We provide IT Staff Augmentation Services!

Etl/technical Lead Resume

4.00/5 (Submit Your Rating)

Warren, MI

PROFESSIONAL SUMMARY:

  • Highly motivated solutions - driven IT professional with 7 years of Data Warehousing experience in the areas of ETL design and Development. Involved in complete Software Development life-cycle (SDLC), including Requirements gathering, System Designing, Data Profiling, Data Modeling, ETL design/Development, Production Enhancements, Support and Maintenance.
  • Strong understanding of the principles of DW using fact tables, dimension tables and star/snowflake schema modeling. Strong SQL experience in working with SQL queries, PL/SQL Stored Procedures, Table Partitions, Packages to load data into Data Warehouse/Data Marts and report generation .
  • Excellent Interpersonal skills with the ability to remain highly focused and self-assured in fast-paced, high-pressure environments.
  • Extensive ETL tool experience using IBM Info sphere/Web sphere Datastage, Ascential Datastage. Worked on DataStage client tools like DataStage Designer, Datastage Director and DataStage Administrator.
  • Strong understanding of the principles of DW using fact tables, dimension tables and star/snowflake schema modeling.
  • Experienced in scheduling Sequence and parallel jobs using DataStage Director, UNIX scripts and scheduling tools.
  • Worked extensively with Data Profiling, Dimensional modeling, Data migration, Data cleansing, ETL Processes for data warehouses.
  • Configured Environment variables using Datastage Administrator.
  • Experience in using Quality stage for data cleansing.
  • Used Enterprise Edition/Parallel stages like Datasets, Sort, Join, Lookup, Change Data Capture, Funnel, Row Generator and many other stages in accomplishing the ETL Coding
  • Worked with various databases like Oracle 10g/9i/8i, DB2, SQL Server, Teradata, MS Access.
  • Familiar in using highly scalable parallel processing infrastructure using parallel jobs and multiple node configuration files.
  • Experience in troubleshooting of jobs and addressing production issues like data issues, ENV issues, performance tuning and enhancements.
  • Extensive experience in Unit Testing, Functional Testing, System Testing, Integration Testing, Regression Testing, User Acceptance Testing and Performance Testing.
  • Involved in providing support in QA, UAT and prod environments.
  • Knowledge in using Erwin as leading Data modeling tool for logical (LDM) and physical data model (PDM).
  • Extensive experience in design and development of Decision Support Systems (DSS).
  • Assisted in development efforts for Data marts and Reporting.
  • Technical and analytical skills with clear understanding of design goals of ER modeling for OLTP and dimension modeling for OLAP.
  • Knowledge in Reporting Tool such as Cognos.
  • Knowledge in using PL/SQL to write stored procedures, functions, and triggers.

TECHNICAL SKILLS:

ETL Tools: IBM Info sphere DataStage 8.5, IBM Websphere Information Server 8.0.1 (Designer, Director, Administrator), Ascential DataStage 7.5 (Designer, Director, Administrator, Manager), Parallel Extender, Meta stage & Quality stage

Databases: Oracle 10g/9i/8i, Teradata, SQL Server 2003/2005/2008 , IBM DB2, MS Access

Data Warehousing: Star & Snow-Flake schema Modeling, Fact and Dimensions, Conceptual, Physical and Logical Data Modeling

Data Modeling Tools: Erwin r8.1, Business Objects XI R2, MSVisio

Reporting Tools: SQL server SSIS/SSRS, BO 6.5/5.1, Cognos 8/7/6.0, Crystal Reports

Operating systems: Windows 7x/NT/XP, UNIX, LINUX, MS-DOS

Languages/Scripting: C, C++, Java, Visual Basic, PL/SQL, UNIX Shell scripts, Pearl

Debugging Tools: TOAD, SQL Navigator, WinSQL, Charles

Scheduling Tools: Datastage Director, Autosys, Control-M, Corn Tab

PROFESSIONAL EXPERIENCE:

Confidential, Warren, MI

ETL/Technical Lead

Responsibilities:

  • Interacted with Business Users and Managers in gathering business requirements.
  • Prepared documents for every phase of the project meeting the standards of GM and uploaded all the documents in configuration management tool (Clear case) phase wise.
  • Calculated the capacity requirements for this project.
  • Used Datastage Designer, Director, Manager and Administrator to develop Parallel, Server and Sequence Jobs as per business rules.
  • Developing and modifying changes in jobs according to the business logic.
  • Creating Environment variables, Job Parameters.
  • Creating Re-Process logic for any Failed jobs.
  • Coding & Debugging, Sorting out their time-to-time technical problems.
  • Analyze the types of business tools and transformations to be applied.
  • Tuned performance of DataStage Jobs for large data files and Tables by Creating Indexes, increasing scratch space, configuring multi-Nodes.
  • Involved in writing the Unit Test Cases using SQL.
  • Involved in the creation of various change control forms to promote the code from Dev, QA and to Production.
  • Developed the UNIX shell scripts to send out an E-mail on success of the process indicating the destination folder where the files are available.
  • Dealt with Unit testing of all the mappings end to end and also with UAT.
  • Involved in managing deliverables from Offshore.

Environment: IBM Info sphere Datastage 8.5, Oracle 11g, Mainframe,SQL server, Windows, UNIX, MS Visio, ASCII, Flat files, UNIX,Putty,WinSCP, Autosys

Confidential, Raleigh, NC

ETL/Sr. Datastage Developer

Responsibilities:

  • Analyzed, designed, developed, implemented and maintained Parallel jobs using IBM info sphere Datastage 8.5.
  • Involved in design of dimensional data model - Star schema and Snow Flake Schema
  • Generating DB scripts from Data modeling tool and Creation of physical tables in DB.
  • Successfully implemented pipeline and partitioning parallelism techniques and ensured load balancing of data.
  • Deployed different partitioning methods like Hash by column, Round Robin, Entire, Modulus, and Range for bulk data loading and for performance boost.
  • Repartitioned job flow by determining DataStage PX best available resource consumption.
  • Experienced in PX file stages that include Complex Flat File stage, DataSet stage, LookUp File Stage, Sequential file stage.
  • Implemented Shared container for multiple jobs and Local containers for same job as per requirements.
  • Adept knowledge and experience in mapping source to target data using IBM Data Stage 8.x
  • Implemented multi-node declaration using configuration files (APT Config file) for performance enhancement.
  • Experienced in developing parallel jobs using various Development/debug stages (Peek stage, Head & Tail Stage, Row generator stage, Column generator stage, Sample Stage) and processing stages (Aggregator, Change Capture, Change Apply, Filter, Sort & Merge, Funnel, Remove Duplicate Stage)
  • Worked within a team to populate Type I and Type II slowly changing dimension tables from several operational source files
  • Created some routines (Before-After, Transform function) used across the project.
  • Involved in creating UNIX shell scripts for database connectivity and executing queries in parallel job execution.
  • Used the ETL DataStage Director to schedule and run jobs, testing and debugging its components & monitoring performance statistics.
  • Debug, test and fix the transformation logic applied in the parallel jobs
  • Experienced in using SQL *Loader and import utility in TOAD to populate tables in the data warehouse.
  • Coordinated with other team members and participated / organize daily meetings to discuss the Dev progress.
  • Created, implemented, modified and maintained the business simple to complex reports using Business objects reporting module (Cognos).

Environment: IBM Info sphere Datastage 8.5, Oracle 11g, Teradata,SQL server, SOA, Windows, UNIX, MS Visio, XML, Flat files, UNIX, TOAD,Cognos, Autosys

Confidential, Greensboro, NC

Sr. Datastage Developer

Responsibilities:

  • Designed the ETL jobs using IBM Web Sphere Data stage 8.0.1 to Extract, Transform and load the data into Staging and then into Target Database.
  • Extensively used the designer to develop various parallel jobs to extract, transform, integrate and load the data into Corporate Data warehouse (CDW).
  • Designed and developed the ETL jobs using Parallel edition which distributed the incoming data concurrently across all the processors, to achieve the best performance.
  • Designed parallel jobs using stages such as Join, Merge, Lookup, Remove Duplicates, Copy, Filter, Funnel, Dataset, Lookup, Pivot, and Sort, Surrogate key Generator, Change Data Capture (CDC), Modify, Row Generator and Aggregator.
  • Handled Performance Tuning of Jobs to ensure faster Data Loads
  • Created Master controlling sequencer jobs using DataStage Job Sequencer.
  • Extensively worked with Shared Containers for Re-using the Business functionality.
  • Extensively developed and deployed UNIX Shell scripts as wrappers that provide values to DataStage jobs during runtime.
  • Created Job Parameters and Environment variables to run the same job for different sources and targets.
  • Used Director to monitor jobs, run and validating its components.
  • Involved in the Performance Tuning of the DataStage jobs using different partition methodologies and node configurations of the environment variable file, designing and editing configurations, increasing the reading as well as the writing speed while fetching or loading data to files or databases.
  • Provided data models and dataflow (extract, transform and load analysis) of the data marts and feeder/target systems in the aggregation effort.
  • Migrated projects from development to QA to Production environments
  • Performed the Integration and System testing on the ETL processes.
  • Taking the regular backups of the jobs using Data Stage Export/Import utility.
  • Working with BI team to apply the business rules for OLAP, designing the Frame Work models.
  • Assisted operation support team for transactional data loads in developing SQL & Unix scripts
  • Scheduled jobs using Autosys Job scheduler utility based on the requirements and monitored the production closely for any possible errors.

Environment: IBM Web sphere DataStage 8.1, Windows, UNIX, IBM AIX 5.2, Teradata, SQL server 2005, Flat files, TOAD, Putty, Win SCP, Autosys HP QC 10.

Confidential, Newark, NJ

ETL Developer

Responsibilities:

  • Worked with the Business analysts and the DBAs for requirements gathering, analysis, testing, and metrics and project coordination.
  • Successfully handled the slowly changing dimensions.
  • Involved in the Dimensional modeling of the Data warehouse.
  • Developed documents like Source to Target mapping for developing the ETL jobs.
  • Worked with DataStage server stages like OCI, ODBC, Transformer, Hash file, Sequential file, Aggregator, Sort, Merge, and other stages.
  • Imported the required Metadata from heterogeneous sources at the project level.
  • Involved in designing various jobs using PX.
  • Developed Parallel jobs using Parallel stages like Merge, Join, Lookup, Transformer (Parallel), Teradata Enterprise Stage, Funnel, Dataset, Oracle Enterprise Stage.
  • Performed debugging on these jobs using Peek stage by outputting the data to Job Log or a stage.
  • Used Remove Duplicates stage to remove the duplicates in the data.
  • Involved in the migration of DataStage jobs from Development to Production environment.
  • Designed and implemented several wrappers to execute the DataStage jobs, create job reports out of the DataStage job execution results from UNIX shell scripts.
  • Designed and implemented wrappers to execute the DataStage jobs from remote servers.
  • Worked on database connections, SQL joins, views, aggregate conditions, parsing of objects and hierarchies.
  • Tuned SQL queries for better performance for processing business logic in the database.

Environment: DataStage 7.5 (Designer, Manager, Director, Administrator), Oracle 9i, TOAD, SQL/PLSQL, Teradata, Erwin 4.0, UNIX, IBM (AIX).

Confidential, Plano, TX

ETL Developer/Tester

Responsibilities:

  • Involved with Business users and ETL Leads from different teams to implement ETL Frame Work using Server/Parallel combination of jobs.
  • Extracted data from various sources like DB2 UDB, Flat Files and loaded into a Corporate Data warehouse.
  • Upgraded to IBM Websphere Datastage 8.0.1 from Ascential Datastage 7.5.
  • Implemented various strategies for Slowly Changing Dimensions using Server/Parallel jobs using the Frame Work approach.
  • Designed jobs using different parallel job stages such as Join, Merge, Lookup, Filter, Dataset, Lookup File Set, Remove Duplicates, Change Data Capture, Switch, Modify and Aggregator.
  • Involved in developing Designer- Server and Parallel jobs for Extracting, Cleansing, Transforming, and Integrating/Loading Data into Corporate Data Warehouse (CDW).
  • Developed Job Sequences with job restart capability for the designed jobs using Job Activity, Exec Command, E-Mail Notification Activities and Triggers.
  • Extensively designed, developed and implemented Parallel Extender jobs using Parallel Processing (Pipeline and partition parallelism) techniques to improve job performance while working with bulk data sources.
  • Extensively used Director to Monitor and check the run statistics of the Jobs and to clear the job logs, job resources.
  • Extensively used - Manager to Export/import components.
  • Involved in Unit Testing of the jobs.
  • Created UNIX Shell scripts for scheduling various Batch jobs.
  • Used configuration management tools like ClearCase/ClearQuest for version control and migration.

Environment: IBM Web sphere Datastage 8.x, Ascential Data stage 7.5 (Director, Designer, Manager, Administrator), Ascential Datastage 7.0, DB2 UDB 8.1.6(EEE), SQL Server, DB2, UNIX Shell Scripting, Clear Case/Clear Quest, IBM AIX.

Confidential

DW/ETL Tester

Responsibilities:

  • Designed and created various Test Plans, Test Cases based on the Business requirements.
  • Involved in extensive DATA validation using SQL queries and back-end testing
  • Used SQL for Querying the DB2 database in UNIX environment
  • Performed execution of test cases manually to verify the expected results.
  • Used Mercury Quality Center 9.0 to state requirements, business components, test cases, test runs for every iterations, defects. Also link defects with requirements
  • Performed all aspects of verification, validation including functional, structural, regression, load and system testing.
  • Worked with other members of the QA and Development teams and offshore team (INDIA) in improving the processes, tools, methods, effectiveness and efficiency
  • Used TOAD for Debugging DB issues.
  • Participated in walkthrough and defect report meetings periodically.
  • Involved in Unit, Functional, Regression and System testing.
  • Documented and reported various bugs during Manual Testing.
  • Stored data in the Data Warehouse about Customers, Accounts. The loads were scheduled (Daily and weekly) depending on the frequency of source data.
  • Written SQL Queries to define, Identify and validate the code written for the data movement into the database tables.

Environment: ETL (Data stage), UNIX, DB2, Oracle 10g, Autosys, Windows XP, Toad, Pl/SQL, HP QC 9.0

We'd love your feedback!