We provide IT Staff Augmentation Services!

Senior Etl Consultant Resume

4.00/5 (Submit Your Rating)

Oakland, CA

PROFESSIONAL SUMMARY:

  • Over 9 Years of experience in Information Management end - to-end Data Warehousing, Business Intelligence, Data Integration & Data Migration. Developing Data Warehouse roadmap, strategy and architecture, enterprise data warehouse, ODS, dimensional data marts and end user reports for Healthcare, Manufacturing, Retail, Pharmaceutical, Financial, Insurance.
  • Over 8 Years of experience designing ETL logic using IBM InfoSphere Data stage 9.1/8.7/8.5/8.0.1 / 7.5.2 / 6.0 (Administrator, Designer, Director and Manager) both SERVER & Parallel Extender/Orchestrate and Multi Clint Manager (7.5.2 to 8.7). Informatica Power center 9.0/8.5/7.2.
  • Data Cleansing experience (Deduplication, Relationships, Address Validation, Identified, Standardized, Matched, Reconciled) using IBM Datastage, Quality Stage.
  • Data profiling experience on IBM Information Analyzer 8.0.1/ Profile stage (validates data values and column/table relationships, source to target field mappings, source system profiling and analysis) join with SME & Data modeler.
  • IBM WBI Message Broker (MQ, JMS, AS1, AS2, XML, SOAP, WSDL), IBM WebSphere TX, B2B Gateway, Agent to Agent.
  • Business System Analysts, Designers and Developers to adopt processes based on Agile/Scrum development techniques.

TECHNICAL SKILLS:

ETL: IBM InfosphereDatastage 9.1/8.7/8.5 (Server /PX)

IBM: Quality stage 8.1, IBM Information Analyzer 8.1, Informatica Power center 9.0/8.5/7.2, Pentaho 4.2

OLAP: Congnos 8.1, Business Objects XIR2, SAS 9, OBIEE

Data Modeling: Erwin 7/4.5, MS Visio, PowerDesigner 16

Database: Oracle 10G,UDB/DB2, SQL Server, Teradata, Netezza.

Languages: C, JAVA, SQL, PL/SQL, XML

Operating Systems: AIX 5.3, Solaris 10, Linux, Window XP, Windows 7

Scheduling: Autosys, UC4, Tivoli

Knowledge on: Siebel 8.0(SFA), SAP CRM 5.0(Service Agreements), Oracle EBS 11.5.1 TCA (Order Management, Finance)

SAP R/3: ABAP 4.7(IDOC, BAPI), BW 3.5

PROFESSIONAL EXPERIENCE:

Confidential, Oakland, CA

Senior ETL Consultant

Responsibilities:

  • Understand the Sybase scripts and prepare the mapping documents.
  • Developed the parallel Jobs to extract data from Sybase IQ to DB2.
  • Developed the parallel jobs to extract data from mainframe generated .dat files and then load in to DB2 database.
  • Validate the target DB2 data against to Sybase IQ database and Mainframes data.
  • Prepare the mapping documents using IRAP provided business requirements
  • Developed the parallel jobs to extract source mainframe file and load in to stage table
  • Developed the parallel jobs to read data from stage table and apply edit rules to collect error records in to error table.
  • Developed the parallel job to read rejected error dataset apply edit rules and lookup with error reference table and load in element table.
  • Developed the wrapper script to execute sequence job from command prompt.
  • Developed parallel job to load data in to dimension and fact tables.
  • Prepare the mapping documents using IRAP provided business requirements
  • Developed the parallel jobs to extract source mainframe file and load in to stage table
  • Developed the parallel jobs to read data from stage table and apply edit rules to collect error records in to error table.
  • Developed the parallel job to read rejected error dataset apply edit rules and lookup with error reference table and load in element table.
  • Developed the wrapper script to execute sequence job from command prompt.
  • Developed parallel job to load data in to dimension and fact tables.

Environment: IBM Information Server 8.7 ( Datastage, Quality stage), IBM Cognos BI, Linux, Sybase IQ, DB2, dbvisualizer & TOAD, Control-M.

Confidential, Peoria, IL

Lead ETL Consultant

Responsibilities:

  • Designed ETL solution using IBM InfoSphere DataStage to build Central HR Data Repository and also to build Employee’s Education, Certification, and Work Experience Interfaces for HR TALEO Talent Management System.
  • Involved in design phase meetings with customers and technical leads of various data source systems to understand the data flow and business process to build Central HR Data Repository mainly comprising of Payroll, Compensation, Time and Labor.
  • Worked with client for analyzing the new requirements. Provided application data and reports to client for new requirements research.
  • Design the datastage parallel job extract data from PeopleSoft 8.9, 9.2 delivered and custom tables and load data in to data repository.
  • Design datastage parallel jobs to extract data from PeopleSoft 9.2 tables apply transformation rules to generate files for Taleo application.
  • Retrofit the PeopleSoft EPM delivered server jobs according to PeopleSoft 9.2 tables and load data in to target DB2 UDB database tables to support Cognos report team.
  • Convert the server job to parallel job as per client requirement.
  • Apply Unicode while generating files for Cat Japan payroll interfaces.
  • Designed a generic parallel job then keep in loop sequence job to extract files generated by Cat Brazil, Cat Japan other Cat group companies load in to data repository tables.
  • Convert the PeopleSoft app engine program in to datastage parallel jobs.
  • Extensively used the environment variables in jobs to pass default values while loading, created parameter sets to group set of parameters, values file to store multiple values for each parameter.
  • Worked with Onsite CAT team to define and configure DataStage Jobs in TIDAL Enterprise Scheduling System to automate HR Data Repository and TALEO Integration load process.
  • Support off-shore team members in development activities, fixing defects and resolving the issues.
  • Participate in weekly status meetings, and conduct internal and external reviews as well as formal walkthroughs among the team and document the proceedings.
  • Involved in weekly project plan meetings to track the progress of work and to reach the milestones.
  • Analyzed the requirements and prepared High Level Design Document (HLDD), Internal Design Document (IDD) and ETL mapping documents.

Environment: IBM Information Server 8.5 ( Datastage, Quality stage,), IBM Cognos BI Framework Mgr/Reports, AIX 5.3, Oracle 11g, TWS, PeopleSoft 8.9/9.2, DB2 UDB, Remedy, TIDAL.

Confidential, Fremont, CA

Senior ETL Consultant

Responsibilities:

  • Understand existing credit view application Oracle scripts (PL/SQL load ELT).
  • Prepared ETL data flow, technical documents and mapping documents.
  • Design the datastage jobs read the source data feed (.dat files) load in to source tables.
  • Design the datastage jobs apply the duplicate logic to find the duplicate records insert in to staging table using flag and create duplicate sequential file.
  • Apply the change data capture on source non duplicate data against target staging table.
  • Create the sequence jobs that include batch control, wait for file activity and send the reject file to business users using datastage job sequence activity stages.
  • Apply datastage best practices using parallel methods on DML operations to avoid the deadlock on target tables.
  • Design the datastage using lookup stage, sort stage, funnel stage and aggregate stage to combined individual SOR data and apply aggregate logic load in to SOR table.
  • Designed etl restartability framework that will support to execute the jobs from the point of failure and dynamically recover data.
  • Design the datastage reusable components like routines, shared container, generic jobs by enable the RCP.
  • Apply the performance database optimization create and drop index, through put.
  • Applied partition exchange between the tables to swap the data between partitioned tables.
  • Designed datastage job to generate the ETL job status report for each category.
  • Prepare the unit test cases validate the data in each step of ETL logic.

Environment: IBM Information Server 8.1 ( Datastage, Quality stage,), AIX 5.3, Oracle 11g, Tortoise SVN, Autosys.

Confidential, Sacramento, CA

Senior ETL Consultant

Responsibilities:

  • Converted Functional Requirements into Technical Specifications
  • Design the ETL Framework using DataStage.
  • Design the datastage jobs apply the ETL logic to load the ODS data in to staging tables.
  • Using XML stage read the xml document data coming from icapture.
  • Used Web Services Transformer to do lookup against Web Services
  • Performed loading Messages into Message Queue’s
  • Design the datastage jobs to capture the db2 replication data using IBM Infosphere Change Data Capture.
  • Design the datastage jobs using CDC Transaction stage insert & update the data in to dimension, fact and datamart tables replication.
  • Acts as technical resource to lead to Application Development Team, Data Team.
  • Design the Quality job using investigate, standardize, match stages, apply pattern maintain party details like Business and individual tax payer information .
  • Designed and developed address cleansing and standardization of foreign Addresses
  • Design the Quality job for address cleansing webservice using QA-AVI 10 plugin.
  • Capture the near recent tax payer address information in to reference table.
  • Design the datastage jobs that support the data to IBM Initiate MDM using Web Service Pack.
  • Using DB2 Connector stage load the daily refresh data using bulk load write method.
  • Responsible for performance tuning of the ETL process and upgrading the ETL best practices document.

Environment: IBM Information Server 8.7 ( Datastage,Qualitystage, MetadtaWorkbentch, Fast Track, Business glossary), IBM Infosphere Change Data Capture 6.5.2, IBM Initiate MDM 9.5, Pega Rules 6, SAP BO 3.5, PowerDesigner 16, AIX 5.3, DB2 9.7, Clear Case, Clear Quest.

Confidential, Franklin Lakes, NJ

Lead ETL Consultant

Responsibilities:

  • Work with the SAP ABAP team to fetch the legacy data files.
  • Created an Oracle stored procedure for data compare old SAP data and New ECC data.
  • Follow the database and datastage optimization and best practices.
  • Use Oracle tuning utilities like auto trace, explain plan to optimize the data extract to avoid the database time out.
  • Design the extract jobs using SAP PACKS (ABAP Extract, IDOC Extract, and BAPI)
  • Design the Datastage ETL jobs applying the business rules provided by SAP process team.
  • Design the quality jobs for de-duplication process for SAP partners and Vendors.
  • Design the jobs for loading data in Legacy tables and LSMW tables.
  • Modify the oracle procedure those are help for cognos error and data validation reports.
  • Used Investigate, Standardize, Match & Survive stages in Quality Stage to Harmonize and align the data to create a single view of Business Partner and Vendor system.
  • Used Domain preprocessor & Domain specific rule sets like USPREP, USNAME,USADDR, USAREA to standardize, Matched, Dedupe and Survived Business Partner and Vendor Master using Custom rule sets in Quality Stage.
  • Design the sequence jobs for SFTP process to generate LSMW file.
  • SFTP jobs are supported to Manual Files, Golden Files.
  • Design delta jobs fetch the data from staging and load in to legacy tables.
  • Export the .dsx file move the develop code in to SVN.
  • Work with the off-shore team as part of development process.

Environment: IBM Information Server 8.1 (Datastage PX, Qaulitystage, Information Analyzer, SAP PACKS), SAP R/3 4.0b, 4.7, ECC 6.0, Oracle 10g, Cognos 8.4, Sun Solaris 10, Tortoise SVN, UC4.

Confidential, Trenton, NJ

Senior ETL Consultant

Responsibilities:

  • Analyze the source data files coming from Mainframes.
  • Understand the business and technical documents of the project.
  • Prepare the ETL column level mapping from various source systems to target database System, Apply the transformation logic while building the mapping documents.
  • Designs the datastage jobs extract data from ODS systems and load in to staging and later write in to dimension tables.
  • Design the daily incremental datastage jobs to compare the old and new records using CDC / Delta process.
  • Write the custom before/after and transformation routines.
  • Design sequence jobs and setup mail notification or setup notification individual job Level when we get rejects.
  • Improve the ETL job performance by doing query optimization and tuning of Jobs.
  • Design the reusable shared container for project level.
  • Prepare the unit test cases and validate the data based on business rules.
  • Import and Export developed and executable ETL job components to maintain Production backup and version Control.
  • Modify the COBOL File Definition PIC clause, data types to read the EBCDIC files.
  • Prepare the source to target mapping document, include the transformation rules.
  • Design the ETL jobs to read Mainframe files and write in to Staging, Dimension tables.
  • Follow the best practices while designing the Data Stage jobs using naming standards.
  • Design and develop ETL jobs using DataStage 8.5 for new projects and Datastage 7.5 for existing projects.
  • Design the Audit, Error log and control tables to maintain job statics for Users.
  • Preparing the unit test document by validating the data in a tables.

Environment: IBM Information Server 8.5 / 7.5(DataStage, QualityStage& Information Analyzer), Business Objects XI R3, Oracle 10g, AIX 5.3, Focus, Adabas, Erwin and TOAD

Confidential, St. Paul, MN

EPM Developer

Responsibilities:

  • Preparing the source to target data mapping for ETL design and development.
  • Developed DataStage ETL jobs and Data Loader definitions in the Enterprise Warehouse based on the client’s requirements.
  • Experienced in Database programming for Data Warehouses (Schemas), proficient in Dimensional modeling (Star Schema modeling, and Snowflake modeling).
  • Follow the ETL standards, naming conventions especially for DataStage Project Categories, Stage names, links and maintain best practices as per the existing Confidential .
  • Developed DataStage ETL jobs and Data Loader definitions as specified by the Requirements of the customer.
  • Reconfigured and setup the DataStage ETL jobs in ODS layer of the Enterprise Warehouse.
  • Created multiple documents including requirements, scope, customization analysis Customize the DataStage jobs according to client business needs.
  • Identified bottlenecks in the DataStage ETL and Data Loader process and tuned as necessary resulting in a decrease of processing time.
  • Prepare the schedule process all the individual sequence jobs.
  • Apply the CRC logic (hash file) to compare the transaction data coming from PeopleSoft source systems to OWS environment.
  • Validate the data and record count in OWS, OWE and MDW tables.
  • Preparing the document by validate the data in Error Tables.
  • Setup and configured the Enterprise Warehouse including PF Business Units, SETID.
  • Created Reconciliation process (SQL scripts, Word Doc, etc.) in order to identify the Accuracy of the results.
  • Created many deliverables including Technical Project Plan, Customization Effort Level Document, ETL Processes Definition, Enterprise Warehouse Outlined, End-to-End Documentation, and Setting up Source Environments, Setting up a New EPM Environment, End to End Document, Test Scripts, and SQL Scripts.

Environment: IBM Information Server 8.1 (Data Stage, Metadata Workbench, Business Glossary), PeopleSoft EPM 9.1, People Tools 8.5.11, Oracle 11g, OBIEE 10.3

Confidential, Westborough, MA

ETL Developer

Responsibilities:

  • Worked with data modeler and database administrator to implement database changes.
  • Mentored Developers by introducing Best Practices to reduce Design Complexity and Implement the best Parallelism methods in DataStage PX Jobs.
  • Design and develop the new ETL job, modify the existing jobs as per the new process.
  • Design datastage PX jobs that extract, integrate, aggregate, load and transform the data into data warehouse or data mart.
  • Design SCM the data mart dimensional and fact table data coming from JDA.
  • Design the jobs using OCI/Oracle EE stage, ODBC Enterprise stage, Lookup stage, Change Capture stage, Sort, Funnel, transformer stage, Peak, Head, Tile stages.
  • Performed data cleansing and data manipulation activities using NZSQL utility.
  • Developed UNIX Shell scripts in conjunction with NZSQL/NZLOAD utilities to load data from flat files to Netezza database.
  • Loading the data into Netezza from legacy systems and flat files using complex UNIX scripts.
  • Implemented Slow changing dimensions SCD-1 and SCD-II mappings to upgrade Slowly Changing Dimension Tables.
  • Created tables, Index and modify the aggregator tables as per the requirement.
  • Prepared the UNIT and SIT test case based on designed and modified jobs.
  • Modify the Incremental sequencer that support to modified jobs.
  • Worked with Metadata Definitions, Import and Export of Datastage jobs.
  • Using FTP Plug-in gets mainframe data and load into DB2 tables.
  • Write a shell scripts for file watcher and file archiving process
  • Work with the TJX Canadian team as part of production support.
  • Defined back up recovery process for data stage projects.
  • Extensively developed UNIX Shell scripts for Data Manipulation.
  • Developed scheduling charts and scheduled shell scripts, Datastage ETL jobs and cognos Reports using ESP.
  • Move the developed datastage code in to AllFusion Harvest version control.

Environment: IBM InfoSphere Information Server 8.1,DB2/UDB 9.1, Netezza 6.0(Twin Fin 24,TwinFin 36), Aginity work bench 2.1, SqlServer, Sybase, Oracle EBS, JDA 7, Cognos 8.3, AIX 5.3, Hummingbird, CA ESP, Harvest, Linux, Windows XP.

Confidential, Durham, NC

ETL Developer

Responsibilities:

  • Analyze the existing Confidential prepare the mapping documents.
  • Design and develop the new ETL job, modify the existing jobs.
  • Follow the ETL standards, naming conventions especially for DataStage Project categories, Stage names, links and maintain best practices as per the existing Confidential .
  • Design Datastage PX jobs that extract, integrate, aggregate, load and transform the data into Data warehouse or data mart.
  • PL/SQL Packages to create daily reports in CSV format and emailing these reports to business users using UTL FILE and UTL SMTP built in Oracle Packages.
  • PL/SQL Packages for daily summarization for Sales and Customer data. This was developed in UNIX and PL/SQL.
  • PL/SQL packages for automating lot of manual queries used by business users.
  • Create and reuse metadata and job components.
  • Design the jobs using OCI/Oracle EE stage, Lookup stage, CDC stage, Sort, Funnel, transformer stage, Peak, Head, Tile stages.
  • Implementing the SCD Type-2 using SCD stage & Change Capture stages.
  • Worked with Metadata Definitions, Import and Export of Datastage jobs using Implemented security among data stage users and projects.
  • Created crosscheck UNIX shell scripts on interface files and audit reports on data extracted and data loaded, implemented post -Execution scripts to reconcile the data.
  • Setup UNIX groups and defined UNIX user profiles and assigned privileges.
  • Defined back up recovery process for data stage projects.
  • Defined & implemented data stage jobs process monitoring.

Environment: IBM Datastage 7.5.2/7.5.3 (PX/MVS), Sun Solaris, Cognos 8.1, Oracle 10G, SqlServer 2005, TOAD, Tortoise CVS 1.8.3, Erwin.

Confidential, Sunnyvale, CA

ETL Developer

Responsibilities:

  • Analyze the existing Confidential environment and find out the gaps.
  • Find out the Impact analysis and cardinality changes.
  • Experience in Converting the Business Logic into Technical Specifications.
  • Prepare a high level and low level design documents.
  • Schedule the meetings with up and down streams.
  • Design and develop the new ETL job, modify the existing jobs.
  • Follow the ETL standards, naming conventions especially for DataStage Project categories, Stage names, links and maintain best practices as per the existing Confidential .
  • Tune the Datastage jobs design level and custom SQL scripts.
  • Used Investigate, Standardize, Match & Survive stages in Quality Stage to Harmonize and align the data to create a single view of customers Used Domain preprocessor & Domain specific rule sets like USPREP, USNAME, USADDR, USAREA to standardize Customer Master and Vendor Master.
  • Standardized, Matched, Dedupe and Survived using Custom rule sets in Quality Stage.
  • Used Quality Stage to developed jobs which involved converting the variable length record to fixed length records, parsing the fields to single domain data fields, identifying the most commonly used pattern for each field, selection of subset of records, standardizing the data by converting each field into a most commonly used format.
  • Used SAP BW PACK BW Load stage, BW Open Hub Extract Stage to pull & push data in to SAP BW Info packages, Process Chains.
  • Extensively used SAP R3 stages like IDOC LOAD, IDOC Extract, ABAP and BAPI.
  • Customized the PL/SQL code as per the rules engine.
  • Used PL/SQL to create Packages, Functions, and Procedure.
  • Work with different internal teams and offshore team also.
  • Created tables, Index and modify the aggregator tables as per the requirement.
  • Prepared the unit test case based on designed and modified jobs.
  • Modify the Incremental sequencer that support to modified jobs.
  • Maintain the defects by using HP-Quality Center, assign the defects.
  • Implemented security among data stage users and projects.
  • Set up development, QA & Production environments.
  • Migrated jobs from development to QA to Production environments.
  • Involved in preparing FSD documentation.
  • Defined production support methodologies and strategies.
  • Defined back up recovery process for data stage projects.

Environment: Windows XP / Sun Solaris, IBM Datastage 7.5.x (Server) IBM Datastage, Qualitystage& Information Analyzer 8.0.1, Cognos 8.1, OBIEE, Siebel 8.0, Oracle 11i, Oracle CDH, SAP ECC 6.0 my SAP CRM 5.0, SAP Net weaver PI 7.0, TIBCO, Oracle 10g, TOAD 8.0, Erwin.

Confidential, Roanoke, VA

ETL Developer

Responsibilities:

  • Design the ETL jobs based on the DMD with required Tables in the Dev Environment
  • Designed and developed Star Schema dimensional model.
  • Developed various jobs using Datastage PX stages DB2API / DB2EE stages, Lookup stage, Datasets, Funnel, Duplicate stage, Change Capture stage, Change Apply stage, ODBC stage.
  • Provided production support and customer support to the newly developed data marts and subject areas like Replenishment Stock, Inventory Reduction.
  • Applying rules set using Qualitystage to maintain customer information.
  • Provide the staging solutions for Data Validation and Cleansing with Quality Stage and Datastage ETL jobs.
  • Load the data in to financial datamart getting data from PeopleSoft GL table.
  • As a DataStage consultant working with PeopleSoft EPM.
  • Developed DataStage ETL jobs and Data Loader definitions as specified by the Requirements of the customer.
  • Reconfigured and setup the DataStage ETL jobs to go to the ODS layer.
  • Prepare the schedule process all the individual sequence jobs.
  • Validate the data and record count in OWS, OWE and MDW tables.
  • Read the supply chain data from Salesforce application.
  • Tuning the Datastage jobs source level, transformation level and target load level.
  • Supporting the existing jobs in Datastage 7.5.2. Using multi-client manager.
  • Data extraction from iseriesDB2 database, Oracle, flat files.
  • Implemented Slowly Changing dimension Type- 2 concepts.
  • Performance tuning of DB2 target database using explain plan (Access Plan).
  • Validation testing and Unit testing using the existing AS/400 required data
  • Validating and compare the source flat file data using Perl script in UNIX box.
  • Scheduling the Datastage batch jobs using UC4.

Environment: IBM Infosphere Information Server 8.0, Crystal Reports, SPSS, Business Objects XIR2, PeopleSoft EPM 8.9/9.1, People Tools 8, Oracle 11g, OBIEE 10.3, JDA, Oracle 10g, Sql Server, DB2UDB 8/ 9.1, IBM BCU, Toad, AIX 5.3, Win XP Pro, UC4.

We'd love your feedback!