We provide IT Staff Augmentation Services!

Sr. Etl Data Stage Developer Resume

5.00/5 (Submit Your Rating)

Lansing, MI

SUMMARY:

  • 8+ years of strong experience in Data warehousing, Data Integration, Data Migration and ETL using, IBM Info Sphere Data Stage 8.7, IBM WebSphere Data stage 11.5/11.3/9.1/8.1/8.0, Oracle Data Integrator (ODI), Ascential Data Stage 7.5, Data cleansing and Data Analysis.
  • Involved in all the phases of SDLC requirement gathering, design, development, Unit testing, UAT, Production roll - out, enhancements and Production support.
  • Designed, Developed and tested jobs to extract data from multiple data sources like Oracle, SQL Server, Flat Files, Sybase, DB2, XML
  • Strong Experience in Dimensional modeling using Star and Snowflake schema. Experience in Data warehouse and RDBMS methodologies.
  • Strong working experience on Data Warehousing applications, directly responsible for the Extraction, Transformation and Loading of data from multiple sources into Data Warehouse.
  • Assisting the data modeler/data warehouse developer in reviewing staging, data warehouse and data.
  • Responsible for all activities related to the development, implementation, administration and support of ETL processes for large-scale data warehouses using Data Stage.
  • Strong in SQL and implementing SQL’s in ETL for better performance.
  • Experience in Unix shell scripting
  • Knowledge in Core Java.
  • Documentation of ETL mappings, processes and batch process.
  • Working knowledge of ETL design and commercial frameworks that implement the data design patterns.
  • Experienced in designing parallel and server jobs using Data Stage 11.X/9.X/8.X, extensively used Link Partitioner, Link collector, Routines, Transformations, Hashed file and Developed Job sequencers for executing Data stage jobs.
  • Experience in resolving on-going maintenance issues and bug fixes and monitoring ETL processes.
  • Worked extensively with slowly changing dimensions.
  • Used ODI Designer to develop complex interfaces (mappings) to load the data from the various sources into dimensions and facts
  • Experience in UNIX Shell scripting as part of file manipulation, and have strong knowledge in scheduling Data Stage jobs using Autosys, Control M.
  • Worked on Performance Tuning, identifying and resolving performance bottlenecks in various levels like sources, targets, mappings.
  • Used Business intelligence application tools with an extensive use of SQL knowledge of business processes, internal and external identification and documentation of Business intelligence requirements.
  • Experienced in integration of various data sources (Oracle, Teradata and XML) into data staging area.
  • Hands-on experience across all stages of Software Development Life Cycle (SDLC) including business requirement analysis, data mapping, build, unit testing, systems integration and user acceptance testing
  • Expertise in co-coordinating On-Site & Off-Shore teams and can handle multiple tasks.
  • Clear understanding of business procedures and ability to work as an individual and as a part of a team.
  • Flexible and versatile to adapt to new environment and hardworking and self-motivated.
  • Excellent communication skills and analytical skills and ability to perform as part of the team.

TECHNICAL SKILLS:

ETL Tool: IBM Info Sphere DataStage and Quality Stage 11.5/11.3/9.1/8.1/8.0, IBM Info sphere 8.1/8.0, Oracle Data Integrator (ODI), Ascential Data Stage7.5

Operating Systems: DOS, UNIX (HP-Unix, Solaris 2.7/2.8, Linux 7.1), Windows 98/NT/2000/XP/7, IBM Mainframe.

Tools: Toad, PLSQL Developer, SQL*Loader, SQL NavigatorSQL Tools: IBM Cognos 10.0.1, Erwin, DataStage 9.1/11.3, GIT, JIRA:

Languages: SQL PLUS, PL/SQL, SAS 8.2, BASIC, Visual Studio 6.0, Java, C, C++, COBOL, Korn Shell Scripts, MS Office

Methodologies: Star Schema modeling, Snowflakes Modeling, Ralph Kimball, RDBMS, FACT and Dimension tables, Aggregation tables, ERWIN.

Databases: Teradata v2r6, Teradata v2r12, Oracle 10/11g, 9i/8i, MS SQL Server 7.0/2000, DB2 UDB v9.7/9.1/7.1, Netezza 6.0, MS Access 97/2000.

Testing Tools: Win Runner 7, Load Runner 6.5, Test Director 7.6, Mercury Quality Center 8.2.

Other: Autosys, Zena, Control M.

Other Utilities: Erwin 7.2/4.0, MS Office, MS Visio, TOAD

PROFESSIONAL EXPERIENCE:

Confidential, Lansing, MI

Sr. ETL Data Stage Developer

Responsibilities:-

  • Involved in requirement gathering, analysis and study of existing systems.
  • Extensively using Data Stage designer for designing Parallel jobs, Server jobs and performed complex mappings based on user specifications.
  • Working in designer for Extracting Data, Performing Data Transformations and Aggregate Data.
  • Worked with technical architect to setup the end-to-end ETL framework for this project.
  • Generated .XML files for various departments in state like MDOT(Michigan Department of Transportation), MSP( Michigan State Police), MDHHS (Department of Health and Human services), MDNR(Department of Natural Resources), MDOS(Department of State) using Oracle DB tables provided by them.
  • Developed jobs using ETL DataStage 11.5/11.3 to process SDS data for Retrospective and Prospective runs. This includes the data analysis and data generation for multiple sources in SDS.
  • Created process flow diagrams using Microsoft VISIO.
  • Created validation process to validate the .xml generated according to design specifications.
  • Extensively used real time stages in Data Stage such as XML input, XML output, and Hierarchical stages to generate .XML files for different departments of State.
  • Used Orchadmin commands to control datasets from command line.
  • Added, deleted and setup Data Stage projects and managed users from Data stage administrator
  • Used QualityStage to coordinate the delivery, consistency, removing data anomalies and spelling errors of the source information
  • Used QualityStage stages such as investigate, standardize, match and survive for data quality and data profiling issues during the designing
  • Involved in creation of database objects like tables, views, materialized views, procedures and packages using oracle tools like Toad, PL/SQL Developer and SQL* plus.
  • Used Data Stage Administrator for setting the variables and parameters.
  • Used ETL Tool, to create jobs for extracting the data from different databases, flat files & Mainframe files.
  • Distributed load among different processors by implementing the Partitioning method.
  • Designed complex jobs to handle data and loading them to Target.
  • Involved in bug fixing for the existing data stage codes.
  • Develop and implement customized MDM solution based on IBM MDM Specification Documentation.
  • Designed Mappings between sources and operational staging targets.
  • Created generic jobs using RCP to extract data from various source systems
  • Extensively worked on building DataStage jobs using various stages like Oracle Connector, Funnel, Transformer stage, Sequential file stage, Lookup, Join and Peek Stages
  • Designed and developed ETL processes using Data Stage designer to load data from Oracle and XML files to staging database and from staging to the target Data Warehouse database
  • Designed DS jobs which involved the Extraction, Transformation, and Loading of data into an Oracle database
  • Involved in developing Source to Target mapping document for all the ETL jobs including the Test Cases.
  • Worked on the Quality stage modules for the Data Analysis and Data profiling and Address verifications and standardizations.
  • Worked with ETL source system and target system (ETL mapping document) for writing test cases and test script
  • Performed database migration and integration from the source server and prepared data mapping
  • Imported data from various transactional data sources residing on Oracle, SQL Server, Sybase, DB2 and Flat files and loaded into Oracle database
  • Involved in business requirements gathering and preparing architecture design documents.
  • Involved in preparing Technical Design Documents for the ETL development.
  • Development and design of ODI interface flow for Upload / Download files.
  • Support in full legacy to ODI data conversion and Integration task
  • Prepared high level Micro Design documentation to be reviewed by client
  • Attended meetings with client and business teams to understand the requirement and prepared the low level design document, technical specification document
  • Evaluated daily ETL development processes, design and code review procedures
  • Created unit test cases and documented them for approval, for code move to SIT
  • Coordinated with development teams to provide them with package bodies, stored procedures, functions and necessary insights into database table information
  • Wrote Linux shell Scripts for FTP of files from remote server and backup of repository and folder
  • Developed a job using execute command stage to ftp the .xml file generated into the landing zone
  • Used Java Transformer Stage to Load the data (blob) into DB.
  • Involved in writing UNIX shell scripts for automation, job run, file processing, initial load, batch loads, cleanup, job scheduling
  • Created jobs sequences and job schedules to automate the ETL process by extracting the data from flat files, Oracle and Teradata into Data Warehouse
  • Read the supply chain data from Salesforce application.
  • Experience with version control software such as GIT
  • Used JIRA for Bug tracking
  • Worked with job scheduler tools Control-M
  • Expertise in design and configuration of landing tables, staging tables, base objects, hierarchies, foreign-key relationships
  • Worked on data cleansing and standardization
  • Code Enhancement and Maintenance by SharePoint Tasks.
  • Involved in Performance tuning at source, target, mappings, sessions, and system levels
  • Prepared migration document to move the jobs from development to testing and then to production

Environment: IBM Info Sphere Data Stage 11.5/11.3, Oracle 11G, SQL Server 2008, Teradata, Toad, MS Visio, Windows 2000, Unix/Linux, GIT, JIRA

Confidential, Troy, MI

Data Stage Developer

Responsibilities:

  • Involved in requirement gathering, analysis and study of existing systems.
  • Extensively using Data Stage designer for designing Parallel jobs, Server jobs and performed complex mappings based on user specifications.
  • Developed the source to target process and mapping documentation.
  • Prepared technical data flow proposals for enhancements and integration of existing third-party data. Communicated with business users and project management to get business requirements and translate to ETL/ELT specifications.
  • Working in designer for Extracting Data, Performing Data Transformations and Aggregate Data.
  • Provide the technical ESB integration expertise guidance on integrating between ESB and ETL frameworks.
  • Designed jobs using different stages like Transformer, Aggregator, lookup, Source dataset, external filter, Row generator, column generator, peek stages.
  • Used stages like standardize, match frequency, Investigate in Quality Stage for cleansing of customer data based on address, state, zip.
  • Implemented Quality Stage for Data cleansing & Data standardization Process.
  • Used Quality Stage to check the data quality of the source system prior to ETL process.
  • Added, deleted and setup DataStage projects and managed users from Data stage administrator
  • Designed Server jobs to integrate the PL/SQL procedures and sequencers.
  • Used DataStage Administrator for setting the variables and parameters.
  • Created various sales (Mortgage) reports using Cognos reporting tool
  • Distributed load among different processors by implementing the Partitioning method.
  • Extracted data from oracle and transformed data and loaded into Oracle data warehouse.
  • Designed complex jobs to handle data in multi millions of records and loading them to Target
  • Developed and implemented data masking, encoding, decoding measures using the DataStage and Unix scripting
  • Experience in Integration of various data sources like Oracle, Teradata, DB2, Sybase, SQL Server, Mainframes into Staging area
  • Designed and customized data models for Data warehouse supporting data from multiple sources on real time
  • Involved in preparing Technical Design Documents for the ETL development.
  • Developed several complicated PL/SQL procedures using recursive loops and techniques that cannot be designed in DataStage.
  • Participated in code reviews and peer review sessions.
  • Involved in bug fixing for the existing data stage codes.
  • Design and develop customized MDM code based on Specification.
  • Execution of SQL queries to extract data from DB2 tables for running test scripts.
  • Used Java stage to run API some times.
  • Developed Java Custom Objects to derive the data using Java API.
  • Participated in weekly team meetings to discuss the test execution status for each client
  • Worked on a Migration project from Mercator (DataStage TX) to DataStage 11.3
  • Used Data Profiling tool Information Analyzer for Column Analysis, Primary Key Analysis and Foreign Key Analysis.
  • Performed SQL and PL/SQL tuning and Application tuning using various tools like EXPLAIN PLAN, SQL*TRACE, TKPROF
  • Used Bulk Collections for better performance and easy retrieval of data, by reducing context switching between SQL and PL/SQL engines
  • Recommended problem solutions and participated in their implementation.
  • Created generic jobs using RCP to extract data from various source systems.
  • Developed Analytical applications that can analyze large amounts of online and offline data using orchestrate environment.
  • Designed and developed ETL processes using DataStage designer to load data from Oracle, MS SQL, Flat Files (Fixed Width) and XML files to staging database and from staging to the target Data Warehouse database.
  • Used Netezza Enterprise stage for doing loads into Netezza Database
  • Used DataStage Netezza Enterprise stage to load data, utilizing the available processors to achieve job performance, configuration management of system resources in Orchestrate environment
  • Transferring the contents of tables between 2 different environments using NZLOAD utility in Netezza.
  • Parameters used with NZSQL command: -u, -pw, -db, -df, -lf, -bf, -delim, -dateStyle, -dateDelim, -maxErrors
  • Wrote Linux shell Scripts for FTP of files from remote server and backup of repository and folder
  • Active participation in decision making and QA meetings and regularly interacted with the Business Analysts and development team to gain a better understanding of the Business Process, Requirements & Design.
  • Used SQL Assistant to querying Teradata tables
  • Created tables, views in Teradata, according to the requirements.
  • Worked on building up Master Data Management, Initiate Meta data management, data stewardship, data quality, Data governance, Master Dataflow
  • Supported APEX applications and conducted performance tuning for PLSQL.
  • Extended data profiling capabilities and provided foundation for data governance initiatives using IBM Information analyzer and job maintainability
  • Involved in Performance tuning at source, target, mappings, sessions, and system levels.
  • Extensively used version control system, such as GIT
  • Used JIRA for Bug tracking
  • Worked with job scheduler tools Control-M
  • Prepared migration document to move the jobs from development to testing and then to production.

Environment: IBM Info Sphere Data Stage 11.3, IBM WebSphere Data stage 8.1, Oracle 11G, Teradata, IBM DB2, MS Visio, Netezza 6.0, Windows 2000, Unix/Linux, IBM Cognos, GIT, JIRA

Confidential, San Francisco, CA

ETL Application Developer.

Responsibilities:

  • Had been actively interacting with Business users, Functional experts to understand and implement the requirements to user expectations.
  • Involved in Design, Data mapping and Development of ETL jobs that supports Super fusion EDW process.
  • Involved in requirement gathering, analysis and study of existing systems.
  • Extensively used Data Stage designer for designing server jobs and performed complex mappings based on user specifications.
  • Worked in designer for Extracting Data, Performing Data Transformations and Aggregate Data.
  • Developed Source to Target mapping document for all the ETL jobs including the Test Cases.
  • Worked as part of the ETL testing team in writing test cases and coordinate with the relevant Dev teams for optimization of jobs.
  • Developed PL/SQL triggers and master tables for automatic creation of primary keys.
  • Worked in Manager to store, manage reusable Metadata and created custom routines and transforms for the jobs.
  • Developed the complete ETL DataStage Process which involves PL/SQL, ORACLE 10g.
  • Experience in Data Stage to Validate, Run, and Schedule and Monitor Data stage jobs.
  • Extensively worked in Data Stage Administrator to assign privileges to user groups.
  • Designed jobs using different stages like Transformer, Aggregator, lookup, Source dataset, external filter, Row generator, column generator, peek stages.
  • Worked with sequential files, Oracle as sources and Oracle and XML as Targets.
  • Distributed load among different processors by implementing the Partitioning method.
  • Extracted data from flat files, sequential files and access files, applied transformations and loaded the data into data warehouse.
  • Created routines for the validation of files provided by client.
  • Coded and tested DB2 programs that update, insert the tables with the daily transaction data.
  • Participated in disaster recovery phase.
  • Read the supply chain data from Salesforce application.
  • Investigate client reported issues and identify root cause.
  • Determine short term and permanent resolution, including possible workarounds.
  • Communicate with the client as needed to diagnose and present options.
  • Performed SQL and PL/SQL tuning and Application tuning using various tools like EXPLAIN PLAN, SQL*TRACE, TKPROF and AUTOTRACE
  • Created data governance strategies that reduced data redundancy wherever possible
  • Worked on IBM tools Information Analyzer, Quality Stages like Investigate, Match
  • Specify and develop code corrections for defects and minor enhancements.
  • Created job sequences to define the flow of the jobs
  • Hands-on experience with Oracle APEX 4.0/4.1 (Oracle Application Express).
  • Experience with version control software such as SVN
  • Used JIRA to log and track bugs and defects
  • Responsible for revalidation of resolved defects
  • Wrote Linux shell Scripts for FTP of files from remote server and backup of repository and folder
  • Written UNIX shell scripts to automate the Data Load processes to the target Data warehouse

Environment: IBM Info Sphere Data Stage 9.1, Ascential Data Stage 7.5, Oracle 10G, Microsoft SQL Server, DB2, Windows 2000, Unix/Linux

Confidential

ETL Developer

Responsibilities:

  • Designed and developed Data Stage jobs for data extraction from different data feeds into EPM.
  • Involved in functional and technical meetings and responsible for creating ETL Source - to - Target maps.
  • Wrote conversion scripts using SQL, PL/SQL, stored procedures, functions and packages to migrate data from SQL server database to Oracle database.
  • Developed various jobs using Dynamic RDBM, Universe, Hashed file, Aggregator, Sequential file stages.
  • Involved in Logical & Physical Database Layout Design.
  • Designed, developed and tested the ETL jobs for the applications.
  • Used Data Stage Manager for importing metadata from repositories, new job categories, import and export .dsx files, routines and shared containers.
  • Implemented profile stage for source data profiling analysis.
  • Extensively worked on PeopleSoft POSITION and LINE ITEM budgeting and STRS areas.
  • Worked on Change Requests (CRs) from Budgeting domain according to changing business rules.
  • Customized PeopleSoft delivered ETLs to in corporate additional functionality as per client requirement.
  • Developed Complex database objects like Stored Procedures, Functions, Packages and Triggers using SQL and PL/SQL.
  • Implemented quality stage for data cleansing & data standardization process.
  • Defined Standardized jobs for standardized files and standardized results.
  • Responsible for creating jobs in Regulatory (FCC & PUC) reporting area.
  • Responsible for creating/modifying MASTER Sequencers for different subject areas/domains, to run/schedule all the jobs related to that particular subject area/domain.
  • Involved in pulling the data from various PeopleSoft applications like CRM, HRMS, and FMS into EPM.
  • Developed ETL maps to pull data from Flat Files, 3rd party Data base tables/views.
  • Developed UNIX Shell scripts to handle and validate the incoming source files.
  • Extensively used PeopleSoft Application Designer to create new tables/views/fields and to modify delivered objects.
  • Involved in migration process from DEV to Test and then to PRD.
  • Developed scripts for performance and customer data verification using data driven tests
  • Performed sprint level, functional, systems integration, and regression testing
  • Used shared containers for multiple jobs, which have the same business logic.
  • Responsible for creating ETLs to pull the data from both PeopleSoft and non-PeopleSoft source system.
  • Responsible for developing ETLs to load the data into EPM incrementally or destructively based on the requirement.
  • Involved in documentation, customization tracking and in maintaining PeopleSoft standards across all the teams in EPM.

Environment: Ascential DataStage 7.5 EE, Ascential ProfileStage 7.5, PeopleSoft EPM 8.9, PeopleSoft ERP Applications, nVision, PeopleTools 8.46/8.45, Oracle 9i, Informix 7.3x, Erwin 3.5.2, Test Director, HP-UNIX 11.11

We'd love your feedback!