We provide IT Staff Augmentation Services!

Etl Data Stage Developer Resume

3.00/5 (Submit Your Rating)

Charlotte, NC

SUMMARY

  • Highly competent ETL IBM Data Stage Developer with 6 plus years of experience in Information technology using IBM WebSphere/InfoSphere Data Stage v8.x and Ascential Data Stage 7.x.
  • Extensively worked with Data Stage Designer, Director, Administrator and Manager. Complete Software Development Life Cycle (SDLC) experience with system design, development, implementation, testing, support and enhancements etc.
  • Extensive ETL tool experience using IBM InfoSphere/Websphere Datastage and Ascential Datastage.
  • Worked on Datastage client tools like Datastage Designer, Datastage Director and Datastage Administrator.
  • Good Knowledge about the principles of DW like Data marts, OLTP, OLAP, Dimensional Modeling, fact tables, dimension tables and star/snowflake schema modeling.
  • Excellent in using highly scalable parallel processing infrastructure using parallel jobs with multi - node configuration files.
  • Experienced in scheduling sequence, parallel and server jobs using DataStage Director, UNIX scripts and scheduling tools.
  • Designed and developed parallel jobs, server and sequence jobs using Datastage Designer.
  • Experience in using different types of stages like Transformer, Aggregator, Merge, Join, Lookup, Sort, copy, remove duplicate, Funnel, Filter, Pivot, Shared containers for developing jobs.
  • Worked and extracted data from various data sources such as Oracle, MS-SQL Server, Netezza, MS- Access, Teradata, DB2, XML and Flat files.
  • Extensive experience in Unit Testing, Functional Testing, System Testing, Integration Testing, Regression Testing, User Acceptance Testing and Performance Testing.
  • Created local and shared containers to facilitate ease and reuse of jobs.
  • Proven track record in addressing production issues like performance tuning, enhancement, data, and environment and memory issues.
  • Imported the required Metadata from heterogeneous sources at the project level.
  • Used the Data Stage Director and its run-time engine to schedule running the solution, testing and debugging its components, and monitoring the resulting executable versions (on an ad hoc or scheduled basis).
  • Experience in Production Support extensively worked on production support issues.
  • Extensively used BI Integration Services to design ETL process and BI Analysis Services to create cubes.
  • Quick learner and adaptive to new and challenging technological environments.

TECHNICAL SKILLS

Database Specialties: Database Architecture, Data Analysis, Enterprise Data Warehouse, Database Design and Modeling, Data Integration and Migration, ETL Architecture and Design, Data Warehouse, OLTP, OLAP

Modeling Tools: Erwin 9.x, ER/Studio, MS Visio.

Databases: My SQL Server 2012/2008R2/2005, Oracle 12c/11g/10g/9i/8i, Teradata R14/R13/R12, Hive

ETL Tools: Data Stage 11.3/9.1/8.5/8.1.0/ (Designer, Manager, Director), Data Stage Parallel Extender, Ascential Datastage, Information Analyzer 8.1

Programming Languages: SQL, PL/SQL, UNIX shell scripting

Operating Systems: Windows, UNIX, MS DOS, Sun Solaris.

Reporting Tools: SSRS, Tableau

Web technologies: HTML, DHTML, XML, CSS.

Scripting Languages: JavaScript, UNIX Shell Script.

Project Execution Methodologies: Ralph Kimball and Bill Inmon data warehousing methodology, Rational Unified Process (RUP), Rapid Application Development (RAD), Joint Application Development (JAD).

Big Data Tools: Hadoop, Kafka, HIVE.

Tools: MS-Office suite (Word, Excel, MS Project and Outlook), TOAD, Multi Load, Fast Export.

PROFESSIONAL EXPERIENCE

Confidential, Charlotte, NC

ETL Data stage Developer

Responsibilities:

  • Involved as primary on-site ETL Developer during the analysis, planning, design, development, and implementation stages of projects using IBM Web Sphere software
  • Prepared Data Mapping Documents and Design the ETL jobs based on the Business Requirements
  • FollowedAgilemethodology and JIRA during the development process
  • Active participation in decision making and QA meetings and regularly interacted with the Business Analysts & development team to gain a better understanding of the Business Process, Requirements & Design.
  • Design and involved in Reusable framework jobs.
  • Consumed the data from Kafka Topics, transformed it and produced back to the Kafka Topic.
  • Working experience with Cobol files, reading the data from web api’s by using the hierarchical stage
  • Validated the Real streaming data sent through Kafka MQ Topics into Oracle.
  • Worked on creating the resiliency framework for the Active Kafka Node to perform the Dynamic Routing.
  • Worked on MySQL to Oracle 12C technology migration.
  • Prepared the contingency plan while deploying the code in prod.

Environment: InfoSphere DataStage 11.5 and 11.7 (Designer, Director and Administrator), Autosys Scheduling tool, Hive, Toad, Shell Scripts, Kafka.

Confidential

ETL Data stage Developer

Responsibilities:

  • Migrated jobs from 11.3 to 11.5.
  • Evaluated the Data extraction from source to confirm for data irregularities and classify corrupt data to apply suitable transformation in the mappings in the jobs.
  • Captured results compared result sets of 11.3 and 11.5.
  • Tuned the parallel jobs for higher performance and replaced old stages with new stages in the migrated jobs.
  • Resolved compilation/run time errors in the migrated jobs and End to End tested the output data of the migrated jobs in the new environment.
  • Worked on changes to be made to jobs in 11.5 in case of errors.
  • Worked on data issues in source files.
  • Communicated with client in case of code changes in 11.5.
  • Worked in performance tuning phase.
  • Monitored jobs in 11.3 production and 11.5 production.
  • Prepared track sheets for understanding performance improvement

Environment: InfoSphere DataStage 11.3 and 11.5 (Designer, Director and Administrator), Autosys Scheduling tool, Toad, Shell Scripts.

Confidential, Los Angeles, CA

ETL Datastage Developer

Responsibilities:

  • Worked on the Architecture of ETL process.
  • Created Data stage jobs (ETL Process) for populating the data into the Data warehouse constantly from different source systems like ODS, flat files, scheduled the same using Data Stage Sequencer for System Integration testing.
  • Prepared Mapping document templates for Source to Target landing.
  • Designed and developed ETL processes using DataStage to load data from Teradata, Flat Files to staging database and from staging to the target Data Warehouse database.
  • Developed and supported the Extraction, Transformation and Load process (ETL) for a data warehouse from various data sources using DataStage Designer
  • Developed the reusable components, best practices that were later on used in other Data warehouse
  • Sending the data from Datastage to NDM (Network Data Mover) to Hadoop.
  • Worked on troubleshooting, performance tuning and performances monitoring for enhancement of DataStage jobs and builds across Development, QA and PROD environments
  • Exporting the jobs to Testing Environment from Development and then to Production and worked closely with DataStage Administrator on deployment.
  • Extracted data from sources like Oracle and Flat Files.
  • Experience in using different types of stages like Transformer, Aggregator, Merge, Join, Lookup, and Sort, remove duplicate, Funnel, Filter, Pivot, Shared containers for developing jobs.
  • Troubleshooting performance issues with SQL tuning.
  • Working in a team with other associate product & component developers.
  • Worked on changed requests as per clients and projects technical specification needs.

Environment: InfoSphere DataStage 11.3,9.1and 8.5 (Designer, Manager, Director and Administrator), Teradata, Autosys Scheduling tool, Oracle 12c/11g/10g/9i, Hadoop, Mainframes, Toad, Shell Scripts.

Confidential, Detroit, MI

Data Stage Developer

Responsibilities:

  • Gathered the Client requirements to develop the jobs
  • Troubleshoot the Datastage jobs and add error handling
  • Responsible for rectification and modification of the scripts for seamless working of jobs (which were previously developed)
  • Developed & configured new Database for the client.

Environment: Data stage, Sql Server, UNIX Shell Scripting, Crontab

Confidential

Data stage developer

Responsibilities:

  • Understood the business rules completely and implements the data transformation methodology.
  • Understood the Technical design document and Source-To-Target mappings.
  • Designed Datastage jobs for validating and loading data.
  • Used the Datastage Designer to develop processes for extracting, cleansing, transforming, integrating and loading into Data Warehouse.
  • Extracted data from sources such as oracle, Flat files to transform and load into target Databases.
  • Datastage Designer was utilized for developing mappings using stages, which includes Transformer, Join, Lookup, Merge, Funnel, Filter, Sequential File, Dataset, File Set, Copy, and Modify Stage.
  • Implemented SCD- Type II Techniques.
  • Designed Sequencer to automate the whole process of data loading.
  • Extensively worked with Datastage Shared Containers for Re-using the Business functionality
  • Experience in running jobs using Job Sequencers, Job Batches.
  • Involving in performance tuning jobs process identifying and resolve performance issues.
  • Involved in unit testing of Datastage jobs.

Environment: Datastage 8.5, XP, Oracle 10g, UNIX, Teradata

Confidential

Data stage developer

Responsibilities:

  • Extensively used Data Stage Designer for Developing Parallel jobs and performed complex mappings based on Business specifications.
  • Involved in the design and development of the ETL process for the data warehouse.
  • Worked extensively on different types of stages like Sequential file, Aggregator, Funnel, change capture, Transformer, Merge, Join, Lookup for developing jobs.
  • Extensively used processing stages like the Lookup stage to perform lookup operations based on the various target tables
  • Modify stage to alter the record schema of the input data set, Funnel stage to combine various datasets into a single large dataset and Switch stage to trigger the required output based on a Specific condition.
  • Constructed containers for performance analysis.
  • Involved in the unit testing.

Environment: Data stage 8.5, XP, Oracle 10g, UNIX

We'd love your feedback!