We provide IT Staff Augmentation Services!

Data Integrator - Datastage Developer Resume

4.00/5 (Submit Your Rating)

Charlotte, NC

PROFESSIONAL SUMMARY:

  • 8 years experience in ETL Development and Maintenance using Datastage 7.5/8.5/11.5 ETL Suite and Teradata V 12/13/14.
  • Experience in Requirements Gathering, Documentation, Business Requirement Analysis, Software Requirement Specifications and Test Case design.
  • Experience in the fields of data Warehousing, Data Integration, Data Migration using IBM Web sphere Data Stage, Oracle, DB2 and SQL Server 2000/2005.
  • Strong knowledge of Extraction, Transformation and Loading processes using Data Stage ETL Tool.
  • Participated in high - level and detailed analysis and design efforts focusing on extract, transformation, and loading of data.
  • Extensively worked on Parallel Jobs using Various Stages like Join, Lookup. Lookup File Set, Change Capture,Funnel, Copy, Sort, File set, Sequential File, Data set, Oracle Enterprise, Aggregator, Remove Duplicate,Merge,Filter Stages.
  • Extensively Worked on Job Sequences to control the execution of the job flow using various activities like Job Activity, Email Notification, Sequencer, Routine activity and Exec Command Activities.
  • Created ETL scripts using BTEQ, MULTILOAD, FLOAD, FASTEXPORT as per the requirements of the client
  • Experienced in designing and developing complex Data Stage jobs and Sequences.
  • Used Volatile, Global, Set, MultiSet tables in the processing of data through BTEQ scripts
  • Used Inner, Outer Join, UNION, INTERSECT, RANK, ROWNUM and OLAP functions CSUM, MSUM, MAVG, MDIFF
  • Experience in extraction of data from various sources and loading the data after cleansing into the Data Warehouse
  • Experience in loading high volume data, debugging, troubleshooting and performance tuning.
  • Delivered Source to Target Mapping documents for ETL Design and Development effort
  • Experienced with all phases of software development life cycle.
  • Experience in writing complex queries to extract the data efficiently.
  • Involved in design and implementation of Data Warehouse Fact and Dimensional tables
  • Experience in UNIX shell scripting.
  • Experience in Dimensional Data Modeling, Star/Snowflake schema, FACT & dimensions tables.
  • Worked on different scheduling tools like Autosys, Control-M
  • Migrated jobs from development to QA to Production environments.
  • Experience working in 24/7 Production Support environment.
  • Experience in working onshore-offshore business model.
  • Highly adaptive to a team environment and proven ability to work in a fast paced teaming environment with excellent communication skills.
  • Experience in coordinating with Release Management, Data Management and Configuration Management

TECHNICAL SKILLS:

Datawarehousing Tools: IBM Info Sphere Datastage 8.1/8.5, 7.5.3/9.1/11.1, Informatica 9.6.0/10.0

Database Tools/Utilities: TOAD, SQLDeveloper, SQLPLUS, SQLLDR, Export, Import,FastExport, FASTLoad, MULTILoad, Teradata SQLAssistant,Tableau 10

Databases: Oracle 9i/11g, DB2, Netezza, Teradata V2R6 and Microsoft SQL Server 2000/2005/2008, Hadoop V2.2

Operating Systems: IBM UNIX AIX5.2, HP UNIX 10.2, Windows 9x/2000/NT/XP,2003/2008 Windows Server, Solaris 2.8/SunOS5.8, Redhat Linux

Programming Languages: C/C++

Scripting Languages: Korn Shell Scripting

Code Versioning Tools: CVS, Clear case, PVCS

Scheduling Tools: Control-M, Autosys

PROFESSIONAL EXPERIENCE:

Confidential, Charlotte, NC

Data Integrator - Datastage Developer

Responsibilities:

  • Designed and developed daily ETL processes to extract the transaction data and replenish the financial information in the datawarehouse as per the business requirement in the target tables in Oracle.
  • Extensively used Transformer, Lookup, Merge, Join, Sort, Filter, Aggregator, and Remove duplicate stages to transform the data extracted from source systems and loaded the data into the Target tables in Oracle.
  • Designed and developed the ETL to transform the data in Staging area and load the Target Objects in Hub environment.
  • Designed and developed daily ETL jobs to load the delta changes in the datawarehouse implementing the business rules as per the specs and refresh the fact tables in the target.
  • Extensively used Transformer, Lookup, Merge, Join, Sort, Filter, Aggregator, and Remove duplicate stages, Change Data Capture Stage (CDC) to transform the data extracted from source systems and loaded the data into the Target in Oracle.
  • Developed an ETL job to extract List Ids from the source tables and load the fact table in Oracle database.
  • Monitored the daily production jobs and ETL jobs un UAT env.
  • Delivered appropriate fixes to the issues identified in Production env as per release cycle.

Environment: IBM Infosphere 11.5, Informatica Power center 9.6.0/10.0, Oracle 11g, SQL Server,Netezza,PVCS, Linux, Control - M 10.0

Confidential, Englewood, CO

Data Integrator - Datastage Developer

Responsibilities:

  • Developed Datastage ETL solution to refresh the target ACCT objects in Sales Force.
  • Designed and developed daily ETL processes to update the financial information of the customers in ACM and Sales Force through Datastage.
  • Designed and developed weekly jobs in Datastage that replenish and updates the financial information of the customers across the globe as per the business requirements.
  • Extensively used Transformer, Lookup, Merge, Join, Sort, Filter, Aggregator, and Remove duplicate stages, Change Data Capture Stage (CDC) to transform the data extracted from source systems and loaded the data into the Target tables in Netezza.
  • Designed and developed Datastage jobs to extract data from SQL server, transform the data as per specs and load in the fact tables in Netezza db to support the reporting requirements of the business users.
  • Designed and developed the ETL to transform the data in GADB and load the Target Objects in Sales Force as per client requirements.
  • Designed ETL jobs to support Ad-hoc reports for the business users.
  • Supported monitoring of daily jobs in Production environment..
  • Participated in the discussion sessions to design the ETL cycle
  • Delivered ETL process to update the payment details and provider details as per requirements of the client.
  • Developed daily ETL jobs that extracted the Company records that have changed from De-active to Active Status and provided the report for the business users.
  • Designed and developed Datastage jobs to extract data from SQL Server applied the transformation rules as per the business requirement and loaded them into Sales Force.
  • Designed and developed Datastage jobs to extracted Agent Information from SQL server, transform the data as per specs and load into the Account Object in EU ORG for business users.
  • Designed and developed Datastage jobs to extract the settlement data from DB2 database applied the transformation rules and loaded the data into the respective objects in Sales Force as per the business requirement.
  • Executed and monitored batch processes as per run schedule and reported the statistics.
  • Supported code migration to higher environments for each development release.

Environment: IBM Infosphere 8.5/ 9.1, Informatica Power center 9.6.0/10.0, Oracle 11g, SQL Server,Netezza, PVCS, Linux, Salesforce

Confidential, Eden Prairie, MN

Data Integrator - Datastage/ETL Developer

Responsibilities:

  • Designed and delivered monthly ETL processes to update the health information of customers in the EDW by using BTEQ/MLOAD.
  • Designed and developed quarterly jobs in Datastage to replenish and update the health informations of the customers across the country as per the business requirements.
  • Developed Datastage jobs to extract data from Teradata and load into the Staging area and Work Tables in Aster Database using Ncluster Loader.
  • Extensively used Transformer, Lookup, Merge, Join, Sort, Filter, Aggregator, Remove duplicate stages to transform the data extracted from source systems
  • Developed processes to check the quality of data based on business rules and generate reports on data quality.
  • Participated in the discussion sessions to design the ETL cycle.
  • Supported monitoring of daily jobs in Production environment.
  • Delivered ETL process to update the patient insurance coverage data in Netezza.
  • Delivered ETL process to update the payment details and provider details in Netezza as per requirements of the client.
  • Delivered archive process to archive the processed data.
  • Designed ETL jobs to support Ad-hoc reports for the business users.
  • Executed and monitored batch processes as per run schedule and reported the statistics.
  • Supported code migration to higher environments for each development release.

Environment: IBM Infosphere 9.1, Oracle 11g, Netezza, Teradata 13, Aster, PVCS, Linux

Confidential, Jacksonville, FL

Datastage/ ETL Developer

Responsibilities:

  • Designed and delivered ETL job streams in Datastage to update inventory of items across all Winn Dixie stores for the current financial period as per business requirements. The enhancement includes all Winn Dixie and BILO data.
  • Enhanced the Item processing ETL job stream in datastage to update the Items dimension table to populate the Winn Dixie and BILO items as per the unified format defined in the business requirements.
  • Extensively used Transformer, Lookup, Merge, Join, Sort, Filter, Aggregator, Remove duplicate stages to transform the data extracted into staging tables.
  • Delivered BTEQ scripts to update Distribution Center, Distribution Region data in the respective dimension tables.
  • Designed daily job that updates the Item replenishment in the Datawarehouse for BILO and Winn Dixie products.
  • Developed Data Stage extract jobs from DB2 and Teradata that loaded the data into a staging area in the form of flat files/datasets
  • Delivered required fixes to the monthly and daily jobs for the defects identified in UAT and Production as per the requirements of the business team.
  • Participated in the discussion sessions to design the ETL cycle.
  • Supported monitoring of daily jobs in Production environment.
  • Designed BTEQ scripts to generate Ad-hoc reports for the business users.
  • Supported code migration to higher environments for each development release.

Environment: IBM Infosphere 8.5, DB2, Netezza,Oracle 11g, Teradata V12, PVCS, Unix

Confidential, Tampa

ETL Developer

Responsibilities:

  • Worked with the users to determine and understand the requirements and designed the
  • ETL process for loading the source Business Data in AEDW.
  • Designed the mapping of the source Business data from MDSW, MTAS, CBSS and
  • Designed and developed ETL processes to load source data to meet the BCAD/RCAD
  • Specifications using DataStage
  • Delivered Mload scripts to load data files to AEDW database.
  • COPS legacy systems to the EDGE Data Warehouse model.
  • Delivered Fast Export scripts to extract large volumes of data
  • Delivered FLoad scripts to load large amount of data into the database.
  • Delivered ETL solutions to the AEDW data warehouse through ETL processes that
  • Corrected the corrupted data in the database and ensured data integrity.
  • Delivered ETL jobs to support ad-hoc requests for Business users.
  • Delivered Source-Target mappings for the ETL jobs and Unit Test scripts
  • Provided daily and weekly support of the data warehouse process (ETL, shell, SQL)

Environment: UNIX, C++, Teradata, Netezza, Data Stage 8.7, Continuous.

Confidential, Canton, OH

ETL Developer

Responsibilities:

  • Designed target tables as per the requirements from the reporting team, and also designed Extraction, Transformation and Loading (ETL) using Data stage.
  • Extensively used Transformer, Merge, Join, Aggregator, Sort, Filter, Look up and Remove Duplicates stages to transform the data extracted into staging tables.
  • Designed and Delivered ETL processes to load the data from the staging tables in to the target tables by using BTEQ/MULTILOAD scripts.
  • Re-designing the existing jobs with a different logical approach to improve the performance.
  • Used the Datastage Designer to develop processes for Extracting, Cleansing, Transforming and Loading data into Data Warehouse.
  • Involved in designing the procedures for getting the data from all systems to Operational data store.
  • Involved in developing shell scripts for loading data into Oracle using SQLLDR and developed control files for this process.
  • Created table definitions, indexes, views.

Environment: IBM Info sphere 8.7, DB2, Oracle 11g

Confidential, Dallas, TX

Datastage/ETL Developer

Responsibilities:

  • Designed parallel jobs using various stages like Oracle Connector, Dataset, Sort, Join, Merge, Look up, Filter, Aggregator,
  • Used the Data stage Designer to develop processes for extracting, cleansing, transforming, integrating, and loading data into datamarts.
  • Designed and delivered BTEQ/MULTILOAD scripts to load the data into the target tables.
  • Increased the job performance of complex jobs, but designing the sequences to enable multiple jobs to run in parallel if possible
  • Worked on various Relational Data Base like DB2, SQLServer, Oracle, Teradata
  • Wrote complex SQL queries using joins, sub queries and correlated sub queries.
  • Used the Data Stage Director for testing and debugging its components, and monitoring the jobs
  • Delivered shell scripts to run the Data Stage jobs so that these scripts can be called by Autosys for scheduling the jobs.

Environment: IBM /Ascential Data Stage v8/7.5.1, PL/SQL developer, Netezza,DB2,Oracle 11g UNIX (Sun Solaris 5.10), Toad, Windows XP, and Autosys.

We'd love your feedback!