We provide IT Staff Augmentation Services!

Etl Developer / Datastage Developer - Lead/ Bi Report Developer Resume

2.00/5 (Submit Your Rating)

Charlotte, NC

PROFESSIONAL SUMMARY:

  • 8.5 years of experience in ETL Development, Business Intelligence and Data warehousing Technologies
  • 8.5 years of experience using IBM Datastage and Oracle.
  • 3 years of experience in Tableau Reporting Tool
  • Experience working with Informatica, Pentaho Data Integration, Unix Shell scripts and DB2.
  • 5 years of Experience working in Asset Management Domain with various Development model from Waterfall and Agile with specialization in ETL Development and Reporting.
  • Experience in working with Slowly Changing Dimensions and setting up Changing Data Capture (CDC) mechanisms.
  • Having Very strong experience in ETL using IBM Websphere Information Server (9.1/8.7/8.5/8.1/7.5.2 ) Server edition, Enterprise edition (Parallel Extender).
  • Experience in all stages of WebSphere Administration like installation, Configuration, Deployment, Migration, and Troubleshooting on AIX 6.0/5.3/5.2, Red Hat Linux, SUSE Linux, Solaris 10/9/8, Windows 2003 Server Environments.
  • Developed efficient jobs for data extraction/transformation/loading (ETL) from different sources to a target data warehouse.
  • Used Control M job scheduler for automating the monthly regular run of DW cycle in both production and UAT environments.
  • Good experience in scheduling Jobs using AutoSys, Tivoli, Zeke and Corntab.
  • Designed and Prepared Functional specification documents, Technical specification documents, Mapping Documents for Source to Target mapping with ETL transformation rules
  • Worked with extraction and loading data from various database sources SQL Server 2008/2005, Oracle, IBM DB2, Teradata V2R5/R6/TD12, Flat Files and XML Files.
  • Worked with stored procedures to insert and clean data into the data warehouse tables, fired triggers upon completion of load to obtain the log of all the tables loaded with job name and load date.
  • Worked on UNIX shell scripts using Kshell for the Scheduling sessions, automation of processes, pre and post session scripts.
  • Experienced in developing both Server and Parallel jobs running on a UNIX platform
  • Having experience in UNIX shell scripting to pre - processing files and checking the integrity of files.
  • Strong understanding of Data Warehousing principles using Fact Tables, Dimension Tables, Star Schema modeling and Snowflake Schema modeling.
  • Implemented Slowly Changing Dimension phenomenon while building Data Warehouses.
  • Deep understanding of OLTP, OLAP and data warehousing environment and tuning both kind of systems for peak performance

TECHNICAL SKILLS:

ETL Tools: Datastage 11.5, 11.3, 8.5, 8.1, 7.5, Pentaho DI, Informatica.

Reporting Tools: Tableau

Languages: C, C++, Shell scripts, SQL, UNIX, HTML.

Database: Oracle11g (Sql), DB2, SQL Server, Greenplum

Operating systems: Windows XP, Windows 7, UNIX

Scheduling tools: Autosys

PROFESSIONAL EXPERIENCE:

Confidential, Charlotte, NC

ETL Developer / Datastage Developer - Lead/ BI Report Developer

Environment: Datastage 8.1, Datastage 11.3, Oracle, Tableau 9, UNIX Shell Scripts, Borland STAR Team, Autosys

Responsibilities:

  • I translated the Requirements to Data model (Star schema)
  • Managed the ETL process end to end with Analysis, Design, Development and testing of Datastage jobs.
  • Implemented SCD Type 1 and Type 3 dimensions with CDC.
  • Automated the whole ETL process with Data stage sequences and Daily Files - Automated Job Failures/File delays/ File Validations / Data Reloads.
  • Tableau development - created dashboards as per the requirements, support UAT testing - Deployment of dashboards to Production.
  • End to End Production Support and Maintenance for SAMI applications - solving Tickets and Incidents.
  • Communicated to the Vendor requesting source feeds that loads the data warehouse.
  • Scheduled the jobs using Autosys and Deploying them in Production.
  • Maintain versions of the code (both Datastage and Oracle) in Borland Star team.
  • Handled All project related documents including BRD and Technical Design.
  • Support Enterprise wide Disaster Recovery Testing for SAMI Applications.
  • Change Management in Service now - End to End Support in Creating, Implementing and Closing the Change for all production releases.
  • Managed and Worked with Offshore/allocating Development tasks and lead daily-Status calls
  • Developed ETL jobs to load data from Sql server, Oracle database and flat files to Target warehouse.
  • Datastage8.5 was used to transform a variety of financial transaction files from different product platforms into standardized data mart.
  • Scheduling the jobs in the control M, by promptly placing the data stage jobs instead of mainframe jobs
  • Created various PL/SQL Scripts involving Insert, Sequences, Constraint and Index scripts.
  • Developed ETL jobs to load data from DB2 database, Flat files and XML files to Target warehouse, and experience with high volume databases.
  • Optimized SQL Queries for maximum performance.
  • Proficient in writing Packages, Stored Procedures, Functions, Views and Database Triggers using SQL and PL/SQL in Oracle.
  • Used Parallel Extender for parallel processing for improving performance when extracting the data from the sources.
  • Used Datastage Designer 8.5 to develop the jobs. Recently we migrated the jobs from 8.5 to 9.1.
  • Involved in development of Data Stage Jobs with required Transformations like Aggregator, Filter, Funnel Join, Lookup, Merge, Remove Duplicates, Surrogate Key Generator, Sort and Transformer.
  • Designed and developed jobs using Data stage/Quality stage for loading the data into Dimension and Fact Tables.
  • Developed Job Sequences for automating and scheduling the Datastage Jobs in the Control M.
  • Experience in generating and interpreting mapping documentation, and translating into detailed design specifications using ETL code.
  • Installed and configured DB2 on AIX.
  • Extensively designed, developed and implemented Parallel Extender jobs using Parallel Processing (Pipeline and partition) techniques to improve job performance while working with bulk data sources.
  • Used Datastage Director to execute the jobs.
  • Involved in Unit Testing and Integration Testing.
  • Worked with Datastage to migrate jobs from 7.5 to 8.5 and Datastage components.

Environment: IBM Data stage 9.1/8.5 (Director, Designer),7.5, PL/SQL, IBM DB2, WebSphere ESB, IBM BPM 7.5/8.x, SQL Server, PERL, Oracle, AIX 6.0, UNIX,BO, LINUX, control M.

Confidential, Charlotte, NC

ETL Developer / Datastage Developer - Lead

Environment: Datastage 8.1, Oracle

Responsibilities:

  • Created Datastage jobs that generates XML files and uses COBOL files as source.
  • Data Analysis and Design of the ETL framework.
  • Optimized the existing jobs for high performance.
  • Development of Jil Files/Scheduling jobs in CA Autosys
  • End to End Testing support in Dev and UAT
  • Migration of jobs to higher environments - IT, UAT and PROD
  • PROD support and maintenance - PROD fixes and Solving Incidents.
  • Created All project related documents that involves both Functional & Technical Design.
  • Created Change Requests/Deployment packages for Production releases.
  • Analyzed Business Requirements by working closely with Business Area.
  • Designed Star/Snowflake schemas, converted pre-existing serial applications to parallel processing using Data Stage parallel extender.
  • Utilized Tivoli scheduler to schedule jobs based on different dependency checks such as incoming file and previous job runs.
  • Designed jobs using different parallel job stages such as Join, Merge, Lookup, Remove Duplicates, Copy, Filter, Funnel, Dataset, Change Captured, Modify, and Aggregator.
  • Used Debugging stages like Row generator, Column Generator and Peek, DB2 and Teradata.
  • Demonstrated proficiency in working with the XML Input and XML Output stages to extract data and create XML documents.
  • Used XML parallel Real Time Integration stages to extract and transform XML nodes from OLTP systems.
  • Extensively used Teradata utilities like Fast Export and MLoad.
  • Implemented slowly change dimensions (SCD) type 1, type 2, type 3.
  • Implemented incremental extraction and incremental load.
  • Designed and developed various jobs for scheduling and running jobs under job sequencer and Data Stage Director.
  • Extensively worked on scripts to schedule jobs to run from Tivoli Scheduler.
  • Performed Administrator functions such as creating projects, setting tunables, protecting project, releasing the jobs, and setting environment variables.
  • Extensively implemented import/export utilities for migrating code.
  • Replaced transformer stages with other stages to improve performance of job.
  • Worked with configuration file in parallel extender to take maximum benefit from parallel environment.

Environment: IBM Information Server (Data Stage & Quality Stage) 8.5, MS Visio, Tectia SSH, IBM DB2, Teradata, Teradata SQL Assistant, Autosys, Serena Dimensions (Version Control), Cognos

Confidential

ETL Developer / Datastage Developer

Environment: Datastage 8.1, Oracle, Unix- shell scripts

Responsibilities:

  • Development of Datastage jobs from Informatica jobs (Migration of entire ETL framework from Informatica to Datastage)
  • Requirement analysis and Design of Datastage jobs.
  • Understand the Informatica job design and ETL framework.
  • Optimize the Datastage jobs to acquire faster outputs when compared to Informatica jobs.
  • Created SCD Type 1 jobs/ Used Containers for efficient job design and reusability.
  • End to End Testing with Dev, QA and UAT support
  • Development of Jil Files/Scheduling jobs in Autosys
  • Migration of jobs to higher environments - IT, UAT and PROD
  • PROD support and maintenance solving Incidents and Change requests.
  • Designed and delivered monthly ETL processes to update the health information of customers in the EDW by using BTEQ/MLOAD.
  • Designed and developed quarterly jobs in Data stage to replenish and update the health information’s of the customers across the country as per the business requirements.
  • Developed Data Stage jobs to extract data from Teradata and load into the Staging area and Work Tables in Aster Database using Ncluster Loader.
  • Extensively used Transformer, Lookup, Merge, Join, Sort, Filter, Aggregator, Remove duplicate stages to transform the data extracted from source systems
  • Developed processes to check the quality of data based on business rules and generate reports on data quality.
  • Developed UNIX shell scripts to automate file manipulation and data loading procedures.
  • Scheduled the jobs using Control M.
  • Participated in the discussion sessions to design the ETL cycle.
  • Supported monitoring of daily jobs in Production environment.
  • Designed ETL jobs to support Ad-hoc reports for the business users.
  • Delivered ETL process to update the patient insurance coverage details
  • Delivered ETL process to update the payment details and provider details as per requirements of the client.
  • Delivered archive process to archive the processed data.
  • Executed and monitored batch processes as per run schedule and reported the statistics.
  • Supported code migration to higher environments for each development release.

Environment: IBM InfoSphere 8.5, Oracle 11g, Netezza, Teradata 13, Aster, PVCS, Linux

Confidential

ETL Developer / Datastage Developer - Lead

Environment: Pentaho Data Integration, Greenplum, Shell scripts

Responsibilities:

  • Involved in Data Model design and Design/Estimates.
  • Development in Open source tools - Pentaho and Greenplum.
  • Management of resources and allocating Tasks including QA.
  • Development of ETL jobs that loads Dimensions and Facts.
  • Implemented SCD type 2 with Effective start dates and end dates with Pentaho stages.
  • Testing/Migration of code to higher environments.
  • Conducting Pentaho ETL Training sessions for the team.
  • PROD support and Maintenance
  • Created Project Documents.

We'd love your feedback!