Senior Datastage Developer/pl Sql Developer Resume
Kansas City, MO
EXPERIENCE SUMMARY:
- Having 10 years 11 months of experience in Software Development Life Cycle (SDLC) which includes Design, Development, Testing in variety of technological platforms with special emphasis on Data Warehouse and Business Intelligence applications.
- Experience in designing Data Marts and Data Warehouse using Star Schema and Snowflake Schema in implementing Decision Support Systems.
- Expertise in Data Warehousing ETL Tool IBM InfoSphere DataStage PX 8.5, 8.7, 9.1, 11.3, 11.5 using Components like Data Stage Designer, Data Stage Director and Microsoft ETL tool SSIS.
- Good knowledge on real - time data Warehouse development using CDC tool Informatica Power Exchange 9.1/8.6/8.1
- Good knowledge of troubleshooting of DataStage jobs and addressing issues like performance tuning and quality assurance testing (QA).
- Good knowledge of DataStage Migration from 8.7 to 11.3/11.5 and Data conversion project.
- Good knowledge of PL/SQL, Oracle, DB2, SQL Loader, Control-M, AutoSys, ASG-Zena and UNIX/Linux.
- Extensive experience working in an Agile development environment and DevOps.
TECHNOLOGY:
Data Warehousing ETL & Profiling tools: Infosphere DataStage PX 8.5,8.7, 9.1,11.3, 11.5 and Microsoft ETL tool SSIS, SSRS, Crystal reports, Data warehouse, Hadoop, HDFS, MapReduce, BigData, HBase, Pig, Sqoop, Oozie, Hive, QualityStage, Informatica PowerCenter 9.5.1/8.6/7.1 , Data cleansing,
Programming Languages: SQL and PL/SQL, Unix/Linux commands & Shell Scripts
Database: Oracle, DB2, Sybase, SQL Server 2008 & 2012, JDE Database, Netezza, Teradata
Operating Systems: Windows 2003/XP/7 and UNIX/Linux
Special Software: ASZ- Zena,HP Quality Center, Footprints, MS VSS, TFS, Control-M, AutoSys, Agility tool, Kanban board, TFS, SVN tool, JIRA, Sharepoint, Doors, EIM, Dev OPS, VSTS
PROFESSIONAL EXPERIENCE:
Confidential, Kansas City, MO
Tools: DataStage 11.5, DB2, Oracle, Sybase, PL SQL, UNIX/LinuxASZ- Zena, SharePoint
Senior DataStage Developer/PL SQL Developer
Responsibilities:
- Designing the solution: Working with Data Modelers and BDAs and helping them in designing optimal solution for our current project at BLUEKC.
- Working majorly in APPEALS, CLAIMS and MEMBERSHIP subject area where in processing member data to ware house and then processing claims data after mapping members to 10 claims tables in warehouse.
- Developed various complex Datastage jobs both Parallel and Server jobs for processing Claims, Membership, Appeals and other data.
- Worked on extracting data from various source DBs like DB2, SYBASE, SQL Server and Netezza and used various complex transformation and load data into various target DBs.
- Worked extensively with 3 rd party vendors files of Appeals, Claims, Member matching etc.
- Worked extensively with Flat files (Fixed width and delimited), Web Service calls (SOAP and Rest), XML files and Hashed files in server jobs.
- Designed various sharable components like Shared Containers, custom build routines and Transforms.
- Worked in generating EOBs (Explanation of benefits), one of its kind of complex process designed in Datastage.
- Highly competent in working in UNIX scripts and implementing them in Datastage.
- Worked on enhancement of Datastage framework in which every developer need to fit in their jobs and which helped in easy debugging and let anybody follow the flow of data.
- Design the ETL Datastage jobs to meet the business output in optimized way, using ETL (Datastage) and UNIX shell scripting.
- Adhere to design reviews, code reviews, deployment documents and appropriate industry standard development methodologies.
- Gathering Requirements, Estimation, creating related documents in all SDLC project phases, creating data model, development in IBM Data stage to populate DWH and DM, creating UNIX shell scripts, ensuring project deliverables for smooth rollout to production environment, production support, Transition to support team are some of the task I performed in this project.
- Involved in code review and provide solution for improved the performance tuning of ETL DataStage jobs and query optimization.
- Work closely with program analysts, testers, and clients to identify features the software needs to contain to write stories of functionality to complete.
- Use SCRUM for agile development and participate in team-led solutions, reviewing peer's code for quality and completeness.
- Worked on job scheduling using Zena tool and designed various complex scheduling techniques, some of them were custom build .
Confidential, Chandler, AZ
Tools: DataStage 9.1/11.3/11.5 , Oracle, DB2, PL SQL, AutoSys, UNIX/LinuxSharePoint, Hive, HBase, Hadoop, MapReduce
Senior DataStage Developer/ PL SQL Developer
Responsibilities:
- Used DataStage 9.1/11.3 as an ETL tool to extract data from sources systems, data validations, aggregate the data filter and load into the target tables.
- Extensively used the Designer, Director, Administrator and Manager to develop various parallel jobs/sequences to extract, transform, integrate and load the data into Data warehouse.
- Extracted the data from the mainframe applications, which is the hierarchy file system by using the DB2 stage, Designed parallel jobs using stages such as Join, Merge, Lookup, Remove Duplicates, Copy, Filter, Funnel, Dataset, Lookup, Pivot, and Sort, Surrogate key Generator, Change Data Capture (CDC), Modify, Row Generator and Aggregator.
- Created Job Parameters and Environment variables to run the same job for different schemas.
- Implemented audit functionalities for Confidential Rewards project.
- Worked on AutoSys for jobs scheduling, created EIM job and IFB file for data load from EIM to Siebel.
- Participated in project planning sessions with project managers, business analysts and team members to analyse business requirements and outline the proposed solution.
- Created new Linux/Unix shell scripts and enhanced the existing shell scripts
- Developed Complex SQL queries using various joins and developed various dynamic SQL’s thorough out the projects.
- Conducted Oracle database and SQL code tuning to improve performance of the application, used Bulk binds, in-line queries, Dynamic SQL, Analytics and Sub-query factoring etc.
- Handled Performance Tuning of Jobs to ensure faster Data Loads, Extensively worked with Shared Containers for Re-using the Business functionality and write complex transformation algorithms.
- Assisted operation support team for transactional data loads in developing SQL & UNIX/Linux scripts, participated in Unit testing and Integration testing of DataStage jobs.
- Involved in writing PL SQL complex queries, validation scripts, stored procedure, cursor, and trigger for ETL requirements.
- Creating Hive tables and written Hive queries for data analysis to meet the business requirements.
- Involved in loading data from UNIX file system to HDFS.
- Created HBase tables to store variable data formats of data coming from different portfolios.
Confidential
Tools: DataStage 11.5, DB2, PL SQL, Control- M, UNIX/Linux, JIRA, DoorsSharepoint, Quality Stage
Senior ETL (DataStage) and PL/SQL Developer
Responsibilities:
- Participated in project planning sessions with project managers, business analysts and team members to analyse business requirements and outline the proposed solution.
- Designing the ETL jobs using IBM Infosphere DataStage 11.5 to Extract high volume data from legacy systems, Transform and load the data into Staging and then into Database after data conversion.
- Involved in control-m jobs scheduling, flow analysis, task creation, monitoring, troubleshooting and agent installation.
- Scheduled sessions and batch process based on demand, run on time, run only once using Informatica Workflow Manager
- Created new Linux/Unix shell scripts and enhanced the existing shell scripts
- Developed Complex SQL queries using various joins and developed various dynamic SQL’s thorough out the projects
- Created sessions, database connections and batches using Informatica Workflow Manager
- Extensively used the Designer, Director, Administrator and Manager to develop various parallel jobs/sequences to extract, transform, integrate and load the data into Data warehouse.
- Extracted the data from the mainframe applications, which is the hierarchy file system by using the DB2 stage, Designed parallel jobs using stages such as Join, Merge, Lookup, Remove Duplicates, Copy, Filter, Funnel, Dataset, Lookup, Pivot, and Sort, Surrogate key Generator, Change Data Capture (CDC), Modify, Row Generator and Aggregator.
- Handled Performance Tuning of Jobs to ensure faster Data Loads, Extensively worked with Shared Containers for Re-using the Business functionality and write complex transformation algorithms.
- Created Job Parameters and Environment variables to run the same job for different schemas.
- Used to do code reviews for all the jobs before moving to production.
- Assisted operation support team for transactional data loads in developing SQL & UNIX/Linux scripts, participated in Unit testing and Integration testing of DataStage jobs.
- Involved in writing PL SQL complex queries, validation scripts, stored procedure, cursor, and trigger for ETL requirements
- Used QualityStage stages such as investigate, standardize, match and survive for data quality and data profiling issues during the designing.
- Preparing unit test scripts, technical Design document, STM (Source to Target Mapping) for DataStage development.
- Involved in all DataStage admin tasks, patch installation and configuring IBM Infosphere enterprise edition software in workstation.
- Created lots of Triggers, procedure, package, and functions for frontend validation.
Confidential, Vanguard, PA
Tools: DataStage 8.7 & 11.5, Oracle, DB2, PL SQL, Control- M, Agility toolUNIX/Linux, Informatica Power centre, Kanban board
Sr. DataStage (ETL) Developer and Administrator/PLSQL Developer
Responsibilities:
- Participated in project planning sessions with project managers, business analysts and team members to analyse business requirements and outline the proposed solution.
- Preparing Technical Design document, STM (Source to Target Mapping) for DataStage development.
- Developed Complex PL SQL queries using various joins and developed various dynamic SQL’s thorough out the projects.
- Use SCRUM for agile development and participate in team-led solutions, reviewing peer's code for quality and completeness.
- Conducted Oracle database and SQL code tuning to improve performance of the application, used Bulk binds, in-line queries, Dynamic SQL, Analytics and Sub-query factoring etc.
- Preparing the unit test case document based on the Low level Documentation.
- Used DataStage as an ETL tool to extract data from sources systems, data validations, aggregate the data filter and load into the target tables and create XML as output.
- Used QualityStage to coordinate the delivery, consistency, removing data anomalies and spelling errors of the source information
- Used QualityStage stages such as investigate, standardize, match and survive for data quality and data profiling issues during the designing
- Involved in Building of Customized Rulesets for the countries for which there are no default Rulesets in Quality Stage.
- Involved in control-m jobs scheduling, flow analysis, task creation, monitoring, troubleshooting and agent installation.
- Have involved to support Informatica mapping to load the data from Siebel data base to BI database.
- Scheduled sessions and batch process based on demand, run on time, run only once using Informatica Workflow Manager
- Coordinated with offshore team to support the existing ODS system jobs.
- Developed Informatica mappings to load data in slowly changing dimension.
- Involved in Informatica upgradation from the older version to the newer version.
- Used Remove Duplicate stage, sort stage, Transformer stage, Filter stage, Join stage, Copy stage, Sequential file, Lookup stage, Oracle connector stage, Dataset stage, Funnel stage, XML Input, XML Output stage etc. for development, enhancement and performance tuning of DataStage ETL jobs.
- Importing & Exporting the Jobs/ sequencer from one server to another server and involved in preparing build for DataStage migration to SAT, CAT and production.
- Involving in development of new ETL jobs for new module and enhancement of existing system.
- Involved in creating job sequences and used different stages like Notification Activity, exception handler.
- Involved to build jobs/sequences for Data-strategy data masking project as key primary member.
- Participated in daily scrum call/meeting for agility methodology and Kanban.
- Involved in production support as a DataStage specialist.
- Involved in code and job review meetings and provide solution for improved the performance tuning of the ETL DataStage jobs.
- Involved in requirement gathering, discussion with business analyst and client team members (FAS, IIG, and INTL).
- Involved in managing and handling offshore/onshore DataStage team members.
- Involved in writing PL SQL complex queries, stored procedure, cursor, and trigger for requirements. Developed UNIX/Linux scripts for ETL project
- Responsible for Identifying and Fixing Performance Bottlenecks on DataStage business Mappings
- Involved and supported all DataStage Admin tasks in current DataStage project.
- Involved and interacted with users, business and offshore/onshore team members during DataStage code migration to SAT, CAT and Production.
Confidential, Chicago
Tools: DataStage 8.5, 8.7, 9.1 Oracle, PL/SQL, UNIX/LINUX, Control-MHP Quality Centre, Informatica, Quality stage
Senior DataStage (ETL) Developer and Module Lead
Responsibilities:
- Used DataStage as an ETL tool to extract data from sources systems, data validations, aggregate the data filter and load into the target tables and create XML as output.
- Used Remove Duplicate stage, sort stage, Transformer stage, Filter stage, Join stage, Copy stage, Sequential file, Lookup stage, Oracle connector stage, Dataset stage, Funnel stage, XML Input, XML Output stage etc. for development, enhancement and performance tuning of DataStage ETL jobs.
- Importing & Exporting the Jobs, quality stage rule sets from one server to another server
- Involving in development of new ETL jobs for enhancement of existing system
- Involved in creating job sequences and used different stages like Notification Activity, exception handler.
- Used QualityStage stages such as investigate, standardize, match and survive for data quality and data profiling issues during the designing
- Involved in creating of new and modify the existing DTD with new requirements.
- Prepared DataStage TFS build document, best practice document, Check variance document, Approach document, Apt Transport Block Size document for Confidential HR project
- Have involved to support Informatica mapping to load the data from oracle data base to BI database for report generation.
- Involved in Informatica upgradation from the older version to the newer version.
- Involved in Building of Customized Rulesets for the countries for which there are no default Rulesets in Quality Stage.
- Involved creation of DDL, DML and DCL statements in ORACLE for the Confidential HR project.
- Involved in code and job review meetings and provide solution for improved the performance tuning of the ETL DataStage jobs.
- Preparing the unit Test case, Integration test cases based on the DBR (detailed business requirements) and DTD (detailed technical design)
- Involved into creation of package for TFS DataStage build through IBM InfoSphere Information Server Manager for moved the ETL jobs and Quality Stage rule sets for SIT, UAT and production.
- Involved in managing and handling offshore/onshore DataStage team members.
- Involved in writing PL SQL complex queries, stored procedure, cursor, and trigger for requirements and UNIX/Linux shell scripts
- Developed Complex PL SQL queries using various joins and developed various dynamic SQL’s thorough out the projects
- Involved in control-m jobs scheduling, flow analysis, task creation, monitoring, troubleshooting and agent installation
- Responsible for Identifying and Fixing Performance Bottlenecks on DataStage business Mappings
- Involved and supported all DataStage Admin tasks in current DataStage project.
- Involved and interacted with users, business and offshore/onshore team members during DataStage code migration to SAT, CAT and Production.
- Have done installation and configuration activities for IIS v8.5, v8.7, v9.0.1, v11.3 and v8.1 and v8.0 servers.
- Have done Migration patches installation and Prerequisite patches installation for IS81 and IS85
- Have done installation and configuration activities for IIS v8.7 and v9.0.1 and v9.1.2,v11.3
Confidential
Tools: DataStage 9.1, DB2, PL/SQL, UNIX/Linux, Control-M, Quality Centre
ETL Team Lead and Senior DataStage Developer
Responsibilities:
- Used DataStage as an ETL tool to extract data from sources systems, data validations, aggregate the data filter and load into the target tables
- Used Remove Duplicate stage, sort stage, Transformer stage, Filter stage, Join stage, Copy stage, Sequential file, Merge stage, Lookup stage, DB2 stages, ODBC stage, Dataset stage, Funnel stage etc. for development, enhancement and performance tuning of DataStage ETL jobs.
- Importing & Exporting the Jobs from one server to another server
- Involved in developing jobs for enhancement and new requirements
- Involved in creating job sequences and used different stages like Notification Activity, sequencer stages
- Written new and enhanced existing UNIX/Linux shell scripts for project
- Prepared Technical Design document, best practice document, Analysis document, Approach document, STM (Source to Target Mapping) for DataStage development
- Involved creation of DDL, DML queries statement and raise an EDM request for the IA FNA project.
- Developed Complex PL SQL queries using various joins and developed various dynamic SQL’s thorough out the projects.
- Involved in Building of Customized Rulesets for the countries for which there are no default Rulesets in Quality Stage.
- Effectively used standardized stage in standardizing the source data by using the existed rule sets like name, address and area etc. for multiple countries and generated the valid and invalid data reports.
- Preparing the Low level documentation based on High level documentation
- Preparing the unit Test case, Integration test cases based on the Low level Documentation
- Interacted with users, other team members during UAT and promoted the code to Production
- Involved in preparation of IQA and EQA documents and review as QA expert.