We provide IT Staff Augmentation Services!

Sr.etl Developer/lead Resume

5.00/5 (Submit Your Rating)

Pleasanton, CA

PROFESSIONAL SUMMARY:

  • Has around 10 years of experience especially in Client/Server business systems and Decision support Systems(DSS) analysis, design, development, testing and implementation
  • Strong experience in Extraction Transformation and Loading (ETL) applications for Data Warehouses and Datamarts using DataStage 11.5/8.X/7.X (DataStage Manager, DataStage Designer, DataStage Director, Datastage Administrator, Version Control).
  • Experience in working with Confidential Information Server Suite 11.3 Suite.
  • Extensively worked on Confidential InfoSphere (Data stage, Quality stage) / Information Analyzer 11.5/8.5
  • Expertise in Logical and Physical Data Modeling and Design
  • Migrated jobs from 8.5 to 11.5 and developed new data stage jobs using data stage/quality stage designer.
  • Experience in working with DW and BI security architecture like ERP
  • Excellent technical and analytical skills with clear understanding of design goals of ER modeling for OLTP and dimension modeling for OLAP.
  • Experience in implementing Best Practice in Data Modeling.
  • Extensive Experience in identifying Fact & Dimension tables and working with Star & SnowFlake Schema’s.
  • Extensively worked on Information Analyzer to perform Column Analysis, Table Analysis, Primary Key Analysis and developed rules to identify overlapping data across domains
  • Experience in working with Shared and Local Containers
  • Strong working knowledge with Oracle database Design and SQL Scripting, PL/SQL.
  • Designed advanced SQL queries, stored procedures, packages, scripts, cursors Create, maintain, modify, and optimize Oracle databases.
  • Efficient in all phases of the software development lifecycle, coherent with Data Cleansing, Data Conversion, Performance Tuning, Unit Testing, System Testing, User Acceptance Testing.
  • Designed Mapping documents ETL architecture documents and specifications.
  • Analyzed the Source Data and designed the source system documentation..
  • Extensive experience in loading high volume data, and performance tuning.
  • Set up development, QA & Production environments.
  • Extensive experience in Dimensional modeling techniques.

TECHNICAL SKILLS:

ETL Tool: Confidential Inforshphere Information Server 11.5, Datastage 11.5/8.5, QualityStage, ProfileStage, WebSphere Information Server 8.0

Databases: Oracle 8i/9i/10g/11g, SQL Server 2014/2008/2005 , DB2, Teradata

Data Modeling: Star Schema, Snow Flake Schema, Confidential - InfoSphere Data Architect, Erwin, MS Visio, Fact and Dimension Tables

Reporting tools: OBIEE and Business objects 6.5, SAP R/3 6.0 ECC

Languages: SQL, PL/SQL, C, C++, Java, HTML, XML, Jscript, VB Script and Shell script

Operating Systems: Windows 98, 2000 NT, 2003 server, XP, UNIX, LINUX, HP-UX 11.23, SOLARIS 10.x

Other Tools: DB2 Visualiser, TOAD, SQL Developer, Tivoli Load leveler3.4.1.2 and Autosys

PROFESSIONAL EXPERIENCE:

Confidential, Pleasanton, CA

Sr.ETL Developer/Lead

Responsibilities:

  • Working closely with Solution Architect to understand the requirements, identifying the dimension tables, fact tables and relationship between them.
  • Implemented the ODS, Stagind Area, Intermediate tables and Target data warehouse .
  • Involved in handling huge volumes of data processing. Used DataStage Parellel processing along with Partitioning techniques to process huge volumes of data. Also used Oracle Partition based on Date column to process the history data from Source to Target tables.
  • Analyzed and applied indexes on the target tables for faster data processing on Type 1 or Type 2 data.
  • Worked on tables with different grain of data. Experiece in making these tables on same grain levels and merged them as per requirement.
  • Created reusables objects which can be used as part of different file loads into Staging Areas using RCP. Also created reusable objects which can be used for CDC.
  • Analyzed Datastage jobs to find the bottleneck for Performance of job . Applied corresposniding sorting/Partitioning techniques to improve the performace of stages like Join/Merge. Replaced Tranformers with possible other stages for performance improvement.
  • Implemented reusable ETL job framework using Datastage for the Delta extraction from different Sources.
  • Code review and Performance tuning of Datastage jobs and SQL’s for optimization.
  • Experience in writing Stored Procs to load/fix the history data in Target tables.
  • Worked with DBA and applied hints on Source query to improve the performance of data extraction.
  • Experience in working with Auto Partition and List Parition on Oracle DB to partition the target data.
  • Preparation of implementation plan, backout plan and Control plan.
  • Support for production deployment activities, implementation and warranty support after post-production.
  • Experience in working with Agile methodology. Updating the user stories and tasks in dialy manner. Atteneded Sprint Planning and Sprint review every two weeks.
  • Preared Dependency doc which gives the cclear picture of dependency between Sequqnce job. Worked with Scheduling team to schedule the Datastage Jobs on Tivoli.
  • Woked with multiple release teams, conducting meeting with them regularly to discuss the dependenct items between team and get them resolved.
  • Concurrrently worked Level 3 PROD Support for another project. PROD failures will be raised by ticketing systems. We will be working on tickets on weekly basis.
  • Worked with Onsite/Offshore model. Worked as Lead developer. Having daily calls with offshore to make them understand on requirements, get the work done on time.

Environment: Confidential InfoSphere Datastage 11.5, Oracle 11g, DB2, UNIX, Tableu, Oracle SQL Developer, UNIX, Control M and Windows XP.

Confidential, San Leondro, CA

Sr. Datastage Developer

Responsibilities:

  • Experience in converting the SQL Server stored procedures in old legacy systems into ETL jobs using the tool Datastage 11.5.
  • Created Framework in Datastage that controls the ATM Hardware application (for all 24 Modules) to process the data as part of daily run.
  • Created process table which controls the whole ATM hardware application.
  • Created the dimension tables using the ETL datastage parallel jobs.
  • Created fact tables using Oracle Stored procedures.
  • Created multi instance parallel job for the file archival process.
  • Created basic Server Routines to pass values between job activities in Sequence jobs.
  • Involved in understanding the business requirements with business team to develop ETL Jobs.
  • Analyzed Data Sources to perform Metadata Validation.
  • Designed /Wrote tech specifications (Source- Target Mappings) for the ETL Mappings
  • Prepared unit test cases for the applications developed.
  • Experience in writing the complex custom SQL’s to test the data between Source and Destination databases.
  • Interacted with the Business Team to understand more about the requirements and conducted status meetings with development team to discuss project status regularly.
  • Migrated the Datastage Jobs from 8.5 to 11.5.
  • Verified migration requirements, Installing the new version of Information Server. Export and Import and Configuration such as Tiers, Database, user groups .Etc.
  • Worked with Confidential team and applied some pathes to support some of the Server jobs from version 8.5 to 11.5.
  • Experience in migrating the jobs from one environment to another using Information Server Manager.
  • Scheduled the jobs using Control M.
  • Created parameter sets that can be used across all the jobs within the application.
  • Used version Control for data stage to track the changes made to the data stage project components and for protecting jobs by making read only.
  • Involved in Unit testing, Functional testing and Integration testing and provide process run time.
  • Little experience on Information Analyzer for Integration of data by analyzing business information to assure it is accurate, consistent, timely and coherent.
  • Experience in migrating the data from Old legacy systems and loaded the data into Oracle database using Dimesnion Data modeling.
  • Expertise in writing, re-writing and dissecting complex SQL queries using multiple join conditions, case statements, arithmetic, aggregate functions and query tuning/optimization techniques.
  • Writing SQL Scripts to extract the data from Database for Testing Purposes.
  • Performance tuned and optimized various complex SQL queries.
  • Provided conceptual and technical modeling assistance to developers and DBA's using Erwin and Model Mart. Validated Data Models with IT team members and Clien
  • Created Physical Data Model PDM for the OLAP application using ER Studio.
  • Created indexes on the tables for faster retrieval of the data to enhance database performance.
  • Identified performance issues in existing sources, targets and mappings by analyzing the data flow, evaluating transformations and tuned accordingly for better performance.
  • Extensively used Designer, Administrator, and Director for creating and implementing jobs.

Environment: Confidential InfoSphere Data stage 11.5/8.5, Oracle 11g, SQL Server 2005, UNIX, Business Objects, PL/SQL, Toad, UNIX, Control M and Windows XP.

Confidential, Albany, NY

Sr. Datastage Developer

Responsibilities:

  • Used the Confidential InfoSphere DataStage and QualityStage 11.3 to develop processes for extracting, cleansing, Transforming, integrating, and loading data into data warehouse database.
  • Used DataStage Parallel Extender parallel jobs for improving the performance of jobs.
  • Implementing Industry ETL standards and best practices, performance tuning during designing the Datastage Jobs.
  • Developed Parallel jobs using Parallel stages like: Dataset, Sequential file, Funnel, Filter, Modify, Merge, Join, Lookup, Transformer and External Source Stage.
  • Extensively used processing stages like the Lookup stage to perform lookup operations based on the various target tables, Modify stage to alter the record schema of the input data set.
  • Modified the Data stage jobs to increase performance using various performance tuning techniques.
  • Involved in code migration and code review sessions as part of migration plans.
  • Involved in migration process from DEV to Test and then to PRD.
  • Used director to run various different loads into the warehouse and also involved in working with various issued occurred in this process.
  • Performed analysis, database design and development.
  • Responsible for knowledge transfer to the operations team.
  • Utilized the Hadoop open-source framework to store the streaming data process in big data for distributed environment across clusters of computers, Big Data File stage to access files on the Hadoop Distributed File System (HDFS),Big Data File stage is similar in function to the Sequential File stage.
  • Provided the massive scalability by running jobs using HDFS stage on the InfoSphere Information Server parallel engine. And also, develop their Confidential InfoSphere DataStage Balanced Optimization to process this logic within the Hadoop cluster.

Environment: Confidential Info Sphere DataStage 11.3 (DataStage and Quality Stage), Oracle 10g/9i, SQL, PL/SQL, SQL Developer.

Confidential, Columbus, GA

Sr. Datastage Developer

Responsibilities:

  • Experience in working with Confidential BDW model which is very useful for banking sector for building data ware house.
  • Installed and administered Confidential Infosphere Information Server Suite 11 Version on Windows.
  • Worked closely with Data Modeler for the requirement gatherings and experience in working with offshore team to pass the requirements and get the things done on time.
  • Converted complex job designs to different job segments and executed through job sequencer for better performance and easy maintenance.
  • Worked on conceptual/logical/physical data model level using ERWIN according to requirements.
  • Interacted with various Business users in gathering requirements to build the data models and schemas
  • Expertise in understanding data modeling involving logical and physical data model.
  • Worked with DBAs to create a best fit physical data model from the logical data model
  • Created shared container to in corporate complex business logic in job
  • Extensively used slowly changing dimension Type 2 approach to maintain history in database. Maintained separate history tables to maintain history in BDW model.
  • Created ALT and Relationship tables for maintaining history in BDW Model.
  • Active participation in decision making and QA meetings and regularly interacted with the Business Analysts &development team to gain a better understanding of the Business Process, Requirements & Design.
  • Create parameter set to assign a value to job at run time.
  • Created Job Sequencers to automate the job.
  • Involved in Full Software Development Life Cycle(SDLC)
  • Created queries to compare data between two databases to make sure data is matched.
  • Created queries using join and case statement to validate data in different databases.
  • Extracted the data from multiples sources like files, Oracle and loaded the data into SQL Server.
  • Developed Parallel jobs using various stages like Join, Merge, Funnel, Lookup, Sort, Transformer, Copy, Remove Duplicate, Filter, Peek, Column Generator, Pivot and Aggregator stages for grouping and summarizing on key performance indicators used in decision support systems
  • Designed developed job sequential to run multiple jobs
  • Involved in writing SQL Queries to test the data from the DataStage jobs.
  • Tuned the Long running Parallel jobs based on Partitioning techniques.
  • Involved in Control M while scheduling the jobs.
  • Experience in working with 24/7 Production Support

Environment: Confidential InfoSphere Information Server 11.3, Data stage, Quality Stage, SQL Server Management Studio 2014, Visual Source Safe 2005, HP Quality Center 10.0, Windows XP, Oracle 10g, Confidential - InfoSphere Data Architect and Control M.

Confidential, Oakland, CA

Sr. Datastage Developer

Responsiblities:

  • Involved as ETL developer during analysis, planning, design, development, and implementation stages of multiple projects like UCUES(UC Undergraduate Experience Survey) and UC Path using Confidential Web Sphere Datastage.
  • Acquired the requirements(Mapping Specs) from the Architect and designed the ETL jobs based on the requirement.
  • Used Confidential Web Sphere Datatsage as an ETL tool to extract the data from Source file systems and loaded into DB2 database.
  • Very complicated SQL Query optimization (explains plans, collect statistics, data distribution across AMPS, primary and secondary indexes, locking, etc) to achieve better performance.
  • Performed Data Profiling and Data Analysis using SQL queries looking for Data issues, Data anomalies.
  • Tuning of Teradata SQL statements using Explain analyzing the data distribution among AMPs and index usage, collect statistics, definition of indexes, revision of correlated sub queries, usage of Hash functions, etc
  • Teradata 14.10/14, Teradata (Mload, Fast load, Fast export, Tpump, BTEQ)
  • Extracted the data from source File systems and loaded it into staging, base and BI area.
  • Staging area is a direct mapping from Source file systems, base area contains all the transformations based on the requirement and BI area is where Type 2 applies.
  • From BI area we do reporting based on Business objects Tool.
  • Experience in working with Quality stage 8.5
  • Migrated jobs from 8.5 to 11.3
  • Extensively worked with sequential file, dataset, file set and look up file set stages.
  • Maintained Data Warehouse by loading dimensions and facts as part of project.
  • Also worked for different enhancements in FACT tables.
  • Created Job Sequences.
  • Collaborated in developing Java Custom Objects to derive the data using Java API.
  • Experience in working with Switch stage and Java LDAP Stage
  • Analyze performance and monitor work with capacity planning.
  • Documented ETL test plans, test cases, test scripts, and validations based on design specifications for unit testing, system testing, functional testing, prepared test data for testing, error handling and analysis.
  • Participated in weekly status meetings.
  • Participated in loading huge volumes of data.

Environment: Confidential WebSphere Data stage 11.3/8.5, Quality Stage 8.5, DB Visualizer 9.0.6, Java, HP Quality Center 10.0, Teradata 14.10/14, Teradata (Mload, Fast load, Fast export, Tpump, BTEQ), Unix, Windows XP, Oracle 10g, Control M, MS Visio 2010.

Confidential, Milwaukee, WI

Datastage Developer

Responsibilities:

  • Extracted the data from source SAP systems and loaded it into staging area, after cleansing and validation loaded into Oracle database.
  • Worked extensively with ABAP Stage to extract the data from SAP ECC System and do transformations, finally loaded into Oracle database.
  • Used RFC data transfer method to extract the data from SAP using ABAP Stage and loaded it to Sequential files.
  • Experience in Working with Information Server RTL (Ready to Launch) for SAP.
  • Worked closely with Architect for the requirement gatherings and experience in working with offshore team to pass the requirements and get the things done on time.
  • Experience in migrating the jobs from Dev environment to QA environment.
  • Extensive experience in identifying the errors corresponding to ABAP Stage.
  • Migrated jobs from 7.5 to 8.1 and developed new data stage jobs using data stage/quality stage designer.
  • Extensive experience in working with Datastage tools like Datatage Designer and Datastage Director for developing the jobs and view the log for errors.
  • Tuned the parallel jobs using appropriate partitioning techniques used in the jobs and worked closely with DBA to create the proper indexes to handle the long run jobs.
  • Extensive experience in writing SQL to check the data loaded by DS Jobs.
  • Experience in working with Routines
  • Experience in generating the Surrogate key using key management functions while loading the data into Oracle in Datastage 8.5 Version.
  • Experience in creating Technical Spec documents as well as Unit test document
  • Used TC’s in SAP to check the data in SAP Tables and ABAP Program codes generated from ABAP Stage.
  • Experience in solving many issues while working with the ABAP Stage.

Environment: Confidential WebSphere Data stage 8.5/8.1, Quality Stage 8.5, Confidential AIX Server, SAP ECC R/3 6.5, SQL Developer 8.0, Toad 9.5, HP Quality Center 10.0, Unix, Windows XP, Oracle 10g and Teradata.

Confidential, Farmingdale, NY

Datastage Developer

Responsibilities:

  • Designed and developed the jobs using DataStage Designer for extracting, cleansing, transforming, integrating and loading data using various stages like Aggregator, Funnel, Change Capture, Change Apply and copy.
  • Worked with DataStage Director to schedule, monitor and analyze performance of individual stages and run DataStage jobs.
  • Developed and supported the Extraction, Transformation and Load process (ETL) for a data warehouse from various data sources using DataStage Designer
  • Designed developed job sequential to run multiple jobs
  • Worked with DataStage Administrator to set up environment variables.
  • Worked with DataStage Director for testing and monitoring the executable jobs
  • Developed DataStage jobs to convert the data obtained in COBOL format from Emblem Health to Oracle Database to be used for further data cleansing processes.
  • Written various stored procedures for testing the application.
  • Involved in writing SQL Queries to test the data from the DataStage jobs.
  • Tuned the Parallel jobs for better performance
  • Experience in working with Copybooks
  • Created objects like tables, views, Materialized views procedures, packages using Oracle tools like PL/SQL, SQL*Plus, SQL*Loader and Handled Exceptions.
  • Involved in database development by creating Oracle PL/SQL Functions, Procedures, Triggers, Packages, Records and Collections.
  • Created various indexes on tables to improve the performance by eliminating the full table scans.
  • Created views for hiding actual tables and to eliminate the complexity of the large queries.
  • Created source table definitions in the DataStage Repository.

Environment: Confidential WebSphere Data stage 8.1, Ascential Datastage 7.5.2, UNIX Shell Scripting (Korn /KSH), PLSQL Developer 8.0, Visual Source Safe 2005, HP Quality Center 10.0, Unix, Windows XP, Oracle 10g, Mainframes, Teradata and Autosys

Confidential, NC

Datastage Developer

Responsibilities:

  • Designed and Developed DataStage jobs in Server Edition initially and then converted those to parallel jobs using Enterprise Edition 8.1 so as to tune the overall performance of system.
  • Used DataStage Director to Run and Monitor the Jobs performed, automation of Job Control using Batch logic to execute and schedule various DataStage jobs.
  • Worked extensively with both Parallel and Server jobs.
  • Knowledge of configuration files for Parallel jobs.
  • Used Partition methods and collecting methods for implementing parallel processing.
  • Improved the performance of jobs by four times by using Multi node Configuration(4-nodes)
  • Scheduled job runs using DataStage director, and used DataStage director for debugging and testing.
  • Used Shared Container for repeated business logic, which is used across the project.
  • Extensively used Parallel Stages like Row Generator, Column Generator, Head, and Peek for development and de-bugging purposes.
  • Extensive experience in developing the Sequence jobs
  • Experience in working with Routines.
  • Experience in combing the multiple jobs into Single job
  • Involved in Unit testing, Functional testing and Integration testing and provide process run time
  • Prepared Data Volume estimates.
  • Used BAPI Stage to load the data from Datastage to SAP by calling function module in SAP.
  • Made Bapi Stage as an Active stage from Passive stage.
  • Used TC’s on SAP to check the data, whether it properly loded into SAP or not.
  • Used Parallel Extender for splitting the data into subsets, utilized Lookup, Sort, Merge and other stages to achieve job performance.
  • Experience in creating Technical Spec documents as well as Unit test document.

Environment: Confidential WebSphere Data stage 8.1, Confidential WebSphere Data stage 8.5, Ascential Datastage 7.5.2, Confidential AIX Server, SAPR/3 6.5, WINSQL, DB2, Oracle 10 g, UNIX, Windows XP

Confidential, CA

Datastage Developer

Responsibilities:

  • Extracted the data from source systems and loaded it into staging area, after cleansing and validation loaded into SAP.
  • Experience in working with Flatfiles, DB2 and Oracle datasources.
  • Created FTP job to extract the data from legacy system.
  • Used Datastage plugins to load the data into SAP(IDOC LOAD)
  • Created a job using IDOC LOAD stage for posting the materials into SAP.
  • Extensive experience in creating the segments in IDOC LOAD stage and populating the segments according to the requirement on SAP
  • Used TC’s on SAP to process the IDOCS.
  • Experience in identifying the errors in segments on SAP.
  • Extensively used Parallel Stages like Row Generator, Column Generator, Head, and Peek for development and de-bugging purposes.
  • Worked with Data Sets and used Changes Capture and Changes Apply stages with them
  • Migrated the jobs from 7.5 to 8.1 and developed new data stage jobs using data stage/quality stage designer Imported and exported repositories across projects.
  • Experience in generating the Surrogate key using key management functions while loading the data into SAP in Datastage 8.1 Version.
  • Wrote Shell Scripts to run data stage jobs, PL/SQL blocks.
  • Imported metadata, table definitions and stored procedure definitions using the Manager.
  • Used Tivoli load leveler scheduler to run more jobs in less time by matching each job’s processing needs and priority.

Environment: Confidential WebSphere Data stage 8.1, Ascential Datastage 7.5.2, SAPR/36.0ECC, UNIX Shell Scripting (Korn /KSH), WINSQL, Oracle 9i/10g, Teradata, UNIX, Windows XP, and Tivoli Load leveler 3.4.1.2

We'd love your feedback!