We provide IT Staff Augmentation Services!

Sr. Datastage Consultant Resume

4.00/5 (Submit Your Rating)

Portland, OR

PROFESSIONAL SUMMARY:

  • Over 5 years of ETL and Data Integration technical experience in IT Industry as a developer in System Analysis, Design, Development, Testing and Support of projects using Data stage 7.5/8.1/8.5/9.1/11.3 in Banking, Retail and Insurance and Automobile verticals.
  • Experience in Software Development Life Cycle - SDLC (Analysis, Design, Development and Testing), requirement gathering, client interaction, Use Case Design and understanding.
  • Have hands on experience in design and development of complex data stage jobs, sequencers and Fast track, Metadata Workbench and Business Glossary.
  • Have hands on experience in Auto sys and Control-M job schedulers.
  • Experience in UNIX scripting, troubleshooting and file handling on the UNIX system.
  • Good hands on experience in RDBMS like Oracle, Netezza, SQL Server, Teradata and DB2. Working experience in data modeling and implementing stored procedures using PL/SQL. Extensive knowledge of writing complex queries using SQL. Working experience in DB2 cursors, triggers and stored procedures.
  • Experience in Design and Development of ETL methodology for supporting Data Migration, data transformations & processing in a corporate wide ETL Solution using Teradata TD 14.0/13.0/12.0.
  • Experience on Teradata tools and utilities (BTEQ, Fast load, Multi Load, Fast Export, and TPUMP).
  • Profound knowledge about the architecture of the Teradata database and experience in Teradata Unloading utilities like Fast export.
  • Experience in integration of various data sources definitions like SQL Server, Oracle, Teradata SQL Assistant, MYSQL, Flat Files, XML and XSDs.
  • Demonstrated work experience in using the state of art Data stage Grid environments.
  • Experience in Data Integration, EDW and Data Mart projects.
  • Strong knowledge in OLAP and OLTP Systems, Dimensional modeling using Star schema and Snowflake schema.
  • Demonstrated work experience in Mainframe to Data stage Migration projects.
  • Experienced in designing and using several process tracking mechanisms and communication designs that give a snapshot of the status of the jobs in an application/system.
  • Worked on integrating data from sequential files, flat files, COBOL files and XML files.
  • Experience in working closely with mainframe applications for the ETL interactions.
  • Prepared High Level design document, Low Level design document, Technical design document for various projects and good at Bug fixing, Code reviews, and Unit & System testing.
  • Expertise in application development in various Data stage versions like 7.5/8.1/8.5/9.1/11.3/11.5.
  • Strong Data Warehousing ETL experience of using Informatica 9.1/8.6.1/8.5/8.1/7.1 PowerCenter Client tools - Mapping Designer, Repository manager, Workflow Manager/Monitor and Server tools Informatica Server, Repository Server manager.
  • Expertise in Data Warehouse/Data mart, ODS, OLTP and OLAP implementations teamed with project scope, Analysis, requirements gathering, data modeling, Effort Estimation, ETL Design, development, System testing, Implementation and production support.
  • Extensive testing ETL experience using Informatica 9.1/8.6.1/8.58.1/7.1/6.2/5.1 (Power Center/ Power Mart) (Designer, Workflow Manager, Workflow Monitor and Server Manager) Teradata and Business Objects.
  • Strong experience in Dimensional Modeling using Star and Snow Flake Schema, Identifying Facts and Dimensions, Physical and logical data modeling using ERwin and ER-Studio.
  • Expertise in working with relational databases such as Oracle 11g/10g/9i/8x, SQL Server 2008/2005, DB2 8.0/7.0, UDB, MS Access and Teradata.
  • Strong experience in Extraction, Transformation and Loading (ETL) data from various sources into Data Warehouses and Data Marts using Informatica Power Center (Repository Manager, Designer, Workflow Manager, Workflow Monitor, Metadata Manger), Power Exchange, Power Connect as ETL tool on Oracle, DB2 and SQL Server Databases.
  • Experience using SAS ETL tool, Talend ETL tool and SAS Enterprise Data Integration Server highly preferred.
  • Expertise in creating Data Mapping for various projects and applications.
  • Experience in working in a Multi-Dimensional Warehouse projects.
  • Experience in reading and loading high-volume Type 2 dimensions by implementing SCD (Slowly Changing Dimensions).
  • Experience in using Mainframe applications for browsing the files and NDM’ ing the files.
  • Hands on Experience in working with BOXI for report generation.
  • Having good knowledge on Hadoop Cluster Architecture.
  • By using Hadoop ecosystem, developed large data processing and big data analytics.
  • Good in designing data base using triggers, stored procedures and functions and also having knowledge in writing queries in SQL server and DB2.
  • Good experience in working with HP ALM(QC)
  • Good experience in designing metadata for all data moves and to re-use them during the job designing with the help of Metadata Management and Fast track Mapping specifications.
  • Providing 24X7 productions supports for the application stability.
  • Quick learner and up-to-date with industry trends, Excellent written and oral communications, analytical and problem solving skills and good team player, Ability to work independently and well-organized.
  • Developed and maintained mostly python and some Perl ETL Scripts to Scrape Data from External websites and load cleansed data into a MYSQL DB.

TECHNICAL SKILLS:

Languages: SQL, PL/SQL, Unix shell scripting, COBOL,Python,C++

ETL Tools: Data stage 7.5/8.1/8.5/9./11.3 , Informatica Power Center 9.1/8.6/8.5/7.1 Designer, Administrator, Director Fast track, Metadata Workbench, Business Glossary

Databases: Oracle 11g/10g/9i/8i, DB2, Teradata, SQL Server, Netezza, DB2

Software s: TOAD for oracle, TOAD for DB2, WINSCP,MS SQL server 2008 Management Studio, SQL * PLUS, SQL Developer, Queryman, AQT for DB2, AQT for Oracle, MAINFRAMES

Big Data tools: HDFS,MapReduce,Hive,Kafka,Flume,Sqoop,Impala,Storm

Operating Systems: UNIX (AIX), LINUX, Windows 95/98/2000/XP/2003, MS-DOS.z/OS

Reporting Tools: BOXI R2

PROFESSIONAL EXPERIENCE:

Confidential, Portland,Or

Sr. Datastage Consultant

Responsibilities:

  • Involved as primary on-site ETL Developer during the analysis, planning, design, development, and implementation stages of projects using IBM Web Sphere software.
  • Prepared Data Mapping Documents and Design the ETL jobs based on the DMD with required Tables in the Dev Environment.
  • • Active participation in decision making and QA meetings and regularly interacted with the Business Analysts &development team to gain a better understanding of the Business Process, Requirements & Design.
  • Used DataStage as an ETL tool to extract data from sources systems, loaded the data into the ORACLE database.
  • Designed and Developed Data stage Jobs to Extract data from heterogeneous sources, Applied transform logics to extracted data and Loaded into Data Warehouse Databases.
  • Created Datastage jobs using different stages like Transformer, Aggregator, Sort, Join, Merge, Lookup, Data Set, Funnel, Remove Duplicates, Copy, Modify, Filter, Change Data Capture, Change Apply, Sample, Surrogate Key, Column Generator, Row Generator, Etc.
  • Extensively worked with Join, Look up (Normal and Sparse) and Merge stages.
  • Extensively worked with sequential file, dataset, file set and look up file set stages.
  • Extensively used Parallel Stages like Row Generator, Column Generator, Head, and Peek for development and de-bugging purposes.
  • Used the Data Stage Director and its run-time engine to schedule running the solution, testing and debugging its components, and monitoring the resulting executable versions on ad hoc or scheduled basis.
  • Developed complex store procedures using input/output parameters, cursors, views, triggers and complex queries using temp tables and joins.
  • Converted complex job designs to different job segments and executed through job sequencer for better performance and easy maintenance.
  • Creation of jobs sequences.
  • Maintained Data Warehouse by loading dimensions and facts as part of project. Also worked for different enhancements in FACT tables.
  • Created shell script to run data stage jobs from UNIX and then schedule this script to run data stage jobs through scheduling tool.
  • Coordinate with team members and administer all onsite and offshore work packages.
  • Analyze performance and monitor work with capacity planning.
  • Performed performance tuning of the jobs by interpreting performance statistics of the jobs developed.
  • Documented ETL test plans, test cases, test scripts, and validations based on design specifications for unit testing, system testing, functional testing, prepared test data for testing, error handling and analysis.
  • Participated in weekly status meetings.
  • Developed Test Plan that included the scope of the release, entrance and exit criteria and overall test strategy. Created detailed Test Cases and Test sets and executed them manually.
  • Sound Knowledge with Netezza SQL
  • Conducted ETL Development in the Netezza Environment using standard design methodologies.
  • Assessed the Netezza Environment for implementation of the ETL Solutions.
  • Used Netezza SQL to maintain ETL frameworks and methodologies in use by the company.
  • Collaborated with software architects to ensure alignment of the Netezza environment.
  • Managed Netezza queries by performance tuning and techniques such as CBT and Collocation
  • Additionally moved some ML code functionality to back end Python Scripts.
  • Developed shared Python Modules to encapsulate common Functionallity.

Environment: IBM Web Sphere DataStage 11.3 Parallel Extender, Web Services, Quality Stage 8.1, (Designer, Director, Manager), Microsoft Visio, IBM AIX 4.2/4.1 IBM DB2 Database, SQL Server, IBM DB2,Teradata, ORACLE 11G, Query man, Unix, Windows,Facets 5.5.

Confidential, Irving,TX

ETL Developer

Responsibilities:

  • Attended meetings with client and business teams to understand the requirement and prepared the low level design document, technical specification documents.
  • Identified the impacts and created Data Mapping Sheets.
  • Active participation in decision making and QA meetings and regularly interacted with the Business Analysts &development team to gain a better understanding of the Business Process, Requirements& Design.
  • Used DataStage as an ETL tool to extract data from sources systems, loaded the data into the IBM DB2 database.
  • Involved in the design, development of the SQL and PL/SQL stored procedures, packages and triggers for the ETL Datastage processes.
  • Created jobs to read and write data into complex flat files, COBOL files, sequential files.
  • Designed and Developed Data stage Jobs to Extract data from heterogeneous sources, Applied transform logics to extracted data and Loaded into Datamart.
  • Created Datastage 8.5 jobs using different stages like Transformer, Aggregator, Sort, Join, Merge, Lookup, Data Set, Funnel, Remove Duplicates, Copy, Modify, Filter, Change Data Capture, Change Apply, Column Generator, Difference, Row Generator, Sequencer, Email Communication activity, Command activity, Sequential File, CFF stage, Dataset, Terminator activity.
  • Used Data Stage Director and its run-time engine for job monitoring, testing and debugging its components, and monitoring the resulting executable versions on ad hoc or scheduled basis.
  • Converted complex job designs to different job segments and executed through job sequencer for better performance and easy maintenance.
  • Created master jobs sequencers.
  • Used Teradata utilities FAST LOAD, MULTI LOAD, TPUMP to load data.
  • Wrote, tested and implemented Teradata Fast load, Multiload and BTEQ scripts, DML and DDL.
  • Created TPT to transfer the data Oracle system to Teradata.
  • Used BTEQ and SQL Assistant (Query man) front-end tools to issue SQL commands matching the business requirements to Teradata RDBMS.
  • Worked on Informatica Power Center tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor.
  • Using Informatica PowerCenter Designer analyzed the source data to Extract & Transform from various source systems(oracle 10g,DB2,SQL server and flat files) by incorporating business rules using different objects and functions that the tool supports.
  • Using Informatica PowerCenter created mappings and mapplets to transform the data according to the business rules.
  • Created ETL/Talend jobs both design and code to process data to target databases
  • Created Talend jobs to load data into various Oracle tables. Utilized Oracle stored procedures and wrote few Java codes to capture global map variables and use them in the job
  • Created Talend jobs to copy the files from one server to another and utilized Talend FTP components
  • Developed complex Talend ETL jobs to migrate the data from flat files to database
  • Implemented custom Error handling in Talend jobs and worked on different methods of logging
  • Developed complex ETL jobs from various sources such as SQL server, Postgressql and other files and loaded into target databases using Talend OS ETL tool.
  • Created Talend jobs using the dynamic schema feature.
  • Interact with business community and gathered requirements based on changing needs. Incorporated identified factors into Talend jobs to build the Data Mart.
  • Used to be On call Support if the Project is deployed to further Phases
  • Used Talend Admin Console Job conductor to schedule ETL Jobs on daily, weekly, monthly and yearly basis (Cron Trigger)
  • Used various transformations like Source Qualifier, Joiner, Lookup, sql, router, Filter, Expression and Update Strategy.
  • Created shell script to run data stage jobs from UNIX and then schedule this script to run data stage jobs through scheduler called VERTIS which runs based on ILOG JRules.
  • Coordinated with team members and administer all onsite and offshore work packages.
  • Performed performance tuning of the jobs by interpreting performance statistics of the jobs developed.
  • Designed and Built Hadoop Applications.
  • Involved in Working Advanced Concept like Apache Spark and Scala Programming
  • Involved continuous development of map reduce. Coding that works on Hadoop clusters
  • Monitoring and managing Hadoop Log Files.
  • Designed and implemented by configuring topics in new Kafka Cluster in all environment.
  • Successfully secured the Kafka Cluster with Kerberos.
  • Experience with ETL working with Hive and Map reduce.
  • Experience in Designing and implementing large scale distributed data processing applications built on Hadoop, Hbase, Hive, Map reduce, Yarn and other Hadoop ecosystem components Hue, Oozie, Spark, Sqoop and Zookeeper.
  • Documented ETL test plans, test cases, test scripts, and validations based on design specifications for unit testing, system testing, functional testing, prepared test data for testing, error handling and analysis.
  • Prepared DDL’s for table creation, table modification, index changes. Tested and executed the same in all environments Dev, CIT, SIT, UAT, Pre production and Production.
  • Prepared the DML’s for maintenance tables, reviewed, tested and executed them.
  • Used Clear case and IBM Infoshpere Information Sever Manager for version control tool for version controlling and movement of code to upper environments like SIT, UAT, Pre production and Production.
  • Used the HP ALM to upload the unit test cases and executed them. And to track any defects in the project during different phases of testing, debugging, investigating and worked towards closure of the defects.

Environment: IBM Web Sphere DataStage 11.5,11.3 Parallel Extender, IBM Infoshpere Information Sever Manager,Informatica PowerCenter Designer 8.6,8.1,Teradata,Informatica Repository Manager, Web Services, Clear case, Microsoft Visio, IBM AIX 4.2/4.1, SQL, PL/SQL, SQL Server, IBM DB2, Vertis, ilOG JRules, Unix, Windows, HP ALM

Confidential, Columbus, Ohio

Datastage Consultant

Responsibilities:

  • Identified source systems, their connectivity, related tables and fields and ensured data suitability for mapping.
  • Prepared Fasttrack Mapping Specifications, created metadata layouts in Metadata Workbench and update Business Glossary.
  • Prepared Data Mapping Document and Design the ETL jobs based on the DMD with required Tables in the Dev Environment.
  • Active participation in decision making and QA meetings and regularly interacted with the Business Analysts &development team to gain a better understanding of the Business Process, Requirements& Design.
  • Used DataStage as an ETL tool to extract data from sources systems, loaded the data into the IBM DB2 database.
  • Wrote SQL, PL/SQL, queries, stored procedures, triggers for implementing business rules and transformations.
  • Designed and Developed Data stage Jobs to Extract data from heterogeneous sources, Applied transform logics to extracted data and Loaded into Data Warehouse.
  • Created Datastage 8.5 jobs using different stages like Transformer, Aggregator, Sort, Join, Merge, Lookup, Data Set, Funnel, Remove Duplicates, Copy, Modify, Filter, Change Data Capture, Change Apply, Surrogate Key, Column Generator, Difference, Row Generator, Sequencer, Email Communication activity, Command activity.
  • Used Data Stage Director and its run-time engine for job monitoring, testing and debugging its components, and monitoring the resulting executable versions on ad hoc or scheduled basis.
  • Converted complex job designs to different job segments and executed through job sequencer for better performance and easy maintenance.
  • Created master jobs sequencers.
  • Also worked for different enhancements in FACT tables.
  • Created shell script to run data stage jobs from UNIX and then schedule this script to run data stage jobs through scheduling tool Autosys using JIL’s.
  • Coordinated with team members and administer all onsite and offshore work packages.
  • Performed performance tuning of the jobs by interpreting performance statistics of the jobs developed.
  • Documented ETL test plans, test cases, test scripts, and validations based on design specifications for unit testing, system testing, functional testing, prepared test data for testing, error handling and analysis.
  • Used the HP ALM to upload the unit test cases and executed them. And to track any defects in the project during different phases of testing, debugging, investigating and worked towards closure of the defects.
  • Performed Performance testing with different sets of node configuration, different queue and different volumes.
  • Prepared the DML’s for maintenance tables, review, and test and execute them.
  • Used Tortoise SVN version control tool for version controlling and movement of code to upper environments like SIT, UAT, Pre production and Production.

Environment: IBM Web Sphere DataStage 8.5 Parallel Extender, Grid environment, Fasttrack, Metadata Workbench, Business Glossary, Web Services, Quality Stage 8.5, SQL, PL/SQL, Microsoft Visio, IBM AIX 4.2/4.1, SQL Server, IBM DB2, ORACLE 11G, Autosys Scheduler, Unix, Windows, HP ALM

Confidential, Columbus, Ohio

Datastage Consultant

Responsibilities:

  • Used Parallel Extender Development/Debugging stages like Row generator, Column Generator, Head, Tail and Peek
  • Used Hashed file Datastage as a reference table based on a single key.
  • Extensively dealt with Performance tuning of the jobs.
  • Created process flow diagrams using Microsoft VISIO.
  • Documented the development process and performed knowledge transfer to Business Intelligence team.

Environment: DataStage/ QualityStage 8.0 (IBM Websphere Datastage and Quality Stage Designer, Director, Administrator), DB2 UDB 9.2,SQL Server 2000, Linux 10, SQL, PL/SQL, UNIX Shell Scripting, Microsoft Visio, DB2 Visualizer, MS SQL server, Mainframe, COBOL .

Confidential

Datastage Consultant

Responsibilities:

  • Studying the business requirement, preparing the impact analysis document.
  • Prepared technical specification document, upon review of the solution developed the solution using Datastage jobs and sequencers.
  • Used sequential file stage as the source for most of the source systems.
  • Developed a file check process that checks the format, volume and date in the file decides whether the right file is being sent by the source and whether the right file is being loaded into the database.
  • Used aggregator, look up, join, merge, dataset, transformer, sequencer, sequential file DB2 bulk load, hashed file stage, surrogate key generator.
  • Created DDL statements for new tables, changes to table structure, index changes, and creation of triggers and stored procedures.
  • Prepared unit test cases and test plans.
  • Executed the test cases, captured the results.
  • Supported the SIT testing, UAT testing.
  • Worked on packaging the code with the help of tortoise SVN version controlling tool and worked with respective teams to deploy the code.
  • Supported the system post production and worked in co-ordination with the production support teams to resolve any issues.

Environment: DataStage 8.1 (Designer, Director, Manager, Administrator) Enterprise Edition, SQL Server 2005, SQL, PL/SQL, IBM DB2, AS/400, ERwin4.0, MS Visio 2000.

We'd love your feedback!