We provide IT Staff Augmentation Services!

Datastage Developer Resume

0/5 (Submit Your Rating)

Hoffman Estates, IL

SUMMARY

  • Around 6 years of experience in ETL mechanisms using IBM InfoSphere Datastage.
  • 6+ years of experience in RDBMS database systems using ORACLE 10g/9i/8.x, SQL Server, DB2, Sybase and Teradata.
  • Excellent knowledge and experience in data warehouse development life cycle, dimensional modeling, repository management and administration, implementation of STAR, Snowflake schemas and slowly changing dimensions.
  • Experience in source systems analysis and data extraction from various sources like Flat files, Oracle 8.x/9i, DB2, UDB, COBOL Copybooks, MS SQL Server 2000, Teradata, Sybase, PeopleSoft and SAP.
  • Extensive experience in designing and developing jobs using DataStage Designer, exporting, importing jobs using Data Stage Manager and monitoring jobs using DataStage Director.
  • Experience with Parallel Extender for Parallel Processing to improve job performance while working with bulk data sources.
  • Performed debugging, troubleshooting, monitoring and performance tuning using DataStage.
  • Proficient in using scheduling tools like Control - M.
  • Excellent communication, client interaction and problem solving skills.
  • Sound knowledge of compile time / runtime architecture of Datastage and internals of parallelism.
  • Strong understanding on Oracle SQL and PL/SQL programming.
  • Good working knowledge of Teradata utilities BTEQ, Fastload, Fastexport, Tpump, Mload and Fast Load utilities.
  • Good working knowledge on UNIX as an operating system as well as Unix Shell scripting.
  • Well familiar with ETL and ETL methodologies.
  • Experience with parallel processing environments.
  • Strong knowledge on building Enterprise Data Warehouses (EDW), Operational Data Store (ODS), Data Marts, and Decision Support Systems (DSS), Star and Snowflake schema design addressing Slowly Changing Dimensions (SCDs).
  • Strong experience in developing complex mappings using various transformations like Unconnected and Connected lookups, Router, Aggregator, Sorter, Ranker Joiner, Stored Procedure, Update Strategy and Re-usable transformations.
  • Strong understanding on dimensional modeling
  • Extensive experience in designing and developing ETL Environment involving various Source and Target databases like Oracle, Flat Files,DB2 ( fixed width, delimited ), XML, SQL server and Teradata.
  • Expert in Database development that includes DML, DDL and DCL, PL/SQL that includes cursors, ref-cursors, procedures, triggers, Bulking techniques, collections, partitioned tables, anonymous, stored procedures, functions, triggers, packages, materialized views and performed query optimization, database performance
  • Excellent analytical, programming, written and verbal communication skills with ability to interact with individuals at all levels.

TECHNICAL SKILLS

Tools: /Packages DataStage 8.0/7.5.1/7.1/6.0 , QualityStage V 7.5, DataStage PX (ParallelExtender), DataStage Version Control, WebSphere Information Server

Databases: IBM DB2 UDB, Oracle 10G/9i, Oracle 8i, Oracle 8.0/7.3, MS SQL Server 2000/7.0/6.5 , TeradataV2R5/V2R4/V2R3

Operating System: IBM AIX, Windows NT/2000/XP/2003

Languages: VB.NET, C #, C, VB 6.0 and HTML

Scripting: SQL, Unix Shell Scripting

Web Technology: ASP.NET, ADO.NET, HTML, DHTML, XML, HTTP, Web Services

Environment: UNIX (Sun Solaris, HP-UX, AIX), Windows 2003/2000/XP/98

Data Modeling: Physical Modeling, Logical Modeling, Relational Modeling, Dimensional Modeling(Star Schema, Snow-Flake, FACT, Dimensions), Entities, Attributes, Cardinality, ERDiagrams, Erwin 4.5/4.2/3.5.2/2. x

PROFESSIONAL EXPERIENCE

Confidential, Hoffman Estates, IL

DataStage Developer

Responsibilities:

  • Understanding the business requirements, functional flow, analyze and implement the recommended solutions for development of the application.
  • Prepared ETL mapping document, high level design documents and testing documents for ETL jobs.
  • Working on development of new requirements using ETL mapping document. As part of the project we worked on applications that provide pricing feed, classification feed, commission feeds, marketing feed, feeds for reporting team, local market place feed etc.
  • Working on requests for enhancements of application.
  • Working on improving performances of the jobs to meet strict time lines
  • Developing extraction, transformation and load process (ETL) for loading data in to data warehouse from various data sources using DataStage.
  • Worked on joins, look up, sequencers, transformers, aggregator stage, database stages etc.
  • Also wrote SQL’s, UNIX shell scripts for ftping the files.
  • Responsible for the design, development, coding, testing and debugging of application to meet the requirements of the users.
  • Migration of developed application between Development/QA/Prod environments.
  • Coordinated with scheduling team to automate the process of running the jobs.
  • Coordinating Project activities and tasks, as well as dependent deliverables Managing tasks, issues, and risk retirement
  • Provide production deployment guidelines and execution plan.
  • Provide documentation support.
  • Created Datastage sequences to sequence the NZSQL scripts.
  • Assisted QA team to create test cases and data.
  • Prepared batch dependency diagram to schedule jobs in Control-M and coordinated the same.

Environment: IBM DataStage 8.1(Designer, Director, Administrator), Oracle 10g, DB2/AIX64 9.7.2, CONTROL-M 6.4

Confidential, LOS ANGELES, CA

DataStage Developer

Responsibilities:

  • Prepared technical design specifications.
  • Worked with Business analysts of Confidential and Farmers Group, to finalize the EDW changes resulted because of the merger. Prepared EDW changes documents.
  • Prepared Mapping Documents, Conceptual Design, and Testing Documents for ETL jobs.
  • As a Data Warehousing team member, involved in Designing, Developing and Testing of the ETL (Extract, Transformation and Load) strategy to populate the data from various source systems feeds.
  • Converted existing Server Jobs into Parallel Jobs, along with new changes to be implemented.
  • Utilized Data Stage DB2 API stages, Oracle enterprise stage to extract data from DB2 OS390 tables and load in to oracle.
  • Created Data Stage jobs to load policy, policy term, driver, driver rating, vehicle, premium charge tables using different stages like Transformer, Aggregator, Sort, Join, Merge, Lookup, Data Set, File.
  • Built jobs in DEV Environment and moved them across DEV/IST/PROD Environments.
  • Created Data Stage Batch jobs to run individual Jobs in the category.
  • Testing individual modules of the software against the Unit test cases.
  • Modified existing UNIX Scripts to run Data stage jobs
  • Worked with Autosys team to schedule the jobs to run every night.
  • Monitored the batch run every day and fixed if any failures in the QA and Production environments.
  • Logged the job failures or data issues in the Log that was created.
  • Tuned Data Stage jobs to enhance their Performance.
  • Coordinated with Micro strategy team to create reports.
  • Exported data using Teradata FastExport
  • Prepared unit test cases and performed Unit test.
  • Assisted QA team to create test cases and data.

Environment: Ascential DataStage 7.5.2, Oracle 10g, Toad 8.0, Mainframe, Autosys, DB2 OS390, IBM AIX, Teradata.

Confidential, Scottsdale, AZ

DataStage ETL Developer

Responsibilities:

  • Created source to target mapping
  • Created Low level design document for datastage jobs
  • Prepared SQL queries for data extraction from Oracle sources
  • Created batch dependency diagram for jobs execution
  • Defined data quality rules and implemented the same ; Analyzed the present data warehouse data model and participated in investigation of data subject areas and document data flows.
  • Interacted with data modelers and data base administrators to make modifications in the data model according to the requirements.
  • Used technical specifications document to design and develop the ETL jobs. Also modified the technical documents to accommodate the changes in specs.
  • Designed, developed and tested the Data stage jobs using Designer and director based on business user requirements and business rules to load data from heterogeneous data sources like Sybase, Oracle 9i, text files and MS SQL Server.
  • Developed Data Stage Jobs using Transformer, Aggregator, lookup, stored procedure stage, Join, merge, sort, RTI stages.
  • Developed several Parallel jobs to improve performance by reducing the runtime for several jobs.
  • Extensively used SQL coding for overriding the generated SQL in DataStage and also tested the data loading into the database.
  • Extensively used DataStage built-in transforms and created user defined subroutines to implement some of the complex logic to meet the business requirements.
  • Used DataStage manager to import table definitions from various databases, import and export the datastage jobs between development, testing and production environments.
  • Used DataStage director to validate, monitor, schedule and execute the jobs.
  • Used DataStage version control to version the designed datastage jobs.
  • Created DataStage sequencers using Job Activity, triggers, routine activity, execute command, notification activities to ensure the sequential run of all the designed jobs.
  • Used quality stage to maintain the high quality master data.
  • Created stored procedures, functions, packages, views and triggers using PL/SQL.
  • Extensively used Oracle Exception handlers and created user defined exceptions to capture the errors in the designed PL/SQL functions and procedures.
  • Involved in manual testing of the designed jobs, UAT, integration testing and production process.
  • Used rational clear case, clear quest for defect management and version control of the created procedures and UNIX scripts.
  • Developed UNIX shell scripts to automate file manipulation and data loading process.
  • Supported during the QA phases of the project

Environment: Ascential DataStage 7.5.2, Teradata, HP-UX

Confidential, Columbus, OH

DataStage Developer

Responsibilities:

  • Interacted with business users extensively in source data analysis and data cleansing.
  • Actively interacted with data model architect in developing data mart design with Facts and Dimension tables for data warehouse and reporting purposes.
  • Created data flow diagrams and workflow/process models.
  • Designed Parallel jobs using different parallel job stages such as Join, Merge, look up, filter, change data capture, modify and aggregator.
  • Used peek, head, tail, Sample for debugging in parallel jobs.
  • Created Server jobs to fetch data from AS400 systems.
  • Used DataStage Manager for importing metadata from repository, import and export jobs between different environments.
  • Used DataStage Director to schedule, monitor, analyze performance of individual stages and run multiple instances of a job.
  • Created SQL loader scripts to load the data into staging environment.
  • Effectively used database indexes to tune the designed SQL’s, thereby reducing the execution time.
  • Created UNIX shell scripts to automate the execution of procedures and SQL scripts.
  • Created UNIX scripts using the sort, cut, head, tail, grep commands to work with the source sequential files.

Environment: DataStage 7.5.2, Control-M, Oracle 9i, DB2 UDB 7.2

Confidential

DataStage Developer

Responsibilities:

  • Created jobs for checking data validation rules
  • Used Join, Merge and Lookup stages to combine data
  • Used Oracle Enterprise stage for loading and updating tables
  • Unit tested Unix scripts and Datastage jobs
  • Created job sequence to batch DS jobs
  • Prepared necessary documentation for the project
  • Tested Datastage jobs
  • Batched jobs using Job sequence
  • Created random data for testing using Column generator and Row generator.
  • Used Sort stage and Remove duplicate stages for applicable business scenarios.
  • Perform unit testing
  • Prepared SQL scripts for testing purposes

Environment: Ascential Datastage 7.5.1, Oracle 9i, IBM AIX

We'd love your feedback!