We provide IT Staff Augmentation Services!

Sr. Etl/datastage Developer Resume

5.00/5 (Submit Your Rating)

Baltimore, MD

SUMMARY:

  • Around 6+ years of Experience in Data Warehousing in Design Analysis and Development using IBM Info Sphere Datastage 11.x/9.x/8.x/7.5 (Datastage Designer, DataStage Director, DataStage Manager, DataStage Administrator), Metastage7.5, Quality Stage, Profile Stage, Quality Manager Oracle, DB2,Teradata, UNIX.
  • Good exposure in other ETL tools like Informatica 9.6.1 9.1, 8.x.
  • Worked extensively with different database stages of DataStage like Oracle Enterprise, Dynamic RDBMS, DB2, Stored Procedure, ODBC.
  • Worked on Datastage migration from v 8.5 to 11.5
  • Extensive experience in creating repository, source, target databases and developing strategies for extraction, transformation and loading mechanism using DataStage 11.5/9.1/8.7 , 8.5,8.1/7.5 on Windows and UNIX environments.
  • Strong experience in Data warehouse development life cycle and Design of Data marts with Star Schemas, Snowflake Schemas and Integrated Schemas.
  • Experience in migrating DataStage jobs from 9.1 to 11.5 versions.
  • Used the DataStage Designer to develop processes for extracting, transforming, and loading data into data warehouse databases.
  • Extensively worked with Parallel Extender for splitting bulk data into subsets to distribute the data to all available processors to achieve best job performance.
  • Extensive experience and well versed with UNIX (shell, AWK, SED, Wild cards) commands.
  • Maintained and modified Shell scripts.
  • Expertise in database design and development on client/server applications using Oracle 12C/11g/10g/9i/8i, SQL, PL/SQL, Forms & Reports
  • Good understanding in extracting data from different sources like AS400/DB2, data directly from various source databases and flat files.
  • Strong experience in making DataStage processes as repeatable as possible to ensure there are significantly less man hours used, less defect ratio, and ability to deliver a task in less time and as a result be more cost effective.
  • Strong working experience in designing DataStage job scheduling outside the DataStage tool and also within the tool as required by Client/Customer company standards
  • Worked on building new environment defining config files, ODBC configurations and environment variables.
  • Involved in the migration of the jobs when migrating from DB2 to Netezza
  • Have hands on in stages like Lookup, Join, Merge, Lookup file set, Dataset, Row generator, Column generator, Funnel, Peek, Transformer, Change capture, Remove duplicate, External filter, Sort, Head, Tail, BASIC Transformer, ODBC/ Teradata/DB2 Connectors, Oracle Enterprise stage, Copy, etc.
  • Involved in developing and reviewing the Scrubbing, Matching, Survivorship rules and standardization using Quality Stage.
  • Performed through data profiling by using the Investigate stage of QualityStage and also by writing PL/SQL queries to identify and analyze data anomalies, patterns, inconsistencies etc.
  • Experience in integration of data from various data sources like DB2, Oracle, Netezza, Teradata, Sybase, AS400, Universe, MS SQL server, Web Services and XML files.
  • Created, updated and reviewed SDLC life cycle documents like Functional requirement specification, Software development specification, Test Strategy, Test Specification, Change Request, Impact Analysis, etc.
  • Defining test cases, executing test cycles, documenting unit test & system integration test results.
  • Experience in creating required documentation for production support hand off, and training production support on commonly encountered problems, possible causes for problems and ways to debug them.
  • A hardworking individual with strong analytical, problem solving and excellent communication skills.
  • Excellent interpersonal and communication skills, and experienced in working with senior level managers, business people and developers across multiple disciplines.

TECHNICAL SKILLS:

ETL Tools: IBM Information Server (DataStage) 11.5/9.1/8.7 /8.5/8.1/8.0.1 /7.5(Designer,Manager, Director, Administrator, Parallel Extender), QualityStage/Integrity, Informatica Power Center 9.6.1/ 9.1.

Operating Systems: Windows XP/2000/98. Windows 2003. UNIX (Sun - Solaris, AIX, HP/UX), Linux.

Programming Languages: SQL, PL/SQL, Shell Scripting, C, C++, C#.Net, VB.Net, HTML.

Databases: Teradata, Oracle 11g/10g/9i/8i/8.0/7.0, Netezza, MS SQL Server, DB2UDB 7.0/8.0/9.7

Scheduling Tools: Control- M, Autosys

Methodologies: Star Schema, Snowflake Schema

Other Tools: Toad, Rapid SQL, Win SQL, SQL Assistant, SQL Tools 1.4, Informatica Power Exchange 9.6.1/9.1/8.6.1 , FastLoad, MultiLoad, iSeries Navigator, DBVisualizer, DB2 AS/400, Microsoft Visio Professional 5.0/2002, Putty, Autosys.

PROFESSIONAL EXPERIENCE:

Confidential, Baltimore, MD

Sr. ETL/Datastage Developer

Responsibilities:

  • Worked on DataStage migration from v 8.5 to 11.5.
  • Worked extensively on Designer, Director and Administrator of the Info Sphere Datastage 11.5/9.1/8.7 , 8.5,8.1.
  • Created, updated and reviewed SDLC life cycle documents like Functional requirement specification, Software development specification, Test Strategy, Test Specification, Change Request, Impact Analysis, etc
  • Dealt extensively with data from Teradata sources.
  • Design and development of Extract, Transform, and Load (ETL) processes for extracting data from a various legacy systems and loading into target tables using SQL, DataStage Enterprise Edition.
  • Used DataStage as an ETL tool to extract data from sources systems, loaded the data into the IBM DB2 database.
  • Used the Data stage Designer to develop processes for extracting, cleansing, transforming, integrating, and loading data into data warehouse database.
  • Populated Data Marts at different levels of granularity for Vendors using DataStage, SQL scripts and stored procedures (being executed from shell scripts driven by Control-M).
  • Involved in Migration of Databases from Oracle 10g to Oracle 11g and Oracle 12C.
  • Understanding the entire business flow of the project form beginning to ending.
  • Designed the data marts in dimensional data modeling using star and snowflake schemas.
  • Designed and Developed ETL (Extract, Transformation & Load) strategy to populate the Data Warehouse from the various source systems such as Oracle, Oracle GG, Flat files, XML, My Sql.
  • Extensively used Sequential file, DB2, Join, Lookup, transformer, datasets, filter, Merge, Sort, remove duplicate stage for designing the jobs in the Data Stage Designer.
  • Worked with data feeds from various source systems from flat files, XML files, DB2 databases.
  • Extensively used File set stage like Sequential file for extracting and reading data.
  • Using the advanced techniques in the jobs to improve the performance.
  • Involved in analyzing and modifying the existing scripts
  • Developed Sequences and used different Stages like Execute Command, Job Activity, Notification Activity, Routine Activity, Sequencer, and Wait for File Activity stages.
  • Coordinated the code and data movement to production in production implementation activities. Also involved in the effort of integration testing of Datastage and shell code with scheduler.
  • Worked on SFTP setup and jobs to push/pull data between servers.
  • Data stage jobs were edited for performance tuning and ease of implementation.
  • Transferring files from one server to other through FTP
  • Defined test cases as a part of user testing and drove testing cycle execution of both SIT and UAT. Also documented the results for test cycles.
  • Identified, resolved source file format issues for production loading & data quality.
  • Enhancements are made to jobs as per the requirements and Unit testing of the jobs.
  • Fix the QA defects within the time lines.
  • Involved in production support activities.

Environment: InfoSphere Datastage 11.5/9.1/8.5 (Administrator, Director, Designer), Flat files,DB2, Rapid SQL, Teradata, Soap, Application Life cycle management, Putty, Autosys scheduler, Oracle 11g, Sybase, MS Visio, GIT EYE.

Confidential, St. Louis, MO

ETL/ Datastage Developer

Responsibilities:

  • Created new projects in Datastage Administrator and have set project level parameter settings.
  • Worked on Datastage version up gradation from Ascential’s Data Stage 7.5.2 to InfoSphereDatastage 8.5.
  • Import and Export of Datastage components.
  • Involved in creating .odbc.ini file for all the ODBC entries in Datastage server box.
  • Involved in creating DB2 cataloging and node entries for all the DB2 API's.
  • Used QualityStage to ensure consistency, removing data anomalies and spelling errors of the source information before being delivered for further processing.
  • Designed and developed reusable Mapplets and used mapping variable and mapping parameters in the ETL mapping.
  • Used Quality Stage to check the data quality of the source system prior to ETL process.
  • Killing jamming process of Datastage from Datastage server.
  • Involved in the design and development of Data Warehouse.
  • Transferring files from one server to other through FTP.
  • RAC configuration Oracle 9i/10g/11g/12c.
  • Preparation of technical specification for the development of Extraction, Transformation and Loading (ETL) jobs to load data into various tables in Data marts.
  • Wrote custom support modules for upgrade implementation using PL/SQL and Unix Shell Scripts.
  • Developed various jobs using stages ODBC, ODBC Connector, DB2, Look-up, Merge, Join, Funnel, Aggregator, Remove Duplicates, Filter, Copy, Sort, Peek, Row Generator, Dataset, Sequential File, Modify, FTP, Difference, Pivot and Transformer.
  • Creating environment variables, job parameters and parameter sets based on the requirements.
  • Extensively used Teradata Connecter / Enterprise stages to Extract / Load data from Teradata Tables.
  • Identified, resolved source file format issues for production loading & data quality issues post loading
  • Used Netezza Enterprise stage for doing loads into Netezza Database
  • Involved in Quality Stage to perform Data Investigation, Data Standardization etc.
  • Created shell scripts to archive files older than n days and remove them from original location.
  • Developed Sequences and used different Stages like Execute Command, Job Activity, Notification Activity, Routine Activity, Sequencer, and Wait for File Activity stages
  • Implemented Shared Containers for multiple jobs, which have the sane business logic.
  • Used Data Stage Director and it’s run-time engine to schedule running of the jobs, testing and debugging it’s components, and monitoring the resulting executable versions.
  • Created Master Job Sequencer to run jobs based on dependencies.
  • Have extensive experience in analysis and coding of UNIX shell scripting and scheduling DataStage jobs through DS-Director.
  • Performing Unit Testing and integration testing.

Environment: InfoSphere Datastage 9.1/8.5 (Administrator, Director, Designer, Quality Stage), DB2, SOAP, Teradata,Netezza 6.0, Oracle 10g, SQL Server, iSeries Navigator, DB2 Client, Sybase, Winscp, Toad, SLQ Assistant, Super Putty, Autosys.

Confidential, Cranston, RI.

Datastage Developer

Responsibilities:

  • Analyzed requirements and created a high level and low-level technical design documents
  • Extracted data from various sources like Oracle, flat files and SQL Server.
  • Based on the Business requirements developed DataStage jobs for transformation and cleansing of data.
  • Worked with Business customers to identify the different sources of data in operational systems and developed strategies to build data warehouse.
  • Design and development of Extract, Transform, and Load (ETL) processes for extracting data from a various legacy systems and loading into target tables using SQL, DataStage Enterprise Edition.
  • Developed jobs in Parallel Extender using different stages like Join, Lookup stage, DRS stage, Copy stage, Filter Stage and Funnel stages
  • Used QualityStage to ensure consistency, removing data anomalies and spelling errors of the source information before being delivered for further processing.
  • Used parameters sets and variables to make jobs and sequencers are more flexible.
  • Involved in creating and maintaining Sequencer jobs.
  • Unit testing the jobs and working actively in the integration testing.
  • Implemented the SCD Type2 to keep track of historical data.
  • Actively coordinated and communicated with the corresponding teams, as the developed Interface has the dependency with other team’s interfaces.
  • Setup and scheduled jobs on DataStage which in-turn utilized Autosys/at the backend.
  • Used DataStage Director to verify logs and monitoring jobs during run session.

Environment: InfoSphere Datastage 9.1/8.5/7.5 (Administrator, Director, Designer), Flat files,DB2, Aqua Data studio, Application Life cycle management, Putty, Autosys scheduler, MS Visio

Confidential

Datastage Developer

Responsibilities:

  • Analyze the BRD
  • Development of data stage jobs using Data Stage Designer
  • Used the Teradata SQL Assistant to fetch data from Teradata database
  • Used Toad for fetching data from Oracle database
  • Prepared unit test cases and test plans. Executed the test cases, captured the results. Supported the SIT testing, UAT testing.
  • Involved in Migration of Jobs from Dev Environment to Testing Environment
  • Used aggregator, look up, join, merge, dataset, transformer, sequencer, sequential file DB2 bulk load, hashed file stage, surrogate key generator.
  • Supported the system post production and worked in co-ordination with the production support teams to resolve any issues.
  • Involved in Daily Status call with client
  • Used Datastage director to run and monitor jobs
  • Used Shared container to reuse the SAP code block data.

Environment: InfoSphere Datastage 7.X (Administrator, Director, Designer), Flat files, Teradata Oracle, Toad, Putty, Autosys scheduler

We'd love your feedback!