- Over 7+ years of ETL and Data Integration technical experience in IT Industry as a developer in System Analysis, Design, Development, Testing and Support of projects using Datastage 7.5/8.1/8.5/9.1 /11.3 in Health Care, Banking and Automobile Insurance.
- Experience in Software Development Life Cycle - SDLC (Analysis, Design, Development and Testing), requirement gathering, client interaction, Use Case Design and understanding.
- Have hands on experience in design and development of complex datastage jobs, sequencers and Fasttrack, MetadataWorkbench and Business Glossary.
- Have hands on experience in Autosys and Control-M job schedulers.
- Experience in UNIX scripting, troubleshooting and file handling on the UNIX system.
- Good hands on experience in RDBMS like Oracle, Netezza, SQL Server, Teradata and DB2. Working experience in data modeling and implementing stored procedures using PL/SQL. Extensive knowledge of writing complex queries using SQL.
- Demonstrated work experience in using the state of art Datastage Grid environments.
- Experience in Data Integration, EDW and Data Mart projects.
- Strong knowledge in OLAP and OLTP Systems, Dimensional modeling using Star schema and Snowflake schema.
- Demonstrated work experience in Mainframe to Datastage Migration projects.
- Working experience in DB2 cursors, PL/SQL triggers and stored procedures.
- Experienced in designing and using several process tracking mechanisms and communication designs that give a snapshot of the status of the jobs in an application/system.
- Worked on integrating data from sequential files, flat files, COBOL files and XML files.
- Used Toad for analyzing data, writing SQL, PL/SQL scripts performing DDL operations.
- Experience in working closely with mainframe applications for the ETL interactions.
- Prepared High Level design document, Low Level design document, Technical design document for various projects and good at Bug fixing, Code reviews, and Unit & System testing.
- Expertise in application development in various Datastage versions like 7.5/8.1/8.5/9.1
- Expertise in creating Data Mapping for various projects and applications.
- Experience in working in a Multi-Dimensional Warehouse projects.
- Experience in reading and loading high-volume Type 2 dimensions by implementing SCD (Slowly Changing Dimensions).
- Experience in using Mainframe applications for browsing the files and NDM’ ing the files.
- Hands on Experience in working with BOXIfor report generation.
- Good experience in working with HP ALM(QC)
- Good experience in designing metadata for all data moves and to re-use them during the job designing with the help of Metadata Management and Fasttrack Mapping specifications.
- Providing 24X7 productions supports for the application stability.
- Quick learner and up-to-date with industry trends, Excellent written and oral communications, analytical and problem solving skills and good team player, Ability to work independently and well-organized.
Languages: SQL, PL/SQL, Unix shell scripting, COBOL
ETL Tools: Datastage 7.5/8.1/8.5/9.1 /11.3
Designer, Administrator, Director: Fasttrack, Metadata Workbench, Business Glossary
Databases: Oracle 10g, DB2, Teradata, SQL Server, Netezza, DB2
Software’s: TOAD for oracle, TOAD for DB2, WINSCP,MS SQL server 2008 Management Studio, SQL * PLUS, SQL Developer, Queryman, AQT for DB2, AQT for Oracle, MAINFRAMES
Other Toolsused: ClearCase, Tortoise SVN Sub Version, HP ALM
Operating Systems: UNIX (AIX), LINUX, Windows 95/98/2000/XP/2003, MS-DOS.z/OS
Reporting Tools: BOXI R2
Confidential, Providence, RI
ETL Datastage Developer
- Defined and implemented an end-to-end enterprise data integration strategy
- Promoted data and business rules reuse and understanding
- Standardized the data content.
- Created Datastage 8.5 and 11.3 jobs using different stages like Transformer, Aggregator, Sort, Join, Merge, Lookup, Data Set, Funnel, Remove Duplicates, Copy, Modify, Filter, Change Data Capture, Change Apply, Surrogate Key, Column Generator, Difference, Row Generator, Sequencer, Email Communication activity, Command activity.
- Centralized the business rules management
- Used several stages like Sequential file, Hash file, Aggregator, Funnel, Change Capture, Change Apply, Row Generator, Peek, Remove Duplicates, Copy, Lookup, Join, Merge, Filter, Datasets during the development process of the DataStage jobs.
- Worked on standing up the Performance Environment by creating Value Files, Parameter Sets, Unix Shell Scripts, running one time PL-SQL scripts for inserting/updating values in the various tables. Deploying the code into all other test environments and making sure QA to pass all their test cases.
- Measured and improved data quality.
- Enabled Enterprise data warehouse
- Designed and developed solutions to complex application problems.
- Maintained and provided support in the existing systems
- Developed new functionality and interface as per the business requirements and integrated them with the legacy application.
- Identified the areas of application where improvement and automation of the manual process was required and implemented the process improvement
Environment: Netezza, Oracle, Datastage 8.1, 11.3, CA7 Scheduler, Unix Shell scripting, MS SQL Server 2008 R2.Confidential, Windsor, CT
- Involved in gathering Business Requirements from Business users. Analyzed all jobs in the project and prepared ADS document for the impacted jobs.
- Made the required changes in the job according to the Business requirements.
- Coordinating offshore team and making the offshore team work smoothly and getting the quality work from offshore on time.
- Used SQL and PL/SQL explain plan in Query man, SQL Developer to fine tune the SQL codes which were used to extract the data in database stages
- Designed, developed and tested the DataStage jobs using Designer and Director based on business requirements and business rules to load data from source to target tables.
- Modified the existing job with new functionality in the code.
- Prepared the test cases for system test.
- Excellent with PL/SQL, T-SQL, Stored Procedures, Database Triggers and SQL * Loader.
- Resolving the defects which have been raised by QA.
- Established best practices for DataStage jobs to ensure optimal performance, reusability, and restartability.
- Involved in developing the Cognos reports by writing complex SQL queries
- Used Autosys to schedule, run and monitor Datastage jobs.
- Extracted the data from the DB2 database and loading into downstream Mainframe files for generating the reports.
Environment: IBM InfoSphere Information Server DataStage 9.1, UNIX, AIX, DB2,Netezza, Mainframe files, CA7, Job control,SVN, Teamfroge, HPQC.Confidential, Durham, NC
- Identified source systems, their connectivity, related tables and fields and ensured data suitability for mapping.
- Prepared Fasttrack Mapping Specifications, created metadata layouts in Metadata Workbench and update Business Glossary.
- Prepared Data Mapping Document and Design the ETL jobs based on the DMD with required Tables in the Dev Environment.
- Active participation in decision making and QA meetings and regularly interacted with the Business Analysts &development team to gain a better understanding of the Business Process, Requirements& Design.
- Studied the PL/SQL code developed to relate the source and target mappings.
- Used DataStage as an ETL tool to extract data from sources systems, loaded the data into the IBM DB2 database.
- Used Data Stage Director and its run-time engine for job monitoring, testing and debugging its components, and monitoring the resulting executable versions on ad hoc or scheduled basis.
- Also worked for different enhancements in FACT tables.
- Documented ETL test plans, test cases, test scripts, and validations based on design specifications for unit testing, system testing, functional testing, prepared test data for testing, error handling and analysis.
- Performed Performance testing with different sets of node configuration, different queue and different volumes.
- Prepared the DML’s for maintenance tables, review, and test and execute them.
- Used Tortoise SVN version control tool for version controlling and movement of code to upper environments like SIT, UAT, Pre production and Production.
Environment: IBM Web Sphere DataStage 8.5 Parallel Extender, Grid environment, Fasttrack, Metadata Workbench, Business Glossary, Web Services, Quality Stage 8.5, Microsoft Visio, IBM AIX 4.2/4.1, SQL Server, IBM DB2, ORACLE 11G, Autosys Scheduler, Unix, Windows, HP ALMConfidential, WI
- Implemented Data Quality process which transforms the input fields into the data types used in the target database tables and also does basic checks on the data fields and reports the data errors
- Developed Subject Area Jobs for performing all the transformations required in the data process for creating the load ready files.
- Strong Experience in writing PL/SQL, Stored Procedures, Functions and Database Triggers.
- Extensively dealt with change capture techniques for implementing slowly changing dimensions process.
- Developing shell scripts to automate file manipulation and data loading procedures.
- Used Shared container for simplifying design and maintenance.
- Experienced in fine tuning, Trouble shooting, bug fixing, defect analysis in DataStage Jobs.
- Used error handler process Jobs to capture look up failure, duplicates and data type errors and these errors are captured to a error log detail table.
- Used DB2/UDB stage to load data into mart tables and DB2 bulkload stage to load data into staging tables
Environment: DataStage/QualityStage 8.0 (IBM WebSphereDatastage and Quality Stage Designer, Director, Administrator), DB2 UDB 9.2,SQL Server 2000, Linux 10, SQL, PL/SQL, UNIX Shell Scripting, Microsoft Visio, DB2 Visualizer, MS SQL server, Mainframe, COBOL .Confidential
- Studying the business requirement, preparing the impact analysis document.
- Prepared technical specification document, upon review of the solution developed the solution using Datastage jobs and sequencers.
- Used sequential file stage as the source for most of the source systems.
- Developed a file check process that checks the format, volume and date in the file decides whether the right file is being sent by the source and whether the right file is being loaded into the database.
- Used aggregator, look up, join, merge, dataset, transformer, sequencer, sequential file DB2 bulk load, hashed file stage, surrogate key generator.
- Created DDL statements for new tables, changes to table structure, index changes, and creation of triggers and stored procedures.
- Knowledge in using PL/SQL to write stored procedures, functions, and triggers.
- Prepared unit test cases and test plans.
- Executed the test cases, captured the results.
- Supported the SIT testing, UAT testing.
- Worked on packaging the code with the help of tortoise SVN version controlling tool and worked with respective teams to deploy the code.
- Supported the system post production and worked in co-ordination with the production support teams to resolve any issues.
Environment:: DataStage 8.1 (Designer, Director, Manager, Administrator) Enterprise Edition, SQL Server 2005, IBM DB2, AS/400, ERwin4.0, MS Visio 2000.