We provide IT Staff Augmentation Services!

Data Stage Developer Resume

2.00/5 (Submit Your Rating)

Mclean, Va

SUMMARY:

  • Over 8 Years of experience in administering, analysis, data modeling, design, development, testing, deployment, and production support of Data Mart, Data Warehouse, and IBM InfoSphere Information Server 11.3/9.x/8.x/7.x systems.
  • Collaborate with business users to analyze the business requirements and develop the assessment, business process, proposed IT solutions, strategy, and technology roadmap documents.
  • Expertise in Agile and waterfall development methodologies
  • Perform peer reviews on architecture, design, code, and standards documentation for Data Mart, Data Warehouse, and ETL systems.
  • Expertise in designing and development of ETL mappings.
  • Expertise in IBM DataStage Designer, Manager, Director, and Administrator.
  • Expertise in administering, development and designing of ETL methodology for supporting data transformations using IBM DataStage 11.x/9.x/8.x/7.x.
  • Built ETL jobs to load data from different sources into data marts and data warehouse using IBM DataStage 11.3/9.x/8.x/7.x Designer.
  • Expertise in implementing complex Business rules by creating robust mappings, mapplets, and reusable transformations using Informatica Power Center.
  • Working Knowledge on different data sources as Flat Files, Oracle, SQL Server, XML and Excel.
  • Experience with complex mappings using Filter, Aggregator, Expression, Router, Lookup, Update Strategy, Sequence generator, Rank.
  • Experience in creating Sessions, Workflows, and Worklet by using the Workflow Manager tools like Task Developer, Worklet and Workflow Designer.
  • Excellent in creating the map, assigning workflow, testing the flow and monitoring the job.
  • Used Debugger to validate the mappings and gain troubleshooting information about data and error conditions.
  • Experience in optimizing query performance, session performance and fine tuning the mappings for optimum performance
  • Experience with pre - session and post-session SQL commands to drop indexes on the target before session runs, and then recreate them when the session completes.
  • Involved in configuring, tuning and maintenance of many DataStage applications.
  • Implemented ETL and ELT framework for Data marts.
  • Expertise in Data Profiling, and Analysis. Developed complex queries in these areas.
  • Expertise in developing the Qualitystage jobs using Standardize, Investigate, Data Rules, match frequency, one-source match & two-source match stages.
  • Experience in developing test plans, test cases, test scripts, and test validation data sets for Data Mart, Data Warehouse, and ETL systems.
  • Expertise in all phases of testing including Unit Testing, Functional Testing, System Testing, Regression Testing, Integration Testing, End to End Testing, Usability Testing, Load/Volume Testing, Performance Testing, and User Accepting Testing.
  • Developed High and low level technical design specification documents.
  • Vast experience in optimizing and working on performance improvement for processes.
  • Expertise in promoting the code from lower environment (DEV) to upper environment (QA/Pre-Production/Production)
  • Excellent knowledge in various operating systems and database systems such as UNIX, LINUX, Windows, Oracle, Netezza, Teradata, SQL Server, DB2.
  • Expertise in Data Warehousing concepts such as Data Cleansing, Slowly Changing Dimension phenomenon (SCD), surrogate key, and CDC (Change Data Capture).
  • Experience in providing support in post-deployment phase and project transition to product support.
  • Always a team player and able to multitask within the team.
  • Involved in mentoring the team members to be compliant with the Industry best practices for developing, administering and maintenance of the DataStage applications.
  • Generated Pivot tables, V/H look up and Array formulas using Excel.
  • Detail oriented with good problem solving, organizational, analysis, highly motivated and adaptive with the ability to learn quickly.
  • Ability to work effectively and efficiently in a team and individual environments with excellent interpersonal, technical and communication skills.

TECHNICAL SKILLS:

Languages: C, C++, SQL, PL/SQL, Pro*C, LINUX/ UNIX shell scripts, Perl, Java and Java script

Database: Oracle, Netezza, SQL Server 2008/2012, DB2, Teradata, Sybase, My SQL, MS Access, and Hadoop.

Tools: WinSQL, TOAD, SQL*Navigator, Oracle Developer, Erwin, PVCS, Remedy, HP service desk, and HEAT

Special Software: IBM InfoSphere Information Server 11.x/9.x/8.x, IBM DataStage, Microstrategy, Cognos, Tableau, Crystal Reports, Business Objects, SSRS, Control-M, Autosys.

PROFESSIONAL EXPERIENCE:

Confidential, Mclean, VA

Data Stage Developer

Responsibilities:

  • Created DataStage jobs to extract, transform and load data from CDW ( Corporate Data Warehouse) and load into Staging and then into DB2 Database according to the requirement.
  • Assisted the Business team to draft the Business requirement for the ETL process.design and code ETL jobs per BRS.
  • Extracted, transformed data from heterogeneous systems like DB2, SQL Server and flat files and loaded into DB2 tables
  • Created shell scripts to invoke Datastage jobs, pre/post processing files and schedule using Autosys scheduling tool.
  • Created Autosys JIL's - command jobs to automate the ETL process.
  • Performed data analysis, gap analysis and data validation according to the requirement.
  • Performed Unit testing and System Integration testing by executing different test cases according to the requirement.
  • Created Migration form. to deploy the code to higher environments.
  • Created clean up and archival scripts to archive the files once they are successfully loaded into the target tables/DB.
  • Worked on defect resolution for bug fixing in higher environments.
  • Created Error Files and Log Tables containing data with discrepancies to analyze and re-process the data.
  • Used DataStage utility to import / export the Datastage jobs.
  • Promoted the code from lower environment (DEV) to upper environment (QA/UAT/PRODUCTION).
  • Used TeamForge for distributed version control (Git) and to maintain governance, compliance and code secutrity.
  • Assisted Scrum Master with all the developmental activities, Updated Version One to create task of ETL development..

Environment: IBM InfoSphere DataStage 8.5, SQL Server 2014, DB2-UDB 9.7, AIX 6.0, Version One, Autosys, UNIX Shell Scripting, Flat files, Micro Strategy, Team Forge, Giteye, HP Quality Center.

Confidential,San Francisco,CA

Data Stage Developer

Responsibilities:

  • Reviewed business and technical requirements and ensure the data integration platform meets requirements.
  • Interacting with Business users for Gathering information on new requirements.
  • Load and transform large sets of structured, semi structured and unstructured data using Hadoop.
  • Extensively worked on Job Sequences to Control the Execution of the job flow using various Activities & Triggers (Conditional and Unconditional) like Job Activity, Wait for file, Email Notification, Sequencer, Exception handler activity and Execute Command.
  • Making sure to meet the coding standards of Data Stage for ease of understanding.
  • Involved in error handling and Auditing in the ETL process.
  • Adhere to the SLA of different modules in daily basis.
  • Involved in the documentation of existing as well as upcoming projects for support perspective.
  • Implement parallel processing, load balancing, near real-time and real-time ETL, ELT processes.
  • Conducted End to end Testing - execute job flows, investigate system defects, resolve defects and document results
  • Assist in the development, documentation and application of best practices and procedures that govern DW implementations and operations.

Environment: IBM DataStage 11.5, DB2, Oracle 11g, Hadoop, Aqua, UNIX Shell scripts, Control - M, Jira, Subversion (version control).

Confidential,Dallas, TX

Informatica / DataStage Developer

Responsibilities:

  • Reviewed business and technical requirements and ensure the data integration platform meets requirements.
  • Interacting with Business users for Gathering information on new requirements.
  • Lead design, development and implementation of end-to-end complex ETL system using Datastage tools
  • Involved in development/testing/deployment of datastage IDW/EDW jobs.
  • Based on requirements using mapping created various Datastage parallel jobs using various stages such as CHECKSUM, CDC, LOOK UP, JOINER, TRANSFORMER, FILTER, FUNNEL, Surrogate key generator etc.
  • Created sequences for various subject areas.
  • Created DDL's from different sources (EPIC, PATCOM, VISION, LUMINDEX, CAG) and sent to configuration file for different subjects.
  • Making sure to meet the coding standards of Datastage for ease of understanding.
  • Maintain the Legacy Data warehouse application up and running.
  • Adhere to the SLA of different modules in daily basis.
  • Involved in the documentation of existing as well as upcoming projects for support perspective.
  • Represent the application health status on weekly basis to Managers.
  • Implement ETL systems that are operationally stable, efficient and automated
  • Implement ETL systems that maximize re-usable components/services, collect/share metadata, in corporate audit, reconciliation and exception handling
  • Implement parallel processing, load balancing, near real-time and real-time ETL, ELT processes
  • Conducted End to end Testing - execute job flows, investigate system defects, resolve defects and document results
  • Worked with DBAs, application specialists and technical services to tune performance of the system to meet performance standards.
  • Assist in the development, documentation and application of best practices and procedures that govern DW implementations and operations
  • Work closely with offshore team member and guide them in case of any issues.
  • Off hours and week end support may be required to meet project milestones.

Environment: IBM DataStage 11.3, Oracle 11g, Netezza 7.1, SQL Server, Putty, TOAD, Secure CRT, UNIX Shell scripts, Autosys, Serena Dimensions, Jira.

Confidential, Omaha,NE

DataStage Developer

Responsibilities:
  • Understand business processes and collaborate with business users/analysts to get specific user requirements.
  • Prepare the high and low level technical specification documents based on the Business requirements.
  • Created Source to Target ETL mapping documents.
  • Architect the solutions using IBM DataStage 11.3 for reading the data from multiple sources like Oracle, Sybase and Flat files; Transform and load the data into Enterprise Data Warehouse.
  • Design and develop PL/SQL Stored procedures to generate the outbound extract files for Third party vendors.
  • Developed Parallel jobs with various stages including Aggregate, Join, Lookup, Sort, Merge, Transformer, Change Capture, Surrogate Key generator and slowly changing dimension (SCD) stages.
  • Developed Qualitystage jobs using Standardize, Investigate, Data Rules and match frequency stages to clean up the data from multiple sources
  • Implemented ETL/ELT framework for Enterprise Data warehouse. Worked on resolving the performance issues.
  • Resolved performance issues for various ETL jobs especially transforming and loading XML formatted files.
  • Wrote complex SQL and PL/SQL queries to generate Data Lineage and Reconciliation reports.
  • Build UNIX scripts to Invoke DataStage jobs.
  • Define appropriate Testing (Unit, SIT, Performance, and UAT), Quality Assurance, and Quality Control Strategy.
  • Develop test plans, test cases, test scripts, and test validation data sets for all the newly built interfaces.
  • Developed shared containers for re-usability purpose and job templates as well.
  • Documenting the Industry best practices to develop DataStage solutions.
  • Mentored the team members to be compliant with the Industry best practices for developing, administering and maintenance of the DataStage applications.
  • Provide production support and resolve the issues in timely manner.

Environment: IBM DataStage 11.5, Oracle, Cognos, Netezza, WinSQL, TOAD, Secure CRT, UNIX Shell scripts, Autosys, Serena Dimensions, HP-Quality Center

Confidential,TX

Informatica Developer

Responsibilities:
  • Propose system solutions and architect the solutions using Informatica Powercenter 8.5 for integration from various data sources with Data warehouse.
  • Developed / designed various new processes and fixed the existing process with new business requirements, various meetings with users input.
  • Designed and developed complex ETL jobs for extracting, transforming, integrating, and loading data into data mart using Informatica Designer.
  • Worked with Informatica Director to schedule, monitor, analyze performance of individual stages and run Informatica jobs.
  • Loading data from various data sources and legacy systems into Teradata production and development warehouse using BTEQ, FASTEXPORT, MULTI LOAD, and FASTLOAD
  • Extensively used various Processing and development stages to develop the parallel jobs.
  • Generated Surrogate Keys for composite attributes while loading the data into Data Warehouse using Key Management functions.
  • Developed Complex ETL jobs to implement Slowly Changing Dimensions.
  • Prepared technical design documents and source to target mapping documents.
  • Involved in configuring, tuning and maintenance of Informatica applications.
  • Developed Informatica jobs using various stages like Aggregator, Sort, Expression, Merge, Join, Lookup, SCD, Sequence key generator.
  • Conducted peer designing, code reviews, and standard documentation of ETL procedures.
  • Define appropriate Testing (Unit, SIT, Performance, and UAT), Quality Assurance, and Quality Control Strategy.
  • Develop test plans, test cases, test scripts, and test validation data sets for all the newly built interfaces.
  • Automate all the new ETL jobs using UNIX shell scripts and add data validation checks including business rules and referential integrity checks.
  • Worked on troubleshooting, performance tuning and performances monitoring for enhancement of Informatica jobs.
  • Mentored the team members to be compliant with the Industry best practices for developing, administering and maintenance of the Informatica applications.

Environment: Informatica Powercenter 8.5, Oracle, Teradata, SQL Server 2012, SQL, PL/SQL, Cognos, DB2, UNIX Shell scriptsConfidential,Irving,TX

DataStage Lead Developer

Responsibilities:
  • Understand business processes and collaborate with business users in requirements analysis, functional design, and technical design documentation.
  • Followed Agile development practices to the Inbound and Outbound ETL interfaces
  • Propose system solutions and architect the solutions with IBM DataStage for Promote integration.
  • Involved in creating the functional and technical design specification documents for Promote Inbound and Outbound interfaces.
  • Build end-to-end integration between Oracle Promote and Netezza enterprise Data warehouse using IBM DataStage 8.7 to implement promotional planning optimization (PPO) solution.
  • Developed jobs using various stages like Aggregator, Sort, Transformer, Merge, Join, Lookup, Change Data Capture, SCD, Surrogate key generator, Netezza connector, Oracle connector and shared containers.
  • Designed and developed Oracle SQL queries and Stored procedures. Used 'NZload' utilities.
  • Created Data lineage and reconciliation design documents. Write complex SQL queries to achieve these.
  • Worked on analyzing the long running queries in Oracle and optimizing them to improve the performance.
  • Developed shell scripts to execute DataStage jobs
  • Develop test plans, test cases, test scripts, and test validation data sets for IBM DataStage integration/ETL processes.
  • Develop data management, information governance strategy documents along with ETL standards document.
  • Implemented ETL/ELT framework to implement the Data warehouse and Data marts
  • Perform peer reviews on architecture, design, code, and standards documentation for ETL processes.
  • Involve in architecting and designing the new ETL processes for Promote Inbound, Outbound and Weekly interfaces.
  • Developed PL/SQL Stored procedures, triggers and functions; Made changes to some of the existing ones as well.
  • Build UNIX shell scripts to automate the DataStage processes through Control-M scheduler.
  • Worked on troubleshooting, performance tuning and performances monitoring for enhancement of DataStage jobs.
  • Involved in mentoring the team members to be compliant with the Industry best practices for developing, administering and maintenance of the DataStage applications.
  • Provide support in post-deployment phase and involve in project transition to production support.

Environment: IBM DataStage 8.7, Oracle 10g, Netezza 6.0, Microstrategy, UNIX Shell scripts, Control-M, PVCS, Pro*C scripts, SQL, PL/SQL, Erwin and Perl.

Confidential TX,Richardson

Business / Data Warehouse Analyst

Responsibilities:
  • Worked closely with a project team for gathering business requirements and interacted with Stakeholders and business users to translate business requirements into Functional and Technical specifications and Data movement documents to build Java applications.
  • Interacted with Users for verifying User Requirements, managing change control process, updating existing documentation.
  • Facilitated (JAD) Joint Application Development sessions to resolve issues relating to difference between business requirements and technical design.
  • Assisting in the design and documentation of logical and physical analytical enterprise databases, particularly the Enterprise Data Warehouse (EDW)
  • Formulated rules, and created flow using Paper free, in the creation of - 837 dental, 837 professional, and 837 institutional maps in accordance with HIPPA compliance
  • Conducted and performed analysis of EDI transactions (834,835 and 837), document and produced metric reports.
  • Consulted with business, IT and third-party resources to identify and document the business reporting needs.
  • Analyzing data profiling reports and evaluates quality, identifying issues and gaps in data with regard to supporting business processes.
  • Gap Analysis done to make sure the appropriate data is obtained for meet the requirement.
  • Created data flow diagrams, data mapping from Source to Target mapping documents indicating the source tables, columns, data types, transformations required and business rules to be applied.
  • Wrote multiple SQL queries based on requirement to obtain data from various sources.
  • Created data flow diagrams, data mapping documents for Real time and Historical reporting.
  • Various metric were used to report the Performance Key Indicator of various providers.

Environment: IBM DataStage 8.5, Oracle, Teradata, UNIX Shell scripts, ZENA, Serena Dimensions, HP-Quality Center, SQL Assistant.

Confidential,Colorado springs,CO

Sr.Informatica / Cognos Developer

Responsibilities:
  • Developed / designed various new processes and fixed the existing process with new business requirements, various meetings with users input.
  • Designed and developed jobs for extracting, transforming, integrating, and loading data into data mart using Designer.
  • Developed, executed, monitored and validated the ETL DataStage jobs in the DataStage designer and Director Components.
  • Worked with Director to schedule, monitor, analyze performance of individual stages and run jobs.
  • Extensively used various Processing and debug/development stages to develop server and parallel jobs.
  • Created low and high level technical specification documents.
  • Generated Surrogate Keys for composite attributes while loading the data into Data Warehouse using Key Management functions.
  • Developed Complex ETL jobs to implement Slowly Changing Dimensions and Change Data Capture.
  • Implemented ELT framework for few Data marts to achieve more performance.
  • Performed Troubleshooting and Tuning of DataStage Jobs using job parameters, configuration files and environment variables.
  • Worked on SQL query optimizations and performance tuning of Informatica jobs.
  • Provided production support for deployed applications and worked on support tickets to resolve the issues.
  • Analyzed Business Requirements for Reports worked closely with business analysts to understand the business needs.
  • Performed Metadata Modeling using the Cognos 10 Framework Manager and published the packages.
  • Created dimensions, levels and hierarchies using Framework Manager.
  • Created simple and Ad hoc reports using Cognos 8.1/8.2 Query studio.
  • Involved in creation of advanced reports using Cognos 8.1/8.2 Report Studio.
  • Developed list reports, cross tab reports, chart reports, reports with multiple prompts(Text box, value, search, Date, Time, Interval), cascading prompts and Filters.
  • Created complex reports such as dashboard reports, master detail reports, drill through reports.
  • Applied model filters, query filters, user access filters, detail filters and summary filters to generate user specific reports.
  • Bursting of reports
  • Generated prompts based on parameters
  • Involved in applying conditional formatting in reports.
  • Involved in bursting reports to several users.
  • Active part in Performance tuning and optimizing.
  • Actively involved in the administration part of Cognos Connection for organizing, security, scheduling and distributing reports.
  • Designed the Cognos Connection portal according to user specifications.
  • Trained users on navigation and creating simple reports in query studio.

Environment: Informatica 8.5, Netezza 6.0, Cognos 11.2, Oracle 9i, SQL server 2005, DB2, SQL, PL/SQL, WinSQL, Toad, SQL Navigator, Control-M

Confidential,Irving, TX

DataStage Lead Developer

Responsibilities:
  • Played a key role in building the strategy to migrate the Oracle based Retek Data Warehouse (RDW) to Netezza based Enterprise Data Warehouse (EDW).
  • Actively involved in building the strategy to Implement Customer and transaction level data warehouse on Netezza appliance.
  • Document user requirements and translate requirements into system solutions.
  • Architect Star & Snowflake based logical & physical data models for Data Warehouse systems using data modeling tools such as Erwin.
  • Implement the migration plan with Oracle, Pro*C, SQL, PL/SQL, Netezza data warehouse appliance, and IBM DataStage.
  • Involved in creating the functional specification documents for ETL interfaces.
  • Architect, design, develop, deploy, and support of integration processes across the enterprise by utilizing IBM DataStage.
  • Created source to target mapping documents & Data lineage documents.
  • Created low and high level technical specification documents.
  • Develop test plans, test cases, test scripts, and test validation data sets for Data Mart, Data Warehouse, and IBM DataStage integration/ETL processes.
  • Architecting and designing the Customer module for Master Data Management.
  • Designing and Implementing the ETL processes for History load and Incremental loads for EDW, Customer and Transaction level data warehouse.
  • Played major role in designing and implementing proof of concept for migrating the data from RDW (Retek Data Warehouse) to EDW (Enterprise Data Warehouse).
  • Document all the interface processes in current data warehouse system and translate them into new ETL processes using IBM DataStage.
  • Wrote to complex SQL queries around Data reconciliation and Lineage report.
  • Prepare the Migration scope and validation strategy documents.
  • Perform Data cleansing activities to improve the data quality.
  • Automate all the new DataStage ETL jobs using UNIX shell scripts through Control-M scheduler and add data validation checks including business rules and referential integrity checks.
  • Developed multiple shared container jobs for reusability & created job templates.
  • Create dimension and fact tables in Netezza and perform data loads using nzload utility.
  • Developed PL/SQL Stored procedures to perform database operations.
  • Design and develop Parallel jobs to extract data, clean, transform, and to load the target tables using the DataStage Designer.
  • Utilize DataStage Director to run, schedule, monitor, and test the interfaces and obtain the performance statistics.
  • Perform troubleshooting, performance tuning and performance monitoring for enhancement of DataStage jobs.
  • Provide support in post-deployment phase and involve in project transition to production support.

Confidential, Austin,TX

DataStage / Business Object Developer

Responsibilities:
  • Understand business processes and collaborate with business users/analysts to get specific user requirements.
  • Develop DataStage Parallel jobs using various stages like Join, Merge, Funnel, Lookup, Sort, Transformer, Copy, Remove Duplicate, Filter, Peek, Column Generator, Pivot and Aggregator stages for grouping and summarizing on key performance indicators used in decision support systems.
  • Involved in configuring, tuning and maintenance of many DataStage applications.
  • Design and develop the Routines and Job Sequence for the ETL jobs and prepare the complete data mappings.
  • Created source to target mapping documents.
  • Created low and high level technical design specification documents.
  • Develop UNIX shell scripts to execute DataStage jobs.
  • Develop test plans, test cases, test scripts, and test validation data sets for Data Mart, Data Warehouse, and ETL processes.
  • Validate, schedule, run, and monitor the ETL jobs using IBM DataStage Director.
  • Manage the ETL repository using IBM DataStage Manager.
  • Supported Disaster Recovery activities for all the InfoSphere suite related applications.
  • Prepare and execute the System Integration Testing (SIT) test cases.
  • Work on troubleshooting, performance tuning and performance monitoring for enhancement of DataStage jobs.
  • Convert complex job designs to different job segments and executed through job sequencer for better performance and easy maintenance.
  • Participate in weekly status meetings.
  • Designed several universes and used various methods to optimize queries on database, universe and reporting level.
  • Worked towards optimal development of universes by removing loops through the use of aliases and contexts.
  • Creating Universe with number of Classes, many objects, Derived Tables using inner/outer and equiv. joins as per requirement.
  • Created and tested classes and objects to organize the universe.
  • Exported the universes to the Repository to make resources available to the users.
  • Created reports for various Portfolios using the Universes as the main data Providers.
  • Created the reports using Business Objects functionality’s like Queries, Cross Tab, Master Detail and Formulae’s etc.
  • Applied drill-up and drill-down techniques in master detail reports for multi-dimensional analysis of retrieved data.
  • Extensively worked with Ranks, alerters, filters, prompts, variables and calculation contexts in designing reports.

Environment: IBM DataStage 7.5.x, Oracle 9i, SQL server 2005, DB2, Sybase, Business Objects, Erwin, SQL, PL/SQL, Toad, SQL Navigator, Control-M, LINUX/UNIX Shell scripts.

We'd love your feedback!