We provide IT Staff Augmentation Services!

Etl Datastage Consultant Resume

5.00/5 (Submit Your Rating)

Pleasanton, CA

SUMMARY

  • Overall 8+ years of experience in IT Industry involving Software Analysis, Design, Implementation, Coding, Development, Unit Testing and Maintenance with focus on Data warehousing applications.
  • Over 8+ years of IT experience using ETL methodology in data warehouse/data marts/Conversion projects with extraction transformation and loading (ETL) processes using IBM InfoSphere Information Server - Data Stage versions 11.5/9.1/8.7 Enterprise Edition, Server Edition, and IBM Information Analyzer 8.7/8.1.
  • Strong functional and technical knowledge in IBM InfoSphere DataStage, IBM InfoSphere QualityStage, IBM InfoSphere Information Analyze, IBM InfoSphere Business Glossary, Tableau.
  • Over 7+ years of experience in writing SQL programming on wide variety of RDBMS systems including Netezza 7.0, Oracle 11g/10g/9i/8i, DB2 UDB 7/8, SQL server 2016/2012/2008 , Teradata V2R5/V2R6/V12 etc.
  • 5 + years of experience in writing UNIX Shell Scripts to automate file manipulation, file checks, Scheduling, and data loading procedures.
  • Have experience working in agile environment.
  • 2+ year of experience in using Informatica Power Center (8.1).
  • Over 4+ years of experience in writing NZ SQL & stored Procedure in Netezza database.
  • Expertise in T-SQL statements, stored procedures Table valued functions and triggers to achieve complex task requirements.
  • Strong Knowledge of Data Warehouse Architecture and Designing Star Schema, Snowflake Schema, FACT and Dimensional Tables, Physical and Logical Data Modeling using Erwin 3.5/4.0.
  • Has Strong Knowledge in all phases of Software Development Life Cycle (SDLC) such as Requirement Gathering, Development, Testing, Migration, Security Management, Training and Support.
  • Expert knowledge in in IBM InfoSphere DataStage best practices and in designing jobs for optimal performance, reusability and restartability.
  • Worked on IBM DataStage Director extensively to run, schedule, monitor, debug and test the application on development, and to obtain the performance statistics and scheduling tools Autosys and Control M.
  • Experienced in interacting with business users to analyze the business process and requirements and transforming requirements into designing database, documenting, and rolling out the deliverables.
  • Worked with snowflake database by loading data from Azure containers, creating Schemas/Stages/Tables/Views and running benchmark queries with varying complexity.

TECHNICAL SKILLS

Primary Skills: IBM InfoSphere DataStage 11.5/9.1/8.7/8.5/8.0.1/7.5.1 (Server Edition, Parallel Extender) Designer, Director, Administrator, Information analyzer 8.7, Informatica PowerCenter 8.1, SQL and UNIX Scripting.

Operating System: Windows NT4.0/2000, MS-DOS, UNIX (Solaris, Linux, AIX)

Languages: PL/SQL and UNIX, Shell Scripting, Bash

Database: Snowflake database, Netezza 7.0, Oracle 11g/10g/9i/8i, MS SQL Server 2016/2012/2008 , DB2 UDB 7/8, MS Access, Teradata V2R5/V2R6/V12

Domain Knowledge: Health Care and Retail domain.

Other Tools: SQL Developer 1.5.4, SQL Programmer, TOAD 7.4/7.5, Teradata SQL Assistant 13.10, Advanced Query Tool 9.0.7, Aginity Workbench for PureData System for Analytics 4.9

Scheduling Tools: Autosys, Control M, TWS

Version Control: SharePoint, SVN, GIT, Clear Case, Visual Source Safe.

PROFESSIONAL EXPERIENCE

Confidential, Pleasanton, CA

ETL DataStage Consultant

Responsibilities:

  • Worked with business to understand scope of requirement, Analyze, and verify requirements for completeness, consistency, comprehensibility, feasibility, and conformance to standards.
  • Using IBM Infosphere DataStage designer extracted data from various source databases like SAP HANA, SQL server, Oracle, and Salesforce.
  • Using IBM Infosphere DataStage, worked on various stages like ODBC connector, Azure Storage connectors, Snowflake connector etc.
  • Using IBM Infosphere information server created and build the packages for migration from one environment to another environment.
  • Developed ETL pipeline to bring the tables from the various source and load into intermediate storage space which is a Microsoft Azure Blob Storage and then load the data into the Target Snowflake database Schemas.
  • Build the ETL Framework for both Truncate and Incremental logic for extracting data from sources and loading the data into Snowflake database.
  • Created the Audit tables for ETL auditing, maintaining the logs, exception handling and error communications standards.
  • Created the Schemas to the corresponding databases, tables and views in the Snowflake database as per the requirements.
  • Using Snowflake database, created external stages to load data directly rom from Azure container to the Snowflake tables.
  • To improve the performance while loading data into Snowflake database, created scripts using Copy and Merge commands.
  • Created Maestro scripts for scheduling the ETL jobs in the Production environment.
  • Coordinated with offshore team regarding design issues, help them to understand requirements, and help them to resolve the access/connectivity issues.
  • Created ETL Unit test scripts, and validations based on design specifications for unit testing. Prepared test data for testing, error handling and analysis and documenting test cases.

Environment: IBM InfoSphere DataStage and QualityStage 11.7.1 (Designer, Director, Administrator), Microsoft Azure, Snowflake database, Salesforce, SQL Server, SAP Hana, Salesforce, SVN, GIT.

Confidential, Albany, NY

ETL DataStage Consultant

Responsibilities:

  • Work with the business team to understand the scope of the project for each module, analyze the requirements, and identify the dependencies with other application. Also, work with the stakeholders from different teams to prioritize the project based on client inputs.
  • Develop and implement tools to maintain, monitor, and troubleshoot business intelligence applications in the HealthCare domain.
  • Development of the applications for extractions of data from various source system that are on different RDMS such as Mainframe AS400, Flat Files, XML, Netezza, Oracle, MySQL, SQL server and Webservices such as Salesforce, application exposed by REST APIs, and applying the required transformation for the integration using tools such as IBM Infosphere DataStage, Unix Shell scripting and SQL.
  • Development of batch jobs to load the data into IBM Netezza where the data gets staged, transformed, and loaded into the final table which gets integrated with other sources.
  • Worked with objects in IBM Information Server Repository like jobs, sequences, parameter sets, routines, shared containers, and table definitions
  • Using IBM DataStage Designer developed Master Sequences using Activity stages such as User Variable, Execute Command, Start Loop, End Loop Terminator, Termination and Exception handler etc.
  • Using scripting, created FTP file watcher scripts to get the files from windows shared drive to local UNIX server.
  • Optimized the existing jobs for better performance and implemented new business logic in the production environment.
  • Created UNIX scripts to Pull and push file from Windows server to local UNIX box and UNIX to Windows server.
  • Automation of the manual operations to increase productivity and reduce the workload across the team.
  • Coordinate with vendors and support team on problem resolution, design issues and upgrades
  • Analyze production environment to detect critical deficiencies and designing and testing solutions for the improvement which would fix the deficiencies and increase the performance of the applications
  • Leading the team in identifying the unexpected incidents/aborts/ issues with the applications along with the support of the source systems and make sure the issues are resolved.
  • Designing the potential solution for a bug fix, enhancement, and improvements with the existing processes.

Environment: IBM InfoSphere DataStage and QualityStage 11.5/9.1 (Designer, Director, Administrator), Oracle 11g, SQL Developer, Toad data Point 3.5, Linux, Aginity Workbench for PureData System for Analytics 4.9, Netezza 7.0

Confidential

Responsibilities:

  • Interacted with end user community to understand the business requirements and in identifying data sources.
  • Analyzed informational sources and methods to identify problem areas and make recommendations for improvement. This required a detailed understanding of the data sources and researching possible solutions.
  • Prepared mapping document for the subsystems like Claims, Provider, Recipients, Managed care organizations, TPL, PA and References.
  • Involved in ETL Processes to load data from COBOL file, MS Access, Excel, Flat files, Oracle, and RDBMS into target Oracle databases by applying business logic on transformation mapping for inserting and updating records when loaded.
  • Worked on Data Profiling like Primary key, Foreign key, referential key integrity analysis and, Range analysis
  • Used different DS stages such as Sequential file, Complex flat file, Transformer, Aggregate, Sort, Datasets, Join, Lookup, Pivot Enterprise Stage, Funnel, Peek stages in accomplishing the ETL Coding.
  • Enhanced the reusability of the jobs by design/develop/deploying shared containers and multiple instances jobs.
  • Excessively Used DS Director for monitoring Job Logs to resolve issues.
  • Developed job sequencer with proper job dependencies, job control stages, triggers.
  • Handled offshore - Onshore team coordination of 10 team members and setup a HOTO call with offshore every day to keep team members on the same page, ensure smooth transition and handover of work between Onshore and Offshore, enable open house discussion on requirements, design and development strategies and track timelines, milestones, and checklists.
  • Tested the Data Converted for Business rules compliance and counted checks and verified. Established and maintained the overall data conversion design of the system
  • Documented ETL Unit test scripts, and validations based on design specifications for unit testing, system testing. Prepared test data for testing, error handling and analysis and documenting test cases in Quality Center.

Environment: IBM InfoSphere DataStage 8.7 (Designer, Director, Administrator), Oracle 11g, SQL Developer, IBM Sun Solaris 5.10, Toad 9.7.2.

We'd love your feedback!