We provide IT Staff Augmentation Services!

Etl Architect /analyst/ Data Warehouse Lead/etl Lead Developer Resume

4.00/5 (Submit Your Rating)

NyC

SUMMARY OF QUALIFICATIONS:

  • Fifteen years of IT experience in Architect, Analyst, design and development of software applications wif Ten years of experience in implementing DataStage as an ETL tool and Business Objects reporting in a dimensional model and Data Integrator Services for data warehouse environment wif Experienced in various capacities and designations from Programmer, Project Leader to Architect.
  • Full life cycle of Software Development (SDLC) experience including Define, Analyst and Review of Software and Business Requirement Specifications, Design, Development and Testing, to solidify client requirements in conjunction wif Software Developers.
  • Experienced in integration and migration of various data sources like Oracle, SQL Server, Sybase, DB2, Mainframe system and Complex Flat files into the staging area.
  • Involved in E - R modeling and dimensional data modeling, design of Star Schema and Snow flake Schema. Used ERwin for physical and logical data modeling
  • Proficiency in data warehousing techniques for data cleansing, slowly changing dimension phenomenon, surrogate key assignment.
  • Extensively worked on PL/SQL programming like Packages, Procedures, functions, Triggers and Cursors.
  • Involved in doing DataStage administrative work.
  • Extensively wrote shell scripts to do bulk load, mail notification, handle SQL commands.
  • Industry experience includes Finance, Banking Wholesale Banking, Manufacturing, Cooperate Credit Risk Management, Telecommunication and pharmaceuticals.
  • Strong experience in the CMM Level 5 Quality process at organizations such as JPMorgan Chase, Hewlett Packard, Standard Chartered Bank Singapore.
  • Experience in creating Business Object Reports, Crystal Reports.
  • Experience in working wif Business Objects modules Supervisor, Designer, BO Reports and Broadcast Agent.
  • Expert in doing Data Analysis by running SQL Queries to validate databases do gap Analysis.
  • Strong experience in interacting wif business users, gathering requirements through interviews, workshops, and existing system documentation or procedures, defining business processes, identifying and analyzing risks using appropriate templates and Analyst tools.
  • Experience in the creation of Project Plan, Design Document, Test Plan and Test Cases from the Requirements document. Documentation of the Test Plans, Test Cases, Test Scripts, Test Procedures based on the Design Document and User Requirement Document for the Unit, Integration, Regression, Functional, Performance, and User Acceptance Testing (UAT).

TECHNICAL COMPETENCIES:

ETL Tools: Ardent Datastage 3.6, IBM Information Server 8.5, 9.1 & IBM Infosphere Data Stage & Ascential Data Stage 7.5 (PX) Ascential Suite(Profile Stage, Quality Stage & Metastage)

IBM Infosphere Information: Server 11.5, Informatica 7.1.1, 8.2, 9.1 Ab initio 1.14.34, Talend DI 3.0, 5.5 SAP BO DS

Job Management: Control M, AutoSys 6.1,11 Reflection for Secure IT (Uni-center)

Reporting Tool: Business Objects 6.5.1/Xi 4.0, Brio Query6.6, Crystal Reports5/ 6/ 8, Tableau 7.0, MicroStrategy 7i, qulickview 11

Data Modeling Tools: Erwin 3.5 / 4.1

Databases: Oracle 7x/ 8i/ 10g, 11g, MS SQL Server 7.0/6.5, 2000, 2005, 2008, MS Access 2000/97Sybase 11.9.2, DB2, UDB, UDB, Teradata V2R4ToolsTOAD 8.6, SQL*Plus 3.3/8.0

GUI tools: Power Builder 4/ 5/ 6/ 7/8

Operating Systems: DOS, Windows 2000/ NT/ XP/ 95, HP UNIX, IBM AIX

Version Control Tool: Visual Source Safe, ClearCase

Programming Languages: SQL, PL/SQL, UNIX Shell Scripting, Korn Shell Script, C, Core Java, Perl, Python, Basic, WinSQL

Other tools: Visio, UML, Singsort

Protocols: FTP, Telnet, TOAD

Bigdata: ETL Hadoop wif Hive, Pig, MapReduce, Cloudera, Hortonworks

PROFESSIONAL EXPERIENCE:

Confidential, NYC

ETL Architect /Analyst/ Data Warehouse lead/ETL Lead Developer

Responsibilities:

  • Worked wif SME to understand the requirement.
  • Did Data Analysis to quality check before being processed.
  • Worked on dimension model for the project.
  • Prepared the High level and ETL Technical specifications for ETL jobs.
  • Prepared Mapping document templates for Source to Target landing.
  • Built PL/SQL to handle performance for huge projects.
  • Developed jobs wif JSON Files to be loaded by using HDFS in Hadoop Environment.
  • Loaded data to SAP using SAP IDocs.
  • Developed, schedule ETL jobs and maintained production jobs about (Daily/Monthly/Yearly) scheduled jobs.
  • Involved in writing the test plans based on Functional Requirement and Business Requirement Documents.
  • Performed Unit testing and Integration testing of the module.
  • Was Involved as part of the UAT testing conducted by the SME.

Environment: IBM Infosphere Information Server 11.5, Oracle 11g MS SQL 2008, Business Object, Rainstore DB, Autosys, qulickview 11

Confidential, NJ

ETL Architect /Analyst/ Data Warehouse lead/ETL Lead Developer

Role & Responsibilities:

  • Worked wif SME to understand the requirement.
  • Prepared the High level and ETL Technical specifications for ETL jobs.
  • Prepared Mapping document templates for Source to landing.
  • Migrated lot Oracle PL/SQL Jobs to Talend Jobs.
  • Used lot of File to import Data and load dat to the SCD1 Tables.
  • Developed, schedule ETL jobs and maintained production jobs about (Daily/Monthly/Yearly) scheduled jobs.
  • Involved in writing the test plans based on Functional Requirement and Business Requirement Documents.
  • Improved performance of a poor performing SQL jobs and database ETL queries.
  • Maintained Job Developed documents, Change documents, Test documents in the Project Shared Directory.
  • Performed Unit testing and Integration testing of the module.
  • Used Big data file Component to process the HDFS files in the Hadoop Environment.
  • Extracted data from Jason files.
  • Used Hortonworks as UI.
  • Used Hadoop Hive and Pig Scripts to achieve results.
  • Was Involved as part of the UAT testing conducted by SME.

Environment: Talend 5.5, Hadoop, Hive, Python, Oracle 11g, MS SQL 2008, PL/SQL, TWS

Confidential, Grand Rapids, MI

ETL Architect /Analyst/ Data Warehouse lead/ETL Lead Developer

Role & Responsibilities:

  • Worked extensively on understanding the Existing Mainframe Code and working on the Architecture and Design wif the Engineering Team.
  • Prepared the High level and ETL Technical specifications for Data stage ETL jobs product and Identity.
  • Prepared Mapping document templates for Source to landing, landing to base objects.
  • Involved in writing the test plans based on Functional Requirement and Business Requirement Documents.
  • Developed, schedule ETL jobs and explained production team about (Daily/Monthly/Yearly) scheduled jobs & logs.
  • Created shell scripts to feed data from different sources to ETL JOBS.
  • Loaded data to MDM.
  • Extensively used Quality stage component for standardize the Identity class employee, contractors address and names.
  • Extracted data from files which was supplied by First Data Servers to track the Credit card and the Payments.
  • Implemented the Parallel Process Methodology and Nodes concepts in all the Parallel Extender.
  • Created - Data Stage Predefined template jobs, (me.e., including Container Jobs, Sequencer Jobs).
  • Improved performance of a poor performing data stage jobs and database ETL queries.
  • Maintained Job Developed documents, Change documents, Test documents in the Project Shared Directory.
  • Source code/version control experience using Source Forge, Citrix
  • Done the Complete testing of other developers jobs and signed off.
  • Performed GUI, Functional, Regression, Integration and System Testing.
  • Performed Unit testing and Integration testing of the module.
  • Was part of the UAT testing conducted by the business people.
  • Strictly following the IBM Web Sphere Data Stage Best practice methods.
  • Used MPP (Massive Parallel Processing)
  • Used XML stage to processes data.
  • Worked on BO when ever client requested..

Environment: IBM Information Server Data Stage 8.5, 9.0 & IBM Info Sphere (PX), DB2 10. MS SQL 2008, PL/SQL, UNIX Shell Scripting Business Object 4.1 wif BO Data integration Services

Confidential, Des Moines, IA

ETL Architect /Analyst/ Data Warehouse lead/ETL Lead Developer

Role & Responsibilities:

  • Involved extensively in both Design team and Engineering Team.
  • Prepared the High level and ETL Technical specifications for Talend ETL jobs.
  • Prepared Mapping document templates for Source to landing, landing to base objects.
  • Involved in writing the test plans based on Functional Requirement and Business Requirement Documents.
  • Developed schedule ETL jobs and explained production team about (Daily/Monthly/Yearly) scheduled jobs & logs.
  • Improved performance of a poor performing jobs and database ETL queries.
  • Maintained Job Developed documents, Change documents, Test documents in the Project Shared Directory.
  • Performed Unit testing and Integration testing of the module.

Environment: Talend 3.0, Oracle 11G, PL/SQL,SQL*plus, Toad

Confidential, Jersey City, NJ

System Analyst

Role & Responsibilities:

  • Involved extensively in both Design team and Engineering Team.
  • Worked closely wif the business users to gather requirements for new requirements.
  • Prepare Business Requirement Document.
  • Data Migration/Integration using Data Stage 7.5
  • Prepared the High level and ETL Technical specifications for Data stage ETL jobs product and Identity.
  • Prepared Mapping document templates for Source to landing, landing to base objects.
  • Involved in writing the test plans based on Functional Requirement and Business Requirement Documents.
  • Developed schedule ETL jobs and explained production team about (Monthly) scheduled jobs & logs.
  • Extensively worked UNIX Shell Scripting (Korn /KSH) / scripting and File Transfer Protocol (FTP) & SFTP,
  • Created shell scripts to feed data from different sources to ETL JOBS.
  • Prepared comparative statement for Identified Key fields and discussed in JAD sessions.
  • Implemented the Parallel Process Methodology and Nodes concepts in all the Parallel Extender.
  • Improved performance of a poor performing data stage jobs and database ETL queries.
  • Maintained Job Developed documents, Change documents, Test documents in the Project Shared Directory.
  • Done the Complete testing of other developers jobs and signed off.
  • Performed GUI, Functional, Regression, Integration and System Testing.
  • Performed Unit testing and Integration testing of the module.
  • Was part of the UAT testing conducted by the business people.

Environment: Datastage and QualityStage 7.5 Oracle 10G, Microsoft SQL Server 2007, Toad, UNIX Shell Scripting (Korn /KSH) Autosys 6.1, Business Object Xi

Confidential, Indianapolis, IN

Analyst/ Data Warehouse lead/ETL Lead Developer

Role & Responsibilities:

  • Involved in designing dimensional model using Erwin 3.5.2.
  • Extensively worked to create and execute change requests for the SFA application in close association wif the validation group and followed the Sales and Marketing SDLC management guidelines.
  • Worked closely wif the business users to gather requirements for the existing functionality.
  • Used the data from the third party vendors to give online Analyst report to the sales rep.
  • Involved in Data Extraction from source database me.e., Oracle 9i and Transformed the Same into Staging area using Informatica 9.1.
  • Created Informatica Mappings wif PL/SQL Procedures/Functions to build business rules like case of pathology, small molecule inhibitors etc., to load data.
  • Used various stages to load from source to Staging and tan to Target Data warehouse.
  • Migrated the repository from development to UAT.
  • Transformed Clean and Confirmed data to Oracle 9i database.
  • Developed jobs between Source and Target and loaded the data into Slowly Changing Dimension Tables.
  • Scheduled the task daily, weekly and monthly for loads, so dat the data is available for generation of BI report.
  • Ensured dat the data is compliant to FDA 21 CFR Part 1002 (Records and Reports) sub part 4(Confidentiality of Information) and sub part 42(Confidentiality of records furnished by dealers and distributors).
  • Responsible in Tuning jobs to improve the performance.
  • The reports are available to the users based on the user permission and based on the user profile.
  • Created and ran Sessions & Batches using Workflow Manager to load the data into the Data warehouse components.
  • For key business requirements wrote stored procedures in PL/SQL.
  • Worked wif Validation analyst ensuring dat the reports dat require statutory compliance of FDA 21 CFR Part 203 and 205(Sales and Marketing) are met.
  • Unit tested and ensured dat the mappings do the transformations as proposed.

Environment: Informatica 9.1, Sun Solaris, Shell scripts, Oracle 9i, PL/SQL, Brio Query

Confidential, Pontiac, MI

Analyst / ETL Lead Developer / Onsite, Offshore model

Role & Responsibilities:

  • Followed all the CMM Quality process and GM Process (STP 21).
  • Involved in gathering business requirements from the client.
  • Data model review was done by the team to check the feasibility.
  • Erwin was used to create the Logical and physical data model.
  • High level & low level designing of ETL Jobs
  • ETL design & ETL development
  • Developing complex ETL jobs involving multiple data sources including several RDBMS tables
  • Develop custom routines & transforms
  • Trouble shoot Jobs
  • Expertise in doing Usage Analysis using Ascential Data Stage for whole Target Data stage projects
  • Prepare the high level design (ETL flow Plan) & low level design (Mapping Docs).
  • Involved in understanding the scope of application
  • Prepared ETL flow of data from source files to the Staging and tan to the Data Mart
  • Used the Stage Editor like database, file (complex flat files), Transformers.
  • Used XML Input stage for the extracting the data and XML transformer to load the data.
  • Used Version Control to backup the source code.
  • Worked on improving the performance of the designed mappings by using various performance tuning strategies.
  • Testing on two levels: Unit Testing and Integration Testing wif all other developers.
  • Used Brio Query and Materialized Views.
  • Support UAT team during the testing.

Environment: DataStage Enterprise Edition 7.5 (PX), HP UNIX Shell scripts, Oracle 10g, Teradata V2R4.

Confidential

Analyst / ETL Lead / Onsite, Offshore model

Role & Responsibilities:

  • Followed all the CMM Quality process.
  • Data Analysis, to study the data base data dat is required from the different source system.
  • Data model review wif the client, because the model was given by the client.
  • Erwin was used to create the Logical and physical data model
  • Involved in understanding the scope of application which was already implemented and for new development.
  • Involved in the Architecture review.
  • Involved in the high level design (ETL design Plan) & low level design (Mapping Docs).
  • Used DataStage designer to create mappings to load various dimensions dat include Products, Product Categories, Coverage Type, Coverage Level, and Risk Level Dimensions.
  • Involved in Creating Jobs, Hash files, sort, and transformer, Lookups, Run, and Monitor Using the DataStage director.
  • Used XML Input stage for the extracting the data and XML transformer to load the data to the staging area.
  • Used Version Control to backup the source code.
  • QualityStage was used to standardize/integrate the data from different sources.
  • Design New Business Object Reports.
  • Reviewing the Report design and improving the query for the performance.
  • Used the Broadcast agent to broadcast the report.
  • Used Control M for scheduling the jobs.
  • Unit testing of the Jobs developed.
  • Use Quality stage to Standardize match and correct the data.
  • Coordinating wif all onsite technical professionals and Project Manager.

Environment: DataStage and QualityStage 7 Server version, HP UNIX, Oracle 9i, Business Objects 6.5, Control M

Confidential

Sr. ETL Developer / Onsite, Offshore model

Role & Responsibilities:

  • Followed all the CMM Quality process.
  • Involved in the high level design (ETL Plan) & low level design (Mapping Docs).
  • Involved in understanding the scope of application for development.
  • Designed and developed several workflows/sessions using Workflow Manager and Designer and monitored the workflows using Workflow Monitor.
  • Performed data manipulations using various Informatica Transformations like Joiner, Expression, Sorter, Joiner, Aggregate and Filter.
  • Enhance Business Objects universe and design Fixed Reports.
  • Involved in understanding the scope of application wif the Customer information.
  • Used the DataStage Designer to develop jobs for extracting, transforming, and loading data into data warehouse
  • Used various stages like aggregator, sort, transformer, link partitioner, link collector, sequential file, hashed file, Lookup Stage to design the jobs
  • Performed Import and Export of DataStage components and table definitions using DataStage Manager
  • Created sequencers to sequentially execute the designed jobs. Used job activity, routine activity, notification activity and sequencer stages to implement these sequencers
  • Used UDB utilities.
  • Created Views and triggers and used DB2 Catalog tables.

Environment: Windows 2000 Advance Server, Datastage 7, DB2, UDB.

We'd love your feedback!