We provide IT Staff Augmentation Services!

Sr Etl Consultant / Solution Architect Resume

2.00/5 (Submit Your Rating)

SUMMARY:

  • Dedicated IT professional with exceptional Data Warehouse, Data Migration, Business Intelligence and having 10+ years of working experience.
  • 10+ years of experience in building Data Warehouses/Business Intelligence (BI) and decision support systems (DSS) with strong domain experience in Health, Finance, Power, Manufacturing, Defense Sectors and others.
  • Good in understanding relational database, Entity relation models and normalization concepts.
  • Experience with highly scalable RDBM systems like SQL Servers, Oracle 11i/10g/9i/8i/8.0/7.3, Teradata, Ingres 1.2/2.0 and DB2 UDB on AIX.
  • 8 years of experience in UNIX scripting and query languages.
  • Exceptional background in business analysis, requirements gathering, design, development, customization, testing, implementation and support of data warehouse, MDM applications and products.
  • Familiar with enterprise wide data warehouse architecture & expertise in Ralph KIMBALL methodology and good in warehousing concepts star schema, snow Flake and identification of KPI, dimensions and facts.
  • Experience with gathering business requirements, use cases and requirements analysis.
  • Experience in Quality assurance for Datawarehouse projects creating test plans, objectives, strategies and test cases. Ensuring data in Datawarehouse meets business requirements.
  • Experience in data profiling, analysis, mappings, match merges and finding out gaps between source and target and made comparative statement reports.
  • Experience in JAD sessions, Data modeling with Erwin case tool (R7), Data analysis, ETL, design and administration.
  • Good in conceptual, logical modeling and physical database design configurations for large - scale data warehouse or OLTP implementations using Erwin.
  • Experience in “Leading Practices” with Confidential Information Management tools to include Datastage enterprise edition Parallel Extender (DS-PX) /7.X/8.1/8.5/8.7, profile stage, Quality stage.
  • Experience with designing and implementing generic ETL streams that perform common data manipulations tasks on data sourced from variety of sources in different formats.
  • Good experience with Datastage admin component and admin operations.
  • Hands on experience with SAP adapters/Datastage SAP R/3 and BW packs in interface projects.
  • Experience in different data sources ranging from flat files, Excel, Oracle, Sybase, SQL Server 2000 & SAP.
  • Experience in processing .XML Input and output files using Oracle RDBMS.
  • Familiar with different software development life cycles including extreme programming, spiral and agile models.
  • Experience in SQL / PLSQL query development including schema definition, stored procedure / function development, and job stream creation, fine tuning database and query optimization.
  • Experience with VI editor and KornShell on different UNIX flavours, good in Unix Shell Scripts for data warehouse and migration projects.
  • Good communication skills, self starter with ability to work with little guidance and understanding business flow.
  • Excellent team player with strong written, communication, interpersonal, analytical skills. capable of working under high stress environment with resource constraints.
  • Good experience with SAP R/3 and BW.
  • Good in - MM/SD/CRM/ERP domain knowledge as techno functional consultant.
  • Excellent problem solving and analytical skills with ability to learn new technologies.

TECHNICAL SKILLS:

ETL: Confidential Information Server 8.7 / 8.5 / 8.1. DATA STAGE- Enterprise Edition (DS-PX)/ 7.5.2/5.1/6.0.

Tools: TOAD, Aqua data studio 4.0, Visual Studio 2000, Confidential - (Profile Stage, Quality stage, Fast track, Meta data work bench, Business glossary).

Business Intelligence tools: SAP BI 7.0/ BW 3.5/3.1

Data Modeling tools: Erwin R7/4.0/3.5.

Databases: Netezza 7.2.0, Oracle 11i/10g/9i/8i/7.x, DB2-UDB, Teradata 13.10,Teradata Utilities, Ingres, UnifyMS SQL SERVER 6.5/7.0& MS Access.

Testing Tools: Auto Tester, Test director, Lotus notes

Operating System: Unix Sun Solaris 2.6/2.7, HP-UX, Confidential AIX 4.2/4. 3, Windows NT and Windows 2000/98/95, MVS XE390.

Programming Languages: PL/SQL, SQL*Loader, Shell Scripts, 4GL, SAP R/3 4.7.

Domain: MM,SD,CRM,FICO

PROFESSIONAL EXPERIENCE:

Confidential

Sr ETL Consultant / Solution Architect

Responsibilities:

  • Handled multi-tasks on a wide variety of functionality, issues, and projects.
  • Worked extensively in the cock-pit of Business and Engineering Team.
  • Understanding functional requirement specifications then create high level technical and functional design documents.
  • Worked as Onshore-Offshore coordinator to follow up on development process and provide status reports to the management.
  • Worked as senior developer on Onsite-Offshore development model projects.
  • Prepared GAP analysis, clarifications and issues reports on functional and technical specs.
  • Communicate with subject matter experts to resolve and clarify issues raised during development cycle.
  • Adherence to project standards and processes and provide valuable suggestions.
  • Create common solutions to implement through out the projects.
  • Performed data quality on source systems.
  • Defined process flow using Data Stage job sequences.
  • Extracted data from multiple legacy SAP R/3 systems using Data stage jobs.
  • Loaded data in Oracle database deemed as staging area.
  • Loaded SAP-MM data into PSES from multiple country sources for Confidential using SAP R/3 pack.
  • High-level design, development, testing and deployment of Interface applications for Confidential Blue Harmony.
  • Responsible for source and target system analysis, data transformation guide lines, data validation and loading from source systems to Interface applications in Confidential -blue harmony project.
  • Understand SAP BW structure and loaded DMS data into FIGS BW database for Confidential
  • Developed ETL jobs, harmonization & survivorship rules applied in ETL transformation and load jobs.
  • Extensively worked on SAP R/3 and BW Pack and Oracle for C&IC and CCA-PLAN tracks development.
  • Prepared SQL queries and unix shell scripts to get the active data into ETL work zone.
  • Have done quality testing on jobs using trail runs on process to make sure the program is error free.
  • Track all issues and documented the development process for program references.
  • Prepared test cases, unit test plans and involved in testing of peers jobs and signed off.
  • Resolved implementation technical issues immediately in WAR room.
  • Strictly implemented Confidential Information server best practice methods.
  • Hands on experience to work on AGILE Methodology.

Environment: Information Server 8.0.5/8.0.1 , Teradata 13.10, Teradata Utilities, SAP 7.10, BI 7.0SAP PACK, BW PACK, MQ PACK, SQL*plus, Oracle 10G, Aqua Data Studio 4.0Visio 2000, Information Analyzer, Quality Stage 8.01, UNIX Shell Scripting (Korn /KSH)Windows 2k, Erwin R7, AutoSys, B.O-XI R/3, HP Quality center.

Confidential, GROTON, CT

Data Stage Developer

Responsibilities:

  • Participated extensively in both design team and engineering team.
  • Extensively worked on Confidential -InformationServer 8.0.1(i.e., Datastage 8.0.1).
  • Designed Confidential Identity conceptual, logical, physical model using Erwin R7.
  • Participated in preparing technical specifications for data stage ETL jobs.
  • Prepared outbound agreement document template for MDM-China Identity.
  • Prepared mapping document templates for source to landing, landing to base objects for Confidential Identity inbound and outbound.
  • Prepared Test cases, use cases for different Confidential Identity ETL tasks.
  • Involved in requirements and design phase of three pan Confidential classes i.e. Identity, customer and product.
  • For Identity class did data analysis, data profiling, and prepared mapping documents prepared gap analysis report for inbound and outbound Confidential Identity class.
  • Used Information analyzer(i.e., Profile Stage) for POC and done column analysis, row analysis, primary key analysis, cross table analysis, relationship analysis and normalization reports.
  • Used trillium for data profiling of source systems.
  • Extensively used IIServer 8.0.1 Quality stage component to standardize Identity class employee, contractors address and names.
  • Build and placed new class, rules for MDM-China Quality standardization.
  • Customized quality stage rules for project requirement.
  • Prepared comparative statement for Identified Key fields and discussed in JAD sessions.
  • Customized data stage jobs for Confidential Identity module.
  • Implemented Parallel Process Methodology and Nodes concepts in Parallel Extender MDM Modules. Finding project Facts & Dimensions and implemented.
  • Created - Data Stage Predefined template jobs, (i.e., including Container Jobs, Sequencer Jobs).
  • Extensively worked with 3 components of Data Stage (Data stage designer, Director, Manager).
  • Improved performance of a poor performing data stage jobs and database etl queries.
  • Worked with multiple data sources for Confidential Identity project.
  • Created 3 different major Universes using with B.O6i for Identity china reports.
  • Generated reports application for identity china with B.O 6i.
  • Responsible for Application back-ups (i.e., Daily/Weekly/Monthly).
  • Maintained Job Developed documents, Change documents, Test documents in project shared directory.
  • Involved in testing of peers jobs and signed off.
  • Developed and scheduled ETL jobs and explained production team about (Daily/Monthly/Yearly) scheduled jobs & logs.
  • Created shell scripts to feed data from different sources to ETL JOBS and extensively worked with File Transfer Protocol (FTP).
  • Involved in writing test plans based on Functional Requirement and Business Requirement Documents.
  • Performed GUI, Functional, Regression, Integration and System Testing.
  • Performed Unit testing and Integration testing of module.
  • Strictly followed Confidential Information Server (i.e., Data Stage Best practice) methods.

Environment: Confidential - Confidential Information Server 8.0.1, Enterprise Edition (DS-PX), Oracle 10G, SQL*plus, ToadVisio 2000, SQL Server 2005, Trillium 5, Information Analyzer, Quality Stage 8.01, Erwin R7B.O (5/6i), UNIX Shell Scripting (Korn /KSH), XML, Windows 2k.

Confidential, Wayne - PA

Data Stage Developer

Responsibilities:

  • Participated as senior data stage developer for Beacon project of DLL.
  • Participated in Design and implemented data warehouse star schema model.
  • Implemented parallel process methodology and nodes concepts in all parallel extender projects.
  • Identified project facts & dimensions and implemented.
  • Prepared Data Stage ETL standards, data validation standards and source-target mapping documents.
  • Prepared functional specifications for first phase.
  • Created Data Stage predefined template jobs, pre-load data stage jobs for Beacon project. (i.e., including container jobs, sequencer jobs).
  • Created slowly changing dimension template jobs.
  • Developed Data Stage jobs using necessary stages and validated them.
  • Extracted data from 3 different sources transformed and loaded data into the warehouse.
  • Generated pre-defined sql’s and shell scripts for Beacon project.
  • Prepared test scenarios, test plans and test cases.
  • Extensively worked with 3 components of Data Stage.
  • Improved performance of poor performing queries.
  • Worked with multiple data sources for data extraction.
  • Documented all the developed jobs, change documents, test documents in project shared directory.
  • Involved in testing of peers jobs and signed off.
  • Developed and scheduled ETL jobs and explained production team about (Daily/Monthly/Yearly) scheduled jobs & logs.
  • Strictly followed Confidential Web sphere data stage best practice methods.

Environment: Web sphere Data Stage 7.5, Enterprise Edition (DS-PX), Oracle 10G, SQL*plus, Toad, UNIX Shell Scripting (Korn /KSH), XML, CRT, Windows 2k, VOILA, B.O (5/6i).

Confidential, Raleigh - NC

Data Stage Developer

Responsibilities:

  • Designed target schema definition and extraction, transformation (ETL) using Data stage.
  • Data migration from DB2 6.0 to SAP R/3(4.7H) is done through Data stage.
  • Worked on HRMS module to migrate data from different source systems.
  • Involved in CEMA to SAP-HR POC and prepared POC document.
  • Involved in HR extract to ODM Interface POC and prepared POC document.
  • Involved in High level & Low level design of migration project.
  • Participated in environment setup issues & Involved in DB2 client set up in Data stage server.
  • Prepared legacy - SAP HR mapping docs.
  • Generated functional specifications and designed test plans and test cases.
  • Conducted training program on Data stage to new team members.
  • Applied high performance methods in ETL process.
  • Extensively used sequential file, CFF, sort, link collector and FTP stages.
  • Used SAP packs to extract data from SAP Systems.
  • Used ABAP EXT for R3 and IDOC EXT for R3 to extract data from R3 & Load data into R3
  • Used load & extract pack for SAP BW.
  • Implemented SAP packs delivered by ascetical for R/3 conversions.
  • Customized ABAP programs while using ABAP Stage.
  • Wrote shell scripts to combine, compress, and FTP compressed files to target system.
  • Developed extract, cleanse, transform and integration jobs.
  • Worked on transformations using Data stage from legacy system DB2 to SAP4.7 system.
  • Extracted data from CEMA and generated Idocs using DS IDOC load stage and send to SAP HRMS.
  • Used Profile stage to do column Analysis, row analysis, primary key analysis, cross table analysis, Relationship, analysis and normalization.
  • Used Data stage director and its run-time engine to schedule running solution, testing and debugging its components, and monitoring resulting executable versions.
  • Created shared container to simplify Data stage design, and used it as a common job component throughout project.
  • Extensively worked as Datastage admin activities like create new project, create new roles & responsibilities, re-indexing data stage jobs, release locks, created or increase mount points, maintain necessary folders in UNIX for Data stage server and job priorities folders.
  • Worked with Data stage manager for importing metadata.
  • Implemented all enterprise parallel process new features in this project.
  • Implemented parallel process methodology and nodes concepts in all parallel extender projects.
  • Experience with RDBMS like DB2 UDB on AIX.

Environment: Web sphere Data Stage Enterprise Edition (PX), 7.5.2, Profile Stage 7.0, SAP R/3 4.7, SAP Pack 5.2, DB2, DB2 Connect v8.1, AIX, z/OS MVS, XML, UNIX Shell Scripting (Korn /KSH), TERADATA V2 R 5.0, Windows 2k, Lotus Notes, Confidential Personal Communication 5.7., Rain dance v2.03

Confidential, Lexington, KY

Data Stage Developer

Responsibilities:

  • Full responsible for STG1 data cleansing & extraction, EDR extraction data, EDM data.
  • Participated in requirements gathering and designed data model for Lex-connect project.
  • Design ETL plan and JOBS design, ORACLE/UNIX environment design.
  • EDR & EDM design plan, load time tracking plan.
  • Involved in designing data model using ERWIN and create DDL Scripts.
  • Extensively used Datastage manager, designer, administrator, and director for creating and implementing jobs.
  • Developed various jobs using ODBC, hashed file, aggregator, sequential file, PIVOT, IPC stages.
  • Created shared containers to use in multiple jobs.
  • Extensively used Link practitioner and link collector in improving performance.
  • Involved in writing test plans based on functional requirement and business requirement documents.
  • Performed GUI, functional, regression, Integration and system testing.
  • Performed unit testing and Integration testing of module.
  • Created shell scripts, which will intern call Data stage jobs with all paths to source and targets and even with connection information.
  • Generated aggregate procedures, views and indexes on tables.
  • Fine-tune all extracted queries using AUTO TRACE and EXPLAIN PLAN.
  • Developed batch jobs, those jobs works as scheduled jobs (Daily / Monthly / Yearly )
  • Documented purpose of mapping so as to facilitate personnel to understand process and in corporate changes when necessary.
  • Worked on Data stage admin activities - created new project, new roles & Responsibilities, Re-indexing data stage jobs, release locks, created / increase mount points; maintain necessary folders in UNIX for Data stage server and job priority folders.
  • Implemented all enterprise parallel process new features in this project.
  • Implemented parallel process methodology and nodes concepts in all parallel extender projects.

Environment: Web sphere Data Stage 7.5, Enterprise Edition (PX), ORACLE Data Integrator, B.O (5/6i). XMLRTI, UNIX Shell Scripting (Korn /KSH), SQL*plus, Windows 2k

Confidential, AZ

Data Stage Developer

Responsibilities:

  • Used Business objects to develop financial reporting system for developing financial reports for its various branch locations across United States.
  • Responsible to create, test the ETL jobs composed of multiple Data stage jobs using Data stage job sequencer.
  • Integrated mainframe IMS data files to DB2 Data warehouse with Data stage ETL process.
  • Implemented CFF stage for loading IMS flat files and flattened redefines and groups.
  • Created several general routines (Before-after, transform function) used across project.
  • Developed various jobs using ORA OCI, ODBC, Hashed file, Aggregator, Sequential file stages.
  • Designed and developed of Data stage server and parallel jobs.
  • Implemented XML plugin’s for reading data stage log files and populated system tables with audit data.
  • Imported Oracle schema into data stage with manager for data mappings.
  • Installed Data stage clients /servers and maintained metadata in repositories.
  • Used shared containers for multiple jobs, which have same business logic.
  • Used Datastage manager to import metadata from repository, new job categories and creating new data elements
  • Tuned Datastage jobs for better performance by creating Datastage hash files for staging data and lookups. Used Datastage director for running Jobs.
  • Used manager to import, export jobs / routines.
  • Created shell scripts, which will invoke Data stage jobs passing all variables to job to execute like source and target databases connection information, user id, password and others, file locations.
  • Developed processes for extract, cleanse, transform, integrate and load the data into Oracle database using Datastage designer.
  • Worked with all Datastage components for data conversion & load it into target.
  • Creation of reports like reports by product, reports by customer, reports by period, demographic reports and comparative reports. Worked on database connections, SQL joins, cardinalities, loops, aliases, views, aggregate conditions, parsing of objects and hierarchies.
  • Wrote PL/SQL procedures with the given functional logic to get / put the data in the database.
  • Fine tuned the SQL queries for better performance.
  • Installed and configured data stage with plug-in’s and adapters & data stage pack for SAP.
  • Extracted SAP R/3 data with IDOC and wrote ABAP code to customize IDOC objects.
  • Implemented SAP R/3 BAPI interfaces to load data into SAP R/3 system.
  • Extracted data and loaded to SAP BW system with SAP Adapters.
  • Worked with SAP plug-in stages like IDOC Load, IDOC Extract, BW load, BW extract stages.
  • Involved in extract data from SAP MM and SD tables.
  • Involved in customize generated ABAP code as per requirements.
  • Operating system and how components parts are structured Job, task, and data management.
  • Environment control, history, on-line access and databases.
  • Extensively worked on Datastage admin activities like create new project, create new roles & responsibilities, re-indexed data stage jobs, release locks, created / increase mount points, maintained necessary folders in UNIX for Datastage server and job priorities folders.
  • Experienced with Teradata Utilities (BTEQ, Fastload & Multiload).
  • Experienced in tuning of batch BTEQ queries.
  • Implemented all enterprise edition parallel process new features in this project.
  • Implemented parallel process methodology and nodes concepts in all parallel extender projects.
  • Skill requirements with skill matrix.

Environment: Data Stage Enterprise Edition (PX), Quality Stage & Meta Stage, XE/390, MVS, MAINFRAME, COBOL, DB2, SAP BIW 3.0B,SAP MM, SAP SD, ABAP 4.7C, XML, version control, Teradata V2 R 5.0, Teradata Utilities, PL/SQL, HP-UNIX and Windows NT, Lotus Notes, Business Objects-6i.

We'd love your feedback!