We provide IT Staff Augmentation Services!

Etl - Solution Consultant Resume

4.00/5 (Submit Your Rating)

SUMMARY:

  • An ambitious self starter with more than 18 years of experience in System Analysis, Design, Development, Documentation in the fields of Data Warehousing, ETL methodology, Client server technologies, Unix and J2EE, in developing 3 - tier, n-tier architectures.
  • A multi-skilled Data Architect / MDM Architect / Solutions Architect / ETL Architect / Tech Lead / Senior Developer with good all-round ability to handle multiple projects and meet deadlines whilst at the same time comprehending complex and interdependent business processes.

CORE COMPETENCIES:

IT Strategy and Execution Refactoring: J2EE, Java, JSP, Servlet, EJB, PHP

Outsourcing and Offshore Capacity Planning: Web Logic, Tomcat, Apache

Standards and Guidelines: Web Services

Struts, XML/XSL/DTD, RMI,JDBC: Application Architecture SOA Implementation

JavaScript, JNDI: Process improvements

Cloud Computing (Amazon EC2): JMS, MSQ, SOAP, REST

Training & Program management: Data Governance

Production Support/Enhancement: Compliances

ERP: People soft( HR, PAYROLL), SAP

Performance tuning Data Profiling, Data Cleansing: Informatica 9.0/8.5.1/8.1.1 , Power Exchange, Power Connect, IDQ,MDM,Web Services, ODI, SSIS UML, RUP, Rational Suite Hadoop, Map Reduce, PIG, HIVE, Hbase, MongoDB, CouchDB, HDFS Business Object, Cognos, SSRS, SSAS, Crystal Report

Methodology: Agile, Waterfall, SCRUM

MDM, Kalido: Oracle, SQL Server, Teradata,Netezza, Green Plum, DB2, Sybase

Framework: TOGAF, FEAF,MODAF,DODAF,ITIL

Perl , AWK, SED: Linux, AIX, Sun Solaris, Windows XP

PRPOFESSIONAL EXPERIENCE:

Confidential

ETL - Solution Consultant

Responsibilities:

  • Performed a role of Solution Integration Consultant for decommissioning existing legacy systems (Demantra) and getting new ERP ( Confidential ) in place by extracting existing ETL rules from Informatica Power Center for Sales Order, Supply Finance, Open Orders, Invoice Daily, Invoiced Monthly, Spend, Spend Latest Estimate, Sales Forecast subject and mapping those rules back to JD Edward.
  • Mapping tables, fields between Legacy and new ERP for integration.
  • Delivered technical solutions in variety of situations. Wearing many hats at professional level including Data Modeler, Data Analyst, Business Analyst, ETL Architect.
  • Utilized Teradata and Oracle as SQL Database. Used Teradata utilities for their systems.
  • Recommended generic common framework components which works like plug in and plug out (Example Data Standardization, Error Handling, Archiving, Purging, Email notification, ETL Batch Id logic, Audit Framework, Common Power Shell script for taking different intake files, Common power shell script for invoking workflows which can check status of existing workflows (Running, Failed, Suspended etc..) and based on status it invokes workflows.
  • Analyzed lot of Usage Tracking Reports, Data Lineage Reports, and Legacy Security reports with OBIEE reporting tool.
  • Worked with different sources flat files from different legacy system and loaded into Teradata database.
  • Build data model for different subjects area wherever applicable for ETL layers
  • Used common STG process control tables to pass parameters from one procedure to other procedures.
  • Recommended tuning to existing Informatica processes, stored procedures.

Environment: Informatica 9.6.1, Erwin, Toad, PL/SQL., Oracle 11g, Teradata, OBIEE

Confidential

ETL - Solution Architect / ETL Architect/ Lead Developer

Responsibilities:

  • Performed a role of Solution Architect / ETL Architect / Senior Developer for converting existing Claim, Enrollment, Billing system from SQL Server / Pl SQL procedures conversion to Informatica conversion. Enrollment system consists of AppIn(OEC), Member Creation, Member Sync, Accretion, TRR(Transaction reply report), MMR(Membership Monthly report), LIS/LEP (Low Income Subsidy / Late Enrollment Penalty), MPWR (Member Premium Withholding Report) and other modules. Billing has Delinquency and other modules.
  • Functioning as customer-facing side of Microsoft to enterprises, delivering technical solutions in variety of situations. Wearing many hats at professional level, including web and Azure developer, project coordinator, Solution architect, ETL Architect, Lead Developer.
  • Utilized Azure SQL Database, Web API, Data Factory, Azure Active Directory,
  • Developed generic common framework components which works like plug in and plug out (Example Data Standardization, Error Handling, Archiving, Purging, Email notification, ETL Batch Id logic, Audit Framework, Common Power Shell script for taking different intake files, Common power shell script for invoking workflows which can check status of existing workflows (Running, Failed, Suspended etc..) and based on status it invokes workflows.
  • Reverse engineer ETL code for Data Lineage and Data flow documentation
  • Generate RACI chart for Data Governance.
  • Used Teradata utilities for one of their core system
  • Worked with different data base sources flat files and XML files into Natezza database.
  • Utilized Informatica Big Data (BDE). Extracted Data from Hadoop and Modified Data according to Business requirement and load into Hadoop.
  • Build data model for different subjects area wherever applicable for ETL layers
  • Designed the real time analytics and ingestion platform using Storm and kafka. Hands on experience with multiple NOSQL databases .
  • Used common STG process control tables to pass parameters from one procedure to other procedures.
  • Existing legacy were used to have Web Services, C++, XML’s, sql server procedures and pl/sql procedures to sync member from Enrollment system (Sql Server) to Claim System(Oracle). This used to cause performance overhead. New ETL architecture replaced all these technologies and used only Informatica for ETL conversion from Enrollment system to Claim System
  • Performed lot of tuning to existing Informatica Procedures
  • Involved in Data Governance by creating RACI charts
  • Used IDQ for Data Quality. Very strong in implementation of data profiling, creating score cards, Creating reference tables and documenting Data Quality metrics/dimensions like Accuracy, completeness, duplication, validity, consistency. Good knowledge of Informatica Data Quality transformations like Address validator, Parser, Labeler, Match, Exception, Association, Standardizer and other significant transformations. Used Address doctor. Perform address validations with US and other countries data
  • Involved in moving code from Dev to QA to SIT to UAT to PRD. Prepared Run book.

Environment: Informatica 10.1. Informatica BDE,Windows, Erwin, Toad, PL/SQL., Oracle 11g, DB2, Sql Server, Teradata. Data Governance. Azure SQL DB, Data Factory, Data Quality 10.1, NOSql, HDFS, Kafka

Confidential, New York

ETL - Data Architect / MDM Architect / Lead Developer

Responsibilities:

  • Performed a role of MDM Architect / ETL Architect / Senior Developer for building customize Master Data Management (MDM) repository for Policy, Product, Party. Used Informatica, PL/SQL, SQL, Unix, and Awk Scripting to build Master Data Management.
  • Build a tool for MDM instead of using Informatica MDM.
  • Build generic data model for Party. Party can be anything ex Customer, Company, Organization, Trust, Employee, Employer, Seller etc.
  • Moved data from different relational / non relational source systems to staging area. No transformation performed at staging area. Staging area is truncate and load region.
  • After staging, data got moved to Consolidated Source region. Consolidated source region is persistence region where data were persisted for longer time. Then data gets moved to Consolidated Standard where standardization were applied
  • Developed generic ETL for performing standardization. Used metadata tables for entering standardization ID. Developed UDF for same standardization ID’s .Performed standardization for SSN, Address (Making Street, Avenue, Road, Suite as same abbreviations), Country, Email Standardization.
  • After data standardized, data gets moved through some temporary tables to do matching algorithms in order to assign global unique ID for all different records which belongs to same customers. Matching algorithms are based on combination of First Name, Middle Name, Last Name, SSN, Birth Date, Address, Zip Code, Birth Date, Gender Code etc. Total 10 algorithms were used with different combination of above attributes First 5 algorithms were assigned percentile of 100 %, Next 2 were 90%, Next 2 will be 80% and so on. For all 100% percentile, same Global ID was applied to different records of the same customer. Once all 100% percentile algorithms executed, then for remaining records, applied unique GLOBAL ID. For percentile less than 100%, match was performed for 100 to 90 %, 100 to 80% etc. No GLOBAL ID generated for percentile less than 100%.
  • Used Merge statements for updating millions of records
  • Used Bulk Collect PL/SQL stored procedure for moving millions of records
  • Used Unix AWK and Shell Scripting for starting workflows. Used Sed Scripting.
  • Used Autosys for Scheduling
  • Performed Incremental for Master Data Management.

Environment: Informatica 9.6. AIX, PERL, Erwin, Toad, PL/SQL., Oracle 11g, DB2, Sql Server. Unix, Awk, Sed. Data Governance.

Confidential, Roseland, NJ

Data Architect / ETL Architect / Lead Developer

Responsibilities:

  • Performed a role of ETL Architect / ETL Lead leading a team size of 8 peoples for Informatica, IDQ, Data Exchange, Data Transformation related work. Performed Data warehouse architecture and ETL architecture in both Kimball and Inmon styles.
  • Involved in database and table design and implementation for data warehouse and related ETL processes for Claim Payment, Eligibility File, ASO(Administrative Services Only) file, Absence files for different customers (DELL, Sedgwick, Blue Cross Blue Shield etc..) . Involved in integration of new systems with old systems( DCMS, COMPASS, GIDW etc..)
  • Responsible for moving data from Source to Enterprise Data Hub Framwork (Source to EDH to RDS to Outbound file)
  • Analyzed and modified logical, physical, dimensional models to include new functionality with the help of Erwin.
  • Configured same workflow for concurrency with multiple parameter file. Used same mapping for different projects. Designed and Performed File Validations with Unix, Data Validation, Data Standardization, Referential Integrity validation, SOR (System of Rules), Delta Detection, Error and Exception handling (Soft Error vs Hard Error), Restorability, Recoverability.
  • Gather requirements, Analyze, Design, Code, Test highly efficient and highly scalable integration solutions using Informatica, Oracle, SQL, Oracle10g, DB2 systems, flat files, Unix Shell Scripting.
  • Used PERL scripting in built modules.
  • Responsible for Functional design, technical design, coding, testing, debugging, and documentation for Informatica processes.
  • Documentation of Source to Target matrix, mapping specifications, Mapping inventory, Unit test plans and data base sizing.
  • Used IDQ in build mapplets for US and CANADA SSN validation, zip code, address, telephone number validation and standardization.
  • Used Match transformation, labeleller transformation and merge transformation to find out duplicate data, merging different set of data.
  • Created custom profile, data object, expression rule for creating profiles
  • Used Informatica Data Profiling tool to do some other Data Profiling work. Identified Data quality rules and implemented with Informatica mappings. Implemented reusable Error handling mechanism to capture Errors and Warnings.
  • Used Informatica Analyst tool to gather requirement and create it as mapping. Created Data object, profiles and scorecards.
  • Used customized MDM solution with help of informaitca.
  • Defined accountability procedures governing data access, processing, storage, retention, reporting and auditing measuring contract compliance.
  • Developed data stewardship program establishing metadata registry responsibilities
  • Scheduling the Informatica workflows using Autosys and Data Exchange scheduling tool & trouble shooting the Informatica workflows.
  • Heavily involved in tuning informatica processes and Oracle queries.

Environment: Informatica 9.5. Oracle11g, DB2, AIX, Data Exchange, IDQ, Data Transformation, MDM, MetaData Manager, Informatica Analyst, Informatica Developer, Erwin, Toad, PL/SQL,Awk, Data Governance, PERL

Confidential, NJ

Data Architect / ETL Architect / ETL Lead

Responsibilities:

  • Performed a role of ETL Architect / Solution Architect for a team size of 15 peoples.
  • Performed Data Conversion with help of Informatica, Power Exchange (Real time CDC), Oracle, AIX technologies for Forward Flow / Reverse Flow.
  • Laid out ETL architecture for Data Conversion, Data Validation, and Recovery.
  • Improved Historical Bulk load (Historical data load) window from 7 days to 1 day for different subject areas like Registration / Vehicle / Title / Customer / Odometer / Lien etc..
  • Responsible for getting Real time CDC data with help of Informatica Power Exchange for DATACOM sources on z/os operating systems.
  • Responsible for maintaining same set of Data Maps and Registrations across different environments (DEV / ST / PEROFMANCE TESTING / PROD)
  • Responsible for Monthly Production Load.
  • Responsible for separating out environments from one Unix box (LPAR) to different LPAR.
  • Responsible for CDC volume testing.
  • Involved in Design of Error Handling mechanism (Errors generated due to real time CDC and through informatica PMERR tables).
  • Responsible for designing batch interfaces for Oracle GL (Data stage batch jobs) for external vendor like Experian, Department of Revenue, Polk etc.
  • Used Informatica MDM to load Customer data from various agencies into Informatica MDM. Perform data standardization and data cleansing by using cleanse function in Informatica MDM. Configured Landing, Staging and Loading processes, Trust and Validation rules, Match and Merge process.. Created and Executed Batch jobs and Batch Groups. Created Hierarchies, relationship type, packages and Profiles.
  • Configured Informatica Data Director (IDD)
  • Used Informatics built in mapplet for US to validate address.
  • Used IDQ for data cleansing and profiling.
  • Coded complex quality rules with cleansing, parsing and standardizing data.
  • Performed complex validations with IDQ
  • Used MDM SIF api for external client to interact with MDM Hub
  • Used Hierarchies tool in MDM for configuring entity type, relationship type. Defined validation rules and match and merge rules in MDM
  • Involved in Database Back up and Recovery, Flashback Configuration, Data Guard Configuration in order to to meet agreed MTTR (Mean time to recover) according to SLA.
  • Analyzed and modified logical, physical models to include new functionality with the help of Erwin. Gather requirements, Analyze, Design, Code, Test highly efficient and highly scalable integration solutions using Informatica, Oracle, SQL, Oracle11g, DATACOM systems,, flat files, Unix Shell Scripting and PERL scripting.
  • Worked on requirement for upgrading Informatica to 9.5 and configuring Informatica Power Exchange for Hadoop with HDFS for flat file source and flat file target
  • Responsible for Functional design, technical design, coding, testing, debugging, and documentation for Informatica processes.
  • Documentation of Source to Target matrix, mapping specifications, Mapping inventory, Unit test plans and data base sizing.
  • Scheduling the Informatica workflows using Informatica schedule & trouble shooting the Informatica workflows.
  • Heavily involved in tuning informatica processes and Oracle queries.
  • Involved in automation for Unit Testing.
  • Involved in Data Governance Meeting.
  • Involved in all high level meeting with Client and HP Stakeholders.
  • Responsible for scoping new work with agile methodology.
  • Responsible for recruiting new resources and training them with existing architecture and technologies.

Environment: Informatica 9.1. Informatica Power Exchange 9.1, Oracle11g, Informatica MDMIDD,SIF Api,Datacom, Information IDQ,AIX, PERL,Erwin, Toad, PL/SQL, Data Stage,J2EE, JAVA, JAVASCRIPT, XML, Filenet 8 Content Manager, ILOG JRULES, IBM Webshphere Message Broker, Oracle RAC, Hadoop, HDFS,Business Object, Microsoft Active Directory, Data Governance, Sun Java System Identity Manager, IBM Web sphere Application Server

Confidential, NY

ETL Lead / Solution Consultant

Responsibilities:

  • Performed a role of ETL Lead Consultant for Informatica, Power Exchange, Data Masking, and IDQ related work. Performed Data warehouse architecture and ETL architecture in both Kimball and Inmon styles.
  • Involved in database and table design and implementation for data warehouse and related ETL processes.
  • Involved in Informatica Data Conversion Project, to convert data from DB2, IMS database to Oracle with Informatica and Power Exchange for CUSTOMER/ HOLDINGS/ CUSTODY/ SECURITIES/ P and I subject areas.
  • Analyzed and modified logical, physical models to include new functionality with the help of Erwin.
  • Gather requirements, Analyze, Design, Code, Test highly efficient and highly scalable integration solutions using Informatica, Oracle, SQL, Oracle10g, Main frame (DB2) systems, MS SQL server databases, flat files, Unix Shell Scripting.
  • Responsible for Functional design, technical design, coding, testing, debugging, and documentation for Informatica processes.
  • Documentation of Source to Target matrix, mapping specifications, Mapping inventory, Unit test plans and data base sizing.
  • Used Informatica Data Profiling tool to do some other Data Profiling work. Identified Data quality rules and implemented with Informatica mappings. Implemented reusable Error handling mechanism to capture Errors and Warnings.
  • Used Data Masking tool to mask critical data wherever required.
  • Scheduling the Informatica workflows using Control-M scheduling tool & trouble shooting the Informatica workflows.
  • Heavily involved in tuning informatica processes and Oracle queries.
  • Wrote new MDX calculation for PRIVATE and PUBLIC cubes in SSAS.
  • Created new SSIS ETL packages for EPRISM / SMTS/ ICE/FL applications.
  • Heavily involved in tuning of Informatica mappings / Oracle queries / SSAS Cube queries, SSIS ETL Packages.
  • Developed customized MDM Solution for common entities across different division.
  • Learned HIVE, PIG, MAP REDUCE, HDFS, HADOOP.CouchDB, MONGO DB.
  • Learned Integration of Hadoop and HDFS with Informatica with Power Exchange.

Environment: Informatica 9.0. Oracle11g, DB2, Sql Server, Power Exchange, Sun Solaris, Erwin, Toad, PL/SQL, VB Scripts, SSIS, SAS, MDX. Hadoop (Big Data)

Confidential, NJ

Technical Lead /Architect

Responsibilities:

  • Performed a role of Technical Lead consultant for EXPR, CLCS, and Credit Card DDM projects.
  • Analyzed logical, physical and dimensional models
  • Discussed with business user about requirement of all vendors (Experian, Credit Card, Acquisition etc.) in order to build dimensional data mart for each Vendor.
  • Designed ETL piece with Informatica and created Tech specs for Informatica mappings.
  • Develop Business and Technical Roadmaps.
  • Heavily involved in Oracle and ETL Tuning Heavily involved in data masking by using Informatica MD5 function and Oracle build in package to mask Customer SSN, Co maker SSN, their acct numbers etc.
  • Heavily involved in extracting and loading data from TPR (Transient Persistent Repository) to DDM. It involved loading data from flat files to TPR area and from TPR to Staging area, from Staging to Shadow area and from Shadow to DDM.
  • Involved in going through Explain plan, applying hints, rewriting query, architecture design change, parallel background calls with UNIX, using oracle in built PL/SQL packages, using DBMS PROFILER for tuning PL. /SQL etc.
  • Performed data profiling and data quality checks with customized ETL.

Environment: Informatica 8.6.1, Oracle11g, Linux, Erwin, Toad, PL/SQL.

Confidential, NY

Solution Architect/ Technical Lead

Responsibilities:

  • Performed a role of ETL Architect/Technical Lead for a team size of 11 peoples for RIC (RMBS Integration and Commercialization) project.
  • Led the planning and coordination of an IT vision, strategy, goals, budget, and initiatives that support the company’s long term objectives.
  • Participated in conflict resolution and implementing best practices across Enterprise.
  • Created overall Architecture, System Spec and Design Doc.
  • Conducted performance reviews, technical discussions of deliverables. Actively involved in hiring resources. Participated in future initiatives by providing technical implementation strategy, budget, resourcing.
  • Created Web Services with Informatica as tool. Refactored old code with new architecture. Used Agile / Scrum methodology.
  • Analyzed and modified logical, physical and dimensional models to include new functionality with the help of Erwin.
  • Discussed with business user about requirement of all vendors (Experian, Veros, Zenta etc.) which send data to S&P for evaluation and came up with modular, reusable object oriented approach to cover all requirement.
  • Designed ETL piece with Informatica and created Tech specs for Informatica mappings
  • Heavily involved in Oracle and ETL Performance Tuning. Designed Purge and archive mechanism for ETL process. Performed Metadata Modeling.
  • Evaluated Cloud Computing for Amazon EC2 for historical data load.
  • Performed Virtualization in order to process historical data faster.
  • Heavily involved in handling offshore resources.
  • Heavily involved in extracting and loading data in VLDB database (Terabytes).

Environment: Informatica 9.0/8.1.1, Informatica MDM, Oracle11g, Linux, Erwin, Toad, PL/SQL, SQL Loader, Business Object XI 2

Confidential, PA

Data Warehouse Architect/Technical Lead

Responsibilities:

  • Performed a role of Data Warehouse Architect/Production Support/ Administrator.
  • Lead team size of 6 peoples.
  • Heavily involved in the production support for various DTS properties.
  • Created and manipulated BTEQ, Fast Load, and Mload Scripts.
  • Created Teradata External loader connections such as MLoad, Upsert and Update, Fast load while loading data into the target tables in TeradataDatabase .
  • Loading data from various data sources and legacy systems into Teradata production and development warehouse using BTEQ, FASTEXPORT, MULTI LOAD, FASTLOAD and Informatica.
  • Involved in analysis, design, development, maintenance, tuning, documentation, and implementation of enterprise logical data models and physical database objects.
  • Performed impact analysis for the source system updates on Data warehousing application.
  • Provided innovation, technology- and architecture-related thought leadership, strategic direction, and long-term vision.
  • Performed Capacity planning for future enhancements and future growth. Worked on Disaster Recovery strategy. Designed Failover strategy in order to perform continuous load. Controlled database access permissions and privileges. Conducted necessary training programmes within organization.
  • Used Microstrategy as one of the reporting tool. Extensively worked in creating and integrating Micro Strategy Reports and objects (Attributes, Filters, Metrics, Facts, and Prompts) with the data warehouse. Created Data Mart Reports using Complex Filters and Compound Metrics.
  • Worked on Proposal to get more business for Data Warehousing for different Casino’s.
  • Used Netezza CLI commands (NZSql, NZLoad) for loading dimension and Fact and benchmarked performance.
  • Worked on Implementation strategy / Technical Implementation piece.

Environment: DTS, Sql Server 2005, T-Sql, Teradata V2R6, Netezza, Erwin, Stored Procedure,Cognos, Windows 2003 Server, Teradata Manager, Teradata Administration, MLOAD, BTEQ, Microstartegy

Confidential, CT

Technical Lead

Responsibilities:

  • Performed a role of Technical Lead for a team size of 20 peoples for CLAIM project.
  • Worked on onshore/ offshore model which includes supervision, conflict resolution, resolving technical challenges. Served as a key technical steward ensuring Business and IT alignment. Responsible for planning project activities, arranging timelines, approving vacations by considering impact on existing implementations. Used Kalido as MDM tool.
  • Responsible for managing a team which includes DBA’s, Data Modelers, Developer’s, SME’s, Architect’s across different projects
  • Participated in the upgrade process of Informatica Power Center from 7.0 to 8.5.1, Configuration of User/Groups, Permissions, Roles, and Privileges. Used Change Data Capture mechanism to load changed data from source to Persistence stage (PSTAGE) by the following three methods Double Minus technique, Oracle CDC technique by using Date mechanism.
  • Wrote PL SQL scripts and used couple of metadata tables to do Data Profiling, Referential Integrity and Data Quality work before loading data into PSTAGE
  • Created documentation for Capacity Planning, Backup and Archival strategy, Restart and Recovery Strategy, Purge Strategy.
  • Used Kalido for Master Data Management (MDM)
  • Used Informatica Data Profiling tool to do some other Data Profiling work. Used Code Generator to generate multiple pl/sql procs or functions by using Pl sql supplied package. This way it eliminated a lot of coding work. Used Informatica Power Center to load data from PSTAGE to KSTAGE (Kalido Stage).
  • Responsible for mentoring whole team in order to enhance productivity.
  • Developed a big application for Data Quality with PL/SQL package.

Environment: Informatica 8.5.1, Oracle 10g, Netezza, Kalido(MDM), Linux, Erwin, Autosys, Toad, PL/SQL, SQL Loader, PVCS, Mercury Quality Center, Business Object.

We'd love your feedback!