We provide IT Staff Augmentation Services!

Etl Informatica Consultant Resume

4.00/5 (Submit Your Rating)

Jersey City New, JerseY

OBJECTIVE

  • To obtain the position of a Senior ETL Developer or ETL Tech Lead and utilize my skills in data integration services wif highly advanced data warehousing and BI solutions to halp the organization achieve their long term and short - term goals.

SUMMARY

  • 8.5 years of experience in business requirements analysis, application design, data modeling, coding, development, testing and implementation of business applications wif RDBMS, data warehouse/data mart, ETL, OLAP, OLTP and client/server environment applications.
  • Proficient in Software Development Life Cycle (SDLC) Methodologies, ISO 9001:2000, SEPG, CMM Level 5 and validations to ensure complete quality assurance control.
  • Having 8.5 Years of Data warehousing experience using Informatica Power center 8.x,9.x
  • Strong knowledge and work experience in designing, building and managing large complex DW BI systems.
  • Extensive exposure to ETL methodology for supporting data transformations and processing, in a corporate wide ETL Solution using Informatica Power PowerCenter. Advanced knowledge in ETL (Source Analyzer/Warehouse Designer/Mapplet Designer/Transformation Developer/Mapping Designer, Repository Manager, Workflow Manager, and Workflow Monitor).
  • Good experience wif Star schema, Snowflake Schema, Fact and dimensional tables, slowly changing dimensions.
  • Acted as liaison between the clients, programmers and managers of the project development. Instrumental in the development and implementation of several project/module phases, maintenance and application testing.
  • Preparation of Technical Documents (Design specification, mapping documents).
  • Experience on working in Real time using Power Exchange for MAINFRAMES.
  • Experience on cloud based databases like Snowflake, Mongo DB, My Sql, and Amazon RDS.
  • Good experience on data profiling using Informatica data explorer.
  • Experience in debugging and performance tuning of ETL designs.
  • Proficient in writing and using Stored Procedures, Triggers, Materialized Views, Cursors, Partitioning, Exception handling and Optimization techniques.
  • Work experience in using Oracle 9i, SQL, and PL/SQL.
  • Extensive experience in writing UNIX shell scripts and automation of the ETL processes using UNIX shell scripting.
  • Good Knowledge on Data Warehousing concepts like Star Schema, Dimensions and Fact tables.
  • Very Good Knowledge on Physical and Logical Database Modeling, as well as OLAP, ROLAP and MOLAP concepts.
  • Worked in projects related to various domain like Banking and Finance, Insurance, Supply Chain and Publishing.
  • Self-motivated, adaptive to new technologies, team player wif good interpersonal and communication skills.
  • Experience on NoSQL database Mongo DB.
  • Experience in working wif web services like restful and soap.
  • Experience on other ETL tools Pentaho kettle, IBM data stage.
  • Excellent interpersonal and communication skills, and is experienced in working wif senior level managers, business people and developers across multiple disciplines.
  • Experience in using Automation Scheduling tools like Tivoli workflow scheduler,Tidal and Control-M

TECHNICAL SKILLS

ETL Tool(s): Informatica Power Center 8.x, 9.x, Informatica BDE, Informatica power exchange, Informatica data explorer, Pentaho KETTLE

BI Tools: Business Objects, RS Unify, jasper soft me reports

Databases: Oracle 9i/10g, MS SQL Server, DB2, Silverpop, Emptoris, Snowflake, MS-Access, Teradata, Netezza

No SQL: Mongo DB

CRM Tools: Salesforce, PeopleSoft

Languages: UNIX shell scripting, SQL, PL/SQL, C and C++

Querying Tools: TOAD, SQL developer, SQL Navigator, Query Analyzer

Supporting Tools: ULTRA Edit, Win SCP, HP Quality center, JIRA ticketing tool, clear quest ticketing tool

PROFESSIONAL EXPERIENCE

Confidential, Jersey city, New Jersey

ETL Informatica consultant

Responsibilities:

  • ETL Design and development in informatica power center.
  • Data Integration validation and data quality controls for ETL processes.
  • Define and document data interface specifications.
  • Documentation of processes, procedures and environment.
  • All aspects of data management and policies, data interface requirements, data analysis and data modeling.
  • Involved in creating rules for data validation, standardization as per business requirements.
  • Align wif company strategies and objectives.

Environment: Oracle 11g, Informatica 9.6.1, Linux, JIRA, AUTOSYS, HP Quality center, SVN

Confidential, New York

Project lead

Responsibilities:

  • ETL design and development in Informatica Power Center and Informatica IDQ.
  • Monitoring of Informatica jobs, processes and environment.
  • Data Integration validation and data quality controls for ETL processes.
  • Technical administration of Informatica Power Center environment.
  • Define and document data interface specifications.
  • Documentation of processes, procedures and environment.
  • All aspects of data management and policies, data interface requirements, data analysis and data modeling.
  • Informatica Data Quality (IDQ 8.6.1) is the tool used here for data quality measurement.
  • Used IDQ components for address verification, data standardization and data profiling.
  • Involved in creating rules for data validation, standardization as per business requirements.
  • Participate in and sometime lead data modeling and data requirement analysis sessions.
  • Information Delivery - report validation processes for warehouse reports and data quality benchmark Dashboards.
  • Align wif company strategies and objectives.

Environment: Oracle 11g, JSON, XMLs, Informatica 9.6.1, Informatica BDE, IDQ 9.5, Linux, Salesforce, Silverpop, Emptoris, Birst, Power BI, Peoplesoft, My SQL, MongoDB, Snowflake, Amazon Redshift, Omniture, Web services, JIRA

Confidential, New York

Project lead

Responsibilities:

  • ETL design and development in Informatica Power Center.
  • Monitoring of Informatica jobs, processes and environment.
  • Data Integration validation and data quality controls for ETL processes.
  • Technical administration of Informatica Power Center environment.
  • Define and document data interface specifications for salesforce integration.
  • TEMPEffectively prepared test cases for salesforce integration.

Environment: Oracle 11g, JSON, XMLs, Informatica 9.6.1, Informatica BDE, Linux, Salesforce, Peoplesoft, Web services, JIRA

Confidential

Sr. Informatica consultant

Responsibilities:

  • Designed the Mapping Technical Specifications on the basis of Functional Requirements.
  • Created ETL mappings using Informatica Designer, which receives data from the source systems like DB2, Flat files into a common Staging area and tan into Data Marts.
  • Developed different mappings by using different Transformations like Aggregator, Lookup, Expression, update Strategy, Joiner, Router etc. to load the data into staging tables and tan to target.
  • Debugged, edited break points while checking for data flow in transformations.
  • Used Workflow Manager for Creating, Validating, Testing and running the sequential and concurrent Sessions and scheduling them to run at specified time.
  • The Objective is to extract data stored in different databases and to load data into staging tables first and tan into Oracle.
  • Involved in performance tuning by optimizing the sources, targets, mappings.
  • The Informatica mappings wifin those batches would tan populate the staging tables.
  • Performed status checks every day and Troubleshooting in Case of issues.
  • Creating Unit test cases.
  • Up-to-date on the upgrade options, enhancements, add-ons and features in the ETL tool wif a view to improving the productivity.

Environment: Oracle 11g, XMLs, Informatica 9.5.1, Linux, Autosys, Clear Quest ticket tool

Confidential

ETL developer

Responsibilities:

  • Supported development team in identifying and fixing fixing issues analyzed and documented an existing Pentaho Kettle framework previously designed. bug fix of the existing Pentaho Kettle jobs prepared unit test cases participated in mock tests and monitored productions jobs.

Confidential

Informatica consultant

Responsibilities:

  • Analysis of the existing system and processes, Interaction wif onsite team for requirement capture.
  • Delivery of the successfully completed and tested modules to the customer at proper time.
  • Develop the LLD of the system, Preparation of Program Specification, Coding, Test Case Preparations and execution of Test Cases.
  • Developed ETL mappings, transformations using Informatica Power Center.
  • Involved in Developing and executing unit test Cases.
  • Involved in writing UNIX Shell scripts to run Informatica workflows.
  • Extensively worked in the performance tuning of programs, ETL procedures and processes.
  • Checking and testing of the ETL procedures and programs using Informatica session log.
  • Worked extensively on SQL Server and DB2 as data sources.
  • Preparation of Deployment documents for code move from one environment to another environment.
  • Involved in Source Code Review.
  • Involved in providing Knowledge Transfer to Other Team.

Environment: Informatica 9.5.1, Informatica power Exchange, Informatica data explorer, XMLs, Mainframes, MS SQL Server, Teradata, Linux, Tivoli scheduler, HP Quality center, Clear Quest ticket tool

Confidential

Informatica consultant

Responsibilities:

  • Involved in understanding the client requirements and prepared a specification document which serves the exact requirement.
  • Extracting, Transforming and Loading the data from Source to Staging and Staging to Target according to the Business requirements.
  • Designed and Created data cleansing, validation for data warehouse using Informatica ETL tool.
  • Scheduling, Monitoring and Debugging of Informatica Sessions and Workflows in Autosys Scheduler.
  • The Objective is to extract data stored in different databases such as SQL server, Flat files and to load data into staging tables first and tan into Oracle.
  • Involved in Peer reviews.
  • Involved in performance tuning by optimizing the sources, targets, mappings.
  • Prepared Unit test cases for the mappings.

Environment: Informatica 9.5.1, Informatica power Exchange, Informatica data explorer, XMLs, Mainframes, MS SQL Server, Teradata, Linux, Tivoli scheduler, HP Quality center, Clear Quest ticket tool

We'd love your feedback!