We provide IT Staff Augmentation Services!

Etl/informatica Developer Resume Profile

Houston, TexaS

PROFESSIONAL SUMMARY:

  • Over 7 years of experience in the complete Data warehouse life cycle involving Analysis, Design, Development, Testing and maintenance of Software applications and implementing Data Warehouse applications and Data integration using ETL.
  • Strong Knowledge of all phases of Software Development Life Cycle and have worked in large scale projects with incremental deliverables.
  • Experience in OLTP Modelling 2NF, 3NF and OLAP Dimensional modelling Star and Snow Flake using Erwin conceptual, logical and physical data models .
  • Extensive knowledge in Architecture/Design of Extract, Transform, Load environment using Informatica PowerCenter.
  • Experience in implementing the complex business rules by creating Informatica transformations, re-usable transformations Lookups - Connected Unconnected, Joiner, Union, Sorter, Aggregator, Rank, Normalizer, Filter, Router and Update Strategy and developing complex Mappings Mapplets.
  • Extensively worked on Slowly Changing Dimensions SCD type 2/3/ Hybrid and setting up Change Data Capture CDC mechanisms.
  • Working knowledge of Informatica MDM Hub and involved in the configuration of landing tables, staging tables, Data Models, Relationships.
  • Experience in data quality tools like Informatica Data Quality IDQ .
  • Profiled the data from disparate source systems using Informatica Data Explorer IDE .
  • Experience in Integration of various data sources like SQL Server, Oracle, Sybase, ODBC connectors Flat Files.
  • Expertise in Debugging and Performance tuning of Informatica mappings and sessions SQL stored procedures.
  • Experience in Identifying and Resolving ETL production issues.
  • Experience in Maintenance, Enhancements, Performance tuning of ETL Mappings.
  • Experience with relational databases such as Oracle 8i/9i/10g, SQL Server, Sybase, MS Access.
  • Strong skills in SQL, PL/SQL packages, functions, stored procedures triggers.
  • Experience in tuning and scaling the procedures for better performance by running explain plan and using different approaches like hint and bulk load.
  • Experience with Database SQL tuning and query optimization tools like Explain Plan.
  • Experience in Oracle External table concepts, SQL Loader UTL File concepts BCP concepts.
  • Hands on experience in UNIX shell scripting.
  • Experience in using the Informatica command line utility 'pmcmd' to schedule and control sessions and batches.
  • Experience in working with various scheduling tools Control M CA Autosys .
  • Documentation of projects for Functional Requirement Specifications FRS , Use case Specifications, ER-Diagrams, Test Plans, and Test Scripts Test Cases.
  • Team Player Excellent Interpersonal skills Ability to work effectively with multiple stakeholders in a project

TECHNICAL SKILLS:

Operating Systems: UNIX, Ms-Dos, Windows 2000/XP

ETL : Informatica Power Center 9.1, 8.6.1, 8.1, 7.1 , Informatica Data Quality 8.5

Modeling Tools : Power Designer, Erwin 7.2/4.1, MS Visio.

Databases : Oracle 8i, 9i/10g/11g, MS SQL 200x, MS Access 97/2000, TOAD.

Languages : C , HTML, Object Oriented Programming, Java, PL/SQL, SQL Plus, Visual Basic 6.0/5.0, Shell Scripting, PERL Scripting, JSP, XML, DHTML.

Application Tools: MS Office, Java-Script, MS Project.

PROFESSIONAL EXPERIENCE:

Confidential

ETL/Informatica Developer

Responsibilities:

  • Involved in requirement gathering with the Internal Users and sending requirements to our Source System Vendor.
  • Worked on Migration of the Metadata changes to the Informatica repository.
  • Configured Informatica Power Center 9.1. Also installed and configured Informatica PowerCenter and Informatica Client tools.
  • Develop several Complex Informatica Mappings, Mapplets, Reusable Transformations, and Workflows for cleansing and loading of data into the Staging area and Data Warehouse.
  • Worked on the implementation of both Type I and Type II SCDs for data mart.
  • Extensively used Transformations for heterogeneous data joins, complex aggregations and external procedure calls.
  • Informatica Designer tools were used to design the source definition, target definition and transformations to build mappings.
  • Involved in fixing invalid Mappings, testing of Stored Procedures and Functions, testing of Informatica Sessions, and the Target Data.
  • Exposed to the use of ERWIN for data modelling and Dimensional Data Modeling. Also helped in designing Database Schemas using ERWIN 4.2.
  • Applied business rules using Informatica Data Quality IDQ tool to cleanse data.
  • Tested all the applications and transport the data to target Warehouse Oracle tables on the server, Scheduled and ran Extraction and Load process and monitored sessions by using Informatica Workflow Manager.
  • Developed data conversion, integration, loading and verification specifications.
  • Designed the slowly changing dimension strategy for the warehouse.
  • Developed UNIX shell scripts to run the PMCMD functionality to start and stop sessions and batches.

Environment:

Informatica Power Center 9.1, XML, Erwin 4.2, Oracle 11g, SQL Server, PL/SQL, SQL Loader, UNIX Shell Scripting.

Confidential

Role: ETL Developer.

Responsibilities:

  • Involved in requirements gathering, design overview, ETL specifications, data analysis/troubleshooting, implementation tasks, and general mentoring.
  • Creation and management of the solution proposal, Macro and Micro designs, and deployment plans.
  • Provided in-depth technical consultation to ensure development of efficient application systems by utilizing standard methodologies and best practices.
  • Responsible for writing program specifications for developing mappings.
  • Created Complex Informatica Mappings to implement complex business rules by using mapplets, reusable transformations and mapping parameters.
  • Design and Development of Mapping specifications, Physical Flow diagrams and Build documents.
  • Developed reusable mappings to process real time files/admin systems extracts, error handling and notification process.
  • Designed and developed ETL load strategies, update strategies, error processing, mapping dependencies, batch processing and balance control approaches.
  • Developed mappings, workflows and schedules that reflect ETL methodology standards.
  • Designed and implemented schemas for reference data, Change Data Capture, process control and audit balancing.
  • Developed mappings/mapplets, used dynamic parameter files.
  • Used transformations like Normalizer, Lookup, Aggregator, Expression, Sequence Generator, Router, Filter, Joiner and Union Transformations.
  • Created data checks and implemented error strategies using the Database Logging.
  • Used Informatica version control to maintain the changes to the code.
  • Created partitions to improve the Session Performance.
  • Involved in code reviews to facilitate the right coding practices.
  • Automated hourly and daily transaction reports using talend open studio.
  • Extensively involved in developing shell scripts, Perl scripts for loading the data and for automating the ETL process.
  • Responsible for migrating projects between multiple environments DEV, QA, UAT and Prod .
  • Involved in production support.

Environment: Informatica Power Center 8.1, Oracle 10g, SQL Server 2005, Erwin, Windows, UNIX, HP Quality Center, Talend, Netezza, Metadata manager.

Confidential

Informatica Developer.

Responsibilities:

  • Created complex mappings in PowerCenter Designer using Aggregate, Expression, Filter, Sequence Generator, Update Strategy, Union, Lookup, Joiner, XML Source Qualifier and Stored procedure transformations.
  • Developed mappings/mapplets by using mapping designer, transformation developer and mapplet designer inInformatica PowerCenter
  • Handle Slowly Changing Dimensions SCD Type I, Type II and Type III based on the business requirements.
  • Used Informatica PowerCenter for extraction, loading and transformation ETL of data in the data warehouse.
  • Used Informatica PowerCenter and its all features extensively in migrating data from OLTP to Enterprise Data warehouse.
  • Worked closely with client in understanding the Business requirements, data analysis and deliver the client expectation.
  • Used Informatica Power Center Workflow manager to create sessions, batches to run with the logic embedded in the mappings.
  • Extracted data from different sources like Oracle, flat files, XML, DB2 and SQL Server loaded into Data Ware House DWH .
  • Extensively used Erwin for Logical and Physical data modeling and designed Star Schemas.
  • Involved in creation of Folders, Users, Repositories and Deployment Groups using Repository Manager.
  • Developed PL/SQL and UNIX Shell Scripts for scheduling the sessions in Informatica.
  • Wrote PL/SQL stored procedures triggers, cursors for implementing business rules and transformations.
  • Performed unit testing on the Informatica code by running it in the Debugger and writing simple test scripts in the database thereby tuning it by identifying and eliminating the bottlenecks for optimum performance.
  • Worked extensively with different caches such as Index cache, Data cache and Lookup cache Static, Dynamic, Persistence and Shared .
  • Involved in Performance tuning for sources, targets, mappings and sessions.
  • Involved in scheduling the Informatica workflows using Autosys.
  • Migrated mappings, sessions, and workflows from development to testing and then to Production environments.
  • Created deployment groups, migrated the code into different environments.
  • Worked closely with reporting team to generate various reports.
  • Responsible for creating Workflows and sessions using Informatica workflow manager and monitor the workflow run and statistic properties on Informatica Workflow Monitor.
  • Performed extraction, transformation and loading of data from RDBMS tables and Flat File sources into Oracle 10g in accordance with requirements and specifications.

Environment: Informatica PowerCenter 9.1/8.6.1 Power Center Repository Manager, Designer, Workflow Manager, and Workflow Monitor , Oracle 11g, SQL Server 2005, UDB DB2 8.1, XML, Autosys, TOAD 6.0, SQL, PL/SQL, UNIX.

Confidential

Informatica Developer.

Responsibilities:

  • Extracted data from various sources, applied business logic to load them in to the Data Warehouse.
  • Worked on Informatica 8.6.1 client tools like Source Analyzer, Warehouse Designer, Mapping Designer, Workflow Manager and Workflow Monitor.
  • Involved in design and development of complex ETL mappings.
  • Based on the requirements, used various transformations like Source Qualifier, Normalizer, Expression, Filter, Router, Update strategy, Sorter, Lookup, Aggregator, Joiner and Sequence Generator in the mapping.
  • Developed Mapplets, Worklets and Reusable Transformations for reusability.
  • Identified performance bottlenecks and Involved in performance tuning of sources, targets, mappings, transformations and sessions to optimize session performance.
  • Identified bugs in existing mappings/workflows by analysing the data flow and evaluating transformations.
  • Developed Informatica SCD type-I, Type-II mappings
  • Implemented incremental loads, Data capture and Incremental Aggregation.
  • Created Stored Procedures in PL/SQL.
  • Used PMCMD command to start, stop and ping server from UNIX and created UNIX SHELL scripts to automate the process.
  • Created UNIX Shell scripts and called as pre session and post session commands.
  • Developed Documentation for all the routines Mappings, Sessions and Workflows .
  • Involved in scheduling the workflows through CONTROL-M using UNIX scripts.
  • Unit tested the mappings and responsible for production support as well.

Environment: Informatica Power Center 8.6.1/8.1.1, Oracle 10g, Putty, TOAD for Oracle, MS SQL Server, Flat Files, PL/SQL, ERWIN 4.1 Data Modelling tool, Cognos, UNIX SHELL scripting, CONTROL-M.

Confidential

Informatica Developer.

Responsibilities:

  • Complete understanding of Data in various sources, so as to support Business Intelligence BI Team in creating SDLC.
  • Supported BI team to create Logical and Physical Data Models for Data Warehousing.
  • Created Incremental Iterative ETL releases using an Agile Process.
  • Supported and Created Source to Target Documentations, along with Business Analysts.
  • Supported and Fixed ETL bugs, Reporting bugs identified by QA Team using Agile Methodology.
  • Involved in extensive DATA validation by writing several complex SQL queries and Involved in Back end testing and worked with data quality issues.
  • Performed all aspects of Verification, Validation including Functional, Structural, Regression, Load and System Testing.
  • Created ETL Codes based on Source to Target Documentations, so as to serve the need of EDW
  • Data to create various reports and dashboards using Micro strategy.
  • Created ETL code to support data flow from Source to Staging and Staging to EDW, so as to standardize ETL coding for various SDLC.
  • Created ETL code with various Incremental Logics Slowly Changing Dimensions - 1, 2 based on requirements.
  • Created Stored Procedures, Cursors and Triggers, to load, test and validate data in EDW Database.
  • Verified the data load in critical and non-critical columns based on the ETL updates or inserts.
  • Created various Parameters in Informatica Server, to support connections for Source, Target and Look up Tables.
  • Wrote several command tasks scripts using UNIX Korn shell for file transfers and clean-up process.
  • Tuned ETL Codes/scripts, SQL queries to improve the system performance.
  • Created or Dropped Indexes to improve the Load Performance.
  • Migrated ETL Codes from Development Environment to Test Environment and Deploy ETL codes to Production Environment after QA ETL TESTING
  • Scheduled and Monitored Workflows in Production Environment under various circumstances and create event wait and event raise tasks as per the need.
  • Supported ETL Testing QA Team in writing various Test Cases and validate data in Test Environment.
  • Supported and Fixed ETL bugs, Reporting bugs identified by QA Team using Agile Methodology.
  • Involved in extensive DATA validation by writing several complex SQL queries and Involved in back-end testing and worked with data quality issues.
  • Performed all aspects of Verification, Validation including Functional, Structural, Regression, Load and System Testing.
  • Attended Daily status meetings with the BI Team and updated the status of the Project under Agile Methodology.
  • Attended BI-Weekly meeting to Update, Plan iterations with the BI Team.

Environment: Oracle 11g, INFORMATICA 8.1, XML, Flat Files, SQL DEVELOPER, SQL Plus, Windows, UNIX, HP Quality Center, Longview Khalix, People Soft, VM Ware, DAC.

Hire Now