We provide IT Staff Augmentation Services!

Etl/informatica Developer Resume

Los Angeles, CA

SUMMARY

  • Overall 8 years of Software Life Cycle experience in System Analysis, Design, Development, Implementation, Maintenance, and Production support of Data Warehouse Applications.
  • Extensive experience in ETL/Informatica Power Center and data integration experience in developingETL mappingsand scripts usingInformatica Power Center10.x/9.x/8.x/7.x, IDQ.
  • Have clear understanding of Data Warehousing and BI concepts with emphasis on ETL and life cycle development using Power Center, Repository Manager, Designer, Workflow Manager and Workflow Monitor.
  • Experience in creatingHigh Level Design and Detailed Designin the Design phase.
  • Extensively worked on the ETL mappings, analysis and documentation of OLAP reports requirements. Solid understanding of OLAP concepts and challenges with large data sets.
  • Strong experience in designing and developing complex mappings from varied transformation logic like Unconnected and Connected, Static and Dynamic lookups, Java, SQL, Stored Procedure, Router, Filter, Expression, Aggregator, Joiner, Update Strategy.
  • Strong knowledge of Entity - Relationship concept, Facts and dimensions tables, slowly changing dimensions and Dimensional Modeling (Star Schema and Snow Flake Schema).
  • Extensively worked on Dimensional Modeling, Data Migration, Data Cleansing, and Data staging for operational sources using ETL and data mining features for data warehouses.
  • Good understanding of relational database management systems like Oracle, DB2, SQL Server and worked on Data Integration using Informatica for the Extraction transformation and loading of data from various database source systems.
  • Worked on IDQ/IDE tools for data profiling, data enrichment and standardization.
  • Experience in development of mappings in IDQ to load the cleansed data into the target table using various IDQ transformations. Experience in data profiling and analyzing thescorecardsto design the data model.
  • Worked on Real Time Integration between MDM Hub and External Applications using Power Center.
  • Expertise in Business Model development withDimensions, Hierarchies, Measures, Partitioning, Aggregation Rules, Time Series, Cache Management.
  • Extensively created Mapplets, common functions, reusable transformations, look-ups for better usability.
  • Extensively used SQL, PL/SQL in writing Stored Procedures, Functions, Packages and Triggers.
  • Experience in UNIX shell scripting, job scheduling and server communication.
  • Involved inUnit testing, System testingto check whether the data loads into target are accurate.
  • Extensive database experience and highly skilled in SQL Server, Oracle, DB2, Sybase, XML Files, Flat Files, MS Access.
  • Excellent communication skills and result-oriented with minimum supervision, problem solving skills and team player.

TECHNICAL SKILLS

ETL Tools: Informatica 10.1/9.6/9.1/8.6.1/8.1 Source Analyzer, Mapping Designer, Workflow Monitor, Workflow Manager, Data Cleansing, Data Quality, Repository, Metadata, Data Mart, OLAP, OLTP, IDQ, MDM,SQL Server SSIS.

Data modeling tools: Erwin

Databases: Oracle 11g/10g/9i/8i, IBM-DB2, MSSQL Server

Other Tools: Toad, SQL Developer

Programming Languages: SQL, PL/SQL, T-SQL, UNIX Shell Scripting

Job scheduling: Shell Scripting, Autosys, Tidal, Control-M

Environment: MS Windows 2012/2008/2005 , UNIX

PROFESSIONAL EXPERIENCE

Confidential, Los Angeles, CA

ETL/Informatica Developer

Responsibilities:

  • Develop, implement and maintains the appropriate end to end transformation and load processes associated with Data Warehouse and Data Marts, including Data Context, Data Mapping, testing, Quality, Metadata, Operations SLAs, Data Modeling and ETL
  • Develop, test, and deploy data processes using a combination of custom and off-the-shelf tools
  • Works with architects, business users, project managers/Scrum Masters and other engineers to achieve sustainable solutions in a complex environment
  • Participates in code reviews, code quality checks and developer integration testing
  • Utilizes minimum requirements to analyze, design, develop, test data transformation solutions for Enterprise Data Warehouse system and performing ETL solutions. Implement ETL solutions with Processing and Error Handling techniques
  • Monitor, troubleshoot, maintain, and continuously improve existing data/ETL/testing applications
  • Identify, documents, and implements functional requirements
  • Follows set ETL performance standards
  • Document, design, processes and procedures for installation and maintenance of ETL processes
  • Deep troubleshooting and issue analysis and implementing software enhancements and/or applying patches in ETL Informatica.
  • Responsible for code Migration from development to System test and Production environments.
  • Testing of ETL jobs that are scheduled for file transfers from Operational Data Stores to designated file systems/directories.
  • Developed Test Strategy document, Test schedule, Test plan and Test cases for Application development projects.
  • Implement Informatica MDM workflow including data profiling configuration specification and coding match rules tuning migration
  • Define and build best practices regarding creating business rules within the Informatica MDM solution
  • Performed land process to load data into landing tables of MDM Hub using external batch processing for initial data load in hub store and define automation process for staging loading match and merge.
  • Experience in writing SQL test cases for Data quality validation.
  • Experience in various data validation and Data analysis activities to perform data quality testing.
  • Experience in maintaining Data Quality, Data consistency and Data accuracy for Data Quality projects.
  • Performed the data profiling and analysis making use of Informatica Data Explorer (IDE) and Informatica Data Quality (IDQ)

Environment: Informatica Power center, SQL Server, JIRA, SVN, GitHub, Toad, Oracle, unix, Bitbucket

Confidential, Grand rapids, MI

ETL/Informatica Developer

Responsibilities:

  • Gathering business requirements, translating that information into detailed technical specifications from which programs will be written or configured, and validating that the proposed applications align with the architectural design and with the business needs.
  • Developing test validations for ETL scripts and recommending modifications to ETL scripts, data warehouse and data marts using JUnitand Source code editor.
  • Involved in Data Profiling, Analysis, Standardize, verifying & Cleansing using IDQ 10.2
  • Developing and customizing ETL packages usingPL/SQL and Identifying the scope for new efficient process developments, ETL mappings and work on planning, prioritizing and estimate on timelines for designing coding, testing, and production deployment.
  • Developing, creating, and modifying the stored procedures for standard or ad-hoc reporting and managing Tableau’s data source.
  • Worked with Informatica Data Quality toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ.
  • Designing, coding, debugging, deploying, and testing the Data model using Oracle PL/SQL, ETL Informatica and Tableau.
  • Modifying the existing data models using SQL and Developing Performance optimization of the database and applications using Oracle concepts like Explain plan and nutshell.
  • Developed the FACETS DA tables load mappings for all the new tables added as a part of Facets version update.
  • Parsing high-level design specification to simple ETL coding along with mapping standards.
  • Worked extensively to develop customized MDM services.
  • Created necessary batch interfaces to and from MDM hub.
  • Maintained MDM jobs and Master Data sequences Build test scripts for unit testing of customized MDM code,
  • Worked on Informatica Data Quality to resolve customers related issues.
  • Worked on Informatica Power Exchange Real Time Change Data Capture (CDC) and Informatica Power Exchange Data maps against IMS.
  • Hands on experience working on profiling data using IDQ.
  • Created dictionaries using Informatica Data Quality (IDQ) that was used to cleanse and standardized Data. Worked with Informatica and other consultants to develop IDQ plans to identify possible data issues.
  • Involved in massive data profiling using IDQ (Analyst tool) prior to data staging.
  • Worked onFACETSData tables and created audit reports using queries. Manually loaded data inFACETSand have good knowledge onFACETSbusiness rules.
  • Coordinate with Business team to consume the data and publish into reports using Tableau.
  • Performed Unit testing and maintained test logs and test cases for all the mappings.

Environment: Informatica Power center, SQL Server, SQL Assist, SQL Workbench, JIRA, SVN, GitHub, Toad, Oracle.

Confidential, Bakersfield, CA

ETL/Informatica Developer

Responsibilities:

  • Worked with business analysts for requirement gathering, business analysis, and translated the business requirements into technical specifications to build the Enterprise data warehouse.
  • Design and developed end-to-end ETL process from various source systems to Staging area, from staging to Data Marts.
  • Worked on InformaticaPower Center tools- Mapping Designer, Repository Manager, Workflow Manager and Workflow Monitor.
  • Involved in creating data models using Erwin.
  • Worked with Designer tools like Source Analyzer, Target designer, Mapping designer, Mapplet designer, Transformation Developer.
  • Implemented performance tuning logic on targets, sources, mappings, sessions to provide maximum efficiency and performance.
  • Experience in Investigating and communicating data quality issues and data failures to onsite DQ development team and fix them.
  • Extensively used Toad utility for executing SQL scripts and worked on SQL for enhancing the performance of the conversion mapping.
  • Developed the FACETS DA tables load mappings for all the new tables added as a part of Facets version update.
  • Did the forward andbackwarddata mapping between the fields in mainframe andFACETS.
  • Used the PL/SQL procedures for Informatica mappings for truncating the data in target tables at run time.
  • Prepared SQL Queries to validate the data in both source and target databases.
  • Worked on TOAD and Oracle SQL Developer to develop queries and create procedures and packages in Oracle.

Environment: Informatica Power Center 10.1, MDM, Oracle 11g,UNIX, PL/SQL, SQL* PLUS, TOAD, MS Excel.

Confidential, Chicago, IL

ETL/Informatica Developer

Responsibilities:

  • Interacted actively with Business Analysts and Data Modelers on Mapping documents and Design process for various Sources and Targets.
  • Developed rules and Mapplets that are commonly used in different mappings.
  • Created complex mappings in Power Center Designer using Aggregate, Expression, Filter, and Sequence Generator, Update Strategy, Union, Lookup, Joiner, XML Source Qualifier and Stored procedure transformations.
  • Worked on Power Center Tools like designer, workflow manager, workflow monitor and repository manager.
  • Parsed high-level design specification to simple ETL coding and mapping standards.
  • Involved in building the ETL architecture and Source to Target mapping to load data into Data warehouse.
  • Created mapping documents to outline data flow from sources to targets.
  • Involved in Dimensional modeling (Star Schema) of the Data warehouse and used Erwin to design the business process, dimensions and measured facts.
  • Extracted the data from the flat files and other RDBMS databases into staging area and populated onto Data warehouse.
  • Developed Address validator transformation through IDQ to be interacted in Informatica PowerCenter mapping
  • Performing Unit Testing and tuned the mappings for the better performance.
  • Maintained stored definitions, transformation rules and targets definitions using Informatica repository Manager.
  • Extensive experience in integration of Informatica Data Quality (IDQ) with Informatica PowerCenter
  • Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer.
  • Created PL/SQL programs like procedures, function, packages, and cursors to extract data from Target System.
  • Created various UNIX Shell Scripts for scheduling various data cleansing scripts and automated execution of workflows.
  • Responsibilities include creating the sessions and scheduling the sessions.

Environment: Informatica Power Center 9.6, Oracle 11g, UNIX, PL/SQL, IDQ, SQL* PLUS, TOAD, MS Excel.

Confidential, Montgomery, AL

Informatica Developer

Responsibilities:

  • With the users and making changes to Informatica mappings according to the Business requirements.
  • Developed the Informatica mappings using various transformations, Sessions and Workflows. SQL Server was the target database, Source database is a combination of Flat files, Oracle tables, People Soft, Excel files, CSV files etc.
  • Worked with IDQ toolkit, Analysis, data cleansing, data matching, data conversion, reporting and monitoring capabilities.
  • Handled technical and functional call across the teams.
  • Developed the Informatica Mappings by usage of Aggregator, SQL overrides usage in Lookups, source filter usage in Source qualifiers and data flow management into multiple targets using Router.
  • Optimizing the Mappings and implementing the complex business rules by creating re-usable transformations and Mapplets.
  • Worked withInformatica Developer (IDQ)tool to ensure data quality to the consumers.
  • Used the Address Doctor Geo-coding table to validate the address and performed exception handling reporting and monitoring the data.
  • Created Reference/Master data for profiling using IDQ Analyst tools.
  • Experience in Data Quality Analysis, Data Profiling, Data cleansing and Master data management.
  • Involved in massive data profiling using IDQ (Analyst tool) prior to data staging.
  • Worked with Informatica and other consultants to develop IDQ plans to identify possible data issues.
  • Involved in deployment of IDQ mappings to application and to different environments.
  • Worked on Data Quality checks for data feeds and performance tuning.
  • Worked on data analysis to find the data duplication and existed data pattern using a data profiling tool, IDE.
  • Monitoring the Data Quality, Generating weekly/monthly/yearly statistics reports on production processes - success / failure rates for causal analysis as maintenance part and Enhancing exiting production ETL scripts.
  • Data if validated from third party before providing to the internal transformations should be checked for its accuracy (DQ).
  • Used Address validator transformation forvalidating various customers address from various countries by using SOAP interface

Environment: Informatica Power Center 9.5.1, IDQ, Oracle 11g, UNIX, PL/SQL, SQL* PLUS, TOAD, MS Excel.

Confidential

Jr. ETL Developer

Responsibilities:

  • Involved in creating Technical Specification Document (TSD) for the project.
  • Used Informatica for loading the historical data from various tables for different departments.
  • Involved in the development of Data Mart and populating the data marts using Informatica.
  • Created and maintained metadata and ETL documentation that supported business rules and detailed source to target data mappings.
  • Involved in the process design documentation of the Data Warehouse Dimensional Upgrades.
  • Developed sessions using Server Manager and improved the performance details.
  • Created reusable transformations called mapplets and used them in mappings in case of reuse of the transformations in different mappings.
  • Created mapplets and reusable transformations to use across different mappings.
  • Created transformations like Aggregate, Expression, Filter, Sequence Generator, Joiner, and Stored procedure transformations.
  • Created and managed the global and local repositories and permissions using Repository Manager in Oracle Database.
  • Designed and coded maps, which extracted data from existing, source systems into the Data warehouse.
  • Scheduled Sessions and Batch Process based on demand, run on time, run only once using Informatica Server Manager.
  • Managed migration in a multi-vendor supported Server and Database environments.

Environment: Oracle RDBMS 9i, Informatica, JAVA, SQL*Plus Reports, SQL*Loader, XML, Toad

Hire Now