We provide IT Staff Augmentation Services!

Etl/informatica Developer Resume

Woonsocket, RI

SUMMARY

  • Overall 6+ years of Software Life Cycle experience in System Analysis, Design, Development, and Implementation, Maintenance, and Production support of Data Warehouse Applications.
  • Extensive experience inETL/Informatica Power Center and data integration experience in developingETL mappingsand scripts usingInformatica Power Center10.x/9.x/8.x/7.x, IDQ.
  • Have clear understanding of Data Warehousing and BI concepts with emphasis on ETL and life cycle development using Power Center, Repository Manager, Designer, Workflow Manager and Workflow Monitor.
  • Experience in data extraction from heterogeneous sources using Teradata. Informatica Power center.
  • Experience in creatingHigh Level Design and Detailed Designin the Design phase.
  • Extensively worked on the ETL mappings, analysis and documentation of OLAP reports requirements. Solid understanding of OLAP concepts and challenges with large data sets.
  • Strong experience in designing and developing complex mappings from varied transformation logic like Unconnected and Connected, Static and Dynamic lookups, Java, SQL, Stored Procedure, Router, Filter, Expression, Aggregator, Joiner, Update Strategy.
  • Strong knowledge of Entity - Relationship concept, Facts, and dimensions tables, slowly changing dimensions and Dimensional Modeling (Star Schema and Snowflake Schema).
  • Worked on Migrating table DDL’s, views and stored procedures from hive to SQL, Netezza to snowflake, Teradata to snowflake.
  • Expertise in working with Teradata Stored Procedures, Oracle Stored Programs, Packages, Cursors, Triggers, Tables, Constraints, Views, Indexes, Sequence, and Synonyms in distributed environment.
  • Extensively worked on Dimensional Modeling, Data Migration, Data Cleansing, and Data staging for operational sources using ETL and data mining features for data warehouses.
  • Good understanding of relational database management systems like Oracle, DB2, SQL Server and worked on Data Integration using Informatica for the Extraction transformation and loading of data from various database source systems.
  • Extensive experience in implementing projects usingAgile (Kanban)andwaterfall methodologies
  • Create project plan to carry out testing for implementation & upgrade of existing legacy system with Guidewire Policy Center & Guidewire Billing Center.
  • Data load and exporting using Teradata utilities such as TPT, FLoad, MLoad, Fast Export and Tpump.
  • Expertise in Business Model development withDimensions, Hierarchies, Measures, Partitioning, Aggregation Rules, Time Series, Cache Management.
  • Extensively created Mapplets, common functions, reusable transformations, look-ups for better usability.
  • Extensively used SQL, PL/SQL in writing Stored Procedures, Functions, Packages and Triggers.
  • Experience in UNIX shell scripting, job scheduling and server communication.
  • Involved inUnit testing, System testingto check whether the data loads into target are accurate.
  • Extensive database experience and highly skilled in SQL Server, Oracle, DB2, Sybase, XML Files, Flat Files, MS Access.
  • Excellent communication skills and result-oriented with minimum supervision, problem solving skills and team player.

TECHNICAL SKILLS

ETL Tools: Informatica Power Centre 10.x/9.x/8x/, IICS, Data Cleansing, IDQ, Repository, Metadata, Data Mart, OLAP, OLTP, SQL Server SSIS.

Data modeling tools: Erwin

Databases: Oracle 11g/10g/9i/8i, Confidential -DB2, MSSQL Server, BigQuery

Other Tools: Toad, SQL Developer, Crystal Reports, SQL Assistant

Programming Languages: SQL, Java, PL/SQL, T-SQL, UNIX Shell Scripting

Job scheduling: Shell Scripting, Autosys, Tidal, Control-M, Tidal

Environment: MS Windows 2012/2008/2005 , UNIX

PROFESSIONAL EXPERIENCE

Confidential, Woonsocket, RI

ETL/Informatica Developer

Responsibilities:

  • Worked with business analysts for requirement gathering, business analysis, and translated the business requirements into technical specifications to build the Enterprise data warehouse.
  • Design and developed end-to-end ETL process from various source systems to Staging area, from staging to Data Marts.
  • Worked on Informatica Power Center tools- Mapping Designer, Repository Manager, Workflow Manager and Workflow Monitor.
  • Involved in creating data models using Erwin.
  • Worked with Designer tools like Source Analyzer, Target designer, Mapping designer, Mapplet designer, Transformation Developer.
  • Experience with Snowflake Multi - Cluster Warehouses. build the Logical and Physical data model for snowflake as per the changes required.
  • In-depth knowledge of Snowflake Database, Schema and Table structures
  • Created Mappings and used transformations like Source Qualifier, Filter, Update Strategy, Lookup, Java, Expression, Router, Joiner, Normalizer, Aggregator, Sequence Generator and Address validator.
  • Developed mappings to load Fact and Dimension tables, SCD Type 1 and SCD Type 2 dimensions and Incremental loading and unit tested the mappings.
  • Created work flows and sessions for each mapping that we are extracting from source systems to staging area; staging area to target.
  • Extensively created Re-Usable Transformations and Mapplets to standardized Business logic.
  • Implemented performance tuning logic on targets, sources, mappings, sessions to provide maximum efficiency and performance.
  • Created PL/SQL programs like procedures, function, packages, and cursors to extract data from Multiple Source Systems.
  • Responsibilities include creating the sessions and scheduling the sessions.
  • Translated Business specifications into PL/SQL code. Extensively used Explain Plan. Designed, Developed and fine-tuned Oracle Stored Procedures and triggers.
  • Extensively used Toad utility for executing SQL scripts and worked on SQL for enhancing the performance of the conversion mapping.
  • Used the PL/SQL procedures for Informatica mappings for truncating the data in target tables at run time.
  • Prepared SQL Queries to validate the data in both source and target databases.
  • Worked on TOAD and Oracle SQL Developer to develop queries and create procedures and packages in Oracle.
  • Supported during QA/UAT/PROD deployments and bug fixes.
  • Created Unix Shell Scripts for Informatica ETL tool to automate sessions and cleansing the source data.
  • Documented and presented the production/support documents for the components developed, when handling over the application to the product support team.
  • Effectively used IICS Data integration console to create mapping templates to bring data into staging layer from different source systems like Sal Server, Oracle, Teradata, Salesforce, Flat Files, Excel Files,
  • Experience working with IICS transformations like Expression, joiner, union, lookup, sorter, filter, normalizer and various concepts like macro fields to templatize column logic, smart match fields, renaming bulk fields and more.
  • Unit tested the data between Teradata and Snowflake, Netezza and snowflake, hive to tsql
  • Experience working with Custom Built Query to load dimensions and Facts in IICS.
  • Interacted actively with Business Analysts and Data Modelers on Mapping documents and Design process for various Sources and Targets.
  • Experience working with Data integration concepts not limited to mapping, mapping configuration task, Task flows, deployment using GIT automation, schedules, connections, API integration.

Environment: Informatica Power Center 10.1, Snowflake, Hive, Tsql, IICS, Oracle 11g, UNIX, PL/SQL, SQL* PLUS, TOAD, 14.0, MS Excel, AcitveBatch V12 Console, Cognos, Big Query, SQL server Management studio 2016. GIT, SQL, Jira.

Confidential, Southbury, CT

ETL/Informatica Developer

Responsibilities:

  • Created complex mappings in Power Center Designer using Aggregate, Expression, Filter, and Sequence Generator, Update Strategy, Union, Lookup, Joiner, XML Source Qualifier and Stored procedure transformations.
  • Worked on Power Center Tools like designer, workflow manager, workflow monitor and repository manager.
  • Integrated Guidewire with QAS.com using Web services for address verification functionality implementation.
  • Extensively used Teradata SQL Assistant to write different queries to understand the data in source tables.
  • Participated inKanban meetingsheld two days in a week to track the progress of the projects.
  • Parsed high-level design specification to simple ETL coding and mapping standards.
  • Involved in building the ETL architecture and Source to Target mapping to load data into Data warehouse.
  • Created mapping documents to outline data flow from sources to targets.
  • Involved in Dimensional modeling (Star Schema) of the Data warehouse and used Erwin to design the business process, dimensions and measured facts.
  • Extracted the data from the flat files and other RDBMS databases into staging area and populated onto Data warehouse.
  • Extracted the data from oracle using sql scripts and loaded into teradata using fast/multi load and transformed according to business transformation rules to insert/update the data in data marts.
  • Maintained stored definitions, transformation rules and targets definitions using Informatica repository Manager.
  • Working withInformatica Developer (IDQ)tool to ensure data quality to the consumers.
  • Maintained warehouse metadata, naming standards and warehouse standards for future application development.
  • Used Address validator transformation forvalidating various customers address from various countries by using SOAP interface
  • Worked on Data Quality checks for data feeds and performance tuning.
  • Worked on data analysis to find the data duplication and existed data pattern using a data profiling tool, IDE.
  • Monitoring the Data Quality, generating weekly/monthly/yearly statistics reports on production processes - success / failure rates for causal analysis as maintenance part and Enhancing exiting production ETL scripts.
  • Data if validated from third party before providing to the internal transformations should be checked for its accuracy (DQ).
  • Used Address validator transformation forvalidating various customers address from various countries by using SOAP interface.
  • Experience in scheduling of ETL jobs using Autosys, Tidal, Control-M.
  • Supported during QA/UAT/PROD deployments and bug fixes.
  • Created Unix Shell Scripts for Informatica ETL tool to automate sessions and cleansing the source data.

Environment: Informatica Power Center 9.6, Oracle 11g, UNIX, PL/SQL, IDQ, Teradata V 13.0, SQL* PLUS, TOAD, Teradata SQL Assistant, MS Excel.

Confidential, Lowell, AR

ETL/Informatica Developer

Responsibilities:

  • Worked with business analysts for requirement gathering, business analysis, and translated the business requirements into technical specifications to build the Enterprise data warehouse.
  • Design and developed end-to-end ETL process from various source systems to Staging area, from staging to Data Marts.
  • Worked on Informatica Power Center tools- Mapping Designer, Repository Manager, Workflow Manager and Workflow Monitor.
  • Involved in creating data models using Erwin.
  • Worked with Designer tools like Source Analyzer, Target designer, Mapping designer, Mapplet designer, Transformation Developer.
  • Created Mappings and used transformations like Source Qualifier, Filter, Update Strategy, Lookup, Java, Expression, Router, Joiner, Normalizer, Aggregator, Sequence Generator and Address validator.
  • Developed mappings to load Fact and Dimension tables, SCD Type 1 and SCD Type 2 dimensions and Incremental loading and unit tested the mappings.
  • Created work flows and sessions for each mapping that we are extracting from source systems to staging area; staging area to target.
  • Extensively created Re-Usable Transformations and Mapplets to standardized Business logic.
  • Implemented performance tuning logic on targets, sources, mappings, sessions to provide maximum efficiency and performance.
  • Created PL/SQL programs like procedures, function, packages, and cursors to extract data from Multiple Source Systems.
  • Responsibilities include creating the sessions and scheduling the sessions.
  • Translated Business specifications into PL/SQL code. Extensively used Explain Plan. Designed, Developed and fine-tuned Oracle Stored Procedures and triggers.
  • Extensively used Toad utility for executing SQL scripts and worked on SQL for enhancing the performance of the conversion mapping.
  • Used the PL/SQL procedures for Informatica mappings for truncating the data in target tables at run time.
  • Prepared SQL Queries to validate the data in both source and target databases.
  • Worked on TOAD and Oracle SQL Developer to develop queries and create procedures and packages in Oracle.
  • Developed shell scripts, PL/SQL procedures, for creating/dropping of table and indexes of performance for pre and post session management.
  • Strong knowledge with PL/SQL Wrapper to protect the PL/SQL procedures or packages.
  • Designed End2End Solution including Database Model, Workflows and Job Schedule for theIDQCleansing/Matching andIDQData Mart.
  • Scheduled Jobs using theIDQScheduler by Deploying the Workflow's as an Application to the Data Integration Service for running the workflows
  • Exporting the Mapplets fromIDQinto Informatica Power center to use the Mapplet in various mappings for implementation of Address doctor.
  • Created dictionaries using Informatica Data Quality (IDQ) that was used to cleanse and standardized Data. Worked with Informatica and other consultants to develop IDQ plans to identify possible data issues.
  • Involved in massive data profiling using IDQ (Analyst tool) prior to data staging.
  • Extensively worked on Data Cleansing, Data Analysis and Monitoring Using Data Quality.
  • Used the Address Doctor Geo-coding table to validate the address and performed exception handling reporting and monitoring the data.
  • Used Address validator transformation in IDQ.
  • Created Technical Specification Documents and Solution Design Documents to outline the implementation plans for the requirements.
  • Working withInformatica Developer (IDQ)tool to ensure data quality to the consumers.
  • Implement Informatica MDM workflow including data profiling configuration specification and coding match rules tuning migration
  • Define and build best practices regarding creating business rules within the Informatica MDM solution
  • Performed land process to load data into landing tables of MDM Hub using external batch processing for initial data load in hub store and define automation process for staging loading match and merge
  • Worked extensively to develop customized MDM services.
  • Created necessary batch interfaces to and from MDM hub.
  • Maintained MDM jobs and Master Data sequences Build test scripts for unit testing of customized MDM code,
  • Actively participated in Scrum Meetings.
  • Designed and developed the ETL processes using Informatica to load data from Oracle, Flat files and XML files to target Oracle Data warehouse.
  • Created partitioned tables, indexes for manageability and scalability of the application.
  • Leveraged workflow manager for session management, database connection management an scheduling of jobs.
  • Developed various mapping and tuning using Oracle and SQL*Plus in the ETL process.
  • Involved in testing of Stored Procedures and Functions, Unit and Integrating testing of Informatica Sessions, Batches, and the Target Data.
  • Created control-M jobs and scheduled them.
  • Experience in scheduling of ETL jobs using Autosys, Tidal, Control-M.
  • Created Test cases and Test Plans for unit testing.
  • Supported during QA/UAT/PROD deployments and bug fixes.
  • Created Unix Shell Scripts for Informatica ETL tool to automate sessions and cleansing the source data.
  • Documented and presented the production/support documents for the components developed, when handling over the application to the product support team.

Environment: Informatica Power Center 9.6, MDM, Oracle 11g, UNIX, PL/SQL, SQL* PLUS, TOAD, MS Excel

Confidential

ETL/Informatica Developer

Responsibilities:

  • Involved in creating Technical Specification Document (TSD) for the project.
  • Used Informatica for loading the historical data from various tables for different departments.
  • Involved in the development of Data Mart and populating the data marts using Informatica.
  • Designed ETL process to translate the business requirements into Mappings using Informatica Power Center - Source Analyzer, Warehouse designer, Mapping Designer, Workflow Manager/Monitor.
  • Created and maintained metadata and ETL documentation that supported business rules and detailed source to target data mappings.
  • Experienced in working with Teradata ETL tools like Fast Load, Multi Load, Tpump and Fast Export.
  • Involved in the process design documentation of the Data Warehouse Dimensional Upgrades.
  • Developed sessions using Server Manager and improved the performance details.
  • Created reusable transformations called mapplets and used them in mappings in case of reuse of the transformations in different mappings.
  • Created mapplets and reusable transformations to use across different mappings.
  • Created transformations like Aggregate, Expression, Filter, Sequence Generator, Joiner, and Stored procedure transformations.
  • Created and managed the global and local repositories and permissions using Repository Manager in Oracle Database.
  • Designed and coded maps, which extracted data from existing, source systems into the data warehouse.
  • Scheduled Sessions and Batch Process based on demand, run on time, run only once using Informatica Server Manager.
  • Managed migration in a multi-vendor supported Server and Database environments.
  • Enhanced existing UNIX shell scripts as part of the ETL process to schedule tasks/sessions.

Environment: Informatica Power Center, Teradata SQL Assistant 12.0, Oracle RDBMS 9i, Informatica, JAVA, SQL*Plus Reports, SQL*Loader, XML, Toad, PL/SQL, Unix Shell Scripting, Windows.

Hire Now