We provide IT Staff Augmentation Services!

Etl/informatica Developer Resume

Denver, CO

SUMMARY:

  • Overall 7 years of Software Life Cycle experience in System Analysis, Design, Development, and Implementation, Maintenance, and Production support of Data Warehouse Applications.
  • Extensive experience in ETL/Informatica Power Center and data integration experience in developing ETL mappings and scripts using Informatica Power Center10.x/9.x/8.x/7.x, IDQ.
  • Have clear understanding of Data Warehousing and BI concepts with emphasis on ETL and life cycle development using Power Center, Repository Manager, Designer, Workflow Manager and Workflow Monitor.
  • Experience in data extraction from heterogeneous sources using Teradata. Informatica Power center.
  • Experience in creating High Level Design and Detailed Design in the Design phase.
  • Extensively worked on the ETL mappings, analysis and documentation of OLAP reportsrequirements. Solid understanding of OLAP concepts and challenges with large data sets.
  • Strong experience in designing and developing complex mappings from varied transformation logic like Unconnected and Connected, Static and Dynamic lookups, Java, SQL, Stored Procedure, Router, Filter, Expression, Aggregator, Joiner, Update Strategy.
  • Strong knowledge of Entity - Relationship concept, Facts and dimensions tables, slowly changing dimensions and Dimensional Modeling (Star Schema and Snow Flake Schema).
  • Expertise in working with Teradata Stored Procedures, Oracle Stored Programs, Packages, Cursors, Triggers, Tables, Constraints, Views, Indexes, Sequence, and Synonymsin distributed environment.
  • Extensively worked on Dimensional Modeling, Data Migration, Data Cleansing, and Data staging for operational sources using ETL and data mining features for data warehouses.
  • Good understanding of relational database management systems like Oracle, DB2, SQL Server and worked on Data Integration using Informatica for the Extraction transformation and loading of data from various database source systems.
  • Extensive experience in implementing projects using Agile(kanban) and waterfall methodologies
  • Worked on IDQ/IDE tools fordata profiling, data enrichment and standardization.
  • Experience in development of mappings in IDQ to load the cleansed data into the target table using various IDQ transformations. Experience in data profiling and analyzing the scorecards to design the data model.
  • Create project plan to carry out testing for implementation & upgrade of existing legacy system with Guidewire Policy Center & Guidewire Billing Center.
  • Data load and exporting using Teradata utilities such as TPT, FLoad, MLoad, Fast Export and Tpump.
  • Worked on Real Time Integration between MDM Hub and External Applications using Power Center.
  • Expertise in Business Model development with Dimensions, Hierarchies, Measures, Partitioning, Aggregation Rules, Time Series, Cache Management.
  • Extensively created Mapplets, common functions, reusable transformations, look-ups for better usability.
  • Extensively used SQL, PL/SQL in writing Stored Procedures, Functions, Packages and Triggers.
  • Experience in UNIX shell scripting, job scheduling and server communication.
  • Involved in Unit testing, System testing to check whether the data loads into target are accurate.
  • Extensive database experience and highly skilled in SQL Server, Oracle, DB2, Sybase, XML Files, Flat Files, MS Access.
  • Excellent communication skills and result-oriented with minimum supervision, problem solving skills and team player.

TECHNICAL SKILLS:

ETL Tools : Informatica 10.1/9.6/9.1/8.6.1/8.1 Source Analyzer, Mapping Designer, Workflow Monitor, Workflow Manager, Data Cleansing, Data Quality, Repository, Metadata, Data Mart, OLAP, OLTP, IDQ, MDM,SQL Server SSIS.

Data modeling tools : Erwin

Databases: Oracle 11g/10g/9i/8i, IBM-DB2, MSSQL Server

Other Tools: Toad, SQL Developer, Crystal Reports, SQL Assistant

Programming Languages: SQL, Java, PL/SQL, T-SQL, UNIX Shell Scripting

Job scheduling: Shell Scripting, Autosys, Tidal, Control-M

Environment: MS Windows 2012/2008/2005, UNIX

PROFESSIONAL EXPERIENCE:

Confidential, Denver, CO

ETL/Informatica Developer

Responsibilities:

  • Worked with business analysts for requirement gathering, business analysis, and translated the business requirements into technical specifications to build the Enterprise data warehouse.
  • Design and developed end-to-end ETL process from various source systems to Staging area, from staging to Data Marts.
  • Worked on InformaticaPower Center tools- Mapping Designer, Repository Manager, Workflow Manager and Workflow Monitor.
  • Involved in creating data models using Erwin.
  • Worked with Designer tools like Source Analyzer, Target designer, Mapping designer, Mapplet designer, Transformation Developer.
  • Created Mappings and used transformations like Source Qualifier, Filter, Update Strategy , Lookup, Java, Expression, Router, Joiner, Normalizer, Aggregator, Sequence Generator and Address validator.
  • Developed mappings to load Fact and Dimension tables, SCD Type 1 and SCD Type 2 dimensions and Incremental loading and unit tested the mappings.
  • Created work flows and sessions for each mapping that we are extracting from source systems to staging area ; staging area to target .
  • Extensively created Re-Usable Transformations and Mapplets to standardized Business logic .
  • Implemented performance tuning logic on targets, sources, mappings, sessions to provide maximum efficiency and performance.
  • Created PL/SQL programs like procedures, function, packages, and cursors to extract data from Multiple Source Systems.
  • Responsibilities include creating the sessions and scheduling the sessions.
  • Translated Business specifications into PL/SQL code. Extensively used Explain Plan. Designed, Developed and fine-tuned Oracle Stored Procedures and triggers.
  • Extensively used Toad utility for executing SQL scripts and worked on SQL for enhancing the performance of the conversion mapping.
  • Used the PL/SQL procedures for Informatica mappings for truncating the data in target tables at run time.
  • Prepared SQL Queries to validate the data in both source and target databases.
  • Worked on TOAD and Oracle SQL Developer to develop queries and create procedures and packages in Oracle.
  • Developed shell scripts, PL/SQL procedures, for creating/dropping of table and indexes of performance for pre and post session management.
  • Strong knowledge with PL/SQL Wrapper to protect the PL/SQL procedures or packages.
  • Designed End2End Solution including Database Model, Workflows and Job Schedule for the IDQ Cleansing/Matching and IDQ Data Mart.
  • Scheduled Jobs using the IDQ Scheduler by Deploying the Workflow's as an Application to the Data Integration Service for running the workflows
  • Exporting the Mapplets from IDQ into Informatica Power center to use the Mapplet in various mappings for implementation of Address doctor.
  • Created dictionaries using Informatica Data Quality (IDQ) that was used to cleanse and standardized Data. Worked with Informatica and other consultants to develop IDQ plans to identify possible data issues.
  • Involved in massive data profiling using IDQ (Analyst tool) prior to data staging.
  • Extensively worked on Data Cleansing, Data Analysis and Monitoring Using Data Quality.
  • Used the Address Doctor Geo-coding table to validate the address and performed exception handling reporting and monitoring the data.
  • Used Address validator transformation in IDQ.
  • Created Technical Specification Documents and SolutionDesign Documents to outline the implementation plans for the requirements.
  • Working with Informatica Developer (IDQ) tool to ensure data quality to the consumers.
  • Implement Informatica MDM workflow including data profiling configuration specification and coding match rules tuning migration
  • Define and build best practices regarding creating business rules within the Informatica MDM solution
  • Performed land process to load data into landing tables of MDM Hub using external batch processing for initial data load in hub store and define automation process for staging loading match and merge
  • Worked extensively to develop customized MDM services.
  • Created necessary batch interfaces to and from MDM hub .
  • Maintained MDM jobs and Master Data sequences Build test scripts for unit testing of customized MDM code,
  • Actively participated in Scrum Meetings.
  • Designed and developed the ETL processes using Informatica to load data from Oracle, Flat files and XML files to target Oracle Data warehouse.
  • Created partitioned tables, indexes for manageability and scalability of the application.
  • Leveraged workflow manager for session management, database connection management and scheduling of jobs.
  • Developed various mapping and tuning using Oracle and SQL*Plus in the ETL process.
  • Involved in testing of Stored Procedures and Functions, Unit and Integrating testing of Informatica Sessions, Batches and the Target Data.
  • Created control-M jobs and scheduled them.
  • Experience in scheduling of ETL jobs using Autosys, Tidal, Control-M.
  • Created Test cases and Test Plans for unit testing.
  • Supported during QA/UAT/PROD deployments and bug fixes.
  • Created Unix Shell Scripts for Informatica ETL tool to automate sessions and cleansing the source data.
  • Documented and presented the production/support documents for the components developed, when handling over the application to the product support team.

Environment: Informatica Power Center 10.1, MDM, Oracle 11g,UNIX, PL/SQL, SQL* PLUS, TOAD, MS Excel

Confidential, Phoenix, AZ

ETL/Informatica Developer

Responsibilities:

  • Interacted actively with Business Analysts and Data Modelers on Mapping documents and Design process for various Sources and Targets.
  • Developed rules and Mapplets that are commonly used in different mappings.
  • Created complex mappings in Power Center Designer using Aggregate, Expression, Filter, and Sequence Generator, Update Strategy, Union, Lookup, Joiner, XML Source Qualifier and Stored procedure transformations.
  • Worked on Power Center Tools like designer, workflow manager, workflow monitor and repository manager.
  • Integrated Guidewire with QAS.com using Web services for address verification functionality implementation.
  • Extensively used Teradata SQL Assistant to write different queries to understand the data in source tables.
  • Participated in Kanban meetings held two days in a week to track the progress of the projects.
  • Parsed high-level design specification to simple ETL coding and mapping standards.
  • Involved in building the ETL architecture and Source to Target mapping to load data into Data warehouse.
  • Created mapping documents to outline data flow from sources to targets.
  • Involved in Dimensional modeling (Star Schema) of the Data warehouse and used Erwin to design the business process, dimensions and measured facts.
  • Extracted the data from the flat files and other RDBMS databases into staging area and populated onto Data warehouse.
  • Extracted the data from oracle using sql scripts and loaded into teradata using fast/multi load and transformed according to business transformation rules to insert/update the data in data marts.
  • Maintained stored definitions, transformation rules and targets definitions using Informatica repository Manager.
  • Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer.
  • Created PL/SQL programs like procedures, function, packages, and cursors to extract data from Target System.
  • Developed data reconciliation reports in various source systems and in Teradata
  • Responsibilities include creating the sessions and scheduling the sessions.
  • Translated Business specifications into PL/SQL code. Extensively used Explain Plan. Designed, Developed and fine-tuned Oracle Stored Procedures and triggers.
  • Extensively used Toad utility for executing SQL scripts and worked on SQL for enhancing the performance of the conversion mapping.
  • Used the PL/SQL procedures for Informatica mappings for truncating the data in target tables at run time.
  • Prepared SQL Queries to validate the data in both source and target databases.
  • Worked on TOAD and Oracle SQL Developer to develop queries and create procedures and packages in Oracle.
  • Created dictionaries using Informatica Data Quality (IDQ) that was used to cleanse and standardized Data. Worked with Informatica and other consultants to develop IDQ plans to identify possible data issues.
  • Involved in massive data profiling using IDQ (Analyst tool) prior to data staging.
  • Extensively worked on Data Cleansing, Data Analysis and Monitoring Using Data Quality.
  • Used the Address Doctor Geo-coding table to validate the address and performed exception handling reporting and monitoring the data.
  • Used Address validator transformation in IDQ.
  • Created Technical Specification Documents and SolutionDesign Documents to outline the implementation plans for the requirements.
  • Working with Informatica Developer (IDQ) tool to ensure data quality to the consumers.
  • Maintained warehouse metadata, naming standards and warehouse standards for future application development.
  • Used Address validator transformation for validating various customers address from various countries by using SOAP interface
  • Worked on Data Quality checks for data feeds and performance tuning.
  • Worked on data analysis to find the data duplication and existed data pattern using a data profiling tool, IDE .
  • Monitoring the Data Quality, generating weekly/monthly/yearly statistics reports on production processes - success / failure rates for causal analysis as maintenance part and Enhancing exiting production ETL scripts.
  • Data if validated from third party before providing to the internal transformations should be checked for its accuracy (DQ).
  • Used Address validator transformation for validating various customers address from various countries by using SOAP interface.
  • Experience in scheduling of ETL jobs using Autosys, Tidal, Control-M.
  • Supported during QA/UAT/PROD deployments and bug fixes.
  • Created Unix Shell Scripts for Informatica ETL tool to automate sessions and cleansing the source data.

Environment: Informatica Power Center 9.6, Oracle 11g, UNIX, PL/SQL, IDQ,Teradata V 13.0, SQL* PLUS, TOAD, Teradata SQL Assistant, MS Excel.

Confidential, SanFrancisco, CA

ETL Developer

Responsibilities:

  • Involved in Analysis, Design and Development, test and implementation of Informatica transformations and workflows for extracting the data from the multiple systems.
  • Worked cooperatively with the team members to identify and resolve various issues relating to Informatica.
  • Designing mapping templates to specify high-level approach.
  • Involved in massive data cleansing and data profiling of the production data load.
  • Designing and implementing ETL environments using ETL strategies as well as tools Like Informatica Power Center 8.x/9.x, Power Exchange, Metadata Manager, IDQ and B2B.
  • Developed several Power Shell scripts to gather data related to job history, policy evaluation, Backup durations and so on using Central Management Server & SSRS subscriptions.
  • Built the Physical Layer /Business Model and Mapping Layer/ Presentation Layer of a Repository by using Star Schemas.
  • Performed data validation, reconciliation and error handling in the load process.
  • Test data to ensure data is masked. Unit Testing, Integration Testing, System Testing and Data Validation are common practices to test developed Informatica Mappings.
  • Extensively worked with Informatica - Source Analyzer, Warehouse Designer, Transformation developer, Mapplet Designer, Mapping Designer, Repository manager, Workflow Manager, Workflow Monitor, Repository server and Informatica server to load data from flat files, SQL Server.
  • Create scripts for transforming unstructured data to structured format using Informatica B2B Data Transformation.
  • Involved in PL/SQL programs like procedures, function, packages, and cursors to extract data from Target System.
  • Processed Vendor Address elements like Cleansing and converting using Informatica.
  • Created Informatica mappings to handle complex flatfiles and load data into warehouse.
  • Designed the mappings between sources (files and databases) to operational staging targets.
  • Developed custom selection of reports ordering using SQL Server Reporting Services (SSRS)
  • Configured and implemented the first instance of Informatica B2B - Data Transformation and Data Exchange.
  • Aggregator, sequence, look up, expression, filter, Joiner, Rank, Router, Sequence generator, Update Strategy transformations used in this populating data process.
  • Involved in Informatica Repository migration.
  • Involved in using the Stored Procedures, Functions, Materialized views and Triggers at Data Base level and imported them in Informatica for ETL.
  • Designed and Developed the Informatica workflows/sessions to extract, transform and load the data into oracle Server.
  • Translated Business specifications into PL/SQL code. Extensively used Explain Plan. Designed, Developed and fine-tuned Oracle Stored Procedures and triggers.
  • Defects are logged and change requests are submitted using defects module of Test Director
  • Worked with different Informatica tuning issues and fine-tuned the transformations to make them more efficient in terms of performance.

Environment: Informatica Power Center 9.1/8.6, PL/SQL, SSRS, B2B,DT/DX, Oracle 11g, JCL, Teradata, SQL Server 2000, Windows 2000, Shell Scripting, Autosys.

Confidential

ETL Developer

Responsibilities:

  • Effective Communication with data architects, designers, application developers and senior management in order to collaborate on projects that involve multiple teams in a vitally time-sensitive environment.
  • Effectively involved in allocation & review of various development activities / task with onshore counter apart.
  • Maintain and Tune Teradata Production and Development systems
  • Understanding the business logic behind every piece of code and documenting requirements in a reverse engineering fashion
  • Optimized Query Performance, Session Performance and Reliability, did performance tuning of Informatica components for daily and monthly incremental loading tables.
  • Designed ETL process to translate the business requirements into Mappings using Informatica Power Center - Source Analyzer, Warehouse designer, Mapping Designer, Workflow Manager/Monitor.
  • Used Timer, Event Raise, Event Wait, Decisions, and Email tasks in Workflow Manager.
  • Worked on loading of data from several flat files sources to Staging using Teradata TPUMP, MLOAD, FLOAD and BTEQ.
  • Used Workflow Manager for creating validating, testing and running sequential batches.
  • Involved in design, development and implementation of the Enterprise Data Warehouse (EDW) and Data Mart.
  • Tuning the Mappings for Optimum Performance, Dependencies and Batch Design.
  • Schedule and Run Extraction and Load process and monitor sessions using Informatica Workflow Manager.
  • Experienced in working with Teradata ETL tools like Fast Load, Multi Load, Tpump and Fast Export.
  • Extensively used Teradata system Priority Schedule in controlling the load of the system.
  • Teradata Viewpoint is used to monitor the system performance when it is in load
  • Used external tools like Name Parser and Address Cleansing for cleansing the data in source systems.
  • Designed mappings using Source qualifier, Joiner, Aggregator, Expression, Lookup, Router, Filter, and Update Strategy transformations and Mapplets load data into the target involving slowly changing dimensions.
  • Used Workflow Manager for creating and maintaining the Sessions and Workflow Monitor to monitor workflows.
  • Enhanced existing UNIX shell scripts as part of the ETL process to schedule tasks/sessions.
  • Carried out unit and integration testing for Informatica mappings, sessions and workflows.
  • Used Autosys as a scheduling tool for triggering jobs.
  • Written several Teradata BTEQ scripts to implement the business logic.
  • Worked exclusively with the Teradata SQL Assistant to interface with the Teradata.
  • Experienced in Teradata Manager which is used to create Alerts, Monitor system.
  • Using Perforce as a versioning tool for maintaining revision history for code.
  • Documented and presented the production/support documents for the components developed when handing-over the application to the production support team.

Environment: Informatica Power Center 7.X/8.1, Teradata SQL Assistant 12.0, Teradata V12.0R2, Oracle10g/9i, MS SQL server 2005,Business Objects, Autosys,Toad 7.6, SQL, PL/SQL, Unix Shell Scripting, Windows.

Confidential

ETL Developer

Responsibilities:

  • Involved in creating Technical Specification Document (TSD) for the project.
  • Used Informatica for loading the historical data from various tables for different departments.
  • Involved in the development of Data Mart and populating the data marts using Informatica.
  • Created and maintained metadata and ETL documentation that supported business rules and detailed source to target data mappings.
  • Involved in the process design documentation of the Data Warehouse Dimensional Upgrades.
  • Developed sessions using Server Manager and improved the performance details.
  • Created reusable transformations called mapplets and used them in mappings in case of reuse of the transformations in different mappings.
  • Created mapplets and reusable transformations to use across different mappings.
  • Created transformations like Aggregate, Expression, Filter, Sequence Generator, Joiner, and Stored procedure transformations.
  • Created and managed the global and local repositories and permissions using Repository Manager in Oracle Database.
  • Designed and coded maps, which extracted data from existing, source systems into the data warehouse.
  • Scheduled Sessions and Batch Process based on demand, run on time, run only once using Informatica Server Manager.
  • Managed migration in a multi-vendor supported Server and Database environments.

Environment: Oracle RDBMS 9i, Informatica, JAVA, SQL*Plus Reports, SQL*Loader, XML, Toad

Hire Now