We provide IT Staff Augmentation Services!

Sr. Etl Informatica Developer Resume

2.00/5 (Submit Your Rating)

Chicago, IL

SUMMARY:

  • 8+ years of experience in Information Technology with a strong background in Database development and Data warehousing.
  • Created Good experience in designing, Implementation of Data warehousing and Business Intelligence solutions using ETL tools like Informatica Power Center, Informatica Developer (IDQ), Informatica Power Exchange and Informatica Intelligent Cloud Services (IICS).
  • Good understanding about design, architecture and implementation of Informatica MDM project mappings in mapping Designer to load data from various sources using transformations like Transaction Control, Lookup (Connected and Un - connected), Router, Filter, Expression, Aggregator, Joiner and Update Strategy, SQL, Stored Procedure, Unstructured data transformation, Sql transformation and more.
  • Excellent Knowledge on Slowly Changing Dimensions (SCD Type1, SCD Type 2, SCD Type 3), Change Data Capture, Dimensional Data Modeling, Ralph Kimball Approach, Star/Snowflake Modeling, Data Marts, OLAP and FACT and Dimensions tables, Physical and Logical data modeling.
  • Experience working with Informatica IICS tool effectively using it for Data Integration and Data Migration from multiple source systems in Azure Sql Data warehouse.
  • Experience working with IICS concepts relating to data integration, Monitor, Administrator, deployments, permissions, schedules.
  • Experience working with Azure Sql Data Warehouse integration with IICS and various Native (v2 and v3 connectors) and Microsoft connector with PDO support for Azure Sql.
  • Good Understanding about Azure Sql Dwh concepts relating to storage, distribution, DWU units, resource user groups, connection strings etc
  • Experience in developing Transformations, Mapplets and Mappings in Informatica Designer and creating tasks using Workflow Manager to move data from multiple sources to target.
  • Experience working with Informatica Data Quality tool to effectively use Address validator transformation for Address doctor implementation.
  • Worked in different phases of the projects involving Requirements Gathering and Analysis, Design, Development, Testing, Deployment and Support.
  • Good Knowledge on Database architecture of OLTP and OLAP applications, Data Analysis.
  • Worked with wide variety of sources like Relational Databases, Flat Files, XML files, Mainframes, Salesforce Objects, Unstructured data files and Scheduling tools like CA7, Control-M and Informatica Scheduler.
  • Experience working with various Informatica concepts like partitioning, Performance tuning, identifying bottlenecks, deployment groups, Informatica scheduler and more.
  • Very Strong skills and clear understanding of requisites and solutions to various issues in implementation throughout the Software Development life cycle (SDLC).
  • Hands on experience in PL/SQL (Stored Procedures, Functions, Packages, Triggers, Cursors, Indexes), UNIX Shell scripting and Windows Batch scripting.
  • Expertise in doing Unit Testing, Integration Testing, System Testing and Data Validation for Developed Informatica Mappings.
  • Experience working in Production Support, Support for Emergency fix and migrating fix from lower environment to higher environment as per the policies.
  • Very good exposure to Oracle 11g/10g/9i, MS SQL Server 2012/2008/2005 .
  • Excellent Verbal and Written Communication Skills. Have proven to be highly effective in interfacing across business and technical groups.

TECHNICAL SKILLS:

ETL Tools: Informatica Power Center 10.2/9.6.1/9.5.1/9.1/8.6. x, Informatica Power Exchange 10.2/ 9.5.1, Informatica developer 9.6.1, Informatica Intelligent Cloud Services (IICS)

Data Modeling: Star Schema, Snowflake Schema, Erwin 4.0, Dimension Data Modeling.

Databases: Oracle11g/10g/9i, SQL Server 2012/2008/2005 , Azure Sql Data Warehouse, DB2, Teradata

Scheduling Tools: CA7 Scheduler, TWS(Tivoli), Informatica Scheduler, Control M.

Reporting Tools: MicroStrategy, and Hyperion Essbase.

Programming: SQL, PL/SQL, Transact SQL, HTML, XML, Java, Shell, Perl, Batch

Operating Systems: Windows 7/XP/NT/2000, DOS, UNIX and LINUX

Other Tools: SQL*Plus, Toad, SQL Developer, Putty, WINSCP, MS-Office.

PROFESSIONAL EXPERIENCE:

Confidential, Chicago, IL

Sr. ETL Informatica Developer

Technical Environment : Informatica IICS, Oracle 11g, Sql Server 2016, Azure Sql Data warehouse, Teradata v15, flat files, Excel Files, Salesforce, Cognos Reporting, batch and python scripting

Responsibilities :

  • Interact with the Business users to identify the process metrics and various key dimensions and Facts and involved in full life cycle of the project.
  • Assist architect on developing STG/ ODS / Hub / dimensional warehouse in Azure Sql Data warehouse.
  • Assist in defining logical and physical database models for building new enterprise data warehouse in cloud to replace existing on-premise warehouse.
  • Identify ETL specifications based on business requirements and creating ETL Mapping Documents, high level documentation for the product owners and data managers.
  • Define modelling and naming standards and Best Practices for the Modelling team to use in the Data models as well as in the DDLs and DMLs while creating new data elements and adding attributes.
  • Effectively using IICS Data integration console to create mapping templates to bring data into staging layer from different source systems like Sql Server, Oracle, Teradata, Salesforce, Flat Files, Excel Files, PWX Cdc
  • Experience working with IICS transformations like Expression, joiner, union, lookup, sorter, filter, normalizer and various concepts like macro fields to templatize column logic, smart match fields, renaming bulk fields and more.
  • Experience working with Informatica Powerexchange integration with IICS to read data from Condense Files and load into Azure Sql Datawarehouse environment.
  • Experience with PWX concepts including registration maps, logger, listener, condense files.
  • Experience working with IICS with FULL push Down optimization to push data from Staging to Ods at scale using data integration templates.
  • Experience working with Custom Built Query to load dimensions and Facts in IICS.
  • Experience working with various Azure Sql data warehouse connectors including V2 and V3 connectors.
  • Experience working with Microsoft Odbc drivers to enable FULL Pdo while doing loads within sql data warehouse.
  • Experience working with Salesforce connector to read data from Salesforce objects into Cloud Warehouse using IICS.
  • Experience working with IICS monitoring, administrator concepts.
  • Experience working with Data integration concepts not limited to mapping, mapping configuration task, Taskflows, deployment using GIT automation, schedules, connections, api integration.
  • Experience working with Key Range Partitioning in IICS, handling File loads with concept of File list option, creating fixed with file format and more, file listener and more.
  • Experience integrating data using IICS for reporting needs.
  • Experience in building semantic layer post to fact loads for reporting to connect to data warehouse.
  • Responsible for deployments in Higher Environments and prod support for warranty period before turning over to managed services.

Confidential, Livonia, MI

Sr. ETL Informatica Developer

Technical Environment: Informatica Power center 9.6.1, Oracle 11g, Sql Server 2012, DB2, flat files, XML files, Informatica IDQ, B2B data transformation, Informatica ICloud, Informatica Address Doctor, Batch Scripting, MicroStrategy.

Responsibilities:

  • Good Understanding of business requirements, technical specifications, source repositories and physical data models for project activities.
  • Experience in creating high level documents, Source to Target mapping document and detailed design level document for the entire ETL process.
  • Extracted/loaded data from/into diverse source/target systems like Oracle, Sql Server, Salesforce, HL-7, EPIC, XML and Flat Files.
  • Project involved usage of most of the transformations like Transaction Control, Active and Passive look up transformation, Router, Filter, Expression, Aggregator, Joiner, Update Strategy, Stored Procedure, Unstructured data transformation, SQL transformation and more.
  • Extensive implementation of Incremental/Delta loads with the help of various concepts like mapping variable, mapping parameter and parameter table concept.
  • Creating ETL Code in such a way to support implementation of full loads for the initial run and incremental/delta loads for next daily runs
  • Developed mappings to load data into landing layer, staging layer and publish layer with extensive usage of SCD Type I and SCD Type II development concept.
  • Experience development of SCD Type I and Type II with the help of MD5 hash function.
  • Experience working with B2B data transformation concept of Informatica.
  • Experience working with advanced Informatica transformation like unstructured data transformation for parsing HL7 data file.
  • Experience working in Informatica Data Quality to create a mapplet for validating, cleansing address’s using Address Validator transformation.
  • Exporting the Mapplets from IDQ into Informatica Powercenter to use the mapplet in various mappings for implementation of Address doctor.
  • Hands on experience working on profiling data using IDQ.
  • Experience working with extracting and loading data directly into Salesforce objects using Informatica Powercenter.
  • Experience working with various session properties to extract data from Salesforce object using standard api and Bulk api.
  • Creating new and enhancing the existing stored procedure SQL used for semantic views and load procedures for materialized views
  • Addressed many performance issues on ETL jobs, semantic views, stored procedures, Reporting and Ad-hoc SQL.
  • Worked in the Performance Tuning of SQL, ETL and other processes to optimize session performance.
  • Created Reusable transformations, Mapplets, Worklets using Transformation Developer , Mapplet Designer and Worklet Designer.
  • Loaded data from Unstructured file format using unstructured data transformation into Oracle database.
  • Tuned Informatica mappings to improve the execution time by applying suitable Partitioning mechanisms and tuning individual transformations inside the mapping.

Confidential, Cleveland, OH

ETL Technical Onsite Lead

Technical Environment: Informatica 9.5.1, Oracle 11g, Sql Server 2008, Power Exchange 9.5.1, flat files, Mainframe, Teradata, Toad for Oracle 11, Harvest Version Control tool, Windows Server, UNIX, Perl Scripting and CA7 Scheduler

Responsibilities:

  • Analyze the business requirements and framing the Business Logic for the ETL Process and maintained the ETL process using Informatica Power Center.
  • Use EPIC Clarity tables for reporting and analytics purpose.
  • Work extensively on various transformations like Normalizer, Expression, Union, Joiner, Filter, Aggregator, Router, Update Strategy, Lookup, Stored Procedure and Sequence Generator.
  • Develop an ETL Informatica mappings in order to load data into staging area. Extracted data from Mainframe files, flat files, Sql Server and loaded into Oracle 11g target database.
  • Create workflows and work lets for Informatica Mappings.
  • Write Stored Procedures and Functions to do Data Transformations and integrate them with Informatica programs and the existing applications.
  • Work on SQL coding for overriding for generated SQL query in Informatica.
  • Involve in Unit testing for the validity of the data from different data sources.
  • Developed workflows for dimension loads, fact loads based on daily/monthly runs.
  • Developed code to archive monthly data into history tables and effective use of the history table to load the data back into the system for a particular past month.
  • Developed Audit tables to keep the track of ETL Metrics for each individual run.
  • Experience working with Audit Balance control concept to create the parameter files dynamically for each workflow before its run.
  • Design and develop PL/SQL packages, stored procedure, tables, views, indexes and functions. Experience dealing with partitioned tables and automating the process of partition drop and create in oracle database.
  • Involve in migrating the ETL application from development environment to testing environment.
  • Perform data validation in the target tables using complex SQLs to make sure all the modules are integrated correctly.
  • Developed Informatica SCD type-I, Type-II mappings. Extensively used almost all of the transformations of Informatica including complex lookups, Stored Procedures, Update Strategy, mapplets and others.
  • Implemented update strategies, incremental loads, Data capture and Incremental Aggregation.
  • Involve in performance tuning for better data migration process.
  • Analyze Session log files to resolve error in mapping and identified bottlenecks and tuned them for optimal performance.
  • Create UNIX shell scripts for Informatica pre/post session operations.
  • Automated the jobs using CA7 Scheduler.
  • Worked on Direct Connect NDM process to transfer the files between servers.
  • Document and present the production/support documents for the components developed, when handing-over the application to the production support team.

Confidential, Farmington Hills, MI

ETL Informatica developer

Technical Environment: Informatica Power Center 9.1/8.6.x, Power Exchange, PL/SQL, Oracle 10g/9i, SQL Server 2008/2005, Windows Server 2003, UNIX, Tivoli Scheduler, Toad

Responsibilities:

  • Worked with Business analysts and the DBA for requirements gathering, business analysis and designing of the data warehouse.
  • Involved in creating Logical and Physical Data Models and creating Star Schema models.
  • Assisted in designing Star Schema to design dimension and fact tables.
  • Involved in developing star schema model for target database using ERWIN Data modeling.
  • Worked on Informatica PowerCenter - Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer and Transformation Developer .
  • Created Complex mappings using Unconnected and connected Lookups and Aggregate and Router transformations for populating target table in efficient manner.
  • Extracted/loaded data from/into diverse source/target systems like Oracle, XML and Flat Files
  • Created Mapplet and used them in different Mappings.
  • Used sorter transformation and newly changed dynamic lookup.
  • Created events and tasks in the work flows using workflow manager.
  • Developed Informatica mappings and also tuned them for better performance with PL/SQL Procedures/Functions to build business rules to load data.
  • Created Schema objects like Indexes, Views and Sequences.
  • Daily production supports for the scheduled job and provide the resolution related to failed job and fixed them.
  • Created Stored Procedures, Functions and Indexes Using Oracle.
  • Worked on batch processing and file transfer protocols.
  • Performance tuning of the workflow which are taking time.
  • Analyzed the enhancements coming from the individual Client for application and implemented the same.
  • Creation of technical documents.
  • Developed mappings for policy, claims dimension tables.
  • Working with database connections, SQL joins, cardinalities, loops, aliases, views, aggregate conditions, parsing of objects and hierarchies.
  • Simulated real time transaction feed by fabricating transactions required for UAT and manually calculating expected results to generate expectancy going into the testing process.

We'd love your feedback!