We provide IT Staff Augmentation Services!

Sr. Etl Informatica Developer Resume

2.00/5 (Submit Your Rating)

Boston, MA

SUMMARY

  • Over 10 years of experience in Information Technology wif a strong background in Database development and Data warehousing.
  • Created Good experience in designing, Implementation of Data warehousing and Business Intelligence solutions using ETL tools like Informatica Power Center, Informatica Developer (IDQ), Informatica Power Exchange and Informatica Intelligent Cloud Services (IICS).
  • Good understanding about design, architecture and implementation of Informatica MDM project mappings in mapping Designer to load data from various sources using transformations like Transaction Control, Lookup (Connected and Un - connected), Router, Filter, Expression, Aggregator, Joiner and Update Strategy, SQL, Stored Procedure, Unstructured data transformation, Sql transformation and more.
  • Excellent Knowledge on Slowly Changing Dimensions (SCD Type1, SCD Type 2, SCD Type 3), Change Data Capture, Dimensional Data Modeling, Ralph Kimball Approach, Star/Snowflake Modeling, Data Marts, OLAP and FACT and Dimensions tables, Physical and Logical data modeling.
  • Experience working wif Informatica IICS tool effectively using it for Data Integration and Data Migration from multiple source systems in Azure Sql Data warehouse.
  • Experience working wif Azure Sql Data Warehouse integration wif IICS and various Native (v2 and v3 connectors) and Microsoft connector wif PDO support for Azure Sql.
  • Good Understanding about Azure Sql Dwh concepts relating to storage, distribution, DWU units, resource user groups, connection strings etc
  • Experience in developing Transformations, Mapplets and Mappings in Informatica Designer and creating tasks using Workflow Manager to move data from multiple sources to target.
  • Experience working wif Informatica Data Quality tool to effectively use Address validator transformation for Address doctor implementation.
  • Worked in different phases of the projects involving Requirements Gathering and Analysis, Design, Development, Testing, Deployment and Support.
  • Good Knowledge on Database architecture of OLTP and OLAP applications, Data Analysis.
  • Worked wif wide variety of sources like Relational Databases, Flat Files, XML files, Mainframes, Salesforce Objects, Unstructured data files and Scheduling tools like CA7, Control-M and Informatica Scheduler.
  • Experience working wif various Informatica concepts like partitioning, Performance tuning, identifying bottlenecks, deployment groups, Informatica scheduler and more.
  • Very Strong skills and clear understanding of requisites and solutions to various issues in implementation throughout the Software Development life cycle (SDLC).
  • Experience in maintaining Batch Logging, Error Logging wif Event Handlers and Configuring Connection Managers using SSIS.
  • Proficient using query tools like TOAD, SQL Developer, PL/SQL developer and Teradata SQL Assistant.
  • Experienced in DTS, SSIS packages creation and scheduling them by using Windows Scheduler, SQL Server Agent and Autosys tools.
  • Very good exposure to Oracle 11g/10g/9i, MS SQL Server 2012/2008/2005 .
  • Excellent Verbal and Written Communication Skills. Has proven to be highly effective in interfacing across business and technical groups.

TECHNICAL SKILLS

ETL Tools: Informatica Power Center 10.2/9.6.1/9.5.1/9.1/8.6. x, Informatica Power Exchange 10.2/ 9.5.1, Informatica developer 9.6.1, Informatica Intelligent Cloud Services (IICS)

Data Modeling: Star Schema, Snowflake Schema, Erwin 4.0, Dimension Data Modeling.

Databases: Oracle11g/10g/9i, SQL Server 2012/2008/2005 , SQL Server Integrations Services (SSIS), Azure Sql Data Warehouse, DB2, Teradata

Scheduling Tools: CA7 Scheduler, TWS(Tivoli), Informatica Scheduler, Control M.

Reporting Tools: MicroStrategy, and Hyperion Essbase.

Programming: SQL, PL/SQL, Transact SQL, HTML, XML, Java, Shell, Perl, Batch

Operating Systems: Windows 7/XP/NT/2000, DOS, UNIX and LINUX

Other Tools: SQL*Plus, Toad, SQL Developer, Putty, WINSCP, MS-Office.

PROFESSIONAL EXPERIENCE

Confidential, Boston MA

Sr. ETL Informatica Developer

Environment: Informatica IICS, Oracle 11g, Sql Server 2016, SQL Server Integrations Services (SSIS), Azure Sql Data warehouse, Teradata v15, flat files, Excel Files, Salesforce, Cognos Reporting, batch and python scripting

Responsibilities:

  • Interact wif the Business users to identify the process metrics and various key dimensions and Facts and involved in full life cycle of the project.
  • Assist architect on developing STG/ ODS / Hub / dimensional warehouse in Azure Sql Data warehouse.
  • Assist in defining logical and physical database models for building new enterprise data warehouse in cloud to replace existing on-premise warehouse.
  • Identify ETL specifications based on business requirements and creating ETL Mapping Documents, high level documentation for the product owners and data managers.
  • Define modelling and naming standards and Best Practices for the Modelling team to use in the Data models as well as in the DDLs and DMLs while creating new data elements and adding attributes.
  • Importing & exporting database using SQL Server Integrations Services (SSIS) and Data Transformation Services (DTS Packages).
  • Effectively using IICS Data integration console to create mapping templates to bring data into staging layer from different source systems like Sql Server, Oracle, Teradata, Salesforce, Flat Files, Excel Files, PWX Cdc
  • Involved in all the phases of Migration from DTS to SSIS packages.
  • Experience working wif IICS transformations like Expression, joiner, union, lookup, sorter, filter, normalizer and various concepts like macro fields to templatize column logic, smart match fields, renaming bulk fields and more.
  • Experience working wif Informatica Powerexchange integration wif IICS to read data from Condense Files and load into Azure Sql Datawarehouse environment.
  • Experience wif PWX concepts including registration maps, logger, listener, condense files.
  • Experience working wif IICS wif FULL push Down optimization to push data from Staging to Ods at scale using data integration templates.
  • Experience working wif Custom Built Query to load dimensions and Facts in IICS.
  • Experience working wif various Azure Sql data warehouse connectors including V2 and V3 connectors.
  • Experience working wif Microsoft Odbc drivers to enable FULL Pdo while doing loads wifin sql data warehouse.
  • Experience working wif Salesforce connector to read data from Salesforce objects into Cloud Warehouse using IICS.
  • Experience working wif IICS monitoring, administrator concepts.
  • Experience working wif Data integration concepts not limited to mapping, mapping configuration task, Taskflows, deployment using GIT automation, schedules, connections, api integration.
  • Experience working wif Key Range Partitioning in IICS, handling File loads wif concept of File list option, creating fixed wif file format and more, file listener and more.
  • Experience integrating data using IICS for reporting needs.
  • Experience in building semantic layer post to fact loads for reporting to connect to data warehouse.
  • Responsible for deployments in Higher Environments and prod support for warranty period before turning over to managed services.

Confidential, MI

Sr. ETL Informatica Developer

Environment: Informatica Powercenter 9.6.1, Oracle 11g, Sql Server 2012, SQL Server Integrations Services (SSIS), DB2, flat files, XML files, Informatica IDQ, B2B data transformation, Informatica ICloud, Informatica Address Doctor, Batch Scripting, MicroStrategy.

Responsibilities:

  • Good Understanding of business requirements, technical specifications, source repositories and physical data models for project activities.
  • Experience in creating high level documents, Source to Target mapping document and detailed design level document for the entire ETL process.
  • Extracted/loaded data from/into diverse source/target systems like Oracle, Sql Server, Salesforce, HL-7, EPIC, XML and Flat Files.
  • Project involved usage of most of the transformations like Transaction Control, Active and Passive look up transformation, Router, Filter, Expression, Aggregator, Joiner, Update Strategy, Stored Procedure, Unstructured data transformation, SQL transformation and more.
  • Extensive implementation of Incremental/Delta loads wif the help of various concepts like mapping variable, mapping parameter and parameter table concept.
  • Creating ETL Code in such a way to support implementation of full loads for the initial run and incremental/delta loads for next daily runs
  • Developed mappings to load data into landing layer, staging layer and publish layer wif extensive usage of SCD Type me and SCD Type II development concept.
  • Experience development of SCD Type me and Type II wif the help of MD5 hash function.
  • Experience working wif B2B data transformation concept of Informatica.
  • Experience working wif advanced Informatica transformation like unstructured data transformation for parsing HL7 data file.
  • Experience working in Informatica Data Quality to create a mapplet for validating, cleansing address’s using Address Validator transformation.
  • Exporting the Mapplets from IDQ into Informatica Powercenter to use the mapplet in various mappings for implementation of Address doctor.
  • Hands on experience working on profiling data using IDQ.
  • Experience working wif extracting and loading data directly into Salesforce objects using Informatica Powercenter.
  • Experience working wif various session properties to extract data from Salesforce object using standard api and Bulk api.
  • Creating new and enhancing the existing stored procedure SQL used for semantic views and load procedures for materialized views
  • Addressed many performance issues on ETL jobs, semantic views, stored procedures, Reporting and Ad-hoc SQL.
  • Worked in the Performance Tuning of SQL, ETL and other processes to optimize session performance.
  • Created Reusable transformations, Mapplets, Worklets using Transformation Developer, Mapplet Designer and Worklet Designer.
  • Loaded data from Unstructured file format using unstructured data transformation into Oracle database.
  • Tuned Informatica mappings to improve the execution time by applying suitable Partitioning mechanisms and tuning individual transformations inside the mapping.

Confidential, Chicago IL

ETL Informatica Developer

Environment: Informatica 9.5.1, Oracle 11g, Sql Server 2008, SSIS, Power Exchange 9.5.1, flat filesMainframe, Teradata, Toad for Oracle 11, Harvest Version Control tool, Windows Server, UNIX, Perl Scripting and CA7 Scheduler

Responsibilities:

  • Analyze the business requirements and framing the Business Logic for the ETL Process and maintained the ETL process using Informatica Power Center.
  • Designed and Developed Complex ETL using SSIS.
  • Work extensively on various transformations like Normalizer, Expression, Union, Joiner, Filter, Aggregator, Router, Update Strategy, Lookup, Stored Procedure and Sequence Generator.
  • Develop an ETL Informatica mappings in order to load data into staging area. Extracted data from Mainframe files, flat files, Sql Server and loaded into Oracle 11g target database.
  • Create workflows and work lets for Informatica Mappings.
  • Write Stored Procedures and Functions to do Data Transformations and integrate them wif Informatica programs and the existing applications.
  • Work on SQL coding for overriding for generated SQL query in Informatica.
  • Involve in Unit testing for the validity of the data from different data sources.
  • Developed workflows for dimension loads, fact loads based on daily/monthly runs.
  • Developed code to archive monthly data into history tables and effective use of the history table to load the data back into the system for a particular past month.
  • Developed Audit tables to keep the track of ETL Metrics for each individual run.
  • Experience working wif Audit Balance control concept to create the parameter files dynamically for each workflow before its run.
  • Design and develop PL/SQL packages, stored procedure, tables, views, indexes and functions. Experience dealing wif partitioned tables and automating the process of partition drop and create in oracle database.
  • Involve in migrating the ETL application from development environment to testing environment.
  • Perform data validation in the target tables using complex SQLs to make sure all the modules are integrated correctly.
  • Developed Informatica SCD type-me, Type-II mappings. Extensively used almost all of the transformations of Informatica including complex lookups, Stored Procedures, Update Strategy, mapplets and others.
  • Implemented update strategies, incremental loads, Data capture and Incremental Aggregation.
  • Involve in performance tuning for better data migration process.
  • Analyze Session log files to resolve error in mapping and identified bottlenecks and tuned them for optimal performance.
  • Create UNIX shell scripts for Informatica pre/post session operations.
  • Automated the jobs using CA7 Scheduler.
  • Worked on Direct Connect NDM process to transfer the files between servers.
  • Document and present the production/support documents for the components developed, when handing-over the application to the production support team.

Confidential, Auburn Hills, MI

Informatica Developer

Environment: Informatica Power Center 8.6.x, Power Exchange, PL/SQL, Oracle, SQL Server 2008/2005, Windows Server 2003, UNIX, Tivoli Scheduler, Toad

Responsibilities:

  • Developed a standard ETL framework to enable the reusability of similar logic across the board Involved in System Documentation of Dataflow and methodology
  • Extensively developed Low level Designs (Mapping Documents) by understanding different source systems
  • Designed complex mappings, sessions and workflows in Informatica PowerCenter to interact wif MDM and EDW
  • Design and develop mappings to implement full/incremental loads from source system
  • Design and develop mappings to implement type1/type2 loads
  • Responsible for ETL requirement gathering and development wif end-to-end support
  • Responsible to coordinate the DB changes required for ETL code development
  • Responsible for ETL code migration, DB code changes and scripting changes to higher environment
  • Responsible to support the code in production and QA environment
  • Developed complex IDQ rules which can be used in Batch Mode
  • Developed Address validator transformation through IDQ to be interacted in Informatica PowerCenter mapping
  • Extensive experience in integration of Informatica Data Quality (IDQ) wif Informatica PowerCenter
  • Worked closely wif MDM team to identify the data requirements for their landing tables and designed IDQ process accordingly
  • Extensively used transformations like router, lookup, source qualifier, joiner, expression, sorter, XML, Update strategy, union, aggregator, normalizer and sequence generator
  • Created reusable mapplets, reusable transformations and performed Unit tests over Informatica code
  • Responsible for providing daily status report for all Informatica applications to customer, Monitoring and tracking the critical daily applications & code migration during deployments
  • Responsible for reloads of Informatica applications data in production and closing user tickets and incidents
  • Identify performance issues and bottlenecks

We'd love your feedback!