We provide IT Staff Augmentation Services!

Senior Informatica Developer Resume

4.00/5 (Submit Your Rating)

PA

SUMMARY

  • 8 years of IT experience in analysis, design, development, testing and Implementation Informatica Workflows using Data Warehousing/Data mart design, ETL, OLAP client /server applications.
  • ETL and data integration experience in developing ETL mappings and scripts using Informatica Power Center 9.x/8.x
  • Solid understanding of ETL design principles and good practical knowledge of performing ETL design processes through Informatica.
  • Developed complex mappings using Informatica data quality (IDQ) and used Informatica developer, Analyst tool.
  • Strong ETL experience using Informatica Power Center with various versions 9.x/8.x/7.x/6.x Client tools - Mapping Designer, Repository manager, Workflow Manager, Workflow Monitor and Server tools - Informatica Server, Repository Server manager and Power Mart.
  • Ability to write complex SQLs, stored procedures and Unix Shell Scripting, for ETL jobs and analyzing data
  • Good exposure on Informatica Cloud Services
  • Good understanding and experience on Informatica cloud with oracle and sales force.
  • Having experience in Change data capture (CDC).
  • Experience in Software development life cycle (SDLC) Such as testing, migrating, developing, etc
  • Proficient and worked with databases like Oracle, SQL Server, IBM DB2, Excel sheets, Flat Files.
  • Strong understanding of OLAP and OLTP Concepts
  • Proficient and worked with databases like Oracle, SQL Server, XML, and XSD’s, Teradata, Excel sheets, Flat Files.
  • Understand the business rules completely based on High Level document specifications and implements the data transformation methodologies.
  • Excellent working knowledge of UNIX shell scripts and scheduling of jobs by using tools like Control M, Autosys.
  • Extensively worked on Data migration, Data Cleansing and Data Staging of operational sources using ETL process and providing data mining features for Data warehouses
  • Strong Experience in creating Transformations such as Aggregation, Expression, Update Strategy, Lookup, Joiner, Rank, Router and Source Qualifier Transformations in the Informatica Designer.
  • Experience in maintaining Data Concurrency, Replication of data
  • Critical thinker who is self-starter and willing to take initiative and work with little or no supervision
  • Demonstrated proficiency in improving re-usability, automation of data loads, performance tuning in Informatica by identifying bottlenecks and database performance.
  • Analytical and Technical aptitude with the ability to solve complex problems
  • Systematic, disciplined and has an analytical and logical approach in problem solving. Ability to work in tight Schedules and efficient in time management
  • Excellent Written, Communications and Analytical skills with ability to perform independently as well as in a team
  • Hands on experience in developing SCD type 1/2/3 mappings.
  • Experience in installing Informatica in UNIX environment
  • Experience in Informatica Administration and hands-on experience in installing, configuring, upgrading the Informatica Power Center and Data Explorer.
  • Experienced in handling various Informatica Power Center code migration methods (XML Export/Import, Deployment Group and Object Copy)
  • Experience in handling Informatica repository back-ups, restore and fail-over of services from primary to back-up nodes and vice-versa
  • Ability to work in teams as well as individually, quick learner and able to meet deadlines

TECHNICAL SKILLS

Tools: Informatica power Centre 9.x/8.x, Informatica Power Exchange9.1, Informatica Data Quality 9.6.1, B2B DX/DT v 8.0

Programming Languages: SQL, PL/SQL, UNIX Shell Scripting

Operating Systems: WINDOWS 7/Vista, XP/2003/2009/NT/98/95, MS-DOS, Unix/Linux

Office Applications: Microsoft Word, Excel, Outlook, Access, Project, PowerPoint

Databases: SQL Server 2008/2005, Oracle 11g/10g/9i, MS-Access 2003/2007/2010 SQL Server, XML, XSD, Teradata, DB2

Reporting Tools: Tableau 9.2

PROFESSIONAL EXPERIENCE

Confidential, PA

Senior Informatica Developer

Environment: Informatica Power Center 9.6.1/9.5.1 , (Repository Manger, Designer, Server Manager, Workflow Monitor, Workflow Manager) Erwin 4.5, Data Quality 9.6.1, RDBMS, Oracle, 11g10g/9i, PL/SQL, Toad, Teradata, UNIX scripting, Windows XP.

Responsibilities:

  • Extensively used Informatica Client tools - Power Center Designer, Workflow Manager, Workflow Monitor and Repository Manager
  • Experience with HCM Applications like PeopleSoft and Workday· Experience with migrating data from PeopleSoft to Workday·
  • Identified and eliminated duplicates in customer address/contact information by IDQ transformations.
  • Development and Production support and Datawarehousing solutions and reporting using Teradata, Informatica Power Center and UNIX technologies.
  • Extensive Knowledge on Teradata system architecture and concepts StrongDatawarehousing experience specialized in Teradata, Extract Transform Load tools and techniques, Strategy, Design, Architecture, Development and Testing. Extraction, Transformation and Loading (ETL)datafrom various sources intoDataWarehouses using Informatica Power Center and Teradata load utilities.
  • Utilized Informatica IDQ to complete initial data profiling or data cleansing in data quality ETL processes
  • Developed complex mapping using Informatica Power Center tool and configured Informatica with DAC.
  • Extracting data from Oracle and Flat file, Excel files and performed complex joiner, Expression, Aggregate, Lookup, Stored procedure, Filter, Router transformations and Update strategy transformations to load data into the target systems.
  • Used concept of staged mapping in order to perform asynchronous Web Services request and response as part of Informatica Mappings
  • Working with various sources such as Flat files, Relational, XML and Web services as part of Informatica Mappings
  • Extracted the data from various data source systems into the Landing Zone area by creating the Informatica mappings using the Teradata fast Loader Connections
  • Designed Mappings using B2B Data Transformation Studio
  • Extensively used Informatica Client tools - Power Center Designer Cloud, Work flow Manager, Work flow Monitor and Repository Manager
  • Developed complex mapping using Informatica Power Center Cloud
  • Created Sessions, Tasks, Workflows and worklets using Workflow manager
  • Worked with Data modeler in developing STAR Schemas
  • Involved in performance tuning and query optimization.
  • Experience in converting SAS scripts to Informatica
  • Worked on utilities like FLOAD, MLOAD, FEXP of Teradata and created batch jobs using BTEQ.
  • Extracting data from Oracle and XML, SAP, MS SQL, T-SQL and performed Delta mechanism using Expression, Aggregate, Lookup, Stored procedure, Filter, Router transformations and Update strategy transformations to load data into the target systems.
  • Integrated Oracle EBS with Informatica Power Center and OBIEE. Configured DAC for Finance
  • Responsible for Configuration and administration of Meta data Manager
  • Used TOAD, SQL Developer to develop and debug procedures and packages.
  • Involved in developing the Deployment groups for deploying the code between various environment (Dev, QA)
  • Experience developing and supporting complex DW transformations
  • Excellent understanding of Star Schema Data Models; Type 1 and Type 2 Dimensions
  • Created pre-sql and post-sql scripts, which need to run at Informatica level.
  • Worked extensively with session parameters, Mapping Parameters, Mapping Variables and Parameter files for Incremental Loading
  • Extracted data from various heterogeneous sources like Oracle, SQL Server, XML, XSD’s, Flat Files and Teradata.
  • Expertise in using both connected and unconnected Lookup Transformations
  • Extensively worked with various Lookup caches like Static cache, Dynamic cache and Persistent cache
  • Developed Slowly Changing Dimension Mappings for Type 1 SCD and Type 2 SCD
  • Monitored and improved query performance by creating views, indexes, hints and sub queries
  • Extensively involved in enhancing and managing Unix Shell Scripts
  • Developed workflow dependency in Informatica using Event Wait Task, Command Wait

Confidential, NY

Sr. ETL Informatica Developer

Environment: Informatica Power Center 9.1, Data Quality 9.6.1, Toad, Oracle 10g, PL/SQL, Teradata, RDBMS, Shell Programming

Responsibilities:

  • Interacted with Business Analysts for Requirement gathering, understanding the Requirements, Explanation of technical probabilities and Application flow
  • Delivered data profiling (IDQ) reports. Responsible for data verification, data cleansing and data mapping in ETL
  • Developed ETL mappings, transformations using Informatica Power Center 9.5.1
  • Extracted data from flat files, DB2 and loaded the data into Oracle staging using Informatica Power Center.
  • Worked on many scripts and well versed with the usage of Teradata tools and utilities in projects -Fastload, Multiload, T Pump, Fast Export, TPT and BTEQ their usage in different scenarios and proficient in writing SQL Scripts, Views and Triggers in Teradata.
  • Designed and created complex source to target mapping using various transformations inclusive of but not limited to Sorter, Aggregator, Joiner, Filter, Source Qualifier, Expression and Router Transformations.
  • Extensively used Lookup Transformation and Update Strategy Transformation while working with Slowly Changing Dimensions (SCD) and also performed reading and loading high-volume Type 2 dimensions.
  • Created variables and parameters files for the mapping and session so that it can migrate easily in different environment and database
  • Deployed reusable transformation objects such as mapplets to avoid duplication of metadata, reducing the development time.
  • Develop Informatica Mappings, Sessions and Workflows.
  • Worked on Extraction, Transformation and Loading of data using Informatica.
  • Created Informatica mappings with PL/SQL procedures/functions to build business rules to load data
  • Developed joiner transformation for extracting data from multiple sources
  • Tuned Queries on Oracle DB to improve performance
  • Designed and Developed Oracle PL/SQL and Shell Scripts, Data Import/Export, Data Conversions and Data Cleansing.
  • Responsible for populating the business rules using mappings into the Repository for Meta data Management.
  • Used Pre-session and post-session scripts
  • Optimized the mappings using various optimization techniques and also debugged some existing mappings using the Debugger to test and fix the mappings.
  • Worked with Informatica workflow monitor in running and debugging its components and monitoring the resulting executable version
  • Involved in unit testing and documentation of the ETL process
  • Daily monitoring of the mappings that ran the day before and fixing the issues

Confidential, CA

ETL (Informatica) Consultant

Environment: Informatica Power Center 9.0.1/8.6, Informatica Power Connect, RDBMS, Oracle 10g/11g, PL/SQL, Toad, UNIX scripting, OBIEE 10.1.3.4.1

Responsibilities:

  • Participated in all phases including Client Interaction, Design, Coding, Testing, Release, Support and Documentation
  • Interacted with Management to identify key dimensions and Measures for business performance
  • Involved in defining the mapping rules, identifying requires data sources and fields
  • Created ER (Entity Relationship) diagrams
  • Extensively used RDBMS and Oracle Concepts
  • Dimensional modeling using STAR schemas (Facts and Dimensions)
  • Generated weekly and monthly report Status for the number of incidents handled by the support team
  • Worked on data conversions and data loads using PL\SQL and created measure objects, aggregations and stored in MOLAP mode.
  • Involved in Performance Tuning
  • Extensively created and used various TeradataSet Tables, Multi-Set table, global tables, volatile tables, temp tables.
  • Did the performance tuning forTeradataSQL statements using Teradata Explain command.
  • Used variousTeradataIndex techniques to improve the query performance
  • Extensively involved in data transformations, validations, extraction and loading process. Implemented variousTeradataJoin Types like Inner-join, outer-join, self-join, cross-joins. And various join strategies like Merge join, Product join, Nested join, Row Hash Joins.
  • Worked on slowly changing dimension table to keep full history which was used across the board.
  • Used Informatica Designer to create complex mappings using different transformations like Source Qualifier, Expression, Lookup, Aggregator, and Update Strategy, Joiner, Filter and Router transformations to pipeline data to Data Warehouse/Data Marts
  • Worked for some time in Support Activities (24*7 Production Support), Monitoring of Jobs and worked on enhancements and change requests.
  • Familiarity with Data Analysis and building Reports and Dashboards with OBIEE
  • Designed and developed reusable mapplets and used mapping variable and mapping parameters in the ETL mapping.
  • Involved in writing various UNIX shell scripts for writing automated scripts for scheduled queue process
  • Optimized the performance of the mappings by various tests on sources, targets and transformations. Identified the different bottlenecks

Confidential, NY

ETL (Informatica) Developer

Environment: Informatica Power Center 9.1, Informatica Power Connect, (Repository Manger, Designer, Server Manager, Workflow Monitor, Workflow Manager), Erwin 4.5, RDBMS, Oracle 10g/9i, PL/SQL, Toad, UNIX scripting.

Responsibilities:

  • Involved in the Analysis, Design and Creation of the Enterprise Data warehouse
  • Analyzed data flow requirements and developed a scalable architecture for staging and loading data and translated business rules and functionality requirements into ETL procedures.
  • Responsible for standardizing data to store various Business Units in tables
  • Designed and build the physical data model for ODS for Data warehouse using Erwin
  • Involved in the development of Informatica mappings and also tuned for better performance
  • Documented standards handbook for Informatica code development
  • Developed a number of Informatica Mappings, Mapplets and Transformations to load data from relational and flat file sources into the data warehouse
  • Used SQL queries and database programming using PL/SQL (writing Packages, Stored Procedures/Functions, and Database Triggers)
  • Used RDBMS Concepts
  • Created Informatica mappings with PL/SQL procedures/functions to build business rules to load data. Most of the transformations were used like the Source qualifier, Aggregators, Connected & unconnected lookups, Filters & Sequence Generators
  • Extensively worked on the Database Triggers, Stored Procedures, Functions and Database Constraints. Written complex stored procedures and triggers and optimized for maximum performance.
  • Optimized the mappings using various optimization techniques and also debugged some existing mappings using the Debugger to test and fix the mappings.
  • Involved in Performance Tuning

Confidential

ETL Developer

Environment: Informatica PowerCenter8.6.1, SQL server 2005, Autosys, Oracle 9i, PL/SQL, Toad, Business Objects 6.0, Windows NT, UNIX Shell Scripts.

Responsibilities:

  • Creating dimensions and facts in the physical data model
  • Involved in designing the Data Mart model with Erwin using Star Schema methodology
  • Used aggregate, expression, lookup, update strategy, router, and rank transformation.
  • Used Lookup Transformation to access data from tables, which are not the source for mapping and used Unconnected Lookup to improve performance
  • Created ftp connections, database connections for the sources and targets.
  • Loading Data to the Interface tables from multiple data sources such as MS Access, SQL Server, Text files and Excel Spreadsheets using SQL Loader, Informatica and ODBC connection
  • Wrote stored procedure to check source data with Warehouse data if it not present written that records to spool table and used spool table as Lookup in Transformation
  • Implemented Variables and Parameters in Transformations to calculate billing data in billing Domain
  • Modified the existing batch process, shell script, and PL/SQL Procedures for Effective logging of Error messages into the Log table
  • Generated weekly and monthly report Status for the number of incidents handled by the support team
  • Improved Performance of the workflows by identifying the bottlenecks in targets, sources, mappings, sessions
  • Designed a mapplet to update a slowly changing dimension table to keep full history, which could be used across the board.

Confidential

ETL Developer

Environment: Informatica Power Center 8.1, SQL server 2005, DB2, PL/SQL, Toad, Business Objects 6.0, Windows NT, UNIX Shell Scripts.

Responsibilities:

  • Interacted with Business Analysts for Requirement gathering, understanding the Requirements, Explanation of technical probabilities and Application flow
  • Developed ETL mappings, transformations using Informatica Power Center 8.6.1
  • Extracted data from flat files, DB2 and loaded the data into Oracle staging using Informatica Power Center.
  • Designed and created complex source to target mapping using various transformations inclusive of but not limited to Sorter, Aggregator, Joiner, Filter, Source Qualifier, Expression and Router Transformations.
  • Extensively used Lookup Transformation and Update Strategy Transformation while working with Slowly Changing Dimensions (SCD) and also performed reading and loading high-volume Type 2 dimensions.
  • Created various transformations such as Union, Aggregator, Lookup, Update Strategy, Joiner, Filter and Router Transformations
  • Involved in Performance tuning. Identified and eliminated bottlenecks (source, target, and mapping)
  • Used reusable Session for different level of workflows
  • Designed various tasks using Informatica workflow manager like session, command, email, event raise, event wait and so on
  • Designed and developed UNIX shell scripts as part of the ETL process to automate the data load processes to target.
  • Involved in Unit Testing, User Acceptance Testing (UAT) to check whether the data loads into target are accurate, which was extracted from different source systems according to the user requirements
  • Developed Mappings that extract data form ODS to Data mart and Monitored the Daily, Weekly and Monthly Loads.
  • Created and monitored Sessions/Batches using Informatica Server Manager/Workflow Monitor to load data into target Oracle database.

We'd love your feedback!