We provide IT Staff Augmentation Services!

Informatica Developer Resume

3.00/5 (Submit Your Rating)

Cary, NC

PROFESSIONAL SUMMARY:

  • 8+ years of IT experience in design, analysis, development, documentation, coding, and implementation including Databases, Data Warehouse, ETL Design, Oracle, PL/SQL, SQL Server databases, SSIS, Informatica PowerCenter 9.x/8.x/7.x, Informatica Data Quality.
  • Expertise in Master Data Management concepts, Methodologies, and ability to apply this knowledge in building MDM solutions
  • Involved in complete software development life cycle (SDLC) of project with experience in domains like Healthcare, Banking, Insurance and Appliances.
  • Expertise in the ETL Tool Informatica and have extensive experience in Power Center Client tools including Designer, Repository Manager, Workflow Manager/ Workflow Monitor
  • Extensively worked with complex mappings using various transformations like Filter, Joiner, Router, Source Qualifier, Expression, Union, Unconnected / Connected Lookup, Aggregator, Stored Procedure, XML Parser, Normalize, Sequence Generator, Update Strategy, Reusable Transformations, User Defined Functions, etc.
  • Extensively worked on Relational Databases Systems like Oracle11g/10g/9i/8i, MS SQL Server, Teradata and Source files like flat files, XML files and COBOL files
  • Excellent background in implementation of business applications and in using RDBMS and OOPS concepts.
  • Worked with Informatica Data Quality 9.6.1 (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities using IDQ 9.6.1.
  • Expertise in Data warehousing concepts like OLTP/OLAP System Study, Analysis, and E - R modeling, developing database Schemas like Star schema and Snowflake schema used in relational, dimensional, and multidimensional data modeling.
  • Experienced in mapping techniques for Type 1, Type 2, and Type 3 Slowly Changing Dimensions
  • Hands-on experience in Informatica upgrade from 8.6 to 9.1
  • Extensive experience in debugging mappings, identifying bottlenecks/bugs in existing mappings/workflows by analyzing the data flow and evaluating transformations
  • Solid experience in implementing business requirements, error handling, job control & job auditing using Informatica Power Center tools
  • Strong experience in SQL, PL/SQL, Tables, Database Links, Materialized Views, Synonyms, Sequences, Stored Procedures, Functions, Packages, Triggers, Joins, Unions, Cursors, Collections, and Indexes in Oracle
  • Sound knowledge of Linux/UNIX, Shell scripting, experience in command line utilities like pmcmd to execute workflows in non-windows environments
  • Proficiency in working with Teradata utilities like (BTEQ, FASTLOAD, FASTEXPORT, MULTILOAD, Teradata Administrator, SQL Assistant, PMON, Visual Explain).
  • Implemented change data capture (CDC) using Informatica power exchange to load data from clarity DB to TERADATA warehouse.
  • Experience in integration of various data sources with Multiple Relational Databases like DB2, Oracle, SQL Server, MS Access, Teradata, Flat Files, XML files and other sources like Salesforce, etc.
  • Experience in Migrating Data from Legacy systems to Oracle database using SQL*Loader
  • Proficient in Oracle Tools and Utilities such as TOAD and SQL*Loader.
  • Experienced in scheduling Sequence and parallel jobs using DataStage Director, UNIX scripts and scheduling tool (Control-M v7/v8), CA WA Workstation (ESP)
  • Expert in analyzing Business & Functional Specifications, creating Technical Design Document and Unit Test cases
  • Experience in Performance Tuning of targets, sources, mapping, workflow, system.
  • Identified and fixed bottlenecks and tuned the complex Informatica mappings for better Performance.
  • Involved in SDLC- software development life cycle (Water, Scrum/Agile) of building a Data Warehouse on windows and Unix Platforms.
  • Well versed with onsite/offshore project delivery model and experience in working with offshore teams
  • Designed Applications according to the customer requirements and specifications.
  • Excellent Interpersonal and Communication skills, coupled with strong technical and problem-solving capabilities.

TECHNICAL SKILLS:

ETL Tools: Informatica Power Center 9.6.1/9.5/9.1/8.6/8.1/7.1 , Informatica Power Exchange, Metadata Manager, Informatica Data Quality (IDQ), Informatica Data Explorer (IDE), MDM, SSIS, Salesforce, DataStage, etc.

Reporting Tools: Business Objects XIR2/6.1/5.0, QlikView, OBIEE, MicroStrategy, Oracle Analytics, etc.

Scheduling tools: Informatica Scheduler, CA Scheduler(Autosys), ESP, Maestro, Control-M.

Data Modeling: Relational Modeling, Dimensional Modeling (Star Schema, Snow-Flake, FACT, Dimensions), Physical, Logical Data Modeling, and ER Diagrams.

Hadoop Ecosystem: HDFS, Map Reduce, Sqoop, Syncsort.

DB Tools: SQL Server Management Studio (2008), Oracle SQL Developer (3.0), Toad 11.6 (Oracle), DB2, Teradata, AQT v10/v9 (Advanced query tool) (Oracle/Netezza), DB Artisan 9.0 (Sybase), SQL Browser (Oracle Sybase), Visio, ERWIN

Languages: C, C++, Java, .Net, Perl Scripting, Shell Scripting, XSLT, PL/SQL, T-SQL.s

Operating Systems: UNIX, Linux, Windows

PROFESSIONAL EXPERIENCE:

Informatica Developer

Confidential, Cary, NC

Responsibilities:

  • Prepared High-level Design and Low-Level Design based on Functional and Business requirements of the project.
  • Designing & documenting the functional specs and preparing the technical design.
  • As a team conducted gap analysis and Discussions with subject matter experts to gather requirements, emphasize on problem areas and define deliverables.
  • Supported the development, optimization, and maintenance of Informatica mappings with various source data including DB2.
  • Involved In Reverse engineering (Backward and Forward analysis) to know the Attributes to be Populated From Existing Tables .
  • Developed mappings in Informatica a variety of Power Center transformations, Mapping Parameters, Mapping Variables, Mapplets & Parameter files.
  • Implemented change data capture (CDC) for mappings so as to capture changes and preserve history.
  • Involved in Migration projects to migrate data from data warehouses on DB2.
  • Involved in logical and physical data modeling and analyzed and designed the ETL processes.
  • Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer.
  • Created (ETL Process) for populating the data into the Data warehouse constantly from different source systems like ODS, flat files
  • Developed mapping parameters and variables to support SQL override.
  • Developing workflows with Worklets, Event waits, decision, Assignments, Conditional flows, Email and Command Tasks using Workflow Manager.
  • Interacting with the Source Team and Business to get the Validation of the data.
  • Extensive Data modeling experience using Dimensional Data modeling, Star Schema modeling, Snowflake modeling, and FACT and Dimensions tables.
  • Writing SQL Scripts to extract the data from Database and for Testing Purposes.
  • Worked with scheduler Maestro for scheduling Informatica Power Center Workflows Involved with Scheduling team in creating and scheduling jobs in Workload Scheduler.
  • Extensive documentation on the design, development, implementation, daily loads and process flow of the mappings.

Environment: Informatica Power Center 9.6.1, SQL, DB2, PL/SQL, TOAD, Microsoft Visio, Maestro, Unix, Winscp, SQL Server 2008, AQT

Confidential, Dallas, TX

Informatica Developer

Responsibilities:

  • Developed several complex mappings in Informatica a variety of Power Center transformations, Mapping Parameters, Mapping Variables, Mapplets & Parameter files.
  • Optimized and Tuned SQL queries and PL/SQL blocks to eliminate Full Table scans to reduce Disk I/O and Sorts.
  • Designed and developed UNIX Shell scripts for creating, dropping tables which are used for scheduling the jobs.
  • Developed Complex database objects like Stored Procedures, Functions, Packages and Triggers using SQL and PL/SQL.
  • Developed banking management scripts in python to support the chase website in creating user profiles, transactions for the withdrawals and deposit.
  • Worked with IDQ on data quality for data cleansing, robust data, remove the unwanted data, correctness of data.
  • Extensive experience in integration of Informatica Data Quality (IDQ) with Informatica PowerCenter
  • Involved in Migration projects to migrate data from data warehouses on Oracle/DB2 and migrated those to Teradata
  • Designed High-level and Low-Level Documentation based on Functional and Business requirements of the project.
  • Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer.
  • Created Data stage jobs (ETL Process) for populating the data into the Data warehouse constantly from different source systems like ODS, flat files, HDFS Files and scheduled the same using Data Stage Sequencer for System Integration testing.
  • Developed, maintained programs for scheduling data loading and transformations using DataStage
  • Developed mapping parameters and variables to support SQL override.
  • Worked on import and export of data from sources to Staging and Target using Teradata MLOAD, Fast Export, TPUMP and BTEQ.
  • Developing workflows with Worklets, Event waits, Assignments, Conditional flows, Email and Command Tasks using Workflow Manager.
  • Extensive Data modeling experience using Dimensional Data modeling, Star Schema modeling, Snowflake modeling, and FACT and Dimensions tables.
  • Handling all Hadoop environment builds, including design, capacity planning, cluster setup, performance tuning and ongoing monitoring.
  • Worked with the third party scheduler Autosys for scheduling Informatica PowerCenter Workflows Involved with Scheduling team in creating and scheduling jobs in Autosys Workload Scheduler.
  • Extensive documentation on the design, development, implementation, daily loads and process flow of the mappings.

Environment: Informatica Power Center 9.6, Oracle 11g, SQL, IDQ, DB2, Teradata, DataStage, PL/SQL, PERL, Python, TOAD, Microsoft Visio, Autosys, Unix, SQL Server 2008.

Confidential, Long Beach, CA

Informatica ETL Developer

Responsibilities:

  • Involved in Designing High Level Technical Documentation based on specification provided by the Manager.
  • Extensively used ETL Tool Informatica to load data from Flat Files, Oracle, SQL Server, Teradata
  • Agile data loading for Amazon Redshift using Informatica drag-and-drop based cloud designer to create integrations with multiple source objects and targets.
  • Loading data from large data files into Hive tables.
  • Importing and exporting data into HDFS and Hive using Sqoop.
  • Created all the Target Table DDLs and as well the Source table DDLs in Teradata.
  • Converted the data mart from Logical design to Physical design, defined data types, Constraints, Indexes, generated Schema in the Database, created Automated scripts, defined storage parameters for the objects in the Database.
  • Designed and developed Informatica mappings for data loads and data cleansing.
  • Performance tuned ETL processes at the mapping, session and Database level.
  • Integrated sources from different databases and flat files.
  • Mapplets and Reusable Transformations were used to prevent redundancy of transformation usage and maintainability.
  • Independently perform complex troubleshooting, root-cause analysis, solution development
  • Involved in end to end system testing, performance and regression testing and data validations and Unit Testing.
  • Extensively used SQL and PL/SQL Scripts.
  • Basic Informatica administration such as creating folders, users, privileges, server setting optimization, and deployment groups, etc.
  • Designed Audit table for ETL and developed Error Handling Processes for Bureau Submission.
  • Created various UNIX Shell Scripts for scheduling various data cleansing scripts and automated execution of workflows.
  • Used Informatica IDQ to do data profiling of the source and check for the accuracy of data using dashboard.
  • Managed Change Control Implementation and coordinating daily, monthly releases and reruns.
  • Used Teradata Utilities such as Mload, Fload and Tpump.
  • Created BTEQ scripts.
  • Used UNIX scripts for automating processes.

Environment: Informatica Power Center 10.1/9.6.1/9.1.0 , Informatica Developer Client, IDQ, MDM, Power Exchange,DB2, SAP, Oracle 11g, Hadoop HDFS, Hive, Sqoop, Syncsort, PL/SQL, TOAD, SQL SERVER 2005/2008, XML, UNIX, Windows XP, OBIEE, Teradata.

Confidential, Boston, MA

Informatica developer

Responsibilities:

  • Understanding existing business model and customer requirements.
  • Worked on ETL tool Informatica to create maps and transformations.
  • Extensively used ETL to load data from Flat files which involved both fixed width as well as Delimited files and also from the relational database, which was Oracle 9i.
  • Configuring Repository Manager, created folders and managed objects in repository manager for Informatica.
  • Used Filter, Router, Aggregator, Lookup, Expression, and Update Strategy transformations whenever required in the mapping.
  • Involved in complete software development life cycle (SDLC) of the project.
  • Developed Mapplets and Reusable Transformations were used to prevent redundancy of transformation usage and maintainability.
  • Used Change Data Capture to implement incremental load.
  • Used CDC mechanism to extract data which has changed at source since last extract and load it in Data marts.
  • Involved in Migration projects to migrate data from data warehouses on Oracle/DB2 and migrated those to Teradata
  • Extensive Data modeling experience using Dimensional Data modeling, Star Schema modeling, Snowflake modeling, and FACT and Dimensions tables.
  • Identified and fixed the Bottle Necks and tuned the Mappings and Sessions for improving performance. Tuned both ETL process as well as Databases.
  • Created and monitored sessions and batches to run the mappings.
  • Involved in creation of sessions and scheduling of sessions.

Environment: Oracle 8i, Informatica 8.6(Designer, Workflow Manager, Workflow Monitor, Repository Manager), Cognos, Test Director, SQL *Plus, UNIX.

Confidential, Boston, MA

Informatica Developer

Responsibilities:

  • Involved in all the phases of SDLC.
  • Worked closely with business analysts and data analysts to understand and analyze the requirement to come up with robust design and solutions.
  • Involved in standardization of Data like of changing a reference data set to a new standard.
  • Data if validated from third party before providing to the internal transformations should be checked for its accuracy (DQ).
  • Involved in massive data profiling prior to data staging.
  • Created profiles and score cards for the users.
  • Created Technical Specification Documents and Solution Design Documents to outline the implementation plans for the requirements.
  • Designed the mappings according to the OBIEE specifications like SDE (Source Dependent Extraction) and SIL (Source Independent Extraction).
  • Involved in testing of Stored Procedures and Functions, Unit and Integrating testing of Informatica Sessions, Batches and the Target Data.
  • Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Informatica Power Center 9.5 by using various transformations like Expression, Source Qualifier, Filter, Router, Sorter, Aggregator, Update Strategy, Connected and unconnected look up etc.
  • Created Informatica components required to operate Data Quality (Power Center required)
  • Designed best practices on Process Sequence, Dictionaries, Data Quality Lifecycles, Naming Convention, and Version Control.
  • Developed the required Informatica mappings to process the data into Dimension and facts tables which satisfy the OBIEE reporting rules by interacting with reporting team.
  • Developed scripts for creating tables, views, synonyms and materialized views in the data mart.
  • Extensive Data modeling experience using Dimensional Data modeling, Star Schema modeling, Snowflake modeling, and FACT and Dimensions tables. Created PL/SQL
  • Programs like procedures, function, packages, and cursors to extract data from Target System.
  • Utilized dimensional and star-schema modeling to come up with new structures to support drill down.
  • Converted business requirements into highly efficient, reusable and scalable Informatica ETL processes.
  • Created mapping documents to outline source-to-target mappings and explain business-driven transformation rules.
  • Data if sourced from database that has valid not null columns should not undergo DQ check for completeness.

Environment: Informatica Power Center 9.1/8.6, PL/SQL Developer, OBIEE, IDQ, Oracle 11g, UNIX, Microsoft SQL Server, TOAD, Teradata, Netezza.

Confidential

Informatica/ETL Consultant

Responsibilities:

  • Analyzed the functional specifications provided by the data architect and created Technical System Design Documents and Source to Target mapping documents.
  • Converted the data mart from Logical design to Physical design, defined data types, Constraints, Indexes, generated Schema in the Database, created Automated scripts, defined storage parameters for the objects in the Database.
  • Performed Source System Data Profiling using Informatica Data Explorer (IDE).
  • Involved in designing Staging and Data mart environments and built DDL scripts to reverse engineer the logical/physical data model using Erwin.
  • Implemented pushdown, pipeline partition, persistence cache for better performance.
  • Developed reusable transformations and Mapplets to use in multiple mappings.
  • Implementing Slowly Changing Dimensions (SCD) methodology to keep track of historical data.
  • Assisted the QC team in carrying out its QC process of testing the ETL components.
  • Created pre-session and post-session shell scripts and email notifications.
  • Involved in complete cycle from Extraction, Transformation and Loading of data using Informatica best practices.
  • Involved in Data Quality checks by interacting with the business analysts.
  • Performing Unit Testing and tuned the mappings for the better performance.
  • Maintained documentation of ETL processes to support knowledge transfer to other team members.
  • Created various UNIX Shell Scripts for scheduling various data cleansing scripts and automated execution of workflows.
  • Involved as a part of Production support.
  • Designed Audit table for ETL and developed Error Handling Processes for Bureau Submission.
  • Managed Change Control Implementation and coordinating daily, monthly releases and reruns.
  • Responsible for code migration, code review, test Plans, test scenarios, test cases as part of Unit/Integrations testing, UAT testing.
  • Used Teradata Utilities such as M load, F load and T pump.
  • Used UNIX scripts for automating processes

Environment: Informatica Power Center 9.1.1, Informatica Developer Client, IDQ, Power Exchange, SAP, Oracle 11g, PL/SQL, TOAD, SQL SERVER 2005/2008, XML, UNIX, Windows XP, OBIEE, Teradata.

We'd love your feedback!