We provide IT Staff Augmentation Services!

Senior Informatica Developer Resume

4.00/5 (Submit Your Rating)

Fort Washington, PA

SUMMARY:

  • 8+ years of experience in IT in the areas of analysis, design, development and testing in client server environment with focus on Data warehousing applications using ETL/OLAP tools like Informatica, Business Objects with Oracle, SQL Server databases, SSIS, etc.
  • Involved in complete software development life cycle (SDLC) of project with experience in domains like Retail, Healthcare Insurance and Automobile.
  • Extensive experience in ETL (Extract Transform Load), Data Integration and Data Warehousing using Informatica Power Center with strong background in ETL Data warehousing
  • Expertise in Data Warehousing Concepts & Data Warehousing Architecture; thorough knowledge on Dimensional Data Model, Conceptual Data Model, Logical Data model, Physical Data Model.
  • Experience on data profiling & various Data Quality rules development using InformaticaData Quality (IDQ).
  • Worked on IDQ tools for data profiling, data enrichment and standardization.
  • Experience in development of mappings in IDQ to load the cleansed data into the target table using various IDQ transformations
  • Exposure in loading data, troubleshooting, Debugging mappings, performance tuning of Informatica objects & fine - tuned transformations making them efficient in terms of session performance.
  • Implemented change data capture (CDC) using Informatica power exchange to load data from clarity DB to TERADATA warehouse.
  • Experience in Informatica mapping specification documentation, tuning mappings to increase performance, proficient in creating and scheduling workflows and expertise in Automation of ETL processes with scheduling tools such as Autosys.
  • Experience in developing Enterprise level Data warehouse, Data marts and Operational databases.
  • Experience in creating complex mappings using various transformations, and developing strategies for Extraction, Transformation and Loading (ETL) mechanism by using Informatica 9.1.6/9.1.0/8.6/7. x/6.x/5.x.
  • Experience in source systems analysis and data extraction from various sources like Flat files, Oracle 11g/10g/9i/8i, DB2 UDB.
  • Extensive experience using tools and technologies such as Oracle 11g/10g/9i/8i, SQL Server 2000/2005/2008, DB2 UDB, Sybase, Teradata 13/12/ V2R5/V2R4, Netezza MS Access, SQL, PL/SQL, SQL*Plus, Sun Solaris 2.x, Erwin 4.0/3.5, SQL* Loader, TOAD, Stored procedures, triggers.
  • Experience in Performance Tuning. Identified and fixed bottlenecks and tuned the complex Informatica mappings for better Performance.
  • Experience in using Teradata Utilities such as Tpump, Fast Load and Mload. Created BTEQ scripts.
  • Strong data modeling experience using Star/Snowflake schema, Re-engineering, Dimensional Data modeling, Fact & Dimension tables, Physical & logical data modeling.
  • Experience in writing complex sub queries, PL/SQL programs (functions, procedures, packages), stored procedures, and shell scripting to run pre-session and post session commands.
  • Worked with heterogeneous data sources like Oracle, SQL Server 2008/2005, flat files, XML files, DB2 UDB, Main Frames and COBOL files.
  • Perform error handling as part of production support on Informatica as well as Unix.
  • Experience in performance tuning of sources, transformations, targets, mapplet, mappings, worklets, workflows, sessions and batches.
  • Expertise in using GUI tolls like Erwin, Toad, SQL Developer, SQL Navigator and SQL *Loader.
  • Excellent communication skills, team player, ability to work individually and meet tight deadlines.
  • Experience in Installation, Configuration, and Administration of Informatica Power Center 9.x/8.x/7.x/6.x, power exchange8.1 and Power Mart 5.x/6.x Client, Server.
  • Experience in Validating ETL Tools like Informatica Power Center.
  • Extensive experience in writing SQL scripts to validate the database systems and for backend database testing.
  • Strong Team Player with communication and relationship management skills.

TECHNICAL SKILLS:

ETL Tools: Informatica Power Center, Power Exchange 9.5/8.6/8.1 7.1, Metadata Manager, Power-Exchange, Informatica Data Quality (IDQ), Informatica Data Explorer (IDE), MDM, SSIS, DataStage, Salesforce etc.

Reporting Tools: Business Objects XIR2/6.1/5.0, Qlikview, OBIEE, Microstrategy, Oracle

Analytics, etc: Databases:

Oracle 11g/10g, MS SQL Server 2008/2005/2000, MS Access, IBM, DB2,: Teradata 14.0, Netezaa 4.0, Siebel CRM, PeopleSoft

Data Modeling: Data Modeling, Dimensional Data Modeling, Star Schema ModelingSnow Flake Modeling, FACT and Dimensions Tables, Physical and: Logical Data Modeling.

DB Tools: TOAD, SQL Developer, SQL Assistant, Visio, ERWIN

Languages: C, C++, Java, SQL, PL/SQL, Unix Shell Scripting

Operating Systems: UNIX, Windows 7/Vista/Server 2003/XP/2000/9x/NT/DOS

PROFESSIONAL EXPERIENCE:

Confidential - Fort Washington, PA

Senior Informatica Developer

RESPONSIBILITIES:

  • Translated the business processes/SAS code into Informatica mappings for building the data mart.
  • Used Informatica Power Center to load data from different sources like flat files and Oracle, Teradata into the Oracle Data Warehouse.
  • Implemented pushdown, pipeline partition, persistence cache for better performance.
  • Applied Business rules that identify the relationships among the data using InformaticaData Quality ( IDQ 8.6 8.6).
  • Modified existing InformaticaData Quality ( IDQ 8.6 8.6) Workflows to integrate the business rules to certify the quality of the Data.
  • Defined measurable metrics and required attributes for the subject area to support a robust and successful deployment of the existing InformaticaMDM 9.5 platform.
  • Planed InformaticaMDM 9.5 requirement analysis sessions with business users.
  • Created InformaticaMDM 9.5 Hub Console Mappings.
  • Worked on CDC (Change Data Capture) to implement SCD (Slowly Changing Dimensions).
  • Hands On experience creating, converting oracle scripts (SQL, PL/SQL) to TERADATA scripts.
  • Configure rules for power center operations team, no file monitoring, process not started, reject records and long running jobs.
  • Worked with InformaticaData Quality ( IDQ ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ .
  • Performed the data profiling and analysis making use of Informatica Data Explorer (IDE) and InformaticaData Quality ( IDQ ).
  • Assisted the QC team in carrying out its QC process of testing the ETL components.
  • Created pre-session and post-session shell scripts and email notifications.
  • Involved in complete cycle from Extraction, Transformation and Loading of data using Informatica best practices.
  • Created mappings using Data Services to load data into SAP HANA.
  • Involved in Data Quality checks by interacting with the business analysts.
  • Performing Unit Testing and tuned the mappings for the better performance.
  • Maintained documentation of ETL processes to support knowledge transfer to other team members.
  • Created various UNIX Shell Scripts for scheduling various data cleansing scripts and automated execution of workflows.
  • Responsible for requirement definition and analysis in support of Data Warehousing efforts.
  • Developed ETL mappings, transformations using Informatica Power Center 8.6.
  • Extensively used ETL Tool Informatica to load data from Flat Files, Oracle, SQL Server, Teradata etc.
  • Developed data Mappings between source systems and target system using Mapping Designer.
  • Developed shared folder architecture with reusable Mapplets and Transformations.
  • Extensively worked with the Debugger for handling the data errors in the mapping designer.
  • Created events and various tasks in the work flows using workflow manager.
  • Responsible for tuning ETL procedures to optimize load and query Performance.
  • Setting up Batches and sessions to schedule the loads at required frequency using Informatica workflow manager and external scheduler.
  • Extensive Data modeling experience using Dimensional Data modeling, Star Schema modeling, Snowflake modeling, and FACT and Dimensions tables.
  • Tested the mapplets and mappings as per Quality and Analysis standards before moving to production environment.
  • Taken part of Informatica administration. Migrated development mappings as well as hot fixes them in production environment.
  • Involved in writing shell scripts for file transfers, file renaming and several other database scripts to be executed from UNIX.
  • Migrating Informatica Objects using Deployment groups.
  • Trouble issues in TEST and PROD. Do impact analysis and fix the issues.
  • Worked closely with business analysts and gathered functional requirements. Designed technical design documents for ETL process.
  • Developed Unit test cases and Unit test plans to verify the data loading process and Used UNIX scripts for automating processes .
  • Involved as a part of Production support.

ENVIRONMENT: Informatica Power Center 9.1, InformaticaMDM 9.5, InformaticaIDQ 8.6 8.6, Power Exchange, Teradata, Data QualityOracle11g, MS Access, UNIX Shell Scripts, Windows NT/2000/XP, SQL Server 2008, SSIS, OBIEE, Qlikview, Teradata, SQL Assistant, Netezza, DB2.

Confidential - Tampa, FL

Sr. ETL Informatica / IDQ

RESPONSIBILITIES:

  • Creating Test input requirements and prepared the test data for Data Driven testing.
  • Writing modification requests for the bugs in the application and helped developers to track and resolve the problems in Warehousing.
  • Developing and optimized standard and re-usable mappings and mapplets, as well as various reusable transformations like expression, aggregator, joiner, source qualifier, router, lookup, and Sorter.
  • Working on the Teradata stored procedures and functions to confirm the data and have load it on the table.
  • Using Teradata Data Mover to copy data and objects such as tables and statistics from one system to another.
  • Working on Teradata Multi-Load, Teradata Fast-Load utility to load data from Oracle and SQL Server to Teradata.
  • Creation of customized Mload scripts on UNIX platform for Teradata loads
  • Using Teradata Data Mover to copy data and objects such as tables and statistics from one system to another.
  • Using Teradata Data Mover for overall data management capabilities for copying indexes, global temporary tables.
  • Writing Teradata Macros and used various Teradata analytic functions.
  • Automating UNIX shell scripts to verify the count of records added everyday due to incremental data load for few of the base tables in order to check for the consistency.
  • Recommending physical tuning strategies for overall better performance of database, queries and load process by suggesting vertical and horizontal partitioning of existing tables and changes in data types where possible for saving space.
  • Data profiling, address doctor validation, monitoring and cleansing across the enterprise through Informatica Data Quality (IDQ).
  • Utilization of Informatica IDQ to complete initial data profiling and matching/removing duplicate data.
  • Moving the ETL jobs From Development to QA Server.
  • Wrote all DDL scripts to create Tables, Views, Transaction Tables, Triggers, and Store Procedures for base tables and CDC processes in all layers.
  • Designed Jobs by Unix Shell Scripts for Tivoli to schedule workflows. Wrote SOP/AID documents for smooth transfer of project.
  • Migrated codes from Dev to Test to Pre-Prod. Created effective Unit, Integration test of data on different layers to capture the data discrepancies/inaccuracies to ensure successful execution of accurate data loading.
  • Designed the ETL process using DataStage tool to load data from source to target Schema.
  • Moving the ETL jobs From Development to QA Server
  • Extensively worked on CDC to capture the data changes into sources and for delta load. Used Debugger to validate the Mappings and gained troubleshooting information about the data and error conditions.
  • Provide technical support for the IDQ, DIH and MDM Environment.
  • Responsible for setting up jobs in EDW/DIH/ETL Integration environment complete on a daily basis.
  • Moving the ETL jobs From Development to production
  • Debugged the invalid mappings and tasted Mappings, Sessions, and Workflows to figure out the Bottlenecks and tuned them for better performance.
  • Involved in Data Center Migration, up gradation of Informatica DIH and Control-M

Environment: Informatica Power Center 9.6.1 HF2, IDQ, Oracle 11g/10g, SQL, PL/SQL, TOAD 9.5, Putty, WinSCP, UNIX Shell Scripting, DIH,SQL-Developer, Tivoli, Service Now, Power Designer, SAP-HANA, Business Object.

Confidential - Fairfax, Virginia

ETL Informatica Developer

RESPONSIBILITIES:

  • Development of scripts for loading the data into the base tables in EDW using FastLoad, MultiLoad and BTEQ utilities of TERADATA.
  • Worked on TERADATA SQL, BTEQ, MLoad, FastLoad, and FastExport for Ad-hoc queries, and build UNIX shell script to perform ETL interfaces BTEQ, FastLoad orFastExport. Created numerous Volatile, Global, Set, MultiSet tables.Created batch jobs for Fast Export.
  • Prepared Conceptual Solutions and Approach documents and gave Ballpark estimates.
  • Design and developed Amazon Redshift databases
  • Worked with XML, Redshift, Flat file connectors.
  • Prepared Business Requirement Documents, Software Requirement Analysis and Design Documents (SRD) and Requirement Traceability Matrix for each project workflow based on the information gathered from Solution Business Proposal document.
  • Data modeling and design of data warehouse and data marts in star schema methodology with confirmed and granular dimensions and FACT tables.
  • Used Teradata Utilities fastload, multiload, tpump to load data.
  • Worked on Inbound, Outbound and Carve-out Data Feeds and developed mappings, sessions, workflows, command line tasks etc. for the same.
  • Involved in Migration projects to migrate data from data warehouses on Oracle/DB2 and migrated those to Teradata
  • Coordinated and worked closely with legal, clients, third-party vendors, architects, DBA’s, operations and business units to build and deploy.
  • Used Data stage to manage the Metadata repository and for import /export for jobs.
  • Worked with Connected and Unconnected Stored Procedure for pre & post load sessions
  • Designed and Developed pre-session, post-session routines and batch execution routines using Informatica Server to run Informatica sessions.
  • Used PMCMD command to start, stop and ping server from UNIX and created Shell scripts to automate the process.
  • Worked on production tickets to resolve the issues in a timely manner.
  • Prepared Test Strategy and Test Plans for Unit, SIT, UAT and Performance testing.

ENVIRONMENT: Informatica Power Center 9.x, InformaticaIDQ, Oracle 10g, Teradata, SQL Server 2008, Toad, SQL Plus, SQL Query Analyzer, SQL Developer, MS Access, Tivoli Job Scheduler, Windows Azure.

Confidential - Hartford, CT

ETL Developer

RESPONSIBILITIES:

  • Involved in all the phases of SDLC.
  • Worked closely with business analysts and data analysts to understand and analyze the requirement to come up with robust design and solutions.
  • Involved in standardization of Data like of changing a reference data set to a new standard.
  • Data if validated from third party before providing to the internal transformations should be checked for its accuracy (DQ).
  • Involved in massive data profiling prior to data staging.
  • Created profiles and score cards for the users.
  • Created Technical Specification Documents and Solution Design Documents to outline the implementation plans for the requirements.
  • Involved in testing of Stored Procedures and Functions, Unit and Integrating testing of Informatica Sessions, Batches and the Target Data.
  • Involved in massive data cleansing prior to data staging from flat files.
  • Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Informatica Power Center 9.5 by using various transformations like Expression, Source Qualifier, Filter, Router, Sorter, Aggregator, Update Strategy, Connected and unconnected look up etc.
  • Created Informatica components required to operate Data Quality (Power Center required)
  • Designed best practices on Process Sequence, Dictionaries, Data Quality Lifecycles, Naming Convention, and Version Control.
  • Developed scripts for creating tables, views, synonyms and materialized views in the data mart.
  • Involved in designing and developing logical and physical data models to best suit the requirements.
  • Created PL/SQL programs like procedures, function, packages, and cursors to extract data from Target System.
  • Utilized dimensional and star-schema modeling to come up with new structures to support drill down.
  • Converted business requirements into highly efficient, reusable and scalable InformaticaETL processes.
  • Created mapping documents to outline source-to-target mappings and explain business-driven transformation rules.
  • Data if sourced from database that has valid not null columns should not undergo DQ check for completeness.

ENVIRONMENT: Informatica Power Center 9.1, Oracle11g, Teradata, SAP HANA, UNIX Shell Scripts.

Confidential

Datawarehouse Developer

RESPONSIBILITIES:

  • Analyzed the functional specifications provided by the data architect and created Technical System Design Documents and Source to Target mapping documents.
  • Converted the data mart from Logical design to Physical design, defined data types, Constraints, Indexes, generated Schema in the Database, created Automated scripts, defined storage parameters for the objects in the Database.
  • Performed Source System Data Profiling using Informatica Data Explorer (IDE).
  • Involved in designing Staging and Data mart environments and built DDL scripts to reverse engineer the logical/physical data model using Erwin.
  • Extracted data from SAP using Power Exchange and loaded data into SAP systems.
  • Translated the business processes/SAS code into Informatica mappings for building the data mart.
  • Implemented pushdown, pipeline partition, persistence cache for better performance.
  • Developed reusable transformations and Mapplets to use in multiple mappings.
  • Implementing Slowly Changing Dimensions (SCD) methodology to keep track of historical data.
  • Assisted the QC team in carrying out its QC process of testing the ETL components.
  • Created pre-session and post-session shell scripts and email notifications.
  • Involved in complete cycle from Extraction, Transformation and Loading of data using Informatica best practices.
  • Involved in Data Quality checks by interacting with the business analysts.
  • Performing Unit Testing and tuned the mappings for the better performance.
  • Maintained documentation of ETL processes to support knowledge transfer to other team members.
  • Created various UNIX Shell Scripts for scheduling various data cleansing scripts and automated execution of workflows.
  • Involved as a part of Production support.
  • Informatica Power Exchange for Mainframe was used to read/write VSAM files from/to the Mainframe.
  • Basic Informatica administration such as creating folders, users, privileges, server setting optimization, and deployment groups, etc.
  • Designed Audit table for ETL and developed Error Handling Processes for Bureau Submission.
  • Managed Change Control Implementation and coordinating daily, monthly releases and reruns.
  • Responsible for Code Migration, Code Review, Test Plans, Test Scenarios, Test Cases as part of Unit/Integrations testing, UAT testing.
  • Used Teradata Utilities such as Mload, Fload and Tpump.
  • Used UNIX scripts for automating processes.

ENVIRONMENT: Informatica Power Center 9.1.1, Informatica Developer Client, IDQ, Power Exchange, SAP, Oracle 11g, PL/SQL, TOAD, SQL SERVER 2005/2008, XML, UNIX, Windows XP, OBIEE, Teradata.

We'd love your feedback!