We provide IT Staff Augmentation Services!

Sr. Informatica Developer Resume

St Petersburg, FL


  • 8+ Years of experience in Information Technology especially in Database management, Data warehousing, Data Integration and Data Migration projects.
  • Experienced in database design and development, data warehousing and ETL process using Informatica Power Center 9.x, Designer (Source analyzer, Warehouse designer, Mapping designer, Metadata Manager, Mapplets designer, Transformation developer), Repository Manager, Repository Server, Workflow Manager and Workflow Monitor.
  • Experience in Data Modeling, Star/Snowflake schema modeling, Fact and Dimension Tables, Physical and Logical data modeling using Erwin Data modeling tool.
  • Analyze business requirements, technical specification, source repositories and physical data models, source to target mapping for ETL mapping and process flow.
  • Expertise in developing complex mappings in Informatica to load the data from various sources into the Data Warehouse, using different transformations like Joiner, Aggregator, Update Strategy, Router, Lookup Connected and Unconnected, Expression, Sequence Generator, Filter, Sorter, Source Qualifier and various in - built functions.
  • Extensive experience in using Informatica Power Center 9.6/9.5/9.0, IDQ(Informatica data quality), Informatica MDM (Master data management).
  • Extensively used Informatica Client tools like Designer, Workflow Manager, Workflow Monitor, Repository Manager and Server tools - Informatica Server, Repository Server.
  • Excellent Data Analysis skills and ability to translate business logic into mappings using complex transformation logics for ETL processes.
  • Implemented complex business rules in Informatica Power Center by creating re-usable transformations, and robust Mapplets.
  • Creating mappings and workflows to extract data from Oracle, SQL Server and Flat File sources and load into various Business Entities.
  • Experience in Performance Tuning techniques at various levels such as Source, Target, Mapping, and Session levels.
  • Experience in scheduling workflows using both Informatica and other scheduling tools like Autosys, Control-M, Crontab, Tivoli etc.
  • Good knowledge on Data modeling (Dimensional & Relational) concepts like Star-Schema Modeling, Snowflake Schema Modeling, Fact and Dimension tables.
  • Knowledge of using SQL Server Integration Services (SSIS) to derive the data using the expression language and loading data using various transformations.
  • Knowledge on reporting tools like Tableau for data blending, connection with cubes, creating views, stories, waterfall charts and histograms.
  • Good experience in Data warehousing applications, responsible for Extraction, cleansing, Transformation and Loading of data from various sources to the data warehouse.
  • Developed Slowly Changing Dimensions Informatica mappings (SCDs) of Type 1 and Type 2.
  • Worked with Informatica Data Quality (IDQ) for data cleansing, data matching and data conversion.


Data warehousing ETL tools: Informatica Power Center 9.x, Informatica MDM, Informatica Data Quality 9.1(IDQ), Microsoft SSIS

Data Modeling: Relational Data Modeling, Dimensional Data Modeling, Star Schema, Snow-Flake Modeling, Fact and Dimensions Tables

Databases: Oracle 11g/10g, SQL Server 2014/2012/2008/2008 R2, DB2, Teradata V12

Reporting Tools: Tableau Desktop, Cognos,SSRS

Languages: SQL, PL/SQL, C,C++

DB Tools: SQL*Plus, SQL*Loader, SQL*Forms, TOAD

Web Tools: ASP.NET, Jscript.NET, HTML, JavaScript

Schedulers / Querying tools: Oracle SQL Developer, MS Visual Studio, SQL Server Management Studio

Operating Systems: Vista, Windows XP/2007/2010, Windows 95, 98, Unix

Others: MS Office, MS Project, MS Visio, MS One Note


Confidential -St. Petersburg, FL

Sr. Informatica Developer


  • Developed Informatica conversion mappings according to the business rules and imported it through Mapping Architect Visio.
  • Involved in documenting technical specifications and the business data flow from source to targets.
  • Developed various complex Informatica mappings, Mapplets and reusable transformations for loading data into data warehouse.
  • Implemented SCD Type1 and Type2 method in ETL mapping to load data into EDW.
  • Created different transformations like B2B Data transformation, Expression, Sorter, Aggregator, Router, Filter, Lookup, Sequence Generator, XML Generator, XML Parser, Web services, HTTP, SQL, Joiner and Update Strategy transformations for loading the data into the targets.
  • Created web services request-response mappings by importing source and target definition using WSDL file.
  • Created conversion scripts using Oracle SQL queries, functions and stored procedures, test cases and plans before ETL migrations.
  • Created Mapplets and reusable transformations to leverage the coding efforts and to help code maintenance.
  • Identifying and removing the bottlenecks, tuning the mappings and sessions to improve performance.
  • Used Informatica Power Center Workflow manager to create sessions and workflows to run the logic embedded in the mappings.
  • Informatica code migrations using Deployment groups and Export/Import xml.
  • Worked in migrating mappings, sessions, workflows and common objects from development to acceptance and to production.
  • Worked on Informatica Advanced concepts and Implementation of Informatica Pushdown Optimization and Partitioning.
  • Created dashboard design and development in Tableau and use of action filters, user filters, advanced navigation techniques, Level-of-Detail (LOD) expressions.
  • Blended data from multiple databases into one report from each database for data validation.
  • Preparing Dashboards using calculations, parameters in Tableau.
  • Deploying, Scheduling and Monitoring deployment groups in the development, acceptance and production environments.
  • Created new database objects like procedures, functions, packages, triggers, indexes and views using T-SQL in development and production environment for SQL server 2008/2012.
  • Responsible in handling on-call production support to fix the incidents on timely manner.
  • Used debugger tool to fix bugs and defects.

Environment: Informatica 9.6.1, Oracle 9i, DB2, MS SQL Server 2016, T-SQL, Microsoft Visual Studio 2008/2010, Tableau Desktop.

Confidential -woodland hills, CA

Sr. Informatica Developer


  • Involved in preparing ETL technical specification document and analyzed approaches to work with different source systems like flat files and SQL Server database.
  • This Project follows agile methodology for delivery the objects are implemented.
  • Build the team of 9 ETL developers (2 onsite and 7 offshore) on the process of gathering Lineage from Informatica objects and database tasks and support production applications.
  • Provide status update to the business reporting manager on the project milestones and highlight achievements by the team weekly. Prepare weekly status report.
  • By using Erwin tool for developing Fact & Dimensional tables, Logical and Physical models.
  • Created Python scripts to generate Flat files out of SQL tables using SQL BCP utility and transmit Flat files to downstream server using UNIX credential object.
  • We are successfully implemented of SSIS ETL project to Informatica 9.6.x.
  • Used Informatica Metadata Manager and Custom Metadata Configurator to display the Lineage data in graphical form.
  • Supported and maintained their existing Informatica Power Center Workflow manager sessions and workflows to run the logic embedded in the mappings.
  • Involved in creating, enhancing and maintaining several activities of Data Warehouse for modifying target tables as per requirement.
  • Worked on the performance tuning of existing sessions and mappings.
  • Generated shell scripts for automated daily load processes.
  • Used Informatica IDQ tool to find Null values, mix of upper and lower case and standardize. Worked on migrating Informatica ETL’s to SSIS packages.
  • Effectively understood session error logs and used debugger to test mapping and fixed bugs in DEV in following change procedures and validation.
  • Provide authoritative and trustworthy data by using Informatica IDQ for downstream systems to generate reports.
  • Worked with Informatica IDQ tool to create rules and validate the data, used Informatica IDQ tool to cleanse the data and creating score cards on existing rules.
  • Prepared documentation for business data flow from source to targets and for the migration of ETL’s from Informatica to SQL Server SSIS packages.
  • Debugging, Troubleshooting, Optimization and performance tuning for ETL process.
  • Created ETL mappings, sessions, workflows by extracting data from MDM system.
  • Developed complex SQL queries using Case statements, Sub queries and Joins in PL/SQL, created complex SQL stored procedures and functions to meet the user requirements.
  • Development of reports using Tableau and using Oracle as DB and ETL tool.
  • Creating solution driven dashboards by developing different chart types including Heat Maps, Geo Maps, Symbol Maps, Pie Charts, Bar Charts, Tree Maps, Line Charts, Area Charts, Scatter Plots in Tableau Desktop.
  • Created new databases for the new clients and set up database objects like tables, indexes, procedures and functions in development and production environment and tuned the database performance by updating statistics, restoring and backing logs.
  • Involved in User acceptance testing and worked on data cleaning.
  • Troubleshoot the existing data loads and fixed the bugs quickly as per tickets opened by reporting users.
  • Highly collaborated with manager and team members along with daily stand up meetings and worked under the agile process.

Environment: Informatica Power Center 9.6.1, Oracle 11g, MS SQL Server 2016, SQL, Informatica Data quality 9.1 (IDQ), Informatica MDM, PL/SQL, Tableau, MS Visual studio 2015.

Confidential -Destin, Florida

Informatica Developer


  • Involved in gathering, analyzing and documenting business requirements and functional requirements and data specifications from users and transformed them in to technicalspecifications.
  • Extracted data from various sources like flat files, XML files, oracle and loaded into tables.
  • Worked on Informatica client tools like Source Analyzer, Target Designer, Mapping Designer, Workflow Manager and Workflow Monitor.
  • Based on the requirements, used various transformations like Source Qualifier, Expression, Filter, B2B, Router, Update strategy, Sorter, Lookup, Aggregator and Joiner in the mapping.
  • Created complex mappings using the Mapping designer, respective workflows and Worklets using the Workflow manager.
  • Troubleshooted the mappings using the Debugger and improved the data loading efficiency using Sql-overrides and Look-up Sql overrides.
  • Implemented incremental loads, Change Data capture and Incremental Aggregation.
  • Identified performance bottlenecks and Involved in performance tuning of sources, targets, mappings, transformations and sessions to optimize session performance.
  • Created UNIX shell scripts to encrypt and decrypt the files, move the files, archive the files and split the files.
  • Developed UNIX Shell Scripts and SQLs to get data from Oracle tables
  • Created stored procedures, functions and triggers to load data into summary tables.
  • Used SQL tools like TOAD to run SQL queries and validate the data in warehouse.
  • Extensively written complex SQL’s to extract data from source’s and load into the target tables.
  • Implemented parallelism in loads by partitioning workflows using Key Range partitioning.
  • Worked on MDM manual maintenance to add, delete, update or merge data.
  • Experience in MDM implementation including data profiling, data migration and pre-landing processing.
  • Created Informatica Data Quality Services like (Data Integration service, Analyst service & Content Management Service) and experience in using Informatica Developer and Analyst Tool.
  • Responsible for migration of the mappings and sessions from development repository to production repository and provided 24/7 production support.
  • Practiced agile methodology while strategizing and implementing solutions.
  • Involved in Code reviews.
  • Scheduled Informatica Workflow’s using TWS scheduling tool.
  • Developed Unit test plans for every mapping developed and executed the test plans.

Environment: Informatica Power Center 9.6, Meta Data, Informatica MDM, Oracle 11g, Toad, HP Quality Center, MS Office Suite, Autosys, Erwin, UNIX Shell.

Confidential - Phoenix, AZ

Informatica developer


  • Gathering requirements and implement them into source to target mappings.
  • Experience in integration of data sources like SQL server and MS access and non-relational sources like flat files into staging area.
  • Designing custom reports via SQL Reporting Services to align with requests from internal account teams and external Clients.
  • Worked on Dimensional Data Modeling using Data modeling tool Erwin.
  • Populated Data Marts and did System Testing of the Application.
  • Built the Informatica workflows to load table as part of data load.
  • Effectively worked on Mapping Designer, Workflow Manager, and Workflow Monitor.
  • Used Debugger to test the mappings and fixed the bugs.
  • Used various transformations like Filter, Expression, Sequence Generator, Source Qualifier, Lookup, Router, Rank, Update Strategy, Joiner, Stored Procedure and Union to develop robust mappings in the Informatica Designer.
  • Done analysis of Source, Requirements, existing OLTP system and identification of required dimensions and facts from the Database.
  • Extensively used Sequence Generator in all mappings and fixed bugs / tickets raised in production for existing mappings in common folder for new files through versioning (Check in and Check out) on an urgency through support for QA in component unit testing and validation.
  • Used shortcuts for sources, targets, transformations, Mapplets, and sessions to reuse objects without creating multiple objects in the repository and inherit changes made to the source automatically.
  • Applied slowly changing Dimensions Type I and Type II on business requirements.
  • Working with large amounts of data independently executing data analysis, utilizing appropriate tools and techniques (Interpreting results and presenting them to both internal and external client.
  • Writing SQL queries to create end-user reports /Developing SQL Queries and stored procedures in support of ongoing work and application support.
  • Fine-tuned ETL processes by considering mapping and session performance issues.
  • Responsible for Creating workflows and Worklets. Created Session, Event, Command, Control, Decision and Email tasks in Workflow Manager.
  • Maintained the proper communication between other teams and client.

Environment: Informatica Power Center 9.5.1, SQL, PL/SQL, UNIX, Shell Scripting, SQL Server 2008.


ETL Informatica Consultant


  • Worked with Development Life Cycle teams.
  • Involved in designing SQL SSIS packages to extract data from various data sources such as Access database, Excel spreadsheet, and flat files into SQL Server 2005/08R2 for further Data Analysis and Reporting by using multiple transformations, scheduled and maintained nightly and weekly loads of data by creating the corresponding job tasks.
  • Wrote database triggers for automatic updating the tables and views.
  • Designed and developed forms and reports.
  • Involved in the creation of jobs using Informatica workflow manager to validate schedule run and monitor jobs using Workflow monitor.
  • Analyzed, designed, developed, implemented and maintained moderate to complex initial load and incremental Informatica mappings and workflows to provide data for Data Warehouse.
  • Created several mappings using various active and passive transformations like Source Qualifier, Lookup, Router, Aggregator, Filter, Joiner and Expression.
  • Identified and fixed bottlenecks and tuned the complex Informatica mappings for better Performance.
  • Developed Conceptual Business Model, Logical Data Model, Physical Data Model and Star-Schema design.
  • Researched operational data store (ODS) and Legacy systems to find qualified source data.
  • Coordinated with source system owners, day-to-day ETL progress monitoring, Data warehouse target schema design (Star Schema) and maintenance.
  • Extensively created MLOAD, FLOAD and control files to load data in to EDW tables.
  • Used Shell Scripting to automate the loading process.
  • Used Pipeline Partitioning feature in the sessions to reduce the load time.
  • Analyzed and Created Facts and Dimension Tables.
  • Performed regular backup of data warehouse, backup of various production, development repositories including automating and scheduling processes.
  • As part of optimization process, performed design changes in Informatica mappings, transformations, sessions.

Environment: Informatica Power enter9.0.x, ETL, SSIS, Oracle 9g, MS Access, SQL, PL/SQL, MS Visual Studio 2008.

Hire Now