We provide IT Staff Augmentation Services!

Sr. Informatica Developer Resume

Shakopee, MN


  • Around7 years of IT experience with expertise in analysis, design, development and implementation of Data warehouses, data marts and Decision Support Systems (DSS) using ETL tools with RDBMS like Oracle, MS SQL server, Teradata, DB2 and PostgreSQL on Windows and UNIX platforms.
  • Strong experience in Informatica Cloud, PowerCenter10.0/9.x/8.x and PowerExchange 10.0/9.x/8. x.
  • Worked on multiple Data Synchronization using Informatica Cloudlike Cloud application to cloud application, On - premise application to cloud application and On-premise application to on-premise application.
  • Data integration with SFDC and Microsoft Dynamics CRM using Informatica cloud.
  • Extensive exposure in overall SDLC including requirement gathering, development, testing, debugging, deployment, documentation, production support.
  • Strong experience with Informatica tools using real-time CDC (change data capture) and MD5.
  • Experience in integration of various data sources like Oracle, Teradata, Mainframes, SQL server, XML, Flat files, JSON and extensive experience on Oracle, Teradata and MS SQL Server.
  • Very strong in Data Warehousing Concepts like Dimensions Type I, II and III, Facts, Surrogatekeys, ODS, stagingarea etc.
  • Realistic understanding of the Data modeling (Dimensional & Relational) concepts like Star-Schema Modeling, Snowflake Schema Modeling.
  • Proficient in Data Analysis, Data Validation, Data Lineage Data Cleansing, Data Verification and identifying data mismatch.
  • Expert in writing complex SQL queries, PL/SQL and optimizing the queries in Oracle, SQL Server and Teradata also excellent in working with Views, Synonyms, Indexes, Partitioning, Database Joins, stored procedure, Stats and Optimization.
  • Well-versed in tuning Teradata ETL queries, remediating excessive statistics, resolving spool space issues etc.
  • Experience in developing very complex mappings, reusable transformations, sessions and workflows using Informatica ETL tool to extract data from various sources and load into targets.
  • Experience in tuning and scaling the procedures for better performance by running explain plan and using different approaches like hint and bulk load.
  • Experience in Performance tuning of ETL process using pushdown optimization and other techniques. Reduced the execution time for huge volumes of data for a company merger projects. Heavily created mapplets, user defined functions, reusable transformations, look-ups.
  • Technical expertise in designing technical processes by using Internal Modeling &working with Analytical Teams to create design specifications; successfully defined &designed critical ETL processes, Extraction Logic, Job Control Audit Tables, Dynamic Generation of Session Parameter File, File Mover Process, etc.
  • Expertise in Teradata utilities like Multi Load, Fast Load, Fast Export, BTEQ, TPUMP, TPT and tools like SQL Assistant, Viewpoint
  • Expert in T-SQL coding and testing: functions, views, triggers, cursors, dictionary, stored procedures etc.
  • Assisted the other ETL developers in solving complex scenarios and coordinated with source systems owners with day-to-day ETL progress monitoring.
  • Skilled in creating & maintaining ETL Specification Documents, Use Cases, Source to target mapping, Requirement Traceability Matrix, performing Impact Assessment and providing Effort estimates, deployment artifacts
  • Worked with UNIX shell scripts extensively for job execution and automation.
  • Deftly executed multi-resource projects following Onsite Offshore model while serving as a Mentor for the Junior Team Members
  • Strong Knowledge of Hadoop Ecosystem (HDFS, Hive, Hdfs, Impala, Hue, etc.)
  • Exposure of end to end SDLC and Agile methodology.
  • Good communication and presentation skills, works well as an integral part of a team, as well as independently, intellectually flexible and adaptive to change.


ETL: Informatica Cloud, Informatica PowerCenter 9.x/8.x, Informatica PowerExchange 10.0/9.x/8.x

ETL Scheduling Tools: Control: M, ESP

RDBMS: DB2, Oracle 11g/12c, Teradata 15/13, SQL Server 2008/2014, MySQL, Snowflake Cloud Database, PostgreSQL 9.2

Data Modeling: ER (OPLTP) and Dimensional (Star, Snowflake Schema);

Data Modeling Tools: Erwin 9.3/7.5

UNIX: UNIX, Shell scripting

Reporting Tools: Cognos 10.0 /9.0, QlikView, Tableau 9/10

Defect Tracking Tools: Quality Center, Bugzilla

Operating Systems: Windows XP/2000/9x/NT, UNIX

Source Management: BitBucket, Visual SourceSafe

Programming Languages: C, C++, PL/SQL

Other Tools: Notepad++, Toad, SQL Navigator, Teradata SQL Assistant, Teradata view point, JIRA, Rally


Confidential, Shakopee, MN

Sr. Informatica Developer


  • Involved in all phases of SDLC from requirement gathering, design, development, testing, Production, user training and support for production environment.
  • Actively involved in interacting with business users to record user requirements and Business Analysis.
  • Translated requirements into business rules & made recommendations for innovative IT solutions.
  • Outlined the complete process flow and documented the data conversion, integration and load mechanisms to verify specifications for this data migration project.
  • Parsing high-level design spec to simple ETL coding and mapping standards.
  • Developed data Synchronization tasks that load data from a source to a target and provide some transformation during transit.
  • Created mappings, that flow from multiple complex operations (filter, join, functions) together to build a complex integration process.
  • Performed Update Delete (CRUD) operations and also bulk pulling and loading data to 3rd party systems/On-premise databases.
  • Worked with PowerCenter Designer tools in developing mappings and Mapplets to extract and load the data from flat files and AWS cloud database.
  • Maintained warehouse metadata, naming standards and warehouse standards for future application development.
  • Created the design and technical specifications for the ETL process of the project.
  • Used Informatica as an ETL tool to create source/target definitions, mappings and sessions to extract, transform and load data into staging tables from various sources.
  • Responsible for mapping and transforming existing feeds into the new data structures and standards utilizing Router, Lookups Using Connected, Unconnected, Expression, Aggregator, Update strategy & stored procedure transformation.
  • Created mapping to load data into AWS S3 bucket using Informatica S3 connector also populated data into Snowflake from S3 bucket using complex SQL query.
  • Loaded diverse types (Structured, JSON, flat files etc.) into the Snowflake cloud data warehouse.
  • Worked on Informatica PowerCenter tool - Source Analyzer, Data Warehousing Designer, Mapping Designer & Mapplets, and Transformations.
  • Worked with various complex mapping, designed slowly changing dimension Type1 and Type2.
  • Maintained Development, Test and Production Mappings, migration using Repository Manager. Involved in enhancements and Maintenance activities of the data warehouse.
  • Performance tuning of the process at the mapping level, session level, source level, and the target level.
  • Implemented various new components like increasing the DTM Buffer Size, Database Estimation, Incremental Loading, Incremental aggregation, Validation Techniques, and load efficiency.
  • Worked with SQL*Loader to load data from flat files obtained from various facilities.
  • Created Workflows containing command, email, session, decision and a wide variety of tasks.
  • Tuning the mappings based on criteria, creating partitions in case of performance issues.
  • Tested End to End to verify the failures in the mappings using shellscripts.
  • Performed data validation after the successful End to End tests and appropriate error handling in ETL processes.
  • Resolving the tickets based on the priority levels raised by QA team.
  • Developed Parameter files for passing values to the mappings for each type of client
  • Scheduled batch and sessions within Informatica using Informatica scheduler and also wrote shell scripts for job scheduling.
  • Understanding the entire functionality and major algorithms of the project and adhering to the company testing process.

Environment: Informatica cloud, Informatica PowerCenter 10.0, Tableau 10.0, Linux, SQL, PL/SQL, Oracle 11g, TOAD, Snowflake cloud data warehouse, AWS S3 Bucket, SQL Server 2012, ORACLE 11g, PostgreSQL 9.3, Control M, Shell Scripting, JSON, SQL Loader

Confidential, Minneapolis, MN

Sr. Informatica Developer


  • Co-ordination from various business users’ stakeholders and SME to get Functional expertise, design and business test scenarios review, UAT participation and validation of data from multiple sources.
  • Created Mappings using Mapping Designer to load the data from various sources, using different transformations like Source Qualifier, Expression, Lookup (Connected and Unconnected), Aggregator, Update Strategy, Joiner, Filter, and Sorter transformations.
  • Extensively worked on Mapping Variables, Mapping Parameters, Workflow Variables and Session Parameters.
  • Worked on Power Center Designer tools like Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer and Transformation Developer.
  • Used Debugger within the Mapping Designer to test the data flow between source and target and to troubleshoot the invalid mappings.
  • Worked on SQL tools like TOAD and SQL Developer to run SQL Queries and validate the data.
  • Scheduled Informatica Jobs through Autosys scheduling tool.
  • Involved in creating Informatica mappings, mapplets, worklets and workflows to populate the data from different sources to warehouse.
  • Responsible to facilitate load testing and benchmarking the developed product with the set performance standards.
  • Used MS Excel, Word, Access, and Power Point to process data, create reports, analyze metrics, implement verification procedures and fulfill client requests for information.
  • Used Teradata utility like BTEQ, FLOAD MLOAD and TPUMP scripts to load the data into Teradata tables.
  • Created UNIX shell scripting for automation of ETL processes.
  • Used UNIX for check in’s and check outs of workflows and config files in to Clearcase.
  • Involved in testing the database using complex SQL scripts and handled the performance issues effectively.
  • Involved in Onsite & Offshore coordination to ensure the completeness of Deliverables.

Environment: Informatica PowerCenter 9.1, Cognos 9.2, SQL Server 2012, Oracle 11g, MySQL, PL/SQL, TOAD, Putty, Autosys Scheduler, UNIX, Teradata 14,Erwin 8.1

Confidential, Richfield, MN

Informatica Developer


  • Involved in full life cycle development including Design, ETL strategy, troubleshooting Reporting, and Identifying facts and dimensions.
  • Created sessions, configured workflows to extract data from various sources, transformed data, and loading into data warehouse.
  • Moving the data from source systems to different schemas based on the dimensions and fact tables by using the slowly changing dimensions (SCD).
  • Used various transformations like Filter, Expression, Sequence Generator, Source Qualifier, Lookup, Router, Rank, Update Strategy, Joiner, Stored Procedure and Union to develop robust mappings in the Informatica Designer.
  • Developed various mapping by using reusable transformations.
  • Prepared the required application design documents based on functionality required.
  • Used Debugger to test the mappings and fixed the bugs.
  • Worked on database connections, SQL joins, cardinalities, loops, aliases, views, aggregate conditions, parsing of objects and hierarchies.
  • Defined the program specifications for the data migration programs, as well as the necessary test plans used to ensure the successful execution of the data loading processes.
  • Developed and executed scripts to schedule loads, for calling Informatica workflows using PMCMD command.
  • Worked on Dimensional Data Modeling using Data modeling tool Erwin.
  • Populated Data Marts and did System Testing of the Application.
  • Built the Informatica workflows to load table as part of data load.
  • Tuning Informatica Mappings and Sessions for optimum performance.
  • Worked on different tasks in Workflows like sessions, events raise, event wait, decision, e-mail, command, worklets, Assignment, Timer and scheduling of the workflow.
  • Implemented the best practices for the creation of mappings, sessions and workflows and performance optimization.

Environment: Informatica Power Center 8.1, DB2, MS SQL Server 2008, Oracle 10g, Toad, SQL Developer,Control M, Putty, WinSCP, UNIX, Erwin 7.5

Hire Now