We provide IT Staff Augmentation Services!

Sr. Informatica Lead Resume

Scottsdale -, ArizonA


  • With 12+ years of IT experience in Analysis, Design and Development as a Senior Software Developer on Client/Server applications in developing strategies for ETL (Extraction,
  • Transformation and Loading) mechanism using Informatica Power center in complex, high volume Data Warehousing projects.
  • Extensive experience in ETL design, development and maintenance using SQL, PL/SQL, Informatica Power Center 10.x/9.x/8.x on UNIX and Windows Platforms.
  • Strong Data Warehousing ETL experience of using Informatica Power Center 10.x/9.x/8.x, Client Tools - Mapping Designer, Repository Manager, Workflow Manager/Monitor and Server tools - Informatica Server, Repository Server Manager.
  • Extensive experience in developing Stored Procedures, Functions, Views and Triggers, Complex SQL queries using SQL Server, Oracle PL/SQL. Proficient in Toad.
  • Experience in resolving on-going maintenance issues and bug fixes; monitoring Informatica sessions as well as performance tuning of mappings and sessions.
  • Experience in all phases of Data warehouse development from requirements gathering for the data warehouse to develop the code, Unit Testing and Documenting.
  • Experience in Change data capture (CDC) and Strong understanding of OLAP and OLTP Concepts.
  • Extensively worked on Informatica Power Center Transformations such as Source Qualifier, Lookup, Filter, Expression, Router, Joiner, Update Strategy, Rank, Aggregator, Stored Procedure, Sorter, Sequence Generator, and UDT.
  • Modified and developed new reports and few dashboards using Tableau tool for enhanced visualization of the data.
  • Experience in using Automation Scheduling tools like Autosys and Tivoli job scheduler.
  • Extensive experience in unit testing, System testing and test data management.
  • Strong hands on experience using Teradata utilities (SQL, B-TEQ, Fast Load, Multiload, Fast Export, Tpump, Visual Explain, and Query man), Teradata parallel support and Unix Shell scripting.
  • Proficient in coding of optimized Teradata batch processing scripts for data transformation, aggregation and load using BTEQ.
  • Involved in Performance tuning and peer review of code and using IDQ to test and compare the data in different tables.
  • Built logical data objects (LDO) and developed various mappings, Mapplet/rules.
  • Extensively Designed SCD 1, SCD 2 mappings.
  • Experience working with UNIX Shell Scripts for automatically running sessions, aborting sessions and creating parameter files.
  • Created action filters, parameters and calculated sets for preparing dashboards and worksheets in Tableau .
  • Performed Data validation, data profiling, data quality, data analysis and statistical analysis using SQL scripts, Tableau Dashboards and MS Excel.
  • Experience working in AGILE METHODOLOGY and ability to manage change effectively
  • Strong experience in Dimensional Modeling using Star and Snowflake Schema, Identifying Facts and Dimensions, Physical and logical data modeling using Erwin, MS Visio and ER-Studio.
  • Experience in Data Modeling and database designing with Business Process Re-engineering (BPR) and Information Engineering (IE) methodologies.
  • Experience in Administrative activities such as creating folders, assigning security permissions to users and creating repositories, setting up groups/users permissions in Informatica Repository Manager and admin console.
  • Proficiency in data warehousing techniques for data cleansing, surrogate key assignment.
  • Involved in full life cycle development of Data Warehousing.
  • Expertise in creating packages to transfer data between ORACLE, MS ACCESS and FLAT FILES to SQL SERVER.
  • Hands-on experience across all stages of Software Development Life Cycle (SDLC) including business requirement analysis, data mapping, build, unit testing, systems integration and user acceptance testing.
  • Extensive experience in managing teams/On Shore-Offshore Coordination/Requirement Analysis/Code reviews/Implementing Standards.
  • Excellent interpersonal and communication skills, and is experienced in working with senior level managers, businesspeople and developers across multiple disciplines.
  • Experience in writing Batch and PERL Scripting to create and Execute the MAXL rule file while loading into Oracle Hyperion ASO/BSO.
  • Experience in writing PERL script to create rule files and CALC method creations while loading into Hyperon Essbase.


Functional: Project Management, Team Building, Team Leader, Business Analysis and Development, Excellent Communication and Interpretation skills

Technical: Informatica Power Center (10.1, 9.5, 9.1, 8.6, 8.1), IBM Data Stage 9.1, SSIS, SSRS, SQL Server DTS packages, Oracle Hyperion Essbase 11.1.0, Oracle 10g/9i, MS SQL Server 2008/2000, Teradata 15.0, Control M, AutoSys, Agile Model


Confidential, Scottsdale - Arizona

Sr. Informatica Lead


  • Developed, configured, coded, tested and debugged new software solutions.
  • Developed, tested and implemented enterprise datamovement (ETL and CDC) solutions.
  • Scheduled the sessions to extract, transform and load data in to warehouse database on Business requirements from REST API.
  • Worked extensively on HTTP Transformation to extract data from API.
  • Parsing JSON and XML format API from source REST API.
  • Addressed system defects and implemented enhancements to existing functionality.
  • Worked with onshore/offshore team to analyze, develop and improve ETL run times as well as to produce accurate defect free code.
  • Maintained productive working relationships with project sponsors and key systems users.
  • Analyzed the business and functional requirements and provided high level technical design specifications to drive ETL Development efforts.
  • Troubleshooted issues with minimal guidance, identified bottlenecks in existing data workflows and provided solutions for scalable, defect-free applications.
  • Participated in the definition of application scope and objectives through research and fact finding.
  • Involved in Extraction, transformation, loading and Implementation.
  • Worked on Informatica tool - Source Analyzer, Mapping Designer and Transformations.
  • Identified and tracked the slowly changing dimensions’ tables, heterogeneous sources and determined the hierarchies in dimensions.
  • Developed mappings using Informatica Power Center Designer to transform and load the data from various source systems like Flat files, SQL server, oracle and loading to Oracle target database.
  • Extensively used various types of transformations such as source qualifier, Expression, Joiner, Update strategy, Lookup, Dynamic Lookup, Router, Normalizer, Union, Filter, Sorter, Rank, Sequence generator etc. to load the data.
  • Used Session parameters, Mapping variable/parameters and created Parameter files for imparting flexible runs of workflows based on changing variable values.
  • Performed Mapping Optimizations to ensure maximum Efficiency.
  • Used Informatica features to implement Type 1, which keeps current data.
  • Implemented complex mapping such as Slowly Changing Dimensions (Type II) using current Flag.
  • Created sessions to run the mappings and set the session parameters to improve the load performance.
  • Writing Shell scripts to invoke Informatica workflow, EACS scripts and wrapper scripts.
  • We undergo discussion to understand the requirement.

Confidential, Blue Ash - Ohio

Sr. Informatica Lead


  • Build partnerships across the application, business and infrastructure teams.
  • Interacted with Business users/customers to confirm the requirements needed for developing and modifying the jobs and to identify the various sources of data in operational systems and developed strategies to build data warehouse.
  • Analyzed the business and functional requirements and provided high level technical design specifications to drive ETL Development efforts.
  • Actively involved in the Design and development of the STAR schema data model.
  • Designed ETL specification documents to load the data in target using various transformations according to the business requirements.
  • Extensively worked with Informatica Power Center to load data from flat files, oracle and other source systems into target database.
  • Designed and developed Complex Informatica mappings using various transformations like Source Qualifier, Expression, Lookup, Aggregator, Update Strategy, Joiner, and Rank for populating target tables/Files in an efficient manner.
  • Designed and developed ETL workflows with job dependencies and scheduling, participated in code reviews of ETL processes.
  • Created Visio diagrams of ETL process to include in the design documents.
  • Created check-lists for coding, reviewing, bug logging, troubleshooting, testing and release for smooth functioning of the entire project.
  • Created and Configured Workflows, Worklets, and Sessions to load the data to target warehouse Oracle tables using Informatica Workflow Manager.
  • Wrote Unix Scripts to invoke Informatica Workflows and sessions.
  • Scheduling the sessions to extract, transform and load data in to warehouse database on Business requirements.
  • Implemented Error handling logic to capture Invalid/Null records coming from staging tables.
  • Tuned and optimized ETL jobs for performance and throughput.
  • Created test plans for unit testing, integration testing and UAT .
  • Developed SQL Scripts and ad-hoc queries for data verification and validation processes.
  • Managed QA and PROD deployments and automation of ETL Jobs.
  • Provided assistance in diagnosing production problems related to the project.

Confidential, Lake Forest - IL

Sr. Informatica consultant (ETL Developer)


  • Led the Data Integration team to drive detailed ETL and data requirements to ensure accurate capturing of client's business requirements and deliverables.
  • Involved in strategy development, project planning, and resource planning, allocation and budget management.
  • Acted as the Subject Matter Expert for ETL and Sales Comp.
  • Collaborated with IT and Business Teams to gather high-level Integration/Compensation requirements.
  • Designed and built ETL solutions to automate the data feeds from client systems
  • Go to guy on all data integration project related questions in Professional Services
  • Took the lead on building custom Process Queuing system for the data Integration processes giving customers the visibility into the current status of their processes.
  • Worked with Cross Functional Teams on various initiatives.
  • Involved in all phases of our company’s evolution from a Start Up to an Enterprise.
  • Extensively worked on standardizing our methodologies to meet the industry standards with focus on quality, scalability and reusability.
  • Always lived up to our Core values of CARE (Customer Focus, Accountability, Respect and Excellence).

Environment: Informatica Power Center 9.0, Oracle 10 g, Flat files, Git migration, STASH, UNIX, Shell Scripting, Xactly Connect, JIRA

Hire Now