We provide IT Staff Augmentation Services!

Sr. Etl/informatica Developer Resume

4.00/5 (Submit Your Rating)

Charlotte, NC


  • 8+ years of Data Warehousing experience in development, Installing, Upgrade, Administer, Analyze, Design and implementation of Data warehousing ETL solutions using Informatica Power Center, Metadata Manager, Data Validation Option, BDM, BDQ, EDC, Data Lake.
  • Have configured the Informatica BDM/BDQ/EIC/IDL 10.1/10.2.1 and integrated with the Kerberos enabled Cloudera and Horton Hadoop Frameworks.
  • Extensive IT Experience in Designing, Development, Administration, Implementation of Data Warehouse & Data marts with Informatica Products 9.x/8.x/7.x/6.x as an ETL tool.
  • Expert in integration of various Operational Data Sources (ODS) with Multiple Relational Databases like Oracle, SQL Server, Teradata & Worked on integrating data from various flat files.
  • Expert in Informatica 9.x installation, configuration, debugging, tuning and administration including Systems implementation, operations and its optimization as Informatica admin.
  • Data integration with SFDC and Microsoft Dynamics CRM using Informatica cloud.
  • Experienced in development, maintaining and implementation of EDW, Data Marts and Data warehouse with Star schema and snowflake schema.
  • Expert in Data Extraction, Transformation, Loading from data sources like Teradata, Oracle, SQL Server, XML, Flat files, COBOL and VSAM files etc.
  • Expert in Stored Procedures, Triggers, Packages for tools like TOAD, SSMS & PL/SQL Developer Expert in using Informatica Designer for developing mappings, mapplets/Transformations, workflows and performing complex mappings based on user specifications/requirements
  • Expert in Operational and Dimensional Modeling, ETL (Extract, Transform and Load) processes, OLAP (On - line Analytical Processing), dashboard designs and various other technologies.
  • Highly experienced in working and managing projects in Onsite-Offshore models. Proficient in handling and meeting client expectations/requirements.
  • Experience in Informatica Big Data Developer Edition and Autosys. Through knowledge of Relational & Dimensional models (Star & Snow Flake), Facts and Dimension tables, Slowly Changing Dimensions (SCD)
  • Extensively worked on Teradata queries and Utilities BTEQ, MLOAD, FLOAD, FEXPORT, TPUMP.
  • Experience in Informatica Big Data Developer Edition and Autosys.
  • Hands on experience in implementing Slowly Changing dimension types (I, II &III) methodologies
  • Expert in migrating Informatica mappings/Workflows/sessions to Production Repository
  • Proven ability to manage multiple project modules simultaneously.
  • Expertise in Informatica power center setup, configuration, user security and administration A personality with strong business acumen, excellent inter-personal relationship, possessing strong leadership and team building capabilities.
  • Experience using SAS ETL tool, Talend ETL tool and SAS Enterprise Data Integration Server highly preferred.
  • Experienced with handling high volume of data, performance tuning, query optimization using Vertica
  • Effectively develops complex Vertica SQL coding (use of temp/volatile tables, complex joins, etc.) Worked in Data Warehouse and Business Intelligence Projects along with the team of Informatica, Talend (ETL), Cognos 10, Impromptu and Powerplay.
  • Expertise on Service Now incidents, IMR (Incident Management Record), CMR (Change Management Records) process and management.
  • Designed and Prepared Functional specification documents, technical specification documents, Mapping Documents for Source to Target mapping with ETL transformation rules.
  • Expert in fine-tuning mapping & sessions by identifying bottlenecks to improve the performance Expert in typical Admin jobs like Migrations, FTP requests, Performance tuning, Installation of patches, hotfixes and upgrading different tools.
  • Strong experience in Scheduling jobs using UNIX Shell Scripts and Unix Crontab
  • Excellent analytical, problem solving & communication skills including leading & managing a team Experienced as an Informatica Admin on monitoring resource utilization & proactively identifying Informatica job performance & resource issues like network, disk usage, memory usage etc.
  • Expertise in setup LDAP authentication security in Informatica Experience in Performance tuning in Vertica which includes creation of projection, partition, LAP, and segmentation.


ETL DW Tools: Informatica PowerCenter 9.x/8.x/7.x, Informatica Power Exchange 9.x/8.xTOAD, SSMS, BODS, Informatica Cloud.

Reporting Tools: Business Objects XIR2/6.5 (Supervisor, Designer, Business Objects), Cognos SSRS

Data Modeling: Dimensional Data Modeling, Star Schema Modeling, Snowflake Modeling FACT and Dimensional Tables, Physical and Logical Data Modeling.

Programming Languages: C, C++, Shell Scripts, C Shell, K-Shell Scripts, VB Scripts and VI-editor.

Databases: Oracle, Oracle Apps, MS SQL Server, SAP R3, Flat Files, Teradata, PL/SQLDB2, Sybase, Vertica, Hadoop

GUI: Visual Basic, FrontPage, Excel, Power Point.

Servers: Oracle 9iAS, Apache, Tomcat and MS Site Server, Web sphere.


Confidential, Charlotte, NC

Sr. ETL/Informatica Developer


  • Familiar with the Install/Upgrade process for the BDM/EDC and the IDL and their association with the different application services on the domain.
  • Familiar with split domain functionality for BDM and EDC and use the same Blaze engine on the Cluster.
  • Using Informatica Power Center 9.6 to extract data from various sources, transform and load it into staging database and from staging database to Oracle Servers and files. Also included mappings which directly loaded tables from various sources to Oracle databases.
  • Extracted the raw data from Microsoft Dynamics CRM to staging tables using Informatica Cloud.
  • Developed Cloud mappings to extract the data for different regions (APAC, UK and America).
  • Refreshing the mappings for any changes/additions to CRM source attributes.
  • Developed the audit activity for all the cloud mappings.
  • Responsible for Optimization of mappings, sessions, SQL queries using Informatica
  • Working on Alternative approach Stream, which includes mapping creation for 60+ tables, Pre and Post Migration Stages of the project in Dev, Test, QA & Prod Environments.
  • To Develop UNIX shell scripts to run batch jobs & automate workflows
  • To Extract XML data from different sources such as messaging system TIBCO, files & databases using XML Parser Transformation, also Used XML editor to create and validate personalized views
  • Responsible to design Informatica Trigger Processes to initiate and monitor Informatica Events
  • To use Informatica Design Quality (IDQ) tool for data profiling based on the given business rules and convert them into reusable Mapplets which can be plugged into power center tool
  • Working with data modeler in designing Conceptual, Logical and Physical data models making use of ER win for relational OLTP systems.
  • Conceptualized and developed initial and incremental data loads in Informatica using Update strategy transformation.
  • Effectively develops complex Vertica SQL coding (use of temp/volatile tables, complex joins, etc.)
  • Developed a ETL process to extract project date from Primavera P6 to IBM Cognos TM1Developed a ETL process to load project status information from Project Proposal System to Primavera P6
  • Responsible for various Requirement Gathering, their Analysis and leading End user Meetings
  • To perform and lead typical Admin jobs like Migrations, FTP requests, Performance tuning, Installation of patches, hotfixes and upgrading the tool etc.
  • To develop each test plan and test case based on the high-level and detail design. To Standardize the formats of irregular data coming from upstream systems into a more meaningful dataset for analysis using various transformations like Standardizer, Label, Match and Comparison transformations etc.
  • Creating PL/SQL Stored Procedures, Functions, Triggers & Packages to implement business logics
  • Implemented exception handling using autonomous transactions, Pragma, Locks, used save points, commits & rollbacks to maintain transactional consistency & database integrity.
  • Created and Scheduled Sessions and Batch Process based on demand, run on time, run only once using Informatica Server Manage
  • Migrated data from SQL Server, Sea Quest (HP internal Database) to HBase using Sqoop.
  • Configured various big data workflows to run on top of Hadoop and these workflows comprise of heterogeneous jobs like VSQL, Sqoop and MapReduce.
  • Tuning of the mappings to ensure data load was completed by SLA. Working with the Cognos developers to create customized BI reports to meet user requirements using Cognos Query Studio
  • Responsible for converting Functional Requirements into Technical Specifications
  • Involved in Unit Testing, User Acceptance testing to check whether the data loads into target are accurate, which was extracted from different source systems according to the user requirements
  • Leading onshore and off shore teams to identify and resolve various issues relating to processes, Informatica, other databases etc.

Environment: Informatica PowerCenter 9.6/9.1, Informatica Power Exchange 9.6/9.1, Oracle 10.0.1, PL/SQL developer, TOAD, Informatica Cloud, Teradata, SQL Server Management Studio, Putty, UNIX Shell Script, Cognos Query Studio, Informatica - BDM 10.2.1, EDC 10.2.1, IDL 10.2.1.

Confidential, Scotts Valley, CA

ETL Developer


  • Developed and supported the Extraction, Transformation, and load process (ETL) for data migration.
  • Extensively used Joins, Triggers, Stored Procedures and Functions in Interaction with backend database using PL/SQL, TOAD
  • My responsibilities included developing and maintaining efficient ETL Talend and reporting using Cognos 8 B
  • Installed, Configured Cognos8.4/10 and Talend ETL on single and multi-server environments
  • Used Normalizer Transformation to split a single source row into multiple target rows by eliminating redundancy and inconsistent dependencies.
  • Worked on Talend RTX ETL tool, develop jobs and scheduled jobs in Talend integration suite. Extensively used the concepts of ETL to load data from AS400, flat files to Salesforce.
  • Extracted XML data from different sources such as messaging system TIBCO, files & databases using XML Parser Transformation, also Used XML editor to create and validate personalized views
  • Performed match/merge and ran match rules to check the effectiveness of MDM process on data.
  • Designed and developed Informatica packages, designed stored procedures, configuration files, tables, views, and functions.
  • Configured match rule set property by enabling search using rules in MDM as per Business Rules
  • Migrated Informatica mapping to the SQL Server Integration Services (SSIS) packages to transform data from SQL 2000 to MS SQL 2005.
  • Extracted data from various heterogeneous sources like Oracle, SQL Server, DB2 and Flat Files.
  • Worked on Dimension/Fact tables to implement the business rules and get required results. Developed Re-usable Transformations and Re-Usable Mapplets.
  • Used various transformations like Lookup, Filter, Normalizer, Joiner, Aggregator, Expression, Update strategy, Sequence generator, XML Generator, router etc. in the mappings.
  • Worked on XML Parser transformation to read the XSD file and build the source definition and accordingly to read the XML source file.
  • Used Netezza SQL to maintain ETL frameworks and methodologies in use for the company and also accessed Netezza environment for implementation of ETL solutions
  • Involved in loading the data into Netezza from legacy systems and flat files using scripting on UNIX. Used NZ SQL & NZ LOAD utilities of Netezza
  • Responsible for Unit testing and Integration testing of mappings and workflows, also responsible for Performance Tuning at the Source, Target, Mapping and Session Level.
  • Rational Clear case is used to Controlling versions of all files & Folders (Check-out, Check-in). Defect Tracking and reports are done by Rational Clear Quest.
  • Provided excellent customer service to the internal functional team by pro-actively following up with the issues on hand (through detailed emails and by setting up short meetings).

Environment: Informatica PowerCenter 9.5/9.1, Informatica PowerExchange 9.5/9.1,TOAD,Teradata, SQL Server 2012/2008, Oracle 11g, Shell Scripts, UNIX, Quality Center, IDQ tool, Auto-sys scheduling tool, Rational clear Quest.

Confidential, New York, NY

Informatica Developer


  • Designed and created mappings using tools like Source Analyzer, Warehouse, Mapping & Mapplet Designer, Transformation Developer, Informatica Repository Manager and Workflow Manager.
  • Developed mappings to load Fact and Dimension tables, for Type 1 and Type 2 dimensions and Incremental loading and unit tested the mappings.
  • Involved in various phases of the project from design to development, integration and acceptance
  • Extensively used various transformations like Source Qualifier, Aggregator, Filter, Joiner, Sorter, Lookup, Update Strategy, Router, Sequence Generator etc. and used transformation language like transformation expression, constants, system variables, data format strings etc.
  • Analyzed the session logs, bad files and error tables for troubleshooting mappings and sessions
  • Involved in performance tuning of mappings to tune the data load & SQLs for better performance
  • Debugged maps using Debugger and Transformation's verbose data.
  • Analyzed business requirements, performed source system analysis, prepared technical design document and source to target data mapping document Profiled data using Informatica Data Explorer (IDE) & performed Proof of Concept for Informatica Design Quality (IDQ)
  • Performance tuning of mappings, transformations and sessions to optimize session performance.
  • Tuned performance of Informatica session for large data files by increasing block size, data cache size, sequence buffer length and target-based commit interval.
  • Created shared folders, local and global shortcuts to reuse metadata.
  • Resolved performance issues for huge volume of data & increased the performance significantly
  • Scheduled and monitored workflows by use of Workflow Manager and Workflow Monitor.
  • Worked on Dimension/Fact tables to implement the business rules and get required results. Developed Re-usable Transformations and Re-Usable Mapplets.
  • Defined parameters, variables and parameter files for flexible mappings/sessions.
  • Prepared various kinds of documents like Test case document, mapping design document.
  • Involved in the migration of code from development to QA environment and then to production.

Environment: Informatica Power Center 8.1 (Source Analyzer, Warehouse Designer, Transformations Developer, Mapplet Designer, Mapping Designer, Workflow Manager) Worklets, SQL, MS SQL 2005, Windows NT, HP-UX.


SQL Developer


  • Worked on implementation of the new CA Tools interface by interacting with Materialized views via Data pump.
  • Created complex Materialized views for CA Tools based on the different type of wholesale loans as specified by the business.
  • Involved in developing a new application architecture framework that is data driven to replace the current process driven.
  • Worked on Creating new PL/SQL Packages and Procedures for the AU-SU Changes for the LNU reports to run on the architecture framework.
  • Developed Stored Procedures, Packages, and Functions for CMS interface using PL/SQL for the batch framework
  • Worked on implementing the data movement strategy using the Oracle 10g feature of DBMS Data Pump API over the network to replace the legacy NDM and Oracle Export/Import process.
  • Worked on enhancements for flat file loading from Mainframe to Oracle database.
  • Responsible for loading multiple forms of data from Mainframe source using SQL Loader.
  • Involved in converting the business requirements into technical design document.
  • Involved in tuning the long running PL/SQL queries for maximum efficiency in various business schemas of batch processing.
  • Worked with Business teams to make sure requirements were converted to Functional Documentation and the precise logic was implemented in the technical design and code.

Environment: Oracle 10g, SQL, PL/SQL, TOAD, SQL Loader, UNIX Shell Scripts.

We'd love your feedback!