We provide IT Staff Augmentation Services!

Informatica Developer/ Data Analyst Resume

Rocky Hill, CT


  • Data analyst/ETL Informatica expert with around 6 Years’ experience in IT requirement gathering, Data Modeling, design, development, testing and support with extensive work in Business Intelligence / Data warehousing technologies.
  • Extensive experience of providing IT services in Insurance, Utilities and Banking domain.
  • Extensive experience in developing ETL applications and statistical analysis of data on databases Oracle, MySQL and SQL Server.
  • Superior SQL skills and ability to write and interpret complex SQL statements and skillful in SQL optimization and ETL debugging and performance tuning
  • Strong experience in Data Analysis, Data Migration, Data Cleansing, Transformation, Integration, Data Import, and Data Export using multiple ETL tools such as Informatica Power Center.
  • Experience in developing of on line transactional processing (OLTP), operational data store (ODS) and Data Warehouse databases.
  • Extensive knowledge in SQL (DDL and DML) Queries to improve the database performance and availability.
  • Have good skills on understanding and development of business rules for its Standardization, Cleanse and Validation of data in various formats.
  • Adept at writing Data Mapping Documents, Data Transformation Rules and maintaining Data Dictionary with Interface requirements documents.
  • Created document and maintained logical and physical database models in compliance with enterprise standards and maintained metadata definitions for enterprise data stores within a metadata repository.
  • Strong experience in using Excel and MS Access to dump the data and analyze based on business needs.
  • Experience in working with Dimension Tables, Fact tables, Slowly Changing Dimensions, DataMart’s and Dimensional modeling schemas.
  • Experience in Data modeling; Dimensional modeling and E R Modeling and OLTP AND OLAP in Data Analysis. Very familiar with SCD1 and SCD2 in snowflake schema and star schema.
  • Experience in Extraction, Transformation and Loading (ETL) data from various data sources into Data Marts and Data Warehouse using Informatica PowerCenter components (Repository Manager, Designer, Workflow Manager, Workflow Monitor and Informatica Administration Console).
  • Strong Experience in developing Sessions/Tasks, Worklets and Workflows using Workflow Manager tools - Task Developer, Workflow & Worklet Designer.
  • Experience in performance tuning of Informatica mappings and sessions to improve performance for the large volume projects.
  • Experience in debugging mappings, identified bugs in existing mappings by analyzing the data flow and evaluating transformations.
  • Good experience in writing UNIX shell scripts, SQL scripts for development, automation of ETL process, error handling and auditing purposes.
  • Exposure to Microsoft Azure Cloud database and high availability configurations.
  • Good knowledge in Amazon Web Service (AWS), components - EC2, S3 Bucket


ETL: Informatica PowerCenter 10.1,9.5.1, 8.6.5.

Hadoop: SparkSQL, Hive, HDFS

Data Warehouse Concepts: ODS, Star Schema, Snowflake schema, OLTP, OLAP

Scheduling Tools: Autosys, Control M

RDBMS: Oracle 11g/12c, SQL Server 2012, MySQL, SparkSQL

Data Modeling: ERWin 7.5, Entity Relationship and Dimensional (Star, Snowflake Schema)

UNIX: Shell scripting

Reporting Tools: Cognos BI 8x/9x

Defect Tracking Tools: HP Quality Center, Bug Tracker

Operating Systems: Windows XP/2000/9x/NT, UNIX

Cloud Computing: Amazon Web Services (AWS), S3, Redshift

Other Tools: Notepad++, Toad, SQL Navigator, Microsoft Excel,MS Visio, Microsoft Visual Studio 2015, Power Point


Confidential, Rocky Hill, CT

Informatica Developer/ Data Analyst


  • Translated business requirements into SQL query to implement performance optimization on advertiser side.
  • Created Data Mapping Documents as per business rules applied in source data to target data.
  • Responsible for creating the Design document and design flow of the data in the project.
  • Responsible for quality of the metadata delivers business value with domain values are normalized, redundancy eliminated.
  • Experience in Extraction, Transformation and Loading (ETL) data from various data sources into Data Marts and Data Warehouse using Informatica power center components (Repository Manager, Designer, Workflow Manager, Workflow Monitor and Informatica Administration Console).
  • Strong Experience in developing Sessions/Tasks, Worklets and Workflows using Workflow Manager Tools Task Developer, Workflow & Worklet Designer.
  • Generated ad hoc SQL queries using joins, Database connections and transformation rules to fetch data from Oracle database and SQL Server.
  • Data expert and SME in multiple areas that includes Consolidated Data repository, Customer Data Warehouse, IVR data repository and Metering Database.
  • Reviewed normalized/Denormalization schemas for effective and optimum performance tuning queries and data validations in OLTP environments.
  • Responsible for developing the mapping, session and workflow as per the dataflow and load the data in Oracle database.
  • Solely responsible for migration of the ETL code to different environments - Dev, Test and Prod.
  • Created the DDL scripts and implemented in different environments.
  • Enhanced the ETL mapping to improve the performance.
  • Worked on Informatica Power Center and used different transformations - Filter, Router, Joiner, Update Strategy (Static and Dynamic), Aggregator.
  • Created parameter files and passed connection string and parameter values using parameter files.
  • Worked on Unix shell scripting to transfer the files to different server locations.
  • Validated and tested the mappings using Informatica Debugger, Session Logs and Workflow Logs.
  • Created detail Unit test plans and performed error checking and testing of the ETL procedures using SQL queries, filtering out the missing rows into flat files at the mapping level.
  • Developed User Defined Functions and Created Materialized Views for high availability of data.
  • Developed mapping to load Fact and Dimension tables, for type 1 and type 2 dimensions and incremental loading and unit tested the mappings.
  • Scheduled and ran Workflows in Workflow Manager and monitored sessions using Informatica Workflow Monitor.
  • Created packages in SSIS packages with control flow, data flow.
  • Worked on the connection to SSIS to Autosys scheduling tool to schedule the packages and run the packages.
  • Migrated the data from on-premise Oracle database to Azure Cloud database using SSIS packages.
  • Azure cloud is a HA configured (high availability) database available in East and Central regions with SLA-95.5%. Any downtime or performance issues were worked with Microsoft and fixed immediately.
  • Worked on parameterizing the SSIS connection to Cloud Central and East Database regions.
  • Supported the project in different phases - Dev, SIT, UAT.
  • Migration of Informatica Mappings/Sessions/Workflows from Dev, QA to Prod environments.
  • Worked on SQL queries to query the Repository DB to find the deviations from Company’s ETL Standards for the objects created by users such as Sources, Targets, Transformations, Log Files, Mappings, Sessions and Workflows.
  • Ensure that all support requests are properly approved, documented, and communicated using the MQC tool. Documenting common issues and resolution procedures.
  • Developed and modified jil files for Informatica workflows and SSIS packages and scheduled in Autosys scheduling tool.
  • Monitored the incremental data loads and scheduled jobs in Autosys in different phases of the project.
  • Supported the project during warranty phase and fixed critical issues in time.

Environment: & Tools: Erwin Data Modeler, Visio, Informatica PowerCenter 9.6, Oracle, UNIX Shell Scripting, SQL Server, Microsoft Azure Cloud SQL Server Database, Autosys Scheduling Tool, File Zilla, Putty, Microsoft Import & Export Wizard, AQT, SQL Developer, HP Quality Center.

Confidential, Virginia

ETL Informatica Developer (Intern)


  • Review SQL queries, Create Data Mapping Documents and work with Data Modeling team for analysis and documentation.
  • Analyze business requirements, technical specification, source repositories and physical data models for ETL mapping and process flow.
  • Coordinate and develop all documents related to ETL design and development.
  • Involved in designing the Data Mart models with ERwin using Star schema methodology.
  • Used repository manager to create repository, user’s groups and managed users by setting up privileges and profile.
  • Used debugger to debug the mapping and correct them.
  • Performed Database tasks such as creating database objects (tables, views, procedures, functions).
  • Responsible for debugging and performance tuning of targets, sources, mappings and sessions.
  • Optimized the mappings and implementing the complex business rules by creating re-usable transformations and Mapplets.
  • Worked extensively with mappings using expressions, aggregators, filters, lookup, joiners, update strategy.
  • Worked on mapping to load Fact and Dimension tables for SCD - Type 1 and Type 2 and incremental loading and unit tested the mappings.
  • Experience in real time Web Services which performs a lookup operation using key column as input and provided response with multiple rows of data belonging to key.
  • Created Workflows, tasks, database connections, FTP connections using workflow manager.
  • Responsible for identifying bugs in existing mappings by analyzing data flow, evaluating transformations and fixing bugs.
  • Created UNIX shell scripting for automation of ETL processes. Used UNIX for check in’s and check outs of workflows
  • Automated ETL workflows using Control-M Scheduler. Used Informatica workflow manager for creating, running the Batches and Sessions and scheduling them to run at specified time.
  • Executed sessions, sequential and concurrent batches for proper execution of mappings and set up email delivery after execution.
  • Preparing Functional Specifications, System Architecture/Design, Implementation Strategy, Test Plan and Test Cases.
  • Implemented and documented all the best practices used for the data warehouse.
  • Improving the performance of the ETL by indexing and caching.
  • Involved in production deployment and later moved into warranty support until transition to production support team.
  • Experience in monitoring and reporting issues for the Daily, weekly and Monthly processes. Also, work on resolving issues on priority basis and report it to management.

Environment: Erwin Data Modeler, Informatica PowerCenter 9.6, Control M, Oracle, MS SQL Server, JIRA, Putty, HPQC


Informatica Developer/ Data Analyst


  • Involved in analysis of Business requirement, Design and Development of High level and Low level designs, Unit and Integration testing
  • Worked on getting the requirements, developing, testing and support during various conversions.
  • Interacted with clients and users to understand their requirements and provided solutions to meet their requirements.
  • Analyzed the system thoroughly and Created System Document of a complex system without any input/document which helped us to get the project from competitors.
  • Participated in client discussions to gather scope information and perform analysis of scope information to provide inputs for project scoping documents
  • Used UNIX scripting for file formatting before it gets into Informatica.
  • Based on business needs prepared conceptual data models for long - term solutions, created logical and physical data models using best practices to ensure high data quality.
  • Optimized and update logical and physical data models to support new and existing design requirement.
  • Used Data Vault technique and achieved many advantages of Data Vault approach some of them are simplified the data ingestion process, removed the cleansing requirement of a star schema
  • Data Vault used in both a data loading technique and methodology which accommodates historical data, auditing, and tracking of data.
  • Data governance functional and practical implementation and also responsible for designing common Data governance frameworks
  • Provided inputs for overall implementation plan, lead deployment of applications/infrastructure and post production support activities.
  • Good domain knowledge in Cards
  • Developed complex modules and delivered defect free and highly optimized deliverables
  • Trained and groomed new resources on domain as well as technical knowledge and all processes

Environment: Informatica 9.5/8.6, Erwin, Oracle, Flat files, SQL, Toad, Cognos 8, Control M


Informatica Developer/ Data Analyst


  • Involved in analysis of Business requirement, Design and Development of High level and Low level designs, Unit and Integration testing
  • Understanding the Functional Design Specs and preparing the Technical design.
  • Involved in designing and developing High level and Low level designs
  • Worked Extensively on Informatica tools -Repository Manager, Designer and Server Manager.
  • Involved in Extraction, Transformation and Loading (ETL) Process.
  • Created the Source and Target Definitions using Informatica Power Center Designer.
  • Developed and tested all the backend programs, Informatica mappings and update processes.
  • Created and Monitored Batches and Sessions using Control M scheduling Tool.
  • Tuned the mappings to increase its efficiency and performance.
  • Used Informatica Workflow Manager to create workflows
  • Workflow Monitor was used to monitor and run workflows
  • Developing Unit Test Plans thoroughly covering all the business scenarios
  • As a fresher, performed technical analysis on the existing code and created
  • Worked on coding and testing during acquiring/merging of banks like Vijaya Bank, South Indian Bank
  • Worked on enhancements of Cognos BI reports.
  • Worked on enhancement activities related to GL accounts
  • Involved in unit testing, systems testing, integrated testing and user acceptance testing.
  • Supported the application on the Warranty Period

Environment: Informatica 8.5, Erwin 7.5, Oracle 10g, Flat files, SQL, Toad, Cognos8, Control M.

Hire Now