We provide IT Staff Augmentation Services!

Sr.informatica/teradata Developer Resume

Richmond, VirginiA


  • Around 6+ years of overall experience in IT Industry with emphasis on Data Warehousing tools using industry accepted methodologies and procedures.
  • Technical expertise in Informatica 10.x/9.x/8.x - Power Center, Client tools - Mapping Designer, Mapplet Designer, Transformations Developer, Workflow Manager/Monitor and Server tools - Informatica Server Manager, Repository Server Manager, and Power Exchange.
  • Expertise in Data Warehousing, Data Migration, Data Integration and Data Cleansing.
  • Experience in installing Informatica and configuration in windows and UNIX environment.
  • Directly responsible for the Extraction, Transformation & Loading of data from multiple sources into Data Warehouse. Complete knowledge of data ware house methodologies (Agile Scrum, Ralph Kimball, Inmon), ODS, EDW and Metadata repository.
  • Experience in extracting, transforming and loading (ETL) data from spreadsheets, database tables and other sources using Microsoft SSIS and Informatica.
  • Expertise in Bteq Shell scripting for data loading in EDWard(Enterprise Data Warehouse and Research Depot).
  • Experience in Apache Hadoop Technologies Hadoop distributed file system (HDFS), MapReduce framework, YARN, Pig, Hive, Sqoop, Spark, BigSheets and BigSQL.
  • Expertise in Extraction, Transformation & Loading of data using heterogeneous sources and targets.
  • Knowledge of Testing environment, Data Analysis (Big data, Tera data, Ab initio).
  • Experienced in integration of various data sources like Oracle, Teradata, Confidential DB2, MS SQL Server, My SQL, XML files, Mainframe sources into staging area and different target databases.
  • Expert in Extracting, Transforming and Loading ( ETL) data using Informatica/SSIS ; creating mappings/workflows to extract data from SQL Server, Excel file, other databases and Flat File sources and load into various Business Entities.
  • Expertise in Oracle (SQL/PLSQL) and SQL server performance tuning using optimization techniques.
  • Experience in creating ETL Jobs, Alerts in Automation tools AutoSys, Control M.
  • Extensively used different features of Teradata such as BTEQ, T-pump, Fast-load, Multi-load, SQL Assistant, DDL and DML commands. Very good understanding of Teradata UPI and NUPI, secondary indexes and join indexes.
  • And also expertise in operational data source (ODS).
  • Data Analysis - strong experience on Data Design/Analysis, Impact Analysis, Data Profiling, Business Analysis, User Requirement Gathering, User Requirement Analysis, Gap Analysis, Data Cleansing, Data Transformations, Data Relationships and Source Systems Analysis.
  • Expertise in creation of complex mappings using SCD type II involving transformations such as expression, joiner, aggregator, lookup, update strategy, and filter.
  • Extensive database experience and highly skilled in SQL in Oracle, MS SQL Server, DB2, Flat Files, CSV files, MS Access.
  • Skilled in SQL, Teradata SQL Assistant, PL/SQL, SQL*Plus, TOAD.
  • Experience in implementing Data Warehousing Applications using Informatica Power Center / Power Mart for designing and developing transformations, mappings, sessions, and for scheduling and configuring workflows.
  • Skilled in UNIX, Shell programming on UNIX systems using VI Editor, Windows scripting and in Informatica Sessions.
  • Working with an Agile, Scrum methodology to ensure delivery of high quality work with every monthly iteration by implementing Sprint Planning and Backlog Refinement Sessions.
  • Worked closely with product owner and other functional team members to meet tight Agile Deadline.
  • Outstanding communication and interpersonal skills, ability to learn quickly, good analytical reasoning and high compliance to new technologies and tools.
  • Self-motivated and Excellent team work spirit and capable of learning new technologies and concepts.
  • Excellent interpersonal and communication skills, and is experienced in working with senior level managers, business people and developers across multiple disciplines.


ETL Tools: Informatica Power Center 10.1.1, 9.6.1, 9.x, 8.x, 7.x (Designer, Workflow Manager, Workflow Monitor, Repository Manager, Repository Server, Bteq-win, SSIS, SSRS, Cognos 10.x,9.x.

Databases: Teradata 15.10, Oracle 12c/11g/10g, MS SQL Server 2012/2010/2008, Facets, Sybase, MS Access 2010/2003/2000, Confidential DB2

Programming: SQL, PL/SQL, T-SQL, UNIX Shell Scripting, Perl

Scheduling Tools: Autosys, BatchIR, ETL Jobs - Control M, WLM (Work Load Manager)

Operating System: UNIX, Linux, Windows 10/8/7/2000/XP/ NT

Methodologies: Agile, Waterfall, Spiral

Other tools: Jira, Jira-Bizz, Confidential Confluence, Confidential Rational ClearTeam Explorer, CAT & FCAT (Code Assessment Tools), IP Switch, Sococo.


Confidential, Richmond, Virginia

Sr.Informatica/Teradata Developer


  • Responsible in requirement analysis and impact analysis for complete understanding of the Business requirements.
  • Data analysis to understand the current data flow in the system to inspect, cleanse, transform and modeling the data with the goal of discovering useful information, suggesting conclusions and supporting decision making.
  • Responsible in creating Technical Design Document for which provides the comprehensive design overview of the Solution at a specific point in a time. It also serves a communication between the designers, software developers and project team members regarding design decisions.
  • Responsible for extracting data from Source (.data, .txt, .cpy, .csv, COBOL, Mainframe, Flat files) to the Landing Zone Area (LZ layer) using Informatica workflows.
  • Creating new mappings and modifying the existing mappings for data loading in LZ layer (i.e., no hard coding, leverage of maplet, resue concepts etc) in Power Centre versions 9.6.1 and 10.1.1.
  • Used different connections such as Relational, File writer and Loader (Mload, Tpump - Insert/ update/ Upsert) to load millions of records.
  • Based on the source file type, layout, structure we use different transformation logics to satisfy the business rule. Used different transformations like Expression, Filter, Router, joiner, Sequence Generator, Update Strategy, Unconnected Look-up and Aggregator in Informatica.
  • Creating Bteq scripts for the pre-population of Intermediate Work tables and STG tables prior to the main load process.
  • Used Volatile tables, Global Temporary tables (GTT) and derived queries for breaking up the complex queries into simpler queries.
  • Developing scripts for $N tables, backup tables and One-time Work tables 1T.
  • Loading the EDWard Production tables using Unix/Bteq Shell Scripts.
  • Creating Unix/Bteq Shell scripts to insert merge and update the Edward productions tables from the STG or Final work tables during the Final Load Process.
  • Used different joins like Left Outer join, Inner-join, cross-join and Look-ups in the Bteq scripts based on the business requirement.
  • Used Push-down optimization to load EDWard tables.
  • Unit testing in Development Environment to ensure the data populated before getting the Code Approval from the Tech Lead.
  • Used SELF CAT tool for Code Assessment/Code review and make sure that no errors exist in the developed code.
  • Performed clear case baselining and FCAT to place the code/components in the staging area and migrate to the test environment.
  • Using Control M and WLM (Work Load Manager) tools to create the jobs and scheduling runtime of the jobs. Creating Control M and WLM jobs to load data in Test ( SIT and UAT ) Environment and IR Environment.
  • Regression testing to ensure the existing code is working properly in IR environment before proceeding with the Production Execution.
  • Used JIRA to track the work in progress and to report the daily status to the team.
  • Attending the daily standup calls with the whole team to report the work done and hours burnt for the work divided into subtasks that are assigned to each resource.
  • Involving in the Backlog refinement sessions, Sprint Planning, Sprint Retrospective and Sprint Demo with Scrum Master, Project Manager , Product Owner , Project Team Lead and all other team members.
  • Involving in the Discussions of Requirement gathering, decision making in critical situations and supporting the team in Design , Deveopment and the whole SDLC of the Project.

Environment: Informatica Power Center 10.1.1/9.6.1, Teradata 15.10, Putty, JIRA, Confidential Confluence, Rational ClearTeam Explorer, CAT & FCAT(Code Assessment Tools), Control M, WLM, IP Switch, Facets/Sybase, Microsoft Visio, MS Office, UNIX, Windows 10 Operating System, Sococo.

Confidential, Louisville, KY

Informatica/ETL Developer


  • Responsible for the MLQM (Member Life Quality Measures) and Quality Score Card Project using ETL process, Design and Development.
  • Data loading from Flat Excel sources to the DA (Data Access) Layer in Oracle Data mart using Informatica 9.6.1 version.
  • Loading the Lab and Biometric Data for each month release in to the relational tables in the DA layer.
  • Worked in using Spark API over Hadoop Map Reduce to perform analytics on data.
  • Developed the Mappings to Load the data from Oracle Data mart to the SBI (Supplemental Business Intelligence) which is in SQL server and Scheduled the sessions to run for each month release.
  • Data staging in the MLQM Stage Database (SQL server) by building the new ETL to load the data in to the Particular Schema in MLQM. Created the mappings and used the business logic to load the data into the tables according to the Client's requirements.
  • Exporting the calculated quality measures custom data to the flat files (Excel) using the informatica and reporting to Provider portal using SSRS.
  • Created ETL jobs to run the Workflows to move the data from Oracle to SQL server and SQL server to Flat files.
  • Worked with Reflection FTP Client to transfer the files in to the Client's server and to define the source path at Source file Directory.
  • Worked on Unix Environment to start and run the sessions in Informatica.
  • Created Shell Scripts in UNIX to run the ETL jobs.
  • Performed Query Optimization with the help of explain plans, collect statistics, Primary and Secondary indexes. Used volatile table and derived queries for breaking up complex queries into simpler queries. Streamlined the shell scripts migration process on the UNIX box.
  • Extracted data from Oracle database of site applications and load it into SET and MULTISET tables in the Teradata database by using various Teradata load utilities.
  • Used on Hadoop Cluster administration, monitoring and managing Hadoop clusters using Cloudera Manager.
  • Used various Transformations like Expression, Aggregator, Sequence Generator, Update Strategy, Joiner, SCD type 1 and type 2, Filter, Normalizer and Lookup(Connected & Unconnected) based on the different conditions and Worked on the Transformation Designer to design the new transformations and use the existing transformations for the Development.
  • Generated complex SQL queries and override in the Source Qualifier and Lookup Conditions where ever needed.
  • Used different joins by Joiner transformations to join different target tables and also user defined joins in the Source Qualifier to join the different sources like Flat files (.csv, excel, xml) and relational sources.
  • Used look up condition and joined the different tables having Primary key and foreign key relation using both cached and un-cached memory.
  • Analyzed the data flow and worked on the session log files to resolve the issues and errors by modifying the mappings in the mapping designer and mapplet and looking in to the ODBC connection strings to make the workflows successful.
  • Responsible for creating Workflows and sessions using Informatica workflow manager and monitor the workflow run and statistic properties on Informatica Workflow Monitor.
  • Involved in writing stored procedures in Oracle SQL Developer and Microsoft SQL Server.
  • Worked on TOAD and Oracle SQL Developer to develop queries and create procedures and packages Oracle.
  • Worked with SSRS for reporting the MLQM Custom measures from Staging data to the Portal Provider for each month release deployment and involved in the production management.
  • Participating in Daily status Check-point team meetings, Development meetings with Lead and conducting internal and external reviews as well as formal walk through among various teams and documenting the proceedings.
  • Involved in regular discussions with the Facets team to enter test data.
  • Weekly DRB meetings with the testing team and fixing identified problems in existing production data and developed one time scripts to correct them.

Environment: Informatica PowerCenter 9.6.1, Oracle 11g/12c, Microsoft SQL Server Management Studio 2012, SSRS,SSIS, Facets 4.7.1, TOAD 10,Sql plus, Putty, Reflection FTP Client, Mainframe 8210, Business Objects 4.1, HP Client, big data, Microsoft Visio, MS Office, UNIX, Windows 10 Operating System.

Confidential, Schaumburg, IL

ETL/Informatica Developer


  • Involved in complete understanding of business requirement and involved in analysing the sources to load in Oracle warehouse.
  • Using Informatica Powercenter 8.6 and 9.1 to make the changes to the existing ETL mappings in each of the environments.
  • Involved in the migration of Informatica 8.6 workflows into Informatica 9.1 version.
  • Wrote PL/SQL procedures which are called from Stored Procedure transformation to perform database actions such as truncate the target before load, delete records based on a condition and rename the tables.
  • Designed Sources to Targets mappings from SQL Server, Excel/Flat files to Oracle using Informatica Power Centre.
  • Develop the Perl and Shell scripts to automate the finance billing file.
  • Responsible for Detail design of each interface.
  • Assisted with implementation/upgrade of AutoSys and JAWS, troubleshoot errors, and documentation/procedures.
  • Implemented slowly changing dimension Type 1 and Type 2 for Change data capture using Version control.
  • Responsible for creating Workflows and sessions using Informatica workflow manager and monitor the workflow run and statistic properties on Informatica Workflow Monitor.
  • Created mapping and mapplets and rules in IDQ to address clean and transform data
  • Involved in writing stored procedures both in Oracle and SQL Server.
  • Responsible for Defining Mapping parameters and variables and Session parameters according to the requirements and performance related issues.
  • Created various Oracle database SQL, PL/SQL objects like Indexes, stored procedures, views and functions for Data Import/Export.
  • Used Informatica MDM (Siperion) tool to manage Master data of EDW.
  • Created relationships for the base objects which reflects from look up tables in MDM.
  • Analyzed Session Log files in case the session failed to resolve errors in mapping or session configurations.

Environment: Informatica Power Center 8.x/9.x, MS SQL server 2012, Informatica IDQ, Oracle 11g/10g, Data quality, Toad, PL/SQL (Stored Procedures, Triggers, Packages), Smart SVN, Autosys, Erwin, SSIS, CSV files , Unix Shell scripting , MS Visio, Windows XP .


Jr. Informatica Developer


  • Based on the requirements created Functional design documents and Technical design specification documents for ETL.
  • Extracted data from various Source Systems like Oracle, SQL Server and Flat Files as per the requirements.
  • Worked on developed mappings and Mapplets using Informatica Source analyzer, Warehouse designer, Transformation designer and Mapplet designer.
  • Created tables, views, indexes, sequences and constraints.
  • Worked on stored procedures, functions and database triggers using PL/SQL according to specific business logic.
  • Transferred data using SQL*Loader to database.
  • Involved in testing of Stored Procedures and Functions. Designed and developed table structures, stored procedures, and functions to implement business rules.
  • Involved in data validation, load process and error control routines.
  • Analyzed the database for performance issues and conducted detailed tuning activities for improvement.
  • Generated monthly and quarterly inventory/ purchase reports.
  • Coordinated database requirements with Oracle programmers and wrote report SQL's for sales data.

Environment: Informatica Power Center 8.x, Oracle 9i, SQL, PL/SQL, SQL *Loader, Windows.


Entry-Level/ jr Informatica Developer


  • Monitored the ETL jobs on regular basis. In case of any failure, performed root cause analysis and recovered the job.
  • Developed Informatica mappings to implement migration activity for client, decreasing processing time by 30% and reduced the burden on the database server up to 60% to meet the SLAs.
  • Experienced in Tuning Informatica Mappings to identify and remove processing bottlenecks.
  • Automation and scheduling of UNIX shell scripts and Informatica sessions and batches.
  • Extensive experience in ETL design, development and maintenance using Oracle SQL, PL/SQL, Informatica Power Center 8.6.
  • Scheduled Informatica workflows using Informatica Scheduler to run at regular intervals.

Environment: Informatica Power Center 8.6, Oracle 10g, MS SQL Server 2008, MS Access, MS Excel 2008, SQL Developer, Windows, Unix/Linux. Internal

Hire Now