We provide IT Staff Augmentation Services!

Informatica Developer Resume

2.00/5 (Submit Your Rating)

SUMMARY

  • Nine (9+) years of IT experience in the Analysis, Design, Development, Testing and Implementation of business application systems for Financial Sectors using Data Warehouse/Data Mart Design, ETL, OLAP, BI, Client/Server applications.
  • Around 1+ years of experience with Talend Open Studio, Hive, Spark
  • Around 9+ years of experience with Informatica, Oracle and Teradata.
  • Certified Developer in Informatica Powercenter Data Integration
  • Certified Professional in Java and C Programming Languages
  • Extensive experience developing Teradata SQL, Stored Procedures, BTeq.
  • Extensively worked with Teradata utilities like BTEQ, Fast Export, Fast Load, Multi Load to export and load data to/from different source systems including flat files.
  • Hands on experience using query tools like TOAD, SQL Developer, PLSQL developer, Teradata SQL Assistant and Query man.
  • Proficient in performance analysis, monitoring and SQL query tuning using EXPLAIN PLAN, Collect Statistics, Hints and SQL Trace both in Teradata as well as Oracle.
  • Created warehousing solutions using the Relational On - Line Analytical Processing (ROLAP) approach
  • Strong experience in ETL workflows for multiple data sources like flat files, XML, Teradata, Oracle, SQL Server.
  • Experience with tools like SQL Server management studio and SQL Server 2005/2008 integration (SSIS) and reporting services(SSRS).
  • Extracted data from SAP system to Staging Area and loaded the data to the target database by ETL process using Informatica Power Center.
  • Performed the performance and tuning at source, Target levels using Indexes, Hints and Partitioning in DB2, ORACLE and Informatica.
  • Designed and developed various PL/SQL stored procedures to perform various calculations related to fact measures.
  • Converted the PL/SQL Procedures to Informatica mappings and at the same time created procedures in the database level for optimum performance of the mappings.
  • Worked extensively in building Dimensions, Bridges, Facts, and Star Schemas, Snow Flake (Extended Star) Schemas and Galaxy Schemas
  • Expertise with the tools in Hadoop Ecosystem including Pig, Hive, HDFS, MapReduce, Sqoop, Storm, Spark, Kafka, Yarn, Oozie, and Zookeeper.
  • Excellent knowledge on Hadoop ecosystems such as HDFS, Job Tracker, Task Tracker, Name Node, Data Node and Map Reduce programming paradigm
  • Experience in designing and developing POCs in Spark using Scala to compare the performance of Spark with Hive and SQL/Oracle.
  • Worked on converting some of the PL/Sql scripts into Informatica mappings.

TECHNICAL SKILLS

Operating Systems: Windows 7, Windows XP, Linux

ETL Tools: Informatica PowerCenter 10.2.0/9.5.1/8.6.1 , IBM Datastage 9.1.2.0, Talend Data Integration 7.0.1, Talend Big Data 7.0.1

Scheduling Tools: Control-M 9.0, Autosys r11

Big Data Technologies: Hive SQL, Spark, Hue, Talend Big Data 7.0.1

Databases: Oracle 12g/11g/10g, MS SQL Server 12.0, Teradata 15.10,

Data Modelling Tools: Aqua Data Studio 17.0.3, Erwin, MS Visio

Applications: Cognos TM1, Business Objects, Denodo, PL/SQL Developer, SAP, Autosys

Languages: SQL, PL/SQL, PostgreSQL, UNIX, Shell scripts, C, Java, HTML, JavaScript

Trainings: Big Data, Hadoop, Tableau

Banking Tools: Connected Teller, Aperio, Nautilus, DocPlus

Functional: Treasury, Financial Services

Project Management Tools: Planview, MS Project, MS Notes, Service Now

PROFESSIONAL EXPERIENCE

Confidential

Informatica Developer

Responsibilities:

  • Understanding the Business requirements based on Functional specification to design the ETL methodology in technical specifications.
  • Developed data conversion/quality/cleansing rules and executed data cleansing activities such as data
  • Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Informatica Power Center 10.2.
  • Experience in integration of heterogeneous data sources like Oracle, VSAM and Flat Files (Fixed & delimited) into Staging Area.
  • Wrote SQL-Overrides and used filter conditions in source qualifier thereby improving the performance of the mapping.
  • Designed and developed mappings using Source Qualifier, Expression, Lookup, Router, Aggregator, Filter, Sequence Generator, Stored Procedure, Update Strategy, joiner and Rank transformations.
  • Managed the Metadata associated with the ETL processes used to populate the Data Warehouse.
  • Implemented complex business rules in Informatica Power Center by creating re-usable transformations, and robust Mapplets.
  • Implemented performance tuning of Sources, Targets, Mappings and Sessions by identifying bottlenecks and used Debugger to debug the complex mappings and fix them.
  • Improved session Performance by enabling property incremental aggregation to load incremental data into target table.
  • Worked with Functional team to make sure required data has been extracted and loaded and performed the Unit Testing and fixed the errors to meet the requirements.
  • Copied/Exported/Imported the mappings/sessions/ worklets /workflows from development to Test Repository and promoted to Production.
  • Used Session parameters, Mapping variable/parameters and created Parameter files for imparting flexible runs of workflows based on changing variable values.
  • Worked with Static, Dynamic and Persistent Cache in lookup transformation for better throughput of Sessions.
  • Used PMCMD command to automate the Power Center sessions and workflows through UNIX.

Technology: Informatica PowerCenter 10.2.0, Oracle 12.1.0, SQL, PL/SQL, Control-M 9.0, Informatica IDQ

Confidential

Informatica Developer

Responsibilities:

  • Developed complex ETL jobs from various sources such as SQL server, Postgressql and other files and loaded into target databases using Talend OS ETL tool.
  • Created Talend jobs using the dynamic schema feature.
  • Interact with business community and gathered requirements based on changing needs. Incorporated identified factors into Talend jobs to build the Data Mart.
  • Used more components in Talend and Few to be mentioned: tMap, tFilterRow, tjava, toracle, txmlMap, tdelimited files, tlogrow, tlogback, components etc. in many of my Jobs Design
  • Worked on Joblets (reusable code) & Java routines in Talend.
  • Developed complex Talend ETL jobs to migrate the data from flat files to database
  • Implemented custom Error handling in Talend jobs and worked on different methods of logging
  • Followed the organization defined Naming conventions for naming the Flat file structure, Talend Jobs and daily batches for executing the Talend Jobs
  • Work on Informatica Power Center, Informatica PowerExhange for Metadata Analysis.
  • Work on various sourcing platforms i.e. Oracle, Teradata, SQL Server, DB2 mainframe to design & create SQLs.
  • Extensive experience in Oracle, DB2 & Teradata to be able to write complex queries to evaluate data.
  • Performance Tuning of sources, Targets, mappings and SQL queries in transformations Designing, creating and tuning physical database objects (tables, views, indexes, PPI, UPI, NUPI, and USI) to sup-port normalized and dimensional models.
  • Design and implement Primary Partitioning Indexes, Join Indexes, and other techniques for effecting database space management and archival.
  • Involved in Design, analysis, Implementation, Testing and support of ETL processes for Stage, ODS and Mart.
  • Developed processes on both Teradata and Oracle using shell scripting and RDBMS utilities such as Multi Load, Fast Load, Fast Export, BTEQ (Teradata) and SQL*Plus, SQL*Loader (Oracle).
  • Created/Enhanced Teradata Stored Procedures to generate automated testing SQLs.
  • Used DataStage as an ETL tool to extract data from sources systems, loaded the data into the ORACLE database.
  • Using Informatica PowerCenter created mappings and mapplets to transform the data according to the business rules.
  • Used various transformations like Source Qualifier, Joiner, Lookup, sql, router, Filter, Expression and Update Strategy.
  • Implemented slowly changing dimensions (SCD) for some of the Tables as per user requirement.
  • Developed Stored Procedures and used them in Stored Procedure transformation for data processing and have used data migration tools
  • Documented Informatica mappings in Excel spread sheet.
  • Tuned the Informatica mappings for optimal load performance.
  • Have used BTEQ, FEXP, FLOAD, MLOAD Teradata utilities to export and load data to/from Flat files.
  • Created and Configured Workflows and Sessions to transport the data to target warehouse Oracle tables using Informatica Workflow Manager.

Technology: Informatica PowerCenter 10.2.0, Oracle 12.1.0, SQL, PL/SQL, Teradata 15.10, IBM Datastage 9.1.2.0, Hive, Spark, Aqua Data Studio 17.0.3, Talend Big Data 7.0.1, SSIS, SQL Server 12.0, Autosys r11

Confidential

Informatica Developer

Responsibilities:

  • Analyze client’s business requirements and processes through document analysis, interviews, workshops, and workflow analysis.
  • Used Informatica ETL to load data from flat files, which includes fixed-length as well as delimited files and SQL Server to the Data mart on Oracle database.
  • Used reverse engineering in Erwin 4.x to understand the existing data model of the data warehouse.
  • Worked with creating Dimensions and Fact tables for the data mart.
  • Created Informatica mappings, sessions, workflows, etc., for loading fact and dimension tables for data mart presentation layer.
  • Have implemented SCD (Slowly Changing Dimensions) Type I and II for data load. .
  • Developed mapping using parameters and variables.
  • Created complex workflows, with multiple sessions, worklets with consecutive or concurrent sessions.
  • Used Timer, Event Raise, Event Wait, Decisions, and Email tasks in Informatica Workflow Manager.
  • Used Workflow Manager for creating validating, testing and running sequential and concurrent batches.
  • Implemented source and target based partitioning for existing workflows in production to improve performance so as to cut back the running time.
  • Analyzed workflow, session, event and error logs for trouble shooting Informatica ETL process.
  • Worked with Informatica Debugger to debug the mappings in Informatica Designer.
  • Involved in creating test plans, test cases to unit test Informatica mappings, sessions and workflows.
  • Involved in migrating Informatica ETL application and Database objects through various
  • Communicate client’s business requirements by constructing easy-to-understand data and process models.
  • Draft and maintain business requirements document(BRD) and align them with functional and technical requirements document (FSD and TDD).
  • Worked on Informatica Power Center tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor.
  • Parsed high-level design specification to simple ETL coding and mapping standards.
  • Involved in building the ETL architecture and Source to Target mapping to load data into Data warehouse.
  • Developed metadata repository using OBIEE Administration tool in Physical, Business Model and Mapping, and Presentation Layer.
  • Involved in Dimensional modeling (Star Schema) of the Data warehouse and used Erwin to design the business process, dimensions and measured facts.
  • Extracted the data from the flat files and other RDBMS databases into staging area and populated onto Data warehouse.
  • Worked with Informatica power center Designer, Workflow Manager, Workflow Monitor and Repository Manager.
  • Involved in migrating Informatica ETL application and Database objects through various environments such as Development, Testing, UAT and Production environments.
  • Developed SQL Server Stored Procedures, Tuned SQL Queries( using Indexes and Execution Plan)
  • Worked with QlikView 12.x, SQL Server, Oracle, Netezza, Excel, CSV files.
  • Developed dashboards pertaining to KPI monitoring using QlikView12.x.
  • Enabled speedy reviews and first mover advantages by using Oozie to automate data loading into the Hadoop Distributed File System and PIG to pre-process the data
  • Developed performance utilization charts, optimized and tuned SQL and designed physical databases. Assisted developers with Teradata load utilities and SQL.
  • Created tables, views in Teradata, according to the requirements.
  • Used Teradata utilities like MultiLoad, Tpump, and Fast Load to load data into Teradata data warehouse from Oracle and DB2 databases.
  • Involved in debugging Informatica mappings, testing of Stored Procedures and Functions, Performance and Unit testing of Informatica Sessions, Batches and Target Data.
  • Created security settings in OBIEE Administration Tool and set up groups, access privileges and query privileges and also managed security for groups in Answers.
  • Integrated BI Publisher with OBIEE to build reports in word, excel and doc format.
  • Involved in Performance Tuning of mappings in Informatica.
  • Loaded the data into the Teradata database using Load utilities like (Fast Export, Fast Load, MultiLoad, and Tpump).
  • Developed MapReduce programs to parse the raw data, populate staging tables and store the refined data in partitioned tables in the EDW.
  • Solved Various Defects in Set of wrapper Scripts which executed the Teradata BTEQ, MLOAD, FLOAD utilities and Unix Shell scripts.
  • Performance tuning of Oracle Databases and User applications.
  • Created function in Lambda that aggregates the data from incoming events, then stored result data in Amazon Dynamo DB and S3.
  • Assisted in Batch processes using Fast Load, BTEQ, UNIX Shell and Teradata SQL to transfer cleanup and summarize data.
  • Designed and developed OLAP Cubes and Dimensions using SQL Server Analysis Services (SSAS).
  • Developed MLOAD scripts to load data from Load Ready Files to Teradata Warehouse.
  • Extract Transform Load (ETL) development using SQL Server 2005, SQL 2008 Integration Services (SSIS).
  • Creating source and target table definitions using SSIS. Source data was extracted from Flat files, SQL Server and DB2 Database
  • Wrote UNIX shell Scripts & PMCMD commands for FTP of files from remote server and backup of repository and folder.

Technology: Windows 7, Unix, Informatica Power Center 10.1/9.1/Cloud, Oracle 11g, Teradata 15.10, MS SQL Server 2008, DB2 v8.1, Erwin, OBIEE 10.1.3.4, Qlikview 12.10,Autosys, SQL, PL/SQL, UNIX, Shell scripts, Hadoop, MapReduce, HDFS, HBase, Hive, Pig,Sqoop

Confidential

Informatica ETL Developer

Responsibilities:

  • Involved in Design, analysis, Implementation, Testing and support of ETL processes for Stage, ODS and Mart.
  • Developed processes on both Teradata and Oracle using shell scripting and RDBMS utilities such as Multi Load, Fast Load, Fast Export, BTEQ (Teradata) and SQL*Plus, SQL*Loader (Oracle).
  • Created/Enhanced Teradata Stored Procedures to generate automated testing SQLs.
  • Involved in Data mining through Teradata miner.
  • Translate stakeholder requirements into over 10 different tangible deliverables such as functional specifications, user cases, user stories, workflow/process diagrams, data flow/data model diagrams.
  • Engage client to gather software requirements/business rules, and ensure alignment with development teams.
  • The Teradata EXPLAIN facility, which describes to end-users how the database system will perform any request.
  • Executed, scheduled workflows using Informatica Cloud tool to load data from Source to Target.
  • Created Hierarchies, Levels, and implemented Business Logic by creating level based measures in OBIEE business model & mapping layer.
  • Created Security settings in OBIEE Administration Tool to set up groups, access privileges and query privileges and also managed security for groups in Answers.
  • Configured and created repository using OBIEE Administration Tool. The TS/API product, a system to allow products designed for SQL/DS to access the Teradata database machine without modification.
  • Used various transformations like Filter, Router, Expression, Lookup (connected and unconnected), Aggregator, Sequence Generator, Update Strategy, Joiner, Normalizer, Sorter and Union to develop robust mappings in the Informatica Designer.
  • Extensively used the Add Currently Processed Flat File Name port to load the flat file name and to load contract number coming from flat file name into Target.
  • Worked on complex Source Qualifier queries, Pre and Post SQL queries in the Target.
  • Worked on different tasks in Workflow Manager like Sessions, Events raise, Event wait, Decision, E-mail, Command, Worklets, Assignment, Timer and Scheduling of the workflow.
  • Extensively used workflow variables, mapping parameters and mapping variables.
  • Created sessions, batches for incremental load into staging tables and scheduled them to run daily.
  • Used shortcuts to reuse objects without creating multiple objects in the repository and inherit changes made to the source automatically.
  • Implemented Informatica recommendations, methodologies and best practices.
  • Coding using BTEQ SQL of TERADATA, write Unix scripts to validate, format and execute the • SQL's on UNIX environment.
  • Prepared ETL standards, Naming conventions and wrote ETL flow documentation for Stage, ODS and Mart.
  • Performance Tuning in SQL Server 2008 using SQL Profiler and Data Loading.
  • Designed and developed Informatica Mappings and Sessions based on business user requirements and business rules to load data from source flat files and oracle tables to target tables.
  • Used debugger to debug mappings to gain troubleshooting information about data and error conditions.
  • Installing SQL Server Client side utilities and tools for all the front-end developers/programmers.
  • Used Change Data Capture (CDC) to simplify ETL in data warehouse applications.
  • Involved in writing procedures, functions in PL/SQL.
  • Involved in extensive performance tuning by determining bottlenecks at various points like targets, sources, mappings, sessions or system. This led to better session performance.
  • Extract Transform Load (ETL) development using SQL Server 2005, SQL 2008 Integration Services (SSIS).
  • Created reports for the BI team using Sqoop to export data into HDFS and Hive.
  • Installed and configured MapReduce, HIVE and the HDFS; implemented CDH3 Hadoop cluster on CentOS. Assisted with performance tuning and monitoring.
  • Worked with SQL*Loader tool to load the bulk data into Database.
  • Prepared UNIX Shell Scripts and these shell scripts will be scheduled in AUTOSYS for automatic execution at the specific timings.

Technology: Windows 7, Linux, Informatica Power Center 9.1, Informatica Cloud, Oracle 11g, Teradata 15.10, MS SQL Server 2008, DB2 v8.1, Erwin, MS Visio, OBIEE 10.1.3.4, Qlikview 12.10, Autosysm, SQL, PL/SQL, UNIX, Shell scripts, Hadoop, MapReduce, HDFS, HBase, Hive, Pig, Sqoop

Confidential

Informatica ETL developer

Responsibilities:

  • Wrote conversion scripts using SQL, PL/SQL, stored procedures, functions and packages to migrate data from SQL server database to Oracle database.
  • Using Teradata manager, Index Wizard and PMON utilities to improve performance.
  • Populate or refresh Teradata tables using Fast load, Multi load & Fast export utilities for user Acceptance testing and loading history data into Teradata
  • Worked on DTS/SSIS for transferring data from Heterogeneous Database (Access database and xml format data) to SQL Server.
  • Involved in Data Integration by identifying the information needs within and across functional areas of an enterprise database upgrade and Migration with SQL server Export Utility.
  • Worked on the Reports module of the project as a developer on MS SQL Server 2005 (using SSRS, T-SQL, scripts, stored procedures and views).
  • Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Informatica Power Center 8.5.
  • Experience in integration of heterogeneous data sources like Oracle, DB2, SQL Server and Flat Files (Fixed & delimited) into Staging Area.
  • Wrote SQL-Overrides and used filter conditions in source qualifier thereby improving the performance of the mapping.
  • Developed many Reports / Dashboards with different Analytics Views (Drill-Down, Pivot Table, Chart, Column Selector, and Tabular with global and local Filters) using OBIEE.
  • Reduced Teradata space used by optimizing tables - adding compression where appropriate and ensuring optimum column definitions.
  • Used OBIEE Web Catalog to set up groups, access privileges and query privileges.
  • The data that was obtained from various sources were fed into the staging area in Teradata.
  • Involved in loading of data into Teradata from legacy systems and flat files using complex MLOAD scripts and Fast Load.
  • Extracted data from Oracle database transformed and loaded into Teradata database according to the specifications.
  • Managed the Metadata associated with the ETL processes used to populate the Data Warehouse.
  • Developed reusable Mapplets and Transformations.
  • Used data integrator tool to support batch and for real time integration and worked on staging and integration layer.
  • Optimized the performance of the mappings by various tests on sources, targets and transformations
  • Design, develop and Informaica mappings and workflows; Identify and Remove Bottlenecks in order to improve the performance of mappings and workflows
  • Review existing code, lead efforts to tweak and tune the performance of existing Informatica processes
  • Scheduling the sessions to extract, transform and load data in to warehouse database on Business requirements.
  • Migrated SSIS Packages from SQL Server 2005 to SQL Server 2008.
  • Extract, transform, and load data from multiple data sources into the QlikView application
  • The project initially used Tableau but migrated to Qlikview in the 2nd week of development
  • Design, Build, Test, Debug, Monitor, and Troubleshoot QlikView solutions
  • Developed a Conceptual model using Erwin based on requirements analysis
  • Involved in writing Teradata SQL bulk programs and in Performance tuning activities for Teradata SQL statements using Teradata EXPLAIN
  • Audited application SQL code with DB2 Explain prior to production implementation
  • Used PMCMD command to automate the Power Center sessions and workflows through UNIX.

Technology: Windows 7, Linux, Informatica Power Center 9.1/8.6, Oracle 11g/10g, Teradata 14.1, MS SQL Server 2008, DB2 v8.1, Erwin, MS Visio, Cognos 8.4, Business Objects XI r2/6.x, OBIEE 10.1.3.4, Qlikview 12.10, Autosys, Control-M, SQL, PL/SQL, UNIX, Shell scripts, Java, JavaScript, Hadoop, MapReduce, HDFS, HBase, Hive, PigSqoop

Confidential

Sr. Informatica developer

Responsibilities:

  • Collected requirements from business, studied the same extensively; mapped subject areas onto dimensions, identified facts from the Key Performance Indexes (KPIs) and formulated conformed dimensions using Dimensional Modeling.
  • Created packages using SSIS for data extraction from Flat Files, Excel Files, and OLEDB to SQL Server.
  • Performance tuning of Oracle Databases and User applications.
  • Modeled multiple data marts within the warehouse and articulated respective Star/Snow-Flake Schemas.
  • Created Informatica transformations/mapplets/mappings/tasks/worklets/workflows to load the data from source to stage, stage to dimensions, bridges, facts, summary and snapshot facts.
  • Designed and developed OLAP Cubes and Dimensions using SQL Server Analysis Services (SSAS).
  • Used External Loaders like Multi Load, T Pump and Fast Load to load data into Teradata database.
  • Loaded data into Teradata using DataStage, FastLoad, BTEQ, Fast Export, MultiLoad, and Korn shell scripts.
  • Analyzed business requirements, transformed data, and mapped source data using the Teradata Financial Services Logical Data Model tool, from the source system to the Teradata Physical Data Model.
  • Installed, implemented, and trained staff in the use of Platinum DB2 tools, Candle Monitor for DB2, and DB2 tuning techniques.
  • Assisted in Batch processes using Fast Load, BTEQ, UNIX Shell and Teradata SQL to transfer cleanup and summarize data.
  • Made use of various Designer transformations like Source Qualifier, Connected and Unconnected Lookups, Expression, Filter, Router, Sorter, Aggregator, Joiner, Normalizer, Rank, Router, Sequence, Union and Update Strategy transformations while creating mapplets/mappings.
  • Used DTS Packages as ETL tool for migrating Data from SQL Server 2000 to Oracle 10g
  • Extensively used the Workflow Manager tasks like Session, Event-Wait, Timer, Command, Decision, Control and E-mail while creating worklets/workflows.
  • Fine-tuned procedures/SQL queries for maximum efficiency in various databases using Oracle Hints, for Rule based optimization.
  • Contributed toward Informatica upgrade from version 8.6.1 to 9.1.0.
  • Developed Oracle PL/SQL code to write DDL/DML statements and Stored Procedures for data transformation and manipulation.
  • Created UNIX shell scripts to FTP and cleanse source data files and archive data and log files.
  • Worked on Teradata Query man to validate the data in warehouse for sanity check.
  • Sequenced the different ETL components (Informatica workflows/Oracle SQLs/UNIX scripts) and enlisted in a list file; further created a data load regulation program by writing a parent UNIX shell script that executes the components within the list file sequentially. Invoked Informatica using “pmcmd” utility from the UNIX script and Oracle using “sqlplus”.
  • Incorporated the mechanism to restart the load in the main program itself by logging status to ensure smooth resumption in case of any failures during the load.
  • Created modules and chains to schedule the loads using UC4 Applications Manager.
  • Created and maintained DDL for all DB2 databases.
  • Successfully deployed data marts (DM) like Prospecting/Sales DM, Finance DM, Risk DM, Originations DM, and Customer Analytics DM to constitute the entire warehouse.
  • Provided support for the applications after production deployment to take care of any post-deployment issues.

Technology: Windows 7, Unix, Informatica Power Center 8.6/8.5/8.1, Oracle 11g/10g, Teradata 14.1, MS SQL Server 2008, Cognos 8.4, Business Objects XI r2/6.x, Autosys, Control-M, SQL, PL/SQL, UNIX, Shell scripts

We'd love your feedback!