We provide IT Staff Augmentation Services!

Sr Etl Informatica Developer Resume

3.00/5 (Submit Your Rating)

Houston, TexaS

PROFESSIONAL SUMMARY:

  • Over 7 years of IT experience with a focus and good understanding of Banking, Finance, Telecommunications, Public and Health care industry including designing, developing, implementing and supporting Data Warehouses, data marts and data integration, ETL projects.
  • Good team player with excellent communication and interpersonal skills with an ability to perform individually as well as ability to work in - group and Team Lead. Excellent problem solving with good analytical and programming skills good time management skills, quick leaner and initiative to learn new technology and tools quickly.
  • Over 7 years experience as Informatica Developer in Data integration, Migration and ETL processes using Informatica PowerCenter 9.X,8.X/7.X/6.X/5.X, Power Exchange (CDC), Informatica Data Quality both in real time and batch processes.
  • Extensive understanding of Informatica Grid Architecture, Oracle/Teradata architecture and how the load and resources are distributed in Grid to maximum utilize the resources available, increase performance.
  • Ability to meet deadlines and handle multiple tasks, decisive with strong leadership qualities, flexible in work schedules and possess good communication skills.
  • Extensively worked on developing Informatica Mappings, Mapplets, Sessions, Workflows and Worklets for data loads from various sources such as Oracle, Flat Files, Teradata, XML, SAP, DB2, SQL Server, Tibco.
  • Design and development of Informatica mappings which involves XML files, Flat files, Oracle &AWS Redshift tables.
  • Involved in Technical Architecture Design Process, by gathering high level Business Requirement Document and Technical Specification Document from Technical Architecture and followed the specific conventions.
  • Analyzing Data by High level audit, JAD sessions during the business requirement definition and evaluate granularity, Historical consistency, and valid values and attribute availability.
  • Proven ability to interface and coordinate with cross functional teams, analyze existing system, wants and needs, design, implement database solutions and providing integration solution.
  • Well acquainted with Performance Tuning of sources, targets, mapping and sessions to overcome the bottlenecks in mappings.
  • Developed Complex Mapping using Source qualifier, Lookup, Joiner, Aggregator, Expression, Filter, Router, Union, Store Procedure, Web Services, Transaction Control and other Transformation for the slowly changing Dimension (Type1, Type2, and Type3) to keep track of historical data.
  • Day to Day responsibilities contains reviewing the High level Documentation, TRD, identifying development objective, defect and fixes.
  • Implemented Performance Tuning techniques at application, database, and system levels by using Hints, Indexes, Partitioning, Materialized View, External Table, Procedures, Functions and Explain Plan.
  • Informatica automation through UNIX Shell Scripts for running sessions, aborting sessions and creating parameter files. Written number of shell script (Korn Shell) to run various jobs and experience in FTP files processing.
  • Created database objects like Tables, Views, Synonyms, DBlink and Indexes from Logical database design document and Experience in writing, testing and implementation of the Stored Procedures, Functions and Triggers using Oracle PL/SQL, Teradata.

TECHNICAL SKILLS:

Data Modeling: NF1/NF2/NF3, Logical, Physical Modeling with ERWIN, ERstudio, MsVisio.

ETL Tools: InformaticaPowerCenter10.2, 10.1/9.6/9.1/8.6.1/8.6.0/8.1.1/8.0/7. x/6.x, PowerExchange 8.X, IDQ, DT studio with structure and Semi Structure data IDQ, rules/data profiling/data cleansing/data parsing, IDE

Databases: Oracle 8i/9i/10g/11g, MS SQL Server 2000/2005/2008, /2012, Teradata (v2R5, 13), Teradata utilities TTU (BTEQ, MLOAD, FLOAD, Fast Export).

Big Data: Hadoop, Map Reducer, Teradata Aster, Query It, Teradata Studio, HBase, Hive, and Putty.

Programming Languages: SQL, SQL* Plus, PL/SQL, Teradata Procedures, PowerShell/Batch/UNIX/PERL Shell Scripting, JAVA, HTML, XML.

Reporting Tools: Teradata SQL Assistance, Toad 9.x, Oracle SQL Developer, SSMS, SSRS, Visual Studio, MS Excel, MS Word, Autosys 11.3, IBM Cognos 8.

Environment: Unix, Linux, IBM AIX, Window xp/NT/2003/vista/7

Cloud: Lamda, S3, SQS, and EC2

PROFESSIONAL EXPERIENCE:

Sr ETL Informatica Developer

Confidential, Houston, Texas

Responsibilities:

  • Involved in Business Requirements analysis and design, prepared and technical design documents.
  • Used Erwin for logical and Physical database modeling of the staging tables, worked with the Data Modeler and contributed to the Data Warehouse and Data Mart design and specifications.
  • Developed technical design specification to load the data into the data mart tables confirming to the business rules.
  • Involved in design and development of complex ETL mappings and stored procedures in an optimized manner.
  • Cleansed the source data, extracted and transformed data with business rules, and built reusable components such as Mapplets, Reusable transformations and sessions etc.
  • Involved in loading the data from Source Tables to ODS (Operational Data Source) Tables using Transformation and Cleansing Logic using Informatica.
  • Developed complex Informatica mappings to load the data from various sources using different transformations like source qualifier, connected and unconnected look up, update Strategy, expression, aggregator, joiner, filter, normalizer, rank and router
  • Developed mapplets and worklets for reusability.
  • Developed workflow tasks like reusable Email, Event wait, Timer, Command, Decision.
  • Implemented partitioning and bulk loads for loading large volume of data.
  • Used Informatica debugging techniques to debug the mappings and used session log files and bad files to trace errors occurred while loading
  • Involved in performance tuning of mappings, transformations and (workflow) sessions to optimize session performance.
  • Prepared AWS Redshift /SQL Queries to validate the data in both source and target databases.
  • Strong understanding of Conceptual Data Modeling (CDM), Logical Data Modeling (LDM), Physical Data Modeling (PDM). data validation.
  • Installed/configured Teradata Power Connect for Fast Export for Informatica and Amazon redshift cloud data integration application for faster data queries.
  • Created JDBC, ODBC connections in Amazon redshift from the connect client tab of the console.
  • Automated the administrative tasks of Amazon redshift like provision, monitoring etc.
  • Aware of the columnar storage, data compression, zone maps of Amazon redshift.
  • Parsed the target data in IDQ using parser transformation.
  • Worked on data profiling & various data quality rules development usingInformatica Data Quality.
  • Worked extensively onData Maskingtransformation applying all the rules available to mask data inQA/DEV/SITenvironments depending on the requirement from the SME Implemented various transformations likeJoiner, Aggregator, Expression, Lookup, Filter, Update Strategy, Sequence Generator, and Router etc.
  • Involved with Data Steward Team for designing, documenting and configuring Informatica Data Director for supporting management of MDM data.
  • Deployed new MDM Hub for portals in conjunction with user interface on IDD application.
  • Designed, Installed, Configured core Informatica Master Data Management (MDM) Hub components such as Informatica Master Data Management (MDM) Hub Console, Hub Store, Hub Server, Cleanse Match Server, Cleanse Adapter & Data Modeling.
  • Used Informatica’s Data Transformation (B2B) tool to retrieve unstructured (xml) data.
  • Developed Informatica Data Director (IDD) applications and queries used for Data Steward Analysis.
  • Created Materialized views for summary tables for better query performance.
  • Implemented weekly error tracking and correction process using Informatica.
  • Developed Documentation for all the routines (Mappings, Sessions and Workflows).
  • Experience in integration of various data sources with Multiple Relational Databases like Oracle, Teradata, SQL Server and Worked on integrating data from flat files like fixed width, delimited, CSV,HL7 (health level)andEPIC EXT Files
  • DT/DX (Data Transformation Studio), Data Transformation Accelerator, Data Format Transformation Library, Data Transformation Engine and full integration with Power center
  • Worked on extracting data from the heterogeneous source systems like MS SQL Server, Oracle,HL7Files and loading into Landing Layer and then to Data Warehouse
  • Creating Test cases and detailed documentation for Unit Test, System, Integration Test and UAT to check the data quality.
  • Worked closely with the end users in writing the functional specifications based on the business needs. Analyzed the source data coming from Oracle. Coordinated with Data Warehouse team in developing data discovery Model.

Environment: Informatica 10.1, Netezza, Oracle 11g, OBIEE 11.1.1.7, DB2, FLATFILES (Fixed width and Delimited), SQL, TOAD, data masking, RedShift, Aws, lamda, Informatica Data Transformation (B2B), Informatica Mdm 10.1, PUTTY, UNIX, Autosys r11.

Sr ETL Informatica Developer

Confidential, San Francisco, CA

Responsibilities:

  • Worked with business analysts for requirement gathering, business analysis, and translated the business requirements into technical specifications to build the Enterprise data warehouse.
  • Analyzed the system for the functionality required as per the requirements and created System Requirement Specification document (Functional Requirement Document).
  • Involved in the development of the conceptual, logical and physical data model of the star schema using ERWIN.
  • Extensively used Informatica Client tools - Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer, Informatica Repository Manager and Informatica Workflow Manager.
  • Developed various complex mappings using Mapping Designer and worked with Aggregator, Lookup (connected and unconnected), Filter, Router, Joiner, Source Qualifier, Expression, Stored Procedure, Sorter and Sequence Generator transformations.
  • Worked in AWS Redshift database for designing tables, loading data

    Used Amazon S3 for loading data from flat files/ Sql Server to AWS Redshift.

  • Optimized and Tuned SQL queries used in the source qualifier of certain mappings to eliminate Full Table scan.
  • Created and Configured Workflows, Worklets and Sessions to transport the data to target warehouse tables using Informatica Workflow Manager.
  • Created various tasks like Email, Event-wait and Event-raise, Timer, Scheduler, Control, Decision, Session in the workflow manager.
  • Installation and configuration of OBIEE 11, BI Publisher Enterprise, Repository Configuration Utility
  • Expertise in debugging and optimizing the Oracle BI / OBIEE Dashboards / Reports and ODI Mappings / Workflows.
  • Developed UNIX shell Scripts to generate parameter files and executed oracle procedures as batch jobs.
  • Created scripts for Batch test and set the required options for overnight, automated execution of test scripts.
  • Setting up Batches and sessions to schedule the loads at required frequency using Power center Workflow manager.
  • Automated UNIX shell scripts to verify the count of records added everyday due to incremental data load for few of the base tables in order to check for the consistency.

Environment: Informatica 9.6, Netezza, Oracle 11g, OBIEE 11.1.1.7, DB2, FLATFILES (Fixed width and Delimited), SQL, TOAD, PUTTY, UNIX, Autosys r11.

Sr. ETL Developer

Confidential

Responsibilities:

  • Extensively Working on Informatica tools such as Source Analyzer, Warehouse Designer, Transformation Designer, Mapplet Designer and Mapping Designer.
  • Using Informatica Power center 10.1 to make the changes to the existingETLWrote PL/SQL procedures which are called from Stored Procedure transformation to perform database actions such as truncate the target before load delete records based on a condition and Re-name the tables.
  • Create ETL mappings with complex business logic for high volume data loads using various transformations such as Aggregator, Sorter, Filter, Normalizer, SQL Transformation, Lookup, Joiner, Router, Update Strategy, Union, Sequence Generator, transformation language likes transformation expression, constants, system variables, data format strings etc.
  • Developed Mapplets, Reusable Transformations, Source and Target definitions, mappings using Informatica 10.x.
  • Extensively worked on building workflows, worklets, sessions, command tasks etc.
  • Designed and developed Informatica mappings including Type-I, Type-II and Type-III slowly changing dimensions (SCD).
  • •Created mappings using different IDQ transformations like Parser, Standardize, Match, Labeler and Address Validate.
  • Extensively worked on ETL performance tuning for tune the data load, worked with DBAs for SQL query tuning etc.
  • Used of iPaas and dais Cloud Services enable to customers to develop, execute and govern integration flows.
  • Cloud Computing and hands on exposure to any cloud and iPaas platform
  • Perform Designs and Analysis on business systems applications, systems interfaces, databases, reporting, or business intelligence systems
  • Involved in writing SQL Stored procedures and Shell Scripts to access data from various sources.
  • Performed System, Integration testing and supported User Acceptance Testing.
  • Managed stakeholder communication and validated the fixed issues in order to ensure support availability as per agreed SLAs.
  • Responsible to handle multiple (18)Informaticaapplications to design, implement,supportand operating Data Integration Infrastructure.
  • Worked on planning, design and implementing the standards, guidelines and best practice.
  • Experienced in using Informatica cloud REST API to access and perform Informatica cloud tasks.
  • Experienced in Developing, maintaining and enhancing Informatica Cloud Mappings, Task Flows, and processes (Data Synchronization, Data replication)
  • Experienced in integrating Sales force and Web Services using Informatica Power Exchange and Informatica Cloud
  • Conduct requirement planning sessions for the hands-on development of the implementation strategy and customize payroll layouts based on best practices and plan knowledge
  • Involved in massive data profiling using IDQ (Analyst Tool) prior to data staging.
  • Used IDQ's standardized plans for addresses and names clean ups.
  • Worked on IDQ file configuration at user's machines and resolved the issues.

Environment: Informatica Power Center 10.2, Oracle 11g, Teradata v13, Teradata SQL Assistance, RHEL (awl, seed), Windows 2003 Server, Toad, SQL Developer.

Sr. ETL Developer

Confidential, Minneapolis, MN

Responsibilities:

  • As a Sr. ETL developer had high-level overview of all required TRD and Mapping Document, and reviews them as development prospective to find out what all the development required.
  • Involved in the meeting with the Business to discuss the business requirement, ETL specification, calculation of all the metrics, sub metrics and reports that Frontier generate for the business to review.
  • As a primary developer in the team, responsible to schedule Team meeting to discuss the Development, changes required, Time Line to meet all the SLA.
  • As a part of the development team understanding the business requirement, Involved in the code change in the Oracle Store Procedures and Package required for the different metrics that calculates the revenue based on the CLEC usage, Retail, and DUF usage.
  • Developed Informatica Workflow using the Informatica PowerCenter 9.1 that was required for the new Order, Trouble, and Billing feeds. Using the Informatica PowerCenter/PowerExchange and also was responsible for the Daily Load of over 500 files every day.
  • Developed standard and re-usable mapping and mapplets using the various transformations like expression, lookup, joiner that Extract, Transform and Load the data between different environment using relational writer and Fast Export.
  • Responsible for Analyzing the data coming from the different sources(Oracle 11g, Teradata v13, XML, Flat Files) and databases by using Complex Query.
  • Independently wrote Oracle Store Procedures and functions that calculate penalty at the end of the month.
  • Created the modified the Teradata utilities BTEQ, MLOAD, FLOAD script to load the data from the various data source and legacy systems into Teradata Test and Production environment.
  • Used the Teradata EXPLAIN and visual explain to analyze the cost and improve the Query performance.
  • Created the workflow for Incremental Load and Slowly Change Dimensions (SCD1, SCD2, SCD3) using the Lookup and Aggregator transformation both for real time and Batch Processes.
  • Developed the real time workflow that process the message and message queues from the MQ Series, web service message using the XML Parser and web service consumer transformation.
  • Involved in writing various shell scripting using Unix/Perl and automate the Informatica workflow using cron jobs.
  • Used Hints, store procedure, complex query using the Toad 9.5.0 to help the team in the performance tuning of the database.
  • After the development of the Informatica workflow responsible for the importing and exporting the code to QA and PRD.
  • Involved in the Unit testing the informatica workflow and with the Testing team to help them writing the test cases for different metrics.
  • Used Power Exchange for web services to submit member assessment to health plan and consume result.

Environment: Informatica Power Center 9.1, Oracle 11g, Teradata v13, Teradata SQL Assistance, RHEL (awl, sed), Windows 2003 Server, Toad, SQL Developer.

Sr. ETL Developer

Confidential, Plano TX

Responsibilities:

  • Gathered requirements from business analysts for designing and development of the system and developed transformation logic and designed various complex Mappings in the Designer for data load and data cleansing.
  • Involved in writing Functional Design Specification document (FDS), translating BRD business requirement documents to technical specification, creates/maintained/modified database design Documents with detail description of logical entities and physical table.
  • Master Data Management (MDM) process and tool was used to provide data transformation, data consolidation, data governance so that business is supportive to take Decision Support System.
  • Categorized Dimension and Fact table by interviewing Oracle functional expert and Business Analyst and evaluated the granularity, Historical consistency and attribute availability.
  • Extensively used Star schema and Snow Flake Schema methodologies in building and designing the logical data model in SCD1, SCD2, and SCD3.
  • Created an environment object to trust open communication, provide the team with a vision of project objective, also motivated and inspired team members.
  • Categorized the transactions by department whether it’s from sales, customers etc by looking at transaction code available in code master table to implement correct logic and filter unnecessary transactions.
  • Responsible for communicating with offshore team to maintain the critical work which has to handle by on site team and work which was not time critical has to assign to off-shore team.
  • Used Informatica PowerExchage with CDC option to capture the inserted, updated, deleted and any changed data and Exchanging data between different partner having their own format and database.
  • Designed Mappings using Mapping Designer to load the data from various sources (Flat Files, Sap, Teradata, VSAM) using different transformations like Source Qualifier, Expression, Lookup (Connected and Unconnected), Aggregator, Update Strategy, Joiner, Filter, and Sorter transformations
  • Extensively used the capabilities of PowerCenter such as File List, pmcmd, Target Load Order, Concurrent Lookup Caches etc. Created and Monitored Workflows using Workflow Manager and Workflow Monitor
  • Extensively worked on Mapping Variables, Mapping Parameters, Workflow Variables, Session Parameters and making Parameter file to pass value through Shell Scripting.
  • Used shortcuts (Global/Local) to reuse objects without creating multiple objects in the repository and inherit changes made to the source automatically.
  • Used debugger to test the mapping and fixed the bugs, and find error by fetching Session log from Workflow Monitor.
  • Extensively involved in identifying performance bottlenecks in targets, sources, mappings, sessions and successfully tuned them by using persistence cache, lookup transformation and parallel processing for maximum performance.
  • Did error handling and performance tuning in Teradata queries and utilities for the existing complex scripts.
  • Created Autosys jobs to schedule sessions and workflows on Linux and scheduled various reports using Schedule Management tool/Event Studio
  • Developed Unit Test monitoring the run time to ensure successful execution of the data loading processes involve in preparing high level documents about mapping details.
  • Used Shell Scripts for better handling of incoming source files such as moving files from one directory to another directory and extracting information from the log files on Linux server.
  • Involved in preparing SDD (System Design Document) based on PRD (Project Requirement Document), TRD (Technical Requirement Document) and Market analysis team inputs
  • Worked with talend solution implementation guideline on daily basis and helped team integrated Informatica with big data.
  • As a Sr. ETL developer I integrated Informatica with HDFS and Teradata aster.
  • Developed Hive, Pig script for data analysis on daily basis.
  • Integrated with Hadoop HDFS using Power Exchange for Hadoop for loading data

Environment: Informatica PowerCenter 8.6, Teradata, Teradata SQL Assistance, IBM-AIX (awl, sed, korn shell), Windows 2003 Server, Autosys 11.3

ETL Developer

Confidential, NY

Responsibilities:

  • Responsible to analyze functional specification and prepared technical specification, development, deploying and testing according to business requirement
  • Involved in data profiling process to systematically examine the quality, scope of data source to build a reliable ETL system with minimum transformation and human intervention before load to target table.
  • Involved in cleansing process to clean invalid values like zip code format, ensuring consistency across records, removing duplicates and ensure if complex business rules has been enforced.
  • Worked on Informatica PowerCenter tools - Designer, Repository Manager, Workflow Manager, and Workflow Monitor.
  • Extracted the data from the flat files and Relational databases into staging area and populated into data warehouse using SCD Type 2 logic to maintain the history.
  • Developed number of complex Informatica Mappings, Mapplets, and reusable Transformations to implement the business logic and to load the data incrementally.
  • Employed the Source Qualifier, Lookup, Expression, Aggregator, Rank, Sequence and Joiner Transformations in the Mappings to populate data into the target.
  • Tested behavior of the mappings using the Load Test option & Unit Test cases and debugging using the Debugger to see mapping flow through various transformations.
  • Closely worked with the QA team during the testing phase and fixed the bugs that were reporting.
  • Created & maintained tables, views, synonyms and indexes from Logical database design document and wrote stored procedures in PL/SQL for certain key business requirements.
  • Trouble shooting of connectivity problems. Looked up for errors by maintaining a different log file.
  • Performance tuned the mappings by integrating the logic and reducing the number of transformations used.
  • Maintained high level document including source name, target name, number of rows in both target and source, transformation used and session information.

Environment: Oracle 10g, Informatica PowerCenter 7.1, PL/SQL, UNIX, Windows 2003 Server, Toad 8.6.1.

ETL Developer

Confidential

Responsibilities:

  • Understanding the business needs by gathering information from Business Analyst, Data Modeler and Business user.
  • Involved in process of preparing Physical design of input and output by documented record layouts, source, target location, file/table sizing information.
  • Extensively explore the physical model of Data Warehouse including all the Dimensions and facts like household, branch, account etc to better understanding of requirement and meet all the SLA.
  • Informatica Designer tools were used to design the source definition, target definition and transformations to build mappings and documented how data will be transformed.
  • Monitored batches and sessions for weekly and Monthly extracts from various data sources across all platforms to the target database using cron scheduler.
  • Supported technical teams involved with ORACLE issues and operations also developed UNIX shell Scripting and PL/SQL Procedure to extract and load data.
  • Used Informatica command utilities like pmcmd to communicate with integrated server to perform some task like start workflow, abort workflow.
  • Performance tuned the mappings using different techniques like indexes, store procedure, functions, and materialized view to provide maximum efficiency.
  • Developed UNIX shell scripts to move source files to archive directory, maintaining logs and automate processes by using command utilities sed, awk, cut and other Unix command
  • Involved in Unit testing to find the validation of mapping and session before testing session start and report all bugs.

Environment: Informatica Power Center 6.2, Oracle 9i, PL/SQL, Windows NT, UNIX.

We'd love your feedback!