We provide IT Staff Augmentation Services!

Sr. Etl Consultant Resume

Findlay, OH

SUMMARY

  • 7+ years of IT experience in the implementation of ETL methodologies to support data extraction, data migration, data transformation and data loading using Informatica PowerCenter 9.6/9.1/8 and Informatica Data Quality 9.6/9
  • Coordinated with Clients, Business Analysts & Data Modelers in understanding the BRD - Business Requirement Document, Mapping Document and Data Models
  • Extensive ETL experience using Informatica PowerCenter (Designer, Workflow Manager, Workflow Monitor and Server Manager)
  • Developed complex mappings using transformations like Source qualifier, Expression, Aggregator, Joiner, Normalizer, SQL, Union, Update Strategy, Filter, Router, XML transformation, Stored Procedure and more
  • Performed various workflow tasks such as meetings, event raising, event waiting, e-mail, orders, worklets, and workflow scheduling
  • Worked on Data Profiling using IDE-Informatica Data Explorer and IDQ- Informatica Data Quality to analyze various trends of source data. Proficient in the creation of transformations such as Parser, Classifier, Standardizer and Decision in Informatica ID
  • Extensively involved in the optimization and tuning of Informatica mappings and sessions by defining and removing bottlenecks at source, transformation, and target layer
  • Developed ETL mappings to migrate data from different sources like SQL Server, Oracle, Mainframe, Teradata and flat files into data marts and data warehouse using Informatica Power Center - Designer, Workflow Manager, and Workflow Monitor
  • Experience with many MDM implementations, including Data Profiling, Data Extraction, Data Validation, Data Cleaning, Data Cleansing, Data Match, Data Load, Data Migration, Validation of Trust Score
  • Worked on Data Profiling using IDE-Informatica Data Explorer and IDQ- Informatica Data Quality to examine different patterns of source data
  • Proficient in the creation of transformations such as Parser, Classifier, Standardizer and Decision in Informatica IDQ. Experienced in creating profiles, rules, scorecards for data profiling and quality using IDQ
  • Experience in the performance tuning of HiveQL and Pig scripts and the development of Hive database views to load into Hive and Netezza databases
  • Experienced in Informatica Big Data Edition (BDE) - Read, Write data from and to HDFS, XMap data transformation, Hadoop data ingestion, Hadoop cluster mapping
  • Experienced with Teradata utilities Fast Load, Multi Load, BTEQ scripting, Fast Export, SQL Assistant
  • Experience with Teradata DBA utilities Teradata Manager, Workload Manager, index wizard, Stats Wizard and Visual Explain
  • Built the platform to implement Informatica Cloud to extend the advantages of enterprise cloud computing
  • Extensive understanding of RDBMS concepts and Relational and Dimensional Data modeling and experienced in the creation and use of stored processes, roles, viewpoints, and materialized opinions
  • Experience on Slowly Changing Dimensions (SCD Type1, SCD Type 2, SCD Type 3), Change Data Capture, Dimensional Data Modeling, Ralph Kimball Approach, Star/Snowflake Modeling, Data Marts, OLAP and FACT and Dimensions tables, Physical and Logical data modeling
  • Involved in understanding of Business Processes, grain identification, identification of dimensions and measures for OLAP applications
  • Solid experience writing SQL queries, Stored Procedures, and UNIX shell Scripting
  • Experience in working with big data Hadoop stack tools like HDFS, HIVE, Pig, Sqoop and imported and exported data into HDFS and Hive using Sqoop
  • Extensive knowledge of SDLC (Software Development Life Cycle) and worked on Waterfall and Agile methodology projects. Proficient in using Agile methodology and framework in software development
  • Configured, developed, tested, implemented, supported, and maintained AutoSys JIL scripts
  • Excellent communication, presentation and project management skills, a really strong team member and self-starter with the ability to function as part of a team independently. Have proven to be extremely efficient in interconnecting business and technical groups

TECHNICAL SKILLS

ETL Tools: Informatica Power Center 9.5/9.1/8.6/8.5/8.1/7.1 , Informatica Power Exchange 8.6/7.1, Informatica Power Connect, Informatica MDM 9.1, Informatica IDQ 9.1

Databases: Oracle 11g/10g/9i/8i/7.x, MS Access, Teradata 13/12/V2R5, DB2 UDB 9.7/10.5, SQL Server 2012/2008/2005

Data modeling Tools: Erwin r7/r4/r3.5, Microsoft Visio 2007/2010, Power designer and Tableau 8.1/8.0/7.1

Utility tools: PL/SQL, SQL, SQL* loader, TOAD, Shell Scripting

Operating Systems: UNIX, Linux, Windows XP/Vista/7/8, Windows server 2003/2008/2010

Scheduling Tools: Tivoli, Autosys, Control-M, Informatica Scheduler

Big Data Ecosystem: HDFS, HBase, Hadoop MapReduce, Hive,Pig, Sqoop, Flume, Oozie, Cassandra

Programming Languages: HTML, PL/ SQL, ASP.NET, Visual Basic7.0/6.0/5.0, Python, Programming R

Methodologies: Waterfall, Agile

PROFESSIONAL EXPERIENCE

Confidential, Findlay, OH

Sr. ETL Consultant

Responsibilities:

  • Coordinated with cross functional team members, Business Analysts, Onsite and offshore teams. Involved in Initial loads, Incremental loads, and Daily loads to ensure that the data is loaded in the tables in a timely and appropriate manner
  • Expertise in System Study, Data Migration, Data integration, Data profiling, Data Cleansing / Data Scrubbing and Data quality
  • Developed Informatica Power Center (ETL) mappings using Designer and extracted data from various sources, transformed data according to the requirement.Created Mapping Parameters, Session parameters, Mapping Variables and Session Variables
  • Performed complex troubleshooting, root cause analysis and provide a solution to repair defects independently.
  • Migrated Mappings, Sessions, Workflows from Development to SIT and then to UAT environment and fixed the defects
  • Developed the Informatica Mappings by usage of Filter, Expression, Sequence Generator, Update Strategy, Joiner, Aggregator, SQL overrides usage in Lookups, source filter usage in Source qualifiers, and data flow management into multiple targets using Router
  • Developed database objects such as SSIS Packages, Tables, Triggers, and Indexes using T-SQL, SQL Analyzer and Enterprise Manager
  • Strong experience inData Warehousing (EDW), Dimensional Modeling using Star and Snowflake Schema, Identifying Facts and Dimensions, Physical and logical data modeling using Erwin and ER-Studio
  • Developed workflows using Workflow Manager with Worklets, Event Waits, Assignments, Conditional Flows, Email and Command Tasks
  • Utilized BTEQ, FASTEXPORT, MULTI LOAD, FASTLOADD, FASTLOAD utilities to load data from different data sources and legacy systems into the Teradata
  • Worked on MDM Hub configurations - Data modeling & Mappings, Data validation, Match and Merge rules, Hierarchy Manager, customizing/configuring Informatica Data Director (IDD)
  • Worked on Informatica Data Quality (IDQ) toolkit, analysis, data cleansing, data matching, data conversion, address standardization, exception handling, reporting, and monitoring capabilities of IDQ
  • Worked extensively on CDC to capture changes in data. Used debugger to verify the mappings and collect details about the data and error conditions from troubleshooting
  • Optimization and performance tuning of Hive QL, formatting table column using Hive functions.
  • Performed requirement gathering analysis, design, development, testing, implementation, support and maintenancephases of both MDM and Data Integration Projects
  • Developed Unix shell scripts, PL/SQL procedures for table creation/dropping, and performance indexes for pre- and post-session management and Worked extensively on Autosys on scheduling loading jobs
  • Excellent interpersonal and communication skills, and is experienced in working with senior level managers, businesspeople, and developers across multiple disciplines

Environment: Informatica Power center 10.2,Informatica Data Quality (IDQ), Oracle 11g, Teradata 16, UNIX/LINUX, Shell Scripting, Autosys, T - SQL, PL/SQL, SQL, SSIS

Confidential

Informatica Developer

Responsibilities:

  • Developed ETL programs using Informatica Power center 9.6.1/9.5.1 to implement the business requirements
  • Developed Reusable transformations, Mapplets, Worklets using Transformation Developer, Mapplet Designer and Worklet Designer
  • Lead a team of 6 other resources and worked in multiple small to medium sized projects
  • Experienced with different Caches such as Index cache, Data cache and Lookup cache (Static, Dynamic and Persistence) while developing the Mappings
  • Experience working on Data quality tools Informatica IDQ 9.1, Informatica MDM 9.1
  • Worked with Informatica tools IDQ Data Analyst, Developer with various data profiling techniques to cleanse, match/remove duplicate data, fixing bad data and fixing NULL values
  • Communicate with company clients and convert the business requirements to technical specifications documented the business needs and responsible in designing the Mapping Design documents and the Deployment Documents.
  • Implemented performance tuning on sources, targets, mappings, and sessions and reduced runtimeand worked on Dimension as well as Fact tables
  • Developed test plans and test scripts, conducted SIT and UAT testing, and coordinate with the clients to support end-user acceptance testing
  • Created measurable metrics and attributes to support a robust and successful deployment of the existing Informatica MDM 9.5 platform
  • Developed Informatica MDM 9.5 Hub Console Mappings and Planned requirement analysis sessions with business users
  • Worked with Tidal to schedule Informatica jobs and implemented dependencies if necessary
  • Designed and Developed Oracle PL/SQL and UNIX Shell Scripts, Data Import/Export.
  • Written end to end documentation for project development, logics used, code, testing, changes, and corrections, optimized the mappings by changing the logic and reduced running time
  • Implemented error-handling and exception handling by logging into the error tables and sending an alert message via e-mail to the concerned address list
  • Responsible for migrating the code to QA and production environments and extending the support for the QA and UAT for issues and post-implementation defects
  • Expertise in using Teradata Utilities BTEQ, M-Load, F-Load, TPT and F-Export in combination with Informatica for better Load into Teradata database

Environment: Informatica 9.6/10.2, Teradata, UNIX, Informatica MDM 9.5

Confidential

Informatica Developer

Responsibilities:

  • Worked on data extraction, cleansing and data integration, and loading from Oracle database and excel files using Informatica power center capabilities
  • Developed mapplets that provide reusability in mappings and used debugger to detect bugs in existing mappings by analyzing data flow and evaluating transformations
  • Implemented Mapping parameters, variables, and session parameters to enter values dynamically betweensessionsDeveloped scripts for creating new tables and views for new project enhancements and created indexes on tables for faster retrieval of data and obtain better performance
  • Partitioned fact tables and created materialized views which perform complex aggregations and high cost joins thereby improving performance
  • Created Triggers in Oracle pl/sql to insert records from transaction tables into history tables and performed tuning of queries using explain plan and auto trace utilities
  • Implemented stored procedures, packages, and functions to move data from staging to various targets likes data marts
  • Created Autosys jil files to create jobs to schedule various project process like loading and unloading of data.
  • Executed SQL queries, stored procedures and performed data validation as a part of backend testing
  • Performed unit testing, adhoc testing and documented all the plans and observations.
  • Deployed various scripts like SQL scripts, Informatica workflows, Autosys and Unix from lower environments (CIT, SIT, QA and UAT) to higher environments (Production)
  • Supported with code release activities to the production environment and coordinated with post-production validation
  • Worked on Dimensional modeling to design and develop STAR schemas using ER-win 4.0, Identifying Fact and Dimension Tables.

Environment: Informatica Power center 9.1/9.6, Oracle 11g, DB2, SQL Server 2005/2008, Windows 2003/2008, UNIX/LINUX, Shell Scripting, Autosys, HP Quality center, TOAD, Remedy

Confidential

Informatica ETL Developer

Responsibilities:

  • Worked on Informatica Power Center 8.6 for extraction, transformation, and load (ETL) of data in the data warehouse
  • Developed mappings and mapplets using Informatica Designer to load data into various targets like Teradata and ODS from various heterogeneous source systems
  • Implemented various transformations such as expression, filter, rank, source qualifier, joiner, aggregator, and Normalizer in the mappings and applied surrogate keys on target table and developed reusable transformations
  • Created Workflow, Worklets and Tasks to schedule the loads at required frequency using Workflow Manager
  • Created various tasks like Worklets, Sessions, Batches, E-mail notifications, Decision and to Schedule jobs using Workflow Manager
  • Modified several of the existing mappings based on the user requirements and maintained existing mappings, sessions, and workflows
  • Migrated mappings from Development to Testing and performed Unit Testing and Integration Testing and moved the code to production environment
  • Extensively used Toad utility for executing SQL scripts and worked on SQL for enhancing the performance of the conversionmapping
  • Implemented SCD type1 and type 2 as per business requirements to maintain historical data
  • Developed and implemented UNIX shell script for the start and stop procedures of the sessions and automated the Informatica jobs
  • Interaction with the onshore team daily on the development activities and keep track of sprints
  • Created Low Level Design and High-Level Design documents for ETL Process and developed Test cases for the mappings developed

Environment: Informatica Power center 8.6, Teradata, SQL Server 2005/2008, Windows 2003/2008, Service Now

Hire Now