We provide IT Staff Augmentation Services!

Etl Consultant Resume

4.00/5 (Submit Your Rating)

Nashville, TN

SUMMARY

  • Over 8+ years of IT experience in Data warehousing and Business Intelligence with emphasis on Business Requirements Analysis, Application Design, Development, Testing, Implementation and Maintenance of Data Warehouse and Data Mart systems.
  • Involvement in all phases of SDLC (Systems Development Life Cycle) from analysis and planning to development and deployment.
  • Experience in development and design of ETL (Extract, Transform and Loading data) methodology for supporting data transformations and processing, in a corporate wide ETL Solution using Informatica PowerCenter.
  • Excellent Data Analysis skills and ability to translate business logic into mappings using complex transformation logics for ETL processes.
  • Expert in troubleshooting/debugging and improving performance at different stages like database, workflows, mapping
  • Experience in scheduling workflows using both Informatica and other scheduling tools like Autosys, Control - M, Crontab, etc.
  • Performed the data profiling and analysis making use ofInformatica Data Explorer (IDE) and Informatica Data Quality (IDQ).
  • Good exposure inInformatica MDMwhere data Cleansing, De-duping and Address correction were performed
  • Experience in integration of various data sources like SQL Server, Oracle, Teradata, Flat files, DB2 Mainframes.
  • Hands on experience in Teradata RDBMS using FastLoad, MultiLoad, TPump, Fast Export, Teradata SQL Assistant and BTEQ.
  • Sound knowledge of relational & dimensional modeling techniques of data warehouse concepts, the Star &Snowflake schemas, SCD, OLAP, OLTP, Normalization /Denormalization.
  • Developed Slowly Changing Dimension for Types 1 SCD, Type 2 SCD, Type 3 SCD.
  • Intense Knowledge in designing Fact & Dimension Tables, Physical & Logical data models
  • Experience in writing SQL queries and PL/SQL, including the use of stored procedures, functions and triggers to implement business rules and validations.
  • Knowledge in design and development of Business Intelligence reports using BI tools Business Objects.
  • Involved in writing Unit test cases for complex scenarios.
  • Experience in creating UNIX shell scripts.
  • Enthusiastic and goal-oriented team player possessing excellent communication, interpersonal skills and leadership capabilities with high level of adaptability.

TECHNICAL SKILLS

  • (ETL) Informatica Power Center 8.6 & 9.X
  • Informatica Cloud (Database) Oracle 11g/10g/9i
  • Teradata 13.10/14.10
  • IDQ
  • SQL Server 2005/2000
  • Informatica Data Quality (IDQ)
  • MS Access (Data Modelling ) Erwin 9.2
  • MS Visio (Utilities) Toad 7.x
  • Data point3.2
  • Nexus Chameleon
  • Teradata performance tuning.
  • SQL Assistant
  • Viewpoint (Scheduling tools) WLM
  • ESP
  • Workflow Monitor
  • Tivoli
  • (CRM) Salesforce
  • SSIS
  • SSRS
  • Netezza
  • T Load Utilities.

PROFESSIONAL EXPERIENCE

Confidential - Nashville, TN

ETL Consultant

Responsibilities:

  • Working on a migration project where I have to design and decommission the existing SQL Server to a Teradata db.
  • Interact with Business and have to create the data models, design them and create the ETL components.
  • Perform an impact analysis from end to end to decommission the existing SQL Server tables.
  • Involved in building the ETL architecture and Source to Target mapping to load data into Data warehouse.
  • Worked on Informatica power center tools - designer, repository manager, workflow manager, and workflow monitor.
  • Extracted the data from the flat files and other RDBMS database into staging area and populated onto data warehouse.
  • Cleansed the data usingMDMtechnique
  • Used Type1 SCD and Type2 SCD mapping to update slowly changing dimensions table
  • Designed and developed ETL and Data Quality mappings to load and transform data from source to DWH using PowerCenter and IDQs.
  • Used the Datastage Designer to develop various jobs processes for extracting, cleansing, transforming, integrating, and loading data into data warehouse database.
  • Have used BTEQ, FLOAD, MLOAD teradata utilities to export and load data to/from Flat files
  • Logical and physical data modelling was done using Erwin for data warehouse database in STAR SCHEMA.
  • Involved in performance tuning at source, target, mapping, sessions and system levels
  • Successfully migrated and decommissioned 15 tables to help the reporting team to use a single source of truth.
  • Created data models and updated the Meta data for future references.

Environment: Used: Informatica 9.5, IDQ, MDM, SQL Serever, Teradata 14.10, Linux, SSIS, Erwin 9.2,Toad Data point, Microsoft Office 2010,T load utilites, Teradata Viewpoint, Salesforce, Datastage, Performance tuning.

Confidential, Columbus, OH

Sr.ETL Developer

Responsibilities

  • Work closely with inter-departmental and cross-functional teams to embed and optimize the business requests and correctly relate it to Data Warehouse terminologies
  • Interpret correctly the business requirement specifications.
  • Answer queries raised by the business team
  • Prepare a Logical and Physical Data model as per requirements.
  • Prepare a Teradata ETL design with Teradata Utilities and Informatica.
  • Developed an E2E detailed level Design Document with all requirements transforming in Technical understanding.
  • Developed Test plans to verify the logic of every phase. The test plans included counts verification, look up hits, business logic of each element of data attributes and target counts.
  • Writingteradata sql queriesto join or any modifications in the table
  • Creation of customized Mload scripts on UNIX platform for Teradataloads
  • Provide design forCDC implementation for real time datasolutions
  • Interact with business to collect critical business metrics and provide solution to certify data for business use. Analyse and recommend solutions for data issues
  • WritingTeradata BTEQ scriptsto implement the business logic.
  • Interact with technical and business analyst, operation analyst to resolve data issues.
  • Creation ofBTEQ, Fast export, MultiLoad, TPump, Fast loadscripts for extracting data from various production systems.
  • Involved in Logical and Physical data model design.
  • Generated DDL’s and used Teradata best practices to implement Physical data model.
  • Hands on experience in Teradata RDBMS using FastLoad, MultiLoad, TPump, Fast Export, Teradata SQL Assistant and BTEQ

Environment: Informatica 9.5, IDQ, MDM, Teradata 14.1.0, Teradata Administrator, Teradata SQL Assistant, PL/SQL Developer, Teradata Manager, BTEQ, MLOAD, FLOAD, FASTEXPORT, UNIX, Unix Shell scripts, Informatica, Cognos, FTP, Windows XP.

Confidential, New Orleans, CA

ETL Developer

Responsibilities:

  • Involved in development of Informatica mappings and tuning for better performance.
  • Created Informatica mappings with stored procedures to build business rules to load data
  • Various transformations (Source qualifier, Aggregators, Connected &unconnected lookups, Filters & Sequence) were used to handle situations depending upon the requirement
  • Worked extensively with the Workflow Manager to handle workflows and worklets
  • Used the new tasks provided in PowerCenter to exert greater control over the execution of workflows
  • Called stored procedures to perform database operations on post-session and pre-session commands
  • Written Parameter files for batch processing from different repositories
  • Created partitions to concurrently load the data from sources
  • Loaded bad data using reject loader utility
  • Performed Unit Testing and tuned for better performance.
  • Written UNIX shell Scripts for getting data from all the systems to Data Warehousing system.
  • Built new dimensions in Universes to support the new reporting requirements of business users.
  • Used SQL tools like TOAD to run SQL queries and validate the data pulled in Cognos reports.
  • Created reports in Cognos Impromptu and exported to the repository.
  • Created the reports using Cognos functionality like Slice and Dice, Drill Down, Functions, Cross Tab, Master Detail and Formulae etc.

Environment: Informatica PowerCenter 6.2/5.1, Cognos 6.0, Oracle 9i, PL/SQL, SQL Server 2000, Windows NT, TOAD, Datastage.

Confidential, Minneapolis, MN

Data Warehouse Developer

Responsibilities:

  • Development of Source-target data mappings, Transformations for the ETL process using Informatica PowerCenter 6.1.
  • Worked extensively on transformations like Source Qualifier, Joiner, Filter, Router, Expression, Lookup, Aggregator, Update Strategy, Sequence generator and Stored procedure transformations.
  • Testing the mappings/programs
  • Extensively used PL/SQL programming in backend and front-end functions, procedures, packages to implement business rules, security efficiently
  • Created the Universes using Business Objects Designer.
  • Created various classes, objects, Filter, Conditions in the Universe.
  • Developed Hierarchies to provide the drill down and Slice & Dice functionality
  • Published the reports using Web Intelligence and scheduled them using BroadCast Agent.
  • System testing the mappings/programs before delivery
  • Delivering the Components to Onsite

Environment: Informatica, Business Objects 5.1, Web Intelligence 2.6, BroadCast Agent, SQL, PL/SQL, Pro C, Oracle 8i, Maestro, Pro C, IBM X series 230 Server and UNIX

Confidential

Informatica Developer

Responsibilities:

  • Worked with Business Analysts to understand the requirement.
  • Developed mappings in Informatica to load the data from various sources into the Data Warehouse, using different transformations.
  • Optimized the performance of the mappings by various tests on sources, targets and transformations
  • Scheduling the sessions to extract, transform and load data in to warehouse database on business requirements
  • Developed Test cases for Unit Testing of the Mappings, and also was involved in the Integration Testing
  • Scheduled the tasks using Autosys
  • Wrote stored procedures and triggers and optimized them for maximum performance.
  • Developed pre-post UNIX scripts to drop and build indexes.
  • Modeled the data warehouse ODS environment to identify impact on changes to model.

Environment: Informatica Powercenter 7.1,Oracle, ERWIN,, Visio, UNIX, SQL, XML, Shell-Script

We'd love your feedback!