We provide IT Staff Augmentation Services!

Etl Lead/technical Lead Resume

4.00/5 (Submit Your Rating)

Desmoines, IA

PROFESSIONAL SYNOPSIS:

  • Expertise using different technologies including Informatica 10.1/9.6/9.1/8. x/7.1, SQL,Teradata and Greenplum database
  • Good knowledge of AWS servicesusing S3, Redshift.
  • Experience in using Attunity replication tool.
  • Experience in Data Analysis and Data profiling using Informatica analyst.
  • Experience in leading a team of ETL developers and guiding them throughout the project phase.
  • Experience in performing data analysis for the data ware housing projects and write SQL queries.
  • Understanding the data model and Prepare source to target mappings and hand it over to the ETL developers for the development.
  • Create labels and deployment groups for the Informatica code deployment from DEV to higher environments.
  • Design, development and coding with Teradata, experienced in writing complex queries, fast load and BTEQ scripts.
  • Experience in Application Design, Data Extraction, Data Acquisition, Data Mining, Development, Implementations and Testing of Data Warehousing and Database Business Systems.
  • Good knowledge in Data Modelingusing Star Schema/Snow Flake Schema, FACT and Dimensions tables, Physical and Logical data modeling.
  • Experience in Integration of various data sources like Greenplum, Teradata, Oracle, SQL Server, and Delimited Flat Files.
  • Strong experience in Extraction, Transformation, Loading (ETL) data from various sources into Data Warehouses and Data Marts using Informatica Power Center (Repository Manager, Designer, Workflow Manager, Workflow Monitor), Power Exchange, Power Mart, Power Analyzer, and Power Connect.
  • Strong experience on Workflow Manager Tools - Task Developer, Workflow and Worklet Designer.
  • Hands on experience with mappings from varied transformation logics including Unconnected and Connected, Lookups, Router, Aggregator, Joiner, Update Strategy, Java Transformations and re-usable Transformations.
  • Extracted data from multiple operational sources of loading staging area, Data Warehouse and Data Marts using CDC/ SCD (Type2) loads.
  • Excellent analytical and logical programming skills with a good understanding at the conceptual level and possess excellent presentation, interpersonal skills with a strong desire to achieve specified goals.
  • Excellent knowledge of system health reviews, capacity planning, disaster recovery planning, etc.
  • Worked in up gradation of all Confidential Applications from Informatica7.1 to 8.1 to 8.6.
  • Highly experienced in preparing Project Estimates and Project Plans.
  • Experience in interacting with Business Users in analyzing the Business Process requirements and transforming them into documenting, designing, and rolling out the deliverables.
  • Excellent communication and social skills.
  • Good troubleshooting and problem solving Skills.
  • Good exposure to production environment and operational processes.
  • Experience in leading the team of 20 members of different locations which include the US, India, China, and Malaysia.

TECHNICAL SKILLS:

ETL Tools:  Informatica Power Center10/9x/8.x/7.x/Power Connect

RDBMS:  Teradata, Greenplum (PostgreSQL), Oracle, SQL DB

Operating System:  HP UNIX,IBM AIX 4.3/4.2, Windows NT/2000/XP/7

Job Scheduling Tools:  Informatica Scheduler, Maestro scheduler

Languages:  SQL, PL/SQL, Unix Shell Scripting, Perl Scripting, PostgreSQL

Modeling Tools:  Erwin, Power designer

Database Frontend Tools:  Aqua data studio, Oracle SQL Developer, Quest Toad for Oracle, SQL Developer, Teradata SQL Assistant, pgAdmin III

PROFESSIONAL EXPERIENCE:

Confidential, DESMOINES, IA

ETL Lead/Technical Lead

Environment:  Windows Professional, AWS

Software: Informatica,Teradata

Database:  AWS Redshift, Teradata, Oralce, SQL Server

Tools:  Informatica 10.1, Aqua data studio, Maestro, Attunity

Applications: Jira, TFS, AWS

Responsibilites:
  • Handled several roles in the project including business analyst, data analyst, ETL Lead and developer, and QA tester
  • Leading the ETL development team and guiding them through the ETL development.
  • Provided technical advice to project managers and assisted them with effective planning and issue resolution.
  • Worked on performance tuning of time consuming ETL mappings.
  • Worked with data ware house manager on the ETL development status updates.
  • Participate in the daily stand up meetings to discuss the project status.
  • Involved in gathering the requirements and created requirements document.
  • Performed data analysis and write data requirements document.
  • Scheduling meetings with research scientists to understand the business requirements and scope of the project.
  • Worked with Maestro team to schedule the Informatica workflows for daily and weekly runs.
  • Performed data profiling based on the requirements and the source data.
  • Write complex SQL queries to understand the business data.
  • Work with data modeler in drafting the target data model based on the requirement and data analysis performed.
  • Prepared the source to target mapping document based on the target data model.
  • Designed ETL mappings, sessions and workflow based on the source to target mapping.
  • Performed Unit testing and SIT in DEV and QA environments.
  • Working on bugs identified as part of SIT and UAT.
  • Coordinated with ETL admins and Database admins on the project activities.
  • PerformedInformaticadeployments to higher environments.
  • Provided production support to the projects deployed to PROD. 

Confidential, Atlanta, GA

Project Lead

Environment: Windows Professional

Software: Informatica,Teradata, SQL Server, Greenplum

Languages: Database: Teradata, Greenplum, SQL Server

Tools: Informatica 9.1

Responsibilites:
  • Managing the team on daily support activities and coordinating on different production issues
  • Involved in gathering requirements and created design documents and mapping documents.
  • Develop BTEQ scripts to load the data coming from SQL Server to GR Teradata Database
  • Using Fload and Mload scripts to load the data from GE Wind turbines through Informatica into GR Teradata database
  • Working with GE customers and application owners in gathering the requirements and execute the project in agile methodology
  • Did error handling and performance tuning in Teradata queries and utilities.
  • Working with Teradata utilities like BTEQ, Fast Load, MultiLoad, Tpump, Fast Expert and Queryman.
  • Worked with ET, UV, and WT Tables.
  • Responsible for database performance and tuning
  • Responsible for resource allocation, managing resources (offshore), prepare ETL Design (source to target mapping documents), helping developers in developing efficient code, ensure coding standards are followed, perform code reviews, delivery of quality code, meeting the timelines for development, perform end-to-end testing and provide weekly status on projects to clients
  • Implemented the common staging area per source system.
  • Designed the Incremental Load strategy for daily loads.
  • Tuned the mappings and reduced the ETL load time window
  • Designed ETL Logic for mappings.
  • Implemented Parallel loading mechanism
  • Implemented single workflow for both daily and weekly loads.
  • Optimized the Dimension structure and created the confirmed dimensions.
  • Extensively worked on Power Center Designer (Source Analyzer, Warehouse designer, Transformation Developer, Mapping Designer and Mapplet Designer).
  • Mappings and better maintenance. Creating/ Building and scheduling Batches and Sessions using the Server manager
  • Developed Reusable Transformations, Aggregations and created Target mappings that contain business rules.
  • Optimizing/Tuning mappings for better load performance and efficiency.

Confidential, Atlanta

Project Lead

Environment: Windows Professional

Software: Informatica, Teradata, Greenplum

Database: Teradata, Greenplum

Tools: Informatica 9.1

Responsibilites:
  • Identify the sources connecting to Teradata database and convert them to Greenplum
  • Convert BTEQ Scripts from Teradata to Greenplum UDF
  • Create new connections for Greenplum DB
  • External tables are created and replaced the FLoad scripts of Teradata
  • Managing the team on daily migration activities
  • Involved in gathering requirements and created design documents and mapping documents.
  • Tuned the mappings and reduced the ETL load time window
  • Designed ETL Logic for mappings.
  • Optimizing/Tuning mappings for better load performance and efficiency.
  • Handling daily and weekly calls with customers to provide update on migration

Confidential

Team Member

Environment Windows Professional

Software: Informatica, Oracle

Database: Oracle

Tools: Informatica

Responsibilites:
  • Analysis of the BRD’s provided by my IM.
  • Creation of Mappings as per the Business Logic.
  • Documentation.
  • Regularly (30 minutes interval) Monitor the Trigger Table in Energy GL instance and checkfor any new record insertion in the table columns to signal that Hierarchy Maintenance work is completed.
  • If Trigger table is updated with new record, then extract all records from Energy GL Basetables as listed above and detailed in the Sections Business Rules and P&L Table Design and Mapping.
  • Validate that the number of records are same in the Energy GL tables and Staging Tables of

Confidential

Team Member

Environment: Windows Professional

Software:Informatica,Oracle

Database: Oracle

Tools: Informatica

Responsibilites:
  • Analysis of the specifications provided by the clients.
  • Designing, developing Informatica mappings and Business Objects Reports.
  • Preparation of Documentation for Informatica mappings and Business Objects Reports and execute the transition for the post production support for all the projects.

We'd love your feedback!