We provide IT Staff Augmentation Services!

Sr Etl Developer Resume

2.00/5 (Submit Your Rating)

SUMMARY:

  • Around 10 years of IT Experience in analysis, design, development, implementation and troubleshooting of Data Mart / Data Warehouse applications using ETL tools like Informatica power center 9.1/8.x/7.x/6.x/5.x. business intelligence (BI).
  • Expertise in Dimensional and Relational Physical & logical data modeling using Erwin and ER/Studio.
  • Extensive experience in implementing CDC using Informatica Power Exchange 8.x/7.x.
  • Experience in integration of various data sources with Multiple Relational Databases like Oracle, SQL Server and Worked on integrating data from flat files like fixed width and delimited.
  • Extensively worked on Informatica tools Admin console, Repository manager, Designer, Workflow manager, Workflow monitor.
  • Worked extensively with complex mappings using different transformations like Source Qualifiers, Expressions, Filters, Joiners, Routers, Union, Unconnected / Connected Lookups and Aggregators.
  • Profound knowledge about the architecture of the Teradata database. Developed Teradata Loading and Unloading utilities like Fast Export, Fast Load, Multiload (Mload).
  • Extensive knowledge with Teradata SQL Assistant. Developed BTEQ scripts to Load data from Teradata Staging area to Data warehouse, Data warehouse to data marts for specific reporting requirements. Tuned the existing BTEQ script to enhance performance.
  • Experience in debugging mappings. Identified bugs in existing mappings by analyzing the data flow and evaluating transformations.
  • Workable knowledge on Master data management (MDM)
  • Experience in identifying Bottlenecks in ETL Processes, improving the Performance of the production applications using Database Tuning, Partitioning, Index Usage, Aggregate Tables, and Normalization / Denormalization strategies.
  • Experience in using informatica's IDQ tool
  • Extensively worked with Oracle PL/SQL Stored Procedures, Functions and Triggers and involved in Query Optimization.
  • Involved in all phases of data warehouse project life cycle. Designed and developed ETL Architecture to load data from various sources like Sybase, DB2, Oracle, SQL Server, Flat files, and XML files, Teradata into Sybase, Oracle, Teradata, Flat Files, and SQL server targets.
  • Good experience in performing and supporting Unit testing, System Integration testing, UAT and production support for issues raised by application users.
  • Strong in UNIX Shell and PERL scripting. Developed UNIX scripts using PMCMD utility and scheduled ETL load using utilities like AutoSys, CRON tab, Maestro.
  • Excellent technical and professional client interaction skills. Interacted with both Technical, functional and business audiences across different phases of the project life cycle.
  • Highly motivated to take independent responsibility as well as ability to contribute and be a productive team member.
  • Versatile team player with excellent analytical, presentation and interpersonal skills with an aptitude to learn new technologies.

TECHNICAL SKILLS:

Operating Systems: Windows 9X/2000/XP/NT,MS - Dos, HP-UX, Unix, IBM AIX 4.3/4.2 and Solaris

ETL Tools: Informatica 9.1/8.1//7.1. x/6.2/6.1/5.1 (Power Mart/ Power Center) 6.x, Informatica Power Exchange 7.x/8.x, Informatica Power Connect, Data stage 7.5.x,IDQ

Tools: UNIX Shell Scripts, Visual Basic, T-SQL, PL/SQL, TOAD, Corntab, Autosys, Informatica Scheduler, MDM, Perl Scripting, Teradata SQL Assistance

Databases: Oracle 11g/10g/9i/8i/8.0/7.x, IBM DB2 UDB 8.0/7.0, Teradata 13/12/V2R4/V2R5, MS SQL Server 2012/2008/2005/2000 , MS Access

Database utilities: SQL *plus, Stored procedures, Functions, Exception handling.

Data Modeling tool/ Methodology: MSVisio, ERWIN 4.x/3.x, Ralph-Kimball Methodology, Bill-Inman Methodology, Star Schema, Snow Flake Schema, Extended Star Schema, Physical And Logical Modeling.

Reporting Tools: Business Objects XI r2/r 1/6.5/6.1/5.1 , Cognos, SSRS(SQL Reporting Services)

PROFFESSIONAL EXPERIENCE:

Confidential

Sr ETL Developer

Responsibilities:

  • Involed in full life cycle design and development of Data warehouse.
  • Interacted with business analysts, Source data architects and Source application developer to develop a data model.
  • Prepared the required application design documents based on functionality required.
  • Played a key role as a core developer in a HR data migration project ARORA from PeopleSoft application to Workday application.
  • Worked on Workday related requirements & providing solutions for issues related with same which covers activities like: requirement analysis, ETL designing, Scheduling workflows and providing Maintenance, Enhancement & Support to the Client
  • Created Informatica mapping for connecting Web services (WSDL/XSLT/XML) to read data from web servers.
  • Created mappings to load data from workday (XML Source) to staging DB and from staging DB to warehouse.
  • Designed and developed complex ETL mappings making use of transformations like Source Qualifier, Joiner, Update Strategy, Connected Lookup and unconnected Lookup, Rank, Expression, Router, Filter, Aggregator and Sequence Generator transformations.
  • Designed and developed the logic for handling slowly changing dimension table’s load by flagging the record using update strategy for populating the desired.
  • Used Workflow Manager for creating, validating, testing and running the sequential and concurrent sessions and scheduling them to run at specified time and as well to read data from different sources and write it to target databases.
  • Identifying and Removing Bottlenecks in order to improve the performance of Mappings and Workflows
  • Designed and Developed ETL logic for implementing CDC by tracking the changes in critical fields required by the user.
  • Optimized the existing applications at the mapping level, session level and database level for a better performance.
  • Developed PL/SQL procedures, functions to facilitate specific requirement.
  • Performed Informatica upgrade from V8.6.1 to 9.0.1
  • Automated UNIX shell scripts to verify the count of records added everyday due to incremental data load for few of the base tables in order to check for the consistency.
  • Optimized and Tuned SQL queries and PL/SQL blocks to eliminate Full Table scans to reduce Disk I/O and Sorts.
  • Involved in migration of mappings and setting up the AutoSys jobs between different environments
  • Worked on setting up and running the batches in different UAT environment for multiple projects.
  • Actively involved in monitoring the batches on a daily bases and weekly batch loads.
  • Involved in Unit testing, User Acceptance Testing to check whether the data is loading into target, which was extracted from different source systems according to the user requirements.
  • Investigated and fixed problems encountered in the production environment on a day to day basis.
  • Worked as L3 support for resolving production issues.
  • Documented and maintained technical documentation regarding the extract, transformation and load process.

Environment: Informatica Power Center 8.6/9.1, Sybase, DBArtisan 8.7.4/9.1.2 ,Rapid SQL, Business Objects XI 3/XI 4, UNIX, Autosys, Workday 23/21, Perforce, Putty, UNIX shell Scripts

Confidential

Sr ETL Developer

Responsibilities:

  • Involed in full life cycle design and development of Data warehouse.
  • Interacted with business analysts, Source data architects and Source application developer to develop a data model.
  • Develop logical and physical data models that capture current state/future state data elements and data flows using Erwin.
  • Prepared the required application design documents based on functionality required.
  • Worked on PPM related requirements & providing solutions for issues related with same which covers activities like: requirement analysis, ETL designing, Scheduling workflows and providing Maintenance, Enhancement & Support to the Client
  • Designed and developed complex ETL mappings making use of transformations like Source Qualifier, Joiner, Update Strategy, Connected Lookup and unconnected Lookup, Rank, Expression, Router, Filter, Aggregator and Sequence Generator transformations.
  • Designed and developed the logic for handling slowly changing dimension table’s load by flagging the record using update strategy for populating the desired.
  • Used Workflow Manager for creating, validating, testing and running the sequential and concurrent sessions and scheduling them to run at specified time and as well to read data from different sources and write it to target databases.
  • Designed and Developed ETL logic for implementing CDC by tracking the changes in critical fields required by the user.
  • Optimized the existing applications at the mapping level, session level and database level for a better performance.
  • Created mapping for loading the Projects, Program/Portfolio and Product and services attributes into the Legacy Warehouse System.
  • Developed PL/SQL procedures, functions to facilitate specific requirement.
  • Member of On call team for providing support for daily and weekly batch loads.
  • Developed UNIX shell Scripts to generate parameter files and executed oracle procedures as batch jobs
  • Involved in migration of mappings and setting up the autosys jobs between different environments
  • Involved in conversion of data using ETL mappings during Oracle upgrade from Oracle 10g to 11g
  • Modifying the shell scripts to rename and backup the extracts
  • Worker as L3 production support for resolving production issues.
  • Documented and maintained technical documentation regarding the extract, transformation and load process.
  • Investigated and fixed problems encountered in the production environment on a day to day basis.

Environment: Informatica Power Center 8.6.1, Business Objects XI 3.1, DBArtisan 8.7.4/9.1.2 , Oracle 11g/10g, Windows XP, BI Tools, UNIX, PPM by HP, DB2, and Perforce Visual Client/NTX 86/2010.1/264284

Confidential, Nebraska

Sr ETL Developer

Responsibilities:

  • Involed in full life cycle design and development of Data warehouse.
  • Interacted with business analysts, Source data architects and Source application developer to develop a data model.
  • Prepared the required application design documents based on functionality required.
  • Created logical and physical data models using Erwin and created Entity Relationship (ER) diagrams based on requirements.
  • Extracted source data using power exchange from legacy systems.
  • Designed and developed complex ETL mappings making use of transformations like Source Qualifier, Joiner, Update Strategy, Connected Lookup and unconnected Lookup, Rank, Expression, Router, Filter, Aggregator and Sequence Generator transformations.
  • Designed and developed the logic for handling slowly changing dimension table’s load by flagging the record using update strategy for populating the desired.
  • Used Workflow Manager for creating, validating, testing and running the sequential and concurrent sessions and scheduling them to run at specified time and as well to read data from different sources and write it to target databases.
  • Identifying and Removing Bottlenecks in order to improve the performance of Mappings and Workflows
  • Designed and Developed ETL logic for implementing CDC by tracking the changes in critical fields required by the user.
  • Optimized the existing applications at the mapping level, session level and database level for a better performance.
  • Created Scripts using Teradata utilities (Fast load, Multi load, Fast export)
  • Used Fast Load for loading the data into the empty tables.
  • Coded Multi Load scripts to load the data into different tables.
  • Developed PL/SQL procedures, functions to facilitate specific requirement.
  • Optimized and Tuned SQL queries and PL/SQL blocks to eliminate Full Table scans to reduce Disk I/O and Sorts.
  • Designed and developed UNIX Shell scripts for creating, dropping tables which are used forscheduling the jobs
  • Involved in migration of mappings and sessions from development repository to production repository.
  • Used Data Warehouse Master Data Management (MDM) in source identification, data collection, data transformation, normalization, rule administration, error detection and correction, data consolidation, data storage, data distribution, and data governance.
  • Creating and Managing the Non Transactional Data Entities using MDM.
  • Conducted meetings for every deployment to make sure the job schedules and dependencies are developed in such a way that we are not missing the SLA on a day-to-day basis.
  • Involved in production support working with various mitigation tickets created while the users working to retrieve the database.
  • Prepared run book for the daily batch loads giving the job dependencies and how to restart a job when it fails for ease of handling job failures during loads and socialized the same with other teams.
  • Documented and maintained technical documentation regarding the extract, transformation and load process.
  • Investigated and fixed problems encountered in the production environment on a day to day basis.

Environment: Informatica Power Center 8.6, Business Objects 3.1, Teradata 12, Oracle 10g/9i, TOAD, Erwin 4.5, SQL, PL/SQL, XML, Microsoft Visio, Windows XP, BI Tools, HP UNIX, Test Director/Quality Center

Confidential, Tampa, Florida

ETL Developer/Teradata Developer

Responsibilities:

  • Interacted with the Business Personnel to analyze the business requirements and transform the business requirements into the technical requirements.
  • Prepared technical specifications for the development of Informatica (ETL) process to load data into various target tables
  • Develop logical and physical data models that capture current state/future state data elements and data flows using Erwin.
  • Used Erwin to reverse engineer and refine business data models
  • Administered and worked with various Informatica client tools like Source Analyzer, Warehouse designer, Mapping designer, Mapplet Designer, Transformation Developer, Informatica Repository Manager, Workflow Manager and Workflow Monitor.
  • Managed the entire ETL process involving the access, manipulation, analysis, interpretation and presentation of information from both internal and secondary data sources to customers in sales and marketing areas.
  • Created mappings using different transformations like Source Qualifier, Joiner, Aggregator, Expression, Filter, Router, Lookup, Update Strategy, and Sequence Generator etc.
  • Designed and Developed several mapplets and worklets for reusability.
  • Implemented CDC using Informatica Power Exchange.
  • Implemented weekly error tracking and correction process using Informatica.
  • Implemented audit process to ensure Data warehouse is matching with the source systems in all reporting perspectives.
  • Identified performance issues in existing sources, targets and mappings by analyzing the data flow, evaluating transformations and tuned accordingly for better performance.
  • Developed Teradata BTEQ scripts to Load data from Teradata Staging to Enterprise Data warehouse.
  • Extensively used Stored Procedures, Functions and Packages using PL/SQL.
  • Worked on Teradata Global temporary and volatile tables.
  • Worked with Teradata DBA team to create secondary indexes required for performance tuning of Data mart loads and reports.
  • Created maestro schedules/jobs for automation of ETL load process.
  • Involved in Unit testing, User Acceptance testing to check whether the data loads into target are accurate, which was extracted from different source systems according to the user requirements.
  • Conducted meetings for every deployment to make sure the job schedules and dependencies are developed in such a way that we are not missing the SLA on a day-to-day basis
  • Actively involved in the production support and also transferred knowledge to the other team members and Created Business objects functionality from Analytical database to Transactional data warehouse. Created reports using Business Objects full client and Web Intelligence.

Environment: Informatica Power Center 8.5,Power Exchange 8.1, Oracle 10g, TeradataV2R5, ANSI/Teradata, Mainframe, Solaris, Erwin, Business objects 6.5.

Confidential, Oakland, CA

Data Warehouse Developer

Responsibilities:

  • Developed complex mappings using Informatica Power Center Designer to transform and load the data from various source systems like Oracle, Teradata, and Sybase into the Oracle target database.
  • Analyzed and understood all data in the source databases and designed the overall data architecture and all the individual data marts in the data warehouse for each of the areas Finance, Credit Cards, Brokerage.
  • Involved in the creation of oracle Tables, Table Partitions, and Indexes.
  • Implemented various integrity constraints for data integrity like Referential integrity using primary key and foreign keys relationships.
  • Handled alerting mechanisms, system utilization issues, performance statistics, capacity planning, integrity, monitoring, population, maintenance, reorganization, security, and recovery of databases.
  • Worked in Off-shore On-shore Co-ordination setting, delegating and managing a group in India’s Accenture.
  • Identified and tracked the slowly changing dimensions/mini dimensions, heterogeneous Sources and determined the hierarchies in dimensions.
  • Created synonyms for copies of time dimensions, used the sequence generator transformation type to create sequences for generalized dimension keys, stored procedure transformation type for encoding and decoding functions and Lookup transformation to identify slowly changing dimensions.
  • Used Task developer in the Workflow manager to define sessions
  • Assisted retailers in understanding consumer buying patterns and in creating consumer sets for marketing campaigns
  • Created application-specific Data Marts so that users can access personalized dashboards of information that is specific to their department and business unit.
  • Involved in quality assurance of data, automation of processes.
  • Involved in the development and testing of individual data marts, Informatica mappings and update processes
  • Worked on database connections, SQL joins, cardinalities, loops, aliases, views, aggregate conditions, parsing of objects and hierarchies.
  • Defined the program specifications for the data migration programs, as well as the necessary test plans used to ensure the successful execution of the data loading processes.
  • Created repository, users, groups and their privileges using Informatica Repository Manager
  • Involved in writing UNIX shell scripts for Informatica ETL tool to run the Sessions.
  • Connection management and scheduling of jobs to be run in the batch process
  • Generated detailed reports from the data marts using Business Objects.

Environment: Informatica Power Center/Power mart 7.1, Business Objects, Oracle 9i, Teradata, SQL/PLSQL, dynamic SQL, UNIX Shell Programming, Erwin 4.0, UNIX (IBM - AIX), and Windows NT.

Confidential

ETL Developer

Responsibilities:

  • Analysis and Design of the system
  • Created mappings to extract, transform & load data from different sources using various Transformations.
  • Created and scheduled workflows thereby combining sessions for individual units.
  • Created PL/SQL scripts for data transformation in line with program specifications.
  • Creating temporary tables and indexes for loading and updating data.
  • Designed and Developed Stored Procedures, Packages & Functions using PL/SQL and SQL scripts.
  • Developed shell scripts for database replication processes.
  • Tested all the modules.
  • Involved in ETL documentation

Environment: Oracle 7.X, SQL, JDBC, PL/SQL, SQL*Loader, Developer 2000, Windows 95, Informatica 4.1

Confidential

PL/SQL Developer

Responsibilities:

  • Created, monitored and maintained Oracle databases.
  • Created Table spaces for Data and Indexes.
  • Created Tables, Indexes, Sequences, Clusters, Triggers, Procedures, Functions and Packages.
  • Re-organization of databases and managing the databases for optimum performance levels.
  • Writing database scripts for user management and roles.
  • Configuring ODBC connectivity for new instances and clients.
  • Performing SQL query optimization using hints, Indexes and Explain plan.
  • Wrote PL/SQL Scripts for DDL operations such as to create alter and drop database objects likes tables, views, sequences, procedures and functions.
  • Written stored procedures and functions to retrieve the data from database using PL/SQL.
  • Different database triggers containing PL/SQL were created and stored in the database and fired off when contents of database were changed
  • Created various reports based on the client requirements using Crystal Reports

    Environment: Visual Basic 6.0, Oracle 8i, PL/SQL, Crystal Report 6, Erwin, Windows NT.

We'd love your feedback!