We provide IT Staff Augmentation Services!

Senior Etl Developer Resume

4.00/5 (Submit Your Rating)

Wayne, MI

SUMMARY:

  • Over 7+ years of experience in Design, Development, Testing and Maintenance of Data Warehouse, Business Intelligence and Operational Data Systems.
  • Strong experience in the analysis, design, development, testing and implementation of business intelligence solutions using data warehouse/data mart design, ETL, OLAP, BI.
  • Expertise in developing ETL Mappings and Scripts using Informatica Power Center 9.6.1/8.6.1 and Designer (Source Analyzer, Target Designer, Transformation Developer, Mapplet Designer, Mapping Designer). Repository Manager, Workflow Manager (Task Developer, Worklet Designer, Workflow Designer) and Workflow Monitor.
  • Experience in working with different types of data sources like Oracle, DB2, and SQL Server and non - relational sources like Flat Files, XML into Staging areas.
  • Experience in working with the business analysts to identify study and understand requirements and translated them into ETL code in requirement analysis phase.
  • Data Modeler with strong Conceptual and Logical Data Modeling skills, has experience with JAD sessions for requirements gathering, creating data mapping documents, writing functional specifications, queries.
  • Experience in Data warehouses architecture and development, Data Architecture (Conceptual/Logical/Physical Data Modelling), and implementation of data warehouses in working in all phases of Software Development Life Cycle.
  • Experience in creating the reusable transformation such as Expression, Joiner, Sorter, Aggregator, Lookup, Router, Filter, Update Strategy, Sequence Generator, Normalizer and Rank) and by creating mappings in Informatica Designer and processing the tasks using Workflow Manager to move data from multiple sources into targets.
  • Extensively worked on Oracle backend programming using PL/SQL- stored procedures, functions, packages, tables, database triggers.
  • Extensively worked on the ETL mappings, analysis and documentation of OLAP reports requirements, solid understanding of OLAP concepts and challenges, especially with large data sets.
  • Architected data warehouses/marts from documenting the functional requirements for design and the creation of data marts using Inmon and Kimball methodologies.
  • Strong experience in dimensional modeling using star and snow flake schema, identifying facts and dimensions, physical and logical data modelling using Erwin.
  • Have Strong SQL Development Skills on Oracle 10/11g/9i, SQL Server 2008/2012/2014, Triggers, Stored Procedures and Functions by using tools such as SQL Server Management Studio and SQL Developer.
  • Experience in installation and configuration of core Informatica MDM Hub console, Hub store, hub server, cleanse match server and cleanse adapter in windows.
  • Knowledge on implementing hierarchies, relationships types, packages and profiles for hierarchy management in MDM Hub implementation.
  • Used the Batch file and Shell Script to run the Workflow by PMCMD Utility and scheduled the ETL jobs using tools, such as Informatica Scheduler, Control-M and CA Workload Automation.
  • Loading data from various data sources and legacy systems into Teradata production and development warehouse using BTEQ, FASTEXPORT, MULTI LOAD, FAST LOAD and Informatica.
  • Proficient in the integration of various data sources with multiple relational databases like oracle 11g/ oracle 10g/9i, MS SQL Server, DB2, Teradata, flat files into the staging area, data warehouse and DataMart.
  • Created the enhanced logical model in 3NF using ER/Studio and the many to many relationships between the entities are resolved using associate tables.
  • Worked on Performance tuning, developing, debugging, identifying and resolving performance bottlenecks in various levels like sources, targets, mappings and sessions.
  • Experience in performing unit testing and system testing to make sure that the data loads into the target are accurate
  • Development of Informatica mappings and PL/SQL blocks ranging from medium level complexity to high complex.
  • Experienced in preparing technical design document, mapping documents, data definition documents, detail design document for source/ target mapping.
  • Excellent problem solving skills with a strong technical background and good interpersonal skills.

TECHNICAL SKILLS:

ETL: Informatica Power Center 9.6.1/9.5/9.1/8.6.1, Informatica MDM, Source Analyzer, Mapping, Designer, Workflow Manager, Workflow Monitor, Data Cleansing, Data Quality, Data Mart, Pushdown optimization, Metadata.

RDMS: Oracle 11g/10g/9i, Teradata, SQL Server, SQL, PL/SQL.

Tools: SQL plus, PL/SQL Developer, SQL Server Management Studio, SQL Forms, Toad.

OS: Windows 10/8.1/7/2003/2000/ XP, UNIX.

PROFESSIONAL EXPERIENCE:

Senior ETL Developer

Confidential, Wayne, MI

Responsibilities:

  • Involved in understanding the booking system, for developing the various data marts for which understanding the business system necessary. The booking system can be thought of as a piece of an object model in which an account has a collection of trips.
  • Created Entity relationship diagrams, function relationship diagrams, data flow diagrams and enforced all referential integrity constraints.
  • Analyzed existing logical data model (LDM) & made appropriate changes to make it compatible with business requirements.
  • Expanded Physical Data Model (PDM) for the OLTP application using Erwin.
  • Studied the existing OLTP system(s) and Created facts, dimensions and star schema representation for the data mart.
  • Extensively worked over the online tracking system for work-requests, database change request issues, automated UNIX scripts modification issues and various scheduler jobs issues.
  • Involved in data model reviews with internal data architect, business analysts, and business users with explanation of the data model to make sure it is in-line with business requirements.
  • Developed PL/SQL scripts and conducted data masking on testing environment to protect sensitive data.
  • Designed and developed complex Informatica mappings using expressions, aggregators, filters, lookup and stored procedures to ensure movement of the data between various applications.
  • Created sequential/concurrent Sessions/ Batches for data loading process and used Pre & Post Session SQL Script to meet business logic.
  • Developed shell scripts to disable or enable constraints in tables and truncating the tables to avoid fragmenting the database table due to frequent deleting of the records.
  • Migrating Folders, Mappings, Workflows across Repositories, Folders and to the higher version of Informatica.
  • Developed Mappings to implement slowly changing dimensions type 1 and type 2. Created shortcuts, reusable transformations and Mapplets to use in multiple mappings.
  • Extensively used Teradata utilities like fast load, multi load, bteq and fast export. Created Teradata external loader connection such as Mload, Insert and update, Fast load while loading data into the target tables in Teradata database.
  • Access data from multiple operational data sources, re-map source data into a common format, Filter data, and calculate derived values, aggregates, load cleansed data to the central DW. Extensive use of Informatica 9.6.1 and ETL performance tuning.
  • Developed and documented Data Mappings/Transformations, and Informatica sessions as per the business requirement.
  • Developed the SQL scripts in Toad and created oracle objects like tables, views, indexes, and other oracle objects.
  • Setting up Batches and sessions to schedule the loads at required frequency using Power Center Workflow manager, PMCMD and also using scheduling tools like Autosys. Generated completion messages and status reports using Workflow manager.
  • Extensively worked in the performance tuning of programs, ETL procedures and processes. Error checking and testing of the ETL procedures and programs using Informatica session log.
  • Worked extensively on Oracle database 10g and flat files as data sources.
  • Configured database performance tools such as oracle enterprise manager, SQL Navigator and used them for tuning of applications.

Environment: Informatica 9.6.1, Oracle 10g, DB2, SQL, PL/SQL, Teradata, Windows, Erwin, UNIX, Toad.

Senior Data Warehouse Developer

Confidential, Troy, MI

Responsibilities:

  • Worked with the business analysts for requirements gathering, business analysis and translate the business requirements into technical specifications to build the Enterprise Data Warehouse.
  • Resolving issues related to enterprise data warehouse, stored procedure in OLTP system and analyzed, design and developed ETL strategies.
  • Involved in the data analysis for source and target systems using data warehousing concepts, staging tables, dimension tables, fact tables, star schema and snowflake schema.
  • Created a relational model and dimensional model for online services such as online banking and automated bill pay.
  • Used Informatica features to implement type 1 and type 2 changes in slowly changing dimensions tables and also developed complex mappings to facilitate daily, weekly and monthly loading of data.
  • Analyzed the business data and defined the match and merge rules for INFA MDM Hub.
  • Extremely used transformation like source qualifier, expression, filter, aggregator, joiner, lookup sequence generator, router, sorter and stored procedures and used debugger to test the mappings and fix the bugs.
  • Worked with Informatica data quality 9.5.1 toolkit for analyzing, standardizing, cleansing, matching, Conversion, exception handling reporting and monitoring the data.
  • Used Informatica data quality profiler and developer tool to analyze source system data and discover underlying issues.
  • Created workflows and used various tasks like email, event- wait and event raise, timer, scheduler, control, decision, session in the workflow manager.
  • Writing the Bteq and Mload scripts to load data from oracle to Teradata and transfer of large volumes of data using Teradata.
  • Worked with Pushdown Optimization Viewer to preview the SQL statements and mapping logic that the integration service can push to the source or target database.
  • Experience in writing, testing and implementation of the triggers, cursors, procedures, and functions at database level using PL/SQL.
  • Used update strategy DD-INSERT, DD DELETE, DD UPDATE, and DD REJECT to insert, delete, update and reject the items based on the requirements.
  • Created customer on boarding process to load data into landing tables of MDM hub using external batch processing for initial data load in hub store to define automation process for staging loading match and merge.
  • Implement Informatica MDM workflow including data profiling configuration specification and coding match rules tuning migration.
  • Worked with TOAD to increase user productivity and application code quality while providing an interactive community to support the user experience.
  • Made use of post- session success and post- session failure commands in the session task to execute scripts needed to clean up and update purposes.
  • Responsible for scheduling workflows and sessions and monitoring them to ensure data is properly loaded on to the target tables.
  • Performed tuning of SQL queries and stored procedures for speedy extraction of data to resolve and troubleshoot issues in OLTP environment.
  • Involved in meetings with the production team for issues related to deployment, maintenance, future enhancements, backup and crisis management of data warehouse.
  • Worked with the production team to resolve data issues in production database of OLAP and OLTP systems.
  • Used SQL Navigator to increase the user productivity and application code quality with rich features to manage database objects, develop and debug PL/ SQL and create, execute and optimize SQL queries.

Environment: Informatica Power Center 9.5.1, IDQ, MDM, Teradata, Toad, Oracle 11g/10g, SQL Server 2012, PL/ SQL, SQL Plus.

ETL Developer

Confidential, Nutley, NJ

Responsibilities:

  • Understanding the business requirements based on functional specification to design the ETL methodology in technology specifications.
  • Involved in development of logical and physical data models that capture the existing state.
  • Used Model Mart of Erwin for effective model management of sharing, dividing and reusing model information and design for productivity improvement.
  • Extracted the data from various heterogeneous sources like DB2, SQL Server and Flat files to the target database.
  • Extensively used various active and passive transformations like filter transformations, router transformation, expression transformation, source qualifier transformation, joiner transformation, and lookup transformation, update strategy, sequence, rank, aggregator.
  • Solid expertise in using both connected and un-connected lookup transformations. Extensively worked with various lookup caches like static cache, dynamic cache, and persistence cache.
  • Created work flows and sessions for each mapping that we are extracting from source systems to staging area and from staging area to target.
  • Extensively created re-usable transformations and Mapplets to standardize the application.
  • Worked on identifying facts, dimensions and various other concepts of dimensional modeling which are used for data warehousing projects.
  • Worked with Informatica data quality 8.6.1 toolkit, analysis, data cleansing, data matching, data conversion, exception handling and reporting and monitoring capabilities of IDQ 8.6.1.
  • Involved in fixing invalid mappings, testing of Informatica sessions, worklets and workflows.
  • Maintained warehouse metadata, naming standards and warehouse standards for future application development.
  • Acted as a technology specialist to propose best practices during enterprise data integration (ETL, data replication, data services) for project execution, software implementation and upgrades played a crucial role defining the data replication services for facets SQL Server tables.
  • Dealt with data issues in the staging flat files and after it was cleaned up it is sent to the targets.
  • Prepared UNIX shell scripts and these shell scripts will be scheduled in AUTOSYS for automatic execution at the specific things.
  • Extracted data from flat files and oracle database, applied business logic to load them in the central oracle database.
  • Worked with session logs and workflow logs for error handling and troubleshooting in all environments.
  • Copied / Exported / Imported the mappings / sessions / worklets / workflows from development to test repository and promoted to production.

Environment: Informatica Power Center 9.1, IDQ 8.6.1, Teradata, Oracle 11g, PL/SQL, DB2, XML, SQL Server 2008, SQL Plus, MS excel.

Informatica Developer

Confidential, NY

Responsibilities:

  • Worked with the business analyst and used methods such as information package and joint application development, to communicate with employees in Confidential financial and IT departments, and do the requirements gathering, analysis and documentation.
  • Researched on existing system, documents and industry research materials and make advises to supplement and optimized the requirement document.
  • Created the conceptual model for the data warehouse with emphasis on insurance (life and health), using ER Studio data modeling tool.
  • Worked with the data analyst and dba to design the data model of data warehouse and operation data store (star schemas and snow flake schemas), using Erwin and Target Designer of power designer.
  • Analyzed the source including oracle and flat file and used power center and power exchange to create the source definitions.
  • Migrated master data module from old OLTP systems to new OLTP system from legacy format to the latest XML technology.
  • Extracted data from the source system to a DataMart running on Teradata using utilities like MultiLoad and FastLoad.
  • Create a mapping to load into the dimensions tables, fact tables and aggregated fact tables from staging area by using transformation including source qualifier, Lookup, Filter, Aggregator, Stored Procedure, and Expression. Used PL/SQL stored procedure to populate the date dimensions.
  • Created and executed workflows and worklets using workflow manager to load the data into the target database in project wide standards by using different tasks including session, command, decision, event wait, email and assignment.
  • Handled workload management using priority scheduler & Teradata dynamic query manager and Teradata manager.
  • Used tools and methods, such as Session log and Debugger to validate the workflow and mappings, identify bottle-necks, and troubleshoot information about data and error conditions.
  • Involved in performance tuning of the ETL process by addressing various performance issues on the source side and Informatica side, such as using sorted data and Pipeline Partitioning.
  • Extensively worked on Mapping Variables, Mapping Parameters, Workflow Variables and Session Parameters and parameterization.
  • Excellent knowledge on ETL tools such as Informatica to load data to Teradata by making various connections to load and extract data to and from Teradata efficiently.
  • Designed and implemented the Incremental loads, including SCD type-1, 2 and 3, by create mappings with Lookup, Expression and Update Strategy transformations, etc.
  • Performance tuning for data warehouse (Teradata) and data warehouse operations. Implemented the Unit Test and Integration Test, and then wrote the test documents.
  • Developed unit/ assembly test cases and UNIX shell scripts to run along with daily/ weekly/ monthly/ batches to reduce or eliminate manual testing effort.
  • Involved in creating and managing global and local repositories and assigning permissions using repository manager. Also migrated repositories between development testing, QA and production systems.

Environment: Informatica Power Center 9.1, Power Exchange, Oracle 11g/10g, Teradata, Erwin, DataMart, SQL Developer, PL/SQL, Windows 7 professional.

Oracle/ ETL Developer

Confidential

Responsibilities:

  • Involved in gathering business requirements, created Technical Specification Document (TSD) for the project.
  • Developed logical modeling, physical database design, data sourcing and data transformation, data loading, SQL and performance tuning.
  • Customized and developed database objects like PL/SQL procedures, Functions, packages, and views involved in database design and preparing the SQL scripts to support the larger database.
  • Used Informatica for loading the historical data from various tables for different departments and used mapping designer for developing mappings, using transformations, which includes aggregation, updating, lookup.
  • Worked upon loading the data from several flat files sources using Teradata, MLoad, FastLoad
  • Involved in the development of data mart and populating the data marts using Informatica.
  • Used transformations like aggregate, expression, filter, sequence generator, joiner, and stored procedure transformations.
  • Created reusable transformations called Mapplet and used them in mappings in case of reuse the transformations in different mappings.
  • Developed and scheduled workflows using task developer, Worklet designer and workflow designer in workflow manager and monitored the results in workflow monitor.
  • Used workflow manager for creating, validating, testing and running the sequential and concurrent batches and sessions, and scheduled them to run at a specified time.
  • Used stored procedures to transform the data and worked extensively in PL/SQL for various needs of the transformations while loading the data.
  • Created SQL loader scripts to load legacy data into oracle staging tables and wrote SQL queries to perform data validation and data integrity testing.
  • Wrote complex stored procedures using dynamic SQL to populate data into temp tables from fact and dimensional tables for reporting purpose.
  • Guided the team on development, created custom objects like shared containers in Data stage, Mapplets in Informatica and reusable scripts in SQL for the support of the project needs.
  • Performance fine tuning of the queries/ report using PL/SQL and SQL plus.
  • Performance tuning of the ETL - by utilizing dynamic cache for lookup partitioning the sessions.
  • Created unit test case document and performed various kinds of testing’s like unit testing, regression testing and system test in Dev, QA environments before deployment and involved in production support.

Environment: Informatica 8.6, Oracle RDBMS 9i, SQLPlus, SQLLoader, Teradata, XML, Toad.

Database Developer

Confidential

Responsibilities:

  • Created database objects such as Tables, Views, Synonyms, Indexes, Sequences and Database Links as well as custom packages tailored to business requirements.
  • Worked closely with architects, designers and developers to translate data requirements into the physical schema definitions for Oracle.
  • Creating SQL and PL/ SQL scripts, calling C external routines from within PL/ SQL for transfer of data between the databases.
  • Creating primary data base storage structures with appropriate placement of data files for maximum efficiency and performance. Re-worked on bad files that are generated by the SQL Loader.
  • Extensively worked in fixing poorly designed mappings and developed schedules to automate the Informatica workflows.
  • Tuning Database and SQL scripts for Optimal Performance, Redesign and build the schemas to meet Optimal Performance measures.
  • Created Stored Procedures, Functions, Packages and Triggers using SQL, PL/ SQL and maintained the scripts for various data feeds.
  • Converted very slow and complicated views to tables and implemented database triggers to update them automatically.
  • Strong knowledge on PL/SQL wrapper to protect the PL/SQL procedures or packages.
  • Created shared folders, local and global shortcuts for the reuse of metadata. Used joins, indexes effectively in where clauses for query optimization.
  • Managed the data model changes in all enterprise applications, performed impact analysis to ensure all systems leads are informed and coordinate efforts to implement changes.
  • Used bulk collections for better performance and easy retrieval of data by reducing the context switching between SQL and PL/SQL engines.
  • Worked very close with Data Architectures and DBA team to implement Data Model changes in database in all environments. Generate DDL scripts for Database Modification, Macros, Views and set tables.
  • Involved in Performance testing, User Acceptance testing of the application and participated in Peer Code and Design Reviews, Status meetings and Walkthroughs.
  • Involved in capturing data lineage, table and column data definitions, valid values and other necessary information in the data models.

Environment: Oracle 10g/9i, SQL plus, SQL, PL/SQL, SQL Loader, Windows 8.1.

We'd love your feedback!