Sr. Etl Developer/consultant Resume
Livonia, MI
SUMMARY
- Over 8 years’ experience within the Healthcare, Pharmaceutical and Consulting Industries.
- Extensive ETL experience using Informatica Power Center 10.1/9.x/8.x, informatica Developer 9.x, Informatica Big data edition with expertise in upgrade. Experience in Creating Repository, Target Databases and developing Strategies for Extraction, Transformation and Loading (ETL) mechanism - using Informatica.
- Experience in working with Informatica Data Quality and Data profiling.
- Expertise in implementing Business rules by creating complex Informatica mappings/Mapplet, shortcuts, reusable transformations, tasks, sessions, workflows/worklets using Designer and Workflow Manager.
- Extensively worked on transformations such as Source Qualifier, Joiner, Filter, Router, Expression, Lookup, Aggregator, Sorter, Normalizer, Update Strategy, Sequence Generator, Union and Stored Procedure transformations.
- Extensively used BMC Control-M to schedule the jobs. Wrote UNIX shell Script for processing/cleansing incoming text files and for the Data flow and Control flow order.
- Exposure to the entire process of Software Development Life Cycle (SDLC) including Requirement Analysis, Requirement Gathering, Cost Estimation, Project Management, Design, Development, Implementation, Testing and Maintenance.
- Expertise in Relational Database Systems and Data warehousing like Oracle, DB2, Teradata, MS Access, SQL Server, Hive, Impala.
- Experienced in Data Modeling in Erwin - Enterprise Data Model (EDM), Star/ Snowflake schema modeling, Fact and Dimension Tables, Physical and Logical data modeling.
- Comprehensive experience in UNIX scripting and basic Perl scripting for the optimal process flow.
- Create notification UNIX shell scripts for maintaining Informatica box. This includes scripts for removing logs on a regular basis, checking disk usage and sending mail if the space is over threshold.
- Excellent communication skills.
TECHNICAL SKILLS
ETL: ETL Informatica Power Center 10.1/9.6.1,9.5/8.x/7.x/6.x, Informatica Big data Edition, Informatica data quality (developer tool) 9.x, Pentaho Kettle PDI.
DATABASES: Teradata v14,, SQL Server 2012/2008, Oracle 10g/9i, DB2
REPORTING TOOLS: Business Objects XI, CSV reports to read in MS Excel
DATABASE TOOLS: Teradata SQL Assistant, Oracle tools Toad, SQL Analyzer and Squirrel SQL Client Version 3.2.1, Hive, Impala
SCHEDULING TOOLS: Tivoli Work Scheduler (TWS), BMC Control M, Informatica Scheduler
SCRIPTING: UNIX, PERL
ENVIRONMENT: UNIX, AIX Windows XP/2000/NT, Windows Vista, Window 7 and LINUX
OTHERS: MICROSOFT Word, Microsoft Excel and Microsoft Access, EDM, Erwin.
PROFESSIONAL EXPERIENCE
Confidential, Livonia, MI
Sr. ETL Developer/Consultant
Responsibilities:
- Worked with Business analysts and the DBA for requirements gathering, business analysis and designing of the data warehouse.
- Created complex mappings to load data directly from salesforce using API call to load Donor data into MDM.
- Worked exclusively on CDC Type 1 & Type 2 mappings to load data into relational tables on SQL Server database using MD5 Hash functions.
- Tuned performance of Informatica session for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval.
- Created mappings for loading data files received from external vendors. Worked with different kinds of input files like CSV, XML, Tab delimited files etc.
- Implementing address doctor using IDQ for validating or standardizing the address.
- Effective experience in pushing data to hive and impala using Informatica Big data edition
- Created data file cache for improving the performance for lookups.
- Used hints for suppressing the order by clause in lookup for improving the performance.
- Used PLSQL stored procedures as part of mappings to improve performance.
- Created Complex mappings using Unconnected, Lookup, and Aggregate and Router transformations for populating target table in efficient manner.
- Worked with Teradata Database in writing BTEQ queries and Loading Utilities using Tpump, Multiload, Fastload, FastExport.
- Experience setting up TPT connection for data load to and from Teradata.
- Experience handling log table, error tables involved in Teradata loads.
- Worked with unstructured data transformation to handle HL-7 unstructured file format.
- Worked with standard EDI files like 835, 837 and NCPDP files
- Knowledge on B2B Data transformation.
- Created Mapplet and used them in different Mappings.
- Used sorter transformation and newly changed dynamic lookup
- Created events and tasks in the work flows using workflow manager
- Developed Informatica mappings and also tuned them for better performance with PL/SQL Procedures/Functions to build business rules to load data.
- Created Schema objects like Indexes, Views, and Sequences.
- Created Teradata triggers, stored procedures
- Developed UNIX Shell Scripts and Windows Batch Files
- Developed mappings for various sources in Inbound.
- Created extracts for different clients in XML, .txt and. csv formats.
- Working with database connections, SQL joins, cardinalities, loops, aliases, views, aggregate conditions, parsing of objects and hierarchies.
- Developed shell scripts for running batch jobs and scheduling them
Environment: Informatica Power Center 10.1/9.5.1/9.6 Work Flow Manager/Monitor, Power Exchange 9.5, Erwin, Teradata v14, Oracle 11g/10g, PL/SQL, SQL * Loader,, UNIX Shell Scripting, Control M. SQL Server 2012.
Confidential, Detroit MI
ETL Developer/Designer
Responsibilities:
- Coordinating with source system owners, day-to-day ETL progress monitoring, Data warehouse target schema Design (Star Schema) and maintenance.
- Designed Informatica mappings by translating the business requirements.
- Developed reusable Transformations.
- Widely used Informatica client tools -- Source Analyzer, Warehouse designer, Mapping designer, Transformation Developer and Informatica Work Flow Manager.
- Used look up, router, filter, joiner, stored procedure, source qualifier, aggregator and update strategy transformations extensively.
- Assisted in adding Physical conceptual data model using Erwin 4.0.
- Analyzed business process workflows and assisted in the development of ETL procedures for moving data from source to target systems.
- Done extensive bulk loading into the target using Oracle SQL Loader& OWB.
- Used workflow manager for session management, database connection management and scheduling of jobs.
- Assisted the team in the development of design standards and codes for effective ETL procedure development and implementation.
- Extensive performance tuning by determining bottlenecks at various points like targets, sources, mappings and sessions.
- Involved in the design, development and testing of the PL/SQL stored procedures, packages for the ETL processes.
- Developed UNIX Shell scripts to automate repetitive database processes and maintained shell scripts for data conversion.
- Involved in the process design documentation of the DW Dimensional Upgrades.
Environment: Informatica Power Center 9.6.1/7.x (Power Center Repository Manager, Designer, Workflow Manager, and Workflow Monitor), OWB 10g, Oracle 10g, TOAD, Erwin 4.0, PL/SQL, UNIX (Sun Solaris)
Confidential
ETL / DW Developer - Internship
Responsibilities:
- Involved in requirements gathering, functional/technical specification, Designing and development of end-to-end ETL process for Sales Data Warehouse.
- Involved in designing, implementing and maintaining the database system.
- Created database objects such as tables, views and database links as well as custom packages tailored to business requirements.
- Used Informatica power center for (ETL) extraction, transformation and loading data from heterogeneous source systems.
- Used different transformation like Expression, Aggregator, Source Qualifier, Sequence Generator, Filter, Router, Update strategy and Lookup Transformations.
- Developed SQL queries to performed DDL, DML, and DCL.
Environment: Informatica Power Center 8.6.1, Oracle 9i, TOAD, SQL Server 2008.