We provide IT Staff Augmentation Services!

Senior Etl Developer /denodo Developer Resume

0/5 (Submit Your Rating)

Newark, CA

SUMMARY

  • Seven years of IT experience expertise in Data Warehousing, Data Migration/Data Conversion (ETL), Business Intelligence and OLAP/ROLAP reporting
  • Experience in all phases of the project starting from Design, Development, Review, Testing, Implementation and Post Production support
  • Experience in gathering Business Requirements, Data Analysis, Data Modeling both Logical & Physical and Design of a Data Warehouse using both the Star Schema and Snow flake Schema.
  • Experience in Documenting and Designing the Data Warehouse and ETL data mapping specifications and Reporting specifications.
  • Strong background in Oracle, Redshift, SQL Server, MySQL, Greenplum, Informatica, Pentaho PDI, Denodo, PL/SQL, and T - SQL.
  • Strong technical expertise in design and development of Denodo Data virtualization platform including Scheduler/VDP components and Caching on different databases for Denodo
  • Strong technical experience in ETL tools Informatica, Pentaho, PL/SQL and T-SQL.
  • Experience with Informatica IDQ tool and Informatica Cloud
  • Good knowledge in SFDC(salesforce.com) application
  • Experience with HTML,XML,JSON Data.
  • Strong knowledge in Source Data Analysis and Data Profiling & Data Cleansing.
  • Expertise in UNIX Shell Scripting.
  • Expertise in all phases of Testing like Unit, System, Integration and User Acceptance and creating Test Plans and Test Scripts for testing.
  • Experience with both Waterfall and Agile Scrums Methodologies
  • Posses very good Interpersonal & Communications Skills, individually managed various clients and very good Individual and Team Player.

TECHNICAL SKILLS

Operating Systems: Windows XP/2000/7, Unix

Languages: UNIX Shell Scripting, PL/SQL

Databases: Redshift, Oracle 12c/11g/10g/9i, Greenplum, SQL Server 2000/2005, MySQL, MS Access, Amazon Aurora(Mysql Compatible engine for Amazon RDS ),Derby

ETL Tools: PostgreSQL, PL/SQL, Informatica 9.1/8.6, Denodo Platform 5.5,6.0 and Pentaho 5.4,6.0

Version Control Tools: SVN,GIT

Reporting Tools: Business Objects, OBIEE and Tableau

GUI Tools: SQL*Plus, SQL*Loader, TOAD, PL/SQL Developer and Shell Scripts

PROFESSIONAL EXPERIENCE

Confidential, Newark, CA

Senior ETL developer /Denodo Developer

Responsibilities:

  • Working as a Senior ETL/Denodo/Pentaho developer and responsible for Designing and Development ETL processes
  • Extensively working on gathering Business Requirements, Data Profiling, Data Cleansing, Migrations and Reporting
  • Responsible for Creating and Monitoring all components of Denodo tool (Administration, VDP, Scheduler, Custom Views and Caching).
  • Designed and developed high quality integration solutions by using Denodo virtualization tool (read data from multiple sources including Oracle, SFDC and Redshift)
  • Read the tables from Amazon Redshift to Denodo and implemented business logic in Denodo and exposed the final business views in Tableau.
  • Created caching jobs in different databases like MySQL, Amazon Aurora (MySQL Compatible engine for Amazon RDS) and Derby.
  • Created Informatica Mappings to load the data from sources to Staging and from Staging to the Target database from Target Database to Data Warehouse.
  • Worked on Informatica IDQ tool to provide web services to users.
  • Created Salesforce to EDW mappings using informatica cloud platform.
  • Created various Sources, Targets, Mappings, Workflows using Informatica Power Center
  • Implemented performance tuning logic on targets, sources, mappings, sessions to provide maximum efficiency and performance
  • Extensively used Informatica Client tools: Designer, Workflow Manager and Workflow Monitor.
  • Parameterized the mappings and increased the re-usability
  • Created Audit tables against EDW Database to Redshift Database using Denodo Tool.
  • Collaborating with other application development teams to design, develop and deploy the best solutions to ensure high level of customer service
  • Worked with SFDC/Legacy teams to gather requirements and load legacy data into SFDC.
  • Involved in oracle r12 migration.
  • Created the Test Plans and Test Cases for ETL Process.
  • Supported the Client in User Acceptance Testing
  • Involved in different POC projects like Pentaho data integration tool, Redshift database and Denodo data virtualization tool.

Environment: Oracle 11g/12c, SQL Server 2005, MySQL, Informatica 9.1/8.6, IDQ, PL/SQL, Denodo Platform 5.5/6.0, OBIEE, Salesforce.com and UNIX shell scripting, Pentaho PDI 5.4.,Redshift(AWS) and Tableau.

Confidential, Palo Alto, CA

Data Warehouse developer

Responsibilities:

  • Worked as a Data Warehouse developer
  • Responsible for Designing and Development of Data Warehouse and ETL process
  • Gathered the Business Requirements for Data Profiling, Cleansing, Migration and Reporting from Data Warehouse
  • Created Detail Design Specification for Data Warehouse, ETL (including Data Profiling and Cleansing) and Reporting
  • Created the Logical and Physical Data Model for the Data Warehouse.
  • Created Informatica Mappings to load the data from sources to Staging and from Staging to the Target database from Target Database to Data Warehouse.
  • Implemented performance tuning logic on targets, sources, mappings, sessions to provide maximum efficiency and performance
  • Worked extensively on Informatica transformations like Source Qualifier, Expression, Filter, Router, Aggregator, Lookup, Update Strategy, Stored Procedure, Sequence generator, Joiner and Normalizer.
  • Parameterized the mappings and increased the re-usability
  • Used the PL/SQL procedures for Informatica mappings for truncating data in target at run time.
  • Created and Modified PL/SQL (Functions, Procedures, Packages and Triggers) and Shell Scripts for improving the performance of the System
  • Some of the performance techniques used are: Analyzing the Explain plans, Tkprof, DBMS Scheduler, Indexes, Partitioning, Materialized Views, etc
  • Developed PL/SQL stored procedures, user defined functions for complex calculations and bundle them into stored package that could be invoked from the Forms triggers
  • Created Custom Triggers, Stored Procedures, Packages and SQL Scripts
  • Carried Data Profiling and Source Data Analysis and made the recommendations for Errors/Defects in data, Data Cleansing and modifications for the ETL logic.
  • Created the ETL exception reports and validation reports after the data is loaded into the warehouse database
  • Drove the Performance Tuning efforts of the complex transformation logic and queries used in both ETL and Reporting by creating Indexes like Bitmap, b-tree, function based index.
  • Created the Test Plans and Test Cases for the Data Warehouse and ETL Process.
  • Supported the Client in User Acceptance Testing

Environment: Oracle 11g, Greenplum, Informatica 9.1/8.6, PL/SQL and UNIX shell scripting

Confidential, Denver, CO

ETL developer

Responsibilities:

  • Gathered the Business Requirements and responsible for Data Migration and Data Cleansing.
  • Created the Detail Design Specification and Technical Specifications for ETL Process including Data Cleansing rules.
  • Designed and Developed the Staging Area for historical data migration.
  • Created Logical Flow of Data Conversion including source data analysis, data profiling and data cleansing for the historical data and Created Test Plans, Test Scripts and Perform the testing for data conversion process during all phases of Testing Starting from Unit, System, Integration and UAT
  • Analyzed the Source Data and made recommendations/modifications to the ETL logic
  • Developed complex mappings in Informatica to load the data from various sources into Gentax
  • Parameterized the mappings and increased the re-usability.
  • Used Informatica Power Center Workflow manager to create sessions, workflows and batches to run with the logic embedded in the mappings.
  • Applied slowly changing dimensions like Type 1 and 2 effectively to handle the delta Loads.
  • Created Materialized Views on Summary Tables for Data Summarization for Reporting.
  • Performed Analysis, Performance Tuning of the Informatica Mappings and Queries for Reporting by creating Indexes and Materialized Views.
  • Extensively worked on Oracle Packages, Triggers, procedures, Functions, Database links, Synonyms, Indexes, Sequences, Views, Materialized views and Cursors.
  • Drove the Performance Tuning efforts of the complex transformation logic and queries
  • Assisted in Integration and User Acceptance Testing.

Environment: Oracle 10g, Informatica 8.6, PL/SQL, ETL, SQL*Loader, TOAD, MS- Office, FCR tool and FTP

Confidential, Denver, CO

PL/SQL and ETL developer

Responsibilities:

  • Migrated/Converted three clients Legacy data into PowerSuite Workers Compensation product
  • Designed the ETL process using PL/SQL and Informatica
  • Designed and created 100+ mappings to move data from source to PowerSuite
  • Involved in gathering Requirements, Systems Analysis, preparing Functional Specifications, Design Reviews, Plan Reviews, Implementation and Post Implementation
  • Participated in project planning sessions and all technical client meetings.
  • Creation of Informatica Mappings to load the data from Source to Staging and from Staging area to Target by applying business rules.
  • Created various Sources, Targets, Mappings, Workflows using Informatica Power Center
  • Developed T-SQL, PL/SQL (Functions, Procedures, Packages, Views and Triggers).
  • Carried Data Profiling and Source Data Analysis and made the recommendations for Errors/Defects in data, Data Cleansing and modifications for the conversion logic wherever necessary.
  • Provided the 24/7 Application Support for Post-Production Process.

Environment: Oracle 9i/10g, Oracle SQL Server 2000, Informatica 8.6, UNIX shell scripting, PL/SQL, T-SQL, TOAD and PL/SQL developer and MS-Office

We'd love your feedback!