We provide IT Staff Augmentation Services!

Lead Developer/data/etl Architect Resume

2.00/5 (Submit Your Rating)

Detroit, MI

CAREER SUMMARY:

  • 14 years in development and design of ETL methodology for supporting data transformations and processing.
  • Exposer to design and architecture of ETL jobs.
  • Hands on programming and development experience with data integration and transformations.
  • Ability to lead and manage projects and effectively delivered outcomes timely.
  • Experience in providing technical guidance to others within the team.
  • Ability to find root cause analysis for complex data mapping issues.
  • Clear understanding of Business Intelligence and Data Warehousing Concepts
  • Recently worked on Siebel to Salesforce Migration.
  • Experience in the Data Analysis, Design, Development, Implementation and Testing of Data Warehouse jobs using Extraction, Transformation and Loading (ETL).
  • Experience in mapping techniques for Type 1, Type 2 and Type 3 Slowly Changing Dimensions.
  • Experience with Mapping Designer, Work flow manager, Workflow monitor, Repository Manager.
  • Experience with all components of mapping designer such as Source Analyzer, Warehouse Designer, Transformation Developer, Maplet designer.
  • Experience in Performance Tuning of Sources, Mappings, Targets and Sessions.
  • Strong skills in development of Mappings, documentation, implementation and enhancement of ETL strategies.
  • Extraction experience relational souces, Mainframe sources like DB2, VSAM files, Salesforce.
  • Experience with powercenter in designing and developing complex mappings applying various transformations such as Source Qualifier, Expressions, Sorter, Router, Filter, Aggregator, Joiner, Connected and Unconnected Lookups, Sequence Generator, Rank, Update Strategy and Stored Procedures
  • Experience with powerexchange is using normalizer transformation for VSAM sources to handle occurs clause in cobol copy books, to filter header and tail records from cobol source file(VSAM).
  • Teradata experience with Tools such as BTEQ, MLOAD, FASTLOAD, etc in large Data Warehouse environments working with salesforce application sources and targets, using upsert transformation to insert/update salesforce target based on external id, using pipeline lookup(having lookup on source column of salesforce object) for better performance, to load salesforce objects based on parent - child relationships, used dataloader utility to upload csv files to salesforce objects. Salesforce data analysis using SOQL, validating records on salesforce based on salesforce accounted, source exernal id, creating cdc mapping using informatica templates with power exchange. Worked with file sources(like csv, tab limited),XML sources and Targets.
  • Used pmcmd mailx command to execute jobs from UNIX platform.
  • Involved in design of Dimension Models, Star Schemas and Snowflake schemas.
  • Programming Skills using PL/SQL, and Shell Scripts. Creation of scripts for collection of statistics, reorganization of tables and indexes, and creation of indexes for enhancing performance for data access.
  • Coordinated with business users and team members to resolve data quality related issues.
  • Production support and fix the problem by understanding its priority.
  • Experience of Application Study, Systems Analysis, Design, Development, Testing, Implementation, Maintenance, and Business Analysis.

AREAS OF EXPERTISE:

  • Informatica
  • SalesForce
  • Trillium
  • Teradata
  • Oracle
  • Business Objects
  • COGNOS
  • Microstrategy
  • Pro *C
  • C
  • PL/SQL
  • Forms
  • Visual Basic
  • VB Script
  • ASP.Net
  • Oracle
  • DB2
  • UDB
  • MS SQL Server
  • MS Access
  • TOAD
  • Unix Shell Scripting
  • Erwin
  • UNIX

PROFESSIONAL EXPERIENCE:

Confidential, Detroit, MI

Lead Developer/Data/ETL architect

Responsibilities:

  • Enabled data analytical solutions via Spotfire dashboards.
  • Developed solutions predicted for spend analytics
  • Improved performance through monthly snapshot mechanism.
  • Enhanced cycle time 3X times by re-designing the data load process.
  • Implemented data audits to improve data quality.
  • Created metadata structures and job control tables to effective schedule and monitor ETL jobs.
  • Designed wing to wing workflows to achieve atomicity and consistency of data loads by checking the dependencies.
  • Developed varies reusable multiple ETL component.
  • Completed ITIL foundation certificate in IT Service Management.
  • Worked on powercenter/powerexchange/informatica developer to handle saleforce and cobol copy book sources.
  • Worked on internal and external tables in Hadoop to view data of from delimited files.
  • Used sqoop for data import/exports to Hadoop from relational sources.

Environment: Informatica PowerCenter 9.1/9.5/10.1 , Informatica Power Exchange, Informatica Developer(IDQ), Salesforce, Oracle 11g, Green Plum, SQL Developer, Talend, Hdfs, Hue, Hive, Sqoop, Scala,Spark, Spotfire

Confidential, San Jose, CA

Sr. Data Warehouse Lead Developer / Data Analyst

Responsibilities:

  • Involved in Data Analysis and development.
  • Interacted with the client for the requirements.
  • Working with offshore development team.
  • Use explain plan to understand performance issues of the SQL.
  • Prepared Mapping specifications documents developed informatica mappings.
  • Defined Target Load Order Plan for loading data into different Target Tables.
  • Worked on Informatica Power center 9.1 Tool - Source Analyzer, warehouse designer, Mapping Designer, Workflow Manager, Mapplets, Worklets and Reusable Transformation.
  • Worked on Power Exchange when Loading data from cobol copy book(VSAM) files to relational targets using normalizer transformation
  • Worked on Informatica Power Exchange to load salesforce objects account, contact, SR cases and SR activity tasks.
  • Used pipeline lookups when working with salesforce object lookups.
  • Using pipeline lookups for salesforce objects.
  • Loading csv data to salesforce objects using data loader utility.
  • Extensively used Router, Lookup, Aggregator, Expression and Update Strategy transformation.
  • Involved in identifying bugs in existing mappings developed by team, mapping reviews, mappings evaluating transformations and fixing the bugs so that they conform to the business needs.

Environment: Informatica PowerCenter 9.1, Informatica Power Exchange, Spotfire, Salesforce, SOQL, apexdeveloper, data loader, Toad, Oracle 10g,Teradata

Confidential, Palo Alto, CA

Sr. Data Warehouse Developer / Data Analyst

Responsibilities:

  • Trillium library is used for Address Attribute validation.
  • Involved in Data Analysis and development.
  • Interacted with the client for the requirements.
  • Interacted with DBA and informatica support in case sfdc target connectivity issue.
  • Coordinated code migration to different environment.
  • Resolved issues related to data sources like remove duplicated data populating appropriate dates on target from the sources, which is involved critical logic to resolve the issue.
  • Prepared Mapping specifications documents developed informatica mappings.
  • Defined Target Load Order Plan for loading data into different Target Tables.
  • Worked on Informatica Power center 8.6.1 Tool - Source Analyzer, warehouse designer, Mapping Designer, Workflow Manager, Mapplets, Worklets and Reusable Transformation.
  • Used normalizer transformation to work with cobol copy book (.cbl)
  • Created connections for relational and salesforce application.
  • Extensively used Router, Lookup, Aggregator, Expression and Update Strategy transformation.
  • Implemented incremental loads and historical loads.
  • Involved in the design and development of mappings from legacy system to target sfdc environment.
  • Using pipeline lookups for salesforce objects.
  • Worked on Informatica Power Exchange to load salesforce objects account, contact, SR cases and SR activity tasks.
  • Loading csv data to salesforce objects using data loader utility.
  • Worked on Power Exchange when Loading data from cobol copy book files relational targets using normalizer transformation.
  • Involved in performance tuning of the mappings by doing necessary modification to reduce the load time.
  • Involved in identifying bugs in existing mappings developed by team, mapping reviews, mappings evaluating transformations and fixing the bugs so that they conform to the business needs.
  • Used Informatica debugger for data fixes.

Environment: Informatica PowerCenter 8.6.1, Informatica Power Exchange, Salesforce, SOQL, apexdeveloper, data loader, MySQL, Toad, Oracle 10g

Confidential, Chicago, IL

Sr. Data Warehouse Developer / Data Analyst

Responsibilities:

  • Worked on agile methodology involved creating mapping to load dimensions and Fact loading sales data of airlines on board.
  • Involved in Data Analysis and development.
  • Reviewed existing star schema, suggesting changes needed after source data analysis.
  • Developed Informatica mappings for existing SSIS packages.
  • Created mappings to for new development requirement.
  • Interacted with the client for the requirements.
  • Interacted with End Users and informatica support in case of target connectivity issue (for Access and xls source connectivity issues).
  • Coordinated code migration to different environment.
  • Resolved issues related to data sources like remove duplicated data populating appropriate dates on target from the sources, which is involved critical logic to resolve the issue.
  • Prepared Mapping specifications documents developed informatica mappings.
  • Defined Target Load Order Plan for loading data into different Target Tables.
  • Worked on Informatica Power center 9.1 Tool - Source Analyzer, warehouse designer, Mapping Designer, Workflow Manager, Mapplets, Worklets and Reusable Transformation.
  • Extensively used Router, Lookup, Aggregator, Expression and Update Strategy transformation.
  • Implemented incremental loads and historical loads.
  • Involved in performance tuning of the mappings by doing necessary modification to reduce the load time.
  • Involved in identifying bugs in existing mappings developed by team, mapping reviews, mappings evaluating transformations and fixing the bugs so that they conform to the business needs.
  • Used Informatica debugger for data fixes.

Environment: Informatica PowerCenter 9.1/9.5/10.1 , SSIS, SQL server, MS Access, Toad, Oracle 11g

Confidential, Schaumburg, IL

Sr. Data Warehouse Lead Developer / Data Analyst

Responsibilities:

  • Involved in Data Analysis and development.
  • Interacted with the client for the requirements.
  • Coordinated with offshore to understand the functionality.
  • Resolved issues related to data sources like remove duplicated data populating appropriate dates on target from the sources, which is involved critical logic to resolve the issue.
  • Prepared Mapping specifications documents developed informatica mappings.
  • Prepared Mapping documents for loading the data from Legacy system, flat files to staging area.
  • Prepared Mapping documents for loading the data from staging area to ZNAW.
  • Defined Target Load Order Plan for loading data into different Target Tables.
  • Worked on Informatica Power center 8.6.1 Tool - Source Analyzer, warehouse designer, Mapping Designer, Workflow Manager, Mapplets, Worklets and Reusable Transformation.
  • Extensively used Router, Lookup, Aggregator, Expression and Update Strategy transformation.
  • Implemented incremental loads and historical loads.
  • Involved in the design and development of mappings from legacy system to target data base.
  • Worked on Power Exchange when Loading data from cobol copy book files relational targets using normalizer transformation.
  • Involved in performance tuning of the mappings by doing necessary modification to reduce the load time.
  • Involved in identifying bugs in existing mappings developed by offshore, evaluating transformations and fixing the bugs so that they conform to the business needs.
  • Used Informatica debugger for data fixes. Coordinating with offshore team to resolve mapping and functional related issues.

Environment: Informatica PowerCenter 8.6.1, Informatica Power Exchange, DB2, window server 2005, AQT, Oracle 10,PEGA

Confidential, Rosemont, IL

Data Warehouse Developer / Data Analyst

Responsibilities:

  • Involved in Data Analysis and development.
  • Prepared Mapping specifications documents, Unit testing documents for developed informatica mappings.
  • Prepared Mapping documents for loading the data from Legacy system, flat files to staging area.
  • Prepared Mapping documents for loading the data from staging area to ODS.
  • Defined Target Load Order Plan for loading data into different Target Tables.
  • Worked on Informatica Power center 8.5.1 Tool - Source Analyzer, warehouse designer, Mapping Designer, Workflow Manager, Mapplets, Worklets and Reusable Transformation.
  • Extensively used Router, Lookup, Aggregator, Expression and Update Strategy transformation.
  • Worked on Power Exchange when Loading data from cobol copy book files relational targets using normalizer transformation.
  • Implemented incremental loads and historical loads.
  • Involved in the design and development of mappings from legacy system to target data base.
  • Involved in performance tuning of the mappings by doing necessary modification to reduce the load time.
  • Defined test cases and prepared test plan for testing ODS jobs.
  • Involved in identifying bugs in existing mappings by analyzing the data flow, evaluating transformations and fixing the bugs so that they conform to the business needs.
  • Used Informatica debugger for data fixes. Coordinating with offshore team to resolve mapping and functional related issues.

Environment: Informatica PowerCenter 8.5.1/8.6, Informatica Power Exchange, COGNOS8.3, SQL server 2005, windows Vista, Oracle 10g

Confidential, Indianapolis, IN

Data Warehouse / Developer / Onsite Coordinator / Team Lead

Responsibilities:

  • Involved in design and development of Business Requirements by co-coordinating with business users.
  • Prepared functional specifications documents.
  • Prepared Mapping documents for loading the data from Legacy system, flat files to staging area.
  • Prepared Mapping documents for loading the data from staging area to ODS.
  • Defined Target Load Order Plan for loading data into different Target Tables.
  • Worked on Informatica Power center 7.1.4 Tool - Source Analyzer, warehouse designer, Mapping Designer, Workflow Manager, Mapplets, Worklets and Reusable Transformation.
  • Extensively used Router, Lookup, Aggregator, Expression and Update Strategy transformation.
  • Implemented incremental loads and historical loads.
  • Used pmcmd, mailx command to execute jobs from UNIX platform.
  • Worked on Power Exchange when Loading data from cobol copy book files relational targets using normalizer transformation.
  • Involved in the design and development of mappings from legacy system to target data base.
  • Involved in the design and development of mappings from legacy system to staging, staging to ODS and ODS to data marts.
  • Involved in performance tuning of the mappings by doing necessary modification to reduce the load time.
  • Defined test cases and prepared test plan for testing ODS jobs.
  • Supported UAT Testing and Production Testing by identifying the bugs and also helped to resolve various Data issues, Load Issues.
  • Involved in Co-Coordinating with DBA team to resolve issues pertaining to load timings and any other database related issues which are hindering performance of loads.
  • Involved in identifying bugs in existing mappings by analyzing the data flow, evaluating transformations and fixing the bugs so that they conform to the business needs.
  • Used Informatica debugger for data fixes. Coordinating with offshore team to resolve mapping and functional related issues.

Environment: Informatica PowerCenter 7.1, Informatica Power Exchange, Business Objects, MicroStrategy, Oracle 10g,SQL Server 2005,DB2, Toad, PL/SQL, Windows xp, Unix shell scripting

We'd love your feedback!