We provide IT Staff Augmentation Services!

Etl Architect/lead/developer Resume

3.00/5 (Submit Your Rating)

CO

SUMMARY

  • he is a senior data warehouse consultant providing end - to-end data warehouse solution for building and maintaining an enterprise wide data warehouse.
  • He has extensive hand-on experience with all phases of a Data Warehouse project life cycle to build a sustainable integrated enterprise wide data warehouse.
  • He brings over 19 years of operational and data warehouse experience.
  • He has worked with clients in various industries including Information Technology, Finance, Government, Manufacturing and HealthCare industry.
  • He has played many different roles in design, development, upgrade and maintaining data warehouse.
  • He demonstrated a successful co-shore model in implementing and maintaining data warehouse solutions to maximize the productivity at a lower cost.
  • He is having good analytical, problem solving skills, result-oriented, hard working and works well with the team.

TECHNICAL SKILLS

  • SAP Data services, Business Objects, Informatica Power Center, Power Connect, Informatica Data Quality, Informatica Test Data Management, Informatica Master Data Management, Informatica Data validation option, Informatica Power Exchange, Informatica Metadata Reporter, Power Center Mapping Architect for Visio, Microsoft SQL SERVER 2008, Microsoft SSIS/SSRS 2008/2016, Microsoft Scorecard Manager, ORACLE, SQL, PL/SQL, AUTOSYS, Oracle Designer, Erwin data modeler r7.3, Provision v6.1, Exceed, Dataflux, Toad, ASP, HTML, CVS, Siebel 6/7, SAP,ORACLE 10.7 / ORACLE 11i, XML,UNIX Korn shell scripts and used sed7I, awk, nawk and other UNIX utilities. JMS queues, Web services, Salesforce 15+

PROFESSIONAL EXPERIENCE

Confidential, CO

ETL Architect/Lead/Developer

Responsibilities:

  • Setup the Audit, Balance and Control process, ETL standards and ETL process templates.
  • T rained and mentored ETL team members on how to use the templates to do the development.
  • Created an automated script to download the weather files amazon S3 file system to load the weather external data.
  • Created automated script to validate the XML files for ETL standards before deploying from development to test and test to production.
  • Automated the Informatica deployment between environments using UNIX scripts.
  • Extracted the data from Oracle to load them into Snowflake cloud data warehouse database on the cloud concurrently using Informatica, Shell Scripts and Snow SQL(Snowflake-Cloud data warehouse)
  • Created table functions in snowflake cloud data warehouse, Converted oracle analytical functions to snowflake.
  • Setup the RBAC roles for the developers, analyst and SME’s to access snowflake.
  • Installed and configured snowflake informatica client driver files to connect to snowflake from informatica client tools and use the driver to load snowflake tables.
  • Learned, Understood and applied new features of Snowflake cloud data warehouse during development. Mentored and trained fellow team member on Snowflake Cloud Data warehouse.
  • Created type1, type2 and type3 ETL templates to capture the data from source to stage to integration and to presentation layer.
  • Created job dependency to optimize the data processing to avoid any wait times between jobs.
  • T uned performance for long running jobs at all levels.
  • Conducted gap analysis between legacy and current EDW data warehouse to identify and add missing data elements covering the legacy functionality to EDW at the appropriate grain.
  • Automatic reprocessing of child orphan records to associate it with its late arriving parent records due to a replicated source data.
  • Over processing logic is implemented in audit, balance and control to avoid any orphan records.
  • Reloaded fact and dimension tables to add new measures/attributes to existing facts and dimension.
  • Implemented self-correction to ETL job failures to reprocess them.
  • Created a stored procedure to unpivot source columns to rows to keep preserving history on the new columns that are added in the source.
  • Identified the data elements used in the legacy warehouse and map them into new EDW, the missing elements were supplemented to the new EDW environment so that users can start using EDW instead of legacy DW.

Confidential, San Diego, CA

ETL Developer

Responsibilities:

  • Created and fixed bugs in the facts and dimensions data loads and optimized loading and querying performance.
  • Created mapplets to calculate the financial measures about their devices.
  • Created checklist for data testing
  • Upgrade Informatica from 8.1 to 9.1 and then to 9.5.1
  • Test the functionality between Power Center 8.1 and 9.5.1
  • Support production 24x7 on existing legacy data warehouse and marts.
  • Created data connectors, Virtual Layers, Defining KPI and data set, Hierarchies, formula in designing dashboard.
  • Designed numerous reports/Visualization using Dundas
  • Have setup interactions between visualizations in dashboards.
  • Improved performance of data refresh in dashboard and tuned queries.
  • Dundas Dashboard upgrade to version 5.Dundas Dashboard upgrade to version 5.

Confidential, CT

Technical ETL Team Lead/Administrator/SME

Responsibilities:

  • Installed configured Power Center, Metadata Manager, Reporting Service, and PowerExchange client and server tools. Configured gateway node and worker node on the same domain. Performed other administrative task including purging, backup, automatic service startup, deployment process.
  • Performed proof of concept to write to read from web Services and JMS queues in Real Time.
  • Designed solutions, mentored offshore team to get the solution implemented.
  • Worked on customizing large complex XML files definition in Informatica to extract only required elements.
  • Improved performance of the ODS system cycle to run under 4 hours from 48 hours.
  • Troubleshooting, bug fixing and supporting Production Support team.
  • Managing offshore development team, mentoring and assisting them on their tasks.
  • Conduct status report with offshore team to assign and assist the team on their development activity.
  • Work with Clients, BA’s on gathering the requirement. Conduct impact analysis for any changes in the scope.
  • Conducted code walkthrough and provided feedback.
  • Worked with BA’s, Client and Oracle partner team on design discussion, issue resolution.
  • Performed ETL design review with client confirming the functional and technical requirement.
  • Setup and configure the development environment for 32 bit and the deployment on 64 bit environment.
  • Created ETL standards, developer checklist, test case templates, and dependency diagrams at subject area and table level.
  • Created a table load method document to track ETL Development, Testing, and Issues/Clarification on requirement. It covers source system, frequency, load type, table type, source to target mapping document, SSIS Solution name and different phases to deliver data to oracle partner team for Data Load and Certification.
  • Created ETL framework using Microsoft SQL Server Integration Services 2008 for extracting data from various internal sources like SAP, Oracle, flat files and external sources like Advent Geneva, Bloomberg.
  • Dynamic database connection is setup using Package Configuration to ease migrations.
  • Enabled logging, checkpoints to make the recovery and restart easier.
  • Created master package to reuse for any package development.
  • Automated Error handling and exception process.
  • Implemented audit, balance and control as part of ETL framework. Reconcile record counts and sum of a measure from source to stage and stage to Reveleus databases.
  • Created reusable validation using script transformation to verify the data types.
  • Used Synchronous and Asynchronous script object to parse varying length data files from Bloomberg external source system.

We'd love your feedback!