We provide IT Staff Augmentation Services!

Data/etl/big Data Architect/sr.etl Developer Resume

3.00/5 (Submit Your Rating)

SUMMARY:

  • Highly Skilled Teradata/Big Data Certified Professional having around Ten Years of experience in Data warehousing, Business Intelligence and Big Data Analytics.
  • Implemented large - scale data warehousing and analytics solutions for Telecommunication, Health Care, Retail, Manufacturing, Education and Construction companies in the US, Europe, and Middle East.
  • Implemented Complete BI SDLC carried multiple roles e.g. Solution Designers. ETL Development, Data Mapping, Source System Analysis, Sr. Data Analyst, Data Architect, Semantic layer design and Development, Data warehouse Design, Data modeling, Technical Team lead, Adhoc Analytical Reporting and KPI analysis, Dashboard Design and development, Data Lake Design and Implementation.
  • Strong ETL & DWH design and development experience using cutting edge industry tools e.g. Oracle ODI, Oracle ODIC, Infromatica (Version 8,9.x), Teradata 12/13/14, Teradata Utilities( BTEQ,Fload,TPT), MS SSIS, MS SSRS, Toad for Oracle, Oracle SQL- Developer etc.
  • Designed high level ETL architecture for overall data transfer from the OLTP to OLAP with the help of multiple ETL tools and also prepared ETL mapping processes and maintained the mapping documents.
  • Strong DWH Logical/Physical Data modeling and Solution Architecture Expertise using industry best practices ad tools e.g. Erwin, ER studio
  • Experience in Relational (OLTP) / 3NF (Bill Inmon) and Dimensional modeling (OLAP) (Ralph Kimball) concepts and constructs
  • Data mining skills (including data auditing, aggregation, validation and reconciliation), advanced modelling techniques, testing and creating and explaining results in clear and concise reports
  • Proficient in implementing business logic and procedure in backend using stored procedures and user defined functions using standard RDBMS tools e.g. Teradata Procedure, Teradata Macros, SQL, Oracle PL-SQL, Oracle Packages, MS T-SQL
  • Experienced in gathering requirements and developing standard / Custom Reports and Dashboards and different types of Tabular Reports, Matrix Reports, Ad hoc reports etc. in multiple formats using Tableau Desktop, Tableau Server, and MS Excel etc.
  • Experienced in release management and deploying the ETL and BI solutions to the production environment using the industry standards by utilizing tools e.g. SVN, MS Team Foundation Sever
  • Experienced in handling the project post go live support and change management /Incident management implementations using tools e.g. SVN, TFS, Remedy
  • Experienced in Database ETL scripts and ETL jobs performance tuning & optimization to meet the Business SLA.
  • I intend to grow in the domain of Data analytics and BI solutions and possess a strong experience and knowledge in big data and hybrid analytics. I have implemented project and carried out few POC for big data analytics and also wrote few blogs about it
  • Certified Big data professional and possess a hands on experience with Hadoop Framework, Hive, Pig,Flume, Hbase, Java, MapReduce, and similar No SQL database (Dynamo DB) moreover also have a good understanding of in memory data processing e.g. Apache Storm, Apache Spark
  • I am passionate about statistical data analysis and hybrid application architecture by utilizing the existing EDW infrastructure along with emerging big data technologies
  • Cooperated and communicated effectively, both orally and in writing, with users, technical staff and management
  • Established priorities and organize assignments, and to work effectively within timelines and under general supervision
  • Attended instructor led Training at Amazon for AWS web services for designing and managing large scale, high available and Fault tolerant solutions for Big Data analytics on Amazon Cloud.

TECHNICAL SKILLS:

ETL Tools: Informatica Power Center, Oracle Data Integrator (Oracle ODI), Oracle ODIC (Oracle Data integration Console), Microsoft SSIS, Teradata Utilities (BTEQ, Fload, mLoad, TPT), Oracle PL-SQL, Oracle Stored Procedures, Oracle Packages, Teradata Macros, Toad for Oracle, Oracle SQL Developer, SQL Loader, Talend Integration Studio, Talend Big Data

Data Modeling Tools: ERwin, ER/Studio, Erwin Data model Manager

Big Data: Horton Hadoop, Scoop, Map Reduce, Hive, Pig, Apache Flume, Oozie, NoSQL Database H-base

Databases: Oracle 9i/11g/12c, Teradata 12/13/14, SQL Server 2012/2008 R2

Programing Languages: Java, C++, Unix/Linux Shell scripting, Windows Shell/Batch Scripting

Reporting Tools: Tableau Desktop/Server, MS SSRS, Microsoft Excel

PROFESSIONAL EXPERIENCE:

Confidential

Data/ETL/Big Data Architect/Sr.ETL Developer

Responsibilities:

  • Data analysis and Requirement gathering for the business required dashboards/reports.
  • Identification of the data feeds from corporate systems and compile a high level solution design methodology.
  • Build a centralized3NF data model using Inmon methodology to integrate data feeds into single structure.
  • Design and document the source to target data mapping.
  • Design the overall ETL/ELT strategy in Informatica defining the logical, physical schemas and design decisions to support efficient real time data integration.
  • Designed and implemented the overall process of initial load and delta processing and to achieve real time data processing and integration in the EDW.
  • Integration and business data validation /testing of ETL process
  • Design the overall methodology to migrate the data from Source systems to Hadoop Data lake, identifying the most efficient tool set and set the standard for ELT/ETL process on Hadoop Data lake.
  • Design the data mapping from source to Hadoop Data lake and implement using Talend /Hadoop Native utilities (Apache Hive, H-Base, Impala).
  • Design the dashboard and KPI metric using Tableau as a front end tool and prepare the source to report matrix.
  • Release management of all database objects (Tables DDLs, Stored procedures, packages) from Development to QA or Production repository.

Confidential, Reston, VA

Data Architect/Sr. ETL (ODI) Developer

Responsibilities:

  • Information management, analyzing different K-12 source systems and identifying the primary data feeds per subject areas.
  • .Worked closely with data modeler to build a centralized3NF data model using Inmon methodology to integrate all 7 data feeds into single structure.
  • Worked closely with data modeler to prepare the source to target data mapping.
  • Design the overall ETL/ELT strategy in Oracle Data Integrator (ODI 11), defining the logical, physical schemas and design decisions to support efficient real time data integration.
  • Designed and implemented the overall process of initial load and delta processing and to achieve real time data processing and integration in the EDW.
  • Designed the overall ODI configuration Physical Schema, Data Servers, Logical Schema and Models in Oracle ODI according to the standard practices.
  • Develop ODI jobs /ELT mappings using Oracle ODI and Oracle Stored procedures for data loading and transformations as per the source to target matrix (SMX) design.
  • Development and Unit testing of all ODI components and preparing the Packages, Generating Scenarios, preparing Load plans for production scheduling.
  • Release management of all oracle database objects (Tables DDLs, Stored procedures, packages) and Oracle ODI objects from Development to QA or Production repository.
  • Monitor the ODI jobs performance and optimize the ELT to complete within the defined SLAs. Developing path scripts / modify ODI jobs to meet business defined SLA.
  • Involved in the implementation of the Landing process of loading the Students, Person, Business Line, Geography and all other dimension to the MDM staging area from various sources e.g. RDBMS and Manual maintained Excel File.
  • Creating the ODI Interfaces to load data periodically from MDM data sources to staging area from RDBMS and Manual maintained files.
  • Worked on Data Cleansing and Data standardization to prepare the data to apply MDM business rules.
  • Preparing business rules to consolidate data from different sources in order to achieve (MDM) master data management.
  • Performed match/merge rules to check the effectiveness of MDM business rule and data process on data and prepare the reports for Data Steward to review.
  • Developing Oracle PL SQL scripts to apply the MDM rules to consolidate data and prepare the single version of truth.

Confidential

Senior DWH/ETL Consultant/Solution Designer

Responsibilities:

  • Information sourcing, mapping business requirements to the source systems and preparing the source to target matrix (data mappings) for ETL implementation.
  • Creating Logical and Physical data model using Erwin and deploying schema objects (Tables, partitions, Indexes etc.) to DWH environment to support efficient ETL strategy.
  • Designed over all ETL strategy to load data from OLTP to OLAP
  • Installed/configure and maintained the oracle ODI repositories (Master repository and Work repository) for all environments e.g. Development, Test, UAT and Production.
  • Designed the Oracle ODI Topologies, physical architecture, logical architecture and Models inside an Oracle ODI project to ensure standard ETL development across the project life cycle.
  • Designed the Oracle ODI mappings to load data from various heterogeneous sources to DWH using Oracle ODI mapping transformations and tools.
  • Optimize the ETL Oracle ODI mappings to load data efficiently and utilizing the ELT approach and implementing Push down optimization/pass through to load data efficiently and improve performance.
  • Preparing Load plans and scenarios and schedule ODI jobs to trigger automatically to load EDW on periodic basis.
  • Data Profiling/Data Quality using Oracle PL SQL and made some ETL jobs using Oracle ODI to profile source system data and identify the error records to the DBA Admin according to the business rules.
  • Big Data Design/Development using Horton Hadoop and loaded data from Social media feeds (Twitter) using Apache Flume, Hive, PIG and Hbase for sentiment analysis/Network logs analysis.
  • Designed the overall data quality and data standardization process with the help of client Business stake holders to configure, document, agree on Business rules, data publishing, data validation and agree on the data set to be included for MDM .
  • Develop the ODI interfaces and some manual PL/SQL loader scripts to load data from RDBMS and some file based inputs for MDM Staging area.
  • Cleanse and standardize the data set related to Student, Teacher, Employee, University Organization Hierarchies Master data.
  • Develop the PL SQL scripts to Token data and apply data cleaning and Data standardization rules for MDM process.
  • Developed the PL SQL procedures to apply all the business rules to match and merge data to achieve a consolidated version of Master Reference Data.
  • Worked with Data Steward from client side to validate and improve the effectiveness of MDM rules and implemented a process of updating the data back in staging area with approval status.
  • Publishing the finalized version of data to the EDW layer for Data analysis and corporate reporting /Dashboards.
  • Worked with BI visualization team to prepare the reports/dashboards according to the business needs using Tableau Desktop, Tableau Server.
  • Providing solution to customer analytical requirements.

Confidential

Data Modeler /Sr. ETL Developer

Responsibilities:

  • Information sourcing and analyzing the source systems to identify the subject area which should be integrated to the Enterprise EDW.
  • Meeting with business stake holders and identify the required information and KPIs for corporate KPIs and monitoring.
  • Translating the business requirements to technical data aspects and identifying the most suitable and correct of data source for data integration.
  • Creating Logical and Physical data model using Erwin and deploying schema objects (Tables, partitions, Indexes etc.) to DWH environment to support efficient ETL strategy.
  • Designed over all ETL strategy to load data from OLTP to OLAP
  • Designed the data mappings to load data from various heterogeneous sources to DWH and provide ETL development team with business rules and implantation source to target matrix (data mappings).
  • Data Profiling/Data Quality using Standard PL SQL.
  • Worked with a vendor to set up the MDM process as MDM SME provided the business knowledge and data elements to vendor team and worked with them to identify and design business rules to cleanse, standardize, match /merge the major business dimensions in Telecommunication industry.
  • Worked with Vendor to conduct MDM workshop with business stake holders to validate the MDM rules and verify the MDM process outcomes.
  • Implementing some complex business requirements, complex ETL/ELT of presentation layer aggregates /cubes for OLAP analysis using standard Oracle/Teradata PL SQL, Stored procedures and packages.
  • Worked with BI visualization team to prepare the reports/dashboards according to the business needs
  • Providing assistance to all team stake holders for any kind of analytical requirements.
  • Adhoc statistical reporting to the CXO level using PL-SQL and Microsoft Excel.

Confidential

Lead ETL Developer

Responsibilities:

  • Meeting with source system technical teams to understand the source system data flows and logical/physical entities to deliver the required corporate reports/dashboards.
  • Worked with Data architect to implement over all ETL strategy and design the business deliverables, technical documents.
  • Installed/configure and maintained the oracle ODI repositories for all environments e.g. Development, Test, UAT and Production.
  • Implement the Oracle ODI Topologies, physical architecture, logical architecture and Models inside Oracle ODI project with the help of data architect to ensure standard ETL development across the project life cycle.
  • Designed the Oracle ODI mappings to load data from various heterogeneous sources to DWH using Oracle ODI mapping transformations and tools.
  • Optimize the ETL Oracle ODI mappings to load data efficiently and utilizing the ELT approach and implementing Push down optimization/pass through to load data efficiently.
  • Semantic Layer design and development with the help of BI developer and Data architect, preparing the aggregates and required deformalized tables for quality performance reporting using Oracle ODI/ PL SQL stored procedures etc.
  • Preparing the business rules to capture data quality issues using PL SQL /Oracle ODI mappings to identify and send the error record to DBA Admin.
  • Preparing Load plans and scenarios and schedule ODI jobs to trigger automatically to load EDW on periodic basis.
  • Monitoring and providing post production support to resolve the issues in Oracle ODI mappings and recover job failures.

Confidential

Technical ETL Lead - DWH Program

Responsibilities:

  • Preparing the source to target matrix (data mappings) with the guidelines of Data architect and source system SME.
  • Implementing the ETL jobs using Informatica and making mappings, defining the transformations, workflows, session mappings variables and loading connection configurations.
  • Worked with Data architect to implement over all data loading for initial load and daily load ETL jobs.
  • Documenting the ETL development standards and prepare the technical solution document with help of Solution Architect and Data modelers.
  • Installed/configure and maintained the Informatica repositories for all environments e.g. Development, Test, UAT and Production.
  • Designed the Informatica mappings to load data from various heterogeneous sources to DWH using Informatica Powercenter Designer, Informatica Powercenter Workflow and Informatica Powercenter Monitor.
  • Release management of the ETL jobs (Informatica mappings) across environment repositories using Informatica Repository Manager.
  • Optimize the ETL mappings performance to load data efficiently and utilizing the ELT approach and implementing Push down optimization/pass through to load data efficiently.
  • Preparing the aggregates/denormalized (Ralph Kimball) data model for reporting using Informatica and PL-SQL stored procedures etc.
  • Production support and ETL jobs monitoring to ensure that it meets business defined SLA.
  • Defect fixing in UAT /post production period to meet business deliverables.
  • Release and change management to implement change requests / new development /defect fixing e.g. Physical tables DDLs, PL scripts DMLs, Informatica mapping Jobs etc.
  • Informatica mappings /scripts failure handling and Job recovery.

Confidential

Data Modeler

Responsibilities:

  • Gathering business requirements analyzing the source systems to identify the subject area which should be integrated to the Enterprise EDW.
  • Identifying the right source of data in the enterprise to achieve business requirements and review the technical data aspects and identifying the most accurate data source for data integration.
  • Creating Logical and Physical data model using Erwin and deploying schema objects (Tables, partitions, Indexes etc.) to DWH environment to support efficient ETL/ELT strategy.
  • Managing the data model across multiple environments and deploying the required schema objects.
  • Work closely with release manager to stream line the development processes and ensure BI program delivery.
  • Working closely with Development and design teams to prepare the source to target matrix (data mappings).
  • Working with reporting teams to prepare the denormalized (Ralph Kimball) data models and prepare the source to target matrix (data mappings) for same.
  • Implement few complex aggregates using PL-SQL scripts to enhance and optimize the BI reporting layer challenges and improve user experience.
  • Work closely with all stake holders to deliver the BI program and attend all weekly meeting with technical teams and CXO committee.
  • Adhoc reporting to the CXO level using PL-SQL and Microsoft Excel.

Confidential

ETL Developer

Responsibilities:

  • Understand the subject areas and source system data flows and logical/physical entities from Data modeler and Solution architect to deliver the required corporate reports/dashboards.
  • Implement the standards of ETL development under the guidelines of Solution Architect to ensure standard ETL development across the project life cycle.
  • Designed the Informatica mappings to load data from various heterogeneous sources to DWH using Informatica Powercenter Designer, Informatica Powercenter Workflow and Informatica Powercenter Monitor.
  • Semantic Layer design and development with the help of BI developer and Data architect, preparing the aggregates and required deformalized tables for reporting using PL SQL scripts/stored procedures etc.
  • Monitoring and providing post production support to resolve the issues in Informatica mappings and recover job failures.
  • Fix any defects during UAT and post production support period.
  • Optimize the ETL Informatica jobs to meet standard ETL SLA by implementing the job performance optimization techniques and enabling push down performance optimization.

Confidential

BI/ETL Developer

Responsibilities:

  • ETL development, ETL Testing, ETL Production support, data mapping, and UAT support, KPI development for reporting and business presentation layer development, Report Development

We'd love your feedback!