We provide IT Staff Augmentation Services!

Advisory Engineer, Data Warehouse Engineer Resume


  • Data Warehouse Engineer with extended experience in Dimensional Modeling on DBMS and Hadoop environments.
  • Architecting, development and testing of ETL pipelines.
  • Deep understanding and use of relational and dimensional models, star and snowflake models and multidimensional cubes.
  • Knowledge of lambda architecture.
  • Conceptual, logical and physical design of Data Marts and Data Warehouses.
  • Full development life cycle using Water - fall and Agile methodologies.
  • Master data management.
  • Collection and analysis of the business requirements.
  • Source data analysis, data profiling and cleansing, dimension confirmation and mapping of the source data elements to KPI and metrics.
  • Preparation of development documentation such as Requirement Documents, Development Standards, Design Documents and Functional Specifications, QA test cases, various diagrams and user manuals.
  • Hands on familiarity with ETL and BI tools such as DataStage, Pentaho. Tableaux.
  • Writing MapReduce jobs for load and extract of Big Data on Hadoop environment.
  • L3 support, performance tuning, debugging and troubleshooting of ETL jobs.
  • Hands on in writing of batch automation and process monitoring scripts using bash.
  • Migration of the legacy systems, reverse engineering and remodeling.
  • Impact analysis, development and maintenance of processes with complex dependencies;
  • Practical knowledge in preparing test cases and conducting QA testing on various stages of development cycle including unit test, integration test, regression and system rests.
  • Familiarity with Banking Regulatory Compliances such as Privacy, Security, USE PATRIOT acts and Foreign Assets Control.


Programming languages: PL/SQL, SQL, T-SQL, Java 1.7, JDBC, Perl, XML, php, HTML, Bash, awk.

Data Base environments: Vertica, Oracle 10g/9i/8i on Unix HP, RedBrick, MS SQL Server 7.0/2005, Aster, DB2, SQL*Loader, Oracle Application Server, MySQL, MS Access.

Operating systems: UNIX, Linux (Su-Se, Red Hat), Windows NT/95/98

Development tools: TOAD, Eclipse, DBVisualizer, VSQL, SQL/Plus, CygWin, PuTTy, WinSCP, SQL-Navigator, MS SQL Server Manager and Query builder, ProfileStage, Meta Data, DataStage, Oracle Financial Suite, ClearCase, VSS, Toad, SQLDeveloper, Oracle Designer, Oracle Discoverer, JDeveloper, Cognos, Tableau, Pentaho, BusinessObjects.

Modeling tools: ErWin, Visio

MS Office applications: MS Office Suite including Excel, Access, Power Point, Word and Outlook. Open Office.



Advisory Engineer, Data Warehouse Engineer


  • L3 support, maintenance, enhancement and tuning of the existing ETL and Reporting applications on Oracle (PL/SQL,bash), Aster and Hadoop (Java Spring, SQL).
  • Data Modeling, design and development of new data feeds and adding new functionalities to the existing pipeline.
  • Migration of the client databases from Oracle to Aster, and from the Aster to Hadoop.
  • Lead migration of few legacy ETL applications:
  • Collected business requirements via reverse engineering.
  • Analyzed requirements and modeled new environment;
  • Developed ETL processes to load data into repository on HBase and Hive;
  • Wrote technical documentation for developers and OPS;
  • Lead group of 4 developers using Agile approach.
  • Validation, troubleshooting and fixing of production bags and user complains.
  • Unit, integration, smog and UA testing.
  • Initiated tuning of batch ETL procedures (PL/SQL) and DB health maintenance events before the 2013 Holiday season, which were adopted by DBA for future use. As a result, we didn't experience Holiday bottlenecks starting with 2013 going forward.
  • Developed and coded data-retention process (in Oracle PL/SQL) to purge and/or archive historical data as part of the tuning efforts of Holiday season initiative.
  • Participated in modeling, design, coding and implementation of the DDR (Dimension Data Repository) on Hadoop, using Java Spring framework and SQL.
  • Refactored the Client Data Import pipeline from the legacy ESB and integrated it into the Hadoop pipeline framework.
  • Integrated new data source from Call Center into the Import flow above.
  • Improved, enhanced with new functionalities and implemented the Baseline application (Java Spring) for migration of the client historical data from the legacy system to the repository on Hadoop.
  • Developed a utility (Java) for OPS to manage HBase tables.
  • Developed and implemented aTest-Data-Generator (Java) to generate master data for QA automation testing.

Environment: Agile, Oracle 10/11g, SQL Developer, PL/SQL, Aster, Linux, Java, Spring, Hadoop, HBase, Hive, Jira, Parature, GitLab, Jenkins, Eclipse, Maven.


Sr. Data Warehouse Engineer, Architect


  • Collection of the requirements through interview of the business partners.
  • Analysis of requirements and SOR to derive rules and standards for the design of ETL.
  • Integration of the source data from various SOR (SalesForce CRM, MsSqlServer, Dynamic GP and WTS Paradigm ERP).
  • Conceptual and logical design of projected Data Warehouse.
  • Evaluation and selection of vendors.
  • Sizing and budged estimation of the Data Warehouse project, including hardware cost, software and tools licenses, developers staffing according to the new skillset and/or cost of retraining of the existing personal, as well as estimation of the DW maintenance cost.
  • Extract of the data from the SORs into Staging area for the demo version.
  • Development of the ETL processes to load the data into the demo data mart.
  • Created a demo dashboard for KPI and Trend Analysis using SSRS and MSExcel for data visualization.
  • Presented conceptual EDW solution and demo data mart to the business partners.

Environment: MsSQLServer 2005, SalesForce CRM, Dynamic GP, WTS Paradigm ERP. T-Sql, Windows.


Sr. Data Warehouse Engineer, DBA


  • Development and testing of ETL process for a corporate DW (Vertica, Pentaho, SQL and Python).
  • DBA support of Vertica, such duties as back-up, user and security management, tuning and upgrades of Vertica.
  • Collection and analysis of business requirements, KPI and client metrics through interviews of the business stakeholders.
  • Analysis of the data in SOR and the log files to derive transformation rules and data quality standards.
  • Conceptual and logical design of the Cloud Services DW.
  • Modeling and physical design of the data mart for selected business process.
  • Selection of software vendors for the DW implementation.
  • Data profiling and mapping of the data source elements to the DW data components, KPI and BI metrics.
  • Hands on development of ELT pipeline, including batch scheduler, DDL (creation of database objects), DML - transformation and load procedures, and DCL to manage the access to the data.
  • Integration and user acceptance testing.
  • Migration of the historical data from the SOR system for selected business process.
  • Lead development of the BI dashboard and reports from the mart.

Environment: MySql, Vertica, RedHan Linux, Tomcat/Catalina log files, device context upload files, Visio, Toad, Bash, awk, SQL.


Sr. Application Engineer


  • Project planning, capacity management and scheduling of delivery of the new projects.
  • Scope analysis of user requests, projects sizing and planning.
  • Collection and analysis of business requirements from business partners as well as via reverse engineering of an existing code and source data profiling.
  • Master data management.
  • New projects impact analysis.
  • Modeling, logical and physical design of the ETL processes and BI repositories and BI reports.
  • Applying regulatory act requirements to the ETL and data storage.
  • Producing periodic audit report to assure data compliance with the regulatory acts.
  • Smog, integration and UAT testing of developed applications.
  • Preparation of technical and user documentation.
  • L3 support of production applications on rotating bases.
  • Migration of STS Data Mart from RedBrick to Oracle 10g. Including reengineering of the ETL pipeline, migration of the historical data and development of the user periodic and scorecard reports (Oracle 10g, PL/SQL, bash, OWB).
  • Took part in modeling, conceptual and physical design of BBG Data Warehouse. Developed ETL pipeline for the load of FDR data (Oracle, PL/SQL, Profile Stage).
  • Migration of CIRM Data Mart to the BBG Data Warehouse (Oracle 10g, DataStage, PL/SQL).
  • Reengineered a Compliance BI Repository ETL process to fix periodic failure.

Environment: Oracle 10g, DB2, Teradata, RedBrick, MS SQL Server, ERWin, Visio, Toad, Cronacle, AutoSys, VSS, Ascential ETL Suite, SQL, PL/SQL, JDBC, shell, Perl, HP-Unix, Windows XP

Hire Now