Sr. Data Warehouse Consultant Resume
2.00/5 (Submit Your Rating)
New York City, NY
SUMMARY
- Over 10 years of experience in IT - ETL, DWH using Google BigQuery, Informatica 10.1,9.6.1, Teradata 14.10, Oracle 8i,9i,10g, SQL Server 2005,2008,2008 R2, IBM Data stage 8.7, Sybase 2.1, Hyperion, Tableau 8.3,9.0 .
- Experience in Google BigQuery in Data warehouse.
- Develop and test the programs in BigQuery.
- Define scalable enterprise architecture.
- Design and implement ETL using Google cloud technologies and tools
- Experience in Teradata Parallel Transporter (TPT) and extensive knowledge on Push down Optimization (PDO).
- Exposure to Informatica Blaze and smart executor.
- Developed several complex mappings in Informatica using transformations, Mapping Parameters, Mapping Variables, Mapplets & Parameter files in Mapping Designer.
- Strong knowledge of Data warehouse concepts: Star Schema, Snowflake and Slowly Changing Dimensions (SCD) Type I and II.
- Developed complex mappings using various transformation logic like Source Qualifier, Router, Filter, Expression, Sorter, Aggregator, Joiner, Union, Connected/Unconnected look ups, Update Strategy and Normalizer.
- Loaded source data - from different sources like Flat files (Delimited files and fixed width files) text files, CSV files, XML, Oracle, SQL server, DB2, Access.
- Created Macros, Stored Procedures, Views in Teradata 14.10.
- Good exposure in writing BTEQ scripts and loading data into Teradata using FASTLOAD, MLOAD and TPUMP utilities.
- Exposure in various industries like Media, Retail and supply chain.
- Strong development skills in writing Oracle PL/SQL, Stored Procedures/functions, Packages, Triggers.
- Extensive experience in Oracle tools and utilities such as SQL Loader, Data Pump, External Tables, Export/Import.
- Experienced in enhancing performance using various objects like Views, Materialized views, SQL hints
- Strong performance tuning (creating indexes, materialized views and modifications in PLSQL procedures using FORALL and bulk collection methods).
- Outstanding communication and interpersonal skills, ability to learn quickly, with excellent analytical reasoning and adaptive to new and challenging technological environment.
TECHNICAL SKILLS
Data warehousing & ETL: Google BigQuery, Datastage-8.7, Informatica- 10.1,9.1/8.6, SQL Server Integration Services(SSIS)-2012/2008
Databases: MS Access 2000, Teradata- 14.0/13.11, ORACLE 11g/10g/9i/8i/7i, MS SQL Server -2012/2008R2/2008/2005
Job Scheduling: Autosys, Tivoli
Languages: PLSQL, UNIX, T-SQL
BI & Reporting: Tableau 8.2, SQL Server reporting Services(SSRS) 2012/2008 r2
Data Modeling: Erwin 8.2
Version control: git
PROFESSIONAL EXPERIENCE
Confidential, New York City, NY
Sr. Data warehouse Consultant
- Created efficient BigQuery queries for migration from existing Informatica/Teradata mapping.
- Created UNIX scripts to call the BQ queries to load data.
- Design the Data Modeling for BigQuery.
- Performance tuning in BigQuery.
- Pushed code to GIT as version control.
- Significant exposure to Google BigQuery - Led architecture design for migration to Google Cloud.
- Implemented de-normalized Google BigQuery design for the DWH.
- Analyzed business rules from existing system and then implemented an optimized Informatica - mapping, session and work flows and Teradata - Macros, Stored Procedures and Views.
- Created Complex mappings using Unconnected and Connected Lookup, Aggregator and Router transformations for populating target table in efficient manner.
- Designed the data life-cycle - initial source system exploration, high-level ETL load strategy , parallel load before Go-live. Developed and presented technical designs to clients which helped in overall successful completion of migration on time with no issues.
- Used Mapping Variables, Mapping Parameters and Session Parameters to increase re-usability.
- Implemented CDC/Full load strategy.
- Extensively worked on Informatica Power center transformations as well like Expression, Joiner, Sorter, Filter, and Router, and Normalizer, rank, Lookups, Stored procedure, Update strategy, Source Qualifier, Union , CDC and other transformations as required.
- Supported, as senior subject matter expert, in the end-to-end migration design, build, execution and transition. Led the technical architecture and design of the migration process.
- Translated complex business requirements into mapping documents. Implemented complex business logic to mapping document which was then used by developers.
Confidential, Englewood Cliffs, NJ
Sr. ETL Informatica Developer
- Created and Imported/Exported various Sources, Targets, and Transformations using Informatica Power Center, Repository Manager and Designer.
- Created Reusable Mapplets in the Mapplet Designer and used in different Mappings.
- Designed and developed Informatica Mappings, Reusable Sessions and Workflows.
- Extensively developed various Mappings using different Transformations like Source Qualifier, Expression, Lookup (connected and unconnected), Update Strategy, Aggregator, Filter, Router, Joiner etc.,
- Preparation of technical specification for the development of Extraction, Transformation and Loading data into various stage tables.
- Validated and tested the mappings using Informatica Debugger, Session Logs and Workflow Logs.
- Developed and implemented stored procedures to automate some of the existing process.
- Efficient in fixing bugs/issues with the business during production support.
- Created UNIX shell script to extract data from database tables for consumption downstream.
- Developed SQL Scripts for Tableau for reporting.
- Coordinated with On-Site and Off-Shore developing and testing teams.
- Lead onsite production support while managing three member off shore team.
Confidential, Somerset, NJ
ETL Informatica Developer
- Worked with business analyst to understand the requirement and build a plan for ETL process.
- Extracted data from Oracle, Flat file, DB2 UDB source systems and loading the processed data into Oracle target tables/Flat files.
- Involved in Data Analysis of the OLTP system to identify sources data elements.
- Prepared Source to Target ETL mapping documents.
- Designed and developed Mappings and Mapplets using Reusable transformations using transformation developer to minimize the time in generating big complex and redundant mappings.
- Used Expression, Aggregator, Lookup, Filter, Router, Sequence Generator, Sorter, Stored Procedure, Normalizer transformations to do necessary data calculations and manipulations per the business rules and loaded the data into Target table.
- Created UNIX shell scripts.
- Wrote complex SQL Queries involving multiple tables with joins
- Used Normalizer Transformation to generate multiple data fields from a single row of data.
- Worked on to identify and solve the performance issues in Informatica Mapping / Sessions.
- Used Autosys to schedule the workflows automatically.
- Checked error log files for the cause for the workflow / session failure and resolved it with thoroughly re checking the mapping logic, database connections, tasks, workflow settings.
- Developed PL/SQL procedures for processing business logic in the database.
- Created Test cases, Pseudo data to check mapping logic and required output.
- Involved in Unit testing, User Acceptance Testing and documented the results.
Confidential, Somerset, NJ
Database Developer
- Developed Oracle PLSQL objects like Functions, Procedures, Packages and Triggers.
- Production support for Inventory, Order Management and General Ledger modules
- Developed standard PL/SQL stored procedures and automated the re-submission process of failed invoice transactions.
- Developed reports in SQL for Invoicing, Order Delivery, Quotation, Stock Inwards, Stock outwards and Inventory Adjustments etc.
- Developed Reports like Ad Hoc Weekly reports, Daily Stock status report, Weekly, Monthly, and Periodical Reports.
- Documented business process flows, user manual and technical support manual.
- Created various PL/SQL code to implement business rules through procedures, functions, and packages using SQL*Plus.
- Extensively involved in support and root-cause analysis of data issues PL/SQL store procedures.