We provide IT Staff Augmentation Services!

Sr. Data Warehouse Developer Resume

5.00/5 (Submit Your Rating)

Austin, TX

SUMMARY:

  • Twelve plus years of planning, engineering, development and implementation of client/server applications including 3 years of experience in Apache Hadoop Ecosystem . Expertise in delivering large scale Global/Enterprise level Data Warehouse solutions and actionable BI insights for related management while servicing Fortune 100/500 clients.
  • Experience developing highly scalable, distributed systems using different open source tools as well as designing and optimizing large, multi - terabyte data warehouses.
  • Hands on expertise in Requirement Engineering, Data Profiling, Data cleansing and Conforming, SQL Performance Optimization, Data Analysis to discern business relevant trends and Analytic reporting.
  • Creative problem solver/technologist, adept in conceptualizing, analysis, Process flow, Proof of Concepts and prototyping of company specific data/information solutions and launch of global platforms.
  • Excellent understanding of Hadoop Architecture and underlying Hadoop framework including Storage Management.
  • Expert in Ralph Kimball and Bill Inmon's Data Warehouse methodologies.
  • Expertise Knowledge in Data Modeling, FACT & Dimensions tables, Physical & Logical data modeling.
  • D eveloped Metadata Mapping from various system sources.
  • Expert in ETL/ELT using Datastage, Informatica, SSIS, Abinitio
  • Experience in importing and exporting data using Sqoop from HDFS to RDBMS and vice-versa.
  • Expert in writing Teradata Macros, Hive queries, NoSQL, PIG, SQL, T-SQL Queries, Dynamic-queries, subqueries and complex joins for generating Complex Stored Procedures, Triggers, User-defined Functions, Views and Cursors.
  • Expertise in Teradata, PostgreSQL, SQL Server, ORACLE, SQL, Pl/SQL, Shell Scripting (ksh, awk, sed), Python, Pro*C, DB2.
  • Used Autosys, Unix Shell Scripts and Cronjob to schedule Production Jobs for complex Batch / Continuous System.
  • Involved in full life cycle development of projects.
  • Executed various Agile Projects
  • Good in working as a team with exceptional communication and presentation skills.

TECHNICAL SKILLS:

ETL: Datstage, AB INITIO, Informatica, MS SQL Server Integration Services (SSIS),, NETIK

Database: Hadoop, Teradata, P ostgreSQL, SQL Server, Oracle, DB2, DB2 UDB

BI: Business Objects

Reporting Tools: Power BI, MS SQL Server Reporting Services (SSRS)

O/S: UNIX, AIX

Modeling: ER Studio

Scheduling: SQL Server Management, AUTOSYS, TCL, CRONJOB

Programming Languages: Python, C/C++, PRO*C

Structured Query Language: PL/SQL, T-SQL, SQL, SQL*LOADER, SQL*PLUS, HIVE

Project Software: JIRA,GIT,TFS

EXPERIENCE:

Sr. Data Warehouse Developer

Confidential - Austin, TX

Responsibilities:

  • Successfully led several data extraction, warehousing and analytics initiatives.
  • Led the architecture design /ETL Design effort of several projects. It included identifying, improving and resolving data cleansing. Defined information architecture, technology infrastructure and Data Model design.
  • Worked closely with Business and system SME to finalize business rules.
  • Designed, Developed, tested and documented various projects using Teradata, Hadoop, Hive, Sqoop, NoSQL, DataStage, Unix scripts, Python, Oracle, SQL Server.
  • Designed and developed Big Data analytics platform for processing cash flow data using Hadoop, Hive and Pig.
  • Loaded structured/unstructured data into Hadoop File System (HDFS).
  • Optimized Hive scripts to use HDFS efficiently by using various compression mechanisms.
  • Created hive schema using performance techniques like partitioning and bucketing.
  • Integrated Hadoop into traditional ETL, accelerating the extraction, transformation, and loading of massive structured and unstructured data.
  • Developed Datastage ETL jobs, Hive queries and Teradata Macros to load aggregated data into Teradata for reporting, dash boarding and ad-hoc analyses.
  • Extensively worked on optimizing and debugging SQL queries.
  • Developed ETL datastage jobs to process data in PostgreSQL .
  • Worked on Proof of Concept to use Power BI on Hadoop data .
  • Created Python scripts to achieve numeric results and process sequential files .
  • Created several Unix scripts and used sed/awk in scripts to achieve results.
  • Automated ETL jobs using Unix scripts to minimize Autosys jobs.
  • Used TFS for deployment and GIT version control .

Sr. Data Warehouse Developer \ Analyst

Confidential - Austin, TX

Responsibilities:

  • Examined existing SQL database and reviewed related business processes.
  • Held meetings with Business to review Data quality check points and audit reports.
  • Developed, tested and documented' SQL scripts for data quality check point.
  • Designed, developed, tested and documented load process using SSIS.
  • Developed and tested SQL scripts to load SQL server database.
  • Designed, developed, tested and documented SSIS Reconciliation Process to create formatted Excel reports.
  • Designed, developed, tested and documented SSIS Validation process to compare profiling and load .
  • Successfully migrated legacy data from SQL Server database to staging SQL Server Database using SSIS, Stored Procedures, Views and TSQL.
  • Supported team in resolving SSIS, SQL Reporting services and T-SQL related issues.
  • Wrote T-SQL Stored Procedures for load, reconciliation and validation process.
  • Used SQL server Agent to schedule jobs.
  • Deployed SSRS on Reporting Server for Business.

Sr. Data Warehouse developer /Analyst

Confidential - Township of Warren, NJ

Responsibilities:

  • Led the on-site offshore ETL and BI development team for the Data Mart implementation, platform rollout and various projects post implementation in 2009 with added responsibilities.
  • Facilitated the execution of ETL/BI Enterprise level cross functional projects by participating in a variety of activities including developing project plans, monitoring project status, business requirement engineering, data analysis, coordinating data regulations and compliance filings and preparing management reports.
  • Interaction with CITI Global PMO and Electronic Banking colleagues and regular contact with Shared Technology Services including Infrastructure, Technology, Architecture, Upstream Business Analysts and Global Operations.
  • Collaborated with Electronic banking for Project Scope definition, overall planning and execution translate high-level business requirement needs to technical specifications and created trace ability matrix.
  • Led the primary architecture design effort of the Custody and Securities Data Mart; this included identifying, improving and resolving data cleansing, conforming and data governance issues and maintenance of schema and ETL process models. Defined information architecture, technology infrastructure and Data Model design.
  • Delivered the Conceptual, Logical & Physical Data Model of the Dimensional DB2 Data Mart using ERWIN.
  • Data Architecture - Prepared Data Flow diagrams, identify major streams like Data Conversion, Historical, Data Retention and Archival, Data Security, Backup and Restore and Data Quality.
  • Data Warehouse Architecture - Prepared High-Level ETL design documents, Process Flow, Logical Source to target mappings, design of data standardization, cleaning and conforming modules and manage METADATA.
  • Took part in the DB2 Performance Optimization project - review and re-write the SQL used in ETL processes, DB2 stored procedures (access plans/paths/index strategies) for improved system through put.
  • Provided System Database Administration team the Data Mart tables volumetric worksheet for efficient capacity planning and storage layout.
  • Created cultural and technical infrastructure to measure, analyze and improve Data Quality. Delivered Weekly matrix and scorecards.
  • Maintenance of existing SQL Server Data Warehouse and Stored procedures, Analysis and Replication / Enhancements into the new Oracle Data Warehouse and DB2 Data mart data load ETL processes.
  • Enforced database standards and conducted follows up with SME and partner systems on inconsistencies
  • Maintained up-to-date plans for disaster recovery and fail over capabilities and test as recommended.
  • Created and maintained the global development team user access matrix reports for database security.
  • Designed the trigger mechanism to update reporting User entitlement database with respect to User Accounts.
  • Performance tuning of ETL processes by removing redundant processes and to follow best practices.
  • Instituted Standardized processes, re-usable templates, data dictionary and provided extensive training in DWH Technologies and training on securities and funds industry to improve team efficiency.
  • Designed and delivered the consolidated OLA and SLA E-mail alert mechanism for Service Management team
  • Delivered the unified Global Security Master integrating near real time data from various sources as part of Master Data Management initiatives. EAI architecture approach was successfully implemented.
  • Performed Data Integration, Profiling and Analysis of Source Systems provide data exception reports showing critical violations of data/business rules. Assess Data quality & conformance issues between disparate source systems. Determination of cleansing strategy based on the Data Quality Assessment and Reporting solutions.
  • Analyzed and gathered of Functional and Non Functional requirements. Conducted JAD sessions with business and IT teams to resolve complex business use cases. Reviewed BRD for acceptance criteria and how they can be quantified for accurate SIT and User Acceptance Tests. Created FRD and Technical Specification documents.
  • Created Informatica mappings with SQL procedures/functions to build business rules to load data
  • Used the techniques like Incremental aggregation, Incremental load and Constraint based loading for better performance.
  • Successfully migrated data between different heterogeneous sources such as flat file, Excel and SQL Server 2008 using Informatica, BCP and Bulk Insert
  • Created Reusable Transformations, Mapplets, Sessions and Worklets and made use of the Shared Folder concept using shortcuts wherever possible to avoid redundancy.
  • Created PL/SQL Packages, Triggers, functions using various oracle utilities.
  • Extensively used Abinitio to load data from Oracle database, My SQL, DB2, Excel sheets, flat files to different target systems.
  • Designed, developed, tested and documented EOD process to load Oracle and DB2 from flat file using Abinitio.
  • Designed, developed, tested and documented Ab-Initio BATCH / CONTINUOUS GRAPHS.
  • Used Ab-Initio MFS features for parallel processing.
  • Used Conduct-it

Data Warehouse Developer

Confidential - Kenilworth, NJ

Responsibilities:

  • Business Requirements Analysis, System Requirement Analysis, Business Model Analysis.
  • Worked in team in creating technical design specification for code segments as well as contributed towards the technical design document for the entire project.
  • Examined pre-existing flat-files, database and reviewed related business processes. Created relational database application to better manage data and processes.
  • Imported flat-file data, transforming it to relational structure, and imported data for comparative reporting.
  • Developed, deployed, and monitored TSQL queries.
  • Designed, developed, tested, and documented Ab-Initio processes.
  • Used Ab-Initio MFS features for parallel processing.
  • Used database components of Ab-Initio to load and Unload data files from Sybase and oracle database
  • Designed, developed, tested and documented staging architecture using Unix Shell scripts, AWK, SED.
  • Developed loading scripts using SQL Loader.
  • Developed Unix Shell wrappers to automate various UNIX and PL/SQL processes.
  • Wrote Shell Scripts and used Cron Job to schedule jobs.

We'd love your feedback!