We provide IT Staff Augmentation Services!

Data Warehouse Architect Resume

5.00/5 (Submit Your Rating)

SUMMARY:

  • Over 25 years experience working in fast - paced start up and software environments demanding strong organizational, analytical and time management skills as well as exceptional development and problem solving skills.
  • Extensive background in data architecture design, meta data definition, data migration and integration with emphasis on both high end OLTP and Business Intelligence solutions.
  • Special skills in efficient and effective data storage designs (data warehouses, data marts, ODS, and database platforms) with data quality (profiling and cleansing considerations.
  • Quickly grasps complex concepts, analyzes and interprets business concepts and ideas into strategy.
  • Over 20 years of CASE database design and success with implementation of data management standards, company standards, and data/business process analysis.

AREAS OF EXPERTISE:

PostgreSQL 9.2 Greenplum 4.2 Data Modeling Jasper Reports PL/pgSQL Dimensional Modeling Oracle 10.2.3 Perl JAVA Big Data Web Services Pentaho Queuing Talend Oracle 10.2.3 R/SPSS Data Mining Mobile Web Meta Data

PROFESSIONAL EXPERIENCE:

Data Warehouse Architect

Confidential

Responsibilities:

  • Greenplum 4.2 Confidential readable and writable external tables
  • Conducted Greenplum Architecture and Code Review
  • Optimized Greenplum Functions called by Pentaho

Data Warehouse Architect

Confidential

Responsibilities:

  • Greenplum 4.2 Confidential readable and writable external tables
  • Developed Prototype to compare Greenplum to Vertica
  • Developed Several Complex Data transformation and loading Greenplum Functions
  • Developed Pentaho Spoon workflows that called Greenplum functions
  • Developed Dimensional Data Model - Logical and Physical - using Toad Data Modeler
  • Linux Bash Perl
  • Amazon Redshift and AWS

Data Warehouse Architect

Confidential

Responsibilities:

  • Greenplum 4.2 Confidential readable and writable external tables
  • Developed logical and physical Greenplum Dimensional Data Warehouse Design
  • Defined Meta Data
  • Developed Several Complex Data transformation and loading Greenplum Functions
  • Used push down optimization in Informatica to call Greenplum GPLoad functions
  • Developed Informatica SQL Transformations
  • Called Greenplum Business Rules, Data Rules and Transform Rules functions using Informatica Stored Procedure Transformation
  • Developed Dimensional Data Model - Logical and Physical - using Toad Data Modeler
  • Linux Bash Perl

Data Warehouse Architect/Developer

Confidential

Responsibilities:

  • Developed database code using SQL Server 2008 and Stored Procedures
  • Developed Date Range Partition Design and Scripts For SQL Server Data Source
  • Developed Real Time Summary Tables using SQL Server Materialized Views And Indexes
  • Called SQL Server stored Procedures using Informatica Stored Procedure Transformation
  • Developed Source Data Model - Logical and Physical - using Erwin
  • Defined Meta Data
  • Linux Bash

Data Warehouse Architect/Developer

Confidential

Responsibilities:

  • Designed and developed a sophisticated high tech parts tracking and financial system as an additional component to SAP.
  • I designed database objects using Erwin and developed database code in PL/pgsql stored functions.

Data Warehouse Architect

Confidential

Responsibilities:

  • Developed database code using PostgreSQL 9.2 and PL/pgSQL.
  • Created Stored Procedures and Functions
  • Developed over 50 reports using Jasper Reports
  • Developed Pentaho workflows using spoon which called PostgreSQL Functions
  • Linux Bash
  • Amazon AWS and Redshift

Database Developer/Architect

Confidential

Responsibilities:

  • Developed distributed database code using PL/Proxy, PostgreSQL 9.0 and PL/pgSQL.
  • Created Stored Procedures and Functions deployed on 32 PostgreSQL databases using Londiste Replication and executing thousands of transactions per second. Some of the tools used include PostgreSQL 9.0, PL/pgSQL, PL/Proxy and Londiste.
  • Sun Unix KSH Perl
  • Amazon AWS and Redshift

Database Architect/ETL Developer

Confidential

Responsibilities:

  • Developed ETL code using MySQL 5.1, Greenplum, and Pentaho.
  • Created Stored Procedures and Functions as well as dashboards and reports.

Database Architect

Confidential

Responsibilities:

  • Developed ETL code using Oracle 10.2.3, PL/SQL stored procedures and functions to filter and migrate an entire operating companies Data Warehouse.
  • Over 200 tables and 500 gigabytes of data were migrated including Partitioned Tables as large as 50 Gigabytes.
  • Created Stored Procedures and Functions as well SQL Scripts to filter and load this Data Warehouse.
  • Some of the tools used include Oracle 10.2.3, Toad, and Erwin
  • Linux Bash

Database Architect/ETL Developer

Confidential

Responsibilities:

  • Developed ETL code using Java, Talend and MySQL stored procedures to track page views for a large web company.
  • The delivered code was very fast and scalable.
  • Creating Data models, ETL, stored procedures, and technical specifications and documentation.
  • Creating conceptual, logical, and physical models for IT and business users.
  • Some of the tools used include Erwin, Talend, MySQL 5.1, Excel, and Visio.
  • Linux Bash

Database Architect

Confidential

Responsibilities:

  • Worked as an ETL Data Architect - responsible for creating Enterprise Data Strategies and Data Management Strategies for the Confidential.
  • Conducted data mapping, cleansing and profiling for multiple Confidential wholesale bank applications.
  • Used custom SQL for the data profiling with Teradata and SQL Server SQL.
  • Created Long term and short term Data Architecture vision and Tactical roadmaps to achieve the Data Architecture Vision beginning from the current state.
  • Some of the tools used include Erwin, Toad SQL developer, Teradata, SQL Server 2008, Oracle, Talend and Visio.

Data Architect/Engineer

Confidential, San Carlos, CA

Responsibilities:

  • Extended the Markdown Data Model using ER/Studio, developed DB2 functions and stored procedures.
  • I also converted DB2 data and stored procedures to Enterprise DB and PostgreSQL and ran part of the Markdown application against the converted databases.
  • Researched Vertica as well.
  • Creation of Enterprise Data Warehouse reports using DB2 9.5 and DB2 Stored Procedures.
  • Data Architecture technical vision was geared towards Java Development and was to support Data Distribution and Data Migration to various areas and platforms including Linux, over 160 DB2 Databases, a web farm and various data marts. All systems were classified as high availability and 24/7 support.
  • Some of the tools used include Oracle Admin tools, ER Studio, Word, Toad SQL developer for DB2, Excel, Visio and Oracle 10.2.3.
  • Developing high level and detail design documents in an agile fast 1 month release cycle.
  • Linux KSH Perl

Data Warehouse Architect

Confidential

Responsibilities:

  • Developed ETL Facts and Dimension transformation and loading code using PL/SQL packages.
  • I extended an already existing star schema Data Warehouse.
  • I developed logical and physical database design for the Bulk Vehicles Vertical using ER/Studio.
  • I designed an Oracle function and procedure generator which I used to generate code for the Confidential .
  • Developed automated code generator that dynamically generates a pin trigger for each instance in the RAC cluster and pins all application procedures, functions and packages.
  • Developed log parsing tracking application using Perl and MySQL.
  • Creation of Enterprise Data Warehouse reports using Oracle 10.2.3 and Oracle Stored Procedures.
  • Some of the tools used include Oracle Admin tools, ER Studio, Oracle Materialized Views, Toad SQL developer for Oracle, Oracle RAC, Excel, Visio and Oracle 10.2.3.
  • Linux KSH

Data Warehouse Architect

Confidential, HI

Responsibilities:

  • I developed all the ETL transformation code to build an Oracle Warehouse Builder analytical Data Warehouse in PL/SQL.
  • Defined the transformation rules, aggregations and cleaning rules as well.
  • Used Erwin to develop the logical and physical Data Model for the JMEWS Application which tracked injured soldiers in Iraq in real-time.
  • I played a key role in architecting and delivering a multi-tier Solaris EMC CX300 SAN solution.
  • This solution includes built in fail-over for the database server, map server and application server. Advanced performance features included multi-path I/O, asynchronous I/O, raw partitions and database code pinning.
  • Creation of Enterprise Data Warehouse reports using Oracle 10.2.3 and OWB.
  • Duties and responsibilities included developing data models and Leading the design and development of Databases, Data marts, Data warehouses and ODS data storage areas.
  • Some of the tools used include Oracle Admin tools, Erwin, Word, Toad SQL developer for Oracle, OWB, Visio and Oracle 10.2.3.
  • Sun Solaris -- Unix Bash Perl

Database Consultant/Architect

Confidential

Responsibilities:

  • Worked as a DBA and Database consultant/Data Modeler/Architect using Erwin to develop conceptual, logical, and business process models.
  • Designed and developed Java Stored Procedures to do XML Parsing and send XML Messages using Oracle Advanced Queuing.
  • This system supports XML Document exchange of diverse documents up to 300K in size.
  • All packages are highly optimized and take advantage of bulk operations, array processing and SAX Parsing.
  • This system exchanges XML Documents throughout Asia, Europe and Latin America.
  • I played a key role in reverse engineering and cleaning up the logical and physical Data Model.
  • Developed key Materialized Views, Stored Packages and Data clustering which dramatically improved customer performance of the ServSmart Service Contract Application Software.
  • Developed complex Materialized Views that provide near real time analytics.
  • Some of the tools used include Oracle 9.1.6, Erwin, Oracle AQ, Oracle XML and Materialized Views.
  • Linux KSH Perl

Founding Data Architect

Confidential

Responsibilities:

  • I implemented a stored procedure Data Service API for key airline search queries.
  • I designed the Confidential 2.0 Data Architecture using JDO's, Object Caching, Oracle Advanced Queuing and Oracle Object Types. I managed a team of five Java Engineers.
  • The application uses a stored Java API for all application finder methods.
  • Architected coded and tested a highly scalable Oracle AQ Application which tracks partner referral purchases.
  • This application is capable of propagating 200 transactions per second per queue.
  • Responsibilities included developing data models in Erwin, developing complex stored procedures returning ref cursors and developing O/R Mapping code in Toplink.
  • Designed, coded and delivered a reporting solution using Oracle AQ, Crystal reports and Oracle PL/SQL.
  • Some of the tools used included Erwin, Oracle tools, Crystal Reports, Visio, Word, Excel and Microsoft Project.
  • Sun Solaris -- Unix KSH Perl

Founding Data Architect

Confidential

Responsibilities:

  • Designed, developed and helped implement live365 Web site.
  • This site provides free MP3 streaming for mostly alternative music.
  • The first release of the site was developed in less than two months.
  • I performed all of the logical and physical database design as well as developing the music directory.
  • The first release uses Oracle 8, Perl DBI and Apache.
  • This release can support up to 10,000 concurrent listeners.
  • Database logical and physical design.
  • Some of the tools used included Erwin, Perl DBI, Java, PL/SQL, Word and Excel.
  • Sun Solaris -- Unix Bash Perl

Founding Data Architect

Confidential

Responsibilities:

  • I researched and designed the logical and physical data architecture for Confidential .
  • I was primarily responsible for the use of Oracle 8, Tuxedo and Oracle Parallel Server.
  • These technologies provided unparalleled scalability and availability.
  • These component choices were very rare and radical for the Internet space.
  • I also was fully responsible for all of the logical and physical database design.
  • The data architecture uses both vertical and horizontal data partitioning for scalability.
  • Data dependent routing of Tuxedo/PRO*C transactions is used to increase the performance of Oracle Parallel Server. Case tools were used to develop the data model.
  • Referential integrity constraints are enforced within each Oracle Schema. In addition to these duties,
  • I designed and coded the search backend for Confidential . Tuxedo, Verity and C were used to develop the search service.
  • Confidential is processing 3 billion messages per month, which represents about 2 percent of all email traffic on the Internet.
  • Developed Verity/Tuxedo C search code.
  • Installed and configured Oracle Parallel Server version 8.0 on a 2 node Sun Cluster.
  • Some of the tools used included Erwin, Verity, Oracle Parallel Server, Tuxedo, and C.

Oracle Applications Architect

Confidential

Responsibilities:

  • I assisted multiple fortune 500 companies with Oracle 10.7 Applications architecture issues.
  • These customers are running Oracle Applications with concurrent users on multiple domain SUN E10000's.
  • I conducted application design reviews.
  • I participated in the development of an Oracle 11-benchmark kit for the Java Enabled Oracle Applications for the web.
  • I wrote two white papers.
  • I worked on developing a data mining benchmark for Oracle Manufacturing.
  • I have extensive experience with developing and configuring Oracle 10.7 and Oracle 11 code for all of the major financial and manufacturing models.
  • I also have experience configuring concurrent manager processes.
  • Conducted Oracle Applications Architect reviews for customers in Singapore and South Korea.
  • Some of the tools used were Oracle Applications, Sun Solaris and Power Point.

Senior Performance Engineer

Confidential

Responsibilities:

  • The benchmark was a modified TPC-D scale factor 1000 benchmark.
  • The cardinalities were modified to model Confidential 's actual Call Database.
  • Oracle and Pyramid won this benchmark.
  • The second benchmark was conducted on two 64 CPU Origin 2000 Servers using Cellular Irix.
  • The application was a real-time multi-dimensional satellite tracking system.
  • The workload was a combination of real-time inserts and long running multi-dimensional queries.
  • Oracle Spatial Data Option was used for this benchmark. Tuxedo was used to do data dependent routing and to do transaction routing.
  • Some of tools used were Sun Solaris -- Unix, Oracle Parallel Server, Tuxedo, PL/SQL and C.

We'd love your feedback!