We provide IT Staff Augmentation Services!

Data Architect Resume

3.00/5 (Submit Your Rating)

SUMMARY:

  • IT Professional with over Twenty years of experience in large - scale applications in Financial Services (Banking, Credit Card, Mortgage), Federal Agencies, Telecommunications, Oil and Gas, Management Consulting, and Hospitality industries.
  • Experience in designing and implementing Data Strategy related to the entire ‘Analytics Lifecycle’ including acquiring data; storing and processing data; managing data quality; extracting insights from data; leveraging insights to enhance customer, business and risk decisions.
  • Experience in development and implementation of lifecycle management for the data from international agencies as part of Govt. to Govt. treaties/agreements e.g. Foreign Bank Accounts and other confidential financial data (used to monitor Money Laundering, Financial Crimes etc.), and Foreign Military Sales data to monitor arms proliferation and facilitate legitimate foreign military sales to US allies and partners.
  • Responsible for the design and development of Data Warehouses and Operational Data Stores (ODS), up to 70 terabytes in size, and the throughput volume of up to hundreds of millions of transactions, including architecture, data modeling & database design (Conceptual, Logical, and Physical data models), and ETL process development. Specialized in design and implementation of high-throughput scalable ETL architecture focused on data integration, data management, data architecture, and data analysis.
  • Responsible to ensure data solution designs fulfilled functional and technical requirements and could consistently meet progress by focusing on client values and satisfaction while balancing corporate objectives, feasibility, and scalability of solutions. Accountable for accuracy, process, integrity, and quality of client solutions by communicating clearly and effectively with the development team, management, business users, other IT system owners, and DA/DBA groups, and by taking a “holistic” view of the problem. Worked with external auditors for SOX compliance audit of applications.
  • Planned and managed multimillion-dollar projects aligning business goals with technology solutions to drive process improvements, competitive advantage and bottom-line gains. Experience leading, planning, and supporting a functionally organized, technically oriented environment with staff working on a wide variety of technical activities. Expertise in forecasting, developing and implementing organizational initiatives. Ability to think strategically, setting priorities, allocating resources, providing follow-through, assuring a well-organized workforce and providing evaluation of projects and efforts.

PROFESSIONAL EXPERIENCE:

Confidential

Data Architect

Responsibilities:

  • Enterprise level data life-cycle management, which includes defining and enforcing a set of rules, policies, standards and models that govern and define the type of data collected and how it is used, stored, managed and integrated within an enterprise or organization. The group is responsible for Enterprise level Data Stewardship, by collaborating with data architects, application developers, ETL architects, business data owners and others to uphold data consistency and data quality metrics.
  • Database Design (Conceptual, Logical, and Physical Data Model) of the pilot of a dynamic, rule-driven, and flexible Examiner Productivity Count system, which provides Counts and Credits (and eventual bonus dollars) to Examiners. This is a high visibility effort, considering that Examiners comprise 75-80% of PTO workforce (Examiners are responsible for examining and approving Patents). This “integrated” system aims to replace the current system which is inefficient, uses too many discretions, and have to depend on the Count/Credit eligibility parameters from many systems

Environment: Oracle 11/12, MySQL, ER/Studio

Confidential

Data Architect

Responsibilities:

  • Developed and maintained a formal description of the data and data structures which included data definitions, data modeling (Conceptual, Logical, and Physical data models), data flow diagrams, metadata management, business semantics, and define data layers for enterprise level reporting capabilities, in collaboration with Application Development Teams, Business Teams, and key Stakeholders to design scalable, and robust Enterprise Data Warehouse (EDW) and Operations Data Source (ODS)
  • Defined strategies to combine EDW local data structures (Data Marts) into enterprise level data structures, to make “local” Data Marts in the EDW more “enterprise” (or Conformed), so that information across Data Marts in EDW was linked, and was part of one overall “Enterprise Warehouse” solution, by taking a “holistic” view of the business, data, and reporting
  • Involved in defining a major Master Data Management solution, which was for their "Property" data which included anything that was considered within the "Property" domain category for Hilton all over the world e.g. Hilton Resorts, Hotels, or anything else they considered their "Property" related data.
  • Defined and implemented enterprise level database/modeling standards, that were used across all new Data Marts of the Hilton EDW and ODS systems

Environment: Oracle 11, HP-UX, Erwin, Informatica

Data Architect/Sr. Data Warehousing Architect

Confidential

Responsibilities:

  • As a Data Architect in Defense Technical Information Center (DTIC) agency of Department of Defense (DOD), responsible for developing and maintaining a formal description of the data and data structures which included data definitions, data modeling (logical and physical), data flow diagrams etc., and included topics such as metadata management, business semantics, and defining data layers for local and enterprise level reporting capabilities. Defined strategies to combine local data structures into enterprise level data structures.
  • Involved in the design of the Data Warehouse (Logical and Physical) for DoD’s Foreign Military Sales initiative.
  • Involved in defining the Future State Architectural framework for Navy’s Blood Borne Infectious Management Center’s eAccess system that dealt with sending military member’s specimens to the labs, and collecting and processing lab results for blood borne infectious deceases (HIV, Hepatitis etc.)
  • Defined the Future State Architectural framework for Department of Energy (DOE) Enterprise Data Warehouse, which involved doing the holistic analysis of the current “As-is” implementation of EDW, do the Gap Analysis, and providing the artifacts for the “Future State” framework with more efficient, scalable, optimized, and lower maintenance solution.
  • OWB was the ETL tool of choice to extract, transform, enrich, cleanse, and load data from legacy and COTS systems into the Enterprise Data Warehouse (EDW)

Environment: Oracle, SQL Server, Teradata, Netezza, OWB, MS Azure, HP-UX, Linux, Solaris, Erwin, ER Studio

Confidential

Specialist Senior

Responsibilities:

  • Lead the Operational Data Source (ODS) and Data Warehouse initiative from planning to implementation through iterative strategy development and tactical execution of objectives. It included data capture (structured and unstructured), storage, analytics, reporting, and dissemination of Bank Secrecy Act (BSA) data from Financial institutions
  • Defined the Conceptual, Logical, and Physical data model of 2 gateway systems dealing with tens of millions of transactions every month to capture Bank Secrecy Act (BSA) data from the Financial Institutions all over the nation, including Banks, Credit Unions, Casinos, Money Transfer agencies, and others. These were complex filings on the possible suspicious activities of financial transactions.
  • Designed Conceptual, Logical, and Physical Data Model of “Unified Staging” environment which was a highly optimized data structure to deal with hundreds of millions of transactions (an Operational Data Source) with around 70-80 tables, to bring uniformity in terms of data structure (data repository) and the data itself, from 2-3 data gateway systems. This Unified Staging Operational Data Source was used by 3 downstream reporting systems (or task orders). These downstream reporting systems used the Unified Staging data repository for pulling data into separate reporting DataMart, or using the existing data structure for data needs.
  • Designed a major Data Warehouses to report on the data quality checks, track the history of filing patterns of the BSA filers, and also canned reports for data filings and data patterns. Designed another Data Warehouse to report Query Audit Logs (activities around BSA data) and also to network BSA user community (Law Enforcement agencies) that may be looking for same type of BSA information, based on the BSA data search patterns.
  • Designed “ETL Execution Workflow Architecture” to handle files placed in many different formats from multiple systems to be processed by the ETL engine, sanity check of files based on pre-defined dynamic configuration, manage ETL processes to load and process files through the ETL engine depending on the quality of the data received and the sanity check results, and failover and recovery mechanism to handle files in automated fashion
  • Responsible for authoring the Detailed Design Specifications and other PMO defined artifacts

Environment: Oracle, Sybase, Autonomy, Linux, Solaris, Erwin, IBM InfoSphere (DataStage), Informatica, Crystal Reports, SAS

Confidential, NJ

ETL Architect

Responsibilities:

  • Provided scalable Data & ETL framework for new high volume complex applications across Confidential, by defining overall ETL & Data architecture including key decisions such as loading, real time/batch, data validation, modularity, re-usability, and parallelization, to make sure ETL applications, using relational databases (Oracle/Sybase) and/or Ab Initio ETL tool, were re-usable, portable, scalable, and low maintenance.
  • Provided Capacity Enhancement Architecture framework for old applications, to make sure they could handle more volume than they were originally designed for, without actually changing the business logic of these applications
  • Provided ad-hoc and on-demand support for any ETL architecture related issues to application teams.

Environment: Oracle, SQL Server, Ab Initio GDE 1.14/1.13, Co>Operating 2.14/2.13, EME, HP-UX, Sun Solaris, AIX

Confidential, VA

Sr. ETL Specialist

Responsibilities:

  • Analyzed, designed, developed, implemented and supported ETL processes to create source/target system mappings of data from different application into Confidential Impairments ODS, after applying complex set of business rules in these processes (using Ab Initio ETL tool in Oracle database environment), and worked closely with business analysts and upstream and downstream application development groups to ensure clear implementation of ETL requirements.
  • Designed, developed, and tuned complex Ab Initio graphs in Oracle database environment, for data interfacing and ETL processes, using advanced Ab Initio features.
  • Developed ETL process strategies, and worked closely with business and data validation groups to provide assistance and guidance for system analysis, data integrity analysis, and data validation activities.
  • Performance tuning of Ab Initio and Oracle ETL processes, and code reviews for applying best coding standard practices

Environment: Oracle 10g, Ab Initio GDE 1.13, Co>Operating 2.13, EME, HP-UX, Sun Solaris

Confidential, VA

Sr. Consultant

Responsibilities:

  • Involved in Capacity Enhancement and Process Tuning of Operational Data Source (ODS) and Data Warehousing interface processes written using Ab Initio, Unix, and Oracle tools of an application that was originally developed 5 years ago, to cater to the needs of just around 1/3 of current 45 million customers at the time. The re-designing included taking a “holistic” approach of ETL processes, and fine-tuning (and in many cases re-designing) components of ETL processes for making ETL engine more efficient for an application with interfaces with 15 other systems with millions of transactions.
  • Responsible for the “Detailed Design” and Development of “Over limit Charges” and “Adverse Credit Action” modules of the existing system, which involved attending Design and Integration workshops in the capacity of the SME of the system, and writing Detailed Design artifacts with end-to-end implementation details of these modules.
  • Worked as an SME of the application with “external” auditors for SOX compliance audit of the application, by providing database and server access procedures, file transfer and handling procedures, and security procedures.

Environment: Oracle 9i, Ab Initio GDE 1.13, Co Operating 2.13, EME, HP server, HP-UX, Perl, Toad

Confidential, VA

Sr. Consultant

Responsibilities:

  • Designed and developed ETL processes using Oracle tools to pull data from external Confidential systems i.e. Ensamble (billing system for billing/adjustment data), Peoplesoft AM module, customer usage and tax data, and load data into Confidential Data Warehousing schema, for the purpose of business intelligence reporting. These reports saved Confidential tens of millions of dollars by better strategic planning and tax and depreciation etc. Confidential ’s data warehousing schema was one of the largest in the nation, around 40 Terabytes in size, and with some processes dealing with 700-800 million transactions, this system required a high level of business and user interaction.
  • Designed and developed Star Schemas with static and changing dimensions, and different facts, to make sure that the data in DW repositories was in sync and integrity of the data could be maintained in DW repositories with changes in the source data.

Environment: Oracle 9i, Data Warehousing, Star Schemas, Essbase, Hyperion, PL/SQL, SQL*Plus, SQL*Loader, HP-UX

Confidential, Houston, TX

TECHNICAL ANALYST

Responsibilities:

  • Shared expertise in EAI architectural design and strategy, including performance and configuration, and infrastructure design. Designed and developed real-time and batch interfaces, to integrate BMC corporate customer and product data with Siebel (Siebel interface & Siebel EIM processes), Peoplesoft-Vantive, and Oracle Financials applications.
  • Integrated Customer and Dun and Bradstreet (D&B) Module of Siebel application using matched D&B data from Dun and Bradstreet. Designed interfaces to send Siebel “customer” data to D&B for matches, and integrate the matched D&B data into Siebel Customer repository. BMC used D&B data in identifying duplicates in customer data & for data cleansing.

Environment: Oracle 9i/8i, SQL Server , Siebel 7.x/6.x (Customer Module, Product Module, D&B Module, Siebel Interface tables, Siebel EIM processes), HP-UX, Dun and Bradstreet (D&B), Erwin, Java

Confidential, NJ

Sr. Programmer Analyst

Responsibilities:

  • Coded the main report-processing back-end engine of this project using Oracle tools (PL/SQL), Unix, C, and Pro*C
  • Designed report tables to store temporary report-processing data, and used by the back-end report-processing program.

Confidential, Houston, TX

Programmer Analyst

Responsibilities:

  • Coded, implemented, and performance tuned data migration processes using Oracle PL/SQL, Unix/C/Pro*C/SQL*Loader
  • Designed/Coded/implemented automatic error handling and recovery processes for table data loads

Confidential

Programmer Analyst

Responsibilities:

  • Documentation and Revenue System DRS was a general liner shipping package to support a wide range of activities performed by liner shipping agents worldwide. Developed programs using Unix, C, and Oracle tools, for Container Tracking module.
  • The project had 3000+ programs and 200+ technical people, with strict coding standards for efficiency & reusability.

We'd love your feedback!