Data Warehouse Architect Resume Profile
NJ
Experience Summary
I am a Technical/Project Manager, Data Solutions Architect, Data Warehouse Architect, Data Architect, Data Analyst, Business Analyst, ETL Developer, and Data Modeler for Oracle, Teradata, SQL Server and Sybase platforms. I am als proficient in languages: Java, C, C , KSH, Perl, and VBScript. Working platforms include Solaris, AIX, Linux, and Windows.
Technical Summary
- Languages: Java, C, Perl, VBScript, KSH/SED/AWK, PL-SQL, T-SQL, JCL
- O/S: Solaris, AIX, Linux, Windows, Z/OS
- Technologies: IBM Infosphere Optim TDM 8.1, MS SSIS, Access, Erwin, Power Designer, IBM TWS, Frontrange ITSM, HP QC - Quality Center 10, HP QTP - Quick Test Professional 10
- Databases: DB2, Teradata, Oracle, SQL Server, Sybase
- Hardware: Sun, IBM
Project Experience
CONFIDENTIAL
- Developing TDM Test Data Management Framework t facilitate automated test data provisioning activities across QA, Development platforms Oracle, Sybase, SQL Server and DB2 using Java Netbeans 8 framework , KSH, and T-SQL.
- Designed and implemented non-deterministic Masking algorithm consistent across all platforms t obfuscate unsecured production data exposed in development and test platforms.
- Implementing Rules Based Engine t support the following:
- Intelligent non-interactive activities and processes e.g. unattended data securing masking
- Dynamic code generation that include Masking logic specific for each database platform
- Scripts and wrappers for various level of masking and provisioning activities
- Database Functions where supported that include masking algorithms
- Stored Procedures that drive the masking process Update-In-Place.
- Optional methods scripts external t the target database in case the database version does not support functions or target database is not available for updates
- The provisioning framework consists of dynamic Self-Learning Engine SLE with capability t learn from various provisioning actions and events and subsequently utilize the knowledge t determine optimal actions.
- A Central Metadata Repository CMR is included as the heart of the TDM framework t collect the following:
- All database related attributes such as DDL's including column, index, constraint, etc.
- All provisioning activity statistics including log of each t be utilized for Audit purposes as well as Analytics/Reporting.
- Metadata Deltas as database changes occur including DDL changes potentially affecting provisioning activities and implement Dynamic Adaption DA of changes t dynamically prevent process failures implements Artificial Intelligence and Self Correction policies implemented using Rules Based Engine.
- Process and activity related data including latency and event based exceptions t feed int Rules Based Engine t be utilized by the Decision Making Engine DME .
- Designing interface between various data feeds and target databases t transform t masked data.
- Producing TDM Framework with centralized and integrated database platform t provide data provisioning service On-Demand though Self-Service interface.
- Developed TDM strategy/document for Deutsche Bank including features such as automation, integration, of various data feeds from source t target platforms as well as functionality such as data masking, sub-setting, and data generation.
CONFIDENTIAL TDM/Data Solutions Architect
Architecture
- Developed award winning data related architecture/solutions 1st prize in 2013 WIPR Innovations.
- Responsible for assessment of business requirements for multiple divisions/lines of business across entire firm.
- Acquired technical specifications including process flow document, data model, etc.
- Responsible for driving various discovery and requirement gathering sessions with SME, business stakeholders.
- Performed detailed analysis based on business/data requirements produced analysis, assessment and solutions document including strategy, architecture, time, cost and resource allocation.
- Developed tools/applications across multiple platforms Z/OS-DB2-Cobol, JCL Linux-Oracle/Teradata-KSH Windows-SQL Server-Powershell supporting testing and implementation efforts.
- Worked with vendors such as IBM and TIBC t achieve collaborative efforts for various solution implementations.
- Evaluate and implement various technology solutions: 3rd party and custom applications/tools.
- Architected/Implemented ETL process from mainframe DB2 t Websphere MSG Queue and down-stream t Oracle and SQL Server.
- Mined retail data for business analysis from DB2 and Teradata feeding t SAS system utilized SQL/KSH on Linux platform.
- Supported retail audit functions within Z/OS-DB2 platforms.
- Implemented sub-setting, masking and data generation processes using Informatica Power Center 9.5 and ILM-TDM.
- Provided data masking for store credit card, customer identification columns within sales/transaction tables.
- Architected Gold Copy solutions for various projects.
- Produced detailed documents.
Custom Tools
- Designed/Implemented Data XML generation tool using C containing sales data t Msg Queue Supporting performance testing.
- Designed intelligent ETL tool for Teradata using CLI t replace its Data Mover utility.
- Designed Referential Constraints resolver used in ETL processes that resolved table parent-child relationships.
- Designed and implemented custom process implementing Oracle Flashback for QA testing efforts.
- Designed automated SQL Server restoration process consisting of application based services, mirroring, version syncing, and production backup images.
Testing
- Collaborated with IBM, Toshiba and TIBC for large application implementation supporting KOHLS eCommerce business that included end-to-end order/transaction coverage and flow through settlements with Banks with integration of Oracle ATG, IBM OMS, GIV inventory system, AJB Settlement application and First Data finance.
- Facilitated integration testing by means of generation of XML based transactions from ATG through OMS through Sales Hub and then t First data for payment processing.
- Managed various testing efforts with various teams including functional, performance, regression, UAT, etc.
- Supported large EDW KOHLS Enterprise Data Warehouse on Linux/Teradata platform including security implementation using Informatica ILM-TDM masking as well as Protegrity encryption and tokenization implementations.
- Supported IBM OMS Order Management System for KOHLS in collaboration with IBM team.
- Coordinated collection and establishment of a repository containing various artifacts for each projects/applications including Data Model, Data Flow Diagram, Process Flow Diagrams, etc.
- Collaborated with TIBC for custom solutions over MDM product applications.
- Provided technical support including machine/database/server related issues.
Leadership
- Provided hands-on technical/relation management/leadership onshore US /offshore India teams.
- Developed team structure implementing 5x24 support model that included dedicated leads/resources aligned with lines of business.
- Provided mentoring, guidance and leadership related t various technical engagements including timelines, and resource allocation along with architecture and solutions.
- Interacted with client management for various project related requirements and negotiations in terms of resource, time and cost allocation.
- Provided impact and Risk analysis in terms of time, cost of deliverables.
- Interacted with various teams including DBA's DB2, Oracle, SQL Server , Infrastructure teams Windows, UNIX/Linus Server , Hardware, Msg Queues, etc. for access and acquisition/allocation related issues.
- Provided guidance for achieving strategic approach and architecture t various technical/data requirements.
- Collaborated with Offshore management t negotiate resource allocations supporting 5x24 coverage that included analysis development efforts.
- Set direction with specific tools selection such as Informatica Power Center/ILM and Scripting languages such as REXX, JCL, Powershell, KSH and high level languages such as Java, C based on specific requirements.
- Provided technology training t individuals including processes consisting of COBOBL/JCL codes, MSG Queues, XML, Windows services, SQL Server backup/restore, mirror/unmirror processes/automation architecture, etc.
- Provided guidance for various process automation tools and applications such as XML generation tools, Database Oracle/SQL Server refresh process, etc.
CONFIDENTIAL Data Solutions Architect
Architecture/Tools
- Developed custom ETL data provisioning tool integrated with IBM Optim that performs key functions:
- Subsets data based on specific Selection Criteria
- Extracts data in specific order considering Parent-Child relationships
- Extracts data considering table characteristics: reference vs. transactions
- Considers multithreading of IBM Optim feature for parallel extract
- Utilizes Oracle Data Pump for parallel data import for performance optimization
- Developed Auto-Generation tool for Oracle Data Pump Export and Import functions.
- Developed Database object comparison tool t be integrated with export/import processes.
- Developed Oracle database/table properties e.g. table space, partitions discovery tool for integration with export/import processes.
- Implemented masking capabilities for PII data using IBM Optim.
Data Analysis/Mining/Masking
- Collaborated with business analyst and QA team for various data requirements selection criteria through interview and discovery sessions.
- Evaluate and implement data selection criteria based on financial advisor, investor, portfoli and fund specification provided by the client t be used for data mining and data extraction from enterprise data warehouse for production of replicated platforms for development, sales, demo, QA and UAT.
- Perform obfuscation masking/disguising real data t hide proprietary and private client information using IBM Infosphere Optim TDM.
- Produced documentation including process and data flow diagrams illustrating hardware infrastructure and processes that utilize them within and cross network domains.
Infrastructure Preparation
- Coordinate efforts t establish the infrastructure necessary for entire testing platform: Source and Target database connection including updating TNS entry, database access Oracle account , and schema including required privileges t use Oracle DATA PUMP t be able t extract data.
- Specify and acquire Linux account and privilege t read/write t the DATA PUMP directory at the OS level with the help of system administrators on both Source and Target server machines.
- Secure both Oracle and Linux ports t be available/open in order t be able t communicate between servers distributed within various networks restricted by firewalls.
- Assist in specifying criteria for Weblogic application server configuration including client interface URL.
- Manage Oracle database objects such as constraints, index, sequence, stored procedures and packages in addition t loading data in the process of preparing a composite replicated platform.
- Work with DBA, Linux, and Weblogic administrators along with Release teams t coordinate and synchronize Oracle, Weblogic, and Java codes based on required release.
Platform Validation
- Validate data that is mined based on the selection criteria that include Advisors, Investors and Fund information following the mining/extraction process including reference and Price tables.
- Verify obfuscated data for consistency including PII Personal Identity Information data.
- Verify application utilizing the URL for connection and running various reports.
- Validate entire Oracle database for validity of objects including stored procedures, packages, DB Links, etc.
- Validate code releases in terms of SQL DDL's as well as Java code.
Troubleshooting
- Debug any issues with the application and data by examining the trace logs from Weblogic application server/instance.
- Review Oracle objects including stored procedures t detect issues that prevent applications from functioning properly and/or causes failures in report generation.
CONFIDENTIAL
Data Analysis/QA
- Performed Data Analysis using Toad/SQL on Linux/Oracle platform for data validation and quality testing/management of BOA compliance data for publishing government mandated BOA holdings reports.
- Participated in testing and validation of vendor and processed data from Informatica ETL process.
- Reviewed existing Data Model and published Data Dictionary for greater understanding of Oracle table relationships t specify object modifications in technical requirements document for the technology team as well as t develop various SQL queries for analysis process.
- Performed data validation and comparative analysis using various Micr Strategy reports and the data source Oracle Tables and Views.
- Maintained various static data sets in Excel CSV used t provide t the database team. Also, performed analysis using Excel and its functions such as VLOOKUP, etc.
- Performed Data Profiling of BOA holdings data along with the holding companies and sources which provide the data.
- Worked with technology and testing teams t implement test/production data int UAT platform for various validations and testing activities.
- Prepared test plan for disaster recovery exercise.
Business Analysis:
- Responsible for analysis and development of specifications supporting the sourcing of BOA position data, instrument enrichment processes, decomposition of complex instrument types, and conversion t common stock equivalent value.
- Reviewed, organized and maintained all existing documents including Business Requirements and Functional/Technical Specification documents.
- Developed various Business/Technical Specifications based on business requirements specified by the BOA compliance group.
- Collaborated with technology team for analysis of various business/technical requirements and derive technical specification documents for development and implementation.
- Analyzed and developed technical specifications related t position holding managers institutional for implementation in Micr Strategy based on specific business requirements.
- Utilized HP Quality Center t implement and maintain business and functional requirements.
CONFIDENTIAL Solutions Architect
Data Analytic Tools Architecture:
- Designed and developed Database Driven Rules Based application using VBScript and SQL on Oracle 10g platform t identify Health Care Professionals and their payment history rules include combination of services provided extracted from Data Warehouse containing HCP data.
- Designed and developed Vermont Price Disclosure Report application using VBScript and SQL on Oracle 10g platform t generate Vermont state mandated reports in Excel format detailing Pfizer drug pricing comparing against its competitor pricings.
- Performed analysis for migration of third party product pricing data for reporting and automation including all necessary data mapping.
- Developed process for migration of product pricing data migration int Oracle using Informatica Power Center 9.
- Designed and developed complete framework using HP Quality Center 10, Quick Test Professional 10, Oracle 10g, VBScript, and XML eliminating requirement for any additional coding.
- Developed various data validation tests incorporated within the framework each driven by independent XML files with unique SQL statements customized for each.
- Test automation includes QC Defect generation, automated Defect Record generation in Excel format as well as automated Email Notification t testers.
- Used Informatica Power Center 9 for various ETL processes t Extract, transform and load source data int database tables for use by tools developed for the Data Stewardship team.
Hardware/Software Platform Evaluation/Specification/Implementation:
- Performed evaluation for Release management, Requirements tracking, Automation/Testing and defect tracking software: HP QC 10, QTP 10
- Specified and led a Windows server build for producing a robust testing platform including installation of various third party software including HP QC 10, QP 10, .Net framework, etc.
- Setup evaluation/pilot platform for Informatica 9.0.1 Data Analysis suite of tools including IDD Data Director , IDQ Data Quality , IDE Data Explorer , DA Data Analyst , and Power Center 9.
Data Analysis/QA:
- As a Technical Data Analyst responsible for understanding and translating business capability needs for data quality services int system architectures and configurations for enterprise data quality solutions. These solutions consisted of a variety of data quality capabilities including record matching, data profiling, data cleansing, etc.
- Developed various testing utilities and procedures for testing data prior t production release.
- Became subject matter expert for data quality operations for the Transaction Repository Data Warehouse and Healthcare Professionals/State Reporting solution delivery including enhancement request delivery, maintenance of business rules and data integrity.
- Identified various data sources including internal as well as external vendors for data migration int local repository.
- Created Universe and reports using Business Objects XI for business stakeholders.
- Developed various complex SQL queries including multi-table joins.
- Assisted with Data Governance process included maintaining local dedicated HCP Health Care Professional Master Data on Oracle by means of QA Tools using HP QC 10 and QTP 10 t run periodic tests t detect errors and exceptions in violation of various enforced rules within cleansed data sets retrieved from various sources including Pfizer HCP Core Master MDM on Siperion platform.
- Used Quest Toad t perform SQL queries for various data analysis functions on Oracle 10g platform.
- Utilized MS Access for data import cleansed data from Excel and uploading int Oracle.
- Developed tables in Oracle for various analysis and tools related efforts.
- Developed Logical and Physical data models using Erwin 7 implementing all of the Rules as well as HCP and Product/Pricing tables.
Business Analysis:
- As Business Analyst collaborated with Business stakeholders t deliver Data Requirements - produced business requirements in support of state and national Healthcare Professional Payment Disclosure reporting based on state and federal government regulations.
- Facilitated joint working sessions t collect data requirements t support application/reporting efforts.
- Recommended automations and tools t support all data collection and cleansing efforts.
- Interviewed Business stakeholders and produced Process Documentations detailing Data Stewardship process in addition t identification of opportunities for improvements and automation of all manual processes.
- Produced Functional Requirements for Web based application for Pfizer field representatives t verify client Healthcare Professionals information details.
- Established a complete application Release Management platform using HP Quality Center 10 comprised of its own Release Manager, Functional Requirements tracker, Tests, and Defects management.
- Made recommendations on best practices for release management, functional requirements and testing efforts and coordination for accountability of each functions.
- Produced complete technical, functional and user guides/documentation.
CONFIDENTIAL Manager/Data Solutions Architect
Technical Manager
- Provided hands-on technical/relation management/leadership onshore US /offshore Phillipines teams.
- Developed team structure implementing 5x24 support model that included dedicated leads/resources aligned with lines of business.
- Provided mentoring, guidance and leadership related t various technical engagements including architecture and solutions.
- Interacted with client management for various project related requirements and negotiations in terms of resource, time and cost allocation.
- Interacted with various teams including DBA's DB2, Oracle, SQL Server , Infrastructure teams Windows, UNIX/Linus Server , Hardware, Msg Queues, etc. for access and acquisition/allocation related issues.
- Provided guidance for achieving strategic approach and architecture t various technical/data requirements.
- Managed various testing efforts with various teams including functional, performance, regression, UAT, etc.
- Provided technical support including machine/database/server related issues.
- Provided technology training t individuals.
- Provided guidance for various process automation tools.
Test Infrastructure Design/Automation:
- Architected and implemented a Test Data Management System - TDM used as a central data repository aiding in establishing MDM optimizing Data Analyst team activities automatically depositing test data produced from databases on Oracle 10g platform based on Use Cases provided by System Analysts.
- Developed spreadsheet Parser and Loader scripts for TDM using KSH used for loading TCR spreadsheet corresponding t use case containing test cases for various scenarios int TDM.
- Developed dynamic Data Profile Generator along with view and Stored Procedure generators using PL/SQL, KSH, SED and AWK on AIX platform.
- Developed Test Query Generator and Loader used t validate data during test executions based on values which are passed in for each test case.
- Setup and managed PVCS utilized as repository and control center for workflow between System Analysts producing the use cases and the data analysts creating test data for spreadsheets containing scenarios and Test Cases based on Use Cases.
- Architecting Performance Testing platform using Mercury Load Runner including setup of clients/server and integration with Quality Center automation and SDLC database t contain test cases and corresponding parameters for correlation int VuGen scripts.
Infrastructure/Architecture:
- Evaluated, Tested, and deployed including complete installation Centennial Discovery t automatically find and track all IT assets such as PCs, servers, printers, switches and hubs within LAN and all subnets. Setup the system with dedicated SQL Server 2005 on Windows 2003 Server platform.
- Configured Centennial t perform Software Audit all applications installed on PCs and servers including software usage and patch status using SNMP like agents deployed across all computers.
- Evaluated, tested, and implemented a distributed Incident Management and Call Tracking system: FrontRange ITSM including data repository on SQL Server 2005 on Windows 2003 Server platform.
- Utilized BPML Business Process Modeling Language t manage Alerts, Notifications and Escalation based on specific SLA Service Level Agreement .
- Integrated LDAP/Active Directory t ITSM using XML/XSLT providing automated and centralized user profile and account generation.
- Setup CMDB configuration management database in ITSM by means of integration with Centennial Discovery allowing identification of individual IT assets.
- Evaluated, tested and architected complete Tivoli Job Scheduler system TWS Tivoli Workload Scheduler implementation using Oracle 10g as the repository on AIX test/QA/Production platforms.
- Integrated LDAP/Active Directory t TWS t manage centralized User accounts.
- Developed tools KSH scripts t maintain and manage FTA's Fault Tolerant Agents across all AIX servers running batch jobs by means of automated detection of FTA states.
- Developed custom FTP/SFTP utilities in Perl using standard Perl library such as Net::FTP including Retry, Restart, Recoverability functions along with automated missing and inconsistent file transmission detection capabilities.
- Developed robust, flexible, generic and fully configurable Directory and Files Cleanup tool using KSH on AIX platform.
- Developed a robust job promotion tool using KSH designed t work with TWS Tivoli Workload Scheduler that allows for automated job promotions t multiple platforms simultaneously.
Hardware/Software Platform Evaluation/Specification/Implementation:
- Performed evaluation in terms of cost effectiveness of Windows vs. AIX platform for implementation of IBM Tivoli job scheduler TWS master in terms of managing TWS FTA's fault tolerant agents in heterogeneous platforms including AIX and Windows.
- Established development, QA and production platforms for IBM TWS in independent Windows based servers including installation and configuration of the product.
- Established Oracle 10g platform t be utilized as the transaction and history repository for IBM TWS.
- Performed evaluation of ITSM trouble ticketing system on Windows platform.
- Established development, QA and production platforms for ITSM in independent Windows based servers including installation and configuration of the product.
- Evaluated CMDB product Frontrange Discovery t identify and manage company-wide hardware and software.
Data Analysis:
- Performed Data Analysis t produce data from Data Warehouse on Oracle 10g platform for testing of PREP application that tracks royalty and revenue for composers, lyricists and music publishers nationally as well as internationally.
- Involved in analyzing data requirements for Test Cases corresponding t scenarios from various Use Cases.
- Performed Usage Path analysis corresponding t Data Flow Diagram for analysis and producing specific queries t yield specific sets of data.
- Produced Test Data.
- Utilized Mercury Quality Center t track data defects as exposed through various tests.
- Managed test data produced by the data analysis team t be available for testing by Quality Center utilizing TDM Test Data Management System .
- Involved in development of standardized scripts t load data int Data Mart on Sybase IQ/AIX platform.
- Performed data analysis of ITSM incident management system on SQL Server platform: developed SQL Queries and Stored Procedures t fix/cleanse data.
- Developed XML/XSLT scripts for importing User specific data from Active Directory.
- Analyzed and Produced Tivoli TWS Job scheduling reports from the corresponding data repository on SQL Server.
- Utilized MSAccess t import data from Excel as well as generation of various reports for end users.
- Developed Logical and Physical data models implementing the entire TDM system using Power Designer.
Business Analysis:
- As a lead data analyst met with business users t discuss and verify various business functions and financial calculations effecting specific data conditions.
- As the lead architect of Trouble Ticketing system using ITSM interacted with end-users for account related and other ticket related issues resulting in interactive trouble shooting sessions.
- Produced complete technical as well as user documentations for the Ticketing system.
- As the lead architect of IBM Tivoli Work Load Scheduler interacted with IBM technical team, Off-shore support team Cognizant , as well as internal staff providing production support for running mission critical processes.
- Produced technical documentation, user guide, and training materials for IBM TWS.
- Facilitated and led formal training sessions for internal staff as well as Off-shore team Cognizant t train on IBM TWS.
- Architected and developed various fault tolerant FTP modules customized for such vendors as RRDonnelly in coordination with vendor application development team.
- Produced complete technical, functional and user guides/documentation.
- Produced complete TDM User Manual and Documentation including all scripts and usage, troubleshooting section and examples.
CONFIDENTIAL Data Solutions Architect
Architecture:
- Developed complete automation tools for Selective Data Migration which includes ETL process using Pre-Specified Selection Criteria t Select, Transform and load using KSH/SED/AWK on Linux platform and Oracle 10g.
- Implemented Automated Self-Analysis process in the Migration tool t accommodate Referential Integrity along with Index and Constraints at load time within target databases containing large volume of data sizing up t several Terabytes f data .
- Implemented Size Analysis for batching beyond specified threshold and controlling Transaction size.
- Developed automated Masking process for Data Privacy t load and prepare databases which include proprietary financial data in disguised form hiding personal and key financial information using Compuware File-Aid and KSH/SED/AWK on Linux platform.
- Implemented automated Primary/Foreign Key Propagation scheme retaining Referential Integrity.
- Developed Data Validation processes for Masked and regular data using Rule Based Relation template.
- Developed various DDL Generation and Reverse Engineering tools using SQL-Plus, PL/SQL, KSH, SED, and AWK on Linux platform and Oracle 10g: Tool t reverse engineer database Objects: Tables, Columns, Constraints, Index, Sequence, etc. Tool for data Migration.
Business/Data Analysis/Design:
- Performed Data Analysis of Albridge Wealth Management System including client account and reference data portfoli accounting, performance reporting, fee billing security prices/transactions current/history, assets/positions/classifications, portfoli performance and corresponding advisors/investors for implementation of data Migration from its Data Warehouse int smaller Data Marts for Development, Integration, and QA platforms.
- Analyzed existing Data Model for the data warehouse containing wealth management data for reporting.
- Developed Logical and Physical Models using Erwin 7.0 implementing database that support the data extraction automation process.
- Implemented prototype for EMPS using MS Access 2000.
Business Analysis:
- Interviewed principal and gathered various documents t identify Business Functions, process and system functionalities, and determine the system and Process Flow leading t the delivery of a complete technology platform EMPS including hardware, database, Web infrastructure BEA/Web Logic and application.
- Performed analysis and prepared Specification Document for EMPS Environment Management Process System t standardize, manage and track building and deployment of Development, Integration and QA platforms which include Hardware, Software and Database components coordinating between various parties such as the Project Managers, Database team, and Infrastructure team ISS.
- Prepared and Environment Build Management automation tool specification including functional specification.
- Developed User Guide and Functional Documentation and Product Presentation of Compuware File-Aid product for internal use within Albridge.
- Gathered data requirements for financial reporting t the clients t be later developed with masked data for confidentiality reasons.
- Interacted and coordinated with various technical teams: Oracle DBA team, Linux admin team t launch various data related activities for UAT/QA platforms.
- Interacted with application development team t troubleshoot data related issues generated from testing activities.
- Produced complete technical, functional and user guides/documentation.
CONFIDENTIAL Data Warehouse Architect
Analysis Design:
- Performed Analysis of new and existing Financial Reporting Systems G/L, A/P, and A/R:
- CODA on VAX platform in terms of current data source: Tables and Data Feeds layout, content, Data Mapping and logic pertaining t the existing reports.
- SAP system in terms of the new accounting structure in order t design and implement new reporting system consisting of an ODS platform and an OLAP platform on SQL Server 2000/2005.
- Universe in Business objects through Data Mapping t the Data Mart.
- Performed Business Analysis in JAD and Discovery sessions in partnership with principals and clients producing Data Flow Diagrams along with Conceptual Model supporting financial data.
- Specified Data Flow Diagrams and Process Flow Diagrams for streamlining data from SAP t ODS t Data Warehouse.
- Designed and implemented both Logical Model and Physical Model using Erwin 7.0 for the ODS platform t accommodate diverse data feeds, data cleansing and a central data repository through both real-time and batch processes from various sources.
- Produced Dimensional Model Star Schema for implementation of Dimension and Fact tables using Erwin 7.0 customized for all reporting requirements within the OLTP platform.
- Implemented De-normalization for performance and efficiency.
- Implemented Measures consisting of various Aggregations and Pre-calculations.
Business Analysis:
- Participated in requirements gathering sessions with Finance department t understand both legacy and new reporting needs along with analysis of the legacy architecture/infrastructure.
- Held various data mapping sessions with SAP development team t design and implement ETL processes along with validating data for reporting requirements.
- Held design sessions with BI team for optimization of Star schema t be implemented for Data Warehouse as well as design of Business Objects universe for reporting and BI activities.
- Produced complete technical, functional and user guides/documentation.
Data Feeds ETL Process:
- Implemented ODS supporting Centralized Data Repository along with a Data Cleansing/Processing platform.
- Specified and designed Data Feeds consisting of various data files Columns, Length, Format feeding int both static Referential as well as dynamic Business data int ODS.
- Designed and developed ETL process using Microsoft SSIS t perform Cleansing, Mapping, Formatting, and Data Integrity Validation operations t transfer various forms of data Flat Files, Excel, and other DBMS sources Sybase and SQL Server 2000/2005 .
- Developed various calculations, Validation and Exception Reports complementing daily loads as well as On-Demand Reports using T-SQL/Stored Procedures.
- Developed Standardizations in terms of Entities and Attributes along with common Data-Dictionary.
- Produced Design Document including ERD, Entity Definitions and Column Mappings.
- Developed process t support Data Feeds received through FTP.
- Implemented Backup and Recovery processes for DBMS as well as related objects such as ETL metadata.
- Defined and implemented scheduled Jobs in SSIS along with notification systems.
CONFIDENTIAL UNIX Administrator/Disaster Recovery Specialist
Provided UNIX administration support t clients on various SUN/Solaris and IBM/AIX platforms including OS installation and configuration along with storage and media ADIC, Storagetek, IBM tape library setup and configuration. Actively participated in disaster recovery tests with clients.