Data Architect Resume
5.00/5 (Submit Your Rating)
Cleveland, UsA
SUMMARY
- I am seeking a competitive and challenging environment where I can serve towards building the success of the company while I experience advancement opportunities.
- I am a data architect with more than 10 years’ of IT experience in the Analysis, Design, Development, testing and implementation of business application system in Banking and financial services and offering extensive real - world experience in Data Architecture and Information Management systems.
- I have successfully built and integrated high-performance teams ranging from five to twenty professionals onshore and offshore, while mentoring and driving the team to success.
PROFESSIONAL EXPERIENCE
Data Architect
Confidential, Cleveland, USA
Responsibilities:
- The basic objective is to extract data from the source systems and present it to the downstream applications in an organized manner which will enhance their capabilities.
- The data from the source systems in various formats would be aggregated in the Combined Staging Area (CSA) in a standard data base format to ensure consistency.
- Data from the CSA would be extracted and stored in the Enterprise Data Repository (EDR or BDW) with the help of the Data warehouse Standard Interface Files (SIFs).
- The Data warehouse SIFs have been introduced to facilitate the process of extracting data from the source systems into the Data warehouse and to reduce the impact on the Data warehouse for any changes in the source systems which supply data to the EIP.
- The data stored in the Enterprise Data Repository (EDR) is finally loaded into the downstream Datamarts using the Enterprise Data Exchange files (EDXF).
- The data as available in the downstream Datamarts can be used by the downstream applications and analytical engines to perform the required functions of calculations and reporting.
- Requirements gathering - Involved in all the discussions with the client during requirements phase.
- Analysis & Architecture - Involved in analysing the existing legacy source systems and define the solution architecture for the given requirements.
- Identify the integration points between the new source systems and the existing source systems. Design the new source system to integrate with the existing source systems.
- Converting the business level requirement into the Technical design specifications.
- Analyse, design and develop ETL strategies, resolve issues related to Enterprise data warehouse (EDW).
- Performed many Technical POC’s related to new process/technologies before implementing the changes to the EDW warehouse (Protegrity/unity data mover/ Ca7 to Autosys conversion...)
- Identify the long running ETL jobs in production and fine tune the ETL to reduce the run time in production database.
- Creating the weekly project status reports, tracking the progress of tasks according to schedule and reporting any risks and contingency plan to management and business users.
- Involved in meetings with production team for issues related to deployment, maintenance, future enhancements, backup and crisis management of DW.
- Involved in analysing and fixing the defect in UAT/QA and PROD database.
Data Architect
Confidential, Cleveland, USA
Responsibilities:
- Extensive experience on Teradata’s Unity Data Mover tool.
- Extensively involved in the design and Implementation of Disaster Recovery Solution for more than 5 applications within Confidential (EII, DQA, ADA, DPF, CDM, CSD)
- Created 1200+ CA7 and Unity data mover jobs for migrating the data to disaster recovery database for various applications within Confidential .
- Designed and created multiple UNIX shell scripts for migrating data to Disaster Recovery Database for various applications.
- Created reusable pre load scripts and post loads BTEQ scripts for unity data mover objects which can be accessed by all applications.
- Involved in analysing the defects and fixes in UAT/QA region on Data Mover objects for various applications.
- Fine-tuned the Data mover jobs that are running for a longer time in production for various applications.
- Created both Partial and full data mover jobs based on various application requirements.
- Designed/recommended the scheduling of data mover jobs for various applications based on the RPO/RTO of the applications.
- Created generic documents which would help any applications to On-Board into Teradata unity data mover framework, set up the unity data mover metadata tables, set up the pass/fail jobs in UAT/QA and PROD region and troubleshoot any data mover failures in various environment.
- Created Run-book for various applications within Confidential . The Run book will be executed during the time of fail back and fail over.
- Designed the connection test jobs which will be executed during the Confidential level fail back/fail over process to test the connections of various relational sources.
Data Architect
Confidential, Cleveland, USA
Responsibilities:
- Converted Oracle ETL mappings into Teradata mappings.
- Implemented Stats Manager for different layers (SIF and BDW) in EDW.
- Implemented FULL PDO, source and target mappings.
- Designed monthly and daily history retention process on Teradata system.
- Designed and implemented solutions for deadlock issue which was occurred by PDO ETL mappings in BDW layer.
- The Oracle materialized views are converted into Teradata Tables and the implemented the materialized view logic in Teradata BTEQ scripts.
- Fine-tuned many of the EDW mappings during this conversion project.
Lead Developer
Confidential, Cleveland, USA
Responsibilities:
- Extensively involved in the design of Autosys to CA7 conversion project.
- Migrated 1600+ autosys jobs to CA7.
- Involved in creating DOC05, DOC40, AGENTS, and JCLP for 1600+ CA7 jobs.
- Extensively involved in analysing and correcting the issues observed in CA7 jobs with help of scheduling/Transition team.
- Implemented various CA7 look back concept and negative dependency in EDW CA7 jobs
Domain: Banking & Financial Services
ETL Developer/BDW Offshore Module Lead
Confidential
Responsibility:
- Involved in all phases of SDLC from requirement gathering, design, development, testing and production support.
- Designed complex UNIX scripts and automated it to run the workflows daily, weekly and Monthly.
- Used Informatica Designer to create complex mappings using different transformations like Filter, Router, Connected & Unconnected lookups, Stored Procedure, Normalizer, Rank, Sorter, Source Qualifier, Joiner, Update Strategy, Expressions and Aggregator transformations to pipeline data to Data Warehouse.
- Responsible for monitoring scheduled, running, completed and failed sessions. Involved in debugging the failed mappings and developing error-handling methods.
- Implemented various Performance Tuning techniques on Sources, Targets, Mappings, and Workflows.
- Worked on the design and development of complex requirements.
- Designed and developed complex Type 1, Type 2 and Type 3 mappings for the stage to integration layer in the project.
- Involved in migration from Informatica 8.6 to 9.5.1.
- Worked with heterogeneous source to extract the data (Mainframe, Flat file, relation database - Oracle) in Staging area.
- Involved in Unit Testing, User Acceptance Testing (UAT) to check whether the data loads into target are accurate, which was extracted from different source systems according to the user requirements.
Technologies: Informatica PowerCenter 8.6/10.2/9.5 , Power Exchange 10.2/9.5, Oracle Exadata, Teradata 14/16, Oracle PL SQL, Autosys & CA7 scheduling, UNIX shell scripts, Harvest Change management tool and Confidential BDW Model, Mainframe.