Data Warehouse Data Resume Profile
PROFESSIONAL SUMMARY
- 9 Years and 5 months of IT experience Insurance, Healthcare and Financial Services
- Extensive experience with Technical and Business Requirements Analysis.
- Extensive experience in prototyping the requirements using SQL Queries.
- Extensive experience with Regulatory Reports CCAR/Dodd-Frank/eSNC on Large Corporate Syndicated Loans, Basel Elements from LoanIQ, AFS, LUCAS, GFI and Risk Weighted Assert RWA .
- Experience in prototyping the requirements into a full package for development team to implementing in production environment.
- Experience working in Conceptual, Logical and Physical Data models.
- Experience in creating source to target mapping documents.
- Experience working in data warehouses projects and Agile Methods.
- Committed and learning BIG Data Apache HiveQL HQL .
TECHNOLOGIES
- Datawarehouse
- Advance SQL query Designer
REGULATORY REPORTING ANALYST
- Large Corporate Syndicate Credit
- Experience in Loan IQ and AFS Data Model tables and reference structures
DATABASE QUERYING TOOLS
- Teradata V2R5
- SQL-Server-2008
- Oracle 10gR2
- DB2 9.7
- MS-Access databases.
LOAN APPLICATION
- Commercial Syndication Loan - MIsSys LoanIQ.
- Commercial Syndication Loan - AFS - CICS in Mainframe Platform.
QUERYING TOOLS
- SQL Management Studio Tool
- Teradata SQL Assistant
- Toad
- SQL Plus
- IBM - DB2 Query Management Facility
BUSINESS INTELLIGENCY TOOLS
- Microstrategy BI Tool 8,9 Dashboards and Reports Validation
- OBIEE BI Tool 11g Dashboards and Reports Validation
DOCUMENTATION TOOLS
- Microsoft Visio
- Microsoft Office products for documentation, presentation and Work Flow Diagram Business/Technical
- SQL Server Import and Export Wizard
- Waterfall and Agile Scrum Methods
- HP-Quality Center
PROFESSIONAL EXPERIENCE
02/2013 - Present
Role: Sr. Technical Business/Data/Quality Analyst
Location: Charlotte, NC
Project:
- Project Description: LCSC Wells Fargo improved the data collection and consolidate all the aggregation processes for Large Corporate Syndicated Credit LCSC Regulatory Reporting CCAR/Dodd-Frank/eSNC required by the Federal Reserve and Office of the Comptroller of the Currency from multiple System of Record SOR LoanIQ MISYS , AFS, LUCAS, GFI and minor system of record. Wells Fargo lines of business responsible for servicing facilities subject to LCSC reporting are currently employing dissimilar processes to extract data from various systems of record, then validate and deliver that data to Corporate Credit where it is aggregated and submitted to the Federal Reserve. In the majority of cases these processes involve significant manual data manipulation and thus allow for potential errors.
- The goal of this project is to leverage and/or enhance existing month-end data flows to consolidate data from each Wells Fargo lending system of record SOR into a single source from which it can be extracted for LCSC reporting. This will allow LCSC data aggregation to be managed by a single group of subject matter experts. In addition to improving data aggregation, this project will implement and/or enhance processes necessary to address data quality issues within each system of record.
Responsibilities
- Responsible for requirement gathering, analysis, defining the frame work, planning, execution and maintenance of prototype scripts.
- Interact with all the SME's from the various system of record team to collect the business logic to meet the data requirements.
- Experience in Loan IQ and AFS Data Model tables and reference/lookup structure to support business needs.
- Develop SQL queries to support the requirements and prototype the complete process for the actual development team to follow SDLC with artifacts.
- Perform the Unit testing on the developed SQL scripts and validate the transformation logic between source and target system/database.
- Experienced in understanding the complex business requirements and system logic to develop SQL queries for complex transformation logic to meet the business needs.
- Actively participated in data reconciliation process to maintain the data integration between source and target system.
- Experience in creating a Data flow diagram to load the generated data from multiple systems.
- Experience in data load process into a database using Import export tools.
- Extract the business requirements from requirement documents to technical documents for the development team to implement in production.
- Experience in creating artifacts for the Business users and field officers to correct the booked loans in LoanIQ and AFS systems.
- Experience in crating Source to Target mapping document with multiple systems for development team to implement in production environment.
- Support the development team to understand the business, technical and system requirements to consolidate different system of records into a single file for regulatory submission on monthly and quarterly basis.
Confidential
Data Warehouse Sr. Data/Quality Analyst
Projects:
TIAA-CREF
- Project Description: SRP Strategic Reporting Platform is a Data Warehouse platform to support Portfolio Management and Research. It will serve as a strategic reporting platform for Fixed Income, Capital Markets, Equities and Derivatives positions, transactions, valuation, performance, market data and reference data to support the reporting requirements for portfolio management and research activities.
- SRP charter is to establish a single point of data and application distribution platform that can support all portfolio management and research solutions across all Fixed income and Equities securities types. SRP supports FI security types such as Bonds, Treasury Bills, Private Placement, Preferred Stock, Agency notes, Stock Options, Swaps, T-Bills, Synthetics, Commercial Paper, Index futures, Bond futures, SNEUPS, Leveraged Loans and etc. Total Return, TCAM, Short Term and General Accounts for the clients of Teachers Advisers, Inc. and TIAA-CREF Investment Management, LLC the Advisers .
Responsibilities
- Responsible for requirement analysis, defining strategy, planning, execution and maintenance of test logs. Along with the Business owners, I own the responsibility of providing the sign-off on the data and the reports for production migration.
- Daily writing of SQL queries to validate the transformation logic between source and target system/database.
- Experience in converting the user stories into data testing requirement in agile method.
- Actively participated in data reconciliation process to maintain the data integration between source and target system.
- Experienced in data migration testing projects from Informatica to Datastage.
- Extract the business requirements from requirement documents and technical documents to design test scenarios/cases and supporting SQL queries to validate the data consistency between source and target result sets.
- Experienced in creating a use case for new requirements to fit the existing data.
- Review of test cases designed by the team and guiding the team technically to complete the testing of complex reports and transformation.
- Experience in creating a mapping document for developers with transformation logical issues.
- To validate the adhoc requests and changes on existing database and validate those requirements with complete artifact. Also, collects the artifacts to explain the story to business users.
- Actively participated in agile methodology projects Scrum from data analysis team.
- Very good Data Analysis and Data Validation experience in agile methodology project.
- Understand the data flow from database to BI tool to validate the data on reports that are generated by end users. Also, report validation between OBIEE 10g and 11g.
- Frequently interact with BI developers, Business Analyst and Data Analyst.
- Publish the project status report from Quality Center. Conducting defect triage meeting and maintenance of the defects in HP-Project Name: Asset Management and Enterprise Services AMES
Data Warehouse Data/Quality Analyst
Confidential
Project Description: Group Claim Management GCM - Group Claims Management's GCM overall goal is to improve Cigna Group Insurance Claims operations by replacing the existing claims management system Acclaim and the associated manual / supporting processes with a modern, scalable, flexible and supportable claims management system that enhances and provides new capabilities for the disability, life, accident and waiver of premium product lines including integration with Family Medical Leave Administration FMLA . The below are the main streams for this GCM program: Fundamentals, Reporting, Workbench and Correspondence.
Responsibilities:
- Analyze the requirement document and get clarifications raised by on-shore and off-shore with business and system analyst to make sure it covered all the test scenarios/cases.
- Actively participated in data modeling discussion while introducing new database/tables to help the team to improve the quality of the test scenarios and test cases.
- Coordinate with offshore team to make sure all the deliverables are in organizations quality with complete test artifacts.
- Conduct the meetings between the database analyst and business analyst to understand the data model to validate the designed the test scenarios and test cases.
- Provided the QA estimations on new requirements and adhoc requests.
- Prepared the test scenarios/cases and walk through the same with business analyst.
- Before proceeding with test execution, sign off will be requested from the stake holders.
Data Warehouse Data/Quality Analyst
Confidential
The Travelers Indemnity Company
- Project Description: Agency Admin Workstation AAW - The main business purpose is to streamline the entry of the Net-appoint attributes into PALS via an automated download. The automated download will create the Entity, Producer Code and Licensee code in PALS. Appointments will also be automated. The Automated download from Net-appoint to the system, to PALS will save Specialists time, and will ensure data integrity by eliminating manual entry into PALS.
- Leads Management system can be used to manage business opportunities and track them to closure. In Leads Management system, a prospect is a potential client who is interested in buying insurance and a lead is a business opportunity for Travelers to sell insurance to a prospect. Reports could also be generated through this application to track performance. Agency Administration Appointment Validation AAAV . The project team will analyze premium and appointment data to determine which Agencies and Agents, by Company Code State, have generated no written premium over the previous three years. If no premium has been written over the three year period, the appointment will be targeted for non-renewal. The existing process will do the over-appointment analysis and would generate excel workbooks with the unused appointment details. The information provided in the excel workbooks are then used for terminating the potential Over-appointments. However, further scope for enhancements and fixes for current process have been identified.
Responsibilities
- Extract the business requirement from business requirement documents and technical documents to design test scenarios/cases and supporting SQL queries to validate the data consistency between source and target result sets.
- Design and review the traceability metrics, test scenarios, test cases and SQL test queries to upload in quality center before execution starts.
- Execute the designed SQL queries to validate/compare the data consistency with the business requirements between source system database, Web Service and the presentation layer or target database. Analysis of each business requirements and scheduling regular walk through with business folk's to make sure their requirements conveyed to the IT team.
- Good knowledge on testing the facts and dimensional Start Schema data in Teradata database.
- Testing the changes on the data reconciliation in Teradata database.
- Experience in validating the reports in microstrategy.
- Create test strategy, plan and review in advance prior to the environmental setup based on the technology used in the project.
- Experience in cross verification on mapping logics between multiple sources to target system tables while creating a new Datawarehouse environment.
- End to end testing starting from the requirement gathering, effort estimation, planning, resource allocation, execution and implementation.
- Participate in Agile methodology projects Scrum from data analysis team to provide estimation to implementation for stories/requirements.
- Analysis of different ways to test the data consistency/manipulation between the source and the target systems.
- Coordinate with on-site/off-shore team to provide enough information and to make sure their understanding sync with business requirement to deliver the quality database data in time.
- Upload all the test results in HP Quality Center HPQC to track the defects and conducting defect triage meeting with development.
- Publish the project status report from quality center to the different project Managers and dependent work group managements.