Data Analyst, Business System Analyst Resume
Santa Clara, CaliforniA
SUMMARY:
- Data Analyst with over Nine years of experience in leading and managing projects of all sizes and complexity. Strong analytical and design skills with ability to grasp new data/processes quickly. Extensive experience working on projects in the Financial Domain in Retail & Commercial Banking, Consumer Servicing, Loan Originations, Check Operations, Investment Banking and Capital Markets. Worked with cross - functional teams managing medium to large-sized projects in Regulatory Risk Reporting, Finance, PMO, Operations and Basel. Experience in Quantitative Analysis, Data Mining, Model development, scoring and validation of Predictive Models.
- Recognized as one of the top Business/Data Analyst by Confidential & Confidential for contributing in designing & development of Confidential NA systems for over 5 years. Selected out of 5000 associates to deliver a critical project of Confidential NA.
- Data Analysis functions include Data Sourcing & Transformation, Data Pre-Processing, Data Visualization and Data Virtualization. Extensive experience with SQL, ETL (Informatica Power Center, Informatica MDM, SSIS), PL/SQL, Python and Excel.
- Data Governance functions includes working with Data Stewards and Data Custodians to implement and maintain Data Governance programs and improve Data Quality. Developed summarized Data Governance / Data Quality analysis and dashboards.
- Knowledge of Data modeling and Data warehousing concepts with emphasis on ETL of Party data (Customer Lifcycle Management).
- Expertise in Comprehensive Capital Analysis & Review (CCAR) - well versed in building logic & mapping of 14M, 14Q reports and 14A report model development for Stress scenarios. Have experience of other reporting such as 2052a Liquidity Reporting & IFRS9.
- Understanding and knowledge of CCAR / IHC / Basel concepts - Credit Risk, Market Risk, Liquidity Risk.
- Skilled in Data Mining techniques - Regression, Classification, Clustering, Decision Trees and Random Forests.
- Implementation knowledge of Regression Modeling, Time Series Analysis, Statistical Testing, Correlation, Multivariate Analysis, Forecasting, Model Building, Business Intelligence tools and application of Statistical Concepts.
- Work experience in analytics, working with data to convert large volumes of Structured and Unstructured Data into actionable insights and business values.
TECHNICAL SKILLS:
Languages: SQL, Python, PL/SQL, T-SQL, R, C/C++, VBScript, VBA, XML, UML, Perl, Shell Scripting
Data Management: Toad, Oracle, MS Excel, MS SQL Server, MS Access, Apache Spark, Hadoop
Data Visualization Tools: Atom, Tableau, MS Visio, Qlikview, R, Microsoft Power BI, SAS
Business Intelligence Analysis: Informatica, Denodo, SQL, Python, Collibra, R (Statistical Software), SSIS (Integration Services)
Machine Learning: Regression (linear, logistic), Classification (SVM, DT, KNN), Clustering (K-means, hierarchical)
Statistical Methods: Time Series, Regression models, Hypothesis Testing, Multivariate Analysis
Testing Tools: Mercury QuickTestPro, Mercury QualityCenter
Documentation & Reporting: MS Office, RTC, Jira, Planview, ServiceNow, Rational Clearquest, Rational Clearcase
PROFESSIONAL EXPERIENCE:
Confidential - Santa Clara, California
Responsibilities:
- Liaise with corporate leadership, subject matter experts, department heads and other key players in assessing the state of the enterprise.
- End to end scope of the current state of the enterprise with an in-depth review of the Month End Business Intelligence BI Reports; Data Landscape; Data Quality; Technical Architecture; Collaborations and Operational Hand-shake across functional units. Reported escalations and updates to Enterprise working group and Data Advisory Council.
- Architecture and Data Landscape with a proposed new roadmap to accomplish design with Data Governance in mind.
- As part of Governance, conducted Data Quality assessment of the Critical Data Elements (CDE) using quality dimensions to track concerns.
- Configured Domain, Asset and Attribute in Collibra. Designed Business Glossary and uploaded Collibra Assets.
- Identify missed checkpoints in the operational workflow.
- Organized a working committee of subject matter experts in defining Business Terms, Landscape review, Quality, and Risk review.
- Part of the Stewardship framework, submitted a clear bill of Enterprise Data Management and Data Governance roadmap.
- Responsible for reviewing current state, quality, architecture. Design and document Future state Enterprise.
- Built good working relationship within the core teams across producers and consumers of the data.
- Train and articulate at enterprise level the benefits and need for Data Governance and Data Quality.
Environment: Oracle, Toad, SQL, MDM, Tableau desktop, Collabra, Python
Confidential - San Ramon, California
Data Analyst, Business System Analyst
Responsibilities:
- Implemented End to End Business Intelligence (BI) and data warehouse solutions for Master Data Management( Confidential ) & CCAR project, which includes logical and physical data modeling, ETL data flow, source-to-target mapping, Data Quality and Data Remediation.
- Extensive analysis of Source data coming from Originations and servicing (product processor) Systems such as Loanserv, AFS, ALS, CACS, APPRO, UNIFI for the MDM & reporting of Confidential , Corporate Loans, Consumer Loans, HELOCs, Mortgages and Overdrafts lines.
- Gathered requirements and prepared mapping for staging & work layer before it goes to MDM hub.
- Performed Gap Analysis for the KDEs for Schedule A to D of 14M & Schedule H of 14Q Fed Report.
- Prepared Metadata, Data Lineage and Data remediation requirement documentation for Informatica MDM project for Confidential , Corp Loans, First Lien and Home Equity.
- Extensive analysis of the Party(Customer) data for Work Layer Data Model - Party, Account & Products and also for the Data Quality, Match & Merge at the Landing zone of Informatica MDM.
- Created Tableau scorecards, dashboards using various types of chart for Visual representation to manage the risk more efficiently based on Charge-offs, Recovery over a period of 12 months, Borrowers Rating and overall health of each portfolio.
- Worked on Data Virtualization tools to connect Heterogeneous Data Source and to standardize the data across the organization which will be used as a feed to Regulatory Reporting tool. Created Base, Derived & Flatten Views, Joins, Unions, Projection, Selection, Interface and associations of data service layers inDenodo. Experience in Installing and Configuring Virtual Data Port(VDP) Database setup.
- Performed exploratory data analysis like calculation of descriptive statistics, detection of outliers, assumptions testing, factor analysis.
Environment: Oracle, Toad, SQL, DENODO, Informatica PS, Informatica MDM, Python, Hadoop, Tableau desktop
Confidential - Livingston, NJ & Austin, TX
Business Data Analyst, Data Governance Consultant
Responsibilities:
- Integral part of the team for integrating the Confidential with Confidential by building one source layer (federated layer) to provide data to Confidential consumers for reporting - Regulatory, Credit Risk, Tax and Treasury. Implemented and managed end to end delivery of Mortgage data for 14M CCAR & 2052a Liquidity report. Extensively used SQL & Excel for Data Profiling, analyzing and presenting the data.
- Maintained Critical Data elements based on various regulatory and risk requirements. Provided key information like authoritative sources, usage, data quality, data standards. Created Qlickview dashboards to generate monthly metrics for Regulatory & Risks team.
- Part of the Stewardship framework, conducted working sessions with data stewards for sourcing and acquisitions and data quality issues and remediation. Reported escalations and updates to Enterprise working group and Data Advisory Council. Restructured the existing process by automation and business process improvement.
- Built good working relationship within the core teams across producers and consumers of the data. Used techniques of smooth and transparent communication, listening to either side and documenting issues, and solve problems with vast domain knowledge.
- Created Data Mapping, BRD, test cases, provided in depth data analysis and worked through quality issues. Built a manual reconciliation process to GL.
- Created over 50 reconciliationsfrom ground-up using feeds fromOWB Systems and the Data feed from Accounting Team.
- Developed a Machine Learning model in Python to predict best performing OWB products over the average performance of products in a portfolio analyzing last ten years historical data.
- Created presentations for higher management on monthly basis for the Edit checks coming from the FRB for 14M-CCAR Reports.
- Created visualization for Monitoring and Managing loan portfolio in terms of Liquidity Risk such as Loan Delinquency.
Environment: Sql Server, DENODO Virtualization Tool, Qlickview, Collibra, OneSumX, SSIS, Python, QRM,
Confidential - Saint Louis, MO
Sr. Business Analyst, Data Analyst, Data Engineer
Responsibilities:
- Implemented a DWH and BI system for COP (Customer Offer Palette) Application that converted manually intensive reporting into an automated solution utilized by Offer & Lead Management teams in making decisions to provide offers to customers.
- Implemented Party(Customer) Hub which acts a Golden source for the distinct customers at enterprise level. Integral part of the MDM Hub for Customer Lifecycle Management (CLM) team.
- Created Data Quality rules to identify the records which have bad data or the data is not available before it goes to Informatica MDM.
- Implemented Global Matching, Merging rules on the Party data coming from different SORs to get the Golden record for Customers. Coordinated with Business team, Data Modelers, Development and other team to have these rules implemented in the Hub.
- Worked on Global Rainbow Project to replace the main systems of Confidential NA like OSRO, CWS, DBSales by Eclipse G2C V2.
- Developed and Implemented Data Quality Monitoring system to catch and report exception from receivables system.
- Delivered the Cards POS Alignment project and added the 7 new products with enhanced value propositions to better meet the needs of consumers and drive acquisition to correct the misalignment of some of the products currently available on POS.
- Implemented a State Level Pricing project to comply with The Dodd Frank Wall Street Reform and Consumer Protection Act, which mandates banks to comply with the state laws so as to differentiate consumer and loan product by state.
Environment: Oracle, Informatica, Informatica MDM, Collibra, Java, Mainframes, .Net, TIBCO, ESB, UNIX/Windows, Sql Server, Websphere App Server
Confidential
PMO, team member - Branch Restructuring
Responsibilities:
- Change Manager:- Responsible for all the infrastructure changes of the branch. Coordinated the changes of applications like TCSBanks.
- Branch Restructuring Manager:- Responsible for opening, closing or relocation of retail branches across India. Managed the team which coordinated with more than 200 groups for the same.
- Vendor Manager:- Responsible for the renewal of Application software, which included getting quotations from vendors, creating purchase order etc.