- Professional with over 8+ years of Data Analyst/Business Analyst experience in the Finance and Banking domains.
- Experience as a Business Analyst with solid understanding of Business Process Flows, Case Tools, and Business Analysis.
- Comprehensive experience covers financial risk management domain especially in Credit Risk Measurement, Capital Adequacy compliance (Basel III regulatory aspects), Asset and Liability Management and Liquidity Risk Management.
- Worked in different OFSAA Enterprise Risk Management (ERM) solutions Such as Basel Regulatory capital,
- Experienced with Fixed Income, Mortgage Backed Securities, Derivative, Options, Equities, Corporate Bonds, Credits, Foreign Exchange, Banking, and other financial products. In depth and extensive knowledge of the Defect Management Life Cycle Software Development Life Cycle using various methodologies like Agile and Waterfall Methodology.
- Experience in Mortgage Market, Debt Securities
- Extensive experience in documenting Business Requirements Documents (BRD) and Functional Requirements Documents (FRD).
- Good knowledge in SQL, practical experience in MS SQL and Oracle databases and SharePoint
- Expertise in UML business modeling tools and arm data modules.
- Extensive work experience in technology related to Financial Services, Wholesale Banking, Consumer Banking, BASELII and III, Fixed Income Trading, Corporate Loans, over - the - Counter (OTC), and Exchange - Traded - Fund (ETF)
- Extensive understanding of Software Development Life Cycle (SDLC) including Waterfall, Agile Scrum.
- Vast experience in performing SDLC tasks - Requirement gathering, Analysis and Feasibility, Creating Business Requirement Document (BRD), Transforming BRD to Functional Specification Documents (FSD), High Level and Low-Level Design documents etc.
- Experience on CCAR, Basel II, Basel III, Dodd-Frank Regulatory Reporting FR Y-14A/Q Monthly, Quarterly, Annual, FRY9-C, DFA 165, DFA 165E and others required for banks for compliance reporting to various regulatory agencies
- Strong analytical skill in analyzing Financial Risk such as Interest Rate Risk, Default Risk, Credit Risk, Inflation Risk, Operational Risk, Market Risk, Systematic Risk, and Volatility Risk associated with investing in securities.
- Detailed knowledge in Risk Management using metrics such as Greeks, VaR, Duration and Convexity.
- Expertise in identifying, developing interface and process specifications, documenting them in Business Requirement Documents, creating Functional Specifications Design (FSD), Business Process Documents, Data Flow Diagrams.
- Experience in diverse the phases of Software Development Life Cycle (SDLC) and methodologies such as Spiral, Agile-Scrum, Waterfall, Waterfall-Scrum, Hybrid and Rational Unified Process (RUP).
- Business Process Research Analysis Gap Analysis, Impact Analysis, and Feasibility Analysis.
- Requirements Elicitation Techniques Conducting Interviews and
- Experienced in continuous association & interaction with users during entire life cycle of application development process (SDLC)
- Expertise in Requirement Analysis, Use Case development, UML Modelling with Sequence Diagrams, Activity Diagrams using MS Visio.
- Proficient in executing SQL queries to create user reports, data validation, and data flow testing for business process
- Expertise in handling Production tickets in collaboration with Production support teams and business user groups
- Excellent analytical & problem solving skills and a team player with strong interpersonal and communication proficiency.
- Good Knowledge of HIPAA Compliance, HIPAA Insurance Regulations and Claims.
- Good experience on Online Banking, Retail Banking, E-Commerce, Online Transaction Processing and Electronic Fund Transfer.
- Strong knowledge of Software Development Life Cycle (SDLC) models (Agile, SCRUM, Waterfall, Rapid Prototyping, and Spiral Model)
- Effectively interacted with customers and Subject Matter Experts (SME) to gather and document business requirements across multiple business process areas.
- Strong experience in coordinating Joint Application Development (JAD) sessions and interviews.
- Experience in the documentation of system and business requirements and specifications, design and development of use and test-case scenarios and root-cause analysis, GAP analysis, developing test plans, test scripts using SQL, conducting System Integration testing (SIT), user acceptance testing (UAT).
- Effectively monitored the Defect Tracking process in the functional, integration and regression test phases of the project. Creating and maintained Requirement Traceability Matrix (RTM) using MS Excel.
- Experience in conducting GAP analysis, SWOT analysis and Root Cause Analysis.
- Documented Test Plans, Test Cases, Test Scripts, Test Procedures based on the Design Document and User Requirement Document for the Black Box, Functional, Usability and User Acceptance Testing (UAT).
- Developed User interfaces, mockups, wireframes for users to relate in the requirements sessions.
- Comprehensive knowledge of the testing lifecycle within web, client server & Mainframe environment.
Sr. Data Analyst/ Business Analyst
Confidential - Dallas, TX
- Performed reverse engineering for a wide variety of relational DBMS, including Microsoft Access, Oracle and Teradata, to connect to existing database and create graphical representation (E-R diagram) using Erwin7.3.
- Worked on Teradata Environment based on the data from PDM. Conceived, designed, developed and implemented this model from the scratch.
- Star schema dimension modeling for the data mart using Visio and created dimension and fact tables based on the business requirement.
- Responsible for statistical applications support and programming primarily in SAS, and supporting Marketing team in ACOE. used pandas with python to create high-performance data structures
- Worked in importing and cleansing of data from various sources like Teradata, Oracle, flat files, SQL Server 2005 with high volume data
- Performed statistical analysis using SQL, SAS, R, Python, and Excel.
- Generating Regular, Ad-Hoc Reports using SAS Tools and Oracle and ETL sources.
- Building Statistical Models, predictive models and Regression Models using SAS Tools
- Gathered report requirements and determined the best solution to provide the results in either a Reporting Services report, Analytical Cube or an Excel table
- Designed SSIS packages to Extract data from different sources like SQL server 2008, MS Excel, MS Access, transform and then load into Dimension and Fact tables in Data Warehouse using SSIS
- Developed SSIS packages using for each loop in Control Flow to process all excel files within folder, File System Task to move file into Archive after processing and Execute SQL task to insert transaction log data into the SQL table
- Writing complex SQL Queries using joins, to extract the data from the Teradata Database and Schedule the jobs in UNIX.
- Develop and support Extraction, Transformation and Load process (ETL) using Informatica power center to populate Teradata tables and flat files.
- Migrating SQL server Objects (Table/Views) to Teradata Objects.
- Create and maintain predictive and multivariate models in SAS BI suite.
- Created and maintained required Data marts and OLAP Cubes using SAS DI Studio and SAS OLAP Cube Studio to fulfill reporting requirements
- Migrating data from SQL server to Teradata using Teradata, Worked on Sql combined in Database environments like SAS, ORACLE, TERADATA, HADOOP
- Implemented metadata standards, data governance and stewardship, master data management, ETL, ODS, data warehouse, data marts, reporting, dashboard and predictive modeling.
- Developed mappings in Informatica Power center 9.6 to load the data from various sources includes SQL server, DB2, Oracle, Flat files into the Data Warehouse, using different transformations like Source Qualifier, Joiner transformation, Update Strategy, Lookup transformation, Rank Transformations, Expressions, Aggregator, Sequence Generator.
- Conducted User Acceptance Testing (UAT) on the application - resolved issues from the participants, prepared and submitted Test Analysis Reports and participated in Product Readiness Review.
- Responsible for debugging, troubleshooting, and tuning PL/SQL stored procedures and functions and Complex SQL.
- Designed, coded, and implemented SAS code for analytic studies to fulfill internal and external client requests, maintained, evaluated, and reported on recurring analytic studies. Performed data hygiene and created data validation program.
- Performed in depth analysis in data & prepared weekly, biweekly, monthly reports by using SQL, SAS, Ms Excel, MS Access, and UNIX.
- Extracted, performed validation and generated SAS data sets from Teradata applied SQL Pass through Facility.
- Created and manipulated datasets using SAS, Access, and Excel.
- Involved in Building a specific data-mart as part of a Business Objects Universe which replaced the existing system of reporting that was based on exporting data sets from Teradata to Excel spreadsheets
- Migrated three critical reporting systems to Business Objects and Web Intelligence on a Teradata platform.
- Used SAS EG extensively for data analysis and involved in writing macros.
- Used SAS procedures to achieve the required data analysis and for data transformations.
Environment: Linux, MySQL, Spark, R, R-Studio, Tableau, Perl, Environment: R, SPSS, Machine Learning, Tableau, Linux, SQL, Python.
Confidential - Boston, MA
- Defined processes and tools best suited to the two projects (SGS-CM Integration & Sigue Agent App). Moved between Waterfall and Agile approaches depending on project specifics and stakeholder goals, creating detailed project road maps, plans, schedules and work breakdown structures.
- Gathered and analyzed, documented Business Requirement Documents (BRD) and Functional Requirement Documents (FRD) from both formal and informal sessions and validated the needs of the Stakeholders, SME's on SGS - Control Money integration project.
- Worked on requirements gathering and design Screen Functionalities, Wireframes for Sigue Agent App's 'Pending Queue', 'Confirmed Queue', 'Get Quote', 'User and location recap', 'Agent balance' modules.
- Planned and documented procedures for data processing and prepared Data flow diagrams for the Sigue Agent application.
- Prepared Functional Design Specifications (FDS) employing Use Case scenarios, Sequence diagrams and Class diagrams using MS Visio.
- Used HP-Rally and assisted project manager in making Project Plan with necessary modifications.
- Documented Test Cases and assisted the QA Team with the Test Plan. Documented the Requirement Traceability Matrix (RTM) for tracing the requirements and the Test Cases related to them.
- Implemented the defect-tracking tool Kayako, which notifies the defects, tracked, solved and generated reports that help the team to trace the QA status.
- Performed Requirement analysis and Impact analysis. Followed a systematic approach to elicit, organize, and document requirements on the legacy systems.
- Implemented Legacy system methodologies to know "Red Phone" model in money transfer and transaction process.
- Prepared for GAP Analysis; identified and documented improving areas between SGS and Control money legacy systems to meet Money Transfer regulations.
- Conducted interviews, meetings and JAD sessions during the process of Requirement Gathering.
- Reviewed the Joint Requirement Documents (JRD) with the cross functional team to analyze the high-level requirements.
- Documented User Stories in the Product Backlog for Sigue Agent App; Driven the Story Review Meetings while ensuring that a clear Acceptance Criteria has been derived for each User Story.
- Provided walk-throughs of User Stories, Mock Ups and User Acceptance Criteria to Development team, QA and Stakeholders during the work item Kick-Off Meetings for Sigue Agent App.
- Assisted with User Acceptance Testing (UAT), developed and maintained quality procedures ensuring that appropriate documentation is in place.
- Provided inputs (Scope, Goals, Risks, Constraints, Timelines and Interfaces) to Product Owner for Project Planning and Scheduling.
- Participated in the Incident Management and Problem Management processes for root cause analysis, resolution and reporting.
- Designed User Interfaces and mock up screens using HTML to validate the requirements.
- Customized and maintained the company's intranet which included company communications, dashboards, reports, business libraries and workflows using SharePoint.
- Created and implemented a Test Plan and thereby assisted Quality Analysts to test the product in a timely manner.
- Interfaced with Quality Analysts for Unit, Integration, System and User Acceptance Testing.
Environment: Microsoft Office Suite, Invision, Microsoft Visio, Rally, Smart Draw, Kayako, Java, Windows XP
Data Analyst/Business Analyst
Confidential - Charlotte, NC
- Performed extensive "Gap analysis" of the Wealth Management application to document existing business process flows and derive requirements for the proposed functionalities
- Administered Joint Application Development (JAD) sessions and interviewed end-users to elicit business requirements; Categorized and documented these requirements in the form of user stories and epics
- Collaborated with the Product Owner to conduct story writing workshops; Participated in writing user stories, defining acceptance criteria and estimating the level of effortto complete each user story
- Teamed-up with Subject Matter Experts to comprehend the artifacts within Investment Portfolio Management, Performance Management, Trade Lifecycle and Asset Management including asset allocation, asset diversification, risk & return analysis and cash flow management
- Assisted solution architects to develop technology solutions for the wealth management application and asset classes including Equities, Commodities and Fixed Income instruments
- Gathered functional requirements, analyzed workflows and created narrative Use Cases to develop new process workflows for the Wealth Management application
- Employed tools such as MS Visio and Balsamiq Mockups to create UML diagrams, wireframes and screen mockups to capture functional requirements of the proposed application
- Participated in daily stand-up meetings to assist the Scrum team to prioritize work and resolve technical issues
- Utilized Agile tool "JIRA" for backlog management and created reports for Sprint meetings using Burn down charts and Task boards
- Employed document management tool "Confluence" to ensure efficient cross-functional collaboration
- Collaborated with the Scrum Master in conducting Scrum ceremonies including Sprint planning, Sprint review and Sprint retrospective sessions
- Coordinated with business partners to develop a test plan and conduct User Acceptance Tests (UAT's) to validate the application
- Collaborated with diverse lines of business to understand the application's cross-functional impact and devise a mitigation plan
- Executed SQL queries to extract data from the database to check the data for consistency
Environment: Agile (Scrum), Microsoft Visio, Balsamiq Mockups, JIRA, Confluence, MS Office Suite, HP ALM (Quality Center), SQL Server 2008, XML
Confidential - Charlotte, NC
- Acted as a primary contact in all the phases of Software Development Life Cycle SDLC, including Quality Assurance Testing, Performance & User Acceptance testing.
- Recommended changes for system design, methods, procedures, policies and workflows affecting Medicare/ Medicaid claims processing.
- Performed GAP analysis for ICD-9 and ICD-10 and EDI Message Structure with the 4010 Structure. Developed End-to-End Business Process Flows for HIPAA 5010 EDI transactions including 834 (Benefit Enrollment and Maintenance), 835 (ERN-Electronic Remittance Notification) and 837 (Claims Submission) Transactions.
- Worked in the analysis of the ICD 9 -ICD 10 codes conversion Project using GEM (general Equivalence mapping) processes and concepts. Worked in a project involving Miami Systems to create Drug Cards and created Test files to be sent to drug card vendors for approval.
- Involved in end-to-end testing of Facets Billing, Claim Processing and Subscriber/Member module.
- Maintaining knowledge of Medicare and Medicaid rules and regulations pertaining to the Facets configuration and evaluating the impact of proposed changes in rules and regulations
- Contributed in the build and design of organizational Wiki that provided comprehensive knowledge of workflows, policies and procedures, patient care objectives, regulatory requirements, and industry best practices for membership management.
- Worked on a Paid without prejudice project for various States. Performed UAT tests using the (MORAE) Usability Testing Tool using the Observer and the Manager Mode.
- Experience in writing SQL queries, Stored Procedures and Triggers.
- Worked extensively with Tableau and MS Excel for the generating of reports.
- Responsible for architecting integrated HIPAA, Medicare solutions, and Facets EDI 834.
- Worked with FACETS, eBilling and EDI HIPAA Claims (837/835/834) processing.
- Developed test Scripts using Test Director/Quality Center and coordinated with developers to quickly resolve the defects associated with them.
- Conducted JAD sessions with the management, users and other stakeholders for open and pending issues to develop specifications. Analyzed and evaluated User Interface Designs, Technical Design Documents and Quality Assurance Test Conditions to test the performance of the application from various dimensions.
- Helped create the 'Business Glossary' to facilitate efficient understanding of the business process amongst the other teams. Assisted in creation of the Functional Design Document from the Business Requirements Document, which was used as the reference by the development team while preparing the design and held the responsibility of the required data setup for unit testing.
- Worked extensively in the executing of SQL queries on the database to verify data integrity.
Environment: IDX, MS Visio, Word, Excel, PowerPoint, CMMI, Rational Rose, Requisite Pro, Clear Case, Clear Quest, SQL, Oracle, J2EE technology.
- Redefined many attributes and relationships in the reverse engineered model and cleansed unwanted tables and columns as part of the Source Data Analysis responsibilities.
- Worked extensively in documenting the Source to Target Mapping documents with data transformation logic.
- Worked on Extracting, Transforming and Loading (ETL) process to load data from Excel, Flat file to MS SQL Server using SSIS.
- Used NoSQL database with Cassandra and MongoDB.
- Worked on Code Migration from Pc SAS to Grid (SERVER).
- Used Python, R, SAS, SQL to create machine learning algorithms involving Multivariate Regression, Linear Regression, Logistic Regression, PCA, Random forest models, Matrix factorization models, Bayes collaborative models to target users with email campaigns and native Ads.
- Build and implement mathematical models and with engineering code in Python (Pandas, scikit-learn and others.)
- Involved in Migrating the data model from one database to Teradata database and prepared aTeradata staging model.
- Used matplotlib and the seaborn libraries to create beautiful modern and informative visualizations with python.
- Created SSIS packages to load the data from Text File to staging server and then from staging server to Data warehouse.
- ETL implementation using SQL Server Integration Services (SSIS), Applying some business logic and data cleaning in staging server to maintain child and parent relationship.
- Deliver reports and ad-hoc analysis focused in the area of client behavior and profiling using SAS, SQL and Excel.
- Worked on Control flow tasks such as Execute SQL task, Send Mail Task, File System Task, Dataflow Task and used different data sources and destination with derived column, lookup transformation within Dataflow Task.
- Build Logistic Regression models to identify the control group cohorts using SAS/STAT and other SAS modules.
- Designed SSIS packages to extract data from different sources like SQL server 2008, MS Excel, MS Access, transform and then load into Dimension and Fact tables in Data Warehouse using SSIS.
- Developed SSIS packages using for each loop in Control Flow to process all excel files within folder, File System Task to move file into Archive after processing and Execute SQL task to insert transaction log data into the SQL table.
- Worked on NoSQL databases including HBase, MongoDB, and Cassandra.
- Data Migration from Flat files, CSV, MS-Access, Excel and OLE DB to SQL Database
- Used SQL Server Reporting Services (SSRS) to generate reports.
- Configured report server and report manager scheduling, give permissions to different level of users in SQL Server Reporting Services (SSRS).
- Generated scheduled reports for business analysis or management decision using SQL Server Reporting Services (SSRS).
- Extracted, performed validation and generated SAS data sets from Teradata; applied SQL Pass through Facility.
- Experienced in generating and documenting Metadata while designing OLTP and OLAP system environment.
- Worked on Sql combined in Database environments like SAS, ORACLE, TERADATA, HADOOP
- Involved in generating proper data in excel for Tableau Dashboard.
- Involved in non-transactional data entities of an organization of MDM (Master Data Management) that has the objective of providing processes for collecting, aggregating, matching, consolidating, quality-assurance, persistence and distribution.
- Redefined many attributes and relationships in the reverse engineered model and cleansed unwanted tables and columns as part of the Source Data Analysis responsibilities
- Work with business partners to understand and capture requirements around reporting & analytics, facilitate across multiple stakeholders towards decision/resolution, and translate business requirements into reporting solutions.
- Apply in-depth knowledge of Human Resources to lead the design of moderate to complex reports using Cognos as per specifications from customers, including design and analysis.
- Identify and enhance existing reports and processes.
Environment: SQL, MYSQL, SAS, R, R-Studio, Hadoop, LINUX, R, Tableau Desktop, Tableau Server, Unix Shell scripting, Python, SQL Server, Python, Bash, Microsoft Excel.
Confidential - NYC, NY
- Gathered the business requirements by conducting a series of meetings with business users.
- Tuned the performance of the queries by working intensively over the indexes.
- Extensively used Star Schema methodologies in building and designing the logical data model into Dimensional Models.
- Used Sybase Power Designer tool for relational database and dimensional data warehouse designs.
- Created reports using SQL Server Reporting Services (SSIS).
- Involved in Data mapping specifications to create and execute detailed system test plans.
- The data mapping specifies what data will be extracted from an internal data warehouse, transformed and sent to an external entity.
- Used SQL for querying the database in the UNIX environment.
- Responsible for creating mapping documents required for the ETL team.
- Created SQL, PL/SQL, SQL Loader control files, functions, procedures, and UNIX Shell scripts.
- Develop and maintain sales reporting using in MS Excel queries, SQL in Teradata, and MS Access.
- Responsible to design, develop and test the software (Informatica, PL SQL, UNIX shell scripts) to maintain the data marts (Load data, analyze using OLAP tools).
- Experience on Extraction Transformation and Loading (ETL) process using SSIS
- Conduct data mapping sessions, and write the required business requirements documentation including use cases.
- Created entity-relationship diagrams, functional decomposition diagrams and data flow diagrams.
- Used Star Schema and Snow Flake Schema for data marts / Data Warehouse.
- Prepared the Joins/Filter logic documents that would help the ETL design team perform the Joins on the tables that are in the form of flat files before loading them to FDS or any downstream systems.
- Created documentation and test cases, worked with users for new module enhancements and testing.
- Used SQL Server 2005 tools like Management Studio, Query Editor, Business Intelligence Development Studio (BIDS) including SSIS and SSRS.
- Created job schedules to automate the ETL process.
- Normalized the database up to 3NF to put them into the star schema of the data warehouse.
- Wrote PL/SQL statement and stored procedures in Oracle for extracting as well as writing data
- Collected business requirements to set rules for proper data transfer from Data Source to Data Target in Data Mapping.
- Loading staging tables on Teradata and further loading target tables on SQL server via DTS transformation Package.
- Reduced labour intensive manual processes like job entry, approval and invoicing by exposing well-defined SOA interface allowing B2B integration with Financial Clients
- Worked on building load rules and transformation logic for Extraction, Transformation and Loading (ETL) of the data into the data warehouse.
Environment: Sybase, SOA, Sybase Power Designer, SSIS, DTS, Rational Rose, Oracle 9i, ETL, Teradata, Oracle Designer, PL/SQL, XSD, SSIS, DOS and UNIX shell scripting, Sequential Files.