Sr. Business Data Analyst Resume
Philadelphia, PA
SUMMARY
- Over 15 years of professional experience in Information Technology as a Sr. Business Data analyst working on AWS Cloud and Client/Server applications, including 13 years of experience in the mortgage and banking industry.
- Possess strong presentation skills and the ability to communicate at different levels within the organization, as well as exceptional problem solving and analytical skills.
- Have had intensive exposure to the entire Software Development Life Cycle (SDLC) including requirements, analysis, design, development, implementation, maintenance, production support, and testing in both client/server and web - based applications.
- Proficient in data analysis, data profiling, data cleansing, data mining using Dremio, PL/SQL, Python, Power BI, Tableau, IDQ, DataFlux, and Excel.
- Strong experience in using Python for data analysis and visualization, as well as in machine learning and automation.
- Created data visualization reports using Power BI desktop, Python, Business Objects XI (BOXI) and SSRS.
- Experience working on data classification and handling sensitive data, as well as managing data access groups.
- Collaborated and coordinated with cross-team stakeholders to conduct integration testing on UAT/E2E environments using prod-like data.
- Created E2E data flow diagrams, business process and data flow charts using Microsoft Visio.
- Worked on Source to Target Mapping documents and assisted in creating logical and physical Data Models.
- Created risk assessment and impact analysis documents for new functionality and changes to the existing functionality.
- Extensive experience in writing project scope documents, functional/technical/transitional/operational requirements, report requirements, test strategy, test scenarios, test plans, test cases, RTM, defect log, test summary report, and UAT/system mockup data preparation.
- Experience in creating automated control validation DQ scripts to identify potential anomalies before distributing production data to consumers.
- Developed and maintained end-user computing applications to automate business functionality using PL/SQL, stored procedures, and VBA code.
- Experience in creating and validating Data Correction Utilities (DCUs) and Emergency Fix Scripts to correct the production data.
- Expertise to execute Autosys jobs, verify error logs, and troubleshoot issues is necessary.
- Experience in creating ETL packages using SSIS to extract-Transform-Load Source/Legacy data, as well as to load consumer data to SFTP server.
- Experience creating SSRS reports and loading them to SharePoint for end-user usage.
- Extensive experience in functionality testing, backend testing, regression testing, performance testing, stress testing, GUI testing, system integration testing, and system testing, as well as user acceptance testing.
- Experience in developing applications using ASP.NET, VB.NET, ASP, VB, Visual C++, XML, and Crystal Reports XI.
TECHNICAL SKILLS
Development: AWS, Python 3, Power BI, Tableau, Hadoop, HIVE, Spark, AbInitio, Informatica,DataFlux, IDQ,MicroStrategy, Business Objects XI, IDQ, DataFlux, Crystal Reports XI, JSON, SSIS, SSRS,ASP.NET, VB.NET, C#, C++, VB, ASP, PL/SQL and XML
Databases: MongoDB, Snowflake, PostgresDB, Redshift,SQL Server, Oracle,Mainframe, Teradata, SAS, IBM DB2, Sybase and MS-Access
Applications: JIRA,VersionOne, Collabera, Confluence Page, Dremio, Splunk, TOAD, SQL PLUS, RapidSQL, ER Studio, SharePoint, MS Visio,IBM Rational DOORS,MS Visual Studio 2005, PuTTY, Cucumber, ALM and Quality Center
PROFESSIONAL EXPERIENCE
Confidential, Philadelphia, PA
Sr. Business Data Analyst
Responsibilities:
- Actively participated in Agile methodology and provided timely inputs in sprint planning and retrospective meetings as well as daily stand-up sprint meetings.
- Performed data profiling, data cleansing and data analysis using Power BI and shared the data insights to business users.
- Involved in on-premise data hydration to AWS Cloud Enterprise Data Lake (EDL) migration.
- Automated validation scripts using Python for on-premise and AWS S3 bucket data and also developed monthly management metrics using Python.
- Developed Power BI reports and dashboards to gain insights about data and operational and business metrics for various purposes.
- Customized JIRA Sprint board to set Service Level Agreements (SLAs) and priorities, which helped identify missed SLA tasks and bugs, and also enabled proactively communicate for approaching SLAs. wrote Hive queries for analyzing data in the Hive warehouse using the Hive Query Language (HQL).
- Validated the on-premise and Cloud EDL S3 data using PostgresDB and Redshift databases using Dremio.
- Worked on STTM documents, Consolidated Attributes List across the enterprise level, Consolidated DBMODs, Producer and Consumer Interface documents.
- Worked with the Data Governance team to model and define the newly onboarded data elements.
- Created DCU’s and Emergency Fix scripts to update production data.
- Worked with SMEs to understand the functional workflow of details from the source to the target system.
- Tracked and resolved issues, promptly escalate issues to the management.
Environment: AWS, JIRA, Mongo DB, Snowflake, PostgresDB, Redshift, AWS S3, Dremio, CloudWatch, Kibana, Kinesis,Python 3, Power BI, Collabera, Confluence Page, Control M, Okera and Dremio.
Confidential, San Francisco, CA
Sr. Business Data Analyst
Responsibilities:
- Performed data analysis and provided data insights, impacted producer and consumer systems in the user story.
- Worked on Data Quality rules and reference data for enumeration values.
- Closely worked with business to clarify requirements/features to create user stories.
- Developed complex data sets using advanced querying using Teradata and Hadoop.
- Involved in PI Planning, Innovation Items and Showcase sprint planning items.
- Worked on maintenance features for production issues and consumer requests.
- Involved risk assessment on SLAs and documented with risks levels, short term &long term remediation’s.
- Created E2E data flow diagrams, business process and data flow-chars using Microsoft Visio
- Provided KT session on EU application to the new team members.
- Reviewed the business requirements and provided feedback for EU performance metrics.
- Developed Pega source query optimization to fit our business needs.
- Implemented project’s best practices and coding standards
- Performed DataAnalysis, Reportingand Dashboards using Microsoft Power BI Desktop and shared the data insights to business users.
- Understand the complex business requirements and updated source to target mapping (STTM) document, included data definitions, transformations, derivations, and enumerations.
- Worked with SME’s to understand functional workflow of details from source to target system.
- Updated sprint board with accurate information to identify the risk and issues proactively at the sprint level.
- Tracked and resolved issues, promptly escalate issues to the management.
- Done deep dive analysis to triage data issuesand documented with root cause, remediation and impacted systems..
- Automated to merge Master mapping documents and load it to Collibra for data governance and data lineage.
- Collaborated with cross team stakeholders to conduct Integration testing on UAT/E2E environments with prod like data.
Environment: TeraData, SQL Server, Hive, Python 3, JIRA, Power BI, Spark, SharePoint and UNIX.
Confidential, Sterling VA
Sr. Business Data Analyst
Responsibilities:
- Worked on source to target mapping (STTM) document, included data definitions, transformations, derivations and enumerations
- Analyzed business requirements and worked with SME’s to understand functional work flow of information from source systems to target system.
- Worked with Product Owner on Backlog grooming, User Stories creation and prioritization.
- Have done data profiling, data cleansing and data analysis on target systems data to provide stats to the business stakeholders to make an appropriate decision.
- Strong SQL skills, wrote complex queries to pull large sets of data and performing analysis using PL/SQL.
- Performed impact analysis on new source systems data to consume by EDW.
- Worked with business users on Chang Control Board (CCB) tickets and provided LOEs to implement them in EDW.
- Closely worked with development team to make sure whether the software application fits within the architecture and has the required behaviors.
- Created automation tool, to merge individual mapping documents into enterprise level master mapping document.
- Responsible to review report requirements and provide feedback based on EDW implementation feasibility.
- Provided guidelines and mapping details to create semantic layer for reports and self serve usage.
- Produced forecast loan data on critical elements using Python linear regression model based on historical data.
- Created data visualization reports to convert the data into Scatter plot, line, bar and pie charts using MatPlotLib, Plotly and Seaborn libraries.
- Analyzed and processed complex data sets using advanced querying using Python.
- Involved in UI Design for Power BI Reports and designed effective layouts.
- Worked on the requirements forDashboards, chart, Clustered Column Chart, Waterfall Chart, Gauge, Pie Chart, Tree map using PowerBIVisualization.
- Worked on Data Analysis Expressions (DAX) for accessing data directly fromtabularSSAS database.
- Involved to develop Tablix Reports, Tabular Reports, Matrix Reports, drill down Reports and Charts using SQL Server Reporting Services (SSRS).
- Worked on data definition standards document to achieve Enterprise Data Management is to consistently define and make standardized data available across the enterprise.
- Worked independently and/or collaboratively with Subject Matter Experts and business points of contact.
Environment: AWS Cloud, MS SQL Server, Python 3, JIRA, SSAS tabular model, Power BI, MS Visual Studio 2014, SSIS, SSRS,SQL Server Data Tools (SSDT), SharePoint, UNIX, MS Access 2016, and MS Project.
Confidential, Washington DC
Sr. Business Data Analyst
Responsibilities:
- Automated Remote Tele-Workers eligible expenses using Confidential ’s HR data portal (Hotel stay, Travel and Transportation) based on primary Confidential Offices.
- Reconciled all activities necessary to process multi-state payroll and account for related transactions (e.g. salaries, benefits, deductions, taxes and third party payments).
- Automated employee pay period preview reports (New hires, Over Time, Issued pay checks for Inactive employees, Multiple pay checks per pay period etc,.)
- Reconciled ADP Top Row Validations by comparing ADP data with Confidential HR home employee’s payroll data.
- Ensure compliance with relevant laws and internal policies. Establish and monitor appropriate controls, policies, and
Confidential, McLean VA
Sr. Business Data Analyst
Responsibilities:
- Worked in an Agile (Iterative/Scrum) environment as a core team member, participated in sprint planning, daily standup, sprint demos/reviews, retrospective and refine backlogs.
- Created End2End Process Flowcharts for Primary Mortgage Loan APP and Correspondent/Aggregator Assignment Center.
- Performed impact analysis on upstream and downstream systems for attributes changes in PML APP.
- Analyzed the data between Loan APP and CDW legacy systems with sample set of loans and provided comparison results whether it comply with data completeness & data accuracy dimensions.
- Provided assistance with ad-hoc requests from business teams and work with the data engineering and governance teams.
- Extensively used VersionOne to create user stories, tasks and estimated time to complete the tasks.
- Worked on Production Data Movement Controls using Business Activity Monitor (BAM) tool.
- Performed impact analysis for newly added attributes to PML APP and notify impacted consumer systems.
- Worked on project deliverable like BRS, UAT Artifacts, Operation Readiness Requirements (ORR), Data Flows, Non-Functional Requirements (NFRs), Operational Requirements (OPRs), STTM, Physical/Logical Data Models and BRD.
- Performed post prod validations and shared the results to make Go/No-Go decision.
- Responsible to validate DDLs and perform Smoke Test for PROD deployments.
- Performed predictive analytics such as machine learning and data mining techniques to forecast loans eligibility criteria with accuracy rate.
- Responsible to identify gaps in Loan APP and PML APP where consumer elements mapping incorrectly.
- Tracked existing production issues and provided solutions to prevent recurring issues.
- Analyzed and documented the production data anomalies to notify consumers timely.
- Worked with Data Modelers to create a data warehouse objects with relational database concepts like referential integrity for accuracy and consistency of data.
- Performed E2E UAT with production like data, coordinated with PML APP source and consumer team members to execute test cases and validate data points.
- Created Test Cases, test plans and executed on HP ALM for UAT/E2E scenarios.
- Created UAT Testing artifacts for various releases; Test Strategy & Plan, Test Cases, Test Results, Test Summary Report, Defect Log and Requirements Traceability Matrix (RTM).
- Updated Source To Target Mapping (STTM) documents with legacy systems ingredient attributes, Transformation Logic and enumeration values to implement it in APPs.
- Reviewed the SIT scenarios and Test cases to make sure all Functional Requirements were covered.
Environment: IBM DB2, Hadoop, MongoDB, Python, Informatica, VersionOne,AWS Data Lake, Rapid SQL, DOORS, BAM, SharePoint, HP ALM, ER Studio, MS Visio, Microstrategy and XML Spy.
Confidential, Reston VA
Sr. Data Analyst/UAT Lead
Responsibilities:
- As a Team Lead, responsible for guiding a group of Test Engineers to complete the project deliverables on time.
- Involved/Reviewed Test data creation to test Business functionality, Enumerations and DQ rules.
- Created Data Profiling SQL scripts to identify mock up data gaps for Enumerations, Transformations and Derivations.
- Responsible to create business process flow charts as part of sprint deliverables.
- Generated Test Results using DTF tool by providing source and target test case details.
- Used Cucumber to automate the autosys data load jobs & acceptance test case execution.
- Created Cucumber Step Definition and Feature files using RubyMine language.
- Responsible to reconcile data movement source to target and CEHL (Common Error Handling language) reports.
- Created “AutoGenTestcases” Tool to generate Vending Test Cases as well as Data load Counts validations from Staging, Error, Exceptions and Target layers.
- Conducted daily check-in status meetings with the team and provided UAT status to the management about Accomplishments, Working tasks, Past due tasks.
- Utilized test case Auto Compare Tool to compare Expected with Actual results and to load Test results to the QC.
- Worked on controls using SQ scripts to verify process anomalies before publishing the data to the consumers.
- Responsible to execute Autosys jobs and to investigate root cause incase of job failure on UNIX environment.
- Provided Reconciliation, Frequency and Metric counts using Tableau to the business teams.
- Validated Data Mart reports on NETEZZA environment.
- Created Change tickets and Incident Tickets to copy of Production archive files including scrambled NPI data.
- Responsible to provide justification/Resolution for BizApps and CEHL reports.
- Responsible to review Issue Tracker and open the iCART defects.
- Responsible to coordinate with Development and Business teams to resolve the defects on daily basis.
- Involved to create Application Logical Data Model (ALDM) and Enterprise Logical Data Model (ELDM).
- Performed Data Profiling on source systems data to understand the pattern and to analyze the data.
- Worked on Source to Target mapping document for consumer data glossary elements.
- Analyzed legacy system data in SQL Server to create DQ (Data Quality) rules for Pre and post validations to execute them At-Rest or In-Line.
- Worked on Metadata requirements to maintain the Logical/Physical Data modeling standards and metadata file specifications.
- Worked on Acquisition Loan source enumerations document and Stakeholder fact sheets.
Environment: ALM QC 11/Quality Center 10, NETEZZA, Tableau, Informatica 9.5, Oracle 9i, SQL Server 2014, Oracle SOA Suite 11g,DTF tool, Cucumber, TIBCO, DOORS, DataFlux, Embarcadero ER Studio 9.5.0,Toad for Oracle 9.5, SharePoint, MS Visio, XML Spy, MS Access 2007, Rational ClearQuest, PuTTY, FileZillaand TIBCO GEMS.
Confidential, McLean VA
Sr. Tester/ Sr. Data Analyst/Business Analyst
Responsibilities:
- Responsible for Functional testing, Integration Testing, Defect coordination, Test status, User Acceptance Testing and Signoff with respect to the functionality as a Teat Lead.
- Analyzed the data to identify and interpret patterns and trends, assess data quality and eliminate irrelevant data using DataFlux and PL/SQL Scripts.
- Created Logical and Physical Data models, Data Glossary and BRS documents.
- Lead the UAT team to migrate SQL Server version, Complaints Analytics, HAMP Tier2 and Data Integration projects.
- Responsible for estimating testing effort, preparing test schedules, risk analysis, identifying and allocating the resources.
- Responsible for prioritizing and assigning the testing tasks & Monitoring testing stages/levels.
- Facilitating Defect Review Meetings, Management Meetings & Go/No Go decision meetings.
- Leading Cross-functional QA efforts in Inter System Testing, conducting User Acceptance Testing.
- Analyzed Confidential provided HAMP data to create Requirements and provide risk assessment to the business team.
- Validated source data XML and XSD files and provided anomalies to Confidential .
- Worked on Scope, Business requirements and Technical requirements for the implementation of new projects.
- Created Test Cases, Test Scripts, Test Results and Dash board Reports using Quality Center.
- Created and/or reviewed Testing artifacts; Test Strategy, Requirements Traceability Matrix (RTM), Defect Log, Test Results, and Test Summary Report.
- Responsibleto review Test data to accurately simulate the scenarios for real time user.
- Developed EUC application to process HAMP Complaints data and provide Statistics Reports to the Business Team.
- Responsible to execute Autosys jobs in UAT and verify the error logs and provide status to the team.
- Created SQL scripts for Data accuracy validations and Loan sampling tranche logic.
- Responsible to validate Post-Production data and provide status to the Business and Production teams.
- Evaluate MicroStrategy reports data to quickly identify problems, issues and gaps.
- Responsible to improve performance by optimizing SQL scripts to remove bottle necks processes and eliminate duplicate data by normalizing the database tables.
- Involved in Developing Servicer Loan Sampling using ASP.NET.
- Worked closely with Business to fix data anomalies and provide them Ad-hoc Reports using PL/SQL scripts.
- Publishing best practices to the enterprise and implementing industry best practices in the team.
- Coordinating with external teams for setting up test data and tracking defects at their end.
- Conducting Test status meetings/walkthroughs, resolving issues and escalating if required.
Environment: MS SQL Server 2008/2005, DataFlux, ER Studio, ASP.NET, ETL - SSIS, SharePoint, MS Visio, XML Spy, ALM QC 11, MicroStrategy, Rational ClearCase, Rational ClearQuest, IBM Rational DOORS and PuTTY.
Confidential, Herndon VA
Sr. Tester/ Data Analyst
Responsibilities:
- Validated Data Quality reports, Data mapping and Auto publish reports using PL/SQL queries.
- Validating daily jobs for Confidential, Genie Mae and Non- Confidential REMICS
- Created PL/SQL scripts to validate Extract process, Auto Recon process, Transformation process and Load process.
- Performed Data Profiling on raw sources using technical tools such as TOAD and MS Access.
- Provided Loans/Securities production data to Confidential &Touche (D&T) Business Audit Team to perform certain computerized procedures in SAS data sets or a delimited text files.
- Validated iUAT with the production data using Ad-hoc queries.
Environment: ETL - AbInitio, Oracle 9i, Sybase, SharePoint, Business Objects XI, Toad for Oracle 9.5, MS Access 2003, Quality Center, Rational ClearQuest, PuTTY, Rational ClearCase and Rational RequisitePro.
Confidential, Herndon, VA
Sr. QA Tester/ Business Analyst
Responsibilities:
- Worked with Business team to understand the detailed impact and business changes needed to support new policies.
- Created Business requirements and developed Test Scenarios, Test Plan, Test Strategies, Test Cases, Data mapping, Data modeling, UAT mock data and created expected results.
- Generated Test Results using automation tool which was developed using VBA code in MS Access and compare Actual and Expected results.
- Executed UAT and iUATAutosys jobs using PuTTY based on instructions provided by Development Team.
- Provided support to production applications by tracking production issues and troubleshooting them to sustain application in production.
- Responsible to update business requirements based on the new enhancements.
- Implementing standards for requirements and scope documentation and ensuring quality deliverables.
- Performed data comparison/analysis between current production ADW data warehouse and legacy production Pool Prepay SAS data sets.
- Created SLS (Subledger) expected results using VBA code in MS Access.
- Extracted data from FDW and performed ad-hoc queries by using TOAD.
- Responsible to Identify the Data issues in ADW and PDR databases before going to the production.
Environment: AbInitio, Oracle 9i, Sybase, SharePoint, Business Objects XI, Toad for Oracle 9.5, MS Access 2003, Quality Center, Rational ClearQuest, PuTTY, Rational ClearCase and Rational RequisitePro.
Confidential, Washington, DC
Sr. Data Management Analyst
Responsibilities:
- Involved in working with clients in gathering requirements, creating use cases, STTM and a logical data model.
- Generated the State and Federal statistics reports(Blackman-Jones, Special Conditions, Child Count, etc) which measure the timeliness of IEP’s and Eligibilitiesusing SQL Server Reporting Services (SSRS).
- Created SSIS/DTS Packages for data migration between the EasyIEP and DCPS/PCS Secure FTP servers.
- Produced reporting procedures, which allowed clients to address data, quality issues before loading data into STAGING database.
- Generated Ad hoc Reports based on Management Requirements using BusinessObjects XI (BOXI).
- Automated to store the specific reports on the SharePoint using SSRS subscription schedule on daily basis.
- Scheduled the SSIS packagesto process inbound and outbound data using SQL Server Agent.
- Created automated DQ scripts to find out anomalies on PROD database, trigger these scripts after data load completed.
- Created process flow diagrams to get a better understanding of the processes involved during data migration.
Environment: ASP.NET, MS SQL Server 2005/2003, MS Visual Studio 2005, SharePoint, WS FTP Pro, Core FTP, SSIS, DTS, SSRS, Business Objects XI, UNIX, MS Access 2007, Visual Source Safe, Visio, Rational Suite and MS Project 4.1.
Confidential, Herndon, VA
Sr. QA Analyst/ Sr. Data Analyst
Responsibilities:
- Responsible for developing and maintaining the appropriate controls for ensuring high quality data are provided.
- Created Expected Results (ERDB) using PL/SQL scripts (Stored Procedures/Packages) to validate GFAS development Data Transformation and Pre Processor systems.
- Extracted data from ADW database and performed ad-hoc queries using PL/SQL Scripts.
- Created the ER database using VBA code in MS access to validate GFAS Processor engine.
- Responsible to Identify the Data issues in ADW and PDR databases before going to the production.
- Developed Ad-hoc reports as per business team, operation team and Manager’s requests.
- Collected and documented policies, calculation methods, business processes as well as business rules.
Environment: ETL - AbInitio, Oracle 9i, MS Access 2003, Business Objects XI, Toad for Oracle 8.6, TOAD, SQL Plus, DbVisualizer 4.3.1, Rational ClearCase, Rational RequisitePro, MS Excel,MS Project 4.1, Windows XP.