We provide IT Staff Augmentation Services!

Sr. Business Data Analyst Resume

4.00/5 (Submit Your Rating)

Mclean, VA

SUMMARY:

  • 18 years of professional experience in Information Technology as Sr. Business Data analyst and Sr. QA Analyst on Financial, Educational, Banking, and Commercial domains; includes 13 years of experience on housing finance industry.
  • Intensive exposure on the entire SDLC in feasibility study, requirements, analysis, design, development, implementation, maintenance, production support and Testing in Client /Server and Web based applications.
  • Experience on Data analysis, Data profiling and Data cleansing, using PL/SQL scripts, IDQ, DataFlux, MS Excel and MS Access.
  • Developed and maintained EUC applications to automate business functionality using MS Access and Excel VBA code.
  • Proficient in all phases of Software Testing Life Cycle, Software Development Life Cycle including Agile and Waterfall methodologies.
  • Worked on Source to Target Mapping documents and created Logical/Physical Data Models using ER Studio.
  • Extensive Experience in Functionality Testing, Backend Testing, Regression Testing, Performance Testing, Stress Testing, GUI Testing, System Integration Testing, System Testing and User Acceptance Testing.
  • Created control validation scripts using PL/SQL to bump against source production systems to find out potential anomalies.
  • Created Data Correction Utilities (DCU) and Emergency Fix Scripts to correct the production data.
  • Extensive experience in writing project Scope documents, Functional/Technical/Transitional/Operational Requirements, Report requirements, Test Strategy, Test Scenarios, Test Plans, Test Cases, RTM, Defect Log, Test Summary Report, and UAT/System Test data preparation.
  • Expertise to execute the autosys jobs, verify the error logs and trouble shoot issues.
  • Validated BizApps, CEHL, Data Quality and Auto publish reports using PL/SQL queries.
  • Extensively used TOAD, Rapid SQL, SQL PLUS and DBVisulizer for data analysis and reporting purposes.
  • Analyzed production source data by tracking prior issues and troubleshoot them before send to consumers.
  • Experience in creating ETL packages using SSIS to extract - Transform-Load Source/Legacy data and to vend consumer data to SFTP server.
  • Experience to create SSRS repots and loads them to SharePoint for end user usage.
  • Experience in developing applications using ASP.NET, VB.NET, ASP, VB, Visual C++, XML, COM, Business Objects XI R2 and Crystal Reports XI.
  • Created expected results database for UAT and SIT validation using PL/SQL scripts.
  • Expert in PL/SQL, Stored Procedures, Triggers, Cursors, and BO Web Intelligence reports.
  • Strong presentation skills and the ability to communicate at different levels within the organization with exceptional problem solving and analytical skills.

TECHNICAL SKILLS:

Testing Tools: Cucumber, ALM QC11, Quality Center 10.0, Test Director, Win Runner, QTP and Load Runner

Languages: ASP.NET, VB.NET, C#,C++, COM/DCOM, VB, ASP, PL/SQL, XML, DOM, XSL,VBA and UML

Databases: SQL Server 2014, Oracle 11.2.0.1, MongoDB, NETEZZA, Teradata, IBM DB2, Sybase and MS-Access

Applications: Hadoop, VersionOne, ER Studio, AbInitio, SharePoint, MS Visio, IBM Rational DOORS, Informatica, DataFlux, SSIS, SSRS, Oracle SOA Suite 11g, MS Visual Studio 2005, Micro Strategy, Business Objects XI, Crystal Reports XI, TOAD, SQL PLUS, RapidSQL, DbVisualizer, ClearQuest and RequisiteProOther Tools: PuTTY, Core FTP and PSFTP

PROFESSIONAL EXPERIENCE:

Confidential, McLean, VA

Sr. Business Data Analyst

Responsibilities:

  • Developed End2End Process Flowcharts for Primary Mortgage Loan APP and Correspondent/Aggregator Assignment Center.
  • Performed E2E UAT with production like data, coordinated with PML APP source and consumer team members to execute test cases and validate data points.
  • Performed impact analysis on upstream and downstream systems for attributes changes in PML APP.
  • Worked in an Agile (Iterative/Scrum) environment as a core team member, coordinated with other team members and participated in sprint planning, daily standup, demo, retrospective and refine backlogs in Iterations/Sprints.
  • Extensively used VersionOne to create user stories, tasks and estimated time to complete the tasks.
  • Worked on legacy data analysis to implement Data Lake system for a single store of all enterprise data.
  • Worked with Data Modules to create a data warehouse objects with relational database concepts like referential integrity for accuracy and consistency of data.
  • Used Hadoop framework to manage data processing and storage of documents.
  • Created Test Cases, test plans and executed on HP ALM for UAT/E2E scenarios.
  • Created UAT Testing artifacts for various releases; Test Strategy & Plan, Test Cases, Test Results, Test Summary Report, Defect Log and Requirements Traceability Matrix (RTM).
  • Worked on Production Data Movement Controls using Business Activity Monitor (BAM) tool.
  • Performed impact analysis for newly added attributes to PML APP and notify impacted consumer systems.
  • Worked on project deliverable like BRS, UAT Artifacts, Operation Readiness Requirements (ORR), Data Flows, Non Functional Requirements (NFRs), Operational Requirements (OPRs), STTM, Physical/Logical Data Models and BRD.
  • Performed post production validations and shared the results to make Go/No-Go decision.
  • Supported production deployments for validating DDLs and Smoke Test scenarios.
  • Analyzed source data in MongoDB to implement Target system data models.
  • Tracked continuous production issues and provided solutions to prevent recurring issues.
  • Analyzed and provided the data anomalies for production issues to notify consumers timely.
  • Analyzed the data between Loan APP and CDW legacy systems with sample set of loans and provided comparison results whether it comply with data completeness and accuracy dimensions.
  • Updated Source To Target Mapping (STTM) documents with legacy systems ingredient attributes, Transformation Logic and enumeration values to implement it in APPs.
  • Reviewed the SIT scenarios and Test cases to make sure all Functional Requirements were covered.
  • Responsible to identify the gaps in Loan APPs and PML APP if consumer requested data elements not exists.

Environment: IBM DB2, Hadoop, MongoDB, Informatica, VersionOne, Azure Data Lake, Rapid SQL, DOORS, BAM, SharePoint, HP ALM, ER Studio, MS Visio, Microstrategy and XML Spy.

Confidential, Reston, VA

UAT Lead/ Sr. Data Analyst

Responsibilities:

  • As a Team Lead, responsible for guiding a group of Test Engineers to complete the project deliverables on time.
  • Involved/Reviewed Test data creation to test Business functionality, Enumerations and DQ rules.
  • Created Data Profiling SQL scripts to identify mock up data gaps for Enumerations, Transformations and Derivations.
  • Responsible to explain business process and to assign the tasks to the UAT team.
  • Responsible to reconcile data movement source to target and CEHL (Common Error Handling language) reports.
  • Used Cucumber software tool to automate the autosys data load jobs & acceptance test case execution.
  • Created Cucumber Step Definition and Feature files using RubyMine language.
  • Practiced Scrum (Agile) Software methodology, actively participated and provided timely inputs in Sprint planning, review and retrospective meetings as well as Daily scrum stand up meeting.
  • Created “AutoGenTestcases” Tool to generate Vending Test Cases as well as Data load Counts validations from Staging, Error, Exceptions and Target layers.
  • Conducted daily check-in status meetings with the team and provided UAT status to the management about Accomplishments, Working tasks, Past due tasks.
  • Reviewed Test Strategy, Test cases & Plan and RTM documents.
  • Utilized test case Auto Compare Tool to compare Expected with Actual results and to load Test results to the QC.
  • Working on controls using SQ scripts to verify process anomalies before publishing the data to the consumers.
  • Responsible to execute Autosys jobs and to investigate root cause incase of job failure on UNIX environment.
  • Provided Reconciliation, Frequency and Metric counts for active production data to consumer business teams.
  • Validated Data Mart reports on NETEZZA environment.
  • Created Change tickets and Incident Tickets to copy of Production archive files including scrambled NPI data.
  • Responsible to provide justification/Resolution for BizApps and CEHL reports.
  • Responsible to review Issue Tracker and open the iCART defects.
  • Responsible to coordinate with Development and Business teams to resolve the defects on daily basis.
Confidential, Reston, VA

Sr. Analyst, Reston, VA

Responsibilities:

  • Involved to create Application Logical Data Model (ALDM) and Enterprise Logical Data Model (ELDM).
  • Performed Data Profiling on source systems data to understand the pattern and to analyze the data.
  • Worked on Source to Target mapping document for consumer data glossary elements.
  • Analyzed legacy system data in SQL Server to create DQ (Data Quality) rules for Pre and post validations to execute them At-Rest or In-Line.
  • Worked on Metadata requirements to maintain the Logical/Physical Data modeling standards and metadata file specifications.
  • Worked on Acquisition Loan source enumerations document and Stakeholder fact sheets.

Environment: ALM QC 11/Quality Center 10, NETEZZA, Informatica 9.5, Oracle 9i, SQL Server 2014, Oracle SOA Suite 11g,TIBCO, DOORS, DataFlux, Embarcadero ER Studio 9.5.0,Toad for Oracle 9.5, SharePoint, MS Visio, XML Spy, MS Access 2007, Rational ClearQuest, PuTTY, FileZillaand TIBCO GEMS.

Confidential, Vienna, VA

Sr. Data Warehouse Analyst

Responsibilities:

  • Created SQL scripts to compare Pre and Post migration objects and table data counts for Production server migration.
  • Performed Data Profiling on the source systems and provided data characteristics, patterns and allowable values to create Source to Target Mapping (STTM) document.
  • Reconciled the user View repots with Source data using SQL Scripts.
  • Involved to setup SIT environment, HP Quality Center, Access Roles.
  • Created SIT Process & approach document and System Test scenarios.
  • Created Test Cases and Test Scripts to validate the data from Source to Landing, Landing to LoadReady and LoadRedy to Base layers based on STTM document.
  • Created HP Quality Center Dashboard reports for Defects life cycles management.

Environment: Informatica 9.5.1, Teradata SQL Assistant, Informatica Power Center DVO, MS SQL Server 2012, HP Quality Center 10, SharePoint, MS Visio, XML Spy and Beyond Compare 3.

Confidential, McLean, VA

Sr. Tester/ Sr. Data Analyst/Business Analyst

Responsibilities:

  • Responsible for Functional testing, Integration Testing, Defect coordination, Test status, User Acceptance Testing and Signoff with respect to the functionality as a Teat Lead.
  • Analyzed the data to identify and interpret patterns and trends, assess data quality and eliminate irrelevant data using DataFlux and PL/SQL Scripts.
  • Created Logical and Physical Data models, Data Glossary and BRS documents.
  • Lead the UAT team to migrate SQL Server version, Complaints Analytics, HAMP Tier2 and Data Integration projects.
  • Responsible for estimating testing effort, preparing test schedules, risk analysis, identifying and allocating the resources.
  • Responsible for prioritizing and assigning the testing tasks & Monitoring testing stages/levels.
  • Facilitating Defect Review Meetings, Management Meetings & Go/No Go decision meetings.
  • Leading Cross-functional QA efforts in Inter System Testing, conducting User Acceptance Testing.
  • Analyzed Confidential provided HAMP data to create Requirements and provide risk assessment to the business team.
  • Validated source data XML and XSD files and provided anomalies to Confidential .
  • Created BRS and Data models based on source data files and requirements.
  • Worked on Scope, Business requirements and Technical requirements for the implementation of new projects.
  • Created Test Cases, Test Scripts, Test Results and Dash board Reports using Quality Center.
  • Created and/or reviewed Testing artifacts; Test Strategy, Requirements Traceability Matrix (RTM), Defect Log, Test Results, and Test Summary Report.
  • Responsible to review Test data to accurately simulate the scenarios for real time user.
  • Created DCU’s and Emergency Fix scripts to update production data.
  • Developed EUC application to process HAMP Complaints data and provide Statistics Reports to the Business Team.
  • Responsible to execute Autosys jobs in UAT and verify the error logs and provide status to the team.
  • Created SQL scripts for Data accuracy validations and Loan sampling tranche logic.
  • Responsible to validate Post-Production data and provide status to the Business and Production teams.
  • Validated Micro Strategy reports.
  • Responsible to improve performance by optimizing SQL scripts to remove bottle necks processes and eliminate duplicate data by normalizing the database tables.
  • Involved in Developing Servicer Loan Sampling using ASP.NET.
  • Worked closely with Business to fix data anomalies and provide them Ad-hoc Reports using PL/SQL scripts.
  • Publishing best practices to the enterprise and implementing industry best practices in the team.
  • Coordinating with database team to set up right test data in right environment as per each release.
  • Coordinating with external teams for setting up test data and tracking defects at their end.
  • Conducting Test status meetings/walkthroughs, resolving issues and escalating if required.
  • Responsible for explaining the application to any new tester / developer / Senior Management.

Environment: MS SQL Server 2008/2005, DataFlux, ER Studio, ASP.NET, ETL - SSIS, SharePoint, MS Visio, XML Spy, MS Access 2007, ALM DC 11/ QC 10, Rational ClearCase, Rational ClearQuest, IBM Rational DOORS and PuTTY.

Confidential, Herndon, VA

Sr. Tester/ Data Analyst

Responsibilities:

  • Validated Data Quality reports, Data mapping and Auto publish reports using PL/SQL queries.
  • Validating daily jobs for Confidential, Genie Mae and Non- Confidential REMICS
  • Created PL/SQL scripts to validate Extract process, Auto Recon process, Transformation process and Load process.
  • Involved in updating production data using data correction requests.
  • Performed Data Profiling on raw sources using technical tools such as TOAD and MS Access.
  • Provided Loans/Securities production data to Deloitte &Touche (D&T) Business Audit Team to perform certain computerized procedures in SAS data sets or a delimited text files.
  • Validated iUAT with the production data using Ad-hoc queries.

Environment: ETL - AbInitio, Oracle 9i, Sybase, SharePoint, Business Objects XI, Toad for Oracle 9.5, MS Access 2003, Quality Center, Rational ClearQuest, PuTTY, Rational ClearCase and Rational RequisitePro.

Confidential, Herndon, VA

Sr. QA Tester/ Business Analyst

Responsibilities:

  • Worked with Business team to understand the detailed impact and business changes needed to support new policies.
  • Created Business requirements and developed Test Scenarios, Test Plan, Test Strategies, Test Cases, Data mapping, Data modeling, UAT mock data and created expected results.
  • Provided the support to Business UAT team by executing detailed Test Cases and Validating & documented the test results.
  • Generated Test Results using automation tool which was developed using VBA code in MS Access and compare Actual and Expected results.
  • Executed UAT and iUATAutosys jobs using PuTTY based on instructions provided by Development Team.
  • Provided support to production applications by tracking production issues and troubleshooting them to sustain application in production.
  • Validated PLAE, LAR and Back-Testing BOXI reports using PL/SQL scripts
  • Responsible to update business requirements based on the new enhancements.
  • Implementing standards for requirements and scope documentation and ensuring quality deliverables.
  • Performed data comparison/analysis between current production ADW data warehouse and legacy production Pool Prepay SAS data sets.
  • Created SLS (Subledger) expected results using VBA code in MS Access.
  • Extracted data from FDW and performed ad-hoc queries by using TOAD.
  • Created the ER database using VBA code in MS access to validate GFAS Processor engine
  • Responsible to Identify the Data issues in ADW and PDR databases before going to the production.
  • Closely working with development team to make sure whether the software application fits within the architecture and has the required behaviors.

Environment: AbInitio, Oracle 9i, Sybase, SharePoint, Business Objects XI, Toad for Oracle 9.5, MS Access 2003, Quality Center, Rational ClearQuest, PuTTY, Rational ClearCase and Rational RequisitePro.

Confidential, Washington, DC

Sr. Data Management Analyst/ Team Lead

Responsibilities:

  • Involved in working with clients in gathering requirements, creating Use cases and a logical data model.
  • Created process flow diagrams to get a better understanding of the processes involved during data migration.
  • Review methodology and testing procedures for loading data into warehouse to ensure quality of data.
  • Analyzed user requirements, attended Change Request meetings to document changes and implemented procedures to test changes.
  • Created SSIS/DTS Packages for data migration between the EasyIEP and DCPS/PCS Secure FTP servers.
  • Performed Data Analysis and Data validation by writing SQL queries.
  • Generated the State and Federal statistics reports which measures the timeliness of IEP’s and Eligibilities (Blackman-Jones, Special Conditions, Child Count, etc) using SQL Server Reporting Services (SSRS).
  • Automated to store all reports on the SharePoint on daily basis using SSRS subscription schedule.
  • Scheduled the SSIS packages using SQL Server Agent.
  • Generated Ad hoc Reports based on Management Requirements using BusinessObjects XI (BOXI).
  • Produced reporting procedures, which allowed clients to address data, quality issues before loading data into STAGING database.
  • Worked on transformation of data from MS-Access to SQL Server.
  • Generated Out-of-Sync reports using SSRS.
  • Performed the Data analysis like pulling data, data-scrubbing based on end-user information prior to migrations.
  • Backup / Restore regular maintenance and troubleshooting of data.

Environment: ASP.NET, MS SQL Server 2005/2003, MS Visual Studio 2005, SharePoint, WS FTP Pro, Core FTP, SSIS, DTS, SSRS, Business Objects XI, UNIX, MS Access 2007, Visual Source Safe, Visio, Rational Suite and MS Project 4.1.

Confidential, Herndon, VA

Sr. QA Analyst/ Sr. Data Analyst

Responsibilities:

  • Analyzed various business requirements.
  • Responsible for developing and maintaining the appropriate controls for ensuring high quality data are provided.
  • Created Expected Results (ERDB) using PL/SQL scripts (Stored Procedures/Packages) to validate GFAS development Data Transformation and Pre Processor systems.
  • Extracted data from ADW database and performed ad-hoc queries using PL/SQL Scripts.
  • Created the ER database using VBA code in MS access to validate GFAS Processor engine.
  • Analyzed, designed and coded to build a new schema for staging the tables and views.
  • Responsible for comparing UAT And Development expected results.
  • Responsible to Identify the Data issues in ADW and PDR databases before going to the production.
  • Developed Ad-hoc reports as per business team, operation team and Manager’s requests.
  • Query optimizations using query analyzer and Index tuning.
  • Worked with D&T, E&Y and EUC teams to help them to understand on various data sources.
  • Collected and documented policies, calculation methods, business processes as well as business rules.
  • Accountable for managing traceability from Scope Specifications within ReqPro, to track dependencies and changes which helped in controlling numerous artifacts produced by the teams.
  • Developed the macros to load the UAT data into Test Database.
  • Involved in Presenting a technical reports, Project overviews and training using Power point.

Environment: ETL - AbInitio, Oracle 9i, MS Access 2003, Business Objects XI, Toad for Oracle 8.6, TOAD, SQL Plus, DbVisualizer 4.3.1, Rational ClearCase, Rational RequisitePro, MS Excel,MS Project 4.1, Windows XP.

Confidential, Towson, MD

Sr. IT Analyst

Responsibilities:

  • Involved in UML, designed Use Case Diagram, Class Diagram, Sequence Diagram, and State Diagram for project documentation using Select Enterprise.
  • Developed Use Cases using Rational Rose for the Web and the Windows Application.
  • Involved in design & development of database logical design, Analysis of the Application.
  • Performed in-depth analysis of data & prepared weekly, biweekly, monthly reports by using MS Access and PL/SQL.
  • Spooled the data into flat files and send them to the database through a secured server.
  • Involved in writing UNIX, SHELL Scripts for loading data using SQL*LOADER and PL/SQL scripts for validating data.
  • Involved in analysis, design, coding and development of building a new schema (staging tables, views, SQL files).
  • Designed and created datasets from various sources like Excel, flat files and XML files.
  • Involved in the Design and Development of the Database for various clients.
  • Developed Stored Procedures and Triggers using SQL Server for complex business rules.
  • Created Complex reports in Business Objects.
  • Migrated the data to UAT, Production and Training Environments.
  • Designed XML schema files for mapping data into database tables.
  • Extensively involved in production support for various clients like Winston Salem, Milwaukee, Dallas and D.C.

Environment: VB.NET, ASP.NET, C#, XML, XSL, SQL Server 2000, MS-Access 2000, DB2, Crystal Reports 8.0, Citrix N-Fuse Desktop, BizTalk Server 2002, Apache 1.4 and Track-It and Heat 6.0 Defect Tracking Tool.

Confidential, Norman, OK

Programmer

  • Analyzed Business Requirements.
  • Developed Sequence Diagrams and class diagrams using UML and Rational rose.
  • Developed COM object using C++ for Business Logic and Data Logic.
  • Responsible for Creating report Generation using Seagate Crystal Reports.
  • Involved in writing and executing test plans, test cases based on the business requirement.
  • Responsible for Black box, and White box testing of the application.
  • Used XSL to control the formatting and typographical layout of pages with style and efficiency.

Environment: C++, VC++, COM, ATL, PHP, HTML, DHTML, Java Script, VB Script, XSL, Rational Rose and UML.

We'd love your feedback!