We provide IT Staff Augmentation Services!

Business Data Analyst Resume

5.00/5 (Submit Your Rating)

Charlotte, NC

SUMMARY

  • 5 years of professional work experience in Business Data Analyst and Tableau Developer. Good understanding of SDLC such as agile and waterfall models.
  • Experience in design and develop Workbooks, Reports and interactive Tableau dashboards using Tableau Desktop version 7.x/8.x/9.x/10.x and Tableau Server.
  • Proficient in analyzing system requirements, use cases, FRD and BRD’s to gain overall understanding of the new application to determine the appropriate level of testing required and creating Test Plans and Test Cases, Test Scripts, Requirement Traceability Matrix(RTM),Test Environments and analyzing Test Results based on requirements.
  • Well versed experience in the data analysis and good hands on experience in SAS, SQL and Teradata and Good Experience in Test Data management.

TECHNICAL SKILLS

Tools: ALM, Jira, Version One, SAS, Tableau 9.3.x/10.x,Spotfire,Informatica,IBM optim

Web Technologies: HTML

Frameworks: Hadoop, Hive

Databases: MySQL, Oracle, Teradata

Operating Systems: Windows XP/7/8/10, UNIX, MAC OS

PROFESSIONAL EXPERIENCE

Confidential - Charlotte NC

Business Data Analyst

Responsibilities:

  • Work with Business Stakeholders, Data Stewards, and System Owners to document Business Rules to be applied to core data.
  • Designed and developed Tableau dashboards/Stories reports by working closely with the management that help analyze the operations at one of the biggest data warehouses and published to Tableau Servers
  • Developed POC dashboards using Tableau that shows the data defects such as data remediation methods and Data quality .
  • Designed the Data Base Modelling in Access by creating the Fact and Dimension tables for the POC data for various commercial lending applications which is used as input source for Tableau reporting.
  • Developed Tableau workbooks from multiple data sources using Data Blending and created gauges, Speedometer, Scatter plots, Gnatt charts and Heat Maps .
  • Organized and created Trending reports on Daily, yearly and Monthly basis by Interacting with stakeholders to analyze the data being generated by various application for Defect remediation’s.
  • Generated and Published with Summary - Tabs Dashboards and use of effective Filters:-(Quick filter, Action Filters, Context Filter) and Parameters, Groups and sets to handle views more efficiently.
  • Created Business and Technical Data Quality Metrics and POC for the Enterprise Data Quality Execution to drive the Data Governance policy.
  • Worked with the Team in Automating the process for upload the Application files to Access database .
  • Identify the physical data fields and the reference (or transactional) data tables within which they reside. Updated the data tables that should be available in your DQ Management repository (Ab Initio Data Quality, Collibra, etc.)
  • Experience in data integration solutions for data warehousing and other systems using ETL tools & SQL server integration services (SSIS).
  • Good knowledge on the IBM Optim Test Data Management and IBM OPTIM objects (e.g. extract, convert, insert) and the relevant scripts/functions to support those, as per the design specification

Confidential - Charlotte, NC

Business Data Analyst

Responsibilities:

  • Validating data models to ascertain effective predictions of the bank's potential default from commercial real estate loans.
  • Designed and created the Data Quality rules on every level of data hierarchy to establish data quality.
  • Have Good knowledge on the PD\LGD models and worked on generating the reports using the Models and building the historical linkage.
  • Created scorecard determining the risk strategies for various models and performed the Sensitivity and shocking analysis
  • Automated the process using SAS and created SQL passthrough to run the Data Quality Checks periodically.
  • Worked on Data extraction, Cleansing, Analysis, and Reporting using SAS, SQL, and Advance Excel.
  • Worked on the production and executed the codes in the UAT and compared with the Actual execution results. Worked with Technology to deliver the HLD &LLD docs that can be leveraged for UAT and implemented to production.
  • Integrating multiple data sources to provide a streamlined resource for future loss forecasting models.
  • Working on the UAT testing CICM and Capital Models. Testing the data in the UAT environment and checking with actuals and provided sign off for the deployment.
  • Worked closely with the Business team for the requirements and created EPICS for the Acceptance Criteria.
  • Created Adhoc reports to users in Tableau by connecting various data sources.
  • Created action filters, parameters and calculated sets for preparing dashboards and worksheets in Tableau
  • Developed visualizations using sets, Parameters, Calculated Fields, Dynamic sorting, Filtering, Parameter driven analysis.
  • Created EPICS with the Business justifications and Acceptance criteria for the technology team to develop the functionality.
  • Interacted with Business experts and Senior Developers to arrive at the right transaction mix on the Production environment for Load/Stress/Performance testing.

Confidential - Charlotte, NC

Business Data Analyst

Responsibilities:

  • Implemented and followed a Scrum Agile development methodology within the cross functional team and acted as a liaison between the business user group and the technical team.
  • Verified the correlation between the UML diagrams and developed detailed diagrams.
  • Imported data using LIBNAME and PROC SQL Pass -Thru facility to reduce processing time
  • Good knowledge on ETL tool such as Informatica power center to integrate data from different sources into dataware house composed of Moody’s analytics Data Mart
  • Documented the developed macros explaining the functionality of the macro and the macro parameters defined for future reference.
  • Responsible for importing and extracting data in and out of SAS data warehouse into external file like excel using SAS IMPORT/EXPORT Wizard.
  • Conducted business and technical analysis to gather functional requirements, identified business rules and transformation associated with relevant data domains, designed a Data Integration Hub to facilitate data movement across all environments.
  • Extensively used Tableau features like Data Blending Extracts Parameters Filters Calculations Context Filters Hierarchies Actions Maps .
  • Identified the factors involving the performance of the reports and coordinated in creation of indexes statistics to improve the query turnaround time efficiently
  • Created actions parameters Filter Local Global and calculated sets for preparing dashboards and worksheets using Tableau Desktop
  • Administer user, user groups, and scheduled instances for reports in Tableau.

Confidential - Charlotte, NC

Data Analyst

Responsibilities:

  • Collected customer data from multiple sources such as Teradata/SQL, ensured data integrity and learned feature engineering.
  • Involved in working on Fraud Analytics and identify key information and strategies for dealing heterogeneous, unfiltered data sets from different data sources.
  • Create and document sampling procedures, BRDs and other artifacts associated with all the internal employees in organization.
  • Involved in the migration of the all the tables form the Bacardi Database to the W database in the Teradata using the SAS filters and deploy the code in the Production.(CRD pod to the APR Tera )
  • Involved in writing the Stored procedures in the Teradata database and manipulated enormous data sets and pulled into the SAS Data sets .

Confidential

Business Data Analyst

Responsibilities:

  • Challenge was to build the process that takes Extract, transform and load datasets with basic modeling variables for each segment type.
  • Involved in Korn-shell scripts for executing SAS batch files on UNIX to save the execution time and for automation
  • Worked on advance querying the database using SAS Enterprise guide in calculating and computing columns, using filters, manipulated and prepare the data for reporting and statistical summarization.
  • Designed and developed data transformation jobs using DataStage and Integrated client’s data warehouses using Teradata, Oracle.
  • Used Quality Center to organize and manage all phases of the software testing process, including planning tests, executing tests, and tracking defects.
  • Developed Selenium WebDriver Java scripts for Integration Testing and System Testing of Functional requirements
  • Developed Frame works as part of cross browser testing for the login pages to validate the credentials on selenium web driver.
  • Tested the code in production and conducted UAT sessions to the end users to make sure the functional requirements are met.
  • Designed the test cases based on the User view and tested the same in UAT covering all the high level scenarios . Performed Alpha and Beta testing as part of the UAT process.
  • Good Experience on ALM, Jira and Version One (Test Management tool) tools in the agile testing.

We'd love your feedback!