We provide IT Staff Augmentation Services!

Sr. Data Analyst Resume

0/5 (Submit Your Rating)

Richmond, VA


  • Around 7 years of Strong experience in Data warehouse, ETL - Extraction/Transformation/ Loading, Data Analysis, Data migration, Data preparation, Graphical Presentation, Statistical Analysis, Reporting, Data Validation and Documentation.
  • Worked closely with Business Analyst/Ops Analyst and Oracle developers to tune the products to work in concert with Teradata and create standards.
  • Acted as a liaison between IT department and core business group to maintain/prepare database reporting and perform data analysis and documentation using Oracle, Teradata, SQL server, MS access, OLAP tools.
  • Experienced in performing Ad-hoc reports by using Teradata, BTEQ, LINUX and UNIX.
  • Assisted IT developers and non IT developers (Database Analysts) with Teradata Utilities.
  • Developed custom dashboards/reports/GUI using Qlik View
  • Experience with Qlik View sheet objects like List Boxes, Multi Boxes, Tables, Current Selection Boxes etc including multiple charts types, Buttons, Drill down, hyper linking, ranking, breaks, KPI’s, custom requests for Excel Export And experience in editing scripts for creating variables, fields, and calculated expressions, basic and aggregated functions.
  • Created charts as per the customer requirements on Quarterly, Monthly and Yearly for report analysis with charts provided by Qlik View
  • Extensive Excel and pivot table development experience and Experienced with v Lookups and data sorting and filtering
  • Strong excel knowledge on maintaining huge data in excel or spread sheet, data validation and creating pivot tables.
  • Experienced in working with MS Excel and creating Pivot Tables and Pivot Charts
  • Performed user acceptance testing, system testing, unit testing for entire database by using test director, win runner, load runner.
  • Periodically monitored table spaces, table space growths and rebuild indexes.
  • Developed UNIX shell scripts to run batch jobs and communicate the messages to the database analysts and developers.
  • Proficient in prioritization and multi-tasking to ensure that tasks are completed on time.
  • Assisted in analysis, design, coding and development of building a new schema (staging tables, views, SQL files).
  • Monitoring the existing code performance and to change the code for better performance.
  • UNIX shell scripting was utilized and Corn tab was used for automating logs, checking daily logs for automating backup process and SQL SCRIPTS.
  • Performed automation of Teradata scripts, SAS scripts in UNIX and LINUX
  • Maintained Database referential integrity for all the processes and DBMS systems.


Databases: MS Access, Oracle9i/10/11g, SAS, Teradata utilities (BTEQ, Fast Load, Fast Export), SQL Server 2008

GUI tools: Developer 2000, Visual Basic 6.0, Oracle, Reports, MS Access, Base SAS, Visual Studio 2005/2008

Languages: SQL, PL/SQL, HTML, Developer 2000, VBA, Shell Scripting, TSQL

BI Tool: Qlik View 8.5/9, SSIS

Tools: SQL Assistant, TOAD, Edit Plus, Enterprise Edition (SSIS, SSRS), Putty, SQL Server Management Studio.

O/S: MVS, UNIX, LINUX, Windows98, Windows XP, Windows7 and MS DOS

Software: MS Access, MS Office and MS Excel

Detect Tracking tool: Test Director


Confidential, Richmond, VA

Sr. Data Analyst


  • Created and maintained various one time and ongoing reports for the Recoveries group. These reports were refreshed on a Daily, Weekly and Monthly basis depending on the business criteria. All reports were hosted on an intranet site called Knowledge Link and an e-mail sent out to the respective clients informing of the updated data availability. These reports were related to the Contingency team that handles an account that is over 180 days due and has been charged off.
  • Supported reporting needs of various Ops groups within Recoveries - Bankrupts, Cash, Estates and Legal.
  • Supported HSBC data integration with Confidential and helped create reports to test migration, productionised combined HSBC and Confidential reports using different data sources and ensure seamless migration of data to one central repository created for the merger.
  • Created Exceptions reports to highlight differences to help bridge data / normalization issues.
  • Compiled and distributed Recoveries metrics for inclusion in executive dashboard - reported to the board.
  • Helped load data via File Import using Fast Load to refresh/ maintain mapping tables from Third Party Agencies and Automated File transfers via FTP to send data files to downstream systems.
  • Created daily, weekly new account reports for various line of business by using Teradata SQL Assistant, UNIX, SQL, and MS Excel.
  • Developed, executed the financial reports of beginning of the month to monitor the new accounts in production databases (Teradata).
  • Created SAS datasets by using Base SAS from Teradata and Oracle tables.
  • Data was extracted from various production databases and exported into MS Excel by using Teradata utilities, SAS, SQL, and UNIX.
  • Working on various ad hoc requests for finance analysts, Business analysts, and project managers.
  • Automated financial reports based business analysts, project manager’s request with couple of parameters like daily, bi weekly, monthly.
  • Generating daily, weekly, monthly financial and business based reports and prep ration comparative statements.
  • Creating pivots tables, chats, data formation, conditional formation, VBA and macros according to the business requirements.
  • Performed data validation by using Teradata, Oracle, and UNIX before providing data to Business Analyst, Financial Analyst.
  • Migrated the existing automation processes from UNIX to Linux environment
  • Extracted, manipulated, validated data from various data stores to build financial data mart by using Teradata, Oracle utilities.

Confidential, Mountain view, CA

Data Analyst


  • Created customer lists for mailing and phone solicitations, create prospect list of file as per the format specified by the vendor, tracking of customers of participation in the response.
  • Periodically monitored spaces, table space growth, and rebuild indexes.
  • Generated weekly, bi weekly, monthly reports with help Oracle, Teradata, SQL, BTEQ, MS Access, MS Excel, UNIX, SAS.
  • Created charts as per the customer requirements on Quarterly, Monthly and Yearly for report analysis with charts provided by Qlik View
  • Loaded data into Data warehouse using TPUMP and Multi load.
  • Wrote numerous macros and procedure to automate the loading of data.
  • Extracted and Imported data from different sources like Oracle, SQL Server, MS Access, Teradata, XML files and flat files using OLE DB/ODBC to meet business report needs.
  • Involved in creating dashboard style of layouts using different sheet objects like List boxes, Multi boxes, slider, current selections box, buttons, charts, text objects, bookmarks, etc.
  • Designed dashboards for client to keep track of Sales representative performance, and efficiency of converting customer leads into revenue.
  • Good knowledge in building qvds and qvws applying business rules and data validations.
  • Communicated with Financial analysts, Business Analyst, Process Analyst, and Ops Analysts to build financial models.
  • Developed Ad hoc queries, SQL queries to fill business analyst, Operations Analysts, Financial analyst data requests.
  • Exported data into various formats by using MS Excel, MS Access, and SAS.
  • Performed a crucial role in the gathering of Report requirements and preparation of the report design documents.
  • Wrote Case Expressions, Data Manipulation commands by using Teradata (BTEQ).
  • Prepared a wide range of statistical analysis, including analysis of variance, regression, categorical data analysis, psychometric analysis and survival analysis using SAS.
  • Created tables, indexes, String Functions, Outer Joins in Teradata.
  • Performed data validations, Data integrity before delivering data to operations, financial analyst.
  • Production support included resolving trouble tickets on a daily basis, checking fixing and error logs.
  • Extensive testing was done on the database scripts for achieving accuracy, timely processing of data.
  • Enhanced the existing shell scripts to include new customers and executed the Korn jobs to schedule the customer scoring process.
  • Supported and migrated the developing, testing, demo and training environments.

Confidential, Foster City, CA

Data Analyst


  • Designing and implementing enterprise data warehouse and data marts for business decision making.
  • Maintained financial reporting, database, and process and supported non-technical and technical departments.
  • Interacted with technical and business analyst, operation analyst to resolve data issues.
  • Built financial models for Business Analyst, Operations Analyst, Financial Analyst (high level business entities) and the identification of key problems and process improvement opportunities.
  • Developed Ad-hoc reports by using Oracle, UNIX.
  • Created Ad-hoc reports, Daily reports as per business analyst, operation analyst, project manager business requirements on daily basis
  • Built the Operational data store by using Bteq, Teradata, Oracle, SQL, PL/SQL, and UNIX.
  • Using basic Teradata Query utility for reporting Purpose.
  • Developed Basic Teradata query (BTEQ) for interactive and batch queries.
  • Wrote case expressions, Data Manipulation Command by using Teradata (BTEQ).
  • Create tables, indexes, String functions, Outer joins in Teradata.
  • Enhanced the existing shell scripting to include new customers and executed the Korn jobs to the schedule the customer scoring process.
  • Generated trend over time, variance reports, switch, pie bar, nested charts and line displays and also did exception highlighting, sorting, drilling down, slice and dice, filter change displays and ranking.
  • Developed the mechanism to load the data from the data warehouse to various Data Marts.
  • Designed and developed the Financial and ad-hoc reports using Bteq, SQL.
  • Experienced in working with MS Excel and creating Pivot Tables and Pivot Charts.
  • Automating Backend process through shell scripting.


SQL Developer


  • Participated actively on Clients Calls, meetings with Project Manager to discuss and produce alternatives of problems.
  • Responsible for implementing the ETL solutions using SQL Server Integration Services
  • Designed and Developed SSIS Packages using Control flow task, Dataflow Task, Execute SQL Task, Derived column task, Merge Join Transformation in SSIS Designer.
  • Analyze the requirement document on every requirement from client.
  • Responsible for writing T-SQL queries, Stored Procedures, and user defined Functions, Indexes for Testing Discrepant Records.
  • Extract the customer data as outlined in the agreement with the newly-appended information to be delivered to the customer.
  • Created UNIX Shell scripts for automating the execution processes.
  • Developed Reports using complex queries as per client request as a part of production support
  • As per client requirements generate the Extracts in the client requested format with pgp encryption.
  • Involved in QA for Testing and analysis of DB Extract with Mainframe Extract.
  • Created UNIX Shell scripts for automating the execution processes.
  • Involved in the discussion with the on sight SMEs for the clarification.
  • Production support on the CMAS services.

We'd love your feedback!