We provide IT Staff Augmentation Services!

Data Analyst Resume

0/5 (Submit Your Rating)

Mooresville, NC

OBJECTIVE

  • Seeking a challenging position as a Data Analyst or Data Architect to utilize my skilled and experience powered with excellent problem - solving abilities in Data Validation, Data Modeling, Data Mapping, Data Profiling, Data Manipulation and Data Cleaning

SUMMARY

  • 6+ years of progressive professional experience in Information Technology Industry using advanced SQL coding, over various platforms like Teradata, Oracle and SQL server for data mining and report generation.
  • Extensive knowledge and experience of developing SQL queries for data extraction.
  • Strong analytical skills in database design, querying in SQL Server, COBOL, TOAD and Oracle.
  • Experienced in using SQL Query Inner, Outer Join, Union, Intersect, Cube, Rank, Quintile and other OLAP functions.
  • Experience in using SharePoint, Excel and Remedy.
  • Excellent problem solving ability in query performance tuning and finding issues with resource intensive queries.
  • Primary experience in using Teradata utilities like SQL Assistant, BTEQ, MLOAD, FLOAD, TPUMP.
  • Wrote complex SQL queries to extract and validate the data from the Facets database
  • Extensive experience in Data Analysis and ETL Techniques for loading high volumes of data and smooth structural flow of the data.
  • Experienced in creating numerous Volatile, Global, Set, Multi-Set tables.
  • Experience using Oracle, MS SQL Server, DB2, Teradata, PL/SQL, Stored Procedures, Triggers, Materialized Views and cursors.
  • Extensively worked on backend programming using PL/SQL, Stored Procedures, Functions, Packages, Database triggers for Oracle and also SQL queries/reports and scripts- Expertise in SQL and PL/SQL programming, developing complex code units, database triggers
  • Advanced experience in RDBMS implementation using object-oriented concept and database toolkit.
  • Involved on retrieving and extracting data from multiple sources.
  • Extensive experience on business intelligence tools such as OLAP, Data warehousing, reporting and querying tools, Data mining and Spreadsheets.
  • Performed ad hoc Data analysis through UNIX command line.
  • Experience in Constructing Unix korn shell drivers
  • Knowledge of end-to-end Data Warehousing concepts
  • Excellent in data analysis and identifying issues while report creation from several and complex data sources.
  • Experience in both manual testing and automated testing.
  • Created and executed Test plans, Test-cases, Test scripts for various types of testing like black-box, white box, integration, performance, security, transaction, backend, regression, and others.
  • Strong working knowledge of Software Development Life Cycle (SDLC) & Software Testing Life Cycle (STLC).
  • Experience conducting User Acceptance Testing (UAT) and writing test cases for the same
  • Configured tools like Mercury’s WinRunner, LoadRunner, QuickTest Pro to automate testing.
  • Excellent analytical skills for understanding the business requirements, business rules, business process and detailed design of the application
  • Good problem solving skills, attention to detail, better initiative and judgment. Good business understanding and effective at working across multiple teams within an organization
  • Excellent written and oral communication skills, along with innate relationship-building qualities.
  • Display keen eye for detail and extreme capability in multitasking effectively in a globally competitive, fast-paced environment.

TECHNICAL SKILLS

Operating Systems: Windows 2000/XP/Vista,7, Windows NT/2008,2012 UNIX

Testing Tools: HP Quality Center, QTP, Load Runner

Applications: Hyperion Essbase, Planning & Forecasting

Databases: MS SQL Server 2005/2008/2012 , Teradata 12/13, My SQL 5.2, Oracle 9i, 10g, 11g

Web Technologies: HTML, XML, JAVA Script, VB Script

MS Office Tools: MS Office, MS Visio, MS project, MS SharePoint and Remedy

Programming Languages: C, C++, Java, SQL, VB, VC++

Querying Tools: SQL Server Management Studio 2008, Teradata SQL Assistant, SQL Plus, SQL Developer, Teradata-BTEQ, TOAD 9.1

PROFESSIONAL EXPERIENCE

Confidential, Mooresville, NC

Data Analyst

Responsibilities:

  • Performed numerous data extraction requests involving SQL scripts, and UNIX. Wrote complex SQL queries in order to fulfill ad-hoc requests. Designed & developed various Ad hoc reports.
  • Wrote PROC SQL for highly advanced merging and matching between large data sets.
  • Developed T-SQL stored procedures/functions to process these data and extract/populate them in the appropriate destination tables.
  • Using MS Excel, SQL and UNIX to prepare weekly, biweekly, monthly reports
  • Automated BTEQ query submission using Unix shell scripting
  • Performed Data Analysis and Data Quality Checking.
  • Created GAP Analysis Document.
  • Created various volatile tables, user perm tables for fulfilling data mining requests.
  • Using derived tables, sub-queries, correlated sub-queries to fetch data from different tables on Teradata platform.
  • Created numerous multi load, and Fast load scripts.
  • Created numerous Volatile, Global, Set, Multi-Set tables.
  • Widely used SET like MINUS, INTERSECT, UNION ALL and UNION operators to merge and consolidate table information.
  • Writing SQL queries and using concepts like Explain, Stats, CAST and volatile tables.
  • Frequently interacted with Business Analysts to clarify customer needs.
  • Gather system design requirements, design and write system specifications.
  • Suggest BA alternatives and corrected their understanding.
  • Interacted with Business Analysts in order to understand need of their data loading.
  • Discussed the tables of physical models and the loading workflow with BA.
  • Experience in using Excel, SharePoint, and Remedy.
  • Experience in using SAS to import data into and export data from Teradata tables.
  • Create reports that did dicing and slicing functions on data by using Pivot tables.
  • Create numerous permanent tables, Volatile tables and Global Temporary tables.
  • Create numerous Universes for implementing Web business intelligence for clients.

Confidential, Royal Oak - MI

Data Analyst

Responsibilities:

  • Maintained data consistency, database integrity and reported instances of data corruption.
  • Involved in gathering, analyzing, and documenting business requirements, functional requirements and data specifications for reports.
  • Solid program development and business process knowledge - utilized effective communication skills
  • Performed data quality checking.
  • Created various volatile tables, user perm tables for fulfilling data mining requests.
  • Using derived tables, sub-queries, correlated sub-queries to fetch data from different tables on Teradata platform.
  • Widely used SET like MINUS, INTERSECT, UNION ALL and UNION operators to merge and consolidate table information.
  • Writing SQL queries and using concepts like Explain, Stats, CAST and volatile tables.
  • Created numerous VIEWS so that the information is provided on need to know basis.
  • Periodically Designed Conceptual, Logical and Physical Models to effectively improve Business processes.
  • Used Teradata OLAP functions like random, qualifies, top n, sample and random. Teradata SQL performance tuning.
  • Used Teradata utilities like SQL Assistant, BTEQ, MLOAD, FLOAD, TPUMP.
  • Used SQL Server, SAS procedures, packages, functions, database triggers to developing and execute MDM reports for performance and response purposes
  • Extracted data from Teradata and Oracle and created pivot tables in Excel.
  • Created Transformations and Mappings by using Informatica Designer
  • Processing tasks using Workflow Manager in order to move data from multiple sources into targets.
  • Developed T-SQL stored procedures/functions to process these data and extract/populate them in the appropriate destination tables.
  • Examines existing data and makes recommendations for improvement/efficiency. Resolved requirements and design issues.
  • Involved in defining the source to target data mappings, business rules, business and data definitions
  • Created Report-Models for ad-hoc reporting and analysis.
  • Participated in the testing process, ensuring testing verifies the user requirements.
  • Works to continually improve the quality and efficiency of business systems
  • Aided in management and documentation of all project deliverables as well as training knowledge based articles.

Confidential, Chicago - IL

Data Architect

Responsibilities:

  • Involved in gathering, analyzing, and documenting business requirements, functional requirements and data specifications for Business Objects reports.
  • Requirement gathering & analysis.
  • Documented Business & System Requirements.
  • Performed numerous data extraction requests involving SQL scripts, and UNIX.
  • Extensively developed complex BTEQ scripts for Invoice Processing.
  • Performed in depth analysis of data & prepared weekly, biweekly, monthly reports using Ms Excel, SQL and UNIX.
  • Used UML Diagrams to graphically capture system functionalities.
  • Periodically Designed Conceptual, Logical and Physical Models to effectively improve Business processes.
  • Test scripts were prepared using programming constructs and verified the particular operation by adding tl step statements
  • Programming Tests with TSL using Function generator and from List performed regression testing using Batch tests.
  • Created function library to enhance the function usage for the entire team.
  • Wrote standard SQL Queries to perform Data Validation and create Excel summary reports (pivot tables and charts).
  • Involved in requirements gathering, source data analysis and identified business rules for data migration and for developing data warehouse/data mart.
  • Daily interaction with internal users, and business analysts for updates and implementation of data request for key business reports.
  • Interacted with Business Analysts to understand client needs and suggested alternatives for a more effective and efficient Business process.
  • Designed & developed various Ad hoc reports for the finance (Oracle SQL, PLSQL, MS EXCEL)
  • Developed UNIX Shell Scripts for automating logs, backup procedures and data health monitoring.
  • Extensively used SAS functions such as Proc Report, Proc Tabulate, Proc Summary, Proc Print and Data Null Reports.
  • Created numerous views so that the data is available on need to know basis,
  • Created pivot tables in Excel by getting data from Teradata and Oracle.
  • Created Multi-set tables in Teradata database.
  • Maintained Data consistency and database integrity.
  • Developed scenarios for Unit testing and Integration testing.
  • Involved in production support and troubleshooting data quality and integrity issues.
  • Interacted with end-users and business analysts to identify and develop business requirements and transform it into technical requirements and ultimately responsible for delivering the solution.

Confidential

Data Analyst

Responsibilities:

  • Analyzed the requirements to identify the necessary tables that need to be populated into the staging database.
  • Involved in analyzing the source data coming from different Data Sources.
  • Intensive use of SQL within the Teradata platform. Performed the Back-end testing manually by writing and executing SQL statements on Oracle Database in UNIX environment to query the database for Data validation, identifying, cleaning/scrubbing.
  • Performed Documentation of Business & Systems Requirements.
  • Performed numerous SQL Statements to Load and Extract Data to and From Database.
  • Created Multi Set, Set and Volatile tables using Teradata.
  • Performed in depth analysis of data & prepared weekly, biweekly, monthly reports using Ms Excel, SQL and UNIX.
  • Designed & developed various Ad hoc reports for the finance (Oracle SQL, PLSQL, MS EXCEL)
  • Developed UNIX Shell Scripts for automating logs, checking daily logs for automating backup procedures along with communication procedures, using Send mail and mail.
  • Extensively used SAS functions such as Proc Report, Proc Tabulate, Proc Summary, Proc Print and Data Null Reports.
  • Performed Data Analysis, created GAP Analysis Document, and performed Data Quality Checking.
  • Created temporary, derived and volatile tables.
  • Used Teradata OLAP functions like RANK, QUALIFY, CSUM, SAMPLE and RANDOM.
  • Extensive use of OLAP Function as well as complex joins to extract data for reporting purposes.
  • Created and modified views within Teradata for Business Analysts.
  • Identified the usage of Indexes in Teradata at various situations as part of tuning.
  • Collected statistics and used join indexes.
  • Worked with the ETL team during the upload process.
  • Created views to make data available on need to know basis.
  • Created queries to fetch data from different tables, using derived tables, Sub queries, as well as Correlated queries on SQL Server platform.
  • Acted as point of contact tune inefficient queries so that they can be run during the peak hours as well.
  • Converted various SAS DATASETS into Teradata tables and vice versa after doing the necessary Data Cleaning/Scrubbing and Data Validation.
  • Created source to target mapping documents from staging area to Data Warehouse.
  • Performed numerous SQL queries that were highly tuned using concepts like Explain, Stats and CAST.
  • Used UNIX to Automate BTEQ scripts.
  • Used UML Diagrams to graphically capture system functionalities.

Confidential, IL

Data Analyst - Contractor

Responsibilities:

  • Created and refreshed monthly Data Marts for various processes.
  • Requirements Analysis, identifying the necessary tables that need to be populated into the database.
  • Performed numerous data extraction requests involving SQL scripts.
  • Analyzed data coming from different Data Sources.
  • In depth analysis and preparation of data, generating weekly, biweekly and monthly reports using MS Access and SQL.
  • Extensive use of SAS functions such as Proc Report, Proc Tabulate, Proc Summary, Proc Print and Data Null Reports.
  • Data Analysis, Documentation, and Data Quality Checking.
  • Periodically created UML Diagrams to come up with a Data Design, that helped the business run more efficiently.
  • Created Multi-set and Set tables in Teradata database.
  • Maintained Data consistency and integrity.
  • Excellent Analytical and Documentation skills, along with the ability to be a key player in a Team Environment.
  • Interacted with end-users and business analysts, identifying and developing business requirements.
  • Involved in gathering, analyzing, and documenting business requirements, functional requirements and data specifications for Business Objects reports.

We'd love your feedback!