We provide IT Staff Augmentation Services!

Data Analyst Resume

Irving, TexaS


  • Data Analyst with 8+ year of experience in interpreting and analyzing data for driving business solution by using SQL, SAS and Excel.
  • Experience in ETL, Data warehouse and Agile(Scrum) methodology. Proficient knowledge in statistics, mathematic and analytics.
  • Strong working experience in the Data Analysis, Design, Development, Implementation and Testing of Data Warehousing using Data Conversions, Data Extraction, Data Transformation and DataLoading (ETL).
  • Experience in development methodologies like RUP, SDLC, AGILE, SCRUM and Waterfall
  • Strong working knowledge with SQL, ETL(Informatica, DB2), SQLServer, Oracle, DB2, SAS, Tableau and Cognos while handling various applications in multiple projects
  • Extensive working experience in RDBMS technologies like Oracle 10g/11i, MS SQL Server … Excel. SQL Developer, TOAD, SQL Plus, Win SQL. Good at working with SQLAssistant 7.1 in Teradata environment.
  • Experience in developing Entity Relationship diagrams and modeling Transactional Databases and Data Warehouse using tools like ERWIN, ER/Studio and Power Designer.
  • Efficient in with Normalization (1NF, 2NF and 3NF) and De - normalization techniques for improved database performance in OLTP, OLAP, Data Warehouse and Data Mart environments.
  • Experienced in working with BI Reporting teams which use tools such as Micro-Strategy, BusinessObjects and SSRS as well as development of ETL mapping and scripts.
  • Experienced in Data Extraction/Transformation/Loading (ETL), DataConversion and Data Migration by using SQLServer Integration Services (SSIS) and PL/SQL Scripts.
  • Experienced in supporting Informatica applications, data extraction from heterogeneous sources using Informatica Power Center.
  • In depth technical knowledge and understanding of Data Warehousing, Data Validations, OLAP, SQL Server, Oracle and ETL.
  • Expert in Data Modeling, DataAnalysis, Data Visualization and Modern Data Warehouse concepts. Designed Various Reports/dashboards to provide insights and data visualization using BI/ETL tools like Mainframes SAS, SSAS, SSIS, OLTP, OLAP, Business Objects, Tableau, Informatica power center, Data stage.
  • Strong SQL query skills and experience with Designing and verifying Databases using Entity-Relationship Diagrams (ERD) and Data profiling utilizing queries, dashboards, macros etc.
  • Very strong working experience with ALM on requirements management, Defect Management and responsible for various reconciliation activities
  • Hands-on experience in preparing ETLMappings (Source-Stage, Stage-Integration, ISD), Requirementsgathering, Reportingrequirements, Summarytabulations, PrototypeDesign by developing various use case scenarios
  • Extensive experience in ObjectOrientedAnalysis and Design (OOAD) techniques with UML using FlowCharts, Use Cases, Class Diagrams, Sequence Diagrams, Activity Diagrams and State Transition Diagrams
  • Extensively interacted with the QATeam in executing the TestPlans, ProvidingTestData, Creating Test Cases, Issuing STR’s upon identification of bugs and collecting the Test Metrics
  • Experience in performing UserAcceptanceTesting (UAT) and End to End testing monitoring test results and escalating based on priorities
  • Experience working with RUP, WATERFALL and AGILE methodologies and demonstrated excellent quality in delivering the output.


Languages: T-SQL, PL/SQL, SQL, C, C++, XML, HTML, DHTML, HTTP, Matlab, DAX, Python

Application Software: MS Office suite, Rational Suite, MS Project

Methodologies: RUP, OOAD, TOAD, UML & Business/Data Modeling, ER modeling

Documents & Processes: SRS, Use Cases, UML diagrams, FRS, UAT, Test plans & cases, Business Process Modelling, Project Planning & tracking

Data warehousing Tools: Informatica, Business Objects, Cognos, Erwin

Web Technologies: HTML, XML

DWH / BI Tools: Microsoft Power BI, Tableau, SSIS, SSRS, SSAS, Business Intelligence Development Studio (BIDS), Visual Studio, Crystal Reports, Informatica 6.1.

Database Design Tools and Data Modeling: MS Visio, ERWIN 4.5/4.0, Star Schema/Snowflake Schema modeling, Fact & Dimensions tables, physical & logical data modeling, Normalization and De-normalization techniques, Kimball &Inmon Methodologies

Tools: and Utilities: SQL Server Management Studio, SQL Server Enterprise Manager, SQL Server Profiler, Import & Export Wizard, Microsoft Management Console, Visual Source Safe 6.0, DTS, Crystal Reports, Power Pivot, ProClarity, Microsoft Office, Excel Power Pivot, Excel Data Explorer, Tableau, JIRA,SparkMlib


Confidential, Irving,Texas

Data Analyst


  • Played Data Analyst role in WF which includes gathering data requirements for data warehouses as well as data requirements analysis and mapping and data profiling.
  • Working in the existing team of data analysts and our business partners to gather data requirements from business users.
  • Worked on Requirement Analysis, Data Analysis and GapAnalysis of various source systems sitting and coming from multi systems.
  • Conducted JAD sessions to allow different stakeholders such as editorials, designers, etc.
  • Performed BusinessProcessmapping for new requirements.
  • Involved in Business and data analysis during requirements gathering.
  • Performed Data Validation with Data profiling
  • Independently work closely with data modellers and data analysts doing design work to hand off data requirements and facilitate the transition from requirements to design including capturing of data consumption requirements and managing traceability.
  • Independently analyse source and profile data related to business requirements and match up to business names and definitions and work with business and technical metadata.
  • Involved in translating the business requirements into data requirements across different systems.
  • Involved in Data Extraction from XMLBlobs and FlatFiles.
  • Designed reports in Access, Excel using advanced functions not limited to vlookup, pivot tables, formulas.
  • Use SQL to validate the Data going in to the DataWareHouse.
  • Creating complex data analysis queries to troubleshoot issues reported by users.
  • Evaluates data mining request requirements and help develop the queries for the requests.
  • Conducted UAT (User Acceptance Testing) for multiple iterations by writing Test Cases and signed off the same after approval.
  • Responsible for creating the Requirements Traceability Matrix.
  • Tested the ETL Informatica mappings and other ETL Processes (Data Warehouse Testing)
  • Verified and maintained Data Quality, Integrity, data completeness, ETL rules, business logic.
  • In depth technical knowledge and understanding of Data Warehousing, Data Validations, OLAP, Oracle and ETL.
  • Involved in Designing Star Schema, Creating Fact tables, Dimension tables and defining the relationship between them.
  • Created Traceability Matrix to ensure that all requirements are covered in test cases.
  • Skilled in various MSOffice tools including MSVisio and MSProject
  • Excellent written and oral communication skills and a team-player with a results-oriented attitude.
  • Worked in Agile technology with Scrum and waterfall models.

Environment: Oracle 11g, 10g,HP ALM, Informatica Power Center 9/8.5.1 ( Workflow Manager, Workflow Monitor), XML, Test Cases, Teradata V2R6, Teradata SQL Assistant, Unix, Test Matrix, TOAD, HP Quality Center 10, Putty

Confidential, Columbus,Ohio

Data Analyst


  • Worked on Requirement Analysis, Data Analysis and GapAnalysis of various source systems sitting and coming from multi systems. Responsible for BIDataQuality.
  • Organized several requirements gathering session process to prepare the Business/Data requirement documents.
  • Took active participation in review meetings with stakeholders and business groups to understand their requirements and usage.
  • Analyzed requirements for various reports, dashboards, scorecards and created proof of concept / prototype the same using Tableau desktop.
  • Created Tableau scorecards, dashboards using stack bars, bar graphs, scattered plots, geographical maps, Gantt charts using show me functionality.
  • Involved in Data Extraction from Teradata and Flat Files using sql assistant.
  • Used excel sheet, flat files, CSV files to generate Tableau adhoc reports.
  • Tested the ETL Informatica mappings and other ETL Processes (Data Warehouse Testing)
  • Verified and maintained Data Quality, Integrity, data completeness, ETL rules, business logic.
  • Involved in creating dashboards to compare data using dual axis for comparison.
  • Extensively used SQL queries to check storage and accuracy of data in database tables.
  • Used SQL for Querying the Oracle database.
  • Performed all aspects of verification, validation including functional, structural, regression, load and system testing.
  • Designed reports in Access, Excel using advanced functions not limited to vlookup, pivot tables, formulas.
  • Evaluates data mining request requirements and help develop the queries for the requests.
  • Worked extensively with advance analysis actions, Calculations, Parameters, Background images, Maps, TrendLines, Statistics, and Log Axes. Groups, hierarchies, Sets to create detail level summary report and Dashboard
  • Created Dashboards by joining multiple complex tables.
  • Tested whether the reports developed in Cognos are as per company standards.
  • Worked with business team to test the reports developed in Cognos.
  • Personalized Cognos Connection for specified display options for testing regional options and personal
  • Information;Also written several complex SQL queries for validating Cognos Reports.

Environment: Oracle 11g,Oracle 10g, Cognos 8.0 Series, Teradata V2R6, Teradata SQL Assistant, TOAD, Tableau Desktop 9.0, 10.0/10.2, Tableau Server

Confidential, Fountain Valley, CA

Data Analyst .


  • Created a report decomposition plan where every risk related report is broken down to its lowest elemental information like table, view, schema, calculations, format and datastructural level information
  • Responsible for training resources, validate decomposition approach with the team and schedule clientinterviews, reportwalkthrough, BRD reviews and manage escalation
  • Worked with IT in creation of a report portal where all the decomposed reports are created in Cognosreporting tool and created a plan for data operation right from permissions, approvals, validation requirements etc.
  • Interacted with SeniorManagement to provide estimate, status and issues on riskreport decomposition which involved 120 reports
  • Created a reverse engineer process of target to source mapping identifying risk related reports and the original source of record
  • Extensively worked with SASdatasets for various reports and responsible for creating several macros to simplify the process of decomposing the SAScode and identifyingrisk data elements, interim calculations and the original source which are required to establish risk aggregation metrics
  • Worked on Datamigration plan in order to understand the future reporting elements where DB2 and Mainframe are the SOR’s and Teradata being the target database as part of IDW to ICDW migration
  • Responsible for creating a data validation template along with SME’s and IT and coordinated the QA process where every report is tested and validated .

Confidential, New York, NY

Data Analyst


  • Conducted initial work shop sessions with DELL and IKA systems in order to facilitate discussions and drafted a road map covering all the key areas of Dataconversion
  • Created current data architecture layer diagrams for Enrollment&Claims system covering end to end processes of Data flow involved in each system
  • Reverse engineered current state datamodel and designed enhanced data model to support the future state requirements
  • Performed DatagapAnalysis by creating future state diagrams and documented all the pain points involved in various Dataprocesses, and some other manual processes involved in the current system
  • Created Transformations using the SQL script to modify the data before loading into staging tables
  • Developed mappings using InformaticaPowerCenterDesigner to load data from various source systems to target database
  • Created and used mapping parameters, mapping variables using Informatica mapping designer to simplify the data flow process and data standards
  • Worked with various departments(Enrollment, Claims, Billing, CustomerService) on complexData issues and developed a mitigation plan to resolve them in a timely manner
  • Conducted ICD-9 to ICD-10 Data mapping sessions and documented the ICD-9 utilizing components within various internal systems and external systems thereby identifying the processes that need to be converted
  • Worked on the databases to understand the source & target systems, volume of data, staging areas, transfer methods, reports and listed all of them in a spread sheet as part of conversion activities
  • Very strong working knowledge on CMSGOVERNANCE data standards and made sure various forms of data like Eligibility, Provider, Claims, Premium Billing, Encounter data and other transactional data are meeting the standards.


Data Analyst


  • Scrum Master for Two internal projects both involving automating various manual processes in AGILE mode in order to improve timely delivery and quality
  • Responsible for creating sales reporting metrics using Cognos across all markets (11 states) and providing with improvement solutions which benefitted the individual market sales revenue
  • Strong Excel working knowledge on datamining, filtering, pivottables, formulas and setting up database connection for automatic data refresh and to share point links as well
  • Experience conducting loads using Informatica tools and handled PerformanceTuning of Informatica Jobs
  • Responsible for running Daily, Weekly, Monthlyreports from both SQLServer and Oracle data warehouses
  • Primary contact for one of the Vendor regarding compliance and data related check points for one of the process
  • Worked with the marketing team and analyzed marketing data using Access/Excel and SQL tools to generate reports, tables, listings and graphs
  • Strong Domain knowledge of Datagovernance, Dataquality to make sure compliance standards are properly incorporated
  • Extensively experience on Datamigration, extraction, Datacleansing and DataStaging of operational sources using ETL (Informatica) processes
  • Strong Data mining work and responsible for working with Business Stakeholders and Vice Presidents in order to report weekly sales, issues, timeliness, trends, statistics and all other road blocks that needs to be addressed
  • Developed various automated and customized reports along with improved template formats for various data metrics as part reporting
  • Analyzed different kinds of data from many systems using ad-hocqueries, SQLscripts, Cognos report designs and delivered various comparisons, trends, statistics, errors and suggestions
  • Responsible for conducting technical breakout sessions with different groups in order to capture Data improvement requirements for some of the systems as per priority list
  • Maintained Change control process, conducted thorough analysis on various parameters, documented and presented the same to the Managers and Directors


SQL Developer


  • Created new database objects like Procedures, Functions, Packages, Triggers, Indexes and Views using T-SQL in Development and Production environment for SQLServer2000
  • Actively participated in gathering of Requirement and System Specification.
  • Developed SQL Queries to fetch complex data from different tables in remote databases using joins, database links and formatted the results into reports and kept logs
  • Strong Understanding of Agile Data Warehouse Development.
  • Worked on complex T-SQL statements, and implemented various codes and functions.
  • Installed, authored, and managed reports using SQLServer2005 Reporting Services
  • Wrote Transact SQL utilities to generate table insert and update statements.
  • Developed and optimized databasestructures, storedprocedures, DDLtriggers and user-defined functions.
  • Implemented new T-SQL features added in SQLServer2005 that are Error handling through TRY- CATCH statement, CommonTableExpression (CTE).
  • Created Stored Procedures to transform the data and worked extensively in T-SQL for various needs of the transformations while loading the data .

Hire Now