We provide IT Staff Augmentation Services!

Sr. Data Analyst Resume

Pittsburgh, PA


  • 8+ years of professional experience in the Information Technology (IT) Industry serving as a Data Analyst.
  • Experience in Data Analysis, Business Process Analysis/Modeling, Business Requirements Gathering, Database Design, Data Analysis, Data warehousing, Data Mapping, Development of Web Based, and Client/Server Applications with domain of Banking, finance and Insurance.
  • Extensive experience in System and Data Analysis, Data Profiling, Data Mapping, Data Migration, Data Conversion, Data Quality, Data Management, Data Governance, Data Integration and Metadata Management Services and Configuration Management.
  • Hands on experience on Full life cycle of the project using methodologies like Agile, Waterfall, RUP and hybrid methods.
  • Extensive experience on interaction with system users in gathering business requirements and involved in developing projects. Chaired and conducted JAD sessions with business and technical teams.
  • Expert in analyzing/troubleshooting and providing technical support for Oracle/Unix applications and their interfaces.
  • Extensive experience on business intelligence (and BI technologies) tools such as OLAP, Data warehousing, reporting and querying tools, Data mining and Spreadsheets.
  • Experience in Relational and Dimensional Data modeling for creating Logical and Physical Design of Database and ER Diagrams using data modeling tool like ERWIN.
  • Extensively involved in MDM to help the organization with strategic decision making and process improvements. (Streamline data sharing among personnel and departments)
  • Vast experience of working in the area data management including data analysis, gap analysis and data mapping.
  • Experience in Data Transformation, Metadata, Data dictionary, Data Loading, Modeling and Performance Tuning.
  • Experience in DBMS/RDBMS implementation using object - oriented concept and database toolkit.
  • Experienced in creating documentation, providing production support, development and enhancements on oracle (SQL, PLSQL), UNIX and Cognos/QlikView (BI Tools) projects.
  • Extensive experience in Data Analysis and ETL Techniques for loading high volumes of data and smooth structural flow of the data.
  • Adept at writing Data Mapping Documents, Data Transformation Rules and maintaining Data Dictionary and Interface requirements documents.
  • Worked on integration and implementation of projects and products, database creations, modeling, calculation of object sizes, table spaces and database sizes.
  • Experience with creating reports using Business Objects.
  • Well versed in system analysis, ER/Dimensional Modeling, Database design and implementing RDBMS specific features
  • Involved in testing the complex COGNOS reports developed, supporting the Testing Team, writing the Test cases and creating the Test Data.
  • Created and executed Test Plan, Test Scripts, and Test Cases based on Design document and User Requirement document for testing purposes.
  • Extensive experience in the testing environment, which included User Acceptance Testing (UAT), functional testing and system testing and defect tracking using ClearQuest.
  • Coordinated and prioritized outstanding defects and system requests based on business requirements. Acted as a liaison between the development team and the management team to resolve any conflicts in terms of requirements.
  • Experience with Star schema modeling and knowledge of Snowflake dimensional modeling.
  • Experience with SQL, PL/SQL objects. Involved in creating Packages, writing Procedures and Functions. Tuning using various types of Partitioning and Indexes.
  • Interacted with all levels of the project development team, from end users to Software Architects, Technical Lead, Database Administrators, and System Administrator
  • Excellent knowledge in Database Creation and maintenance of physical data models with Oracle, Netezza, DB2 and SQL Server databases.
  • Experience in Data Integrity constraints, Performance Tuning, Repairing dead locks, Query Optimization and Validation issues.
  • Experienced in creating documentation, providing production support, development and enhancements on oracle
  • (SQL, PLSQL), UNIX and Cognos/QlikView (BI Tools) projects.
  • Highly proficient in the use of T-SQL for developing complex stored procedures, triggers, tables, user functions, user profiles, relational database models and data integrity, SQL joins and query writing.
  • Possess strong Documentation skills and knowledge sharing among team, conducted data modeling review sessions for different user groups, participated in sessions to identify requirement feasibility
  • Excellent communication and interpersonal skills with a clear view of business process flow.


Databases: SQL Server 2012, MS Access, Oracle 9i, Databases, DB2, MS SQL Server, OBIEE, Netezza.

Visualization Tools: Tableau, QlikView

Data Modeling Tools: Erwin, ER Studio and Oracle Designer.

Business Intelligence Tools: SAS Enterprise Miner, XL Miner, Cognos, SAP, Business Objects

Software Applications: MS Visio, Toad for Oracle

Modeling Tools: MS Visio, iGrafx, Rational Rose Requirement

Management: RTC, DOORS, Version OneUse Case Diagrams, Activity Diagrams, State Diagrams, Sequence Diagrams

Testing Tools: HP Mercury Quality Center, QTP

Reporting Tools: Business Objects, Crystal Reports

Others Process: MS Project, MS Office, DB2, STAR Team, IBM Clear Case, TOAD, SQL *PLUS, SQL*LOADER, MS Project, MS Visio and MS Office, have worked on C++, UNIX, PL/SQL etc.


Confidential, Pittsburgh, PA

Sr. Data Analyst


  • Met with stakeholders and users to review / manage expectations and identify opportunities to improve ROI.
  • Performed extensive data modeling to differentiate between the OLTP and Data Warehouse data models
  • Analyzing and mining business data to identify patterns and correlations among the various data points.
  • Working closely with data mapping SME and QA team to understand the business rules for acceptable data quality standards.
  • Performed data profiling on datasets with millions of rows on Teradata environment, validating key gen elements, ensuring correctness of codes and identifiers, and recommending mapping changes.
  • Wrote complex SQL queries to identify granularity issues and relationships between data sets and created recommended solutions based on analysis of the query results
  • Wrote the SQL queries on data staging tables and data warehouse tables to validate the data results.
  • Delivered Enterprise Data Governance, Data Quality, Metadata, and ETL Informatica solution
  • Maintained Excel workbooks, such as development of pivot tables, exporting data from external SQL databases, producing reports and updating spreadsheet information.
  • Interfaced with business users to verify business rules and communicated changes to ETL development team.
  • Creating and executing SQL queries to perform Data Integrity testing on a Teradata Database to validate and test data using TOAD.
  • Worked with data architects team to make appropriate changes to the data models.
  • Worked on the ETL Informatica mappings and other ETL Processes (Data Warehouse)
  • Experience in creating UNIX scripts for file transfer and file manipulation.
  • Generate ad-hoc or management specific reports using SAS and Excel.
  • Provided multiple demonstrations of Tableau functionalities and efficient data visualizations approaches using Tableau to the senior management at the client as part of the BRD.
  • Involved in data validation of the results in Tableau by validating the numbers against the data in the database tables by querying on the database
  • Acquire data from primary or secondary data sources like RedShift, FTP, and external or internal files; extract and analyze data to generate reports.
  • Participated in initial MDM implementations and future rollouts. Involved in creating, monitoring, modifying, & communicating the project plan with other team members.
  • Involved in creating logical and physical data modeling with STAR and SNOWFLAKE schema techniques using Erwin in Data warehouse as well as in Data Mart.
  • Worked in database objects like tables, views, materialized views, procedures and packages using Oracle tools like Toad, PL/SQL Developer and SQL plus.
  • Involved in physical Modeling, Logical Modeling, Relational Modeling, Dimensional Modeling (Star Schema, Snow-Flake, FACT, and Dimensions), Entities, Attributes, OLAP, OLTP, Cardinality, and ER Diagrams.
  • Developed Data Mapping, Data Governance, Transformation and Cleansing rules for the Master Data Management Architecture involving OLTP, ODS and OLAP.
  • Developed control files for SQL Loader and PL/SQL programs for loading and validating the data into the Database.
  • Created Stored Procedures in both SQL Server and DB2 and involved in several DTS.
  • Created various transformation procedures by using SAS, ETL and SAS Enterprise guide.
  • Involved in Data Modeling of both Logical Design and Physical Design of Data Warehouse and datamarts in Star Schema and Snow Flake Schema methodology.
  • Involved in Informatica MDM processes including batch based and real-time processing.
  • Responsible for reviewing data model, database physical design, ETL design, and Presentation layer design.

Environment: : ERwin, Informatica Power Center, MDM, ETL, SQLs, PL/SQL, DB2, Oracle, Normalization / De-normalization, Star Schema and Snow Flake Schema, TOAD, ETL, Meta Data, SQL Server2008 etc.

Confidential, Boston, MA

Data Analyst


  • Collaborates with cross-functional team in support of business case development and identifying modeling method(s) to provide business solutions. Determines the appropriate statistical and analytical methodologies to solve business problems within specific areas of expertise.
  • Builds models and solves complex business problems where analyses of situations and/or data require in-depth evaluation of variable factors.
  • Generating Data Models using Erwin and developed relational database system and involved in Logical modeling using the Dimensional Modeling techniques such as Star Schema and Snow Flake Schema.
  • Participate in documenting the data governance framework including processes for governing the identification, collection, and use of data to assure accuracy and validity.
  • Metrics reporting, data mining and trends in helpdesk environment using Access
  • Involved in creating Informatica mapping to populate staging tables and data warehouse tables from various sources like flat files DB2, Netezza and oracle sources.
  • Involved in loading the data from the flat files submitted by vendors into the Oracle12c External tables, extensively used ETL to load data from Oracle database, XML files, and Flat files data and also used import data from IBM Mainframes.
  • Involved with ETL team to develop Informatica mappings for data extraction and loading the data from source to MDM Hub Landing tables.
  • Involved in MDM Process including data modeling, ETL process, and prepared data mapping documents based on graph requirements.
  • Used Pivot tables, vlookups and conditional formatting to verify data uploaded to proprietary database and online reporting.
  • Performing data management projects and fulfilling ad-hoc requests according to user specifications by utilizing data management software programs and tools like Perl, MS Access and Excel.
  • Coordinated with Data Architects and Data Modelers to create new schemas and view in Netezza for to improve reports execution time, worked on creating optimized Data-Mart reports.
  • Used both Kimball and Bill Inmon methodologies for creating data warehouse and transformed data from different OLTP systems.
  • Independently worked on owning IT support tasks related to Tableau Reports on Server.
  • Primary subject matter expert in implementing Oracle's self-service OBIEE reporting system. This system allows for real-time data to be accessible at all HR locations throughout the company.
  • Evaluate, identify, and solve process and system issues utilizing business analysis, design best practices and recommended enterprise and Tableau solutions.
  • Used advanced functions like VLookups, Pivots, graphs, and analytical and statistical tool packs in Excel.
  • Involved in Normalization and De-Normalization of existing tables for faster query retrieval.
  • Tested Complex ETL Mappings and Sessions based on business user requirements and business rules to load data from source flat files and RDBMS tables to target tables.
  • Extensively used SQL Server2014 tools to develop Oracle stored packages, functions and procedures for Oracle database back-end validations and Web application development.
  • Perform analyses such as regression analysis, logistic regression, discriminant analysis, cluster analysis using SAS programming.

Environment: Erwin, Oracle 12c, SQL Assistance, Netezza, OBIEE, PL/SQL, T-SQL, Tableau, MDM, Data Stage, DB2 UDB, Unix, SQL Sever2014, Normalization/De-normalization, Physical & Logical data modeling, Star Schema/Snow Flake schema etc.

Confidential, Philadelphia, PA

Data Analyst


  • Involved in designing and developing Data Models and Data Marts that support the Business Intelligence Data Warehouse.
  • Utilized corporation developed Agile SDLC methodology. Used Scrum Work Pro and Microsoft Office software to perform required job functions.
  • Executed ETL operations for Business Intelligence reporting solutions using Excel.
  • Using Shared Containers and creating reusable components for local and shared use in the ETL process.
  • Provide input for the data governance requirements and standards.
  • Develop and maintain sales reporting using in MS Excel queries, SQL in MS Access. Produce performance reports and implement changes for improved reporting.
  • The reports that were created in Business Objects were tested by running the SQL statements.
  • Experienced in writing complex SQL queries for extracting data from multiple tables.
  • Designed Excel templates, created Pivot Tables and utilized VLOOKUPs with complex formulas.
  • Provided weekly, monthly & ad hoc web analytics reports using Adobe Site Catalyst & Google Analytics.
  • Analyzed, evaluated, and compared website analytics packages to create a gap analysis and action plan for IT department's migration process from Site Catalyst (Adobe Analytics) and Google Analytics.
  • Involved in Tableau data visualization using Cross Map, Scatter Plots, Geographic Map, Pie Charts and Bar Charts, Page Trails, and Density Chart.
  • Evaluate, identify, and solve process and system issues utilizing business analysis, design best practices and recommended enterprise and Tableau solutions.
  • Substantial report development experience utilizing SQL Server Reporting Services (SSRS), Cognos Impromptu, and Microsoft Excel
  • Writing PL/SQL procedures for processing business logic in the database. Tuning of SQL queries for better performance.
  • Utilize complex Excel functions such as pivot tables and vlookups to manage large data sets and make information readable for other teams.
  • Developed SQL-based data warehouse environments and created multiple custom database applications for data archiving, analysis, and reporting purposes.
  • Data mapping, logical data modelling, created class diagrams and ER diagrams and used SQL queries to filter data within the Oracle database.
  • Worked in creating dashboard analysis using Tableau Desktop, including information of car rental records, analysis covered customers, days of rental, service region, travel type and etc.
  • Performed extensive requirement analysis including Data analysis and Gap analysis.
  • Analysed business requirements and segregated them into high level and low level Use Cases, activity diagrams using Rational Rose according to UML methodology thus defining the Data Process Models.
  • Used extracts to better analyze the data, extracted data source was stored in Tableau Server, updated on a daily basis.
  • Designed and implemented basic SQL queries for QA Testing and Report / Data Validation.

Environment: Toad, MS-Visio OLTP, OLAP, Business Objects, Tableau, Informatica, Erwin, Cognos Crystal Reports Oracle, SSRS, SQL Server, MS-SQL, MS-Access, HTML, DHTML, XML

Confidential, Kansas City, MO

System Analyst


  • Identify business, functional and Technical requirements through meetings and interviews and JAD sessions.
  • Define the ETL mapping specification and Design the ETL process to source the data from sources and load it into DWH tables.
  • Designed the logical and physical schema for data marts and integrated the legacy system data into data marts.
  • Integrate Data stage Metadata to Informatica Metadata and created ETL mappings and workflows.
  • Designed mapping and identified and resolved performance bottlenecks in Source to Target, Mappings.
  • Developed Mappings using Source Qualifier, Expression, Filter, Look up, Update Strategy, Sorter, Joiner, Normalizer and Router transformations.
  • Involved in writing, testing, and implementing triggers, stored procedures and functions at Database level using PL/SQL.
  • Developed Stored Procedures to test ETL Load per batch and provided performance optimized solution to eliminate duplicate records.
  • Involved with the team on ETL design and development best practices.
  • Gathered and translated business requirements into detailed, production-level technical specifications, new features, and enhancements to existing technical business functionality.
  • Conducted project related presentations and provided reports to the project sponsors.
  • Created/ Tuned PL/SQL Procedures, SQL queries for Data Validation and for various data profiling activities for current system and performed data analysis.
  • Served as the primary contact for business concerns. Conducted training sessions educating team members on use of requirement/defect management tools. Implemented project milestones, and monitored timelines.
  • Worked closely with QA team, documented errors and tracked them to completion by communicating and co-coordinating with the development, business and testing group.
  • Worked on the Snow-flaking the Dimensions to remove redundancy.
  • Followed standards and best practices in indexing and partitioning where performance can be a concern.
  • Supported ETL team in understanding the process and creating/enhancing the ETL maps.

Environment: Oracle, SQL Server, MS Excel, PL/SQL, T-SQL, Crystal Reports, Requisite pro, clear quest, HTML, UML, Agile Methodology, MS Office, MS Visio, MS Project and SQL.

Hire Now