We provide IT Staff Augmentation Services!

Sr. Data Analyst Resume

Chicago, IL


  • Have 7+ years of experience in IT industry, with expertise in Data Modeling, Data Analysis and Design.
  • Dimensional Data Modeling experience using ER/Studio, Erwin, Multidimensional Data Modeling Schema - Star Join Schema, Snowflake modeling, Fact & Dimension tables, Conceptual, Physical & Logical Data modeling.
  • Complete knowledge of Data Warehouse Methodologies Ralph Kimball & Bill Inmon Methodologies. Experienced in building ODS, EDW, Staging and maintaining metadata repository.
  • Expert in data analysis, design, development, implementation and design and code reviews for Extraction, Transformation and Loading (ETL).
  • Experience in conducting Joint Application Development (JAD) sessions with SMEs, Stakeholders and other project team members for requirements gathering and analysis.
  • Strongly capable of handling VLRDB (Very Large Relational Data Bases) of about 15TB with expert level working knowledge of the architecture involved.
  • Expertise in Normalization/De normalization techniques for optimum performance in relational and dimensional database environments.
  • Highly proficient in Data Modeling retaining concepts of RDBMS, Logical and Physical Data Modeling until 3NormalForm (3NF) and Multidimensional Data Modeling Schema (Star schema, Snow-Flake Modeling, Facts and dimensions).
  • Expert at Full SDLC processes involving Requirements Gathering, Source Data Analysis, Creating Data Models, and Source to target data mapping, DDL generation, performance tuning for data models.
  • Experience in using the Informatica command line utility to schedule and control sessions and batches.
  • Extensive Knowledge in architecture design of Extract, Transform, Load environment using Informatica Power Mart and Power Center.
  • Experience in identifying and resolving the bottlenecks in Source, Target, Transformations, Mappings and Sessions for better performance
  • Strong working knowledge and background in multiple databases such as Oracle, SQL Server, DB2, and Sybase.
  • Strong experience in writing SQL and PL/SQL, Transact SQL programs for Stored Procedures, Triggers and Functions.
  • Responsible for interacting with business users to identify information needs and business requirements for reports.
  • Created complex and sophisticated Business Objects Reports that had multiple data providers, drilldown, slice and dice features using Business Objects.
  • Experience in retrieving data from Business Objects Universes, Personal Data Files, Stored Procedures and Free Hand SQL.
  • Experience in creating of Business Objects & Web Intelligence reports from various sources such as Oracle, Teradata, MS SQL Server, Access and Sybase
  • Excellent team member with interpersonal and communication skills, highly motivated, result oriented with strong analytical, organizational, presentation and problem solving skills.
  • Expert in developing effective working relationships with client team to understand support requirements, developed tactical and strategic plans to implement technology solutions, and effectively manage client expectation.


Data Modeling: ER/Studio: 10.0/9.0/7.6.1/7.5.1/7.0, Erwin: 9.64/9.6/9.5/9.2/9.1/9.0/8.2/7.3/7.0/6.5, OBIEE 11g,

ETL Tools: Informatica: 9.6/8.6/7.1.4/6.0/5.0, SSIS, Data Services 3.2, Datastage 8.5

Databases: Teradata15.0,14.0,13,13.0,12,Oracle12c/11g/11i/10g/9i/8i/7.x, SQLServer 2016/2014/2012/2008/2005/2000, DB2 UDB 9.x/8.x/7.x, Sybase 12/11

Programming Skills: SQL, PL/SQL, T-SQL, C, C++, C#, HTML, DHTML, XML

Operating Systems: Win 2000/XP, Win 7, Win 8, Win 10 LINUX

Tools: Toad 8.5/7.4/7.3, MS-Office, MS-Visio, Ultra edit, SQL Developer, Microsoft Visual Studio 2013/2010

Reporting Tools: Business Objects XI R3 v 3.1/ XI R2,SSRS,Cognos, SSAS, PowerPlay, Transformer


Confidential, Chicago, IL

Sr. Data Analyst


  • Interacted with clients to gather business and system requirements which involved documentation of processes based on the user requirements.
  • Analyzed the pertinent client data and worked with the Analytics team to inspect, cleanse, transform and model data and provide valuable feedback to the client.
  • Performed Data Analysis using visualization tools such as Tableau, Spotfire, and SharePoint to provide insights into the data.
  • Implemented Multidimensional and Tabular cubes to perform interactive analysis.
  • Configured Azure platform offerings for web applications, business intelligence using Power BI, Azure Data Factory etc.
  • Conceptualized and Designed Extraction, Transformation and Loading of client data from multiple sources into SQL Server Integration Services (SSIS).
  • Formulated SQL stored procedures.
  • Assisted in mining data from the SQL database that was used in several significant presentations.
  • Assisted in offering support to other personnel who were required to access and analyze the SQL database.
  • Performed data analysis and data profiling using complex SQL on various sources systems including Oracle and Teradata.
  • Involved with data profiling for multiple sources and answered complex business questions by providing data to business users.
  • Responsible for development of workflow analysis, requirement gathering, data governance, data management and data loading.
  • Root cause analysis of data discrepancies between different business system looking at Business rules, data model and provide the analysis to development/bug fix team.
  • Evaluated existing practices of storing and handling important financial data for compliance and Ensured corporate compliance with all billing, credit standards and direct responsibility of accounts receivables and supervision of accounts payable.
  • Have setup data governance touch points with key teams to ensure data issues were addressed promptly.
  • Responsible for facilitating UAT (User Acceptance Testing), PPV (Post Production Validation) and maintaining Metadata and Data dictionary.
  • Responsible for source data cleansing, analysis and reporting using pivot tables, formulas (v-lookup and others), data validation, conditional formatting, and graph and chart manipulation in Excel.
  • Actively involved in data modeling for the QRM Mortgage Application migration to Teradata and developed the dimensional model.
  • Created views for reporting purpose which involves complex SQL queries with sub-queries, inline views, multi table joins, with clause and outer joins as per the functional needs in the Business Requirements Document (BRD).
  • Conducted data cleaning, manipulation, modification and combination using variety of SAS steps and functions. Analyzed, tested, documented and maintained SAS programs and macros to generate SAS datasets, spreadsheets, data listing, tables and reports.
  • Responsible for generating Financial Business Reports using SAS Business Intelligence tools (SAS/BI) and also developed ad-hoc reports using SAS Enterprise Guide

Environment: Agile, Teradata, BTEQ, Fast Load, Fast Export, Multiload, Oracle, Unix Shell Scripts, SQL Server 2005/08, SAS, PROC SQL, MS Office Tools, MS Project, Windows XP, MS Access, Pivot Tables

Confidential, San Francisco, CA

Data/ Business Analyst


  • Worked with Business analysts for requirements gathering, business analysis, testing, metrics and project coordination.
  • Involved in gathering requirements from business users and created Multidimensional model using Erwin.
  • Using Erwin for Logical and Physical database modelling by using Ralph Kimball & Bill Inmon Data Warehouse Methodologies.
  • Understood OLTP Source tables and build Dimension and fact tables accordingly
  • Identified all the conformed dimensions to be included in the target warehouse design and confirmed the granularity of the facts in the fact tables
  • Created Multiple Dimensions and Fact tables. Used concepts of Degenerated dimension, sub-dimension, Fact less fact table, Aggregate fact tables in Multidimensional model, Snowflake schema.
  • Conducted ETL design review meetings to get approval from Cross functional teams
  • Organizing and managing meetings between clients, third party vendors, development teams and analysts on a scheduled basis
  • Gather user requirements and translate to functional requirements/technical specifications for Automation
  • Involved in the Extraction, Transformation and Load process (ETL) for a Data Warehouse from their legacy systems using Informatica.
  • Worked on multiple disparate Data sources ranging from flat files, XML Files, Oracle, SQL server, DB2 databases to load the data into Heterogeneous targets such as XML files and Oracle Database.
  • Created High Level Design & Architectural document, Detail Level Design document, Unit test plan as a part of SDLC Process.
  • Created Reusable Transformations, Mapplets and used them in Mappings to develop the business logic for transforming source data into target.
  • Extensively used transformations like lookups, update strategy, sorter, expression, aggregator
  • Extensively used Informatica Workflow Manager and Workflow Monitor for creating and monitoring workflows, worklets and sessions.
  • Created Test data and Unit test cases to ensure successful data loading process
  • Scheduled reports to run on a monthly, weekly, and daily basis using Cognos Connection Scheduler.
  • Involved in transforming the knowledge about system to Production support team
  • Supported the System for a period of two months after it went live into Production and helped production support team to come up to the speed
  • Helped in creating Production support Manuals.

Environment: Orac1e 11i, Erwin 9.0, Microsoft Visual Source Safe, SSIS, SQL Server 2016, Teradata 15.0, TFS, T-SQL, PL/SQL, TOAD 8.5, OBIEE 11g, Informatica Power Centre 8.5.1, Microsoft Visual Studio 2013,Cognos.

Confidential, Plano, TX

Data Analyst


  • Gathered and analyzed requirements from the end users and converted functional specifications to technical specifications documents.
  • Reverse-Engineered numerous existing database schemas to get an understanding of existing database architecture.
  • Created Data Model from scratch which involved a complete life cycle implementation that encompassed business requirement gathering, analyzing their source systems; and then building financial data mart in order to provide functionality for their reporting purposes.
  • Created conceptual, logical & physical data models with Erwin and did design review meetings with other team members in order to finalize the model.
  • Implemented advance modeling concepts like Bus Matrix Architecture, Resolving Multi Valued dimensions, and Conform dimensions, Junk dimensions, Degenerate dimensions in designing multidimensional model.
  • Created and developed Slowly Changing Dimensions tables SCD2, SCD3 to facilitate maintenance of history.
  • To maintain the consistency and quality of the data worked with Data Governance, Data Profiling and Data Quality team that required managing the master data from all the business units as well as from IT and ensuring data quality standards across the enterprise.
  • Developed and maintained enterprise data dictionary by interacting with data steward, data governance team.
  • Divided model into subject areas for reflecting understandable view to business as well as data model reviewers.
  • Created DDL for physical model using Erwin and handed over to the DBA team.
  • Experienced in Writing PL/SQL ETL routines in Data warehousing applications Involved in database design and Data modeling.
  • Developed various mappings using Mapping Designer and worked with Aggregator, Lookup, Filter, Router, Joiner, Source Qualifier, Expression, Stored Procedure, Sorter and Sequence Generator transformations.
  • Created Complex Mappings which involved slowly Changing Dimensions to implement Business Logic and capturing the deleted records in the source systems.
  • Created a number of reports for Credit, Sales, Document Funding, application Security etc. These reports used Crystal formulas, Parameters, Selection criteria, Sub reports, Graphical Representations etc.
  • Worked efficiently even in tight dead lines

Environment: Orac1e 11i, ER Studio 9.0, Microsoft Visual Source Safe, SSIS, SQL Server 2008, T-SQL, PL/SQL, TOAD 8.5, Informatica Power Centre 8.5.1 Cognos.

Confidential, OH

Data Analyst


  • Perform Daily validation of Business data reports by querying databases and rerun of missing business events before the close of Business day.
  • Analysis of functional and non-functional categorized data elements for data profiling and mapping from source to target data environment. Developed working documents to support findings and assign specific tasks.
  • Worked on claims data and extracted data from various sources such as flat files, Oracle and Mainframes.
  • Gathered Business requirements by interacting with the business users, defined subject areas for analytical data requirements.
  • Created Datasets using SAS proc SQL from flat file.
  • Optimizing the complex queries for data retrieval from huge databases.
  • Root cause analysis of data discrepancies between different business system looking at Business rules, data model and provide the analysis to development/bug fix team.
  • Lead the Data Correction and validation process by using data utilities to fix the mismatches between different shared business operating systems.
  • Conduct downstream analysis for different tables involved in data discrepancies and arriving at a solution to resolve the same.
  • Extensive data mining of different attributes involved in business tables and providing consolidated analysis reports, resolutions on a time to time basis.
  • Performed data analysis and data profiling using complex SQL
  • Involved with data profiling for multiple sources and answered complex business questions by providing data to business users.
  • Verifying data quality after every deployment and perform extensive analysis for the data variance pre and post implementation.
  • Utilize data to prepare and conduct quality control with SAS.
  • Provide programming parameters to technical staff for all data extracts.
  • Ensure to partner with internal teams to prepare analytics as well as validate results with SAS.
  • Conducted data cleaning, manipulation, modification and combination using variety of SAS steps.
  • Developed and maintained MS Access database.
  • Created an MS Access database to collect data systems in order to provide lists, status reports and management overview reports.
  • Managed all data related aspects of multiple directory projects through various stages of production.
  • Defined the periodic extraction and performed formatting of data from multiple sources and laid out basic queries and reported required by the customer.
  • Analysis of CDE (Critical data elements) in business process, data conversion, data movement and Data integrity check before delivering data to operations, financial analyst for uploading to databases in accordance with IM policy compliance.
  • Extensively involved in User Acceptance Testing (UAT) and Regression testing.
  • Involved in Data Reconciliation Process while testing loaded data with user reports.
  • Documented all custom and system modification
  • Worked with offshore and other environment teams to support their activities.
  • Responsible for deployment on test environments and supporting business users during User Acceptance testing (UAT).

Environment: DataStage 8.1, Oracle 10g, DB2, Sybase, TOAD, SQL Server 2008, TSYS Mainframe, SAS PROC SQL, SQL, PL/SQL, ALM/Quality Center 11, QTP 10, UNIX, Shell Scripting, XML, XSLT.

Hire Now