We provide IT Staff Augmentation Services!

Senior Data Analyst Resume

5.00/5 (Submit Your Rating)

Minneapolis, MN

SUMMARY

  • Sr. Data Analyst with 7+ years of experience in data modeling, data analysis, analyzing business operations, specialized in finding patterns, gathering requirements, root cause analysis, decision making and developing user stories by digging into big, complex data.
  • Proficient at translating IT user requirements into specific database capabilities.
  • Experience in developing advanced/complex SQL server queries with Stored Procedures, Triggers, Views, Cursors.
  • Expert in working various relational database management systems (RDBMS) such as SQL Server, Oracle and writing DML, DDL and DQL commands in SQL Server/ PL SQL.
  • Well versed in system analysis, Entity Relations ER/Dimensional modeling, Database Design and implementing RDBMS specific features.
  • Built Data Integration, Workflow Solutions and Extract, Transform and Load (ETL) solutions for data warehousing using SQL Server Integration Service (SSIS) and data warehousing techniques like Slowly Changing Dimensions, Snow flaking.
  • Experience in utilizing Tableau visual analytics and generating reports using Tableau visualizations like bars, lines, pies, scatter plots, heat maps, bubbles etc according to the end user requirements.
  • Extensive knowledge in Normalization/De - normalization techniques for effective and optimum performance in Online Transactional Processing System (OLTP) and Online Analytical Processing System (OLAP) environments, developing Database Schemas like Star schema and Snowflake schema using in relational, dimensional and multidimensional modeling.
  • Design of logical and physical data model using Erwin tool.
  • Knowledge on Unix Shell Scripting, Python, no SQL and tools like Informatica 8x, 9x.
  • Ability to prepare detailed documentation of procedures and specifications for data models to be developed.
  • Excellent knowledge in Data Analysis, Data Validation, Data Cleansing, Data Mapping, Data Munging and identifying data mismatch.
  • Experience in managing projects using software development lifecycle (SDLC) methodologies such as Agile and waterfall.
  • Good at logical and critical thinking analysis - analyst by title and subject matter expert by role, smart & hard worker by nature, flexible & out-going personality with strong technical knowledge from software development background.
  • Being a quick learner and self-learner with can-do attitude, constantly updating myself with the existing new versions of the existing technologies and new technologies/methodologies, effectively utilizes the third party or proprietary tools to reduce the product delivery time.
  • Excellent verbal and written skills and urge towards meeting the deadline on time as an individual contributor with great accuracy in fast paced environment.
  • Flexible working in team and as an individual, handled projects from the scratch, also dealt with live projects in sync with the existing members and implemented enhancements as well. Never ending enthusiasm in learning new tools and technologies made me a quick learner and adaptable to any work environment.

TECHNICAL SKILLS

Operating Systems: Windows XP/2003/2007, Windows Server 2003/2008, UNIX/LINUX

Data Warehousing Tools: Informatica 9.x/8.x, Data Modeling Star-Schema, Snowflake, Fact and Dimension tables, Erwin r7.x/8.x/9.x, Visio

Databases: MS SQL Server 2008/2008R2/2012, Oracle 12c/11g/10g/9i/8i/8.0

Computer Skills: MS Office - Excel, Word, Access, Outlook, Publisher, PowerPoint, SharePoint

Languages: SQL, PL/SQL, HTML, CPP, JAVA, R, Python

Methodologies: Software Development Life Cycle, Waterfall, Agile, Scrum

PROFESSIONAL EXPERIENCE

Senior Data Analyst

Confidential, Minneapolis, MN

Responsibilities:

  • Experience in all phases of the Data warehouse life cycle involving Analysis, Design, Development and Testing as a part of Data Quality Design framework. Development of Extraction and Loading using Informatica.
  • Prepared SQL & PL/SQL Queries to validate the data in both source and target databases.
  • Applied data cleansing/data scrubbing techniques to ensure consistency amongst data sets.
  • Developed process for capturing and maintaining metadata from all data repository components.
  • Solely responsible for designing and migrating the Data quality framework from composite to Informatica.
  • Used various transformations like Source Qualifier, Expression, Normalizer, Aggregator, and Filter for Designing and optimizing the Mapping.
  • Perform pharmacy claims data extraction and analysis to produce reports evaluating the potential financial impact associated with implementation of medical policy claim edits.
  • Created the ETL exception reports and validation reports after the data is loaded into the warehouse database.
  • Designed entity relationship diagrams and multidimensional data models, reports and diagrams for marketing.
  • Worked extensively on Data Profiling, Data cleansing, Data Mapping and Data Quality.
  • Used Model Mart of Erwin for effective model management of sharing, dividing and reusing model information and design for productivity improvement.
  • Collaborated with data architects for data modeling management and control visions by conducting data model reviews with project team members.
  • Created UNIX shell scripts to run the Informatica workflows and controlling the ETL flow.
  • Independently perform complex troubleshooting, root-cause analysis and solution development.
  • Worked with upper management to identify and brainstorm solutions for bottlenecks and errors of ongoing reports.
  • Extracted data from various database sources like Oracle, DB2, SQL Server using Informatica to load the data into a single data warehouse repository.
  • Utilized corporation developed Agile SDLC methodology used Jira and Microsoft office software to perform required job functions.
  • Extensively used Erwin r9.6 for Data modeling. Created Staging and Target Models for the Enterprise Data Warehouse.
  • Designed Logical and Physical modeling, Dimensional modeling using Erwin in data warehouse.
  • Prepared test Data sets and performed data testing using the PL/SQL scripts. Also, used MS excel for data mining, data cleansing, data mapping, and data dictionary and data analysis.
  • Created and optimized RDBMS objects (tables, materialized views, synonyms, indexes, triggers, procedures, functions) and Discoverer reports. Development of dashboard in Tableau for data visualization to expedite the data analysis process and report automation.
  • Reviewed and created conceptual model for Enterprise Data Warehouse with business users/clients.
  • Played important role in Agile team which includes participating in job sprints on Scrum based projects with onsite and offshore team members.

Environment: Agile, Erwin, PL/SQL, RDMS, SDLC, MS office, Enterprise data warehouse, Jira, data mapping, data cleansing.

Senior Data Management Analyst

Confidential, Mounds View, MN

Responsibilities:

  • Reviewed normalized / de-normalization schemas for effective and optimum performance tuning queries and data validations in OLTP and OLAP environments.
  • Extensively involved in Performance Tuning for ETL process at source, target, Informatica Mappings, Sessions and at database level for optimum performance.
  • Designed & created OLAP Cubes with Star schema using SSAS.
  • Experience in building Data Integration, Workflow Solutions and Extract, Transform, and Load (ETL) solutions for data warehousing using SQL Server Integration Service (SSIS).
  • Worked on creating DDL, DML scripts for the data models.
  • Primary focus was administration and support of warehouse management and order fulfillment applications written in Progress RDBMS.
  • Utilized Agile/ SCRUM and PMI methodologies to monitor steer and develop project objectives.
  • Create various Data Mapping Repository documents as part of Metadata services.
  • Providing subject matter expertise to Compliance on expected data behaviors, trends and patterns, as well as the meaning of data fields and values.
  • Developed Entity-Relationship diagrams and modeling Transactional Databases and Data Warehouse using tools like Erwin7.x
  • To ensure data warehouse and data mart designs efficiently support BI and end user.
  • Created and maintained all data mapping, Meta Data, resolved all Data Quality Issues.
  • Performed requirements gathering and analysis including data analysis, gap analysis (AS-IS to TO-BE), & documentation for end users.
  • Designed & developed various Ad hoc reports for different teams in Business (Teradata and Oracle SQL, MS ACCESS, MS EXCEL).
  • Created dimensional logical model with various facts, dimensions with attributes combination using ER studio.
  • Responsible for data design, metadata and assisting with the data integration layer's repository creation.
  • Developed slowly changing dimension scheme - type 1 and type 3 for most of the dimensions.
  • Designed the business requirement collection approach based on the project scope and SDLC methodology.
  • Reviewed the Business Requirement Document and Business Process Flow Diagram.
  • Creating logical and physical data models using best practices to ensure high data quality and reduced redundancy.
  • Review of Various documents including Business Requirement Document (BRD) and Functional System Design (FSD).
  • Created UML diagrams including context, Business Rules Flow, and Class Diagrams.

Environment: Informatica, Tableau, Microsoft Office, Oracle, SSRS, SQL Server, MS Excel, Teradata, SCD, Agile.

Data Analyst

Confidential

Responsibilities:

  • Worked with the analysis teams and management teams and supported them by providing variables based on their requirements.
  • Generated PL/SQL scripts for data manipulation, validation and materialized views for remote instances.
  • Reviewed basic SQL queries and edited inner, left, and right joins in Tableau Desktop by connecting live/dynamic and static datasets.
  • Created and modified several database objects such as Tables, Views, Indexes, Constraints, Stored procedures, Packages, Functions and Triggers using SQL Server.
  • Analyzed existing logical data model and made the necessary changes to make it compatible with business requirements.
  • Requirements, Analysis and Design - Involved in data analysis, preparation of mapping documents, procuring sign-offs on ETL design.
  • Involved in Unix Shell Scripts for automation of ETL and various other process.
  • Develop robust ETL designs (functional and technical) for the ETL solution.
  • Assist with task identification and effort estimates for ETL development per Agile Development methodology.
  • Lead and mentor the team in Offshore in design, develop and deliver the ETL tasks.
  • Identified the entities and relationship between the entities to develop Conceptual Model using Erwin 7.x/8.x.
  • Applied data cleansing/data scrubbing techniques to ensure consistency amongst data sets.
  • Developed live reports in a drill down mode to facilitate usability and enhance user interaction.
  • Query Data from Hadoop/Hive & MySQL data sources to build visualization in Tableau.
  • Facilitated the automation process for the Delinquency Report -This report was required to run monthly.
  • Designed and developed various analytical reports from multiple data sources by blending data on a single worksheet in Tableau Desktop. Involved in creating Tree Map, Heat maps and background maps.
  • Involved in generating dual-axis bar chart, Pie chart and Bubble chart with multiple measures and data blending in case of merging different sources.
  • Developed storytelling dashboards in Tableau Desktop and published them on to Tableau Server which allowed end users to understand the data on the fly with the usage of quick filters for on demand needed information.
  • Identified source systems, their connectivity, related tables and fields and ensure data suitably for mapping.
  • Tested dashboards to ensure data was matching as per the business requirements and if there were any changes in underlying data.
  • Created reports using analysis output and exported them to the web to enable the customers to have access through Internet.

Environment: Microsoft SQL Server 2012, SSRS, SSIS, RDBMS, MYSQL, Tableau, Shell Script.

Business Integrity Analyst

Confidential, San Jose, CA

Responsibilities:

  • Documented logical, physical, relational and dimensional data models. Designed the Data Marts in dimensional data modeling using star and snowflake schemas.
  • Prepared documentation for all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, and glossary evolve and change during the project.
  • Coordinated with DBA on database build and table normalizations and de-normalizations.
  • Created, documented and maintained logical & physical database models.
  • Creating Data mappings, Tech Design, loading strategies for ETL to load newly created or existing tables.
  • Perform in-depth data analysis and load customer details from data warehousing to analyze, generate comprehensive reports to decision makers and other affected by the results.
  • Created Schema objects like Indexes, Views, and Sequences, triggers, grants, roles, Snapshots.
  • Worked closely with Data Modelling team to create the logical model for the EDW with approximately 75 entities and 1000 attributes using Erwin 7x.
  • Developed strategies and loading techniques for better loading and faster query performance.
  • Extensively worked on documentation of Data Model, Mapping, Transformations and Scheduling batch jobs.
  • Developed dimensional model for Data Warehouse/OLAP applications by identifying required facts and dimensions.
  • Generated the DDL using forward engineering Worked in merging and complete compare of Physical models.
  • Used the Data Warehousing Life Cycle to identify data elements from the source systems, performed data analysis to come up with data cleansing and integration rules for the ETL process.
  • Documented requirements with stakeholders, and partner with People Insights and People Analytics on broader changes to core dashboards and solutions.
  • Developing the standard operating procedure to automize the regular patterns and conduct in-depth investigations leveraging volumes of data.
  • Worked in 2-week sprints and assisted in managing the resources during development, testing and documented the process and updates of the project as well.
  • Created data reports and visualizations in Tableau and forecasted the beneficial project contracts.
  • Tracked defects/issues and escalated them from time to time on JIRA.
  • Conducted Data Analysis and trend analysis on the customer issues and behavior based on the historical data provided and data extracted from the wall board.
  • Worked with Scrum Master and PO in prioritizing the PBI and putting them out to the Scrum teams.

Environment: RUP, MS Office, Tableau, MS SQL Server, Informatica, SSRS, SSIS, SSAS, PL/SQL, JIRA, data modeling-star and snow flake schema, Erwin.

Data Analyst

Confidential

Responsibilities:

  • Created System Sequence Diagram to depict the interaction or flow of transactions between the entities.
  • Managed deployment of complex software releases with dependencies among different implementations and engagement of multiple release managers.
  • Worked with the UX team to update/create mockups of the pages.
  • Worked with Analytics Team to engage Data Analytics in the project.
  • Developed work flow diagrams, activity diagrams, sequence diagrams and Use cases by covering all the business needs using MS Visio
  • Worked with engineering and external partners to generate, capture, and ingest source data, also, worked with data engineering to build ETLs that transform the data into manageable nuggets.
  • Built reports and performing analyses to draw insights; and worked with different parts of the organization to implement changes based on those insights.
  • Analyze and determine risks in the portfolio decisions, forecasting potential losses.
  • Learnt Amazon data structures - MYSQL, Oracle.
  • Wrote sophisticated and optimized SQL queries to extract large data and built complex models.
  • Defined and created procedures to install Tableau desktop in silent mode.
  • Generate DDL's and make the same available to the DBA for execution.
  • Empowered productivity improvements and data sharing throughout a major banking enterprise by using Erwin r7.x for effective model management.
  • Been as an SME and worked with other SMEs in identifying solutions for reporting issues found while comparing existing data models.
  • Utilized Agile/Scrum and PMI methodologies to monitor steer and develop project objectives.
  • Analyzed large transactional date, history data and stream data for the knowledge of business development and problem solving.
  • Gathered requirements through meetings, surveys and got them approved by the product manager.
  • Predict future trends from the current developments in the market and recommend management accordingly to plan the strategies.
  • Wrote and tuned SQL scripts to test the flow of online quotations to the database and verify the data.
  • Facilitated User Acceptance Testing with business stakeholders to ensure all requirements have been met.
  • Identified/documented data sources and transformation rules required populating and maintaining data warehouse content.
  • Facilitated meetings with business users, data architects, data modelers, business analysts, QA and multiple delivery teams to define the data quality and profiling requirements.
  • Used SDLC (System Development Life Cycle) methodologies like Agile methodology.

Environment: Oracle, MYSQL, SSIS, SSAS, MDX, Informatica, Erwin, Windows, UNIX, Excel, Agile

Data Analyst

Confidential

Responsibilities:

  • Responsible for defining the scope and business rules of the project, gathering business requirements, and document them textually or using models. Interacted with cross-functional teams to facilitate gathering of business requirements.
  • Assisted in documenting business requirements, technical specifications and implementation of various ETL standards in the mappings.
  • Created data flow diagrams, data mapping from Source to stage and Stage to Target mapping documents indicating the source tables, columns, data types, transformations required and business rules to be applied.
  • Implemented CDC (change data capture) for inserting and updating slowly changing dimension tables in target for maintaining the history data.
  • Created large datasets by combining individual datasets using various inner and outer joins in SQL and dataset sorting and merging techniques using Base.
  • Validating and manipulating data within Oracle using PL/SQL. Wrote PL/SQL queries to perform Data Analysis.
  • Utilized corporation developed Agile SDLC methodology used Jira and Microsoft office software to perform required job functions. Maintained Requirement Traceability matrix throughout the project. Prepared test Data sets and performed data testing using the PL/SQL scripts. Also, used MS excel for data mining, data cleansing, data mapping, and data dictionary and data analysis.
  • Worked with data warehouse concepts for data cleaning, data integration, data transformation and a periodic data of refreshing.
  • Involve in project cycle plan for the data warehouse, source data analysis, data extraction process, transformation and ETL loading strategy designing. Involved with Data Warehouse team and generated reports using Oracle BI tool.
  • Worked closely with the Enterprise Data Warehouse team and Business Intelligence Architecture team to understand repository objects that support the business requirement and process.
  • Created a thorough Data Analysis document to gather the accurate data for the rewrite of the application and to demolish the bad data. Ran SQL queries to create, modify, delete and update the Oracle database and to analyze the data.
  • Make use of data modeling techniques in designing the data marts project.
  • Extensively involved in Data mapping and Data modeling.
  • Expanded physical data model of OLTP application using Erwin 7.x.
  • Assisted the DBA in converting the logical models to physical.
  • Conduct Design discussions and meetings to come out with the appropriate Data Mart using Kimball Methodology.

Environment: SQL, Informatica, Oracle, Erwin, SSRS, Jira

We'd love your feedback!