We provide IT Staff Augmentation Services!

Sr. Data Analyst Resume

4.00/5 (Submit Your Rating)

Pittsburgh, PA

SUMMARY

  • Overall 8+ years of Professional Experience in Banking Industry on different environments and platforms including Data Analysis, data warehousing, data modelling, data mapping, data profiling, Client - Server.
  • Domain experience includes credit cards, retail banking, lending, cross sell, campaign management, auto finance, customer intelligence, customer servicing, retail banking, channels, acquisitions, recoveries, mergers, digital & phone banking, mortgage, core banking solutions, complex data warehouse environments, data analytics, data mining and reporting.
  • Experienced in using both MOLAP & ROLAP tools, Operational reporting tool and know how to evaluate BI Tool.
  • Experience with designing and verifying databases on Oracle and SQL Server RDBMS using Entity-Relationship Diagrams (ERD).
  • Strong skills in Data Warehouse, Data Exploration/Extraction, Data Validation, Reporting and Excel.
  • Experience in working with Data Management and Data Governance based assignments.
  • Extensive experience in RDBMS implementation and development using SQL, PL/SQL stored procedures and query optimization.
  • Strong understanding of Relational Database Management System (RDBMS) including data model, tables, views, indexes, table space and partitioning etc.
  • Expert in TSQL DDL/DML, perform most of teh SQL Server Enterprise Manager and Management Studio functionality using T-SQL Scripts and Batches.
  • Strong working experience in teh Data Analysis, Design, Development, Implementation, Testing of Data Warehousing using Data Conversions, Data Extraction, Data Transformation and Data Loading (ETL).
  • Working experience on interactive dashboard and Reports in Tableau for monitoring teh operation performance on day-to-day basis.
  • Experience in creating Data Governance Policies, Business Glossary, Data Dictionary, Reference Data, Metadata, Data Lineage, and Data Quality Rules.
  • Extensively worked on Data analysis skills including Data mapping from source to target database schemas, Data Cleansing and processing, writing data extract scripts/programming of data conversion and researching complex data problems.
  • Extensively designed Data mapping and filtering, consolidation, cleansing, Integration, ETL, and customization of data mart.
  • Good understanding of teh Data modeling (Dimensional & Relational) concepts like Star-Schema Modeling, Snowflake Schema Modeling, Fact and Dimension tables.
  • Good knowledge of Teradata RDBMS Architecture, Tools & Utilities.
  • Experience working with consultants from multiple vendors including big 5 consulting firms.
  • Strong knowledge of Core Banking System, Retail and Commercial banking business, products, policies and procedures.
  • Experience in teh ETL (Extract, Transform and Load) of data into a data warehouse/date mart and Business Intelligence (BI) tools like Business Objects Modules (Reporter, Supervisor, Designer, and Web Intelligence).
  • Good working knowledge of Meta-data management in consolidating metadata from disparate tools and sources including Data warehouse, ETL, Relational Databases and third-party metadata into a single repository to get information on data usage and end-to-end change impact analysis.
  • Solid understanding of data governance theories, principals, processes, practices, and tools, including master data, data quality, data modeling, and data stewardship.
  • Worked on integration and implementation of projects and products, database creations, modeling, calculation of object sizes, table spaces and database sizes.
  • Extensive experience on working with BI tools.
  • Exposure to implementation and operations of data governance, data strategy, data management and solutions.
  • Possess strong Documentation skills and knowledge sharing among team, conducted data modeling review sessions for different user groups, participated in sessions to identify requirement feasibility.
  • Strong knowledge of all phases in Software development life cycle (SDLC) and teh iterative Rational Unified Process (RUP) .
  • Extensive experience using data cleansing techniques, Excel pivot tables, formulas, and charts. Strong end-user computing skills (Excel, SharePoint, Power Point, Word).
  • Adept at writing Data Mapping Documents, Data Transformation Rules and maintaining Data Dictionary and Interface requirements documents.
  • Proficient in various management tools like JIRA, Confluence, Version One and Rational Requisite Pro.
  • Executed different types of testing including User Acceptance Testing (UAT), Black Box Testing, Unit Testing, Integration Testing and System Testing.

PROFESSIONAL EXPERIENCE

Sr. Data Analyst

Confidential, Pittsburgh, PA

Responsibilities:

  • Interacted with SME's (subject matter experts) of different divisions and followed a data analysis and design methodology around teh agile (scrum) methodology.
  • Identified and extracted relevant data from large data sets using data reduction and mining techniques .
  • Written SQL queries for checking teh Data Migration, Data transformations and Database Integrity in Teradata.
  • Identify customer marketing opportunities by applying data mining models and writing advanced T-SQL stored procedures and queries.
  • Involved in Designing teh ETL process to Extract translates and load data from OLTP Oracle database system to Teradata data warehouse.
  • Responsible for Data Warehousing (DWH) and BI with Data Analysis, Data Mapping, Data Cleansing, Entity-Relationship Diagrams (ERD), Data Modeling (Conceptual, Logical, & Physical), & Master Data Mgmt. (MDM).
  • Involved with a team and worked on BI Tool Evaluation for choosing teh right tool.
  • Worked on teh development of Dashboard reports for teh Key Performance Indicators for teh top management and Developed Test cases and performed Unit Testing.
  • Performed Data profiling and exploration using Informatica IDQ.
  • Worked on developing, analyzing, and reviewing multiple logical solution logical data model/data architecture to a given business situation.
  • Involved in designing and developing data management application on with central operational database.
  • Extract transform data from various sources and transfer data to teh Teradata data warehouse tables/Views.
  • Involved in defining teh source to target data mappings, business rules, business and data definitions.
  • Involved in Master Data Management, responsibilities include Data governance consisting of Data Stewardship, Metadata management, Data Quality, Data Profiling, Data Integration & Consolidation.
  • Involved in Logical & Physical Data Modeling, Database Schema design and modification of Triggers, Scripts, Stored Procedures inSybaseDatabase Servers.
  • Data Validation to check for teh proper conversion of teh Data Cleansing to identify bad data and clean it.
  • Gathered requirements and modelled teh data warehouse and teh underlying transactional database.
  • Worked with SQL, RDBMS, and Data warehousing (STAR Schema).
  • Performed data entry and reconciliation as well as back-end data systems maintenance work using MDM tools Informatica and SQL language programming.
  • Managed SQL RDBMS, ability to read and understand data flow diagrams and ERD.
  • Used Snowflake functions to perform semi structures data parsing entirely with SQL statements.
  • Gathered requirements and modelled teh data warehouse and teh underlying transactional database.
  • Created Talend flows to load and transform teh data from various sources like flat files (delimited and positional), Oracle, XML, MS SQL Server, MS Access and excel files.
  • Part of team conducting logical data analysis and data modelling JAD sessions, communicated data-related standards.
  • Worked with Finance, Risk, and Investment Accounting teams to create Data Governance glossary, Data Governance framework and Process flow diagrams.
  • Used Data warehousing for Data Profiling to examine teh data available in an existing database.
  • Responsibilities included source system analysis, data transformation, loading, validation for data marts, operational data store and data warehouse.
  • Collected business requirements to govern proper data transfer from data source to data target during teh data mapping process.

Data Analyst

Confidential, New York, NY

Responsibilities:

  • Supported for analysis and evaluation of data mining in a data warehouse environment, which included data design, database architecture, and metadata and data governance in oracle environment.
  • Performed extensive system analysis to design data model which helped in loading data from multiple data sources.
  • Responsible for Text Analytics, generating data visualizations using R, Python and creating dashboards using tools like Tableau.
  • Data mapping, logical data modelling, created class diagrams and ERD and used SQL queries to filter data within teh Oracle database.
  • Participate in Data modeling discussions, exchange thoughts to come up with a best model to efficiently serve both ETL teams and reporting teams.
  • Worked on Data Management, Data Analysis, Data Mapping and Testing of ETL and BI Reporting requirements.
  • Created STTM documentation for data mart integrating different source systems.
  • Worked on Various Databases, Various SOA Platforms, Various ESBs, Various ETL tools, Various BI tools, XPATH, XQuery.
  • Creating teh data for Qlik View which is an exploration & discovery Analysis.
  • Involved in creating logical and physical data modelling with STAR and Snowflake schema techniques using Erwin in Data warehouse as well as in Data Mart.
  • Performed Dependency Analysis (by identifying any breaks in lineage), root cause analysis and remediation as part of teh data lineage effort.
  • Performed mappings & ran sessions and workflows to execute teh Land Process of loading teh customer data Set into Informatica MDM Hub using ETL tool that was coming from various source systems.
  • Independently worked on owning IT support tasks related to Tableau Reports on Server.
  • Created Use Case Diagrams, Data Flow Diagrams, Data mapping, Data Lineage mapping, Sequence Diagrams, ODD and ERD in MS Visio.
  • Involved in creating Informatica mapping to populate staging tables and data warehouse tables from various sources like flat files DB2, Netezza and oracle sources.
  • Worked with Informatica MDM Hub Match and Merge Rules, Batch Jobs and Batch Groups.
  • Generating Data Models using Erwin and developed relational database system and involved in Logical modelling using teh Dimensional Modeling techniques such as Star Schema and Snowflake Schema.
  • Involved in data definitions, data element naming conventions, logical/physical data designs ana data architecture.
  • Performing data management projects and fulfilling ad-hoc requests according to user specifications by utilizing data management software programs and tools like Perl, MS Access and Excel.
  • Well versed with various aspects of ELT processes used in loading and updating Teradata data warehouse.
  • Worked on RDBMS like Oracle DB2 SQL Server and My SQL database.
  • Participated in System Analysis, E-R/Dimensional Data Modeling, Database Design and implementing RDBMS specific features.
  • Writing sql involving complex queries, joins, sub queries and creating temp tables, using Teradata SQL Assistant and SQL Developer for data requests.
  • Involved in defining teh source to target data mappings, business rules and data definitions.
  • Performed data integrity testing by extensive use of SQL queries and executed backend testing.
  • Involved in creating and designing reports framework with dimensional data models for teh data warehouse and worked with teh development team on SQL Server tools like Integrations Services and Analysis Services.
  • Responsible for preparing adhoc reports using custom SQL and views and to present teh same to teh business users and program managers.
  • Developed user documentation for all teh application modules. Also responsible for writing test plan documents and unit testing for teh application modules.
  • Prepared project reports for management and assisted teh project manager in teh development of weekly and monthly status reports, policies and procedures.

Data Analyst

Confidential - Austin, TX

Responsibilities:

  • Involved with Data Analysis Primarily Identifying Data Sets, Source Data, Source Metadata, Data Definitions and Data Formats.
  • Performed Count Validation, Dimensional Analysis, Statistical Analysis and Data Quality Validation in Data Migration.
  • Performed data analysis and documented Source to Target mapping, Data quality checks, Data sharing agreements and documents.
  • Involved in migration projects to migrate data from data warehouses on Oracle and migrated those to DB2.
  • Performed data auditing, validation and cleansing as well as metadata mapping and migrations.
  • Identified integration impact, data flows and data stewardship.
  • Involved in migration of various objects like, stored procedures, tables, views from various data source to SQL Server.
  • Assisted in Identifying and clarifying defined functional issues and supported IT development staff throughout teh design, development, unit testing, and implementation phases of teh software development life cycle.
  • Involved in Designing teh DB2 process to Extract translates and load data from OLTP Oracle database system to Teradata data warehouse.
  • Involved in developing and maintaining data architectures, data repositories, and data management procedures and standards.
  • Part of team implementing REST API’s in Python using micro-framework like Flask with SQL Alchemy in teh backend for management of data center resources on which OpenStack would be deployed.
  • Work with Data Migration team in determining teh data elements and period for data migration.
  • Involved in Data modeling process, conceptual, logical and physical models, ERD and data warehouse design-Dimensional-Star, snowflake schema, Data Integrity and OLAP and facts, indexing, data dictionary.
  • Conducted Data Cleansing for migrating data from legacy system to teh new data warehouse and performed Data conversion validation testing to make sure there is seamless conversion of data from current system to new System.
  • Perform Data Conversion/Data migration using Informatica Power Center.
  • Responsible for executing User Interface Testing, System Testing, Data Quality Testing on teh configuration design and prototypes.
  • Worked with teh data steward and data owners in creating metadata, Lineage, Data Quality rules and guidelines.
  • Created data governance templates and standards for teh data governance organization.
  • Performed source data profiling to verify that data meets teh business requirements.
  • Worked on issues with migration of Data from development to QA-Test Environment.
  • Proposed solutions to reporting needs and develop prototypes using SQL and Business Objects that address these needs.

Data Analyst

Confidential, Minneapolis, MN

Responsibilities:

  • Worked in creating teh Security Master for teh fixed income securities dis was implemented using teh market data and reference data feeds from various sources like Bloomberg.
  • Write and analyze complex Oracle PL/SQL queries to understand data flow between multiple legacysystems.
  • Responsible for defining teh key identifiers for each mapping/interface.
  • Supported to Create and design Dashboard mockup screens for development purposes, understanding KPI's to present charts and graphs.
  • Involved in Data Modeling of both Logical Design and Physical Design of Data Warehouse and data marts in Star Schema and Snowflake Schema methodology.
  • Involve in Data Extraction, Transformation and Loading (ETL process) from Source to target systems using Informatica Power Center.
  • Involved with Data Analysis Primarily Identifying Data Sets, Source Data, Source Metadata, Data Definitions and Data Formats.
  • Performed data analysis and documented Source to Target mapping, Data quality checks, Data sharing agreements and documents.
  • Used Data Warehouse Architecture and Design - Star Schema or Snowflake Schema, FACT and Dimensional Tables, Physical/Logical Data Modeling and data flow diagrams.
  • Involved in migration projects to migrate data from data warehouses on Oracle and migrated those to DB2.
  • Involved in migration of various objects like, stored procedures, tables, views from various data source to SQL Server.
  • Involved in Designing teh DB2 process to Extract translates and load data from OLTP Oracle database system to Teradata data warehouse.
  • Analyzed data lineage processes to identify vulnerable data points, control gaps, data quality issues, and overall lack of data governance.
  • Supported BI development in centralized solutions repository for dashboard properties, Key performance indicators, BI metrics, facts, dimensions, data anomalies, data dependencies, data dictionaries, metadata, data profiling and table-column definitions in data warehouse.
  • Performed ETL process using Data Manager for loading teh data into Data marts which were used for performance reporting.

Data Analyst

Confidential, San Francisco, CA

Responsibilities:

  • Generating Data Models using Erwin and developed relational database system and involved in Logical modeling using teh Dimensional Modeling techniques such as Star Schema and Snowflake Schema.
  • Involved in Data Extraction, Transformation and Loading (ETL process) from Source to target systems using Informatica Power Center.
  • Metrics reporting, data mining and trends in helpdesk environment using Access.
  • Assisted in Data warehousing and database modeling and development of Entity Relationship Diagrams ERD
  • Used both Kimball and Bill Inman methodologies for creating data warehouse and transformed data from different OLTP systems.
  • Involved in requirement gathering and database design and implementation of star-schema, dimensional data warehouse using Erwin.
  • Involved in writing complex SQL queries for extracting data from multiple tables.
  • Used SQL Queries to verify and validate Data populated in Front-End is consistent with that of Backend, used Insert, Update, Aggregate and Join queries.
  • Tested Complex ETL Mappings and Sessions based on business user requirements and business rules to load data from source flat files and RDBMS tables to target tables.

We'd love your feedback!