We provide IT Staff Augmentation Services!

Sr. Data Analyst Resume

0/5 (Submit Your Rating)

San Diego, CA

SUMMARY

  • Over 7+ years of experience in the IT industry as a Senior Data Analyst, data modeler, data mapper and data profiler.
  • Excellence in delivering Quality Conceptual, Logical and Physical Data Models for Multiple projects involving various Enterprise New and Existing Applications and Data Warehouse.
  • Experienced in identifying entities, attributes, metrics, and relationships; also assigning keys and optimizing the model.
  • Proficient in data mart design, creation of cubes, identifying facts & dimensions, star & snowflake schemes
  • Expert in developing transactional enterprise data models that strictly meet normalization rules, as well as Enterprise Data Warehouses using Kimball and Inmon Data Warehouse methodologies.
  • Expert in Data Quality Management techniques like Data Profiling, Data Cleansing, Data Integrity, Reference Data and Data Security, Data mining etc.
  • Proven experience with design and solution on Data mining and analytics modeling on cross datasets.
  • Strong knowledge with Big Data and Data Analytics technologies
  • Solid experience in working and creating 3rd Normal Forms (ODS) and Dimensional models (OLAP)
  • Extensive experience with Normalization (1NF, 2NF, 3NF and BCNF) and De - normalization techniques for improved database performance in OLTP and Data Warehouse/Data Mart environments
  • Excellent experience in modeling DW components (Staging area, normalize, and reporting DB (Star Schema /data mart).
  • Extensive knowledge of enterprise repository tools, data modeling tools, data mapping and data profiling tools.
  • Extensive working experience in advanced SQL Queries and PL/SQL stored procedures.
  • Good Experience with Mainframe systems (COBOL, JCL, CICS, VSAM, DB2, IMS, IDMS) and also conversion of Mainframe data to ETL staging tables
  • Involved in the maintenance of the repositories for the metadata
  • Experience in logical/physical database design and review sessions to determine and describe data flow and data mapping from source to target databases coordinating with End Users, Business Analysts, DBAs and Application Architects.
  • Excellent knowledge in Data Analysis, Data Validation, Data Cleansing, Data Verification and identifying data mismatch
  • Performed data analysis and data profiling using complex SQL on various sources systems including Oracle and Teradata
  • Excellent experience on Teradata SQL queries, Teradata Indexes, Utilities such as Mload, Tpump, Fast load and FastExport queries
  • Extensive knowledge and experience in producing tables, reports, graphs and listings using various procedures and handling large databases to perform complex data manipulations.
  • Excellent experience in Data mining with querying and mining large datasets to discover transition patterns and examine financial data.
  • Experience in testing Business Intelligence reports generated by various BI Tools like Cognos and Business Objects
  • Excellent knowledge on creating DML statements for underlying statements.

TECHNICAL SKILLS

Programming Languages: SQL, PL/SQL, UNIX shell Scripting, VBScript, PERL, AWK, SED

Databases: Oracle 11g/10g/9i, Teradata R12/R13/R14, MS SQL Server 2005/2008, MS Access

Tools: MS-Office suite (Word, Excel, MS Project and Outlook), VSS,ERWIN tools, MDM tools, Informatica, SSAS Microsoft Visual Studio 2008

Testing and defect tracking Tools: HP/Mercury (Quality Center, Win Runner, Load Runner, Quick Test Professional, Performance Center, VU Scripting, Business Availability Center), Requisite, MS Visio & Visual Source Safe

Operating System: Windows Vista/XP/2000/98/95, Dos, Unix

ETL/datawarehouse Tools: Informatica 9.5/9.1/8.6.1/8.1 (Repository Manager, Designer, Workflow Manager, and Workflow Monitor), SAP Business Objects XIR3.1/XIR2, Web Intelligence

Data Modeling: Star-Schema Modeling, Snowflake-Schema Modeling, FACT and dimension tables, Pivot Tables, Erwin

Tools: &Softwares: TOAD, MS Office, BTEQ, Teradata SQL Assistant, Informatica Metadata manager

PROFESSIONAL EXPERIENCE

Confidential, San Diego, CA

Sr. Data Analyst

Responsibilities:

  • Involved in creating Physical and Logical models using Erwin.
  • Created and maintained Database Objects (Tables, Views, Indexes, Partitions, Synonyms, Databasetriggers, StoredProcedures) in thedatamodel.
  • Presented thedatascenarios via, Erwin logical models and excel mockups to visualize thedata better.
  • Worked on building the data model using ER Studio as per the requirements, discussion and approval of the model from the BA.
  • Provided subject matter expertise as appropriate to ETL requirements, information analytics, modeling & design, development, and support activities.
  • Worked with requestors to develop and understand data requirements and reports specifications and worked on credit risk and operational risk.
  • Involved with Data Analysis primarily Identifying Data Sets, Source Data, Source Meta Data, Data Definitions and Data Formats
  • Designed and Developed logical & physical data models and Meta Data to support the requirements using Erwin
  • Designed the ER diagrams, logical model (relationship, cardinality, attributes, and, candidate keys) and physical database (capacity planning, object creation and aggregation strategies) for Oracle and Teradata as per business requirements using Erwin 9.5
  • Involved with Data Profiling activities for new sources before creating new subject areas in warehouse
  • Conducted the Data Analysis and identified the Data quality issues using Data profiling methodologies.
  • Analysis of functional and non-functional categorized data elements for data profiling and mapping from source to target data environment. Developed working documents to support findings and assign specific tasks
  • Created data masking mappings to mask the sensitive data between production and test environment.
  • Enforced referential integrity in the OLTP data model for consistent relationship between tables and efficient database design.
  • Developed the required data warehouse model using Star schema for the generalized model.
  • Used forward engineering approach for designing and creating databases for OLAP model.
  • Worked with business to identify the distinct data elements in each report to determine the number of reports needed to satisfy all reporting requirements.
  • Prepared source to target mapping and technical specification document.
  • Prepared requirement for design of data model for data marts using the identified data elements.
  • Resolveddataissuesand updates for multiple applicationsusing SQL queries/scripts
  • Worked on DataQuality Framework design, logical\physical model and architecture.

Environment: Netezza, Erwin 9x, DB2, Information Analyzer, Informatica, MDM, Quality Centre, Excel, Word.

Confidential, Winston-Salem, NC

Data Analyst / Data Modeler

Responsibilities:

  • Acted as liaison between the business units, technology teams and support teams.
  • Responsible for researching data quality issues (inaccuracies in data), worked with business owners/stakeholders to assess business and risk impact, provided solution to business owners
  • Analysis of functional and non-functional categorized data elements for data profiling and mapping from source to target data environment.
  • Developed working documents to support findings and assign specific tasks
  • Involved with data profiling for multiple sources and answered complex business questions by providing data to business users.
  • Extracted data from various sources like Oracle, Mainframes, flat files and loaded into the target Netezza database
  • Involved in investigating and resolving data anomalies
  • Analyzed data which included investigation and analysis, documentation, filtering of bad data through generation of crystal reports.
  • Designed 3rd normal form target data model and mapped to logical model.
  • Worked with Data Warehouse Extract and load developers to design mappings for Data Capture, Staging, Cleansing, Loading, and Auditing.
  • Developed, enhanced and maintained Snow Flakes and Star Schemas within data warehouse and data mart conceptual & logical data models.
  • Analyzed business requirements and translated them into detailed conceptual data models, process models, logical models, physical models and entity relationship diagrams.
  • Gathered enterprise level data requirements.
  • Communicated data needs internally and externally to 3rd parties. Analyzed and understood the data options and documented data dictionary.
  • Designed and created Data Marts as part of a data warehouse. Effectively used triggers and stored procedures necessary to meet specific application's requirements.
  • Involved in coordinating efforts with the application development team and obtaining new data requirements.
  • Involved in extensive DATA validation using SQL queries and back-end testing
  • Used SQL for Querying the database in UNIX environment
  • Involved in Data mapping specifications to create and execute detailed system test plans.
  • The data mapping specifies what data will be extracted from an internal data warehouse, transformed and sent to an external entity.
  • Create and Monitor workflows using workflow designer and workflow monitor.
  • Performing data management projects and fulfilling ad-hoc requests according to user specifications by utilizing data management software programs and tools like Perl, Toad, MS Access, Excel and SQL
  • Tested the database to check field size validation, check constraints, stored procedures and cross verifying the field size defined within the application with metadata.
  • Analyzed business requirements, system requirements, data mapping requirement specifications, and responsible for documenting functional requirements and supplementary requirements in Quality Center 9.0
  • Involved in developing detailed test plan, test cases and test scripts using Quality Center for Functional and Regression Testing.

Environment: PL/SQL, Business Objects XIR2, ETL Tools Informatica9.5/8.6/9.1 Oracle 11G, Teradata V2R13/R14.10Teradata SQL Assistant 12.0, Netezza, Power Designer, Erwin

Confidential, Austin, TX

Data Analyst

Responsibilities:

  • Liaised with lead business users (PK Analyst, QC Coordinator/QC Reviewer) and developed system process flows and User Requirements Specifications through surveys and interviews.
  • Extensively worked Data Governance, i.e. Meta data management, Master data Management, Data Quality, Data Security
  • Worked with Data Quality Team in defining and configuration of Rules, Monitoring and preparation of Data Quality Analysis and Dashboards
  • Involved in data model reviews with internal data architect, business analysts, and business users with explanation of the data model to make sure it is in-line with business requirements.
  • Used Reverse Engineering approach to redefine entities, relationships and attributes in the data model as per new specifications in Erwin after analyzing the database systems currently in use.
  • Worked on various business data and extracted data from various sources such as flat files, Oracle and Mainframes.
  • Worked with data investigation, discovery and mapping tools to scan every single data record from many sources.
  • Supported business areas and database platforms to ensure logical data model and database design, creation, and generation follows enterprise standards, processes, and procedures
  • Designed database solution for applications, including all required database design components and artifacts
  • Provided input into database systems optimization for performance efficiency and worked on full lifecycle of data modeling (logical - physical - deployment)
  • Worked on data modeling ER/Studio DA &ERwin to design data models
  • Identified/documented data sources and transformation rules required to populate and maintained data warehouse content.
  • Prepared the Dimension models and validated the models with the business requirements.
  • Prepared Data Cleansing methodologies by relating them to DQ issues.
  • Prepared the ETL mapping specifications to and the data in to staging area, and from staging area to the Data marts.
  • Involved in extensive DATA validation by writing several complex SQL queries and Involved in back-end testing and worked with data quality issues.
  • Provided data modeling support for numerous strategic application development project
  • Documented the Purpose of mapping so as to facilitate the personnel to understand the process and in corporate the changes as and when necessary.

Environment: PL/SQL, Business Objects XIR2, ETL Tools Informatica9.5/8.6/9.1 Oracle 11G, Teradata V2R12/R13.10, Teradata SQL Assistant 12.0

Confidential, McLean, VA

Data Analyst

Responsibilities:

  • Analysis of functional and non-functional categorized data elements for data profiling and mapping from source to target data environment.
  • Developed working documents to support findings and assign specific tasks
  • Involved withdataprofiling for multiple sources and answered complex business questions by providing data to business users.
  • Analyzed business requirements, system requirements, data mapping requirement specifications, and responsible for documenting functional requirements and supplementary requirements in Quality Center
  • Tested the ETL process for both before data validation and after data validation process.
  • Tested the messages published by ETL tool and data loaded into various databases
  • Experience in creating UNIX scripts for file transfer and file manipulation.
  • Perform data reconciliation between integrated systems.
  • Metrics reporting, data mining and trends in helpdesk environment using Access
  • Written complex SQL queries for validating the data against different kinds of reports generated by Business Objects XIR2
  • Designed and developed cubes using SQL Server Analysis Services(SSAS) using Microsoft Visual Studio 2008
  • Used ODS to integrate data from disparate source systems in a single structure using data integration technologies like data virtualization.
  • Worked with data investigation, discovery and mapping tools to scan every single data record from many sources.
  • Wrote and executed unit, system, integration and UAT scripts in a data warehouse projects.
  • Troubleshoot test scripts, SQL queries, ETL jobs, data warehouse/data mart/data store models.
  • Used Metadata manager to integrate data, data analysis and transformation of data
  • Extensively used ETL methodology for supporting data extraction, transformations and loading processing, in a complex EDW using Informatica.
  • Perform data reconciliation between integrated systems.

Environment: PL/SQL, Business Objects XIR2, ETL Tools Informatica 8.6/9.1 Oracle 11G, Teradata V2R12/R13.10

Confidential, New York, NY

Data Analyst / Data Modeler

Responsibilities:

  • Analysis of functional and non-functional categorized data elements for data profiling and mapping from source to target data environment.
  • Developed working documents to support findings and assign specific tasks
  • Involved withdataprofiling for multiple sources and answered complex business questions by providing data to business users.
  • Wrotecomplex SQL queries onNetezzaand usedthem in lookup SQL overrides and Source Qualifier overrides.
  • Analyzed business requirements, system requirements, data mapping requirement specifications, and responsible for documenting functional requirements and supplementary requirements in Quality Center
  • Tested the ETL process for both before data validation and after data validation process.
  • Tested the messages published by ETL tool and data loaded into various databases
  • Experience in creating UNIX scripts for file transfer and file manipulation.
  • Perform data reconciliation between integrated systems.
  • Metrics reporting, data mining and trends in helpdesk environment using Access
  • Written complex SQL queries for validating the data against different kinds of reports generated by Business Objects XIR2
  • Extracted data from various sources like Oracle,Netezza and flat filesand loaded into the targetNetezzadatabase
  • Designed and developed cubes using SQL Server Analysis Services(SSAS) using Microsoft Visual Studio 2008
  • Used ODS to integrate data from disparate source systems in a single structure using data integration technologies like data virtualization.
  • Worked with data investigation, discovery and mapping tools to scan every single data record from many sources.
  • Wrote and executed unit, system, integration and UAT scripts in a data warehouse projects.
  • Troubleshoot test scripts, SQL queries, ETL jobs, data warehouse/data mart/data store models.
  • Used Metadata manager to integrate data, data analysis and transformation of data
  • Extensively used ETL methodology for supporting data extraction, transformations and loading processing, in a complex EDW using Informatica.
  • Perform data reconciliation between integrated systems.
  • Designed database solution for applications, including all required database design components and artifacts
  • Provided input into database systems optimization for performance efficiency and worked on full lifecycle of data modeling (logical - physical - deployment)
  • Maintaineddatain the database with consistency and integrity.
  • Involved with datacleansing/scrubbing and validation.
  • Formed numerous Volatile, Global, Set, Multi-Set tables.
  • Used Teradata OLAP functions like RANK, ROW NUMBER, QUALIFY, CSUM and SAMPLE.
  • Used the automated process for uploadingdatain production tables by using UNIX.
  • Performed dicing and slicing ondatausing Pivot tables to acquire the churn rate pattern and prepared reports as required.
  • Identified critical Metadata for key business divisions to create crucialdatamapping and Metadata documents
  • In depth analyses ofdatareport was prepared weekly, biweekly, monthly using MS Excel, SQL & UNIX.

Environment: PL/SQL, Business Objects XIR2, ETL Tools Informatica 8.6/9.1 Oracle 11G, Teradata V2R12/R13.10, Teradata SQL Assistant 12.0, ERWIN data modeler, ODS

Confidential, Stevens Point, WI

Data Analyst

Responsibilities:

  • Acted as liaison between the business units, technology teams and support teams.
  • Responsible for researching data quality issues (inaccuracies in data), worked with business owners/stakeholders to assess business and risk impact, provided solution to business owners
  • Analysis of functional and non-functional categorized data elements for data profiling and mapping from source to target data environment.
  • Developed working documents to support findings and assign specific tasks
  • Involved with data profiling for multiple sources and answered complex business questions by providing data to business users.
  • Extracted data from various sources like Oracle, Mainframes, flat files and loaded into the target Netezza database
  • Involved in investigating and resolving data anomalies
  • Analyzed data which included investigation and analysis, documentation, filtering of bad data through generation of crystal reports.
  • Designed 3rd normal form target data model and mapped to logical model.
  • Worked with Data Warehouse Extract and load developers to design mappings for Data Capture, Staging, Cleansing, Loading, and Auditing.
  • Developed, enhanced and maintained Snow Flakes and Star Schemas within data warehouse and data mart conceptual & logical data models.
  • Involved in extensive DATA validation using SQL queries and back-end testing
  • Used SQL for Querying the database in UNIX environment
  • Involved in Data mapping specifications to create and execute detailed system test plans.
  • The data mapping specifies what data will be extracted from an internal data warehouse, transformed and sent to an external entity.
  • Create and Monitor workflows using workflow designer and workflow monitor.
  • Performing data management projects and fulfilling ad-hoc requests according to user specifications by utilizing data management software programs and tools like Perl, Toad, MS Access, Excel and SQL

Environment: PL/SQL, Business Objects XIR2, ETL Tools Informatica9.5/8.6/9.1 Oracle 11G, Teradata V2R13/R14.10, Teradata SQL Assistant 12.0, Power Designer

Confidential, Buffalo, NY

Data Analyst

Responsibilities:

  • Worked on the thorough Data analysis to make sure that all the sensitive data has been identified.
  • Conducted workflow, process diagram and gap analyses to derive requirements for existing systems enhancements.
  • Worked with the reference data to analyze the sensitivity of the data and secured it to protect the privacy of the client.
  • Worked on transformation logic using PL/SQL queries.
  • Worked with the Risk Matrix and identified the different tiers of the risks in the data warehouse environment.
  • Used Heat Maps to identify different Risk Mediums across different internal departments.
  • Performed Data Analysis and Data validation by writing SQL queries.
  • Worked on developing the tool to extract the data from db2 database and conducted MetaData and Data analysis.
  • Used regular expressions, and data mining techniques to search for defined data patterns.
  • Worked on several business intelligence tools such as Tableau and Qlikview to view and report the data analysis, patterns and trends.
  • Created SQL queries to perform Data Validation and Data Integrity testing.
  • Worked extensively with R and Python to view the data for analysis and create statistical models for deeper analyses.
  • Maintained/updated system data flow chart, Heat Maps, Tree Maps, Visio documents, and system documentations.
  • Lead meetings with the clients and business owners to discuss the requirements of the data analysis.
  • Created and maintained several custom reports using Tableau and Qlikview..

Environment: MS Excel, MS Project, Oracle 9i Heat Maps,MS SQL Server 2005, Tableau, Qlikview, R-Studio, Python, MySQL, Visual Basic

We'd love your feedback!