We provide IT Staff Augmentation Services!

Sr. Data Analyst/business Analyst/data Modeler/sql Analyst Resume

4.00/5 (Submit Your Rating)

Richardson, TX

SUMMARY:

  • 8+ years of Industry experience as a Data Analyst with solid understanding of Data Modeling, Evaluating Data Sources and strong understanding of Data Warehouse/Data Mart Design, ETL, BI, OLAP, Client/Server applications.
  • Expert in writing SQL queries and optimizing the queries in Oracle, SQL Server 2008 and Teradata.
  • Excellent Software Development Life Cycle (SDLC) with good working knowledge of testing methodologies, disciplines, tasks, resources and scheduling
  • Excellent knowledge in Data Analysis, Data Validation, Data Cleansing, Data Verification and identifying data mismatch. Performed data analysis and data profiling using complex SQL on various sources systems including Oracle and Teradata.
  • Excellent experience on Teradata SQL queries, Teradata Indexes, Utilities such as Multiload, Tpump, Fast load and Fast Export. Strong experience in using Excel and MS Access to dump the data and analyze based on business needs. Excellent knowledge on Perl &UNIX.
  • Experienced working with Excel Pivot and VBA macros for various business scenarios.
  • Strong experience in Data Analysis, Data Migration, Data Cleansing, Transformation, Integration, Data Import, and Data Export through the use of multiple ETL tools such as Ab Initio and Informatica Power Center Experience in testing and writing SQL and PL/SQL statements - Stored Procedures, Functions, Triggers and packages.
  • Excellent experience on using Teradata SQL Assistant, Teradata Administrator, PMON and data load/export utilities like BTEQ, Fast Load, Multi Load, Fast Export, and Exposure to T pump on UNIX/Windows environments and running the batch process for Teradata.
  • Extensive experience in supporting Informatica applications, data extraction from heterogeneous sources using Informatica Power Center. Experience in automating and scheduling the Informatica jobs using UNIX shell scripting configuring korn -jobs for Informatica sessions.
  • Experienced in various Teradata utilities like Fast load, Multi load, BTEQ, and Teradata SQL Assistant. Expertise in loading data by using the Teradata loader connection, writing Teradata utilities scripts (Fast load, Multi load) and working with loader logs.
  • Experience in designing error and exception handling procedures to identify, record and report errors. Excellent knowledge on creating reports on SAP Business Objects, Web reports for multiple data providers. Excellent experience in writing and executing unit, system, integration and UAT scripts in a data warehouse projects.
  • Excellent experience in writing SQL queries to validate data movement between different layers in data warehouse environment. Excellent experience in troubleshooting test scripts, SQL queries, ETL jobs, data warehouse/data mart/data store models.
  • Excellent knowledge in preparing required project documentation and tracking and reporting regularly on the status of projects to all project stakeholders
  • Extensive knowledge and experience in producing tables, reports, graphs and listings using various procedures and handling large databases to perform complex data manipulations.
  • Experience in testing Business Intelligence reports generated by various BI Tools like Cognos and Business Objects. Have good exposure on working in offshore/onsite model with ability to understand and/or create functional requirements working with client and also have Good experience in requirement analysis and generating test artifacts from requirements docs.
  • An excellent team player& technically strong person who has capability to work with business users, project managers, team leads, architects and peers, thus maintaining healthy environment in the project.

PROFESSIONAL EXPERIENCE:

Confidential, Richardson, TX

Sr. Data Analyst/Business Analyst/Data Modeler/SQL Analyst

Responsibilities:

  • Extracted data from Oracle, SQL Server and DB2 using Informatica to load it into a single data warehouse repository.
  • Created entity relationship diagrams and multidimensional data models, reports and diagrams for marketing.
  • Used Model Mart of Erwin for effective model management of sharing, dividing and reusing model information and design for productivity improvement.
  • Created a relational model and dimensional model for online services such as online banking and automated bill pay.
  • Applied data cleansing/data scrubbing techniques to ensure consistency amongst data sets.
  • Developed logical data models and physical data models using ER-Studio.
  • Adjusted and maintained SQL script and perform further data analysis and data aggregation.
  • SME (subject matter experts) for remediation exclusion and Data Dictionary.
  • Involved in defining the source to target data mappings, business rules and data definitions.
  • Worked closely with QA team and developers to clarify/understand functionality, resolve issues and provided feedback to nail down the bugs.
  • Gathered and translated business requirements into detailed, production-level technical specifications detailing new features and enhancements to existing business functionality.
  • Created the Source System Analysis documents and Architectural Solution documents.
  • Developed the logical and physical data model for the proposed solution.
  • Responsible for creating the staging tables and source to target mapping documents for the ETL process.
  • Worked with ETL teams and used Informatica Designer, Workflow Manager and Repository Manager to create source and target definition, design mappings, create repositories.
  • Developed and maintained data dictionary to create metadata reports for technical and business purpose.

Environment: Erwin r9.5, Informatica 8.6.1, Oracle SQL Developer, Oracle Data Modeler, TOAD, Oracle 10g/11g, Teradata 14, DB2, SSIS, Business Objects, SQL Server 2012/2014, SQL, PL/SQL, IBM DB2, VBA MS Excel, ER/Studio Windows XP, MS Excel, Sybase Power Designer.

Confidential, Richardson, TX

Sr. Data Analyst/Data Modeler

Responsibilities:

  • Demonstrated strong analytical skills in identifying and resolving data exchange issues. Developed Data Mapping, Data Governance, and Transformation and Cleansing rules for the Master Data Management.
  • Normalization (1NF, 2NF & 3NF) and Demoralization techniques for effective performance in OLTP and OLAP systems.
  • Designed data models in Dimensional Modeling (DM) environment. Executed SQL queries to retrieve data from databases for analysis.
  • Created Data Model Reports and Data Dictionary (DD) using ERWIN.
  • Developed normalized Logical and Physical database models to design OLTP system for enterprise applications.
  • Performed forward engineering operations to create a physical data model with DDL that best suits requirements from the logical data model.
  • Developed Facts & Dimensions using ERWIN. Used Normalization up to 3NF and De-normalization for effective performance.
  • Developed Star and Snowflake schemas using dimensional data models.
  • Enforced Referential Integrity for consistent relationship between parent and child tables.
  • Strong working experience in Extract/Transform/Load (ETL) design and implementation in areas related to Teradata utilities such as Fast Export and MLOAD for handling multiple tasks.
  • Extracted data from databases like Oracle, SQL server and DB2 using Informatica to load it into a single repository for data analysis.
  • Created source to target mapping specifications using Informatica data quality tool.
  • Involved in development and implementation of SSIS, SSRS and SSAS application solutions for various business units across the organization.
  • Experienced in data migration and cleansing rules for the integrated architecture (OLTP, ODS, DW).
  • Involved in generating reports and maintaining the data base.

Environment: Erwin r7.0, SQL/MS SQL Server, MS Analysis Services, Windows NT, MS Visio, XML, Informatica, ER/Studio Windows XP, Oracle Data Modeler, Teradata 14, SSIS.

Confidential, Farmers Branch, TX

Data Analyst/SQL Analyst

Responsibilities:

  • Involved in understanding Logical and Physical Data model using Erwin Tool.
  • Analysis of functional and non-functional categorized data elements for data profiling and mapping from source to target data environment.
  • Developed working documents to support findings and assign specific tasks
  • Worked with data investigation, discovery and mapping tools to scan every single data record from many sources.
  • Analyzed business requirements, system requirements, and data mapping requirement specifications interacting with client, developers and QA team.
  • Extensively test the reports for data accuracy and universe related errors
  • Tested several dashboards and deployed them across the organization to monitor the performance.
  • Used SQL tools to run SQL queries and validate the data loaded in to the target tables
  • Validated the data of reports by writing SQL queries in PL/SQL Developer against ODS.
  • Strong ability in developing advanced SQL queries to extract, manipulate, and/or calculate information to fulfill data and reporting requirements including identifying the tables and columns from which data is extracted
  • Worked on Teradata Appliance Backup Utility (ABU) and ARC to backup data to and from Teradata nodes.
  • Worked in importing and cleansing of data from various sources like DB2, Oracle, flat files onto SQL Server 2005 with high volume data
  • Worked with Excel Pivot tables.
  • Identify & record defects with required information for issue to be reproduced by development team.
  • Flexible to work late hours to coordinate with offshore team.

Environment: Quality Center 9.2, MS Excel 2007, PL/SQL, Business Objects XIR3, ETL Tools Informatica 8.6, Oracle 10G, Teradata V2R12, Teradata SQL Assistant 12.0, ETL Tools

Confidential, Austin, TX

SQL Analyst

Responsibilities:

  • Designed and developed ETL processes using Informatica ETL tool for dimension and fact file creation
  • Data Analysis primarily Identifying Data Sets, Source Data, Source Meta Data, Data Definitions and Data formats.
  • Performed data analysis and data profiling using complex SQL on various sources systems including Oracle and Netezza
  • Involved in Migrating the data model from one database to Teradata database and prepared a Teradata staging model.
  • Using ErWin v8.2 modeling tool, publishing of a data dictionary, review of the model and dictionary with subject matter experts and generation of data definition language.
  • Managed full SDLC processes involving requirements management, workflow analysis, source data analysis, data mapping, metadata management, data quality, testing strategy and maintenance of the model.
  • Wrote complex SQL queries for validating the data against different kinds of reports generated by Business Objects XIR2.
  • Worked in importing and cleansing of data from various sources like Teradata, Oracle, flatfiles, SQL Server 2005 with high volume data
  • Involved in extensive Data validation by writing several complexSQL queries and Involved in back-end testing and worked with data quality issues.
  • Developed regression test scripts for the application and Involved in metrics gathering, analysis and reporting to concerned team and Tested the testing programs
  • Identify, assess and intimate potential risks associated to testing scope, quality of the product and schedule
  • Reviewed the model with application developers, ETL Team, DBAs and testing team to provide information about the data model and business requirements.
  • Designed 3rdnormal form target data model and mapped to logical model.
  • Had brain storming sessions with application developers and DBAs to discuss about various de-normalization, partitioning and indexing schemes for physical model
  • Resolved the data type inconsistencies between the source systems and the target system using the Mapping Documents.
  • Used SQL for Querying the database in UNIX environment
  • Developed the Star Schema for the proposed warehouse model to meet the requirements.
  • Identified the objects and relationships between the objects to develop a logical model using ERWin and later translated the model into physical model.
  • Worked with Data Warehouse Extract and load developers to design mappings for Data Capture, Staging, Cleansing, Loading, and Auditing.
  • Developed, enhanced and maintained Snow Flakes and Star Schemas within data warehouse and data mart conceptual & logical data models.
  • Involved in Data mapping specifications to create and execute detailed system test plans. The data mapping specifies what data will be extracted from an internal data warehouse, transformed and sent to an external entity.

Environment: Quality Center 9.2, MS Excel 2007, PL/SQL, Java, Business Objects XIR2, ETL Tools Informatica 8.6/9.1/9.5, SSIS, Oracle 11G, Teradata R13, Teradata SQL Assistant, Load Runner 7.0, Oracle 10g, UNIX AIX 5.2, PERL, Shell Scripting

We'd love your feedback!