Senior Data Analyst/data Modeler Resume
Houston, TX
SUMMARY:
- 7 years of IT experience in testing both Business Intelligence and ETL applications
- Highly skilled in developing Test Plans, Test scripts, Test Matrix and Executing Test Cases
- Proficient in Data Quality Analysis, System Analysis, Business Requirement Gathering, Data Modeling, Technical Documentation.
- Proficient in Data Governance, Data Lifecycle, Data Quality Improvement, Master Data Management, and Metadata Management.
- Worked with Data governance team to evaluate test results for fulfillment of all data requirements
- Hands on experience in writing test scripts, preparing test data, testing Informatica Mappings and creating SQL scripts using stored procedures, functions, PL/SQL.
- Proficient in SAS/BASE, SAS/SQL, SAS/Macro, SAS/Connect, SAS/Access
- Expert in using Informatica Administrator, Informatica Data Quality Developer 9.6.1 and Informatica Data Quality Analyst 9.6.1 tools for data quality improvement.
- Executed and validated the data on weekly and monthly reports in Microstrategy before UAT.
- Experience in creating ETL test scenarios, test scripts and test cases and deployment plan by using ETL mappings
- Advanced knowledge of healthcare data coding such as CPT - 4, HCPCS, ICD-9/10, DRG
- Well capable in preparing Financial Reports, On-Demand Commercial and Consumer Reports using Oracle SQL, SAAS, and Microsoft SQL Server Studio.
- Working knowledge in Dimensional Data modeling using Star and Snow Flake schema.
- Experience in Data Analysis, Data Validation, Data Verification, Data Cleansing, Data Completeness and identifying data mismatch.
- Contributed to the design analyses, and implementation of Customer Master Data Management using Informatica Tool Suite, integrating 7 source systems.
- Excellent testing experience in all phases and stages of Software Testing Life Cycle (STLC) and Software Development Life Cycle (SDLC) with good working knowledge of testing methodologies, disciplines, tasks, resources and scheduling.
- Expert knowledge of data mapping, relational database modeling concepts and practices.
- Created ETL test data for all ETL mapping rules to test the functionality of the ETL process based on mapping document.
- Experienced in working with Business Requirements Documents (BRD), ETL Data Mapping Documents, Functional Requirement Documents (FRD), and Training Business Users.
- Tested the ETL flow and other ETL Processes (Data Warehouse Testing)
- Expert in writing SQL queries and has expertise in Test Case Design, Test Tool Usage, Test Execution, and Defect Management.
- Extensively used ETL methodology for supporting data extraction, transformations and loading processing, in a corporate-wide-ETL Solution using Informatica
- Excellent analytical, problem solving, and communication skills, with ability to interact with individuals at all levels.
- Responsible for mentoring resources technically and training other QA Team Members.
- Experience with QA Methodology to ensure the Quality Assurance Control.
- Good in analyzing risk and issues and able to communicate to Management for determining mitigation plan in timely manner
TECHNICAL SKILLS:
- Data Warehousing Informatica 9/8.5/ 8.1/7.1/6.2/5.1 (Power Center), Informatica Data Quality Analyst/Developer 9.6.1 HotFix2, SSIS 2008/2012
- Reporting Tools OBIEE 10g\11g, MicroStrategy 8.0/9.4.1, Power BI, SSRS 2008/2012
- Data Modeling Erwin Data Modeler v8.2, Star-Schema Modeling, Snowflake-Schema Modeling, FACT and dimension tables, Pivot Tables
- Testing Tools HP ALM 11/Quality Center
- RDBMS Oracle11/10g/9i/8i/7.x,MSSQLServer 2014/2012/2010/2005/2008 R2, MS Access 2008/2013, Teradata v14, SAS EG5.1
- Programming UNIX Shell Scripting, SQL, SQL*Plus, PL/SQL, TOAD
- Environment UNIX, HP-Unix, Win 3.x/95/98/XP/Win 7, NT 4.0, Red Hat Linux
- Industry Experience Financial Services, Mortgage Servicing, Commercial Credit Risk/Retail, Healthcare (Medicaid/Medicare), Treasury, Commercial Loan Servicing, Banking
PROFESSIONAL EXPERIENCE:
Confidential,Houston, TX
Senior Data Analyst/Data Modeler
Responsibilities:- Perform analysis of all data into, out of and within CareSource in support of Data Warehousing efforts.
- Identify and quantify data issues within the organization and assist in the development plans to resolve data issues.
- Involved in Planning, Defining, and Designing data based on business requirements and prepared documentation.
- Analyzed and accurately extracted large volumes of medical, facility, pharmacy and dental claims data from Enterprise Data Warehouse system for regulatory CMS submission
- Support regular data management processes and ad hoc user requests for reporting.
- Developed complex stored procedures in SQLSMS to automate regulatory data submission to Center for Medicare and Medicaid Services (CMS).
- Performed root cause analysis on smaller self-contained data analysis tasks that are related to assigned data processes.
- Monitored and analyzed the quality of data submitted to regulatory agencies.
- Hands on experience in ingesting third party vendor files, such as XML, XSD, Flat files into CareSource Enterprise Data Warehouse system.
- Support the verification of data accuracy within DSI Analytic Systems and source systems.
- Develop, document and perform testing and validation as needed.
- Wrote queries to perform data validation and created Excel summary reports.
- Involved in working with Subject Matter Experts to develop business rules that support the transformation of data.
- Develop data models according to enterprise data management practices.
- Designed data models for reporting and analytic databases that translate business rules into queryable data structures.
- Designed data models for development of a data warehouse and data marts.
- Reverse engineered data models for existing databases and systems.
- In charge of supporting the maintenance of the Analytic Systems data dictionaries.
- Responsible for facilitating discussions between end users and data modelers as needed.
- Worked with Business Requirements Documents (BRD), ETL Data Mapping documents, Functional Requirement Documents (FRD) and Source to Target Mapping Documents (STM).
- Assisted in ETL development, documentation and testing as needed.
- Ensure the data migration development adheres to standards for development and the SDLC methodology.
- Worked closely with areas directly connected to the Enterprise Data Warehouse to ensure that reporting, business intelligence, and analytic data needs are met.
- Responsible for creating source to target mappings for ETL Development Process.
- Write complex SQL queries to extract data from Enterprise Data Warehouse System to analyze data being used in Micro Strategy dashboards.
- Created high level ETL design document and assisted ETL developers in the detail design and development of ETL maps using Informatica.
- Analyzed and performed unit/regression testing in Development, Integration, Certification, and Production Environments.
- Provided mentoring to team members in the use of Informatica Developer and Informatica Analyst tools.
- Create Data Quality Check templates to track and improve the quality of data in Teradata Enterprise Data Warehouse.
Environment: PowerCenter Informatica, Informatica Data Quality Analyst (IDQA), Informatica Data Quality Developer (IDQD) 9.6, MS Office Suite 2013, Teradata 14, Microsoft SQL Server Studio 2008/2014, Erwin Data Modeler 8.2, Windows 7, SAS EG, MicroStrategy 9.4.1, Power BI, SharePoint
Confidential,Cincinnati,OH
Data Analyst
Responsibilities:- Analyze the client data and business terms from a data quality and integrity perspective.
- Perform root cause analysis on smaller self-contained data analysis tasks that are related to assigned data processes.
- Worked to ensure high levels of data consistency between diverse source systems including flat files, XML and SQL Database.
- Develop and run ad hoc data queries from multiple database types to identify system of records, data inconsistencies, and data quality issues.
- Created data quality mappings and workflows for Data Remediation process in Informatica Data Quality Analyst 9.6.1.
- Managed the FNFG Data Quality surveillance setup process in Informatica Data Quality 9.6.1
- Involved in translating the business requirements into data requirements across different systems.
- Created Informatica Mappings, Mapplets, Workflows using Informatica Data Quality Developer and deployed applications into Informatica Data Integration Services (DIS)
- Involved in understanding the customer needs with regards to data, documenting requirements, developing complex SQL statements to extract the data and packaging/encrypting data for delivery to business users.
- Used SAS Business Intelligence (BI) to drive the automation of standardized analytics
- Extensive use of PROC SQL to perform queries, join tables
- Design SQL query to select, create, append, and update database tables and views.
- Developed numerous ad-hoc SAS programs to create summaries and listings
- Built and managed over 30 reports daily from Advanced Commercial Banking System (ACBS) databases for Commercial Credit Risk and Treasury management and users for day to day management of the business.
- Executed and validated data on weekly basis in OBIEE dashboards for Customer Master Data 360 improvement.
- Provided support to Data Architect and Data Modeler in Designing and Implementing Databases for MDM.
- Identified the database tables for defining the queries for the reports using SSRS.
- Worked on data analysis using Microsoft SQL Server Studio, PL/SQL, IDQ Developer, SAAS, and many other queries based applications.
- Compile and Generate Reports in a Presentable Format to the Project Team.
- In charge of running Data Quality Surveillance reports on a weekly basis to identify potential data quality issues for Commercial and Consumer Credit Line.
- Updated business users, Operations management, Commercial Credit Risk, Loan Admin, and Consumer Credit Line with the summary of data quality issues.
- Involved with profiling data elements from source to target columns using Informatica Data Quality Analyst 9.6 for Master Data Management.
- Write queries to extract data from sources, production and development environment to identify and validate the issues sent by Commercial and Consumer Credit Risk.
- Send summary reports to DQ manager on the findings for data quality surveillance.
- Developed good communication skills by working with Data Quality Managers, Subject Matter Experts, Loan Admin, Commercial and Consumer Credit Line department, Data Stewards and Project Managers.
- Participated in Data Quality Team meetings on a weekly basis for data quality improvement.
- Identified defects in the software environment, and logged new tickets to remediate in HPALM.
- Maintained Excel workbooks, such as development of pivot tables, exporting data from external SQL databases, producing reports and updating spreadsheet information.
- Accustomed to working with multiple projects, such as General Data Quality/Production, Product Specification and Master Data Management (MDM) Projects.
- Highly involved in Data Quality improvement trainings organized by the client, such as Custom Informatica Data Analyst with Profiling 9.6 training, Metadata, and Master Data Management trainings
Environment: Informatica Data Quality Analyst (IDQA) 9.6.1 HotFix2, SAS, Informatica Data Quality Developer (IDQD) 9.6.1 HotFix2, MS Office, VBA Excel 2013, Microsoft SQL Server Studio 2008, Windows 7, Oracle 11g, SAAS EG5.1, OBIEE 11g, HPALM.
Confidential, CA
Sr. Data Analyst
Responsibilities:- Enrollment Process - maintain all subscribers’ details up to date. Maintain all clients subscribers details under HSEnroll DB
- Campaign Correspondence - set process to archive data of all communication with subscribers like welcome letter, transition letter etc
- Activation Process - set Activation process to send positive files with cell numbers to clients to activate cell numbers that have been dispatched to subscribers.
- Cleanse Process - set Cleanse file process to send a list of cell numbers to the client that needs to be deactivated
- Create OLAP cubes using SAS OLAP tools for the creation of dashboards and drill-through web based reporting.
- Configured SAS BI components and solved BI related issues.
- Revising a series of Oracle PL/SQL programs to SAS for extraction of data from Oracle in Enterprise Miner.
- Developed methods to streamline analysis and present business critical information through a combination of SAS BI dashboards and web-based reporting
- Create reusable Programmed ETL (PETL) components (stored procedures) to solve business problems.
- Create ETL routines in SSIS
- Define and interpret business requirements into data specifications
- Determine usage and needs for data collection and reporting
- Identify and resolve conflicts with data interpretation
- Perform impact analysis on data and process changes
- Work with customers to identify data that supports their products and answers their questions.
- Perform analysis on data to detect and address issues and establish criteria for data quality standards
- Work with Database Engineering to maintain a centralized data dictionary
- Capture and plan for future data needs
- Work with BA to interpret business requirement into functional requirement
- Create mapping document using the functional requirement
- Help developer in understanding the mapping document
- Validate the stored procedures written by the report developers
- Raise RFC and prepare replication request forms
- Validate the replicated tables and views
- Help the ETL team understand the mapping document
Environment: MS Office, Power Pivot Excel, Access, SQL Server 2008, SAS, SSRS, SSAS, SSMS, SSIS, T-SQL, Windows 7, Oracle 11g.
Confidential,Richmond,VA
Data Analyst
Responsibilities:- Analyze the client data and business terms from a data quality and integrity perspective.
- Perform root cause analysis on smaller self-contained data analysis tasks that are related to assigned data processes.
- Worked to ensure high levels of data consistency between diverse source systems including flat files, XML and SQL Database.
- Develop and run ad hoc data queries from multiple database types to identify system of records, data inconsistencies, and data quality issues.
- Involved in translating the business requirements into data requirements across different systems.
- Involved in understanding the customer needs with regards to data, documenting requirements, developing complex SQL statements to extract the data and packaging/encrypting data for delivery to customers.
- Performed Data Cleansing widely during the ETL's Extraction and Loading Phase by analyzing the raw data and writing SAS Program and creating complex reusable Macros.
- Wrote SQL Stored Procedures and Views, and coordinate and perform in-depth testing of new and existing systems.
- Worked with Data Modeling team to create Logical/Physical models for Enterprise Data Warehouse.
- Reviewed normalized/Denormalization schemas for effective and optimum performance tuning queries and data validations in OLTP and OLAP environments.
- Identified the database tables for defining the queries for the reports using SSRS.
- Worked on data analysis using SQL, T-SQL and many other queries based applications.
- Experienced in conducting JAD Sessions.
- Developed Data Migration and Cleansing rules for the Integration Architecture (OLTP, ODS, DW).
- Developed data mapping documents between Legacy, Production, and User Interface Systems.
- Documented data content, data relationships and structure, and processes the data using Informatica Power Center Metadata Exchange.
- Transferred data objects and queries from MS Access to SQL Server.
- Assisted ETL team to define Source to Target Mappings.
- Worked with Data Architect in Designing the CIM Model for Master Data Management.
- Compile and Generate Reports in a Presentable Format to the Project Team.
Environment: Informatica, XML, MS Office, VBA Excel 2013, Access, SSRS SQL Server 2008, Windows 2000, Oracle 10g.
Confidential, New York, New York
Data Analyst
Responsibilities:- Involved in Business and data analysis during requirements gathering.
- Assisted in creating fact and dimension table implementation in Star Schema model based on requirements.
- Performed segmentation to extract data and create lists to support direct marketing mailings and marketing mailing campaigns.
- Defined data requirements and elements used in XML transactions.
- Reviewed and recommended database modifications
- Analyzed and rectified data in source systems and Financial Data Warehouse databases.
- Generated and reviewed reports to analyze data using different excel formats
- Documented requirements for numerous adhoc reporting efforts
- Troubleshooting, resolving and escalating data related issues and validating data to improve data quality.
- Designed developed and implemented 2 professionally finished systems for tracking IT requests, and providing a data repository about reports. Documented all system functionality.
- Participated in testing of procedures and data, utilizing PL/SQL, to ensure integrity and quality of data in data warehouse.
- Metrics reporting, data mining and trends in helpdesk environment using Access
- Gather data from Confidential Help Desk Ticketing System and write adhoc reports and, charts and graphs for analysis.
- Defined report layouts and wrote queries for drill down reports and identified and included report parameters using SSRS.
- Identify and report on various computer problems within the company to upper management
- Report on trends that come up as to identify changes or trouble within the systems using Access and Crystal Reports.
- Guide, train and support teammates in testing processes, procedures, analysis and quality control of data, utilizing past experience and training in Oracle, SQL, Unix and relational databases.
- Maintained Excel workbooks, such as development of pivot tables, exporting data from external SQL databases, producing reports and updating spreadsheet information.
- Used T-SQL for Querying the SQL Server database for data validation and data conditioning
- Created cases for settlement claims analysts by printing documents from a website that maintains information, regarding settlement claim data, compiled documents and updated settlement claim data into database system
- Modified user profiles, which included changing users cost center location, changed users authority to grant monetary amounts to certain departments - monetary amounts were part of the overall budget amount granted per department
- Deleted users from cost centers, deleted users authority to grant certain monetary amounts to certain departments, deleted certain cost centers and profit centers from database
- Created a report, using SAP reporting feature that showed which users have not performed scanning of journal voucher documents into the system.
- Created Excel pivot tables, which showed a table of users that, have not performed scanning of journal voucher documents. Users were able to find documents by double-clicking on his/her name within the pivot table
- Load new or modified data into back-end Oracle database.
- Optimizing/Tuning several complex SQL queries for better performance and efficiency.
- Created various PL/SQL stored procedures for dropping and recreating indexes on target tables.
- Worked on issues with migration from development to testing.
- Designed and developed UNIX shell scripts as part of the ETL process, automate the process of loading, pulling the data.
- Validated cube and query data from the reporting system back to the source system.
- Tested analytical reports using Analysis Studio
Environment: SAS/BASE, SAS/Access, SAS/Connect, Informatica Power Center (Power Center Designer, workflow manager, workflow monitor), SQL *Loader, T-SQL, SSRS, Oracle, SQL Server 2000, Windows 2000, TOAD.
