Sr. Data Analyst Resume
Long Beach, CA
SUMMARY
- 8 years of IT experience in the field of Business & Data analysis, ETL Development, ETL Testing and Data Modeling.
- Strong experience in Business and Data Analysis, Data Profiling, Data Migration, Data Conversion, Data Quality, Data Governance, Data Integration, Master Data Management (MDM), Metadata Management Services, Reference Data Management (RDM) and Configuration Management.
- Strong understanding of project life cycle and SDLC methodologies including RUP, RAD, Waterfall and Agile.
- Experience in various phases of Software Development life cycle (Analysis, Requirements gathering, Designing) with expertise in documenting various requirement specifications, functional specifications, Test Plans, Source to Target mappings.
- Experience in Mapping ICD 9 codes with corresponding ICD 10 Diagnosis and Procedure Codes.
- Strong experience in interacting with Stakeholders/Customers, gathering requirements through interviews, workshops, and existing system documentation or procedures, defining business processes, identifying, and analyzing risks using appropriate templates and analysis tools.
- Strong working experience in the Data Analysis, Design, Development, Implementation and Testing of Data Warehousing using Data Conversions, Data Extraction, Data Transformation and Data Loading (ETL).
- Good understanding of SQL Server Architecture in terms of ETL and query processing, structure and performance and table partitions.
- Strong expertise in ETL, Data warehousing, Operations Data Store (ODS) concepts, data marts and OLAP technologies.
- Knowledge in the ETL (Extract, Transform and Load) of data into a data warehouse/date mart and Business Intelligence (BI) tools like Business Objects Modules (Reporter, Supervisor, Designer, and Web Intelligence).
- Strong experience in Data Analysis, Data Migration, Data Cleansing, Transformation, Integration, Data Import, and Data Export using multiple ETL tools Informatica Power Center
- Strong experience in using Informatica toolset (Informatica Data Explorer, and Informatica Data Quality) to analyze legacy data for Data Profiling.
- Experience in Data Modeling, creating Star & Snow - Flake Schemas, FACT and Dimensions Tables, Physical and Logical Data Modeling using Erwin.
- Experience with Teradata as the target for the data marts, worked with BTEQ, Fast Load and Multiload.
- Expertise in Master Data Management, Meta Data, Informatica Business Glossary & Data Quality.
- Very good understanding of BI visualization tools (Tableau, QlikView, MicroStrategy).
- Worked on Launching the Master Data Management (MDM) initiative after gaining business buy-in, by defining and prioritizing Enterprise Master Data Entities based on industry criteria including their core attributes through the Data Governance Process.
- Experience in Creating Views, Stores Procedures, DDL/DML Triggers and User-Defined Functions to implement Business Logic and Data Protection.
- Experienced in automation, Unit, integration, and performance testing.
- Participated in Design walk-through with SMEs to baseline the business architecture.
- Extensively experienced working on Health Care Reform projects such as Health Insurance Exchange (HIX), ICD 10 Remediation and HIPAA 5010 Implementation.
- Extensive knowledge of Medical Management Information Systems (MMIS), National Provider Identification (NPI), Health Insurance Portability & Accountability Act (HIPAA) standards, Electronic Data Interchange (EDI), Health Level -7 (HL7), HIX (Health Information Exchange), EMR/EHR, Health Care Reform and Patient Protection and Affordable Care Act (PPACA).
- Hands on Experience on Customer Churn, Sales Forecasting, Market Mix Modeling, Customer Classification, Survival Analysis, Sentiment Analysis, Text Mining, Recommendation Systems.
- Extensive hands-on experience and high proficiency in writing complex SQL queries like stored procedures, triggers, joins and subqueries along with that used MongoDB for extraction data.
- Experience with statistical programming languages such as R and Python.
- Experienced in Time series analysis to create a forecasted portfolio using R.
- Extensive experience in Text Analytics, developing different Statistical Machine Learning, Data Mining solutions to various business problems and generating Data Visualizations using R and Python.
- Experienced working with Excel to analyze the data based on business needs.
- Knowledge and experience in SAS to analyze data based on business needs.
TECHNICAL SKILLS
Programming Languages: Python, R, SQL, MS-Excel, SAS, SQL, Weka, Orange
Querying languages: SQL, DB2, Teradata, Snowflake
Visualization: Tableau, Seaborn, Power BI, Excel charts, GGPLOT2, Matplo Lib
IDE Tools: Jupyter, PyCharm, Visual Studio
Project Management: JIRA, Rally, SharePoint
SDLC Methodologies: Agile, Scrum, Waterfall
Databases: Oracle, DB2 and MS SQL.
ETL tools: SSRS, SSIS, Data Transformation Service
BI Tools: ETL, Informatica Power Center (9.x/8.1/7.x/6.x/5.x, SSIS, SSRS)
PROFESSIONAL EXPERIENCE
Confidential, Long Beach, CA
Sr. Data Analyst
Responsibilities:
- Integrated various systems withHEDISand create design forHEDISand other systems to pull data inHEDIS.
- Attended a two-day Final Design Review Meeting with HIX, PCG, CMS, Deloitte, HSD (Human Services Department) to see the progress of the project.
- Work as a DataAnalystwhile building US Health Care Public Exchange (HIX) & WEM as well as for respective implementation carved out of Federal Exchange product.
- Involved in reviewing BRD, URS and Functional Requirements for MDM Business and Data Quality Requirements for Health Insurance Exchange (HIX) and Health information exchange (HIE) system.
- Involved withDataAnalysis Primarily IdentifyingDataSets, SourceData, Source Metadata,Data Definitions and Data Formats.
- Involved inDataAnalysis,DataValidation,DataCleansing,DataVerification, and IdentifyingData Mismatch.
- Manipulating,cleansing& processingdatausing Excel, Access and SQL
- Performed Count Validation, Dimensional Analysis, Statistical Analysis and Data Quality Validation in Data Migration.
- Performed data analysis and documented Source to Target mapping, Data quality checks, Datasharing agreements and documents.
- Develop Logical and Physical data models that capture current state/future state data elements and data flows using Erwin.
- Wrote Test cases for Enterprise Data Warehousing (EDW) Tables and Data Mart Staging Tables.
- Review the logical model with Business users,ETLTeam, DBAs and testing team to provide information about thedatamodel and business requirements.
- Led the development of a training program to train users on a custom web application and a Cognos ad-hoc reporting environment.
- Generated Use Case diagrams, Sequence diagrams, Business Objects, Domain Object Model, and Visio to depict process flows and PowerPoint presentations
- Working with UI & UX team to define wireframes and visuals reflecting the PRD requirements.
- Responsible for ensuring HIPAA EDI Trading Partner transactions meet established standards and are able to be transmitted and processed.
- Analysis and Impact analysis done for ICD10. Prepared the BRD for the same.
- Worked on EDI transactions like 834,270/271, 276/277, 835/837 and EDI's. Worked on Use Cases for Batch processing for EDI's.
- Drive modification to any EDI transactions maps as required by Trading Partners.
- Involved in Writing use case based on HIPAA standards.
- Responsible for creation of Collaboration Diagrams, Activity Diagram, Project Flow Diagram in MS Visio.
- Responsible to extract specificdataset from Operation database (ODS) by using PL/SQL query in oirder todatacleansing process and new feed processing purpose.
- Involved inETLspecification designing for incoming feed coming from variousdatasource and Datamanipulation on it.
- Involved in creating and designing reports framework with dimensional data models for the data warehouse and worked with the development team on SQL Server 2005 tools like Integrations Services and Analysis Services.
- Extensive Analysis of HIPPA rules to in corporate in the Development of Utility application which are not HIPAA compliant.
- Performed data auditing, validation and cleansing as well as metadata mapping and migrations.
- Involved in migration of various objects like, stored procedures, tables, views from various data source to SQL Server.
- Extensively used PL/SQL tables and Bulk loader programs for processingdataand loading into Oracle tables.
- Conducted meetings withDataconversion team in order to provide source-to-source stagingdata requirements/ definitions/ for mapping purposes.
- ConductedDataCleansing for migratingdatafrom legacy system to the newdatawarehouse and performed Dataconversion validation testing to make sure there is seamless conversion ofdatafrom current system to new System.
- Involved indataconversionsand extracteddatausing SSIS packages (ETL) to transferdatafrom different server locations and heterogeneous sources like Oracle, Excel, CSV, flat file, XML and Text FormatData.
- Worked on SQL queries to retrieve data from Oracle for white box testing.
- Creating SQL Queries to validatedatabetween source and target systems
- Experienced in writing complexSQLqueries for extracting data from multiple tables.
- Used SQL Queries to verify and validateDatapopulated in Front-End is consistent with that of Backend, used Insert, Update, Aggregate and Join queries.
- Prepared the Test Plan and Testing Strategies for Data Warehousing Application.
- Wrote detailed specifications for various reports generated using Cognos
- Proposed solutions to reporting needs and develop prototypes using SQL and Business Objects that address these needs.
Environment: Tableau Desktop/Server, Python, MS SQL Server, Erwin.
Confidential, Alpharetta, GA
Data Analyst
Responsibilities:
- Expertise in configuring the instance in eThority tool as per the Configuration Acceptance document approved by Business Analyst and Activation Manager.
- Experience in Configuring the CORE/IRS Instance that involves data files with single feed and multiple feeds, single control group or multiple control group, at the same time preparing Instance Verification document for all the CORE/IRS instances that has different version of Specification files created by Business Analyst as per the Client requirement for every individual instances in order to check the eligibility of the employee for the 1095C form.
- Experience in modifying the instance and purging (i.e., Clearing the data from the database) the data from the data warehouse as required by the client to get the clean validation pass for the files.
- Ability to analyze the data by performing Extraction, Transformation & Load function mentioned by clients to process and get the accurate feedback that includes cause for warnings & failures. Ability to explain the cause for the loss of records when the data moves from import table to Transformation table.
- Ability to analyze the problems occurring with the client’s data by verifying the contents of the file from the data warehouse in order to provide the accurate feedback by providing the exact cause of the problem along with its fix. Ability to analyze data in the database SQL Server by running SQL queries & troubleshoot the issues related to the client’s data. Worked and created a PPACA Document as per the client requirement.
- Worked on cleaning, exploring, and manipulating source data and transform it to the target system using Python and tools such as Pandas, NumPy, Matplotlib.
- Good experience in Data mining with regards to collecting, searching through, and analyzing a large amount of data in a database, as to discover patterns or relationships. Good experience in transforming EMR (Electronic Medical Record) to Database.
- Experience in performing the SME (Subject Matter Expert) review done by Business Analyst & Activation Manager by thoroughly verifying list of questionnaire & updating the same and inform Business Analyst and Activation Manager about the necessary actions to be taken to resolve the issue.
- Experience in performing Regression testing i.e. finding the loopholes or bug in the tool by building up the test data as per the test cases which is then sent to development team to develop a software patch for that, in order to fix the issue. Experience working with large amounts of data: facts, figures, and number crunching. Ability to see through the data and analyze it to find conclusions. Experience in working as Quality Analyst by reviewing the task done by QA ETL Tester. Experience in mentoring the new joinees about the project and training them.
- Ability to present findings, or translate the data into an understandable document. Expertise to write and speak clearly, easily communicating complex ideas. Ability to look at the numbers, trends, and data and come to new conclusions based on the findings.
- Ability to understand current business processes and implement efficient business process.
- Expertise in defining scope of projects based on gathered Business Requirements including documentation of constraints, assumptions, business impacts & project risks. Strong background in support documentation. Analysis and review of Business Requirement Documents.
- Conducting requirement gathering sessions, feasibility studies and organizing the business requirements in a structured way.
- Gathering business and technical requirements that would best suit the needs of the technical architectural development process.
Environment: Ethority (BI Tool), SQL Server, MS-Office, Tableau, Teradata, SQL, WinRunner.
Confidential, Buffalo, NY
Data Analyst
Responsibilities:
- Working closely with the BA Teams (Business and Technology) to ensure they have captured all the requirements correctly.
- Conducting JAD sessions, writing meeting minutes, collecting requirements from business users and analyze based on the requirements.
- DevelopingDataMapping,DataGovernance, Transformation and Cleansing rules for the Master DataManagement (MDM) Architecture involving OLTP, ODS and OLAP.
- Transformation on the files received from clients and consumed by Sql Server.
- Involved in defining the source to target data mappings, business rules, and data definitions.
- Performing data profiling on various source systems that are required for transferring data to ECH using Informatica Analyst tool.
- Responsible for defining the key identifiers for each mapping/interface also responsible for defining the functional requirement documents for each source to target interface.
- Providing source to target mappings to the ETL team to perform initial, full, and incremental loads into the targetdatamart.
- Performing Data Profiling, Cleansing, Integration and extraction tools (e.g. Informatica).
- Working with users to identify the most appropriate source of record and profile the data required for sales and service.
- Have experience in Dimensional Modeling using Snowflake schema methodologies of Data Warehouse and Integration projects
- Working closely with the ETL SSIS, SSRS Developers to explain the complex Data Transformation using Logic.
- Generating comprehensive analytical reports by running SQL queries against current databases to conductdataanalysis.
- Worked with processes to transfer/migrate data from AWS S3/Relational database and flat files common staging tables in various formats to meaningful data into Snowflake.
- Did extensive data mining to find out relevant features in an anonymized dataset using R and Python
- Strong Experience in Data Migration from RDBMS to Snowflake cloud data warehouse
- Defining the list codes and code conversions between the source systems and the data mart using Reference Data Management (RDM).
- Worked extensively with Business and IT in understanding and documenting system requirements pertaining to the Credit Card Authorization Project in the e-commerce department.
- Performed data mapping of vendor data, items, and purchase orders, as specified by business.
- Agile was the methodology used, Worked with Scrum Master in implementing Agile SDLC processes, including ticketing system, Burn up/down charts, scrum sessions, and sprint and product backlogs.
- Centralized Project Management and Business Analysis capabilities into one team to ensure consistency of IT processes and SDLC deliverables across all IT Solution Delivery teams.
- Functioned as a primary point of contact between IT and the leaders of each of the business units.
- Conducted JAD Sessions, interviewed Subject Matter Experts (SMEs) and Stakeholders, asking detailed functionality aspects of the business process.
- Performed data mapping and Optimized data collection procedures and generated reports on a weekly, monthly, and quarterly basis
- Preformed GAP Analysis of business rules, business and system process flows, user administration, and requirements. Ensured Point of Sale (POS) PCI compliance requirements were met.
- Performed Feasibility, Adaptability and Risk Analysis to identify the business critical and high-risk areas of the application.
- Experience in writing and refining user stories.
- Communicated with design, development, and testing team.
Environment: Excel, MS Excel, POS.
Confidential, Longview, WA
Data Analyst
Responsibilities:
- Performed data profiling in the source systems that are required for New Customer Engagement (NCE)Data mart.
- Document the complete process flow to describe program development, logic, testing, and implementation, application integration, coding.
- Manipulating, cleansing & processing data using Excel, Access, and SQL.
- Responsible for loading, extracting and validation of client data.
- Good experience in Data mining with regards to convert raw data into useful information. By using software to look for patterns in large batches of data.
- Built and analyzed datasets using R, MATLAB, and Python (in decreasing order of usage).
- Liaising with end-users and 3rd party suppliers. Analyzing raw data, drawing conclusions & developing recommendations Writing SQL scripts to manipulate data for data loads and extracts.
- Developing data analytical databases from complex financial source data. Performing daily system checks.
- Data entry, data auditing, creating data reports & monitoring all data for accuracy. Designing, developing and implementing new functionality.
- Monitoring the automated loading processes. Advising on the suitability of methodologies and suggesting improvements.
- Involved in defining the source to target data mappings, business rules, business and data definitions. Responsible for defining the key identifiers for each mapping/interface.
- Responsible for defining the functional requirement documents for each source to target interface.
- Document, clarify, and communicate requests for change requests with the requestor and coordinate with the development and testing team. Reverse engineered all the Source Database’s using Embarcadero.
- Coordinate with the business users in providing appropriate, effective and efficient way to design the new reporting needs based on the user with the existing functionality.
- Document data quality and traceability documents for each source interface.
- Designed and implemented data integration modules for Extract/Transform/Load (ETL) functions.
- Involved in Data warehouse and Data mart design. Experience with various ETL, data warehousing tools and concepts.
- Documented the complete process flow to describe program development, logic, testing, and implementation, application integration, coding.
- Worked with internal architects and, assisting in the development of current and target state data architectures.
- Worked with project team representatives to ensure that logical and physical ER/Studio data models were developed in line with corporate standards and guidelines.
- Also worked on some impact of low quality and/or missing data on the performance of data warehouse client. Identified design flaws in the data warehouse.
- Designed application components in an Agile environment utilizing a test driven development approach. Created and maintained project tasks and schedules.
- Provided task estimates, identified potential problems and recommended alternative solutions.
- Worked in close cooperation with project managers and other functional team members to form a team effort in development. Collaborated with other members of the product development team.
- Coordinated configuration of back-end components in support of application development.
Environment: SQL/Server, Oracle10 &11g, MS-Office, Netezza, Teradata, MDM, Informatica Data Quality, ER Studio, TOAD, Business Objects, Microstrategy, SAP, Greenplum Database, Qlikview, PowerPivot, Selenium, SoapUI, CruiseControl.Net, HP Quality Center 10, Maven, PL/SQL, OBIEE, Cognos.
Confidential
Data Analyst
Responsibilities:
- Designed Data Stage ETL jobs for extracting data from heterogeneous source systems, transform and finally load into the Data Marts.
- Strong knowledge of P & C industry and internal business operations.
- Extensive experience for implementing BI solutions in Healthcare, Banking, Health insurance and P&C industries using MSBI Stack (SSIS, SSAS, SSRS, and PowerBI).
- Broad Business Analysis and Testing experience that includes fully integrated ERP Systems that crosses P & C and Health Insurance.
- Responsibility for documenting custom-software requirements related to ePolicy’s P&C policy administration system (RightRisk) as implemented at its customer site
- Created custom reports via Excel, SQL, QLIK, and electronic dashboards for the P&C division.
- Involved in Managing Data modeling project from Logical design and implementation of Sybase Database
- Identified source systems, their connectivity, related tables, and fields and ensure data suitably for mapping.
- Designed the business requirement collection approach based on the project scope and SDLC methodology.
- Used MS Visio for Process modeling, Process mapping and Business Process flow diagrams.
- Expertise in development of High-level design, Conceptual design, Logical and Physical design for Database, Data warehousing and many Distributed IT systems
- Worked closely with the Enterprise Data Warehouse team and Business Intelligence Architecture team to understand repository objects that support the business requirement and process.
- Lead Business Intelligence reports development efforts by working closely with MicroStrategy, Teradata, and ETL teams
- Design, development, implementation and roll-out of MicroStrategy Business Intelligence applications
- Reported and analyzed all application defects, user issues and resolution status to the higher manager using Mercury Test Director.
- Prepared Test plans which include an introduction, various test strategies, test schedules, QA team’s role, test deliverables, etc.
- Responsible for writing Test cases to cover overall quality assurance using Test Director
- Performed initial manual testing of the application as part of sanity testing
- Performed various tests such as positive, negative to check business functionality manually
- Designed and developed use cases, activity diagrams, and sequence diagrams using UML.
- Created the business process model using MS Visio for better understanding of the system and presented it to Project Manager and other team members for validation.
- Established traceability matrix using Rational Requisite Pro to trace completeness of requirements in different SDLC stages.
- Used Query Analyzer, Execution Plan to optimize SQL Queries
- Enabled Proper selection of the hash table design parameters for faster table look-up.
- Created re-usable components using shared containers for local use or shared use.
- Imported and exported repositories across projects.
- Created Error Files and Log Tables containing data with discrepancies to analyze and re-process the data.
- Collaborated with the development team to enforce the implementation of requirements throughout the entire coding cycle and managed change request using Rational Clear Quest.
- Developed business process models in RUP to document existing and future business processes.
- Troubleshot the designed jobs using the Data Stage Debugger.
- Created job schedules to automate the ETL process.
- Enhanced the job properties for performance tuning.
Environment: SAP SD, SQL Server MS Access, SDLC, RUP, P&C Enterprise Applications UML, UAT, XML Files, MVS, IMS.