We provide IT Staff Augmentation Services!

Sr. Data Analyst Resume

3.00/5 (Submit Your Rating)

Lewisville, TX

SUMMARY

  • Data Analyst with 8+ years of professional experience and expertise in Business Requirements, Functional Specifications, Business Process Flow, Business Process Mapping, data analysis, data mining, reporting dashboards, metrics, requirements, systems analysis, programming, testing, Project Management, and application development benchmarking in Healthcare/Life science industry such as EMR/HER, Claims, HIPAA and EDI Acts.
  • Experience as a Data Analyst for data warehousing with a good knowledge and expertise in all phases of SDLC methodology including analysis, design, development, testing and maintenance.
  • Excellent knowledge of entire Software Development Life Cycle (SDLC) methodologies like Agile, Scrum, UML, Waterfall and Project Management Methodologies.
  • Expertise in development and design of ETL methodology for supporting data transformations and processing, in a corporate wide ETL Solution using Informatica Center.
  • Knowledge in Health Administration such as Claims processing (auto adjudication), COB, HIPAA, enrollment, EDI, Medicare, Medicaid, CDHP (consumer driven health plans).
  • Expertise in providing an Enterprise Level Data Management Strategy and Framework encompassing Data Governance, Data Lifecycle, Data Quality Improvement, Decision Support, Master Data Management (MDM), and Metadata Management.
  • Expertise in creating and developing Informatica, Power BI Dashboards into rich look.
  • Strong working experience in the Data Analysis, Design, Development, Implementation and Testing of Data Warehousing using Data Conversions, Data Extraction, Data Transformation and Data Loading (ETL).
  • Extensively used ETL methodology for Data Extraction, transformation and loading process in a corporate - wide ETL Solution using Informatica.
  • Worked on different EDI healthcare transactions like 837-Institutional, 837-Professional, 837-Dental, 835-Claim Payment/Remittance Advise, 270/271-Eligibility Benefit Inquiry/Response, 276/277-Claim Status Inquiry/Response Transactions.
  • Very good understanding of gathering MDM requirements from the business users and implementing them.
  • Knowledge in enterprise data modeling and analytics (SSRS, SSAS, SSIS, ETL, MDM, Entity Data Models and data warehousing) incorporating transaction-oriented business/service level data processes and metadata collection techniques to enhance informatics strategies.
  • Good knowledge of Health Insurance Plans and managed care concepts Medicaid and Medicare and experienced in determining the membership eligibility, billing experience within life and disability in health plans with thorough understanding of CPT coding, CMS-1500 claim forms and reimbursement forms.
  • Strong knowledge on HL7 standards, HIPAA (Health information portability and accountability act).
  • Experience in Data Analysis, Data Validation, Data modeling, Data Cleansing, Data Verification and identifying data mismatch.
  • Excellent Knowledge in Electronic Medical Record (EMR) / Electronic Health Records (EHR) modules and process flow.
  • Highly experienced on different EDI healthcare transactions like 837-Institutional, 837-Professional, 837-Dental, 835-Claim Payment/Remittance Advise, 270/271-Eligibility Benefit Inquiry/Response, 276/277-Claim Status Inquiry/Response Transactions.
  • Strong experience in Data Analysis, Data Profiling, Data Cleansing & Quality, Data Migration, Data Integration.
  • Highly experienced in creating SSIS packages to load data into Data Warehouse using Various SSIS Tasks like Execute SQL Task, bulk insert task, data flow task, file system task, send mail task, active script task, xml task and various transformations.
  • Experience in installing and configuring Reporting services and assigning permissions to different levels of users in SSRS.
  • Extensive Knowledge of using (Tableau) during my project Help Building, Designing and Web Analytics.
  • Experience in working on Data Scrubbing/ Cleansing, Data Quality, Data Mapping Data Profiling, Data Validation in ETL.
  • Proficient in performing data extraction, transformation and loading (ETL) between systems using SQL tools such as SSIS.
  • Proficient in developing Use Case Model, Analysis Model, Design Model, Implementation Model, Use Case Diagrams, Behavior Diagrams (Sequence diagrams, Collaboration diagrams, State chart diagrams, Activity diagrams), Class Diagrams based on UML using Rational Rose
  • Exploited power of Teradata to solve complex business problems by data analysis on a large set of data.
  • Experienced in conducting JAD Sessions.
  • Experience in defining events within Business Process using Workflows.
  • Highly proficient in creating ETL packages with different data sources (SQL Server, Flat Files, Excel source files, XML files etc.) and then loaded the data into destination tables by performing different kinds of transformations using Informatica Center and SSIS.
  • Experienced in SQL and good knowledge in PL/SQL programming and developed Stored Procedures and Triggers and Data Stage, DB2, UNIX, Cognos, MDM, UNIX, Hadoop, Hive, Pig.
  • Strong knowledge of Extraction Transformation and Loading (ETL) processes using UNIX shell scripting, SQL and SQL Loader.
  • Experience in Building, publishing customized interactive reports and dashboards, report scheduling using Tableau server.
  • Excellent analytical, inter-personal and communication skills with clear understanding of business process flows.

PROFESSIONAL EXPERIENCE

Confidential - Lewisville, TX

Sr. Data Analyst

Responsibilities:

  • Worked with business requirements analysts/subject matter experts to identify and understand requirements.
  • Conducted user interviews and data analysis review meetings. Defined key facts and dimensions necessary to support the business requirements along with Data Modeler.
  • Created draft data models for understanding and to help Data Modeler.
  • Resolved the data related issues such as: assessing data quality, data consolidation, evaluating existing data sources.
  • Provided support to Data Architect and Data Modeler in Designing and Implementing Databases for MDM using ERWIN Data Modeler Tool and MS Access.
  • Extensively used Star Schema methodologies in building and designing the logical data model into Dimensional Models.
  • Created MDM, data architecture, analytical datamarts, and cubes optimized for reporting.
  • Involved in Design and development of MDM data model and integration between MDM and WebSphere Process Server.
  • Designed STAR Schemas for the detailed data Marts and plan Data Marts involving Shared Dimensions.
  • Worked on raw data migration to Amazon cloud into S3 and performed refined data processing.
  • Parsing the data from S3 through the Python API calls through the Amazon API Gateway generating Batch Source for processing.
  • Created a road map for the client for the planning, developing, and implementing of MDM solutions, enabling consolidation of MDM data following Mergers and Acquisitions.
  • Technical requirement document (TRD) preparation and update with respect to receiving and sending enrollment data via 834 EDI File Transfers /HIPAA Transaction & Membership Renewals.
  • Support both automated and manual data entry to the MDM database.
  • Involved in complete cycle & end to end testing, Termination and Changes transaction that are received via state files through internal databases and front-end customer service applications.
  • Actively participated in month production roll out and regression testing the portals.
  • Involved in functional requirement and test case development for data flow in the form of XML messages between internal systems for appropriate distribution of data.
  • Worked with team of developers on Python applications for RISK management.
  • Reviewed coding done to SAS and Python application upgrades and extensions.
  • Developed advanced SQL/SAS queries to extract, manipulate, and/or calculate information to fulfill data and reporting requirements including identifying the tables and columns from which data is extracted.
  • Contributed to pre and post-sales solutions with MDM solutions, including WebSphere Product Center, WebSphere Customer Center and WebSphere portal.
  • Extract data from Teradata, Excel and flat files into SAS Datasets.
  • Cleaned data and processed third party spending data into maneuverable deliverables within specific formats with Excel macros and python libraries.
  • Involved in creation of AMIs, performed troubleshooting and monitoring of the Linux server on AWS.
  • Responsible for loading, extracting and validation of client data.
  • Worked closely with Data Architect to review all the conceptual, logical and physical database design models with respect to functions, definition, maintenance review and support Data analysis, Data Quality and ETL design that feeds the logical data models.
  • Analyzed the source data coming from various data sources like Mainframe & Oracle.
  • Created data mapping documents mapping Logical Data Elements to Physical Data Elements and Source Data Elements to Destination Data Elements.
  • Involved in managing, updating and manipulating report orientation and structures with the use of advanced Excel functions including Pivot Tables and V-Lookups.
  • Performed extensive data Validation, data Verification against data Warehouse.
  • Validating and profiling Flat file data into Teradata tables using Unix Shell scripts.
  • Used SAS to solve problems with data and analytics.
  • Tested the data using the Logs generated after loading the data into Data warehouse. Prepared Traceability Matrix with requirements versus test cases.
  • Involved in Testing the EDIs according to HIPPA code set 834 enrollments and disenrollment in a health plan. Process the /271, 276/277,834 files in the UAT environment using EDIFECS-Transaction management to validate the Output from the file.
  • Performed onsite Health Plan reviews for Medicaid encounter compliance.
  • Responsible for the design, development and administration of transactional and analytical data constructs/structure
  • Worked on data quality, data organization, metadata, and data profiling.
  • Involved working on different set of layers of Business Intelligence Infrastructure.
  • Worked extensively in Data consolidation and harmonization and T-SQL environment to run out the queries and explore the Databases.
  • Worked with OLAP tools such as ETL, Data warehousing and Modeling, Used Informatica / SSIS to extract, transform & load data from SQL Server to Oracle databases. Meeting with user groups to analyze requirements and proposed changes in design and specifications.
  • Performed data/systems analysis to determine best BI solution (reports, dashboards, scorecards, data cubes, etc.) using Tableau
  • Publishing data sources and dashboards from Tableau desktop to the Tableau server with the user management, security, content restriction, authentication, and authorization.
  • Performed Detailed Data Analysis (DDA), Data Quality Analysis (DQA) and Data Profiling on source data.
  • Manipulating, cleansing & processing data using Excel, Access and SQL. Performed Data Validations using SQL developer.

Confidential - Nashville, TN

Sr. Data Analyst

Responsibilities:

  • Participated in Daily Agile Scrum, Biweekly Sprint Planning and Retrospective Sessions and update the team on work status.
  • Documented business and technical requirements gathering for the development of MDM, Data Warehousing and Reporting implementation roadmaps.
  • Involved in creating dimensional model based on star schemas and designed them using Erwin.
  • Developed numerous reports to capture the transactional data for the business analysis.
  • Combine cross-functional technology and business skills with creative vision to create impactful and engaging websites for sale, supporting operational launch and growth across various industries.
  • Extensively used V-LOOKUP'S, Pivot tables and Excel functions and worked on creating Pivot graphs, reports and dashboards.
  • Analyzing query results in Excel, creating pivot table/chart reports etc. and sharing them with MDM team members.
  • Created and updated data mapping document(s) with reference to the source for 270/271 (Eligibility & Benefit Inquiry & Response), 276/277 (Claim Status Inquiry & Response) and 837 (Health Care Claim).
  • Assisted in the meetings with the technical team as well as the clients regularly to assess the current process and how it could be changed in making a better propose process.
  • Analyzed relational data with SQL to quickly understand existing data stores and to validate source/target mappings and transformation logic.
  • Worked with Facets Team for HIPAA Claims Validation and Verification Process (Pre-Adjudication).
  • Checked for Carriage Return/Line Feeds (CR/LF) in the data stream and cleaned them up before loading the data into the target data storage.
  • Designed both 3NF data models for ODS, OLTP systems and dimensional data models using star and snowflake Schemas.
  • Involved in Dimensional modeling (Star Schema) of the Data warehouse and used Power Designer to design the business process, dimensions and measured facts.
  • Worked on the Snow-flaking the Dimensions to remove redundancy.
  • Worked Informatica power center tools like source analyzer, mapping designer, mapplet and transformations.
  • Developed Informatica mappings and tuned for better performance.
  • Worked and extracted data from various database sources like Oracle, SQL Server and Teradata.
  • Developed data variation analysis and data pair association analysis in the Bioinformatics field.
  • Interacted with the ETL, BI teams to understand / support on various ongoing projects.
  • Extensively using MS Excel for data validation.
  • Worked on Dimensional modelling, Data cleansing and Data Staging of operational sources using ETL processes.
  • Data pre-processing, splitting the identified data set into Training set and Test set.
  • Performed data wrangling to clean, transform and reshape the data utilizing panda's library.
  • Manipulating/mining data from database tables (Oracle, Data Warehouse).
  • Interface with other technology teams to load (ETL), extract and transform data from a wide variety of data source.
  • Reporting and dashboarding analytical results to client (R, Tableau, Excel).
  • Created Tableau scorecards, dashboards using stack bars, bar graphs, scattered plots, geographical maps, heat maps, bullet charts, Gantt charts demonstrating key information for decision making.
  • Utilized a broad variety of statistical packages like SAS, R, MLIB, Graphs, Map Reduce and others
  • Provided input and recommendations on technical issues to Business & Data Analysts, BI Engineers and Data Scientists.
  • Involved in writing complex SQL queries for validating the data against different kinds of reports generated by Cognos.
  • Regularly accessing JIRA tool and other internal issue trackers for the Project development.

Confidential - Lemoyne, PA

Data Analyst

Responsibilities:

  • Developed and implemented data cleansing, data security, data profiling and data monitoring processes.
  • Developed long term data warehouse roadmap and Designs and builds the data warehouse framework per the roadmap.
  • Massively involved in Data Modeling role to review business requirement and compose source to target data mapping documents.
  • Involved in the implementation of MDM concept with upstream and downstream integration of different applications in agile environment.
  • Managed functional requirements for interfaces and conversions between other legacy systems to Teradata, MDM, Enhancements, Workflows and Reports for MDM.
  • Work closely with MDM Informatica Architects to fully understand and "read" ETL processes and process flows for business requirements, understand the business rules that are coded within the ETL load processes.
  • Loaded and transformed large sets of structured, semi structured and unstructured data using Hadoop/Big Data concepts.
  • Translated business requirements into working logical and physical data models for Data warehouse, Datamarts and OLAP applications.
  • Independently coded new programs and design Tables to load and test the program effectively for the given POC's using Big Data/Hadoop.
  • Worked on health plan operations in enrollment and billing, claims and benefits, network management, medical policy and utilization management.
  • Created HBase tables to load large sets of structured, semi-structured and unstructured data coming from NoSQL and a variety of portfolios.
  • Developed dimensional model for Data Warehouse/OLAP applications by identifying required facts and dimensions.
  • Extensively used Erwin for Data modeling. Created Staging and Target Models for the Enterprise Data Warehouse.
  • Presented the data scenarios via, Erwin logical models and excel mockups to visualize the data better.
  • Developed Data Mapping, Data Governance, and Transformation and cleansing rules for the Master Data Management involving OLTP, ODS.
  • Involved in the configuration of Facets Subscriber/Member Application-group.
  • Analyzed the member/eligibility information on the claim to that in Facets.
  • Extensively used agile methodology as the Organization Standard to implement the data Models.
  • Established and maintained comprehensive data model documentation including detailed descriptions of business entities, attributes, and data relationships.
  • Used ETL to load data from flat files (Excel/Access), SQL Server to Oracle database.
  • Developed ETL procedures to ensure conformity, compliance with standards and lack of redundancy, translates business rules and functionality requirements into ETL procedures.
  • Responsible for ETL resources in Basel Project. Defining the Scope of Work based on the requirements and time lines.
  • Implementing the standards of creating ETL Workflows/Mappings.
  • Analysis of the End user requirements and involved in Modeling the ETL Schema.
  • Designed and developed the Interface Engine for loading data into the framework.
  • Performed current state assessment and translated interface requirements into technical specifications.
  • Coordinated the purchase of interface services from vendors.
  • Act as main point of contact for interface validation, ensuring that all required interface functionality has been tested appropriately using a dynamic approach and methodical testing habits.
  • Handle any custom development request for interface functionality outside of the scope of our standard interface templates, including SQL stored procedure.
  • Worked closely with application analyst, interface engineers and vendors to design, test and troubleshoot interfaces.
  • Installed, configured, managed and deployed Big Data solutions and the underlying infrastructure on many Big data tools.
  • Wrote SQL quires for Oracle and DB2 databases using SQLPLUS.
  • Documented ER Diagrams, Logical and Physical models, business process diagrams and process flow diagrams.
  • Responsible for ETL setup for data transfer and data integration across different applications, RDBMS, flat files and mainframe systems.
  • Developed and supported the Extraction, Transformation, and load process (ETL) for data migration using Informatica power center.
  • Involved in training employees on systems use by demonstrations and walkthroughs.
  • Factored in the inputs of various SME's & Senior management to drive the solutions towards the goals.
  • Successfully created the Tableau data extracts using best database approach to support the goals.
  • Worked with a team in multiple locations nationwide and acted as a liaison between units and IT programmers in gathering requirements used to developing reports.
  • Helped & Trained the Analyst to create the visualizations in Tableau and get quick results.

Confidential - Atlanta, GA

Data Analyst

Responsibilities:

  • Collaborate with cross-functional team in support of business case development and identifying modeling method(s) to provide business solutions. Determines the appropriate statistical and analytical methodologies to solve business problems within specific areas of expertise.
  • Build models and solves complex business problems where analyses of situations and/or data require in-depth evaluation of variable factors.
  • Generating Data Models using Erwin9.6 and developed relational database system and involved in Logical modeling using the Dimensional Modeling techniques such as Star Schema and Snow Flake Schema.
  • Participate in documenting the data governance framework including processes for governing the identification, collection, and use of data to assure accuracy and validity.
  • Metrics reporting, data mining and trends in helpdesk environment using Access.
  • Involved in creating Informatica mapping to populate staging tables and data warehouse tables from various sources like flat files DB2, Netezza and oracle sources.
  • Involved in loading the data from the flat files submitted by vendors into the Oracle12c External tables, extensively used ETL to load data from Oracle database, XML files, and Flat files data and also used import data from IBM Mainframes.
  • Used Pivot tables, VLOOKUP and conditional formatting to verify data uploaded to proprietary database and online reporting.
  • Coordinated with Data Architects and Data Modelers to create new schemas and view in Netezza for to improve reports execution time, worked on creating optimized Data-Mart reports.
  • Used both Kimball and Bill Inmon methodologies for creating data warehouse and transformed data from different OLTP systems to Teradata database for analyzing, reporting and data mining.
  • Independently worked on owning IT support tasks related to Tableau Reports on Server.
  • Worked with primary subject matter expert in implementing Oracle's self-service OBIEE reporting system. This system allows for real-time data to be accessible at all HR locations throughout the company.
  • Evaluate, identify, and solve process and system issues utilizing business analysis, design best practices and recommended enterprise and Tableau solutions.
  • Used advanced functions like VLOOKUP, Pivots, graphs, and analytical and statistical tool packs in Excel.
  • Involved in Normalization and De-Normalization of existing tables for faster query retrieval.
  • Designed Physical Data Model (PDM) using Erwin data modeling tool and ORACLE PL/SQL.
  • Tested Complex ETL Mappings and Sessions based on business user requirements and business rules to load data from source flat files and RDBMS tables to target tables.
  • Extensively used SQL Server2014 tools to develop Oracle stored packages, functions and procedures for Oracle database back-end validations and Web application development.
  • Perform analyses such as regression analysis, logistic regression, discriminant analysis, cluster analysis using SAS programming.
  • Worked with various Teradata15 tools and utilities like Teradata Viewpoint, MultiLoad, ARC, Teradata Administrator, BTEQ and other Teradata Utilities.

Confidential - Pittsburgh, PA

Data Analyst

Responsibilities:

  • Primary point of contact working with stakeholders to understand their systems, AS-IS process and business needs. Worked as a trusted advisor to identify opportunities for new insights, analytics and reporting solutions.
  • Involved in designing and developing Data Models and DataMarts that support the Business Intelligence Data Warehouse.
  • Utilized corporation developed Agile SDLC methodology. Used Scrum Work Pro and Microsoft Office software to perform required job functions.
  • Executed ETL operations for Business Intelligence reporting solutions using Excel.
  • Using Shared Containers and creating reusable components for local and shared use in the ETL process.
  • Develop and maintain sales reporting using in MS Excel queries, SQL in Teradata, and MS Access. Produce performance reports and implement changes for improved reporting.
  • Designed Excel templates created Pivot Tables and utilized VLOOKUPs with complex formulas.
  • Conducted Data analysis to answer specific business questions asked by clients and/or support teams; use data to establish facts and draw valid conclusions.
  • Developed Data Models and done Data Mapping across downstream applications by working closely with Architects.
  • Developed Tableau data visualization using Cross Map, Scatter Plots, Geographic Map, Pie Charts and Bar Charts, Page Trails, and Density Chart.
  • Evaluate, identify, and solve process and system issues utilizing business analysis, design best practices and recommended enterprise and Tableau solutions.
  • Substantial report development experience utilizing SQL Server Reporting Services (SSRS), Cognos Impromptu, and Microsoft Excel.
  • Wrote PL/SQL procedures for processing business logic in the database. Tuning of SQL queries for better performance.
  • Developed SQL-based data warehouse environments and created multiple custom database applications for data archiving, analysis, and reporting purposes.
  • Data mapping, logical data modelling, created class diagrams and ER diagrams and used SQL queries to filter data within the Oracle database.
  • Worked in creating dashboard analysis using Tableau Desktop, including information of car rental records, analysis covered customers, days of rental, service region, travel type and etc.
  • Performed extensive requirement analysis including Data analysis and Gap analysis.
  • Used extracts to better analyze the data, extracted data source was stored in Tableau Server, updated on a daily basis.
  • Designed and implemented basic SQL queries for QA Testing and Report / Data Validation.

We'd love your feedback!