We provide IT Staff Augmentation Services!

Sr. Data Analyst Resume

2.00/5 (Submit Your Rating)

SUMMARY:

  • Confidential is a proactive, result driven Sr. Data Analyst / Business Analyst with a solid track record of providing support to maintain and expand existing data collection and data delivery platforms.
  • He is well - versed in creating new data collection systems that optimize data management, capture, delivery and quality. He is a SAS certified base programmer and has 11+ years of experience with SAS, data analytics, ETL, and SQL Server. He has significant exposure to enrollment, medical, and RX claims data and has experience analyzing and validating medical claims, provider, and client/members data.
  • Confidential has excellent verbal and written communication skills and is looking forward to his next project.

TECHNICAL SKILLS:

Languages: SAS, JCL

Operating Systems: Mainframe, Windows XP, Windows 7, UNIX and Linux

Database/Database tools: SQL Server 2008, DB2, Oracle, Teradata, Hadoop, MS Excel, MS Access, Toad for Data Analysts

Special Software: Stat, SAS/Access, SAS/Connect, Putty, QMF, Toad, CA7, Change Management &ITIL, Autosys, Panvalet, Ultra Edit, WinScp, SSIS, Excel, Pivot Tables, Tableau, Rally, SAS EG 9.2, SAS EG 9.4

PROFESSIONAL EXPERIENCE:

Confidential

Sr. Data Analyst

Responsibilities:

  • Analysis of various data feeds (30%)
  • Importing large amount of data (Enrollment, Medical claims, Pharmacy Claims) from various healthcare group to SCIO SAS system.
  • Creating mapping to correlate the different data elements-Converting carrier’s standard data layout to SCIO standard for enrollment, Medical & Rx data. Checking the data layout with the Data Dictionary by each group and ensure the mandatory fields are all available in the input files as per SCIO standard.
  • Document and receive sign off (20%)
  • Create the file map document: Registry of all data feeds, file naming conventions, delivery schedules, and SFTP retrieval details.
  • Create the Vendor registry: Per vendor per data feed configuration information, including expected file layout, number of fields, header / footer information, delimiters, and record length. This table captures the rules for transforming the medical data file into separate facility and professional claims warehouses
  • Creating the data mapping document: captures list of all fields across all data feeds, field attributes, per field QC acceptance thresholds, and mapping logic to fit SCIO standard layout.Writing and executing SAS jobs
  • Writing and executing/Analyze SAS jobs (30%)
  • Creating the SAS program to create SAS Meta data for client specified files.
  • Creating the SAS Macro program to replicate Production environment to development region and test the files.
  • Once complete the development setup, below steps will be involved
  • Importing Data Set - Validate SAS log that import was done successfully without any error or warnings.
  • Raw QC - It is a quality checkup process to the imported RAW data file. Validate the field name, length, data according to data dictionary.
  • Mapping Data Set - Map the client files (File map/Vendor Registry/Data Mapping) to SCIO standard.
  • Map QC - It is a quality check process to the mapped fields and imported data. Validation will be done by running set of SAS program
  • Building a warehouse - Build the SAS warehouse after the validation is done. Once approved and signoff the warehouse internally, it will be shared to client for review.
  • R eviewing the output and Sign off (20%)
  • Monitor The daily import process is driven by a program that leverages the Client File Map, Client Vendor Registry and import criteria derived from the Client Data Mapping. Raw QC report is automatically generated once SAS program Complete. RAW QC reports should be reviewed, validated and sign off for further process
  • Monitor the daily jobs, that will captures the mapping and warehouse build process leverages the Client Data Defaults, Client Data Mapping, and Client Code Mapping tables. Once completed, a map QC report and ETL summary is generated. It should be reviewed and sign-off.
Confidential, CT

Facets Claims processing/Business Analyst

Responsibilities:

  • Accountable for processing claims and Provider contracts for medical, ancillary and ASC entities.
  • Enrolled members and provider in the facets system.
  • Performed Quality Assurance of new or existing contracts and configuration to ensure appropriate or correct payment was made via unit testing.
  • Participated in Claims; building and testing members
  • Troubleshooting and problem resolution of provider reimbursement utilizing reimbursement policies, methodologies and standard.
  • Analyzed configuration across Medicaid lines of business
  • Created and modified queries utilizing Facets data tables
  • Analysis configuration across commercial and government lines of business
  • Researched/planned products and plans for new markets
  • Networx: Modify/update Medical Agreements via Medical Agreement Configurator
  • Manually price claims by modifying agreements in Networx
  • Updated/Modified Facets Fee Schedules
  • Utilized Facets Data Model to document, map and query data required for the 271 benefit response
  • Built Medicaid products for new markets. Assisted configuration team in building SEPYs-modifying service rules-, creating modifying LTLTs and attaching those to the variable components rows.
Confidential

Sr. Data Analyst / Business Analyst

Responsibilities:

  • Actively interacting with the customers in understanding the business requirements and preparing the business documents.
  • Gather the requirements and design and implemented the TCS (Therapeutic class strategies) analytics reporting application.
  • Worked with Specialty and Non specialty PDL(Prescription drug list) teams to Create report and analyzed data for various programs such as Step therapy, Notification, double copay and PA programs.
  • Worked on Hep C Therapeutic class (Sovaldi, and Harvoni)analysis, Mortaring and UM(utilization management) savings reports and Genotype,SV12 etc
  • Worked on deep drive analysis for different therapeutic classes like MS, Hep C, HIV, cancer, cystic fibrosis
  • Write the SAS macros to automate the TCS process and create the reporting layer which feeds into Tableau to create dashboards and self-service application.
  • Proc SQL and Proc DS2 to read manipulate the data from Teradata and Hadoop (Hive and Impala) to create COE provider reporting application.
  • Extensively used PROC DS2, to utilize SAS in-database and parallel execution and thread technique improves the performance of the SAS data step process.
  • Write the SAS Teradata script to create data layer and reporting layer to create BOB Lago reporting.
  • Assist to team with any questions on data related, Teradata or SAS related questions.
  • Working with Profiling team, Business Analyst and SME
  • Optimized performance using Data Validation and Data cleaning on data using Statistical procedures like Proc Freq, Proc Means.
  • Working with multiple stakeholders to gather the requirement, schedule and coordinate the meeting
  • Experience in Ad-hoc programming.
  • Adding ICD-10 fields to the existing process
  • Possess strong ability to quickly adapt to new applications and platforms.
  • Generate reports either in HTML, PDF or RTF formats according to the client specifications.
  • Extensive use of Proc SQL to perform queries, join tables.
  • Active participation in monthly departmental meetings and providing the necessary inputs on how to improve the standards.
  • Conducted analysis and generated tables, listings and using SAS.
  • Used data null and Proc Report to generate the outputs.
  • Extracting Data from mainframe datasets.
  • Reading data from SQL database tab delimited files and other types of raw data files.
  • Testing Categorical Data with Proc Freq, Producing Statistics with Proc Means.

Environment: Windows XP, SAS, Unix, Mainframe, File-aid, BMRS, ESP, QMF, WINSCP, Putty

Confidential

Team Lead IDS

Responsibilities:

  • IDS Scheduled Reporting aims in generating reports for Commercial lines, Claims and Wealth management portfolios by using SAS and Mainframes. This report helps the business customers in decision making and research activities.
  • Generated reports are on a periodic basis to help the actuaries in understanding the business trend. The Business Knowledge facility (BKF), Hart source data mart (HSDM), CIDER is a repository of commercial lines data. CCPS, APG, WCRS and CI contain volumes of claims data.
  • Generated excel reports by analyzing the data available in data marts and providing them in various formats as requested by the customer. Once the report is approved it is scheduled in Autosys/CA7 to be delivered on a periodic basis with data for different timelines.
  • Scram, Growth, Capre and Insurance Score are critical applications handled by IDS-SR which handles policy details and provides pricing, and score information to downstream systems.

Confidential

Technical Developer

Responsibilities:

  • The Proposal Pricing and Tracking System (PPTS) is a system created for the Military Proposals Pricing & Analysis department to support costing, pricing, tracking and generating proposals for all Military Engines, Spare parts (including direct foreign and support equipment), ECPs(Engineering Change Proposals), and Catalogs. PPTS are comprised of two modules, the costing procedures which is located in mainframe and the pricing module that is located in LAN.
  • Military proposals are initially entered into the IBM, where they are exploded and costed automatically where possible. Those parts that cannot be costed automatically are distributed to the responsible buyer or VPE for discrete costing on-line.

We'd love your feedback!