We provide IT Staff Augmentation Services!

Sr. Technical Data Analyst Resume

5.00/5 (Submit Your Rating)

SUMMARY

  • Senior Technical Data Analyst and Tech Lead with over 12 years of experience in Data Warehousing, Analytics and Information Management using SAS, SSIS, DB2, Teradata, SQL Server, SSIS, and Reporting Tool
  • Experienced with multi domain environments such as Healthcare, Personal/Commercial Insurance, Manufacturing and retail
  • Hands on experience in Data Integration, Data Quality validation.
  • Good experience with EDI Medical Claims Data in “837I/837P/837/835 Formats”
  • Experienced in every role of IT projects including Business Systems Advisor, acts as a liaison between technical personnel and business end - users, business process analysis, planning, design and architecture, development, testing, quality control, project management and ongoing support
  • Excellent knowledge of Business Intelligence tools, data warehousing concepts like Data Modeling, Data Quality, Data Governance, Entity Relational and Dimensional Models
  • Significant exposure to enrollment, medical, and RX claims data, Billing and has experience analyzing and validating medical claims, provider, and client/member’s data.
  • Experienced as Tech Lead, Data Quality Analyst Lead and Project Lead

TECHNICAL SKILLS

Languages: SAS, JCL,COBOL,C,C++

Operating Systems: Mainframe, Windows XP, Windows 7, UNIX and Linux

Database/Database tools: SQL Server 2008, DB2, Oracle, Teradata, Hadoop, MS Excel, MS Access, and Toad for Data Analysts

Special Software/Tools: Stat, SAS/Access, SAS/Connect, Putty, QMF, Toad, HIVE, CA7, Change Management &ITIL, Autosys, Panvalet, Ultra Edit, WinScp, SSIS, Excel, Pivot Tables, Tableau, COBOL, Rally, ALM, Informatica,Cognos Reports, SAS EG 9.2, SAS EG 9.4,SAS Enterprises Miner, Confluence, JIRA,PL/Sql Developer, Visio, Python and Pandas

PROFESSIONAL EXPERIENCE

Confidential

Sr. Technical Data Analyst

Responsibilities:

  • Understanding the existing check run process(BCBS Michigan,BCBS Nebraska, BCBS Arkansas and Total Health)
  • Working with BA/SME and gather the necessary details for documentation
  • Preparing step by step Procedure documents
  • Understanding existing PL/SQL procedures and prepare the BRD/FSD with business rules
  • Capturing the business rules and make sure the development team understand the existing logic
  • Gathering/track the status report from team,
  • Data analysis/Data mapping to develop ETL Technical Specs for ETL Process development
  • Create data dictionary and data flow diagrams.
  • Perform GAP Analysis in various places during Data Migration in different levels
  • Sharing business knowledge across the team
  • Creating dashboards for senior management
  • Creating the traceability matrix and maintain the project plan
  • Working with Existing Developers and understand the program logic and purpose of the code
  • Maintain the data quality and Consistency
  • Creating environments for development and QA and getting them access to schemas and tables.
  • Preparing Source to Target mapping document
  • Capturing Data movement in different layers such as

Confidential

Sr. Technical Data Analyst

Responsibilities:

  • Analysis of various data feeds (30%)
  • Importing large amount of data (Enrollment, Medical claims, Pharmacy Claims) from various healthcare group to Confidential SAS system.
  • Creating mapping to correlatethe different data Elements-Converting carrier’s standard data layout to Confidential standard for enrollment, Medical & Rx data. Checking the data layout with the Data Dictionary by each group and ensure the mandatory fields are all available in the input files as per Confidential standard.
  • Document and receive sign off (20%)
  • Create the file map document: Registry of all data feeds, file naming conventions, delivery schedules, and SFTP retrieval details.
  • Create the Vendor registry: Per vendor per data feed configuration information, including expected file layout, number of fields, header / footer information, delimiters, and record length. This table captures the rules for transforming the medical data file into separate facility and professional claims warehouses
  • Creating the data mapping document: captures list of all fields across all data feeds, field attributes, per field QC acceptance thresholds, and mapping logic to fit Confidential standard layout.
  • Writing and executing/Analyze SAS jobs (30%)
  • Creating the SAS program to create SAS Meta data for client specified files.
  • Creating the SAS Macro program to replicate Production environment to development region and test the files.
  • Once complete the development setup, below steps will be involved
  • Importing Data Set - Validate SAS log that import was done successfully without any error or warnings.
  • Raw QC - It is a quality checkup process to the imported RAW data file. Validate the field name, length, data according to data dictionary.
  • Mapping Data Set - Map the client files (File map/Vendor Registry/Data Mapping) to Confidential standard.
  • Map QC - It is a quality check process to the mapped fields and imported data. Validation will be done by running set of SAS program
  • Building a warehouse - Build the SAS warehouse after the validation is done. Once approved and signoff the warehouse internally, it will be shared to client for review.
  • Reviewing the output and Sign off (20%)
  • Monitor The daily import process is driven by a program that leverages the Client File Map, Client Vendor Registry and import criteria derived from the Client Data Mapping. Raw QC report is automatically generated once SAS program Complete. RAW QC reports should be reviewed, validated and sign off for further process
  • Monitor the daily jobs, that will captures the mapping and warehouse build process leverages the Client Data Defaults, Client Data Mapping, and Client Code Mapping tables. Once completed, a map QC report and ETL summary is generated. It should be reviewed and sign-off.

Confidential

Sr. Technical Data Analyst

Responsibilities:

  • Actively interacting with the customers in understanding the business requirements and preparing the business documents.
  • Creating/Understanding Business requirement document and develop the SAS Program.
  • Responsibility to Ftp the files through Accent team
  • Create the report for campaign management team.
  • Source to target Data mapping as per file layout
  • Working on Process improvement and performance tuning.
  • Managed projects in SharePoint everyday by collaborating with Development, Application, Infrastructure, DATA, BA and QA teams.
  • Write the SAS macros to automate the TCS process and create the reporting layer which feeds into Tableau to create dashboards and self-service application.
  • Proc SQL and Proc DS2 to read manipulate the data from Teradata and Hadoop (Hive and Impala) to create COE provider reporting application.
  • Extracted the IS project task reports from SharePoint, analyze the data and prepare presentation of KPIs to the senior management.
  • Write the SAS Teradata script to create data layer and reporting layer to create BOB Lago reporting.
  • Conducted daily calls with the application team managers and provide updates regarding service requests. Maintained SharePoint site tracking all relevant request information, including approval status.
  • Assist to team with any questions on data related, Teradata or SAS related questions.
  • Working on Gap analysis, root cause analysis.
  • Working on performance improvement, automating manual processes. Performance tuning and reduce the cpu/run time.
  • Working with Profiling team, Business Analyst and SME
  • Optimized performance using Data Validation and Data cleaning on data using Statistical procedures like Proc Freq, Proc Means.
  • Working with multiple stakeholders to gather the requirement, schedule and coordinate the meeting
  • Experience in Ad-hoc programming.
  • Optimizing the code and reduce the program run time / cpu time.
  • Working with Relational databases such as SQL Server, Oracle, and Teradata.
  • Adding ICD-10 fields to the existing process.
  • Working on ICD9/ICD10, CPT codes, Revenue codes, DRG and Bill type code.
  • Working with large datasets.
  • Possess strong ability to quickly adapt to new applications and platforms.
  • Generate reports either in HTML, PDF or RTF formats according to the client specifications.
  • Extensive use of Proc SQL to perform queries, join tables.
  • Scheduled daily team meeting to get updated status on project deliverables and updated the project plan in SharePoint.
  • Active participation in monthly departmental meetings and providing the necessary inputs on how to improve the standards.
  • Automating the existing manual process.
  • Conducted analysis and generated tables, listings and using SAS.
  • Used data null and Proc Report to generate the outputs.
  • Extracting Data from mainframe datasets.
  • Reading data from SQL database tab delimited files and other types of raw data files.
  • Testing Categorical Data with Proc Freq, Producing Statistics with Proc Means.

Confidential

Team Lead

Responsibilities:

  • IDS Scheduled Reporting aims in generating reports for Commercial lines, Claims and Wealth management portfolios by using SAS and Mainframes. This report helps the business customers in decision making and research activities.
  • Generated reports are on a periodic basis to help the actuaries in understanding the business trend. The Business Knowledge facility (BKF), Hart source data mart (HSDM), CIDER is a repository of commercial lines data. CCPS, APG, WCRS and CI contain volumes of claims data.
  • Generated excel reports by analyzing the data available in data marts and providing them in various formats as requested by the customer. Once the report is approved it is scheduled in Autosys/CA7 to be delivered on a periodic basis with data for different timelines.
  • Scram, Growth, Capre and Insurance Score are critical applications handled by IDS-SR which handles policy details and provides pricing, and score information to downstream systems.

Confidential

Technical Developer/Team Lead

Responsibilities:

  • The Proposal Pricing and Tracking System (PPTS) is a system created for the Military Proposals Pricing & Analysis department to support costing, pricing, tracking and generating proposals for all Military Engines, Spare parts (including direct foreign and support equipment), ECPs(Engineering Change Proposals), and Catalogs. PPTS are comprised of two modules, the costing procedures which is located in mainframe and the pricing module that is located in LAN.
  • Extract the data from DB2 systems and create the weekly/monthly reports.
  • Analyze large datasets and prepare the trend analysis report.
  • Creating the dataset and transform to downstream system (LAN/Informatica).
  • Maintain the historical data (Member data, Geographic, Aviation parts)

Confidential

Technical Developer

Responsibilities:

  • AOS is a responsible for the health of a particular system and we are ensuring the system's health by various checks.
  • The purpose of the project is to develop the new code and maintain the existing and resolve the production failures.

Confidential

Technical Developer

Responsibilities:

  • The CAS-MVS project is a part of Centralized Application Support (CAS) project executed for Confidential, Minneapolis USA. Support maintains and enhances 4 Business Areas that is Supply, Corporate Services (finance, Admin and HR/Payroll), Guest (Distribution and stores) and credit. The main objective of the project is to provide maintenance to Confidential ’s MVS applications which includes Develop and production Support, Resolving Client issues, providing permanent Fixes for production problems and carry out minor enhancements.
  • Maintenance & Enhancement of the existing programs in order to meet the changing business requirements.
  • Code development, test plan preparations and testing.
  • Monitor the production cycle for main jobs/transactions to catch the problem as early as possible.
Environment: Windows XP, SAS, Unix, Mainframe, File-aid, BMRS, ESP, QMF, WINSCP, Putty

We'd love your feedback!