We provide IT Staff Augmentation Services!

Business/data Analyst Resume

2.00/5 (Submit Your Rating)

SUMMARY

  • Around 4 years of IT experience as Business System Analyst with proactive focus on Property & Casualty and Health Care insurance domain.
  • Expertise in manipulating linear approach like Waterfall methodology and iterative approaches like Scrum & Kanban Agile methodology for the Software Development Life Cycle (SDLC).
  • Expert in creating business & data flow diagrams, documenting high level business documents like Business Requirement Document (BRD) and Functional Requirement Documents (FRD), System requirement specification (SRS), Technical Requirement Document (TRD) and description of user stories and use cases.
  • Proficient in using Object Oriented Analysis and Design action for Unified Modeling Language (UML) like use case diagrams, activity diagrams, sequence diagrams, class diagrams, user stories and creating prototypes, wire frames, screen mockups.
  • Splendid experience in Software Test Life Cycle (STLC). Also, Quality assurance (QA) experience with developing, reviewing, executing test plans, test scripts, test cases, test reports and Requirement Traceability Matrix (RTM).
  • Strong real time experience in conducting and documenting test cases of User Acceptance Testing (UAT), went through the defect tracking life cycle.
  • Deep knowledge in data warehouse profiling and data management lifecycle concepts including ETL (Extract, Transform and Load), data mining, data selecting, data cleansing and utilizing Business Intelligence (BI) tools.
  • Dealt with simple complex SQL queries to extract data from database for the purpose of data analysis and testing and extensively used DDL and DML Commands in SQL for back end testing on Oracle and SQL databases.
  • Substantial knowledge in healthcare industry pores on both sides - Payer and Provider.
  • Experienced in creating Business reporting, BI, Dashboard using tools like Qlik View, Power BI. Acquired sufficient knowledge on Tableau.
  • Built reports and dashboards for different Salesforce Objects like Accounts, Policy, Opportunities and Broker.
  • Adept in dealing with claims processing life cycle (auto adjudication) especially in EDI Transactions like 837 (Submit Medical Claims), 834 (Benefits and Enrollments), 835 (Billing Pay and Remittance).
  • Good exposure over different modules of FACETs like Claim, Encounter, Eligibility, Membership, Billing, Payment, Authorization.
  • Knowledge of claims pricing and testing, Health Insurance Portability and Accountability (HIPAA) rule and regulations, Electronic Data Interchange (EDI), Consumer Driven Health Plans (CDHP), Coordination of Benefits (COB), Explanation of Benefits (EOB)/Drafts.
  • Deep understanding of HL7 Standards messages (ADT,ORM,ORU & DFT), Health Insurance Exchanges (HIX), Electronic Health Records (EHR), Electronic Medical Records (EMR), and Centre of Medicare and Medicaid Services (CMS) regulations, Meaningful Use Compliances, Health Care Reform.
  • Worked with all programs like Preferred Provider Organization (PPO), Point of Service (POS) and Health Maintenance Organization (HMO), Medicare (Part A, B, C, and D).
  • Hand on experience on Informatica MDM Match and Merge rule criteria, tokenization and trust setting on Master elements.

PROFESSIONAL EXPERIENCE

Confidential

Business/Data Analyst

Responsibilities:

  • Analyzed Domestic and International commercial parties coming from Dun and Bradstreet, Confidential underwriting systems and Confidential international policy booking systems for all MDM Master data elements which are required to build party’s 360-degree view.
  • Worked on different subject area like Party, Policy, Submission and Producer. Build required PKEY Source Objects and relationship with same entities.
  • Contributed to build and design the MDM platform including complex pieces from Informatica Data Quality (IDQ) and Pitney Bowes (PB) with the goal to create one & only one Golden Party record with its 360 view in Modernized MDM.
  • Collaborated with Subject Matter Experts, Data Owners, & Data Modeling Architects to capture the Entity-Relationship requirements of the MDM Project on Oracle Exadata platform.
  • Co-ordinated with Data Architect/Data Modeling team to build Conceptual (CDM), Logical (LDM) and Physical (PDM) target MDM Data Model. Including depicting Cardinality/Relationships (One-One, One-Many etc.), defining data types and Referential Integrity (RI).
  • Refined a user stories with necessary details and groomed Data Architects on the story.
  • Wrote user stories with body parts and acceptance criteria and estimated story points and hours to prioritize the stories from backlog items to sprint ready items.
  • Created a detailed technical specification document which consists of end to end requirement of the Data Hub’s Software Development Life Cycle for the Offshore Developers to program the DataStage ETL tool.
  • Identify critical data quality issue in sourced master data elements and build up Data Quality rules & cleansing rules.
  • Created Source to Target Mapping (STTM) document for all three source systems Account Data Base (ADB), Deal Management System (DMS) & Global All Lines Databases (GoalD). Developed Mapping building Party-Policy, Party-Submission and Party to related child entity relationship for all these three sources.
  • In depth understanding of Informatica MDM Base Object (BO) table including internal table like BO Cross reference (XRF), BO Match (MTCH), BO Merge (MRG) and External Match tables like EMI & EMO.
  • Built and validated MDM Match rules and helped business building trust rule/survival ship for Critical Data Elements (CDE). Supported MDM architect to build Match and Merge rule criteria, tokenization, and trust rule criteria for successful Golden record creation.
  • Utilized Qulik View to create various analytical dashboards that depict critical Master Data Elements such as Corporate Tree Position, SIC Code, NAICS Code, FEIN, and National TAX IDs for commercial parties.
  • Analyzed existing KPI dashboard from MR&A team for Commercial Party & Policy. Documented required/critical data element based on KPI reports.
  • Analyze Pitney Bowes output for international and dome tic party addresses. Identify possible data quality issue and possible improvement.
  • Present the conceptual data flow report to Offshore Developers, and Technical Project Managers for getting Signoff of Reverse Engineered Functional Data flow.
  • Support in customizing Trust setting and Match rule of different sources on selected Master data elements.
  • Build and execute sanity check scripts, UAT test scripts and provide support MDM UI support and walkthrough to the user.

Tools: & Technologies: Oracle SQL Developer, Informatica MDM, Informatica Data Director (IDD), Informatica MDM Hub Console, DataStage, Agile Central (Rally), Qlik View, Power BI, MS Office Suite, Informatica Data Quality (IDQ), Pitney Bowes and Main Frame (Basic COBOL script analysis).

Confidential

Business / Data Analyst

Responsibilities:

  • Participated in sprint technical workshops with Data Architecture, SME and Data Analysts to identify and document business needs and objectives, current operational procedures and problems, input, data needs and retention, open, pending and critical issues, usage, formatting, and security requirements.
  • Performed data analysis for data continuity, data integrity and provided imitation/derivation logic based on data analysis from various source files like Excel files, Text files, Landing files (Extracted from other DB and landed in CDH) and Key files.
  • Collaborated with CDH core team for standardizing the Fact and Dimensional data which enhanced the design of data models and architecture for CDH use by Business Intelligence and reporting tools like I-dive, Salesforce, KPI dashboard, Pricing Actuarial team.
  • Contributed to depict data flow diagram of Dynamic Key File, the process which include thousands of rules applied to data after standard extraction from source files.
  • Analyzed existing requirements in SAS EG, documented and compared/validated data from old system APRODUCT library (SAS) to the new system CDH (Data warehouse).
  • Cross teamed with Salesforce, KPI and Core CDH teams to source different attributes for Policy, Premium, Submission, Party, Producer coming from various sources like CRS, OGIS (Overseas General Insurance System), Global All Lines Databases (GoalD), EDW (Enterprise Data Warehouse), ADB for Confidential domestic and international general, property & casualty insurance data.
  • Built customized Opportunity reports in Summary and matrix form using SF UI.
  • Design, document, build, test and deploy enhancements to Salesforce custom objects, page layouts, workflows, alerts, reports and complex dashboards within Salesforce. Partner with end users and first line support to understand the administrative needs and resolve support issues.
  • Created and managed custom objects, fields, formulas, validation rules, custom workflow, and approval processes in Salesforce.
  • Developed STTM documents based on data profiling/analysis performed.
  • Reviewed code changes for a story in Github and peer reviewed DA validation with other team members.
  • Created and executed test cases in QA and UAT environment. Also mentored users during production data validation.
  • Identified Data issues during functionality/QA testing, resolved issues and provided feedback to nail down the bugs raised by user.
  • Maintained CDH data dictionary (Edict) for different attributes added during development with business definition and Meta data information of a given attributes.
  • Dealt with standard data elements used in property and general insurance like DUNS numbers, SIC (Standard Industrial Classification), NAICS (North American Industry Classification Systems) code.
  • Wrote simple/complex SQL queries DDL, DML commands to data validation, data visualization and for data analysis.
  • Closely involved in four ETL process of different process using Github for current development. Updated structure of existing ETL data mapping templates for Integration process.
  • Involved in creating Test Plans, Test Cases and Test scripts for different modules in the QC application according to the business requirement document.
  • During Transition phase helped Business user to achieve self-supportability and roll out the product.

Tools: & Technologies: Netezza, SAS enterprise grid, Jira (Agile Scrum & Kanban), Confluence, Github, Salesforce, MS Office Suite, Share point, Qlik View

Confidential

Business / Data Analyst

Responsibilities:

  • Gathered requirement as user stories, analyzed requirement, documented functional and technical requirements from both formal and informal sessions and validate the needs of the business stakeholders.
  • Prioritize user stories, documented user story description and assigned story points than put into different sprints.
  • Developed Activity Diagrams for exact illustration of the workflow and defined which entity responsible for which activity using swim lanes.
  • Co-Authored TRD for Commercial and Medicare Claims Data with exclusion and fitter criteria identified by Business during JAD sessions.
  • Identified Data Owners, Data Stewards for HIE Outbound data to ensure the Data Quality and Governance Process is established.
  • Coordinated with Business, Development, Testing, and Solution Architecture for meet Product Deliverable with minimum Thresholds Define by business.
  • Arranged Findings Reviews sessions with Business and IT team for Potential issue counts, Impact on the Day1 release, Potential risk.
  • Participated in change control meetings recommending appropriate action after analysis of changing requirements.
  • Analyzed existing SAS files for payer’s claims data coming from different sources and translates Proc-SQL to SQL requirements. Created mapping between SAS library file with Tera data target tables.
  • Executed simple complex SQL and Profiled ad hoc analysis data sources over production DB to identify the deficiencies of the current system.
  • Determined suitability for a solution in different Data models like Tera Data Healthcare DB, ODS, Enterprise Data Warehouse and other third-party application’s Data Base.
  • Identified root cause analysis on data anomalies and worked with data stewards to resolve issues.
  • Closely involve with ICOE’s activity for Data extraction, transformation, and Loading.
  • Performed detailed analysis, identifying trends, patterns in data, documented to Present with Business across all domain like Membership, Provider, Medical claims, Rx Claims, Authorization, PCP.
  • Assisted Data Modeler team to understand CDM, LDM, PDM for Audit, Bounce and Control process of acknowledgement File - HIE (HIX) inbound data for Confidential .
  • Facilitated review sessions with business users and other team members.
  • Unified with SQA testing team and provided require support from Data Governance team on Assign WS Data elements.
  • Worked with security team to tag PHI, PII security level for Outbound Data.
  • Produced standard commercial reporting deliverables that provide summary metrics (Data Matrix) and insights.
  • Worked on HL7 standards and HL7 interface to integrate the different department information and procedure according to provider’s EHR system rules.
  • Dealt with different Hl7 messages like Patient Administration (ADT), Charges (DFT), Orders (ORM) and Results (ORU).

Tools: & Technologies: Tera Data SQL, Tera Data Studio, SAS enterprise grid, Informatica, MS Office Suite, MS Visio, Share Point, Erwin, JIRA.

We'd love your feedback!