- Scrum Master with experience in Project Management, Structured and Unstructured real - world data in various sectors like Healthcare, Information Technology etc. Hands on experience in Data Analysis in Data lake environment using various tools and technologies like SQL, Excel, Python, R, Tableau etc.
TECHNICAL PRODUCT OWNER
Confidential, Atlanta, Georgia
- Collaborate with scrum masters, technical leads, product owners, project managers, and third parties to groom backlog and plan sprints.
- Create and maintain Product backlog with Product vision in focus.
- Attend and facilitate the various Scrum ceremonies to keep the team in alignment with Product and Solution Roadmap.
- Agile Technical Product Owner role on geographically distributed scrum/Kanban delivery teams.
- Execute SQL Queries, Stored procedures, functions, packages, tables, views, triggers using Oracle SQL Developer, MS SQL Server for ad hoc tasks and Hadoop Impala, Hive for Data lake (Azure).
- Skilled in VLOOKUP formula development, Pivot Table generation, and Pivot reporting in Excel.
- Created and maintained the solution vision, roadmap, and backlog of work through the project's life cycle.
- Define Scrum Team’s backlog, plan iterations, participate in PI Planning ceremony and contribute to Vision and Roadmap.
- Maintain the Scrum Team’s Sprint and product backlog and adjust them based on the reviews/ feedback from System demos.
- Regularly demonstrate the Scrum Team’s sprint incremental development during the System Demo that occurs every two weeks with the stakeholders and the business members in the Agile Release Train (ART).
- Successfully led a Scrum team of 10 and implemented a Tableau project resulting in more than $6 Million in savings and increased turnaround time from 3 weeks to 5 minutes (approximately).
- Excellent competency in project management tools (JIRA, Confluence, etc.)
- Proven ability to handle competing demands and multiple, changing deadlines.
- Used regular expressions, text parsing, and data mining techniques to search for defined data patterns within millions of claim lines and within unstructured raw claims data from multiple businesses.
- Hands on experience in Global Data Formats (GDFs), Common Data Structure/ Format (CDS/ CDF) in Data Lake (Azure - Hadoop) to structure data from multiple schemas/entities/businesses.
- Familiar with Master Data Management, Entity Resolution processes in healthcare entities’ data like Member, Coverage, Providers, Clearing houses, Claims, encounters, Long term care etc.
- Experience in Waterfall, Agile Scrum & Kanban SDLC methodologies.
- Act as a strong liaison between Product Management, Product Owners and development to ensure a transparent and productive workflow environment.
- Drive the development of and continued improvements to the automation framework(s).
- Maintain/update system data flow chart, Heat Maps, Tree Maps, Visio documents, and system documentations.
- Used DDL, DML and DCL commands in Impala, Oracle SQL developer, MS SQL Server etc
- Work with Clients, user groups to analyse requirements and proposed changes in design and specifications.
- Created mockup diagrams using MS VISIO and provided the screen shots replicating the changes to the User Interface (UI) as per the requirements.
- Review all the systems design by assuring adherence to defined requirements, decompose features to user stories, clearly written acceptance criteria, formally accept user stories etc.
- Product and project management skills including requirements elicitation, documentation, planning and tracking project through the entire product lifecycle.
- Excellent team collaboration skills and effective communication with internal and external business and technical stakeholders.
- Prepared Business Requirements Document (BRD) by gathering requirements from all the stakeholders; translated BRD to Functional Specifications Document (FSD) and created prototypes.
- Facilitated pre-planning meetings, demos and daily stand ups.
- Experience in Medicaid, Medicare, HIPAA standards and various EDI transaction sets like 837P, 837D, 837I, 276/277 & X12 5010 formats etc.
- Performed Exploratory Data Analysis on claims data for predicting claim adjudication status using Pandas, Scipy, Numpy by implementing various supervised learning algorithms.
- Performed data cleaning, transformation and loading for the various subsystems using python.
- Created mass adjustments for various Medicaid & Medicare Claims and part of the claims adjudication process using Pro*c, Sqlplus, Oracle 11g, UNIX etc.
- Created pivot tables and charts using worksheet data and external resources, modified pivot tables, sorted items and group data, and refreshed and formatted pivot tables.
- Familiar with the various subsystems in the MMIS system like Claims, Long Term Care, HMO, Pharmacy etc.
- Facilitated JAD sessions, daily scrum meetings and conducted peer review conferences sporadically to keep track of projects goal.
- Identified the Data Attributes for the data required by the users & developed the Data Mapping & Dictionaries.
- Tracked daily issues and dependencies and coordinated with the team to work on the impediment list.
- Coordinated with development team to help them better understand the requirements.
- Assisted Project Manager during the inception phase to create the project scope, charter, and initiation documentation.
- Worked with change control board to initiate/manage Change Requests.
- Good understanding of healthcare industry claims management process, Medicaid (MMIS), Medicare Services and insurance sector.
- Conducted multiple levels of testing like regression, unit, functional and user acceptance to verify the client’s needs are met.
- Used MS Visio for flow-charts, process model and architectural design of the application.
- Experience in all phases of software development life cycle (SDLC) i.e., requirement gathering, requirement analysis, design, define test cases, development, unit test and maintenance.
- Experience in SDLC development methodologies like Agile, Scrum and Waterfall.
- Excellent communication, interpersonal, analytical skills, and strong ability to manage and motivate the team.
- Executed SQL Queries, Stored procedures, functions, packages, tables, views, triggers using Oracle SQL Developer for ad hoc tasks.
- Designed and populated specific tables, databases for collection, tracking and reporting of data using SQL.
- Hands on experience in various algorithms in ML Regression, Clustering etc. using Python, R & Tableau for advanced visualizations.
- Develop and maintain sales reporting using in MS Excel queries, SQL in SQL Developer, and MS Access.
- Used regular expressions, text parsing, and data mining techniques to search for defined data patterns.
- Performed Data Analysis and Data validation by writing SQL queries and Regular expressions.
- Extracted, cleansed data (Munging) and manipulated datasets as needed for reporting and forecasting analysis.
- Worked extensively with UNIX and LINUX to view the data for analysis.
- Maintained/updated system data flow chart, Heat Maps, Tree Maps, Visio documents, and system documentations.
BUSINESS/ DATA ANALYST
- Involved in requirement gathering, analysis of the requirements from the business owners and users.
- Conducted a detailed Gap Analysis to provide Impact Assessment on various departments across the enterprise.
- Created data cleansing, data quality check strategies based upon analysis of complex business requirements.
- Interacted with users for verifying user requirements, managing change control process, updating existing documentation.
- Writing complex SQL queries using joins, sub queries and correlated sub queries to retrieve data from the database using SQL Developer that was used further for data mining.
- Analysed reports and fixed bugs in stored procedures and Used DDL, DML and DCL commands in Oracle SQL developer.
- Interpreted raw data using a variety of tools (Python, R, Excel), algorithms, and statistical/ econometric models.
- Performed Data Cleaning, Feature selection using WEKA, Excel. Reduced Dimensionality for datasets using PCA.
- Documented all programs and procedures to ensure an accurate historical record of work completed and to improve quality and efficiency.
- Wrote ETL scripts in Python/SQL for extraction and validating the data.
- Reviewed all the systems design by assuring adherence to defined requirements.
- Met with user groups to analyse requirements and proposed changes in design and specifications.
- Flat file conversion from the data warehouse scenario.