We provide IT Staff Augmentation Services!

Sr Data Analyst Resume

2.00/5 (Submit Your Rating)

Washington, DC

SUMMARY

  • 16 years of IT experience and 7 years of experience as a Data Analyst/Business analyst/Data Modeling working in the areas of requirement engineering process, implementation, quality assurance, testing, release management, change management and production support. Extensive knowledge and experience in all the phases of System of Development Life Cycle (SDLC) specially in Healthcare.
  • Strong understanding of SDLC methodologies such as Agile, Rational Unified Process (RUP) and Waterfall.
  • Good communication and interpersonal skills to establish communication channels between the end client and the SDLC team.
  • 4 plus years of experience in Teradata development area and analysis.
  • Experience using quality tools, such as HP ALM 12, HP UFT for test management and automation Knowledge of preparing payroll from the Optum Market benefit tool.
  • Experience in developing, Customizing Interfaces and Data Conversion programs using SQR, Application Engine and Component Interfaces for HR, Benefits, and Payroll modules.
  • Hands on experience in Hadoop data modeling, metadata solution ERWin, system architect, informatica, data mining, implementing ER - models and data aggregating
  • Excellent in systems and networking administration.
  • Knowledge of Family al Rights and Privacy Act (FERPA) and other state and federal regulations regarding the release of student data
  • Requirement Elicitation Technique such as Joint Application Development (JAD), interview, brainstorming, surveys, workshops and one-one meetings.
  • Experience in data warehousing and business Intelligence using ETL tools like Informatica, SQL Server, Business Objects and SSIS
  • Experience in Oracle pl/sql and Oracle database.
  • Experience in SAS, MDM match and merge and cognos.
  • Adept in writing SQL code in Teradata, Oracle, MySQL and MS SQL Server
  • Created data modeling artifacts: ER diagrams, mapping documents, and various Erwin metadata reports.
  • Automated Teradata utility scripts using UNIX Demonstrated expertise utilizing ETL tools, including SQL Server Integration Services (SSIS), Data Transformation Services (DTS) and ETL package design, and RDBM systems like SQL Servers, Oracle, and DB2.
  • Knowledge in EIR.
  • Proficient in creating UML diagram such as Use Case Diagram, activity diagram, sequence diagram.
  • Creating Requirement Traceability Matrix (RTM) to trace requirement to other project deliverables.
  • Extracting data from multiple data sources and conducting data analysis using SQL queries.
  • Writing test cases, adding defect, and documenting test summary report for User Acceptance Testing (UAT).
  • Excellent knowledge of Health Insurance Portability & Accountability Act (HIPAA) standards, Medicaid and Medicare regulations, Health Care Reform (HCR), and Electronic Health Record (EHR).
  • Strong knowledge and experience with claims associated with payers, claims by providers and member/subscriber claims.
  • Engaged OLAP functionalities (Online Analytical Processing) by data analysis and SQL execution
  • Experience in negotiating with the drug makers and pharmacy companies and dealing with them over the rates as a role of Pharmacy benefit management.
  • Worked on HL-7 and knowledge of java VM scripting language: Groovy
  • Understanding of HIPAA EDI inbound and outbound transaction, and HIPAA conversion analysis involved in 834 (Enrollment and Maintenance), 837 (claim processing and clam adjudication including COB), 835 (Claim Payment and Remittance), and 820 (Payment Order and Remittance).
  • Involved in fullHIPAAcompliance lifecycle from GAP analysis, mapping using General Equivalence Mapping (GEM), migration ofHIPAAANSI X12 4010 to ANSI X12 5010and translation of ICD-9 codes into ICD-10 codes.
  • Used Facets Claims and Member/Subscriber modules and have worked on editing and validating claim.

TECHNICAL SKILLS

Operating Systems: Win XP, Vista, 7, Windows Server 20 and 64 bit), UNIX, Ubuntu

Databases: MS SQL Server 2005,2008 and 2012, Teradata 12/13, Oracle 9i, 10g, 11g, MySQL 5.2, DB2, TSQL

ETL/Other Tools: SAS Base 9.x, Informatica Power Center 8.6.1/9, SSIS(Visio Studios 2010 Shell), Erwin data ModelerTest Tools HP Quick Test Professional 10/11, HP Quality Center 10

Query Tools: SQL Management Studio 2008, Teradata SQL Assistant, SQL Plus, SQL Developer, Teradata-BTEQ

Scripting: SQL, PL/SQL

Reporting Tools: Informatica, tableau, SSRS, Crystal reporting

Language: Java, AJAX, Java Scprit, VB.Net, php, R, C++, CSS, HTML 5.0,Apache

PROFESSIONAL EXPERIENCE

Confidential, Washington, DC

Sr Data Analyst

Responsibilities:

  • Assisted the project manager in the creation of the project charter & vision document during the inception phase of the project.
  • Experience of working closely with end-users during User Acceptance Testing (UAT), and document issues and defects.
  • Use SQL server management studio for data analysis and SSIS tool for ETL on a daily basis.
  • Created ETL test data for all ETL mapping rules to test the functionality of the Informatica Mapping.
  • Experience in SAP master database analysis.
  • Tested the ETL Informatica mappings and other ETL Processes (Data Warehouse Testing).
  • Developed Test Cases for Testing the Facets Model.
  • Coordinated with the developers and IT architects to design the interface of the new system according to the X12 (270, 276, 278, 834, 837 (I,P,D) and 820) standards
  • Develop, design & implement department plan to operationalize new FACETS integrated processing system, to include but not limited to, workflow, management oversight and performance analysis.
  • Translated business requirements into functional specifications and documented the work processes and information flows of the organization.
  • Worked as data analyst team lead.
  • Experience on Rstudio server management.
  • Responsible for creating reports for HEDIS data.
  • Worked on HL-7 and knowledge of java VM scripting language: Groovy.
  • Investigated and resolving data anomalies; tuning SQL queries to optimize performance.
  • Engaged OLAP functionalities (Online Analytical Processing) by data analysis and SQL execution
  • Condition Split, Term extraction, Aggregate, Execute SQL Task, Data Flow Task, and Execute Package
  • Conducted Teradata performance tuning.
  • Contributed in the build and design of organizational Wiki that provided comprehensive knowledge of workflows, policies and procedures, patient care objectives, regulatory requirements, and industry best practices for management
  • Observed and gathered business requirements, documented pertinent requirements and performed extensive data analysis of consumer data.
  • Designed, developed, implemented and maintained conceptual, logical and physical data models.
  • Created data modeling artifacts: ER diagrams, mapping documents, and Erwin metadata reports.
  • Performed data mapping and modeling.
  • Adept at processing Pharmacy benefit health claims and payment issues.
  • Involved with various aspects of the project's needs such as the logging, tracking, and resolution of issues, current state workflow assessments, assist with integration and script testing, downtime activities/testing
  • Created detailed use cases, use case diagrams, and activity diagrams using MS Visio
  • Clarified to claims personnel the new Affinity payments and Explanation for payments (EOPs) for same claim processing cycle
  • Worked extensively with the ETL using the SSIS packages.
  • Created SSIS packages to clean and load data to data warehouse.
  • Created SSIS Packages using Lookup, Derived Columns,
  • Designed and implemented complex SQL queries for QA testing and data validation

Environment: Facets, Oracle, MS Project, MS Office suite, SQL, ETL, Informatica, SSRS, SSIS, HEDIS, SQL Server, Rational Suite, Citrix, MS SharePoint.

Confidential

Data Analyst/ System Analyst

Responsibilities:

  • Involved in all phases of SDLC from requirement, design, development, testing, and rollout to the field user and support for production environment
  • Strong knowledge and experience in SQL server management studio.
  • Designed, developed, implemented and maintained conceptual, logical and physicaldatamodels.
  • Performed Data Quality Analysis to determine cleansing requirements.
  • UsedSSISto extract, transform, and load data from transaction systems.
  • DesigningSSISPackages using several transformations to perform Data profiling, Data Cleansing and Data Transformation.
  • Worked as data analyst team lead.
  • Used visualization tools such as tableau, informatica.
  • Strong knowledge and experience on HL-7.
  • Proficiently used Informatica to load data from fixed width and delimited Flat files.
  • Overseeing the development, implementation and ongoing maintenance of HIPAA 837 (claims), 834 (enrollment), 835 (remittance), 276/277 (claims status enquiry and response), (278 (referral authorization) EDI transactions as required to ensure quality claims.
  • Worked with FACETS Team for HIPAA Claims Validation and Verification Process (Pre-Adjudication).
  • Worked with FACETS edits and EDI HIPAA Claims (837/835/834) processing. complex mappings using corresponding Source, Targets and Transformations like update strategy, lookup, stored procedure, SQL, sequence generator, joiner, aggregate, Java and expression transformations in extracting data in compliance with the business logic.
  • Created SSIS Packages to migrate slowly changing dimensions.
  • Worked closely with Data Architect to review all the conceptual, logical and physical database design models with respect to functions, definition, maintenance review and support Data analysis, Data Quality and ETL design that feeds the logical data models.
  • Created data mapping documents mapping Logical Data Elements to Physical Data Elements and Source Data Elements to Destination Data Elements.
  • Involved working on different set of layers of Business Intelligence Infrastructure.
  • Performed Data cleansing and scrubbing while finding the quality data
  • Worked on integration workflows and load processes
  • Setting up ETLs to load data into the physical data objects using SSIS and SQL Data Tools

Environment: SQL Server 2008,SSIS, HIPAA standards, Facets, SQL, MS Outlook, Java, HTML.

Confidential, Washington, DC

Data Analyst

Responsibilities:

  • Works with systems and security engineering to keep environment secure and operate at peak levels.
  • Performs transaction analysis and performance tuning of database applications and batch processes.
  • Implements database backup and recovery procedures and disaster recovery plans.
  • Manages scheduled database processes supporting automated file transfers.
  • Maintains databases, including updating statistics, indexing, structure modifications, capacity planning, and index analysis. Manages database security, logins, users, roles, and object permissions.
  • Works with integrated project teams as project database administrator for large-scale modifications and new product releases. Ensures availability and integrity of RDBMS.
  • Plans and executes small-scale database modifications and SQL scripts for bug fixes, data repair, and functionality enhancements.Designs and develops data structures to store enterprise data.
  • Supports data replication to and from multiple web sites and maintains the underlying batch processes.
  • Assists software engineers and end users with query writing, database access, and general information.
  • Maintains online analytical processing server and processes to replicate production data to the server. Troubleshoots and maintains database encryption.
  • Confers with management and staff on policies and procedures, technical problems, and priorities that relate to database operations. Prepares project status/progress reports on assigned activities.

Environment: MS SQL Server 2008, SQL Server Management Studio 2008, Windows OS, MS Excel.

Confidential, Pleasanton, CA

Data Analyst/ QA Analyst

Responsibilities:

  • Used ETL tool for Extraction, Transformation and Loading the data into target database.
  • Executed Test Cases using positive and negative data in Quality Center’s Test Lab and reported results and defects using Quality Center’s Defects tool.
  • Tested the XMLs feeds received from another source which is a third party for data consistency.
  • Tested the ETL with XML as source and tables in the data warehouse as target.
  • Responsible for ETL(Extract, Transform and Load) processes using Informatica power center
  • Extracted data from flat files, Sybase, DB2, Oracle and MS SQL Server 2008 and SQL Server 2005/ 2008 to load data into target database.
  • DesigningSSISPackages using several transformations to perform Data profiling, Data Cleansing and Data Transformation.
  • Proficiently used Informatica to load data from fixed width and delimited Flat files.
  • Worked with Repository Manager, Designer, Workflow Manager and Monitor to import and load Source Definitions using Source Analyzer and Target Definitions using Warehouse Designer.
  • Complex mappings using corresponding Source, Targets and Transformations like update strategy, lookup, stored procedure, SQL, sequence generator, joiner, aggregate, Java and expression transformations in extracting data in compliance with the business logic.
  • Created SSIS Packages to migrate slowly changing dimensions.
  • Extensively used Informatica Power center and created mappings using transformations to flag the record using update strategy for populating the desired slowly changing dimension tables.

Environment: SQL Server, DTS,SSIS, SQL Server Data Tools, Visual Studio, Visual Source Safe, Team Foundation System

Confidential

MS SQL Server 2000 Database Administrator (DBA)

Responsibilities:

  • Installed and maintained MS SQL Server 2000 on Windows 2003 Server. Worked on SQL Server 2000 failover clustering which is built on top of a Windows 2003 server cluster.
  • Installed SQL Server service packs based on the errors and flaws that are found in the application
  • Generated Script files of the databases whenever changes are made to stored procedures or views.
  • Used Data Transformation Services (DTS) an Extract Transform Loading (ETL) tool of SQL Server to populate data from various data sources, created packages for different data loading operations for application.
  • Managed the use of disk space, memory and connections.
  • Performed DBCC utilities and fixed data corruption in application databases.
  • Worked on monitoring and tuning of SQL scripts. Created upgrade scripts for production database and supported it.
  • Created and managed schema objects such as tables, views, indexes, procedures, triggers & maintained Referential Integrity.
  • Performed Server & Application fine-tuning & troubleshooting. Monitored MS SQL Server performance and network issues.
  • Helped Development Team in deploying and testing the application, which uses SQL Server as a database.
  • Monitored Databases for lock escalation, dead Lock and resource utilization Captured long running SQL/runaway queries using SQL Profiler.
  • Scheduled the backups for Databases and Transaction log. Performed restoration operations. Suggested backup strategies.
  • Implemented different types of Replication Models like Snapshot and Transactional.
  • Dropped and recreated the indexes on tables for performance improvements. Used Index tuning wizard for tuning of indexes.
  • Controlled user access rights, and managed data security.
  • Traced long running queries and deadlocks using SQL Profiler

Environment: MS SQL Server 2000, Windows 2003 Server, Windows

We'd love your feedback!