Sr. Data Analyst Resume
Wilmington, DE
PROFESSIONAL SUMMARY:
- Over 8+ years of Industry experience as a Sr. Data Analyst with solid understanding of Business Intelligence, Data warehousing, Data Mapping, Data Validation, Data Integrity, Master Data Management (MDM), Reporting, Data Governance, Data Stewardship and Data Modeling.
- Experience as a Data Analyst in Healthcare domain and experience as a passionate business, process, and qualitative analyst with expertise in understanding business problems.
- Experience in Health Administration - Claims processing (auto adjudication), COB, EOB/Drafts, Claims pricing and testing, HIPAA, enrollment, EDI, Medicare, Medicaid, CDHP (consumer driven health plans).
- Experience in various project management methodologies like Waterfall and Agile Project Management
- Expertise in data visualization, developing Dashboards, Charts, Graphs and reports using Tableau
- Extensive experience working with Business Intelligence data visualization tools with specialization in Tableau Desktop, and Tableau Server.
- Experience inData Analysis,Data Migration, Data Cleansing, Transformation, Integration, Data Import, and Data Exportusing multipleETLtools such asInformatica Power CenterandData Stage.
- Experience in building Data warehouses (ER and Multi - Dimensional), Data Marts, Data Migration, System & Data Integration, Reporting & Dashboard, Data Management, Operational Data Stores and ETL Processes usingBusiness Intelligence (BI),Data Warehouse(DWH)andData Analyticsfor clients in major industry sectors likeHealth Care.
- Experience in creating ETL transformations and using Data Integration Designer and scheduling them on BI Server.
- Installation and Configuration of Informatica MDM Hub, Cleanse and Match Server, Informatica Power Center.
- Proficient in Unit Test Plans (UTP), Integrated Test Plans (ITP) Extensive experience in User Acceptance Testing (UAT).
- Expertise in Logical Modeling, Physical Modeling, Dimensional Modeling, Star and Snow-Flake schema
- Experience with healthcare system, Medicaid and with prime focus on claims adjudication, provider, eligibility and prior authorization.
- Proficient in the area of project implementation (SDLC )specifically in integration of business intelligence strategy,requirement gathering, requirement analysis, data modeling, information processing, system design, testing and training.
- Comprehensive knowledge of software development methodologies such as Waterfall, RAD, RUP and AGILE SCRUM.
- Expertise in implementing Data modeling, Erwin, Dimensional Modeling,Ralph Kimball Approach,Star/SnowflakeModeling, Datamarts,OLAP,FACT& Dimensions tables, Physical & Logical data modeling and Oracle Designer.
- Experience with standards for medical transactions like 820 (enrollment), 834 (premium payments), 835 (medical claims payments), 837 (medical claims), 270 (eligibility inquiry), 271 (eligibility response), 276 (claim status), and 277 (claim status response)
- Experience in data modeling, data analysis, data mapping, SQL queries, and database design. Deep understanding of data warehouse environment including dimensional modeling, star and snow flake schemas, ETL and BI.
- Developed expertise in claims processing and direct clearinghouse for claim processing and billing to insurances companies nationwide all on a SQL backend.
- Medical Claims experience in Process Documentation, Analysis and Implementation in 835/837/834/270/271/ (X12 Standards) processes of Medical Claims Industry from the Provider/Payer side
- Deployed and uploaded the SSRS reports to SharePoint Server for the end users and involved in enhancements and modifications.
- Experienced in health information and health care services regulatory environment including HIPAA, Medicaid/Medicare, CCHIT, EDI and XML
- Experienced in payer rules, requirements, governmental regulations and HIPAA compliance. Interacted with claims payment and enrollment to review, analyze and document business processes.
- Experience in data extracting(ETL), MIS Reporting, data Conversion and manipulation, transformation, data cleansing and suppressions, data validation & error trapping, testing and loading large data sets into SAS from various applications/databases (Oracle, Teradata, DB2, SQL server, Access, Spread sheet, text files) and environments (windows, UNIX, TSO/MVS Mainframe).
- Have worked in Data Market Systems,BPMand SQL queries.
- Experience in ETL tool using info Informatica 8.x/7.x (PowerCenter Designer, Repository Manager, workflow Manager, workflow Monitor).
- Experienced in full life cycle MDM development including data analysis, database design, data mapping and data load in batch.
- In depth knowledge Rational Unified Process (RUP) methodology, Use Cases, Software Development Life Cycle (SDLC) processes, Object Oriented Analysis and Design (OOA/D).
- Facilitated Change Management across entire process from Project conceptualization to Testing through Project Delivery, Software Development and Implementation Management in diverse Business and Technical Environments.
- Experience involving Business Process Re-Engineering, Software Re-Engineering using RUP methodology and ERP Systems Implementation and Continuous Improvement techniques.
- Software Development Life Cycle (SDLC) experience including Requirements, Specifications Analysis/Design and Testing as per the Software Development Life Cycle.
- Responsible to Track, Document, Capture, Manage and Communicate the Requirements using Requirement Traceability Matrix (RTM) which helped in controlling numerous artifacts produced by the teams across the deliverables for a project
- Strong experience in conducting User Acceptance Testing (UAT) and documentation of Test Cases. Expertise in designing and developing Test Plans and Test Scripts.
TECHNICAL SKILLS:
Databases: MS SQL Server 2014, 2012, 2008 R2/2005 Oracle 9i/10g, MS-Access - 2007
ETL Tools: SQL Server Integration Services (SSIS) (2005/2008/2012/2014 ), Data Stage, DTS, SQL Server Analysis Services (SSAS).
Operating Systems: Windows 2012R2/2008/2003 Advanced Server, Windows NT 4.0 / 3.51, and have necessary knowledge of Linux operating system.
Reporting Tools: SQL Server Reporting Services (SSRS), Crystal Reports 8/10, Excel Pivot.
Programming Languages: T-SQL, C, C++, C #.Net, VB.NET, Visual Basic for Applications (VBA), XML, and MDX.
Database Tools: MS SQL Server Enterprise Manager, SQL Profiler, Query Analyzer, Profiler Database Engine Tuning Advisor DTA).
Other: Microsoft Office, MS Project, MS Visio, MS LYNC, SQL Loader, DB visualize, TFS, Quick Test Professional (QTP) and Quality Center (QC).
Internet languages: Java script, XML, HTML, SOAP, Rest, XSL, XSLT.
WORK EXPERIENCE:
Confidential, Wilmington, DE
Sr. Data Analyst
Environment: Microsoft Office Suite, MS Visio, SOA, Java, C, Windows NT, JavaScript, UNIX, HTML, Oracle, SQL, SAS,UML, RUP, Agile,Mercury TestDirector, Mercury LoadRunner
Responsibilities:
- Involved in designing and developing data management application on with central operational database.
- Prepared Test Plans for each release, written Test Cases and executed them as part of Functional Testing. Prepared Test Reports and Deliverables and submitted for version releases.
- Validating the EDI 837 claim billing (professional, institutional and dental claims) & 835 (remittance advice or payment) claims adjudications.
- Developed test plans, created and managed test scripts for software development projects using HP Quality Center (QC).
- Implemented project using theAgileMethodology based on iterative and incremental development, where requirements and solutions evolve through collaboration with cross-functional teams and produce artifacts inSoftware Development Life Cycle(SDLC) Phases.
- Extensively usedStar and Snowflake Schema methodologiesin building and designing the logical data model into Dimensional Models.
- Creating UI Mockups for the Data Visualizations and presenting to the Stakeholders.
- Implemented the project using the Waterfall Methodology to produce artifacts in the different phases of the Software Development Life Cycle (SDLC).
- Involved Business requirement Analysis, Functional Requirement Preparation, Over All Design, Use-Case analysis, Initial Summary, Risk Analysis, Test Plan development, Test Case generation, Test script Preparation, Documentation and Test Execution.
- Created and executed various XML scripts for testing each individual functionality & Regression Testing of the applications for enhancements.
- Designing Test Cases and creating components were designed based on Requirements in HP Quality Center
- Responsible for source system analysis, data transformation, data loading and data validation from source systems to Transactional Data system and Warehouse System.
- Created ETL test data and tested all ETL mapping rules to the functionality of the Informatica mapping and Ab Initio graphs.
- Developed vision and strategy for building the Data Integration/Data Warehouse team.
- Created EDI documents for EDI maps to generate requirement documents and Project Charter, and logical design documents for EDI transactions and code sets.
- Played key role in defining test automation procedure and standards, creating Win Runner and QuickTest Professional scripts for all the modules, which reduced the regression cycle drastically and improved the testing efforts for daily builds.
- Involved in the full HIPAA compliance lifecycle from GAP analysis, mapping, implementation, and testing for processing of Medicaid Claims.
- Played active role during daily scrum meeting and task planning as part of agile methodology
- Production support applications break fixes were immediately resolved as well as modifications/updates, MIS Reporting, ETL, Data conversion, manipulation and occasionally new applications were implemented using SAS/UNIX scripting, X Manager Reflections, Teradata, FUTRIX, Tableau.
- Writing SQL queries to validate new data coming into the data warehouse from JDE migration project.
- Documented and gathered Functional specifications for 837 (claims), 278(Authorizations) and 270/271 (Eligibility and Benefit Response).
- Developed use case documentation for system requirements, business process flows, and screen mockups using MS Visio.
- Validating the EDI 837 claim billing (professional, institutional and dental claims) & 835 (remittance advice or payment) claims adjudications.
- Created SAS programs that are used for data validation, statistical report generation and program validation and automated the Edit Check programs using Macros
- Used ETL methodologies for supporting data extraction, transformations and loading processing, in a corporate-wide-ETL Solution using Informatica Power Center.
- Responsible for attaining HIPAA EDI validation from Medicare, Medicaid and other payers of government carriers.
- Used Data Integration to design all ETL processes to extract data from various sources including live system and external files, cleanse and then load the data into target data warehouse.
- Extensively involved in data analysis, data modeling, Data mapping, logical data modeling, created class diagrams and ER diagrams and used SQL queries to filter data within the Oracle database.
- Conducted thorough analysis of the Business Requirements and created design specification to accomplish and achieve business needs related to Healthcare EDI X12 transactions such as 835 and 837.
- Development included the use of Test Driven Development, Extreme Programming for continuous Integration and enhancements.
- Performed User Acceptance Testing (UAT) with business users and stakeholders, gathered requirements through interviews, workshops, and existing system documentation.
- Developed and Documented Use Cases for the intended system.
- BusinessProcess Management (BPM), Use Case modelling using UML, Data Modelling, Project
- Met project milestone dates for Technical Use Cases and Test Script completion dates for Phase releases.
- Worked with the Medicare Centers and Medicaid Services CMS to provide quick, easy, and affordable access to the health care service of their choice. Created wireframes, mockup screens using MS Visio during elicitation of requirement.
- Extensively used Informatica Client tools like Informatica Repository Manager, Informatica Designer, Informatica Workflow Manager and Informatica Workflow Monitor.
- Used Teradata utilities Fast Load, Multi Load, tpump to load data.
- Process Mapping created work-stream, reporting, and UI process diagrams to analyze the current business state, identify problem areas and to help create the best possible user experience in the future.
- Created automated Load test scripts using Load Runner.
- Data mapping, logical data modeling, created class diagrams and ER diagrams and used SQL queries to filter data within the Oracle database
- Provide IT quality compliance support to R&D and Medical Devices projects.
- Worked in identifying the efficient Design flow for migrating from SQL Server to Oracle Exadata Platform.
- Tested Inbound and Outbound Feeds of different format (XML, Pipe-Delimited, CSV)
- Conducted data driven testing using QTP to conduct backend testing
- Involved in process modeling, conducted & Participated in Joint Application Development (JAD) sessions with System Users.
- Consultant at a medical device company on a GxP program documenting laboratory and production applications.
- Responsible for cost estimation and timelines for various Business Intelligence reports
- Used Test Director and Mercury Quality Center for updating the status of all the Test Cases & Test Scripts that are executed during testing process.
- Performed Load and Stress Testing using Load Runner.
- Automated confidence tests that run on new builds on regular basis.
- Involved in setting up different configuration environment for compatibility testing and manual testing.
- Upgraded the existing Test Scripts and created new scripts for client application to be able to work for new versions and patches, which improved product quality.
- Evaluated testing results for each potential release build using Test Director, Quality Center and Bugzilla reports, listing summarized bug information in priority sequence, recommended viability of release for production.
- Involve in preparing Trace Matrix to design test cases
- Preparing Weekly action report & QA feed back to QA team & Manager.
Confidential, Raleigh, NC
Data Analyst
Environment: Mainframes, COBOL, JCL, Db2, Oracle, ASP.NET, C#, Java/J2EE, SDLC, Rose, MACROS, MS SQL 2000, Cognos, ETL, MS Word, MS Excel, MS Visio, SAS,Pivot Tables, Agile/Scrum LoadRunner, WinRunner
Responsibilities:
- Conducted Joint Application Development (JAD) sessions with IT groups. Identified the Key Changes, and participated in Stakeholder Management to communicate effectively with them
- Analyzed, transformed, resolved, and an assisted with the implementation and ongoing support of the Health Insurance Exchange (HIX) environments utilizing (Strategic planning, Business model analysis, Process design, and Systems analysis).
- Created maps & layouts for HIPAA as imposed during EDI: 837, 834, 835, 270/271, 277/275, 276/277, 278.
- Involved in Migrating the Informatica objects using Unix SVN from Dev to QA Repository.
- Worked on Data mapping, logical data modeling used SQL queries to filter data within the Oracle database tables
- Involved in designing and developing Data Models and Data Marts that support the Business Intelligence Data Warehouse.
- Interacted with stakeholders and business users to understand Business requirements to provide insights/recommendations per interactive data visualization dashboards views/metrics developed inTableau.
- Utilized corporation developed Agile SDLC methodology. Used Scrum Work Pro and Microsoft Office software to perform required job functions.
- Executed ETL operations for Business Intelligence reporting solutions using Excel.
- Using Shared Containers and creating reusable components for local and shared use in the ETL process.
- Developed and Documented timelines for Project Delivery, and managed Projects and Resources to successful completion.
- Generated and wrote appropriate ad hoc and routine reports applications. Perform data management, data manipulation and data transformation among different reporting applications. Performed data and reports analysis in support of quality assurance marketing activities.
- Designed, developed, and tested a new data integration / ETL process for a data warehouse running within an Oracle database environment.
- Created source to target mappings, performed data validation, and developed DDL scripts.
- Prepared implementation plan and negotiated agreement on EDI file specifications with Trading Partners and vendors in accordance with HIPAA guidelines
- CreatedSQL Server Traces in SQL ServerProfiler, to collects a variety of information about SQL Server connections, stored procedures and Transact-SQL statements.
- Created traces using SQL server profiler to find long running queries and modify those queries as a part of Performance Tuning operations.
- Create on demandAd hoc reports, Parameterized reports, Linked reports, Snapshot reports, Drilldown reportsandSub reportsusingSSRS.
- Created numerousSSISpackages (ETL) to migrate data from different server locations and heterogeneous sources likeExcel, CSV, flat file, XMLandText Format Data.
- Assisted for the business requirements, ETL Analysis, ETL test and design of the flow and the logic for the Data warehouse project.
- Interfaced with technical staff, trading partners, Clearinghouses and internal staff including programmers on all HIPAA related information and changes
- Used Rational RequisitePro for Managing and Configuring the Requirement Analysis effort.
- Involved in exhaustive documentation for technical phase of the project and training materials for all data management functions
- Performed data analysis and design, and creates and maintains large, complex logical and physical data models, and metadata repositories usingExcel and Snowflake Database.
- Performed SQL Queries in Snowflake cloud-based data warehouse.
- Documented all data mapping and transformation processes in the functional design documents based on the business requirements.
- Work in conjunction with IT EDI group to develop standard product offerings with respect to HIPAA transaction sets 837/835 .
- Followed Workgroup for Electronic Data Interchange EDI standards for testing that need to comply with the HIPAA transaction sets.
- Responsible for attaining HIPAA EDI validation from Medicare, Medicaid and other payers of government carriers.
- Researching, analyzing, projecting cost of inventory for projects, reconciling balances, process mapping, developing policies and procedures to improve tracking and issuing of Watershed material, preparing MS Excel, and MS Word Reports for Senior Management.
- Authorized Test Cases for HIPAA EDI transactions specifically 837.
- Prepared graphical depictions of Use Cases, Use Case Diagrams, State Diagrams, Activity Diagrams, Sequence Diagrams, Component Based Diagrams, and Collateral Diagrams and creation of technical design (UI screen) using Microsoft Visio.
- Worked on Documentum for Version Controlling, to maintain up to date changes in the Documents.
- Used Data Loader for insert, update, upsert, and bulk import or export of data from Salesforce Objects. Used it to read, extract and load data from comma separated value (CSV) files.
Confidential, Phoenix, AZ
Data Analyst
Environment: Scrum, Agile, Unix, ClearQuest, Documentum, Windows NT/XP, Oracle, SQL, SQL Server, Macros, Sybase, PL/SQL VBScript, C#. Microsoft Office Suite, SOA, SDLC, MS Access, SAS,Microsoft Visio, Pivot Tables, DOORS, Cognos, C++, NET, ASP.NET, Business Objects, TestDirector, LoadRunner
Responsibilities:
- Identify source systems, connectivity, tables, and fields, ensure data suitability for mapping.
- Designed data flow models, and performed functional decomposition analysis on various business processes.
- Responsible for business system analysis of customizing the BPS Risk Management product with involvement through the whole SDLC.
- Introduced Agile and RUP methodologies to reflect liquid nature of front-office improving time-to-market
- Responsible for scheduling workflows, error checking, production support, maintenance and testing of ETL procedures using Informatica session logs.
- Created Custom ETL procedures and TDE files in Alteryx from various Data Sources for ingestion into Tableau
- PerformedData Profiling, Data definition, Data Mining, validatingand analyzing data and presenting reports.
- Conducted process mapping and process pinch points during a Kaizen activity to reduce tool and manpower waste.
- Creating mockups of Dashboards and individual data visualizations in MS Excel.
- Create the architectural artifacts for the Enterprise Data Warehouse and the Operational Dashboard, such as Entity Relationship Diagrams (ERD), the DDL scripts, the Conceptual Data Model, and technical as well as business documents.
- Performance tuning on sources, targets mappings and SQL (Optimization) tuning.
- Created test scripts for all the test cases in PL/SQL.
- Evaluated and provided development templates for various ETL and data integration tools under consideration for purchase by the IT department.
- Involved in the full HIPAA compliance lifecycle from GAP analysis, mapping, implementation, and testing for processing of Health Insurance Claims. Worked on HIPAA Standard/ EDI standard transactions: 270, 271, 276, 277, 278, 834, 835, and 837 (P.I.D), 997 and 999 to identify key data set elements for designated record set. Interacted with Claims, Payments and Enrollment hence analyzing and documenting related business processes.
- DevelopedData Mapping, Data Governance, TransformationandCleansingrules for the Master Data Management Architecture involving OLTP, ODS and OLAP
- Extensively usedSQLqueries forData Validation,Data Quality), Data Manipulation, Data Integration, Data Governance Data Dictionary, Data SegmentationandData Conversionfromlegacy system to guidewire, data profiling for EDW, ER-Studio, Erwin, Data CleaningandMicro Strategy Reporting
- Involved in creatingdata transformation logics to load the data into Staging&Data Warehouseby using integration tools likeInformatica
- Involved with Testing team in reviewingTest plan,Test Cases,UnitandSystem Integration test plans,UAT
- Prepared Test Plans for each release, written Test Cases and executed them as part of Functional Testing. Prepared Test Reports and Deliverables and submitted for version releases.
- Involved in interactions with the Subject Matter Expert, Project Manager, Developers, and the end-users to gather key issues involved in the project and propose the new solutions.
- Used the guidelines of the Rational Unified Process (RUP) to strategize the Implementation of Rational Unified Process effort in different iterations and phases of the project.
- Developed detailed use cases using Rational Software.
- Worked with the UAT team to validate that the developed application will meet the business requirements.
Confidential, St. Louis, MO
Data Analyst
Environment: Web Logic, Java, Quick Test Pro 8.2,Agile, SQL, Mercury Quality Center 9.0, SAS,Load Runner 8.2, Sun Java, GXP,MSJVM, Oracle 8i/9i/10g, SQL Server 2000, DB2, MS-Project 2000/2003
Responsibilities:
- Responsible for coding the complex clientPBMplan designs within the adjudication support system.
- Experienced working in the area data management including Data Modeling, Metadata, Data Analysis, Data Integrity, Data Mapping and Data Dictionaries.
- Working on the complete end-to-end processing of the 837 claims testing also helping debug and recommend change in guidelines for 837 claims processing in Ramp Manager.
- Involved in data querying usingMS SQLto do the Data analysis PreparedData Analysis,Data Profile Reportfor documentingData Quality issues
- Responsible for testing and production processing and daily support for the 837 HIPAA transactions.
- Wrote SQL Queries in MS Access to sort data and analyze the large set of data during project life cycle.
- Created automated Load test scripts using Load Runner.
- Created Use-Cases and Requirements documents to document business needs and involved in creating use cases based on HIPAA standards.
- Conducted GUI and functionality testing using QTP.
- Created data mapping documents mapping Logical Data Elements to Physical Data Elements and Source Data Elements to Destination Data Elements.
- Tested the data using the Logs generated after loading the data into Data warehouse.
- PreparedTraceability Matrixwith requirements versus test cases.
- Worked onMaster Data Management(MDM) for maintaining the customer information also for the ETL rules to be applied.
- Create story board of back log items in Agile and develop item according to business needs.
- Created Tableau dashboards/reports for data visualization, Reporting and Analysis and presented it to Business.
- PerformedDetailed Data Analysis(DDA),Data Quality Analysis(DQA) and Data Profiling on source data.
- Extensively Worked in Data Analysis, Data Requirement Analysis and Data Mapping for ETL process.
- Created the design and technical specifications for theETL processof the project.
- UsedInformaticaas an ETL tool to create source/target definitions, mappings and sessions to extract, transform and load data into staging tables from various sources.
- Responsible for mapping and transforming existing feeds into the new data structures and standards utilizingRouter, Lookups Using Connected, Unconnected, Expression, Aggregator, Update strategy & stored procedure transformation.
- Involved in creating logical and physical data modeling with STAR and SNOWFLAKE schema techniques using Erwin in Data warehouse as well as in Data Mart.
- Worked in database objects like tables, views, materialized views, procedures and packages using Oracle tools like Toad, PL/SQL Developer and SQL plus.
- Involved in physical Modeling, Logical Modeling, Relational Modeling, Dimensional Modeling (Star Schema, Snow-Flake, FACT, and Dimensions), Entities, Attributes, OLAP, OLTP, Cardinality, and ER Diagrams.
- Developed Data Mapping, Data Governance, Transformation and Cleansing rules for the Master Data Management Architecture involving OLTP, ODS and OLAP.
- Developed control files for SQL Loader and PL/SQL programs for loading and validating the data into the Database.
- Created Stored Procedures in both SQL Server and DB2 and involved in several DTS.
- Created various transformation procedures by using SAS, ETL and SAS Enterprise guide.
- Involved in Data Modeling of both Logical Design and Physical Design of Data Warehouse and datamarts in Star Schema and Snow Flake Schema methodology.
- Involved in Informatica MDM processes including batch based and real-time processing.
- Responsible for reviewing data model, database physical design, ETL design, and Presentation layer design.
- Worked on Informatica Power Center tool -Source Analyzer, Data Warehousing Designer, Mapping Designer & Mapplets, and Transformations.
- Conducted JAD sessions with SMEs and other stakeholders for open and pending issues
- Tested Inbound and Outbound Feeds of different format (XML, Pipe-Delimited, CSV)
- Conducted data driven testing using QTP to conduct backend testing
- Developing and executing project validation deliverables, such as; Validation Plans, Migrations Plans, IQ/OQ/CQ, System, Integration and UAT test scripts.
- Used SDLC (SystemDevelopment life Cycle) methodology like Agile and Scrum.
- Responsible for cost estimation and timelines for various Business Intelligence reports
- Used Test Director and Mercury Quality Center for updating the status of all the Test Cases & Test Scripts that are executed during testing process.
- Involved in setting up different configuration environment for compatibility testing and manual testing.
Confidential, Atlanta, GA
Data Analyst
Environment: XML, Java, J2EE, JSP, RUP, VoAVPN, Oracle, MS Visio, MS Office, DB2, Sybase, SQL Server Reports, WinRunner, LoadRunner, SAS,Quick TestPro, Agile,Rational Clear Case,Doors, IBM WebSphere
Responsibilities:
- Used the guidelines and artifacts of the Rational Unified Process (RUP) to strategize the Implementation of RUP effort in different iterations and phases of the Software Development Life Cycle.
- Involved in requirement gathering phase (Provider, Claim components and HIPAA).
- Utilized corporation developed Agile SDLC methodology. Used Scrum Work Pro and Microsoft Office software to perform required job functions.
- Met with business users, gathered business requirements and prepared the documentation for requirement analysis.
- Created data mapping documents mapping Logical Data Elements to Physical Data Elements and Source Data Elements to Destination Data Elements.
- Tested the data using the Logs generated after loading the data into Data warehouse.
- PreparedTraceability Matrixwith requirements versus test cases.
- Worked onMaster Data Management(MDM) for maintaining the customer information also for the ETL rules to be applied.
- Followed the UML based methods using Microsoft Visio to create: Use Cases Diagrams, Activity Diagrams, State Chart Diagrams, Sequence Diagrams and Collaboration Diagrams.
- Performed extensive data modeling to differentiate between the OLTP and Data Warehouse data models
- Worked on Data warehousing/Batch and Data Integration analyst /Dimensional modelling/ ETL development/ Business intelligence (Tableau) /File transfer
- Analyzing and mining business data to identify patterns and correlations among the various data points.
- Working closely with data mapping SME and QA team to understand the business rules for acceptable data quality standards.
- Validated thedata flow and control flowtransformations are working according to functionality inSSIS packages
- Performed data profiling on datasets with millions of rows on Teradata environment, validating key gen elements, ensuring correctness of codes and identifiers, and recommending mapping changes.
- Wrote complex SQL queries to identify granularity issues and relationships between data sets and created recommended solutions based on analysis of the query results
- Wrote the SQL queries on data staging tables and data warehouse tables to validate the data results.
- Delivered Enterprise Data Governance, Data Quality, Metadata, and ETL Informatica solution
- Maintained Excel workbooks, such as development of pivot tables, exporting data from external SQL databases, producing reports and updating spreadsheet information.
- Developed Tableau data visualization using Cross tabs, Heat maps, Box and Whisker charts, Scatter Plots, Geographic Map, Pie Charts and Bar Charts and Density Chart.
- Worked in Agile methodology used JIRA to track day to day responsibilities.
- Interfaced with business users to verify business rules and communicated changes to ETL development team.
- Creating and executing SQL queries to perform Data Integrity testing on a Teradata Database to validate and test data using TOAD.
- Worked with data architects team to make appropriate changes to the data models.
- Worked on the ETL Informatica mappings and other ETL Processes (Data Warehouse)
- Involved in the full HIPAA compliance lifecycle from GAP analysis, mapping, implementation, and testing for processing of Medicaid Claims.
- Responsible for designing, developing, testing, documenting and delivering technology solutions for Microsoft Dynamics ERP, Management Reporter, Crystal Reports, and SQL Server including customizations and alterations to existing applications.
- Created the design and technical specifications for theETL processof the project.
- UsedInformaticaas an ETL tool to create source/target definitions, mappings and sessions to extract, transform and load data into staging tables from various sources.
- Promoted, Implemented and Tested Informatica objects from Development to UAT to Production environments.
- Prepared Logical Data Models that contains set of Entity Relationship Diagrams and Data Flow Diagrams and supporting documents and descriptions of the Relationships between the data elements to analyze and document the Business Data Requirements.
- Responsible for providing analytical support in the Design, Development and Implementation of Project.
- Verified the Business Scenarios on new builds to allow extended testing by the QA team.
