We provide IT Staff Augmentation Services!

Data Analyst Resume

5.00/5 (Submit Your Rating)

Dallas, TX

SUMMARY

  • Dedicated IT professional wif 7 years of progressive experience indataanalysis, business requirement analysis, quality assurance, design, development, and testing to further teh success of various organizations' business goals and objectives.
  • Experience as a Data Analyst wif solid understanding of Data Modeling, Evaluating Data Sources and strong understanding of DataWarehouse/DataMart Design, ETL,BI, OLAP, Client/Server applications to implement management and staff's business requirements into teh software application in Healthcare Industry.
  • Expert in writing SQL queries and optimizing teh queries in Oracle, SQL Server 2008, Teradata and Microsoft PowerBI.
  • Experienced in QlikView Server and Publisher maintenance - Creating scheduled jobs for QVD extracts and report reloads
  • Expert inDataModeling,DataAnalysis,DataVisualization and ModernDataWarehouse concepts.
  • Designed Various Reports/dashboards to provide insights anddatavisualization usingBI/ETL tools like Mainframes SAS, SSAS, SSIS, OLTP, OLAP, Business Objects, Tableau, Informatica power center,Datastage.
  • Excellent Software Development Life Cycle (SDLC) wif good working knowledge of testing methodologies, disciplines, tasks, resources and scheduling
  • Excellent knowledge inDataAnalysis,DataValidation,DataCleansing,DataVerification and identifyingdatamismatch.
  • Performeddataanalysis anddataprofiling using complex SQL on various sources systems including Oracle and Teradata.
  • Strong experience in using Excel and MS Access to dump thedataand analyze based on business needs.
  • Sound understanding of BigDataAnalytics and its technologies, Digital and IoT concepts, AWS Cloud. SMAC (Social, Mobility, Analytics and Cloud)
  • Expert inDataModeling,DataAnalysis,DataVisualization and ModernDataWarehouse concepts. Designed Various Reports/dashboards to provide insights anddatavisualization usingBI/ETL tools likeBusiness Objects, Tableau, and Pentaho.
  • Extensive experience in working wif Tableau 9.0 and 10.0 Desktop along wif Tableau ServerKnowledge of Claims Billing (procedure codes, modifiers, diagnoses)
  • Worked on healthcare standards such as HIPAA 4010, 5010, ICD-9, 10.
  • Working knowledge of healthcare Technology standards such as HL7 and ISO standards
  • Experience in automating and scheduling teh Informatica jobs using UNIX shell scripting configuring Korn-jobs for Informatica sessions.
  • Built dashboards using techniques for guided analytics, interactive dashboard design, and visual best practices
  • Expertise in teh ETL (Extract, Transform and Load) ofdatainto adataware house/date mart and Business Intelligence (BI) tools like Business Objects Modules (Reporter, Supervisor, Designer, and Web Intelligence).
  • Results-oriented Joint Application Development (JAD) Facilitator and meetings coordinator wif excellent interpersonal skills.

TECHNICAL SKILLS

Data analysis: Requirements Gathering, JAD sessions, Process/Production Model analysis, Data Normalization, Cleansing, Profiling, System Design, Dataarchitecture internal Standards development, Metadata and Reports, Source and Target system Analysis

Languages: SQL, T-SQL, PL/SQL, Basic C, UNIX

MS Office Suite: MS Word, MS PowerPoint, MS Excel, MS Access

Database Systems: SQL Server, Oracle, Teradata, DB2

Operating Systems: Microsoft Windows, Linux, Unix

ETL and Reporting Environment: SQL Server, SSIS, SSRS, Informatica, SAS, DataStage, Qlikview, Tableau

DataModeling Tools: Power Designer 16.5, Erwin, ER/Studio, Additional QA Skills

PROFESSIONAL EXPERIENCE

Confidential - Dallas, TX

Data Analyst

Responsibilities:

  • Expert in data validation, cleansing, consolidation and mining for sense, consistency and accuracy and compliance standards.
  • Captured data lineage for all teh top level reports by validating teh authorized data sources wif system of records and system of origin.
  • Performed analysis on existing data model to understand teh methodology of model and help QlikView developers understand teh requirements.
  • Created scheduled jobs for QVD extracts and report reloads using Qlikview Server and publisher.
  • Validated teh data at QVD level and Dashboards level using Qlikview.
  • Provided tech team support for SDLC Data Warehouse.
  • Assisted Project Managers in establishing plans, risk assessments and milestone deliverables.
  • Designed data models using Oracle Designer.
  • Designed programs for data extraction and loading into Oracle database.
  • Managed database tables, procedures and indexes.
  • Created SQL-Loader scripts to load legacy data into Oracle staging tables and wrote SQL queries to perform Data Validation and Data Integrity testing.
  • Implemented enhancements to existing software products.
  • Performed data manipulation using Informatica and Oracle software.
  • Performed Source system analysis (SSA) to identify teh source data that needs to be moved into teh target tables.
  • Defined frameworks for Operational data system (ODS), Brokerage data warehouse (BDW), Central file distribution (CFD) and Data Quality (DQ) and created functional data requirement (FDR) and Master Test Strategy documents.
  • Defined target load order plan and constraint based loading for loading data correctly into different target tables.
  • Designed and created Data Quality baseline flow diagram, which includes error handing and test plan flow data.
  • Created Test Plans, Test Cases, and Test Scripts for all testing events such as System Integration Testing (SIT), User Acceptance Testing (UAT) and Unit Integration Testing.
  • Coordinated wif offshore vendor and established offshore teams and processes for code development.
  • Coordinated wif execution team to define teh batch flow for data validation process.
  • Involve in requirement gatherings wif Business team for building new Tableaureports and migrating existing BOBJ report to Tableau.
  • Create Allowance Limit Reports, Customer Loyalty, Periodical Volume Reports, Brand awareness and other reports for measuring teh KPIs.
  • Convert existing BO reports to Qlikview dashboards.
  • Develop Tableaudata visualizations using Cross tabs, Heat maps, Box and Whisker charts, Scatter Plots, Geographic Map, Pie Charts, Bar Charts and Density Chart.
  • Create complex workbooks and Dashboards by connecting to multiple data sources using data blending and Joins.
  • Creation of attributes, filters, advanced chart types, visualizations and complex calculations to manipulate teh data and to analyze, obtain insights into large data sets.
  • Develop SQL and extract right data for analysis and build confidence to business that requirements are accurate and are not ambiguous.
  • Interpret allowance data residing in various data sources like DB2, Teradata, SQL Server tables and analyze results using statistical techniques and provide ongoing reports.
  • Identify, analyze, and interpret trends or patterns in complex Walmart sales', receiving's and allowance data by doing data decomposition.
  • Involve in Data Profiling and creation of Data Mapping documents.
  • Participate in tech meetings between Technical Architects and Business teams.
  • Documenting various artifacts like Requirements, Analysis and Design document in SharePoint.

Environment: Windows, MS Office (MS Word, MS Excel, MS PowerPoint, MS SharePoint, MS Visio), Wireframes, Informatica, Oracle, Qlikview, Tableau, Teradata, SQL

Confidential - Southfield, MI

Data Analyst

Responsibilities:

  • Demonstrable expertise in core IT processes, utilizing ETL tools to query, validate, and analyze data.
  • Expert business and technical requirements documentation skills employing contemporary tools for data mapping, diagramming, Use Cases, and business rules to produce concise functional specifications.
  • Follow and assess teh business process model defining metadata rules and critical data elements.
  • Conduct analysis, gather requirements, develop Use Cases, data mapping, and workflow diagrams.
  • Develop global incident management reporting dashboard for teh DQM over multiple IM platforms.
  • Investigate unused modules of teh DQM and report viability and feasibility for implementation.
  • Comparative cost/benefit analysis between DQM modules and Inquiry Framework for DQ assessments.
  • Utilize multimedia office suite applications and conduct surveys for high level dashboard reporting.
  • Performing daily integration and ETL tasks by extracting, transforming and loading data to and from different RDBMS.
  • Creating complex SQL queries and scripts to extract and aggregate data to validate teh accuracy of teh data.
  • Business requirement gathering and translating them into clear and concise specifications and queries.
  • Prepare high level analysis reports wif Excel and Tableau. Provides feedback on teh quality of Data including identification of billing patterns and outliers.
  • Identify and document limitations in data quality that jeopardize teh ability of internal and external data analysts.
  • Wrote standard SQL Queries to perform data validation and created excel summary reports (Pivot tables and Charts).
  • Gather analytical data to develop functional requirements using data modeling and ETL tools.
  • Systems Documentation, change control/defect analysis and updates. Implementation testing.
  • Gathered data and documenting it for further reference and designed Database using Erwin DATA modeler.
  • Experienced in logical and Physical Database design & development, Normalization and Data modeling using Erwin.
  • Used Ref cursors and Collections wif bulk bind and bulk collect for accessing complex Data resulted from joining of large number of tables to extract data from data warehouse.
  • Fine Tuned (performance tuning) SQL queries and PL/SQL blocks for teh maximum efficiency and fast response using Oracle Hints, Explain plans.
  • Used Teradata as a Source and a Target for few mappings. Worked wif Teradata loaders wifin Workflow manager to configure Fast Load and Multi Load sessions.
  • Load data from MS Access database to SQL Server 2005 using SSIS (creating staging tables and tan loading teh data).
  • Highly proficient in using T-SQL for developing complex Stored Procedures, Triggers, Tables, Views, User Functions, User profiles, Relational Database Models and Data Integrity, SQL joins and Query Writing.
  • Migration of MS Access to SQL SERVER 2012.
  • Requirements gathering, analysis, Use Cases, data mapping, and workflow diagramming.
  • Data quality analysis and execution of teh Data Quality Management (DQM) package.
  • Wrote SQL queries using analytical functions.
  • Documented Business Requirements, Functional Specifications, User stories.
  • Created UML based diagrams such as Activity diagrams using MS Visio.
  • Perform data extrapolation and validation of reports for analysis and audits.
  • Created T/SQL statements (select, insert, update, delete) and stored procedures.
  • Utilize SSIS for ETL data modeling, data migration, and analysis.
  • Performed drill down analysis reports using SQL Server Reporting Services.
  • Documenting teh extent to which data fails to meet threshold reporting requirements
  • Project manages analytics for deployment wifin teh Development Life Cycle.
  • Developed SQL scripts involving complex joins for reporting purposes.
  • Develop various SQL scripts and anonymous blocks to load data SQL Server 2005

Environment: Windows, MS Office (MS Word, MS Excel, MS PowerPoint, MS Access, MS SharePoint, MS Visio), SQL, SSIS, ETL, SSRS, Erwin, Tableau, SQL

Confidential - Richardson, TX

Data Analyst

Responsibilities:

  • Created logical Datamodel from teh conceptual model and its conversion into teh physical database design using Erwin.
  • Used SAS to build models to identify definite patterns and suggest business wif possible problems and feasible solutions. Identify requirements for new possible business flows.
  • Created reports using either Cognos and/or Tableau based client needs for dynamic interactions wif teh dataproduced.
  • Designed, developed and deployed databases using MS Access to assist wif inventory, payment tracking and various other datacontrol needs.
  • Use of Excel and PowerPoint on various projects as needed for presentations or summarization of datato provide insight on key business decisions. dataconsolidation for operational weekly and monthly business reports, TEMPeffective and efficient solutions to deal wif large Datasets.
  • Created dataBase Design of Fact & Dimensions Tables, Conceptual, Physical and Logical dataModels using Erwin tool.
  • Gathered and documented teh Audit trail and traceability of extracted information for dataquality.
  • Worked wif teh Business Intelligence and ETL developers on teh analysis and resolution of data related problem tickets and other defects and assist them to design of efficient processes to load and manage teh data, including a dataQuality Assessment to ensure teh quality of teh source datawill meet teh information requirements.
  • Pulled out sales order dataand generated ad hoc stock pulling list using advanced excel functions every day.
  • Created template and optimized teh procedures which significantly decreased time spent in daily work.
  • Performed in-depth dataanalysis, successfully made more than 4,000 less-sold items competitive in market
  • Involved in Troubleshooting, resolving and escalating datarelated issues and validating datato improve dataquality.
  • Business requirement gathering and translating them into clear and concise specifications and queries.
  • Provides feedback on teh quality of dataincluding identification of billing patterns and outliers.
  • Identify and document limitations in dataquality that jeopardize teh ability of internal and external data analysis.
  • Used SAS to mine, alter, manage and retrieve datafrom a variety of sources and perform statistical analysis.
  • Datagovernance and defining processes concerning how datais stored, archived, backed up, and protected from mishaps, theft or attack.
  • Used Datawarehousing for DataProfiling to examine teh dataavailable in an existing database and created DataMart.
  • Developed and tested PL/SQL scripts and stored procedures designed and written to find specific data.
  • Written PL/SQL Stored Procedures and Functions for Stored Procedure Transformation in Informatica.
  • Implemented PL/SQL scripts in accordance wif teh necessary Business rules and procedures.
  • Generated SQL and PL/SQL scripts to create and drop database objects including: Tables, Views, and Primary keys, Indexes, Constraints, Packages, Sequences and Synonyms.
  • Conducted or participated in requirement gathering workshops and design sessions necessary to capture teh business needs and develop models.
  • Interfaced wif business and technology stakeholders to gather, analyze, and document business and datarequirements.
  • Used HP Quality Center for UAT Test Case Management and defect tracking and resolution.
  • Designed and implemented basic SQL queries for testing and report/datavalidation.
  • Ensured teh compliance of teh extracts to teh DataQuality Center initiatives
  • Understood teh new business requirements and plan teh facilitation of datamarts.
  • Prepared sales reports and interpreted to managers for decision making.
  • Extracted key information from highly unorganized datausing excel and SQL.
  • Conducted datamanagement and used statistical techniques to clean teh data.
  • Tracking and reporting teh issues to project team and management during test cycles.
  • Utilize SSIS for ETL datamodeling, datamigration, and analysis.
  • Prepare high level analysis reports wif Excel and Tableau.
  • Analyze and Assess invalid and missing datain teh All Payer Claims Data
  • Documenting teh extent to which datafails to meet threshold reporting requirements.
  • Created T/SQL statements (select, insert, update, delete) and stored procedures.
  • Designed and implemented basic PL/SQL queries for testing and report/datavalidation.
  • Troubleshooting and performance tuning of PL/SQL scripts and stored procedures.
  • Define and represent Entities, Attributes and Joins between teh entities.
  • Extensively developed PL/SQL Procedures, Functions, Triggers and Packages.
  • Written UNIX shell scripts to automate loading files into database using crontab.
  • Developing batch files to automate or schedule tasks.
  • Support for teh development, pre-production and teh production databases.

Environment: SAS, Erwin, Cognos, Tableau, Oracle, PL”SQL, SQL, SSIS, SSRS, MS Excel, MS Access, Tableau, UNIX Shell Scripting

We'd love your feedback!