We provide IT Staff Augmentation Services!

Sr. Data Analyst Resume

2.00/5 (Submit Your Rating)

Boise, ID

SUMMARY

  • Over 8+ years of working experience as a Data Modeler and Data Analyst with high proficiency in requirement gathering and data modeling including design and support of various applications in OLTP, Data Warehousing, OLAP and ETL Environment.
  • Experience in Performance Tuning and Debugging of mappings and sessions. Strong in optimizing the Mappings by creating/using Re - usable transformations, Mapplets and PL/SQL stored procedures
  • Experience in Data warehousing, Data Architecture &Extraction, Transformation and loading (ETL) data from various sources into Data Warehouse and Data Marts using Informatica Power Center
  • Profound knowledge of Data modeling including Star Schema and Snow Flake Dimensional Data modeling, 3NF Normalized Data Modeling. Very good understanding of Logical Modeling and Fine-tuned Physical Data Modeling.
  • Worked on Tableau and created adhoc reports and dashboards.
  • Worked in creating different Visualizations in Tableau using Bar charts, Line charts, Pie charts, Maps, Scatter Plot charts, Heat maps and Table reports.
  • Involved in Design of the Enterprise Data Visualization Architecture. Defined best practices for Tableau report development.
  • Expert knowledge of UNIX Shell Scripting and understanding of PERL & Korn scripting.
  • Ability to use custom SQL for complex data pulls.
  • Well versed in system analysis, ER/Dimensional Modeling, Database design and implementing RDBMS specific features.
  • Skilled in visualizing, manipulating and analyzing large datasets and also has ability of designing and developing effective reports. Proficient with MS Office Applicants, Excel(Pivot Table, Hlook UP, Vlook Up, CountIf, etc)and also experienced with Tableau, MS SQL Server,
  • Strong experience in Data Analysis, Data Migration, Data Cleansing, Transformation, Integration, Data Import, and Data Export through the use of multiple ETL tools such as Informatica Power Center Experience in testing, data validation and writing SQL and PL/SQL statements - Stored Procedures, Functions, Triggers and packages.
  • Experience in Integration and extraction of data from various sources like DB2, SQL Server, Oracle, Sybase, and Teradata, MS Access, Flat files into a staging area. in using ETL methodologies for supporting Data Extraction, Data Migration, Data Transformation and loading using Informatica Power center IDQ.
  • Strong SQL Skills related to information retrieval and analysis. Exposure to Client Interaction, User requirement Analysis and Support.
  • Extensive experience in creating data scope and requirements. Strong expertise in understanding various data sources.
  • Experience working with data analysis tools for data lineage, metadata, and data profiling
  • Hands on experience in importing, cleaning, transforming, and validating data and making conclusions from the data for decision-making purposes.
  • Solid understanding of statistical analysis, predictive analysis, machine learning, data mining, quantitative analytics, multivariate testing, and optimization algorithms.
  • Worked on various databases Oracle, Sql server, Sql Server, Teradata andDB2.
  • Expertise in writing SQL Queries, Dynamic-queries, sub-queries and complex joins for generating Complex Stored Procedures, Triggers, User-defined Functions, Views and Cursors.
  • Extensive experience in developing Unit, Integration and UAT Test plans and Cases and also has experience in generating/executing SQL Test Scripts and Test results.
  • Expertise in OLTP/OLAP System Study, Analysis and E-R modeling, developing Database Schemas like Star schema and Snowflake schema used in relational, dimensional and multidimensional modeling.
  • Worked on Data Model design, Data Extraction, Transformations and Loading, Mappings & Workflows, Customized Analytics Reports.
  • Highly skilled in Tableau Desktop for data visualization, Reporting and Analysis,Cross Map, Scatter Plots, Geographic Map, Pie Charts and Bar Charts, Page Trails and Density Chart.
  • Experience with data warehousing techniques like Slowly Changing Dimensions, Surrogate key, Snow flaking etc. Worked with Star Schema, Data Models, E-R diagrams and Physical data Models.
  • Extensive knowledge on Data Profiling using Informatica Developer tool.

TECHNICAL SKILLS

Analytical Tools: Tableau (Desktop/Server/Public/Online/Reader) 9.x/8.x/7.x, Adobe Analytics, SSRS, Business Objects

ETL Tools: SAPBODS, Informatica Power Center, SSIS

Databases: Oracle DB2 UDB, MS-SQL Server, Sybase, Teradata, Hadoop

Front End Tools: Microsoft Project, Microsoft Office

Methodologies: Data Modeling - Kimball/Inmon, Logical/Physical/Dimensional, Star/Snow flake Schema, ETL, OLAP, Waterfall, Agile.

Data Modeling Tools: Power Designer, ER Studio, MS Visio

Languages: SQL, SQL server 2008, PL/SQL and Java, HQL

Operating System: Windows NT/2000, UNIX (HP UX, AIX)

Web Applications: HTML, XML

PROFESSIONALEXPERIENCE

Confidential, Boise, ID

Sr. Data Analyst

Responsibilities:

  • Coordinated with DBA on database build and table normalizations and de-normalizations
  • Written SQL scripts to test the mappings and Developed Traceability Matrix of Business Requirements mapped to Test Scripts to ensure any Change Control in requirements leads to test case update.
  • Perform administrative tasks, including creation of database objects such as database, tables, and views, using SQL DCL, DDL, and DML requests.
  • Created advanced chart visualizations in Tableau using Dual Axis, Box Plots, Bullet Graphs, Tree maps, Bubble Charts, Pie Chart, Gantt chart, Histograms.
  • Involved in Creation of dashboards, stories and visualizations in Tableau. Created report schedules on Tableau Server.
  • Optimized DB2 SQL by reducing the number of table reads as part of performance tuning.
  • Performed Data mapping, logical data modeling, data mining, created class diagrams and ER diagrams and used SQL queries to filter data.
  • Maintained and enhanced SSRS and optimized them for reporting efficiency.
  • Creating PL/SQL reports using SQL Developer to package to Sybase
  • Testing and migration of SSIS workflows and mappings from one Repository to another Repository.
  • Maintain the internal MS-access application that has routine jobs that also takes the data from the SQL server database and load them into Teradata.
  • Created MS Excel reports for metadata extracts using MS Excel pivot tables.
  • Creating/reporting defects inside client JIRA.
  • Validated web services manually and through automation using SOAP UI (XML).
  • Maintaining the DB2 database environments and troubleshooting performance problems.
  • Troubleshooting problems with SSRS relating to our databases.
  • Performed extensive Data Integrity testing by executing SQL Statements on Oracle & SQL database.
  • Tested the functionality and performance of web services using SOAP UI.
  • Created Stored Procedures, Views, Triggers, User defined Functions and scripting Complex T-SQL logics for business logics.
  • Used UNIX commands for different processes while working on UNIX.
  • Worked on SQL queries in a dimensional data warehouse as well as a relational data warehouse.
  • Created, documented and maintained logical & physical database models.
  • Written complex SQL queries to extract data from different data bases like DB2, SQL Server, MySQL, Teradata, Hadoop etc. Used HQL to retrieve data from Hadoop systems.
  • Performed data analysis utilizing tools such as Spotfire, Tableau, Pipeline Pilot, or SQL.
  • Designed STAR schema for the detailed data marts and plan data marts consisting of confirmed dimensions.
  • Performed data analysis and data profiling using complex SQL on various sources systems including Oracle.
  • Used forward engineering to create a physical data model with DDL that best suits the requirements from the Logical Data Model.
  • Extensively worked on documentation of Data Model, Mapping, Transformations and Scheduling batch jobs.
  • Generated the DDL using forward engineering Worked in merging and complete compare of Physical models.
  • Used the Data Warehousing Life Cycle to identify data elements from the source systems, performed data analysis to come up with data cleansing and integration rules for the ETL process.
  • Identified the entities and relationship between the entities to develop Conceptual Model using ERWIN.
  • Creating Data mappings, Tech Design, loading strategies for ETL to load newly created or existing tables.
  • Involved in extensive data validation by writing several complexes SQL queries and involved in back-end testing and worked with data quality issues.

Environment: Tableau Desktop, Tableau Server, MySQL, Hadoop, PL/SQL, R, SharePoint, Spotfire, SQL, MS Visio, MS Excel, MS PowerPoint

Confidential, Dallas, TX

Sr. Data Analyst

Responsibilities:

  • Develop new statistical predictive models to forecast demand, including pricing models.
  • Worked on developing complex reports using custom SQL scripts to extract and present sales data.
  • Loaded the tables from different data sources into Tableau and created variety of graphs showing the factors responsible for revenue leaks. Involved in developing ad-hoc reporting for various sales operations for different customers using Tableau dashboards.
  • Building, publishing customized interactive reports and dashboards, report scheduling using Tableau Server.
  • Developed Tableau workbooks from multiple data sources using data blending.
  • Produced Crosstabs to display underlying data based on various graphs and charts created for further data analysis
  • Responsible for creating dashboards using bars, stacked bars, pie charts, scattered plots, line charts, Gantt charts, and maps.
  • Communicate and explain demand models (inputs, outputs, rationale, gaps) and recommendations to senior management. Involved in data management and data governance.
  • Partner with cross-functional teams to ensure that inventory flows and product assortments support performance goals and forecast demand
  • Developed and tested PL/SQL scripts and stored procedures designed and written to find specific data.
  • Written PL/SQL Stored Procedures and Functions for Stored Procedure Transformation in Informatica.
  • Implemented PL/SQL scripts in accordance with the necessary Business rules and procedures.
  • Generated SQL and PL/SQL scripts to create and drop database objects including: Tables, Views, and Primary keys, Indexes, Constraints, Packages, Sequences and Synonyms.
  • Performed data extraction, data analysis, data manipulation and prepared various production and ad hoc reports to support cost optimization initiatives and strategies using R, Excel, Python and Tableau using Machine Learning and Predictive Modeling.
  • Conduct data mining and analysis to uncover trends and correlations to better predict demand.
  • Used Data Blending, groups, combine fields, calculated fields, and aggregated fields and spotlighting to compare and analyze data in different perspectives.
  • Created guided navigation links for Interactive dashboards in Tableau.
  • Created Actions in Worksheets and Dashboards for interactivity and to compare data against different views.
  • Used advanced Excel functions to generate spreadsheets and pivot tables.
  • Designed documentation and its delivery management and provided weekly and monthly project status report, resource time sheet for the senior management and client.
  • Managed and tracked project issues and debugged them if needed and developed SQL queries in SQL Server management studio, Toad and generated complex reports for the end users.

Environment: Tableau Desktop, Tableau Server, MySQL, PL/SQL, R, SharePoint, Microsoft SQL Server, MS Visio, Microsoft Office Suite (Word, Excel, PowerPoint), Pivot tables, Informatica Cloud.

Walgreens, Deerfield, IL

Data Analyst

Responsibilities:

  • Used ETL methodology for data extraction, transformations and loading in a complex Enterprise DataWarehouse (EDW)
  • Conducted team meetings and Joint Application Design (JAD) session.
  • Worked on the MDM by planning and coordinating the activities across multiple groups in the engineering organization.
  • Experience in OBIEE/OBIEE for different business verticals - Insurance, Life Sciences / Pharma, Public Sector and horizontals like Call Center, Service, Marketing and Sales.
  • Performed analysis, coding, testing, implementation and troubleshooting on production reports and tables, as well as producing ad hoc reports using SAS.
  • Modeled the dimensions and facts using Erwin for centralized data warehouse.
  • Desgined ETL Mapping Specs for ETL team for Source to Target Mappings and corresponding business rules involved.
  • Conducting the meetings with business users to gather data warehouse requirements.
  • Lead the database development, ETL and report development activities and work closely with On-shore and off-shore developers to ensure proper data flow from SAP to Teradata and is available for reporting solutions.
  • Contributed to development of new products and functionality by working in conjunction with editorial, data management and technology groups.
  • Ensured data was properly tagged, allowing for proper linking, search functionality and presentation in web-based products.
  • Contributed to increased collaboration and integration across development teams by seeking out opportunities to assist other teams with content-specific deliverables.
  • Resolved failed linking and processes and programming code-related errors by viewing transaction error log files.
  • Worked with ETL team using SSIS packages, to plan an effective package development process, and design the control flow within the packages.
  • Worked in Tableau environment to create dashboards like weekly, monthly, daily reports using tableau desktop & publish them to server.
  • Performed Tableau administration function like Migration of Tableau Workbook and Dashboard from one server /environment to others.
  • Experience Implementing Proactive Tableau Environment Health Checks and Performance Threshold Alerting.
  • Develops reports, charts, tables, graphs, and intermediate statistical analysis using tools such as SAS, SQL, Tableau, and MS Excel.
  • Used Agile Methodology of Data Warehouse development using Kanbanize.
  • Developed Star and Snowflake schemas based dimensional model to develop the data warehouse.
  • Used Teradata utilities fastload, multiload, tpump to load data.
  • Implemented Referential Integrity using primary key and foreign key relationships in Teradata.
  • Worked with the Business Analyst, QA team in their testing and DBA for deployment of databases in the correct Teradata Environment, business analysis, testing and project coordination.
  • Used SAS/V8 on the mainframe in conjunction with Teradata/Fastload to create and load Teradata tables.
  • Implemented the OBIEE 12c Dev, QA, UAT, and PROD environments.

Confidential, MD

Data Analyst

Responsibilities:

  • Worked with informatics department and I was responsible to build predictive models using clinical, survey, or administrative data in support of Informatics analytical projects.
  • Extensively used Teradata-SQL Assistant and Advanced query tool to write SQL queries.
  • Worked with various complex queries with joins, subqueries, and nested queries in SQL queries.
  • Coded complex SQL queries to retrieve data from the database depending on the need.
  • Created Cursors and Ref cursors as a part of the procedure to retrieve the selected data.
  • Using set operators in PL/SQL like Union, Union all, Intersect and Minus.
  • Built various graphs for business decision making using Python matplotlib library.
  • Updated and manipulated content and files by using python scripts.
  • Connected a Python based script to Teradata using the ODBC driver on Unix
  • Extracted 3 years' data from different databases (Teradata/MS SQL/Oracle) using SAS/SQL.
  • Imported and Exported data files to and from SAS using Proc Import and Proc Export from Excel and various delimited text-based data files such as .TXT (tab delimited) and .CSV (comma delimited) files into SAS datasets for analysis.
  • Created reports in the style format (RTF, PDF and HTML) using SAS/ODS.
  • Built complex formulas in Tableau for various business calculations.
  • Developed Geo/Area Maps to show details on which states have more patients who are hospitalized using Tableau.
  • Create Bar Charts which is compiled with data sets and added trend lines and forecasting on future trend of the financial performance.
  • Complied interactive dashboards in Tableau Desktop and published them to Tableau Server with Quick Filters for on demand needed information with just a click of a button.

Confidential

SQL Developer

Responsibilities:

  • Worked on the integration of existing systems Confidential Data warehouse and Application systems level.
  • Extensively used SQL for Data Analysis and to understand and documenting the data behavior.
  • Write SQL queries for database access, modifications, create tables, views, triggers and functions to facilitate data analysis and validation.
  • Generate reports through SQL and MS-Excel.
  • Product reports using Cognos and Business Objects for stakeholders.
  • Communicate, and collaborate with clients and stakeholders.
  • Analyze business requirements and perform feasibility check.
  • Provide release artifacts to DBA's and Release Management teams and promote code releases.
  • Assist senior management with questions related to data analysis/patterns.
  • Debug/Troubleshoot code and resolve data issues.
  • Make updates to design and test specification documents.
  • Participate in the code reviews, data management standards and conventions, and data element naming standards.
  • Reversed engineered existing data bases to understand the data flow and business flows of existing systems and to integrate the new requirements to future enhanced and integrated system.
  • Designed the procedures for getting the data from all systems to Data Warehousing system.
  • Worked with ETL Architects and developer to design performance centric ETL mappings.
  • Extensive Experience on Reporting for CRM On Demand, Oracle Sales Cloud, OBIEE etc.
  • Created Batch processes using Teradata FastLoad, MultiLoad, B-TEQ scripts, Unix Shell andTeradata SQL to transfer cleanup and summarize data.
  • Good understanding of Data model across Eloqua, SFDC, Oracle Sales Cloud, Big Machine, Oracle Sales Cloud, Oracle CRM OnDemand, Siebel and OBIEE.
  • Experience in OBIEE/OBIEE for different business verticals - Insurance, Life Sciences / Pharma, Public Sector and horizontals like Call Center, Service, Marketing and Sales.
  • Performed analysis, coding, testing, implementation and troubleshooting on production reports and tables, as well as producing ad hoc reports using SAS.
  • Develops reports, charts, tables, graphs, and intermediate statistical analysis using tools such as SAS, SQL, Tableau, and MS Excel.
  • Involved in Data Analysis, Data Migration, Data Cleansing, Transformation, Integration, Data Import, and Data Export through the use of multiple ETL tools such as Abilities and Informatica Power Center and testing and writing SQL and PL/SQL statements - Stored Procedures, Functions, Triggers and packages.

Confidential

SQL Developer

Responsibilities:

  • Interacted with End user community to understand the business requirements and analyzed them and designed specification document.
  • Gathered Requirements and performed Business Analysis and documented all the data after pulling the data from databases.
  • Hands-on experience in analyzing the data and writing Custom MySQL queries for better performance, joining the tables and selecting the required data to run the reports.
  • Restricted data for Users using row level security and user filters.
  • Utilized technology such as MySQL and Excel PowerPivot to query test data, and customize end- user requests.
  • SQL scripts were developed and triggers, cursors are written.
  • Data Validation in done timely to check whether the data is correct and clean.
  • The reports will be generated by tracking, compiling and extracting the data.
  • Gained expertise on DEDUPLICATION where we have removed duplicates in source data using various data quality transforms available in SAP Business Objects Data Services.
  • Scheduled the sessions to extract, transform and load data into warehouse database as per Business requirements. Improved the session performance by pipeline partitioning.
  • Extensively used ETL and SAP BODS to load data from Oracle, flat files into the target SAP system.
  • Involved in fixing invalid Mappings, testing of Stored Procedures and Testing of Sessions, Batches and the Target Data.
  • Re-engineer existing Informatica ETL process to improve performance and maintainability.
  • Developed, executed and tested BODS jobs per business requirements.
  • SQL Query performance tuning is used to identify tables and understanding the performance of the database tables.
  • Mapping of business requirements to Business Data Model and understanding of system analyst in Canonical Mapping.
  • SQL Server Reporting Service is used to handle reporting of Designed Hierarchy dimensions.
  • Developed Source to Target Data Mapping, Data Profiling, Transformation and Cleansing rules using SAP BODS.

Environment: MS Excel, MS Word, Agile, MS Visio, SQL server, Oracle 10g, DB2, SAP BODS.

We'd love your feedback!