We provide IT Staff Augmentation Services!

Data Analyst / Tableau Developer Resume

3.00/5 (Submit Your Rating)

Charlotte, NC

PROFESSIONAL SUMMARY:

  • About 7 years of experience in analysis, design, administration, development, integration and maintenance of Data Warehouse, Business Intelligence, ETL and Database platforms with 2 years of experience in Big Data / Hadoop Ecosystem technologies which include HDFS, Hive, Pig, HBase, Sqoop, Flume and Oozie.
  • Expertise in the design and development of Tableau visualization and dashboard solutions using Tableau Desktop and publishing the same on Tableau Server/ Public/ Reader and external websites.
  • Extensive experience in various reporting objects like Facts, Attributes, Hierarchies, Transformations, Filters, Prompts, Calculated Fields, Sets, Groups, Parameters inTableau.
  • Built dashboards using techniques for guided analytics, interactive dashboard design, and visual best practices using Tableau 8.0/8.2/9.2/10.0.
  • Extensive experience with design and development of Tableau Analytics and experience on Business Intelligence platforms and processes across Pharmaceutical, Communications, Utilities, Retail and Financial industries.
  • Worked in creating different Visualizations inTableauusing Bar charts, Line charts, Pie charts, Maps, Scatter Plot charts, Heat maps and Table reports.
  • Extensive experience in querying languages using SQL, PL/SQL, NOSQL, HIVE.
  • Expertise in developing Pig Latin scripts and using Hive Query Language.
  • Collected data from different sources like web servers and social media using flume for storing in HDFS and analyzing the data using other Hadoop technologies.
  • Ability to import and export data between HDFS and Relational Database Management Systems using Sqoop
  • Proficient in Data Analysis with sound knowledge in extraction of data from various database sources like MySQL, MSSQL and Oracle.
  • Expertise in developing advanced PL/SQL code through Stored Procedures, Triggers, Cursors, Tables, Views and User Defined Functions.
  • Experience with Level of Detail Expressions (LOD) for complex calculations, and advanced grouping to implement semi additive measures.
  • Excellent knowledge of ETL processes and Data modeling (Dimensional & Relational) on concepts that include Star Schema, Snowflake Schema using fact and dimension tables and relational databases, and client/server applications.
  • Comprehensive knowledge of Software Development Life Cycle (SDLC), having thorough understanding of various phases like Requirements, Analysis/Design, Development, Testing and Implementation.
  • Good understanding of business environment, excellent communication skills and ability to work proactively in a fast paced environment. Quick learner and positive attitude towards new challenges.
  • Flexible and versatile to adapt to new environment with a strong desire to keep pace with latest technologies.

TECHNICAL SKILLS:

Programming languages: C, C++, R, Hive

Methodologies: Star schema, Snowflake schema

Hadoop/Big data: HDFS, HBase, Pig, Hive, Sqoop, Flume, Oozie

Databases: Microsoft SQL server 2005,2008,2012 Oracle 11i, MySQL, DB2

Operating System: Mac OS and Windows (95, Vista, XP, 7, 8, 8.1, 10)

BI tools: Tableau 8.0/9.0/9.1/10.0/10.1

PROFESSIONAL EXPERIENCE:

Confidential, Charlotte, NC

Data Analyst / Tableau Developer

Responsibilities:

  • Involved in planning, defining and designing data based on business requirement and provided documentation for further references.
  • Implemented Tableau BI Dashboard reporting solution for various groups in the organization.
  • Published dashboards onto Tableau Server and from there consumers could choose viewing medium (laptop, pc, IPad).
  • Created extensive joins for data blending from several different data sources.
  • Created Tableau scorecards, dashboards using stack bars, bar graphs, scattered plots, geographical maps, and Gantt charts using show me functionality.
  • Built complex formulas in Tableau for various business calculations
  • Worked extensively with advance analysis actions, calculations, parameters, background images, maps, trend lines, statistics, and log axes, groups, hierarchies, sets to create detailed level summary reports, and dashboards using Tableau Desktop.
  • Worked on analyzing Hadoop cluster and different big data analytic tools including Pig, Hive, HBase and Sqoop.
  • Worked with highly unstructured and semi - structured data.
  • Responsible for building scalable distributed data solutions using Hadoop.
  • Involved in importing and exporting data (SQL Server, Oracle, csv and text file) from local/external file system and RDBMS to HDFS. Load log data into HDFS using Flume.
  • ETL Data Cleansing, Integration &Transformation using Pig: Responsible of managing data from disparate sources.
  • Exported the analyzed data to the relational databases using Sqoop for visualization and to generate reports for the R&D team.
  • Designed a data warehouse using Hive, created and managed Hive tables in Hadoop.
  • Created and maintained Technical documentation for launching Hadoop Clusters and for executing Hive queries and Pig Scripts.
  • Introduced several levels of dashboard drill down and links to interrelated performance metrics
  • Administered and provided production support for enterprise wide Tableau application consisting of multiple key functional business domains.
  • Worked with business analysts, interpreted business requirements, and created server based Tableau dashboards with automatic updates.
  • Leveraged advanced features of Tableau such as calculated fields, parameters, and sets to support data analysis and data mining.
  • Created several dashboards with various chart types such as stacked bar charts, scatter plots, heat maps, and text tables that demonstrated insightful trends over a period of time.
  • Documented complete server architecture and solutions to troubleshoot performance, scheduling, and server related issues.
  • Created electronic spread sheets.
  • Performed Tableau Server administration activities including Site and Project creations, adding users and providing permissions to workbooks.
  • Conducted training sessions for users on Tableau to use it as a Self-service BI tool for adhoc reporting.

Environment: Tableau 9.x/8.x (Desktop/Server), Hadoop, HDFS, Hive, Pig, Sqoop, Oozie, Hbase, Oracle 11i, R

Confidential, Dallas, TX

SQL / TABLEAU Developer

Responsibilities:

  • Gathered business requirements from end users, analyzed and documented the same for various reporting needs.
  • Worked with various complex queries with joins, sub-queries, and nested queries inSQLqueries.
  • Coded complexSQLqueries to retrieve data from the database depending on the need.
  • Created Cursors and Ref cursors as a part of the procedure to retrieve the selected data.
  • Using set operators in PL/SQLlike Union, Union all, and Intersect.
  • Updated and manipulated content and files by using python / R scripts.
  • Extracted 3 years’ data from different databases (MS SQL/Oracle) using SQL.
  • Developed and reviewed SQL queries with use of joins clauses (inner, left, right) in Tableau Desktop to validate static and dynamic data for data validation.
  • Created reports using custom SQL for complex report views.
  • Created data extracts for better performance of dashboards and accessing data offline.
  • Created views and dashboards based on requirements and published them to Tableau Server for business and end user teams to be reviewed.
  • Designed and developed mock-up Tableau Dashboards to explore options for visualization of data, presentation, and analysis.
  • Created spread sheets and line charts to display trends over a period of time and tied it with bar charts to show revenue growth.
  • Created interactive, parametrized dashboards by extracting data from different sources using data blending and applying actions (filter, highlight and URL).
  • Generated Interactive Dashboards with Quick filters, Parameters and Actions to handle views more efficiently.
  • Created Table of Contents (TOC), a common navigation page which had all the links to various reports and dashboards.
  • Developed various dashboards, used context filters, and sets while dealing with a huge volume of data.
  • Used parameters and input controls to give users control over certain values.
  • Integrated dashboards with live connection in Microsoft PowerPoint using Live Web add on.
  • Blended data from multiple databases into one report by selecting primary keys from each for data validation.
  • Executed and tested required queries and reports before publishing.
  • Participated in weekly meetings, reviews, and user group meeting as well as communicating with stakeholders and business groups.

Environment: Tableau Desktop 8.0, Oracle 11i, MS SQL Server, MS Visio, Microsoft Office Suite (Word, Excel, PowerPoint)

Confidential, Winston Salem, NC

SQL Developer

Responsibilities:

  • Work closely with management and end-users to create and evaluate business requirements.
  • Extensively used SQL queries to retrieve the data from large databases for further decision making.
  • Creating and optimizing SQL queries, Stored Procedures, Views, Joins, Aggregate Function, Sub Queries, Cursors, Index, Constraints, Functions, Group by, Order by etc.
  • Designed, developed and maintained relational databases.
  • Modified and maintained SQL Server stored procedures, views, ad-hoc queries.
  • Creating different Custom maps for good Business understanding
  • Experienced inDataTransformation andDataMapping from source to target database schemas and alsodatacleansing.
  • Used Star and Snowflake Schema methodologies.
  • Delivering reports to Business team on timely manner.
  • Provide support to Business and Data Analyst(s) in gathering and/or clarifying data and reporting requirements from business owners.
  • Responsible for the Extraction, Transformation and Loading of data from Multiple Sources to Data Warehouse using SSIS.
  • Experience in writing custom code expressions in SSRS.
  • Created complex Stored Procedures, Functions, Indexes, Tables, Views and SQL Joins for applications.
  • Create and maintain data model/architecture standards, including master data management.
  • Monitoring nightly ETL process from various highly different source systems. Sources included SQL based databases and Excel Files. Also ensured that nightly backup jobs or cube processing or other ETL jobs didn't interfere with each other.
  • Followed agile methodology and coordinated daily scrum meetings.
  • Performed data cleansing for accurate reporting. Thoroughly analyzed data and integrate different data sources to process matching functions
  • Tested and improved Report Performance, and wrote SQL, PL/SQL Packages, Stored procedures, Functions, and Triggers.
  • Defined, stored and maintained security information using Access Manager.
  • Coordinated all communications with the Database Management team.
  • Conducted Unit Testing of the reports to ensure correctness of the data and was responsible for resolving issues after QA testing.
  • Developed, managed, and coordinated all User Acceptance Testing.

Environment: Windows XP, MS SQL Server 2005/2008, MSBI, Microsoft Office Suite (Word, Excel, PowerPoint)

Confidential

Program Analyst

Responsibilities:

  • Determineddatarules and conducted Logical and Physical design reviews with business analysts, developers and DBAs.
  • Experienced inDataTransformation andDataMapping from source to target database schemas and alsodatacleansing.
  • Created various PhysicalDataModels based on discussions with DBAs and ETL developers.
  • Worked ondatamapping process from source system to target system.
  • Created dimensional model for the reporting system by identifying required facts and dimensions using Erwin.
  • Extensively used Star and Snowflake Schema methodologies.
  • Developed and maintainedDataDictionary to create Metadata Reports for technical and business purpose.
  • Worked on Performance Tuning of the database which includes indexes, optimizing SQL Statements.
  • Used Model Mart of Erwin for effective model management of sharing, dividing and reusing model information and design for productivity improvement.
  • Created tables, views, sequences, triggers, table spaces, constraints and generated DDL scripts for physical implementation.
  • Worked at conceptual/logical/physical data model level using Erwin according to requirements.
  • Performed data mining on data using very complex SQL queries and discovered pattern.
  • Used SQL for Querying the database.
  • Involved in extensiveDatavalidation by writing several complex SQL queries and Involved in back-end testing and worked withdataquality issues.
  • Performed data analysis, Data Migration and data profiling using complex SQL on Oracle source systems.
  • Designed semantic layer data model.

Environment: DB2, Oracle 11g, VISIO, MS-Office, SQL Architect, SQL Loader, SQL server 2005, Oracle, MS-Excel.

We'd love your feedback!