We provide IT Staff Augmentation Services!

Data Analyst Resume

0/5 (Submit Your Rating)

CA

SUMMARY

  • With 18 years of experience in Business Intelligence, ETL pipelines, Data Analysis, Data Mapping, and Data science.
  • Experienced in creating ETL pipelines, dashboards and top - management scorecards. Strong in SQL, PL/SQL, HiveQL, Presto and Python coding skills for pipelines. Used algorithms to find data trends, anomalies and outliers.
  • Hands-on experience in using analytical and ETL tools like Dataswarm, Scuba, Data services, Pentaho, Tableau, Kibana, SAP BO, and in PM tools like Pivotal Tracker, Jira, Ms. Project and TFS.
  • Worked with many databases like Elastic search, Hadoop HDFS, Oracle RDBMS, MySQL, SQL Server, and Hana.
  • Worked mainly in BI space, Multi-dimensional data modeling, multi-tenancy and cloud computing. And used data modeling tools like Erwin, Oracle Designer 2000, Oracle Modeler and Visio.
  • Experienced also in gathering requirements, and creation of business documents like TRD, Process flows, Data flow diagrams, ETL & DW system architecture diagrams. Did weekly status reporting involving completed tasks, plans for next week, and issue tracking.
  • Participated in Sprint planning, review and scrum meetings and created scrum work items like Epics, Scenarios, User stories, and tasks.
  • Process & methodology oriented and analytical/problem solving skills with a high degree of proficiency in written and verbal communication. Excellent in making clear articulate presentations, with 10 years of Toastmasters experience. Well organized and proactive in task management, and problem resolving. Highly collaborative with users and team.

TECHNICAL SKILLS

BI: Tableau Desktop 10/8.2/9, Unidash, Scuba, Business Objects XIR3 with Web Intelligence, SAP BO 4.0, Follow Ralph Kimball’s DW lifecycle toolkit

Big Data: Elastic search (ELK), Hadoop HDFS, HBase, Map Reduce, Hive, MongoDB, Cassandra, Couch base, Kafka for data producer and consumer, Hortonworks for long term storage, Certified in Big-Data solutions.

ETL: Dataswarm Studio, Pentaho, SAP Data services, SSIS, and Informatica

Hardware: Linux (and putty), Ubuntu 12, Windows 8i

Database: Elastic Search, Oracle 11g, Sql Server 2012, MySQL, HANA in-Memory

Software: Strong SQL and strong PL/SQL, Python 3.4, Java, JavaScript, Perl 5.x

CRM: Siebel Call Center & Self Service.

Verticals: Cloud computing, TV Mediaroom/STB, Mobile apps data model, Manufacturing - Order Management and supply chain, e-Learning, Financial, Automobile, Insurance, Front Office and Customer Service, Oracle 11i.

Case Tools: Erwin, Visio, Business Objects Universe designer, Oracle Modeler.

Front End: Toad, Oracle SQL Developer, Confluence, Jira, Wiki, Salesforce (SFDC) apex, Crystal reports, BO Dashboard designer, and Tableau Server.

PROFESSIONAL EXPERIENCE

Confidential, CA

Data Analyst

Responsibilities:

  • Working closely with business and members of law enforcement team. Analyzed terabytes of data using Hive, and created many Tableau dashboards to present data from Oracle Db and Hadoop Hive. Very conversant with SQL and HiveQL queries and used them efficiently to derive analytics results from Hive and Oracle DB.
  • Used Unidash for presentation onto digital dashboards for viewing at vantage points in the department of the top level metrics of active ERs, Live#. Top NOC types, active cases on global map, weekly case movements, percentiles, etc.
  • Used Argus for getting data from Hive and presenting them using tabular analytical formats for various levels of systems’ user privileges.
  • Created pipelines using Dataswarm studio (, daggerDS), and used Presto queries via Daiquery and used OData service to connect Hive queries to Tableau.
  • Extensively used Tableau to create 15+ dashboards to show meaningful metrics like site usage, risk areas, user behaviors, and personnel travel locations etc. Created Govt. transparency dashboards, showing the growth in the transparency data access on a global map and country wise growth trend of content restrictions and takedowns. Created first response time dashboard showing times taken for critical cases and convert to d:hh:mm:ss format. This shows response times of ERs of various NOC, LP, and Live# response times. Created task management dashboards using Oracle SQL and Tableau to present service center loads, requests timeframes, and ageing details.
  • Created dashboards to show Requests from LEON system for various case types and the movements of critical ones into various other systems. Used heat maps, bubble graph, and stacked bars as needed. Also created Site usage dashboard showing the numbers on world map and monthly trend of usage from the Nectar hive cluster. Created US Team load with monthly and average trending. In addition, Assignment growth weekly trend and grouping of assignments. NCMEC portion and compare with Confidential and map overlap of perpetrators vs victims.
  • Also played the role of managing the Tableau Server usage roles and other administration tasks.
  • Worked in Scrum and Agile project management using Project tool.
  • Used Tableau 10, Hive for Big Data, SQL, Scuba, Presto using Daiquery, Dataswarm studio, Argus, Unidash, Python, VBA, Oracle.
  • Successfully completed Confidential Data Camp.

Confidential, CA

Data Scientist

Responsibilities:

  • Used python and shell scripting for loading Costing and Customer Usage data pertaining to all of our Services from Azure on the cloud to elastic search using web service and Kafka. Good at python coding, used Kafka for data producer and consumer.
  • Created Pentaho ETL transformations to load the hierarchies of epics, scenarios, user stories, work items from SQL-server DB to MySQL data warehouse and developed tableau workbooks, from that data for presenting agile management reports and dashboards to top management and for display on to the digital display boards throughout the department. From that data created release level reports for end-customers, Story-points distribution, Burn up chart, Forecast sprint completion with confidence bands, Bugs status, and Open vs closed tickets.
  • Very thorough in SQL, wrote lot of MySQL, SQL*server, and Cassandra SQL queries, and used analytical functions and joins in Tableau and Lucene queries.
  • Analyzed terabytes of data pertaining to media room video usage, customer related data, agile management sprint and release status data and loaded as necessary to MySQL, and elastic search using python, Pentaho and via RESTful web service, and analyzed data from ISS logs, Azure logs.
  • Identified patterns, anomalies and did forecasting of data like finance costings, subscriber additions, Sprint-points completion etc. Created 25+ workbooks and dashboards using Tableau.
  • Analyzed data and created tableau dashboards pertaining to metrics for Finance like service level, component level costs and calculate cost per user. Analytics included Finance metrics for actuals and forecast, System and Environment Performance dashboards, TV Rates, user and device counts. Sprint metrics like Current sprint status, sprint over sprint, commit over completed. Various KPIs like Scenario cascading report, load test, operator/NOC metrics, quality reports, program management reports, service management, and insights team metrics for cloud.tv
  • Thoroughly understand interfaces, and data from various systems, like master data and usage data, like subscribers, roles, features, Uri, API, IIS logs, client logs, system logs, device counts, usage metrics, system perf counters (like #of requests, success rate, latency average, 95-latency etc.) Sprint data like TFS releases, scenarios, sprints, iterations, area path, bugs, tasks, etc.
  • Managed Tableau Server, installed and upgraded as necessary. Also automated the delivery of tableau workbooks to emails inbox. Automated the presentation of tableau reports/dashboards on to the company digital dashboards automatically from the tableau server.
  • Worked in Scrum and Agile project management using Jira work items to track releases and sprints. Worked closely with business and other members of insights team
  • Later used elastic search Kibana 4.2, Grafana, and D3 also to create visualization and executive dashboards.
  • Mainly used Tableau 8.2/9, Pentaho 5/6, python 3.4 (canopy), Elastic search for Big Data storage. And Kibana 4.2, Logstash, Lucene queries and index shards, ES cluster Marvel monitor and Shield users, Kafka, spark & Scala, PowerShell, MySQL and workbench, HeidiSQL, datastax devcenter, Cassandra, mongo dB, sqlserver2012 for TFS, cURL, web service, Java, Horton works Hadoop, and HiveQL, HTML, CSS, JavaScript, jQuery, Node.js, dynatrace instrumentation. On both Azure and on AWS, Hadoop certified by Mindsmapped.

Confidential, CA

Business Systems Integration Analyst

Responsibilities:

  • Involved with daily interactions with business team to define and implement data migration (g-HR to Workday using Pentaho) and testing strategy and rollout plans. This HR orgstore is the central store and its data was used by many upstream and downstream applications.
  • Created test cases for testing business logic for Pentaho pipelines, based on user requirements of the integration team. Troubleshooting ktr and kjb Pentaho transformation and job files.
  • Coded in Python for testing various transformation and invoked them using Pentaho script executor.
  • Conducted end-to-end testing of g-HR to Workday migration for Master-data services department.
  • Compiled Dremel and org-console SQL queries for testing the ETL data in the Orgstore. The data pertained to all aspects of Employees and TVCs, like employment status, scales, country, addresses, exit status, etc.
  • Checked xml input files (full and shard files) for proper input formats and data validations.
  • Used Mac OS, Goobuntu Linux 12, Orgstore (No-SQL), Dremel Query, Buganizer, Pentaho 4.5 Data Integration/Kettle (PDI), Python, Confidential trix.

Confidential, CA

BI Lead Analyst

Responsibilities:

  • Met with business end-users to gather their reporting and analytical needs. Converted business requirements into technical documents including ETL documents.
  • Evaluated ETL tools and selected Pentaho suit based on business needs. Bench-marked presentation tools and recommended Tableau. As recommended, both Pentaho and Tableau were selected based on functionality/features, tool effectiveness in data crunching, scalability and cost of ownership, and the business was convinced to procure them.
  • Participated in design discussions of architecture for the BI solution and related servers and tools including all components of Pentaho 4.8 and Tableau. Responsible for coordinating with IT for server procurement, installation, setup and configuration of all components of Pentaho and Saiku analytics. And in installation of Tableau server and desktop, including necessary MySQL ODBC driver and created data sources and Db connections.
  • Created complex ETL transformations in Kettle/PDI for getting RESTful data from various affiliate sites/Adobe Site-Catalyst, for integration and processing using multi-level steps. Designed and developed star schema data model in MySQL to include all data elements from sites and created various dimensions per business needs. Created all MySQL structures as needed, and created lot of complex SQL queries.
  • Coded in Python to convert XML to JSON. Created many Tableau dashboards, workbooks using the loaded data. This included reports and graphical charts.
  • Led a team of two developers, analyst. Guided and motivated the team for optimal results. Provided them with all necessary docs including ETL mappings, data ER, DFD, Business vision, etc.
  • Created models in metadata editor and used them in Pentaho report designer and created interactive analytical reports. Created Mondrian OLAP cubes in Schema Workbench. Created user access and assigned the reports and cubes for profiled based access. Monitored system for smooth functioning of the jobs and report access.
  • Certified in Big-Data solutions, used Hadoop, HBase, Map Reduce, HDFS, Hive, MongoDB, Cassandra, Couch base. Created PDI transformations and PDI jobs on Community edition and used the Hadoop adapter, Certified Bigdata analyst in Bigdata world conference

Confidential, CA

DW Architect and BI Solutions Specialist

Responsibilities:

  • Met internal and end-users to understand and gathered their reporting & business intelligence needs. Users were from Sales (both channel and carrier), Marketing, etc. Collected and synthesized those needs into requirements for extensible data architecture with high performance. Data sources included sales force (SFDC), mobile data, and app provisioning data.
  • Used sales data and grouped them according to various criteria and codifications like NAICS, for consumption by marketing department. Created lot of MySQL queries with complex joins between the structures, in the database for mobile apps deployment and to show customer patterns like monthly new memberships, churn rate, etc. Very strong in SQL queries and in joins between various views and tables.
  • Worked with engineering to understand data collection of product usage statistics as Confidential Mobile Workforce Management helps organizations with employees in the field meet their productivity & service goals. Analyzed the product/engineering data model, in the mobile apps data base, for integrity and suggested enhancements. Modeled data for scalability of the mobile usage data. Pro-actively identified short comings in current system and guided IT team to implement resolutions.
  • Was responsible for design and delivery of enterprise-wide BI solution using the latest Pentaho 4.5. Coordinated with the IT infrastructure team for procuring and setting up needed HW/networks. Used all components of Pentaho, PDI, and BA server.
  • Provided prototype of job flows, report blueprints to the offshore team and facilitated the team to deliver reporting development on a timely manner, by giving solutions & following up with them on the progress and having early morning and late night calls to suit various time zones.

Confidential, CA

Senior BI Business Analyst (BIOD)

Responsibilities:

  • Gathered requirements and did business analysis for the needs of various customers of the hosted Business Intelligence Cloud environment. Total customer facing role with customers like Jive, Confidential, Confidential, etc. Set up and ran conference calls for remote on-call support for customers during various time zones.
  • Converted these requirements to technical documents, and using 4GL tools like data services and BO web-intelligence, created the data flows and UIs, after modeling the data. Responsible for daily error-free operations of this SaaS system based ETL flows on multi-tenant architecture.
  • Due diligently analyzed customer issues, communicated with them about the status and roadmaps and helped them by providing solutions on a timely manner. Thoroughly investigated the ETL flows to find issues, after going through ETL logs. Also created lots of SQL queries in SQL*server and analyzed using joins between various objects to get the data based on object level security in multi-client environment.
  • Coordinated the customer needs so that the technical team can set up the development stacks on virtual machines for customer tenants, which included installation of VM, BO data services and crystal reports.
  • Created the necessary product requirements and supplied the requirements to the Engineering to constantly update the product, based on timely feedback from customer and maintain the quality of the solution by maintaining high performance and scalability at all times.
  • Coordinated and worked closely with sales and marketing for prototyping of BOXIR3 solutions to customers, taking care of customer’s immediate needs. Took care of customer issues and thereby doubled sales and customer retention.
  • Provided necessary information and logs to and followed up with product team, and got the product issues resolved. Played scrum master roll. Tested the system end to end after sprint and quarterly maintenance deployments.

Confidential, San Jose, CA

BI Analyst and Project Lead

Responsibilities:

  • Collected requirements from business users from departments (like Sales Operations, Customer service, Quality, Finance, HR, Sales and purchasing) for their data warehousing and analytics needs, and was responsible for all stages from gathering user requirements to final delivery and system rollout. Created many Functional specifications, design documents, and TRD.
  • Analyzed and designed the database structures, SQL queries, ELT flows, presentation layouts, etc. for projects including Quality data-mart, TL9000 dashboards, Service request (SR) metrics dashboard, Bug tracking and COO dashboards, and launched using Web Intelligence. Ensured DW data was available on timely basis for use by the IBM Web portal. Designed the main portal security architecture using WAS portal and LDAP (AD).
  • Thorough in creating complex SQL queries and in building materialized views. Created lot of ETL pipelines using PL/SQL Procedures and managed schedules of the ETL jobs. Performed re-architecture of the current Data warehouse, after adding new ETL tool from inbuilt PLSQL procedures. Created 25+ ETL pipelines using Data Integrator, similar to SQL procedures pipelines and thereby moved the project roll-out time from weeks to days.
  • Supported the smooth functioning of Data warehouse, Business Objects and ETL components. Ensured DW/BO products encompassed the latest product releases/patches/service-packs. Took care of storage needs and installed Data integrator/Data services and upgraded to 11.5. Installed salesforce.com (SFDC) adapter for integration with Oracle to meet all architectural needs.
  • Redesigned the systems and added many enhancements to DW including: DW to adapt to multi-currency enabling in Oracle ERP and added key enhancements such as embedded support and pre-booking cleanup process. Designed Facts & conformed Dimensions tables and table storage parameters, and used Data integrator and created various transformation/mappings.
  • Managed meta-data for the data coming from various sources including Oracle ERP Order Management, supply chain management, Siebel call center and self-service, Agile PLC, salesforce.com (SFDC) and Extra view bug tracking.
  • Created various metrics/KPI including ERI, YRI, LTR, NPR, SO, OTD and other TL standards metrics, monthly sales, top ten customers, top-level part requirements, fulfillment and backlog reports, supplier performance, forecasting, opportunities in pipeline, purchase order reports, cases by severity, cases by product category, RMA number and part, etc.
  • Facilitated workshops and trained users in BO, Dashboard navigation and DI. Created user profiles, managed the user rights and profile based access using the BO supervisory module. Facilitated brainstorm sessions and discussed various possibilities/prototyping/mockup

Confidential, San Jose, CA

Senior lead analyst

Responsibilities:

  • Played the role of a liaison between business and IT for World-wide Competency Development myLearning reporting project. Interviewed lot of users and business teams from theaters like EMEA, APAC, US for requirements gathering/analysis. Worked closely with the upstream/downstream application teams, analyzed applications data structures and worked on data mapping/integrity for collating the data to a central Cisco Data warehouse through the Operational Data Store using SQL queries and PL/SQL procedures. Cleansed the data, combined, removed duplicates, standardized for conforming to dimensional model and DW Bus.
  • Created clear and unambiguous requirements. Designed and developed numerous universes using object/row level security, aggregate aware, linked universes, derived tables, aggregate navigation, contexts and used drill downs and scope of analysis in reports.
  • Designed and created complex analytical reports using Business Objects and deployed enterprise wide by publishing to the web for Learning Management System (LMS). Trained the end user community and junior developers. Tested analytics and reporting portal, along with Business. Closely worked with CDW, Competency/myDevelopment, ODS, Moses teams. Supported a large user/ cliental base creating ad-hoc reports and resolving any data and access problems.

We'd love your feedback!