We provide IT Staff Augmentation Services!

Big Data Analyst/ Tableau Developer Resume

5.00/5 (Submit Your Rating)

Raleigh, NC

PROFESSIONAL SUMMARY:

  • Big Data & BI Solution Engineer with 14+ years of IT experience. Ability to deliver data management vision, goals, priorities, design principles, and operating policies in support of the business goals of the organization.
  • Hands on experience in Big Data architecture (Cloudera Hadoop distributions ) and various components like HDFS, Map Reducer, Pig, Hive, Python, Spark, HBASE, Impala, Sqoop.
  • Experience working in Hadoop Data ingestion & curation for Data Lake, data governance, data auditing, aggregation, validation and reconciliation.
  • Expert in building Data Lake Solution for Data Ingestion, Transformation & Enrich layers design using Alteryx, Hive, HDFS, Impala, Spark, Sqoop and Python/Unix scripts along with data mapping for enrich layers.
  • Strong experience in migrating data warehouses and databases into Hadoop/NoSQL platforms.
  • Job workflow monitoring and scheduling through Run - Deck, Control-M and implementation of continuous integration using DevOps and Git.
  • Expert in requirements Gathering, Design, Development, Integration, Documentation, Testing and Build
  • Expert in Data Visualization and analytics using Tableau desktop and Tableau Server.
  • Expert in developing Enterprise BI applications using Tableau Desktop, Actuate, Jasper i-Report and Studio, Business Object, SQL SERVER, Oracle and PL/SQL.
  • Expertise in design and development of Tabular, Ad-hoc, Drill down/ Drill through reports, Cascading, Master/Sub Report, list reports, crosstabs, charts and interactive dashboards.
  • Experience includes overall technical support, troubleshooting, report design and monitoring of system usage
  • Knowledge of Data warehouse, Data ETL/ELT and Data Modeling and Data Visualization.
  • Application Migration Expert (Tactical approach to strategic solution) (Actuate to Jasper, Jasper to Tableau).
  • Experience in working in Agile Model, Expert in day to day Scrum activity, Sprint Planning and Story writing.
  • Smart worker, quick learner, enthusiastic and highly committed to the growth and success of the organization.
  • Possessing strong analytical and critical thinking, innovative/ strategic thinking and planning, software development, project management, strong debugging and problem-solving skill, presentation skills, skill to prioritize and complete several work requests simultaneously.

TECHNICAL SKILLS:

Hadoop Platform: Cloudera Hadoop Distributions, HDFS, YARN, MapReduce, Hive, Pig, Impala, Sqoop, Flume, Spark, Kafka, Storm, Zookeeper and Oozie

BI Tools: Tableau Desktop, Tableau Server, Business object (Desktop intelligent, Web intelligent reports), Jasper Reports, Jasper Studio, Jasper Server, Jasper API, Actuate (8,9,10,11), Actuate Server

Python, Scala, R Language, Pig: Latin, Java, Visual Basic (VB), C++, Java Script

Database: Oracle, Sybase, SQL Server, Mark Logic

ETL/Analytical Tools: Alteryx, Informatica 9.5

Server: Window, Unix

MSWord, MSExcel, MS: PowerPoint, TOAD, Version one, Jenkins, Visio, GitHub, Control-M

Methodology: Agile, Water Fall, DevOps

PROFESSIONAL EXPERIENCE:

Confidential, Raleigh, NC

Technology/ Tools - Hadoop Cloudera CDH 5.15, HDFS, Map Reduce, Sqoop, HIVE, Impala, SPARK, Python, Oracle, HUE, Alteryx, Tableau Desktop, Tableau Server 10.3, Unix Script, Java, Kerberos Security, DevOps, Control-M, Git.

Big Data Analyst/ Tableau Developer

Responsibilities:

  • Maintained active relationships with business partners/stack holders to understand business requirements.
  • Identified and ingested data from different data sources like databases, file System, web services, stream etc.
  • Performed analysis on source data for quality checks and evaluated detailed business and technical requirements.
  • Performed ingestion strategy and data acquisition with incremental data load in Hive.
  • Created source specific BRD and DDA (Data Delivery Agreement).
  • Build Hadoop Data lake architecture for data ingestion, data storage, data processing and visualization using tools like Sqoop, FTPS, Unix Scripts, Python Scripts, Alteryx, HDFS, Hive, Impala, Spark SQL, PySpark and Tableau Reporting.
  • Design logical and physical data model (Using IBM Infosphere) for Payment, Clearing and Settlement sources
  • Used Spark API over Cloudera Hadoop YARN to perform analytics on data in Hive. Developed Spark code using Python and Spark SQL for faster testing and processing of data.
  • Pulled data from web services and loaded into HDFS and Hive using CURL, LFTP and Python scripting.
  • Worked on Run-deck, control-M jobs creation, scheduling, monitoring, external & internal Hive tables in parquet file format with partitioning, bucketing, vectorizing based on design needs and then maintaining impala metadata to use it further for Tableau dashboard design.
  • Implemented authentication using Kerberos and Apache Sentry.
  • Performed ELT and analytics through Alteryx tool. Used GIT and SVN for code versioning.
  • Worked on Agile development methodologies, tools, and processes. Participated in Scrum activity, Sprint Planning, Story Writing, Backlog Grooming, Mid Sprint Review and Project Road Map Design.
  • Developed Tableau dashboards/workbooks to display PCS summary reports, PCS playbook reports, FMU vs MLE distribution and Pending Exposure report to Business.
  • Building, publishing customized interactive reports and dashboards, report scheduling using Tableau server.
  • Mentors others on coding standards, data mapping design, code integration and performance tuning.
  • Collaborate with various cross functional teams; infrastructure, network, database and application for various activities: development, setup and framework rollout activities
  • Platform SME and provide Level-3 technical support for troubleshooting.

Confidential, New York, NY

Technology/ Tools - Business Object (Desktop intelligent, Web intelligent reports), Tableau Desktop 9.2, Tableau Server, Informatica, SQL-Server, Oracle, Java, UNIX, Control-M, RainStor Database

Technical Architect

Responsibilities:

  • Lead a team of consultants in analyzing the business needs, identifying a suitable solution and guided them in the design and development of a solution.
  • Analyzed the existing Database Application and designed a Prototype of Universe. Created complex reports by linking data from multiple data providers, using free hand SQL, PL/SQL and functionalities like Combined Queries.
  • Creation and monitoring of Control-M jobs and Informatica jobs that updates PB Data warehouse.
  • Writing of UNIX script to connect and run user specific query and send scheduled excel reports to end user through email.
  • Created the Web Intelligence Reports and desktop reports to display Asset Report, Performance report, Transaction reports, Holding, Liabilities and cash Flow reports. Created User Prompts, Conditions and Filters to improve the quality of report generation.
  • Business Object Reports support activities such as root cause analysis, modifying users and scheduling
  • Development of Tableau interactive dashboard using feature of heat map view to display Client Reporting Groups (CRG) rank based on Assets. Interactive dashboard designed using filter action to visualize Asset, Holding, Liabilities and cash Flow of a particular Client Reporting Groups (CRG)
  • Migration of few BO reports in Tableau Desktop views.
  • Writing of SQL and PL SQL code based on business need in Oracle and SQL Server DB.
  • Analysis and Debugging of data and queries with DBA team.
  • Coordination with Release Approvers for setting approvals for report deployment.

Confidential, New York, NY

Technology/ Tools - Cloudera 5.3, RHEL, Sqoop, Spark, Oozie, Hive, Scala, python, Tableau 9.2, UNIX

Big Data Developer

Responsibilities:

  • Interacting with data scientists and gathering requirements.
  • Analyze tools and technologies required.
  • Identify key tables to load in Hadoop environment (Hive)
  • Importing data from Oracle using Sqoop and other files through FTP.
  • Scheduling jobs using Control-M
  • Developing and integrating Scala application on Spark (RDD’s) using Hadoop Cluster.
  • Transforming data and tables through Spark SQL.
  • Coding and implementation of Hive tables using monthly partitions.
  • Developing and scripting in Python and Scala.
  • Developing and running scripts on Linux production environment.
  • Reporting and visualizations using Business objects and Tableau Desktop.

Confidential, New York, NY

Technology/ Tools - Tableau Desktop (8.3, 9.1), Tableau Server (8.3, 9.1), Java script API, Informatica power center 9.5, Oracle 11g, PL/SQL, PLSQL Developer, Temenos (Wealth Management Tool), Actuate 9 ERDPRO and SERVER

BI Consultant

Responsibilities:

  • Development of Ad-Hoc and scheduled business reports using Actuate and Tableau desktop
  • Data analysis and data preparation for business requirement
  • Performed data mining task in tableau using A-B testing to get more data insights and patterns
  • Mockup report design and presentation to business user to give a high-level look and feel of report dashboard and worksheet.
  • Interaction with functional team and business stake holders to gather requirements and manage delivery.
  • Interactive dashboard designed using filter action to visualize Asset Allocation, Equity Summary, Holding, Realized gain and loss, fixed income of a particular account or reporting groups
  • Created action, filters, parameters and calculated sets for dashboards and worksheets in Tableau. Created dual axis chart to show portfolio performance over indexes, used Tree map and scattered chart, calculated fields, Bins, Table calculations, line for visualization.
  • Defined best practices for creating Tableau dashboards by matching requirements to the charts to be chosen, color patterns as per user's needs, standardizing dashboard's size, look and feel etc.
  • Analysis and Debugging of data and queries with DBA team
  • Writing of SQL and PL SQL code based on reports and business requirement in Oracle DB
  • Involved in Query Optimization task to increase overall performance of Report generation
  • Publishing worksheets and dashboard into tableau server, scheduling and maintenance
  • Integration of Tableau server with Delivery Portal (User Interface) using Tableau Java Script API
  • Documentation and estimation of data visualization related activities
  • Support and maintenance of existing Actuate Report application along with Tableau
  • 3rd level Production support

Confidential, Atlanta

Technology/ Tools - Tableau Desktop 8.3, Tableau Server 8.3, Java Script API, Informatica Power Center 9.5, Oracle 9i, PL/SQL, PLSQL Developer, Core Java, Version One

Technical Specialist

Responsibilities:

  • BI Reporting Module Lead. Requirement gathering from functional team and designing of BI reporting framework.
  • Involved in reviewing business requirements and analyzing data sources from Flat file, Excel, Oracle for design, development, testing, and production rollover of reporting and analysis project within Tableau Desktop.
  • Involved in creating interactive dashboard and applied Actions (filter, highlight and URL) to dashboard.
  • Developed various analytical reports from multiple data sources by blending data on a single worksheet.
  • Reviewed basic SQL queries and edited inner, left, and right joins in Tableau Desktop by connecting live/dynamic and static datasets.
  • Created dashboards by extracting data from different sources using parameters and calculations.
  • Involved in creating Tree Map, Heat maps, background maps, calculated fields, mapping and hierarchies
  • Involved in generating dual-axis bar chart, Pie chart and Bubble chart with multiple measures and data blending in case of merging different sources.
  • Writing of SQL and PL SQL code based on reports and business requirement in Oracle DB
  • Involved in Query Optimization task to increase overall performance of Report generation
  • Being an Agile project actively contributed to planning sessions and story creations related to ETL and BI reporting

Confidential, Boston

Technology/ Tools - Jasper Reports 4.x, Jasper Server, Jasper API, Actuate 9 (eRDPro, server), Oracle 9i, PL/SQL, Java, UNIX, PLSQL Developer

Report Developer

Responsibilities:

  • Listing of Actuate functionalities (Generic and report specific).
  • Analysis of Jasper iReport Designer and its functionalities.
  • POC and Demo for 30 dynamic report specific functionalities.
  • Preparation of project estimation plan and complexity measurement
  • Conversion plan from Actuate to Jasper
  • Writing of Java script for jasper reports using Jasper API
  • End to end migration of reports from Actuate to Jasper
  • Deployment of Jasper reports on file system
  • Prepared Unit test cases for existing functionality as per the requirement and executed the same.

We'd love your feedback!