We provide IT Staff Augmentation Services!

Sr. Data And Reporting Consultant Resume

TexaS

SUMMARY:

  • Nine (9+) years of experience in Analysis, Design and Development of various Data Warehousing projects using Tableau, Spotfire and SAP Business Objects.
  • Good Experience with Hadoop and SAS.
  • Strong understanding of OLAP Data warehouse concepts, Dimensional Data Model, Star Schema, Snowflake Schema.
  • Strong SDLC knowledge including entire life cycle of analysis, design, build, test and deployment.
  • Strong business knowledge in Banking and Finance, Utilities and Pharma.
  • Expert in documenting business requirements, functional, technical specifications and test cases.
  • Capable of understanding the business environment and translate business requirements into technical solutions.
  • Capable of working as a Team Member or individually with minimum supervision. Flexible to adapt to any new environment with a strong desire to keep pace with latest technologies.
  • Good Communication and Interpersonal skills with the ability to interact with end - users, managers and technical personnel.
  • Experience in creating complex Tableau reports line charts, bar charts, pie charts, Combination charts, tree maps and Map charts.
  • Good Experience with Level of detail expressions, filters, date functions, window functions and Table calculations, sets, groups etc.
  • Good Experience with Base SAS, SAS SQL, SAS/MACROS and SAS/ODS in Unix Environment.
  • Programming Experience in Basic SAS with different Procedures like Proc SQL, Proc Report, proc tabulate, Proc Print, Proc Sort, drop, keep, retain, and date manipulations, formats and SAS EG etc.
  • Strong experience in writing complex SQL and PL/SQL programs, stored procedures in various databases Oracle, SQL Server, Netezza, Teradata and Db2.
  • Created complex statistical and financial reports, trend over time reports, ranking and forecast reports, Executive Dashboards using TIBCO Spotfire.
  • Good Experience in customizing the Spotfire Dashboards using HTML, Python and Java scripting.
  • Expert in designing Spotfire dashboards/reports with complex multiples data sources (SQL, Excel, Flat files) relational design and optimizing applications.
  • Experience with Hadoop Echo system tools and databases HDFS, Sqoop and HIVE.
  • Experience in creating Complex Tableau Visualizations and Spotfire Dashboards.
  • Good Experience in Data analysis, data extraction using SQL.
  • Experience in building data marts and Data warehousing.
  • Experience with SAP HANA and HANA Studio.
  • Experience with Microsoft reporting tools SSRS.
  • Experience with PL/SQL procedures, Stored Procedures in Netezza, Functions and Triggers.
  • Strong Experience with tableau server, security access, publishing the dashboards and access control.
  • Created various Reports using Tableau and involved in Performance tuning.
  • Knowledge in Data Warehousing Concepts in OLTP/OLAP System Analysis and developing Database Schemas like Star Schema and Snowflake Schema for Relational and Dimensional Modeling.
  • Extensively worked on Relational Databases like Oracle 11g/10g, IBM Netezza, Teradata, MS SQL Server, Sybase, Access, DB2 and database access tool TOAD, Rapid SQL and Agility.
  • Experience on ETL tools like Informatica Power center, workflow manager and workflow Monitor.

TECHNICAL SKILLS:

Business Intelligence Tools: Tableau 6.0,8.1,9.1,10.3Tibco Spotfire Professional 3.2,3.3,4.0,5.5,6.5, Spotfire Web playerBusiness Objects X2, X3

Servers: Application Servers (WAS, Tomcat), Web Servers (IIS, HIS, Apache).

Operating Systems: Windows 7/VISTA/NT/ XP/ ME/ 2000/, UNIX.

Databases: IBM Netezza, SQLServer, Sybase, Oracle11g/10g, Teradata, SAP HANA, HBase, Hive, MS Access and DB2.

Programming Languages: SAS, C, C++, CORE JAVA, SQL, PL/SQL, COBOL, XML.

Operating Systems: Windows 7/VISTA/NT/ XP/ ME/ 2000/, UNIX.

Databases: IBM Netezza, SQL Server, Sybase, Oracle11g/10g, Teradata, SAP HANA, HBase, Hive, MS Access and DB2.

Programming Languages: SAS, C, C++, SQL, PL/SQL, COBOL, XML.

PROFESSIONAL EXPERIENCE:

Confidential, Texas

Sr. Data and Reporting Consultant

Responsibilities:

  • Gathered Business requirements and perform data analysis and reporting.
  • Experience in creating Tableau Dashboards for Management as well as for Operations.
  • Good Experience with creating line charts, Bar charts, Map charts, cross tables, Pie charts etc using Tableau.
  • Experience with various agg functions, quick filters, context filters, sets, grouping, calculated fields and table calculations using Tableau.
  • Experience in publishing the reports on to the tableau server and provide access depending on the type of users.
  • Extracted data from various data sources using Pass-through SQL, created macros, exported and imported data into SAS, created PROC sort, email automation etc.
  • Good experience in building various charts, property controls, calculated columns and custom expressions in Spotfire.
  • Extract the data from RDBMS databases like Oracle, Netezza, MS SQL server using SQL and SAS.
  • Good Experience with SAS Procedures, pass-through sql, proc report, drop, keep and various date functions.
  • Experience with Tableau data extracts and packages work books.
  • Used ETL tool Informatica Power center designer to migrate the data from Oracle to Netezza.
  • Experience with data blending, sets, context filters, parameters etc.
  • Good Experience with Tableau server and access controls.
  • Responsible for creating stored procedures in Netezza.
  • Experience with Hadoop and Tableau integration and build reports on the data stored in Hive tables.
  • Experience in deploying UDF using Java into Hadoop.
  • Good Experience with Hadoop environment tools SQOOP, HIVE, HDFS and Pig for data storing and data transfer.
  • Create Hive tables and involved in performance tuning of Hive.
  • Experience with Java and python scripting.
  • Used HTML to custom visualizations in Spotfire.

Environment: SAS9.2, SAS EG, Tableau10.3, Spotfire 7.0, Netezza, Oracle, Java, Toad, Informatica 9.3, Aginity, Hadoop Hive, HDFS, SQOOP

Confidential, Herndon, VA

Senior Business and Data Analyst

Responsibilities:

  • Gathered Business requirements and designed technical requirements documentation.
  • Involved in data analysis and Data reporting.
  • Responsible for gathering the requirements from Business, Analyze and build the SQL query to extract the data from various data bases Oracle, SQL server and Netezza using Toad and Aginity and created Data marts for 6 business groups.
  • Extracted data from various data sources using Pass-through SQL, created macros, exported and imported data into SAS, created PROC sort, email automation etc.
  • Consolidated several reports in a business group into a single Tableau dashboard for both Managerial and Analysts use.
  • Used Hadoop Open system tool SQOOP to Export the data from RDBMS systems Oracle, Netezza to HDFS.
  • Experience with processing tools Hive, Pig and Knowledge on Map reduce using Java programming.
  • Experience in copying the data from different RDBMS systems using Informatica.
  • Published dashboard to Server and involved in Access controls. Worked with Filters, calculated columns etc.
  • Experience with data blending, sets, context filters, parameters etc.
  • Involved in Performance tuning of Tableau reports and Scheduling.
  • Experience with Complex Filters, Action Filters, Sets, calculated Fields, Parameters etc.
  • Created various complex Cross tables, Bar Charts, Line Charts and Tree map and Geo Maps.
  • Created Data sources for Report creation.
  • Customized data by adding Property Controls, Filters, Calculations, Summaries and Functions.
  • Created Information Links By using Information Designer to pull the data from Oracle Database and SAS data sets required to create the reports.
  • Created stored procedures in Netezza for Performance Improvement.
  • Used Automation Job Service Builder to Export DXP file during Migration Process.
  • Developed reports using Spotfire Professional and published the same to Spotfire Server for Business Users use.
  • Experience in Writing SQL Queries.
  • Worked on Tableau and Hadoop integration.
  • Created various prototypes and showed them to the users.
  • Created Action Controls using Python Scripting.
  • Experience in enhancing the visualizations using Custom expressions.
  • Experience in creating Bookmarks.
  • Experience in Handling Multiple data tables in Analysis.
  • Converted Various Excel, Tableau and BO Reports to Spotfire.
  • Conducting Unit testing of the reports developed in Spotfire Professional.
  • Responsible for the migrating the reports from Development to Acceptance / Production Server.
  • Worked on developing training program, user guide for Spotfire Professional and Spotfire web player in terms of best developing and design practices.

Environment: Tableau6.0, 8.1,9.1, Microsoft SQL Server, SAP HANA, IBM Netezza, MS Access, Aginity, Sybase, Rapid SQL, SAS, SAS Enterprise Guide, Autosys, Humming bird, Hive, Sqoop, Cloudera Hadoop echo systems. Informatica Power center 9.2. Spotfire professionals 3.2, 4.0, 5.5, 6.5, Spotfire web player, Spotfire enterprise player, Business objects, Oracle 11g, SAS, and Toad.

Confidential, Texas

Senior Data Analyst

Responsibilities:

  • Gathered Business requirements and perform data analysis and reporting.
  • Responsible for consolidating the reports and re writing the SAS programs.
  • Extract the data from SAP HANA views, Teradata and various RDBMS data bases.
  • Expert in writing the SQL quires against Oracle, SAP HANA, Teradata and DB2.
  • Identified data anomalies and reported the issues for the business to correct.
  • Created Hive tables, Partitioning and performance tuning.
  • Utilized Teradata utilities Fast Load, Fast Export and Multi load for loading and Exporting data to Teradata tables.
  • Extracted data from various data sources using Pass through SQL, created macros, exported and imported data into SAS, created PROC sort, email automation etc.
  • Experience with various formats and date maupulations and sas functions.
  • Build and enhanced Tableau dashboards and created and scheduled the data sources.
  • Experience in SQL performance tuning.

Environment: SAS, SAS EG, Tableau 6.0, SAP HANA, Hadoop Hive, HDFS, SQOOP, Teradata, Teradata SQL assistant, SAP HANA Studio, DB2, Vertica, Control-M.

Confidential

Business Objects Consultant

Responsibilities:

  • Participated in project planning sessions with project managers, business analysts and team members to analyze business requirements and outline the proposed solution.
  • Involved in developing new universes as per the user requirements by identifying the required tables from Data mart and by defining the universe connections.
  • Used Derived tables to create the universe for best performance and use context and alias tables to solve the loops in Universe.
  • Created complex reports stating revenue generation per year using cascading and user objects like measure objects by using @aggregate aware function to create the summarized reports.
  • Exported the Universes to the Repository to make resources available to the users for ad-hoc report needs.
  • Created users, user’s groups and maintained user access rights by using CMC.
  • Created reports containing all the information regarding the publishers, distributors and retailers using crystal reports XIR3.
  • Generated Reports using the Universe as the main Data provider and using the personal data files provided by the users.
  • Developed critical reports like drill down, Slice and Dice, master/detail for analysis of parts benefits and headcount.
  • Created different reports containing charts showing revenue and operating income by different divisions, market share of newspaper by circulation.
  • Created complicated reports including sub-reports, graphical reports, formula base and well-formatted reports according user requirements.

Environment: Business Objects XIR3, Web Intelligence XIR3, Xcelsius 2008, Crystal Reports XI, 2008, Oracle 9i (PL/SQL), Windows Server 2003.

Hire Now