We provide IT Staff Augmentation Services!

Consultant- Data Analyst Resume

2.00/5 (Submit Your Rating)

Newark, DE

SUMMARY:

  • 8+ years of IT experience working on ETL developer and 3+year on bigdata technology like Hive/Impala, Spark and Scala.
  • Automate QA process in Bigdata project using Spark/Scala and shell scripting.
  • Business Data Analyst - Involves in gathering ETL/BI requirements and upon converting into useful functional requirements.
  • Creating wireframe of the report & dashboard mock-ups along with SQL logics for each data element

    Good knowledge on Data Warehouse concepts and writing complex SQL queries

  • Good knowledge on dimension modeling to identify dimension & fact tables and associated data elements
  • Write Complex sql in hive for data reporting and also perform data process through complex datatypes using Map, Struct and Array.
  • Performs data profiling to validate data quality issues for the critical data elements
  • Involves in UAT to ensure code in placed satisfies all requirements before it goes to production
  • Good knowledge on wealth and asset management concepts as well as trading life cycles for different investment classes
  • Involved in end to end testing in DWH projects including DWH & Report testing
  • Extensively worked on creating Test Plans, Test Scenarios, Test Scripts and Test
  • Execution to meet business and functional specifications
  • Extracted data from various sources like oracle tables and flat files
  • Verifying ETL Mapping Rules against source data to meet ETL requirements
  • Used to run SQL queries to validate the data loaded in to the target tables w.r.t source table
  • Verifying reports & dashboards against fact & dimensional data to meet reporting requirements.

TECHNICAL SKILLS:

Programming Languages: SQL, Hive/Impala, Spark, Scala, Unix Shell Script

Databases: SAS, Oracle, DB2, Vertica

Tools: Used:: Bugzilla, Tidal, Informatica Power Center 8.6, OBIEE Reporting, Filezilla, Code Migration, IVV&T, Functional & Regression Testing, Quality Centre, Clear Quest, Putty, toad, Excel, JIRA, Intellj, Notepad++, Jenkins, Maven, SBT.

Source Control: Rational Clear Case, Microsoft VSS, PVCS, CVS

ETL Tools: Informatica BDS

Operating Systems: Windows 9x/2000/XP, Linux, UNIX, Mac

Reporting: Tableau Reporting

PROFESSIONAL EXPERIENCE:

Confidential, Newark, DE

Consultant- Data Analyst

Responsibilities:

  • As a Data Analyst in the team, I interact with clients (Data Modelers, Business analyst) to aggregate their data to calculate the credit/Debit fraud along with innovation lab activities for analysis.
  • Work with the product team to provide data logic using sql’s for data analysis. Write Complex sql to evaluate the business reports in using SQL’s for reporting purpose.
  • Developed, enhanced and maintained client and internal reporting using Tableau Desktop for visualization analytic.
  • Experience in writing complex HQL for database management, data migration and database validation (SQL, DB2, vertica and Hive/Impala)
  • Performed Count Validation, Dimensional Analysis, Statistical Analysis and Data Quality Validation in Data Migration
  • Involved in creating database objects like tables, views and functions using SQL in hive/impala and spark sql to provide definition, structure and to maintain data efficiently
  • Experience working with large volumes of data, creating and maintaining complex reports using Tableau.
  • Built, published customized interactive reports and dashboards, report scheduling using Tableau 10.4 server
  • Created action filters, parameters and calculated sets for preparing dashboards and worksheets in Tableau 10.4
  • Restricted data for users using Row level security and User filters
  • Developed Tableau visualizations, dashboards, stories and forecasted the data using Trend lines in Tableau Desktop version.
  • Developed Tableau workbooks from multiple data sources using Data Blending
  • Validated reports and datasets against source systems to ensure quality and accuracy
  • Experience with fundamental concepts of Continuous Integration (Confluence and Jira), Agile Methodologies
  • Conversion of SAS files to csv file format for data ingestion using Scala.
  • Designed and developed the QA QC framework for count, hash total, transpose validation using shell scripting and Spark.
  • Working on oracle SQL scripts and hive/impala sql scripts to load data.
  • Shell Scripts to process the data using control m job scheduler.
  • Worked on business solutions with the architects to design the system
  • Provided assistance to the Quality Analysts team with writing test scripts between hive and oracle and also between excel, csv to hive tables using Spark.

Confidential

Data Analyst

Responsibilities:

  • Gather the business requirement and provide input to dev team to development activities.
  • Data Analysis in life science domain for HCP and Patients.
  • Use hive complex sql’s like Map, struct and Array datatypes for business use case.
  • QA QC framework for data validation using shell script, spark and Scala.
  • Conversion of SAS files to Csv file format for data ingestion using Java/Scala.
  • Designed and developed the QA QC framework for count, hash total, transpose validation in Scala and Spark.
  • Working on oracle SQL scripts and hive/impala sql scripts to load data into Data Lake.
  • Shell Scripts to process the data using control m job scheduler.
  • Worked on business solutions with the architects to design the system
  • Provided assistance to the Quality Analysts team with writing test scripts between hive and oracle and also between excel, csv to hive tables using Spark.

Confidential

Data Analyst

Roles and Responsibilities:

  • Worked closely with the team by gathering the functional requirements and working with the project manager to give a high-level design and the cost associated with the project.
  • Understanding of Omniture data for different programs on Hcom site.
  • Worked on big data/hive testing for validating data on HUE UI and pig scripting.
  • Shell scripting for validating the logs.
  • Creating scenarios based on the experimental testing.
  • Worked on Tableau tool for Reporting purpose.
  • Working on Dev Support work.
  • Provided assistance to the Quality Analysts team with writing test scripts using DB2 SQL to validate the detail, summary and report data along with fine tuning the scripts. Developed automation scripts to analyze the data quality, scope of the load process upon completion of the data loads.

Confidential

Data Analyst

Roles and Responsibilities:

  • Worked closely with the team by gathering the functional requirements and working with the project manager to give a high-level design and the cost associated with the project.
  • Worked on Tableau tool for Reporting purpose
  • Worked on Vertica as database and Pentaho Jobs for loading databases from relational sources and non-relational sources.
  • Creating test scenarios as per the user stories and validating the data.
  • Shell scripting for validating the logs.

Confidential

Associate Consultant

Roles and Responsibilities:

  • Worked closely with the team by gathering the functional requirements and working with the project manager to give a high-level design and the cost associated with the project.
  • Worked on Informatica tools - Source Analyzer, Warehouse designer, Mapping Designer and Workflow Manager and workflow monitor.
  • Extensively used Informatica Power center for loading databases from relational sources and non-relational sources.
  • Using Workflow Manager for Workflow and Session Management, database connection management and Scheduling of jobs to be run in the batch process.
  • Extensively Used Environment SQL commands in workflows prior to extracting the data in the ETL tool.
  • Monitored the sessions using Workflow Monitor.
  • Shell scripting for executing the workflows.
  • Testing of Rejected records in Monitor.

We'd love your feedback!