We provide IT Staff Augmentation Services!

Big Data Hadoop Developer Resume

3.00/5 (Submit Your Rating)

SUMMARY

  • Big Data Hadoop Developer with over 3 years hands - on technical experience in the Healthcare industry.
  • Extensive ETL experience moving data from relational databases into on-prem Data Lakes and cloud data warehouse like Snowflake.
  • Proficiency with SQL to curate raw data and do analytics on complex data.
  • Ability to translate business requirement in technical specifications.
  • Successful in delivering solutions in both technological and corporate environments.

TECHNICAL SKILLS

  • Ambari, HDFS, MapReduce, Sqoop, Pig, Hive, HiveQL, HBase
  • Spark SQL
  • Oracle, SQL Server
  • Snowflake - Cloud Computing
  • Epic Chronicles
  • Clarity Workbench
  • Clarity Reporting
  • Clarity Data Warehouse (BI/Analytical Reporting)

PROFESSIONAL EXPERIENCE

Big Data Hadoop Developer

Confidential

Responsibilities:

  • Manage daily ETL batch jobs to migrate data from EMR applications (EPIC) into Hive (Raw Zone).
  • Manage daily ETL batch jobs to migrate data from EMR applications (EPIC) into data lake.
  • Generate adhoc reports (BAU’s) from Hive to quickly support various business analytical needs.
  • Troubleshoot potential technical issues from batch jobs.
  • Support daily jobs that extract flat files from Chronicles (Cache) and loads into Clarity (Oracle) to support enterprise reporting.
  • Help develop new strategies to capture, store and analyze data for increased access and lower costs.
  • Analyze datasets using Hive, and Sqoop to recommend business improvements.
  • Import data from Oracle to HDFS, using Sqoop to load data.
  • Tune performance of legacy Hive queries to improve data processing and retrieving.

Data Analyst

Confidential

Responsibilities:

  • Helped developed ETL batch jobs that migrate data from AllScripts into Oracle database
  • Developed various business reports using SQL to generate CSV files
  • Analyze datasets in Hadoop using Hive QL, Impala, Spark SQL
  • Developed Sqoop scripts to extract data from Oracle into Hive.
  • Develop complex data models to support analytical reporting.
  • Develop data models for storing EMR records such as patient treatment plans, medications and recovery tracking.

We'd love your feedback!