We provide IT Staff Augmentation Services!

Senior Big Data Developer Resume

3.00/5 (Submit Your Rating)

Basking Ridge, NJ

SUMMARY:

  • All - encompassing understanding in Requirements gathering, Impact Analysis, High-level design, Programming, testing, JOB scheduling and implementation of Big Data Applications using Bigdata Hadoop enterprise version
  • Involved in designing, building and establishing Enterprise Big data solutions for huge amount of data into Datalake to preserve history and valuable data of decades for long time
  • Designed and developed generic sqoop framework to import historical and incremental data from various relational databases (Mainframe, Oracle, SQL, etc.)
  • Proficient in Apache Hadoop ecosystems PIG, FLUME, Hbase, Zookeeper, Hive, SQOOP, Spark, KAFKA, streamsets, strong understanding of HDFS architecture
  • Experience in writing complex Unix shell scripts
  • Good understanding of Object Oriented Programming (OOP) with Waterfall and Agile methodologies

TECHNICAL SKILLS:

Big Data: Hadoop, Map Reduce, HDFS, Hive, Pig, Sqoop, Hbase. Hive

Scripting/Programming languages: XML, Shell Script, Python, JavaDatabases: Elastic Search, Hive, Hbase, Oracle, MySQL, DB2, and SQL/PLSQL

Development Tools: Eclipse (I D E), Oracle Sql Developer, MS Visio, XML Spy, Teradata, IntelliJDB Visualizer, Devstudio, SOAPUI, Kibana, Streamsets, Cognos reporting tool

Server & Version Control: Unix, Apache Tomcat, JBoss, SVN, Git Hub

Environment: s: UNIX, Windows 7/8/10, Mac OS, Linux

PROFESSIONAL EXPERIENCE:

Confidential - Basking Ridge, NJ

Senior Big Data Developer

  • Extracting, transferring and loading the data from different sources to build the right solutions for Hadoop Projects
  • Involve in Project kick-off meetings to understand high-level business requirements
  • Automation of data Ingestion process & QA testing
  • Migration of reports & extracts getting generated in source Mainframe and ETL Application to Big Data Platform
  • Develop generic Sqoop process to import history and incremental data from different Relational databases to Datalake (it’s a Central repository for all databases in Big Data Platform)
  • Import data using Sqoop to load data from MySQL, Oracle, DB2 to HDFS on regular basis
  • Develop Shell Scripts and Batch Job to schedule various Ingestion Processes
  • Schedule production jobs in Talend

Confidential - New Brunswick, NJ

Administrative Assistant

  • Oversaw inventory and office supply purchases
  • Liaised with vendors to order and maintain inventory of office supplies
  • Oversaw daily office operations for staff of 60 employees
  • Ordered and distributed office supplies while adhering to a fixed office budget
  • Screened applicant resumes and coordinated both phone and in-person interviews
  • Answered and managed incoming and outgoing calls while recording accurate messages

Confidential - Somerville, NJ

Judiciary Intern

  • Reviewed and updated client correspondence files and scheduling database
  • Entered numerical data into databases in a timely and accurate manner
  • Scanned documentation and entered into the database

Data Ingestion

Confidential

  • Data lake implementation at UnitedHealth Group to ingest data from numerous source systems into HDFS
  • Migration of legacy system application extract & reports into Big Data platform by designing and developing similar spark, hive or pig processes to implement
  • Move data from different databases (oracle, db2, MySQL) using sqoop
  • Validate sqoop data with source data
  • Create HIVE tables on hadoop infrastructure using the ingestion process
  • Create reports, views and extracts using HIVE for analytics purpose
  • Actively provide user support to debug and troubleshoot issues from trace logs

Contract Transformation and Providers

Confidential

  • Converted the CT and Encounters data from .dat to json format using transformation
  • Load json data into elastic search
  • Create mapping and index to connect with kibana
  • Generate Reports and charts using kibana to provide more insight to Upper Management
  • Used COGNOS reporting tool to show business weekly error reports

We'd love your feedback!