We provide IT Staff Augmentation Services!

Big Data Developer Resume

0/5 (Submit Your Rating)

NJ

SUMMARY

  • Over 9 years of professional IT experience which includes over 3 years of experience in Big data ecosystem related technologies and over 5 years of experience in Java related technologies.
  • Excellent understanding / noledge of Hadoop architecture and various components such as HDFS, Job Tracker, Task Tracker, NameNode, Data Node and MapReduce programming paradigm.
  • Experience in installation, configuration, supporting and managing - CloudEra's Hadoop platformalong with CDH3 & 4 clusters.
  • Experience with leveraging Hadoop ecosystem components including Pig and Hive for data analysis, Sqoop for data migration, Oozie for scheduling and HBase as a NoSQL data store.
  • Good Exposure on Apache Hadoop Map Reduce programming, PIG Scripting and Distribute Application and HDFS.
  • Experience in NoSQL database HBase
  • Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems and vice-versa.
  • Experience in Hadoop Shell commands, writing MapReduce Programs, verifying managing and reviewing Hadoop Log files.
  • In depth noledge of JobTracker, TaskTracker, NameNode, DataNodes and MapReduce concepts.
  • Experience in understanding teh security requirements for Hadoop and integrate with Kerberos authentication and authorization infrastructure.
  • Experience in Big Data analysis using PIG and HIVE and understanding of SQOOP and Puppet.
  • Good understanding of HDFS Designs, Daemons, federation and HDFS high availability (HA).
  • Experienced in developing Map Reduce programs using Apache Hadoop for working with Big Data.
  • Experience in developing customized UDF’s in java to extend Hive and Pig Latin functionality.
  • Experience in programming languages like Core Java & COBOL.
  • Experience in using Oracle 10g, DB2, SQL Server 2008 and MySQL databases and writing complex SQL queries.
  • Experience in designing and developing database tables using SQL Server.
  • Experienced in Healthcare (Medicare/Medicaid/ Pricing/ PBM RxClaim) and Banking/Financial (Credit Cards Statement system)
  • Strong team player, ability to work independently and in a team as well, ability to adapt to a rapidly changing environment, commitment towards learning.
  • Administering and troubleshooting Linux and Windows systems, comfortable with UNIX environment and shell scripting.
  • Ability to blend technical expertise with strong Conceptual, Business and Analytical skills to provide quality solutions and result-oriented problem solving technique and leadership skills.
  • Knowledge of project management concepts, software development lifecycle and quality assurance techniques.

TECHNICAL SKILLS

Languages: Java, JavaScript, Shell Scripting, COBOL

BigData/ Hadoop Ecosystem: Hadoop, MapReduce, HDFS, Hive, Pig, Sqoop, HBase, Oozie, HDInsight.

Java technologies: Java Server Pages, Servlets, Junit, Spring, Hibernate

Database technologies: MySQL, SQL Server, Oracle.DB2

Other technologies: HTML, UML (modeling), AJAX, CSS, Tomcat, SVN

IDEs: Eclipse.

Designing Tools: Microsoft Visio

Operating Systems: Windows XP/7, LINUX, MAC

PROFESSIONAL EXPERIENCE

Confidential, NJ

Big Data developer

Responsibilities:

  • Extensively involved in Installation and configuration of Cloudera distribution Hadoop 3, NameNode, Secondary NameNode, JobTracker, TaskTrackers and DataNodes.
  • Involved in loading data from MySQL and Oracle to HDFS using SQOOP.
  • Implemented Hadoop framework to capture user navigation across teh application to validate teh user interface and provide analytic feedback/result to teh UI team.
  • Loaded data into teh cluster from dynamically generated files using Flume and from relational database management systems using Sqoop.
  • Performed analysis on teh unused user navigation data by loading into HDFS and writing MapReduce jobs. Teh analysis provided inputs to teh new project front end developers.
  • Wrote MapReduce jobs using Java API and Pig Latin.
  • Loaded teh data from Teradata to HDFS using Teradata Hadoop connectors.
  • Used Flume to collect, aggregate and store teh web log data onto HDFS.
  • Wrote Pig scripts to run ETL jobs on teh data in HDFS.
  • Used Hive to do analysis on teh data and identify different correlations.
  • Worked on importing and exporting data from Oracle and DB2 into HDFS and HIVE using Sqoop.
  • Imported data using Sqoop to load data from MySQL to HDFS on regular basis.
  • Written Hive queries for data analysis to meet teh business requirements.
  • Automated all teh jobs, for pulling data from FTP server to load data into Hive tables, using Oozie workflows.
  • Involved in creating Hive tables and working on them using Hive QL.
  • Supported Map Reduce Programs those are running on teh cluster.
  • Maintaining and monitoring clusters. Loaded data into teh cluster from dynamically generated files using Flume and from relational database management systems using Sqoop.
  • Weekly meetings with technical collaborators and active participation in code review sessions with senior and junior developers.

Environment: Hadoop, MapReduce, HDFS, Pig, Hive, HBase, ZooKeeper, Cloudera, Oozie, MongoDB, SQL*PLUS, NoSQL, Windows.

Confidential, Phoenix, AZ

Hadoop Developer

Responsibilities:

  • Installed and configured Hadoop, MapReduce, HDFS (Hadoop Distributed File System), developed multiple MapReduce jobs in java for data cleaning and cessing.
  • Developed data pipeline using Flume, Sqoop, Pig and Java MapReduce to ingest customer behavioral data and financial histories into HDFS for analysis.
  • Used Pig as ETL tool to do transformations, event joins and some pre-aggregations before storing teh data onto HDFS.
  • Implemented teh workflows using Apache Oozie framework to automate tasks.
  • Developed Pig Latin scripts to extract teh data from teh web server output files to load into HDFS.
  • Applied MapReduce frameworkjobs in java for data processing by installing and configuring Hadoop, HDFS.
  • Created Hive External tables and loaded teh data in to tables and query data using HQL.
  • Developed PIG Latin scripts to extract teh data from teh web server output files to load into HDFS.
  • Developed workflow in Oozie to automate teh tasks of loading teh data into HDFS and pre-processing with Pig.
  • Importing and exporting data into HDFS and Hive using Sqoop.
  • Created HBase tables to store various data formats of PII data coming from different portfolios.
  • Involved in creating Hive tables, loading with data and writing hive queries which will run internally in map reduce way.
  • Involved in writing Hive UDFs.
  • Involved in HDFS maintenance and WEBUI it through Hadoop-Java API.
  • Performed data analysis in Hive by creating tables, loading it with data and writing hive queries which will run internally in a MapReduce way.
  • Worked on analyzing Hadoop cluster and different big data analytic tools including Pig, HBase NoSQL database and Sqoop.
  • Extracted data from Hive through Sqoop and placed in HDFS and processed.
  • Developed shell script to pull teh data from third party system’s into Hadoop file system.

Environment: Hadoop, MapReduce, HDFS, Flume, Sqoop, Pig, HBase, Hive, ZooKeeper, Cloudera, Oozie, Sqoop, NoSQL, UNIX/LINUX.

Confidential, Thousand Oaks

Senior Mainframe Developer

Responsibilities:

  • Gathering teh requirements from teh various Users/Business.
  • Coordinate teh preparation of Test Strategy document and review teh same. Coordinate and perform teh development activities - Coding and Unit Testing.
  • Imparting domain and technical noledge to teh new entrants and also fellow team members.
  • Also responsible to perform teh analysis for various enhancements, perform impact analysis to find out teh systems/programs dat could be potentially affected by proposed change(s), coding, and testing and implementation activities.
  • TEMPEffective offshore coordination to complete teh development efforts as per design and review their set of deliverables to meet quality standards and clients expectation.
  • Handled a team size of 6 people as an onsite client lead.
  • As part of rollout type of works, teh responsibilities include gathering teh requirements from teh clients, performing analysis on teh business requirements and development of high level and detailed system design, development activities by involving offshore team, unit testing, system testing, implement teh system in to teh production environment and providing warranty support.
  • Preparation of teh following various technical and functional documents (where applicable depending on teh request types) Requirement Specification document, High Level Analysis and Approach document, High Level Design Document, Detailed Design Document, System test plan and test procedure document and implementation plans.
  • Reviewing teh set of deliverables for completeness and correctness to ensure dat teh business objective is met. Performing rigorous testing. Setting up test environment for User Acceptance Testing.

Environment: COBOL, JCL, IMS DB, DB2, Mainframe OS

Confidential, Pennsylvania

Senior Mainframe Developer

Responsibilities:

  • Involved in Design Review sessions such as Business System Design (BSD), Technical System Design (TSD) Reviews with Business Stakeholders, IT Leads, Business Analyst and Test Leads/Managers.
  • Created Business Spec documents, Technical Spec documents and Development Test plan
  • Conducting Internal Reviews with Impacted teams and approved teh test Documents.
  • Coordinate with Testing Team for any defects and issues found in testing.
  • Preparing Development test plan approach.
  • Conduct Meetings with Client Business Leads for weekly status, test approach, defect analysis, Risk analysis and prepare MOM document.
  • Ensure dat teh project documentation is maintained as per Project Life Cycle and all documents are version controlled and maintained for client review and audits.
  • Created JCL for executing teh batch programs. Used tool to create JCL. Has good experience with DB2 database. Proficient in SQL queries to retrieve teh data from DB2 database
  • Created multiple online screens using CICS.
  • Conducted unit testing for teh programs dat are developed. Made sure dat all teh items in test plan get executed as expected.
  • Coordinated with Offshore for requirements clarification, schedule requirements
  • Received multiple client appreciation for teh excellent offshore coordination and timely completion of teh project activities.

Environment: IBM B2 V9.1, COBOL, JCL, VSAM, Mainframe OS

Confidential

Mainframe Developer

Responsibilities:

  • Getting teh user requirements by interacting with teh various interface teams
  • Creating detailed level design and specifications for teh new development and obtaining sign off from all teh stakeholders.
  • Code analysis, modification and Coding
  • System, Regression and Unit Testing with teh Test Plan preparation
  • Co-ordination with teh onsite development team to complete teh development efforts as per teh design, solving problems and issues faced by them and handling teh development of teh critical functionality.
  • Installing teh enhancements in production environment
  • Performing rigorous testing. Setting up test environment for User Acceptance Testing

Environment: IBM B2 V9.1, COBOL, JCL, VSAM, DB2 Mainframe OS

We'd love your feedback!