We provide IT Staff Augmentation Services!

Hadoop Developer Resume

3.00/5 (Submit Your Rating)

Atlanta, GA

PROFESSIONAL SUMMARY

  • Over 11 years of technical experience in database (DB2) development including 2.5 years of experience in Hadoop Development.
  • Extensive experience of teh SDLC with proficiency in mapping Business Requirements, Technical Documentation, Application Design, Development, Integration, Testing and Troubleshooting for Mission critical applications.
  • Extensive Experience in Agile and Waterfall methodology.
  • Experienced in Quality Management techniques using SEI CMM based processes during all phases of a Project life cycle.
  • An TEMPeffective team Member with proven abilities to be a part of teh team during teh project phase, training & guiding team members.
  • Excellent communication, leadership and interpersonal skills with proven abilities in resolving complex software/ application related issues.
  • Extensively worked as a part of onsite - offshore team model as a lead as well as coordinator.
  • Experience in setting up Cloudera CHD3, CHD4 Hadoop Cluster.
  • Experinece in Hadoop architecture and Map Reduce programming. Proficient in Installation, Configuration of Cloudera components HDFS, SQOOP, FLUME, OOZie, Hive.
  • Experience in ETL using SQOOP, FLUME, HDFS
  • Experience in HiveQL, NoSql (Casandra, Hbase), and Pig Latin in creating reports, writing scripts for business use cases.
  • Working Knowledge for Zookeeper and R.
  • Extensive DB2 UDB (Teradata) experience in Logical and physical Designing (stored procedures, indexes, table, Alias, Synonym and Views), Managing DATA, Performance Tuning.
  • Handling various project aspects like project documentation, system design & integration, coding of modules, Peer Reviews, monitoring critical paths & taking appropriate actions on right time.
  • Designing and executing test plans, test cases and test scripts/procedures, gap analysis to ensure that business requirements and functional specifications are tested and fulfilled.

TECHNICAL SKILLS

Domain: Financial Market, Energy and Utility, Retail and Logistics.

Project Management Methodologies: Agile (Scrum), Waterfall

Big data Ecosystem: Cloudera Manager, Hadoop Ecosystem (HDFS, Pig Latin, Hive, OozieImpala)

Big Data Analysis Platform: MapReduce

Data Aggregation and Transfer: SQOOP, Flume

Database and Data Access: Db2, Teradata, Oracle 10g, MySQL, PL/SQL, NoSQL (Hbase, Cassandra)

Database Modelling Tools: Data Studio, Teradata Query Analyser

Programing Languages: Java, JavaScript, SQL and R

Data Exchange Format: XML, JSON

Operating System: Ubuntu Linux, Windows 8/7/XP/2000, MVS-OS/390, z/OS & UNIX

Virtualization Tool: VMWare, Virtual box

Configuration management, Build: Jenkins, Ant, Maven

Interface: MQ - Series

Integrated Development Env (IDE): Eclipse

Job Schedulers Tools: OOZie

Incident Management Tools: REMEDY, PERIGRINE, HPQC, HPALM

Work Tracking Tools: Rational Project Management (RPM), Rational Team concert (RTC)

Other Tools/Technologies: CICS, PL/1, COBOL, JCL, Telon, Easytrieve, REXX, VSAM, ENDEVOR, Librarian, SCLM, Changeman, MVS,FILE-AID for File and DB2, IBM File Manager, SPUFI, QMF, Insync, Rapid SQL, SQL programmer, DB2 Command Line, Platinum, CONTROL-M, CA7, ESP Lotus Notes, Outlook, MS-Office, Xpediter, Via Soft, Smartest, DTCN, CEDF

PROFESSIONAL EXPERIENCE

Confidential, Atlanta GA

Hadoop Developer

Responsibilities:

  • Translation of functional and technical requirements into detailed architecture and design.
  • Developed and supported existing map reduce programs and jobs for various data cleansing features like Schema validation, Row Count and data.
  • Responsible to manage data coming from different sources.
  • Importing and exporting data into HDFS using SQOOP and Flume.
  • Responsible for creating Hive tables and wrote Hive queries using Hive QL
  • Responsible for writing PIG (Pig Latin) scripts for ad-hoc data retrieval.
  • Developed work flows to schedule various Hadoop programs using Oozie.
  • Involved in configuring multi-nodes fully distributed Hadoop cluster.
  • Involved in analysis, design, testing phases and responsible for documenting technical specifications
  • Written program in R for Statistical representation of data stored on 20 Node HDFS cluster.

Confidential NY

Hadoop Developer

Responsibilities:

  • Import and export of data using Sqoop from or to HDFS and Relational DB system (Db2 and Oracle)
  • Developed JAVA Map Reduce programs for custom processing.
  • Created Hive tables and wrote Hive queries using Hive QL
  • Developed work flows to schedule various Hadoop programs using Oozie.
  • Used Pig (Pig Latin) scripts for ad-hoc data retrieval
  • Involved in analysis, design, testing phases and responsible for documenting technical specifications

Confidential, NY

Sr. DB2 Developer

Responsibilities:

  • Supporting Developing teams with Database management (Logical and physical Designing, Performance Tuning.
  • Working on DB2 LUW tools to monitor and manage DB2 database on Windows and AIX.
  • Working on Query Performance tuning.
  • Design data model. (Erwin)
  • Create new database objects including tables, views, triggers and indexes.(Erwin, DB2 Data Studio, DDL scripts, File aid for Db2(Zos), Platinum)
  • Writing and tuning large SQL queries using Teradata.
  • Populate dimension tables with data from various sources including text files. Write shell or Perl scripts when necessary.(Unix shell, Perl, Load utility, Import, Export, Control Centre, Load-unload utility in z/OS)
  • Develop stored procedures and Trigger, while creating more indexes in teh process for performance improvement and automation. (DB2 Data Studio, Design Advisor)
  • Migrate to test and production database, unit test and tune teh application.
  • Document complete database design, High level data Design and technical data flow related design
  • Support system integration testing (SIT), user acceptance testing (UAT).
  • Working as a senior developer for CAS GAS Migration Project.
  • Coordinating with clients and off-shore team for anchoring global delivery model.teh results.

We'd love your feedback!