We provide IT Staff Augmentation Services!

Hadoop Developer Resume

3.00/5 (Submit Your Rating)

Wilmington, DE

SUMMARY

  • 9+ years of experience in IT with 3+ years of experience in Big Data Ecosystem with hands on experience in Data Extraction, Data Analysis, and Cleansing with Cloudera Platform HDFS, Hive, Sqoop, Spark, Scala and very good experience in Database/Application migration and integration, and managed various critical application from onsite and offshore.
  • Good knowledge in completeHadoop ecosystemand extensive experience in understanding the client'sBig Databusiness requirements and transform it intoHadoop centric technologies.
  • Imported and exported large sets of data into HDFS from oracle using sqoop and vice versa.
  • Worked on hive partition and bucketing concepts and created external hive tables and internal tables with hive partition.
  • Worked on developing applications in Hadoop big data technologies Hive,Map reduce,kafka,spark scala.
  • Involved in loading data into Hadoop distributed file system in order to process data.
  • Have very good knowledge in Cassandra and elastic search.
  • Oracle Certified Associate 9i and IBM Certified Solution Developer - InfoSphere DataStage v8.5
  • Worked in following technologies datastage 8.5,unix shell scripting,Oracle 11g PL/SQL
  • Excellent Experience in Designing, Developing, Documenting, Testing of ETL jobs and mappings in Sequence, Parallel jobs using Data Stage to populate tables in Data Warehouse
  • Expert in designing Parallel jobs using various stages like Join, Merge, Lookup,Remove duplicates, Filter, Dataset, Lookup file set, Complex flat file, Modify, Aggregator, XML,salesforce
  • Expertise in UNIX shell scripts using K-shell for the automation of processes and scheduling the Data Stage jobs using Control-M.
  • Experience in production maintenance activities by co-ordination and validating the system after move
  • Committed team player with strong analytical and problem solving skills.
  • Dedicated to successful project completion with excellent communication and interpersonal skills and great sense of responsibility
  • Strong experience and exposure to Oracle, Sql Server, ETL and Unix Shell Scripting
  • Integrated Salesforce with datastage
  • Experience in design and implementation of Structured, Semi Structured, Relational and Object Oriented Data technologies
  • Strong in data migration and data load using scripts and Oracle tools like SQLLOADER
  • Well experience in RDBMS especially Oracle 9i / 10g
  • I was also responsible for Development of LLD, Mapping Specification, UTP and UTR
  • Proficient in writing, implementation and testing of triggers, procedures and functions in PL/SQL and Oracle.
  • Strong in Sql optimization and tuning
  • Expertise in Data handling using UNIX Shell
  • Strong problem analysis and Solution optimization skills with ability to work in multi-platform environments like Windows and UNIX
  • Proficient knowledge of Software Development Life Cycle and Procedural techniques
  • Great customer support skills and adaptable to changing business needs
  • Self-motivated, detail oriented, time bound, responsible team player and ability to coordinate in a team environment
  • Excellent communication skills and keen in learning new technology

TECHNICAL SKILLS

Big Data Ecosystems: Spark Scala,Hive, HDFS, Sqoop, Zookeeper,pig, Map-Reduce

Languages: Unix Shell (KSH), TSQL, ANSI-SQL, Java,Scala

OS: Solaris, Win 2k/NT, Unix, Linux, Aix

Databases: Oracle 9i /10g/11g, MS SQL Server

NOSQL Database: Cassandra

Tools: Elastic Search Logstash,kibana, Bit bucket, FTP, SFTP, Toad, PL/SQL Developer, Oracle Sql, Developer,Datastage, Crystal Reports,Visual Source Safe (VSS 6.0), TortoiseSVN (1.6.2),DbSymphony tracking tool, MindAlign chat client, Putty

Web Server: ASP, Internet Information Server (IIS)

PROFESSIONAL EXPERIENCE

Confidential, Wilmington DE

Hadoop Developer

RESPONSIBILITIES:

  • Developed data pipeline using Sqoop, spark to ingest regulatory related and Customer related data into HDFS .
  • Developed scripts and Batch Jobs to schedule various Hadoop Program
  • Worked on hive partition and bucketing concepts and created external and internal hive tables with hive partition.
  • Migrated legacy datastage code into hadoop ecosystem using spark scala
  • Involved in converting Hive/SQL queries into Spark transformations using Spark RDDs and Used Scala application to extract data from sources systems by applying transformation through dataframes and spark sql
  • Involved in creating Hive tables, loading data into Hive.
  • Imported and exported large sets of data into HDFS from oracle using sqoop and vice versa.
  • Followedagile softwaredevelopmentScrumpractice paired programming and test driven development.
  • Worked on various POC's like using kafka for reading deployment logs files and feeding intoElastic search Logstash kibana and alerting users on the issues.Worked on Casandra data modeling for one of the legacy system.

Environment:, Sqoop, pig,Hive, Map Reduce,Spark, Kafka,HDFS, Elastic Search, Logstash, Oracle SQL Developer, SQL loader, Sql Server, Toad, Sql Plus, Datastage8.5/9.1, Sql plus, Ftp, Sftp, UNIX, ksh

Confidential, San Antonio TX

Information Technology Analyst

RESPONSIBILITIES:

  • Discussion with business personnel for requirements gathering, analysis and design, coding, system testing, user acceptance testing and implementation.
  • Used Data Stage as an ETL tool to extract data from sources systems, loaded the data into the oracle database.
  • Created Data stage jobs using different stages like Transformer, Aggregator, Sort, Join, Merge, Lookup, Data Set, Funnel, Remove Duplicates, Copy, Modify, Filter, salesforce, xml Etc.
  • Extensively worked with Join, Look up (Normal and Sparse) and Merge stages.
  • Extensively worked with sequential file, dataset, file set stages.
  • Extensively used Parallel Stages like Row Generator, Column Generator, Head, and Peek for development and de-bugging purposes.
  • Used the Data Stage Director and its run-time engine to schedule running the solution, testing and debugging its components, and monitoring the resulting executable versions on adhoc or scheduled basis.
  • Created shell script to run data stage jobs from UNIX and then schedule this script to run data stage jobs through scheduling tool.
  • Developed Batches to extract, transform and load source data to target system.
  • Developed complex store procedures using input/output parameters, cursors, views, triggers and complex queries using temp tables and joins.
  • Analyze user requirements, procedures, and problems to automate or improve existing systems.
  • Supporting maintenance activities by co-ordination and validating the system after move.
  • Monitoring of server processes and other services.
  • Reported server/database downtime with appropriate business users and coordinated with them as per business requirements.
  • Worked with Big data team more focused towards Scoop, Scala and Hive.
  • Solved critical production issues and P1 tickets.
  • Created shell scripts to deploy Oracle database objects.

Environment: Oracle SQL Developer, SQL loader, Sql Server, Toad, Sql Plus,, Datastage8.5/9.1, Sql plus, Ftp, Sftp, UNIX, ksh, Starteam

Confidential, NYC, NY

Technology Analyst

RESPONSIBILITIES:

  • Creation of user account in LDAP.
  • Monitoring of server process and service during maintenance.
  • Automating to improve existing systems or work problem using shell script.
  • Validating the system after move.
  • Preparation of report for daily status on service request .
  • Preparation of knowledge article for the issues reported.
  • Tracking technical issues and giving solutions for the user.
  • Spooled reports to back track the end user who hasn’t closed their ticket
  • Attending weekly status meeting
  • Reported server/database downtime with appropriate business users and coordinated with them as per business requirements.
  • Solved critical production issues and P1 tickets.
  • Created shell scripts to deploy Oracle database objects.

Environment: Sql plus, Ftp, Sftp, UNIX, ksh

Confidential, NY

Software Engineer

RESPONSIBILITIES:

  • Migrated oracle9i databases and objects to ORACLE 10g without changing functionality and business logic .
  • It was a custom designed ETL environment Developed Store procedures, Functions, Triggers and Packages for custom designed functionality using 3rd party tools like TOAD and Oracle SQL Developer.
  • Used oracle database links and extracted data from different schema and loaded data into source systems
  • Used oracle features to implement type 1 changes in slowly changing dimension tables.
  • Used bulk loading concept in oracle for quickly loading data into database
  • Used shell script for file validation and for calling the PL/SQL procedures
  • Involved in migration process from development to QA and production
  • Responsible for gathering functional requirement and impact analysis on it
  • As a lead Developer I was also responsible for Development of LLD, Mapping Specification, UTP and UTR
  • Performed cross-functional coordination activities to facilitate consistency and availability of dependent data sources.
  • Performed analysis and tuning of all queries, store procedures and user defined functions introduced by a new software release to ensure maximum performanceand throughput.

Environment: Toad, Sql Plus, Oracle Sql Developer, Sql Loader, external table,Sql plus, Ftp, Sftp, UNIX,ksh,Putty, DbSymphony tracking tool, Tortoise SVN

Confidential

Analyst Programmer

RESPONSIBILITIES:

  • Created table structures and database objects (functions, procedures, packages and triggers)
  • Used SQL * Loader for bulk loading.
  • Interacting with customer on daily issues and developments.
  • Analysis of Business Requirements and Technical Requirements.
  • Listed scope, limitations and bottleneck problems. As a senior developer, concern responsibilities to developers, coordinated reviews and weekly meetings to achieve quality of deliverables in defined time frame.
  • Wrote PL/SQL procedures and packages to automate adding policies to concern schema for different security levels.
  • Wrote Shell scripts for deployment and automation at various stages of projects.
  • Used TOAD and SQL PLUS for performance tuning.
  • Involved in documenting the project .
  • Impact Analysis by analyzing the existing system for new enhancements.
  • Reported server/database downtime with appropriate business users and coordinated with them as per business requirements.
  • Solved critical production issues and P1 tickets.
  • Created shell scripts to deploy Oracle database objects.

Environment: PL/SQL, KSH, Oracle 10g/11g, Aix, VB,Toad, Sql plus, Ftp, Sftp, UNIX,java

We'd love your feedback!