We provide IT Staff Augmentation Services!

Sr. Big Data Hadoop Developer Resume

5.00/5 (Submit Your Rating)

SUMMARY

  • Total 10 years of experience in IT with 2+ years experience in big data ingestion,analytics and visualization frameworks
  • Experience on BigData Hadoop projects using MapReduce,HDFS,Pig,Hive,Sqoop,Oozie,Unix Scripting,Platfora.
  • Exposure to Apache Spark and Python.
  • Experience using Apache Tez as execution engine for Pig & Hive.
  • Experience in using Avro, Nested Avro, Sequence files, Parquet & ORC file formats.
  • Cloudera Certified Developer for Apache Hadoop.
  • Efficient in building advanced hive scripts using various analytic functions.
  • Expert in optimizing the Hive and Pig Scripts.
  • Experience in building the visualization charts/graphs using the tool Platfora.
  • Familiar with data architecture including data ingestion pipeline design, Hadoop information architecture, data modelling and data mining.
  • Strong knowledge & Experience of Data Warehousing concepts, ETL Methodologies and various Database technologies.
  • Extensive experience in complete life cycle of Mainframe projects starting from design, development, and integration testing till Production implementation..
  • Able to assess business rules, collaborate with stakeholders and perform source - to-target data mapping, design and review.
  • Experience in working Agile Projects
  • Excellent client-facing, negotiation & conflict resolution skills; a highly motivated self-starter and team-player interacting effectively with stakeholders to translate business requirements into IT deliverable.

TECHNICAL SKILLS

Big Data Ecosystems: Hadoop 1.0/2.x, Spark, MapReduce, HDFS, YARN, Hive, Pig, SqoopOozie, Tez, Avro, Parquet, ORC

Languages/Frameworks: Pig Latin,Hive QL,Shell scripting, Python,SQL,COBOL,JCL

Databases: DB2,Teradata

Tools: Hortonworks,Cloudera CDH 3 & CDH 4,Abinitio,Github,Hue,Putty, WinSCP,Changeman,Version one,IBM Rational Team Concert.

Domain: Banking and Financial Services,Credit Cards,Insurance

Platforms: Windows(2000/XP), Linux,Unix,z/OS

Methodologies: Agile,Waterfall

PROFESSIONAL EXPERIENCE

Confidential

Sr. Big Data Hadoop Developer

Responsibilities:

  • Ability to interface with business team to understand the requirements and conceptualize & explain analytical solutions.
  • Co-ordinate with offshore during the development.
  • Develop advanced Hive scripts using various analytic functions.
  • Optimize the hive and pig scripts.
  • Wrote Python scripts for text parsing/mining.
  • Develop the vizualization charts/graphs using platfora.
  • Co-ordinate with business and upstream teams during the SIT and UAT phases.
  • Unit Testing, documentation & training to support people

Environment: Hadoop,Python,HDFS,MapReduce,Tez,PIG,HIVE,Sqoop,HortonworkDistribution-hdp2.x,Platfora,WinScp,PuttyCapitalone Bank January 2014 - July 2015

Confidential

Sr. Hadoop Developer

Responsibilities:

  • Creating ETL Process to move data from Source systems to Hadoop.
  • Create map reduce code to convert the source file in EBCDIC format to ASCII.
  • Create Data quality framework to do the basic validation of the data from source.
  • Create the Key and Split framework for adding the key columns and splitting the npi/non npi data’s
  • Experience in transforming and analyzing data using Hive QL and Pig Latin.
  • Experience in developing custom UDF/UDAF,Handling updates in hive and Apache Sentry
  • Experience in optimizing Hive queries and performance tuning
  • Experience in various process like Historical loads (one-time) from Teradata using an unload script built on Teradata Parallel Transporter (TPT) and Fast Load Scripts.
  • Registration of the datasets in a metadata registry that controls admittance into Hadoop
  • Good understanding of Hadoop Data classification and Directory Structure options.
  • In depth understanding/knowledge of Hadoop Architecture and various components such as HDFS, Job Tracker, Task Tracker, Name Node, Data Node, MRv1 and MRv2 (YARN).
  • Provided solutions for different Integration and user acceptance testing issues. Coordinating with offshore team and provide them analysis and guidance.
  • Ensured the timely completion of the Unit and Integration Testing, testing effort on the project by coordinating with Business SMEs / IT, interface teams and stake holders.
  • Involved in daily SCRUM meetings to discuss the development/progress of Sprints and was active in making scrum meetings more productive.

Environment: Hadoop-Cloudera Distribution, UNIX, Teradata, MapReduce, HDFS, Pig, Hive, Sqoop, UnixShell scripting,Abinitio,Mainframes.

Confidential

Technical Lead(ETL/Mainframes)

Responsibilities:

  • Next Gen Teller BIC: As part of this project, we build the tool 'Transend" which stores the Argo Transactions. It stores all the different types of Transactions and allow the Confidential Bankers to search the transactions via EJ Search tool
  • Book of Business: As part of this project, we build a data store which stores the Confidential Retail bank Transactions. It stores balances for Deposit/Lending/Investment accounts and allow the Confidential Bankers to search the transactions via Bank Book View Tool
  • Responsibilities: Development of Extraction programs that uses ALE or File Analyzing the requirements and converting them to proper ETL design and then develop then using Ab Initio graphs.
  • Development of UNIX wrapper korn shell scripts.
  • Promotion of project from Development to SIT and UAT to Production.
  • Monitor the jobs using Control/m GUI and work very close with production support team to ensure the successful completion of jobs.
  • Maintaining the versions of project in EME.
  • Working in coordination with other teams like Java, Load testing, QA in different environments and System Integration Testing (SIT) and User Acceptance Testing (UAT).
  • Graph optimization from time to time and resolve the performance issues of the graph and explore the tools to utilize them to the optimum level.
  • Maintaining proper documentation of the project like Design Documents and explaining the documents to other teams

Environment: UNIX, Abinitio 3.x, Mainframes

Confidential

Sr Mainframe Developer

Responsibilities:

  • Requirement analysis and Design Preparation.
  • Development and Unit testing of Batch Programs
  • Provide System testing/Integration testing support
  • Fix the defects in System testing/Integration testing
  • Resolve production issues and make sure to meet the SLA of the Batch process.

Environment: Mainframes

Confidential

Mainframe Developer

Responsibilities:

  • Requirement analysis, Design, Development and Unit testing.
  • Development and Unit testing of Batch Programs
  • Provide System testing/Integration testing support
  • Fix the defects in System testing/Integration testing
  • Resolve production issues and make sure to meet the SLA of the Batch process.

We'd love your feedback!