We provide IT Staff Augmentation Services!

Sr. Big Data Hadoop Developer Resume

5.00/5 (Submit Your Rating)

SUMMARY

  • Total 10 years of experience in IT with 2+ years experience in big data ingestion,analytics and visualization frameworks
  • Experience on BigData Hadoop projects using MapReduce,HDFS,Pig,Hive,Sqoop,Oozie,Unix Scripting,Platfora.
  • Exposure to Apache Spark and Python.
  • Experience using Apache Tez as execution engine for Pig & Hive.
  • Experience in usingAvro, Nested Avro, Sequence files, Parquet & ORC file formats.
  • Cloudera Certified Developer for Apache Hadoop.
  • Efficient in building advanced hive scripts using various analytic functions.
  • Expert in optimizing teh Hive and Pig Scripts.
  • Experience in building teh visualization charts/graphs using teh tool Platfora.
  • Familiar with data architecture including data ingestion pipeline design, Hadoop information architecture, data modelling and data mining.
  • Strong noledge & Experience of Data Warehousing concepts, ETL Methodologies and various Database technologies.
  • Extensive experience in complete life cycle of Mainframe projects starting from design, development, and integration testing till Production implementation..
  • Able to assess business rules, collaborate with stakeholders and perform source - to-target data mapping, design and review.
  • Experience in working Agile Projects

TECHNICAL SKILLS

Big Data Ecosystems: Hadoop 1.0/2.x, Spark, MapReduce, HDFS, YARN, Hive, Pig, Sqoop, Oozie, Tez, Avro, Parquet, ORC

Languages/Frameworks: Pig Latin,Hive QL,Shell scripting, Python,SQL,COBOL,JCL

Databases: DB2,Teradata

Tools: Hortonworks,Cloudera CDH 3 & CDH 4,Abinitio,Github,Hue,Putty, WinSCP,Changeman,Version one,IBM Rational Team Concert.

Domain: Banking and Financial Services,Credit Cards,Insurance

Platforms: Windows(2000/XP), Linux,Unix,z/OS

Methodologies: Agile,Waterfall

PROFESSIONAL EXPERIENCE

Confidential

Sr. Big Data Hadoop Developer

Responsibilities:

  • Ability to interface with business team to understand teh requirements and conceptualize & explain analytical solutions.
  • Co-ordinate with offshore during teh development.
  • Develop advanced Hive scripts using various analytic functions.
  • Optimize teh hive and pig scripts.
  • Wrote Python scripts for text parsing/mining.
  • Develop teh vizualization charts/graphs using platfora.
  • Co-ordinate with business and upstream teams during teh SIT and UAT phases.
  • Unit Testing, documentation & training to support people

Environment: Hadoop,Python,HDFS,MapReduce,Tez,PIG,HIVE,Sqoop,HortonworkDistribution-hdp2.x,Platfora,WinScp,Putty

Confidential

Sr. Hadoop Developer

Responsibilities:

  • Creating ETL Process to move data from Source systems to Hadoop.
  • Create map reduce code to convert teh source file in EBCDIC format to ASCII.
  • Create Data quality framework to do teh basic validation of teh data from source.
  • Create teh Key and Split framework for adding teh key columns and splitting teh npi/non npi data’s
  • Experience in transforming and analyzing data using Hive QL and Pig Latin.
  • Experience in developing custom UDF/UDAF,Handling updates in hive and Apache Sentry
  • Experience in optimizing Hive queries and performance tuning
  • Experience in various process like Historical loads (one-time) from Teradata using an unload script built on Teradata Parallel Transporter (TPT) and Fast Load Scripts.
  • Registration of teh datasets in a metadata registry dat controls admittance into Hadoop
  • Good understanding of Hadoop Data classification and Directory Structure options.
  • In depth understanding/noledge of Hadoop Architecture and various components such as HDFS, Job Tracker, Task Tracker, Name Node, Data Node, MRv1 and MRv2 (YARN).
  • Provided solutions for different Integration and user acceptance testing issues. Coordinating with offshore team and provide them analysis and guidance.
  • Ensured teh timely completion of teh Unit and Integration Testing, testing effort on teh project by coordinating with Business SMEs / IT, interface teams and stake holders.
  • Involved in daily SCRUM meetings to discuss teh development/progress of Sprints and was active in making scrum meetings more productive.

Environment: Hadoop-Cloudera Distribution, UNIX, Teradata, MapReduce, HDFS, Pig, Hive, Sqoop, UnixShell scripting,Abinitio,Mainframes.

Confidential

Technical Lead(ETL/Mainframes)

Responsibilities:

  • Next Gen Teller BIC: As part of this project, we build teh tool 'Transend" which stores teh Argo Transactions. It stores all teh different types of Transactions and allow teh Capital One Bankers to search teh transactions via EJ Search tool
  • Book of Business: As part of this project, we build a data store which stores teh Capital one Retail bank Transactions. It stores balances for Deposit/Lending/Investment accounts and allow teh Capital One Bankers to search teh transactions via Bank Book View Tool
  • Responsibilities: Development of Extraction programs dat uses ALE or File Analyzing teh requirements and converting them to proper ETL design and then develop then using Ab Initio graphs.
  • Development of UNIX wrapper korn shell scripts.
  • Promotion of project from Development to SIT and UAT to Production.
  • Monitor teh jobs using Control/m GUI and work very close with production support team to ensure teh successful completion of jobs.
  • Maintaining teh versions of project in EME.
  • Working in coordination with other teams like Java, Load testing, QA in different environments and System Integration Testing (SIT) and User Acceptance Testing (UAT).
  • Graph optimization from time to time and resolve teh performance issues of teh graph and explore teh tools to utilize them to teh optimum level.
  • Maintaining proper documentation of teh project like Design Documents and explaining teh documents to other teams

Environment: UNIX, Abinitio 3.x, Mainframes

Confidential

Sr Mainframe Developer

Responsibilities:

  • Requirement analysis and Design Preparation.
  • Development and Unit testing of Batch Programs
  • Provide System testing/Integration testing support
  • Fix teh defects in System testing/Integration testing
  • Resolve production issues and make sure to meet teh SLA of teh Batch process.

Environment: Mainframes

Confidential

Mainframe Developer

Responsibilities:

  • Requirement analysis, Design, Development and Unit testing.
  • Development and Unit testing of Batch Programs
  • Provide System testing/Integration testing support
  • Fix teh defects in System testing/Integration testing
  • Resolve production issues and make sure to meet teh SLA of teh Batch process.

We'd love your feedback!