We provide IT Staff Augmentation Services!

Sr. Hadoop Developer Resume

5.00/5 (Submit Your Rating)

SUMMARY:

  • Over 9 years of experience in IT industry and strong experience in Application development on Hadoop Platform, Data Analytics and ETL in Banking and Insurance sectors
  • Cloudera Certified Hadoop Developer, Completed CCD - 410 certification from Cloudera
  • Hands on experience on in core Hadoop and Hadoop technology stack which includes Spark, Sqoop, Hive Server2, HBase, Oozie, Pig, Impala, Sentry, Hue and MR programming
  • Experience in all phases of a software project life cycle like Analysis, Design, Development, Testing, Deployment and Maintenance of Software Projects
  • Adequate knowledge and working experience in using Continuous Integration tools such as Jenkins and Ansible, performing auto build & deploy in different lanes across environments as part of DEVOPS
  • Worked on huge data imports/export to RDBMS like Teradata, Oracle, Netezza, SQL server and building the data pipelines
  • Extensive experience of using various IDE including Eclipse, IntelliJ
  • Development using Agile Methodology and was part of agile ceremonies such as Scrum, Sprint planning, Story grooming and managing tech debts
  • Supported various CDH platform upgrades and assisted team to create benchmark results during each upgrade
  • Interacted directly with Cloudera team for Hadoop cluster related issues and resolved the same
  • Conducted Orientation sessions and acted as mentor to Accenture New joiners

TECHNICAL SKILLS:

Hadoop Ecosystem: Spark, Hive Server2, Sqoop, Oozie, Zookeeper, Impala, Hbase, Pig, Cloudera CDH 5.13, Sentry, Kerberos and YARN

Languages: Python, SQL, Shell scripting and PL/SQL

DBMS: Teradata, Netezza, SQL server, Oracle

Operating Systems: RHEL 6.8, Ubuntu 14.X and Z/OS-390

Specialized Tools: Autosys, Jenkins, Teradata SQL assistant, Oracle SQL Developer, Maven, Eclipse, SVN explorer, Tortoise SVN, Fisheye crucible

Testing Tools: QTP, QC, Tosca, Datameer

PROFESSIONAL EXPERIENCE:

Confidential

Sr. Hadoop Developer

Responsibilities:
  • Develop the reusable Spark Data standardization, Data integrity, Data Ingestion components reducing the effort to develop any new sourcing process
  • Created Sqoop and Python scripts for validating the data coming from various RDBMS
  • Built lineage for Key business elements scanning through the transformation flow.
  • Created python scripts for Pyspark processing to create end of day reports for ingestion execution and task level performance monitoring.
  • Built Hadoop logger framework to log messages from Hadoop MapReduce framework and its abstraction layers such as Hive, pig.
  • Built Autosys JILs to schedule the jobs that are run based on the allocated windows/SLAs
  • Supported team with CDH platform upgrade validations (5.7.1 to 5.13.1)
  • Write hive and pig scripts to validate each KBE(Key Business Element) value as per the business logic.
  • Coordinated in Automated build and deploy, continuous integration process with DEVOPS team using Jenkins.
  • Actively involved in code review and integration process for code promotion across different lanes/environment until Prod release.

Confidential

Sr. Hadoop Developer

Responsibilities:
  • Built Oozie workflows that does data sourcing from legacy systems to staging, perform data integrity check and dedup logic and publish the data in parquet format
  • Developed Spark project for data ingestion using Hive server2 and executed using Spark SQL for rapid processing
  • Map and design the migration of historical elements from Risk/Capital data to Alpide publish layers and map the new required go-forward fields into BFV
  • Using Sqoop, imported data from Teradata by Fastexport JDBC and Netezza by default drivers
  • Improved parallelism and map splits for Sqoop process by evaluating boundary queries
  • Processed ~3.7 TB of retail risk data containing over 165 million records for every month end
  • Coordinated with admin team for build and deploy of code components in Edge node and adding JAR files to cluster using Jenkins.
  • Used Java JCEKS keystore for Sqoop password encryption and implemented sentry as service for table and view access control
  • Built Autosys schedules for running the Oozie jobs to archive data based on the allocated windows
  • Performed parquet data analysis for query optimizations based on partitions and DFS block sizes.
  • Coordinated with Oncall support team for any issues in Prod environment during month end cycle and creating JAR, code release branches for Production deployment
  • Converted EBCDIC input files to ASCII format using MapReduce programming and parse copybook using python script to generate schemas for tables and views

Confidential

Automation Engineer and Module lead

Responsibilities:
  • Enhanced hands on experience on QTP by automating the test data creation in the project.
  • Analyze the need for Automation testing and write automation test scripts for policy creation, Tax calculations page using VB scripting in QTP
  • Developed test data driven framework for 4 different applications on platforms SAP, .NET, WEB, Mainframes
  • Developed a html report which will have the test results after execution of automation script from QTP
  • Giving Test Estimations
  • Mentoring new resources in the team
  • Upskilling resources by training them on QTP

Confidential

Automation Engineer and Module lead

Responsibilities:
  • Module lead for Peachtree Forms testing team
  • Understand the areas where automation is required and develop tools using QTP and macros required for reducing manual effort
  • Designing work break down structure and allocating the relevant tasks to team members
  • Automate generation of Daily and weekly status reports on time and maintained accuracy
  • Following testing standards and ensured all standards are being met
  • Collaborated with other Test Leads in creating Test Completion Report
  • Conducted review meetings within the team
  • Ensured that any risks associated with the agreed test strategy and the system test plan are clearly documented and described to the clients/users and colleagues
  • Developed tool which can find content from policy document and return the name of file.
  • Automated database validation process in mainframe as it was consuming most of test execution time
  • Using QTP automated the process of product validation and captured the product applicability list screenshot from Mainframe

Confidential

Build Developer

Responsibilities:
  • Build Packaging for the Dev Team using Octopus and WiX tools
  • Creating and Troubleshooting Octopus Packages to be used for automated Deployment
  • Develop and Document a Best Practices process to build deployment in LPO
  • Creating tools to streamline our process thus reducing the amount of churn/turn around time with build requests and automate any manual processes as possible
  • Got chance to hone my skills in C# .NET, Perl scripting and created small tools

We'd love your feedback!