We provide IT Staff Augmentation Services!

Hadoop Consultant Resume

Plano, TX


  • Experienced IT Professional with close to 8 years’ experience including 2+ years in BigData Ecosystem and 5+ years in Oracle on Analysis, Design, Development, testing, Upgrade and Support.
  • Proven ability to work with diverse and disparate teams and consistently recognized for organizing and completing projects in a business environment including all phases of software development.
  • Experienced in developing Big Data applications for processing tera - bytes of data using Hadoop ecosystem (HDFS, Map Reduce, Spark, Hive, Sqoop) and In-depth knowledge of MR1 (classic) and MR2 (YARN) frameworks.
  • Experienced in extract, transform, and load (ETL) of Big Data from multiple federated data sources with DataFrames in Spark.
  • Experienced in Big Data Tools like Spark using pySpark and Spark SQL for processing of data files.
  • Created Partitions and Bucketing concepts in Hive and designed both Managed and External tables in Hive to optimize performance.
  • Experienced in writing custom Hive UDF's to in corporate business logic with Hive queries.
  • Experienced with distributions including HortonWorks HDP2.2 and Cloudera CHD5 Hadoop Cluster.
  • Experience with Cloud AWS (S3,EMR,EC2) and Snowflake warehouse.
  • Experience with Streaming data using Spark Streaming, Kafka and Scala.
  • Experience in Hive and Pig Latin in writing scripts for business use cases.
  • Experience in pySpark, Python in writing scripts for business use cases.
  • Experience in Software Development with proficiency in development of systems using Apache Hadoop, Pig, and Hive.
  • Experience in working with various tools like Jenkins and GitHub.
  • Highly experienced in software development life cycle with active participation in Software Design, Analysis, Coding, Development and Testing with specialization in ORACLE based Client-Server Computing, Implementation, Maintenance and Quality Assurance.
  • Experienced in preparation of use cases from the functional requirements.
  • Experienced in design and development of Packages, Stored Procedures and Database Triggers using PL/SQL and good understanding of oracle data dictionary.
  • Excellent knowledge of Oracle utilities such Toad, PL SQL Developer, SQL Loader etc.
  • Seasoned in optimizing performance of the data warehouse operations, using the Oracle tools like SQLLoader, TKPROF, Import/Export, and Explain Plan etc.
  • Experience in working with Oracle Forms and Reports, EiS and Oracle fusion ADF.
  • Extensively worked on ADF Business Components including customization of default ADF behavior (EO, VO, View Link, Entity Association, and Application Module), ADF faces, Templates, Taskflow.
  • Tuning of database by creating and usage of indexes and gathering table and index statistics.
  • Expertise in implementing various business processes using oracle workflow.
  • In depth understanding of Oracle Architecture, memory and process structures
  • Willingness to learn new technologies and apply them in creative ways.
  • Good initiative, teamwork and skills to understanding new business applications.
  • Proficient in Techno-Functional consulting of Oracle E-business suite applications.
  • Good Knowledge of Programming/Scripting language skills include Oracle SQL and Oracle PL/SQL.
  • Good communication skills, interpersonal skills, self-motivated, quick learner, team player and a good zeal to participate in extracurricular activities.


ERP: Oracle E Business Suite 11i & R12(12.1.3), EDI, Order Management (OM), Purchasing (PO), Inventory (INV), Bill of Material (BOM), Change Management/Engineering(ENG), Accounts Receivables (AR),Account Payables (AP), CRM, Oracle Projects, HRMS.

Oracle: SQL, PL/SQL, Oracle Forms 6i,10g& Reports 6i and 10g, Oracle Workflow, SQL * Loader, Oracle 11g/10g/9i,BPEL,EiS,XML Publisher Reports.

BigData: Hadoop Ecosystem, Hadoop (MapR/YARN), Spark2,pySpark, Sqoop1.4.5, Hive1.2.4, Zookeeper, Kafka, AWS(S3,EC2,EMR,SQS),Cloud Database warehouse Snowflake, Flume, Zookeeper, Kafka.

Programming Languages:: SQL, PL/SQL, C,Java 8, Java Script, Shell scripts, R, Python, Scala.

Other Tools: OBIEE (BI Answers), TOAD, SQL Developer, Loftware, 1EDI Source, Informatica, Oracle Data Integrator (ODI),EiS reports, OracleFusion ADF 12.1.3 (12c),VMWorkstation, Eclipse, Jdeveloper, Jupyter, RStudio,IntelliJ, GITHUB, Citrix,Jenkins.

Operating Systems: Windows,MAC OS, UNIX (Linux, Sun Solaris).

Database: Oracle 10g, 11g, Snowflakes datawarehouse.


Confidential, Plano,TX

Hadoop Consultant


  • Developed java scripts to automate DDL and insert script creation so that data modelers could insert and analyze data without errors.
  • Created JSON files so that schedulers could automate scheduling jobs.
  • Developed java REST API to support UI which was designed to create a portal for fine grain access control (FGAC) which is used to by data guardians (DG) to provide database access to users.
  • Created spring boot microservices/API to call different services which is used for creating SQL scripts, creating JSON config file and call different API’s to insert metadata information and commit the resulting files to GIT; also to save different artifacts created into S3 bucket in AWS.
  • Created Snowflake tables to store the input coming data.
  • Used AWS SQS to send email notification to the users.
  • Used parquet and JSON files to load the data into AWS S3 buckets.
  • Deployed the springboot applications in EC2 instances.
  • Used Spark Streaming and Kafka to stream the data from multiple producers into Snowflakes warehouse (AWS database warehouse).
  • Developed Scala scripts so that the streaming data could be inserted into snowflakes and AWS S3.
  • Developed APIs to create and monitor Change Orders in Service Now and HPSM.
  • Worked on making multiple frameworks CI/CD PAR complaint using RevUp.

Environment: AWS, Snowflakes, SQL, Spark2, Scala and Java SpringBoot Application.

Confidential, San Antonio, TX

Hadoop Consultant


  • Used Big Data tools like Hive, Sqoop to extract, transform and load the data into HDFS
  • Developed wrapper scripts to pull data from HortonWorks using Python.
  • Created Hive tables, performed Hive queries with Joins, Group, and aggregation.
  • Used Hive to analyze the partitioned and bucketed data and compute various metrics for reporting.
  • Created Hive UDF to handle various date formats.
  • Imported and export data into HDFS using Sqoop which includes incremental loading and transfer data between Relational Database (Discovery) and Hadoop System
  • Respond to Adhoc requests from data scientist and business.
  • Standardize the code being used within ADR projects.
  • Was a part of migrating data from BigInsight to HortonWorks.

Environment: Hadoop BigInsight, HortonWorks 2.6.1, SQL, Hive 1.2.1, Sqoop and Python.

Confidential, Costa Mesa, CA

Hadoop/BigData Developer


  • Excellent knowledge of Hadoop architecture, Hadoop Administration, Map-Reduce Jobtrackers / Task trackers functionality process and HDFS file system/ MapR File System.
  • Participated in meeting related to design and implementation of the application.
  • Written Python scripts to format data provided by data Scientist.
  • Written validation, Search and Match, pinning using pySpark and SparkSQL Scripts.
  • Created workflow using Luigi for batch processing.

Environment: Cassandra, Hadoop HDFS, SQL, Python, PySpark, Luigi (ETL automation tool).

Confidential, Costa Mesa, CA

ETL Developer (Big Data)


  • Excellent knowledge of Hadoop architecture, Hadoop Administration and Production support. Monitoring of Hadoop clusters, Map-Reduce Jobtrackers/Task trackers functionality process and HDFS file system/ MapR File System.
  • Participated in meeting related to design and implementation of the application.
  • Create SQL scripts to transform data from mainframe to HDFS through text file format.
  • Managed and reviewed Hadoop log files
  • Responsible for creating Hive tables and wrote Hive queries using Hive QL
  • Involved in Database connectivity for the Hadoop Cluster, YARN and Map Reduce Configuration.
  • Involved in analysis, design, testing phases and responsible for documenting technical specifications

Environment: Hadoop CDH5.12.1, MapR(V2), Hadoop HDFS, SQL, Hive.

Confidential, Houston, TX

Sr. Software Developer


  • Involved in the design & development activities for interface programs, data conversion utilities.
  • Worked on Procurement (PO), HRMS and Oracle Projects.
  • Worked on PCG (Preventive Control Governors) to do Form Personalization on AR, AP, HRMS forms.
  • Created new reports in EiS (Enterprise Integration Solutions) for HRMS, PO, AP, AR modules.
  • Created conversion programs to transfer PO data from SAP to Oracle.
  • Used AOL for registration of Procedures and tables.
  • Worked on development of various custom interface programs to transfer information to Projects through AP and AR.
  • Created Oracle Alerts for Employee Termination/Appointment, Training Class Expiration.
  • Created XML Publisher reports on Oracle Projects and HRMS paycheck customization.
  • Worked on Customization of reports e.g. Purchase Order Detail Report, Receipt Adjustments, Purchase Order and Releases Detail Report in PO module.
  • Generated various reports in HRMS and Procurement (PO) to provide information about the Employee and Requisitions/Purchases.
  • Involved in design of customized objects / CV-60 documentations/generating test cases.

Environment: Oracle Applications R12(12.1.3) (AR, PO,HRMS,AP,Projects), EiS Reporting Tool, PCG Preventive Controls Governor, Oracle Alerts, Oracle 11g, SQL Loader, TOAD, UNIX, putty, FTP GUI Client, XML Publisher.

Confidential, Orlando, FL



  • Worked on upgrade of 11i SCM and financial modules to oracle R12.
  • Did CEMLI assessment and retrofit.
  • Analyzed various changes and new features of Oracle E-Business Suite R12 that affect custom business flows and application customizations. (CEMLI)
  • Performed E-Business Suite System Administration security audit on Users, Menus and Responsibilities; Clean-up obsolete Workflow/ setup data in preparation for R12 upgrade.
  • Tested and customized various reports, forms, SQL scripts, functions, and stored procedures as part of the R12 upgrade.
  • Did CEMLI evaluation of 11i custom reports (INV, PO, OM), decommissioned some in R12 and moved to OBIEE.
  • Performance tested and tuned all the custom objects during database upgrade.
  • In R12 upgrade, migrated custom forms from version 6i to 10g to make them compatible with new application server and technology stack.
  • Worked on Various Party APIs in Creating and Updating Party Information, Party Sites, Locations, Customer Accounts using TCA APIs.
  • Discuss and explain the functional requirements to offshore development team.
  • Review the technical design documents prepared by the offshore team.
  • Worked on Oracle payment module, developed a package to authorize credit card payments.
  • Imported Sales Order through Sales Order Interface. Used SQL*Loader for import into Interface tables, and PL/SQL block for validating the data.
  • Worked on Universal work queue(UWQ) and CRM module.
  • Tested and compared new features in R12 available over version R11 for SCM modules.
  • After migrating/importing custom concurrent programs in R12, compared results of all with regular EBS R11 executions.
  • Created documentation for the tests cases as well as the results for the test cases.
  • Followed up with Database team during the process of upgrade for maintaining, testing logs of patches and application clones.
  • Test integrity of customizations to ensure no overrides due to R12 upgrade.
  • Prepared MD 70s and other technical documents.
  • Prepared unit test cases for CRP testing.

Environment: Oracle Applications R12(12.1.3) (INV, AP, AR, OM, PO,CRM, Oracle Payments), Oracle Workflow, Oracle 11g, SQL Loader, TOAD, UNIX, putty, FTP GUI Client, Oracle forms 10g,Reports 10g.

Confidential, San Jose, CA

Business Solution Analyst


  • Migrated Hypercom Oracle custom forms to Confidential Oracle Suite.
  • Worked on data Conversions for PO, AP, AR and SO, uses SQL Loader, Conversion program.
  • Written PL/SQL packages and Triggers to create the order and the different workflow steps.
  • Worked on Language changes for reports for Germany region.
  • Worked on XML Publisher for Hypercom Log changes.
  • Developed AP Invoice Interface Package which creates the payable Invoices from the legacy systems by inserting data into AP invoice Interface Tables and Calling Payables Import Program.
  • Lead a team and Integrated Loftware (A label printing system) to Oracle.
  • Worked on integrating EDI (Electronic Data Interchange) in Oracle.
  • Worked on BPEL components for Customer Interface, from Amdocs to Oracle.
  • Development of various interfaces Inbound/Outbound for Customer, AR, PO, WIP.
  • Development of PL/SQL reports for ASCP planned items.
  • Done the CV 40 and CV-60 documentation.
  • Worked on the Customized pick slip report in printing all the sales order picking line details in a batch. This report is used by warehouse personnel to collect all items included in shipments.
  • AP Vendor Conversion - Developed a custom conversion program to load all the vendors and vendor sites from External system to Oracle. The vendor types included Trade Vendors and Employees.
  • Involved in writing Control Files for SQL Loading to upload the data from legacy system to the new system
  • Developed Vendor Conversion Package which validates the Vendor data from the old systems and creates vendor, vendor sites and Contact Information by inserting data into PO base tables.
  • Involved in the design & development activities for interface programs, data conversion utilities.
  • Used AOL for registration of Procedures and tables.
  • Worked on development of custom interface program to inform shop floor about the open orders and related customer information from Oracle WIP and ORDER tables.

Environment: Oracle Applications 11i (INV, AP, AR, OM, EDI, PO, BOM), Remedy, PVCS tool, Oracle Workflow, Amdocs, Oracle Reports Builder 6i, Oracle Forms Builder 6i, XML Publisher, BPEL, Oracle 11g, SQL Loader, TOAD, UNIX, putty, FTP GUI Client, Loftware.


Software Engineer


  • Coordinated on shore and off shore teams on regular basis for the project.
  • Involved in discussions with cross functional teams in designing and development of Interfaces between systems.
  • As a lead, making sure that the issues are assigned, acknowledged and resolved at earliest.
  • Involved in preparation of package for UAT and providing UAT support.
  • Worked on Integration for AP, OM, AR, GL, PO, INV and Service Contracts modules.
  • Involved in requirement gathering, design and development and testing of System.
  • Preparing project plan for the objects to be developed.
  • Preparing functional and technical document; for the enhancements assigned as a part of business process re-engineering.
  • Done the Personalization and customization for Sales Orders in Order Management.
  • Worked on Outbound Interface for Customers, Tax Exemptions and Tax Zones.
  • Created the Descriptive Flex fields (DFFs) in AP, AR, INV, BOM and PO for capturing the Item & Tax information.
  • Customized the XML publisher Template for Sales Order Acknowledgement Report.
  • Written the SQL Loader Concurrent program for the various purposes.
  • Worked for Customization of Sales Order form for various scenarios for Tax Determination.
  • Worked on P2P Process and O2C process.
  • Modified purchase order workflow to implement approvals using AME
  • Worked on Sales Order Acknowledgement Report.
  • Done the MD-50 and MD-70 documentation.
  • Providing key ideas on monthly bases, which would help in reducing the recurrence of issues and enhance business process.
  • Knowledge sharing and providing support after the upgrade of major portion of Confidential VAIO business from 11.0.3 to 11i.
  • Assure Quality of Code is maintained by following the Standards of Coding.
  • Understand all kinds of issues/enhancement and make sure of the service assurance within the mentioned SLA.
  • Managing and coordinating with other teams to carry out day-to-day work, handle critical situations to maintain business continuity.
  • Upgrade of database from 9i to 10g
  • Provided User training at various company locations.
  • Requirement Analysis and its translation into technical design.
  • Maintenance and support of client ERP system and database.
  • Created and maintained technical documentation for all changes incorporated.
  • Development of Database Triggers for Data security within the application.
  • Developed various kinds of Stored Procedures, Functions and Packages using SQL, PL/SQL.
  • Optimized SQL queries for higher performance.
  • Prepared backup and recovery plans.
  • Providing Root Cause analysis for all kinds of issues and to ensure they do not repeat again by proactively providing a permanent fix.
  • Developed PL/SQL Procedures for execution of different Business Logics in Reports and Forms.
  • Written SQL code to create Views necessary for the system.
  • Developed program code for packages, forms, reports, triggers using SQL, PLSQL.
  • Worked on Customization of reports e.g. Item Categories Report, Item Quantities Summary Report in INV module.

Environment: Oracle Applications 11i (AR, AP, OM, INV, PO, WIP, BOM, INV), Oracle 8i, SQL, TOAD, Reports 6i, Forms 6i, SQL*Loader and Sun Solaris 8,remedy, PVCS tool, Oracle Workflow, HPPM, Oracle Reports Builder 6i, Oracle Forms Builder 6i, Java Script, Oracle database 9i/10g, SQL Loader, TOAD, UNIX, putty, FTP GUI Client

Hire Now