We provide IT Staff Augmentation Services!

Sr. Hadoop Developer Resume

3.00/5 (Submit Your Rating)

PROFESSIONAL SUMMARY

  • Over 8 years of experience in IT field as a Programmer/Analyst. Strong in Project Design, Software Processes, Requirements Analysis, and Development of Software Applications in Java and Big Data Hadoop.
  • Expertise in Banking Domain with hands - on expertise on Touch Point Sales & Service, Xpress and Teller platforms of Fidelity Information Systems
  • Experience in Service Oriented Architecture (SOA) and Component Oriented Architecture design
  • Experience of developing applications with Model View Architecture (MVC2) using Struts Framework and J2EE Design Patterns
  • Good Knowledge on Hadoop stack, Cluster architecture and monitoring the cluster
  • Hands on experience with Hadoop and its components like HDFS, Map Reduce, Apache Pig, Hive, Sqoop and Oozie
  • Experience with NoSQL databases like HBase and Cassandra
  • Experience in analyzing data using HiveQL, Pig Latin, and custom MapReduce programs in Java
  • Strong working experience with ingestion, storage, processing and analysis of big data
  • Good working knowledge with Apache Pig and Hive
  • Developed the Pig UDF’s to pre-process the data for analysis
  • Experienced with creating work flows using Oozie for cron jobs
  • Involved in converting the business requirements into System Requirements specification (SRS)
  • Experience and Strong knowledge in SQL and PL/SQL (Stored procedures)
  • Good Knowledge on Spark, Scala, Flume and NoSQL
  • Well adaptable to challenging environments, Strong work ethics and commitment to work will well serve to achieve Client’s objectives
  • Motivated and focused Team player with strong problem solving and analytical skills. Generate new ideas and quickly learn new technologies to get the job done

TECHNICAL SKILLS

Languages Technologies: Java, Scala, HTML5 J2EE, SQL, XML/XSLT, JDBC, Servlets, JSP, JSF, AJAX, EJB, MS, JAXB, Spring and Hibernate

Big Data Technologies: Map Reduce, Pig, Sqoop, Hive, Spark, HBASE

Web Services: XML, XSD, XSLT, DTD, DOM/SAX Parsers, UDDI, SOAP, Restful, WSDL.

IDE/ Tools: Visual Age, Visual C++, Eclipse 3.0, Rational Rose, Toad 6.3, WASD5.1, JUnit, XML Spy

Scripting Languages: JavaScript, VBScript, CSS3, Bootstrap

Application Servers: WebSphere 6.0/7.0/8.5 and JBoss 5

Web Servers: Tomcat 7.0

RDBMS: Oracle 10g and DB2

Methodologies: OOAD

Frameworks: Struts 1.2, Hibernate and Spring 3.0

Design Tools: Rational Rose

Messaging Middle Ware: IBM MQ Series and JMS

Testing Tools: Junit

Protocols: TCP/IP, HTTP(S), SOAP

OS: Windows 2000/98/NT, UNIX, LINUX, Cent OS

Configuration and Versioning Tracking tools: SVN,VSS 6.0,CVS 3.0.1, PVCS and Rational Clear Case

PROFESSIONAL EXPERIENCE

Confidential

Sr. Hadoop Developer

Responsibilities:

  • Architect, design and build big data platform primarily based on Hadoop eco system that is fault-tolerant and scalable.
  • Build high throughput messaging framework to transport high volume data.
  • Build High-Availablity (HA) architectures and deployments primarily using big data technologies.
  • Importing data into HDFS and Hive using Sqoop.
  • Creating and managing Data pipelines.
  • Responsible for design and development of Hive & Spark SQL scripts based on functional specifications.
  • Responsible for Spark streaming configuration based on type of input source.
  • Coordinate Hadoop system implementation, system support and performance tuning.
  • Preparing technical specs, analyzing functional specs, development and maintenance of code
  • Responsible for the complexity determination and estimation.
  • Evaluated different tools like Apache Nifi, Zeppelin Notebook, Kylin, Knox for different POCs.

Environment: HDP-2.4, Spark, Spark SQL, Hive, HDFS, Sqoop, Oozie, Ranger, Kerberos, Unix, SAP HANA, Peoplesoft and Oracle

Confidential

Hadoop Developer

Responsibilities:

  • Moved all products information of existing customers to HDFS for further processing.
  • Written Apache PIG scripts to process HDFS data.
  • Importing and exporting data into HDFS and Hive using Sqoop.
  • Involved in creating Hive tables, loading data and writing Hive queries that run internally in map reduce.
  • Load and transform large sets of semi structured data.
  • Analyzing the requirement to setup a cluster.
  • Involved in scheduling Oozie workflow engine to run Hive and Pig jobs.
  • Gained experience in managing and reviewing Hadoop log files.
  • Developed multiple MapReduce jobs in java for data cleaning and processing.
  • Written Hive queries for data analysis to meet the business requirements.
  • Environment: Cloudera CDH 5.0, Java, MapReduce, Pig, Hive, Sqoop, Unix, Oracle

Confidential

Sr. Application Developer

Responsibilities:

  • Build/Development of Project.
  • Designed and developed UOB Administration Account entitlements.
  • Implemented Actimize customization for purge view and purge parameters.
  • Designed and developed ACH Company Name and ID customizations.
  • Performed Unit testing and Integration testing.
  • Worked on SIT, UAT and Production Issues.

Environment: Java, J2EE (JSP, HTML, Servlets), Struts, Hibernate, Spring, JavaScript, JQuery, XML, Ant, SVN, DB2, putty, WinSCP, IBM DB2, IBM AIX, Websphere

Confidential

Application Developer

Responsibilities:

  • Worked on Touchpoint Forms configuration and coding across all business processes like Deposit Account Opening, Customer Account Maintenance, Account Close etc
  • Development of Order ATM/Debit Card business process
  • Development of ATM/Debit Card Maintenance business process
  • System Integration Testing support. Involved in defect fixes raised in System Integration testing

Environment: Java, J2EE (JSP, HTML, Servlets), Struts, Hibernate, Spring, JavaScript, JQuery, Web services, Hibernate, XML, XSL, Tomcat, Ant, SVN, DB2 and Apache Tomcat 5.5

Confidential, Charlotte, NC

Programmer Analyst

Responsibilities:

  • Done Capital One Testing of Xpress services upon Host retrofit changes
  • Supported Performance Testing
  • Implemented Host Pagination for Stop Payment Inquiry service
  • Delivered high level Xpress Development for prioritized Argo gap analysis
  • Part of the Production Support Team and responsible for Bug Fixing.

Environment: Java, MDB, JWSDP, WSDL, JMX, JAXP, JAXB, JCA, IBM IMS Resource Adapter, Hibernate, Spring, IBM MQ, DB2, IBM WebSphere 6.1 and AXIS 1.4

Confidential, Charlotte, NC

Programmer Analyst

Responsibilities:

  • Touch Point suite is distributed through the branch, the contact centre and through the network of banking relationship managers. Capital one Bank requires some more enhancements to the TPTeller, TPSS.
  • TouchPoint Teller complements traditional teller needs by automating all monetary transactions, inquiries, compliance requirements and administrative functions while providing a reliable store-and-forward environment to ensure transactions are not lost.
  • TouchPoint Sales and Service is a suite of customer interaction solutions that allow institutions to address their greatest needs first, and then add more capabilities as needed.
  • ESL is an enterprise services layer built in C++ sits between front end and back end and takes care of transaction processing
  • Worked on defect fixes in Teller Services and Cash Drawer modules of TPTeller UI.
  • Worked on Enhancements and Bug fixes for the following c++ ESL Services
  • ContactSvr, SalesSvr, CashBoxSvr, EJSvr and TranServer
  • Unit Testing and Documentation

Environment: C++, Java, WebSphere MQ, Oracle, XML, UNIX.

We'd love your feedback!