Hadoop Developer Resume Profile
Charlotte, Nc
PROFESSIONAL SUMMARY
- Senior Java Developer having 8 years of experience in design andImplementation of scalable and reliable applications. Focusedon code and design quality. Well positioned to deliver in challenging role as a J2EE Lead.
- Experience with BigData Technologies - Apache Hadoop, Cloudera, HortonWorks, MapR.
- YARN Yet Another Resource Negotiator , Hadoop Distributed File System HDFS , MapReduce. Pig, Hive - HiveQL, Sqoop, Oozie. HBase, Hue.
- Experience with Hadoop Big Data Installation and development
- Experience with using Big Data with ETL Talend
- Experience with ETL - Extract Transform and Load - Talend Open Studio,Informatica
- Experience with Business Intelligence BI tools - Jasper Report, Actuate.
- Developed Scripts and Batch Job to schedule various Hadoop Program
- Created Hive tables and working on them using Hive QL for data analysis to meet the business requirements
- Load and transform large sets of structured, semi structured and unstructured data
- Hive experience in configuring Hadoop environment processing data from HDFS.
- Experience in Jasper Report, implemented successful migration from actuate reporting tool to jasper report.
- Application support executive which involved maintenance activities of major Global Treasury Risk Rating applications.
- Exceptionally well organized, strong work ethics and willingness to work hard to achieve employer objectives.
- 3 years 6 months of experience inthe Finacle core banking productImplementation customize forall major Banks like Emirate Bank, Doha Bank.
- An easy going, hardworking, reliable and a good communicator who can
- Translate complex information, in real easy to understand ways.
Skills
Languages Java 1.6
Advanced Java Concepts J2EE Servlets, JSP, SOAP Web Services.
Big Data Apache Hadoop, Hive, Pig, HBase, Sqoop, Oozie workflow
RDBMS Oracle 9i, 10G, DB2, SQL Server
Scripting Languages UNIXShell script, Java Script
XML Technologies XML, XHTML, XSL
Web Servers Tomcat 7.x.
Application Servers IBM Websphere 8, Weblogic Portal
Deployment Tools SVN, VSS, Perforce
Development Tools Eclipse
Other Tools /IDE Weblogic Portal, Eclipse 3.x, ITRS Genes, MIPS, Introscope Wily, Autosys. DMX Syncsort.
Operating systems Windows NT/2000/XP, UNIX AIX, Solaris, Linux
Methodology Waterfall
Reporting Tool Jasper Report 3.7.4
ETL Tool Talend, Informatica power center 8.6
Professional Experience
Confidentiial
Role:Hadoop Developer
- Analyze large and critical datasets of Global risk treasury technology GRITT Domain using Cloudera, HDFS, MapReduce, Hive, Hive UDF Pig.
- Responsible for data ingestions ETL
- Importing metadata from Oracle database using sqoop.
- Automate processes in Cloudera environment and building Oozie workflows.
- Created Map Reduce jobs for data transformations and data parsing.
- Created Hive scripts for extracting the summarized information from hive tables.
- Written Hive UDFS to extract data from staging tables.
- Volume testing to calculate cluster's throughput.
- Involved in creating Hive tables, loading with data and writing hive queries which will run internally in map reduce way.
- Helped the team to increase Cluster from 18 Nodes to 20 Nodes.
- Maintain System integrity of all sub-components primarily HDFS, MR .
- Monitor System health and logs and respond accordingly to any warning or failure conditions.
- Coding peer review of assigned task.
- Unit testing and Volume Testing. Bug fixing.
- It is the single retail data sourcing and common aggregations for enterprise risk.
- Quantitative risk technology is the evolution of how enterprise credit risk system will source from different upstream downstream.
- Do the Import export data ingestion using DMX Syncsort tool.
- Generating GRITT domain Risk rating reports for various clients in the organization
Environment:Linux 2.6.32, Java, Hive, UNIX shell scripting, Hadoop 2.2.0 Cloudera , Database Hive 0.11.0, Eclipse 3.5, Sqoop, Syncsort DMX, FileZilla, Putty, SVN
Confidentiial
Role:Java Developer
- Scheduled Exposure Report SER application early used Actuate as the product platform for its reporting solutions.
- Open source tools are being evaluated to replace the Actuate reports.
- One of the preferred target technologies for migration of Actuate Reports was JasperSoft iReport .
- Three Major Reports which has the entire information about the applications.
- Schedule Exposure Report SER , Credit Approval Document CAD , Executive Summary Report ESR
- Develop design documents based on inputs from Project Specification document and discussion with Bank of America technology associate
- UI designing and developed code inline to the above documents.
- Field Level Data Mapping.
- Designed the reports using iReport tool and integrated the jasper file to the application
- Including criteria in the search functionality.
- Performing enhancements to existing functionality and implementing fixes to the production system.
- Preparation and review of documents BRD, LLD, HLD, Test case for enhancements.
- Checking the quality of the software and determining its degree of compliance to standards.
- Site Administration process for critical updates
- Application's performance has been increased the tremendously.
- IReport Tools is an open source, but actuate is a licensed software.
Environment:Windows, Unix Solaris, Java/J2EE, UNIX shell scripting, iReport tool 3.7.4 , Perforce , Eclipse 3.5, Web logic 10.3 and Oracle
Confidentiial
Role:Sr Implementation Engineer
|
|
Environment:
Windows ,Unix Solaris, Java/J2EE, UNIX shell scripting, Web sphere , Custom Studio, iReport and Oracle
Confidentiial
Role:Software Implementation Engineer
|
|
|
Environment:
Windows ,Unix Solaris, Java/J2EE, UNIX shell scripting, Putty, Filezila, VSS, Custom Studio, Eclipse 3.5, Web logic 10.3 and Oracle