We provide IT Staff Augmentation Services!

Principal Application Developer Resume

3.00/5 (Submit Your Rating)

TX

SUMMARY

  • 9+ years of experience in OO Analysis, Design, Development using Java/J2E technologies with experience in developing strategic methods for deploying Big data technologies to efficiently solve Big Data processing requirement.
  • More than 3+ years of hands on experience in Hadoop Eco system technologies such as in Pig, Hive, HBase, Zookeeper, Map Reduce, Oozie, Flume and Sqoop.
  • Involved in Design and Architecting of Big Data solutions using Hadoop Eco System.
  • Hands on experience in Capacity planning, Installation, monitoring and Performance Tuning of Hadoop Clusters.
  • Proficient Knowledge on Apache Spark and Apache Storm to process real time data.
  • Involved in Design and Architecting applications to use NOSQL Solutions like HBase, Mongo DB.
  • Worked on writing custom UDFs in java for Hive and Pig and SERDE’s for Hive.
  • Implemented custom Sinks and Interceptors for Flume.
  • Hands on experience in writing map - reduce programs in java.
  • Performed Importing and exporting data into HDFS, Hive and HBase using Sqoop.
  • Written Oozie workflows / Co-ordination workflows to schedule Hadoop Jobs.
  • Hands on integrating search tools like SOLR cloud and Lucene to HDFS and HBase.
  • Integrated analytical tool R to Hadoop using ORI.
  • Designed and Architected SOA platform at Confidential using Mule, Active MQ, Salesforce, Sterling, Informatica.
  • Developed web application in open source java framework Spring. Utilized Spring MVC framework.
  • Configured and developed web applications in Spring, employing AOP and IOC.
  • Developed RESTful Web Services using Spring Rest and Jersey framework.
  • Experienced in creative and effective front-end development using EXT-JS, JQuery, JavaScript, HTML, DHTML, Ajax and CSS.
  • Aced the persistent service, Hibernate for object relational mapping with database. Configured xml files for mapping and hooking it with other frameworks like Spring, Struts and other technologies.
  • Have excellent Relational Database understanding and experience with Oracle 10g/11i, IBM DB2 7.X/8.X, SQL Server 2008 and MySQL 5.0/5.5.
  • Strong experience in database design, writing complex SQL Queries and Stored Procedures.
  • Worked on test driven software development where Junit testing was employed.
  • Experienced in using Version Control Tools like Perforce, Git Hub, CVS, SVN, Harvest and VSS.
  • Have extensive experience in building and deploying applications on Web/Application Servers like Jboss AS, Web logic, IBM Websphere, Glassfish and Tomcat.
  • Experience working with Agile Methodologies including XP, SCRUM and Test-Driven Development.

TECHNICAL SKILLS

Hadoop related Big Data Technologies: Hadoop,MapReduce,HDFS,Pig,Hive,Oozie,Sqoop,Flume,Spark,Mahout,Storm,Kafka,Zookeeper.

No SQL: Hbase, Cassandra, Mongo DB

Languages: Java/J2EE, SQL.

Web Technologies: JSP, EJB 2.0, JNDI, JDBC, HTML, JavaScript, DHTML, EXT-JS, JQuery

Frame Works: Struts 1.x, Spring 3.x, Hibernate3

MOM: Active MQ, JMS

SOA: Mule ESB, REST Services, SOAP web services.

Web/Application servers: Tomcat, Jboss, Web logic 12c,SAP Webserver

Databases: Oracle, DB2, Postgres, SQL Server.

Operating Systems: Windows, Unix

IDE: Eclipse 3.x, JDeveloper 12

Database Tools: TOAD,SQL Developer

Development Tools: Maven, ANT, Telnet, FTP

Reports: Pentaho, JASPER

Version Control: Git, Subversion, CVS, Perforce

Testing Technologies: JUnit 4/3.8, MR Unit, Mockito.

Portal: Liferay Portal

Mobile: Phone Gap, Android, IOS.

PROFESSIONAL EXPERIENCE

Confidential, TX

Principal Application Developer

Responsibilities:

  • Analyzed business requirements and existing software for High Level Design.
  • Designed detailed software structure and architecture documents using Use cases, sequence diagram and UML.
  • Worked in an agile development process with 4 weeks release, monthly Sprint and daily Scrum.
  • Installed and Configured Hadoop Cluster.
  • Installed and Configured HBase, Hive, Sqoop and Oozie.
  • Installed and configured analytical tool R.
  • Written CURD services on Hbase.
  • Exposed HBase CURD services as RESTful web services using Jersey.
  • Imported existing data to HBase and HDFS using Sqoop.
  • Created external tables in Hive on HBase tables.
  • Written custom UDF’s in Hive.
  • Written Sqoop incremental import job to move new / updated info from Database to HDFS and HBase.
  • Created Oozie coordinated workflow to execute Sqoop incremental job daily.
  • Integrated R with Hadoop using ORI.
  • Configured Hive Meta store on MySQL.
  • Used Hive Server 2 as service.
  • Configured Hive to Obiee using Hive ODBC drivers.
  • Created dashboards in OBIEE.
  • Installed and Configured HBase replication cluster.
  • Involved in Hadoop and Hbase cluster performance tuning.
  • Installed and Configured Zookeeper and Setup the Zookeeper quoram.

Confidential, NY

Sr. Software Engineer

Responsibilities:

  • Analyzed business requirements and existing software for High Level Design.
  • Designed detailed software structure and architecture documents using Use cases, sequence diagram and UML.
  • Worked in an agile development process with 4 weeks release, monthly Sprint and daily Scrum.
  • Installed and Configured Hadoop Cluster.
  • Installed and Configured HBase, Sqoop and Oozie.
  • Written CURD services on Hbase.
  • Exposed HBase CURD services as RESTful web services using Jersey.
  • Imported existing data to HBase and HDFS using Sqoop.
  • Imported existing text files and Meta files to HDFS and Hbase using flume.
  • Written flume Solr sink to index the blogs.
  • Written Sqoop incremental import job to move new / updated info from Database to HDFS and HBase.
  • Created Oozie coordinated workflow to execute Sqoop incremental job daily.
  • Configured Sqoop Meta store.
  • Installed and Configured HBase replication cluster.
  • Involved in Hadoop and Hbase cluster performance tuning.
  • Installed and Configured Zookeeper.
  • Setup the Zookeeper quoram.
  • Configured Name node HA.
  • Written Servlet to call name node /getImage URL.
  • Written Oozie workflow to Copy fsimage into HDFS.
  • Written Oozie Co-ordination workflow to run MR jobs when fsimage file available in HDFS.
  • Written Oozie workflow to move MR output files to RDBMS using Sqoop.
  • Written a MR Job code using MR API to process fsimage file taken through Offline Image Viewer.
  • Built UI for Administrators to do operations by fetching from RDBMS using Ajax.
  • Built Dash Boards using D3.js

Confidential, TX

Sr. Software Engineer

Responsibilities:

  • Analyzed business requirements and existing software for High Level Design.
  • Designed detailed software structure and architecture documents using Use cases, sequence diagram and UML.
  • Worked in an agile development process with 4 weeks release, monthly Sprint and daily Scrum.
  • Installed and Configured Hadoop Cluster.
  • Installed and Configured Hive, Sqoop, Pig and Oozie.
  • Installed and configured analytical tool R.
  • Created Hive external tables.
  • Installed Flume agents to retrieve the eco meter sensor data.
  • Imported existing data to Hive and HDFS using Sqoop.
  • Configured the Flume to aggregate the logs and publish it to HDFS.
  • Written custom UDF’s in Hive.
  • Written Sqoop incremental import job to move new / updated info from Database to HDFS and Hive.
  • Created Oozie coordinated workflow to execute Sqoop incremental job daily.
  • Configured Hive Meta store on MySQL.
  • Used Hive Server 2 as service.
  • Configured Hive to Pentaho using Hive ODBC drivers.
  • Created reports using Pentaho.
  • Involved in Hadoop and Hive cluster performance tuning.
  • Written Pig scripts to extract information from logs and load into Hive tables.

Confidential, NY

Sr. Software Engineer

Responsibilities:

  • Involved in requirements analysis and gathering and converting them into technical specifications using UML diagrams: Use Case Model, Business Domain Model, Activity & Sequence Diagrams and State Diagrams.
  • Applied Object Oriented concepts (inheritance, composition, interface, etc) and design patterns (singleton, strategy...Etc).
  • Installed and Configured ActiveMQ.
  • Installed and Configured Mule ESB.
  • Written JMS Consumer mule action to listen for messages from Sterling.
  • Used JAXB mule action to parse XML data will convert XML to Java Object.
  • Used Salesforce mule connector to Salesforce.
  • Used salesforce upsert operation to insert / update data into salesforce.
  • Written transformers to convert Java objects to salesforce supported format.
  • Load tested the application using SOAPUI.

We'd love your feedback!