Lead Developer Resume Profile
NJ
Summary:
- Sun Certified Java Programmer and Cloudera Certified Developer for Apache Hadoop with around 9 years of experience in architectural design and development of Client/Server applications using Object-Oriented Programming in the Financial, Insurance and Healthcare Domains.
- Experience in Cloud based technologies/framework like Big Data-Hadoop and configuring, installing, benchmarking and managing Apache Hadoop, Cloudera Hadoop and MapReduce distributions.
- Expertise in Java/J2EE development along with Business Rule Engine BRE and Business Process Management BPM based technologies and methodologies using Blaze Advisor/Advisor Smart Forms and Savvion.
- Designing and implementing complete end-to-end Hadoop Infrastructure including HIVE, Sqoop, Zookeeper etc. and data transformation and performance tuning the Hadoop clusters.
- Strong data modeling experience and relational database development including stored procedures, triggers, functions, indexes, and packages in Oracle and SQL Server.
- Exposure in automating the Hadoop Installation, configuration and maintaining the cluster by using the tools like Puppet, openstack, Cloudera Manager.
- Exposure in installing and configuring HIVE. Good skills at writing queries in Hive for the Map-Reduce.
- Exposure in Importing and exporting data from different databases like MySQL, Oracle into HDFS and Hive using Sqoop and exposure in writing Shell scripts to dump the shared data from MySQL servers to HDFS.
- Experience in creating partitions, views, indexes on hive tables and knowledge in creating batch scripts for Hive.
- Build business solutions on Hadoop Platform for different Machine Log / Device Log.
- Good knowledge in complete Hadoop ecosystem and extensive experience in understanding the client's Big Data business requirements and transform it into Hadoop centric technologies.
- Experience in supporting data analysis projects using Elastic Map Reduce on the Amazon Web Services AWS cloud and exporting and importing data into S3.
- Experience in designing both time driven and data driven automated workflows using Oozie.
- Excellent communication, analytical, presentation and interpersonal skills with the ability to work both independently as well as in team environment. Enthusiastic in learning new technologies and adapt to new environment quickly.
Technical Skills:
Programming Lang. JAVA, C .Net, PL/SQL
Web/XML Technologies HTML, CSS, JavaScript, JSON, jQuery, DWR, AJAX, Ext JS, XML
Technologies Java / J2EE JSP, Servlets, EJB, Spring, Struts, ORM IBatis / Hibernate, Web Services - SOAP / Restful , NET ASP 2.0, C
SQL Database Oracle, MySQL, DB2, SQL Server
Big Data Ecosystem Cloudera - Map-Reduce, Hive, Pig, Zookeeper, Sqoop, HDFS
NOSQL Database MongoDB, HBase
BPM / Rule Engine Blaze Advisor Rules Engine , Savvion BPM , iBPM / Lombardi
Application Servers Websphere, Oracle Apps, WebLogic, Pramati, Tomcat, JBOSS, IIS
Scripts Bash, Python, ANT
Source Control Rational Clear Case, CVS and SVN, Tortoise, GitHub
Other/ Build Tools RAD, Jdeveloper, Eclipse, Jenkins, VPN, Putty, WinSCP, Blade Logic, Elastic Search Logstash
Operating System Windows, Ubuntu, CentOs, RedHat.
Professional Experience:
Confidential
Lead Developer
Responsibilities:
- Responsible for writing Map Reduce programs to place DB data to HDFS and also NOSQL- MongoDB using Custom Record Reader and Input Format.
- Importing and exporting data into HDFS and Hive using Sqoop.
- Created Oozie workflow for automating the process and archiving the DB data.
- Wrote Hive queries to data analysis and reporting.
- Worked on implementing callback parameters for Hadoop job running and its status, worked on PathFilters to read the required on HDFS, worked on reading JSON data. Also, exposed in writing Combiners, Custom Partitions, Custom Splits, Custom Record Readers, Distribute cache, join and filtering.
- Configured Hive and also written Hive UDFs and also worked on reading JSON data using JSON SerDes.
- Exposing the MongoDB data as a Restful WebServices for reporting and also analyzing the data in NoSQL DB - MongoDB
- Designing the MongoDB schema including indexing and sharing of data.
- Working and maintaining Cloudera Hadoop cluster and also maintaining MongoDB DB cluster.
- Reading the log files using Elastic search Logstash and alerting users on the issue and also saving the alert details to MongoDB for analyzations.
- Worked on various POC's like using Flume for reading logs files, analyzing the data using PIG, worked on various NoSQL DB's like HBase that suits the requirement, processing the data using Spark
Confidential
Responsibilities:
- Designed and created Savvion workflows with various adaptors like WebServices, DB, Email adaptor with velocity framework, etc for the automation deployment.
- Part of team of architects for designing the enterprise application involving various technologies like Cloud Computing and BPM.
- Creating the Jenkin jobs for deployment files packaging and exposing the Spring Jenkins jobs with Spring Rest web services
- Consumed the SOAP WebServices of BladeLogic and creating BladeLogic package.
- Created the SOAP WebServices for the workflow tracking and reporting
- Created Rules module for errors and tracking. Also creating Rules to insert data to DB using iBatis for workflow tracking.
- Developed CRON job using Spring Quartz scheduler and placing the message to IBM Message queue.
- Worked on Sencha architect. Creating the UI using Ext-JS, JQuery, AJAX, etc.
- Extensively worked on Spring modules like Spring Core, AOP, MVC and ORM/Transaction module
- Implemented DAO layer using Spring and iBatis.
- Implemented Spring WebServices layer using over Database using JPA.
- Extensively worked on Restful and Spring WebServices.
- Heavily used Spring Core, DAO, ORM and AOP modules.
- Involved in designing the Java Web and Business layer architecture and accordingly came up with so many innovative ideas of reducing the manual work.
- Lead the effort to reach out to potential consumers and producers of this architectural design to collected system requirements along with analyzing future political conflicts.
- Developed use case diagrams, class diagrams, database tables and mapping between relational database tables.
Technologies: Java/J2EE, Hadoop, Savvion, BizPulse Rule, iBatis, Elastic Search, Logstash, Sqoop, Flume, MongoDB, Spring, WebServices, Blade Logic, Jenkins, JSON, JQuery, AJAX, ExtJS, Oracle, Websphere, Tomcat, etc.
Confidential
Sr. Analyst Programmer
Responsibilities:
- Followed agile software development Scrum practice paired programming and test driven development.
- Installed and configured Hadoop Cluster for development and testing environment.
- Involved in consuming/Integrating the external WebServices application by generating stubs and interaction with DB by writing DAO classes.
- Developed generic UI framework/utility for making dynamic Ajax calls and consuming WebServices details to UI using Spring MVC, Jquery and DWR.
- Used Axis2 usage to prepare WebServices calls.
- Exported data from DB2 to HDFS using Sqoop.
- Developed map reduce programs for applying business rules on the data.
- Developed and executed hive queries for denormalizing the data.
- Moved data from Hadoop to Cassandra using Bulk output format class.
- Implemented fair scheduler on the Job tracker to share the resources of the cluster for the map reduces jobs.
- Automated the work flow using shell scripts.
- Created Hive queries to compare the raw data with Warehouse reference tables and performing aggregates
- Performance tuning of the hive queries, written by other developers.
- Defining the Oozie workflow to automate the tasks of loading the data into HDFS.
- Developed unit test cases using JUnit.
- Involved in security authentication, architectural design and code review.
- Testing and validating the changes to meet the requirements.
Technologies: Java/J2EE EJB, Spring MVC, AOP, Spring Batch, Web Services SOAP, Restful , iBatis, Savvion, Blaze Advisor 6.9, HTML, JSON, jQuery, DWR, AJAX, Ext JS, DB-Oracle, Hadoop, Hive, Pig, Sqoop, etc
Confidential
Technology Specialist
Responsibilities:
- Responsible for the architectural design, development/implementation and maintain third party open source solutions for Hadoop.
- Worked on setting up the Hadoop cluster for the Environment for development and testing environment.
- Importing Data into HDFS from relational database using Sqoop.
- Developed map reduce jobs for applying business rules on the data.
- Developed Hive queries to pre-process the data for analysis by imposing read only structure on the stream data and executed Hive queries for denormalizing the data.
- Automated the work flow using Oozie.
- Creating and managing the instances, which includes no of CPU's, RAM and Disk storage required using Openstack
- Deploying and managing the applications using puppet recipes and puppet master
- Writing the puppet scripts and testing the some deployed instance.
- Experienced in managing and reviewing Hadoop log files.
- Experienced in running Hadoop streaming jobs to process terabytes of xml format data.
Technologies: Java/J2EE, Hadoop, Hive, Sqoop, HDFS, Oozie, Openstack, Oracle, Puppet
Confidential
Technology Specialist / Client Coordinator
Responsibilities:
- Coding, Debugging DAO, Service layers and JUnit test case preparations.
- Technical trouble shooting and crisis resolution, Team Mentoring.
- Customer Interaction and Solution presentation.
- Requirements Understanding and translation to technical Implementation.
- Participation in Technical reviews as SME Architecture, Design and Code .
- Promote Coding Standards and Design principles.
- Write Unit test cases and validate QA functional test cases for all scenarios.
- Diagnose critical problems and develop solutions.
- Responsible for the overall systems development life cycle and deliverables.
- Working on issues, working with business team to get Requirements and rolling out the Changes or enhancement in the application.
- Updating the Business Rules and bug fixing.
- Preparation and updating the documentations like Developers Guide, Architecture Guide, and Change control Management Plan, etc.
Technologies: Java/J2EE, Web sphere 6.0/7.0, RAD 7.5, IBM MQ, Message Queue Explorer, XML, EJB, Spring, Blaze Advisor Rules Engine 6.5/6.9 Java
Confidential
Lead Developer / Client Coordinator
Responsibilities:
- Role of Module Lead, Mentor and Managing all L1 Team members/developers and introduce any improvements in process to enhance the performance of the client prospective.
- Working with business team to get Requirements and rolling out the changes or enhancement in the application.
- Troubleshooting issues that occur in production with minimal turn-around time.
- Evaluating new technologies and concepts.
- Collaborate with product management / QA to ensure end to end product delivery.
- Contribution of technical artifacts/collaterals to organization like PMP, Test Plan, Design documents, FRS, etc.
Technologies: Java/J2EE, Savvion 7.0, Biz-Pulse Rule Engine, BPM Studio as IDE, Web Logic 9.2, Oracle, supporting tools like SVN, Clarity, Mercury, etc
Confidential
Lead Developer / Client Coordinator
Responsibilities:
- Working with Business users and Underwriters for gathering the requirements and building the product specific Rules Template and Rules Ledger.
- Preparation of making Project Management documentations like PMP, Test Plan, Design documents, FRS, Rules Ledgers, etc.
- Deployment of Underwriting Rules as WebServices, Designing the UI's and integration UI to Rules service.
- Customizing business object model XSD according to requirements and implementing the changes both at UI and Rules
- Analyzing the requirements and building the business object model XSD for various products
- Designing Generic template consists of Template and providers according to the Business Requirements. Defining and testing the rules according to the requirements.
- Building custom providers to integrate with Database and customizing the RMA according to requirements.
- Deploying the application and integrating all the products in different projects as single WebService.
- Developed the Test Client and integrating with the Web Service, for testing the products.
Technologies: Java/J2EE, Blaze Advisor Rules Engine 6.7, .Net C , ASP , Web Services, IIS Server, XML, SQL Server, Savvion