Senior Developer/tech Lead Resume
NJ
SUMMARY:
- 7 + years of experience in working and processing High data volumes using Hadoop, Oracle SQL, ETL methods and creation of reporting the computed data using Java web - based application.
- Experience in installation of Hadoop in standalone and cluster setup.
- Experience in writing UDF and using UDAF function in Hive Query.
- Experience in using and deciding different Hive table storage format.
- Experience in creating Hive Hbase tables for reporting screen for faster data access and using of Hbase tables.
- Experience in building ETL jobs for Data Processing.
- Experience in creating Data Lineage for the complete data Processed.
- Experience in job schedule using Oozie.
- Experience in writing Java JDBC methods for extracting data from Hive/Phoenix tables & views.
- Experience in Query optimization
- Experience in writing Query for Hive and Phoenix.
- Experience in writing MapReduce through Hive and Java and using Tez for faster computation.
- Experience in writing Java methods which will build hive/phoenix query to filter, sort, aggregate functions.
- Experience in writing Sqoop to import/export data from different DataBase to HDFS.
- Experience in using Datalake to extract data and
- Experience in writing query process BTEQ/Multi Load in Tera data.
- Experience in writing PIG script for using PIG latin Joins.
- Experienced in Web applications development using Spring MVC.
- Experience in using SVN for continuous integrated development.
- Experience in writing oracle SQL query and PLSQL.
- Experience in working with JBoss, Tomcat application Servers.
- Excellent interpersonal, technical skills, Quick Learner and Self motivated.
- Experience in development using ETL tools
- Designed and developed Web Based Reporting Dashboard for End Users using Spring MVC, Java, JDBC, Java Servlet, jQuery, Java Script, JSON.
- Experienced with Eclipse & scripting language like Shell/Perl.
- Experienced in building Web services, SOAP.
- Experienced with Agile software development practices like scrum master, product owner.
- Ability to learn quickly and to correctly apply new tools and technology and correct solutions towards the issue.
- Experience in various domains application development - Communication, Revenue Leakage, Pharmacy, Insurance & eCommerce.
TECHNICAL SKILLS:
Operating System: UNIX, Linux, Windows 2000/XP/2007
Programming languages: Core Java, C++ OOPS
Big Data: Hive, HBase, Oozie, Phoenix, UDF, Sqoop, Teradata
Standards & Trends: Agile Scrum Programming Practice.
Distributed Technologies: J2EE, Web service, IBM WPS
Scripting Languages: JavaScript, HTML, CSS, XML, XSLModeling languages Knowledge in Design patterns
RDBMS Databases: Oracle 10g/9i.
Application Servers: Jboss, Apache Tomcat Server
Framework: Spring Core, Spring MVC
Editors: Notepad ++
Tools: Toad, SQL Developer, Putty, WinSCP, HP ALM, Splunk, JIRA.
Version Control System: GIT, SVN
Agile Tool: Rally / TDP.
PROFESSIONAL EXPERIENCE:
Confidential, NJ
Senior Developer/Tech Lead
Responsibilities:
- Working as a lead development Engineer and Support Engineer for e-Commerce express script website.
- Data extraction from RAW files into HDFS using external tables.
- Write MapReduce programs for data comparisons.
- Generic RX pricing batch processing for individual client/customer/ OE / OE Onboarding based on rules and coverage policy using Hive (Map Reduce & Tez).
- Understanding and learning legacy codes like DataPower, SOFEA Mule as most of the modules are in Legacy Code.
- Conversion of XML values to hive filter variables using Java UDF and end to end data testing using Soap UI.
- Optimization of Phoenix queries and tables for faster data processing while using join in query.
- Hands on experience on noSql Hbase and data extraction using Java to extract and update data.
- Strong and hand on experience in business knowledge about deductibles, copay and out of Pocket Concepts.
- Extraction of Eligibility records using Sqoop or DataLake from other vendors.
- Working across different business and tech teams to get the required information about the product to identify root cause for the issue in data processing.
- Being a part of support engineer, making sure the eCommerce website availability 24*7.
Environment: Hadoop, MapReduce, HDFS, Hbase, Zoo keeper, HBase, Hive, Sqoop, Pig Latin, Phoenix, Sqoop, AJAX, Spring services, Spring MVC, Squirrel .
Confidential, Dallas
Senior Developer/Tech Lead
Responsibilities:
- Worked as a lead developer to develop large scale batch processing jobs to detect Revenue leakage and dashboard visibility for the data.
- Solid experience in design and implementation of MVC architecture in SPRING.
- Hands on experience in installing, configuring like Apache Hadoop, MapReduce, HDFS, Hbase, Zoo keeper, Hive, Sqoop and Pig
- Data Lake setup in production, transferring and extraction of data using curl and in-built scripts.
- Good understanding between Relational DB and Hadoop (Hive & Phoenix), to make query builder for selecting/sorting/filtering/adding having clause/arithmetic’s in Java for ETL and reporting dashboard.
- Expertise in configuring and creating Phoenix views over Hive or HBase for faster and quicker access from UI.
- Experience in data ingestion between HDFS and Relational Database Management systems using Sqoop Import and Export.
- Experience working with MVC Architecture, Spring Core Frameworks & front-end application design using JQuery, JavaScript, HTML, CSS.
- Designed and developed UI components and services to create auto case using configured rules.
- Designed and developed backend Java to create web based ETL for hive/hbase/phoenix data processing.
- Design of UDF using Java for custom/complex data processing/parsing in Hive.
- Experience working with onsite offshore development model and played the role of Onsite coordinator.
- End to End experience as Agile Scrum master.
Environment: Java 8/J2EE, JQuery, JavaScript, HTML, CSS HTML5, Spring MVC/Core,Apache Hadoop, MapReduce, HDFS, Hbase, Zoo keeper, HBase, Hive, Sqoop, Pig, Phoenix, Sqoop, Oozie, AJAX, Spring Web services, Spring MVC, Squirrel .
Confidential
Senior Developer
Responsibilities:
- Worked as a developer for applications using ETL that checks the KRA (Key risk areas) in Revenue Assurance.
- Developed UI components interacting with Hive/Phoenix services to provide analytics reports to clients.
- Worked as a support engineer in maintenance and making available of KRA (Key risk Area)
- Ticket Management/address and job Automations.
- Installation of Apache Hadoop, Hive, HBase in the Lower environment for development purpose and proposal.
- Worked as ORT (Operation Readiness Tester) for other vendor developments.
- Scripting using Shell/perl for input file processing.
Environment: Core Java, J2EE,Springs, Spring Web services, Apache Hadoop, Hive, HBase Spring MVC, CVidya, Oracle10g,, JUNIT Putty, Notepad++, PLSQL, JDBC, Shell/perl
Confidential
C++ Development/Enhancement
Responsibilities:
- Worked as a developer to develop components for other projects using C++ / OOPS.
- Fixing bug & working on enhancement request.
- Worked in different scripting languages to build custom scripts.
- Creating make file.
Environment: C++, UNIX, Putty, Notepad++, Toad.