We provide IT Staff Augmentation Services!

Sr. Technical Specialist - Big Data Analytics/hadoop Developer Resume

5.00/5 (Submit Your Rating)

SUMMARY:

  • Over 10 years of extensive experience in IT Industry in the field of enterprise, web and distributed application development with a zest for Big Data in Hadoop and JAVA, J2EE technologies.
  • Cloudera Certified Hadoop Developer with more than THREE Year of experience in BigData in HADOOP ecosystems and related technologies (HDFS, MapReduce MRV1,MRV2, HBase, Hive, Pig, Impala, Spark Scala, Akka, Spray API, Strom, Kafka, MRUnit and Elastic Search ).
  • Strong exposure on Big Data industry top distributions giants like Cloudera(CDH), HortonWorks (HDP) and Apache Hadoop platforms
  • Good experience in Amazon Web Services(AWS) - EC2, EBS, EMR, Kinesis, AWS Lambda and S3 for Hadoop
  • Deep Experience in Hadoop architecture and various components such as HDFS, Job Tracker, Task Tracker, Name Node, Data Node& MapReduce programming paradigm.
  • In depth knowledge of HDFS file system and customizing it by writing custom data types, input and output formats.
  • Good experience in optimizing MapReduce algorithm using combiners and practitioners to deliver the best results.
  • Good Experience in data loading from RDBS system to HDFS system using SQOOP and Flume.
  • Experience in analyzing data using HIVEQL, PIG Latin and custom MapReduce programs in JAVA. Extending HIVE and PIG core functionality by using custom UDF’s.
  • Worked on NoSQL databases including HBase and MongoDB.
  • Has very good working experience with AWS (Amazon Web Services). Managed multiple instances in EC2 with load balancing and auto scaling features. Worked on managing the data in S3 on AWS.
  • Research, evaluate and utilize new technologies/tools/frameworks around Hadoop Eco system.
  • Good experience in AGILE methodology & SCRUM model to deliver software services. Experience in all phases of SDLC including analysis, Design, Development, Coding and testing.
  • Good experience in Developing JAVA and J2EEtechnologies using JSP, JDBC and open source like Struts, Spring, Hibernate and JUnit.
  • Experience in developing RESTful WebServices.
  • Experience in Configuration Management tools like GitHub, Subversion, CVS, Clear Case, TFS.
  • Experience in continuous Integration(CI) process and build management tools like Jenkins.
  • Experience in using tools Log4J, Unit Testing Tools JUnit, and Build Tools such as Apache Maven, Ant and SBT.
  • Good Implementation Knowledge in J2EE Design Patterns.
  • Experience in Onsite and Offshore development model with business Analysis, Requirement Gathering and Client Interaction.
  • Strong analytical and problem-solving skills with good interpersonal and communication skills.
  • Experience in PCI DSS Compliances to improve security standards on web-based application and interface communication
  • Extensive experience in developing layered-architecture using Struts and Spring, REST,MVC frameworks, Spring lightweight-container and Hibernate ORM tool.
  • Extensive experience in developing middle-tier solution using Apache Axis Web Services, JWS and SOA

TECHNICAL SKILLS:

Operating Systems: Windows, Linux (Ubuntu, CentOS and Redhat)

Hadoop Cluster: Setup, Installation, Cloudera CDH5.x, Hortonworks HDP 2.3, Apache Hadoop, Amazon EC2

Analytics Tools: Kibana, Tableau, HttpFS, WebHDFS and FuseDFS

Programming Skills: JAVA,J2EE,SCALA, PHP

Web Technologies: JSP, Servlets, JSF, HTML, CSS, JQuery,XML, AJAX,DOJO

Frameworks: Struts, Hibernate, Spring Core,MVC

Application/Web Servers: Jetty 7, JBoss 5, Apache Tomcat 5, WebLogic 10.x, Apache 2.2.x.

Tracking tools: BMC Remedy, JIRA and Clarify

Agile tools: Rally and TFS

Version/Repo Tools: GitHub, Clear case, SVN, CVS, VSS, StartTeam.

Other Tools: JUnit, Maven, Ant, Yourkit profiler, Jmeter, MarkLogic, XQuery,Python, Eclipse (IDE), Tamper, Xss Me, TogetherJ, XML Spy(Altova), SOAP UI, ZAP

PROFESSIONAL EXPERIENCE:

Confidential

Sr. Technical Specialist - Big Data Analytics/Hadoop Developer

Tools: /Tech Used: HDFS,MR, Hive, Pig, Flume, Oozie, Zookeeper, HBASE, Impla, Sqoop, Spark, Scala, Akka, Spray API,Kafka and Elastic Search, MRUnit, JUnit, SBT,Eclipse, MavenDescription:

Responsibilities:

  • Architect the Data Analytics for Healthcare solutions
  • Developed Map Reduce program for parsing and loading the streaming data into HDFS information regarding messaging objects
  • Developed Real-Time streaming using Spark
  • Developed API to expose analytics data using Spray API and AKKA actors
  • Installed and configured Hive and also written Hive UDFs. Created Hive tables and working on them using Hive QL and Spark SQL.
  • Perform adhoc queries.
  • Develop platform architecture for incremental data processing and ingestion the data.
  • Develop Analytics APIs to make easy for Customers to use easily
  • Hands on experience in leading complex integration between AWS services(EMR, EC2, S3) and distributed legacy data centers in both Research and development and production environments.
  • Worked on defining job flows, jobs management using Fair scheduler
  • Worker on Cluster coordination services through Zookeeper
  • Setup CHD and HDP compatible cluster setup
  • AWS Cluster setup for Hadoop
  • Utilized Agile Scrum Methodology to help manage and organize a team of 4 developers with regular code review sessions.
  • Used MRUnit&Junit, Scala Test for Hadoop/Java unit testing..
  • POC and implementation on installed Oozie workflow engine to run multiple Spark, Hive and Pig jobs.

Confidential

Technical/Analytics Lead

Tools: /Tech Used: HDFS, Hive, Pig, Flume, Oozie, Zookeeper, HBASE,Impla and Sqoop

Responsibilities:

  • Developed Map Reduce program for parsing and loading the streaming data into HDFS information regarding messaging objects
  • Developed Hive queries to pre-process the data for analysis by imposing read only structure on the stream data
  • Hands on experience in leading complex integration between AWS services(EMR, EC2, S3) and distributed legacy data centers in both Research and development and production environments.
  • Developed workflow using Oozie for running Map Reduce jobs and Hive Queries
  • Used Sqoop for exporting data into MYSQL and Oracle
  • Worked with Agile methodologies and have use scrum in the process
  • Worked on defining job flows, jobs management using Fair scheduler
  • Worker on Cluster coordination services through Zookeeper
  • Worked on loading log data directly into HDGS using Flume.
  • Worked with Agile Methodologies and have used scrum in the process

Confidential

Team Lead

Tools: /Tech Used: Java 1.7, Jsp, Spring Rest, Spring Core and Spring MVC, JPA, Hibernate, Oracle10g, Webservices, Windows,Tomcat 7, Jboss,DOJO,JQuery

Responsibilities:

  • Preparing SFS based on PRD
  • Coordinating with Product Team
  • Involving AGILE methodologies
  • Preparing Production Design document(PDD)
  • Implementing CSSP runtime architecture
  • Ownership on development tasks

Confidential

Team Lead

Tools: /Tech Used: Java 1.6, Jsp, Spring Rest, Spring Core and Spring MVC, JPA, Hibernate, Oracle10g, Webservices, Windows,DOJO,JQuery

Responsibilities:

  • Preparing SFS based on PRD
  • Coordinating with Product Team
  • Involving AGILE methodologies
  • Preparing Production Design document(PDD)
  • Implementing CSSP runtime architecture
  • Ownership on development tasks

Confidential

Tools: /Tech Used: Java 1.6, Jsp, Servlets, Java, EJB, Struts, Oracle10g,Webservices, Oracle Weblogic Server11g (10.3.3.0), Shell Scripts, Unix, Solaris 10.0,Spring Core and Spring MVC and ESP Scheduler

Responsibilities:

  • Understanding the client requirements and project functionality
  • Preparing LOE of newly add enhancements
  • Preparing Functional and Technical design documents
  • Developing GUI using JSP, Servlets, Struts validation framework
  • Interaction setup with IVR & BW functionality.
  • Using MVC Framework Architecture
  • Implementing SSO integration with client CSP initiative
  • Implemented new feature to enhance application more reusable like online shopping cart (eCommerce).
  • Implementing PCI compliance initiatives
  • Implemented design patterns like Value object, DAO, Front-controller, Composite design pattern and etc.,

Confidential

Technical/Team Lead

Tools: /Tech Used: Webservices, Oracle Weblogic Server11g (10.3.3.0), JBoss AS 4.2 GA, Shell Scripts, Unix, Solaris 10.0,Cron, MS-SQL, Clarify CRM, FireDeimon, IIS,SAT,iCare order config, ConceptWave and Java

Responsibilities:

  • Maintain and create Weblogic and Jboss applications domains based on need
  • Constant support for production issues and interaction with users
  • Providing support and setup new environments for enhancements by creating new domains
  • Provide cluster based applications for distributed environments to handle load
  • Tuning performance of the application issues and configure load balancer
  • Regular application support

Confidential

Sr. Software Developer

Tools: /Tech Used: Java 1.5, Jsp, Servlets, Struts 1.2, springs 2.x, Hibernate2.x, Oracle 9i and Weblogic Server 9.2, Webservices, XML and Solaris 8.0

Responsibilities:

  • Involved on the Infranet API/Bridge to create a client side web service adapter that will support all the same methods as the Infranet API interface.
  • Design and developed Infranet client API to interact with external systems(like SAM,PAM and Business ware)
  • Involved the provisioning interface to radius will remain unchanged for the initial phase
  • Developed object-to-XML mapping layer will be created using the Java API for XML Binding (JAXB)
  • Developed reusable components using Customtags
  • Using MVC Framework Architecture
  • Implemented design patterns like Value object, DAO, Front-controller, Composite, Factory, Singleton and etc.,

Confidential

Software Developer

Environment: Java 1.3 to 1.5, JSP, Servlets, JDBC, EJB, Oracle 9i, Weblogic7.0 to 9.2, Sun Server8.1, Unix, Solaris 8.0

Responsibilities:

  • Migrating/developing Servlets, JSP and Java code based on specifications
  • Implementing log4j instead of system properties
  • Client and Server side validations
  • Implemented struts validation framework
  • Implemented Design Pattern Session Facade, Value Object

Confidential

Trainee

Tools: /Tech Used: JAVA,SQL,Servlets,Tomcat

Responsibilities:

  • Involved in developing and integrating the bank bills & commercial papers components to the instrument framework
  • Unit testing
  • Received BEST performer of the team in 2007
  • Resolved performance tuning and memory leak issue on TSA project
  • Identifying the issue on load balancer configuration issue on EBPP/ePay project
  • SSO integrations on client CSP portal development for WCI
  • Enhanced the features on ecommerce application on existing client apps, which felt client extreme happy in terms of cost and income growth in their business on B2B
  • Client delighted on EDD(Electronic document delivery) project implementation in tighten time-line which will enable end-customer view the monthly bill on PDF format
  • PCI compliance initiatives s in short time
  • Identified on tunneling issues between IVR(BW) and ePay communication
  • Appreciation on implementing nuBridges encryption logging
  • Got ProStar for the year 2011 from PRODAPT SOLUTIONS PVT LTD
  • Got Best performer from CISCO SYSTEMS
  • Got Impressed Architecture in PHILIPS

We'd love your feedback!