Sr. Big Data Consultant Resume
PROFESSIONAL SUMMARY:
- Around 11 years of experience in the Information Technology Domain, with key emphasis on Technical Lead, Project Lead/Management for projects.
- 5 Years of extensive experience as Hadoop Developer/Administration with strong expertise in Map Reduce and Hive.
- Involved in different levels of Requirement analysis, Development, Testing and Debugging, Deployment to Production.
- Good exposure in RETAIL Domain, BANKING Domain, TELECOM Domain.
- Expertise in Hadoop technologies HDFS, Map Reduce, Hive, Sqoop, Pig, HBase, HBase, Flume, Zookeeper,AWS, S3, Python, Oozie, Spark, Strom, Elastic Search, Presto, Nifi and Kafka .
- Experience in installation of Big Data/Hadoop and data science tools, analysis, estimation, design, construction, problem solving, ongoing maintenance, and enhancements for the business needs
- Experience in real - time analytics using latest Big data technologies and recommending Big Data solutions
- Expertise in Object Oriented Analysis and Design (OOAD), OOPS using Unified Modeling Language (UML), Design Patterns (MVC, Singleton, Facade, Factory, DAO, etc.).
- Experience in migrating data from RDBS ( My SQL, SQL Server and Teradata) to Hadoop Environment
- Extensive experience in writing SQL/Analytical queries as per the business specifications
- Experience in Designing and developing web applications using Java, J2EE, JSP, MVC Framework (Struts, spring), Hibernate, XML, JSON, AJAX, and Java Script.
- Experience and good understanding about RDBMS (Relational Database Management System) like Oracle, etc. Coding experience in store procedures, Functions, Triggers & Assertions, etc.
- Experience in developing and implementing Web Services using SOAP, WSDL and UDDI.
- Good working knowledge of ANT, Log4J, XML, XSLT (DOM, SAX), Multithreading, Collections, Exceptions.
- Experience in following software methodologies such as Agile (Extreme Programming (XP), and SCRUM).
- Effective team player and self-starter with high adaptability to new technologies and having excellent analytical, problem solving and communication skills.
- Versatile team player with good analytical, communication and interpersonal skills.
- Extensive experience in data-driven software development and implementation
- Strong knowledge of SQL and database modeling; specializing in queries for a hierarchical data structure
- Specializing in Web application development with Java and Open Source Software.
- Experience in handling and supporting customers in production systems. Handling ad-hoc development requests from Business team
- Cloudera Certified Developer for Apache Hadoop CDH-410
- Sun Certified Java Programmer Version 5.0
TECHNICAL SKILLS:
Big Data: Hadoop, HDFS, Map Reduce, Hive, Sqoop, Pig, HBase, Flume, ZookeeperOozie. Kafka, Tableau, Spark, Scala, Presto, Elastic Search, Phoenix, NiFi
No-SQL: HBase, MongoDB
Languages: Java, Java Script, j2ee, HTML, XML, UML, SQL
Development Tools: Eclipse, Myeclipse, Netbeans, STS, IntellJ,informatica powercenter
Web Servers: Web Sphere, Web Logic, Jboss, Tomcat
Open Source Framework: Spring, Struts, Java Server faces
Databases: Oracle, SQL Server, MySQL, PostgreSQL, Teradata
Middle Ware: SOAP/Restful web services, MQ,RabbitMQ
Operating Systems: Windows 7/vista/NT/XP, Unix, Linux, Ubuntu
Design Methodologies: UML Modeling, Design Patterns /Rational Rose
Domain Knowledge: Retail, People Systems, Banking, Telecom
Version Control: CVS, Rational Clear case, Subversion, Github
Others: JMS, Ant, maven, LOG4J, CSS, Ajax
PROFESSIONAL EXPERIENCE:
Confidential
Sr. Big Data Consultant
Responsibilities:
- Designed an ETL flow to collect data from different sources (Oracle and Teradata) using Sqoop.
- Performed data transformations, filtering, sorting and aggregation using Hive, Pig, MapReduce and Spark (Scala).
- Worked an automating the Sqoop, hive.
- Using Shell Scripts for profiling. Converted the same to Spark for better processing.
- Ingested cleansed data in to Hive from oracle, Teradata. Generated reports using Tableau.
- Writing custom UDFs in hive.
Environment: HDP 2.3.4(Hortonworks), ja va,Spark, Scala, py Spark, Scala, Python, Hive, Pig, Tableau, Sqoop, Cron Job, MapReduce and Oozie
Confidential
Sr. Big Data Consultant
Responsibilities:
- Designed an ETL flow to collect data from different sources (Sql and Teradata) using NiFi, Sqoop.
- Built scalable Kafka cluster to queue the data collected from different from sources.
- Developed Storm using Java topologies to parse the call detailed records
- Performed data transformations, filtering, sorting and aggregation using Hive, Pig, MapReduce and Spark (Scala).
- Worked an automating the Sqoop, hive and spark jobs using oozie.
- Ingested cleansed data in Elastic Search, HBase and Hive
- Generated reports using Tableau and Kibana
Environment: HDP 2.3.4(Hortonworks), Ambari, RHEL 6.5, Java, Scala, NiFi, Presto, HBase, Kafka, Spark, Scala, Python, Elastic Search, Kibana, Hive, Tableau, Sqoop, Pig, MapReduce and Oozie
Confidential
Lead Hadoop Consultant
Responsibilities:
- Load and transform large sets of structured and semi structured data.
- Pre-processing using Hive and Pig.
- Loading data into Hive partitioned tables
- Requirement gathering and prepare the Design.
- Work with different Business and stake holders for each track
- Responsible for Analyzing and managing data coming from various sources.
- Designing, building, installing, configuring and supporting Hadoop.
- Translate complex functional and technical requirements into detailed design.
- Perform analysis of vast data stores and uncover insights.
- Work closely with the business and analytics team in gathering the system requirements.
- Implemented solutions using Hadoop, HBase, Hive, Sqoop, Java API, etc.
- Export and Import data into HDFS- HBase and Hive.
- Managing and deploying in production.
- Coordinate between the Business and the Off-shore team.
Environment: Horton Works, HDFS, Core Java, Map Reduce, informatica powercenter, Hive, Pig, Flume, Storm, Spark, Scala, Hue, Sqoop, Shell script, UNIX, Oracle, Toad, DMF, Active MQ, Restful web services.
Confidential
Sr Hadoop Developer
Responsibilities:
- Involved in gathering functional and non-functional requirements.
- Evaluated business requirements and prepared detailed specifications that follow project guidelines required to develop written programs.
- Installed and configured Hadoop Map Reduce, HDFS and developed multiple Map Reduce jobs in Java for data cleansing and preprocessing.
- Analyzed the data by performing Hive queries and running Pig scripts to know user shopping behavior
- Extracted the data into HDFS using Sqoop.
- Continuous monitoring and managing the Hadoop cluster.
- Developed Hive queries to process the data and generate the data cubes for visualizing.
- Preparing the dashboards with the data using Tableau.
- Involved in loading data from UNIX file system to HDFS.
Technologies: CDH, HDFS, Core Java, MapReduce, Hive, Pig, Flume, Storm, Elastic search, Scala, Spark, Kibana, Shell scripting, UNIX, Restful web services.
Confidential
Senior Software Engineer
Responsibilities:
- Interaction with the business on the business scope.
- Worked on several data sets like the weather, agriculture and human resources available on data.gov to test out/practice Map Reduce, Pig and Hive in Virtual Machine modes.
- End to end implementation, analysis for fraud detection using core java Map Reduce.
- Replicated the SQL functionality in PIG. Implemented the same using the HIVE.
- Conducted sessions on resources on Hadoop.
Environment: Java, j2ee, Eclipse, Hadoop, HDFS,Map Reduce, Pig, Hive, HBase, Sqoop, Spark,Scala, Flume, Presto, Kibana, HUE,Sqoop,Oozie, linux
Confidential
IT Analyst
Responsibilities:
- Analyzing application requirements, creating low level design and developing the proposed solution using Core Java, Spring Restful Web Service, J2EE and Tomcat 6.0, Jboss 6.0.
- Incorporated various Structural, Behavioral and Creational design patterns while implementing the solutions.
- Developed the application using spring architecture.
- Developed Web Services for inter organization operations and for retrieving the customer information from the central database using Restful Services.
- Used spring JDBC to store the persistent data into database and to access the data from the database.
- Developed Class Diagrams, Sequence Diagrams, Activity Diagram and Use Cases in Rational Rose.
- Experience with Client Side Validations using Spring Bean Annotations and custom validations.
- Involved in Code Review with Team and task assignments.
- Used Spring Framework for Dependency Injection.
- Developed test cases and performed testing using JUNIT.
- Monitored the error logs using Log4J and fixed the problems.
Environment: JDK 1.6, spring restful web services, spring Jdbc, Sts Eclipse 3.2, Jboss 6.0, LOG4J, JUNIT, JNDI, ext-js.
Confidential
IT Analyst
Responsibilities:
- Interaction with Amex Business on the scope of the project.
- Collecting the high level requirements and analysis of the impacted Applications/Services.
- Preparation of K245 Requirement document based on the requirements.
- Creation of Project Release Management Tracker.
- Code Review using the AMEX Code Review Checklist.
- Perform Risk Based Assessments on business requirements using Confidential Quality Standards.
- Monitoring/Coordinating the System Integration Testing.
- Coordinating the Roll Out activities that includes execution of CMR’s Change Management Record
Environment: java, j2ee, web service, Solaris 2.6, maven 3.0, Web Sphere Application Server 8.0
Confidential
Systems Engineer
Responsibilities:
- Involved in Requirement Gathering and Design Phase.
- Responsible to code service layer which sits between relational database & UI
- Prepared and executed the unit test cases.
- Developed the reusable/Generic Grid Component.
- Worked on soap based web services.
- Worked with Ibatis for interaction with database.
- Developed test cases and performed testing using JUNIT.
Environment: Core java, J2ee, Web Services, Oracle 10g, Web Logic 11g, Spring Ibatis.
Confidential
Software Engineer
Responsibilities:
- Creating the technical model for the project.
- Prepared the unit test cases and Quality documents.
- Developed the module for encryption logic in order to exchange data using the HSM
- Involved in writing the heart beat code for MQ Connection Pooling.
Environment: Core java, J2ee (Jsp and Servlets), Struts, Voice Objects Server, IBM WebSphere MQ, Weblogic 9.0, Hibernate.
Confidential
Software Engineer
Responsibilities:
- Developed the Connector code for sending/ receiving the MLI messages.
- Developed the KEK process using the software encryption using the java crypto API.
- Developed the reusable component for MLI packet formation.
- Deploying the application in the production at Confidential bank premises.
Environment: Core java, J2ee (Jsp and Servlets), Struts, Voice Objects Server, IBM WebSphere MQ, Weblogic 9.0, Hibernate.