Sr. Big Data Consultant Resume
Sunnyvale, CA
PROFESSIONAL SUMMARY:
- 8+ years of professional experience in IT, including 2 years of work experience in Hadoop Eco system, and Big - Data Analytics
- In depth understanding/knowledge of Hadoop Architecture and various components such as HDFS, JobTracker, TaskTracker, NameNode, DataNodeand MapReduce concepts.
- Well versed in installation, configuration, supporting and managing of Big Data and underlying infrastructure of Hadoop Cluster.
- Hands on experience on major components in Hadoop Ecosystem like Hadoop Map Reduce, HDFS, HIVE, PIG, Confidential PDI, HBase, Sqoop, Oozie and Flume.
- Experience wif Oozie Workflow Engine in running workflow jobs wif actions dat run Hadoop Map/Reduce and Pig jobs.
- Experience in managing and reviewing Hadoop Log files.
- Extending Hive and Pig core functionality by writing custom UDFs.
- Experience in Hadoop administration activities such as installation and configuration of clusters using Apache and Cloudera.
- Good Knowledge in Amazon AWS concepts like EMR and EC2 web services which provides fast and efficient processing of Big Data.
- Hands on experience in designing and coding web applications using Core Java and J2EE technologies.
- Experienced the integration of various data sources like Java, RDBMS, Shell Scripting, Spreadsheets, and Text files.
- Experience in Web Services using XML, HTML and SOAP.
- Excellent Java development skills using J2EE, J2SE, Servlets, JUnit, JSP, JDBC.
- Familiarity working wif popular frameworks likes Struts, Hibernate, Spring, MVC and AJAX.
- Extensive experience in developing components using JDBC, Java, Oracle, XML and UNIX Shell Scripting.
- Ability to blend technical expertise wif strong Conceptual, Business and Analytical skills to provide quality solutions and result-oriented problem solving technique and leadership skills.
TECHNICAL SKILLS:
Hadoop Ecosystem: HDFS, Map Reduce Hive, Pig, HBase, Zookeeper, Sqoop, Oozie, Flume, Spark, Strom, Kafka, Confidential and Avro.
Web Technologies: Java, J2EE, Servlets, JSP, JDBC, XML, AJAX, SOAP, WSDL
Methodologies: Agile, UML, Design Patterns (Core Java and J2EE)
Frameworks: Hibernate 2.x/3.x, Spring 2.x/3.x, Struts 1.x/2.x
Programming Languages: Java, XML, Unix Shell scripting, HTML.
Database Systems: Oracle 11g/10g, DB2, MS-SQL Server, MySQL, MS-Access
Web Services: Web Logic, Web Sphere, Apache Tomcat
Monitoring & Reporting tools: Ganglia, Custom Shell scripts
Operating Systems: Windows-XP/2000/NT, UNIX, Linux, and DOS
IDE: Eclipse3.x, NetBeans
PROFESSIONAL EXPERIENCE:
Confidential, Sunnyvale, CA
Sr. Big Data Consultant
Responsibilities:
- Responsible for building Hadoop cluster and integrate wif Confidential Data Integration (PDI) server
- Experienced in creating ETL transformations/Jobs using Spoon
- Experience in developing visual MapReduce Applications using Confidential Spoon
- Data loaded into impala tables after data cleansing
- Established database connection from Confidential to store in MySQL
- Experienced in working wif different Hadoop Ecosystems such as Hive, Impala, Hbase, Pig, Sqoop through Confidential
- Developed various complex Mapper and Reduce transformations for Confidential MapReduce Jobs
- Experience in using MongoDB
- Extensively involved in Hadoop testing where scripts written in Python
- Responsible for analyzing logs generated from various test cases identifying the reasons in case of failures
- Experience in building plugins using Confidential java API
- Involved in various debugging sessions wif team
- Responsible test cases reporting and documenting the test results
- Good Knowledge in Report Designing using Confidential BA suite
Environment: Confidential PDI, Spoon, Java 6, Linux, Hadoop (CDH and Hortonworks), Impala, Hive, Sqoop, Hbase, Pig, MySQL
Confidential, Boston, MA
Hadoop Developer
Responsibilities:
- Installed and configured Hadoop MapReduce, HDFS, Developed multiple MapReduce jobs in java for data cleaning and preprocessing.
- Importing and exporting data into HDFS and Hive using Sqoop.
- Experienced indefining jobflows.
- Experienced in managing andreviewingHadooplog files.
- Extracted files from CouchDB through Sqoop and placed in HDFS and processed.
- Experienced in runningHadoopstreaming jobs to process terabytes of xml format data.
- Load and transform large sets of structured, semi structured and unstructured data.
- Responsible to manage data coming from different sources.
- Got good experience wif NoSQL database.
- Supported Map Reduce Programs those are running on the cluster.
- Involved in loading data from UNIX file system to HDFS.
- Installed and configured Hive and also written Hive UDFs.
- Involved in creating Hive tables, loading wif data and writing Hive queries which will run internally in map reduce way.
Environment: Java 6, Eclipse, Linux, Hadoop, HBase, Sqoop, Pig, Hive, Flume.
Confidential, MA
Hadoop Developer
Responsibilities:
- Involved in the Complete Software development life cycle (SDLC) to develop the application.
- Worked on analyzing Hadoop cluster and different big data analytic tools including Pig, Hbase database and Sqoop.
- Involved in loading data from LINUX file system to HDFS.
- Exported the analyzed data to the relational databases using Sqoop for visualization and to generate reports for the BI team.
- Importing and exporting data into HDFS and Hive using Sqoop.
- Implemented test scripts to support test driven development and continuous integration.
- Supported in setting up QA environment and updating configurations for implementing scripts wif Pig and Sqoop.
- Installed and configured Hadoop MapReduce, HDFS, Developed multiple MapReduce jobs in java for data cleaning and preprocessing.
- Created Pig Latin scripts to sort, group, join and filter the enterprise wise data.
- Involved in creating Hive tables, loading wif data and writing hive queries dat will run internally in mapreduce way.
- Supported MapReduce Programs those are running on the cluster.
- Analyzed large data sets by running Hive queries and Pig scripts.
- Worked on tuning the performance Pig queries.
- Mentored analyst and test team for writing Hive Queries.
- Installed Oozie workflow engine to run multiple MapReduce jobs.
- Worked wif application teams to install operating system, Hadoop updates, patches, version upgrades as required.
Environment:Hadoop, HDFS, Map Reduce, Hive, Pig, Sqoop, Linux, Java, Oozie, HBase
Confidential, Atlanta, GA
Hadoop Developer
Responsibilities:
- Responsible for building scalable distributed data solutions using Hadoop
- Installed and configured Hive, Pig, Sqoop, Flume and Oozie on the Hadoop cluster
- Setup and benchmarked Hadoop/HBase clusters for internal use
- Developed Simple to complex Map/reduce Jobs using Hive and Pig
- Optimized Map/Reduce Jobs to use HDFS efficiently by using various compression mechanisms
- Handled importing of data from various data sources, performed transformations using Hive, MapReduce, loaded data into HDFS and Extracted the data from MySQL into HDFS using Sqoop
- Analyzed the data by performing Hive queries and running Pig scripts to study customer behavior
- Used UDF's to implement business logic in Hadoop
- Implemented business logic by writing UDFs in Java and used various UDFs from Piggybanks and other sources.
- Continuous monitoring and managing the Hadoop cluster using Cloudera Manager
- Worked wif application teams to install operating system, Hadoop updates, patches, version upgrades as required
- Installed Oozie workflow engine to run multiple Hive and Pig jobs
- Exported the analyzed data to the relational databases using Sqoop for visualization and to generate reports for the BI team
Environment: Hadoop, MapReduce, HDFS, Hive, Pig, Java, SQL, Cloudera Manager, Sqoop, Flume, Oozie, Java (jdk 1.6), Eclipse.
Confidential, Seattle, WA
Java/J2EE Developer
Responsibilities:
- Involved in various phases of Software Development Life Cycle (SDLC) as design development and unit testing.
- Developed and deployed UI layer logics of sites using JSP, XML, JavaScript, HTML/DHTML, and Ajax.
- CSS and JavaScript were used to build rich internet pages.
- Agile Scrum Methodology been followed for the development process.
- Designed different design specifications for application development dat includes front-end, back-end using design patterns.
- Developed proto-type test screens in HTML and JavaScript.
- Involved in developing JSP for client data presentation and, data validation on the client side wif in the forms.
- Developed the application by using the Spring MVC framework.
- Collection framework used to transfer objects between the different layers of the application.
- Developed data mapping to create a communication bridge between various application interfaces using XML and XSL.
- Spring IOC being used to inject the parameter values for the Dynamic parameters.
- Developed JUnit testing framework for Unit level testing.
- Actively involved in code review and bug fixing for improving the performance.
- Documented application for its functionality and its enhanced features.
- Created connection through JDBC and used JDBC statements to call stored procedures.
Environment:Spring MVC, Oracle 11g J2EE, Java, JDBC, Servlets, JSP, XML, Design Patterns, CSS, HTML, JavaScript 1.2, JUnit, Apache Tomcat, My SQL.
Confidential, TN
Java/J2EE Developer
Responsibilities:
- Played an active role in the team by interacting wif welfare business analyst/program specialists and converted business requirements into system requirements.
- Developed analysis level documentation such as Use Case, Business Domain Model, Activity & Sequence and Class Diagrams.
- Conducted Design reviews and Technical reviews wif other project stakeholders.
- Implemented Services using Core Java.
- Developed and deployed UI layer logics of sites using JSP.
- Struts (MVC) is used for implementation of business model logic.
- Worked wif Struts MVC objects like Action Servlet, Controllers, and validators, Web Application Context, Handler Mapping, Message Resource Bundles and JNDI for look-up for J2EE components.
- Developed dynamic JSP pages wif Struts.
- Used built-in/custom Interceptors and Validators of Struts.
- Developed the XML data object to generate the PDF documents and other reports.
- Used Hibernate, DAO, and JDBC for data retrieval and medications from database.
- Messaging and interaction of Web Services is done using SOAP.
- Developed JUnit Test cases for Unit Test cases and as well as System and User test scenarios
- Involved in Unit Testing, User Acceptance Testing and Bug Fixing.
- Implemented mid-tier business services to integrate UI requests to DAO layer commands.
Environment:J2EE, JDBC, Java 1.4, Servlets, JSP, Struts, Hibernate, Web services, SOAP, WSDL, Design Patterns, MVC, HTML, JavaScript 1.2,WebLogic 8.0, XML, JUnit, Oracle 10g, My Eclipse
Confidential . Somerset, NJ
Java Developer
Responsibilities:
- Actively participated in requirements gathering, analysis, design, and testing phases.
- Designed use case diagrams, class diagrams, and sequence diagrams as a part of Design Phase.
- Developed the entire application implementing MVC Architecture integrating JSF wif Hibernate and Spring frameworks.
- Developed the Enterprise Java Beans (Stateless Session beans) to handle different transactions such as online funds transfer, bill payments to the service providers.
- Implemented Service Oriented Architecture (SOA) using JMS for sending and receiving messages while creating web services.
- Developed XML documents and generated XSL files for Payment Transaction and Reserve Transaction systems.
- Developed Web Services for data transfer from client to server and vice versa using Apache Axis and SOAP.
- Used JUnit Framework for the unit testing of all the java classes.
- Implemented various J2EE Design patterns like Singleton, Service Locator, DAO, and SOA.
- Worked on AJAX to develop an interactive Web Application and JavaScript for Data Validations.
Environment: J2EE, JDBC, Java 1.4, Servlets, JSP, Struts, Hibernate, Web services, SOAP, Design Patterns, MVC, HTML, JavaScript 1.2,WebLogic 8.0, XML, JUnit, Oracle 10g, Web Sphere, My Eclipse
ConfidentialJava Developer
Responsibilities:
- Worked on AJAX to develop an interactive Web Application and JavaScript for Data Validations.
- Developed the application under JEE architecture, developed Designed dynamic and browser compatible user interfaces using JSP, Custom Tags, HTML, CSS, and JavaScript.
- Deployed & maintained the JSP, Servlets components on Web logic 8.0
- Developed Application Servers persistence layer using, JDBC, SQL, Hibernate.
- Used JDBC to connect the web applications to Data Bases.
- Implemented Test First unit testing framework driven using JUnit.
- Developed and utilized J2EE Services and JMS components for messaging communication in Web Logic.
- End to End designing of Critical Core Java Components using Java Collections and Multithreading.
- Analysis of different database schemas Transaction and Data warehouse to build extensive reports to Business using SQL & Joins.
- Development of multiple reports to business in quick turn-around time, which halped business to save considerable operational costs.
- Created one of the best programs to notify the operational team on Downtime of one of 250 pharmacies on AP network in a few seconds.
- Created an interface using JSP, Servlet and MVC Struts architecture for pharmacy team to resolve stuck orders in different pharmacies.
- Performance tuned the IMS report to memory leaks and best practices in java to boost the performance and reliability of the application.
Environment: Java 1.4, J2EE (JSP, Servlets, Java Beans, JDBC, Multi-Threading), LINUX (Shell & Perl Scripting), and SQL.