Java/ J2ee Programming Resume Profile
Holliston, MA
Objective:
To use my experience and skills in a personable and professional manner in the hope of complementing the organization's culture and enhancing its growth.
SUMMARY:
- Over 10 years of experience in IT Industry with strong experience with Big Data Hadoop
- Experienced professional in Java/ J2ee programming and application development
- Experienced developing application codes using HTML, JavaScript, JQuery, JSP
- Working experience in Map Reduce programming model and Hadoop Distributed File Systems
- Knowledge of writing Hive Queries to generate reports using Hive Query Language
- Hands on experience in installing, configuring Cloudera Apache Hadoop ecosystem components like Flume, Hbase, Zoo Keeper, Oozie, Hive, Sqoop and Pig
- Hands-on experience in distributed systems technologies, infrastructure administration, monitoring, configuration
- Basic Knowledge of UNIX and shell scripting
SKILLS:
Database : Oracle, PL/SQL, SQL, My SQL, MS Access
Programming Languages : C Language, Java, JDK 1.1.7, Visual Basic
Scripting : ASP, HTML, PHP, Python, Perl, JavaScript
Networking : Ethernet, Switches, Routers, Network Protocols, Subnetting Implementing
Operating System : Linux/Unix, Windows Server 2003/ XP, Active Directory
Big Data : Hadoop, MapReduce, High-performance computing, Data mining, HBase,
Django, Pig, Hive, HDFS, Zookeeper
PROFESSIONAL EXPERIENCE:
Confidential
- Confidential enables its customers to cost effectively mine, manage and monetize data delivering actionable analytics that drive unsurpassed business performance.
- Dragonfly's Products, Data Factory facilities and Data Engineering Services are focused on cloud-based, open data architectures and tools that provide data extraction and processing data sources towards data analytics deliverables
Hadoop Administrator/Developer
This Project main aim was to track the trending HashTags, trending discussion from the real-time streaming data. We used the most popular streaming tool KAFKA to load the data on Hadoop File system and move the same data to MongoDB Nosql database. We use most productive algorithms to analyze the data on HDFS using Map reduce, Hive and Pig. We got the Geo tagging Location based popular tweets, treading hash tags data and counts on daily basis to guide the customers to post ads on Twitter. We did the real-time analytics using Hadoop and display the end results using Tableau.
Technologies Involved in this project: kafka 2.9.1-0.8.1.1, Cloudera CDH4.7 Hadoop Cluster, AWS, Java 1.7, Mysql, MongoDB-Nosql-2.6, Tableau 8.2, hadoop -2.0.0, hive-0.10, hue-2.5, oozie-3.3.2, pig-0.11,sqoop-1.4.3,sqoop2-1.99,zookeeper-3.4.5, Red hat Linux, Unix scripting, twitter4j, log4j, Scala, junit testing and Maven
- M3xlarge nodes with 4 core CPU and 15 Gb RAM and 1.6TB node size
- Daily getting 1TB of data size in JSON format with popular hash tags along with 8 billion events and trillions of real time tweets
- Hadoop cluster -nodes- 30 node cluster
- Mongo DB cluster MongoDB with 1000 IOPS
- Using Kafka we stream the data with twitter4j from source to Hadoop. From Hadoop to Mongodb move the data using Map reduce, hive and pig scripts by connecting with mongo-hadoop connectors. Analyze data on HDFS and send the results to MongoDB databases to update the information on the existing table.
Responsibilities:
- Continuous monitoring and managing the Hadoop cluster using Cloudera Manager.
- Developed simple and complex MapReduce programs in Java for Data Analysis.
- Transfer the data from HDFS TO MONGODB using pig, hive and Map reduce scripts and visualize the streaming data in dashboard tableau.
- Do analytics using map reduce, hive and pig in HDFS and sends back those results to MongoDB databases and update information in collections.
- Installed and configured CDH-Hadoop environment and responsible for maintaining cluster and managing and reviewing Hadoop log files.
- Load data from various data sources into HDFS using Kafka.
- Install KAFKA on Hadoop cluster and configure producer and consumer coding part in java to establish connection from twitter source to HDFS with popular hash tags.
- Implemented Partitioning, Dynamic Partitions, Buckets in HIVE.
- Developed PIG Latin scripts for the analysis of semi structured data.
Environment: CDH4.7, Hadoop-2.0.0 HDFS, MapReduce, MongoDB-2.6, Tableau 8.2, Hive-0.10, Sqoop-1.4.3, Oozie-3.3.4, Zookeeper-3.4.5, Hue-2.5.0
Hadoop Administrator/Developer
Responsibilities:
- Installed and configured Hadoop MapReduce, HDFS, Developed multiple MapReduce jobs in java for data cleaning and preprocessing
- Created Pig Latin scripts to sort, group, join and filter the desperate enterprise wise data
- Responsible in administration, monitor and maintain Hadoop cluster, develop MapReduce jobs, assist and guide team members.
- Experienced in reviewing Hadoop log files
- Administer, Monitor and Maintain Hadoop cluster setup with over 100TB available storage
- Design and build scalable infrastructure and platform to collect and process very large amounts of data
Environment: Hadoop, MapReduce, HDFS, Pig, Hive, Java jdk1.7 , Flat files, Oracle 11g/10g, PL/SQL, Windows
NT, UNIX
Hadoop Administrator
Description: Confidential niche technology solutions, strategic consulting and embedded resourcing services to clinical research, biotechnology, and pharmaceutical organizations. Our quality philosophy is to provide products and services that meet our customers' requirements first time every time
Responsibilities:
- Provided 24/7 Support in monitoring Hadoop/HBase Cluster using tools like Ganglia, BMC and Nagios
- Administer, Monitor and Maintain Hadoop 0.20 cluster setup
- Responsible for providing administration and support of multiple Hadoop clusters for a variety of mission-critical custom and packaged software applications, in addition to a large reliable distributed data processing framework and analytical infrastructure
Environment: Hadoop, HDFS, Map Reduce, Zookeeper, HBase, Pig Latin, Sqoop, Hive, Pig, SQL, Oracle PL/SQL, Oozie, MySQL, Java Script, JQuery, Web Services
Network Administrator
Confidential is dedicated to providing industry-specific business technology solutions to companies in the insurance, healthcare, and construction verticals. Because they are tailored to the needs of these particular industries, our solutions are different from one-size-fits-all performance management and compensation suites such as enterprise incentive management EIM and sales performance management SPM solutions. VUE Software's solutions allow companies to manage strategic incentive plans, automate producer administration and organize complex data and contractor policies, resulting in greater administrative efficiency and improved sales performance
Responsibilities:
- Provide XP to Windows 7 platform Migration including Outlook 2003 to Outlook 2010
- Competent in transfer of all files, photos, music, email and settings including pop/smtp
- Provide on-site, phone and remote system and network troubleshooting and repairs
- Drive mapping, network drives, modem and router configuration
- Manage all business levels of operations: including IT Staffing, Best Practice Repair or Replacement, Data Forensics, finance, marketing and client retention
- Provide and inspire outstanding service to customers by maintaining attention to detail and completing assigned work in a timely and efficient manner
Network Administrator
Designed and implemented Windows NT 4.0 Server with Windows 98 clients including one remote site using RAS installed EZDental/EagleSoft software and provided training to the staff migrated to W2K3 server with XP-Pro clients and RRAS/VPN for the remote site provided ongoing support for both locations.