Hadoop Developer/ Administrator Resume
Woburn, MA
OBJECTIVE
- Highly motivated, communicative, performance driven IT professional wif industry experience. In active search for the opportunity to use the strong technical IT skills gained through past experiences in an organization those values professionalism and growth.
SUMMARY
- Over 5 years of progressive and diversified experience in all phases of Admin, Development Activities in Information technology.
- Experience in Big Data technologies like Hadoop development and administration.
- Experienced Certified Admin, lead wif proven success in developing multiple projects for various industries.
- Established history of innovative mindset, critical thinking, and leadership.
- MapR Certified Hadoop Administrator wif experience in managing and developing the Big Data Solutions for the industries in banking, Finance, Retail, Telecom, Life Sciences, and Automobile Domain.
- Experience in Hadoop/Big Data writing, course development, instructional design, and resource coordination.
- Expertise in Pig - MR technique
- Superior communication, diagnostic, technical, and presentation skills.
TECHNICAL SKILLS
Programming Experience: C, C++, JAVA, Python, JavaScript, HTML, XML, CSS, PHP, SQL
Applications: Video Editor, Web designing applications, MySQL, SSH, QuickBooks
Microsoft Office: Word, Excel, Access, PowerPoint
Operating System: Windows XP, Windows 7, Windows 8, Macintosh, Linux/Unix
Big Data: Hadoop, MapReduce, High-performance computing, Picard, Data mining, Apache/Tomcat/IIS, Django, Pig, Hive, HDFS, Zookeeper, Zive, MongoDB, Zookerper, Tableau, Scala.
PROFESSIONAL EXPERIENCE
Confidential, Woburn, MA
Hadoop Developer/ Administrator
Responsibilities:
- Installed and configured Hadoop Map Reduce, HDFS, Developed multiple Map Reduce jobs in java for data cleaning and preprocessing.
- Loaded the customer profiles data, customer spending data, credit from legacy warehouses onto HDFS using Sqoop.
- Built data pipeline using Pig and Java Map Reduce to store onto HDFS.
- Applied transformations and filtered both traffic using Pig.
- Used Pattern matching algorithms to recognize the customer across different sources and built risk profiles for each customer using Hive and stored the results in HBase.
- Performed unit testing using MRUnit.
- Responsible for building scalable distributed data solutions using Hadoop
- Installed and configured Hive, Pig, Sqoop, Flume and Oozie on the Hadoop cluster
- Setup and benchmarked Hadoop/HBase clusters for internal use
- Developed Simple to complex Map/reduce Jobs using Hive and Pig
- Optimized Map/Reduce Jobs to use HDFS efficiently by using various compression mechanisms
- Handled importing of data from various data sources, performed transformations using Hive, MapReduce, loaded data into HDFS and Extracted the data fromMySQL into HDFS using Sqoop
- Analyzed the data by performing Hive queries and running Pig scripts to study customer behavior
- Installed Oozie workflow engine to run multiple Hive and Pig jobs
- Exported the analyzed data to the relational databases using Sqoop for visualization and to generate reports for the BI team
- Responsible for writing Hive Queries for analyzing data in Hive warehouse using Hive Query Language (HQL).
- Provide support data analysts in running Pig and Hive queries.
- Involved in HiveQL.
- Involved in Pig Latin.
- Importing and exporting Data from Mysql/Oracle to HiveQL
- Importing and exporting Data from Mysql/Oracle to HDFS using SQOOP.
Environment: Hadoop, Hive, Zookeeper, Map Reduce, Sqoop, Pig 0.10 and 0.11, JDK1.6, HDFS, Flume, Oozie, DB2, HBase, Mahout
Confidential, Woburn, MA
Hadoop Developer/ Administrator
Responsibilities:
- Installed and configured CDH-Hadoop environment and responsible for maintaining cluster and managing and reviewing Hadoop log files.
- Developed simple and complex MapReduce programs in Java for Data Analysis.
- Install KAFKA on Hadoop cluster and configure producer and consumer coding part in java to establish connection from twitter source to HDFS wif popular hash tags.
- Load data from various data sources into HDFS using Kafka.
- Transfer the data from HDFS TO MONGODB using pig, hive and Map reduce scripts and visualize the streaming data in dashboard tableau.
- Do analytics using map reduce, hive and pig in HDFS and sends back those results to MongoDB databases and update information in collections.
- Implemented Partitioning, Dynamic Partitions, Buckets in HIVE.
- Developed PIG Latin scripts for the analysis of semi structured data.
- Continuous monitoring and managing the Hadoop cluster using Cloudera Manager.
Environment: CDH4.7, Hadoop-2.0.0 HDFS, MapReduce, MongoDB-2.6, Tableau 8.2, Hive-0.10, Sqoop-1.4.3, Oozie-3.3.4, zookeeper-3.4.5, Hue-2.5.0
Confidential, Bayside, NY
Technology Support
Responsibilities:
- Managed all financial parts of the company, including bookkeeping, using QuickBooks and Excel
- Maintained and Reinstalled Windows Operating System on the company computers
- Troubleshot and Maintained Company Network to communicate wif the Clients in Korea
- Managed Client’s information data using MySQL
- Developed and Maintained Company website using Web development Tools.
Confidential, Flushing, NY
Assistant Web Designer
Responsibilities:
- Designed and Maintained AYC website using Notepad++, CSS, and JavaScript
- Maintained Windows Operating Systems and Linux Operating Systems
- Troubleshot and Maintained AYC and The Council of Korean Churches of Greater New York’s Network
- Managed information about the Networking Group data of AYC using MySQL
- Managed Members of The Council of Korean Churches of Greater New York’s data using MySQL
Confidential, Bayside, NY
Manager
Responsibilities:
- Enhanced overall performance of business by effectively organizing classes considering individual student’s performance
- Taught SAT and consulted students wif overall colege application processes.
- Record and manage student’s personal information and grades using SQL database program
Confidential
Intern Programmer
Responsibilities:
- Assisted main programmer to develop Android smart phone applications using python, JAVA, XML, PHP, and SQL.
- Assisted main programmer to develop IOS smart phone applications using Objective-C, XML, PHP, and SQL
- Managed and maintained application server and database using Amazon Web Service.
- Developed Web User Interface using Java to accomplish the Clint’s need.
- Assisted web programmer to develop website using HTML and CSS.