We provide IT Staff Augmentation Services!

Hadoop Developer Resume

0/5 (Submit Your Rating)

Atlanta, GA

SUMMARY

  • More than 8+ years of experience in IT industry which includes 3 years of experience in Big data and Hadoop technology.
  • In depth understanding/knowledge of Hadoop Architecture and its various components such as HDFS, MapReduce, Job Tracker, Task Tracker, NameNode, DataNode.
  • Experience in using various Hadoop infrastructure tools like Hive, Pig, Sqoop, Flume, Oozie and Storm for data storage and analysis.
  • Good knowledge on Map Reduce Framework & HDFS Architecture.
  • Experience in testing data in HDFS and Hive for each transaction of data.
  • Experience in importing and exporting data between HDFS and Relational Database Management systems and vice - versa using Sqoop.
  • Good knowledge in Hadoop Shell Commands.
  • Experience in loading data to HDFS from UNIX (Ubuntu) file system.
  • Performed data analysis using Hive & Pig.
  • Strong knowledge of using Pig and Hive for processing and analyzing large volume of data.
  • Good experience in OOzie framework and automating daily import jobs.
  • Good Knowledge in Amazon AWS concepts like EC2 web services which provides fast and efficient processing of Big Data.
  • Experience in managing and reviewingHadoop Log files.
  • Knowledge of architecture and functionality of NoSQL database like HBase.
  • Experience in loading logs from multiple sources directly into HDFS using Flume and exporting data from different databases like MySQL into HDFS and Hive using Sqoop.
  • Strong working experience with data ingestion, storage, querying, processing and analysis of big data.
  • Understanding in installing and configuring Pig, Hive, HBase, Flume, Sqoop on the Hadoop Clusters.
  • Experience in working with Hadoop clusters using Cloudera distributions.
  • Experience in collecting log data from various sources and intergrating into HDFS using Flume.
  • Good knowledge on Zookeeper to coordinate clusters.
  • Excellent analytical, problem solving and interpersonal skills with the ability to effectively communicate at all levels of the organization such as technical, management and customers.
  • Capable of rapidly learning new technologies, process and successfully applying them to projects and operations.
  • Energetic and perseverant self-starter known for exceeding goals and objectives.

TECHNICAL SKILLS

Hadoop/Big Data: HDFS, MapReduce, Pig, Hive, Sqoop, Flume, HBase, ZooKeeper, Oozie, YARN, Storm

Platform: Microsoft Windows, Mac, LINUX

Databases: Oracle, MySQL

Cloud Infra: Amazon EC2

Languages: Hive, Pig

Distributions: Cloudera, Hortonworks

PROFESSIONAL EXPERIENCE

Confidential - Atlanta, GA

Hadoop Developer

Responsibilities:

  • Worked extensively in creating MapReduce jobs to power data for search and aggregation.
  • Designed a data warehouse using Hive
  • Worked extensively with Sqoop for importing metadata from Oracle
  • Extensively used Pig for data cleansing
  • Created partitioned tables in Hive
  • Worked with business teams and created Hive queries for ad hoc access.
  • Evaluated usage of Oozie for Workflow Orchestration
  • Mentored analyst and test team for writing Hive Queries
  • Gained very good business knowledge on health insurance, claim processing, fraud suspect identification, appeals process etc.

Environment: Hadoop, MapReduce, HDFS, Hive, Java (jdk1.6),Hadoopdistribution of Hortonworks, Oozie, Oracle 11g/10g.

Confidential - Dallas, TX

Hadoop Consultant

Responsibilities:

  • Worked Responsible for coordinating end to end project management related activities.
  • Involved in Design and Development of technical specification documents using Hadoop
  • Developed MapReduce programs to parse the raw data, populate tables and store the refined data in partitioned tables in the MMIS. Managed and reviewed Hadoop log files.
  • Developed and written Apache PIG scripts and HIVE scripts to process the HDFS data.
  • Monitored Hadoop scripts which take the input from HDFS and load the data into Hive.
  • Migrated the needed data from Oracle, MySQL in to HDFS using Sqoop and imported various formats of flat files in to HDFS.
  • Defined job work flows as per their dependencies in Oozie.
  • Maintain System integrity of all sub-components related to Hadoop.
  • Worked on Apache and Cloudera's Hadoop clusters.

Environment: Apache Hadoop, HDFS, Hive, Map Reduce, Java, Eclipse, Hive, Pig, Sqoop, Flume, Oozie, MySQL, Hadoop Distribution of Cloudera.

Confidential - Sacramento, CA

Hadoop Developer

Responsibilities:

  • Extracted files from DB2 through Kettle and placed in HDFS for processing
  • Analyzed large data sets by running Hive queries and Pig scripts
  • Worked with the Data Science team to gather requirements for various data mining projects
  • Involved in creating Hive tables, loading data, and analyzing data using hive queries
  • Developed Simple to complex MapReduce Jobs using Hive and Pig
  • Involved in running Hadoop jobs for processing millions of records of text data
  • Worked with application teams to install operating system, Hadoop updates, patches, and version upgrades as required
  • Developed multiple MapReduce jobs in java for data cleaning and pre-processing
  • Involved in unit testing using MR unit for Map Reduce jobs
  • Involved in loading data from LINUX file system to HDFS
  • Responsible for managing data from multiple sources
  • Experienced in runningHadoopstreaming jobs to process terabytes of xml format data
  • Load and transform large sets of structured, semi structured data
  • Responsible to manage data coming from different sources
  • Assisted in exporting analyzed data to relational databases using Sqoop
  • Created and maintained Technical documentation for launching HADOOP Clusters, executing Hive queries, and Pig Scripts

Environment: Hadoop, HDFS, Pig, Hive, MapReduce, Sqoop, LINUX, MRUnit, and Big Data

Confidential, MO

Hadoop Developer

Responsibilities:

  • Installing and configuring Hadoop MapReduce, HDFS, Developed multiple MapReduce jobs in Java.
  • Importing and exporting data into HDFS and Hive using Sqoop.
  • Working on PIG Latin Scripts and UDF's while ingestion, querying, processing and analysis of Data.
  • Installing and configuring Hive and also written Hive UDFs.
  • Defining job flows.
  • Managing and reviewing Hadoop log files.
  • Extracting files through Sqoop and place in HDFS and processed.
  • Loading and transforming large sets of structured, semi structured and unstructured data.
  • Responsible to manage data coming from different sources.
  • Good Knowledge with NOSQL HBase.
  • Supporting Map Reduce Programs those are running on the cluster.
  • Loading data from UNIX file system to HDFS.
  • Creating Hive tables, loading with data and writing hive queries which will run internally in map reduce way
  • Prepared design documents and functional documents.
  • Involved in Unit level testing
  • Connected local file system to HDFS using Filezilla
  • Based on the requirements, addition of extra nodes to the cluster to make it scalable.
  • Check the memory usage and disk usage to find the malfunctioning nodes.
  • Submit a detailed report about the daily activities on a weekly basis

Environment: Java, Linux, Hadoop, Sqoop, Pig, Hive, Flume, Eclipse, Clear case, Zookeeper & Oozie.

Confidential, West Street, NY

SQL Developer

Responsibilities:

  • Analyzed the functionality, developed data model for financial Modules, using Erwin.
  • Worked with the team for project’s rapid development, including timings and resources throughout project’s life cycle.
  • Writing Development Documents for new projects based on requirements and Business Requirement Document (BRD) submitted by the business owners.
  • Created Database Maintenance Planner for the Performance of SQL Server, which covers Database Integrity Checks, Update Database Statistics and Re-indexing.
  • Created SSIS packages to transfer data from Oracle to SQL server using different SSIS components and used configuration files and variables for production deployment
  • Migrated existing DTS Packages to SSIS Packages, Crystal reports to SSRS reports
  • Created and Scheduled reports for daily, weekly, monthly reports for executives, Business analyst and customer representatives for various categories and regions based on business needs using SQL Server Reporting services (SSRS).
  • Used SQL Profiler to trace the slow running queries and the server activity and with SQL performance tuning too.

Environment: MSSQL Server 2000, Oracle 9i, SSRS and SSIS, Crystal Reports, SQL, T-SQL, PL/SQL, SQL Query Analyzer, Profiler, Erwin, Data modeling, HTML, XPath, ETL, DTS packages.

Confidential, Washington D.C.

BI/DW Developer

Responsibilities:

  • Interpreted complex Business specifications developed by the client and generated cubes and reports to meet customer specifications.
  • Indexing and performance tuning for development, test and production environments
  • Designed ETL loading process and data flow
  • Developed ETL scripts to extract and cleanse data from SQL databases
  • Have prepared the impact analysis document and high-level design for the requirements
  • Analyzed the source data coming from different sources (Oracle, XML, Flat files, Excels) and worked on developing ETL mappings.
  • Debugged the reports and Cognos Metadata.
  • Performance tuning by analyzing and comparing the turnaround times between SQL and Cognos
  • Developed Reports which involved Multiple Prompts, Filters, multi-page, multi-query reports against database.
  • Developed dashboards and provided detail list and summary charts.
  • Designed models using Framework Manager and deployed packages to the Cognos BI Server.
  • Designed multidimensional models using Transformer.
  • Created user class views, dimension views and cube groups to cater various business needs.
  • Created Cognos Go Office PowerPoint PPT solutions.
  • Created adhoc package for Query studio which fulfil the on the fly data requests from customer.

Environment: Cognos 8.4/10, Report Studio, Framework Manager, Query Studio, Analysis Studio, Oracle11g, SQL & PLSQL Developer, Informatica PowerCenter, LDAP

Confidential, CT

BI Consultant

Responsibilities:

  • Interpreted existing complex report specifications developed by users in excel reporting.
  • Study and analysis of the business requirements gathering, understanding client's data warehousing requirements and system analysis of business process
  • Designed framework model using multi-layer architecture.
  • Created both standalone and embedded filters in model to reuse in the reports.
  • Created Standalone and embedded calculations for easy maintenance of the reports.
  • Implemented security at various levels to secure the sensitive data from unauthorized data.
  • Defined Governor Settings accordingly to improve the performance by restricting unnecessary querying.
  • Created ad hoc packages to generate reports using Query studio.
  • Used Parameter maps with macros to implement security and simplifying the report design.
  • Created master detail reports to show related data in the report, Created drill reports to show the detail information from the summary reports.
  • Created drill-through report and reused by defining package level security
  • Used both cognos functions and vendor specific functions to simplify the report design and improving the performance.
  • Created the interactive dashboard reports using various charts and various report templates.
  • Created rank reports to report top performed underwriters.
  • Used layout component reference to reuse the complex report layouts and prompt pages.
  • Created various parameters for drill-through reports.
  • Used variables and applied conditional style and conditional text for highlighting the specific data and adding more interactivity to the report dashboard.
  • Created report views and scheduled the reports to automate the reports for prompts selections and mailing the results the report results to the user groups.
  • Used page break property to show each group of values in the different pages.
  • Created financial report templates which are specific to the government format.
  • Created security groups and roles to implement various type securities.
  • Implemented the security to avoid unauthorized access of the sensitive data.
  • Deployed the packages to QA and Production environments as part of the project life cycle.
  • Prepared Unit test cases and performed unit tests on Reports.
  • Involved in Performance Tuning and Testing The Reports.
  • Assisting the team in the development of standards to ensure data quality.
  • Prepared documentation for various toll gates of PMO process.

Environment: Cognos 8.4/10, Report Studio, Framework Manager, Query Studio, Oracle 10g, SQL & PLSQL Developer, Informatica PowerCenter

We'd love your feedback!