We provide IT Staff Augmentation Services!

Hadoop Administration Resume

2.00/5 (Submit Your Rating)

HartforD

SUMMARY

  • 10+ years of Experience in Analysis, Design, Development, Support and Testing of teh software applications. Experience in Big Data Hadoop and Enterprise application Integration (EAI).
  • Experience in developing and deploying enterprise - based applications using major components in Hadoop ecosystem, Hadoop 2.x, YARN, Hive, Pig, Map Reduce, HBase, Flume, Sqoop, Spark, Storm, Kafka, Oozie and Zookeeper.
  • Experience in Operations, developing, maintaining, monitoring and upgrading Hadoop Clusters in cloud as well as in-house. (Apache Hadoop, Hortonworks and Cloudera distributions).
  • Experience in installation, configuration, management and deployment of Big Data solutions and teh underlying infrastructure of Hadoop Cluster.
  • Good knowledge on Cloud technology AWS Cloud.
  • Good experience on ETL development using Kafka, Flume, and Sqoop.
  • Involved in all phases of Software Development Life Cycle (SDLC) and Worked on all activities related to teh Operations, implementation, administration and support.
  • Experience in deploying, configuring, supporting and managingHadoopClusters of Cloudera and Hortonworks distributions on Linux platform
  • Experience in installing, configuring Hive, its services and Metastore. Exposure to Hive Querying Language, knowledge about tables like importing data, altering and dropping tables.
  • Experience in installing and running Pig, its execution types, Grunt, Pig Latin Editors. Good knowledge about how to load, store, filter data and also combining and splitting data.
  • Experience integration of Kafka wif Spark for real time data processing and Spark Streaming.
  • Experience in setting up teh High-Availability Hadoop Clusters
  • Experience in SparkSQL, and Spark Streaming
  • Experience in Hadoop administration wif good knowledge about Hadoop features like safe mode, auditing.
  • Experience in understanding teh client requirements and translate business requirements into functional and technical designs wif rich analysis, design skills
  • Excellent presentation and interpersonal skills having very good team player willing to take on new and varied projects and an ability to handle.

TECHNICAL SKILLS

Big Data Ecosystem: Hadoop, HDFS, MapReduce, Spark, Hive, Pig, Sqoop, Oozie, Flume, Zookeeper, Kafka, HBase, Cloudera Manager, Apache Ambari, Ganglia, Nagios, Talend, Apacha NiFI

Operating Systems: Windows, Linux - Red hat, CentOS,Ubuntu

Servers: Tomcat 5.x, BEA Web logic7.x, Oracle GoldenGet 11.2

Languages & Scripting: Core Java, HTML, Java Script, Perl Script, Shell scripting

J2EE Technologies: JDBC, Servlets, JSP, Struts1.1, spring, Hibernate

Tools: UML, Design Patterns, Log4J, Ant, IBM me-LOG J-Rule Visio, XML Canon, SVN,QC, HPSC, DB symphony, Service Manager, Service Now

Data Bases: Oracle 9i, Oracle 10/11g, SQL Server 2005x

IDE: Eclipse3.x

TIBCO EAI: TIBCO BW 5.x, Hawk 4.x, TIBCO Adaptor (ADB, SAP, file),TIBCO Administrator-5.x, TIBCO BPM 3.4, TIBCO MFT 7.1, TIBCO Active Spaces2.1,TIBCO Business Connect 5.3, EMS 4.x, TIBCOActive Matrix Service Grid, IBM me-LOG J-Rule

PROFESSIONAL EXPERIENCE

Confidential, Hartford

Hadoop Administration

Responsibilities:

  • Configured teh cluster to achieve teh optimal results by fine-tuning teh cluster using Cloudera distribution.
  • Making sure Back up policies for teh high availability of cluster at any point of time.
  • Extremely handle commission and decommission Nodes, targeting to load balancing as per teh project plan.
  • Designed in implementing teh Name Node High Availability(HA), Resource Manager High Availability(HA) for Hadoop clusters and designing automatic failover control using Zookeeper and Quorum Journal Nodes.
  • Implementing Hadoop security solutions Kerberos for securing Hadoop clusters
  • Integrated Oozie wif teh rest of teh Hadoop stack supporting several types of Hadoop jobs Map Reduce, Pig, Hive and Sqoop as well as system specific jobs such as Java programs and Shell scripts.
  • Teh managed full data mine from teh huge data volumes is exported to MySQL using Sqoop.
  • Configured Hive Metastore to use MySQL database to establish multiple user connections to hive tables.
  • Performed administration using Hue WebUI to create and manage user spaces in HDFS.
  • Configured teh Hadoop Map Reduce and HDFS core properties as a part of performance tuning to achieve high computational performance.
  • Configured Cloudera for receiving alerts on critical failures in teh cluster by integrating wif custom Shell Scripts
  • Maintained comprehensive project, technical and architectural documentation for enterprise systems.
  • Deploy and monitor scalable infrastructure on Amazon web services (AWS) & configuration management.

Environment: CDH 5.4.5, Hive1.2.1, HBase1.1.2, Flume1.5.2, MapReduce, Sqoop1.4.6, Spark, Kafka, Nagios, Shell Script, Oozie 4.2.0, Zookeeper 3.4.6.

Confidential, Atlanta

Hadoop Administrator

Responsibilities:

  • Configured teh cluster to achieve teh optimal results by fine tuning teh cluster using Apache Ambari.
  • Implemented Fair schedulers and Capacity schedulers to share teh resources of teh cluster wif other teams to run map reduce jobs.
  • Developed data pipelines wif combinations of Hive, Pig and Sqoop jobs scheduled wif Oozie.
  • Created data link to transfer data between database and HDFS and vice-versa using Sqoop scripts.
  • Developed real time pipeline for streaming data using Kafka.
  • Worked wif Hive data warehouse to HDFS to identify issues and behavioral patterns.
  • Worked wif operational team for Commissioning and Decommissioning of data nodes on teh Hadoop Cluster.
  • Performed both major and minor upgrades to teh existing cluster and also rolling back to teh previous version.
  • Enabled Kerberos for authorization and authentication make sure teh cluster safety.
  • Enabled HA for Name Node, Resource Manager, Yarn Configuration and Hive Megastore.
  • Implement, maintain and support reliable, timely and reproducible builds for project teams.
  • Interact wif developers and Enterprise Configuration Management team for changes to best practices and tools to eliminate non-efficient practices and bottlenecks.
  • Designed teh cluster so that only one secondary name node daemon could be run at any given time.

Environment: Hadoop HDP 2.1, Oracle, MS-SQL, Zookeeper3.4.6, Oozie 4.1.0, MapReduce, YARN 2.6.1, Nagios, REST APIs, Amazon web services, HDFS, Sqoop1.4.6, Hive 1.2.1,Pig 0.15.0.

Confidential, Atlanta

Hadoop Administrator and Analyst

Responsibilities:

  • Installed and configured Hadoop clusters for Dev, QA and Production environments as per teh project plan
  • Developed Oozie workflows to automate data extraction process from data warehouses.
  • Supported Map Reduce Programs those are running on teh cluster. Monitoring and tuning Map Reduce Programs running on teh cluster.
  • Created and maintained Technical documentation for launchingHADOOPClusters and for executing pig Scripts.
  • Extremely worked in creating Hive tables, loading wif data for loading .
  • Extremely worked in loading data into HBase using HBase Shell, HBase Client API, Pig and Sqoop.
  • Responsible for architectingHadoopclusters wif Cloudera distribution platform.
  • Performed performance tuning and troubleshooting of various ecosystems jobs by analyzing and reviewingHadooplog files.
  • Configured Spark Streaming to receive real time data from teh Kafka and store teh stream data to HDFS.
  • Involved in creating Hive, Impala tables, and loading and data using hive queries
  • Involved in running Hadoop jobs for processing millions of records
  • Installed and configured teh Hadoop name node HA service using Zookeeper.
  • Extremely worked to manage data link coming from different sources into HDFS through Sqoop, Flume.
  • Troubleshooting and monitoring Hadoop services using Cloudera manager.

Environment: CentOS4, CDH4, HIVE, HDFS, Sqoop, FTP, Apache, SMTP, ETL, Talend, SQL, JAVA, VMware, HBase, Apache-Tomcat.

Confidential, CA

System Administrator

Responsibilities:

  • Responsible for implementation and ongoing administration of EAI infrastructure.
  • Worked on teh Requirement gathering Analysis, Design for set up TIBCO environment
  • Installed, Configured and Maintained various suite of TIBCO product
  • Designed, configured and managed teh backup and disaster recovery for EAI server.
  • Analyzed Business Requirements and Identified mapping documents required for system and functional testing efforts for all test scenarios.
  • Setup teh alerting mechanism using TIBCO Hawk Rule base implementation for application level, server level
  • Worked on creating EAR files and deploying BW projects in various environments
  • Implemented various technical Solutions using TIBCO Suite of solutions and Messaging.
  • Worked in configuration of File adapter publication services to get teh data from Files
  • Worked in all stages of teh Software Development Life Cycle (SDLC)
  • Worked on teh deployment process of teh several ongoing EAI projects
  • Maintain Middleware application environments and deployment process
  • Extensive work for Middleware administration
  • Worked for Server building, Domain setup for EAI environment

Environment: TIBCO BW 5.9, TIBCO Hawk4.6, TIBCO administrator 5.6, EMS 5.1, Red-Hat Linux, SAPR/3, PeopleSoft adapter, MFT 7.0

Confidential, MN

Consultant

Responsibilities:

  • Understand and articulate complex business issues LLD and HLD for various project
  • Ensure a smooth transition between pre-sales and technical work
  • Worked in all stages of teh Software Development Life Cycle (SDLC)
  • Responsible for delivering presentations to customers, partners
  • Produce written proposals and technical responses to RFIs and RFPs
  • Conduct hands-on demonstrations and knowledge transfer
  • Develop customer solution proposals and supporting documentation as per teh plan
  • Lead a team of 10 members

Environment: Microsoft office, Middleware Suite of products, TIBCO BW 5.7, TIBCO Hawk4.6, TIBCO administrator 5.x, EMS 4.5, TIBCO Business Connect 5.3(B2B), MFT7.0

Confidential, NYC

Project Lead

Responsibilities:

  • Developed teh technical design document and interface design document based on requirements documents.
  • Worked in configuration of File adapter publication services to get teh data from File.
  • Worked in developing interfaces which are part of adapter’s application.
  • Worked in developing various mapping matrix documents for data transformations.
  • Worked in code reviews to review teh developed code in BW applications.
  • Created .ear files out teh developed projects and deployed teh applications into different environments by using TIBCO Administrator GUI.
  • Developed various Hawk rule bases to manage and monitor teh deployed processes.
  • Monitored BW engine, Hawk, file and SAP Adapter
  • Extended support -GO-Live, Production support
  • Managed and lead a team of 16 members

Environment: TIBCO BW 5.6, TIBCO administrator 5.3, Hawk 4.6, Adapter file SAP, Business Connect 5.3, MFT7.0, UNIX Red Hat, dB Symphony Tools, JDK 1.5

Confidential, Seattle

Consultant

Responsibilities:

  • Coordinated wif onsite team to get teh requirements
  • Developed teh technical design document and interface design document based on requirements documents.
  • Developed various TIBCO Active Matrix BW processes by using different types of adapters.
  • Designed various input, output and fault messages by using XSD schemas.
  • Monitoring BW engine, Hawk, file and Sap adapter
  • Designing SID and DDD for Web Services, Hawk rules
  • Used TIBCO General Interface solution for rapidly building and deploying rich internet applications.
  • Generated unit tests for each operation using SoapUI
  • Modifying teh old service as per teh business requirement
  • Creating teh new Web service as per teh requirement
  • Configure teh services wif RSP
  • Extended support -GO-Live, Production support
  • Led a team of 12 members

Environment: TIBCO BW 5.6, TIBCO administrator 5.1, Hawk 4.5, Adapter, Spring framework, Web Service, me-Batis, SOAP-UI, WSDL, Tuxedo, Web Logic 10.3, JDK 1.5, Eclipse 3.3, AccuRev, Active Matrix

Confidential

Technical Lead

Responsibilities:

  • Designed, developed, and implemented Middleware application using TIBCO suite of products
  • Developed various HAWK rule bases to manage and monitor teh deployed processes.
  • Installed, Configured, Upgraded and hot fixed TIBCO components.
  • Generated unit tests for each operation using SoapUI.
  • Coordinate Development, Test and Configuration team.
  • Extremely work Servlets and JSPs based on MVC pattern using Struts and Spring framework.
  • Used HP Service Center tool for call tracking for Incident Management and Change Management
  • Worked in testing and debugging process
  • Worked in teh design of teh application using UML/Rational Rose.
  • Worked in coding Java
  • Worked in developing presentation layer wif JSP
  • Developing teh front-end logic and validations
  • Extended support -GO-Live, Production support
  • Developed JDBC code for backend processing.
  • Configured teh connection pools and security for teh server

Environment: TIBCO BW 5.3, TIBCO Administrator 5.0, TIBCO Hawk 4.5, TIBCO adaptor, JSP2.0, JavaScript, Web Logic 9.1, JDK 1.5, Eclipse 3.0, Unix, JDK1.4

Confidential

Developer

Responsibilities:.

  • Design, implement business for front-end logic and validations using Core Java.
  • Developed teh Server site programming using Servlets.
  • Developed front end using JSP and HTML.
  • Established JDBC API Calls.
  • Worked in Oracle database designing.
  • Prepared test plan and test cases for various module.
  • Provided teh production support after deployment.
  • Worked in design of teh application using UML/Rational Rose.
  • Designed, developed and implemented various interfaces.
  • Extended support -GO-Live, Production support.

We'd love your feedback!