We provide IT Staff Augmentation Services!

Hadoop / Data Platform Architect Resume

Minneapolis, MinnesotA

SUMMARY:

  • Over 7 years’ experience on Commissioning, Decommissioning, Balancing, and Managing Nodes and tuning server for optimal performance of the cluster.
  • Around 5 years of professional experience including extensive Hadoop and Linux experience.
  • Experienced in installation, configuration, supporting and monitoring 100+ node Hadoop cluster using Cloudera manager and Hortonworks distributions.
  • Experience in performing various major and minor Hadoop upgraded on large environments.
  • As an admin involved in Cluster maintenance, trouble shooting, Monitoring and followed proper backup& Recovery strategies.
  • Experience in HDFS data storage and support for running map - reduce jobs.
  • Involved in Infrastructure set up and installation of HDP stack on Amazon Cloud.
  • Experience with ingesting data from RDBMS sources like - Oracle, SQL and Teradata into HDFS using Sqoop.
  • Experience in big data technologies: Hadoop HDFS, Map-reduce, Pig, Hive, Oozie, Sqoop, Zookeeper and NoSQL.
  • Experience in benchmarking, performing backup and disaster recovery of Name Node metadata and important sensitive data residing on cluster.
  • Experience in designing and implementing HDFS access controls, directory and file permissions user authorization that facilitates stable, secure access for multiple users in a large multi-tenant cluster
  • Experience in using Ambari for Installation and management of Hadoop clusters.
  • Experience in Ansible and related tools for configuration management.
  • Experience in working large environments and leading the infrastructure support and operations.
  • Migrating applications from existing systems like MySQL, oracle, db2 and Teradata to Hadoop.
  • Expertise with Hadoop, Map reduces, Pig, Sqoop, Oozie, and Hive.
  • Benchmarking Hadoop clusters to validate the hardware before and after installation to tweak the configurations to obtain better performance.
  • Experience in administering the Linux systems to deploy Hadoop cluster and monitoring the cluster.
  • Great team player and quick learner with effective communication, motivation, and organizational skills combined with attention to details and business improvements.
  • Adequate knowledge and working experience in Agile & Waterfall methodologies.

TECHNICAL SKILLS:

Hadoop/Big Data: HDFS, Mapreduce, HBase, Pig, Hive, Sqoop, Flume, MongoDB, Cassandra, Power pivot, Puppet, Oozie, Zookeeper, Kafka, Spark

Big data Analytics: Datameer 2.0.5

Frameworks: MVC, Struts, Hibernate, Spring

Databases: Oracle 11g/10g/9i, MySQL, DB2, MS-SQL Server

Web Servers: Web Logic, Web Sphere, Apache Tomcat

Web Technologies: HTML, XML, JavaScript, AJAX, SOAP, WSDL

ETL Tools: Informatica, Talend

PROFESSIONAL EXPERIENCE:

Confidential, Minneapolis, Minnesota

Hadoop / Data Platform Architect

Responsibilities:

  • Designed and implemented end to end big data platform solution on. Teradata Appliance and AWS cloud.
  • Manage Hadoop clusters in production, development, Disaster Recovery environments.
  • Implemented Teradata Aster a data science tool and integrate with Hadoop.
  • Integrate Informatica BDM and Informatica Cloud with Hadoop.
  • Implemented IBM Guardium to perform enterprise level monitoring.
  • Splunk integration with Hadoop for log aggregation and monitoring dashboards.
  • Provisioning, installing, configuring, monitoring, and maintaining HDFS, Yarn, HBase, Flume, Sqoop, Oozie, Pig, Hive, Ranger, Rangerkms, Falcon, Smart sense, Storm, Kafka.
  • Recovering from node failures and troubleshooting common Hadoop cluster issues.
  • Scripting Hadoop package installation and configuration to support fully-automated deployments.
  • Automated Hadoop deployment using Ambari blueprints and Ambari REST API's.
  • Responsible for building a cluster on HDP 2.5.
  • Performed major Hadoop upgrades. Upgraded from HDP 2.5.3 to HDP 2.6.4
  • Worked closely with developers to investigate problems and make changes to the Hadoop environment and associated applications.
  • Trouble shooting many cloud related issues such as Data Node down, Network failure, login issues and data block missing.
  • Proven results-oriented person with a focus on delivery.
  • Performed Importing and exporting data into HDFS and Hive using Scoop.
  • Performed HDFS cluster support and maintenance tasks like Adding and Removing Nodes without any effect to running jobs and data.

Environment: HDFS, Map Reduce, Hive, Pig, Scoop, Ranger, Rangerkms, Falcon, Smart sense, Storm, Kafka.

Confidential, Sunnyvale, CA

Hadoop Engineer

Responsibilities:

  • Involving in Analysis, Design, Implementation and Bug Fixing Activities.
  • Involving in Functional & Technical Specification documents review.
  • Created and configured domains in production, development and testing environments using configuration wizard.
  • Involved in creating and configuring the clusters in production environment and deploying the applications on clusters.
  • Deployed and tested the application using Tomcat web server.
  • Analysis of the specifications provided by the clients.
  • Involved to Design of the Application.
  • Ability to understand Functional Requirements and Design Documents.
  • Developed Use Case Diagrams, Class Diagrams, Sequence Diagram, Data Flow Diagram
  • Coordinated with other functional consultants.
  • Web related development with JSP, AJAX, HTML, XML, XSLT, and CSS.
  • Create and enhance the stored procedures, PL/SQL, SQL for Oracle 9i RDBMS.
  • Designed and implemented a generic parser framework using SAX parser to parse XML documents which stores SQL.
  • Identified the required data to be pooled to Hadoop, and created required Sqoop scripts which were scheduled periodically to migrate data to Hadoop environment.
  • Provided further Maintenance and support, this involves working with the Client and solving their problems which include major Bug fixing.

Environment: Java 1.4, Web logic Server 9.0, Oracle 10g, Web services Monitoring, Web Drive, UNIX/LINUX Hadoop, Hive, Web Logic Server, JavaScript, HTML, CSS, XML

Confidential, Newark, NJ

Hadoop Developer

Responsibilities:

  • Responsible for coding Map Reduce program, Hive queries, testing and debugging the Map Reduce programs.
  • Developed Pig Latin scripts in the areas where extensive coding needs to be reduced to analyze large data sets.
  • Used Sqoop tool to extract data from a relational database into Hadoop.
  • Worked closely with data warehouse architect and business intelligence analyst to develop solutions.
  • Responsible for performing peer code reviews, troubleshooting issues and maintaining status report.
  • Involved in creating Hive Tables, loading with data and writing Hive queries, which will invoke and run MapReduce jobs in the backend.
  • Installed and configured Hadoop cluster in DEV, QA and Production environments.
  • Performed upgrade to the existing Hadoop clusters.
  • Enabled Kerberos for Hadoop cluster Authentication and integrate with Active Directory for managing users and application groups.
  • Implemented Commissioning and Decommissioning of new nodes to existing cluster
  • Worked with systems engineering team for planning new Hadoop environment deployments, expansion of existing Hadoop clusters.
  • Responsible for data ingestions using Talend.
  • Designed and presented plan for POC on impala.
  • Experienced in migrating Hive QL into Impala to minimize query response time.
  • Monitoring workload, job performance and capacity planning using Cloudera Manager.
  • Worked with application teams to install OS level updates, patches and version upgrades required for Hadoop cluster environments.
  • Supported in setting up QA environment and updating configurations for implementing scripts with Pig, Hive and Sqoop.

Environment: Hadoop, HDFS, Map Reduce, Hive, Flume, Sqoop, Cloudera CDH4, HBase, Oozie, Pig, AWS EC2 cloud, Eclipse, Talend.

Confidential

Linux System Administrator

Responsibilities:

  • Installation and configuration of Red Hat Linux (4.x), Solaris (9.x, 10.x) on new server builds as well as during the upgrade situations.
  • Providing on-call support and escalations support on business hours. Package management using YUM repository and RPM.
  • Creating the file systems using red hat volume manager and performing the health check on regular basis for all Linux serves
  • Configuring printers to the Solaris and Linux servers and also installing third party software.
  • Performance Tuning and Management for Linux/AIX server and working with the application/database team to resolve issues.
  • Upgraded wireless network infrastructure and firewall.
  • Gathered changes for Remedy system from IT staff or user community for requirements, prepared technical assessments, delivering technical support, training and administration.
  • Performed server updates, patches and upgrades using YUM and RPM.
  • Installs Firmware Upgrades, Kernel patches, systems configuration, performance tuning on Linux systems.
  • Extensive knowledge on Server administration, Kernel upgrade and deployment of patches and applying all firewall and security policies with emphasis on maintaining best practice
  • Installation, configuration, and customization of services send mail, Apache, FTP servers to meet the user needs and requirements.
  • Monitored system capacity and performance using tools like Vmstat, Iostat.
  • Performing Kernel and database configuration optimization such that it limits I/O resource utilization on disks.

Environment: Linux, Red Hat, Vmstat, IOS tat .

Hire Now