We provide IT Staff Augmentation Services!

Hadoop Administrator/technologies Manager Resume

2.00/5 (Submit Your Rating)

PROFESSIONAL SUMMARY

  • Over 14+ years of IT experience, played various roles such as Developer, Database Administration, Infrastructure Technology management and Big Data Hadoop Administrator/Architect.
  • More than 2 years working experience, ongoing support for All the Non - Functional aspects of the Big Data Competency for the Big Data Analytics projects of Tier-1 Telco, like Charging Business Analytics, Operational Data Store/Data Hub - based on the ALDM (Telco specific data model).
  • Day to Day Hadoop Admin support activities for multiple Non-Production clusters, Non-secured/Secured Clusters for NFT/UAT.
  • Solid Experience in Data modeling and ETL Processing.
  • Expertise in Hadoop Eco system tools such as Hive, Pig, Sqoop, Impala, Oozie & Zoo keeper.
  • Hands on experience in NOSQL databases such as HBASE.
  • Working experience in Data Mapping, Data Ingestion, Transforming and storing large and real time data sets in Hadoop clusters.
  • Hands on experience in working with large volume of Structured and Un-Structured data.
  • Handling HDP & Cloudera distributions, varied from 4 to 20 Nodes clusters of Development/Testing Non-Production Environments.
  • Working experience with Large Cluster of 50 nodes HDP Hadoop cluster, processing data of 800 TB production environment.
  • Have experience in setting up big data environment of multi nodes cluster with HortonWorks & Cloudera distribution, End to End Nonfunctional aspects for the clusters.
  • Having strong analytical and presentation skills to communicate complex quantitative analysis in a clear, precise, and actionable manner.
  • Hands on experience in processing streaming data using Apache Spark.
  • Infrastructure setup, Capacity planning for Hadoop clusters.

TECHNICAL SKILLS

Big Data Skills/Tools: Map Reduce, HDFS, Hadoop Eco Systems ( Sqoop, Impala, Hive, Pig, Oozie, Zookeeper, Flume, Apache Spark, Python, Scala, HBASE, YARN, Kerberos).

Hardware & Operating Systems: Windows, UNIX, Solaris, HP-UX, REDHAT 6.x

Database Skills/Tools: Oracle, MySQL, Postgres, InMemory DB (TimesTen), Golden Gate Replication.

Programming Languages & Tools: Cobol, Developer 2000/ Designer 2000, Awk, Unix Shells, Python.

Engagement Experience: Architecture and Design, Database Migration, Database maintenance, Logical and Physical Data Modeling, Project Management, Agile/Lean Methodologies, Project Planning, Technical Architect, MS Project.

PROFESSIONAL EXPERIENCE

Confidential

Hadoop Administrator/Technologies Manager

Responsibilities:

  • Involved in discussions and guiding teams of Developers/Testers/Operational .
  • Translated complex Non-functional and technical requirements into detailed design. Creation of the HLAD, Installation Documents.
  • Cluster Installation and maintenance of it through routine support to Big Data Competency.
  • Migrated data from oracle data warehouse into HDFS using Sqoop tool, Large volume of data Migration strategy preparation.
  • Responsible for technical design and review of data dictionary.
  • Understanding the flow of the data, Use case analysis.
  • Involvement in the Preparation of the Production like Environments. Discussions and inputs to the Development/Testing Team, Clusters based on HDP and Cloudera Distributions.
  • Used Python along with Shell script for maintenance/extracts/Data Transfer jobs.
  • Involved in various POC's to choose right big data tools for business use cases.
  • Proposed best practices/standards.
  • Proficiency with mentoring and on-boarding new engineers who are not proficient in Hadoop and getting them up to speed quickly. I provide mentorship and guidance in operationalizing Hadoop and its ecosystem to solve business problems.
  • Extensive knowledge on Sqoop ETL jobs from different RDBMS sources, had automated sqoop jobs on a refresh of data from the different data source.
  • Applying Change control on the Production Environment during maintenance Period.
  • Involved in Commissioning and decommissioning the Nodes.
  • Assistance with performance Monitoring, ganglia, nagios, Alert setup.
  • Understanding and Issue resolution out of Reviewing Hadoop log files.
  • Project planning for Hadoop based systems and Environment creation strategies, Capacity Planning, Cluster Sizing.

Environment: Hortonworks (HDP 2.2), HDFS, Map Reduce, HIVE, Pig, Sqoop, HBase, Yarn, Tableau, Oracle, Shell Script, Oozie, Python.

Confidential

Hadoop Developer

Responsibilities:

  • Developed and configured flume agents and connectors to pipeline server logs data to HDFS.
  • Setup Kerberos for securing the Cluster.
  • Used HIVE schema to define the data.
  • Imported User data from Oracle using Sqoop.
  • Responsible for technical design and review of data dictionary.
  • Used Python map reduce package along with Shell script for code logic development.
  • Designed and created HIVE external tables using shared meta-store instead of derby with partitioning, dynamic partitioning and bucketing functions.
  • Migrated the current project data from different sources like Oracle, MySql - Data extracts to Hive using sqoop.
  • Automated the scripts for daily refresh and data up to date on Hadoop. Monitoring of cluster, Maintenance.
  • HDFS Capacity of 40 TB, Cluster of 5 nodes, each nodes of 24 Cores, 128 GB RAM.

Environment: Cloudera 5.x, CDH5,HDFS, Python, Map Reduce, HIVE, Pig, Sqoop,, hbase, Impala, Flume, Oracle, Shell Script, Oozie.

Confidential

Dy. Manager

Responsibilities:

  • Team Leading in Datacenter for Core Banking Project- Full Branch Computerization, Worked on the Core Banking Project (Enterprise wide solution, B@ncs24 FNS), which was the Centralized Web-based solution for connecting all the branches of State Bank of India. This project was on .NET Framework Frontend and Oracle 9i as the Centralized Database on Unix (HP 11i, Superdome).
  • Involved in the Data Mapping, Data Migration Projects. Managing around 100 of Test, Development, Production Databases in a Unix, Windows environment with sizes from Multiple MB to Multiple TB. Routine DBA Activities for all aspects of Core DBA
  • Developed Projects in Dev.2000 - Forms/Reports, Pl/Sql Packages, Procedures/Functions: Mutual Welfare Scheme, Branch Performance Report /Budget, Personal Data System, Standard Data Systems, Management Information System for Loans & Advances)
  • Delegation of Work to the Team members for coding and Testing, Involved in Data Modeling, Design and Creation of Database, Schema Objects, Writing Packages, Stored procedures & functions, Backup/Restore, Security aspects, Data Migration (Sql*Loader) for the projects.
  • Involved in the full application and database Life Cycle of the Project -System Study, Analysis, and design.
  • Written number Cobol program for the Back office project on Banking.

Environment: Unix, Cobol, Oracle, Dev 2000, Designer 2000.

We'd love your feedback!