We provide IT Staff Augmentation Services!

Hadoop Admin Resume

4.00/5 (Submit Your Rating)

Boston, MA

SUMMARY

  • Confidential is a qualified Technocrat and a trained and seasoned IT Professional, having over 11 years of rich and insightful experience, including Big data Cloud (Hadoop, AWS) & Open stack projects.
  • Involved in Leadership/Management large complex Information Systems and Projects as well as in System Integration, IT Solution Architect, Multi - Vendor Service Management and Business Continuity management.
  • Consulting CTO and development partner for several start-ups in the SaaS, Cloud Computing, Web 2.0 and SOA space.
  • Having total 10 years of IT Exp as Storage, UNIX (Linux, Solaris, AIX) Server,
  • Having 5 years of exp in handling WAS, WESB, FileNet, WSRR,DB2 infrastructure planning/design
  • Certified ITIL v3 & Certified Hadoop admin from Cloudera
  • Good expertise on SLA management & Toot cause analysis.
  • Having 4 years of exp in Hadoop, AWS & Open stack.
  • Having 5 years of exp on workingAgile methodologies
  • Technical Reviewer of 30+ technical books on Java, J2EE, Weblogic, SOA, and every major mobile platform (iPhone, BlackBerry and Android).
  • Speaker at multiple conferences
  • Enabled easy sharing of technical resources and developmental costs, and the centralization of infrastructure across a large integration of users
  • Cloud & SaaS Platforms: Amazon EC2, Google App Engine, Force.com, Workday, Concur, ServiceNow etc.
  • J2EE Platforms such as JBoss, Tomcat, WebLogic, AquaLogic and some WebSphere.
  • Open Source & Collaboration Platforms such as LAMP, Alfresco, SharePoint, Drupal, Facebook, WordPress, Twitter etc.
  • Cassandra deployment,
  • Extensive SOA and Web Services experience on different technology stacks.
  • Hands on exp on secure system with TLS/SSL security from end to end.
  • Easily manageable system, using customized deployment and server management scripts.
  • Having exp 4 years of exp on RabbitMQ deployment with support for clustering and federation.
  • Web 2.0 Platforms such as Blogs, Mashups, Facebook API, and other Social Technology.
  • Architectural Experience: Python, DJango, Ruby on Rails, .NET, RIA (Silverlight, Adobe Flex/Air), Blackberry, Android & iPhones mobile development.Having 4 years of Exp as Hadoop developer & Administrator.
  • Excellent installation skills of Hadoop components:
  • HDFS, Map, HBase, Red, Proficient in supporting Linux operating systems: CENTOS, Debian Fedora, Solid understanding of open source monitoring tools: Nagios, Ganglia, Cloudera Manager
  • Familiarity with networks: TCP/ IP Firewall, DNS
  • Exceptional in overseeing system administration operations: Performance tuning, Storage capacity managemen,System dump analysis, Skilled in programming languages: Unix shell scripting, SQL, C, Java, Well-versed in databases:Oracle,DB2,MySQL,Specialist in version control tools: SVN, VSS
  • Work with various project teams to identify opportunities to optimize client infrastructure.
  • Develops solutions that would be taken to market and support the sales cycle
  • Deliver solution Presentation to Customers. This includes: Clarification Workshops, Solution Workshops and Solution Presentations
  • Drive / Participate in Solution discussions with Partners (where applicable).This includes: Capability, Technical Solution and Pricing
  • Coordinate with other BU’s to provide an integrated solution
  • Collaborate with Business Analysts in response development
  • Leverage CompanyCapabilities, IP and Alliances as appropriate in the overall Solution Model
  • Optimized the utilization of data systems, and improved the efficiency of security and storage solutions by control and prevention of loss of sensitive data
  • Good exp towards Cloud stack & open stack

TECHNICAL SKILLS

Storage: EMC Vmax, DMX-3000/4000, symmetric, Clarion, HP XP, EVA, MSA, P7000, IBM DS 8300/8700/4700/8000 , Hitachi USP, AMS, NetApp FAS series arrays, Symantec

Cloud: Huawei, Cisco, VMware, Amazon, Eucalyptus, Hadoop

Operating Systems: WIN NT/2000/XP/2003/Vista/7, Solaris 8/9/10, HP-UX, Red Hat Linux

Database: MS SQL Server, Oracle, MS Access, Teradata

Performance monitoring tools: HP,BSM

NMS Tools: Nagios, Confidential

Backup Tool: Tivoli

Open source: Nagios, Apache Open stack, Confidential

PROFESSIONAL EXPERIENCE

Confidential - Boston, MA

Hadoop Admin

Responsibilities:

  • Leading Hadoop-based data mining and graph algorithms research team.
  • Working closely with both internal and external cyber security customers.
  • Researching and implementing algorithms for large power-law skewed datasets.
  • Developing interactive graph visualization tool based on Prefuse vis package.
  • Developing machine-learning capability via Apache Mahout.
  • Leading development of terabyte-scale network packet analysis capability.
  • Leading research effort to tightly integrate Hadoop and HPC systems.
  • Purchased, deployed, and administered 70 node Hadoop cluster. Administered two smaller clusters.
  • Compared Hadoop to commercial big-data appliances from Netezza, XtremeData, and LexisNexis. Published and presentedresults.
  • Developed capability to extract and collect malicious urls from spam and botnet generated email.
  • Efficiently collected over 5 million unique domains and urls.

Confidential, Charlotte, NC

Senior Hadoop admin

Responsibilities:

  • Lead the definition of an outsourcing and centralization strategy for Application Development and Maintenance; demonstrating how the company could save $3 million over 1 years. A combination of my plan and a plan is currently being implemented at UTAS.
  • Involved in various POC activity using technology like Map reduce, Hive, Pig, and Oogie.
  • Handling 24x7 support on Hadoop issues.
  • Involved in designing and implementation of service layer over HBase database.
  • Assisted in designing, development and architecture of Hadoop and HBase systems.
  • Coordinated with technical teams for installation of Hadoop and third related applications on systems.
  • Formulated procedures for planning and execution of system upgrades for all existing Hadoop clusters.
  • Supported technical team members for automation, installation and configuration tasks.
  • Suggested improvement processes for all process automation scripts and tasks.
  • Provided technical assistance for configuration, administration and monitoring of Hadoop clusters.
  • Conducted detailed analysis of system and application architecture components as per functional requirements.
  • Created scripts to form EC2 clusters for training and for processing.
  • Implemented performance monitoring tools (HP)
  • Worked on Amazon cloud Migration project.
  • Worked for Amazon Elastic Cloud project using agile metadologies.
  • Assisted business analyst in posting migration project.
  • Reviewed firewall settings (security group) and updated on Amazon AWS.
  • Created access documents for level/tier 3/4 production support groups
  • Created Cassandra Advanced Data Modeling course for DataStax
  • Working on Agile methodologies.
  • Participated in evaluation and selection of new technologies to support system efficiency
  • Importing of data from various data sources such as Oracle and Comptel server intoHDFS using transformations such as Sqoop, Map Reduce.
  • Analyzed the data by performing Hive queries and running Pig scripts to know user behavior like frequency of calls, top calling customers.
  • Continuous monitoring and managing the Hadoop cluster through Cloudera Manager.
  • Developed Hive queries to process the data and generate the data cubes for visualizing.

Environment: Hadoop, Map Reduce, HDFS, Hive, Ooozie, Java (jdk1.6), Cloudera, NoSQL, Oracle 11g, 10g, Toad 9.6, Windows 2000, Solaris, Linux

Confidential, Wilmington, DE

Cloud admin

Responsibilities:

  • Assisted in creation of ETL process for transformation of data sources from existing RDBMS systems.
  • Involved in various POC activity using technology like Map reduce, Hive, Pig, and Oogie.
  • Involved in designing and implementation of service layer over HBase database.
  • Importing of data from various data sources such as Oracle and Comptel server intoHDFS using transformations such as Sqoop, Map Reduce.
  • Analyzed the data by performing Hive queries and running Pig scripts to know user behavior like frequency of calls, top calling customers.
  • Continuous monitoring and managing the Hadoop cluster through Cloudera Manager.
  • Developed Hive queries to process the data and generate the data cubes for visualizing.
  • Designed and developed scalable and custom Hadoop solutions as per dynamic data needs.
  • Coordinated with technical team for production deployment of software applications for maintenance.
  • Provided operational support services relating to Hadoop infrastructure and application installation.
  • Supported technical team members in management and review of Hadoop log files and data backups.
  • Participated in development and execution of system and disaster recovery processes.
  • Formulated procedures for installation of Hadoop patches, updates and version upgrades.
  • Automated processes for troubleshooting, resolution and tuning of Hadoop clusters.

Environment: Hadoop, Map Reduce, HDFS, Hive, Ooozie, Java (jdk1.6), Cloudera, NoSQL, Oracle 11g, 10g, Toad 9.6, Windows NT, UNIX(Linux),Agile

Confidential

Tech Lead

Responsibilities:

  • Led team of 5 developers on both the restricted and classified networks.
  • Customized J2EE COTS Business Process Management (BPM) engine.
  • Interfaced system with Sandia's infrastructure and 15+ custom applications.
  • Technical Lead for radiation dose calculation application.
  • Developed J2EE application to support dosimetry lab and 3000+ dosimeters.
  • Application deployment access control and tracking.
  • Developed middleware infrastructure to enable secure deployment of 150+ applications.
  • Critical reporting.
  • Developed web-based reporting application; interfaced with DOE systems.
  • Taught several classes and tutorial sessions on J2EE software development.
  • DOE Q clearance.

Environment: Java,J2EE,Webservice,JAX-WS,JAX-RS,Spring,Hibernate,Smooks,OracleWeblogic, SOAP-UI,Unix, Agile

Confidential, Herndon, VA

Senior specialist

Responsibilities:

  • Responsible for pre-sales technical support for a J2EE / Services Oriented Architecture product.
  • Involved in preparing demos and technical sales material and sales tools such as ROI, and Architecture Maturity.
  • Worked on proof of concepts on using the Wakesoft Architecture Platform to deliver service-oriented applications on BEA WebLogic 7.0 and 8.1 and IBM WebSphere 4.0.3 and 5.0.
  • Developed competitive differentiation matrices and consulted on refining the value proposition.
  • Helped define requirements for next version of product based on customer feedback and competitive analysis.

Environment: Sonic ESB, DxSI, Sonic MQ, Actional Intermediary (AI), Web Service, Jenkin, Java, Linux

Confidential, Minneapolis

Senior software engineer

Responsibilities:

  • Internal Applications for iRise - Architected various internal applications such as a recruiting application, time and expense application and an ERP application to manage consulting projects and consultants. Developed the technical architecture, and led the development team. Created mobile interfaces to the application, and Web Services wrappers using WebLogic Workshop and Apache AXIS. Worked with BEA Integration (AI, B2B and BPM) and Web Logic Portal. Used a hybrid of RUP and XP and Rational Rose tools for design and implementation on projects. Was responsible for requirements analysis, estimation and responsible for on-time project delivery..
  • Supply Chain integration / Logistics - Helped a startup architect a supply chain platform using the J2EE architecture with best-practice design patterns, optimized EJB usage, for deployment on WebLogic. The application integrated with a rules engine and various reporting systems such as Crystal Reports.

Environment: Struts1.2,Spring3.2,Hibernate,Java,XML,Jboss4.2,Apache,Mysql5.0,Zimbra,Sonar,Cybersource,Jenkin,JAX-RS,JAX-WS,Jenkin, Junit, JunitPerf, javaHelp,Linux.Java Script.

Confidential

Senior consultant

Responsibilities:

  • Involved Worked as a lead developer with a team of 5 members and developed application from the scratch based on the existing ordering application using technologies Java/J2EE, Hibernate and XML, JMS,JAX-WS.
  • Operational transfer for offshore delivery management.
  • Assisted business analyst in posting migration project. .
  • Created access documents for level/tier 3/4 production support groups.
  • Deprovisioned long haul NFS mounts on Linux systems and established regional/local NFS mounts increasing speed and performance.
  • Participated various Global consulting Services Bidding assignments.
  • Designed and Delivered Heterogeneous storage Migrations Using Brocade
  • Used XML DOM/SAX API for parsing XML.
  • Used ANT for compilation and building EAR files.
  • Worked in CI Tool (Jenkin)
  • Used JUnit/Eclipse for the unit testing of various modules.
  • Responsible for reviewing the code developed by the team members and making changes for performance tuning.
  • Involved in unit testing using Junit framework.
  • Providing support to ASG.

Environment: Java, Hibernate MDB, XML, JMS, Web service, Weblogic8.1.5, Oracle 9i, Toad, UNIX.

Confidential

Software Engineer

Responsibilities:

  • Challenge was to identify complex application components and its interdependencies, application interaction with stakeholders, business impactof a down time, application interaction with internal users and with external entities, hosting environment, performance and availability expectations, security aspects etc.
  • Based on business needs and priorities, prepared execution and testing strategy.

Environment: Struts, Hibernate, Java Script, Oracle 10g database, Oracle9iAS, Toad, Oracle Forms and Report builder, Unix

We'd love your feedback!