Hadoop Developer Administrator Resume Profile
3.00/5 (Submit Your Rating)
Charlotte, NC
SUMMARY
- 11 years of rich and insightful experience, including Big data Cloud Hadoop, AWS Open stack projects. Involved in Leadership/Management large complex Information Systems and Projects as well as in System Integration, IT Solution Architect, Multi-Vendor Service Management and Business Continuity management.
- Consulting CTO and development partner for several start-ups in the SaaS, Cloud Computing, Web 2.0 and SOA space.
- Having total 10 years of IT Exp as Storage, UNIX Linux,Solaris,AIX Server,
- Having 5 years of exp in handling WAS, WESB, FileNet, WSRR,DB2 infrastructure planning/design
- Certified ITIL v3 Certified Hadoop admin from Cloudera
- Good expertise on SLA management Toot cause analysis.
- Having 4 years of exp in Hadoop, AWS Open stack.
- Having 5 years of exp on working Agile methodologies
- Technical Reviewer of 30 technical books on Java, J2EE, Weblogic, SOA, and every major mobile platform iPhone, BlackBerry and Android .
- Speaker at multiple conferences
- Enabled easy sharing of technical resources and developmental costs, and the centralization of infrastructure across a large integration of users
Software SKILLS
- Cloud SaaS Platforms: Amazon EC2, Google App Engine, Force.com, Workday, Concur, ServiceNow etc.
- J2EE Platforms such as JBoss, Tomcat, WebLogic, AquaLogic and some WebSphere.
- Open Source Collaboration Platforms such as LAMP, Alfresco, SharePoint, Drupal, Facebook, WordPress, Twitter etc.
- Cassandra deployment,
- Extensive SOA and Web Services experience on different technology stacks.
- Hands on exp on secure system with TLS/SSL security from end to end.
- Easily manageable system, using customized deployment and server management scripts.
- Having exp 4 years of exp on RabbitMQ deployment with support for clustering and federation.
- Web 2.0 Platforms such as Blogs, Mashups, Facebook API, and other Social Technology.
- Architectural Experience: Python, DJango, Ruby on Rails, .NET, RIA Silverlight, Adobe Flex/Air , Blackberry, Android iPhones mobile development.
IT Infra Technical Skills
- Storage: EMC Vmax, DMX-3000/4000, symmetric, Clarion, HP XP, EVA, MSA, P7000, IBM DS8300/8700/4700/8000, Hitachi USP, AMS, NetApp FAS series arrays, Symantec
- Cloud: Huawei, Cisco, VMware, Amazon, Eucalyptus, Hadoop
- Operating Systems: WIN NT/2000/XP/2003/Vista/7, Solaris 8/9/10, HP-UX, Red Hat Linux
- Database: MS SQL Server, Oracle, MS Access, Teradata
- Performance monitoring tools: HP,BSM
- NMS Tools: Nagios, Infovista
- Backup Tool: Tivoli
- Open source: Nagios, Apache Open stack, Infovista
Professional Summary
Confidential
Hadoop developer Administrator.
- Familiarity with networks: TCP/ IP Firewall, DNS
- Excellent installation skills of Hadoop components:
- HDFS, Map, HBase, Red, Proficient in supporting Linux operating systems: CENTOS, Debian Fedora, Solid understanding of open source monitoring tools: Nagios, Ganglia, Cloudera Manager
- Exceptional in overseeing system administration operations: Performance tuning, Storage capacity managemen,System dump analysis, Skilled in programming languages: Unix shell scripting, SQL, C, Java, Well-versed in databases:Oracle,DB2,MySQL,Specialist in version control tools: SVN, VSS
- Work with various project teams to identify opportunities to optimize client infrastructure.
- Develops solutions that would be taken to market and support the sales cycle
- Deliver solution Presentation to Customers. This includes: Clarification Workshops, Solution Workshops and Solution Presentations
- Drive / Participate in Solution discussions with Partners where applicable .This includes: Capability, Technical Solution and Pricing
- Coordinate with other BU's to provide an integrated solution
- Collaborate with Business Analysts in response development
- Leverage Company Capabilities, IP and Alliances as appropriate in the overall Solution Model
- Optimized the utilization of data systems, and improved the efficiency of security and storage solutions by control and prevention of loss of sensitive data
- Good exp towards Cloud stack open stack
Confidential
Designation: Hadoop admin
Key Responsibilities
- Leading Hadoop-based data mining and graph algorithms research team.
- Working closely with both internal and external cyber security customers.
- Researching and implementing algorithms for large power-law skewed datasets.
- Developing interactive graph visualization tool based on Prefuse vis package.
- Developing machine-learning capability via Apache Mahout.
- Leading development of terabyte-scale network packet analysis capability.
- Leading research effort to tightly integrate Hadoop and HPC systems.
- Purchased, deployed, and administered 70 node Hadoop cluster. Administered two smaller clusters.
- Compared Hadoop to commercial big-data appliances from Netezza, XtremeData, and LexisNexis. Published and presented results.
- Developed capability to extract and collect malicious urls from spam and botnet generated email.
- Efficiently collected over 5 million unique domains and urls.
WORK EXPERIENCE
Confidential
Role: Senior Hadoop admin
- Lead the definition of an outsourcing and centralization strategy for Application Development and Maintenance demonstrating how the company could save 3 million over 1 years. A combination of my plan and a plan is currently being implemented at UTAS.
- Involved in various POC activity using technology like Map reduce, Hive, Pig, and Oogie.
- Handling 24x7 support on Hadoop issues.
- Involved in designing and implementation of service layer over HBase database.
- Assisted in designing, development and architecture of Hadoop and HBase systems.
- Coordinated with technical teams for installation of Hadoop and third related applications on systems.
- Formulated procedures for planning and execution of system upgrades for all existing Hadoop clusters.
- Supported technical team members for automation, installation and configuration tasks.
- Suggested improvement processes for all process automation scripts and tasks.
- Provided technical assistance for configuration, administration and monitoring of Hadoop clusters.
- Conducted detailed analysis of system and application architecture components as per functional requirements.
- Created scripts to form EC2 clusters for training and for processing.
- Implemented performance monitoring tools HP
- Worked on Amazon cloud Migration project.
- Worked for Amazon Elastic Cloud project using agile metadologies.
- Assisted business analyst in posting migration project.
- Reviewed firewall settings security group and updated on Amazon AWS.
- Created access documents for level/tier 3/4 production support groups
- Created Cassandra Advanced Data Modeling course for DataStax
- Working on Agile methodologies.
- Participated in evaluation and selection of new technologies to support system efficiency
- Importing of data from various data sources such as Oracle and Comptel server into HDFS using transformations such as Sqoop, Map Reduce.
- Analyzed the data by performing Hive queries and running Pig scripts to know user behavior like frequency of calls, top calling customers.
- Continuous monitoring and managing the Hadoop cluster through Cloudera Manager.
- Developed Hive queries to process the data and generate the data cubes for visualizing.
Environment:
- Hadoop, Map Reduce, HDFS, Hive, Ooozie, Java jdk1.6 , Cloudera, NoSQL, Oracle 11g, 10g, Toad 9.6, Windows 2000,Solaris, Linux
Confidential
Role: Cloud admin
- Assisted in creation of ETL process for transformation of data sources from existing RDBMS systems.
- Involved in various POC activity using technology like Map reduce, Hive, Pig, and Oogie.
- Involved in designing and implementation of service layer over HBase database.
- Importing of data from various data sources such as Oracle and Comptel server into HDFS using transformations such as Sqoop, Map Reduce.
- Analyzed the data by performing Hive queries and running Pig scripts to know user behavior like frequency of calls, top calling customers.
- Continuous monitoring and managing the Hadoop cluster through Cloudera Manager.
- Developed Hive queries to process the data and generate the data cubes for visualizing.
- Designed and developed scalable and custom Hadoop solutions as per dynamic data needs.
- Coordinated with technical team for production deployment of software applications for maintenance.
- Provided operational support services relating to Hadoop infrastructure and application installation.
- Supported technical team members in management and review of Hadoop log files and data backups.
- Participated in development and execution of system and disaster recovery processes.
- Formulated procedures for installation of Hadoop patches, updates and version upgrades.
- Automated processes for troubleshooting, resolution and tuning of Hadoop clusters.
Environment:
- Hadoop, Map Reduce, HDFS, Hive, Ooozie, Java jdk1.6 , Cloudera, NoSQL, Oracle 11g, 10g, Toad 9.6, Windows NT, UNIX Linux ,Agile
Confidential
Consultant-systems
- Tech Lead for Corporate Workflow project.
- Led team of 5 developers on both the restricted and classified networks.
- Customized J2EE COTS Business Process Management BPM engine.
- Interfaced system with Sandia's infrastructure and 15 custom applications.
- Technical Lead for radiation dose calculation application.
- Developed J2EE application to support dosimetry lab and 3000 dosimeters.
- Application deployment access control and tracking.
- Developed middleware infrastructure to enable secure deployment of 150 applications.
- Critical reporting.
- Developed web-based reporting application interfaced with DOE systems.
- Taught several classes and tutorial sessions on J2EE software development.
- DOE Q clearance.
Environment:
- Java,J2EE,Webservice,JAX-WS,JAX-RS,Spring,Hibernate,Smooks,OracleWeblogic, SOAP-UI,Unix, Agile