- Working in Apache Hadoop, Cloudera, Hortonworks Enterprise Distribution of Hadoop Eco - system which includes supporting developers and architects in planning of Hadoop clusters, setting up and configuring a core/customized Hadoop cluster, loading data from different sources using Sqoop, enabling High availability, commissioning and de-commissioningReport generation of running nodes using various benchmarking Operations, scaling, Node failure recovery and Documenting all production scenarios, issues and resolutions.
Web Technologies: Ajax, jQuery, HTML, CSS, XML
Programing Languages: Java, Scala, C/ C++, Python
Databases: MySQL, MS-SQL Server, SQL, Oracle 11g, NoSQL (HBase, Cassandra)
Web Services: REST, AWS, SOAP,UD, Micro Services
Tools: Ant, Maven, JUnit, Apache NiFi, Talend, Airflow
Servers: Apache Tomcat, WebSphere, JBoss
IDE's: Eclipse, IntelliJ IDEA, NetBeans
Hadoop Framework: HDFS, Map Reduce, Hive, Pig, Zookeeper, Sqoop, Hbase, Flume
OS: RedHat Linux, UNIX, Windows 2000/NT/XP,Sun Solaris
Scripting Languages: Unix Shell scripting
- Involved in start to end process of Hadoop cluster setup where installation, configuration and monitoring the Hadoop Cluster.
- Responsible for Cluster maintenance, commissioning and decommissioning Data nodes, Cluster Monitoring, Troubleshooting, Manage and review data backups, Manage and review Hadoop log files.
- Monitoring systems and services, architecture design and implementation of Hadoop deployment, configuration management, backup, and disaster recovery systems and procedures.
- Installation of various Hadoop Ecosystems and Hadoop Daemons.
- Responsible for Installation and configuration of Hive, Pig, HBase and sqoop on the Hadoop cluster.
- Configured various property files like core-site.xml, hdfs-site.xml, mapred-site.xml based upon the job requirement.
- Involved in loading data from UNIX file system to HDFS.
- Provisioning, installing, configuring, monitoring, and maintaining HDFS, Yarn, HBase, Flume, Sqoop, Oozie, Pig, and Hive.
- Monitored multiple Hadoop clusters environments using Ganglia and Nagios. Monitored workload, job performance and capacity planning.
- Expertise in recommending hardware configuration for Hadoop cluster.
- Installing, Upgrading and Managing Hadoop Cluster on Hortonworks distribution.
- Trouble shooting many cloud related issues such as Data Node down, Network failure and data block missing.
- Managing and reviewing Hadoop and HBase log files.
- Experience with Unix and Linux, including shell scripting.
- Strong problem-solving skills.
- Loading the data from the different Data sources like (Teradata and DB2) into HDFS using sqoop and load into Hive tables, which are partitioned.
- Developed Hive UDF's to bring all the customers information into a structured format.
- Developed bash scripts to bring the Tlog files from ftp server and then processing it to load into hive tables.
- Built automated set up for cluster monitoring and issue escalation process.
- Administration, installing, upgrading and managing distributions of Hadoop, Hive, HBase.
- Disaster Recovery setup and best practices, including planning, testing, failover testing of MongoDB.
- Add / remove replica and shard nodes in MongoDB Cluster as needed.
- Design, implement sharding and indexing strategies for huge data sets of MongoDB.
Environment: Hadoop, HDFS, Map Reduce, Shell Scripting, spark, solr, Pig, Hive, HBase, Sqoop, Flume, Oozie, Zoo keeper, cluster health, monitoring security, Redhat Linux, impala, Ambari.
Confidential, New York, NY
- Involved in Hadoop Cluster environment administration that includes adding and removing cluster nodes, cluster capacity planning, performance tuning, cluster Monitoring, Troubleshooting.
- Installation of various Hadoop Ecosystems and non-default component.
- Installed and configured multi-nodes fully distributed Hadoop cluster.
- Adding/removing nodes to an existing Hadoop cluster.
- Recovering from node failures and troubleshooting common Hadoop cluster issues.
- Supported various echo system programs those are running on the cluster.
- Managed and reviewed Hadoop Log files as a part of administration for troubleshooting purposes. Communicate and escalate issues appropriately.
- Involved in HDFS maintenance and administering it through Hadoop- API.
- Hands on experience in analyzing Log files for Hadoop and eco system services and finding root cause.
- Configured Dynamic resource pool to provide service-level agreements for multiple users of a cluster.
- Worked with systems engineering team to plan and deploy new Hadoop environments and expand existing Hadoop clusters.
- Monitor Hadoop cluster connectivity and performance.
- As a admin followed standard Back up policies to make sure the high availability of cluster.
- Assisted with data capacity planning and node forecasting.
- Extensive knowledge using Linux/Unix commands.
Environment: HADOOP, HDFS, Map-Reduce, Hive, HBase, YARN, Kafka, Storm, Zookeeper and Cloudera.
Confidential, Morris Plains, NJ
- Analyzed system requirements and developed detailed Test Plan for System Testing.
- Performed Data Validation of the data flow from the front-end to the back-end.
- Involved in all phases of testing from attending BRD review meetings to Release testing in different test environments.
- Extensively experienced in designing Test Cases, Test Scenarios, and Test reports.
- Worked on Windows and UNIX platforms.
- Performed User Acceptance Testing (UAT), Smoke testing (Ad-Hoc).
- Worked on Requirement Traceability Matrix and Test Matrix.
- Extensively worked on TestDirector for Requirement Management, Defects tracking, review and analyzing.
- In depth understanding of the Bug Life Cycle and Used TestDirector to prepare the reports for the same with excellent communication with the BA and the developers.
- Associated Test Cases to requirements in order to ensure that all the functional requirements have been covered using TestDirector.
- Performed Back-End Testing using base and complex SQL Queries and Crystal Reports.
- Developed automated Test Scripts using Quick Test Professional and automated various Business Flows for E-Commerce activities.
- Extensively involved in writing, executing and analyzing UAT, Database Checkpoints, and Data Driven Test Scripts for all the projects using Quick Test Professional.
- Performed Functional and Regression testing using HP Quick Test Professional (QTP).
- Involved in Lock-Down Sessions and interacted with Developers, Analysts and Clients to discuss the bug fixes and enhancements.
- Implemented Naming Convention to identify the scripts for various applications and functions.
- Performed User Acceptance Testing and defined Exit- Entry criteria involved in Usability testing.
Environment: Oracle, QTP, TestDirector, Visual FoxPro, MS Office, Crystal Reports, DBMS,Windows, Unix, SQL Server.
- Responsible for all client communications, conflict resolution, and compliance on client deliverables and revenue.
- Reviews all major deliverables (i.e. strategic brief, function spec, tech spec, etc.) to ensure quality standards and client expectations are met.
- Ensures that client issues are dealt with in an efficient manner, informing the Account Director or Managing Director of any problems that may arise.
- Owns the contract and contract renewals for new work for an existing client.
- Approves Change Orders and invoices, and is responsible for payment collections.
- Works closely with the project team in order to maintain a continuous knowledge of project status in order to identify potential issues and/or opportunities within or related to the project.
- Ensures that all processes and procedures are completed, quality standards are met, and that projects are profitable.
- Aware and in pursuit of opportunities for account growth and new business, involving the Account Director, Sales or other Q-Bridge support.
- Communicates the client's goals and represent the client's interests to the team.
- Provides regular two-way communication between the client and team, to provide strong team representation and set proper client expectations.
- Understanding of company capabilities and service, and effectively communicates all offerings to the client.
- Reports to the Account Director, providing regular input on all account activity, including status and call reports on a weekly basis.