Big Data Consultant Resume
4.00/5 (Submit Your Rating)
Sunnyvale, CA
SUMMARY:
- Having around 10+ years of Professional experience in IT Industry, involved in Developing, Implementing, configuring testing software systems, Hadoop ecosystem components and automation of various test tools.
- Experience with data architecture including data ingestion pipeline design, Hadoop information architecture, data modeling and data mining.
- Real time experience in Hadoop/Big Data related technology experience in Storage, Querying, Processing and analysis of data.
- Experience with design and develop test scenarios in Groovy and Python to validate Big Data.
- Experience with source code management such as GIT and CMVC
- Knowledge in installing, configuring, and using Hadoop ecosystem components like Hadoop Map Reduce, HDFS, Hive, Sqoop.
- Experience with Cloud computing environment with open stack.
- Experience in installing, configuring, maintaining, troubleshooting and backup/restore of UNIX/Linux servers.
- Experienced in configuring virtual servers using KVM, Power Hypervisors on Linux and Confidential power servers.
- Test tools development for validating system software, firmware and hardware components.
TECHNICAL SKILLS:
- Python, Groovy, HQL, C, Java, Java script, Shell scripting.
- KVM, VMware, VMcontrol, Confidential Smart Cloud, Confidential BigInsights, Open Stack.
- Map - Reduce, Hive, Sqoop, SAN, iSCSI.
- Linux (RHEL, SLES, Ubuntu), AIX6/7, Windows 2008 server.
PROFESSIONAL EXPERIENCE:
Confidential, Sunnyvale, CA
Big data consultant
Responsibilities:
- This project involves the design and develop Big data analysis solutions and test suite for maps data analysis.
- Involved in complete lifecycle of Hadoop implementation tasks - Hadoop Cluster administration, write MapReduce Programs, Hive Queries and pull the log files using Flume.
- Design and implement Map-Reduce jobs to support distributed data processing to process large data sets utilizing Hadoop cluster.
- Developed Map Reduce programs to cleanse and parse data in HDFS obtained from various data sources and to perform joins on the Map side.
- Exported the business-required information to RDBMS from HDFS using Sqoop to make the data available for BI team to generate reports.
- Responsible for design and developing test cases to validate maps data.
- Develop test cases to process JSON data, which execute across millions of records.
- Extract and load data from different sources into HDFS.
- Write Hive Queries to fetch data from Hadoop servers and have a consolidated view of the data.
- Analyze the failed data and escalate critical data issues.
- Managing the source code in SCM tools such as GIT/GitHub.
- Continuous integration of the data analysis source code through Jenkins.
- Experience working with REST APIs to fetch test status based on different queries.
- Working experience with Hadoop clusters and Map Reduce programs.
Technology: Big Data, Hadoop, Map Reduce, HDFS, Hive, ETL, Groovy, Python, GIT, JSON, Jenkins, Sqoop.
Confidential, Austin, TXSoftware Engineer
Responsibilities:
- Implementing Confidential solutions for Big Data on Power servers in the test environments.
- Involved in start to end process of Hadoop cluster installation and configuration.
- Analyzed large data sets by running Hive queries scripts.
- Involved in creating Hive tables, and loading and analyzing data using hive queries.
- Involved in running Hadoop jobs for processing millions of records of raw data.
- Experienced in loading data from local file system to HDFS.
- Responsible for loading unstructured and semi-structured data into Hadoop cluster coming from different sources using Sqoop.
- Handled importing of data from various data sources, performed transformations using Hive, MapReduce and loaded data into HDFS.
- Aligning with the systems engineering team to propose and deploy new hardware and software environments required for Hadoop and to expand existing environments.
- Automated and created Python tools to fetch data and report data analysis.
Technology: Big data, Hadoop, Linux, Hive, MapReduce, Python, Sqoop, HQL, BigInsights.
Confidential, Austin, TXSoftware Engineer
Responsibilities:
- Server test tools development. Power enterprise and Flex server validation.
- Responsible for developing IO switch cards (PCIe,) and memory test tools on power server.
- Created stress test tools for server and enhanced existing error injection mechanism.
- Configuration of Confidential cloud manager with Confidential Smart Cloud entry and open stack.
- Worked with hardware teams to configure KVMs and created multiple Virtual machines (VMs).
- Designed, developed, and maintained test plans and test cases for testing low-end, mid-range, and high-end Power Systems configured with AIX, Linux operating systems
- Power server verification with storage (SAS, SAN, RAID) and virtualization (KVM) technologies.
- Led IO system test team and managed the project with agile methodology.
- Performed functional level validation for virtualization technology (SRIOV).
- Performed Server I/O diagnostic and serviceability
Technology: Linux, AIX, Python, KVM, Storage, VMware, Power server, Flex servers.
ConfidentialSoftware Engineer
Responsibilities:
- This project involved AIX bug fixing and automation test development.
- Worked with development team to fix bugs in the user layer components in AIX OS.
- Fixed bugs in AIX RBAC, LDAP/Active directory authentication modules.
- Develop automation tools to verify new features on AIX operating system and Power firmware.
- Involved in validating power server firmware and other server components.
- Develop and enhance error injection tool for Confidential power processors and PCIe switches for negative testing.
- Experienced in creating virtual machines (VMs) for various operating systems - Linux (RedHat, Ubuntu, SuSE) on Power hardware.
- Bug reporting, tracking and status reporting in accordance with agile methodology.
- Hands on experience with Confidential Power enterprise servers and Blade (Flex) servers.
Technology: C, Python, AIX, Linux, Shell scripting.
ConfidentialSoftware Test Engineer
Responsibilities:
- This project was aimed at validating new features on AIX operating system.
- Led the security component team to validate new security features in AIX.
- Verified new features in security & kernel such as RBAC, Trusted execution, LDAP, Kerberose.
- Test plan design, test cases development using C programs and Shell scripting.
- Developed system management tool to create and manager AIX LPARs.
- Tool is still being used across multiple teams to manage power servers remotely.
- Bug reporting and tracking through Confidential Clear Quest, CMVC.
Technology: C, AIX, Linux, Shell script, LDAP, UNIX security, Virtualization.