We provide IT Staff Augmentation Services!

Big Data Engineer/ Data Analytics Resume

2.00/5 (Submit Your Rating)

Cumming, GA

SUMMARY

  • Over the past 12 years Confidential primary focus has been on Cloud Automation, Hadoop Big Data and Data Analytics.
  • Confidential has Installed and configured Big Data distributions from Cloudera (CDH), Hortonworks (DHP), and Map R.
  • He has implemented Cloud solutions with local and Amazon Web Services (AWS) resources.
  • Perform various visualizations and data mining techniques for Big Data analytics with R Language, and Python programming.
  • Confidential is a Certified UNIX System Administrator with 10 plus years of experience.
  • He has a clear understanding of Relational Database Concepts and has extensively worked with ORACLE, MySQL, DB2, and SQL Server.
  • He has a strong understanding on configuration management tools such as Chef and Puppet.

TECHNICAL SKILLS

Amazon Web Services: Elastic Compute Cloud (EC2), Virtual Private Cloud (VPC), Simple Storage Services (S3), and Relational Database Service (RDS).

Hadoop Environments: Cloudera, Hortonworks, MapR, HDFS, Zooker, Spark Scala Unix/Linux, Pig, Hive, Hbase, Flume, Sqoop, Shell Scripting, Ambari, Kerberos, Cloudera Manager. A Good understanding of Classic Hadoop and Yarn architecture along with various Hadoop Demons such as Job Tracker, Task Tracker, Name Node, Data Node, Secondary Name Node, Resource Manager, Node Manager, Application Master and Containers.

Programming: Perl, JavaScript, Python programming, C, UNIX shell scripting, R Language, C#, .NET and PowerShell.

PROFESSIONAL EXPERIENCE

Big Data Engineer/ Data Analytics

Confidential, Cumming, GA

Responsibilities:

  • Involved in Hadoop Cluster environment administration that includes cluster capacity planning, performance tuning, cluster Monitoring and Troubleshooting.
  • Created sequence file, ORC file and Avro file formats.
  • Worked on developing Linux and bash scripts for Job Automation.
  • Installed and configured the clusters using CDH & HDP using AWS and local resources.
  • Configuration of SSL and trouble shooting in Hue.
  • Responsible for building scalable distributed data solutions using Hadoop Cloudera works.
  • Optimized Map/Reduce Jobs to use HDFS efficiently by using various compression mechanisms.
  • Enabled Kerberos for authorization and authentication in ETL process.
  • Enabled HA for NameNode, Resource Manager, Yarn Configuration and Hive Megastore.
  • Configured Journal nodes and Zookeeper Services for the cluster using Cloudera.
  • Monitored Hadoop cluster job performance and capacity planning.
  • Taking backup of Critical data, Hive data and creating snapshots.
  • Performed Data extraction, transformation and loading to and from GlusterFS to Big Data HDFS and Hbase.
  • Responsible for cluster maintenance, adding and removing cluster nodes.
  • Handled importing of data from various data sources, performed transformations using Hive, MapReduce, and loaded data into HDFS.
  • Extraction data using Flume. Import/Export to HDFS/RDMS using Sqoop.
  • Analyzed the data by performing Hive queries and running Pig scripts to know user behavior.
  • Continuous monitoring and managing the Hadoop cluster through Cloudera Manager.
  • Data Analytics / Data mining / Machine Learning
  • Utilized R and Python, used Linear and Non - Linear Modelling Algorithms for Machine Learning, various visualizations and data mining techniques for big data analytics.
  • Utilize Business Intelligence tools such as Tableau for visually analyzing the data.

Environment: HDFS, Map Reduce, HBase, Hive, Oozie, Pig, Sqoop, HBase, Shell Scripting, MySQL, Red Hat Linux, CentOS and other UNIX utilities, AWS, EC2, S3, Cloudera Manager, Hortonworks Platform. R Language, and Python programming.

Senior Cloud Engineer

Confidential, Atlanta, GA

Responsibilities:

  • Work with the Design, Implementation, and Development teams to resolve issues related to Operating Systems, Databases, Networking, and underlying patterns.
  • Monitor multiple technology maintaining clients' cloud environments, including orchestration related applications and infrastructure, in a multi - tenant virtualized environment.
  • Utilize orchestration engines, such as ICO (IBM Cloud Orchestrator), Openstack Heat, VMware vRA, vCloud Orchestrator (vCO), or VMware vCloud Automation Center (vCAC).
  • Work within client environments to troubleshoot and resolve issues relating to the client s hybrid cloud using industry standard tools such Python, YAML, bash, Docker, PowerShell and PowerCLI programming languages.
  • Utilized configuration and management tools like Puppet and Chef for the installation and configuration of both Hadoop Hortonworks and Cloudera.

Environment: Hadoop Hartonworks(DHP), Cloudera, OpenStack, IBM Cloud Orchestrator, VMware, Sqoop, HDFS, Hive, Hbase, Puppet, Chef, Red Hat (RHEL), Python, Oracle, DB2 and SQL server.

Senior Cloud Consultant / System Engineer

Confidential, Atlanta, GA

Responsibilities:

  • Utilized Cloud Service Automation (CSA) to provide a Self - Service Portal and Service Catalog for end-to-end provisioning of both Windows and Linux servers.
  • Integrated various state-of-the-art Big Data technologies into the overall Experiences in designing, reviewing and optimizing data transformation processes using Hadoop and Apache-Storm.
  • Wrote Puppet models for installing and managing java versions.
  • Configured VMware vCenter, HP Server Automation, HP SiteScope, HP UCMDB, and Openstack as resource providers for Cloud Service Automation.
  • Provided advanced troubleshooting for applications and systems with emphasis on root cause analysis.
  • Installed and Configured HP Operations Orchestration (HPOO) 10.x on both Windows and Linux Platforms working with repositories, SSL Certificates, Contents packages, and Oracle 11g databases.
  • Utilized HPOO and Remedy Web Service to create, update, and close tickets automatically.
  • Create Windows PowerShell programs to quickly communicate in the windows environment.
  • Configured LDAP, Active Directory and System accounts to support Single-Sign-On (SSO) for both HPSA, HPNA, and HPOO.

Environment: HP Operations Orchestration(HPOO), HP Server Automation (HPSA), HP Network Automation (HPNA), Puppet, Chef, HP-UX, Redhat, Solaris, Python, HP Cloud Service Automation.

Cloud Consultant

Confidential, Alpharetta, GA

Responsibilities:

  • Performed Operations Orchestration user/group management, backups, and LDAP Security integration.
  • Integrated HP OpenView operation manager and Site scope into a central console.
  • Configured Site scope to monitor agent less clients, UNIX system resources, applications logs, database, SNMP traps, and URLs.
  • Installed and configured Network Automation System (NA) to validated compliance checks on Cisco routers and switches.
  • Configured Server Automation (SA) to provide security Audits on UNIX/Windows servers. Create software policies, server provision, server remediation, and patch management.
  • Created over 30 Opsware flows to automate routine maintenance tasks to increase efficiency and reduce operational cost.

Environment: HP Operations Orchestration(HPOO), HP Server Automation (HPSA), RedHat, Solaris, UNIX scripting, HP Cloud Service Automation, Window, and VMware.

Sr. UNIX System Engineer

Confidential, Sacramento, CA

Responsibilities:

  • Worked on Volume management, Disk Management, software RAID solutions using VERITAS Volume manager & Solaris Volume Manager.
  • Installed and configured Network Automation System (NA) to validated compliance checks on Cisco routers and switches.
  • Performed file system tuning and growing using VERITAS File System (VxFS), coordinated with SAN Team for storage allocation and Disk Dynamic Multi path
  • Patch and Package administration for installation of patches as per company policy and installation of packaged application.
  • Patching of Veritas Netback media/master server that were setup as two node Veritas Cluster running on physical Red Hat Linux to do OS and Security patch upgrade and troubleshoot any cluster related issues while testing failover of service groups between nodes.
  • Decommissioning of the old Unix servers (Linux, AIX and Solaris) and keeping track of decommissioned and new servers using inventory list. Migration of Local to SAN Boot disks on Production Servers.
  • Worked directly with the Customers to demonstrate basic system functionality.
  • Monitored System Activities like CPU, Memory, Disk and Swap space usage to avoid any performance issues.
  • Installing, configuring VERITAS Netbackup on Linux, Solaris servers, and creating backup polices.
  • Backup and restore data using Tivoli (TSM) on Linux, Solaris as per user requests.
  • Support for LR project applications developed in Shell, Perl, Java, Oracle, SQL, web methods, Business Objects.
  • Security (implementation of Wellmark corporate server-build baselines, installing and testing tools like PowerBroker, ESM, WAS), users, groups' administration. Daily Administration of Red Hat Enterprise Linux servers.

Environment: Tivoli (TSM), PowerBroker, Windows, VERITAS Netbackup, AIX, Solaris, RedHat, VMware, Oracle, and SQL server.

We'd love your feedback!