We provide IT Staff Augmentation Services!

Software Developer/ Program Analyst Resume

Sunnyvale, CaliforniA

PROFESSIONAL SUMMARY:

  • 8+ Years of experience in Data Analysis, Design, Development, Testing, Customization, Bug fixes, Enhancement, Support, Installing and configuring Hadoop clusters and Implementation using Python, spark programming for Hadoop . Worked on AWS environment such as lambda, server less applications, EMR, Athena, AWS Glue, IAM policies, roles, S3,CFT and Ec2.
  • Developed Python Pyspark programs for data analysis on MapR, AWS - EMR, AWS-Glue and on Hadoop clusters.
  • Apache Spark Data Frames/RDD's were used to apply business transformations and utilized Hive Context objects to perform read/write operations.
  • Hands on experience to install and configuring Hadoop Clusters (MapR, Cloudera and Hortonworks), AWS EMR clusters.
  • Worked on AWS Lambda functions with Python and CFT’S.
  • Developed and supported software using Python, Bash, MapR, Hadoop, and Amazon Web Services (including S3, EC2, VPCs, EMR, Glue, Athena and IAM).
  • Highly motivated to work on Python, R scripts for statistics analytics for generating reports for Data Quality.
  • Good working knowledge on Pyspark, MapReduce, HDFS, HBase, Oozie, Hive, Sqoop, Ambari, Zookeeper, Hue, Drill, Databricks and Yarn.
  • Completed implementing CI/CD pipeline with Jenkins, git and chef.
  • Schedule the production Jobs using Airflow, Cloud watch, AWS Lambda, AWS Glue and UC4.
  • Implemented AWS EMR, AWS Glue, Athena, CFT, S3 Event, SNS, KMS and Lambda function, Roles, IAM and policies.
  • Hands on experience in installing and configuring Hadoop cluster, MapReduce, HDFS, HBase, Oozie, Hive, Sqoop, Pig, Ambari, Zookeeper, Hue and Yarn
  • Worked with structured, semi-structured, and unstructured data.
  • Installed, configured, and upgraded OS when required.
  • Exceled in importing data from various data sources into HDFS like Oracle, DB2, and SQL
  • Handled security permissions to users.
  • Worked closely with management and other departments to configure the system correctly
  • Gained extensive experience managing and reviewing Hadoop Log files
  • Monitored the performance of the Hadoop ecosystem.
  • Strong technical, administration and monitoring knowledge in Bigdata/Hadoop
  • Well experienced with AWS IAM, S3, VPC, Subnets, OpsWorks, Route 53, Cloud Formation, Service Catalog and EMR Services.
  • Set up of Cluster servers on AWS and management of cluster servers.
  • Excellent scripting skills in UNIX Shell and python

TECHNICAL SKILLS:

Hadoop Eco-Systems: Hive, Pig, Flume, Oozie, Sqoop, Spark, Impala and HBase

Operating systems: RedHat Linux 5.X, 6.X, Windows 95, 98, NT, 2000, Windows Vista, 7

Configuration Management Tools: Puppet

Database: Oracle (SQL) 10g, MYSQL, SQL SERVER 2008

Hadoop Configuration Management: Cloudera Manager, Ambari

Monitoring Tools: Ganglia, Nagios

Scripting Languages: Shell scripting, PowerShell.

Configuration / Protocol: DNS, DHCP, WINS, VPN, TCP/IP, SNMP, IMAP, POP3, SMTP, PKI, DFS

Ticketing Systems: Remedy, Service Now, IBM Tivoli

Backup software’s: Net-Backup, Tivoli, Com vault, NT Backup, DPM 2012

PROFESSIONAL EXPERIENCE:

Confidential, Sunnyvale, California

Software Developer/ Program analyst

Responsibilities:

  • Developed Python Pyspark programming for data analysis.
  • Exploring with Spark various modules of Spark and working with Data Frames, RDD and Spark Context.
  • Responsible for analyzing and data cleaning using Spark SQL Queries.
  • Using Pyspark created the reports for viewership data.
  • Migrated an existing on-premises application to AWS.
  • Developed code in python for AWS lambda, AWS EMR, AWS Glue.
  • Handled the AWS EMR, AWS Glue, Cloud formation, code pipeline and SQL using AWS Athena.
  • Used PySpark to write the code for all the use cases in spark and experience with scala for data analytics on Spark cluster and on Joins.
  • Implemented CI/CD pipeline. Involved in data pipeline setup.
  • Hive, Hue, Drill configuration on Mapr.
  • Using python implemented business goals.
  • Implemented Validation scripts in Python using SQL.
  • Deployed Hadoop cluster installation and configuration using chef and Jenkins.
  • Analyzed Hadoop cluster and other big data analysis tools including Pig.
  • Implemented multiple nodes on Hadoop cluster on Red hat Linux.
  • Built a scalable distributed data solution.
  • Played the key role in team while implementing Databricks.
  • Imported data from Linux file system to HDFS
  • Loaded data from UNIX to HDFS
  • Installed clusters, starting and stopping data nodes, and recovered name nodes
  • Assisted with capacity planning and slot configuration
  • Worked on managing of Ambari, Zookeeper, Hue and Yarn
  • Created tables and views in Teradata
  • Created HBase tables to house data from different sources
  • Transmitted data from SQL to HBase using Sqoop
  • Worked with a team to successfully tune Pig's performance queries
  • Exceled in managing and reviewing Hadoop log file
  • Fitted Oozie to run multiple Hive and Pig jobs
  • Worked with management to determine the optimal way to report on data sets
  • Installed, configured, and monitored Hadoop Clusters using Cloudera
  • Installed, upgraded, and patched ecosystem products using Cloudera Manager
  • Balanced and tuned HDFS, Hive, Impala, MapReduce, and Oozie work flows
  • Maintained and backed up meta-data
  • Configured Kerberos for the clusters
  • Used data integration tools like Flume and Sqoop
  • Setup automated processes to analysis the system and find errors
  • Supported IT department in cluster hardware upgrades

Environment: Hadoop, HDFS, Pig, Sqoop, HBase, Shell Scripting, Ubuntu, Linux Red Hat

Confidential, Seattle

Senior Engineer

Responsibilities:

  • Developed and created reports using Pyspark programming for data analysis and Python programming for automation and monitoring.
  • Implemented CI/CD pipeline. Created data pipelines.
  • Developed and supported software using Python, Bash, MapR, Hadoop, and Amazon Web Services (including S3, EC2, VPCs, and IAM).
  • Automated and orchestrated tasks using Ansible.
  • Automated software environments using Chef and developed and supported software using Java.
  • Install the servers and components using PowerShell and shell script.
  • Very good working Knowledge in AWS services.
  • Completed setup and configuring of Hadoop clusters.
  • By using PowerShell, Automated the services like creating, disabling users, groups.
  • Deployed Windows 2003 active directory and DHCP in CentOS.
  • Worked on all type of Hardware from DELL, HP and IBM. Installed Linux and Windows OS.
  • Setup, configured, and managed security for the Cloudera Hadoop cluster.
  • Monitored and installed services using Python scripting and Shell scripting.
  • Worked on MapR Hadoop clusters and Horton works.
  • Used Hive and Pig to perform data analysis
  • Worked on managing of Ambari, Zookeeper, Hue and Yarn
  • Loaded log data into HDFS using Flume
  • Created multi-cluster test to test the system's performance and failover
  • Improved a high-performance cache, leading to a greater stability and improved performance
  • Built a scalable Hadoop cluster for data solution
  • Responsible for maintenance and creation of nodes
  • Managed log files, backups and capacity
  • Found and troubleshot Hadoop errors
  • Worked with other teams to decide the hardware configuration
  • Implemented cluster high availability
  • Scheduled jobs using Fair Scheduler
  • Configured alerts to find possible errors
  • Handled patches and updates
  • Worked with developers to setup a full Hadoop system on AWD
  • Well experienced with AWS IAM, S3, VPC, Subnets, OpsWorks, Route 53, Cloud Formation, Service Catalog and EMR Services.
  • Experience in managing Red hat IPA (Identity, Policy, and Audit).
  • Set up of Cluster servers on AWS and management of cluster servers
  • Worked on Python and shell scripting

Environment: HDFS CDH3, CDH4, HBase, NOSQL, RHEL 4/5/6, Hive, Pig, Perl Scripting and AWS S3, EC2

Confidential, San Jose

IT consultant

Responsibilities:

  • Worked on Python programming, shell scripting and on Hadoop cluster eco components.
  • Implemented CI/CD pipeline with Jenkins, Git, ansible and chef and also in AWS environment.
  • Implemented and deployed products using AWS services.
  • Implemented multiple nodes in Hadoop cluster on Red hat Linux and Centos.
  • Imported data from Linux file system to HDFS.
  • Implemented different applications using Python scripting and Shell scripting.
  • Loaded data from UNIX to HDFS .
  • Deployed servers using automation.
  • Use of Windows Server 2008/2012 Active Directory and Exchange Management Console. For 2003, 2007 and 2010.
  • Monitored trouble ticket queue to attend user and system calls.
  • Setup a domain and Active directory on Windows 2008 server
  • Worked as support engineer to Support ACTIVE Directory Windows 2003/2008 and group policies and DNS, DHCP, CITRIX, VM, SCOM, SCCM and Windows server Exchange 2007/2010 with O365.
  • Experienced on configuring and installing VMware (Virtualization) and Hyper V
  • Manage and monitor all D2D and D2T backups for the enterprise.
  • Perform daily, weekly and monthly disaster recovery audits.
  • Ensure company data is stored safely and securely by performing random audits of tape storage vendor. Worked on Apache 2
  • Worked on Apache SSL configuration
  • Developed and implemented policies and procedures for computer systems operations and development also ensured the technology is accessible and equipped with current hardware and software.
  • Provision and maintain server infrastructure services such as Active Directory (AD), DHCP, DNS, authentication, and network services
  • Experience in deployment and management of complex Active Directory (AD) environment
  • Hands-on knowledge of Active Director Federation Services 2.0 (ADFS 2.0)
  • Experience with integration ADFS and/or OpenAM into Azure AD
  • Scripting experience in PowerShell
  • Setup whole root zones/containers on Solaris 10 for application management; modify zone setting, network settings, import file systems, migrating zone paths and migrating zones between Solaris 10 Servers.
  • Created and configured volumes on Solaris 10 systems Configured and managed storage volumes such as LVM for RHEL/centos systems.
  • Installed and configured ISCSI Utility on RHEL/Centos 6.4 server for Network Attached Storage.
  • Configure apache web server on Solaris 10: install and configure samba server for quick publishing using third-party web page maker.
  • Install and configured LAMP on RHEL/Centos servers.
  • Have average experience with chef configuration manager on RHEL/centos servers.
  • Built a scalable distributed data solution

Confidential

System Engineer

Responsibilities:

  • Python programming on Hadoop Clusters, AWS services for automation and development.
  • Worked on scalable distributed data system using Hadoop ecosystem in AWS EMR and MapR (MapR data platform).
  • Completed the internal employee data analysis using Python.
  • Worked on Jenkins for CI/CD pipeline in AWS and on-prem.
  • Implemented monitoring of critical services using automation like Nagios.
  • Installed Hadoop cluster, Big Data eco system components and monitored.
  • Deployed applications using Python, shell scripting and PowerShell Scripting.
  • Deployed Windows 2003 active directory and DHCP in CentOS.
  • Created VPC, IAM policies and roles in AWS environment.
  • Implemented CI/CD pipeline.
  • Extensively use python and shell scripting.
  • Used all AWS services.
  • Worked on Cloudera HDFS.
  • Use of Windows Server 2008/2012 Active Directory and Exchange Management Console.for 2003, 2007 and 2010.
  • Provided support for virtual server solutions including VMWare and ESX hosts, adding disk space from SAN and expanding partitions using disk part and Ext part. Server patching and windows server clustering set up.
  • Support of client backups including Symantec local, Tivoli and Commvault, creation of new backup jobs for new servers.
  • Remote support of clients using Kaseya, Secret Server and Join. Me
  • Managed Citrix and XenApp environments.
  • Maintenance of multiple Symantec End Point anti-virus servers at multiple client sites.
  • Daily check of backups of Symantec Backup Exec at office locations. Set up and maintain tape rotation schedule, creation of new backup jobs for new servers. Set up and maintenance of Data Domain storage and replication system.
  • Building new Windows 2003 and 2008 servers, added systems to What’s Up Gold SNMP tracking system, Symantec Backup and McAfee anti-virus software.
  • Built HP blades servers using ILO and virtual servers using VMWare vSphere 4.1.
  • Daily use of Active Directory and Exchange Management Console.
  • Documented build procedures for physical and virtual servers.
  • Used Automate IT software to provide replication of directories Datacenter servers.
  • Maintenance of McAfee anti-virus system including maintaining 4 repositories and compliance for 200 servers and 600 workstations.
  • Provide daily support for Help Desk tickets on Aldon tracking system.
  • Monitor and analyze servers and resolve any problems, maintain systems reporting, tuning.
  • Created users, manage user permissions, maintain User & File system quota on Linux servers.
  • Configured volume groups and logical volumes, extended logical volumes for file system growth needs using Logical Volume Manager (LVM) commands.
  • Maintaining integrity of security with the use of group policies across domains.
  • Supporting users through email, on call and troubleshooting.
  • Maintaining inventory of all components including systems and other hardware.
  • Performed User Account management, data backups, and users' logon support.
  • Maintaining user's data backup by creating particular user folder in File Server and applying security permission on folders.
  • Monitored trouble ticket queue to attend user and system calls.
  • Attended team meetings, change control meetings to update installation progress and for upcoming changes in environment.
  • Imported data from Linux file system to HDFS
  • Loaded data from UNIX to HDFS
  • Perform day-to-day Technical Support, analyzing troubleshooting and resolved Technical problems related to servers.
  • Perform User and Security Administration and Implementing File Permissions for the Users and Groups.
  • Configuring Role-Based Access Control (RBAC) & Access Control List ( Confidential ).
  • Maintaining Service Management Facility (SMF) in Solaris 10.
  • Resolving system Hardware and software errors and crashes, huge file sizes, file System full error on UNIX and Linux Servers.
  • Responsible for Package and Patch Management & Installation on servers.
  • Extensively worked on administering NFS, DNS, DHCP, NIS, NIS+, LDAP, Mail Servers and Samba server.
  • Installing, configuring and maintaining Web logic Server, Web Sphere Application Server, Apache/ Tomcat web servers on UNIX.
  • Encapsulating and Mirroring the Root Disk. Implementing RAID 0, RAID 1, RAID 0+1 & RAID 5 on multiple disks using Solaris Volume Manager.
  • Increasing and Decreasing the Size of File System using Logical volume Manager.
  • Provided network and servers development and support
  • Developed and implemented the Unix part of the IT infrastructure for the company's customers.
  • Utilized Hyper-V to deploy and configure Linux Servers
  • Supported and administered Linux servers for the company's customers
  • Implemented sophisticated mail system with outgoing IP spreading depending on sender domains

Hire Now