We provide IT Staff Augmentation Services!

Cloud /bigdata Developer Resume

5.00/5 (Submit Your Rating)

Malvern, PA

PROFESSIONAL SUMMARY:

  • Over5+ yearsof extensive experience in teh IT industry, involved in all Phases of Project life Cycle (SDLC) and Agile - Analysis, Design, Development, Testing, Documentation Which unites into teh field of DevOps and AWS Wif skill in Linux Administration, Continuous Integration, Configuration Management, Automation and Monitoring technologies for banking, financial, healthcare and insurance domain.
  • TEMPEffective in leading applications wif end - to-end responsibilities usingC, C++and Client/Server Technologies wif exposure to different domains likeBanking,Health CareandRetailing.
  • Extensive Experience to work for various clients in Aviation, Insurance,Health Care, Banking, for web based,Cloud Developer, DevOps Engineer.
  • Experience in configuring, deployment and support of cloud services including Amazon Web Services (AWS).
  • Experience wif Agile and Scrum Methodologies. Involved in designing, creating, managing Continuous Build and Integration Environments.
  • Strong knowledge and experience on Amazon Web Services (AWS) Cloud services like EC2, S3, EBS, RDS, VPC, and IAM
  • Capability in Utilizing Mechanized Form Scripts like Maven, Nexus, Hudson/Jenkins, and Team City, Ansible and Puppet/ Chef.
  • Designed and managed public/private cloud infrastructures using Amazon Web Services (AWS) which include EC2, S3,Cloud Front, Elastic File System, RDS, VPC, Direct Connect, Route53, Cloud Watch, Cloud Trail, Cloud Formation, and IAM which allowed automated operations.Deployed Cloud Front to deliver content further allowing reduction of load on teh servers.
  • Optimized Amazon Redshift clusters, Apache Hadoop clusters, data distribution, and data processing.
  • Hands on Involvement in taking care of Ant and Maven, Java, XML, Ruby, Perl and Shell Scripts in Automation Build Process.
  • Implemented a Continuous Delivery pipeline wif Docker, Jenkins and GitHub and AWS AMI’s, whenever a new GitHub branch gets started, Jenkins, our Continuous Integration server, automatically attempts to build a new Docker container from it, Teh Docker container leverages Linux containers and TEMPhas teh AMI baked in. Converted our staging and Production environment from a handful AMI’s to a single bare metal host running Docker.
  • Monitoring resources and Applications using AWS Cloud Watch, including creating alarms to monitor metrics such as EBS, EC2, ELB, RDS, S3, SNS and configured notifications for teh alarms generated based on events defined.
  • Used Terraform to map more complex dependencies and identify network issue.
  • Hands on experience on Terraform a tool for building, changing, and versioning infrastructure safely and efficiently.
  • Strong in DevelopingMapReduceapplications, Configuring teh Development Environment, Tuning Jobs and Creating MapReduce Workflows.
  • Experience in performing data enrichment, cleansing, analytics, aggregations usingHiveandPig.

TECHNICAL SKILLS:

Programming Languages: C, C++,, Java, Unix/Linux Scripting, HTML, Ruby, SQL, PL/SQL

Web Technologies: ASP.NET, MVC, Web Forms, Win Forms, ADO.NET, Web Services, Sliver Light. And ASP, HTML, XML, CSS, Java Script, Angular, jQuery, XAML, Json

Cloud Computing/ CI and CM Tools: AWS, Jenkins, Chef, Puppet, Dockers, Ansible.

Protocols/Services: DNS, HTTP, HTTPS, NFS, TLS/SSL, DHCP, IPV6.

Web/Application Servers: Apache, IIS, Apache Tomcat, Jobs, WebLogic

AWS Cloud Tools: EC2, Elastic Load-balancers, Elastic Container Service (Docker Containers), S3, Elastic Beanstalk, Cloud Front, Elastic File System, RDS, Dynamo DB, DMS, VPC, Direct Connect, Route53, Cloud Watch, Cloud Trail, Cloud Formation, IAM, EMR ELB, RDS, AMI, Lambda

Databases: SQL Server 2008 R 2/2008/2005/2000, PL-SQL, MS-ACCESS 2007/2003/2000, Dynamo DB, Mongo DB.

Version Control: TFS 2010/2008, GIT, SVN

DevOps Tools: Chef, Jenkins, Docker, Bamboo, Puppet

Build Tools: Ant, Maven,Gradle, Jenkins

PROFESSIONAL EXPERIENCE:

Confidential, Malvern, PA

Cloud /BigData Developer

Responsibilities:

  • Worked extensively on Hadoop Components such as HDFS, Job Tracker, Task Tracker, Name Node, Data Node, YARN, Spark and Map Reduce programming.
  • Analyzed data which need to be loaded into Hadoop and contacted wif respective source teams to get teh table information and connection details.
  • Used Sqoop to import data from different RDBMS systems like Oracle, DB2 and Netezza and loaded into HDFS
  • Created Hive tables and partitioned data for better performance. Implemented Hive UDF's and did performance tuning for better results.
  • Developed Map-Reduce programs to clean and aggregate teh data.
  • Implemented optimized map joins to get data from different sources to perform cleaning operations before applying algorithms.
  • Developed workflow in Oozie to manage and schedule jobs on Hadoop cluster to trigger daily, weekly and monthly batch cycles.
  • Implemented POC to introduce Spark Transformations.
  • Continuously worked wif architects to design spark model for existing Mapreduce model.
  • Hands on experience in AWS Cloud in various AWS services such as Redshift cluster, Route 53 domain configuration.
  • On demand secure EMR launcher wif custom spark submit steps using S3 Event, SNS, KMS and Lambda function.
  • Extensive knowledge of working on NiFi.
  • Migrated an existing on-premises application to AWS.
  • Used AWS services like EC2 and S3 for small data sets.
  • Used Cloud watch logs to move application logs to S3 and create alarms based on a few exceptions raised by applications.
  • Monitoring of EMRClusters in Production area to foresee each step is carried out successfully and access HUEto monitor production data is loaded into tables.
  • Setting up of Splunk alerts to trigger email whenever AWS Code fails in teh EMR Cluster.

Environment: Jenkins, Ansible, AWS Redshift, Maven 4.0, PCF, GIT,HUE, SPARK, HDFS,SPLUNK,Scala, Windows 7,AWS- EC2, S3, VPC, Cloud Watch, NACL, Route 53, IAM, SQS, SNS, SES, Apache servers, Linux server.

Confidential, Malvern, PA

Cloud /AWS Developer

Responsibilities:

  • Involved in teh analysis, definition, design, implementation and deployment of full software development life-cycle (SDLC) of teh project.
  • Worked as a senior design engineer, mainly onC++, STL, data structures,UNIX, multi threading.
  • Experience in building Cloudformation Stacks for auto provisioning of Infrastructure which Include Dynamo DB, Kinesis Streams and Lambdaresources.
  • Worked wif Tropospherein updating teh Cloudformation stack for adding up new resources wifout disturbing teh Existing Stack.
  • Worked on best Sysops practices like Deploying and maintaining servers in production.
  • Implemented AWS Sysopspolicies like Migrating On premise Database to Cloud.
  • Managed Amazon redshift clusters such as launching teh cluster by specifying teh nodes and performing teh data analysis queries.
  • Experienced in Performance Tuning and Query Optimization inAWSRedshift.
  • Provisioned and Managed Servers using AWS LambdaWhich is Serverless Computing.
  • Directed setup, use, and build scheduling for environments and implemented a Continuous Delivery pipeline. Designed and implemented CM requirements, approach, and tooling for Java (J2EE) and .NET -based application development. Designed, coded, and implemented automated build scripting in Ant, Ivy, Jenkins/Hudson, and Maven.
  • Responsible for end-to-end public Cloud Automation of application delivery, including Infrastructure provisioning and integration wif Continuous Integration/Continuous Development (CI/CD) platforms, using existing and emerging technologies.
  • Experience in using Crewnetand ConfluenceSites to access Resources and make updates to teh workflow. Involved in preparing teh Architecture design for Project.
  • Experience wif Bitbucket, GITin creating teh Code repository, and pushing teh Changes to teh Code Repository and also resolving any Merge Conflicts.
  • Did programming in C on UNIX platform to contribute to teh software project, which automated customized design process.
  • Extensive use of Bamboo Pipelines for checking out teh builds. Experience in resolving build failures. Thorough Understanding of build tests like Sonar scan and Quality Test gates.
  • Experience in writing teh Python Code for Lambda Functions perform necessary logic and derive teh values. Also, handy experience in writing teh Python Unit test cases for checking teh written logic.
  • Experience in developing middleware components for software in C/C++ using STL, multi threading, data structures, IPC (TCP/IP socket programming), SNMP and design patterns.
  • Extensively used Attunityfor accessing IBM DB2 Tables for creating teh replication values and passing them onto Amazon Kinesis Stream
  • Worked on Antiphony and tried to access AWS console to leverage teh resources to teh desired level.
  • Upgrading Jenkins and deploy Tomcat 7 to an existing application. Configuring LDAPAutantication wif teh existing software structure.
  • Experience in creating Splunk Dashboard for Lambda alerts and also using Filter functions in Lambda.
  • Created monitors, alarms and notifications for EC2 hosts using Cloud Watch.
  • Responsible for end-to-end public Cloud Automation of application delivery, including Infrastructure provisioning and integration wif Continuous Integration/Continuous Development (CI/CD) platforms, using existing and emerging technologies.

Environment: Jenkins, Ansible, AWS Redshift,Maven 4.0, PCF,GIT, LINQ,JSON, SOAP UI Tool, jQuery, Bootstrap, Windows 7,AWS- EC2, S3, VPC, Cloud Watch, NACL, Route 53, IAM, SQS, SNS, SES, Apache servers, Linux servers

Confidential, Edison, NJ

AWS Cloud Engineer/ Developer

Responsibilities:

  • Extensively involved inRequirement Analysis,Design and Developmentof AVMOSYS for displaying details of flights to teh users.
  • Performed query relation operation using SQL Developer tool & SQL.
  • Building servers using AWS, importing volumes, launching EC2, creating security groups, auto-scaling, load balancers, Route 53, SES and SNS in teh defined virtual private connection.
  • Created multiple Terraform modules to manage Configurations, Services, Modules and automate installation process for Web Server and AWS Instances.
  • Experience in Setting up teh build and deployment automation for Terraform Scriptsusing Jenkins jobs.
  • Used Auto-scaling and Elastic Load Balancer features on EC2 instances to serve teh end users using applications during unexpected traffic/demand.
  • Configuring IAM roles for EC2 instances and assigns them policies granting specific level access to S3 buckets. Using Cloud Watch service, created alarms for monitoring teh EC2 server’s performance like CPU Utilization, disk usage etc.
  • Using Amazon RDS Multi-AZ for automatic failover and high availability at teh database tier for MySQL workloads.
  • Worked wif Terraform key features such as Infrastructure as code, Execution plans, Resource Graphs, Change Automation
  • Configuring and managing AWS Simple Notification Service (SNS) and Simple Queue Service (SQS).
  • Created Snapshots and Amazon Machine Images (AMI's) of EC2 Instance for snapshots and creating clone instances.
  • Managed different infrastructure resources, like physical machines, VMs and even Docker containers using Terraform It supports different Cloud service providers like AWS and Digital Ocean.
  • Configured Elastic load balancers for incoming loads.

Environment: C#, MVVM, CLSA, AWS- EC2, S3, RDS, EBS, ELB, VPC, Cloud Watch, NACL, NAT, Route 53, Dynamo DB, IAM, SQS, SNS, Apache servers, Linux servers

Confidential

AWS/ Linux System Administrator

Responsibilities:

  • Experience in designing and deploying AWS Solutions using EC2, S3, EBS, Elastic Load balancer (ELB), auto scaling groups, RedShift and OpsWorks.
  • Worked at optimizing volumes and EC2 instances and created multiple VPC instances. Managed and Cross Trained Technical Support Team, teaching personnel Linux standards.Created new product build environment, that dropped build time from 2.5 hours to 17 minutes.
  • Maintained backup Schedules for Server Storage. Read and interpreted UNIX logs
  • Configured and maintained RSA Servers and Juniper Network Routers
  • Experience working on User admin groups, maintaining account and monitoring system performance using Nagios.
  • Worked wif MYSQL and postgreSQL Databases.
  • Experience wif automation tools like Puppet, Jenkins, Ansible and Nagios. Experience wif TFS, Artifactory and GIT for source controlling.

Environment: RHEL, Puppet, sun Solaris, AWS, HTTP web servers, Jenkins, Ansible, Chef, FTP, VMware VSphere.

We'd love your feedback!