We provide IT Staff Augmentation Services!

Solutions Architect (consultant ) Resume

Dallas, TX

SUMMARY:

  • Over 9 years of industry experience in software life cycle, including system analysis, designs, administration, infrastructure which includes 4 + years in AWS Cloud Services (EC2, S3, IAM, VPC, Cloud Formation, SQS, SNS, Cloud Watch, Kinesis, Lambda, API Gateway, Dynamodb, ECS, Codepipeline, codedeploy) and Devops tools (Jenkins, Ansible, Terraform) Microservices Architecture, Dockers, ELK Stack, DC/OS, Kubernetes for Orchestration, Graphana for Monitoring, working knowledge of Terraform etc
  • High level Understanding of Microsoft Azure
  • Internal Project of Managing/Deploying the Windows Azure based applications.
  • Architecting Infrastructure Migrations: Drive Operational efforts to migrate all legacy services to a fully Virtualized Infrastructure.
  • Configured Azure traffic manager to manage live traffic.
  • Implemented HA deployment models with Azure Classic and Azure Resource Manager.
  • Azure Availability and Scalability - Configured VMs availability sets using Azure portal to provide resiliency for IaaS based solution and scale sets using Azure Resource Manager to manage network traffic
  • Designed, Architected and Delivered Authorization Policies for Users, Groups and Roles by employing AWS (IAM) Identity and Access management
  • Design and implement PAAS on AWS cloud
  • Followed AWS best practices while designing IAM for customers
  • Configured Network Access Control Lists (NACL) and Route Tables at subnet level
  • Configured Security Groups for instances in the VPC for public and private subnets
  • Designed and Implemented EC2 Compute, Networking and Storage for AWS Cloud virtual and hybrid environments using AWS Console
  • Designed Elastic Load Balancing (ELB) with Auto-Scaling Groups to address traffic and failover issues and to enhance resiliency in the network
  • Configured Web Application Firewall (WAF) for Application Load Balancer
  • Configured S3 bucket policies and Access Control Lists
  • Configured and implemented Compute, Storage and Networking using EC2, EBS, S3, ELB, RDS MySQL, Route 53 DNS, Glacier, EFS, CloudFront, Cloud Watch, CloudTrail
  • Integrated CloudTrail, Cloud Watch, Trusted Advisor
  • Created Cloud Formation Stacks/Templates to automate resources creation
  • Provisioned various types of Elastic Compute Cloud (EC2) instances in a multi-tier environment based on the customer's need
  • Planned and Migrated on-prem multi-tiered IT infrastructure to AWS cloud VPC
  • Keywords: Lambda, OpsWorks, SES, SQS, SNS, RDS, DynamoDB, Redshift, Ansible, Python, VPN, ACL, AAA, TCP/IP, IPsec, SIEM, AMI
  • Requirement gathering and performing functional and detailed design analysis, and responsible for developing guidelines, standards and implementations
  • Drive architectural road-maps for influencing upcoming standards, tools, and technologies.
  • Defining Architectural Vision and analyzing gaps between Base Architecture and Target Architecture.
  • Solid hands-on experience on designing AWS cloud architecture and devops
  • Solid hands-on experience in provisioning EC2, EBS, S3, Elastic Load Balancer, Auto Scaling, ECS, CloudWatch alarms., Cloud formation templates, Virtual Private Cloud(VPC), IAM, RDS, Lambda, Elastic-search etc. based on architecture
  • Experience with Glacier, DynamoDB, Elastic BeansTalk, CloudFront, Route53, Redshift, CloudFormation managing and working with configuration management tools (Ansible).
  • Cloud Migration of VMware ESXi, vCenter based applications into AWS and using DBM services for the Database RDBMS AWS RDS
  • Infrastructure as code with Terraform implementing cloud agnostic solution with development team
  • Architected and designed Cloud Monitoring Services by using AWS Services.
  • Re-architected the existing platform to improve the performance of the real time data ingestion application.
  • Architected and implemented real time data analytics platform by using kinesis, lambda and S3.
  • Designed data ingestion and architecture for intensive data application for lambda big data architect with both streaming and batch ingestion
  • Solid Hands-on experience in designing and creation of complete CloudFormation Templates (Json/YML Template) to implement whole AWS infrastructure through scripting.
  • Have worked on designing highly available, cost effective and fault tolerant systems using multiple EC2 instances, Auto Scaling, Elastic Load Balance and AMIs.
  • Good experience on Amazon AWS IAM Service: IAM Policies, Roles, Users, Groups, AWS Access Keys and MFA.
  • ConfiguredAWS IAM and Security Group in Public and Private Subnets in VPC. Architected, Designed and Developed the Backup and Archiving, Disaster Recovery in AWS Cloud.
  • Suggest architectural improvements, design and integration solutions, and formulate methodologies for business optimization.
  • Proficient in AWS services EC2, IAM, S3, Lambda, Cloud front, Cloud Watch, Redshift, Dynamo DB, SNS, SQS, SES, EMR, Elastic Bean stalk, VPC, ELB, RDS, EBS, Route 53.
  • Kubernetes configuration using kubeAdm
  • Cluster storage, networking, managing kubelets with ETCD auto scaling of services scheduling services with kubernetes scheduler
  • Managing kubernetes multiple services and networking of services. Troubleshooting services
  • Integrating Jenkins with kubernetes and with SCM for CI/CD
  • Application lifecycle management with kubernetes, monitoring and logging using custom monitoring application integrated into kubernetes
  • CI/CD pipeline with Jenkins Ansible for auto-configuration and kubernetes
  • Configuring and implementing pipeline for production using build automation and SCM
  • Suggest architectural improvements, design and integration solutions, and formulate methodologies for business optimization.
  • Develop data architecture design to facilitate targeted customer analysis.
  • Experienced working with methodologies like Agile SCRUM and SDLC Waterfall.
  • Proficiency in developing Cloud Security policies and strategies in par with the organization's compliance structure. Providing Risk Management and mitigation recommendations for projects in organization.
  • Expertise in architecting the design, scalability, high availability, (Disaster Recovery) DR Design on Cloud Infrastructures, providing solutions based on a range of Cloud technologies and services.
  • Strong analytical, problem solving, organizational and planning skills.

WORK EXPERIENCE:

Solutions Architect (Consultant )

Confidential, Dallas, TX

Responsibilities:

  • Launching Amazon EC2 Cloud Instances using Amazon Web Services (Linux) and Configuring launched instances with respect to specific applications Using IAC Cloud formation templates and Terraform templates
  • Perform S3 buckets creating, policies and the IAM rule based polices using IAC
  • Implementing PAAS solutions for clients
  • Experience using AWSCLI for rapid check of resourced and troubles shooting and investigating instances
  • Implementing AWS security solution as required by business engagements and legal restrictions
  • Designed highly scalable and fault tolerant, highly available and secured, distributed infrastructure (IAAS) using EC2 instances, EBS, S3, RDS, ELB, Auto Scaling, Lambda, Redshift, DynamoDB etc. in AWS Cloud
  • Gathered user requirement and performing functional and detailed design analysis.
  • Designing and implementing both the front-end and back-end systems that run on AWS on par with organization compliance and security policies.
  • Provided Migration plan and strategy for cloud, and strategy to migrate infrastructure and data from On-premises data center to AWS Cloud. Data center migration plans. Database Migrations and Applications migrations
  • Configuring DNS (Route53), ELB, general networking principles, firewalls, route tables and route propagations.
  • Create and maintain SSL Security certificate management for enterprise, maintaining certificates across multiple SSL-providers, and integrating certificates into products such as nix, apache, AWS -ELB.
  • Build servers using AWS, importing volumes, launching EC2, RDS, creating security groups, auto-scaling, load balancers (ELBs) in the defined virtual private connection.
  • Defined Access policies and access groups among internal and also customers
  • Designed Robust Environment for connectivity between On-Premises and Cloud for existing on-premises Apps(Hybrid cloud design)
  • Implemented monitoring process of cloud environment and notification system using cloud watch and SNS.
  • Designed Successful Data Migration approach using AWS DMS and Scheme Conversion tool and policy for infrastructure migration to Cloud environment
  • Creating and presenting MS Power point presentation for Technical and non-Technical management team.
  • Creating and Designing highly secured Virtual Private Cloud(VPC)
  • Working Experience implementing VPC peering and VPN connect with AWS VPC
  • Working knowledge of implementing Active Directory with AWS
  • Followed Agile Methodology and scrum for implementation.
  • Design a Continuous Delivery platform and implementation to provide a complete working Continuous Delivery solution using industry-standard open source tools such as Jenkins, ansible AWS code deploy code pipeline code commit and code build etc.
  • Solid experience in designing data retention strategy along with automatic backup plan using SNS and scheduler
  • Designing and creating highly scalable, highly available, fault tolerant, highly secured, distributed infrastructure (IAAS) using AWS EC2 instances, EBS Snapshot, S3, Elastic Load Balancer, Auto Scaling, Cloud Watch, Cloud Formation, RDS, KMS, Lambda, Redshift, SNS etc.

Big-data solution Architect

Confidential

Responsibilities:

  • Assess business requirements for each project phase and monitor progress to consistently meet deadlines, standards and cost targets. Implemented cost-savings initiatives to reduce infrastructure costs by consolidating databases, and made development standards. Automated various database and infrastructure tasks.
  • Designed and Implemented Enterprise Data Lake Platform (EDP) for storing vast quantities of data in different formats. Defined Capacity for storing data across multiple node Cluster at various phases
  • Experience on Hadoop cluster maintenance, including data and metadata backups, file system checks, commissioning and decommissioning nodes and upgrade hands on Experience in Installing, Configuring and using Hadoop Eco System Components like HDFS, Hadoop Map Reduce, Yarn, Zookeeper, Sentry, Sqoop, Flume, Hive, HBase, Pig, Oozie.
  • Experience in analyzing data using HiveQL, Pig Latin, HBase and custom Map Reduce programs in Java
  • Hands on Experience with Hadoop, HDFS, Map Reduce and Hive, Pig, Spark and Scala.
  • Performed importing and exporting Data into HDFS and Hive using SQOOP.
  • Experience in performance tuning the Hadoop cluster by gathering and analyzing the existing infrastructure.

Azure Administrator

Confidential

Responsibilities:

  • Developed a migration approach to move workloads from On-Premises to Windows Azure and develop new Cloud -ready application solutions.
  • Configuring Cloud platform such as Virtual Networks, VMs, Azure AD, Load Balancers, Cloud Services, etc.
  • Used Azure Active Directory for MFA (Multi Factor Authentication) and integrate with Virtual Desktops for users.
  • Creating and managing end points using Azure Traffic Manager.
  • Adding and Managing Co-Admins for all the subscriptions in the Windows Azure Platform.
  • Involved in Azure Platform Development Concepts, hosted Cloud Services, platform service and close interface with Windows Azure Multi-Factor Authentications.
  • Worked on Private Cloud and Hybrid cloud configurations, patterns, and practices in Windows Azure and in Azure web and database deployments.
  • Designing Azure Resource Manager Template and extensive experience in designing custom build steps
  • Administering and managing Windows server Active Directory services, DHCP and DNS servers.
  • Management of DHCP, WINS, DNS and Active Directory.
  • Help implement established system wide objectives; utilize project management processes for communicating, planning, designing, documenting, implementing, and removing information security related changes.
  • Performed Security audits to ensure best practices are maintained on hardware and visualized server farms.

Data Analyst | AWS Devops | BigData Architect

Confidential, Dallas, TX

Responsibilities:

  • Hands on experience in Amazon Web Services AWS provisioning and good knowledge of AWS services like EC2, Auto scaling, Elastic Load-balancers, Elastic Container service (Docker containers), S3, Elastic Beanstalk, Cloud Front, Elastic file system, VPC, Route 53, Cloud Watch, Cloud Formation, IAM.
  • Involved in designing and deploying a large applications utilizing almost all of the AWS stack (Including EC2, Route53, S3, RDS, Dynamo DB, SNS, SQS, IAM) focusing on high-availability, fault tolerance, and auto-scaling in AWS Cloud Formation.
  • Managed multiple AWS accounts with multiple VPC's for both production and non-prod where primary objectives included automation, build out, integration and cost control.
  • Developed Cloud Formation scripts to automate entire CD pipeline.
  • Created and managed multiple Instances deployed for several test applications in those instances in QA environment.
  • Configured a VPC and provisioned EC2 instances, EBS in different availability zones.
  • Implemented and maintained the monitoring and alerting of production and corporate servers/storage using AWS Cloud Watch.
  • Setup and build AWS infrastructure various resources, VPC EC2, S3, IAM, EBS, Security Group, Auto Scaling and RDS in Cloud Formation JSON templates.
  • Creating Cloud Watch alerts for instances and using them in Auto Scaling launch configurations.
  • Implementing good security compliance measures with MFA and authentication and Authorization with IAM roles
  • Backing up the instances by taking snapshots of the required servers regularly.
  • Setting up and administering DNS system in AWS using Route53.
  • Written Ansible Playbooks from that can Provision several pre-prod environments, Deployment automation, instance mirroring, and several proprietary middleware installations.
  • Provision Instances to support Big-data operations on a large scale enterprise environment with Autoscaling and Auto-configurations
  • Experience on Hadoop cluster maintenance, including data and metadata backups, file system checks, commissioning and decommissioning nodes and upgrade hands on Experience in Installing, Configuring and using Hadoop Eco System Components like HDFS, Hadoop Map Reduce, Yarn, Zookeeper, Sentry, Sqoop, Flume, Hive, HBase, Pig, Oozie.
  • Hands on Experience with Hadoop, HDFS, Map Reduce and Hive, Pig, Spark and Scala.
  • Performed importing and exporting Data into HDFS and Hive using SQOOP.
  • Experience in performance tuning the Hadoop cluster by gathering and analyzing the existing infrastructure.
  • Worked on Agile Methodology.
  • Experience with running, configuration and management on AWS.
  • Excellent understanding / knowledge of Hadoop architecture and various components such as HDFS, Job Tracker, Task Tracker, Name Node, Data Node and Map Reduce programming paradigm.
  • Responsible for building scalable distributed data solutions using Hadoop.
  • Involved in managing and reviewing Hadoop log files
  • Created Hive Scripts to process the data.
  • Loaded and transform large datasets such as Structured, Unstructured and Semi Structured data
  • Importing and Exporting data into HDFS and Hive using Sqoop
  • Involved in creating Hive External tables, loading data and writing Hive queries which will run internally in Map Reduce.
  • Analysis with SPARK SQL and using DATA FRAMES and Visualizing Using Tableau
  • Good knowledge of messaging Framework Kafka and Flume
  • Worked with cloud services like Amazon Web Services (AWS) and involved in ETL, Data Integration and Migration.
  • Worked in converting Hive/SQL queries into Spark transformations using Spark RDDs, Python and Scala.
  • Imported the data from different sources like HDFS/Hbase into Spark RDD.
  • Used Flume, Kafka to aggregate log data into HDFS.
  • Created multiple Hive tables, implemented Partitioning, Dynamic Partitioning and Buckets in Hive for efficient data access.
  • Excellent experience in ETL analysis, designing, developing, testing and implementing ETL processes including performance tuning and query optimizing of database.
  • Implemented Spark RDD transformations, actions to migrate Map reduce algorithms.
  • Implemented Spark using Scala and SparkSQL for faster testing and processing of data.
  • Understanding and usage of Elastic Search and ELK stack

Technologies: HDFS, CDH4, Kafka, Cassandra, Hive, Pig, Oozie, Map Reduce, Java, Sqoop, Oracle. Hadoop Distribution of Hortonworks, Cloudera

Admin Support

Confidential

Responsibilities:

  • Provide day to day hardware and software support on Switches, router, printer, servers, performing roles such as designing, organizing, modifying, installing, and supporting computer systems.
  • Installs, upgrades and maintain pc's desktop/laptop
  • Install and support LANs, WANs, Networking, internet, and intranet systems
  • Troubleshooting Resolving MS Office and Outlook day to day issues
  • Monitor networks to ensure security
  • User control and Windows XP Remote Assistance
  • Configure Cisco Routers using RIP, IGRP, OSPF, EIGRP, BGP, MPLS.
  • Microsoft Exchange, Active Directory, TCP/IP, DNS, DHCP
  • Remote Help Desk Systems
  • Identify user needs evaluating and modifying system’s performance
  • Ensure connectivity

Hire Now