We provide IT Staff Augmentation Services!

Aws Solution Architect With Devops Resume

5.00/5 (Submit Your Rating)

Reston, VA

PROFESSIONAL SUMMARY:

  • 8+ years of total IT experience with nearly 2 years of experience of Amazon Web Services(AWS), DevOps tools and methodologies.
  • Working knowledge of DevOps methods and CI/CD automation practices (Ansible, Git and GitHub).
  • Proficient withDevOpsrelease tools such as Ansible, Automated configuration management and deployments usingAnsibleplaybooks and YAML.
  • Worked extensively with AWS CLI to interface and manage the cloud eco system.
  • Developed Cloud formation custom scripts to spin up the entire infrastructure on AWS cloud.
  • Worked with scriptingAutomationfor Linux toolbox, with Bash scripts.
  • Advanced knowledge of the primary AWS services (EC2, ELB, RDS, Route53 & S3).
  • Used CloudFormationfor provisioning mechanism of a broad range of AWS resources.
  • Designed, configured, managed, and maintained the deployment and operations using Amazon services like VPC (app/database servers in public and private subnets), ELB, EC2, S3, Route 53 and CloudWatch for monitoring.
  • Strong understanding of AWS Glue (data catalog, glue crawlers, code generation, developer endpoints and job scheduler) service and implementing ETL solutions using the same.
  • Have a good understanding of the entire Big data eco - system and have high level knowledge of Hadoop, Hbase, Hive, Java/MapReduce and Spark.
  • Good exposure to serverless architecture with AWS Lambda service.
  • Expertise in AWS Identity and Access Management (IAM) components, including user, groups, roles, policies and password policies.
  • Experience in creating User accounts for different teams with access to core AWS services implementing IAM.
  • Use Amazon RDS (MySQL and PostgreSQL) to manage, create snapshots, and automate backup of database.
  • Created both Windows and Linux AMI based servers for EC2 instances.
  • Skilled at providing day-to-day support of applications and services, resolve functional and performance issues.
  • Designed and developedAWSCloud Formation templates to create custom VPC, Subnets, NAT to ensure deployment of web applications.
  • Skilled at studying existing technology landscape and understand current application workloads.
  • Understand and document technical requirements from clients.
  • Understand virtualization technologies like VMware and Oracle VirtualBox.
  • Ability to define Migration strategy to move application to cloud and develop architecture blueprints and detailed documentation. Create bill of materials, including required Cloud Services (such as EC2, S3 etc.) and tools.
  • Skilled at designing the overall Virtual Private Cloud VPC environment including server instance, subnets, availability zones, etc.
  • Configured Elastic Load balancer (ELB) including high availability of ELB using various subnets in various availability zones, configured security settings and health check for application.
  • Configured and managed AWS Glacier, to move old data to archives based on retention policy of databases/applications (AWS Glacier Vaults).
  • Created customized AMIs based on already existing AWS EC2 instances by using create image functionality, hence using this snapshot for disaster recovery. Created AWS Launch configurations based on customized AMI and use this launch configuration to configure auto scaling groups
  • Configured auto scaling in customized VPC, based on elastic load balancer (ELB) traffic and using ELB health check in order to trigger auto scaling actions.
  • Configured auto scaling policies to scale up/down EC2 instances based on ELB health checks and created alarms to be used in auto scaling decision making policies.
  • Used Amazon Elastic Beanstalk for deployments on production environment.
  • Simulated failover by randomly deleting instances to test Auto Scaling Groups.
  • Ability to design the AWS network architecture including VPN connectivity between regions and subnets and design the HA / DR strategies.
  • Oversee build of the environment and Execute migration plan.
  • Leverage appropriate AWS services and validate the environment to meets all security and compliance controls (created users, roles, groups and policies using IAM).
  • Designed, developed and deployed AWS services as Infrastructure as a Service (IaaS), Platform as a Service (PaaS) and Software as a Service (SaaS).
  • Having experience in Implementing of Versioning and Lifecycle management on S3 Content.
  • Experience in creating Alarms for billing activities and installing the third-party Scripts provided by Amazon for OS level metrics.
  • Drive the technical design of AWS solutions by working with customers to understand their needs.
  • Strong expertise working with large scale and midsized Data Warehouse implementations with emphasis on Teradata development and administration.
  • Good exposure to ETL tools Datastage 7.5.3, Informatica Power Center, Business Objects and Oracle on UNIX and Windows platforms.
  • ExpertDBAskills inTeradataDBMS Administration, development and productionDBAsupport, use ofTeradataViewpoint, Stats Manager. FASTLOAD, MULTILOAD, TSET, TPUMP, SQL, PDCR and ARCMAIN, TASM for workload management.
  • Extensive experience in Stress Testing of BO reports as part of benchmarking activities.
  • Implemented variousTeradatarecommended best practices while defining Profiles, Roles, Alerts, Multi-Value Compression, BLC, TARA GUI,TeradataData Mover, and NetBackup.
  • Created and implemented BAR strategy for the TD application and system databases.
  • Experience in creating Logical and Physical data models for new/existing applications.
  • Having good experience in SUSE 11/ Linux operating system and command set as well as competence with shell scripting.
  • Experience in designing and developing Slowly Changing Dimensions (SCD2).
  • Experience in addressing the Data Quality issues in the data warehouse.
  • Complete knowledge of full life cycle design & development for building Data Warehouse.
  • Responsible for code migrations from Dev to Test and then to Production and schedule jobs using Control-M.
  • E- recipient for the successful delivery of the project with high quality.
  • Experience in using version management tools SVN, GIT and defect management tool Quality Centre (QC).
  • Excellent Experience working with offshore and on-site co-ordination.
  • Strong analytical, logical and problem-solving skills and ability to quickly adapt to new technologies by self-learning

TECHNICAL PROFICIENCY:

Platform: AWS

Cloud Service Models: Amazon EC2, IaaS, Paas and Saas

Cloud Technologies: Elastic Block Store (EBS), Elastic Compute Cloud (EC2), EFS, VPC, IAM, Auto Scaling, Route 53, Glacier, Database Migration Service, Redshift, Relational Database Service (RDS), SQS, SNS, S3, CodeDeploy, Cloudwatch,Elastic Beanstalk, Direct Connect, Lambda, Snowball - VM Import/Export, Elastic Load Balancing, Cloud Formation.

No-SQL: DynamoDB

RDBMS: Teradata 12, 13, 13.10,14, 14.10,15.10, Oracle 11g, PostgreSQL, Redshift.

Teradata Utilities: BTEQ, Fload, Mload, Fexport and TPT.

Teradata Tools: SQL assistant, Teradata Administrator, Teradata Viewpoint, DBQL, Tset, Visual Explain, Workload Manager,TeradataPriority Scheduler,Teradata PMON,TeradataManager & Administrator, Viewpoint, PDCR, SWS, TVI, SLES.

Mainframes Utilities: JCL, Cntlcards.

Backup Technologies: Net vault/Netbackup GUI tools, Bar scripts, Data Mover, TARA.

Workload Management: TASM.

ETL: Datastage, Informatica Power Centre.

Operating Systems: UNIX, Windows Xp, Mainframes.

Defect Management Tool: HP Quality Centre (QC) 10.x/9.x.

Version Management: Tortoise SVN, GIT and GIT Hub.

Configuration Management: Ansible 2.2

Orchestration: Terraform.

Other Tools: AWS CLI, WinSCP, Putty, Oracle SQL Developer, pgAdmin 4.

Programming Languages: Python.

IDE’s: Spyder, Pycharm

Software Methodology: Waterfall, Agile

Scheduling Tools: Control-M, Autosys

ITSM Tool: Service now

PROFESSIONAL EXPERIENCE:

Confidential, Reston, VA

AWS Solution Architect with Devops

Responsibilities:

  • Architected designed and developed fault tolerant solution for data analytics application.
  • Created Pre-requisite document, requirement specification document, analysis and high-level design document and detail design document.
  • Presented the plan for migration of Pyramid Analytics application to cloud.
  • Used VPC and ELB for balancing the traffic load between multiple EC2 instances deployed (using automation with ansible) across multiple AZs.
  • Created and managed security groups for inbound and outbound traffic for instances in both public and private subnets.
  • Database(PostgreSQL) hosted in RDS update and management using pgAdmin 4.
  • Used S3 buckets as failover webpage.
  • Created and worked in Dev, QA, UAT and Prod environments.
  • Worked with Configuration Management automation tool Ansible and has worked on integrating Ansible YAML Scripts.
  • Created Ansible playbooks to automatically install packages from a repository, to change the configuration of remotely configured machines and to deploy new builds.

Environment: & Softwares: EC2, S3, ELB, VPC, IGW, RDS, Ansible, Terraform, PostgreSQL, pgAdmin 4, JIRA. Operating System: Windows server 2008, Linux Ubuntu 16.04.

Confidential, Teaneck, NJ

Cloud Architect - AWS - Teradata Datawarehouse

Responsibilities:

  • As a member of DBA core team, understand and resolve the end to end issues of the developers, testers, business users in the data warehouse ecosystem.
  • Managing the dependencies on the environments like Development, Test, SIT, UAT, DR and Productionfrom different teams to avoid the conflict.
  • Involved in the implementation of TASM (workload management) on the system.
  • Worked on change request implementations by reviewing the scripts, implementation/rollback plan, any impacts, test results from lower environments and deploying the same on production.
  • Backing up of data by archive and restore through TARA of different database containers as per the project/application needs.
  • Extensively worked on analyzing and collecting the performance metrics data from the system as part of Business objects upgrade project.
  • Raising incidents according to priority with Teradata customer support through TAYs for any issues which require vendor assistance.
  • As part of Cloud Strategy team, identified various parameters and built a cloud framework from feasibility analysis.
  • Prepared Scoring Matrix after the detailed analysis of the application landscape with respect to cloud and put the score against different identified categories.
  • Hands on experience working with Redshift to migrate few applications from Teradata warehouse to cloud.
  • Experience using Schema conversion tool(SCT) to convert the schemas from Teradata to Redshift.
  • Successfully migrated data and schemas (DDL’s, views and stored procs) of data warehouse applications such as FIS, Medea, Customer MDM and Product MDM applications to Redshift from Teradata.
  • As a Cloud Architect, Significant infrastructure management experience of running operations in pre-production and production environments, which includes incident management, problem management, change management, build and deployment of production environment.
  • Experience in designing proper traffic routing into and out of our AWS Virtual Private Cloud - VPC.
  • Involved in designing and deploying a multitude of applications utilizing almost all the AWS stack including EC2, Route53, S3, RDS, SNS, SQS, focusing on high-availability, fault tolerance and auto-scaling in AWS cloud formation.
  • Build servers using AWS: Importing volumes (EBS), launching EC2, creating security groups, auto- scaling, load balancers, Route 53, SES and SNS in the defined virtual private connection.
  • Used Amazon IAM service to grant permission to assigned users and manage their roles.
  • Implement and maintain monitors, alarms, and notifications for EC2 instances using Cloud Watch and SNS.
  • Archived outdated data to Glacier through Life Cycle Policy configuration.
  • Involved in daily standup meetings and weekly CAB calls for the production implementation.
  • Creation of Weekly, monthly, quarterly Teradata reports on various system parameters and presenting the same as part of monthly KPI’s to the client.
  • As part of process improvement, developed a stored procedure to retrieve the password from the user creation logs to be used by BI admin and ETL admin teams, thereby reducing the number of tickets hitting the DBA queue.

Environment: Elastic Compute Cloud(EC2), Elastic Load - balancers, S3, Elastic Beanstalk, Cloud-Front, RDS, Route53, Cloud Watch, Cloud Formation, IAM, Redshift, VPC, Auto-scaling, AWS CLI, AWS SCT (Schema conversion tool). Datawarehouse Ecosystem: Teradata 15.10, TSAM, Teradata Viewpoint, Teradata SQL Assistant,FLOAD, BTEQ, Service Now, TARA, BO 4.1/4.2, BTEQ, FastExport, Putty, Informatica Power Center 9.6, Oracle 11g, Win SCP, Autosys. Operating System: Windows 10, SLES 11.

Confidential, Atlanta, GA

Cloud Engineer

Responsibilities:

  • Created multiple VPC's and public, private subnets as per requirement and distributed them as groups into various availability zones of the VPC.
  • Implemented NAT gateways and instances to allow communication from the private instances to the internet through bastion hosts.
  • Used security groups, network ACL's, internet gateways and route tables to ensure a secure zone for rganization in AWS public cloud.
  • Created and configured elastic load balancers and auto scaling groups to distribute the traffic and tohave a cost efficient, fault tolerant and highly available environment.
  • Installed applications on AWS EC2 instances and configured the storage on S3 buckets.
  • Created s3 buckets and managed policies using IAM roles for S3 buckets and Glacier for storing historical data.
  • Configured and managed AWS Glacier to move old data to archives, based on the retention policy of database/applications.
  • Developed and maintained Cloud Formation scripts, automated the provisioning of AWS resources (IAM, EC2, S3, SNS, RDS, ELB, and Auto scaling).
  • Migrated the on-premise MySQL database to AWS RDS Multi AZ's.
  • Implemented and maintained the monitoring and alerting of production and corporate servers/storage using AWS Cloud Watch.
  • Build servers using AWS, importing volumes, launching EC2, RDS, creating security groups, auto-scaling, load balancers (ELBs) in the defined virtual private connection.
  • Implemented notification service via email and text messages using Cloud Watch and Simple Notification Service to alert administrators of special events.
  • Implemented Disaster Recovery by creating and migrating server images to other AWS regional centers in case of local regional disasters.
  • Implemented AWS Cloud Front content delivery network cut down website latency to viewers as well as offer high availability.
Environment: & Softwares: LAMP stack, IAM, S3, Ec2, EBS, ELB, Cloud Watch, SNS, Auto scaling, Cloud formation, Code Deploy, Code Pipeline, RDS, Jenkins, GitHub, Apache Tomcat, Agile. Operating System: Cent OS Linux

Confidential

Senior Consultant

Responsibilities:

  • Involved in the complete re-design of the process flow, starting with design considerations, technical specification and source to target mapping.
  • Responsible for solving the issues from customer support on 24x7 On-call issues.
  • Involved inTeradata DBAactivities in the tests, such as creation of users, spool, temporary, permanent space. Checking the tables skewing, compression.
  • Performance Tuning, Query optimization (Explain plans, Collect statistics, Primary and Secondary indexes).
  • Performed application levelDBAactivities creating tables, indexes, monitored and tunedTeradata BETQ scripts usingTeradataVisual Explain utility.
  • Performed Space Management for Perm & Spool Space.
  • Created users with different profiles and Roles through Stored Procedures, to control the access on the system.
  • Supported Business Object report generated SQL’s.
  • Handled Netvault Backup and Recovery tasks, and Tape Management, and Drive issues.
  • UsedTeradataManager& Administrator to Monitor and ManageTeradataDatabases.
  • Involved in various OperationalTeradataDBAactivities on daily basis.
  • Teradata13 Viewpoint is used to monitor the system performance when it is in load.
  • Worked on loading of data from several flat files sources to Staging usingTeradataTPUMP, MLOAD, FLOAD and BTEQ in Mortgage and Statutory applications.
  • Interacted with Clients, Data Modeler and Data Analyst to develop the Logical & Physical Design for the new tables.
  • Worked on TSET tool, to get the Explain Log, and send it toTeradatateam to analyze it.
  • Automated several DBA routine tasks such as space consumption, data skewness, dropping orphan tables etc.
  • Experienced with Backup and Recovery usingTeradataARCMAIN in Net Vault.
  • Build tables, views, UPI, NUPI, USI and NUSI, PPI.
  • Handled Workload monitoring using TASM.
  • Reviewed and suggested changes to the SQL code from performance point before the deployment in to production.
  • Solved the tickets assigned to DBA queue with in the mentioned SLA’s.
  • Providing deployment support during the non-business hours to the ETL teams on a rotation basis.

Environment: & Softwares: Teradata 13,13.10, UNIX, Viewpoint,TeradataAdministrator& Manager, PDCR DATA/INFO, DBQL, NETVAULT (Arcmain Backup Restore, TaraGui, Datamover), (TeradataELT Utilities - Multiload, FastLoad, PDCRDATA, SQL Assistant, PMON, TSET, TASM, Visual Explain, FastExport, BTEQ, Tpump) Stored Procedures, Teradata SQL assistant, GIT, Tivoli.Operating System: Windows XP, Linux.

Confidential

Teradata DBA

Responsibilities:

  • UsedTeradatamanager to queryTeradataRDBMS status and CPU utilization in reports and graphs.
  • Maintain and TuneTeradataProduction and Development systems based on workload parameters.
  • Worked extensively withTeradatautilities BTEQ, Multiload, Fastload, TPUMP, Fastexport.
  • Performance Optimization, Measuring and monitoring the performance of the 16 nodeTeradatasystem with newTeradatafile system having WAL feature embedded.
  • Designing and implementing the Physical Data Model. Received the Physical Data Model DDL from Erwin.
  • Patient Sensitive data is loaded in a separate data mart through SQL server integration services.
  • Measuring performance of the systems with different configurations. Configuration of the systems like changing number of AMP per node, Changing number of Clusters, Changing number of Clique, Changing number of nodes in a clique.
  • Capacity planning and proactive monitoring to meet performance and growth requirements
  • Performance Tuning and Administration activities ofTeradata
  • TeradataViewpoint is used to monitor the system performance when it is in load.
  • Recovering the database/tables whenever we have disk drive failures. And used most of the recovery scenarios to recovery the data of theTeradataSystem.
  • Experienced with Backup and Recovery usingTeradataARCMAIN.
  • Upgrading theTeradataVersions and worked on patch upgrades of theTeradatasystems.
  • Extensively usedTeradatasystem Priority Schedule in controlling the load of the system.
  • Created users with different profiles and Rules to control the access on the system.
  • Experienced inTeradataManager which is used to create Alerts, Monitor system.
  • Familiar withTeradataDatabase Query Manager and in using DBS Control Flags.
  • MaintainedTeradatasystem with all UNIX, LINUX and WINDOWS platforms.

Environment: & Softwares: Teradata 13.10, BTEQ, Teradata SQL assistant, SSIS 10.0, JCL and Control cards, Endeavor. Operating System: Windows XP and Mainframe.

Confidential

Teradata ETL Developer

Responsibilities:

  • Understanding the business requirements and scope from SRS and HLD documents.
  • Developed SK generation, CDC and Loading scripts using Teradata utilities, according to business requirements to populate the data to warehouse.
  • Extensive experience in developing Retrofit scripts to correct the data quality issues like duplicates, PK violations, date overlap, date gap issues and RI issues in the EDW.
  • Successfully replaced CDC and ADS load datastage jobs which are failing frequently in production with Teradata BTEQ scripts to address the same.
  • Debugging and fixing the errors in System & CAT testing environments to ensure defect free codes & closely follow-up with testing team until defects are closed in QC.
  • Modifying, Testing and Trouble Shoot BTEQ's by adding the new functionality to facilitate the changes from the source system and getting them deployed back to production.
  • Analyzing the defects and finding the Root Cause to avoid the re-occurrence.
  • Analysis of the Adhoc production issues (PKE’s) and developing solutions for the same using Teradata SQL's and utilities.
  • Developed, reviewed, and executed SQL queries to validate transformation rules, and to validate data in production environment after the deployment of scripts.
  • Creating and Modifying the Business Views in the semantic layer to be used by the Reports as per the business requirements.
  • Analysis of the long running queries in the production system and coming up with a plan for tuning the same (Explain plans, collect statistics PI & SI).
  • Worked extensively with the Teradata SQL assistant and BTEQ to interface with the Teradata.
  • Experience in working with huge volume of data in the EDW.
  • Automated the operation issue through BTEQ script, which send a mail to all the subject area leads if duplicate rows are loaded to tables because of job ran twice in a batch load.

Environment: & Softwares: Teradata 12/13.10, BTEQ, Fastload, Multiload, Teradata SQL assistant, Datastage 7.5.3, Control-M, SVN, Putty, HP Quality Centre (QC).

We'd love your feedback!