Aws Solution Architect Associate Resume
San Antonio, TexaS
PROFESSIONAL SUMMARY:
- 12 + years of experience in IT industry this includes 4+years of experience in Cloud computing.
- Designed an Architectural Diagram for different applications before migrating into amazon cloud for flexible, cost - effective, reliable, scalable, high-performance and secured.
- Experience in architecture, design and implementation of On-Prem, Off-Prem and Hybrid cloud solutions utilizing AWS.
- Good experience in conducting proof-of-concept, developing prototypes and reference models.
- AWS (Amazon Web Services): EC2, VPC, IAM, S3, Cloud Front, Cloud Watch, Cloud Formation, Glacier,RDSConfig,Route53,SNS,SQS,Kinesis,APIGateway,ElasticCache,RedShift,DynamoDB,AutoScaling,Aurora,Lamda,Direct Connect, EMR, Elastic Beanstalk, Elastic File System (EFS).
- Utilized Cloud Watch to monitor resources such as EC2, CPU memory, Amazon RDS DB Services, EBS volumes; to set alarm for Notification or Automated actions, and to monitor logs for better understanding and operation of the system.
- Deep understanding of system development in cloud environments, including Software as Service ( SaaS ), Platform as Service ( PaaS ), or Infrastructure as a Service ( IaaS ).
- Expertise in RDBMS (MySQL, AWS Aurora, Oracle, SQL Server, and Postgres) and/or NoSQL (AWS DynamoDB, MongoDB, and Cassandra) data stores.
- Experienced with installation of AWS CLI to control various AWS services through SHELL/BASH scripting.
- Experience in implementing a CI/CD pipeline.
- Experience in designing and architecting serverless computing and implementation using AWS Lamda.
- Possess working knowledge with Python to automate software configuration.
- Experienced in creating multiple VPC's and public, private subnets as per requirement and distributed them as groups into various availability zones of the VPC.
- Created and configured elastic load balancers and auto scaling groups to distribute the traffic and to have a cost efficient, fault tolerant and highly available environment.
- Created S3 buckets in the AWS environment to store files, sometimes which are required to serve static content for a web application.
- Configured S3 buckets with various life cycle policies to archive the infrequently accessed data storage classes based on requirement.
- Used IAM for creating roles, users, and groups to provide additional security to AWS account and its resources.
- Experienced in creating RDS instances to serve data through servers for responding to requests.
- Provided support for java applications by collaborating with java development team using the agile methodology
- Assist in designing, automating, implementing and sustainment of Amazon Machine Images (AMI) across the AWS Cloud environment
- Experience with AWS API Gateway and Rest APIs, XML/SOAP.
- Created and modified Cloud Formation templates to create/upgrade EC2 instances to support specific needs
- Working experience on Windows Active Directory and LDAP.
- Experience on File, Block and Object storage.
- Knowledge on tools such as Git, Jira, Maven, Gradle, Jenkins, Bamboo, Chef, Puppet, Ansible, SaltStack, Nagios, Splunk.
- Experience in container based solutions, and its orchestration with OpenShift , Kubernetes or Docker SWARM and Docker datacentre components.
- Troubleshooting skills with network protocols such as DNS, TCP/IP, SMTP, and SNMP.
- Expertise in troubleshooting and resolving issues, disaster recovery .
- Good knowledge on Big Data.
- Hands-on experience in analysing workloads, debugging and tuning performance at system level.
- Used Agile methodologies like Scrum and Kanban for the development process.
- Committed to excellence, self-motivator, team-player, and a far-sighted developer with strong problem-solving skills and with zeal to learn new technologies
- Strengths include good team player, excellent communication interpersonal and analytical skills and ability to work effectively in a fast-paced, high volume, deadline-driven environment.
TECHNICAL SKILLS:
Cloud: AWS (EC2, VPC, ELB, S3, EBS, RDS, Route53, ELB, Cloud WatchCloud Formation, Auto Scaling, Lambda, RedShift, Aurora, DynamoDB)
DWH Technologies: IBM Web Sphere Data stage 7.0/7.5/8.0/8.1/8.5/8.7/9.1, Informatica
Languages: Java, C++, C, PL/SQL, JSON, Python
Databases: Oracle 11g/10g/9i/8i, DB2, My Sql, Teradata, Netezza
IDE/Tools: Eclipse, IBM Rational Software Architect 7.5
OLAP Tools: Business Objects, Crystal Reports, Cognos, Lumira
Operating Systems: UNIX, Linux and Windows
Version Control Tools: IBM Borland Star team, RTC
Project Methodologies: Agile, Waterfall
PROFESSIONAL EXPERIENCE:
Confidential, San Antonio, Texas
AWS Solution Architect Associate
Responsibilities:
- Responsible for architecting, designing, implementing and supporting of cloud based infrastructure and its solutions.
- Proficient in AWS services like VPC, EC2, S3, ELB, Auto Scaling Groups (ASG), EBS, RDS, IAM, Cloud Formation, Route 53, Cloud Watch, Cloud Front, Cloud Trail.
- Experienced in creating multiple VPC's and public, private subnets as per requirement and distributed them as groups into various availability zones of the VPC.
- Created NAT gateways and instances to allow communication from the private instances to the internet through bastion hosts.
- Involved in writing Java API for Amazon Lambda to manage some of the AWS services.
- Used security groups, network ACL's, internet gateways and route tables to ensure a secure zone for organization in AWS public cloud.
- Created and configured elastic load balancers and auto scaling groups to distribute the traffic and to have a cost efficient, fault tolerant and highly available environment.
- Created S3 buckets in the AWS environment to store files, sometimes which are required to serve static content for a web application.
- Used AWS Beanstalk for deploying and scaling web applications and services developed with Java.
- Configured S3 buckets with various life cycle policies to archive the infrequently accessed data to storage classes based on requirement.
- Possess good knowledge in creating and launching EC2 instances using AMI's of Linux, RHEL, and Windows.
- Implemented domain name service (DNS) through route 53 to have highly available and scalable applications.
- Maintained the monitoring and alerting of production and corporate servers using Cloud Watch service.
- Created EBS volumes for storing application files for use with EC2 instances whenever they are mounted to them.
- Experienced in creating RDS instances to serve data through servers for responding to requests.
- Created snapshots to take backups of the volumes and also images to store launch configurations of the EC2 instances.
- Designing and implementing CI (Continuous Integration) system: configuring Jenkins servers, Jenkins nodes, creating required scripts, and creating/configuring VMs (Windows/Linux).
- Focus on continuous integration and deployment(CI/CD), promoting Enterprise Solutions to target environments.
- Experience in CI/CD pipelines, strong background in Build and Release Management and Cloud Implementation all within that suites the needs of an environment under DevOps Culture.
- Implemented and maintained the monitoring and alerting of production and corporate servers/storage using Cloud Watch.
Environment: AWS (EC2, VPC, ELB, S3, EBS, RDS, Route53, ELB, Cloud Watch, Cloud Formation, AWS Auto Scaling, Lambda), SQL, Unix/Linux, Shell scripting,Git, Jenkins, CI/CD pipeline
Confidential, San Antonio, Texas
AWS Architect
Responsibilities:
- Performed configuration, deployment and support of cloud services including Amazon Web Services (AWS).
- Data center migration to Amazon Web Services (AWS) infrastructure and provided initial support to Applications and Database teams.
- Created AWS RDS database instances consisting of Oracle, SQL Server and AWS RDS Oracle database clusters.
- Configured AWS Identity and Access Management (IAM) Groups and Users for improved login authentication.
- Designed, configured and managed public/private cloud infrastructures utilizing Amazon Web Services (AWS), including EC2, S3, Cloud Front, Elastic File system, RDS, VPC, Direct Connect, Route53, Cloud Watch, Cloud Trail, Cloud Formation, IAM, and Elastic Search which allowed automated operations.
- Handled streaming and live content over Amazon Cloud Front.
- Configured an AWS Virtual Private Cloud (VPC) and Database Subnet Group for isolation of resources within the Amazon RDS Oracle DB cluster.
- Performed database SQL queries to address connectivity and integration activities.
- Implemented AWS High-Availability using AWS Elastic Load Balancing (ELB), which performed a balance across instances in multiple Availability Zones.
- Assigned AWS Elastic IP Addresses used to work around host or availability zone failures by quickly remapping the address to another running instance or a replacement instance that was just started.
- Configured and managed AWS Glacier, to move old data to archives based on retention policy of databases/ applications (AWS Glacier Vaults).
- Deployed and configured Git repositories with branching, forks, tagging, and notifications.
- Experienced and proficient deploying and administering GitHub, GitLab for CI.
- Implementing a Continuous delivery (CD) pipeline with Docker, Jenkins and GitHub and AWS AMI's.
- Developed proof of concept implementations of distributed frameworks using Docker.
- Deploy builds to production and work with the teams to identify and troubleshoot any issues.
Environment: AWS, EC2, Shell Scripting, Glacier, VPC, Windows 2012 Server, SQL,Git,Jenkins,Docker
Confidential
Lead Datastage Developer
Responsibilities:
- Analysis of the data sources like Teammate, Bwise, MS Access, MS Excel, MS Word and pdf.
- Preparation of the estimates, time lines of the deliverables and project execution plan.
- Developed jobs in Data Stage using different stages like Transformer, Aggregator, Lookup, Join, Merge, Modify, Remove Duplicate, Sort, Peek, Change capture, Filter, Copy, Sequential File, and Data Set.
- Used Data Stage for splitting the data into subsets and to load data, utilized the available processors to achieve job performance, configuration management of system resources in Orchestrate environment.
- This project involved the knowledge of Teammate tool along with batch scripts, Unix Scripts, PL\SQL and solid ETL jobs.
- Worked on the Roll out Plans and actively involved in the migrations of code across the Environments.
- Used Administrator to administer the locks on the jobs and other Administration activities for Data Stage Server.
- Undertaken the Lead capabilities from Onsite and delegated the work to offshore.
- Worked on the Salesforce UI design approach for the manual entry excel sheets.
- Used agile methodologies (Scrum) for software development. Involved in daily status meetings and team code reviews.
- Involved in story writings for the Scope items.
- Involved in loading data from LINUX file system to HDFS.
Environment: Microsoft SQL Server, UNIX shell scripting, UNIX, IBM Web sphere Data Stage 8.7/9.1, DB2, Win SCP, Squirrel, Control M, AutoSys, RTC.
Confidential
Senior Datastage Developer
Responsibilities:
- Analyze and Understanding the existing rules of analyzing risk and develop a strategy (ETL) to reduce false positives.
- Preparation of the estimates, time lines of the deliverables and project execution plan.
- Preparation of S2T, A&D and design documents
- Extensively involved in the Analysis, Design and Modeling.
- Analysis of present schema and data model.
- Designed the ETL process by gathering the requirements from both divisions and the users.
- Participated in preparation of HLD and LLD.
- Used the Data Stage Designer to design and develop jobs for extracting, cleansing, transforming, integrating, and loading data into different Data Marts.
- Designed Data Stage ETL jobs for extracting data from heterogeneous source systems, transform and finally load into the Data Marts.
- Developed various jobs using ODBC, Lookup Stage, Join Stage, Dataset, Aggregator, Sequential file, Transformer stages, etc.
- Using Shared Containers and creating reusable components for local and shared use in the ETL process.
- Tuned Data Stage transformations and jobs to enhance their performance.
- Troubleshoot jobs using the debugging tool.
- Written Test cases and test scripts for testing the jobs.
Environment: Oracle 10g, Perl, UNIX shell scripting, IBM Web Sphere Data Stage 8.1, Star team, Squirrel, Teradata, putty, NORKOM.
Confidential
Datastage Developer
Responsibilities:
- Developed star schema and snow flake schema.
- Analysis of present schema and data model.
- Identifying suitable dimensions and facts for schema.
- Used Data Stage Manager for importing metadata into repository and also used for importing and exporting jobs into different projects.
- Used the Data Stage Designer to develop jobs for extracting, cleaning, transforming, integrating, and loading data into data warehouse database.
- Used the Data Stage Director for scheduling, validating, running and monitoring the jobs.
- Used Administrator for creating projects and managing the users.
- Used almost all stages like Sequential file, hash file, odbc stage etc. and also extensively involved in creating custom routines and transforms.
- Used the Meta Stage for synchronizing and integrating Meta data from warehouse related tools, automatically gathering process data from operational systems. History of an item of data, data lineage and history of process, process analysis and how an object in the directory is related to the other objects, Impact analysis.
- Worked on programs for scheduling Data loading and transformations using Data Stage from legacy systems to Oracle.
- Involved in Designing and developing catalogs for report generation from warehouse databases.
- Developed shell scripts to automate file manipulation and data loading procedures.
- Preparation of the estimates, time lines of the deliverables and project execution plan.
Environment: Data Stage 7.0/7.5, UNIX, Windows 2000/NT, IBM UDB-DB2, Star Team, Shell & PERL Scripting.