We provide IT Staff Augmentation Services!

Aws/devops Engineer Resume

Mason, OH

SUMMARY

  • Skilled IT professional with 7+ years’ IT experience, looking for a position in areas ofSoftware Configuration Management, Version Control, Build and Release management, Change management, Cloud Integration.
  • Well versed with deadline pressures, superior analytical, time - management, collaboration, communication and problem-solving skills.
  • Experience in using Teradata SQL Assistant, Teradata Administrator, PMON and data load/export utilities like BTEQ, Fast Load, Multi Load, Fast Export, and Exposure to T pump on UNIX/Windows environments and running the batch process for Teradata.
  • Strong Data Warehousing experience in Application development & Quality Assurance testing using Informatica Power Center 9.1/8.6(Designer, Workflow Manager, Workflow Monitor), Power Exchange, OLAP, OLTP
  • Experience in creating complex Informatica Mappings using Source Qualifier, Expression, Router, Aggregator, Lookup, Normalize, and other transformations in Informatica and well versed in debugging an Informatica mapping using Debugger.
  • Efficient development skills in Teradata SQL Objects, Creating Tables, Stored Procedures, Functions, Views, Indexing, Performance Tuning.
  • Experience in Designing and Deployment of Reports for the End-User requests using Cognos, Teradata and Excel.
  • Experience inSDLC, Build Engineering & Release Management process, including end-to- end code configuration,building binaries & deploymentsand entire life cycle model in Enterprise Applications.
  • Extensive experience withAWS(Storage, Application Services, Deployment and Management) and managed servers on AWS platform instances usingPuppet, Chef Configuration management.
  • In-depth knowledge of DevOps management methodologies and production deployment Configurations.
  • Proficient experience in working with enterprise security teams, commercial tools, and open-source security tools and AWS security.
  • Knowledge of recommended best practices for building secure and reliable applications on the AWS platform; understanding of the basic architectural principles of building on the AWS Cloud; knowledge of the AWS global infrastructure; understanding of network technologies as they relate to AWS.
  • Hands on experience in AWS provisioning and good knowledge ofAWS serviceslikeEC2, S3, Glacier, ELB, RDS, Redshift, IAM, Route 53, AZURE, VPC, Auto scaling, Cloud Front, Cloud Watch, Cloud Trail, Cloud Formation, Security Groups.
  • Strong knowledge of CM responsibilities in ALM, TFS, Azure DevOps.
  • Created tagging standards for proper identification and ownership ofEC2 instancesand other AWS resources.
  • Experience in development of Big Data projects usingHadoop, HDFS, Map Reduce, Hive, Sqoop, Hue, HBase, Pig and Oozie.Expertise in implementingMap Reducejobs of Big Data Hadoop inJava.
  • Expertise in gathering, analyzing and documenting business requirements, functional requirements, and data specifications forBusiness Objects UniversesandReports.
  • Strong Data Modeling experience in ODS, Dimensional Data Modeling Methodologies likesStar Schema, Snowflake Schema.Design and development ofOLAP modelsconsisting of multi-dimensional cubes and drill through functionalities for data analysis.
  • Very Good experience in writingshell scriptsksh .
  • Excellent Web development skills. Experience in N-tier Client-Server based Internet technology, intranet portal design/development Web based data reporting system, Framework development for Internet application.
  • Experience in managing infrastructure resources incloud architecturewith close coordination with various functional teams.
  • Experience withJenkins, Hudson as Continuous Integration / Continuous Deployment Tooland strong expertise with Ant and Maven Build Frameworks.
  • Experienced in authoring pom.xml files, performing releases with theMavenrelease plug-in and managing Maven repositories.
  • Involved inARTIFACTORYRepository Managers forMAVEN and ANT.
  • Knowledge of recommended best practices for building secure and reliable applications on the AWS platform; understanding of the basic architectural principles of building on the AWS Cloud; knowledge of the AWS global infrastructure; understanding of network technologies as they relate to AWS.
  • Expertise in maintainingdata quality, data organization, Metadata and data profiling.
  • Experience in Business Analysis and Data Analysis, User Requirement Gathering, User Requirement Analysis, Data Cleansing, Data Transformations, Data Relationships, Source Systems Analysis and Reporting Analysis.
  • Strong experience withETL tool Informatica 8.1/7.1/6.2, AB andDATA STAGE 8.0.1
  • Proficient in coding of optimizedTeradata batch processing scriptsfor data transformation, aggregation and load usingBTEQ.
  • Expertise in Ansible automates across your infrastructure, you will see plays and tasks complete, each success or failure,Set-up different Playbooks to run in case of success or failure of a prior workflow Playbook.
  • Experience with Configuration Management automation tool Ansible and has worked on integratingAnsible YAML Scripts.
  • Experienced in Agile Methodologies, Scrum stories and sprints experience in a Python based environment, along with data analytics, data wrangling and Excel data extracts and solved repetitive admin tasks.
  • Experienced in creating snapshots and Confidential machine images (AMI) of the instances for backup by creating clone instances andcreated Lambda function to automate snapshot back up onAWSand set up the scheduled backup.
  • Used Jenkins as Code Deploy plugin to deploy to AWS and built Jenkins jobs to create Azure Infrastructure from GitHub repositories containing Terraform code and ability to work closely with teams, to ensure high quality, timely delivery of builds & releases.
  • Involved in migration of existing on-premises data centre into the AWScloud environment. Utilized S3 bucket and Glacier for storage and backup on AWS.
  • Experienced in AWS Elastic Beanstalk for app deployments and worked on AWS Lambda with Confidential kinesis.
  • Worked with different Bug tracking tools likeJIRA, Remedy and Bugzilla.
  • Ability to quickly understand, learn and implement the new system design, data models in a professional work environment.

TECHNICAL SKILLS

Programming Languages: Java, Python, C++, C, powershell, bash

Data warehouse: AWS, Snowflake, Azure

Development ETL: Informatica, UNIX access, Citrix

Scripting Languages: HTML, CSS, JavaScript, Angular

Databases: MySQL, Oracle10g, Microsoft Access, SQL, Teradata, IBM data studio DB2

DevOps tools: Jira, Bit bucket, Git

Web Tools: PHP, XML, Spring Boot, HTML, SAML, AZURE (ALM, TFS, CM)

Network Protocols: HTTP, HTTPS, SMTP, FTP, SFTP, DHCP, DNS, SNMP, UDP, Cisco Routers/Switches

Performance Monitoring And Analysis tools: Wire Shark

Operating Systems: Linux, Windows

Languages: C, Python, Java

Continuous Integration: Jenkins

Methodologies: Agile, Waterfall

Cloud computing tools: SAAS, IAAS, PAAS

PROFESSIONAL EXPERIENCE

Confidential, Mason, OH

AWS/DevOps Engineer

Responsibilities:

  • Use agile methodology throughout the project. Involved in weekly and daily bases release management.
  • Involved in source control management with GitHub and GitLab Enterprise level repositories. Regular activities include configure user’s access levels, monitor logs, identifying merge conflicts and managing master repository.
  • Deploy and monitor scalable infrastructure on Confidential web services (AWS) & configuration management.
  • Involved in performing application deployment toAWSElasticBeanStack environment.
  • Configured & deployed Java applications on Confidential Web Services (AWS) for a multitude of applications utilizing theAWSstack, cloud formation.
  • Made SQL changes for provider care management solutions patrol using ETL Informatica and informatica 2.1 to 4.1 transfer.
  • Using Snowflake for database changes in reporting area for the PCMS project.
  • ETLteradata to snowflake transgression and jobs load check.
  • UsingAWS for uploading client data to cloud platform with ETL cross testing using teradataand DB2
  • End to end migration of company data to reporting layer using snowflake with regression check on DB2 and Teradata.
  • Informatica client-side testing and development, data transfer from old to new version and workflow monitoring and designing for the monthly tasks.
  • Experience in microservice based solution design using followingarchitecture aspects: Availability, Cloud Traffic Management, Maintainability, Operability, Scalability and Portability.
  • Maintained high Availability clustered and standalone server environments and refined automation component with scripting and configuration management (Ansible).
  • Managed Users, roles and Groups using the Confidential Identity and Access Management (IAM).
  • Created scripts for REST API’s using JAVA Microservices.
  • Using multiple DevOps tools like GIT, Jenkins, Ansible, Maven, Terraform, Elastic search, Logstash, Kibana, JirainbuildingCI/CDpipeline.
  • Worked on OAuth for authorization between microservices developed.
  • Developed a broker interface to store and retrieve information.
  • Created Load Balancers assigning roles in AWS Lambda to run scripts and launching resources like EC2 & S3) using scripts.
  • Involved in installation of baseline component on RHEL, WIN32 platforms
  • Providing web authentication solutions (using Single Sign-On, Multi-Factor, Auth0, and OpenID).
  • Application developed by using C/C++, Shell Scripts Java on RHEL/WIN32 environment.
  • Secure cloud-based environments, particularly in AWS.
  • Building required services using JAVA Spring Boot Micro Servers and cloud platforms.
  • Developing Java REST services Spring Boot Microservice architecture.
  • Performed data analysis and gathered columns metadata of source systems for understanding requirement feasibility analysis.
  • Created Logical Data flow Model from the Source System study according to Business requirements on MSVisio.
  • Transformed Logical Data Model to Physical Data Model ensuring the Primary Key and Foreign key relationships in PDM, Consistency of definitions of Data Attributes and Primary Index considerations.
  • Created UML Diagrams including Use Cases Diagrams, Activity Diagrams/State Chart Diagrams, Sequence Diagrams, Collaboration Diagrams and Deployment Diagrams, Data Flow Diagrams (DFDs), ER Diagrams and Web Page Mock-Ups using Smartdraw, MS Visio & Rational Rose.
  • Worked on the Teradata stored procedures and functions to confirm the data and have load it on the table.
  • Developed procedures to populate the customer data warehouse with transaction data, cycle and monthly summary data, and historical data.
  • Worked on optimizing and tuning the Teradata views and SQL’s to improve the performance of batch and response time of data for users.
  • Worked closely with analysts to come up with detailed solution approach design documents.
  • Provided initial capacity and growth forecast in terms of Space, CPU for the applications by gathering the details of volumes expected from Business.
  • Prepared low level technical design document and participated in build/review of the BTEQ Scripts, FastExports, Multiloads and Fast Load scripts, Reviewed Unit Test Plans & System Test cases.
  • Provided support during the system test, Product Integration Testing and UAT.
  • Verified if implementation is done as expected i.e. check the code members are applied in the correct locations, schedules are built as expected, and dependencies are set as requested.
  • Done the impact assessment in terms of schedule changes, dependency impact, code changes for various change requests on the existing Data Warehouse applications that running in a production environment.
  • Provided quick production fixes and proactively involved in fixing production support issues.
  • Have strong knowledge in data mover for importing and exporting of data.
  • Creating and maintaining source-target mapping documents for ETL development team.
  • Providing requirement specifications and guide the ETL team for development of the ETL jobs through Informatica ETL tool.
  • Development of test cases and testing.
  • Used Teradata Data Mover to copy data and objects such as tables and statistics from one system to another.
  • Analyze business requirements, designs and write technical specifications to design/redesign solutions.
  • Involved in complete software development life-cycle (SDLC) including requirements gathering, analysis, design, development, testing, implementation and deployment.
  • Developed technical design documents (HLD and LLD) based on the functional requirements.
  • Coordinate with Configuration management team in code deployments.
  • Developed microservices coded in JAVA, PYTHON (web apps, few data analytic services)andsecure my inter microservice communication using the certificates.
  • Developed new libraries with Micro Services architecture using Rest APIs, spring boot, and cloud foundry.
  • Migration of existing application into Microservices Architecture using Rest APIs, spring boot, Spring Cloud and AWS.
  • Design the services based on microservice architecture, typically authenticating them with web authentications (oAuth, Security Token Service, or PKI based authentication).
  • Design API gateways to check authentication and to forward to designated microservices.
  • Experience in micro service based solution design using followingarchitecture aspects: Availability, Cloud Traffic Management, Maintainability, Operability, Scalability and Portability.

Environment: Teradata V2R5, Teradata Administrator, Teradata SQL Assistant, Teradata Manager, BTEQ, MLOAD, FLOAD, FASTEXPORT, Erwin Designer, Teradata FSLDM, Quality Center, UNIX, Windows 2000, Shell scripts.

Confidential, Peoria,IL

AWS/ DEVOPS Engineer

Responsibilities:

  • Gathering of Requirements from various systems business users.
  • Communicating with the business users to understand business requirements clearly.
  • Conducting meetings with the data architects to understand the source system elements.
  • Development of scripts for loading the data into the base tables in EDW usingFastLoad,MultiLoad and BTEQ utilitiesof Teradata.
  • Implementing new projects builds framework using Jenkins & maven as build framework tools.
  • Implementing a Continuous Delivery framework using Jenkins, CHEF, and Maven in Linux environment.
  • Written wrapper scripts to automate deployment of cookbooks on nodes and running the chef client on them in a Chef-Solo environment.
  • Converting production support scripts to chef recipes, testing of cookbooks with chef-spec.
  • Setting up client server model of Chef in development environment of OCI.
  • Worked on documentation - Chef Basics, Initial setup of Chef, Data bags implementation, Coding standards, Cookbook document, Testing docs, and AWS server provisioning using Chef Recipes.
  • Using Cloud Trail, TESSA, Cloud Passage, Checkmarks, Qualys Scan tools for AWS security and scanning.
  • Created monitors, alarms and notifications for EC2 hosts using Cloud Watch.
  • Defining Release Process & Policy for projects early in SDLC.
  • Integration of Maven, Jenkins, Urban Code Deploy with Patterns/Release, Git, andCloudFoundry.
  • Involved heavily in writingcomplex SQL queriesto pull the required information from Database usingTeradataSQL Assistance.
  • Management and Administration of AWS ServicesCLI,EC2,VPC,S3,ELBGlacier,Route 53,Cloud trail,IAM, and Trusted Advisor services.
  • Created automated pipelines in AWSCode Pipelineto deployDockercontainers in AWSECSusing serviceslikeCloudFormation,Code Build,Code Deploy,S3andpuppet.
  • Worked onJIRAfor defect/issues logging & tracking and documented all my work usingCONFLUENCE.
  • Integrated services likeGitHub, AWS Code Pipeline, Jenkins and AWS Elastic Beanstalk to create a deployment pipeline.
  • Created a shell script that checks the corruption of data file prior to the load.
  • Created/Enhanced Teradata Stored Procedures to generate automated testing SQLs.
  • Created a cleanup process for removing all theIntermediate temp filesthat were used prior to the loading process.
  • Involved introubleshooting the productionissues and providing production support.
  • Developed unit test plans and involved in system testing.
  • Migrated Linux environmentto AWSby creating and executing a migration plan, deployed EC2 instances in VPC, configured security groups &NACL's, attached profiles and roles usingAWS Cloud Formation templates
  • Used AmazonRoute53 to manage DNSzones globally & to give public DNS names to ELB's and CloudFront for Content Delivery
  • Experience in implementing AWSlambda to run servers without managing them and to trigger run code by S3 and SNS.
  • Responsible for Continuous Integration (CI) and Continuous Delivery (CD) process implementation using Jenkins along with PowerShell to automate routine jobs.
  • Developed shell scripts for automation of the build and release process, developed Custom Scripts to monitor repositories, Server storage.
  • Created Ansible playbooks to automatically install packages from a repository, to change the configuration of remotely configured machines and to deploy new builds and various automation purpose, file copy, permission changes, configuration changes, path specific folder creation.
  • Used Ticketing toolJIRAto track defects and changes for change management.
  • Initiating alarms inCloudWatchservice for monitoring the server's performance,CPU Utilization, disk usage etc. to take recommended actions for better performance.
  • Configured AWS Multi Factor AuthenticationinIAMto implement 2 step authentication of user's access using Google Authenticator and AWS Virtual MFA.

Environment: Teradata Administrator, Teradata SQL Assistant, Teradata Manager, BTEQ, MLOAD, FLOAD, FASTEXPORT,Teradata FSLDM, Quality Center, UNIX, Shell scripts, GITHUB, Jenkins, AWS elastic bean stalk, AWS S3, AWS code pipeline.

Confidential

AWS DevOps Engineer

Responsibilities:

  • Experience in Software Integration, Configuration, building, automating, managing, and releasing code from one environment to another environment and deploying to servers.
  • Worked in AWS environment, instrumental in utilizing Compute Services (EC2, ELB), Storage Services (S3, Glacier, Block Storage, and Lifecycle Management policies), CloudFormation, Lambda, VPC, RDS and Cloud Watch.
  • Migrated Linux environmentto AWSby creating and executing a migration plan, deployed EC2 instances in VPC, configured security groups &NACL's, attached profiles and roles usingAWS Cloud Formation templates
  • Used AmazonRoute53 to manage DNSzones globally & to give public DNS names to ELB's and CloudFront for Content Delivery
  • Experience in implementing AWSlambda to run servers without managing them and to trigger run code by S3 and SNS.
  • Responsible for Continuous Integration (CI) and Continuous Delivery (CD) process implementation using Jenkins along with PowerShell to automate routine jobs.
  • Developed shell scripts for automation of the build and release process, developed Custom Scripts to monitor repositories, Server storage.
  • Created Ansible playbooks to automatically install packages from a repository, to change the configuration of remotely configured machines and to deploy new builds and various automation purpose, file copy, permission changes, configuration changes, path specific folder creation.
  • Used Ticketing toolJIRAto track defects and changes for change management.
  • Initiating alarms inCloudWatchservice for monitoring the server's performance,CPU Utilization, disk usage etc. to take recommended actions for better performance.
  • Configured AWS Multi Factor AuthenticationinIAMto implement 2 step authentication of user's access using Google Authenticator and AWS Virtual MFA.
  • Worked onBlue/green deploymentstrategy by creating new applications which are identical to the existing production environment usingCloudFormationtemplates &Route53weighted record sets to redirect traffic from the old environment to the new environment viaDNS.
  • Managed theCode Repositoryby maintaining code inGIT, improve practices ofbranchingandcode mergeto custom needs of development team.
  • Used AmazonRoute53to manage DNS zones and give publicDNS namestoelastic load balancers.

Environment: Python, AWS, Ansible, Docker, AWS CLI, Maven, Tomcat, Jenkins, Python Scripting.

Confidential

Junior DevOps Engineer

Responsibilities:

  • Created scripts inPython (Boto)which integrated withAmazon APIto control instance operations.
  • Designed, built, and coordinate an automated build & releaseCI/CDprocess usingGitlab, Jenkins and Puppeton hybrid IT infrastructure.
  • Involved in designing and developing AmazonEC2, AmazonS3, AmazonRDS, AmazonElastic Load Balancing, AmazonSWF, AmazonSQS, and other services of theAWS-infrastructure.
  • Running build jobs and integration tests on Jenkins Master/Slave configuration.
  • Managed Servers on the Confidential Web Services (AWS) platform instances usingPuppet configuration management.
  • Involved in maintaining the reliability, availability, and performance of Confidential Elastic Compute Cloud ( Confidential EC2) instances.
  • Conduct systems design, feasibility and cost studies and recommend cost-effective cloud solutions such asAmazon Web Services (AWS).
  • Responsible for monitoring the AWS resources usingCloud Watchand application resources usingNagios.
  • CreatedAWSMulti-Factor Authentication(MFA)for instanceRDP/SSHlogon, worked with teams to lockdownsecurity groups.
  • Involved in complete SDLC life cycle - Designing, Coding, Testing, Debugging and Production Support.
  • Installed/Configured/ManagedPuppet Master/Agent. Wrote custom Modules and Manifests, downloaded pre-written modules from puppet-forge. Up gradation or Migration of Puppet Community and Enterprise.
  • Coordinate/assist developers with establishing and applying appropriatebranching, labelling/naming conventions usingGit.
  • Branching, Merging, Release Activities on Version Control Tool GIT. UsedGit Hubs version control to store source code and implementedGitfor branching and merging operations for Java Source Code.
  • MaintainedDNS, NFS,andDHCP, printing, mail, web, and FTP services for the enterprise.
  • Installed and upgraded packages and patches configuration management, version control, service pack and reviewing connectivity issue regarding security problem.
  • In depth knowledge of System/OS level TCP/IP Networking.
  • Used AGILE MAVEN method to develop a build and ANT as a build tool.
  • Involved in writing parent pom.xml files to establish the code quality tools integration.
  • Creating and maintaining users, profiles, security, rights, disk space and process monitoring.
  • User account management, worked with shell scripting (bash) to automate administration tasks.
  • Experience in Networking, DNS, NFS and TCP/IP.
  • Administration of client computer using SSH and FTP.
  • Troubleshoot the network and firewall issues.
  • Administrating and managing windows servers including configuration and troubleshooting of Active Directory, DNS, DHCP, NFS, IIS.
  • Designed and developed several SQL Scripts, Stored Procedures, Packages and Triggers for the Database.
  • Involved in the complete Software Development Lifecycle (SDLC) using the agiledevelopment Methodology. Interacted with the end users and participated in the SCRUM meetings.

Environment: Java/J2EE, Jenkins, JIRA, Maven, GIT, ANT, AWS, Python, Ruby, Cassandra, Web logic, Unix Shell Scripting, Nagios, Cloud Watch.

Hire Now