We provide IT Staff Augmentation Services!

Devops Engineer Resume

5.00/5 (Submit Your Rating)

Milwaukee, WI

SUMMARY:

  • Possessing overall 17+ years of comprehensive Information Technology experience in both analytical and technical capacities during complete
  • SDLC of DevOps, CICD, Big Data, Confidential services.
  • Experience in DevOps and Build & Release Engineering.
  • Worked as Technical consultant, Solution Architect and Lead and Software Engineer in various projects.
  • Competent on DevOps tools - Confidential, Puppet, Docker, SCM: Github/StarTeam/ Confidential /VSS, Confidential, Maven, Ant, SBT build tools, Sonar, Vera code
  • Expertise on JobDSL, Build/Deploy Pipeline, Jenkins ELK integration & implementation
  • Experience on Big data tools - Bigdata, ElasticSearch, Logstash and Kibana (ELK), File Beat, ELK Hadoop connector,
  • Splunk, Hadoop, HBase, Hive, Pig, Machine Learning, Apache Solr.
  • Design, development, implementation and maintenance of Banking Applications using Unix, Shell Scripting, Perl,
  • Python, Java, Groovy, Ruby, Jenkins, Confidential, Maven, Ant, SBT, Rest API, Meteor, C++ and MongoDB, Oracle/Sybase, HTML, JavaScript.
  • Extensive experience in using Unix Shell Scripting and Perl/Python Scripting.
  • Experienced in troubleshooting, configuring and Deploying Enterprise Applications in Webservers such as Tomcat, Websphere.
  • Internet protocols and internetworking s (ftp), http(s), network, firewall, port, data synching on nodes.
  • Experience with Virtual platforms VMware, VSphere,
  • Possess good communication and interpersonal skills, technical skills, leadership abilities, adoptability to adopt new technologies.
  • Participated in Confidential, Project Management Process.
  • Worked in client place for requirement gathering & providing solution and having good experience in Application testing.
  • Object oriented Design, implementation and Validation and Confidential support.

TECHNICAL SKILLS:

Programming Languages: UNIX Shell Script, Perl, Python, C++, Java, Groovy, Ruby

Web/XML Technologies: HTML, DHTML, CSS, JavaScript

Build Tools: Ant, Maven, Gradle, SBT

Tools: Puppet, Sonar, ELK (ElasticSearch, Logstash, Kibana), Splunk, Hadoop, HBase, Hive, Pig, EditPlus, winSCP, VSS, ToadApplication/Web Servers: Tomcat, BEA WebLogic

RDBMS: Oracle 9.x, MySql 4.0, MS SQL Server 7.0, MS Access

Source Control: Github/Starteam, Confidential, VSS, CVS

Operating Systems: Linux, UNIX, Sun Solaris, Windows 9x/2000/XP

PROFESSIONAL EXPERIENCE:

Confidential, Milwaukee, WI

DevOps Engineer

Responsibilities:

  • Reporting and Dashboard creation for LOBs and Demos to client and Development team
  • Jenkins configuration/Plugins Management, JOC Management and Slaves Management. DevOps tools support and latest version upgrade.
  • Github, Confidential, ServiceNow API Integration, JIRA Integration with Jenkins.
  • Jenkins and Github integration using Ssh, Jenkins and Confidential integration using ldap authentication and ServiceNow API Integration with Jenkins using Rest API.
  • Migration of all the Anthil Jobs /Starteam repository to the latest conventional model to Jenkins Jobs and Github repository. Migration of Jobs from Maven to Sbt and Gradle on the project requests.
  • Secure page implementation, Confidential Certificate installation of all DevOps tools before the DR Activity.
  • ELK Dashboard for all LLE environments to show the project specific metrics. Developed Framework for POC environment to collect metrics from all LLE.
  • Data pushed to ELK and Dashboard created for Business team to view the Dashboard. Got appreciation from Kohl’s. ELK Hadoop connector Integration. Access Hadoop data using Hive.
  • Hadoop Hive automation, Windows syslog data processing of TerraBytes volume of data retrieval using Hive.
  • Process automation of required Metadata/metrics collection on need basis.
  • Puppet implementation: Log cleanup activity on all LLE environments through puppet module to push on all environments.
  • Log Cleanup activity on POC LLE Environments to avoid the Priority Incidents. Understanding the structure of the log folders and developed a shell script on Linux environment, Pushed through Puppet to all POC LLE Environments (50+ servers). Pushing the configuration and software such as shared services required for POC project to the expected server.
  • Tibco Application Framework developed using DSL Framework. Generated a Seeds job and its application Build/Deploy/Maintenance jobs to work for 56 applications.

Environment: Unix, Unix Shell Scripting, Jenkins/Antillpro, Puppet, GitHub, StarTeam, Confidential, Confidential, Maven/Ant/Gradle, Sonar, Elasticsearch, Logstash, Kibana (ELK), Hadoop, Hive, ELK Hadoop connector, Java, UNIX, Groovy, Perl, C, C++, Java, Python, Ant, Maven, Gradle, SBT

Confidential, Milwaukee, WI

DevOps Engineer

Responsibilities:

  • This is the engagement program for the Confidential Automation.
  • Designing & Implementing the governance model for the automation transformation track, establishes the
  • SCM utilization to support the RCM and standardized the configuration management tool across the enterprise.
  • Requirement analysis and preparation of automation requirements for Kohl's line of Business (LOB)
  • Standardize the Configuration and Management tool across the enterprises in the areas of integration testing, regression testing and certification testing.
  • Establish a guideline for test automation and integrate with current build system.
  • Design of build and deployment implementation plan and integration plan, validation of build and deployment artifacts.
  • Build and Deploy pipelines. Troubleshoot problems arising from Build/Deploy/Test failures.
  • Self-service Job creation for the Java, WebLogic, Tibco technology specific applications with the template for the Build/Deploy and Maintenance jobs.
  • ServiceNow API Integration with Jenkins using ServiceNow Rest API. Identified the Informatica Jobs enabled the Production automation with ServiceNow API, reduced the manual effort.
  • Collection of Build metrics and its Test case metrics from Jenkins to show case as visualization in Kibana dashboard using ELK.
  • Veracode implementation in Jenkins. Enabling the Environment Dashboard plugins and POC on its dashboard.
  • Enabling Splunk for Jenkins configuration, Build Pipeline plugin.
  • Docker container implementation on deployment and Testing for Test Environment.
  • Enabling Github Web hook for polling SCM and Github pull request plugins and Jenkins Webhook plugin configuration
  • Elasticsearch Kibana default dashboard on JOC usage.

Environment: Jenkins/Antillpro, Puppet, GitHub, StarTeam, Confidential, Confidential, Maven/Ant, Sbt, Gradle, Sonar, Elasticsearch, Logstash, Kibana (ELK), UNIX, Groovy, Perl, Java, Python, Ant, Maven, Gradle, SBT

Confidential, Milwaukee, WI

DevOps Engineer

Responsibilities:

  • This is the engagement program of Confidential Automation in Kohl’s.
  • Installation of Jenkins, Confidential, Sonar, Github for Confidential of Business.
  • Configuration of Jenkins and Github and Jenkins and Confidential and Installation of Jenkins Plugins and support.
  • Design CI/CD processes in the context of a Jenkins orchestration, including the usage of automated Build, Deploy and Testing and deployment tools.
  • Understanding the existing applications and business area on each LOB. Understanding the existing Jobs developed using Java, WebLogic applications, Tibco which is running on AnthillPro. Providing Key pipeline concepts using upstream and Downstream, Visualization, Folders and Parameters, Email Notifications.
  • Design approach and usage bringing to DevOps
  • Automation of manual build and release and testing process.
  • Job set up in open source build tools in Jenkins and build and release automation (Not disturbing the existing applications) and Demo to LOB managers and Team
  • Identified the big volume of applications and its relevant jobs have been migrated which are running on the Anthill to Jenkins.
  • Build and Deploy tested and validated by pointing source code from Confidential /StarTeam
  • Configuring the Windows/Unix slaves based on the jobs on LOB
  • Demos to Client for Build and Deployment

Environment: Jenkins/Antillpro, Puppet, GitHub, StarTeam, Confidential, Confidential, Maven/Ant, Sonar, ElasticSearch, Logstash, Kibana (ELK), Sbt, Gradle, UNIX, Groovy, Perl, Java, C++, Python

Confidential

DevOps Engineer

Responsibilities:

  • Developer checking in and push the code from SCM.
  • CI tool build the code after checking the coding standard.
  • User Interface shows the build status of maven phases, success or failures notified
  • For every successful build, Artifacts are stored in Repository. Rollback is possible as we store the artifacts.
  • Orchestrator requests for the machine for the deployment to the environment services.
  • Provisioning services provide a machine as per the request of VMware, Vagrant, and AWS.
  • CD tool install the required software to new machine and deploy the artifacts successfully.
  • One Stop User Interface - Complete monitoring of the flow.
  • Tracking of environment, resources and machine details at one place
  • Orchestrator is built to control the flow. Integration of User, Project and Workflow management
  • Auto Provisioning machine for build and deploy (Vagrant, VMware, AWS)
  • Rest API and Zero Message Queues used for the communication
  • Integration of Code coverage, Coding standards tools, Testing tools
  • Service virtualization (Easy Integration with commercial CI/CD tool such as CA Nolio, AnthillPro)
  • Automated 40% to 50% Test cases reduces the testing effort and meets daily/weekly release cycles
  • Automated Repeatable process and flow predicting release cycle in advance
  • Reduces the risk of errors and causes time bottlenecks that delay the release of software into the multiple target environments
  • Benefits of a reliable, repeatable process: improved accuracy, speed, and control.
  • Developing the Interface as One Stop User Interface - Complete monitoring of the flow.
  • Tracking of environment, resources and machine details at one place
  • Orchestrator is built to control the flow. Integration of User, Project and Workflow management. Rest API and Zero Message Queues used for the communication
  • Enabling the Auto Provisioning machine for build and deploy (Vagrant, VMware, AWS)
  • Integration of Code coverage, Coding standards tools, Testing tools
  • Service virtualization (Easy Integration with commercial CI/CD tool such as CA Nolio, AnthillPro)
  • Automated 40% to 50% Test cases reduces the testing effort and meets daily/weekly release cycles
  • Automated Repeatable process and flow predicting release cycle in advance

Environment: GitHub, RestAPI using Flask, Java, Jenkins, Maven, Confidential, Python, Ruby, PhantomJS, Meteor, C++

Confidential

BigData Engineer

Responsibilities:

  • Solution Architect - Architecture, Development and Support
  • Requirement gathering and Understanding of existing “as is” Confidential enterprise tools.
  • Understanding of Confidential ’s competitors list and the required technology trends within the experiment scope
  • Open source tools evaluation & short listing
  • Evolve architectural design for the required experiment using short listed open source tools and Environment set up for the open source tools.
  • Data extraction -Twitter feeds using Twitter4J framework.
  • Experiment building -
  • Data extraction from Twitter using Twitter4J API;
  • Data processing using Hadoop storage and Map reduce jobs;
  • Preparing training data for sentiment analysis;
  • Predict tweets sentiment using Gate tool which uses Machine learning -PAUM algorithm
  • Data visualization set up using Splunk

Environment: Java, Hadoop (HDFS, Hive), Perl, Python, Apache Solr, Machine Learning.

Confidential

BigData Engineer

Responsibilities:

  • Solution Architect - Architecture, Development and Support
  • Requirement gathering and Understanding of existing “as is” Confidential enterprise tools
  • Understanding of Confidential ’s competitors list and the required technology trends within the experiment scope. Open source tools evaluation & short listing
  • Evolve architectural design for the required experiment using short listed open source tools
  • Environment set up for the open source tools and Experiment building -
  • Data extraction from News feed/Blogs using Apache wink API;
  • Data processing using Hadoop storage and Map reduce jobs;
  • Data visualization set up using Apache Solr.

Environment: Java, Hadoop (HDFS, Hive), Perl, Python, Apache Solr, Machine Learning.

Confidential

Bigdata Engineer

Responsibilities:

  • Development and Support
  • This is the project of Automation of Security Audit using Splunk.
  • Splunk Installation &Setup and Splunk Solution Design and configuration for Security Audit
  • Received syslog data from Cisco Routers (via UDP ports), created index and analyzed Syslog
  • Set up - file and trigger an alert. Perl script to perform audit service.
  • Perl Script designed and built to perform the below functionalities: compliance reconciliation between the received logs
  • Router configuration command files to identify security violation and generate reports on audit compliance / violation.
  • Security Audit Result Visualization through Splunk Dashboard.

Environment: Splunk and Perl 5.6

Confidential

BigData Engineer

Responsibilities:

  • Architect / Development Lead
  • Training and Leading the 6 members of team in Perl/Oracle.
  • Conducting the various training in Technical (UNIX & Perl) to groom new comers and work in the project.
  • Developed 6+ applications in Perl/Oracle.
  • Involved in Design activity and providing the appropriate solutions.
  • Developed the Confidential application of CODS, Pre-processor 1&2, Update Manager, Polling, Scheduling and CRDMS using Perl
  • Unit tested and Integrated and Implementation on the client location.

Environment: Perl, Shell script, Oracle, Linux, PL SQL, MS SQL

We'd love your feedback!