We provide IT Staff Augmentation Services!

Devops Engineer Resume

3.00/5 (Submit Your Rating)

Philadelphia, PA

SUMMARY:

  • Over 11 years IT experience and 3 years of practical experience in deploying enterprise business continuity solutions.
  • Skilled in AWS Confidential computing, EC2 instances, AMI, S3, EBS, Glacier, Snowball, VPC, Route 53, Kinesis, EMR, Redshift, Elastic Search & Lambda, SNS, SQS, Internet of Things
  • Worked with Apache Ecosystem; Hadoop File Distribution System and Hive for aggregation and analytics of big data, MongoDB, Cassandra, CouchDB, SimpleDB, Hbase, Apache Spark, Storm, Flume, Kafka, Pig
  • Knowledge of CI/CD technologies both for Confidential - native application and non- Confidential applications, experience in build Automation tools; Chef, Jenkins, Puppet
  • Worked with Hadoop clusters in Amazon Elastic Map Reduce to migrate files to Redshift with Data pipeline
  • Strong understanding of DW/BI technologies - DW data modeling, Data integration, Visualization/Reporting technologies like Talend, Snaplogic, Tibco, Tableau and host of others available in AWS marketplace
  • Generated the weekly, monthly, and quarterly report by using SQL Standard Report.
  • Knowledge of Disk and Server configuration through RAID for SQL Server
  • Knowledge of Citrix Netscaler, Nginx, HA proxy Loadbalancing
  • Experience in Containerization / Cluster Management: Docker containers, Repositories, scripting languages like Bash, Perl
  • Use SQL Server Integration Services (SSIS) and Extract Transform Loading (ETL) tools to populate data from various data sources, creating packages for different data loading operations for the application .

TECHNICAL SKILLS:

Extract Transform Loading (ETL): SSIS, SSRS, SSAS; Hadoop; Hive/Hbase; Pig/Scala; Data Migration; AWS/Azure; Elastic Map Reduce; High Availability; Java/J2EE/Python; Data Warehousing; Performance Tuning; Domain Names creation; Unix/Linux; Spark

PROFESSIONAL EXPERIENCE:

Devops Engineer

Confidential, Philadelphia, PA

Responsibilities:
  • Working with collaboration tools like Jira and SCRUM as agile methodologies
  • Experience with Splunk & Zenoss as monitoring tools and ensured 97% uptime
  • Experience with Nexus for code storage before testing and deployment
  • Experience with Sonarqube for code quality and validation tool
  • Experience with Jenkins for Continuous Integration and Continuous Deployment tool
  • Experience with virtualization technologies at the infrastructure level like VMWare, Virtualbox
  • Experience with Openshift platform for container-based code build, test and deployment
  • Experience with Ansible for IT automation, multi-tier App deployment by writing simple task description
MS. SQL Server Database Administrator

Confidential, McDonough, GA

Responsibilities:
  • Installed and configured Hadoop Map reduce, HDFS, developed multiple Map Reduce jobs in java for data cleaning and pre-processing
  • Populate large data sets into data nodes of HDFS using Linux Command Line
  • Used Eclipse platform as Integrated Development Environment
  • Experience in Hadoop Architecture and writing both Mappers and Reducers algorithm
  • Ensure High Availability and fault tolerance within the Hadoop Ecosystem
  • Experience in using Command Line Interface to set Hive shell and performed complex queries with HiveQL
  • Created Database, design tables and populate tables with large data set on Hive framework
  • Experience in AngularJS for web design and applications
  • Developed multiple Map Reduce programs in Python to process large volumes of semi/unstructured data files using different Map Reduce design patterns
  • Developed multiple MapReduce programs in java for data extraction, transformation and aggregation from multiple file formats including XML, JSON, CSV and other compressed file formats
  • Implemented partitioning, dynamic partitioning and bucketing in Hive to improve performance while querying historical tables
  • Developed custom MapReduce programs and User Defined Functions (UDFs) in Hive to transform the large volumes of data with respect to business requirement
  • Implemented Proof of concepts on Hadoop stack and different big data analytic tools, migration from different databases (i.e. MSSQL, Oracle, and MySQL) to Hadoop
  • Developed Pig Scripts and Pig's Load and Store functions
  • Extracted and aggregated large amounts of log data using Apache Flume and staging data in HDFS for further analysis.
  • Utilized Spark to improve the performance and optimization of the existing algorithms in Hadoop using Spark context, Spark-SQL, Data Frame, pair RDD's, Spark YARN.
  • Created an Amazon RDS Microsoft SQL Server instance and connecting to that instance using Microsoft SQL Server Management Studio.
  • Created S3 storage on AWS and implemented lifecycle polices for managing objects
  • Configured and managed a high availability SQL Server on AWS using RDS
  • Utilized AngularJS to add data to HTML document for web application
  • Designed, deployed and optimized MS SQL server on AWS in EC2 and RDS
  • Provisioned Redshift clusters for descriptive, predictive and prescriptive analytics of big data with traditional Business Intelligence tools
  • Utilized Version Control Tools like SubVersion, Git and development of logging standards and mechanism based on Log4J
  • Mitigated a situation where there is an application spike when customers migrated to a new data source system by investigating and applying resolution skill sets within my team to stabilized the application timely
  • Work with Join Group management to understand their short-term goals and long-term vision and provide leadership for both internal and external initiatives for continuous integration and continuous delivery
  • Implemented agile methodology and participated in Scrum meetings with DevOps and developers for daily operations
  • Deployed Jenkins Server as CI tool to schedule Cron Jobs and workflow pipeline
  • Experience in using automation tools like Chef, Jenkins Visual Studio Team Services(VSTS) for build and release of artifacts in Continuous Integration and Continuous Deployment architecture whenever there is any changes in Git repositories
  • Have Extensive knowledge of using Kafta for extraction and ingestion of large streaming data into HDFS and fast distribution to different nodes within the cluster for fast data processing

MS. SQL Server Database Administrator

Confidential, Rio Rancho, NM

Responsibilities:
  • Configured and set up Disaster Recovery (DR) and High Availability (HA) using Always on High Availability solutions for customers
  • Designed, implemented and administered High Availability solutions, (clustering, log shipping mirroring and replication)
  • Provided 24/7 remote support to SQL Servers on customer sites of various companies to prevent downtime
  • Wrote, debugged and tuned T-SQL and Stored Procedures for system performance
  • Provided SQL Server database physical model creation and implementation (data type, indexing, table design)
  • Setup and administered SQL Server Database Security environments using profiles, database privileges and roles to prevent data theft and increase data integrity
  • Database maintenance, replication, and tuning
  • Migrated systems data from different databases and platforms to MS-SQL databases
  • Provided Database administration including installation, configuration, upgrades, capacity planning, performance tuning, backup and recovery in managing clusters of SQL servers
  • Generated complex stored procedures and functions for better performance and flexibility
  • Ensured database performance monitoring and tuning is regularly accomplished
  • Reviewed logs, updated database statistics and index management
  • Troubleshot application performance issues, database maintenance & configuration, patching, and upgrading
  • Participated in DBA On-call monthly and document environments and processes where applicable

Managing Director

Geopell Nigeria Limited, Lekki, Lagos

Responsibilities:
  • Introduction of more business units in energy and metallurgy
  • Participated in procurement of oil line pipes worth $5million for NAOC
  • Procurement of concrete coated 3km 4” and 6” line pipes
  • Commenced In-country stocking of steel pipes for better customer satisfaction
  • Achieved Agency agreement with some technical partners like TMK Russia, St. Louis Pipes USA, Lontrin Steel Tube, China, Galperti Valves, Italy, Offshore Engineering Dubai.
  • Participated in procurement of line pipes for NPDC-Nestoil 20km project
  • Participated in procurement of Control & Shutdown valves for Suntrust/Midwestern Oil 53.1km pipeline Project

Database Administrator

Geopell Nigeria Limited, Lekki, Lagos

Responsibilities:
  • Troubleshot and resolved database integrity issues, performance issues, blocking and deadlocking issues, replication issues, log shipping issues, connectivity issues, security issues etc.
  • Identified and analyzed areas of potential risk to assets, earning capacity, or success of the organization .
  • Produced reports or presentations that outline findings, explain risk positions or recommend changes.
  • Planned and contributed to the development of Fraud risk management, compliance, and control systems.
  • Carried out Windows Server Performance tuning, Security of data - and physical security of the servers themselves
  • Maintained a cross- functional interface with all Business units: Revenue Assurance, Finance, Audit, Projects, QA, and implementation team

We'd love your feedback!