Devops Engineer & Technical Lead Resume
New York, NY
SUMMARY
Versatile technologist with over 20 years of experience in software development, architecture,operations, technical writing, training, and project leadership.
TECHNICAL SKILLS
Languages: awk, bash, C, CSS, HTML, Java, JavaScript, JSP, ksh, Perl, PL/SQL, Python, R, SQL
Software: Ableton Live, Adobe Experience Manager, Ansible, Apache Felix, Apache Sling, Artifactory, AWS, Bitbucket, Confluence, Docker, DynamoDB, git, Jackrabbit Oak, Jenkins, JIRA, JMeter, jQuery, JUnit, Linux, Maven, MySQL, NeoLoad, Nexus Repository, Oracle Database, Oracle Endeca, Rundeck, SQLite, Vagrant, VirtualBox, VMware
PROFESSIONAL EXPERIENCE
Confidential, New York NY
DevOps Engineer & Technical Lead
Responsibilities:
- Primary DevOps technician serving a distributed, multi - cultural team of Adobe CQ/AEM developers and architects building a major public-facing web site.
- Coordinated with operations, developers, QA, marketing, and senior leadership to ensure accurate, on-budget, and on-time delivery of software.
- Responsible for the configuration of Red Hat Enterprise Linux servers, spanning all environments from development to production and located both on-premise and AWS.
- Provided continuous integration and deployment using Bitbucket, Maven, and Jenkins, enabling rapid and flexible development.
- Designed an efficient release management pipeline using JIRA, Confluence, Jenkins, Artifactory, and Rundeck, minimizing time-to-production while ensuring that all quality and IT governance standards were met.
- Created turnkey development environments using Docker and Vagrant, providing the team with consistency and streamlined onboarding of new resources.
- Designed custom Ansible modules using Python, automating complex server configuration and reducing provisioning time from weeks to minutes.
- Optimized the workflow for a team distributed across multiple continents and time zones, using tools such as Confluence, JIRA, and Slack.
- Harvested and cleansed data from internal APIs using Java and Python. Built an automated, custom ETL system to reliably refresh production data on a daily basis.
- Performed stress testing using NeoLoad, to simulate the expected traffic against the system from access points worldwide.
- Identified and fixed bottlenecks and faulty code to ensure performance standards were met.
Confidential
DevOps Engineer & Python Developer
Responsibilities:
- Provided DevOps and Python support for the development of an adaptive video streaming API.
- Created a continuous integration and deployment pipeline using AWS, Bitbucket, Jenkins, and Docker.
- Built development containers using Docker to provide a highly-specialized development environment, including Python, Node.js, MongoDB, MySQL, mp4split, MP4Box, and ffmpeg.
- Wrote Python-based parsers for various video streaming formats, including MPEG-DASH, HTTP Live Streaming, and Microsoft Smooth Streaming.
- Wrote Python-based closed-caption parsers to maximize accessibility of video content in all formats.
- Designed unit tests for streaming media operations in both Python and bash.
Confidential
Enterprise Search Architect
Responsibilities:
- Led the search track of an international multi-solution project, including Adobe Experience Manager and Oracle Endeca, to modernize the interface and content management practices of a major commercial web site.
- Created relational data models using UML, synthesizing disparate data sets such as products, recipes, news articles, store locations, and nutritional information.
- Designed an intuitive and user-friendly search interface to allow shoppers easy navigation through the data.
- Using Oracle Confidential and AWS Dynamo, designed a data ingestion pipeline to provide near- real-time updates for the pricing and availability of thousands of grocery items in over 90 stores.
- Helped design a data ingestion API using Java and C#, following HATEOAS principles.
Confidential, Chicago IL
Confidential Platform Specialist
Responsibilities:
- Created multiple web-based BI applications, working comprehensively on both the front end ( Confidential Latitude) and back end, including CloverETL, Informatica, Teradata, and SUSE Linux Enterprise.
- Designed BI dashboards, reports, and dimensional models for demand planning, inventory policy, replenishment planning, and transportation control, allowing users to quickly discover and obtain actionable information from hundreds of gigabytes of raw data.
- Automated the execution and monitoring of data loads involving hundreds of millions of records, running in a complex multi-stage environment, including various layers of databases, ETL, and Endeca.
- Developed auditing routines and data-quality filters using LQL and Python, to ensure the accuracy of raw data entering the pipeline, as well as computations presented to the end user.
- Implemented a variety of automated health checks and error reporting throughout the server landscape, to maximize uptime and SLA compliance.
- Assisted sysadmins in diagnosing memory and hard drive failures; performed analyses to anticipate and prevent upcoming failures.
Confidential, New York NY
Technical Lead & Senior Programmer Analyst
Responsibilities:
- Wrote from scratch dozens of rich data-driven, Java-based web applications serving a wide variety of financial sectors, including Structured Finance, Corporates, International Public Finance, and U.S. Public Finance.
- Wrote JavaScript libraries supporting browser-agnostic DHTML, to create among the first public-facing financial web sites offering a rich user experience and responsive design across the standard browsers of the day.
- Designed a custom search engine for millions of financial records using Oracle PL/SQL, allowing users to perform highly targeted search operations.
- Architected a multi-lingual enterprise search solution using Endeca, providing subscription- based faceted and free-text search across millions of records of structured and unstructured data.
- Implemented a granular permissions system enabling the sales of custom subscription packages, and maintained a user base of tens of thousands of web site subscribers.
- Set up off-site disaster recovery servers and failover mechanisms to ensure uninterrupted access to critical data in the event of a widespread outage.