We provide IT Staff Augmentation Services!

Technical Director Resume

5.00/5 (Submit Your Rating)

OBJECTIVE:

  • To be utilized to bring the most advanced and creative solutions to customers creating opportunity for future business and leveraging a great team to create repeatable solutions that can allow profit margins to increase.

SUMMARY:

  • Over 16 years in the Information Technology field focusing on automation, cloud, and big data architectures. Specializing in and certified in AWS. He has presented at AWS Roadshows and executive level functions. During his career he has worked with many commercial, federal government, and intelligence agencies directly with executives for many years. He has a comprehensive knowledge of applications development, as well as cloud and virtualization based architectures having full development lifecycle, continuous integration, and cutting edge high availability (DR/Coop) cloud solutions. He is an experienced solutions architect, object - oriented software designer, and an experienced technical lead. He is adept at interfacing with employees, customers, and vendors at various levels to grow business.
  • He has over sixteen years development expertise in Web, application, and data tiers, eight years experience integrating solutions using Web Services REST, and other SOA technologies. In addition to Amazon Web Services, Microsoft Azure, and Terremark cloud service providers, he has worked with Big Data and mass analytics with the largest enterprise solutions as an integrator for the intelligence community, and large enterprise petabyte systems like Siri, Facebook, Xbox Live, and Researchers.
  • Recently he has been brought on as a Subject Matter Expert (SME) for CMS Healthcare.gov to help with re-architecting and development.

TECHNICAL SKILLS:

Languages: Node.js, C#, C++, Visual Basic, HTML, HTML5, JavaScript, Python, ASP, AJAX(JavaScript), PHP, PL/SQL, Perl, Flash, Flex3, batch, Bash shell

Hardware: x86 machines, Dell servers, HP servers, SAN, Petabyte Storage Arrays

OS: Amazon Linux, Windows Azure, Linux (Centos, RedHat, SUSE, Ubuntu), Win2008 Server, Win2007, WinXP, Windows Vista, Win2k, Win XP, Win98, FreeBSD, AIX (IBM RS6000), IBM Mainframe

Frameworks: Hadoop, Hortonworks, Cloudera, NginX, Apache, Tornado Web Server (Facebook s), AngularJS, ExpressJS, Backbone, Bootstrap and JQuery

Databases: Redshift, DynamoDB, RDS (MySQL/Postgresql). Redis, Mongo, HBase, Casandra, Luceen, Solr, Vertica

Software: Amazon Web Services, APIs, CLI, and console, Microsoft Asure console, Terremark Cloud, Jenkins, Git, Subversion (Svn), MS Visual Studio and .Net, PyCharm, Sencha Architect, Eclipse, RabitMQ, SQL Developer, SQL Plus, Office 365, Visio, MS Project, Access, MSWord, Excel, Power Point, Lotus Notes Designer, Lotus Notes, Lotus Notes Server, Analyst Notebook, Centrifuge, IIS, Samba, Mahout

PROFESSIONAL EXPERIENCE:

Confidential

Technical Director

Responsibilities:

  • Leveraging Splunk App called Hunk on Hadoop utilizing map/reduce to create a cost effective analytics Platform on AWS EMR for large amounts of machine and user data. Leveraging AWS CloudWatch and S3 logs.
  • Access VPC, Subnet, Security Group and EC2 AWS describe API and CloudFormation to create spreadsheets, MySQL & Postgres Database entries, and d3.js visualization.
  • Architect and Implement AWS Kinesis based consumers in Lambda and pipeline data to a data lake while allowing real time analytics using DynamoDB and ordering based on the location of mobile users and ordering trends.
  • Architect and Implement custom big data analytics platform for selling to customers who want to gain insights into the market trends and upcoming opportunities positioning them to have competitive advantage. Utilized ETL tools like Talend and Pentaho. Data Ingestion through S3 triggered Lambda function to load to Redshift/Mysql/Postgresql
  • Developed enterprise architecture for a very complex and secure hybrid cloud infrastructure for website and mobile applications with credit cards transactions meeting PCI DSS credit card standards.
  • Stand up multiple Tiers and various databases to do Analytics and Business Intelligence (BI) on Trends and transactions utilizing Mongo/DynamoDB, MySQL, and Vertica/Redshift.
  • Utilize CloudFormation and Puppet to create consistent and reliable deployment methodology that remediates risks and reduces time associated with replication of environments in multiple regions, Disaster Recovery, and Continuous Operations.
  • Built a Road map, implementation plan, development, implement, and testing for Continuous Integration using Jenkins, GIT/SVN, NodeJS and SSH2 (comparable to Python and Fabric). Coded the AWS APIs, SSH2 and Bash needed for Automatic preparation of AMI builds and Build, Test, and Deploying of code to the servers. This deployment server integrated with CloudFormation and Puppet and creating DEV/OPS processes that work best with Jenkins and Agile methodology for development.
  • Complex Automation included adding in new servers to have Splunk monitoring, Active Directory Joining with Centrify, and Trend Micro agents activation. The complications are due to the server configuration changes and passwords to be used securely and encrypted.
  • Enterprise Tools to help achieve PCI compliance: Sofos Unified Threat Management, Arcsite enterprise security management, and Splunk Operational Intelligence, Solar Winds Network and Fault monitoring, AWS Cloud Trail, and AWS Cloud Watch.

Confidential

Cloud Solution Architect

Responsibilities:

  • Amazon Web Services (AWS) premier parnter and certified AWS Architect.
  • Set Up and use VPC, Subnets, security groups, ACLs, EC2, Hardened AMIs, RDS, DynamoDB, S3 buckets, SES, SNS, detailed billing, cloud trail, and cloud watch
  • Recently brought on to CMS Healthcare.gov as a Subject Matter Expert (SME) to work with AWS, NodeJS, and front end (HTML5, CSS3, Backbone, JavaScript). Specifically working with acclimating people and developing NodeJS/NginX apps in the AWS Cloud. Utilizing the full Continuous Integration and Test Driven Development methodology.
  • Guest Speaker at an AWS SLED/DOD road show focusing on products in AWS, cloud broker model, and contract vehicles.
  • Olympus Aquilent's Cloud Portal AWS API integration to manage and audit AWS. As an Architect in the Cloud Solutions Group, worked with CTO, developers, AWS senior sales, and engineering to conceive best portal design and architecture to meet the Federal Government customer needs, 508 compliance, and User Experience utilizing responsive bootstrap framework, bootsrtap, backbone, ect
  • Help customize bundled services to fit customers’ needs providing greater satisfaction, reduced time to transition, better security, and overall cloud architecture.
  • Help customize bundled services to fit customers’ needs providing greater satisfaction, reduced time to transition, better security, and overall cloud architecture.
  • Cloud solution architecting, quotes, and provided RFPs, RFQs, and RFIs responses as needed for various configurations and customer needs under different contract vehicles that allow flexibility and simplicity.
  • Configured Ticket system workflow that allows CSG to function with automation for higher efficiency and organization with continuous feedback for better automating triage.

Confidential

Solution Architect

Responsibilities:

  • Setting up Linux OS and testing NodeJS/Front End code on local and in the cloud on AWS EC2, EBS, SNS, and S3 Storage.
  • Met the deadlines placing DDN in position for millions of dollars in revenue
  • First on time delivery of DirectMon was the monitoring and management of the GridScaler System, which allowed in depth training and learning about the IBM’s parallel file system GPFS. How to set it up, configure and monitor the metrics via the DirectMon cluster management tool that we designed
  • Second on time delivery of DirectMon was the monitoring and management of the ExaScaler and hScaler Platforms The integration allowed in depth training and learning about the Intel’s parallel file system Lustre and Apache’s distributed applications platform Hadoop, as well as HDFS, MapReduce, and HBase with a Pentaho nodes management added.
  • Release of DirectMon at Supercomputing SC12 Conference
  • The Latest graphing packages and drill down graph libraries utilizing real time though websockets straight from the Storage Fusion Architecture (SFA) APIs which is used for the super computing, High Performance Computing and Cloud Computing Customers
  • Research and Development efforts to balance large data and acceptable performance ratios and the best practices and libraries

Confidential

Senior Software Engineer

Responsibilities:

  • Received 2 gold coins from a large intelligence agency rarely given to contractors as awards for productivity and cutting edge innovations called the BRAVE initiative which was their Cloud upcoming architecture
  • Configured and enhanced widgets in an web based Ozone framework (Extended ExtJS & Dojo) used in the intel community. Widgets communicated with other widgets to produce a enhanced user experience
  • Worked with many vendors for big data including those used by EBay, Amazon, and others that are in the petabyte range. Including briefings on all these products
  • Cloudera (Hadoop bundle) training and writing map-reduce jobs. Understanding the concepts that make truly scalable solutions in a shared nothing enviroment
  • Develop, implement, and test Lotus Notes Converter using Lotus APIs to Integrate into an Oracle schema to leverage the database full-text indexing and integrated HighView workflow processes. The Notes integration provided a mapping into a HighView schema that can be leveraged with other 3rd party applications that work in SOA. This integration provided a strong understanding of the Lotus Notes architecture and the APIs
  • Develop, implement, and test phone exploitation kits and import the data into the HighView schema to allow searching and linking phone records, images and video into named entity structure for plotting on Maps, exploiting, translating, and exporting to Analyst Notebook or Centrifuge
  • Develop, implement, and test Document Exploitation webservices for Confidential ’s DOMEX suit. To include all the necessary data to promote system integration among the community. Latter Oracle performance tuning for the database and schema were needed to get the desired response times
  • Developed and tested a PDF to HTML converter to convert Electronic PDF to HTML and put all images in div tags so coordinates of text and images were available. Lastly, links and relationships between images and data were automated and stored in the database
  • Document Named Entity Extraction built to customer specifications. Gathered Requirements, Composed the project plan with timeframes, SDD, develop, implement, test, and Document the Extractor. Used complex regular expressions and PDF to HTML converter to extract text and related images
  • Developed a XSL style sheets to easily view webservice output
  • Sold the client on providing users with a full scale search and filter website utilizing AJAX support. The site was a major enhancement to the speed and elegance of the DocEx presentation layer. The site provided recently added documents, recent searches, saved searches, favorites, oracle full text search capabilities and filtering on the fly. Site had Thumbnails for all the documents. The data was cached for paging and could show different result sets as well as data from other database source that matched our projects. The site could even bookmark and share with others using location hash stored information
  • Developed the thumbnail generator for most document types utilizing some third party libraries

Confidential

Senior Software Engineer

Responsibilities:

  • Coordinate requirements, develop, implement, and test Document automation, database and approval for major government CI group that allows automatic creating and sharing CI data from field documents. System also incorporates dlls and other low level VCL components for SMTP and file objects which share highlights to other interested parties
  • Build database for the document automation program utilizing MS SQL Sever
  • Coordinate requirements, develop, implement, and test in Visual Studio 2005(VB, ASP, C#, CSS) information sharing database and website for a large CI group that incorporates various sources using RSS feeds, email and other cutting edge technology

Confidential

Software/Systems Engineer

Responsibilities:

  • Attila Process adaptation, interfacing, and testing by building/converting routes, building statistical tools, and analysis of 4D trajectories in database and statistical tools documenting conclusive findings
  • Interfacing and reverse engineering TMA(FAA Time Management Sequencer for aircraft) with AWSim(Flight Injector/Display/Analyzer) by writing C++ application to Convert Trajectories to CMS messages (FAA CTAS Standard)
  • Documented the interfacing applications and the findings of the reverse engineered TMA (FAA Time Management Sequencer for aircraft) system
  • Design a system to take FAA flight position reports and convert to Trajectories and ICAO standard and archive/analyze outputs to verify the ATTILA (Time Sequencing) patent
  • Design and Implement the Arrival Departure Screens for Air Traffic Control System written primarily in C++, and JavaScript
  • The design and implementation of a message tracking system that utilized, Database, ActiveX for SMTP, and socket connections to a mail server. As well as Windows APIs for higher level of control of the Windows Operating System
  • Setup and maintained all html, mail, and firewall servers on Linux platform
  • Interface to a Raytheon Radar system though (tcp/udp) protocols utilizing sockets to feed data to our Avid/Kinematics systems increasing the figure of merit (accuracy)
  • Designed a web-based calendar, which is a cross platform environment. The Code for the system is all HTML, JavaScript, Perl, and Unix Shell on a Linux platform running Apache I setup and maintained. The managerial tools designed integrate email with the web to yield explosive power and convenience for accounting
  • Made upgrades to the SPR system (problem logs and fixes) to allow validity checks, multiple SPR inputs, and database cross-links. These Platforms were AIX and Linux
  • Document repository was coded with HTML as a front end and Perl on Linux to run form requests, and configuration management interface for full control of the database
  • Assumed the position of Configuration Management and aided in the QA portion of the FDPS2 project for EDS which incorporated testing of various C++ applications and software packages running on an AIX platform
  • Configured the environment of the Air Traffic control Systems for two major clients. The Environment consisted of a strict policy and configuration of Windows 2000, Borland C++, Interpose Server, and the In House software. The money saved was 35% of hardware cost and about 80 man-hours off configuration time

Confidential

Software Engineer

Responsibilities:

  • Started Confidential partnership with Rev. Larry Miller
  • Designed, debugged, packaged, and sold Complete Church Management System
  • Designed a form submission client/server model including a database, secure submission, and Public key registration done from scratch (Windows API)
  • Integrated web cams into some software products
  • Designed a Biometric solution for identification. It included a state of the art interface, finger print scanning, cameras, and other powerful API and Active X hooks
  • Designed a robust and efficient robotics object in C++ that controls stepper motors and motors with led position sensors. The object is made from a flexible abstract class

Confidential

Programmer/Analyst

Responsibilities:

  • Production of web-based procedures and policies for the company by providing tools to enhance the HTML conversions and speed productivity
  • Managed a large HTML project, which created flowcharts of modules COBOL source code and linked the documentation of the Medicare system. Over 1000 pages in 4 weeks completed by using self-designed Visual Basic Automation tools and assistance from three Cobol Interns
  • Designed a system to extract Medicare bills from the mainframe and save them to the PC and group them together in a series for the push of a button resubmission for beta testing. The program edited the test series by a GUI, resubmitted bills though the terminal, recorded the results from the terminal, opened Excel for summarizing the data in a table and reformatted the data to save and print. The table immediately revealed if the results were as expected in a pass/fail column for quick review

We'd love your feedback!