We provide IT Staff Augmentation Services!

Senior Software Engineer - Project Lead Resume

5.00/5 (Submit Your Rating)

Grand Rapids, MI

SUMMARY:

  • Highly accomplished, polymorphic polyglot and experienced Sr. Software/Data Applications Architect/Engineer and Team Lead technologist with broad experience in several areas within the Computer Information Systems and Programming arenas; including system analytics, software engineering, cloud development/architecture, hardware engineering systems programming and automation (servers of various flavors and networking nodes, Chef, Puppet, Ansible CFEngine and others) installs and configurations, as well as database design, visualization UI’s, real time data stream analytics and automation management.
  • Developed and configured distributed computing systems for “Big Data projects including HDFS and MapR file systems.
  • Data Ingestion into the Indie - Data Lake using Open source Hadoop distribution to process Structured, Semi-Structured and Unstructured datasets using Open source Apache tools like FLUME and SQOOP into HIVE environment. (Using IBM Big Insights Ver-4.1 platform on many occassions). Develop Kafka producer and consumers, Hbase clients, Spark and Hadoop MapReduce jobs along with components on HDFS, Hive.
  • Automating and scheduling the Sqoop jobs in a timely manner using custom Shell Scripts.
  • Integrated Apache Storm with Kafka to perform web analytics. Uploaded click stream data from Kafka to Hdfs, Hbase and Hive by integrating with Storm.
  • Design and code from specifications, analyzes, evaluates, tests, debugs, documents, and implements complex software apps.
  • Developed Sqoop Scripts to extract data from Oracle source databases onto HDFS.
  • Worked in tuning Hive & Pig to improve performance and solved performance issues in both scripts with understanding of Joins, Group and aggregation and how does it translate to Map Reduce jobs
  • Created (virtual) Partitions, Buckets based on State to further process using Bucket based Hive joins.
  • Architected a Business Intelligence application in AWS using VPC, S3, Route 53, Auto-scaling, Elastic Beanstalk, SQS, SNS, RDS MySQL
  • Played a key role in evaluating, establishing and conducting proof of concepts of various API management products like 3-Scale, Nginx proxy, WSO2 API Manager
  • Built and led a geographically distributed team responsible for building a private OpenStack-based cloud a private cloud used by various services such as DBaaS (database as a service), LBaaS (Load Balancers as a Service).
  • Architected a backend HA Database Cluster. Built automation necessary virtual infrastructure required such as Galera (Percona XtraDB Cluster) using Chef
  • Developed full-automation of build-out of a 3-AZ (availability-zone) Openstack cloud (on hardware) using SaltStack (python)
  • Experience inP2V, V2V MIGRATION using VMware Converter / Plate Spin 6.8.x. Creating host and client VM templates and cloning
  • VMware: VMware Vsphere 3.5/4.1/5.0, Workstation 6/7, ESX Server 3.5, VMware DRS, HA and FT, VMware Standalone Converter 3.0, VMware Update Manager 1, VMware Virtual Desktop Infrastructure VDI, VMware View
  • Sound knowledge on Hyper-V, ESX and ESXI Architecture, guest OS installation, setting up of VM priorities, cloning and snapshots
  • Minimized application changes and concealed Cloud Storage details from applications with a POSIX like file system for improved efficiency
  • Lead team to use libcURL and Apache to communicate via a scalable architecture with Public Cloud Storage that supports Amazon S3, Microsoft Azure, Google Cloud Services and OpenStack (custom builds) Cloud Services
  • Sound understanding and experienced utilizing SDN (software defined networks) and NFV (network function virtualization) techniques implemented in Cloud based infrastructures
  • Authored numerous audit implementation plans prior to supervising information automation audits.
  • Ensured best practice for information automation, configuration management systems utilizing Opscode (Chef) while moving organization from legacy to full implementation of DevOps
  • Managed creation of high-profile HATP (High Availability Transaction Processing) solution, supervising developments across multiple locations
  • Developed highly efficient security audit management software application to manage software upgrades deployed through ATMs and desktop systems worldwide
  • Architected and implemented automated cloud infrastructure using Puppet Pallet and Juju for a custom private Cloud overlaid right on bare metal using MAAS
  • Created detailed insight into all aspects of business operations through painstaking integration of Graphite, Logstash, Sensu, and Chef
  • Extended an existing Puppet implementation for an logistics & operations start-up (spin off really) to enable seamless full-infrastructure provisioning for site-redundancy and staging/development environments
  • Designed and implemented rigorous security checks on data in transit on Cloud infrastructure utilizing Fabric and custom Python scripting
  • Implemented Data Binding, Serialization and Parsing Techniques across Major Database Types and Schema both Server and Cloud based (Oracle 11g, SQLite, and SQL Server Express, Azure SQL, Amazon EC2 and RDS for SQL.) Software Architecture and System Analytics
  • Significant knowledge of and experience with security and firewall practices. Knowledge of SSL certificates and digital signature technology including online/mobile payment gateways and structures to include API’s, Web API’s and API development
  • Significant knowledge of and experience with QA methodologies and Unit Testing (TDD/BDD). Followed Unified Modeling Language (UML) methodology using Requisite Pro and Rational Rose to create/maintain: Use Cases, Activity Diagrams, Sequence Diagrams, and Collaboration Diagrams. Wrote and executed JUnit, NUnit, JSUnit, XUnit, mlUnit Unit testing both automated and manual methods.
  • Significant knowledge of and experience with infrastructure deployment automation utilizing Chef, Puppet, Salt and Ansible. Highly experienced with software deployment applications such as Nagios and New Relic. (I’ve also scripted custom solutions in this area)
  • Knowledge and experience of implementing Accessibility/WCAG2.0 compliance

TECHNICAL SKILLS:

Languages/Programming: Java, J2EE, Scala, C, C++, C#, Ruby, Python, R, JavaScript, TypeScript, CoffeeScript, T - SQL, P/L SQL, PHP, SQL

Languages/Mark Up & Scripting: Awk, Bash, XML, HTML, HTML5, CSS, JSON, Ksh, Power Shell, CGI, Perl

Systems & OS: AIX, Linux (various flavors; Ubuntu, Red Hat, Fedora, Debian), BSD, UNIX (Solaris/HP-UX), Windows XP, 7, 8/NT v4.0, and Windows Server, SQL Server

Networking: TCP/IP, SFTP, FTP, STMP, DHCP Sockets and other necessary Protocols, Ports and Datagrams, Network Architecture Design and Modelling, Network Topology and Information Abstractions. Network function virtualization (NFV) and software-defined networks (SDN)

Databases: Oracle 11g - 12c, Windows SQL Server, MySQL, PostgreSQL, SQLite, DB2, NoSQL (MongoDB, Cassandra, CouchDB)

Data Caching Systems: Redis - NoSQL caching solution, CSQL Cache - To cache tables from MySQL, Postgres and Oracle, Memcached- To cache result set of queries, Windows Azure Caching- To cache result set of queries in Windows Azure, TimesTen - To cache ORACLE tables, SafePeak - Automated caching of result sets of queries and procedures from SQL Server, with automated cache eviction for full data correctness, AppFabric Caching- To cache result set of queries in distributed computing systems.

Data Analysis: Consulted with business partners and made recommendations to improve the effectiveness of Big Data systems, descriptive analytics systems, and prescriptive analytics systems. Integrated new tools and developed technology frameworks/prototypes to accelerate the data integration process and empower the deployment of predictive analytics. Working knowledge of machine learning and/or predictive modeling.

WORK EXPERIENCE:

Confidential, Grand Rapids, MI

Senior Software Engineer - Project Lead

Responsibilities:

  • As the Sr. Team Lead and Architect/Technologist I was tasked with liaison responsibilities with client’s staff including non-technical Management and client technical personnel, DevOps, Project Managers, Database Analyst, DBA’s, Development Team members and other client facing stakeholders. In this role I played the critical hinge between stakeholders of all stripes and Development Teams which included our 2 Dev Teams: 1. Windows Development Environment Team 2. Linux/Unix Development Team. Each Team consisted of 7 members including auxiliary personnel and Software Development Engineers.
  • Responsible for implementing Project Research into scope, software debt, application requirements and resources including educating product owners about technical debt
  • Lead team Scrum meetings daily, insure project summaries, use case and stories alignments, communicated product backlog status to all teams
  • Model Agile Development methodologies for clients Dev teams, make the pairs programming assignments for the day’s sprint
  • Prepare projects status reports for stakeholders and our back-office staff communications
  • Became an Information Radiator with complete visibility to all stakeholders and project members up and down the stack

Confidential, Chicago, IL

Senior Lead Software Developer

Responsibilities:

  • Lead a team of developers towards successful project accomplishments
  • Coach, monitor, and train teammates
  • Establish good practices and produce high-quality software
  • Develop product details, project schedule and development criteria in compliance with quality standards
  • Design custom components for data integration and encryption in the software
  • Perform user acceptance tests, debugging and alterations as per the clients' convenience
  • Perform on-site installation and train the users on advanced troubleshooting
  • Project Management accomplished on Microsoft Project utilized in managing multiple simultaneous projects with teams in several locations throughout the U.S.
  • Version Control via SVN, Mercurial, GIT and Veracity ( which includes bug tracking and Agile software development tools integrated with the version control features )
  • Design user manuals, user guides, FAQ's guide, detailed product specifications and project reports
  • Provide secure off-site maintenance through remote access to the server
  • Conduct performance reviews and updates as per business requirement
  • Work closely with software developers, software architects and business analysts to plan, design, test, develop and maintain business applications for web, mobile (cross platform) and desktop
  • Plan, design, and implement application database code objects, such as stored procedures and views.
  • Build and maintain SQL scripts, indexes, and complex queries for data analysis and extraction.
  • Provide database coding to support business applications using T-SQL or PL/SQL
  • Perform quality assurance and testing of SQL server environments
  • Develop new processes to facilitate import and normalization, including data file for counterparties.
  • Work with business stakeholders, application developers, and production teams across functional units to identify business needs and discuss solution options.
  • Ensure best practices are applied and integrity of data is maintained through security, documentation, and change management.
  • Identified data issues and provided recommendations for resolution to ensure optimal performance.
  • Development, execution, and maintenance of disaster recovery plans for all data services within a statewide based construction services organization and in this process overhauled an ancient COBOL based data retrieval mechanism.
  • Documented and maintained database system specifications, diagrams, and connectivity charts.
  • Helped create process logging and new monitoring tools, integrity reports, and mapping tools
  • Liaised with project management and development teams to identify and implement reporting, control, and automation opportunities to improve overall access to information, to include real time analytics formatted in a User-friendly manner allowing for immediate and impactful outcomes.
  • Developed new processes and procedures to ensure data integration and data conversion activities.

We'd love your feedback!