Senior Technical Architect (consultant) Resume
SUMMARY:
- Veteran Senior Technical Architect and Senior Engineer with thirty years of multi - faceted experience with application of information and networking technology, emphasizing deployment and support across complex hybrid enterprise level ecosystems. Certified Hadoop Engineer, experienced Hadoop Administrator, and practiced Big Data Architect.
- Experienced, strong and proficient coder employing PHP, Perl, C, UNIX Shell Scripts, CGI Scripts, Python, JavaScript, SQL, and HTML technologies. Skilled Linux design engineer, architecting the automation creation of custom spec files and compiling from source to finished professional RPM builds.
- Roger has maintained, upgraded and administered multiple UNIX systems including SUN Solaris, Red Hat Linux, CentOS, AIX, Confidential -UX, and Windows Server 200X.
- Highly adept in installing, building, configuring, troubleshooting, upgrading, maintaining, and administering UNIX and Windows servers.
- Strong network infrastructure and Linux kernel knowledge allows him the flexibility required to support all aspects of the many different types of systems. Strong team leader, excellent mentor and trainer, accomplished tech author.
TECHNICAL SKILLS:
Operating Systems: Red Hat ES/AS Linux 4.6/5.x/6.x, CentOS 5/6, Solaris Sun 2.6/10.0, SCO UNIX 2.1/7.1.3, Confidential -UX 10.20, AIX 4.0/5.3L, DOS 6.22, Win 9x, Windows XP, Win NT 4.0, Windows Server 2000, Windows Server 2003, Vmware VIH, Vmware ESX Server, F5 Big IP Load Balancers, Cisco PIX
Hardware: Confidential e-Series Servers, Dell PowerEDGE 2650 & 6650 s, Intel x86 platforms, Confidential RS6000, Sun Sparc Ultra s, Sun E6500, E5500, E3000, E10k, E450, Confidential ProLiant 360, Confidential 380, Confidential SL4540
Computer Languages: LISP, Visual Basic, JavaScript, Java, C, HTML, Perl, SQL, CGI Scripts, UNIX shell scripts, Python
PROFESSIONAL EXPERIENCE:
Confidential
Senior Technical Architect (Consultant)
Responsibilities:- Serve as expert, and consultant to lead and develop the overall big data strategy solution and technical roadmap deliverable that encompasses and unifies various groups within a large financial customer for combing real-time, near-real-time, streaming data, and batch processing into a single comprehensive integrated platform that performs fraud detection, BIN profiling, Customer authentication, Implication scoring, and Macro fraud pattern matching and other Data product driven applications.
- Provide new HBase cluster layout, cluster division and partitioning, hardware and software recommendations, construct a framework calculation tool and develop strategies for Cloudera Enterprise CDH 5.5 Resource Pools, Static Service Pools, and HDFS traditional and Sentry ( Cloudera) controlled multi-tenancy and group separation.
- Align the technical requirements with the business goals objectives of all customer groups and stakeholders involved. Effective professional presentation and data strategy message delivery to customer fraud group senior management and company overall senior management. Successfully field all questions concerning the technical solution, business strategy, and cost economics and overall planned schedule and road map.
Environment: RHEL6 x86 64, CentOS 6 x86 64, Hadoop 2.6, Cloudera CDH 5.5, Hortonworks HDP 2.5, HBase, Solr, Apache Nifi, Phoenix, Impala, R Studio, Tableau Desktop, MS Power BI, Hive 2, Yarn RM, Spark SQL, AWS, Kafka
Confidential
Senior Big Data Architect
Responsibilities:- Serve as expert, engineering team lead (over six people) and Big Data SME for BDAaaS group within AMCOE data center, providing team guidance and direction around administration, development, business use cases, hardware recommendation and selection, best practices, industry standards, performance tuning and monitoring, security standards and policies regarding Confidential Big Data environments. Standardize and automate infrastructure and build out of new clusters on AWS.
- Provide high level, detailed design plans and documentation for Hortonworks HDP 2.4 Hadoop Environments, Cloudera CDH 5.3, 5.7 and 5.8 environments, and other Hadoop distro clusters, network diagrams and layouts, logical and physical architectural documentation. Assess and review new Big Data business intelligence software tools and vendors. Research and analyze Big Data industry trends and patterns, security, new vendors, and new open source software solutions, write internal white papers, and provide peer review regarding overall design and Big Data environment stability.
- Lead team with standardization for installation and configuration of BDAaaS environment build outs, providing seamless automation via custom puppet modules, chef recipes, and or python scripts as required. Construct and drive project schedules, timelines, and project lifecycles to acceptable levels of performance and closure. Train junior team members in Hadoop core competencies and fundamentals. Assist in Hadoop developers in the ETL process, responsible for data ingestion orchestration, data import/export, data governance, data protection and security polices and implementations. Hands on Hadoop administration, and Java application troubleshooting, environment updating and patching on a regular basis, environment maintenance scheduling and migration planning. Architect, layout design and build various clustered environments on AWS, AWS GovCloud and Azure.
Environment: RHEL6 x86 64, CentOS6 x86 64, AWS, Azure, Windows Server 20012r2, Jboss 7.0, Hadoop 2.6, Hortonworks HDP 2.4, Eclipse, SVN, Cloudera CDH 5.1, Platfora, AWS, AWS GovGloud, Nifi, Phoenix, HAWQ, R Studio, Microsoft R Server, Tableau Server Cluster, Tableau Desktop, MS Power BI, Hive DB, Yarn RM, Spark Thrift Server, Ambari 2.2
Confidential
Senior Big Data Solutions Architect
Responsibilities:- Engineering lead for Big Data group and architect for Big Data implementation for ‘Hadoop-As-A-Service’ Big Data managed software offering. Design and implement reference architecture for Big Data infrastructure, capacity planning, network topology and design around Hadoop cluster layouts. Perform extensive testing and benchmarking on Hadoop ISV’s such as HortonworksMapR, Cloudera, and Pivotal. Created and built ‘hadoop-toolkit’ generic big data package for performance tuning, monitoring automation, and quality assurance tools for any Hadoop cluster.
- Direct customers with their POC (Proof of Concept) Hadoop clusters, working with sqoop and flume for data ingestion, Kerberos configuration, research and develop various ‘use cases’, aid in the use and operation of BI Tools, predictive data modelling and data analytics and or integration platform software such as TalenD, Datameer, Pentaho, Tableau, Microstrategy, etc.
- Utilize puppet automation software; configure puppet masters and clients with custom puppet modules and classes for infrastructure configuration centralization, management and server rules enforcement. Designed and created the kickstart image for replication and automation of the installation and configuration of Hadoop cluster nodes.
- Architect for multiple SaaS offerings, Apache HTTP Server, PHP, Tomcat, Jboss, MySQL, PostgreSQL, Repliweb and Hadoop HDFS. Recent work with Hadoop has been automation of installation and configuration of multi-node clusters, automation of the HDFS daemons and the Map Reduce daemons, NameNode, DataNode, JobTracker, and TaskTracker components. Construct basic python scripts for manipulation and monitoring of Map Reduce data. Develop customer use cases and implement predictive analytics planning using hunk around customer vehicle fleet data for more accurate vehicle maintenance schedule.
Environment: Sun Solaris 10, Red Hat Enterprise Linux AS 5.8/6.3, VMware Server, and ESX 3.5 Server, Windows Server 2003r2, Apache 2.4, Hunk 6.0, MySQL 5.6.18, PHP 5.4, Jboss 6.1, Tomcat 8.0, Java SE Development Kit 7u61, Hadoop 2.3.0r, CDH4.8, CDH 5.1
Confidential
Senior Big Data Solution Consultant
Responsibilities:- Create roadmap and timeline to architect solution and capacity planning for 10 email server migration from FreeBSD UNIX sendmail/popper environment to RHEL 6 x86 64 sendmail/dovecot environment with enhanced security, 1600% enhanced performance, and improved mail account provisioning process. Project completed in less than 6 weeks, with 0 loss of email messages in a 150,000 user environment with high volume email traffic. Build and install chef automation server, configure cookbooks, etc. for centralized management and container configuration for role and environment designation.
- Completed environmental assessment and study, lab environment build-out and production server clones, professional reports and statement of work documentation. Work onsite with client personnel and engineers, setup kickstart environment, RAID configuration, install RHEL6 Operating system, and install and configure mail infra-structure.
- Installation of Splunk 5.0. forwarders on new mail servers and configuration in existing Splunk Central Server. Configure the inputs.conf file to send mail logs to Splunk Central Server. User management and configure Splunk Server to integrate with Active Directory servers for single sign on. Create simple Splunk Dashboards to identify likely spammers and phishers, and assess other threats and detect threat patterns and tighten mail security measures.
- Big data environmental assessment and analysis for large clients, recommend solutions, provide options and paths for future strategy. BI tool and integration platform recommendation in formal presentation to customer management. Build out 10 node Cloudera cluster for customer POC, execute mapreduce jobs, and developed predictive analytics solutions model.
Environment: FreeBSD 5.5, Dell PowerEDGE servers, CISCO routers, Barracuda Mail Filtering Devices, Red hat Enterprise Linux 6.5 (x86 64), Splunk 5.0, Cloudera CDH 4.1, Datameer, Tableau, Mulesoft ESB, TalenD, SFDC, Box.net, Sendmail, Dovecot, PieceTracker, MS Dynamics AX, VMware
Confidential
Senior Consultant (Contractor)
Responsibilities:- Create documentation for HPSA Satellite server build-outs; gather project information requirements; installation and configuration of RHEL6 x86 64 operating system hardened image. Installation and configuration of Confidential satellite software. Generate network diagrams and documentation for Confidential satellite server connectivity into the Confidential core mesh. Network troubleshooting and debugging.
- Custom bash script development to automate server manual build-out processes, reduced server build time from 2-3 months to 3-4 days, eliminated human error input, and reduced data input duplication, automate data modelling, and improved build accuracy.
- Mentor and train junior team members in Linux system administration fundamentals and advanced concepts and configurations.
Environment: MS Office, Dell PowerEDGE servers, Red hat Enterprise Linux 5.5 & 6.5 (x86 64), VMware, Bash shell scripting, Bourne shell scripting, Python scripting, HPSA, HPOO
Confidential
Senior Solution Engineer (Contractor)
Responsibilities:- Created timeline and project workload projection, and schedule for 3,600 server migration. Implement capacity planning and environment hardening, network architectural design, and expansion. Installation of Solaris 10 on Sun SPARC T-2000 servers, installation and configuration of LDOM’s Solaris zones, LDAP clients, DNS, jumpstart, and other Solaris infra-structure.
- Migrate customer UNIX environment from UNIX/Apache/MySQL old source to RHEL6/Apache/Oracle target environment, designed and documented re-usable maintenance flexible procedures and processes for overall scope and length of project.
Environment: Sun SPARC T-2000 Servers, RHEL6 x86 86, Bash Shell scripting, Apache HTTP 2.2 Web Server, MySQL Clusters, PHP 2.4, Oracle RAC
Confidential
Senior UNIX Solution Engineer
Responsibilities:- Team lead for the Tomcat, Jboss, Apache HTTP Server, MySQL Server, and PHP product lines. Conduct extensive research, set standards, researched and develop best practices, tomcat suite of management tools, for installs, deployment methods, and automation on a large scale for Savvis standard cloud platforms. Building RPM’s for the Savvis Tomcat HPSA Opsware Policies, creating software toolkits, and advanced customized tools via perl, and bash shell scripts.
- Perform extensive benchmarking of Apache Hadoop clusters on various hardware platforms. Analyze and evaluate various Apache Hadoop software vendors for best performance. Build out small 7 and 10 node Cloudera clusters for customer POCs as needed.
- Execute proof-of-concept project for customer's CRM database where implementation involved data ingestion, ETL, data sortingand visualization of data through dashboards to logically determine how to select which predictive analytics algorithm, and combined predictors would be final key performance indicators for improving customer retention.
Environment: Sun Solaris 10, Red Hat Enterprise Linux AS 4.7/5.3, VMware Server, and ESX 3.5 Server, Windows Server 2003r2, Windows XP, Apache 2.2, MySQL 5.0.45, PHP 5.3.6, Jboss 5.1, Tomcat 6.0, Java SE Development Kit 6u18, Hadoop 1.0.3r
Confidential
Senior UNIX/Linux System Engineer Project Team Lead
Responsibilities:- Systems engineering team lead on 100 million dollar, 500 websites migration project for key corporate customer. Charged with initial setup, system & network configuration, kernel tuning, rpm management, DNS server/client and sendmail setup, basic system configuration, security configuration & policies, user account management, setup of ssh, sftp, snmp trap config, and quality assurance of some 87 RHEL, AS 3.0 & 4.0 servers, and 22 Windows 2003 Servers, concluded initial work within 19 days.
- Installation of Websphere 6.0.2, Oracle 10g, (Oracle RAC servers), IHS, Wily Enterprise Manager, Wily Agent, Fortress, SOA Management software, with client observation, and parameter specifics given via webex and teleconference. Principal engineer, for troubleshooting, technical assistance, and goto Linux person during multi-server installs/deployments. Undertaking included webex installs with client, for durations of 3-6 hours or more, under high pressure deadlines, and situations where remaining professional and technically adroit was especially vital and essential to company representation.
- Hardware configurations and troubleshooting for Linux and Windows 2003 servers on egenera bladeframe, and standalone servers. Windows tasks included installation of applications, configuration of services, user and group admin, system security, disk management, netbackup configuration, device management, system resource allocation. Some administration of IIS, and troubleshooting of Windows based websites.
- Base installation and configuration of Confidential p650 & p570 series AIX 5.3 servers, configured and managed network, services, user and group setup, security, lvm and disk management, system monitoring and performance tuning. Installed and configured DB2 Universal Database v9.1, DB2 FixPaks and Information Management Tools for DB2.
Environment: Sun Solaris 2.6/8.0/9.0/10.0, Red Hat Enterprise Linux AS 3.0/4.0, Windows Server 2003, Windows XP, Apache 1.3, Confidential HTTP Server(I H S), Confidential Websphere 6.0.2, Samba 3.0, NFSv4, Wily Enterprise Manager, SOA Gateway, Oracle 10g RAC, Splunk 3.1, Sendmail, DNS, Big IP F5, Veritas NetBackup, eGenera Bladeframe, AIX 5.3L
Confidential
Senior UNIX Solution Consultant
Responsibilities:- Responsible as part of tier 2 level support team and tasked with remote monitoring and initial analysis of critical alarming for software & hardware matters for 1100 various Solaris and Linux servers across the country through a Remedy ticketing system implementing remote fault management. Troubleshoot critical system, network and application failures, identify and prioritize SLA outages, restoration of services and escalation of issues to Confidential &T Technical Development team with detailed conclusions.
- Routinely perform “System Maintenance Operation Procedures” for all manner of SUN UNIX business class servers and applications. Delicate and precise work was always on a strict and critical timeline, work crucial to core business, and Confidential &T’s guarantee of 99.999% network availability. Developed shell scripts to automate work and provide instant feedback, verify data integrity, and if need be to successfully back-out work procedure, with least impact to services.
- Initial system analysis and software package checks. System program execution and process que monitoring on SUN Servers, CPU utilization monitoring, performance tuning and application analysis for predictive system bottlenecks. Tier 2 support to diagnose and document network response, desktop PC issues, and server troubleshooting. Check and maintain server load balancing for all Solaris and Linux production servers. Configured Domain Name Service (DNS), Network File Service (NFS), Apache Web Server, LDAP and Netscape Web Servers.
- Promoted to shift lead in 12 months, duties included overseeing group of 5 to 6 event managers’ daily assignments, and assisting and troubleshooting where ever necessary. Organizing, and directing assignments for daily and long term projects, creating the monthly schedule, timely reports for outages, degradations, and SLA events to Confidential &T management with outage metrics presented in a timely and concise fashion.
Environment: Sun Solaris 2.6/7.0/8.0/9.0, Red Hat Linux 6.0/7.0/8.0, Apache, Samba, DNS, NFS, NIS, SAN, Maillenium 2000, Sun Mgmt Console, Veritas Volume Manager, Veritas Netbackup, Veritas Cluster Services, CISCO & FOUNDRY routers/switches, Sun StorEDGE D1000/A1000, A5x00