We provide IT Staff Augmentation Services!

Analyst/programmer Resume

3.00/5 (Submit Your Rating)

Collegeville, PA

SUMMARY:

  • Data analyst most recently developed and managed Spotfire visualizations of data for a variety of clinical trial studies.
  • Experienced in both standard and customized views for a variety of therapies: Oncology, Respiratory, Infectious Diseases, etc.
  • Developed new Oncology visualization templates from large study - driven requirements to routine data refresh management.
  • Developed new visual comparison techniques to enhance recognition of trends as data accumulated.
  • SAS and Spotfire programming routinely used.
  • Data analyst, programmer and Capacity & Performance engineer with significant experience in data monitoring, data analysis, system simulation and predictive modeling supporting various types of hardware and software systems Confidential all levels: architecture, design, and modeling - development, test and production. Areas: national telecommunications networks, computer networks; insurance, banking, investment banking.
  • Large computer system capacity planning and modeling, industrial and academic scientific software development in integrated circuit Computer-Aided-Design and Test, hardware and software test engineering.
  • Background: Experimental High Energy Physics research (PhD IIT/Fermilab); post-docs: Purdue/SLAC, OU/Fermilab).

TECHNICAL SKILLS:

  • Machine Learning/Python ML Ecosystem: analyzing data, algorithms, analyzing and presenting results
  • Amazon Web Services - AWS Solution “Full Course” Video
  • Spotfire, SAS
  • Data science: Machine-Learning 2015, Octave/MatLab
  • ‘R’, Python, Tableau, Knime
  • Application Performance Management (APM)
  • Response time budget data gathering and application requirements - UML diagramming
  • Shunra, CiRBA, DynaTrace,
  • Custom monitoring tools (DNT/techinfo, etc., Confidential )
  • SQL Developer, Oracle DB, Sharepoint, XML, Crystal Reports, Sitescope, Windows Resources, ODBC, Hyperformix Heatmaps, WireShark (Ethereal)
  • Extendsim (by Imagine That ), Java, HTML/JavaScript, Dreamweaver, Cold Fusion, Flash, NMON on AIX
  • Automatic Test Equipment (ATE) hardware/software: Advantest, Agilent, Teradyne, Credence, Schlumberger
  • CMOS IC Test, Technology and IC Design and Test CAD Tool Development
  • CAD/CAE tools: Verilog, Synopsys, LogicVision Embedded Test, VHDL

PROFESSIONAL EXPERIENCE:

Confidential, Collegeville, PA

Analyst/Programmer

  • Numerous SAS programs and Spotfire code changes were required according to the rules and regulations we established. The coding requirements were specified by myself and verified/analyzed once completed by the offshore programming staff.
  • I created and presented Spotfire demonstrations and tutorials to get new Spotfire users acquainted with the visualizations of the data and the navigation of the tools. The presentations were short and well received by the clinical study teams.
  • I advised the clinical study teams regarding the data visualizations and the trends (or lack thereof) in the data as it accumulated over time. This was routine work done whenever the study teams had their data refreshed.
  • Differences between (successive) data sets are of special interest to me. I investigated ways to compare visual and textual differences of datasets and study workflows. One simple method that I developed (the ‘blink test’), lent itself very well to routine Spotfire refresh reviews and was in use for two years and managed to catch errors not otherwise found.
  • Custom visualizations were occasionally requested and were either borrowed or built from the ground up. Many special graphs were made and went through several design iterations before they were finalized.

Confidential, Philadelphia, PA

Analyst/Programmer

  • Worked in a dynamic, fast-paced, creative group of smart, talented and energetic analysts responsible for enterprise level forecasting, network analytics and capacity projects for Confidential on High Speed Data networks, Video-On-Demand (VOD) and related technologies (X1 Xfinity system, e.g.).
  • Detail-oriented statistical analysis of big database tables using SQL (Vertica), Spotfire (TibCo), ‘R’, and Tableau requiring fast turnaround using scripting/programming for accurate, proven reliable mathematical analyses. The job skills were rapidly changing and moving toward Data Science / Big Data technologies.
  • The analyses and models used national, regional, local, as well as individual device level data sliced Confidential appropriate time intervals (monthly, weekly, daily, and diurnal/hourly, 15-minute, etc.) to do predictive modeling and forecast reporting tools.
  • Involved were both current CDN as well as next-generation IPCDN analytics. Fast-paced daily Scrum - based Agile methodology.

Confidential, Charlotte, NC

Hyperformix Tools Application Manager

  • Managed and applied CA/Hyperformix Capacity Management monitoring tools in a large enterprise production environment including current capacity monitoring and predictive modeling for Ecommerce/Online Banking, Banking Centers Call/Contact Centers, mortgage, insurance, and infrastructure.
  • Developed numerous analytical and discrete event simulation models applied to application initiatives. The models cover architectural design support, performance test analysis, as well as production capacity planning.
  • Advised and aided in performance test and production triage situations and contributed to development of the monitoring and modeling planning.

Confidential, Charlotte, NC

Lead Hyperformix Modeler

  • Lead modeler/analyst for performance architecture team. Modeled and simulated various sized systems from single servers to large multi-tier physical and virtual servers in initiatives - new and existing.
  • Expert in Hyperformix Optimizer, Capacity manager and Data Manager - used in a variety of approaches from early architectural development models through performance test-driven models using LoadRunner performance test data. Forecast models yielded projections indicating need of significant new hardware build outs. Note: I have no professional experience running Loadrunner or other test tools.
  • Developed error analysis and provided confidence measures measuring correctness of forecasts.
  • Knowing “how well you measured” was well received by the developing performance architecture team as an aid in communicating accuracy to business partners.
  • Developed response time budget models as a member of the performance architecture team. These were combined with workload volumes in a newly developing support environment targeting performance modeling for very large and complex systems.
  • Identified long-running service calls and inconsistencies in response time reports to streamline and increase accuracy of new budget development processes.

Confidential, New York City, NY

Lead Hyperformix Engineering Consultant

  • Forecasted merged environment system utilization using Hyperformix Performance Optimizer for all client-facing and internal applications distributed enterprise-wide.
  • Interfaced and coordinated with the local onshore test team. Tests were performed by a large on/offshore testing team.
  • Developed new procedures for measuring and applying background subtraction to increase modeling accuracy.
  • Introduced variability/error analysis into the modeling process to generate statistical confidence levels.
  • Validated complex models with a cross-model methodology in collaboration with Hyperformix.
  • Designed experiments yielding reliable VMWARE modeling methodology.

Confidential, Bloomington IL

PERFORMANCE ENGINEERING CONSULTANT

  • Did capacity planning for ongoing, evolving systems and test analysis and modeling for new systems under development in various environments.
  • Developed new data collection processes and innovative analyses applied to: Windows mid-tier systems and clusters, web-farms, Unix (AIX) servers and Linux clusters, and IBM z/OS mainframe hosts and subsystems.
  • End User Computing (EUC) capacity planning.
  • Applied error analysis measuring goodness-of-results and provided confidence levels qualifying deliverables to business partners.
  • Stressed good communication with multiple audiences.

Confidential, King of Prussia, PA

Lead Independent Analyst & Q/A Software Tester

  • Validated/verified, critiqued large Monte Carlo simulation system for Confidential launch vehicle safety analysis - input data, simulation models, source-code vs. requirements, design and code of this critical mission simulation software system. ( Confidential to Pluto)
  • Identified weaknesses and suggested improvements to both input data and mathematical models. Accuracy improvement estimated to be ~20%.
  • Provided Quality Assurance software tests: unit, integration, system, and acceptance levels as well as a new regression testing system. Tools: Visual Studio/Fortran-2000, for fastest execution simulation language on Intel-based PCs.

Confidential, Allentown, PA

Software Development Engineer

  • Created user interface: interactive waveform graphics, data entry systems, command language, interfaced CAD data extractors, online help and documentation (TPG2, Fortran-77).
  • Designed and implemented MOTIF-standard Graphical User Interface (GUI) using UIB (ObjectBuilder). Data and control flow, generic dialog box, context-sensitive help, designed icons/short-cut toolbars, CAD data interfaces and graphic specification documentation. Managed GUI development team. (TPG3, C/C++/OOP)
  • Pioneered design-to-test interfacing. Organized and chaired committee to standardize ASIC digital simulation data for test automation. Developed CAD data extraction tools for error-free, automatic input of a) device layout, b) digital simulation data and c) wafer array map (controls ATE wafer probers).
  • Created algorithms to automate probe card design (chip to ATE probe test head) and associate interactive graphical displays for instantaneous feedback to user.
  • Consulted with design and test engineers to solve engineering and production test problems (ongoing Application Engineering).
  • Conceptualized and developed large-scale software tool to translate third-party digital simulation event data into cyclized test vectors ready for test program generation (TVT/“Stingray”) in C++ with streaming objects to efficiently process huge databases.
  • Created and maintained generic CMOS technology databases (COM1, COM2, COM2H, COM3). Reviewed and updated periodically and as needed to improve quality. The easy access and improved quality of design data enhanced test reliability.
  • Embedded a stand-alone translator in GUI to support multiple format input files. Simplified and significantly streamlined GUI user activity.

We'd love your feedback!