We provide IT Staff Augmentation Services!

Systems Performance Engineer Sr Resume

Wallingford, CT

OBJECTIVE:

Seeking a challenging Test Management position in fast - paced information technology and striving to help business determine the best solutions, while contributing my Technical, Managerial and Business skills

SUMMARY:

  • Managed and supervised all deliverables for the projects related to Performance and System Testing engagements including Disaster Recovery
  • Handled successful completion of multiple projects by assigning onshore and off-shore teams
  • Worked with stakeholders to ensure understanding of performance and operational requirements and created Master Performance Test Strategy for completeness
  • Reviewed architectural design for performance risks and evaluated performance characteristics of new technologies
  • Lead and coordinated performance engineering for large initiatives spanning multiple projects and systems.
  • Developed and executed performance tests including load, stress/capacity, endurance, resilience/fail-over and interoperability
  • Conducted technical analysis of performance test results and production systems, and provided recommendations on performance tuning, systems, and infrastructure
  • Reviewed and offered guidance for test scripting and recommended use of appropriate technical tool to produce correct result output. Also ensured accuracy of test scripts and take corrective action to produce accurate and valid test results
  • Managed test process implementation and documentation as per audit compliance
  • Ensured the defects are addressed and appropriate solutions provided. Created reports to document performance metrics, test results, analysis and recommendations.
  • Mentored the team on new technologies duly identifying the training needs to keep up the market trends.
  • Lead improvement of the Performance Engineering process in the SDLC and created industry Confidential Artifacts - Test Plan, Test Result and Test Report templates
  • Made recommendations on automation and performance testing tools’ implementation
  • Provided effort estimates and managed to keep the cost with in approved limit.
  • Maintained open communication channels with Business owners, Project management, System Architects, Tech leads, Mid-Tier and DBAs including external Vendors
  • Managed escalations and steered the team in right direction for diagnosis, troubleshooting, and resolution of issues
  • Worked with Business owners located in countries such as UK, Japan and Australia
  • Gathered, Analyzed, consolidated and presented Test Metrics to the senior leadership
  • Maintained key lessons-learnt and best practices to maintain the project health green
  • Excellent analytical, problem solving, communication and interpersonal skills.
  • Team player, hardworking, self-motivated, positive attitude and pleasant personality.

TECHNICAL SKILLS:

SKILL S: MS-DOS, HP-UX, Solaris, Windows

Hardware: IBM PC, Pentium, HP-6000, Sun Solaris

Languages: SQL, SQA BASIC, TSL, C, C++, Shell, Perl, Python, VB Scripts, PL/SQL, JAVA, HTML, XML

Databases: Oracle, MS Access, Sybase, SQL Server, DB2

Testing Tools: LoadRunner- Performance Center, WinRunner, QTP, Quality Center, Silk, Silk performer, Sitescope

Other Tools and Software: MS office, MS Access, MS Project, PeopleSoft, SQR, SQL*Plus, TOAD, SQL Navigator, Citrix, ITIL, MOF, LeanIT, MDS, Wily Introscope, BSM (BAC), Fiddler, Splunk, TEP, Mainview, TDM, OnDemandClient

EXPERIENCE:

Confidential, Wallingford, CT

Systems Performance Engineer Sr

Responsibilities:

  • Worked with release managers and stakeholders to ensure understanding of performance and operational requirements.
  • Created Master Performance Strategy document, Detailed Performance Test Plan, Test Activities Status Sheet, Test Results and Test Reports for ODWS new services
  • Reviewed architectural design for performance risks and evaluated performance characteristics of ODWS Rewrite (Rest based services)
  • Coordinated with the application development team to review the impacts of legacy SOAP services while working on the new changes in the pipe line.
  • Developed and executed performance tests including load, stress/capacity, endurance, resilience/fail-over and interoperability
  • Conducted technical analysis of performance test results and production systems, and provided recommendations on performance tuning, systems, and infrastructure
  • Ensured the defects are addressed and appropriate solutions provided. Created reports to document performance metrics, test results, analysis and recommendations.
  • Collaborated, negotiating and providing updates to the Senior management
  • Helped BA's in defining non-functional requirements for this project while collecting the stats to simulate the test after the production model
  • Researched and reviewed new technologies for smooth transition from SOAP to REST services along with MongoDB configurations

Environment: s: Performance Center 12.6, VuGen 12.6, Websphere 8.5, Wily Introscope, Mainview, OnDemandClient, DB2, BSM, Jira,TEP, Mainframe, MongoDB

Confidential

Performance Program Lead

Responsibilities:

  • Being a program lead, provided guidance, direction, getting things done on time from the BA, Test and Dev teams while collaborating, negotiating and providing updates to the Senior management
  • Participated in the development of the Best Practices and Lean Practice for Systems Performance Engineering which describes the Lean process during Discovery, Inception, and Elaboration, Construction, and Transition phases.
  • Helped BA's in defining non-functional requirements for this project, and insisted for their inclusion in ReqPro.
  • Reviewed and suggested BTRD's, and DAD's as part of the Systems Performance Engineering discovery estimates and created Performance Risk Evaluation documents
  • Created Master Performance Strategy document, Detailed Performance Test Plan, Test Activities Status Sheet, Test Results and Test Reports
  • Researched and Reviewed new technology MDS/DB2 and lead the team in smooth transitioning from the third party vendors to WellPoint.
  • As a test lead, designed test scenarios, planned and executed the tests for the performance tuning and capacity measurements using HP Performance Center
  • Monitored WAS servers using Wily Introscope and measured system resources using Tivoli TEPS/NMON and also provided recommendations to the teams
  • Established process for Disaster recovery testing using HADR and made recommendations based on the test output
  • Managed and coordinated prototype Citrix testing as part of cost effective evaluation for offshore capacity testing

Environment: Performance Center 11.0, VuGen 11.0, Websphere 7.0, Wily Introscope 8.5, Citrix, Initiate IBM MDS V1.099, Rational ClearQuest 8.0, DB2, BSM 9.22, HP Diagnostics 9.2

Confidential

Performance Program Lead

Responsibilities:

  • Interacted with many groups - Infrastructure, development, business, Database to collect requirements, identify business processes, inventory and volumes for planning and implementation of the tests
  • Coordinated with WAW change deployment team on the Integration Test, UAT, Prod, and Training environments along with requests and reviews
  • Involved in WAW Daily Strategic and Tactical Defect Corrections to prioritize defects in sync with business
  • Updated and provided recommendations on delivery of reporting to the leadership team and introduced Quad Charts to the organization
  • Developed test strategy to develop VuGen scripts for Chordiant interface on Websphere J2EE technology and set up monitors
  • Provided feedback on updating existing test artifacts such as performance test plans, test cases, test results and reports.
  • Assisted in setting up J2EE probes on WAS servers besides UNIX monitors configuring Sitescope for integrating with HP performance center tool set
  • Given presentations on HP tools set - Quality Center, Performance Center, Diagnostics, Sitescope and QTP.
  • Coordinated Chordiant, EIS, WGS, Confidential, EWPD, IMS and other teams to improve performance and set goals for scalability and capacity planning
  • Identified hotspots and worked with tech teams to drill down issues and suggested recommendations for tuning the application for better performance
  • Provided suggestions to the test team on generating Test reports from LoadRunner Analysis and update the project team from time to time

Environment: s: Performance Center 9.10, Quality Center 9.0, Websphere 6.1, Oracle 10g, AIX v5.3/TL 7 SP1, IBM HTTP Server 6.1, WAS HTTP Server Plugin 6.1, SiteMinder Plugin, Tivoli Storage Manager Client (TSM)-5.3, Chordiant framework, ClearQuest, IMS Mainframe, DB2

Confidential, Groton, CT

Test Lead

Responsibilities:

  • Managed the successful completion of system and performance testing engagements - more than 100, with different technologies such as .Net, Citrix, Webtop Documentum, ASP, Java RMI, CORBA Java
  • Coordinated with Onshore team for timely deliveries on multiple projects and set expectations with both the onshore and offshore teams
  • Participated initial scoping discussions with the project teams and provided estimates
  • Coordinated with Engagement and Release Managers, Platform, DBAs, Mid-Tier and Network Modeling team for better customer relationship, particularly with cross divisional project teams like PGM and provided demos to get the business
  • Paved the way to implement functional testing automation tool QTP and Quality Center integration in the team by providing demos to the project teams including Peoplesoft team
  • Managed the migration and system testing of two internal projects - TestCenter LoadRunner to Performance Center and TestDirector to Quality Center
  • Advised various project managers in identifying requirements and goals of performance testing (stress testing, volume testing, capacity testing)
  • Prepared Performance Testing Guidelines, Test plan and Test Report templates duly following industry standards (CFR) and ISLC guidelines (ITIL& MOF)
  • Developed check lists to gather information from project teams for smooth takeoff
  • Extensively used Sitescope and set up counters for monitoring and gathering test data to drill down the root cause of the problem
  • Suggested recommendations to the project teams after analyzing the collected data from LoadRunner Analysis tool for tuning the application for better performance
  • Worked with Red Team to resolve production issues
  • Monitored weekly Quad Charts for status to maintain open channels with the clients
  • Provided weekly test metrics (onshore and offshore) to the top management after consolidating the data gathered from all the partners
  • Conducted interviews and recruited resources in the team
  • Handled issues of cost management and resolved issues
  • Coordinated with financial team in cost management
  • Participated in Bidders selection for identifying the right partner/vendor.
  • Scheduled and planned validated and non-validated applications to meet Production Deployment dates
  • Actively participated in ORR (Operation Readiness Reviews) to update the team according to new schedules and updates
  • Arranged quick netmeeting demos with developers and end users for getting right numbers to simulate performance test, to make it more realistic and to complete load testing in the specified time
  • Prepared performance test plans for each application duly identifying scenarios, while considering Critical and high throughput scenarios
  • Peer reviewed test artifacts/deliverables in line with the iSLC process
  • Provided status updates and Test Metrics to the project teams
  • Reviewed VuGen scripts for appropriate actions and steps, in addition to the parameterization and correlation that are required to run the script through TestCenter LoadRunner
  • Coordinated with DBA and Mid-Tier team in setting up monitors and integrating the Sitescope with TestCenter LoadRunner

Environment: s: TestCenter LoadRunner 7.8, Performance Center 8.1, Quality Center 9.0, Weblogic, Websphere, Oracle 8i Solaris, MS Windows 2K Adv.Server SP3 and NT Server SP6.

Confidential, Princeton, NJ

QA Lead Tester

Responsibilities:

  • Prepared test plans and test cases for the two areas - E-Mail Delivery system and E-Mail Customer repository.
  • Backend testing was done using SQL queries for the cleanup and migration of data from listservers, dartmail and J20 server dynamic data to the new data source.
  • Front end testing of E-Mail Center primarily updating email preferences with different alerts were tested using automation tools.
  • Hard bounces and soft bounces designed on perl scripts were tested for their threshold limits and their codes.
  • Java connector testing was done to check mapping rules from LDIF to LDAP using sun one directory server console.
  • Tested arrow point switch, which was used between ListServer and Enlist and between Enlist and directory server for higher throughput and availability.
  • Shell scripts were extensively used to cross verify the data from flat files to the database.
  • Verified the replacement of Dartmail and migration of all the static lists from the existing List servers on to the Bulk Email infrastructure.

Environment: s: WinRunner 7.6, TestDirector 7.6, using Windows NT 4.0, Toad, Sun One Directory server, Sun one Meta Directory, the Persistent Data's Enlist ODBC to LDAP, Lsoft Tracker, Lsoft Listserv and Lsoft LSMTP servers and Oracle 8i under sun Solaris.

Confidential

Project Lead

Responsibilities:

  • Enhanced the test plan and test scripts for the PPV, following Confidential testing methodologies.
  • Extensively used LoadRunner and WinRunner for performance and functional testing, in conjunction with Test Director.
  • Involved in testing of Tax calculations and credit card checks for US, Europe, Canada and other countries
  • Executed Java programs/shell scripts to create different scenarios in the test runs
  • Tested CCCP/Vertex web services, which includes SOAP calls.
  • Performed User Acceptance Test through the available channels.
  • Generated Test Matrix - Graphs and Data using MS Excel and analyzed various graphs generated through LoadRunner performance tool.

Environment: s: WinRunner 7.6, LoadRunner 7.6, TestDirector 7.6, using Windows NT 4.0, Toad, Websphere V4, Vignette CMS and Oracle 8i under Sun Solaris.

Confidential, NYC, NY

Project Lead

Responsibilities:
  • Created and updated Project Plans, using MS Project
  • Prepared Master Test Plan covering all areas of testing - Functional and Performance
  • Analyzed existing WinRunner and LoadRunner scripts and suggested solutions for robust scripts to run among the testers of S&P
  • Prepared Test Procedural documents for WinRunner and LoadRunner
  • Implemented Test Director not only for the test requirements, test plans and test run
  • Arranged numerous meetings with testers for coordination and synchronized with IT and DB teams for timely updates and progress
  • Lead the Steering committee to coordinate Mercury tools - TestDirector, TestCenter LoadRunner and WinRunner
  • Validated data feed generated from Bloomberg to the destination files through ETL tool SecureFx
  • Mentored testers to edit and improve Test scripts generated through WinRunner and Vugen for robust scripts, besides Manual Testing

Environment: s: WinRunner 7.6, LoadRunner 7.6, Test Center 7.5, TestDirector 7.6, using Windows NT 4.0, Windows XP, SQL Navigator, UNIX, XML, iPlanet and Oracle 8i sun Solaris.

Confidential, Jersey City, NJ

QA Automation Lead

Responsibilities:

  • Mentored WinRunner and LoadRunner to the two members of the test team.
  • Lead the team in design and development of automation using Mercury tools
  • Actively involved in meetings with Confidential management in the follow up of schedules, enhancements, changes etc.,
  • Developed reusable TSL test scripts in WinRunner that are used in the test automation throughout Confidential live web-based systems.
  • Developed WinRunner modular scripts to perform Data driven, Navigation and Regression testing of all the system components for verification and validation of functionality, executed through TestDirector.
  • Recorded, parameterized, correlated and enhanced Vuser scripts to simulate the critical business transactions for Stress testing and Load testing of Confidential Live portal.
  • Analyzed various Load Runner Analysis graphs to report proper fixes in the application for performance fine-tuning.

Environment: s: WinRunner 7.5, LoadRunner 7.5, QuickTest Pro, TestDirector 7.5 using Windows NT 4.0, WebLogic 6.1, XML, IIS 4.0 and Oracle 8i under HP- UNIX.

Confidential, NJ

Team lead

Responsibilities:

  • Lead the team in designing Master Test Plan, designing and developing test cases for all the modules of the system
  • Identified the testers and distributed the test cases among them
  • Resolved many issues among the developers and testers
  • Conducted status meetings to meet the requested Timeframe
  • Developed the UNIX based test tools for testing the application apart from the WinRunner
  • Tested LUPA- Lookup phone access numbers for different states using zip codes
  • Credit card checks were extensively tested using HETS tool, with different conditions
  • Evaluated the application performance for the peak load conditions using LoadRunner
  • Participated in MR Review board to resolve the issues on MRs and other issues
  • Checked different database transactions in Oracle database

Environment: Windows 98/NT/2K/XP, XML, WinRunner 7.5, LoadRunner 7.5, TestDirector 7.01, JDK 1.3 and Oracle 8i, HP-UX

Confidential

Performance Lead Tester

Responsibilities:

  • Lead the team to resolve different performance issues
  • Designed and developed test cases for the complete application.
  • Developed TSL script in WinRunner for testing all MSOs, for functional testing.
  • Created VuGen script to reproduce the field performance problems.
  • Used LoadRunner tool extensively for determining the performance goals.
  • Derived different methods to check the processes of the application like Java, Jrun, ODBs and SSL, during the performance testing, in particular.
  • Checked different database transactions in Oracle database.

Environment: Windows 98/NT/2000, JDK 1.3.1, NSent36, JRun Pro 2.3.3 using JDBC driver for Oracle 8.0.6, HP-UX, WinRunner 7.01, LoadRunner 7.02

Confidential

Release Test Manager

Responsibilities:

  • Sent install requests for new builds and verified proper installation of packages
  • Participated in the MR review meetings and resolved the issues
  • Evaluated Silk Test for functional testing and Silk Performer for load testing
  • Prepared Master Test plans and executed test cases to reflect business requirements
  • Manually tested the client/server application and converted the test cases into Win Runner automated test cases for Regression testing
  • Coordinated the code change from C++ to Perl5 and set up the lab schedules
  • Checked Oracle database transactions from UNIX environment
  • Ensured proper working of all the Perl and JAVA programs as developed

Environment: Windows 95/98/NT, html, C++, Perl 5, Silk Test 5.0, WinRunner 6.0, LoadRunner 6.0 and Oracle 7.0 under HP-UX

Hire Now