Performance Test Lead Resume
Los Angeles, CA
OBJECTIVE
Seeking a challenging role in the area of IT as Sr. Software Engineer to contribute towards organizational success and grow to a Senior Management level
SYNOPSIS
Young, energetic and result oriented professional with over 6 years of hands on experience in IT; Extended expertise in Programming & Analytical skills, Manual and Automated Software Testing; Multi tasking ability to sketch the plan, prioritize the work and manage the complex projects under aggressive timelines; Deep understanding of technology with focus on delivering business solutions; Highly ethical, trustworthy and discreet;
- Prior relevant technical and leadership experience in performance testing and analysis in a distributed environment
 - Expertise in reviewing/analyzing Software Requirements Specification documents [SRS], Business Requirement Documents [BRD], creating and developing Test Plans, Test cases and test Scripts and interacting with development Team
 - Gained experience with entire phases of Software Development Life Cycle Viz. analysis, design, Development, and Testing
 - Adept technical skills in functional, integration, system, and performance testing in the domain of Pharmaceuticals, Finance, Banking and Telecom
 - Preparation of test harness
 - Experience in automated load and stress testing of production systems
 - Prior experience in analyzing and reporting on performance metrics and test results
 - Highly self-motivated and able to perform under aggressive timelines
 - Excellent verbal and written communications skills
 - Strong analytical and problem solving skills
 - Able and comfortable interacting with all levels of management and technical resources
 - Able to work independently and collaborate as necessary
 
CORE COMPETENCIES
- Preparing/Executing Test Plans, Test Strategy, Test Cases
 - Back-end database testing using SQL
 - Proactive & Industrious
 - Quick Test Professional (QTP)
 - Project Management
 - LoadRunner and WinRunner
 - Team Management
 
- Efficiently and productively analyzed all the projects undertaken
 - Experience with Java, C/C++ and leading edge software development models
 - Skilled in use with load generation and analysis tools like LoadRunner
 - Won several laurels and appreciations from my superiors for my hard work and efforts shown in meeting the project dead lines
 - Sound ability to develop strategies to achieve personal goals as well as organizational goals
 - Resolved and facilitated my team in number of technical issues
 
TECHNICAL QUALIFICATIONS
Testing Tools: WinRunner 7.0/7.6/8.0, LoadRunner 8.0, 8.1, 9.1, Performance center 8.1, 9.1,
  Test Director6.0/7.0/8.0,Quality Center (QC)
  Scripting: TSL, VBScript, JavaScript
  Languages: C, C++, Java, Visual Basic 6.0
  Databases: Oracle 8i/9i, SQL Server 7.0/2000/2005
Operating Systems: Windows 95/98/NT/2000/ XP, RH Linux 8.x, Solaris 9/10, SCO UNIX 5.0
WORK HISTORY
Confidential, Los Angeles, USA (Jan’09 -Till Date)
    Performance test lead
  Key Deliverables
- Responsible for validating that all test case input sources and test case output results are documented and can be audited
 - Documenting Performance test plan, test case documents.
 - Accountable for Test Scripting using Vugen
 - Ensuring the availability of all testing facilities in accordance with the testing calendar;
 - Monitoring system performance (hardware/software) during volume/performance/capacity testing.
 - Liaising with technical staff to ensure availability of all testing facilities;
 - Make arrangements for functional consultants and/or other project personnel to execute or supervise testing;
 - Liaising with the Testing Co-ordinator on all matters in respect of the availability of the testing environment for UAT
 - Responsible for the execution of all test cases to which they are assigned and recording of results with support from the project team
 - Executing the test cases using sample source documents as inputs and ensure that the outcomes of the tests are recorded
 - Signing off on all test cases in accordance with the stated acceptance criteria by signing the completed test worksheets
 
Environment
  C, C++, JAVA, HTML, Visual Basic 6.0, SQL Server 2005, and Windows NT, ASP.Net,VB.Net,,,Linux , perl scripting, Performance Center 9.1, Load Runner 9.1,Quality center
Confidential, Detroit, MI (Dec’07 – Dec’08)
    V&V Test Analyst
  Key Deliverables
- Responsible for software Quality Control and Quality Assurance of all processes associated with internal and external customers
 - Analyzing software specifications and architecture to determine locations of possible bottlenecks.
 - Carrying out procedures through development life cycle to include requirements gathering, test planning, creating test scripts, software testing, and defect management
 - Designing performance test scripts and scenarios to identify impact of performance bottlenecks.
 - Analyzing performance test results to generate scalability models
 - Collaborating with other engineers on architecture, design, code, configuration, and defect reviews.
 - Providing weekly status reports to project manager describing accomplishments.
 - Responsible for the department quality assurance guidelines and processes.
 - Providing documentation and feasibility studies directed toward customer standards and compliance
 - Providing individual and lead support for complex multi-system installations, large system installations with complex software and applications, or large system installations with complex business and management considerations
 
Environment
  C, C++, JAVA, HTML, Visual Basic 6.0, SQL Server 2005, and Windows NT, ASP.Net,VB.Net,, Manual Testing,QTP 8.1, Performance Center 8.1, Load Runner 8.1,Quality center
Confidential, Fort Worth, TX (Jan’07 – Jul’07)
    QA Tester
  Key Deliverables:
- Designing the automated testing framework
 - Designs the performance test harness for execution on automated, performance, & endurance testing platforms; supporting off-shore automation development resources, as well as contracted *on-site* personnel resources;
 - Providing assistance to the Performance Test Lead to help in development of test strategies and test plans
 - Supporting the development and execution of performance test cases and HP Performance Center test scripts per approved Test Plans by providing specific performance testing expertise in Centricity/IDX
 - Working with the Test Lead and Performance Test Engineers to execute comprehensive Performance Testing of the application
 - Interacting with development engineers to design & implement functional, performance interface applications
 - Working with automation tool vendors in the implementation of in-house simulator applications to effectively function within automation environments
 - Responsible for the specification and installation of required hardware and software for automation projects
 - Conducting documentation reviews for performance automation documents developed by QE automation engineers
 - Responsible for specifying automation platform operation: automation servers, operating systems, simulator applications, automaton applications, and future automation environments
 - Planning and managing rollouts of enhancement and maintenance upgrades to performance automation platforms; A solid knowledge of Windows and UNIX preferably Solaris and/or Linux operating systems.
 - Designing, researches and developing components of test architecture for a new platform with a concentration on system performance and throughput.
 - Responsible for performance and load testing using HP Mercury LoadRunner, and designing, developing and delivering performance testing solutions for applications
 - Performing performance and load tests on complex projects using technical specifications.
 - Planning, scheduling and implementing performance testing projects
 - Defining performance test objectives and write LoadRunner Scripts.
 - Performing database verification, SQL queries to verify and modify test data
 - Performing API/Web Services testing; may also be required to write test harnesses using web services and SOAP technologies
 - Investigating and resolving technical issues in the QA and prod environments
 - Designing an infrastructure to validate the requirements, functionality and performance goals for the particular product, with an emphasis on performance
 - Identifying the bottlenecks in the system during architecture and design and providing appropriate solutions
 
Environment
    Load runner 8.1,C, C++, JAVA, HTML, Visual Basic 6.0, SQL Server 2005, and Windows NT, Manual Testing, QTP,Quality Center.
Confidential, Jersey City, NJ (Mar’06 – Nov’06)
    QA Tester
  Key Deliverables:
- Developing a loadrunner script or load script
 - Executing a load test & Preparing a load test before execution
 - Possessing LR Scripting Skills and reports experience
 - Performing heavy load stress tests
 - Understanding of RENDEZVOUS POINTS in a Vuser Script and *performing CORRECTION TEST
 - Identifying BOTTLENECKS in web applications (loadrunner) and correction skills
 - Analyzing Performance Graphs
 - Planning and Designing complex and Integrated Performance test scenarios for applications and infrastructure to be used for Integrated Load Testing
 - Designing and writing performance test scripts using LoadRunner as well as custom test harnesses
 - Executing and coordinate monitoring of performance tests
 - Analyzing performance test results
 - Identifying potential bottlenecks; Producing comprehensive test reports and present findings to management; Analyzing application usage and server performance data
 - Working with application development teams and business users to develop & validating performance requirements
 - Investigating and resolving technical issues in the QA and production environments and communicates issues with various departments
 - Mentoring and enhancing their capabilities for consistent success of the organization
 - Monitoring performance tool (Load Runner) capabilities and trends to enhance and maximizing performance value to organization; Creating necessary SDLC test artifacts and ensure all performance testing requirements are fully tested and traceable to test cases and results
 
Environment:
  Load runner 8.0,Java, PERL Scripting, Windows 2000/XP, Manual Testing, WinRunner 8.0, TestDirector 8.0
Confidential, Blue Bell, PA (Jun’05 – Feb’06)
    QA Tester
  Key Deliverables:
- Developing test plan & test scripts; executing performance testing
 - Documenting performance testing results
 - Presenting results to development team, test team and management
 - Conducting/facilitating telecoms and meetings
 - Developing release program plan/schedule and maintain
 - Facilitating and reviewing functional requirements
 - Responsible for the performance and load testing using HP Mercury Load Runner, and designing, developing and delivering performance testing solutions for company applications
 - Planning, scheduling and implementing performance testing projects, defining performance test objectives and writing Load Runner Scripts
 - Performing database verification, SQL queries to verify and modify test data
 - Formulating test plans including systems analysis, risk analysis, writing and plotting test strategies and determines how to report test results and defects
 - Performing API/Web Services testing; may also be required to write test harnesses using web services and SOAP technologies
 - Facilitating and monitoring:
 - Business Impact Assessment (BIA)
 - System Requirements Document (SRD)
 - Requirements Traceability Matrix (RTM)
 - System Design Document (SDD)
 - User Guide & Help Content
 - Release Implementation Plan
 - As Built documentation updates
 - Functional Requirements Review (FRR)
 - System Requirements Review (SRR)
 - Test Readiness Review (TRR)
 - Deployment Readiness Review (DRR)
 
Environment: 
  Windows 2000 work station, SQL Server 7.0, Oracle 8i, VB 6.0, ASP 3.0, VB Script, VISIO 2000, Manual Testing,WinRunner 7.6, TestDirector 7.0
Confidential, Houston, TX (Jan’04 – May’05)
    Sr. Test Engineer
  Key Deliverables:
- Responsible for the performance tools such as JMeter, Loadrunner, and Grinder
 - Building a robust test infrastructure
 - Understanding test plans and metric reporting
 - Accountable for the databases (MySQL preferred)
 - Responsible for the Linux operating system
 - Working closely with Development, Quality Engineering, Product Management and Technical Operations during the development, test, launching stages of software development & release cycle (Waterfall and Agile)
 - Defining test plans and test specifications for functional, unit, integration and performance testing, as well as the manual execution of test cases, and reporting product failures
 - Developing test frameworks in Selenium for UI regression test automation
 - Designing and developing load test suite using open source tools and/or using scripting
 - Creating and executing complex database queries (SQL) to validate database performance and data integrity
 - Following project milestones; design, implement, document, and execute tests; evaluating and communicating results; and investigating product features (including ad hoc testing)
 - Working with the various teams in order to ensure that all components meet the required levels of performance, scalability, and reliability as demand grows for our service
 - Analyzing the various components and the whole system to identify issues and bottlenecks and prototype possible component fixes or architecture enhancements to address these issues
 - Working with a very experienced team and understand deeply every detail of our highly scalable cloud synchronization platform
 - Coordinating test planning, manual testing and automated testing, according to project deliverables
 - Developing, maintaining, and executing automated test scripts; Maintaining complete, detailed test results data; Generating, publishing test team reports indicating test scripts passed, failed, not run, including # defects created as a result of testing
 - Identifying/automating manual, functional test scripts that are good candidates as regression test scripts for automation using QuickTest Professional and LoadRunner
 - Developing and executing automated test scripts for future regression testing
 - Designing, developing, and executing performance test scenarios as required using LoadRunner
 
Environment: 
  VB 6.0, ASP 3.0, JavaScript, Manual Testing,WinRunner 7.0, TestDirector 6.0, LoadRunner 6.0
TRAININGS
- Training Program on SAP ABAP, BW, HR Payroll, Retail Modules; Load runner, QTP & winrunner
 
