Sr. Performance Engineer\performance Center Admin Resume
Cincinnati, OH
PROFESSIONAL EXPERIENCE
- Hands - on testing experience in JAVA, SOAP UI, SAP, Web Services, ASP, SOA, .Net, VB, ASP, JSP and JavaScript. Mainframe and SQL on UNIX and Windows platforms.
- Excellent experience in Performance testing using Load Runner with Quality Center. Analyze results, graphs, service level agreement (SLA), Site Scope, reports, online view and work load using Performance Center.
- Experience with the Waterfall, Agile software development lifecycle and other SDLC Methodologies.
- Performed Smoke Testing, Integration Testing, System Testing, End-to-End Testing, User Acceptance Testing (UAT), Regression Testing, Performance Testing, Adhoc Testing, Data and Database Migration Testing, Interaction Testing, Localization testing, Configuration Testing, usability testing and security testing.
- Experienced in using the automated testing tools Test Director, Loadrunner and Quality Center
- Efficient in Bug Tracking and Reporting using automated testing tools.
- Involved in Business Requirement Analysis, and writing Test Plans, Test Cases, Traceability matrix, Test Scripts for middleware, web-Based, Client/Server and Stand-Alone applications.
- Worked closely with BA’s, developers, Clients, support group to ensure test environment, application and data readiness.
- Strong experience in Mainframe Batch Cycles & Online Real-time Processing.
- Strong Database (Oracle, SQL Server, MS - Access and ETL tool) testing skills using advanced SQL and PL/SQL.
- Comfortable with various Industry Leading Operating systems (Windows NT/98/95/2000/XP, UNIX, AIX, Mac 9.0/10)
- Conversant with SDLC and QA practices and Rational Unified Process (RUP) and Agile Environment
- Excellent team player with good communication and interpersonal skills.
- Expert experience in gathering requirements specifications and technical specifications.
- Preparing Test Strategy and Test Plan for multiple projects which followed AGILE methodology (Scrum, XP, ASD (Adaptive Software Development).
- Extensive experience in creating virtual users using VUGEN and performed performance testing using Load Runner and solid experience in Load Runner Architecture.
- Excellent experience in Functional testing, performance, manual, Automation, Data Warehouse, White box and non-functional testing including Performance in Web based and client/server applications.
- Experience in Insurance industry, Web application, Product Development Company and online e-commerce projects.
TECHNICAL SKILLS
Operating Systems: AIX, HP-UX, Solaris, Windows XP, 2003, 2000, Vista, Windows NT and Linux
Languages: C, JAVA/J2EE, VB Scripts, XML, UNIX - Shell Scripting
Databases: Oracle 9i/10G, DB2, SQL Server, MS-ACCESS, MySQL
GUI: VB 6.0/5.0, JSP, Java Applets, ASP, HTML
Web Related: DHTML, XML, VBScript, JavaScript, Applets, JAVA, JDBC, Servlets and JSP
Testing Tools: Loadrunner, and Quality Center, Silk, JMeter
Web / Application Servers: Apache, Tomcat, Weblogic, WebSphere 5.x, IIS5.x, 6.x
Other: Testing tools, SiteScope, Quality Center, and Performance Center ALM
PROFESSIONAL EXPERIENCE
Sr. Performance Engineer\Performance Center Admin
Confidential - Cincinnati, OH
Responsibilities:
- Prepared Test Strategies, Test Plan and Test Cases as per the business requirements and Use Cases.
- Involved in Load Testing of various modules and software application using Load Runner
- Developed the Load Test scripts using the Load Runner Virtual User Generator (VUGen) and enhanced the scripts by including transactions, parameterize the constant values and correlating the dynamic values
- Extensively worked as a Performance center administrator in installing & migrating from PC 9.52 to ALM 11.10 and then to ALM 11.5. Providing access to new users, Adding new projects.
- Raised several tickets with HP and fixed various issues related to ALM & loadrunner. Applied patches to the tools to keep them up to date.
- Developed Vuser Scripts in Web\HTTP, WebServices, ODBC, Oracle-2Tier,Winsock and Click & Script Protocols.
- Enhanced Load Runner scripts to test the new builds of the application
- Developing Vuser scripts and enhanced the basic script by Parameterizing the constant values
- Conducted Web services testing using SOAP UI.
- According to business specification, Customization of scripts by using Load Runner
- Carried out stress testing by introducing rendezvous points in the script
- Conducted testing on the servers using Load Runner & Performance Center to establish the load capacity of the server
- Using Load Runner analyzed the response times of various business transactions, modules login times under load, developed reports and graphs to present the test results
- Used Sitescope & Dynatrace to monitor the load test and to identify the bottle necks.
- Exclusively used Dynatrace to monitor the Performance of the application at a 3-Tier architecture level and analyzed the metrics like JVM heap size and expensive SQL queries.
- Installed Dynatrace probes in Application & Database Servers. Configered the complete Dashboard for easier Monitoring & Analysis.
- Monitored the performance of the Web and Database (SQL) servers during Stress test execution.
- Defined transactions to measure server performance under load by creating rendezvous points to simulate heavy load on the server
- Used Load Runner for checking the performance of Various Reports and Queries under load
- Analyzed the results of the Load test using the Load Runner Analysis tool, looking at the online monitors and the graphs and identified the bottlenecks in the system
- Reported and entered bugs in Quality Center
- Tested for the compatibility of the Browser.
- Developed High Level and Detailed Test Plans and reviewed with Team, demonstrated Customer Level experience to team. Identify critical functionality at business level.
- Updated test matrices, test plans, and documentation at each major release and performed Regression testing using automated script.
- Managed/Updated Shared object repository from time to time using Object Repository Manager.
- Used environment variables as global variables to pass the values between actions.
- Carried out the manual testing of different interfaces.
- Provided Test Estimates for various phases of the project.
- Reported and tracked defects in the Quality Center bug tracking system.
- Automated the test cases using Quick Test Professional
- Performed QA Process management by automated process, identified functional changed vs. business impact and trained QA team with cross business training.
- Work closely with software developers and take an active role in ensuring that the software components meet the highest quality standards
- Supported Production team to understand and execute the processes.
- Created application documentation to assist in the support and training of users.
Environment: LoadRunner 9.52,11.1, Performance Center, Quality Center, SiteScope, Wily IntroScope, DynaTrace, QTP, SQL Server, SQL Profiler, Windows, UNIX, Weblogic, Websphere, Performance Center, XML
Sr. Performance Engineer
Confidential - DesMoines, IA
Responsibilities:
- Defining the performance goals and objectives based on the client requirements and input.
- Involved in building a Performance Test Environment.
- Installed, configured and maintained Loadrunner 9.5.
- Extensively Worked in Web, and Web services in LoadRunner 9.5.
- Completely handled Oracle upgrade project. There was an upgrade from 9i to 10g. Apples to apples comparison was done to verify the performance of the new database.
- Extensively worked on Performance center 9.52 & 11.10 to execute load tests and to raise defects.
- Configured HP Sitescope & HP Diagnostics to HP ALM 11.10 to monitor the test and identify bottle necks.
- Scripted legacy desktop applications using Winsock protocol.
- Ensure the compatibility of all application platform components, configurations and their upgrade levels in production and make necessary changes to the lab environment to match production
- Create automated test scripts with LoadRunner, VuGen.
- Involved in complete end to end back end testing of 3-tier architecture starting from Network monitoring, JVM logs and heap monitoring. Setting up the JDBC connections, querying the DB to understand the throughput.
- Optimized the environment for the best performance by changing the JVM thread counts and number of JVM’s as well.
- Execute test and monitor system performance Sitescope, LoadRunner Controller.
- Extensively used SOAP UI to perform web services testing.
- Responsible for developing and executing performance and volume tests
- Develop test scenarios to properly load / stress the system in a lab environment and monitor / debug performance & stability problems.
- Coordinated with DBA for optimizing UNIX server for querying data in SQL and running scripts.
- Configure and set up monitors in Sitescope. Analyzed Performance of applications using Dynatrace. Used Dynatrace to monitor the complete end to end transaction using Purepaths.
- Involved in analyzing, interpreting and summarizing meaningful and relevant results in a complete Performance Test Report.
- Work closely with software developers and take an active role in ensuring that the software components meet the highest quality standards.
- Used Virtual User Generator to generate VuGen Scripts for web protocol. Ensure that quality issues are appropriately identified, analyzed, documented, tracked and resolved in Quality Center.
- Developed and deployed test Load scripts to do end to end performance testing using Load Runner.
- Implemented and maintained an effective performance test environment.
- Identify and eliminate performance bottlenecks during the development lifecycle.
- Accurately produce regular project status reports to senior management to ensure on-time project launch.
- Conducted Duration test, Stress test, Baseline test
- Verify that new or upgraded applications meet specified performance requirements.
- Used Performance Center to execute tests, and maintain scripts.
- Used to identify the queries which were taking too long and optimize those queries to improve performance
- Work closely with software developers and take an active role in ensuring that the software components meet the highest quality standards.
- Provide support to the development team in identifying real world use cases and appropriate workflows
- Performs in-depth analysis to isolate points of failure in the application
- Assist in production of testing and capacity certification reports.
- Investigate and troubleshoot performance problems in a lab environment. This will also include analysis of performance problems in a production environment.
- Created Test Schedules.
- Worked closely with clients
- Interface with developers, project managers, and management in the development,
- Execution and reporting of test performance results.
Environment: LoadRunner 9.5, Performance Center, JMeter, SiteScope, Dynatrace Oracle, Citrix, MS SQL Server, Weblogic, Load Balancer, JAVA, Quality Center 10, J2EE Diagnostic Tool, web, Windows 2000/XP .
Performance Engineer
Confidential - Columbus, OH
Responsibilities:
- Extensively used SAP Web, WEB (HTTP/HTML), SAP GUI, Ajax Click and script, RTMP, Flex, AMF, Citrix client and Web Services protocol.
- Preparation of Test plan which specify testing overview, testing approach, testing strategy, roles and responsibilities, scope of testing, Architecture landscape.
- Executed End to End integration testing scripts from different channels - SAP CRM IPL portal, Web portal.
- Provided Site scope admin support for PE tool management activity.
- Extensive knowledge in SAP ECC 6.0 component to collect test statistics, Server details, number of active sessions statistics, Batch jobs statistics.
- Experience in Data Transfers using Batch input methods, Transferred master data from previous system using LSMWSAP Tool.
- Involved in SAP BOBJ Integration end to end testing in Tera data base for business reports run.
- Executed Integration/end to end testing manually from creating sales order through billing, Order Delivery, Batch Split, Batch Determination, Shipping, Pick & Pack, PGI (Post Goods Issue), PGR (Post Goods Receipt) and ASN and Billing.
- Executed various T codes to collect statistics, Generate batch jobs, In bound delivery, Out bound delivery and to schedule various jobs.
- Monitored / tested EDI Messages 850, 855 and 856 Ensured that the data exchanged is syntactically correct and the information required properly applied in the internal systems as per requirements.
- Extensively worked with Shunra team for network testing with various bandwidth and latency tests.
- Monitored CPU, Memory utilization and various infrastructure matrices from Site scope and HP diagnostics.
- Executed endurance test for long period to find DB server usage.
- Gathered SQL queries, Java object classes, web service calls from Wily Introscope.
- Collected HTTP basic watch, wire shark and fiddler logs for various scripting challenges in Vugen.
- Created load runner scripts for BAC alerts and worked with HP to migrate in to production for external customer facing applications.
- Developed various reports and metrics to measure and track testing effort.
- Attending weekly defect report meetings and presented progress updates.
- Attending conference calls with offshore team to discuss the Testing status and to assign the defects to the concerned developers.
Environment: Load Runner 9.52, JAVA, Quick Test Pro, Site scope, JAVA, ASP, JSP, Oracle, UNIX, Linux, Windows, Web Sphere, Rational clear quest, HP - Diagnostics, Wily intro scope, Tera data, Shunra, SAP R/3 ECC 6.0, SAP CRM 7.0 IPM, MDM,BW,AIMS,SCM,EDI
Sr. Performance Engineer
Confidential - Owings Mills, MD
Responsibilities:
- Participating in JAD Sessions, BRD Reviews, FS Reviews, and Tech Spec Review Meetings.
- Preparation of Estimations, Testplan, and Traceability Metrics & Test cases based on the requirements from BRD & FSD and uploaded to VSTFS.
- Internal & External review of Test plan and Test cases.
- Build setup on Test environment and installation Testing.
- Executing BVT Test cases for validating the Build.
- Executing the Test cases and updating the result in VSTFS.
- Logging the issues found during pre-test reviews and during testing the application in VSTFS
- Experience with database languages such as SQL and PL/ SQL which includes writing triggers, Stored Procedures, Functions, Views and Cursors.
- Identified Performance issues in MySQL DB, by evaluating quires and tuned accordingly for better performance.
- Expertise in scenario design, test execution and analysis using LoadRunner Controller and Performance Centre.
- Conducting Triage (Daily Bug Review meeting with participants from Analysis, Dev, Test & User Support) for prioritizing the issues.
- Work allocation to team members and Application KTs to new team members.
- Coordinating with analysis, dev, performance & user support teams.
- Troubleshooting the QPRs (Issues raised in production)
- Applying and testing the monthly data updates, QFEs, CRs & SRs.
- Used build compare tools and windiff to compare the build.
- Created Scripts to compare the OLTP and Reports data.
- Testing the application for compatibility on different versions of Windows OS like VISTA, Xp. Testing reports on different versions of MS Office.
- Performing security testing as per ACE security Guidelines
- Using tools like FxCop, SQLCop to check whether the code met the standards.
- Using tools like SQL Code Coverage & Magellan to check the test coverage.
- Creation of daily status report & weekly Score cards.
- Creation of Graphical Bug Matrix for each release.
- Signing off on the release.
Environment: LoadRunner, LoadRunner Test Center, OpenSTA, VTS(Virtual Table Server), Wily, Windows 2000 Advanced Server, Apache, IIS 5, Livelink, BEA Weblogic, Servlets, EJB, Solaris, Oracle Database,JAVA