We provide IT Staff Augmentation Services!

Senior Performance Engineer/architect Resume

Lanham, MD

SUMMARY:

  • Extensive experience with Financial / Communication / Health Care applications.
  • Expertise in analyzing requirement specifications, developing test plans, test cases, test scripts and planning for QA methodologies and Rational Unified Process.
  • Excellent understanding of the Software Development Life Cycle and analysis, with emphasis on Black Box Testing, Database Testing, GUI testing, Integration Testing, System Testing, Regression Testing, Load and Performance Testing, User Acceptance Testing.
  • Strong Manual Testing skills and proficient in using automated testing tools like QTP, Win Runner, Load Runner, Test Director/Quality Center.
  • Expert at performing Database Testing of back - end tables and data manipulation using SQL.
  • Hands on experience in writing and executing automated Test Scripts, Tracking Defects and interacting with the development team in correcting the defects.
  • Specialist in using Quality Center/Test Director for global test management, bug tracking and reporting.
  • Experienced in Client/Server application development using Oracle, Sybase, MS SQL Server, MS Access, Visual Basic, VB.NET
  • Extensive experience in setting up and troubleshooting LAN, WAN, Wireless Area Network, Windows Server, Active Directory and recovering data from physically and logically damaged hard drives.
  • Proficient in using Quality Center
  • Excellent communication and inter personnel skills and ability to quickly learn new technology.
  • Self-motivated, responsible and ability to lead as well as work in a team.

TECHNICAL SKILLS:

Modeling and Simulation: CA Capacity Manger, CA Optimizer, Hyperformix, Capacity Command Center (CCC), Current Capacity Reporter (CCR)

Test Tools: Parasoft Test Suite, HP LoadRunner, HP Performance Center, UFT(Unified Functional Testing), ALM(Application Lifecycle Management)

App/Web Servers: Oracle Application Server, WebSphere 5.x

Operating Systems: UNIX Solaris, Linux, Windows NT/Xp

Programming Language/Technologies: C, Java Script, Serena PVCS, Serena TeamTrak, SQL, J2EE, .NET

RDBMS: Oracle 9i, 10g, 11g, DB2

ETL /BI TOOLS: INFORMATICA, DATASTAGE, ABINITIO,TOAD, AUTOSYS

Internet Technologies: HTML, XML

DB Tools: TOAD, SQL Developer

Hardware: Sun, HP, Confidential

Networking: TCP/IP

Modeling: OPNET, ATX, ARX

Monitoring Tools: Site Scope, Quest Fog Light, Oracle GRID Control, Oracle Enterprise Manager, HP Deep Diagnostics, CA Wily, OPNET ATX training

PROFESSIONAL EXPERIENCE:

Confidential, Lanham, MD

Senior Performance Engineer/Architect

Responsibilities:

  • Responsible for providing Performance Requirements guidance, Performance Testing, Performance Monitoring, and Workload Modeling.
  • Strong testing Quality Assurance experience within Agile environment
  • Good understanding of Agile software development lifecycle (iterative and incremental)
  • Generated detailed bug reports, pass-fail reports and comparison charts using ALM
  • Identified the scaling factor between the Performance Test environment and Production based on comparing server resource specifications.
  • Wrote a Performance Test Plan that included the performance requirements, load model, test approach, assumptions, constraints, risks, test schedule, performance test scenarios, and key performance metrics.
  • Oversaw and actively participated in the development and execution of performance test scripts and the creation of performance testing artifacts including test strategy and test plans, test cases, test execution reports, defect/issues reports, action item reports, and project plans
  • Designed Load model on the basis of the current volume and projected percentage increase in volume based on production metrics.
  • Provide guidelines to Test engineers to Configure Performance Center test scenarios and Vuser according to the load model so as to take into effect the load distributed across various geographies
  • Performance test development for 'Continuous Benchmarking' using JMeter and Jenkins
  • Design, Develop and Execute load tests using JMeter and Jenkins
  • Created JMeter Test Cases to measure performance and functionality of web services
  • Execution of automated test scripts using JMeter based on business/functional specifications
  • Used regular expression for dynamic values in JMeter
  • Performed in-depth analysis to isolate points of failure in the application
  • Performed integration testing between various modules and with hardware interfaces.
  • Converted LR scripts to JMeter through proxy setup in Loadrunner
  • Analyzed Load and Generation reports of the Scheduled against the online reports.
  • Extensive knowledge of load balancing theory both networking and software (application) sides, fault tolerance and fail over.
  • Performed security audits, load balancing and fault tolerance solutions

Environment: Performance Center, JMeter, Dynatrace , Powershell, Confidential RQM, ClearCase, ClearQuest, DB2, Oracle, Informatica Web Services Sharepoint, .NET, J2EE,JAVA EJB, Webserver, BEA WebLogic 8.1 App Server

Confidential, Woodlawn, MD

Performance Test Engineer/Lead

Responsibilities:

  • Studied application design and architecture and gathered business requirements and information from business analysts, project managers, architects, subject matter experts and production support team.
  • Performed tests on various features of Agile development process
  • Good experience in using APM tool Dynatrace in monitoring business transaction across all tiers (web/app/DB) of the applications.
  • Develop scenario based testing for the JMeter scripts.
  • Create, schedule and run the scenarios using JMeter and generate necessary graphs
  • Extensively worked on JMeter to create Thread Groups and test Web Application for various loads on key business scenarios
  • Created and executed JMeter scripts for performance testing of portal
  • Studied the existing architecture and the proposed architecture finding the difference in terms of software versions and configuration changes and the scaling factor and documenting them
  • Identified key business scenarios from application specialists or business analysts
  • Interacted directly with developers, Project managers and stake holders. Elaborate performance executions and reporting of test performance results in low/High level.
  • Researched past application response time metrics, business transactions from production support team for developing realistic test scenarios and load models.
  • Developed test plan, test cases and communicated aggressively to get approval/sign off on time from application owners and stake holders.
  • Designed Load model on the basis of the current volume and projected percentage increase in volume based on production metrics.
  • Developed test plan strategy and involved in test client and test environment build.
  • Analyze Web Server, App Server and Data Base Servers while performing load test
  • Performed in-depth analysis to isolate points of failure in the application
  • Actively took ownership of defects and coordinate with different groups from initial finding of defects to final resolution.
  • Provided input based on test analysis for the tuning team for environment/application optimization and maintained all configuration and parameter changes and documented as recommendations for production team.
  • Coordinated daily status call for technical and non-technical audiences.
  • Mentor new team member and assign task to each team member with accountability and responsibility.

Environment: Performance Center, Dynatrace, JMeter, Powershell, ALM, C#, Sharepoint, Java, Oracle, DB2, Informatica, Webserver, BEA WebLogic 8.1 App Server

Confidential, Gwynn Oak, MD

Performance Test Engineer

Responsibilities:

  • Tasked to implement Performance Engineering Governance and processes to impact all existing and new development Lines of Business for a Target Enterprise Architecture.
  • Wrote a Performance Engineering methodology document deliverable detailing 5 areas of Performance Engineering influence including Workload Profiling, Performance Modeling, Performance Testing, Design for Performance, and Performance Tuning.
  • Developed Production Load Models for 2 Lines of Business on the program. The Load Models were based on the key Workload Areas within the Production infrastructure and application.
  • Identified all necessary inputs for a Performance Model including a Load Model, Infrastructure details, and Flow of application.
  • Conducted daily scrum meetings for agile sprint. Worked on sprint planning by assigning and tracking scrum activities progress.
  • Used Agile practices and Test Driven Development techniques to provide reliable, working software early and often
  • Performance modeling inputs collection contributed to Capacity Planning efforts to answer system performance questions related to environment sizing, network infrastructure impacts, and multiple workloads on servers.
  • Studied application design and architecture and gathered business requirements and information from business analysts, project managers, architects, subject matter experts and production support team.
  • Studied the existing architecture and the proposed architecture finding the difference in terms of software versions and configuration changes and the scaling factor and documenting them
  • Identified key business scenarios from application specialists or business analysts
  • Researched past application response time metrics, business transactions from production support team for developing realistic test scenarios and load models.
  • Developed test plan and communicated aggressively to get approval/sign of on time from application owners and stake holders.
  • Designed Load model on the basis of the current volume and projected percentage increase in volume
  • Build and enhanced LoadRunner scripts using Vugen when necessary to boost delayed projects.
  • Provide guidelines to Test engineers to Configure Performance Center test scenarios and Vuser according to the load model so as to take into effect the load distributed across various geographies
  • Provide guidelines, configured and used Site Scope Performance Monitor to monitor and analyze the performance of servers by generating various reports from CPU utilization, Memory Usage to load average, pages/sec etc.
  • Analyze Web Server, App Server and Data Base Servers while performing load test
  • Performed in-depth analysis to isolate points of failure in the application
  • Actively took ownership of defects and coordinate with different groups from initial finding of defects to final resolution.
  • Used Mercury Diagnostics for Web Page Diagnostics and J2EE/.net Diagnostics to identify and pinpoint performance problem with Web J2EE .Net applicators
  • Interacted directly with developers, Project managers in the development, execution and reporting of test performance results
  • Analyzed Load Runner on-line graphs and reports to identify where performance delays occurred, network or client delays, CPU performance, I/O delays, database locking, or other issues at server level.
  • Provided input based on test analysis for the tuning team for environment/application optimization and maintained all configuration and parameter changes and documented as recommendations for production team.
  • Experience in working with open source tools Selenium (Selenium IDE, Selenium RC, and Selenium Web Driver), JUnit, Eclipse and preparation of automation test framework.
  • Recorded and plays back test in Fire fox using Selenium IDE.
  • Created Automation test framework using Selenium.

Environment: HP Performance Center, Selenium, SOAPUI, C#, XML, SQLServer 2000, TOAD, MVS., Crystal Reports, Oracle 9i, PL/SQL, Confidential DB2, MS Excel, Flat Files.

Confidential, Hartford, CT

Test Engineer

Responsibilities:

  • Responsible for creating Test Plan and Test Cases by analyzing Business requirements from Business Analyst and stored in ALM.
  • Responsible for automating these test cases into test scripts using WinRunner 8.2, UFT 8.0 and LoadRunner8.0.
  • Responsible for calling WinRunner Scripts from UFT.
  • Accessed data from the different File systems and Databases to Parameterized data in UFT and Data driven in WinRunner for regression.
  • Tested Application using Manual and as well as using automation Test scripts (UFT).
  • Developed VB script for all the modules which need to be automated
  • Extensively used VB Script to develop and execute automation Test scripts using Quick Test Professional.
  • Developed Keyword Driven, Test automation Framework in Quick Test Pro.
  • Developed VB Script and Enhanced to obtain proper results. Handled dynamically changing Objects through VB Scripting. Split the Script in to number of Actions and made them Reusable.
  • Parameterized data for Data driven testing in order to implement Retesting using multiple sets of data.
  • Responsible for Running Batch tests using ALM by launching WR and UFT.
  • Responsible for creating LR scripts thru VuGen (Virtual user Generator) by Web (HTTP/HTML) protocol.
  • Responsible for creating the Workload Profile for the Load Test.
  • Responsible for creating Load Runner Scenarios in controller and Analyzing Results through LR Analysis in the GM Tech Center Lab.
  • Monitoring the Web, Application server and Database Server and getting metrics for CPU Utilization.
  • Responsible for running both Base line, Block Point and Black box Tests for both functional and performance getting the timings for no load and under load and comparing the Results for the both Tests, Using the Maximum 2100 Vuser level for the Load Test.
  • Analyzed failed Test Results and submitted defects based on Test Results thru ALM(Application Lifecycle Management) and maintained complete Defect status Life cycle.
  • Responsible for the creating final Reports and recommending the Application Team in Application Tuning.
  • Responsible for Creating and updating the Activity Log.

Hire Now