- Overall 7s years of expertise in performing load and performance tests against web - based applications, web service interfaces, and back-end server components. Skillful in Identifying production metrics important to performance/load testing efforts and evaluating performance test results alongside providing performance enhancing recommendations.0
- Possesses detailed working knowledge of QTP, SOAPUI, LoadRunner components including VuGen, Controller, Load Generators, Agents, Analysis and Monitoring tools. Has detailed experience analyzing LoadRunner test results and articulating these into an accurate report for business, development, and management teams as well as operations stakeholders.
- Proficient in articulating modern web application architectural components, including application server containers, session management, authentication and authorization techniques, load balancing devices, and proxies. Used this knowledge while capacity planning and system planning.
- Have done manual testing and automation testing using both Agile and Waterfall methodologies, proficient in UAT testing, Regression testing, integration testing and other types of functional testing.
Operating Systems: Win NT, Win 98, Win 2k, Win 2003 Server, Win XP, Win Vista, Win 7, Unix
Web Servers: IIS6.0, Weblogic 8.1, WebSphere 6.1, Oracle 10g/11g
Programming Languages: C, C++, C#, Visual Basic 6.0, SQL, PowerShell
Databases: Oracle 11i, SQL Server, MS-Access, system 80
Tools: Loadrunner, Performance Center, QTP, Quality Center / Test Director, Win Runner, MS remote desktop Load Simulator, SOAP UI, Visual Studios, Netstorm/NetOcean. TFS, Redgate SQL Compare, Jira, Microsoft Office (Word, Excel, ETC)
Process Methodologies: Agile Scrum & Waterfall
Environment: LoadRunner, Performance Center, Http/html, Web services, sitescope, Jira, C,
- Run test on ALM Performance Center
- Creating, updating/fixing and making peer review on scripts using HTTP, web services, protocols within Vusers in LoadRunner.
- Used Excel to document some of the results
- Used LoadRunner Analysis features to create custom reports and graph correlation comparisons
- Leverage on LoadRunner/Vugen recording options to create and enhance scripts
- Used HTML and URL recording modes to create robust web performance scripts
- Used a verity of parameter settings and types for different test purposes based on test current needs, such as random, unique, and sequential
- Monitoring the behavior of the different elements in the application in real time to make sure are components are behaving as expected and the metrics are within the SLA
- Write scripts on Loadrunner and then load them up on Performance Center
- Contributed with testing fail over scenarios with LoadRunner to understand, document, and report the current application state and test results
- Write a report on the tests ran and send them off the different managers, stakeholders and higher ups.
- Contribute to training sessions for government employees; train them on how to use Performance center and loadrunner.
- Document on any training that were done and document on how to use Performance center and other applications so if a new employee or a new team member joins, they have it as a reference.
- Must communicate clearly to the government and the government employees, if not, they will take it the wrong way and the results may not be what was desired.
- When in a meeting, must contribute to ensure the success of the project.
Environment: TFS, Visual Studios Ultimate 2013, NetStorm, C, C#, HTTP/HTML, Unix, Agile, SQL, SSMS, APIs, PowerShell, RedgateSql compare
- Converting Scripts from Netstorm to Visual studios.
- Running Scripts in NetStorm by kicking them off using Unix Command lines
- Work Close with DBAs, Developers, BSAs, QA, BI/DI team, Mobile team, Rental Team, Web Team,
- Restore DBs before Testing
- Help solving performance issues by creating test scenarios that closely simulate production load.
- Update DB tables when a new build comes in and requires changes.
- Manage Staging DBs and work very closely with DBAs, System Admins and Developer for the success of sprint and project
- Attend Plan fest for planning what to do for the next few sprints
- Analyze the results from the test run(s).
- Meet all deliverables to ensure all the builds go to prod as scheduled
- Ensure all the Performance testing is finished before Deadlines.
- Sync up the Staging environment DBs with Prod to get the most accurate test results.
- Must communicate with all teams (DBA’s, Devs, etc) to ensure that all the tests that were to be run properly according to the SLAs.
- Used Microsoft office tools to document and even Plan for meetings, test cases, and various other tasks
QA mobile tester
Environment: Mobi, Mobile testing, Android, Health services, Jira, Confluance, Iguana, PaceArt, Softaid, HTTP/HTML, C#, Zephyr, True Agile, Jenkins
- Create Test plans and create test cases
- Attend Daily Scrum meeting
- Responsible of testing mobile devices that are used in Production after every Update or build
- Test production machines after every update or a new build
- Execute test cases and Log any defects/ bugs in Jira
- 100% manual.
- Work Very closely with test team, client, and various stake holders
- Worked with and coordinated with the team in Macedonia and Greece