We provide IT Staff Augmentation Services!

Sr. It- Capacity / Performance Management Analyst Resume

4.00/5 (Submit Your Rating)

IL

SUMMARY:

  • Having 12+ Years of experience in Software testing which includes Manual, Automation (Functional) and Performance testing with business analysis and IS Manager experience including Tuning, Analysis and recommendations. Immense experience in software testing of web - based, client server applications and web services. Highly proficient in performance testing using HP Load Runner, Performance Center, JMeter and Functional Testing using QTP and Selenium.
  • Expertise in HP tools Quality Center (QC), LoadRunner, Performance Center and JMeter for Performance Testing.
  • Expertise in HP tools QTP/UFT and Selenium for Functional Testing.
  • Expertise in Agile Methodology, Waterfall and V Model.
  • Imported data/issues from QC to JIRA and used Zephyr for Test Management.
  • Hands-on experience in Analyzing Business spec Requirements & Develop comprehensive test Spec & execute them.
  • Extensive experience in preparing Load and Functional Test Plans and Test Strategies. Ability to handle baseline and comparison performance tests from beginning to end.
  • Proficient in Load Runner Vugen Scripting using multiple protocols like Web HTTP/HTML, Web Services, JAVA and Citrix and performed various customizations to the scripts and also performed various error handling steps for scripts debugging.
  • Experienced in conducting Smoke, System, Functional, Regression, Stress, Load, Spike, Endurance and UAT.
  • Hands on experience using various performance monitoring tools such as Dynatrace, VisualVM and CA Wily Introscope.
  • Presented various Performance Analysis Graphs and Reports collected from various Performance Tools and discuss its bottlenecks such as Memory Leaks, JVM Heap, CPU Utilization, Memory usage, Thread usage, thread contention, Garbage collection, Network time, Page Refresh Time and Page Rendering Time.
  • Proficient in working with multiple application servers like Weblogic, WebSphere, Tomcat and Jboss.
  • Expert in developing Java Selenium Web Driver/RC/IDE/GRID test scripts using Java, JavaScript and Test Frame works using TestNG.
  • Involved in testing of web application and integration projects using Object Oriented technologies such as Core Java, J2EE, Struts, JSP, JDBC, Spring Framework, Hibernate, Java Beans, Web Services(REST/SOAP), XML, XSLT, XSL, and ANT.
  • Experience in Business Intelligence testing in generating various reports using Business Objects, Crystal Reports and Crystal Dashboards (Xcelsius).
  • Hands on experience in Database Testing using applications of RDBMS in ORACLE 11g, 12c and SQL Server.
  • Good knowledge on Cloud technology (Salesforce.com) and experience in testing SalesForce applications.
  • Involved in project estimation phase and budget allocations for resources.
  • Managed teams at both Onshore and Offshore and coordinating with the team on day to day tasks.
  • Trained in Agile (Scrum) Methodology and actively involved in each stage of the Scrum process (Grooming, Sprint Planning, Daily Standup, Demo meeting, Retrospectives).
  • Excellent communication and interpersonal skills, involved in intracting with core team, dealing with end-users in conducting workshops, documenting specifications.
  • Scalable knowledge in testing of Health, TeleCom Billing, SalesForce, Security, Human Resource, Insurance and Financial applications on different operating systems.

TECHNICAL SKILLS:

Project Management Tools: ALM, JIRA and Confluence

Issue reporting tool: Quality Center 11.0 and JIRA

Testing Tools: QTP, LoadRunner, Selenium, Pentaho (Kettle) and QC

Operating Systems: UNIX, Windows

Databases: Oracle 11g, 12c, Sql Server7.0/2000 and MS-Access.

Performance monitoring Tools: DynaTrace, PerfMon, JVisualVM and Wily Introscope.

Tracking Tools: Quality Center 11.0 and JIRA

Script Technologies: Unix Shell, VB Script

SalesForce.com: Triggers, Custom Objects, Workflows, Email Templates, Visualforce, Apex, Data loader, SOQL, SOSL, Chatter, Report, Dashboard, Force.com IDE.

PROFESSIONAL EXPERIENCE:

Confidential, IL

Sr. IT- Capacity / Performance Management Analyst

Responsibilities:

  • Developed Vuser Scripts(HTTP/HTML, WebServices and JAVA) using VuGen for all identified business processes.
  • Executed Baseline, Load, Stress, and Combined Testing.
  • JVM monitoring and tuning
  • Garbage Collection Analysis, Java Heap Analysis, Thread dump, Heap dump analysis, Large Object JVM tuning, Native Heap Analysis to identify memory leaks and Connection Leaks.
  • Test and Production monitoring and metrics collection
  • Capacity Planning assessments
  • Determining the Service Level Requirements
  • Analyzing current capacity and plan for the future using measuring and estimation tools
  • Worked in TrendAnalysis, Application Sizing, Monitoring, analyze bottlenecks, tuning and implementation.
  • Performance Test Report
  • To provide performance, capacity assessment and deployment risks using key performance indicator attributes i.e. capacity, performance, scalability, availability, reliability, fault tolerance etc
  • Risks and Mitigation Plans
  • Production Support
  • POC & Instrumentation of JAVA, .NET and WebServer Technology in Dynatrace 6.2.
  • Configuration and validation of different DynaTrace monitors like Oracle Monitor, Log Monitor, URL Monitor, File Scraper Monitor, vmWare Monitor.
  • Creation of measures, Business transactions and dashboards to have increased visibility into the environment and quickly respond, analyze, and escalate any issue to one of multiple vendors with limited internal IT staffing
  • Creation of custom web dashboards in Dynatrace 6.2 and sharing them with clients and NOC teams.
  • Creation of incidents, configuration of Extended Email Plugin and customization of email alerts.
  • Daily health assurance of the monitoring system as well as expert support for issue root cause analysis, Business impact analysis, client Dashboards, Reporting, and custom escalations to each 3rd-party vendor.
  • Admin and Migration from Dynatrace 6.2 to Dynatrace 6.5 and solving issues during plugin migrations.

Confidential

IS Manager - QA

Responsibilities:

  • As a QA Manager I still manual hands on tested most of our applications, frontend, backend including XML APIs, backend jobs, Relational DBs and ETL packages.
  • Proficient in various software quality assurance methodologies with strong emphasis in white-box testing, test automation, stress, reliability, performance testing and to analyze requirements for software design, development, and validation.
  • Analyzed project and/or change documentation (BRD, FRD, TDD, CR) to determine and document appropriate test approach for complex multi-application/platform projects.
  • Proven ability and consistency estimating, planning, coordinating, executing, and managing overall quality assurance testing activities through the Software Development Lifecycle for medium to large projects with leading various sized QA teams with usage of onshore/offshore team members.
  • Developed testing strategy, defined testing scope and testing milestones.
  • Conduct daily standup calls, track project status and liaison with required teams to address any risks/issues.
  • Identifying and tracking risks and follow-up on issues with all teams to a closure in Clarity and setting mutually agreeable time lines.
  • Allocate resources to the project and assign those tasks according to the respective Agile / SDLC phases.
  • Worked closely with DBAs, Ops and other departments to find ways to improve our development and testing environments.
  • Provided reporting to staff, colleagues, senior management and clients on a regular basis to ensure all relative and critical information is clearly and expeditiously shared
  • Created the System Integration Test Methodology for the QA organization. Revised and improved upon current documentation standards.
  • Managed End to End regression testing activities in the UAT environment test team for Billing applications using HP QC (ALM).
  • Developed Vuser Scripts(HTTP/HTML, Webservices and JAVA) using VuGen for all identified business processes.
  • Involved in testing of services using SOAP/REST services using SOAP UI
  • Owned QA deliverables & managed release cycles
  • Solid performance of delivering quality results by development of test plans, test cases and test scripts along with review of requirements for testability and traceability
  • Excellent experience of working with Agile Methodologies.
  • Testing Experience: Adhoc, Automation, Configuration (Browser/OS), Conversion, Functional, Integration, Installation, End to End, Exploratory, Performance and Stress, Regression, Security and Access Control, Smoke, UI, Usability, User Acceptance and Production Deployment Validation.

Confidential

QA Team Lead

Responsibilities:

  • Interact with business leads, Solution architects and application team to develop and mimic production usage models by collecting non-functional requirements for multi-year rollout of large volume SOA.
  • Participated in Agile Sprint Planning meetings and estimated the stories accurately with proper understanding of the application and performance validation process goes by the ‘Work Done & Ready to Go’ approach from time to time, release to release and in specific sprint by sprint.
  • Organize status meetings with the stakeholders for Performance Testing in Project Ensure Processes and content of all Performance testing.
  • Creating test plan document that contained all the performance test artifacts and obtained approval from the project stakeholders.
  • Integrated performance testing with various applications as well as within a cloud environment.
  • Develop robust scripts using HP LoadRunner Web (HTTP/HTML) and Web Services protocol based on the user workflows.
  • Extensively used LoadRunner using Virtual User Generator to script and customize performance test harness with protocols like Web (HTTP/ HTML), Web services, Seibel Web &RTE.
  • Advanced Skills in performing automated testing for JAVA, J2EE, Web, Web Services, RTE, and Web Applications.
  • Generate Load by triggering virtual users using VUGen and created scenarios to conduct the load test using Performance Center.
  • Executed Baseline, Load, Stress, and Combined Testing using HP Performance Center.
  • After test execution collaborated with the development, solution engineering, and technical architecture and release management teams in the client organization, to analyze performance results and identify fixes for the findings and for effectively identifying potential bottlenecks.
  • Reported various Performance Analysis Graphs and Reports collected from various Performance Tools and discuss its bottlenecks such as Memory Leaks, JVM Heap, CPU Utilization, Network time, Page Refresh Time and the Page Rendering Time.
  • Application server, Database, Network and WebLogic monitors are utilized during execution to identify bottlenecks, bandwidth problems, and infrastructure, scalability and reliability benchmarks.
  • Configured and used DynaTrace for performance monitoring and performed trouble shooting on Bottlenecks with performance testing along with response times, analysis and profiling the application to find out where the performance issue.
  • Managing DynaTrace Servers and working as Admin. Creating custom Sensors, Reports and Dashboards.
  • Knowledge of Java Virtual Machine internals including class loading, threads, synchronization, and garbage collection.
  • Profile slow performing areas of the application, system resources and identify bottlenecks and opportunities for performance improvements by using wily Introscope tool.
  • Implemented automation using Selenium WebDriver, JAVA, Selenium Grid and TestNG.
  • Conducted application Profiling and JVM tuning for all builds delivered per each agile sprint.
  • Conduct application performance profiling at method call level and set priorities for code performance optimization. Root out inefficient SQL calls and indexing issues for DBA group.
  • Prepared regression suites for final regression at the end of Sprint using ETL Tool - Pentaho (Kettle) and executing the regression automation scripts.
  • Involved in Java/Hibernate/Spring version upgrade project testing from Java6 to 8 and database upgrade testing from 11g to 12c.
  • Managed a team size of 5 to 7 team members and have the ability to implement and drive through the performance testing process for multiple projects.

Environment: Performance Center 12.02/ALM, Load Runner 12/12.02, Dynatrace 6.1, Wily Introscope, Selenium, TestNG, Eclipse, Pentaho(Kettle), JIRA, Confluence, Core JAVA, J2EE, Web Logic Server, JSP, EJB, Web Services, Servlets, APP Dynamics, Oracle 11g and 12c, Unix.

Confidential

Software QA Engineer

Responsibilities:

  • Reviewed Business Requirement Document for accuracy and Completeness.
  • Worked closely with Business user in understanding designing and documenting the functional testing plan and then writing, executing, documenting the results.
  • Captured requirements in the form of Use Cases, non-functional specifications and business rules.
  • Extensively involved in requirement analysis.
  • Created the conceptual diagrams and data flow diagrams based on the analysis of requirements.
  • All test cases were written based on business requirements, Use Cases and screen shots.
  • Maintained Traceability Matrix to track the requirements to the test cases to ensure complete test coverage.
  • Actively involved in various phases of the testing cycle such as Integration, Functional, Regression, System and End-To-End Testing.
  • Tested workflow rules, triggers, and page layouts.
  • Tested data migrating requirements and analyze data to be loaded from legacy systems to Salesforce.com.
  • Performed stress testing using 2 millions of transactions.
  • Compared the response times of transactions against the SLAs. worked in Trouble shooting issues, checking logs by login in to Unix boxes
  • Involved in monitoring Web, Application and DB servers during execution.
  • Prepared a test analysis report and presented it to the project team.
  • Mentor other team members on product functionality and domain knowledge.
  • Continuously monitor the testing progress with team members and make sure to meet the given deadlines.
  • Participated in daily stand up meetings.
  • Participated in retrospective meetings actively suggesting great ideas to improve the scrum process to achieve better results.
  • Involved in release planning meeting before the release and prepared the documents for the team to follow the testing during production.
  • Proactively came up with innovative methods to improve software quality, test coverage efficiency and regression coverage.
  • Ability to successfully manage multiple projects effectively through a combination of business and technical skills.
  • Handling production code deployment with APP teams.
  • Handling Interview process and training Interns.
  • Managing multiple applications and teams.

Environment: Salesforce.com, Apex, Visualforce, triggers, workflow, validation rules, SOSL, SOQL, AppExchange, report, dashboard, Force.com IDE, Java, JavaScript, XML, BusinessObjects Enterprise XI, IIS 7.0, .Net, Quality Center, Oracle 10g and Windows XP, 2000.

Confidential

Software QA Analyst

Responsibilities:

  • Captured requirements in the form of Use Cases, non-functional specifications and business rules.
  • Extensively involved in requirement analysis.
  • Created the conceptual diagrams and data flow diagrams based on the analysis of requirements.
  • All test cases were written based on business requirements, Use Cases and screen shots.
  • Maintained Traceability Matrix to track the requirements to the test cases to ensure complete test coverage.
  • Involved in the team meetings to discuss the issues and problems.
  • Participated in Walk through of test cases.
  • Validated Reports, Crystal Reports and Dashboards based on Functional Requirements.
  • Written SQL Queries and similar artifacts to validate the completeness, Integrity and Accuracy of the Reports, Crystal Report and Dashboards.
  • Defined the data cleaning strategy based on the exploratory data analysis reports which to be applied on the staging database and then pushed to the operational data store.
  • Used SQL Queries to verify the data from the Oracle database-using TOAD and SQL Developer .
  • Worked closely with application packagers and developers in reproducing the bugs reported.
  • Tested compatibility of application with Windows Vista, Internet Explorer and Chrome.
  • Reported software defects in Quality Center and interacted with the developers to resolve technical issues.
  • Creating test plan document that contained all the performance test artifacts and obtained approval from the project stakeholders
  • Created Scripts using VUGEN.
  • Created and configured the performance test scenarios with the help of Controller in Performance center.
  • Compared the response times of transactions against the SLAs.
  • Analyzed the results by merging multiple graphs using Analysis tool in order to identify the bottlenecks
  • Worked in Trouble shooting issues, checking logs by login in to Unix boxes.
  • Executed the performance test scenarios and monitored the performance metrics.
  • Involved in monitoring Web, Application and DB servers during execution.
  • Involved in root cause analysis for performance issues by collecting various Backend application logs to identify which back end system causing high response times.
  • Independently developed LoadRunner scripts according to test specifications/requirements.
  • Used LoadRunner to execute multi-user performance tests.
  • Developed reports and graphs to present the load/stress test results.
  • Assisted the UAT sessions with the users.
  • Verify the new or upgraded applications meet specified performance requirements.

Environment: BusinessObjects Enterprise XI, IIS 7.0, .Net, Quality Center, Oracle 10g, Windows XP, 2000 and LoadRunner.

Confidential

Software Engineer- QC

Responsibilities:

Developing and execution of test plans, test cases based on business requirements of the Application, Performed Functional Testing, Created test scripts for various testing stages like Integration, Functional, Regression and System Testing., Involved in meetings along with developers to resolve identified issues, Rerun of test cases and test scripts for Regression testing after fixing bugs in the software, Used SQL Queries to verify the data from the SQL Server 2000 database, Used PVCS for defect tracking, Installation and configuration of Win 98/00 and MS-Office Tools, Tested compatibility of application with Internet Explorer 6.0, Netscape Navigator 5.0 and Report defect found during test cycles and Track the defects and retest fixed programs.

Environment: Win 2K, IIS, .Net, VB COM+, ASP, LDAP, C++, SQL Server 2000, PVCS, Netscape Navigator 5.x, IE 6.x.

Confidential

Software Engineer- QC

Responsibilities:

Prepared and executed test cases based on User Requirements Document (URD) and prepared Test Procedures, Captured requirements in the form of Use Cases, non-functional specifications and business rules, Involved in prepared scripts to automate the testing process using Rational Robot 2003, Back end testing was done to verify and validate data, Data verification and data integration test was done using SQL queries, Tested various text-hyperlinks of Home page and different pages, Tested the functionality of each screen to monitor proper navigation, Used PVCS to store, tracking and reporting bugs, Worked closed with developers in reproducing bugs reported, Involved in Functionality, Block box, Integration, System, Regression and Compatibility testing, Consultation with end users and customers to develop solutions to meet business needs and new business requirements.

Environment: Rational Robot 2003, Win 2k, IIS, ASP. Net, VB COM, Sybase 12.5.2 and PVCS.

We'd love your feedback!