- IT professional with 12+ years of experience in QA Analysis and Performance Testing of web - based applications and cross-platform environments such as Linux and Java.
- Strong analytical working knowledge as a QA Analyst for various business applications in Cross-platform environment Such as Linux and Java platform. Solid knowledge in process improvement, methodologies, traditional software life cycle.
- Highly self motivated and goal oriented and committed to pursuing a long-term career in QA. Excellent communication skills, written and verbal. Ability to multi-task and learn complex procedures quickly.
- Good understanding of Agile software development lifecycle (interactive and incremental)
- Ability to multitask and prioritize work to meet deadlines
- Solid knowledge of SDLC and PTLC (Performance Testing Life Cycle).
- Extensive experience in coordinating testing effort, responsible for test deliverables, status reporting to management, issue escalations.
- Good experience in Replication of problems & Generation of Bug report using Jira.
- Active participation is Sprint planning meetings to decide the stories to develop for the Current sprint
- Enabled improvement in team delivery commitments and capacity planning for sprints by identifying & tracking hidden tasks that increased customer satisfaction.
- Strong knowledge on JIRA tool to manage Scrum Projects and Kanban Projects
- Collaborated with members of the Product, Business and Engineering Teams to develop and maintain Product Backlogs.
- Comprehensive knowledge of full system development life-cycle and structured Software Testing phases
- Utilized Agile Scrum practices to help the team increase their Velocity by 65% within the first year of Agile adoption
- Worked with Design Team to in corporate UX best practices/design to ensure a more comprehensive and innovative approach.
- Experience preparing and executing manual Test Scripts/Cases and documenting the results.
- Review and understand Business Requirements, Functional Specification and System design documents with the goal of performing a structured validation of all testable condition.
- Developing detailed system test plans and test cases from business and technical specifications of the systems.
- Perform Fail over test with Operations, Regression testing for every new release
- Knowledge of developing UNIX shell script and Linux/Unix Commands
- Extensive knowledge of Back-end database testing and writing SQL Queries.
- Proven ability to build strong relationship with staff and client using negotiation talent.
Operating Systems: Windows Server, RHEL6/7 and MAC OS
Software: Java, JSP, C, Visual Basic, PL/SQL, MS SQL Server and ORACLE
Networking: Cisco, TCP/IP, UDP, HTTP/HTML, POP3, IMAP4, SMTP, SSL, FTP, Telnet and DHCP
Performance Tools: LOADRUNNER 12.50, 12.63 and 2020
Monitoring Tools: DYNATRACE and SPLUNK LOADRUNNER ANALYSIS
Confidential, Holmdel, NJ
- Create comprehensive performance test plans and speculations for smooth and important performance capabilities of the System under test are assessed.
- Responsible for Plan and Design of Performance Testing Including Load Testing, Scalability Testing, Endurance/Sock Testing, Spike Testing, and Stress Testing for Multiple projects.
- Interact with Stakeholder team to implements Applications and Infrastructure workloads for performance testing including SLAs (Service Level Agreement’s)
- Setup and Configure APM tools for Applications Monitoring for Windows and Linux Servers Clients such as DYNATRACE ONEAGENT and ACTIVEGATES
- Very Good Experience with Dynatrace Modules and 3 rd Party Integrations such as Load Testing tools, Load-Runner and JIRA SOFTWARE.
- Using Applications Monitoring various APM tools DYNATRACE, Micro-Focus Analysis and SPLUNK for identify System Performance issues such as CPU, Memory, Disk Utilizations and Applications Performance Maximum number of Users, Transactions Response Time, Throughput, Page per minute and Hits per Second.
- Create and Records LR Scripts using Protocols HTTP/HTML and DEVWEB including DEVWEB Proxy recorder (Utility in DEVWEB Standalone and Offline Generator)
- Customization of Vu-Gen Scripts, Such Validate, Parameterization, Correlation and Modularizations
- Design Scenarios for Load Testing between Applications under Expected load with Expected Configuration.
- Identify Application Performance bottlenecks using Application Server Monitors, Web Server Monitors, Database Server Monitors, and Network Monitors. Finding out the troubled area in our Scenario which cause increased Response time, including Measurements (Performance Response Time, Throughput, hits/sec, Networks delay Graphs)
- Analyzing Load Test Results Statistics Summary such as Maximum Running V-users (Concurrent Users), Total Throughput, Average Throughput (Per Seconds), Total Hits, Average Hits (Per Seconds).
- Analyze Transaction behavior and Average Response Time using the SLA (Service Level Agreement) and Analyze Transaction Mechanism with Graphs.
- Periodically review Workloads and SLAs with Stake Holders Team with Applications performance test results ensuring accuracy, clarity and completeness.
- Write Test Results Summary Report to Stake Holders Team and attend weekly meeting with PT Team to Analysis report, Recommendation, Faults found into the Application, Pending issues, Problems & Risks and Conclusions.
- Monitor and track Splunk Performance Problems, Administrations and open tickets with Splunk if there is Need.
- Responsible for Scheduling and Automating Database tasks jobs Alerts, emails Notification
- Prepared arranged and teste Splunk search strings and operational strings
- Good knowledge in building Splunk Apps for custom Applications Environment.
- Created Dashboards to monitor CPU Performance Peak, Memory Leakage.
- Working LoadRunner Results Log Files add into Splunk Enterprise Created and Configured management reports and dashboards.
- Responsible for Functional and None Functional Testing for Following Modules Such as Confidential Business Web Admin Portal, Sales-Force, Confidential Mobile APPs, and CRM’s And Plugins.
- Excellent Working knowledge of Agile SCRUM and KANBAN Projects
- Primary accountable person for ensuring a full-team approach to automated and exploratory testing in a continuous delivery environment.
- Ongoing role within a small agile software product development team.
- Extensive Working Experience with Epics, Issues, Stories and Stories Description, Acceptance criteria and Sub-task including Size of story(points)
- Good Understanding working with Backlog refinement/Grooming
- Excellent Agile project management methodologies, projects are broken-down into Sprints or Iterations
- Play a major role in improving quality, functionality, reliability, and usability of software products and 3rd party software integrations
- Interact with product development team(s) to evaluate system interfaces, operational and performance requirements of the overall system
- Execute tests & analyze test results to assure quality of existing and new functionalities
- Creating and expanding automated regression tests
- Participate in continuous improvement of testing process and procedures by analyzing reported bugs and then assessing test coverage and project execution around these areas as well as by analyzing the current processes and practices
- Developed and documented all test plans and speculations for smooth integration into systems.
- Testing of software, including regressions, non-functional and integration testing
- Browsers Capability testing various Browsers versions to use VBC Web Admin Portal and Sales-force.
- Excellent Knowledge Using JIRA and working with Enter Bug and Manages Bugs
- Extensive Understanding Fronted and Backend Data Validations Testing Using SQL Queries.