We provide IT Staff Augmentation Services!

Devops/ Prod Support Engineer Resume

5.00/5 (Submit Your Rating)

SUMMARY

  • Experience in Performance Monitoring tools like CA APM, App, Gomez, Sitescope, and HP Diagnostics
  • Experience in the area of application testing using automation tools like Mercury Win Runner, Loadrunner, Quality Center, RSW(E - Tester), SQA team Test, Rational suite Test studio and Performance studio, Borland Silk/Performer R2.
  • Knowledge of basic ITIL framework and processes.
  • Strong experience in SDLC and ATLC(Automated testing Life Cycle).
  • Experience in Mercury tools/BPM Business Process Monitoring
  • Experience on Project management.
  • Strong Experience in HIPPA, GXP (GLP, GMP) and FDA rules and regulations.
  • Experience in SQA Policies and procedures.
  • Experience in Weblogic 8.1 Portal Server, Wireless hardware/networks,
  • Experience in IBM WebSphere application server.
  • Experience in financial within trading systems, equity derivatives, risk management and regulatory reports.
  • Strong experience in understanding of market data, quotes, stocks, bonds, options, fixed income and data feeds.
  • Strong knowledge in SOX system applications, and how Sarbanes-Oxley (SOX) efforts to improve the efficiency of Financial and business processes.
  • Experience in Messaging Services.
  • Strong experience in writing Object Oriented Programming language (C++, Java).
  • Actively involved in various project phases, functional specification, Design, Program Specification, Test Plan, Test cases, Test scripts, Unit testing, System and Stress Testing, End-User Interaction & Documentation.
  • Experience in existing database methodologies i.e. Data Modeling (Physical and Logical), Data Warehousing and
  • Strong experience writing Testing Plan & Procedure using a software development Methodology (IEEE, RUP, and CMM).
  • Strong Experience in writing Unix Shell scripting (ksh & csh) and PERL scripting.
  • Experience in writing TSL (Test script language) .
  • Worked on end-to-end Software Methodology and Testing processes (preparing Test plans, Test scripts, Execution of Test scripts, Reporting and Debugging).
  • Web Development using C, C++, Java, Swing, EJB, Weblogic, WML, WAP, HDML, JDBC, Servlets Java Beans, RMI, HTML, XHTML,DHTML, Oracle, Ms SQL Server.
  • Good knowledge of SEI/CMM (Capability Maturity Model) Level 5.
  • Ability to work independently and with group.
  • Experience in working on Portal Applications.
  • Experience with Media Servers; and networking (TCP/IP, DNS, DHCP, CISCO switches/routers, F5 BIG-IP load balancer).
  • Experience in Networking protocols (HTTP,TCP/IP,IMAP,POP3,SNMP,FTP,UDP),Network Layer. Also experience in Routers, Switches, Bridges and Packets.

TECHNICAL SKILLS

Testing Tools: HP Business Availability Center, BTM, BPM. SiteScope, Diagnostics, RUM, Win Runner, Load Runner, Performance Center, Test Director, SQA team Test, Rational Suite Test studio, Rational Performance, Segue Silk Test, Silk Performer, Para soft Jtest, RSW Etester, E-load, Clear Case, Clear Quest, Compuware, Junit 4, PVCS TrackerP/Language C, C++, Java, RPG/400, SQL, PL/SQL,XML,TSL,PERL,COBOLScripting Languages HTML, JHTML. XHTML, DHTML, JAVA Script, and VB Script

Databases: Oracle 12, DB2, Sybase, SQL Server 6.5/7.0,SQL, PL/SQL, MS-Access, Informatica Tools

Application Server: Weblogic App Server, Dynamo App Server. IBM WebSphereWeb Servers Personal Web Server, IIS, Netsscape, and Java Web Server.

GUI: Developer /2000, Symantec Visual Café 2.0/3.0, Power Builder, Visual Basic 6.0, ASP, .Net, Delphi

Web Tools: Visual Interdev 6.0, MS-FrontPage 98

Technologies: Com, DComMessaging MQ Series (Monitoring)

Operating System: Novel, Solaris 2.8, HP-UX, AIX, LINUX, Windows 2000/NT, ‘XP, Vista, Mainframe

PROFESSIONAL EXPERIENCE

Confidential

DevOps/ Prod Support Engineer

Responsibilities:

  • Working as DevOps Production Support for NFCU’s BPA prod applications.
  • Investigating and resolving operational problems in support of the production support teams.
  • Maintaining and monitoring the production business process application
  • Troubleshooting to providing technical/operational support and problem resolution to the business.
  • Performing the technical tasks related to process automation.
  • Interfacing & working in partnership with the BPA developers and IT infrastructure team (Middleware) and database team.
  • Providing post implementation ongoing support for various business automated processes.
  • Develops new techniques and/or improved processes, materials or products
  • Good understanding of Confidential Federal's business plan, business processes and systems.
  • Working with commonly use tools in NFCU such as HP-PPM, HP-ITSM, SharePoint etc.
  • Working with Monitoring team for setting and configuration monitoring for various BPM applications.
  • Working on CA APM, HP SteScope monitoring set up and configuration.
  • Strong analytical ability and communication skills to present ideas with clarity Knowledge of industry best practice.
  • Working on financial industry trends, direction and standards that guide new technology offerings Experience, and demonstrated knowledge in strategic planning
  • Leading, guiding, and coaching professional staff.

Environment: /Technologies: Windows, IBM BPM 8.5.5/7, DB2, IBM WebSphere 8.5, Oracle, ITSM, SplunkIBM Process Designer, Inspector, Rest API Tester, Process Console, Admin Console, CA APM, SiteScope, HP BSM.

Confidential

Sr. Application Monitoring Engineer

Responsibilities:

  • Responsible for designing and implementing application performance monitoring for Confidential production environment.
  • Building and maintaining automation scripts for CA APM cloud monitor and dashboards for integrated environment health-check.
  • Involved with the change management process to control changes to integrated test/ Pstage systems with regards to physical or system updates
  • Responsible creating various applications as well database monitor using CA APM.
  • Responsible creating Alerts and Maintenance time using CA APM.
  • Responsible working with Middleware (WebSphere, EBS, BPM) monitoring and creating Dashboard using CA APM.
  • Responsible configuring creating monitor for Confidential ’s IBM BPM, WebSphere and creating dashboard using CA Wily.
  • Working on IBM Business Process Manager (BPM) with Middleware team for configuring, set up and installing CA APM Agent.
  • Responsible supporting for platform tuning and support infrastructure running on WebSphere Application Server (Linux/AIX).
  • Responsible configuring creating monitor for CDM (CPU, Diskpace, Memory) using CA UIM.
  • Performing deep statistical analysis using Performance data to help identify capacity and performance bottleneck.
  • Responsible in setup and analysis of monitoring, incident detection using Wily Introscope.
  • Responsible creating various application dashboard using CA Wily Introscope.
  • Responsible installing upgrading CA application performance monitor (APM) environment from 10.3 to 10.5 versions.

Environment: /Technologies: Windows, Unix, AIX, DB2, IBM WebSphere 8.5, IBM BPM, Oracle, CA APM, CA Wily Introscope, CA UIM, CDM, CA Nimsoft database

Confidential

Sr. Performance and Monitoring Engineer

Responsibilities:

  • Responsible for designing and implementing performance, scalability, stability, and stress tests using industry standard tools, and custom solutions.
  • Responsible in performance testing and strategy for complex Enterprise apps in Java and IBM WebSphere.
  • Responsible in working with performance test architecture for EJB and Mainframe DB2.
  • Responsible performance testing using JMeter frame work.
  • Responsible simulating and testing legacy Mainframe CICS transactions also enterprise java beans, rest api using JMeter distributed methodology.
  • Using JMeter (Master and Slaves) distribution testing method for simulating production systems.
  • Performing deep statistical analysis using Performance data to help identify capacity and performance bottleneck.
  • Responsible in setup and analysis of monitoring, incident detection using Wily Introscope.
  • Responsible Oracle database tuning and monitoring using OEM ( Oracle Enterprise manager.
  • Responsible in triaging performance bottlenecks & tuning for Enterprise Java applications.
  • Working in end to end performance test strategy including capacity, load and stress scenarios.
  • Responsible for improving and creating high performance software products.
  • Responsible understanding on various of issues of scale and performance in large-scale cloud environments and ability to identify, debug and propose viable solutions.

Environment: /Technologies: Windows, Unix, Mainframe, DB2, CICS, IBM WebSphere 8.5, Oracle Exadata, CA APM, CA Wily Introscope, Oracle OEM, JMeter, HP Load Runner, VUgen (12.5), Java, J2EE, EJB, JSON, Java, Web Services, Rest API, Load Balance, Postman, KIBANA, Elastic search

Confidential

Sr. Performance and Monitoring Engineer

Responsibilities:

  • Responsible for designing and implementing performance, scalability, stability, and stress tests using industry standard tools, and custom solutions.
  • Solely responsible performance testing for Confidential ’s various environment and applications (Aviation, Oil and gas, Transportation, etc.).
  • Responsible developing Performance Test Plan to ensure the scope of performance testing scenarios meets the customer objectives and performance requirements meets the SLAs
  • Responsible working with development team and gathering all the performance requirements.
  • Responsible developing performance test scripts using HP Vugen (Load Runner 12.5) and JMeter.
  • Responsible executing performance test scripts with various load and every release base to make sure there are no degradation in performance with previous release.
  • Tested HTTP headers to make sure the presence of a proxy server or other balancer. Tested properly implemented, load balancing.
  • Deeply responsible end to end analysis (from performance requirement gathering, creating test plan executing test scripts and publishing performance test results).

Environment: /Technologies: Windows, Mainframe, DB2, IBM WebSphere 8.5, CA Introscope, Oracle OEM, IOS, HP Load Runner, Vugen (12.5), JMeter, Java, JSON, Cloud Foundry, AWS, Java SE/EE, Web Technologies, Web Services/XML/Micro services, Rest API, SOAPUI, Load Balance, Postman, Big data, New Relic, Maven, Jenkin, Kibana, SPARK,, KAFTA, messaging, Rally, GIT, Confluence, Jira

Confidential

Lead Performance and Monitoring Engineer

Responsibilities:

  • Worked as a performance engineer and responsible for the performance testing of IT multi-tier software systems.
  • Responsible working with Data Mart team which replicated PECOS database, using Informatica, data was extracted from PECOS system, transformed into dimensional data structures based on the reporting.
  • Installed, configured and troubleshoot issues with CA APM 9.x/10.x.
  • As a senior performance engineer, worked with various applications (PECOS, HITECH).
  • Responsible in configuring Power Packs with different Application servers like Websphere, Jboss, Apache, Websphere Portal Server, Websphere MQ, to administer applications with CA APM Suite.
  • Worked with the application which Section 508 compliance.
  • Responsible for gathering, reviewing and analyzes performance requirements (functional, non-functional from application team. Closely worked with, application developers, database team, ETL team, environment team and network configuration management team.
  • Responsible developing performance scripts using HP's Load Runner tools and JAWS for 508 support.
  • Responsible reviewing technical architecture requirements, performance test plan and test environments
  • Responsible creating/reviewing Performance Test Plan to ensure the scope of performance testing scenarios meets the customer objectives and requirements in a cost effective manner.
  • Responsible developing performance test scripts using HP Vugen (Load Runner 12.5).
  • Responsible developing and executing ongoing performance, scale and endurance tests relied upon for certification of product release.
  • Responsible creating /reviewing performance test completion reports for accuracy, completeness and readability - appropriate risks and issues are identified and results are clear and accurate
  • Responsible for monitoring for patterns and thresholds, trigger alerts when specific conditions arise specially for finding bottleneck while executing performance tests.
  • Responsible monitoring server statistics (web server, app server, jvm, connection pools, thread pools etc.).
  • Responsible for performance requirements traceability metrics.
  • Responsible working with WAS admin, DB admin and network team for analysis and performance tuning.
  • Worked on java application tuning including JVM, Heap Size monitoring for application performance analysis.
  • Generated a ton of traffic to see if your requests start going somewhere else, or if the headers change, etc. to observe the load balance.
  • Responsible monitoring and tuning database server with database team for application good performance.
  • Responsible for executing performance release tests and gathering reviewing and analysis for performance test reports.
  • Involved with automation team for Automated build Acceptance Test (BAT) which is developed using combination by Selenium, Cucumber and Gerkhin.

Environment: /Technologies: Windows, HP Load Runner, Vugen (12.5), ALM Performance Center, SOASTA, JAWS, XML, JSON, Oracle 12G, Oracle Fusion, Laod Balance, Cloud, AWS, Middleware, SQL, Store Procedure, ETL, Informatica, JavaScript, HTTP/HTTPS, SOAP, TCP/IP and SNMP, Web Services, Java, J2EE, JVM, Heap Dump, New Relic, Wily Introscope, Selenium, Cucumber/ Gerkhin

Confidential

Senior Performance and Monitoring Engineer

Responsibilities:

  • As a senior performance and monitoring engineer, working with Confidential healthcare applications.
  • Worked with the application HAIMS and Section 508 compliance.
  • Involved gathering, reviewing and analyzing performance requirements (functional, non functional from application team . Closely working with, application developers, database team, environment team and Network Configuration management team.
  • Responsible reviewing technical architecture requirements, performance test plans and test environments
  • Monitoring the completion of tasks within time and cost constraints and ensures technical and functional objectives are met.
  • Responsible creating/reviewing Test Plans to ensure the scope of testing scenarios meet the customer objectives and requirements in a cost effective manner.
  • Responsible developing performance test scripts using Vugen (Load Runner 12.02)
  • Responsible developing performance scripts using HP's Loadrunner and JAWS for 508 support.
  • Responsible developing and executing ongoing performance scale and capability tests relied upon for certification of product release.
  • The methods we used to achieve load balance is DNS load balancing.
  • Used Splunk Enterprise for analyzing real-time and historical data from various server logs and creating Dashboard and reports.
  • Involved setting and running performance monitoring for the application.

Environment: /Technologies: Windows, HP Load Runner, Vugen (12.02), Site scope, XML, JSON, MS SQL, JavaScript, HTTP/HTTPS, SOAP, Load Balance, Web Services, JAWS, SQL Trace, Spunk, .NET.

Confidential

Senior Performance Test Engineer

Responsibilities:

  • Worked with a team of Performance Testing and responsible for the performance testing of IT multi-tier software systems.
  • As a senior performance engineer, worked with various CMS.gov applications including CMS websites also consumer-facing Internet website portal (RBIS), HealthCare.gov.
  • Worked and followed Section 508 guidelines for federal government application, i,e. Health Plan and Other Entity Enumeration System is compliant with the applicable Section 508 guidelines.
  • Worked extensively with multiple technical disciplines, across multiple platforms.
  • Involved gathering, reviewing and analyzes performance requirements (functional, non functional from application team . Closely working with, application developers, database team, environment team and Network Configuration management team.
  • Responsible reviewing technical architecture requirements, performance test plans and test environments
  • Monitored the completion of tasks within time and cost constraints and ensures technical and functional objectives are met.
  • Troubleshooting issues like slow performance using Wily Introscope and CEM for applications deployed on different application servers.
  • Expertise in configuring with different Application servers like Websphere, Jboss, Apache, Websphere Portal Server, Websphere MQ, to administer applications with CA APM Suite.
  • Was On-Call on a rotational basis to support application teams if required b triaging issues using wily introscope or troubleshooting any environmental issues on which wily introscope was hosted.
  • Installed, configured and maintained Introscope Enterprise Manager and Manager of Managers with agents supporting both Production and test environment.
  • Responsible creating/reviewing Test Plans to ensure the scope of testing scenarios meet the customer objectives and requirements in a cost effective manner.
  • Responsible developing performance test scripts using Vugen (Load Runner 11.5) and SOASTA CloudTest.
  • Responsible developing and executing ongoing performance, scale, and endurance tests relied upon for certification of product release.
  • Responsible for the design and development of Performance Tuning SQL, PL/SQL, Store Procedure, JVM and database.
  • Responsible working with WAS admin, DB admin and network team for analysis and performance tuning.
  • Working on java application tuning including JVM, Heap Size monitoring for application performance analysis.
  • Responsible monitoring and tuning database server if necessary for application good performance.
  • Responsible for executing performance release tests and gathering reviewing and analysis for performance test reports.
  • Worked with load balance to make sure all the servers are distributed load properly.
  • Using Splunk Enterprise for analyzing real-time and historical data.
  • Responsible creating /reviewing Test Completion Reports for accuracy, completeness and readability - appropriate risks and issues are identified and results are clear and accurate
  • Developed Performance Test Scripts using Load Runner’s VUGen, Cloud Test SOASTA.
  • Real-time monitoring for patterns and thresholds, trigger alerts when specific conditions arise specially for finding bottleneck using Wily Introscope
  • Involved working with automation team for Automated build Acceptance Test (BAT) which is developed using combination by Selenium, Cucumber and Gerkhin.
  • Responsible monitoring servers statistics and collecting metrics using Wily Introscope,

Environment: /Technologies: Windows, HP Load Runner, Vugen (11.5), Performance Center, SOAPUI, SOASTA Cloud Test,, XML, JSON, MS - SQL, PL SQL, Store Procedure, Mongo DB, JavaScript, HTTP/HTTPS, SOAP, Web Services, Java, J2EE, JVM, Heap, Connection pool, Load Balance, Apache, Tomcat, SQL Trace, Spunk, HP Diagnostic, Wily Introscope, .NET, 3-tier applications

Confidential

Lead Performance and Monitoring Engineer

Responsibilities:

  • Responsible for a Lead a team of Performance Testing and responsible for the performance testing of IT multi-tier software systems; partnering with project management for resource ramp up/ramp down, identifying resources/skills, resource allocation, task assignments and managing groups/teams
  • Involve developing effective relationships within and among key stakeholders crossing business and technical teams, negotiating expected outcomes.
  • Responsible for identifying software and system risks associated with business processes and IT solutions architecture and leveraging of these risks and guiding team in selecting optimal risk mitigation techniques based on system changes.
  • Monitoring the completion of tasks within time and cost constraints and ensures technical and functional objectives are met.
  • Responsible owning engagement Level Of Effort estimation
  • Responsible for organizational financials and management of forecast accuracy against full year actual spend.
  • Partnering with customers to assist in planning and forecasting testing engagements using historical testing and business plans and forecasts
  • Responsible in planning and managing team of Performance Testers local and at remote locations to support projects and day to day operations
  • Responsible insuring proper and timely 'time management' at the Test Lead and Test Engineers Level
  • Responsible monitoring adherence to quality standards in development, testing and business processes and able to produce quantifiable metrics to measure success; ensures test deliverables are prepared per enterprise guidelines
  • Responsible for implementation and maintenance of robust test metrics that provide clear understanding of progress as well as drive continuous improvement
  • Responsible in implement improvements within quality assurance program and operational systems with measurable quality indicators;
  • Responsible for prepare and discuss quality reports with the management.
  • Analyzed product requirements with members of the project/product delivery team and devise the testing strategy; aligning a risk-based test approach mitigating risk exposure within all phases of testing; utilizing performance testing automated tools
  • Worked with load balance to make sure all the servers are distributed load properly.
  • Worked with members of the project/product delivery team to devise the performance testing strategy to exercise and verify the application performance based on established test success criteria.
  • Ensured test success criteria are defined clearly to facilitate the 'pass/fail' indicator to test executions.
  • Worked with members of the project/product deliver team determine if there is sufficient risk associated with the delivery as to result in a performance test engagement being initiated.
  • Reviewed performance test environment details, inclusive of data, and advise customer as to any risks/deficiencies related to it.
  • Assisted the project/product delivery team in the identification/obtainment of external team resources required to support the success of the performance test engagement.
  • Responsible for ensuring that the performance testing effort/strategy is synched and aligned with the Agile strategy followed by the project team.
  • Identified interdependencies, ambiguities or omissions, and make suggestions to improve requirements and ensure usability/testability.
  • Worked with ‘Service Owners’ (platform owners) to optimize the configuration of clients, servers, and networking equipment to enable performance testing across the enterprise.
  • Assess quality of performance test environments by comparing against production counterparts, identifying scaling issues or gaps and providing cost feasible recommendations to mitigate risks
  • Responsible for escalating performance test environmental deficiencies to Leadership.
  • Responsible for advocate for quality delivery; coordinates activities with overall team, administers the testing problem management process including monitoring and reporting on problem resolution
  • Assisted platform Owners and Production Support teams with analysis and recommendations for mitigating performance related production incidents/issues - advise on and recommend testing scenarios that can mitigate future such issues
  • Responsible creating/reviewing Test Plans to ensure the scope of testing scenarios meet the customer objectives and requirements in a cost effective manner
  • Responsible creating /reviewing Test Completion Reports for accuracy, completeness and readability - appropriate risks and issues are identified and results are clear and accurate
  • Responsible for escalation of unresolved issues, working with the appropriate stakeholder/subject matter experts to drive resolution of issues.
  • Responsible for timely communications reflecting the remediation effort reflecting status/progress to Performance Testing Leadership and the project delivery team
  • Provided career development, performance management, and pay determination and communication for direct reports.
  • Responsible providing leadership and mentoring of staff, while staying abreast of the details of all projects and initiatives.
  • Responsible for managing team knowledge development and training to ensure expertise for key application testing, system deployment and cross-training of team members.
  • Managed the development and retention of proactive, technically strong service oriented staff
  • Responsible verifying staff maintain technical certifications of those tools labeled as 'core knowledge required'

Environment: /Technologies: Windows 2000/XP, Mainframe, Unix, Linux, Performance Center 11, Sitescope, RUM, HP/Load Runner, Vugen (11), SOAPUI,WSDL, XML, MQ Series, DB2, SQL, HTTP/HTTPS, SOAP, Web Services, JVM,GC, Omegamon

Confidential

Sr. Performance Engineer

Responsibilities:

  • Worked on Confidential ’s BCBS Claim application, ICD10 and other applications.
  • Worked on Care First BCBS’s various health plans.
  • Involved working on Agile (Various Sprint’s phases) as well as waterfall methodologies.
  • Worked on Healthcare Payer claims data, HIPAA standard transaction and code set standards, industry code classifications such as ICD-9, ICD-10 etc.
  • Reviewed ICD-9 claims data and verify mappings to the equivalent ICD-10 codes executing the database queries.
  • In conjunction with the ICD-10 Coding Manager, provided information and recommendations to assisted other departments within the organization design their business rules related to ICD-10 Required.
  • Supported SMEs, project sponsor, ICD-10 clinical and PMO office in ICD-10 training activities also Supporting various levels of validation required for mapping rules.
  • Involved gathering, reviewing and analyzes performance requirements (functional, non functional) from application team . Closely working with BAs, application developers, database team, environment team and Network Configuration management team.
  • Developed/created Performance Requirement documents, Performance Test Plan/Strategies and Performance Reports (including Interim and DO reports)
  • Developed HP Performance Center Upgrade Plan (11.5) Procedures (Upgrading, Installation and Configuration) document and worked on Performance Center upgrade process.
  • Developed Performance Test Scripts using Load Runner’s VUGen and other tools.
  • Executed/ Ran Performance Tests for each Releases (phases).
  • Working into the area of ICD-9 to ICD-10 transformation and ICD-9 diagnosis and procedures and CPT codes.
  • Worked with PEGA Rule engine and IBM Data Power which is implemented in this project.
  • Worked on IBM Business Process Manager (BPM)
  • Responsible supporting for platform tuning and support infrastructure running on WebSphere Application Server (Linux/AIX).
  • Monitored the application Performance ( Servers, CPU, Memory, etc), for Mainframe and Midrange using Strobe, NMON and TMON DB2 Trace and Network Sniffers.
  • Analyzed Performance Test Results and finding root cause for performance bottleneck and latency ( JVM tuning, Garbage Collection, Thread Pools, Connection Pools, Heap Size, Network Sniffers, etc)
  • Created and pulled the various Performance Test Results/ Metrics after successfully completion the Tests. Thoroughly Analyzing the test results and create the Performance Test Reports.
  • Created performance reports for server performance statistics (Metrics) using Wily Introscope.
  • Using SOAPUI 4.5 to validate Web Services (SOAP,XML) files
  • Involved scheduling/ arranging various application team meeting.
  • Worked on various environment like, Mainframe, AS/400 and Windows, Unix,
  • Worked on SiteScope installation, configuration including administration and customization for various monitoring purposes.
  • Troubleshooting issues like slow performance using Wily Introscope and CEM for applications deployed on different application servers.
  • Expertise in configuring with different Application servers like WebSphere, Jboss, Apache, Websphere Portal Server, Websphere MQ, to administer applications with CA APM Suite.
  • Was On-Call on a rotational basis to support application teams if required b triaging issues using wily introscope or troubleshooting any environmental issues on which wily introscope was hosted.
  • Installed, configured and maintained Introscope Enterprise Manager and Manager of Managers with agents supporting both Production and test environment.
  • Involved preparing capacity analysis when applicable with production and Test environment.
  • Involved training and demonstrating and mentoring to team members and other application team.

Environment: /Technologies: Windows 2000/XP, IBM AIX, Mainframe, Unix, Linux, Performance Center 11.5, Site scope 11.5, RUM, HP/Load Runner, Vugen (11.5), SOAPUI, IBM WebSphere, HIPPA, PEGA, WSDL, XML, MQ Series, DB2, SQL, PEGA, IBM Data power, HTTP,FTP, SNMP, SOAP, Web Services, Java, Spring, JVM,GC,TMON, NMON, ITCAM

Confidential

Lead Performance & Monitoring Engineer

Responsibilities:

  • Involve developing and deploying Enterprise performance monitoring systems for Confidential ’s various internal external applications.
  • Involve reviewing the requirement and necessary documentations to better understand for deployment monitors and creating SLA/SLMs.
  • Involve Confidential ’s enterprise engineering monitoring solution (Monitoring tools).
  • Involve following and maintain all the Confidential ’s Procedures and policy including security policy etc.
  • Involve developing performance script using Vugen tool (9.1/9.5) and deploy them to HP BAC.
  • Involve working on DDM ( Installing Prob and configuring with BAC and run the reports).
  • Involve working with multiple protocols (including Web, Citrix, RTE, AMF, Flex, Ajax, Winsock, ODBC, Web Services)
  • Deeply involve developing Vugen (Load runner) script for various Capitan One’s internal as well as external sites including Web, database, Citrix, Outlook and other application.
  • Involve working on RUM (Real User Monitor) and Discovery (DDM).
  • Involve working on SOA Web Services application (RTM).
  • Involve creating/configuring Maintenance Window and Alerting for these monitors.
  • Involve creating Sitescope monitors for various application, database etc.
  • Involve working testing Storage, database, server migration projects.
  • Involve working on DR environment.
  • Involve working with Capacity Planning.
  • Involve software installations and configurations ( Windows, Linux, Unix).
  • Attending the status meetings and reporting the issues and also collecting status from all other team members and provided the support for onsite and offshore teams.
  • Involved in writing and execution of the SQL queries and validates the data in backend.
  • Involved in modifying the UNIX Scripts for optimized performance.
  • Generated test scripts for Load and Performance testing to see how the system behaves under load using LoadRunner.
  • Involve using Monitors to identify load and performance bottlenecks.
  • Involve correlation the unique data generated from the server such as Session ID.
  • Involve analyzing transaction graphs, web resource graphs and network graphs to pinpoint the bottleneck.

Environment: /Technologies: Windows 2000/XP, Business Availability Center 7.5, Diagnostics, BPM, SAM, Sitescope, RUM, TQL. DDM (Discovery), UCMDB, HP OMI, xMatters, HP/Mercury Load Runner ( 9.1/9.5), Quality Center 10.0, Unix, Linux, Java,JSP,, .Net, WSDL, XML, MQ Series, DB2, SQL,TCP/IP, HTTP, Oracle, SOAP, Web services, Peoplesoft, Load Balance, LDAP

Confidential

Sr Performance Load Stress Engineer

Responsibilities:

  • Involved in designing certain data driven automated Performance tests with Borland's load testing tools Silk Performer and Silk Center.
  • Involved in identifying performance test, developing and executing performance test scripts that validate that our Client's software products meet specified performance requirements using Silk Performer R2 and Load Runner.
  • Developed and executed performance test scripts that validate that our Client's software products meet specified performance requirements using Silk Performer.
  • Involved in identifying performance test requirements based on system specifications and service level agreements.
  • Involved in Database Testing using SQL Queries.
  • Database analysis, query analysis, design review.
  • Identified performance bottlenecks for database.
  • Involved in Test planning, test execution, problem solving and analysis, with an emphasis on validating Web-based applications.
  • Involved in Identifying application performance requirements and defining performance test strategies.
  • Involved in update Test Strategies, Reports according to the Performance Test Results.

Environment: Windows NT/XP, Java, JSP, XML, DB2, SOA, Web Services,, Silk Performer 9.0/2010 R2, Load Runner 9.1, PeopleSoft, Remedy

Confidential

Performance Monitoring & Automation Engineer

Responsibilities:

  • Involved developing and deploying performance monitoring systems for Confidential ’s various internal external applications.
  • Involved reviewing the requirement and necessary documentations to better understand for deployment monitors and creating SLA/SLMs.
  • Involved Confidential ’s enterprise engineering monitoring solution (Monitoring tools).
  • Involved following and maintain all the Confidential ’s Procedures and policy including security policy etc.
  • Involved developing performance script using Vugen tool (9.1/9.5) and deploy them to HP BAC.
  • Involved working with multiple protocols (including Web, Citrix, RTE, AMF, Flex, Ajax, Winsock, ODBC, Web Services)
  • Deeply involved developing Vugen (Load runner) script for various Capitan One’s internal as well as external sites including Web, database, Citrix, Outlook and other application.
  • Handle the tasks of managing/creating test data, developing test scenarios and logging test results. Responsible for setting and execution of service oriented application.
  • Involved working on RUM (Real User Monitor) and Discovery (DDM).
  • Involved working on SOA Web Services application (RTM).
  • Involved creating/configuring Maintenance Window and Alerting for these monitors.
  • Involved creating Sitescope monitors for various application, database etc.
  • Involved working testing Storage, database, server migration projects.
  • Involved working on DR environment.
  • Involved working with Capacity Planning.
  • Involved software installations and configurations ( Windows, Linux, Unix).
  • Deeply involved working with Confidential ’s upgrade AlarmPoint project ( critical event notification System).
  • Involved working HP OM integration with xMatters (formally Alarm Point).
  • Involved working HP Diagnostic upgrade project.
  • Involved creating /configure aterting for CPU, Disk, Memory threshold, Process running, Error found log file and Batch job fails etc for various server and database.
  • As a team lead, involved attending meeting with managers and providing daily/weekly status report . Discuss regarding the issues and expedite if necessary.

Environment: /Technologies: Windows 2000/XP, Business Availability Center 7.5, Diagnostics, BPM,SAM, Sitescope, RUM, TQL. DDM (Discovery), UCMDB, HP OM, xMatters (Alarm Point ), HP/Mercury Load Runner ( 9.1/9.5), Quality Center 10.0, Unix, Linux, Java, JSP, TIBCO, .Net, WSDL, XML, SQL, TCP/IP, HTTP, Oracle, MQ, SOAP, Web services, Peoplesoft, Load Balance

Confidential

Sr Performance Engineer

Responsibilities:

  • Involved in full application development lifecycle/performance test life cycle and all aspects of application testing with emphasis on stress testing and performance testing.
  • Involved in identifying and introducing to improve test practices, particularly around performance testing. Also responsible the creation of test assets, including test strategies, test plans, test cases, test scripts, as well as test execution, analysis and reporting.
  • Provided input to the development of the stress testing strategy and approach. Developed performance test plans and scripts. Working with counterparts to help plan, build operate and maintain the performance test environment. Managing and performingload testing.
  • Involved in documenting test procedures, load balances, system performance findings and problems from test results. Reconciling test results from different tests and different groups. Assess readiness and deviation of application performance based upon test results and product specifications. And also involve to ensure that processes and testing documentation meet total quality management and /or other reliability standards.
  • Involved in research and implement initiatives that would improve testing methods, processes, and test/load tools. Performed the tests in both the QA and contingency / back - up environments.
  • Involved designing, developing, coding performance and load test scripts and that meet design and functional specifications using LoadRunner/ Performance Center and other tools.
  • Involve working on development and architecture of large, multi-tiered distributed applications including web servers, application servers, and relational databases.
  • Strongly involved in creating automated performance test scripts using HP Loadrunner including parameterize/correlation etc.
  • Involved in installation, administration and upgrade of HP LoadRunner and Performance Center.
  • Involved working on J2EE (Java/Weblogic/Websphere); RDMS (Oracle; Sybase); Testing Methodologies, Analysis of test results.
  • Involve working on Unix shell scripts for various applications.
  • Deeply involve working on various protocols viz.AJAX, Web, HTTP/HTML, JMS, Java, .Net, SOA/Web Services, Oracle forms, Power Builder/Sybase etc.
  • Involved working with others in technical writing and presentation, project planning and delivery.
  • Responsible for on boarding applications and promoting usage of the Load testing environment.
  • Involved assisting developers during initial use of the Scripting tool Loadrunner /Vugen and executing the Virtual user scripts in Performance Center.
  • Involved in planning and perform environment and product upgrades.

Environment: Windows 2000/XP, Vista, Unix, HP/Mercury Load Runner ( 9.0/9.1) Performance Center( 9.0/9.1) Analysis, SiteScope, J2EE, Java, JSP, WSDL, XML, Oracle 9i/10g, SQL, TCP/IP, HTTP,MS SQL Server, Oracle Forms, Sybase 15.0.2, .Net 2.0, MQ, J2EE, Weblogic 9.1, JBOSS, Websphere 5.X, SOA, Web Services, SOAP, Unix shell scripts, Solaris 10. Ajax, LDAP, Site Minder.

Confidential

Sr Performance Engineer

Responsibilities:

  • Involved in SOA (Service-Oriented Architecture )Testing Framework.
  • Deeply involved in designing certain data driven automated Performance tests with Segue's load testing tools Silk Performer and Test Component.
  • Involved in identifying performance test Developing and executing performance test scripts that validate that our Client's software products meet specified performance requirements using Silk Performer R2 and Load Runner,
  • Deeply involved on Integration testing.
  • Reviewing requirements and specifications early in the product life cycle.
  • Involved requirements based on system specifications and service level agreements (SLA).
  • Involved in Test planning, test execution, problem solving and analysis, with an emphasis on validating Web-based applications.
  • Involved working with J2EE Portal technologies in a clustered environment.
  • Involved working with Single Stack International e-quote (Sales) application.
  • Involved in Identifying application performance requirements and defining performance test strategies. Deeply involved in configuration, monitoring BEA Weblogic environment components I,e. CPU usage, Memory, and Transactions/sec during tests using BEA Console.
  • Involved in writing Performance Test Scenarios, Test Plan, Performance test Scripts, Test Statistics and Test reports,
  • Worked with Empirix Hammer tool for testing IVR, IP network testing.
  • Involved working in IBM WebSphere application server.
  • Involved in working with application servers (WLS/WAS), Web servers (Apache, IIS), Windows Server.
  • Deeply involved monitoring Webserver, Appserver, Database server while running the performance test.
  • Involved analyzing various components of the application (web server, application server, transaction management server, database etc), identify performance bottlenecks (at application, performance and OS layers) and support performance tuning exercises.
  • Involved creating performance test reports, CPU usage, Memory usage and monitoring them.
  • Involved working in Solaris System administration with Admin group.
  • Involved working closely with the Development team during performance tuning and troubleshooting.
  • Involved in installing and configuring testing environments (HW, SW, Testing Tools, Controller / Generator/Agent).
  • Involved working with Firewalls.
  • Involve using Load Runner for Load tesing.
  • Involved testing in Various Network Protocols SOAP, TCP/IP, FTP, RTP, SIP etc.
  • Involved working with Mercury Side Scope tool for performance monitoring.
  • Involved creating /developing QTP scripts Web center (for simulated call flows).
  • Involved using Quality Center for defect tracking, requirements and monitoring purpose.
  • Involved perform system verification to insure servers are built to Client requirements and per standards prior to turn over to production.

Environment: Windows NT/XP, Java, JSP, XML, Microsoft, Content Management, IVR,VoIP, SIP, RTP, TCP/IP, Oracle 9i/10g, J2EE, BEA Weblogic Server 8.1, IBM Websphere 5.X, SOA, Web Services, SOAP, Apache, Tomcat, Unix, Visio, Solaris 10, Silk Performer 7.3.1/5/R2, IBM Mainframe, DB2, Load Runner 8.1, QTP, Quality Center 9.0,2, HP/Mercury Site Scope, Eclipse 3.2, CISCO switches/routers, Empirix Hammer,VPN, RAS, MSTC, SSH, .Net/2.0, SQL Server 3

Confidential

Software Performance Engineer

Responsibilities:

  • Responsible in writing Test Plans & Test Procedures and Test cases using software development Methodology
  • Responsible in Regression, Smoke, functional, Batch and Performance testing.
  • Responsible working with developers using J-Unit 4 ( bug fixing, JMS testing),
  • Responsible in Backend testing (connect to back-end and execute the SQL statements, to make sure that the data we entered from the front end is co- relating the back-end or not).
  • Involved in executing and validating test cases based upon the system requirements
  • Responsible in data validation, mapping data with AS/400 and Oracle table.
  • Involved in testing Data Warehousing data.
  • Involved working with xpath, xquery and xscema.
  • Involved testing in Batch processing and data validation.
  • Strongly involved in Login User ID and Password authentication testing.
  • Involved in secure HTTP/HTTPS testing using Silk Performer and JMeter Tool.
  • Deeply involved in Performance testing.
  • Deeply involved in configuration, monitoring BEA Weblogic environment components (Console, CPU usage, Memory, usage and Transactions/sec during tests.
  • Involved in User acceptance Testing.
  • Used @ Task for entering and tracking defects.

Environment: Windows NT/2000, Java, JSP, JMS, XML, AS/400 iSeries, Content Management Tool,, SQL, PL/SQL, Oracle 9i, J2EE, BEA Weblogic Portal Server 8.1, Apache, Silk Performer, JUnit 4, JMeter, Data Warehouse, HTML, DHTML

Confidential, Dulles, VA

Lead Software QA Engineer

Responsibilities:

  • Developed and executed Quality Assurance Plan for the full SQA Life Cycle. Test Strategy Test schedule expected and to test whether the desired results are achieved.
  • Responsible in writing Test Plans & Test Cases to test the Applications.
  • Developed and implemented test plans, test cases, and test strategies by analyzing Business Objects, IT, and other requirements/documents (TRD, PRD, SDD, TSD).
  • Responsible in evaluation test execution, Recovering from errors and Assessing results.
  • Involved in developing test cases, scripts, and execution them.
  • Responsible in various Browser (IE, Netscape, AOL Client) testing.
  • Involved testing in complex systems.
  • Involved in testing java, swing applications.
  • Strongly involved testing interfaces and file transfers from other system.
  • Used Test Director for Reporting and Defect tracking purpose.
  • System testing was done with the help of WinRunner/QTP.
  • Used Quick Test Pro for functional testing.
  • Used Rational RobotJ for java application testing.
  • Responsible to write Test Script language (TSL) used for developing scripts on Win Runner.
  • Responsible in Functional, Regression, Integration, Performance,, BAT(built acceptance test), and Comparing with mocks, Failure Testing.
  • Used BLT (Tracking System) for submitting defects and tracking the defects.
  • Responsible in testing the UI in all browsers and platforms using HTML mocks as a guideline.
  • Responsible in bug scrub meeting with developers, team leads and managers.

Environment: MAC OSX, Win XP/2K/98, HTML, XHTML, XML, ACLs (access control Lists), java, Java Script, Swing, SQL, Sybase, Tcl interface, Tandem Server, Portal Application, Oracle 9i,WinRunner, QTP, LoadRunner, Test Director, Rational Robot, Rational RobotJ, Requisite pro, Test Manager, XML SPY, Crystal reports.

Confidential

Sr QA/Performance Engineer

Responsibilities:

  • Responsible in all phases of Software Development Life Cycle and Test Life Cycle as creating Test Strategies, deciding tools, creating Test Plan,, Test Scripts, Developing and Executing Test Scripts, Test Cases, Implementation and Reporting.
  • Responsible in testing and debugging the application. Worked on PL/SQL for testing and maintenance purpose
  • Deeply responsible in back-end testing (connect to back-end and execute the SQL statements, to make sure that the data we entered from the front end is co- relating the back-end or not).
  • Worked on Software Development Life Cycle (SDLC) methodology.
  • Developed and implemented test plans, test cases, test strategies by analyzing Business Objects, IT, and other requirements/documents
  • Worked with Project Management Team.
  • Responsible in evaluation test execution, recovering from errors and assessing results.
  • Developed and executed Quality Assurance Plan for the full SQA Life Cycle. Test Strategy Test schedule expected and to test whether the desired results are achieved.
  • Worked with development team, requirement (Elab) team, in further extending our in-house developed test tool.
  • Responsible in Build Acceptance Testing (BAT), Regression Testing, Functional, System Testing, Security testing, UAT, End to End Testing.
  • Worked on ISO 9000 Procedure. (We Passed the ISO recently).
  • Responsible in writing Shell scripts for configuration purposes.
  • Creation of User Scenarios, which defined what action users can take and what result are expected and to test whether the desired results are achieved.
  • Worked with Development team for Bug rectification.
  • Used Compuware QA Run for testing purpose.
  • Used Compuware tools make informed “go live” decisions and deploy applications with confidence,
  • Responsible in demonstration and presentation to the client for SDR (Software design review).
  • Worked with Sun Solaris (Unix) system for Executing Test Cases.
  • Worked in Java Embedded Server (OSGS) Technology.
  • Installed, Configured and used ClearQuest for defect tracking and reporting.
  • Responsible in working with ClearCase project Components (VOBs), streams (VIEWs) and Unified Change Management (UCM).
  • Deeply involved Backend Testing, executing SQL, PL/SQL quires.
  • Worked with Devices and Embedded system..
  • System testing was done with the help of WinRunner and with QTP.
  • Responsible in Creating and implementing test data strategy for regression testing in multiple environments.
  • Responsible in Validate automated test scripts after they are created and Maintain previously created WinRunner scripts.
  • Automate test cases for GUI and non-GUI applications using Mercury WinRunner.
  • Responsible in executing WinRunner scripts to test new functionality and regression testing. Analyzed test results to ensure that requirements have been met and working functionality has not broken .
  • Responsible in installtion, System Adminstration and used the Testing tools.
  • Involved in installation, configuration and monitoring BEA Weblogic environment.
  • Load Testing was done with the help of Load Runner.
  • Used Test Director for Reporting purpose.
  • Used Jrisk for the application.
  • Responsible in writing Test Plans & Test Procedures using software development Methodology (RUP).

Environment: Unix, Java, JSP, JAVA SCRIPT, EJB, BEA Weblogic 7.0, PostgresSql, PL/SQL,SQL, JDBC, Oracle 9i, XML, E-Load, Bean Test, OSGS(OpenService Gateways), RMI, JMS, Bugzilla,Ksh, Bsh, Perl, Sun Solaris, Embedded Server,JSP Pages, WinRunner, LoadRunner, QTP, J-test Tool, Compuware tools, Jrisk

Confidential

QA Engineer

Responsibilities:

  • Responsible in testing and debugging the application. Worked on PL/SQL for testing and maintenance purpose. Involved in all phases of Test Life Cycle as creating strategies, deciding tools, creating test plan, test cases, implementation, bug fixing, reporting .
  • Responsible in Performance, Regression, Integration, Functional, Configuration, Security, Installation, Volume, User-Interface, Stress, and Compatibility Testing.
  • Updating and writing shell scripts for databases backend server according to the requirements.
  • Build data warehouses utilizing oracle features like partitioning Materialized views, query-rewrite, bitmap indexing, bit merging and star transformation. Design and development of multi-dimensional STAR and Snowflake schema. ETL and OLAP processes for large warehouse and data marts. Worked with Informatica, BRIO, Oracle Forms/Reports and Web development HTML, Java.
  • Wrote Unix/ Perl scripts for Database migration, Data Mining and Database tuning.
  • Responsible to write PERL, TCL and Unix shell scripts.
  • Worked with PeopleSoft Enterprise eSettlements. PeopleSoft eSettlements, a global electronic invoice presentation and payment (EIPP) solution, enables us to optimize the settlement process with electronic invoices and payments, matching, online approval and dispute resolution, and email notifications.
  • Worked with PeopleSoft Electronic Invoice Processing and Payment and Rapid Approval and Dispute Resolution.
  • Wrote Interface files (Configuration & Definition) in Unix scripts for backend server.
  • Wrote Script for high Availability in Unix and integrated Vitria’s Application.
  • Involved working on Ariba Buyer application.
  • Monitoring of cronjobs for daily backup of database & application.
  • Responsible to write Test Cases for database object testing and provide test data to testers. Deeply involved to work with data warehousing.
  • Designing, developing and implementing SQA process by bridging the gap between user requirements and the developers.
  • Went for Clear Case and Clear Quest training from IBM Rational team.
  • Involved in installations and configuration CC and CQ in QA environment.
  • Involved working in project structure (VOBs, VIEWs). UCM process (creating a project, integrate a project, create baseline) with UCM.
  • Involved working change request management system. Registering with CQ, Enter CQ, Create a new change request, Generate reports, Create own query and interface with developers.
  • Helping Test Engineers to installation and configuration CC & CQ. Follow the schedule. Make necessary changes.
  • Develop and execute Quality Assurance Plan for the full SQA Life Cycle. Test Strategy Test schedule expected and to test whether the desired results are achieved.
  • Functional and System Testing was done with use of Win Runner.
  • Responsible to write Test Script language (TSL) used for developing scripts on Win Runner.
  • Responsible in White Box testing for Java Applications.
  • Clear Case used for version control and configuration management.
  • Used Clear Quest for Change Request, Reports and Defect tracking.
  • Performance Testing was done to test the speed of application.
  • Performance Testing was carried out with the help of Load Runner.
  • Used PVCS Tracker for Tracking purpose.
  • Wrote scripts for daily backup & restore on Unix..
  • Responsible in writing build and configuration scripts (ksh,bsh,perl) for server.

Environment: Java, JSP, EJB 4.5.1,WebSphere Server, Oracle 8i/9i, LINUX, Solaris 2.8, Clear Case, Clear Quest, Win Runner 6.5, Load Runner 6.5/7, Test Director, SQL,PL/SQL,PERL 5, Unix Scripting ( Ksh, Csh, awk) Shell scripts, Vitria, i2 Ariba Buyer, Trade Metrix., PeopleSoft, Win NT, XML, COM. C++, DB2, Informatica, PVCS Tracker

Confidential

Senior QA Engineer

Responsibilities:

  • The applications were tested in all different phases, namely Unit Testing, Integration testing, Database Testing, Backend Testing, Performance Testing, Regression Testing BAT Testing and End to End Testing.
  • Worked extensively on java code. Responsibilities include reading Java code, checking the syntax for the errors, executing the program and comparing the expected results with the output of the program, debugging, implementation of coding standard.
  • Responsible in development of testing framework for EJBs, Classes, Servlets, Taglibs.
  • Transformation of testing reports from text docs to xml docs (this includes defining DTD’s).
  • Used SQL for database operations (DDL, DML, DCL) from Unix box Involved in data migration.
  • Involved back-end testing (connect to back-end and execute the SQL statements, to make sure that the data we entered from the front end is co- relating the back-end or not).
  • Black Box Testing was carried for the functionality testing,
  • Data validation was done with the help of Segue Silk Tool,
  • Involved to write Shell/PERL scripts on various instances such as batch file processing.
  • Wrote Parsing scripts for Weblogic & Apache error log files, Cronjob for Site Administration, Process Monitoring, and Wrote utilities in “C” on Unix to populate & access memory mapped files.
  • Prepared documents related to pre delivery Testing.
  • Responsible in White Box Testing for Java Applications.
  • Work with Various Internal Messaging Services .
  • HTTP testing was done with the help of Silk Performer.
  • Used Silk testing tool for testing the Java application.
  • Deeply worked in configuration with Clear Quest.
  • Change Request, bug tracking and Reports was carried out with the help of Clear Quest.
  • Used Clear Case for version control and change management.
  • Functional, Regression Testing was done with the help of Win Runner.
  • Regression Testing was done with the help of Win Runner.
  • Performance Testing was carried by Load Runner.
  • Deeply involved to write TSL (Test Script Language) for Win Runner.
  • Load Testing / Performance Testing was carried with the help of Load Runner.
  • Involved working with XML (file structure, parsing).
  • Responsible to write Configuration scripts for server (Staging, QA &Production) in Unix (ksh) PERL 5.0.
  • Deeply responsible in transferring data for account conversions (as Confidential is converting their system ADP mainframe to Dean Witter Systems).
  • Strong responsible in testing of market data, quotes, stocks, bonds, options, fixed income and data feeds.
  • Worked in OTC Trading environment.
  • Worked on Fixed income derivatives with Jrisk.

Environment: Clear Case, Clear Quest, JUnit, Win Runner 6.5, PERL, Shell Script, Java, EJB, Weblogic 5.1, XML, Sybase and DB2, SQL,PL/SQL, Oracle 8i, J2EE, JSP, XML, Windows 2000/NT, XP, Unix, AIX AS/400, Segue Silk, Load Runner, MQ Series, RSW E-Tester tool, Apache. Silk Performer, Messaging Service, Jrisk.

We'd love your feedback!