We provide IT Staff Augmentation Services!

Senior Systems Analyst/software Test Engineer Resume

3.00/5 (Submit Your Rating)

SUMMARY

  • US citizen with more than 9 years of professional experience in the field of IT with focus on QA and testing of web based and Client-Server applications, effective use of QA and Testing practices. Strong knowledge of all phases ofSDLC and Software testing.
  • Experienced in defining Testing Methodologies, Designing Test Strategy/Test Plans and Test Cases, Verifying and Validating applications andDocumentation based onstandards forSoftware Development and effective QA implementation in all phases of Software Development Life Cycle (SDLC).
  • Expertise in Manual/Functional Testing and Automation Testing.
  • Has good Working knowledge of Automation tools and Expertise in Win Runner, Test Director, Load Runner, HP Performance Center, Rational Clear Case, Quality Center, QTP

and soapUI.

  • Proficient in all cycles of test life cyclefrom test planning to defect tracking andmanagingdefect life cycle.
  • Extensiveexperiencein Developing and Maintaining Test Scripts, analyzing bugs and interacting with development team members in fixing the defects.
  • Extensiveexperienceincoordinatingtestingeffort,responsiblefor test deliverables,status reporting tomanagement, issueescalations.
  • Experience in Database Testing including SQL Server and Oracle.
  • Capability to apply wide technical and practical skills to manage and execute white-box and black-box tests and to evaluate needs for software designing, development and validations.
  • Ability to study new technologies and tricky concepts rapidly and execute them.
  • Outstanding analytical, decision-making, problem solving and management skills with the ability to organize activities in fast speedy team environment.
  • Ability to perform effectively and efficiently in team and individually.
  • Performed different types of Testing(Smoke/Baseline, Functional, Integration, Regression, User Acceptance, Web, Load, Stress, Performance).
  • Proficient in using MS Office Word, Excel, Power Point.

EDUCATION
Bachelors Degree in Electronics & Communications Engineering
PG Diploma in Computer Applications

TECHNICAL SKILLS

Languages: C, C++, FORTRAN 1V, Java, Pascal
Databases : Oracle 7.x / 9i, MS Access, MS SQL Server
GUI/Tools : Visual Basic 6.0/5.0, Active Server Pages, HTML, DHTML, VB Script, Java Script, Visual Interdev 6.0, TOAD, SQL Developer, Microstrategy reporting tool
CM/Testing Tools: Mercury Interactive Test Suite (Win Runner, Test Director, Load Runner), Quick Test Pro (QTP), PVCS Tracker, PVCS Version Manager, Clear Quest, Clear Case, Hummingbird DM, Quality Center, HP OpenView Service Desk, HP Performance Center, SoapUI Pro
Platforms: IBM PC (Windows 7/XP/NT/Windows 95/98), MS DOS and Unix

WORK EXPOSURE

Confidential,Arlington, VA Feb 2010 – Jun 2012
(Senior Systems Analyst/Software Test Engineer)

The American Association of Motor Vehicle Administrators (AAMVA) is a tax-exempt, nonprofit organizationdeveloping model programs in motor vehicle administration,law enforcement and highway safety. Founded in 1933, AAMVA represents the state and provincial officials in the United States and Canada who administer and enforce motor vehicle laws. AAMVA developed and maintains a number of information systems facilitating the electronic exchange of information between jurisdictions. Some systems are fully operational, while others are in

Accomplishments:

  • Involved in complete test life cycle (Initiate the project, Design the system, Define the system, Build the system, Test the system and support the system).
  • Created Test Strategy document that defines Testing Approach to achieve testing objectives.
  • Created Test Cases for testing the functionality, navigation, stress, performance of the application.
  • Organized & conducted Test Case Peer Reviews, Walk Thrus with DHS, FBI.
  • Created the test data for the negative and positive test cases.
  • Data migration testing – Export data to SQL Server DB in current CDLIS. Wrote queries to verify the data for successful migration to CDLIS MOD.
  • Worked with different jurisdictions’ DMVs on Casual and Structured Test Planning, Execution, Reporting Issues.
  • Co-ordinate end-to-end testing efforts between FBI & DMVs. Performed the initial analysis of the issues & report to Development team.
  • Used Test System (in-house tool) to emulate any AAMVAnet Application.
  • Inspect the database to ensure the data has been populated as intended and all database events have occurred properly, or review the returned data to ensure that the correct data was retrieved for the correct reasons.
  • Documented and communicated test results, defect reports.
  • Worked on multiple projects simultaneously. Lead projects independently & with 1-2 resources.
  • Worked in Agile Sessions with the development SA & QA, to expedite the issue resolution process.
  • Performed User Acceptance testing, Smoke/Baseline testing, Integration testing, Regression testing, Load testing, Performance testing for application code releases and changes for different AAMVA applications.
  • Used XML spy to perform the XML validations against the schemas.
  • Performed Web-Services testing using soapUI Pro.

Environment: Java, HTML, XML, MainFrame, WebServices, SQL Server Management Studio, XML Spy, Test System (in-house tool), SQL Server, SoapUI Pro 3.6.1

Confidential,Reston, VA Sept 2005 – Dec 2009

Confidential, (QA Test Engineer)
ATLAS System provides an Enterprise Asset Management solution for Sprint Nextel tracking and valuating the Cell Site and Warehouse inventory. ATLAS records the status, location and cost of Sprint Nextel’s Network Asset equipment. Asset location and asset worth are used to determine tax, insurance, valuation and inventory requirements.

Accomplishments:

  • Involved in Requirements review sessions with Business & Development teams.
  • Prepared/Updated Test Project Plan with MS Project.
  • Test Planning. Formulate Test Plan that contains test scenarios for testing the functionality, navigation, stress, performance of the application.
  • Organized & conducted Test Case Peer Reviews, Walk Thrus.
  • Preparation of test data for the negative and positive test cases.
  • Facilitated Test Team Meetings
  • Performed Smoke/Baseline testing, User Interface testing, System testing, Integration testing, Regression testing, Stress testing, performance testing for ATLAS application code releases and changes.
  • Optimized QTP scripts for Regression testing of the application with various data sources and data types.
  • Mapped the custom objects to the standard objects where necessary, and inserted GUI, Bitmap and Text checkpoints where needed, to compare the current behavior of the application being tested to its behavior in the earlier version using Quick Test Pro (QTP).
  • Performed Data-driven testing - Parameterized the fixed values in checkpoint statements, created data tables for the parameters and wrote functions for the parameters to read new data from the table upon each iteration.
  • Wrote several SQL scripts to validate the data integrity in the application using various DDL statements.
  • Involved in the complete test life cycle.
  • Created Estimates for estimating the time for automating the application
  • Documented and communicated test results.
  • Worked with the development team to ensure defects opened are resolved.
  • Consistently submitted high quality deliverables on schedule.

Environment: Oracle 9i, PeopleSoft 8.9, Infowave mobile tools, IBM Web Sphere, MQ Series, Business Objects, PL/SQL, Java, HTML, XML, Quick Test Pro (QTP).

Confidential, (Tech Support Engineer/Reports Developer)
ITPS provides the production support functions (monitoring, chg and problem mgmt, manage stability and capacity, etc) for legacy Sprint Information Technology applications that are associated with number mgmt, porting and provisioning. ITPS is accountable for the overall health and stability of Sprint\'s production applications. ITPS reviews, approves and deploys all changes going into production to ensuring the overall stability of the application so they are available and ready for business. Also provides efficient customer focused application system support for all Sprint production applications. ITPS ensures maximum up-time through proven support and application techniques to meet SLA\'s and business application availability.

Accomplishments:

  • Successfully supported IT FEOPs change control process. Handled all Service Desk responsibilities for IT Network Services development including RFCs and Wordorders, and proactively worked with the development groups, front office, EDS, Tier2, and change management to ensure all deployments were approved and all work was coordinated and on schedule. Provided Service Desk Change Management process training to the development teams.
  • Worked with the offshore team to ensure the number of open tickets does not exceed expectations, and that all tickets are triaged appropriately and in a timely manner. Coordinated weekly meetings to review the status of all tickets and provided the offshore team direction on how to work and route trouble tickets.
  • Coordinate the migration of IT FE&O supported systems/hardware to ITPS supported processes and data centers.
  • Successfully created/generated daily P1/P2 reporting, weekly ticket trending and analysis, and monthly, annual OPS review reporting and trending using Microstrategy Reporting tool, which has allowed IT Production Support for Network Services to evaluate work productivity, trouble ticket analysis, and produce measurable goals for the organization.
  • Created/Generated reporting for all of Sprint Nextel’s IT Production Support organization that is shared at the executive level.
  • Received EXCELLENCE AWARD for playing a lead role in developing Microstrategy reporting metrics for IT Production Support for Network Services that helped to meet the overall goals of the organization.

Environment: hp OpenView service desk, Microstrategy, Oracle SQL Developer

Confidential, (Performance Tester)
Optimi\'s Wizard RF engineering software is a state-of-the-art engineering tool that can simplify RF engineering work. Through its Geographical User Interface (GUI), many time-consuming, tedious, tasks are transformed into simple point-and-click operations. Wizard provides engineers with all the resources they need to design, plan, install, and optimize the networks.
Accomplishments:

  • Worked with business owners of the application to identify the critical business functions, process flow, user profile information and capacity model
  • Define Performance Test strategy
  • Build out of the Performance Architecture
  • Development and Execution of Performance Test scripts
  • Coordination of resources for monitoring and support
  • Documentation of individual test results & overall findings
  • Identification of potential performance bottlenecks
  • Baseline system performance at current production usage levels
  • Assurance that hardware / software is production ready (ability to meet anticipated heavy loads)
  • Performance Center (Version 7.8) is licensed with 7 concurrent test runs and up to 5000 virtual users. The load will be generated as follows:
    • Load will be generated from internal sites (behind the firewall) using Performance Center from Herndon, VA
    • Load will be generated from external sites (outside firewall) using Performance Center from servers located throughout the U.S.
    • Transactions will be dispersed evenly across the selected Load Generators using Performance Center. This mechanism will test that the load balancer is configured properly.

Environment: Performance Center 7.8, MSDE

Confidential,MD
(Test Lead) Sep 2003 – Sept 2005
The Electronic Fraud Detection System (EFDS) is an automated system that is used by the Fraud Detection Center (FDC) investigative analysts and aides to review potentially false tax returns and develop scheme referrals to the Field Office Criminal Investigation (CI).
The Electronic Fraud Detection System (EFDS) consisted of four separate applications in 2004: The Workload Management System (WMS) - WMS automates looking at “chunks” of returns and deciding whether each return needs further investigation, The Scheme Tracking and Referral System (STARS) - STARS organizes questionable returns into “Schemes.” 1999 was the first system to include STARS, The National Office Interface (NOI) and User Administration (UA).
The system for 2005 has been redesigned for deployment as a Web-based application. The functionality of the individual legacy applications have been combined into a single application and database.
Accomplishments:

  • Involved in Requirements review sessions with CI(customer) and Business Analysts
  • Prepare/Update Test Project Plan with MS Project.
  • Test Planning. Formulate Test Plan that contains test scenarios for testing the functionality, navigation, stress, security and recovery of the application.
  • Organized & conducted Test Case Peer Reviews, walk thrus.
  • Prepared & maintained checklists to ensure all software standards are implemented.
  • Implement CMM/I level 3 procedures
  • Facilitated Test Team Meetings
  • Performed Smoke/Baseline testing, User Interface testing, System testing, Integration testing, Regression testing, Stress testing, performance testing.
  • Test the database & application transmittals in UNIX environment.
  • Inspect the database to ensure the data has been populated as intended and all database events have occurred properly, or review the returned data to ensure that the correct data was retrieved for the correct reasons
  • Preparation of test data for the negative and positive test cases.
  • Involved in complete test life cycle (Initiate the project, Design the system, Define the system, Build the system, Test the system and support the system).
  • Documented and communicated test results. PVCS Tracker is used for reporting the bugs.
  • Prepared & delivered software metrics.
  • Worked with the development team to ensure testing issues are resolved.
  • Handled Help Desk tasks.
  • Performed ITAMS Initial analysis. Contact the customer, try to duplicate the problem reported & perform the initial analysis when ITAMS tickets are submitted. Help Development team in resolving the problem tickets.
  • Presented Testing Training session for Project Office
  • Volunteered to move over to the development team to support Portal reports development. Assumed the development lead role to ensure on time report delivery. Prepare the System Design Document (SDD) for the Reports. Successfully developed reports. Upload reports to server & perform unit/integration testing.

Environment: C++, Oracle 9i/10g, Unix, J2EE, Java Beans, Oracle Reports Builder, Windows NT/XP, PVCS Tracker, PVCS Version Manager, MS Project, TOAD, Clear Case, WinRunner, LoadRunner

Confidential,Arlington, VA Confidential, Aug 2000 – Feb 2001

(QA Tester)
SunScreens interface is used to capture subscriber data, activate service, verify credit & update account information. Addresses the business processes: Credit Verification, Fraud Detection, Account Creation & Update, Reporting. Interface includes a GUI (SunScreens Client) that inserts account information into the database.

Accomplishments:

  • Created common library with various re-usable compiled modules developed for use with different business rules on the application like handling special login procedures.
  • Created various custom categories in function generator to enhance the visual programming capability of WinRunner to develop TSL scripts.
  • Created several checkpoints on various objects in application to verify its properties like size of the logo, position of an object.
  • Billing information from various flat files was received to update the database. Data driven scripts were written to read the flat files, separately query the database and verify whether the database updates are properly done. Since the file names change dynamically regular expressions were used.
  • Scripts are programmed to handle several pop-ups like instance messenger screens, to handle object exceptions like connection errors.
  • Configured TestDirector to allow remote execution of automated scripts by several users on different hardware and software configurations.
  • Programmed tslinit and startup scripts to automatically initialize special testing configurations.
  • Developed test cases using various test case design techniques.
  • Attended meetings with teammates to evaluate the progress and performance of applications.
  • Used Test Director for test plan creation, for bug tracking and reporting. Tested the functionality of each screen and monitored the proper navigation.
  • Load tested the application under various operating systems, various hardware configurations, different browsers and different connection speeds.


Environment: Java, Oracle 8.0, Windows NT, WinRunner, TestDirector, Load Runner.

Confidential,Manassas, VA Customer Information System Mar 2000 – Jul 2000

(QA Tester)
Customer Information System (CIS) is a multi-user billing system for an electric co-operative. CIS maintains the customer, account, property and service information of all the customers of an electric co-operative. It also maintains the billing information of each and every entity.

Accomplishments:

  • Automated End-to-End activity for Quick Changes, Refresh, Customer Profile -
  • Created virtual users to simulate various customer service representatives some acting as GUI Vusers, some acting as Database Vusers.
  • Created various host machine types to test the application on different hardware and operating system configurations.
  • Simulated up to 500 Virtual Users in Controller by creating a scenario to test the systems ability to process various requests from web users. Also, created winsock scripts using vugen to load test interfaces having java applets.
  • Configured scenario runtime settings to control the load of users accessing the system.
  • Used load runner scheduler to closely simulate the production environment.
  • Trained new users and worked with Mercury Interactive to resolve issues.
  • Automated Edits and Field Validations for Refresh Maintenance Tasks, Quick Changes, Customer Profile - Dialed Numbers, Customer Profile - Locations.
  • Created Load Runner Scripts to help meet Transaction target\'s during performance testing.
  • Developed Automation Strategy and Common Win Runner functions for application.
  • Participated in Feature Testing of Refresh (Database Queries) and Edit Plans.
  • Executed batch jobs in UNIX to get the test data in test environment.
  • Used PVCS tracker for bug tracking, used version manager for configuration management.

Environment: Oracle 8.0, Windows 98, UNIX, Crystal Reports 7.1, WinRunner, Load Runner, PVCS Tracker, PVCS Version Manager.

We'd love your feedback!