Sr Qa Analyst Resume
MinneapoliS
SUMMARY
- I have 13+ years of experience as a Software Quality Test Engineer/QA Analyst.
- Proficient in performing both Manual and Automated testing of applications.
- Excellent understanding of the Software Development Life Cycle (SDLC), Software Quality Life Cycle (SQLC) and various testing methodologies like Waterfall Methodology, Spiral Methodology, Agile Methodology and RUP Methodology.
- Participated in requirements analysis, reviews and working sessions to understand the requirements and system design.
- Expertise in writing System Test Plans, Test Design Specifications, defining Test Case Specifications and Test Procedures, developing and maintaining Test Scripts.
- Expertise in Manual and Automation testing using Mercury Interactive Tool Winrunner7.x QTP 9.1/9.5/10.0 , Testdirector7.x/Quality Center 9.0 and LoadRunner8.0.
- Experience with creating test data, executing test cases, defect tracking and resolution.
- Involved with Functional Testing, Integration Testing, GUI Automation Testing, User Acceptance Testing, Security testing, Negative testing, Black Box Testing, White Box Testing, Load Testing, Stress Testing, System Testing, Usability Testing, Performance Testing, Regression Testing on Client/Server, Mainframe and Web based applications.
- Strong experience in mapping, testing and implementing the application as per HIPAA Regulations and EDI Standards like 837(health care claim), 835(health care claim payments), 271 (health insurance eligibility response verification), 270, 276 and 277.
- Experience in testing the mainframes using TSO, ISPF, QMF, File - Aid and Job schedulers.
- Experience in testing the applications developed in Mainframe using the COBOL, CICS and JCL.
- Experienced in writing SQL queries.
- Experienced in Testing of Web based applications and Desktop Applications.
- Testing Web service SOAP, using SOAP UI.
- Experienced IBM Rational Team Concert for integrated, including iteration planning, process definition, source control, defect tracking, build management and reporting.
- Involved in Manual and Automated Testing of applications on Windows and UNIX Environment.
- Experience with creating test data, executing test cases and Defect tracking and Resolution.
- Experience testing new functionality and enhancements to existing applications.
- Having Experience on Domain’s like Telecom, Financial, Billing, Banking, HealthCare and Insurance.
- Ability to interact with developers and product analysts regarding testing status and defect tracking.
- Documents quality plans and test procedures.
- Strong analytical skills with excellent communication and interpersonal skills.
- Flexible to perform other duties as assigned by manager.
TECHNICAL SKILLS
Tools: QTP/UFT -9.2/9.0,Selinium, Quality Center, JIRA File Aid, ISPF, TSO, InSync and Clear Quest
Languages: Java, J2EE, Visual Basic, ASP, C++, AS/400, SQL, PL/SQL, CORBA and TOAD
Web Technologies: HTML, XML, JSP, ASP. Net, VB. Net, SOAP, SMTP, SOA, and IIS Web services, Adobe Flash 3.0 and Web sphere
Scripting Languages: Testing Script Language (TSL), VB Script
RDBMS: Oracle 7.X/8i/9i, SQL Server 7.0/2000, DB2/400 and MS-Access
Others: MS Office (Word, Excel, Power Point, Excel macros, and MSVISIO)
Version Controls: Rational Clear Case, RTC, CVS and Visual Source Safe
Operating Systems: Windows 98/NT, Windows 2000, Windows XP, Linux, Solaris and UNIX
PROFESSIONAL EXPERIENCE
Confidential, Minneapolis
Sr QA Analyst
Responsibilities:
- Analyze user/business requirements, functional specifications and Use Case documents to design, develop and execute test plans and test cases for system testing.
- Prepare and execute Test strategies, Test planning, Test Scenarios and Test Scripts.
- Use IBM FTM application to conduct functional testing, SAT testing and End-to-End testing.
- Use mRemoteNG, puTTY and winSCP tools for file ingestion and ACH Modernization using FTM application.
- Working with other teams such as FLTS & FLIS to coordinate to data validations.
- Work flow and processing of FTM Transactions and data integration with Legacy applications.
- Coordinate with the business analysts and developers to discuss issues in interpreting the requirements to ensure all parties are aligned.
- Provide Testing Effort estimates and timely feedback on progress of all testing activities.
- Design, develop and conduct Black box, White box, Grey box, Functional, Integration, System, User Acceptance, Regression, Performance and Load Testing.
- Prepare, distribute execution results by gathering and analyzing the Test Metrics for reporting.
- Validate tests by cross checking data in backend on Oracle using SQL Queries to retrieve data and confirm integrity.
- Strong working experience on DSS (Decision Support Systems) applications, and Extraction, Transformation and Load (ETL) of data from Legacy systems using Informatica.
- Experience in testing ETL mappings- data transformation, sourcing, conversion and loading.
- Involved in testing the BI reports using Cognos.
- Provide test case walkthroughs for team members and QA Manager/Lead.
- Work with QA Manager/QA Lead to plan, schedule, and execution Test cases, as needed.
- Participate in PRD reviews business impacts, feedback and final approvals.
- Use Quality Center for managing the defect flow.
- Active team player with the UFT automation team, to help automate manual scripts.
- Apply experience with Software Development Life Cycle (SDLC) to all projects.
- Conduct Manual and Automated testing on Client/Server and Web-based applications.
- Participate in PRD reviews business impacts, feedback and final approvals.
- Use ALM Quality Center for managing the defect flow.
- Expertise using JIRA to write user stories, Defects tracking with IBM onsite.
- Conduct Manual and Automated testing on Client/Server and Web-based applications.
- Executed system quality assurance activities on complex n-tier systems to ensure the quality control of test plan designs, execution, defect tracking and implementation plans.
- Track bugs using Bug tracking tools such as ALM (Quality center) and JIRA.
- Maintain Requirements Documentation and prepare Traceability matrix.
Environment: ALM, JIRA, SQL, PL/SQL, TOAD 7.0, IBM DB2, COBOL, Copy Books, MVS, IMS DB/DC, Control-M, ISPF, BI Reporting (Cognos 7.0), Informatica Power Center 8.1 (Power Center Designer, workflow manager, workflow monitor), SQL *Loader, Oracle8i, SQL Server
Confidential, Minneapolis, MN
Sr QA Analyst
Responsibilities:
- Planned, coordinated and implemented QA methodologies Agile (DEV, SIT), Hybrid in UAT & Performed functional, integration and Regression testing
- Participate in Agile daily/weekly meetings, sprint planning, grooming sessions and KT & DRB meetings.
- Filed and linked defect using ALM/JIRA.
- Coordinated with Automation, Performance testing and offshore teams in helping them understand application; provide Test Scripts and relevant test data. Developed checklists to verify and ensure that the Functional specifications, Use cases, UI specifications documents are within the guidelines as specified by the Client.
- Performed updating the mobile app builds for online availability from the server for the team to download and install the mobile application on various platforms (OS) like Apple iOS, Android, Tablet editions and Windows OS. Initiated a test method called ‘Spot testing’ to test the application on the go with developer.
- Performed end-end testing core Mobile applications based on different platforms and operating systems for Apple, Android and Windows smart phones, tablet devices. Also validated the OS needs and browser supporting. Hands on experience in complete Regression testing on Mobile App, Mobile Web and Windows desktop by reviewing the source code after each build for every sprint. Performed full Regression testing based on cross browser testing along with cross functionality testing.
- Worked in testing the enterprise web services using SOAP UI, Transaction Viewer
- XML request and response validations using Transaction Viewer, XML pretty smart.
- Validated the Oracle data base using SQL Server 2008 and Oracle Developer.
- Participating Regression Testing after every code deploy & high level hands on smoke test.
- Regression testing using UFT automation scripts and debug those test cases which are failed
- Smoke Test validation after deployment using UFT test suite.
- Wrote Advanced SQL queries for extracting data from database (Hogan, FK) and Used UNIX commands for moving, accessing, copying and changing file permission.
- Complete UAT support interacting with end-business users and organize the UAT approvals in project shared repository
Environment: s & Tools: Windows 7, Unix, Web Services, SOA, Java, SQL, JIRA 6.0, ALM 13.1,UFT, SQL, J2EE, XML, SOAP UI,JSON, Android 5.0/6.0, iOS 9/10.2, MS SQL Server Management Studio, Clear Case 8.0, SQL Server 2008