Senior Qa Analyst/test Engineer Resume
Des Moines, IA
OBJECTIVE:
A detail oriented professional with experience in the field of Software Quality Assurance (QA) and Software Testing on various platforms and environments. Experience in writing and executing test plans,designing test cases and procedures, creating bug documentation, creating test summary reports. Seeking a position where I can enhance my expertise and implement my technical, analytical and business skills to contribute to the organization’s growth and deliver value.
SUMMARY:
- Nine years of experience in Information Technology, with emphasis on Software testing, IT Infrastructure (ITI) services.
- Experience as a Quality Assurance Engineer in QA Methodologies and QA testing of various Application Systems using Manual and Automated testing techniques ((QTP)) for both Web and Client/Server based applications.
- Familiar with Software Development Life Cycle (SDLC), Software Testing Life Cycle (STLC), Test Design, Deriving Test strategy, Testing methodologies and processes.
- Experience in Waterfall development model and Agile (SCRUM) development model.
- Good understanding of quality assurance testing methodology relative to the Software Development Life Cycle (SDLC).
- Experience in Quality Assurance Life Cycle (QALC) aspects such as developing Test Strategies, Test Plans, Test Cases (Manual / Automated), Defect Reports and developing Test Scripts, Traceability Matrices and Test Environments.
- Expertise in Functional testing, Integration testing, Black box testing, GUI testing, Backend - testing, Regression testing, Browser compatibility testing.
- Experienced in automated testing using HP-Mercury Quick Test Professional, Quality Center.
- Expertise in analysis of Software Requirement Specifications and development of Test Plans, Test Scripts, Test Cases and Test Analysis Report.
- Proficiency in Defect management, including Defect creation, modification, tracking, reporting using industry standard tools like Quality Center 10.0.
- Expertise in developing, test Scenarios, preparing test data, and creating bug documentation.
- Excellent knowledge and working experience with test execution and test results analysis.
- Documentation and execution of User Acceptance Testing for business applications.
- Creating Automation resources such as Object repositories function libraries etc.
- Creating Tests/Test Scripts using keyword driven methodology or descriptive programming.
- Experience in testing ETL infromatica mappings and other ETL process ( Data Warehouse Testing).
- Experience in testing Section 508 compliance.
- Experience in ETL Testing using Informatica tool verifying if the data is getting loaded to the target within estimated time limits (i.e. evaluating performance of workflow).
- Validated data and performed end to end testing using sql queries.
- Experience in operational testing, validation, and acceptance .
- As a operational Acceptance team member (OAT ) team, focused on integration, vendor and end-to-end testing.
- Experience in testing software applications and meeting the project goals within the time constraints.
- Experience in estimation of test efforts in Coordination of test schedules with overall project schedules.
- Documented Issues and Mitigated risks, Acknowledged Internal and External Dependencies and communication plan.
- Performed execution of User Acceptance Testing and tracked UAT execution metrics and defects.
- Experience in Version control, build systems, CI/CD .
- Involved in migrating key systems from current on-prem host to Amazon Web Services(AWS).
- Involved in utilizing AWS stack.
- Liaise with developers, business analysts, and user representatives in application design and document reviews.
- Leverage knowledge of object-oriented programming to help validate, verify, communicate and resolve software issues through careful, thoroughly documented testing to maximize return on investment (ROI) for IT initiatives.
- Experience in COTS-based application system and identify limitations and impacts to testing during vendor upgrades
- Knowledge of automation strategy and tools like Cucumber and Selenium
- Working experience on Autosys Infrastructure and monitoring Autosys jobs, Incident management.
- Strong team player with superior analytical, troubleshooting, communication and presentation skills.
- Focused towards quality improvement and quality adherence and works towards customer satisfaction.
Soft skills:
- Excellent oral & written communication skills.
- Very good team player.
- Fast learner & can work independently with minimal or no supervision.
TECHNICAL PROFICIENCY:
Testing Tools: Manual Testing, Automated Testing, soap UI 5.0.0, QTP 10.0, Load Runner 9.0, HP ALM.
Languages: SQL, XML, basic familiarity C#, Java.
Databases: Microsoft SQL Server, Oracle 10g.
Operating Systems: MS-DOS, Windows 7/Vista/2000/XP/98, UNIX (Linux, Solaris)
Web Technologies/ Tools: HTML, MS office suite - Word, Excel, Power Point, MS SharePoint
PROFESSIONAL EXPERIENCE:
Confidential
ETL Tester/Senior Test Engineer
Responsibilities:
- Extensively involved in analyzing business and functional requirements. Involved in reviewing the Use Cases and other functional requirement documents for testability.
- Performed output data validations at the field level which will confirm that each transformation is operating fine.
- Testing the functionality of informatica workflow and its components; all the transformations used in the underlying mappings.
- To check the data completeness (i.e. ensuring if the projected data is getting loaded to the target without any truncation and data loss).
- Verifying if the data is getting loaded to the target within estimated time limits (i.e. evaluating performance of workflow).
- Ensuring that the workflow does not allow any invalid or unwanted data to be loaded in the target.
- Designing the Test Cases based on requirements in SRS. Responsible for Functional, Regression testing, Integration testing.
- Designed Test Cases for Manual Testing and Test Scripts for Automation using HP Quick Test Professional (QTP ) to check the Functionalities of the application.
- Perform formal review of the test cases developed by team and ensure all the functionality is covered.
- Analyzed System requirements to develop the Test strategy and detailed Test Plan.
- Coordinate with the infrastructure team and business team to identify the environments and data needs for the testing of the application with the various phases of the release.
- Performed execution of User Acceptance Testing and tracked UAT execution metrics and defects.
- Performed detail Test planning, developing testing strategies and coordinating and tracking testing progress and reporting
- Performed functional, black box, system and regression testing.
- Performed sanity testing to estimate testing readiness of the build for every release.
- Executing Manual and Automated test Cases and Documentation of execution reports.
- Created the Regression Test cases and automated them for the purpose of regression testing.
- Use Quality Center 10.0 to organize and manage all phases of the software testing process, including planning tests, executing tests, and tracking defects.
- Experience in operational testing, validation, and acceptance .
- Used Teradata utilities (MLOAD & FLOAD) to load the source files in to test region. And did the querying on same data.
- As a operational Acceptance team member (OAT ) team, focused on integration, vendor and end-to-end testing.
- Executed and updated the test results for the test cases in Quality Center 10.0
- Execute SQL queries on ORACLE tables to verify and validate data as per the Data Association documents.
- Modify the data at the back end to test the respective scenario according to the business request.
- Performed periodic checks to run crosscheck against QA/SIT/PROD environments to ensure it is up and running.
- Managed and executed the traceability matrix of requirements to test cases.
- Responsible for migrating the test cases from excel spreadsheets to the Quality Center’s repository.
- Responsible for conducting the defect review meetings with the development and business teams to report and discuss the validity of defects opened by the testing team.
- Support the QA team in accordance to the data for the testing of the application.
- Involved in migrating key systems from on-prem host to Amazon Web Services.(AWS).
- Involved in utilizing the AWS stack.
- Maintain better communication with lead, development team and business analysts throughout the project cycle to manage the customer expectations and resolve issues.
Environment: Manual testing, Automated testing, Quality Center 10.0, SQL.
Tools: Quality center 10.0, Quick Test Professional (QTP) 10.0, Toad for oracle 8.6, SharePoint, Informatica, AWS.
Confidential, Des Moines, IA
Senior QA Analyst/Test Engineer
Responsibilities:
- Extensively coordinated with SMEs, investment Data Analysts and IT people for traceability on testing and test case creation.
- Understanding the Software Requirements and Change requests for the new features testing.
- Performing the review of test cases in QC 10.0 and providing review comments for further changes.
- Performed System test, Functional, Integration, End to End, User Acceptance testing and Regression testing activities in Test environment and Production support environment.
- Responsible for Staging / Mock-up of required inputs.
- Responsible for Test execution, test coordination and Test Strategy Development.
- Responsible for the environment setup for UAT Testing.
- Developed & executed Test scenarios for User Acceptance Testing.
- Executed backend data-driven test efforts with a focus on data transformations between various systems and a data warehouse.
- Preparing the SQL queries to validate data as per business rules.
- Performed execution of User Acceptance Testing and tracked UAT execution metrics and defects.
- Wrote complex SQL scripts used in RDBMS for back end testing.
- Conducted manual testing on entire application.
- Coordinated software defect tracking efforts to ensure satisfactory defect resolution.
- Performed verifications on fix and documented status of defects appropriately.
- Written failure reports and defect escalation.
- Extensively involved in Triage / Issues management.
- Responsible for conducting the defect review meetings with the development and business teams to report and discuss the validity of defects opened by the testing team.
- Support the QA team in accordance to the data for the testing of the application.
- Maintain better communication with lead, development team and business analysts throughout the project cycle to manage the customer expectations and resolve issues.
Environment: Manual testing, Automated Testing, .NET Framework, SQL Server 2008.
Tools: Load Runner 9.0, HP Application Lifecycle Management (ALM), SQL Developer, Microsoft Office.
Confidential, Bentonville, AK
Senior QA Analyst/Test Engineer
Responsibilities:
- Extensively coordinated with SMEs, investment Data Analysts and IT people for traceability on testing and test case creation.
- Involved in Preparation of Test plan and Test Strategy
- Involved in QA planning, Co-ordination and implementation of QA methodology
- Vastly working with migration team, development team, SMEs and project management team to have smooth migration of the programs
- Performing the initial review of test cases in HP Quality Center and providing review comments for further changes
- Providing daily/weekly status reports on test status/ test bed and defect triage to the project manager
- Writing failure reports and defect escalation.
- Defect management and resolving of general default issues.
- Regression testing to ensure the stability of the system after enhancement and defect fixing
- Involved in Black Box Testing includes Smoke Testing, Regression Testing, User Acceptance Testing (UAT) System Testing in Putty and Manual HP Quality Center.
- Actively participated in Status reporting and Defects discussion meetings.
- Verified and tested the final output/data in the form of Reports
- Attending daily/weekly checkpoint calls along with project management, migration teams and SMEs
Environment: Manual testing, Automated Testing, Quality Center 10.0, Unix, Linux, Oracle, Shell scripts.
Tools: Load Runner 9.0, Toad for oracle 8.6, Microsoft Office.
Confidential
QA Analyst/Test Engineer
Responsibilities:
- Preparing Test plan, test scripts and test cases by understanding the business logic and user requirements.
- Working with the development team to create a suite of test data (both input files and expected results) that fully exercises data validation.
- Perform system, acceptance, regression, load and functional testing on the application using both automated and manual testing methods.
- Finding, documenting and reporting bugs, errors, interoperability flaws and other issues within proprietary software applications developed for UBS Company’s global user base.
- Experience in operational testing, validation, and acceptance .
- As a operational Acceptance team member (OAT ) team, focused on integration, vendor and end-to-end testing.
- Escalated issues to developers and verified fixes.
- Preparing defect reports for daily/weekly status meetings.
- Attended daily/weekly/adhoc defect report meetings and presented progress updates.
- Preparing Test results and created/updated test summary reports.
- Documented test artifacts as required.
- Assist in scheduling activities and training new members of the team.
Environment: Manual testing, Quality Center 10.0.
Tools: Quality center 10.0, Quick Test Professional (QTP) 10.0, Toad for oracle 8.6
Confidential
Systems Executive
Responsibilities:
- Systems administration, maintenance and monitoring various day-to-day operations.
- User and Group Administration - adding, modifying, removing according to UBS Standards.
- Providing value added customer services by attending to customer queries and issues.
- Monitoring of Event Engines and creating tickets for the alerts and coordinate with L3 Team.
- Involved in various ITIL processes like Change Management and Incident Management.
- Monitoring of Server health and availability.
- Scheduling jobs using WCC tool.
- Monitoring the mainframe jobs and scheduling the jobs in event engine through UNIX.; Analyzing and re- running the failed jobs.
- Management of Outages on the complete IT Infrastructure of the client. This involved identifying the root cause and identifying the clients impacted and sending out appropriate notifications for providing the status of Business Impacted.
Environment: Unicenter Enterprise job manager, UNIX, Autosys, Multi Domain Security (MDS) checks.
Tools: Workload Command Centre (WCC), Manual Testing, Automated Testing.