Qa Analyst Resume
OBJECTIVE
To obtain a position as a QA Analyst for testing client-server, web and middleware applications
BS in Mechanical Engineering
SPECIAL SKILL- 14 plus years of Software Quality Assurance experience.
- 12 years of ETL testing, batch load testing, Oracle and Data warehouse experience.
- 6 years with Lead skill managing onshore and offshore QA teams.
- 8 years with application installation, configuration, integration, code migration to production, creating and setup the test environment and test bed.
- 5 years of UAT (User Accepting Testing).
- 5 years with test automation using QTP, 2 years with other tool.
- 5 years with Performance/Load testing using Load Runner, 2 years with other tool.
- Good working knowledge with ETL Informatica Designer, Workflow manager and Monitor.
- Highly skilled in Oracle SQL queries writing, and operating UNIX system with commands and Shell scripting.
- C# programming skill in .Net framework.
- Having a good knowledge in SAP BW/BI (data warehouse) architecture and design for developing reports from R/3 ECC systems.
- Skilled in While box testing, mainframe, DB2, IVR, CTI, and ICM testing.
- Experienced in FIX, FIXML and TIBCO message testing.
CERTIFICATION
- Certified Software Test Engineer (CSTE) from IQA – Institute of Quality Assurance, Orlando, Florida.
- Oracle Certified Associate by Oracle Corporation.
- Credentialing in Load Runner and QTP by HP/Mercury partner.
- Strong skill in managing test efforts on time. Be able to take direction from senior members and implement them. A decisive "Can Do" ability. Get-done attitude with great fervent manner. Having strong analytical and problem solving skills.
- Skilled in estimation of project testing cost, allocation of resources (man, machine, time). Solid skill in creating test plan, test strategy, test approach, test requirements, test procedure and test cases.
- Experience includes 5 full development life cycles starting from scratch to finish product. Expert in Functional Test, System test, GUI, Integration test, Transaction, Configurations, End to End Test, Documentation, Installation, User Acceptance (UAT), Sanity, Smoke, Regression, System, Performance, Stress and Load testing.
- Strong working knowledge with writing SQL, loading data into staging tables from data source system (flat file, mainframe file, DB2, Oracle). Validate staging data with source data. Load data from staging to warehouse tables with ETL Informatica and validate target data with source data. Strong working knowledge loading data into database tables using SQL Loader and INSERT scripting. Excellent understanding of Relational Database (RDBMS) system, and Data warehousing (MDBMS).
- Excellent working knowledge with automation tool such as QTP. Skilled in developing and enhancing the automation scripts, maintaining and customizing the scripts to support regression testing by running the scripts on a new build, parameterization the scripts to run the scripts for multiple sets of data for more test coverage.
- Excellent understanding of Software development Life Cycle (SDLC), QA process, SixSigma, Agile Methodology, and RUP processes. Contribute to on going QA process development.Create, review and obtain approval for SDLC documentation: Test Plans, Test Coverage Matrix, Test Results Reports, Test Scripts, other documentation as needed. Ensure that projects meet required QA standards prior to release with regard to functionality and reliability.
- Participated in various meetings and sessions for documentation review, sign off, walkthrough, release planning. Review documentation for thoroughness and accuracy, and provide feedback to Business Analyst. Expert in business Communication. Developed relationships with other groups: Development team, Business Analysts, Project Manager, UAT test team, internal customer, all test teams of interfacing systems; Fidelity, Hogan, MAX, Middleware and more.
- Provide daily/weekly status to project team and internal customer during testing cycles. Facilitate Bug Triage meetings, gathering of metrics.
- Worked for high end companies such as Chicago Mercantile Exchange Group, Confidential, Washington Mutual bank, Chase bank, ABN AMRO Bank, DLJ brokerage, SBC Ameritech, AT&T, William Communication and Com Ed.
- Extensive working knowledge with Oracle, UNIX, IBM VMS Main Frame system. Expert in MS office applications: Excel, Word, Power Point, Outlook. Having some exposure in Unix Administration and Oracle DBA.
SKILLS & TRAINING
Software &Testing tool: HP Quality Center, TOAD, SQL Developer, APEX, JIRA, Informatica ETL, QTP (Quick Test Pro), Load Runner, RequisitePro, CQTM, RMT, UC4, Blade Logic, Advance Query Tool (AQT), Attachmate, Share point, CTS, E Load, Win Runner, E Test, IBM Rational Suite: SQA Robot and SQA Manager, soapUI, Bean Test, Clear Quest, Clear Case, Crystal Report, SQL Plus, Microsoft’s Visual Source Safe, Visio, Remedy, Microsoft office products. Lotus Notes, Word Perfect, Lotus 123, Eclipse, Rumba, Host, QPID, Python, JBoss.
Test Skill: System testing, Integration testing, Functional testing, GUI validation, Performance, Transaction, Installation, Configuration, Documentation, User Acceptance - UAT, Sanity, Server Load, Parallel test, and Regression Testing.
Middleware: Experienced in testing of Middleware interfaces such as Enterprise Java Bean, IBM adapter (UNIX program), Informatica.
Hardware: Sun Solaris, Sun Workstation, HP, AIX, IMB VMS Main Frame, AS/40, IBM PC & Compatible.
Language: .Net with C#, Unix Shell scripting, Perl, Java, Visual Basic, BASIC, FORTRAN, Java Script, HTML, XML, FIX and FIXML protocol.
Database: Oracle Exadata, Oracle, Sybase, MS SQL Server, DB2, Progress, Red Brick, and Access
Special Training
- Siebel 7.8 consultant training facilitated by Siebel Corporation in 2006.
- OBIEE 7 consultant training facilitated by Siebel Corporation in 2006.
- Load Runner – training facilitated by Mercury Inc in 2007
- SAP BW/BI consultant facilitated by SAP Inc. partner in 2007
- MCSE – Microsoft Certified System Engineer Training
EXPERIENCE
Confidential, Rosemont, IL
QA Analyst (A consultant) Sep 2011 to present
Confidential is the #1 foodservice supplier in US, distributing food and nonfood supplies over 43,000 items to more than 250,000 customers: restaurants, hotels, schools, health care facilities, and institutions. In Confidential, working as a QA Analyst for the project below:
OBIEE- Oracle Business Intelligence Enterprise Edition: OBIEE keeps track of Confidential business data: sales and customer info. OBIEE reports are used by managers and other business users. Reports also help company executives to make critical business decisions. Its back-end is Oracle Exadata running on Linux box.
Responsibilities included the following:
- Developed test plan, test strategy, test cases and execution of test cases.
- For ETL testing, wrote the queries and ran them, saved the output data as a baseline. After deploying the ETL program, ran the same queries, saved the output as runtime. Compared runtime with baseline.
- Wrote SQL queries and ran it into Oracle Exadata environment to validate the report data quality with corresponding backend source data (Exadata).
- Performed functional test, System test, End to End test, Performance test, and Regression test.
- Test tools used: Load Runner for performance test, HP Quality Center for writing and executing test cases, defect logging, and Requirement Traceability Metrics coverage. Used TOAD and SQL Developer tool to access to databases. In-house build ETL tool has been used for data extract, transform and loading. APEX tool to modify application data.
- Attended project meetings, communicated with dev and other teams to solve issues.
- Participated in production migration activities as a part of release/build migration team.
Confidential, Chicago, IL
QA Engineer (A consultant) Oct 2010 to Sep 2011
Confidential is the world largest commodity trading Exchange, the provider of widest range of benchmark futures and options products available on any exchange, covering all major asset classes. CONFIDENTIAL maintains two data systems, DB2 on mainframe, and Oracle on UNIX platform. Both data systems are needed to run CONFIDENTIAL business. The bridge-back and the bridge-forward mechanism are used to keep the data in sync in both the systems (DB2 and Oracle). Data is loaded from Oracle to mainframe thru bridge-back process (ETL tool). Also data is loaded from mainframe to Oracle thru bridge-forward process (ETL tool). CONFIDENTIAL uses DB2, Oracle, Blue Phoenix databases running on mainframe and Unix/Linux system. I worked as a QA Engineer for the applications below.
Organization Account application: This is a Java based GUI application that keeps track of all customer information such as bank account, position account, risk account and requirement account along with other business data. Its backend is mainframe DB2, Oracle and UNIX.
Calendar Plus application: Calendar Plus application is a Java based GUI application for keeping track of holidays, weekends, and business days of a year for countries around the world. It helps Clearing Department for clearing trade transactions with foreign exchanges. Its backend is DB2 running on Legacy system, and Oracle running on UNIX box.
TIPS – Theoretical Intraday Pricing System: TIPS application keeps track of prices for trading products of the Exchanges. Real time prices flow thru FIXML messaging system to update the old prices in backend data storage system. TIPS’ back ends are Oracle and DB2 running on UNIX and mainframe.
Security System (Service Control): This is a new security access control system that CONFIDENTIAL has implemented to access all applications. Each user and application goes thru this security system. Each user has two accesses: one with LDAP id (Lightweight Directory Access Protocol) and other with mainframe id.
Responsibilities included the following for all the projects above:
- Developed test plan, test cases and execution of test cases.
- Validated front end data with its corresponding source data (back end).
- Used Java Test Harness to verify that the AP method callings were working properly for the new built comparing with baseline behavior, before and after build migration.
- Ran PL/SQL script to create tables, indexes, functions, and other database objects to build a DB in test environment.
- Updated and executed Perl and Shell scripts as needed to configure different UNIX environments (sand boxes) for migrating new release of application.
- Refreshed database tables with production data. Modified system configuration files (shell scripts) as necessary for data refreshing purpose.
- Researched and analyzed script run logs files (UNIX files) to identify the root causes when defects were found.
- Communicated dev and other teams to resolve the issues/defects promptly.
- Migrated software builds into Test environment.
- Attended various project meetings. Led test meetings when needed.
- Toot used: TOAD for accessing Oracle database, AQT-Advance Query Tool for accessing DB2 database, Attachmate to access mainframe, Putty to access Unix/Linux, Beyond Compare to compare files, JIRA and SharePoint for release document management, Clear Case for software version controlling, CQTM, RequisitePro and RMT for test case writing, test execution and defect logging, UC4 for job scheduling, Blade Logic for software build migration.
Confidential, Chicago, IL
Test Lead / Test Engineer (A consultant) Oct 2009 to Sep 2010
For a Technology company worked for the projects as below:
Enterprise GIS (EGIS): Was a Test Lead for testing theEnterprise Geospatial Information System (EGIS). EGIS was developed and tested following the Agile methodology. This is to be a source data metadata repository that facilitates discovery, tracking use, and sharing of best available sources for a specific product lifecycle. EGIS performance is measured by improvements in productivity and quality for select workflows and use cases as prioritized by users. EGIS is to achieves a “one-stop shop” role for spatial discovery by users with a convenient and responsive interface. Within the vision, EGIS performs the primary data source discovery tool using spatial query to include admin level, relational polygon, and/or feature-level query through a user web interface. EGIS consists of three main process components namely Data Ingest, Data Update, and Data Query. The main interface into the EGIS system is the EGIS Web Application which is accessed over the CONFIDENTIAL intranet via a web browser. The web application is built using ASP.NET technology and leverages 3rd party controls such as the Infragistics NetAdvantage for ASP.NET, ArcMap API for JavaScript and libraries such as jQuery JavaScript Library. Internet Explorer (7.0+) is the target browser. EGIS backend is Oracle spatial 11g, IIs6.0 running on Window 2008 severs.
Worked as Test Lead and responsibilities included the following:
- Built the process from scratch for testing the EGIS system.
- Helped Confidential project management making sure all necessary documents are being created as per Industry Standard QA process.
- Helped upper management providing them Unix server specification (for application and database server) for building brand new test environment.
- Reviewed the IRD, FRD, SAD, Detail Design Doc (DDD) and provided the feedback.
- Developed and implemented the Test Strategy.
- Peer reviewed the Test Matrix (test cases), and implemented the review-feedback.
- Confirmed that all requirements are covered by test cases, and appropriate details information (step by step) has been provided to test cases.
- Installed and configured the whole EGIS system in Test and UAT environments.
- Used Soap UI for automation for UI functional and load testing.
- Executed test cases after successful installation of EGIS system.
- Created defects during execution of test cases.
- Conducted Defect Triage meeting regular basis.
- Performed Functional testing, System testing, and Integration testing.
- Used JIRA, and CTS for defect management.
- Created Test Report after test completion.
GPS Probe Data: GPS Probe Data is a system that processes probe data (points on earth with latitude and longitude) with ability to accept and process high volumes of probe data on an ongoing basis. The purpose of this system is discovering the missing geometry (e.g. roadway). The system consists for several components such as Linkset Generator (a jar file), Probe Viewer (a GUI application created with .NET Framework 2.0), Admin Console (a java web application). Its back end is Oracle 11g running on Linux servers. Probe Processing Components are Qpid, and JBoss. This application was developed following the Agile methodology.
Responsibilities included the following as a Test Engineer:
- Analyzed IRD, FRD, SAD documents to recognize Test Requirement, Test Strategy for creating Test Matrix.
- Developed Test Matrix, Test Scenario, and Test Strategy.
- Performed White box testing for Probe Viewer to make sure Probe data is forming a clusters with expected shape.
- Created and setup the test environment or test bed.
- Installed and configured the GPS Probe Data system on Linux servers for each release and validate the install process.
- Created Database schema as a part of installation process.
- Performed testing to verify the system can produce consistence results with same inputs data.
- Developed Shell scripts to accelerate test activities, such as to check if all processes are up and running, if there any errors occurred during application run, and set up env etc.
- Created Test Data, and executed the Test Cases in Test Matrix.
- Created CRs (defect/enhancement) during test execution. Used CTS system tracking CRs.
- Performed back end database testing to validate the Frontend GUI data against backend Oracle database for Probe Viewer and Admin Console applications.
- Validated binary files (Linkset files) against its source RMOB database system.
- Validated processed .csv files (partitions files) against its source binary files (probe point files).
- Performed Business Functional test, GUI test, Performance test, Integration test, Installation test, Configuration test, and database testing.
- Performed Regression of new releases.
- Preformed End to End testing to make sure all integrated systems are working as a whole as expected.
- Created test results report and sent to all stakeholders.
- Tool used: TOAD, ArcMap, ArcGIS, Bison tool, Putty, CTS, JIRA, SQL plus etc.
Confidential, Downers Grove, IL
QA Lead / QA Specialist (an employee) Oct 2002 to Aug 2009
For the financial company worked as a Test Lead and QA Engineer for several projects as below.
OBIEE- Oracle Business Intelligence (formerly: Siebel Analytics): Performed test Lead role for testing the OBIEE application, a web-based reporting tool. Worked on this project as a Test Lead from scratch to finish of application development. Responsibility included; attending JAD session, estimation of project testing cost, allocation of testing resources, wiring test plan, test strategy, test cases, executing the test cases, conducted daily Triage meeting, assign tasks to testers, manage onshore and offshore test teams, create test progress status report and send it to project stakeholders. Performed ETL validation for target data against source data. Analyzed the ETL mappings for target data validation. Installed the OBIEE on Test Environment myself to create the test bed. OBIEE generates reports for senior level managers and executives giving them the ability to monitor, analyze, and act upon intelligence reports in real time while providing end-to-end visibility into company operations and financial performance. The OBIEE application was being developed in parallel with Siebel CRM Application development. The application back-ends are Oracle Data mart running on Sun and AIX boxes.
Siebel CRM: Provide testing and quality expertise as member of project team charged with converting legacy CRM system to a more robust Siebel CRM Application, a web-based Siebel/Oracle package customized for WaMu by the development team. Develop and execute full testing cycle including writing the test plan, test estimation, test strategy, test cases and executed test cases, resource management, leading onshore and offshore teams. Expert in using Siebel Tool. Make assured efficiency and accuracy in both implementation and conversion, ensuring a seamless transition to divisions including the Enterprise Call Center, Consumer Direct, and National Sales of the company. The back-ends systems are IBM VMS mainframe, and Oracle 10g running on Hogan, Fidelity, Sun and AIX boxes.
IVR, ICM, CTI: Perform comprehensive testing for IVR Application (Interactive Voice Response for 24/7 customer banking inquiries) system to ensure high quality response and prompt referral to appropriate banker if further assistance is required. Wrote test plan, test cases and executed test cases. Identified potential quality issues and efficiently developed and implemented corrections. IVR system also incorporated through Siebel CTI and fed the information to ICM system with, routing the call to the banker with appropriate skill set.
Data Conversion: Provided testing expertise on the Hogan Mainframe Data Obfuscation Project to ensure compliance with Federal regulations requiring scrambled customer personal information data (e.g. SSN, name, account number, etc.) Production data was extracted into a flat file and processed into an obfuscated data program. Successfully performed data validation for before (baseline) and after (runtime) the obfuscated process run.
Code migration/Production implementation: Performed application code migration activities during production implementation for new build of the applications. Code has been migrated from test environment to production environment
Data EIM: Performed data population to Siebel Test database thru the EIM (Enterprise Integration Manager) process.
Performance Reporting: Performance Reporting is a reporting and analysis tool developed in Business Object (BO). Used Informatica ETL tool for target data validation against source system. This reporting tool is used by company managers and executives to monitor and analyze the operations and financial performance of the company.
Responsibilities included the following for all the projects above:
Manual Testing
- Analyzed requirement catalogues, functional and technical specifications.
- Developed Test Plan and Test Strategy. Calculated test estimation (TA).
- Created Test cases and executed test cases. Created test data to meet the test conditions.
- Directed offshore and onshore team as a team lead.
- Performed UAT testing along with UAT team.
- Code migration, and data EIM to database server.
- Performed batch testing, and backend database testing. Performed ETL data validation for target data against source data.
- Developed database schema and tables in test environment. Developed SQL queries to populate database tables in test environment.
- Used SQL loader to populate tables from flat files in test environment.
- Used Siebel Tool for data mapping purpose between source to target.
- Created control files for SQL loader. Searched test data in various main frames (Hogan, Fidelity, FDR)
- Performed Data Mining with SQL queries from DB2 database to search test data for Siebel CRM testing.
- Used Informatica ETL tool for target data validation. Analyzed the ETL mappings using ETL Informatica Designer. Used ETL Informatica Workflow manager for analyzing the source systems.
- Developed and implemented a Six Sigma test process to improve the test quality.
- Created test reports in Test Director/Quality Center for analyzing test efforts and its progress.
- Documented Test Results report.
- Performed Usability test, System test, Access control test, Functional & GUI test, End to End test, Integration test, Database testing.
- Installed the OBIEE on Test machines for testing.
- Used Test Director/Quality Center for writing and execution of Test cases, and tracking of defects.
- Performed Regression test to ensure release modification does not break other functionalities.
- Confirmed the quality of testing by participating in all phases of development lifecycles.
- Participated in various meetings such as JAD – Join Application Development session, Functional Spec and Design spec walkthrough meetings, and QA sign off meetings.
- Highly skilled in defect management. Led defect triage meeting daily basis with development team.
- Performed system testing, Integration, functional testing of IVR, CTI, and ICM system.
Performance testing with Load Runner for web/Siebel/CRM application
- Analyzed Business requirement for Load testing and developed the Test plan.
- Created Vuser Scripts using Virtual User Generator.
- Enhanced and modified the Vuser Script as per requirements.
- Created the baseline behavior. Developed the scenarios based on business requirements, and executed the scenarios using Controller.
- Analyzed of Graphs and Reports after scenarios run, and performed the trouble-shooting to fix the bottlenecks which degrading the performance.
- Created and published Performance Test Result.
- Provided effective communication with other teams regarding test status and issues.
Test automation with QTP for web/Siebel/CRM application
- Analyzed requirement for test automation and wrote the Test plan and test cases.
- Created QTP test scripts for the application performing the regression testing for new build/release.
- Ability to code in VB Scripts.
- Customized the scripts to validate GUI data with backend data (database and mainframe data).
- Parameterized the tests, created check points, and output values.
- Modified the scripts to enhance the testing for validating application data against back end database, and mainframe.
UAT – User Accepting Testing
- For Siebel CRM application completed all the arrangements for UAT testing.
- Worked as a UAT team member performing UAT testing.
- Also worked as a liaison person to ensure availability of all testing facilities such as test related documents, test machines and applications set up, test data and test tool readiness.
- Reported on testing progress to project stakeholders regular basis.
- Collated final UAT test results and reported in writing to stakeholders.
Confidential, Lincolnshire, IL 7/01 to 9/02
QA Engineer (was an employee)
For a Software development company, worked as a Quality Assurance Engineer for testing UNIX based applications named ClearView. ClearView is a complex software for managing optical, broadband and digital telecom and wireless networks for fault and performance management. This software allows long-distance carriers and local-exchange carriers to maintain their circuits and networks thereby decreasing downtime, reducing costs and improving quality of service through efficient operations. There are two intelligent Interfaces, called DCD (Data Collection Daemon), and NEI (Network Element Interface), analyze & compute various factors and dependencies (such as threshold, port group, slice, elements etc), interact & send information to all ClearView component software, populate the database with problematic seconds, and control overall system activities. ClearView suite has Several components namely Circuit View (written in C/C++), Early Warning (C/C++), Old Report Card (C/C++), New Report Card (Java-J2EE with Web sphere env), Old Clear Admin (GCI, Perl, Java Script), New Clear Admin (Java-J2EE with Web sphere env.). The back end was Oracle 8i running on Sun 250 & 450 boxes.
Responsibilities included the following:
- Analyzed Business Documents and System Specs to Identify Test Requirement, Test Strategy for outlining appropriate Matrix of testing to achieve optimum quality of the system.
- Developed Test Plan and Test Strategy, Test Procedure, Test Cases, Test Data, and executed the Test Cases.
- Developed Unix Shell Program to automat system configuration making life easier.
- Developed small Stored Procedure for repeatedly used complex database queries using PL/SQL.
- Created different test environments with specific version of Solaris (2.6, 2.7) with specific version of Oracle (8.1.5; 8.1.7; 9i) to simulate Customer production environment.
- Installed Solaris operating system and recommended patches on Sun box 250 and 450 enterprises.
- Installed Oracle 8i and Oracle 9i software and created Oracle SID on Sun box 250 and 450 enterprises.
- Installed ClearView software suite (Circuit View, Early Warning, Report Card, and Clear Admin) on Sun box and PC.
- Performed White box testing with accessing and modifying source code for ClearView component software. Validated the signals strength differentiating week and strong signals.
- Worked on Unix environment for installation and configuration of ClearView software.
- Performed back end database testing to validate the statistics of the ClearView software.
- Performed Business Functional test, System, GUI test, Performance test, Integration test, Installation test, Configuration test, Volume test, and Parallel testing.
- Performed Sanity and Regression test to ensure software modification do not affect custom software functionalities.
- Created test results and progress reports for test evaluation and further enhancement of the system.
- Used Clear Quest software (a Rational product) to log Defects.
- Used Crystal Report software to generate Clear Quest defect reports.
Confidential, Hoffman Estates, IL 4/01 to 7/01
QA Analyst (A consultant)
For a large Telecom company, working as a QA Analyst for testing middleware applications. The middleware were written in Enterprise Java Bean (EJB) and in Unix shell. Those middleware applications map data between tier to tier, such as client to database, client to application sever, client to legacy system, and vice versa etc. Used Bean Test (a tool to test Java Bean application) for testing EJB middleware. Unix middleware applications have been tested in Unix Command line. Used Win Runner for regression of Windows applications for SAM Services. The backend databases are Oracle and Sybase running on Unix boxes, and DB2 running on Mainframe. Worked on various SBC products, such as Data gate, SAM, and Bean services.
Responsibilities included the following:
- Reviewed/Analyzed business requirements, functional and technical specifications.
- Developed Test Plan and Test Strategy. Created Test procedures and Test cases.
- Created test scripts that were written in Perl and Unix. Created test data with different test conditions to execute the test scripts. Developed Perl scripts to insert test data automatically into test scripts (test scripts were written in Perl and Unix shell) with various test conditions.
- Developed Unix Shell scripts to execute multiple test scripts (test scripts were written in Perl and Unix shell) at a time. Also executed test scripts manually.
- Analyzed and evaluated test results after test scripts execution against expected results.
- Developed several Macros (MS Excel and MS Word) to develop test result summery. The macros grab specific required data from Unix output results and put into MS Excel or/and MS Word.
- Performed Functional test, Performance test, System test for EJB middleware using Bean Test Tool (a product of Empirix Inc).
- Performed command line testing for Unix middleware applications for Functional and System testing.