We provide IT Staff Augmentation Services!

Sr. Erp Qa Lead /site Administrator Resume

2.00/5 (Submit Your Rating)

Columbus, OH

SUMMARY

  • Offering over 12 years of experience, in the Software Quality Assurance field for diversified industries such as Pharmaceutical, Financial, Ecommerce, Telecommunication, Insurance and University.
  • Successfully lead/implemented a $21 Million project with Quality Assurance Process
  • Conducted and Review current testing practices
  • Implemented best testing strategies and practice
  • Built a ERP testing team and testing framework from ground up
  • Interviewed and Hired QA Leads and QA Analyst for the team
  • Worked closely with QA Director on testing budgets, departmental goals and strategies
  • Meet/Trained ERP Directors to imphaze the importance of testing at ERP level SDLC
  • Reviewed and Improved current testing practices for all departments of OCIO
  • Helped with implementing new process and change (push vs pull process)
  • Reviewed/Developed Test Plan, Test Cases, Test Reports and Traceability Matrix as per FRS and SDS.
  • Well versed functional, integration, regression (top down, bottom up), GUI, black and white box, back - end, browser compatibility, exploratory and sanity testing.
  • Well versed in different stages of Software Development Life Cycle (SDLC) and Agile Testing Process.
  • Exceptional knowledge of Automation and Regression testing Framework for end to end testing.
  • Created/Executed SQL queries in HPQC for Reporting out of Dashboard module
  • Communicated testing process with managers and directors of each departments
  • Well versed in Data Driven/Keyword Driven/Hybrid Driven QTP Framework.
  • Used Quality Center and TFS for defect tracking and reporting tool.
  • Well versed in finding defect clusters and defect areas in the application for exploratory testing efforts.
  • Managed resource allocation for project and performed capacity management
  • Coordinated internal/external teams and external project for the client.
  • Improved existing Agile Testing Process to ensure Quality Improvement.

TECHNICAL SKILLS

Hardware: HP-9000, SUN Ultra SPARC, IBM Compatible PC

Operating Systems: Windows XP/ 2000/2003/2007 , UNIX, Linux, Citrix, Novell

Protocols: HTTP, TCP/IP, FTP, Ethernet, Telnet, FIX, SSH,SOAP

Programming Languages: C++, C#, VB, HTML, DHTML, XML, ASP, JAVA, SQL, JavaScript, Ajax

Testing Tools: WinRunner 7.0/6.0, HP Quality Center 7.6/7.0/6.5/6.0 , QTP, LoadRunner7.0/7.6,11.51 Ruby 1.9.3, Gherkin, Cucumber, SOAP UI, Bonfire, Green Hopper

Bug Tracking Systems: Test Director 7.6, Rational Clear Quest 2001A, Remedy, Bug Tracker, Track Record 6.02, JIRA, Pivotal Tracker

Applications: Lotus Notes, rKaid, VNC, Console One, Documentum, Eroom, Terminal Emulator, Exceed, Reflection, Citrix remote desktop

Application servers: Web logic 5.1/6.1, ColdFusion Server 5.0, WebSphere 4.0, JRun 3.0

Web Servers: IIS 4.0/5.0, Netscape Enterprise Server(NES)

Databases: Oracle 8i, SQL Server 7.0, MS-Access 97/2000, SQL Server 2008/2010

Documentation Tools: MS Office, MS Project, Adobe tools, Page Maker, SharePoint

Methodologies: Waterfall SDLC, Agile SDLC, Kanban, Scrum, XP Programming, TDD (Test Driven Development), ATDD (Acceptance Test Driven Development), Push Vs Pull Workflow, Lean, Six Sigma

PROFESSIONAL EXPERIENCE

Confidential, Columbus, OH

Sr. ERP QA Lead /Site Administrator

Responsibilities:

  • Lead a team of Testing Coordinator for a $21Million project of PeopleSoft Upgrade from 8.9 to 9.2.
  • Created Testing Strategies and Milestone for PeopleSoft Upgrade project and
  • Interviewed and Hired new personal for the QA department
  • Provided input for the Test Plan review and Test Strategies for
  • Provide input for go/no-go decision on major project that are migrating to the production environment
  • Proactively resolved testing bottleneck for the PeopleSoft Upgrade project ex: getting additional HPQC licenses before SIT phase
  • Coordinated and implemented HPQC dashboard reports with testing coordinator and directors
  • Delegated QA Lead task to the testing coordinator ex: Test plan creation, Traceability Matrix, Requirement Review processes, Roles and Responsibility
  • Coordinated and Negotiated additional license agreement with Genlogix vendor for HPQC
  • Provide input to the QA Director on renewal of support contract Genilogix
  • Provide input and helped OCIO director and QA Director to fund and move to the SAAS or Cloud model service for HPQC
  • Brought a 2 year pending plan to realization in my first 6 months by opening up HPQC access across campus buildings by working and negotiating with ERP Security group, ERP Architect and QA Director.
  • Installed SSL certification on the HPQC server to make in transit data more secure
  • Managed resource allocation for project and performed capacity management
  • Worked with various departments’ personals and directors under OCIO to Conduct/Documented Phase 1 study on how each group conducts their testing in the current state
  • Conducted/Defined Phase II for OCIO organization on where we would like to be in respect to testing framework
  • Proposed/Conducted/Implemented Phase III on how and when we are going to implement changes with specific milestones to measure our progress.
  • Proposed a testing/training strategies for the OCIO department with HPQC tool
  • Created and Trained end user on how to use HP Quality Center and Plug ins to be more efficient at manual testing and uploading requirements to HP Quality Center
  • Worked with various department to learn their testing pain points and provided solutions
  • Created a Test Automation Workflow Plan
  • Prioritized automation tests effort for the testing to get the biggest ROI on automation effort
  • Created automation script to cut regression testing time.
  • Created TNR (Testing and Release ) Team department SharePoint site to ensure common
  • Worked with the release management group to identify changes and regression effort for a particular release to production
  • Conducted weekly status meeting to communicate changes, cross impact on projects and new process changes for the testing group
  • Help design the change management process for production defects and incident numbers by integration with HP Quality Center
  • Worked with batch team to create reusable test case template in HP Quality Center for reusability on different batch types of workflow testing
  • Worked with the Test Coordinator on Blue Chip project on testing effort, planning and
  • Worked with the end user to learn their pain points for UAT and helped addressing these issues with the help of other team members.
  • Handle HPQC Site Administrator Duties of User Management, Patch Maintenance, Project/Domain Management
  • Worked with Vendor (Genilogix) to resolve any HPQC issues due to patch updates.
  • Worked with the DBA at Confidential to do disaster recovery planning and project restore exercise/training.
  • Installed SSL certification for HPQC Link and Opened up HPQC access around campus using IP address filters and updating firewall rules.
  • Restored/recovered HPQC dev. server for the OICIO after the pervious Admin quit without providing any transition training.
  • Performed performance testing using true client in Load Runner.
  • Assist project with Quality Assurance task and HPQC setup.

Environment: XP/7, SharePoint, SQL Server 2010,.Net 3.5 Framework, Gadwin Print screen Ver. 3.5, JIRA, Pivotal Tracker, HP Quality Center 10.0/11 (ALM), HP QTP 11, Load Runner 11.51

Confidential, New Albany, OH

Sr. QA Lead

Responsibilities:

  • Created test plan and held a test plan review for senior management’s approval
  • Provided testing effort and testing timeline for the project
  • Lead a group of 3 tester for testing
  • Managed and allocated work for testers
  • Communicated bottleneck on the project to senior management
  • Performed and presented testing artifacts in front of CAB for move to production approval
  • Analyzed User stories/Value Stories for QA testing effort.
  • Coordinated testing effort and testing strategies with SME(subject matter experts)
  • Coordinated testing window with labs and UAT users around the world
  • Mitigated bugs and issue related to testing
  • Performed daily and weekly status reports to upper management using HPQC dashboard

Environment: XP/7, SharePoint, SQL Server 2010,.Net 3.5 Framework, Gadwin Print screen Ver. 3.5, JIRA, HP Quality Center 10.0, POS (point of sale labs)

Confidential, Columbus, OH

Sr. Quality Analyst/QA Lead

Responsibilities:

  • Analyzed User stories/Value Stories for QA testing effort.
  • Developed Test Cases, Test Reports and Traceability Matrix as per user stories/ value stories.
  • Worked in various stages of Agile Methodologies and Testing Process to ensure Quality.
  • Executed SQL queries on SQL server in order to view successful transactions of data on the back-end of database for validating purpose.
  • Coordinated testing effort and testing strategies between Grange and Hertz for the project.
  • Tested Response/Request xml using SOAP UI to verify correct response xml where generating.
  • Provide Defect/Testing status on daily/weekly basis to the Project Manager and QA Manager
  • Provided QA testing effort and estimation at a team/program level to product owners and Management
  • Allocated QA workload between co-workers and teams
  • Provided cross training and on boarding to new team members
  • Interviewed / Screened new candidates for QA Team
  • Executed Mainframe Command to view successful transactions of data on the Mainframe.
  • Verified XML are CIECA compliant
  • Coordinated testing effort with vendor (Hertz) and in house testers
  • Provided Demo for the Hertz project to other groups to assist in test automation and to cross train
  • Assisted in weekly UAT meeting for Hertz by assisting the BA.
  • Provided time estimation for Hertz project and for new CR (Change Request)
  • Tested in house test harness built to test HP Exstream (Data Transformation Layer) process.
  • Tested the Print and DTL functionality for HP Exstream
  • Created Test Plan for HP Exstream Phase I MDL project
  • Prioritized defect/change requests/enhancement that came out of defects logged.
  • Attended Daily Scrum /project status meeting.
  • Executed regression scripts for quarterly claim workstation release to prod.
  • Created Regression framework for the EZSEQ project for end-to-end testing.
  • Created Mainframe macro to capture mainframe screen for testing.
  • Verified Mainframe mapping and screen to ensure data verification and system mapping works correctly.
  • Provided weekly commitments status to the team and QA Manager.
  • Executed cucumber script for bug retest and story cards testing.
  • Created/Tested Policy Maker with Ruby developers and automation tester.
  • Blogged on QA Blog with testing articles and testing methodologies.
  • Lead Quality Push training and Test execution for the QA department.
  • Screened, Interviewed and Hired QA Analyst for QA Manger

Environment: Windows 2003/XP/7, SharePoint, Extra for Netware 6.5, SQL Server 2008,.Net 3.5 Framework, Gadwin Print screen Ver. 3.5, Track Record 6.02, TFS (Team Foundation Server) test suite, Gherkin, Cucumber, Ruby 1.9.3, Gvim editor, Remote Desktop, Beyond Compare 3, Text pad, Claims work station, Test Management, Soap UI, WSDL

Confidential, Columbus, OH

Sr. Quality Analyst/QA Lead for Quest

Responsibilities:

  • Analyzed Functional Requirement Specification (FRS), System Design Specification (SDS).
  • Developed Test Cases, Test Reports and Traceability Matrix as per FRS and SDS.
  • Performed functional, unit, integration, regression, black box and white box, GUI, back-end Database and Mainframe testing.
  • Worked in various stages of Software Development Life Cycle (SDLC) and Testing Process.
  • Executed SQL queries on SQL server in order to view successful transactions of data on the back-end of database for validating purpose.
  • Executed Mainframe Command to view successful transactions of data on the Mainframe.
  • Verified XML Logs for troubleshooting defect on the COMM frameworks.
  • Created Regression Framework for the BOP & BAP Projects..
  • Worked on multiple change requests/enhancement for Production support and releases.
  • Created QTP Scripts for specific Testing Scenarios of the BAP Project.
  • Worked with developers to perform White box testing for stored proc. Changes.
  • Created Documentation for the new hired QA Team members & performed one-on-one Training for Defect/Test case Naming convention, Mainframe and Application flow.
  • Created template for back-end testing.
  • Audited Defect/ Test Cases to make sure rest of the team is following the standard procedures.
  • Attended Triage meeting with Developer and Release manager for Production releases.
  • Assisted Manager in the hiring process by filtering and rating candidate based on their resume.
  • Verified Mainframe screen to ensure new developed screen are functioning properly according to the requirements.
  • Created/Uploaded policy for batch processing testing on Mainframe.

Environment: Windows 2003/XP, MS Office Project Web Access, SharePoint, Mocha TN3270, SQL Server 2003, QC 8.2, .Net 2.0 Framework, Gadwin Print screen Ver. 3.5, BizXpress UI Tester (in house test harness tool used for automation and Quote Compare), HP Quality Center 8.2,QTP 8.2

Confidential, NJ

Quality Analyst

Responsibilities:

  • Analyzed Functional Requirement Specification (FRS), System Design Specification (SDS).
  • Developed Test Plan, Test Scripts, Test Reports and Traceability Matrix as per FRS and SDS.
  • Performed sanity, functional, unit, integration, regression ( top down, bottom up), black box and white box, GUI, back-end, browser compatibility testing.
  • Worked in various stages of Software Development Life Cycle (SDLC) and Testing Process.
  • Executed SQL queries on Oracle and SQL server in order to view successful transactions of data on the back-end of database for validating purpose.
  • Tested marketing campaign by passing subtactic code from Internet Browser or external website banners to Product Database.
  • Tested Ephinany marketing email sent to end user.
  • Worked on multiple change requests for database enhancements & registration process for various sites.
  • Performed interface testing using XML Spy for BRC & IVR users.
  • Test Email hierarchy, ODS and Ephinany ETL process for cross product, channels and registration.
  • Worked with developers to perform White box testing for stored proc. Changes.
  • Performed Kanisa testing implementation for various new sites including Boniva.
  • Created Defect reports using Test Director 8.0
  • Tested web services request/response using Soap UI via XML
  • Prepared testing summaries and status reports for the team lead and QA Manager.
  • Created and updated Standard Quality Assurance Documentation for QA department.
  • Attended conference meeting with external vendor to explain standard Pre-QA process.
  • Worked with End user to Performed UAT testing for Boniva Project.

Environment: Windows XP, SQL Analyzer, SQL Enterpriser, Oracle, SQL server 2000, ODS, IIS 5.0, Kanisa, HTML, Java Script, Flash, CSS, XML SPY, Soap Sonar, UML, Documentum Eroom, Watch fire (web automation tool), Test Director 8.0, Ajax, SOAP UI, SOAP HP Quality Center 8.2

Confidential, NJ

Quality Engineer/ Quality Consultant

Responsibilities:

  • Analyzed Functional Requirement Specification (FRS), System Design Specification (SDS).
  • Developed Test Plan, Test Scripts, Test Reports and Traceability Matrix as per FRS and SDS.
  • Performed sanity, functional, unit, integration, regression ( top down, bottom up), black box and white box, GUI, back-end, browser compatibility testing.
  • Worked in various stages of Software Development Life Cycle (SDLC) and Testing Process.
  • Executed SQL queries on Oracle and SQL server in order to view successful transactions of data on the back-end of database for validating purpose.
  • Checked database connectivity using ODBC and Tnsnames.ora .
  • Applied 21 CFR Part 11, Template and SOP guidelines and prepared IQ, OQ and PQ.
  • Performed Cross Browser testing and checked the navigation path and broken links using QTP 6.5.
  • Verified web objects such as frames, tables, cells, links, images and text using QTP 6.5.
  • Created defects and test reports using Test Director 7.6.
  • Compared results of previous version with the latest execution to investigate the performance of the application.
  • Developed scripts for Web (HTTP/HTML), Database virtual users using Load Runner and analyzed the Activity/Performance graphs and reports using Load Runner Analysis.
  • Ran the Vuser script in multi user mode by creating and running scenarios in the Controller.
  • Implemented Rendezvous points, parameters and Correlation in the Vuser Script.
  • Supported the pilot program by Analyzing problem reported by UAT.
  • Interacted with developers in resolving problem tickets.
  • Communicated with the Project Lead/Manager as well as with the other members of development team and provided inputs and other changes required in the application development.
  • Prepared testing summaries and status reports for the team lead and QA Manager.
  • Published User Guide on the company intranet site tbirdsupport.bms.com for end user.
  • Tested the application for Functional testing while documenting failed cases.

Environment: UNIX (Solaris), Windows 2000/XP, Thunderbird M6& M7, Oracle 8i, Web logic 6.1, IIS 5.0, Java, JDBC, HTML, XML, XSL, UML, Console one, Documentum, QTP6.5, HP Quality Center /Test Director7.6, Load Runner7.6

Confidential, NJ

System Tester Analyst

Responsibilities:

  • Analyzed Functional Requirement Specification (FRS), System Design Specification (SDS).
  • Developed Test Plan, Test Scripts, Test Reports and as per SOPs and Templates.
  • Performed sanity, functional, unit, integration, regression ( top down, bottom up), black box and white box, GUI, back-end testing.
  • Installed and setup load test environment from scratch Load Runner 7.0 and configured different Load Runner tools (Unix, Oracle Monitoring Tool)
  • Performed Performance testing (Load Runner) with large volume of download data coming from the Main Frame to see how long it takes for the application to populate the data in the database.
  • Executed Shell Scripts and used CRONTAB for verifying scheduling of Export and Import of data from ORACLE Database.
  • Used vi editor and Unix commands such as ftp, tail, ps, grep, cat, more, cp, cut, mv, rm, ls etc.
  • Executed Shell Scripts by using ORACLE SQL, PL/SQL to convert data from old system to new system and to generate Test Data.
  • Used SQL queries to extract data from different database tables and verify with front-end of application.
  • Wrote Complex SQL quires to ensure data verification and data quality.
  • Prepared test plan tree to execute manual and automate test cases using Test Director 7.0.
  • Wrote test cases, execute test cases, and report defects using Test Director 7.0.

Environment: Unix, Windows 2000, Linux, Oracle 8i, Java and JDBC, RMI, EJB, C/C++, Internet Explorer, Telnet, LoadRunner 7.0, Test Director 7.0., Fix Protocol, DB2

Confidential, NJ

Software Testing Analyst/Quality Assurance Analyst

Responsibilities:

  • Analyzed Functional Requirement Specification (FRS), System Design Specification (SDS).
  • Wrote test plans and test cases based on functionality and business requirements.
  • Involved in troubleshooting the end-user module.
  • Coded and executed functional tests using WinRunner.
  • Created test scripts using WinRunner tool and performed interface, functionality, and regression testing on new changes to the software.
  • Provided testing of navigation and operating system compatibility. The major focus was to test usability, navigation, and Data content results.
  • Developed scripts for Data-Driven tests using TSL and used correlation for supplying unique data to the server in WinRunner 6.0.
  • Written TSL script for Database and data driven testing for regression testing of business workflow.
  • Identified the custom objects in the AUT and mapped them to the standard objects.
  • Prepared test plan tree to execute manual and automate test cases using Test Director7.0.
  • Wrote test cases, execute test cases, and report defects using Test Director7.0.
  • Created and implemented test plans for stress and load testing for the application.
  • Determined the performance of critical transactions by analyzing reports/graphs generated in LoadRunner.
  • Communicated with developers to resolve various issues.

Environment: Windows 2000, SQL Server 7.0, Visual Basic, ODBC, WinRunner 6.0, HP Quality Center/Test Director 7.0

Confidential, NJ

Web Developer/QA Consultant

Responsibilities:

  • Responsible for maintaining, designing and marketing the website.
  • Provided graphical works for the site using HTML/DHTML/Java Script.
  • Updated and maintained the website’s database using SQL.
  • Prepared test plans, test scenarios and test cases based on business and functional requirements.
  • Verified web objects and checked differences in the object properties of each page.
  • Created FRS & BRS for the website.
  • Created test cases and test script for cross browser support.
  • Performed FTP, worked on setting up and maintaining test environments.
  • Involved in Functional, UI, Navigational, Security and Performance testing.
  • Performed Manual & Automated Testing based on developed
  • Performed Cross Browser testing and checked for the navigation path and broken links using Winrunner 6.0.
  • Verified web objects such as frames, tables, cells, links, images and text using WinRunner 6.0.
  • Test plans and test cases and documented the results.
  • Tested web billing, shopping cart, and shipment methods.
  • Documented defects using clear quest.
  • Stressed web server for performance testing using Load Runner Virtual user generator.

Environment: NES (Netscape Enterprise Server), ColdFusion Server 5.0, WebSphere 4.0, JRun 3.0, SQL server, MS Access, WinRunner 6.0, LoadRunner, Internet Explorer, Netscape Navigator.

Confidential, NJ

QA Tester/ Trainer

Responsibilities:

  • Manually tested client laptops for configuration issues with network connections, Microsoft Outlook, and other software on a Windows 2000 platform.
  • Followed test plans systematically and pointed out errors to the configuration team for modification.
  • Performed regression testing on reconfigured laptops.
  • Configured and synchronized palm PC’s for used with each laptop.
  • Trained new employees on quality control practices.
  • Through understanding of application/document Change Request, Review and Approval process.

Environment: Remedy, Windows 2000, MS Outlook, MS Office, MS Project, Adobe Reader, Rkaid, Internet Explorer

We'd love your feedback!