On occasion, we may need to test applications to make sure they are functional. In July 2016, we tested the new confluence installation that contains this wiki. This guide contains our work plan for that testing phase, and can serve as a model for future application tests.
July 2016 Confluence User Testing
Rebecca Rosenblatt, Veronica Kuhn, Ethan Feuer, Michael Chen
Scope
We will test editing, navigation, and administrative tasks Confluence users might want to accomplish. We will also check Confluence for visual errors in the “look and feel.”
Timeline
Based off the initial testing we have already done, we’ve set Friday, July 15 as a completion goal for the preliminary Confluence function list (more on that below). If we finish sooner than Friday, we'll go through additional functions we find during testing.
Testing Strategy
We’ve created a list of common Confluence tasks and functions on a Google Docs Spreadsheet, based in part off the Confluence guide included in the Knowledge Base. Testing each task will occur on a first come, first served basis. Testing ends when there are no more open tasks on the spreadsheet.
Workflow
- The tester writes his/her name next to the task they will test. This prevents testers from working on the same task simultaneously.
- The tester attempts to accomplish the task. They verify:
- Whether they can accomplish the task within various scenarios.
- Whether the pages required to accomplish the task “look” correct visually.
- If the tester finds an issue:
- They attempt to replicate the issue in a different browser.
- They create a JIRA issue in the ATG Issues Confluence project. This should include browser information, a step-by-step walkthrough to help recreate the issue, and screen captures as needed.
- They check off “Issue” on the spreadsheet, then add a link to the JIRA issue.
- Once the issue has been resolved, they check off “Resolved” on the spreadsheetIf the task has no issues:
- They check off “No Issue” in the spreadsheet.