Cassiel Posted July 17, 2020 Share Posted July 17, 2020 Long post ahead, arm yourselves with patience Since the Firestorm expansion is slowly approaching release and testing will be needed, I figured I'd share some QA knowledge that might help with future testing phases. Do note that this is not a by-the-book rundown of professional QA practices, but rather a compilation of the "essentials", from personal experience as an ex-tester. It's highly likely that not all of these practices will be worth adopting due to the size of our dev team and the limited time left for testing (and, obviously, the fact that this is all a passion project, not a corporate backed commercial product). Some of the terminology used will differ from what i worked with, while some aspects will be left out entirely (so as not to risk violating NDA terms), these are at the discretion of whoever creates the QA infrastructure (i.e Jira/Confluence pages) and decides how the roles would function. Who knows, if this model is adopted and it turns out to work well with Firestorm, it could be the start of the QA framework used for future RenX content implementation and even other projects. A few bullet points: • Standardized bug report & handling environment (Jira) • Specialization • Documentation (Confluence) • Streamlined workflow (Splitting of responsibilities / areas of expertise and of testing) People involved with quality assurance would be specialized as follows: • Tester (in charge of the actual, hands-on testing of the end-user experience, should be a role mostly reserved for experienced players) • Analyst (in charge of creating and updating documentation, as well as forwarding submissions and maintaining contact with devs and following along with the development process) • Lead/Manager (normally separate roles, in charge of recruiting testers, managing roles and the testing process in itself, as well as maintaining and updating the QA infrastructure as needed) • Developer (self-explanatory) *Do keep in mind these are not meant to emulate the actual corporate positions, but are rather simplified and likely flexible roles to be changed at the discretion of community leaders and in accordance to the availability of volunteers. I'd say that the analyst role will likely be taken on by the devs, at least until more people volunteer. A standardized bug report environment ensures consistency throughout the entire testing process, while also keeping track of the project's evolution as it is developed. The usual flow of the bug reporting/fixing process would go as follows: 1. Submission as a standard form by the tester, followed by assigning the now <open> issue to the analyst in charge of the issue's specific area of occurrence (like PvP, PvE, MTX, Writing, Map design etc). 2. Reviewing and (if necessary) retouching of the submitted issue by the analyst assignee, then forwarding the now <open and confirmed> issue to the developer (or development team) in charge of the respective field. This is also the stage in which the submitted issue could be given a priority tag, in the case that the number of bug submissions becomes overwhelming for the development team to handle them as they come. Should the report not conform with the agreed-upon submission form, be invalid from a documentation standpoint or a duplicate of another submission, the issue will either be closed or assigned back to the tester. 3. Once in the hands of the development team, the issue can either be fixed for the next build or be postponed. Usually issues that hinder the ability to test other aspects of the game or cause crashes/other technical failures are among the first to be fixed, while minor issues like low-impact edge cases or small graphical artifacts are left on the backburner. The issue will be assigned to the general QA inbox once it is believed to have been fixed as <open, regression pending>. 4. The analysts will now choose which of the "fixed" issues to assign back to their respective submitters based on the build on which the current testing phase is taking place and the available "fixed" issues for that build. 5. The final phase consists in regressing the issue, i.e attempting to recreate the bug according to the information and steps provided in the report. Should the issue persist, it will be forwarded back to the development team's inbox as <open, not fixed>. If the issue has indeed been fixed, it will be closed by the tester as <closed, fixed>. In the case of a large-scale test, people that have not specifically volunteered as testers could contribute by compiling and sending their findings to someone within the QA team, so that the issues can be formally submitted. As for documentation, it generally serves two purposes: to make testing possible for people who've never played the game (which i suppose will likely not be the case here) and to enable a grey-box approach to testing (simulate end-user experience while also knowing exactly what SHOULD happen). Forgoing documentation is entirely possible when testing aspects like art or, in general, aspects of the game in which anyone could notice if something was off. This is what i propose for now, any corrections or suggestions would be appreciated. I have also attached a detailed bug submission template to this post. Submission format.docx 1 1 Quote Link to comment Share on other sites More sharing options...
limsup Posted July 28, 2020 Share Posted July 28, 2020 I very much like what I've read! 1 Quote Link to comment Share on other sites More sharing options...
Totem Arts Staff SonnyX Posted July 28, 2020 Totem Arts Staff Share Posted July 28, 2020 Interresting read, I definitely like most of the suggested QA procedures. I have been thinking of making a personalized QA framework for Totem Arts recently as Jira and the likes do not actually provide the necessary management required for an volunteer based team such as ours. Currently I'm working out what and how things should work and be implemented in such a QA framework, as an example: - Automated distributed testing tasks among the testers or an "contract" system, for accepting the contract of testing a task. - Based on what files were changed in version control since the last major release: require certain testing tasks. - Full integration in our CI/CD system. - More? 1 2 Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.