Testing

Testing Strategy

In order to ensure our game provides a seamless experience to users, we had to implement a testing strategy. Due to the nature of our project, most testing relied on manual and user acceptance testing.

Manual Testing

For quality assurance, we implemented a systematic approach to manual testing across all games in our Unity project. Each component was extensively evaluated both independently where possible, and together with other components, ensuring both individual and system-wide reliability.

We initially validated core functionality using standard keyboard controls without MotionInput integration. This allowed us to test and identify fundamental issues with the game.

Exploration

For the exploration scene, our testing protocol covered several critical components:

  • ExplorationPlayerController: Responsible for player movement within the exploration scene.
  • AudioManager: Responsible for handling the playing of background music and footsteps.
  • Exploration State: Responsible for storing the last position of the player before transitioning to a mini-game.
  • NPCInteraction: Responsible for handling the interaction with NPCs.
  • SeasonalTreeManager: Responsible for handling season changes and updating the trees and falling leaves to match the season.
Case Study: ExplorationPlayerController

Testing the player controls and movement covered several critical areas:

  • Player Movement Testing
  • Footstep SFX Testing
Player Movement Testing

In order to ensure that the player movement works flawlessly, we verified forward walking and camera panning mechanics through repeated input testing and visual verification.

Footstep SFX Testing

Once the player movement was successfully tested, we had to ensure that the footstep sounds were being played. We programmed different sounds to be played when different terrain textures are stepped on. In order to test this, we went through several test cases:

  • Stay on 1 specific terrain texture and verify the right sound is played.
  • Walk across various terrain textures to ensure that the right sound is played even when transitioning.

Wipeout

For the wipeout game, our testing protocol covered several critical components:

  • WipeoutPlayerController: Responsible for player movement and collision detection.
  • GameManager: Responsible for overall game flow (starting and ending the game), and providing level settings for the SpinningObstacle component.
  • PlayerGameManager: Responsible for player-specific games - controls when to update the level and the score for each player.
  • SpinningObstacle: Responsible for enabling and disabling beams based on the level settings from GameManager, changing the rotation speed and enabled/disabling the obstacle.
  • CanvasManager: Responsible for all UI/Canvas elements - includes updating the score, level number and showing the countdown.
  • CountdownController: Responsible for countdown timer before game starting.
  • EventManager: Responsible for game-wide updates between all components.
Case Study: WipeoutPlayerController

Testing the player controls and movement covered several critical areas:

  • Player Movement Testing
  • Obstacle Collision Testing
Player Movement Testing

In order to ensure that the player movement works flawlessly, we verified the jump and crouch mechanics through repeated input testing and visual verification. As a next level of testing, we ensured that the transition between movement states are smooth.

Obstacle Collision Testing

Once the player movement was successfully tested and the spinning obstacle component was ready, we tested the collision detection. To test this worked seamlessly, we went through several test cases:

  • Each spinning obstacle has 8 beams attached to it (4 lower and 4 upper beams). We tested that the collision with each individual beam works through enabling each individually and checking if it knocks the player off.
  • We also tested that there were no false collisions. For example, we do not want a collision to be occurring when a player is crouching and a upper beam passes over the player's head.
    • This approach was similar to testing positive collisions. We tested that we were able to successfully avoid each of the 4 lower and upper beams through jumping and crouching.
  • We had to test whether the collision will work properly with the beams when in the player model is mid-air after jumping
Case Study: CountdownController

To test that the CountdownController worked flawlessly, we had to ensure that the text with the countdown appears on the screen and that this occurs on three occasions:

  • Upon initial game start
  • Upon levelling up and resuming the game
  • Upon getting knocked off and resuming the game
Case Study: CanvasManager

To test that the CanvasManager worked flawlessly, we had to ensure that specific UI elements are rendered according to various events:

  • Upon player getting knocked off, they should have a text that appears in the middle of their screen indicating that they have been knocked off and that they are getting reset to level 1.
  • Upon levelling up, there should be a text that indicates that they have levelled up and be shown the next level number.
  • Upon the game ending, the player should be prompted with their final score.
  • While the game and obstacle is running, we had to ensure that the score is incrementing.

To test these, we simply went through each scenario and ensured that the right message was shown. To test the score, we did a general test where we observed whether it would increment faster as the levels incremented.

Golf

For the golf game, our testing protocol covered several critical components:

  • GolfManager: Responsible for overall game flow and scoring.
  • BallController: Handles ball physics and movement.
  • ClubController: Manages club swing mechanics and impact force.
  • CourseLayout: Controls hole design and obstacle placement.
  • GolfUI: Handles all scoring and game state displays.
Case Study: Golf Ball Physics

Testing the golf ball physics was crucial to ensure realistic gameplay:

  • Ball Rolling and Friction
  • Collision Response
  • Putting Mechanics
Ball Rolling and Friction

We conducted extensive testing of the ball physics to ensure realistic behavior on different surfaces:

  • Tested ball roll distance on flat green surfaces
  • Verified decreased roll distance on rough terrain
  • Confirmed accurate ball behavior on slopes of varying degrees
  • Validated proper stopping behavior based on surface friction
Club Impact Testing

The force applied to the ball by different club types was carefully tested:

  • Verified that swing power correlates properly with distance
  • Tested accuracy of direction based on club face angle
  • Confirmed that partial swings result in proportionally less distance
  • Validated that off-center hits produce appropriate ball spin

Racing

For the racing game, our testing protocol covered several critical components:

  • RacingGameManager: Responsible for controlling scene switching and storing player scores.
  • RacingUIManager: Manages all UI elements, including updating the coin count, timer, and speedometer.
  • CarController: Handles the acceleration, braking, and steering of the car.
  • CarCheckpoint: Stores passed checkpoints and is responsible for repositioning the car when it gets stuck.
Case Study: Checkpoint Registration

Once a car passes through a checkpoint, that checkpoint is registered and used for car recovery. To ensure proper checkpoint registration, multiple test cases were conducted:

  • Driving from the starting point to the finish line
    • Verified that each checkpoint was registered along the way by checking logs
  • Driving from the finish line to the starting point
    • Ensured that only the checkpoint closest to the finish line was registered, confirming that only that checkpoint is recorded.
Case Study 2: Car Recovery

The car recovery system was tested to ensure proper functionality when the car gets stuck. Several test cases were used to validate this feature:

  • Hitting a barrier without braking in auto-driving mode: The car was repositioned to the previous checkpoint.
  • Braking in the middle of the racetrack: No recovery action was taken; the car remained in the same position.
  • Braking after hitting a barrier: No recovery action was taken; the car remained in the same position.
  • Car not accelerating in manual driving mode: No recovery action was taken; the car remained in the same position.
  • Car accelerating while hitting a barrier in manual driving mode: The car was repositioned to the previous checkpoint.

Through these tests, we confirmed that the checkpoint and recovery system function as expected.

Case Study 3: Car Control

Testing the car's movement covered several critical areas:

  • Acceleration and Braking
  • Steering
Acceleration and Braking

The racing game includes two driving modes: auto acceleration and manual acceleration. To verify that acceleration and braking function as intended, multiple tests were conducted.

Acceleration Mode Inputs Results Expected?
Auto N/A Car accelerates Yes
Auto Boost Car accelerates beyond normal speed limit Yes
Auto Brake Car moving: slow down
Car not moving: stationary
Yes
Auto Boost and Brake Car moving: slow down
Car not moving: stationary
Yes
Manual N/A Car moving: slow down
Car not moving: stationary
Yes
Manual Acceleration Car accelerates Yes
Manual Brake Car moving: slow down
Car not moving: stationary
Yes
Manual Boost and Brake Car moving: slow down
Car not moving: stationary
Yes
Steering

The maximum steering angle of the car decreases as the speed of the car increases, following a linear relationship.

To verify the steering behavior, we recorded the maximum steering angle at different speeds and confirmed an inverse linear relationship.

Potion Brewing

For the potion brewing game, our testing protocol covered several critical components:

  • PotionGameManager: Manages player turns and resets the game.
  • PotionSurface: Handles the player UI and tracks the potion recipe.
  • GameData: A static class used for storing game data between scenes.
  • SecondSceneManager: Manages the game flow in the second scene.
  • PotionPlayerController: Handles player movement and camera control.
  • DragAndDrop: Manages picking up and dropping ingredients.
Case Study: Player Movement and Camera Control

Testing the player movement and camera system was critical for ensuring smooth gameplay:

Camera Tracking

We tested the camera system which tracks the player's nose position and maps input to arrow keys:

  • Verified that the camera accurately follows the player's nose position for precise targeting.
  • Confirmed that camera rotation speed increases the longer the player moves in a particular direction, allowing for both precise ingredient selection and fast movement.
  • Tested edge cases such as rapid direction changes and extreme viewing angles.
Case Study: Ingredient Interaction

Testing the ingredient interaction mechanics covered several critical areas:

Drag and Drop System

The game uses Unity's raycasting system to handle ingredient interaction. Our testing verified that:

  • When the crosshair overlaps an ingredient, its material correctly changes to highlight it.
  • The highlighted ingredient is properly stored as the selected ingredient.
  • When the player clicks, the selected ingredient is picked up and positioned in front of the player.
  • The carried ingredient maintains its position relative to the player as they walk around the environment.
  • Clicking again correctly drops the ingredient, with vertical dropping animation.
  • Ingredients can be dropped anywhere in the room.
  • When dropped into the cauldron, ingredients properly disappear and trigger the cauldron's color change effect.
Case Study: Recipe Validation

A critical component of the potion brewing game is the ability to correctly validate player-created recipes:

Recipe Recognition

We extensively tested the recipe validation system which handles 12 unique recipes:

  • Verified that once four ingredients are placed into the cauldron, the recipe-checking process begins.
  • Confirmed that ingredients are properly sorted in order of addition.
  • Tested that the system correctly iterates through the list of valid recipes and compares them to the player's combination.
  • Ensured that if a match is found, the correct message is retrieved from the hash table.
  • Verified that each of the 12 recipes (each requiring 4 specific ingredients) is correctly recognized regardless of the order ingredients are added.
  • Confirmed that incorrect recipes properly inform the player and reset the game, returning ingredients to their original positions.
Visual and Audio Feedback

Proper feedback is essential for the brewing experience. We tested various feedback mechanisms:

  • Verified that the cauldron liquid changes color appropriately when ingredients are added.
  • Confirmed that glowing particles and bubbles appear from the cauldron during the brewing process.
  • Tested that background music plays correctly in both the first and second scenes, with appropriate transitions.
  • Ensured that audio effects play when ingredients are dropped into the cauldron.
Case Study: Scene Transition and Result Display

Testing the transition between the Making Scene and the Result Scene was crucial for game flow:

  • Verified that the GameData static class properly stores information about which recipe was completed.
  • Confirmed that the transition to the second scene occurs seamlessly after a successful recipe.
  • Tested that the correct unique animal spawns in the result scene based on the completed recipe.
  • Ensured that all 12 different results display correctly with their corresponding visual effects.

User-Acceptance Testing

User Acceptance Testing (UAT) serves as our primary method of integration testing, ensuring that all game components work seamlessly together. While traditional integration testing focuses on technical connections between modules, UAT elevates this concept by placing actual end-users at the center of the verification process. This approach allows us to simultaneously validate both the technical integration of various components and games, and their practical functionality from a user perspective.

UAT Process

Test Plan Creation

We developed comprehensive test plans covering all game scenarios and user interactions. These plans outlined specific test cases for different user profiles, including children with varying motor skills and cognitive abilities.

Test Environment Setup

Test environments were prepared with both keyboard controls and MotionInput camera setups to evaluate both control methods under real-world conditions.

User Testing Sessions

Testing sessions were conducted with children from different age groups (6-12 years old), including children with autism spectrum conditions, under the supervision of educators from the National Autistic Society.

Feedback Collection

We collected detailed feedback through observation, educator interviews, and simple questionnaires adapted for children. Special attention was paid to non-verbal cues indicating engagement or frustration.

Analysis and Iteration

Feedback was analyzed to identify usability issues, engagement levels, and technical problems. Multiple iterations of testing and development were conducted to address identified issues.

Key Findings

Control Scheme Accessibility

MotionInput controls required calibration adjustments for different children's heights and movement patterns. We implemented adaptive sensitivity settings to accommodate varying motor abilities.

Visual Clarity

Some children found certain visual elements distracting or overwhelming. We adjusted color schemes, reduced unnecessary visual effects, and improved contrast for important game elements.

Game Pacing

Initial game speeds were challenging for some users with motor coordination differences. We implemented adjustable difficulty levels and extended time limits to ensure accessibility.

Audio Feedback

Audio cues proved essential for many users, particularly those with visual processing differences. We enhanced audio feedback systems to provide clearer indications of game events and achievements.

USER TESTING SESSIONS

MotionInput Integration Testing

Testing the integration of MotionInput with our games required special attention to ensure physical movements were accurately translated into in-game actions.

Camera Setup Tests

  • Tested optimal camera positioning for different physical spaces
  • Verified detection accuracy in various lighting conditions
  • Established minimum space requirements for different games
  • Confirmed camera resolution and frame rate requirements

Movement Recognition Tests

  • Calibrated jumping height detection for different user heights
  • Verified crouching detection thresholds
  • Tested arm movement recognition for steering controls
  • Validated consistency of movement detection across multiple sessions

Game-Specific Motion Tests

Wipeout Game

  • Tested jump height recognition for avoiding low obstacles
  • Verified crouch detection for avoiding high obstacles
  • Validated timing between physical movement and in-game response

Racing Game

  • Tested steering wheel motion detection
  • Verified acceleration and braking gesture recognition
  • Validated response time for quick directional changes

Test Results Summary

97%
Pass Rate

Passing test cases across all game modules

214
Total Test Cases

Comprehensive test coverage across all mini-games

12
Testing Sessions

With actual end users in collaboration with NAS

Issues Resolution Summary

Critical Issues
100% Resolved
8/8
Major Issues
93% Resolved
14/15
Minor Issues
85% Resolved
17/20
UI/UX Improvements
75% Resolved
18/24

Conclusion

Our systematic testing approach has resulted in a stable, accessible gaming experience that meets our quality standards and user requirements. All critical issues have been resolved, with only a few minor improvements remaining for future updates. The game has been thoroughly tested with both traditional controls and MotionInput integration, ensuring a seamless experience regardless of the control method chosen.

User testing with children, including those with autism spectrum conditions, has confirmed that our games are accessible and engaging, with appropriate difficulty levels and sensory considerations. The modular design of our game architecture allowed us to isolate and thoroughly test each component, resulting in a robust final product.