Richard Cloudsley School
We have been fortunate to have scheduled meetings with children at Richard Cloudlsey school.This school are dedicated to integrating cutting-edge technologies and working alongside the community to bolster the prospects of children for the future. They place a strong emphasis on adopting innovative tech solutions to prepare for evolving challenges, ensuring that young individuals are equipped to make decisions that are informed by these advancements. Their efforts include deepening collaborations with educational bodies and employers, with a particular focus on enhancing independent living skills through technology. Their goal is to ensure that children not only feel valued in their community but are also at the forefront of technological progress and adaptation.
Feedback and Our Response
Visit Commencing 15/12/23
Feedback
On the 15th we had our first visit to the school, in which we faced a few challenges. During our demonstration it was clear to us that the program we had designed would not work well with the children who were testing it, this was due to a few factors:
-
One of the children who were testing our software, had involuntary head movements, and therefore the calibration process was extremely difficult to setup. As when the head is moved from its original position the calibration process must restart.
-
Another child, who didn't have involuntary head movements, was long sighted. This meant we either had to keep the computer a distance away from him inorder for him to see the computer screen or we would have to extend the display onto the projector.
We had to consider these factors when making appropriate changes to our code whilst also, asking other MotionInput teams in FaceNav how they have countered the involuntary head movement issue we faced.
The feedback we did gather form the vist:
-
The calibration should be more visually enaging rather than audio
-
The accuracy is really poor, and as a result the buttons controlled by dwell should be rather large to counter this issue
-
Slight rotation/movement of the head sometimes controls the movement of the eyes, however this should not be the case, it should be eyeGaze.
Our Next Steps
The visit was useful as it allowed us to find how external users would go about using the software espcially assisted by an occupational therpist.
Our next steps were to...
-
Improve calibration: we would now code a more visually engaging calibration set up, where by the user would stare at points rather than listen to the audio. Visual engagement is a key aspect in this and therefore it is a priority.
-
Improve accuracy: this will be done 2 ways, the first being by increasing the number of calibration points. This means we will have more data to make accurate predictions on where the user is looking. The second being, using facial landmarks in relation the eyes. This should counter the rotations and head tilting slightly.