By Sarah Mirza, Software Engineering Student
Published 27 June 2023
If I could summarise the final stretch of my Software Engineering degree at UNSW with one word, it would be: empowering.
As part of my three-term long thesis in 2022, I embarked on a change-making journey with the Media & Immersive team. Together, we set out to revolutionise the Optometry and Ophthalmology classrooms by developing EyeSim - a VR experience where students can immerse themselves into the patient’s view of nine distinct eye conditions. The goal is to enhance their understanding, elevate their accuracy of future diagnoses, and foster a profound sense of empathy between them and future patients.
The making of EyeSim: my experience of the process
EyeSim V1 was built using WebGL for use primarily for phone-based VR. From my own observations, the students were intrigued by the experience and were able to use their own phones and a headset phone holder to immerse themselves. The lecturer controlled which eye condition would appear, and quizzed them on what they saw, what symptoms were present and how to diagnose.
As good as this was, we understood that EyeSim being delivered on phones and laptops did not allow for an entirely authentic experience and the quality of images was low. We also discovered other limitations of use or success, from phones overheating to certain phone sizes not fitting well in the viewer. Whilst it allowed students to visualise eye conditions, it fell short of capturing the entire spectrum of issues these conditions create for the patient.
So, we developed EyeSim V2, which leveraged VR technology using Quest Headsets and the Unity Game Engine. Harnessing the power of VR, EyeSim V2 became a wonderful empathy machine, enabling students to take a journey in the shoes of someone experiencing these conditions, and to therefore gain a more comprehensive understanding and heighten their emotional connection with their future patients.
For me, the development process across the three terms was filled with research on the eye conditions and making ‘shaders’ in Unity to recreate these conditions within the program. It took design, decision making, coding and documentation, and I found myself thoroughly enjoying working closely in an agile environment with my supervisors, Graham Hannah (Manager, Immersive Technologies) and Tim Dodds (Immersive Technology Specialist) from the PVCESE Media & Immersive team.
Working with them allowed me to strengthen the essence of my Software Engineering mindset - using problem solving, project management, continuous learning, adaptability and resilience.
Showcasing EyeSim and other gains for me
During my involvement with the project, I had the privilege of showcasing it to others several times. In the final weeks, I had a chance to present not only to freshly enrolled scholarship students and our third year Computer Science Engineering students, but also to the public. What really resonated with me was seeing elderly men and women using the headset for the first time, reflecting on their days of being students and being so impressed by the future possibilities.
Hearing their WOWs and witnessing their faces lighting up with curiosity and wonder was absolutely heartwarming.
In addition to honing all the skills required for the project, as well as the useful public presentation skills, I found the whole experience so incredibly motivating and empowering. I am proud to have been a part of a team that built EyeSim 2 and replaced a passive learning experience with a much-enhanced version of a powerful active learning experience.
I am grateful that I have been able to help create something that builds empathy and engagement with students in the field of Optometry and Ophthalmology.
I am looking forward to seeing the project evolve in the hands of the team and the next generation of students. I hope they get as much out of it as I did.
For more details on EyeSim, or to explore using VR in the classroom, contact the Media & Immersive team.
***
Reading this on a mobile? Scroll down to learn about the author.