By popular request, we provide a look at open source in video games. Two excellent talks one looking at the general area of opening up vidoegames, the second looking at cutting edge use of open source gaming to help children with ASD.
Once again, due to COVID-19 this will be a purely virtual meeting. We’ll be livestreaming using BigBlueButton to provide a rich online experience for participants. As always the talks will be recorded for later upload to YouTube. You are invited to join and socialize from 18:00, talks will run from 18:30-20:00 with 30 minutes at the end for further discussion and socializing.
In a change to our past practice there is no requirement to register, you can just connect to BigBlueButton on this link.
18:00 – Feel free to join the online meeting to chat with other participants
18:30 – Short introduction (5 min) of the evening by Mary Bennett
18:35 – Presentations
20:00 – Closing Discussion
We were live streaming via BigBlueButton and recording the talks for later posting on YouTube.
Opening Up Videogames
The National Videogame Museum’s most important maxim is ‘videogames are for everyone’. Empowering our audiences to play, understand and make videogames is at the core of everything we do. This is a behind the scenes look at how the NVM is using open source software, open hardware and open data to achieve those goals through game development, game interpretation, and game preservation, as well as its ambitions for the future.
Lex is part of the curatorial and exhibition team at the National Videogame Museum in Sheffield. They are an avid programmer and maker of things, both digital and physical. Lex helped establish the London chapter of the Code Liberation Foundation, which fosters the creation of games and creative technologies by women and non-binary people. They now sit on OpenUK’s museum committee, a not for profit organisation committed to develop and sustain UK leadership in Open Technology.
Virtual Social Robot Interaction for Enhancing the Social Skills of Children with ASD
Social skills training (SST) programmes are extremely important to train and enhance the social skills of children with autism spectrum disorder (ASD). SST programmes are designed to provide learning experiences to teach children the necessary skills to interact successfully with their social environment. Although traditional SST interventions were proven to be beneficial for children with ASD by many previous studies, their accessibility is limited. Physical social robots and virtual environments have been popular training tools for children with ASD in recent years. Studies have been conducted to evaluate their effectiveness in enhancing the social skills of children with ASD. The research presented in this talk investigates the potential of combining virtual environments with social robots as a novel approach to address some of their limitations and train the social skills of children with high-functioning ASD. A non-immersive (desktop) virtual reality environment that employs a 3D robot has been designed. The developed environment aims to enhance the social skills of children with high-functioning ASD through a social skills training programme guided by a parent or a teacher. The motivation of this research is providing a tool that can be widely accessible, cheap and easily used by parents and teachers either at home or at school. The designed training programme is adapted and modified from successful work has been done with physical robots (NAO, Zeno, and Qtrobot) to train the social skills of children with ASD. The developed training programme targets three social skills: imitation skills, emotion recognition skills, and intransitive gestures skill. Due to the current circumstances (COVID-19) and the closure of the schools and centres for autism, the experimental studies have been conducted on-line and onsite. The proposed tool has been launched on a website to make it accessible to a wider group and enable data collection over time. Questionnaires and observation sheet have been designed for data collection pre and post-intervention sessions. Ethical approval for this study was obtained from the University of Greenwich Research Ethics (UREC).
A preliminary evaluation has been conducted to ensure that the gestures produced by the virtual robot can be recognised by the children with ASD. Eight typically developed children without ASD (6-12 years) have participated in this preliminary evaluation, they have been asked to identify the meaning of the gestures demonstrated by the virtual robot in one of the training scenarios. The findings show that all the gestures had a consistency rate of 75% or above. That is an indication that the animated gestures performed by the virtual robot are recognisable and have a common interpretation. The evaluation process is still in progress with seven children with ASD. The emotion recognition training programme has been finished, and the participants showed improvement between the pre and post-tests in all phases.
Maha Hatem is a PhD candidate at the University of Greenwich, London. She is conducting research related to autism and how assistive technology can enhance their social skills. This developed tool is a free desktop virtual reality tool for enhancing the social skills of children with ASD. The tool is still in the evaluation process, and the evaluation is part of her PhD.