Scoop has an Ethical Paywall
Licence needed for work use Learn More

Education Policy | Post Primary | Preschool | Primary | Tertiary | Search

 

UC Research Supports Hands-off Approach To Childcare

A new UC study is developing critical training opportunities for early childhood education student teachers, without an infant in sight.

It’s a reality that many people passionate about working effectively with infants (0-6 months), have not and are potentially unable to interact with children of that age, during their training.

The obstacles, while varied and appropriate, are the subject of a new, trans-disciplinary study from Te Whare Wānanga o Waitaha | University of Canterbury (UC), with its HIT Lab NZ and Te Kaupeka Ako: Faculty of Education in the early stages of creating specialised training environments through virtual reality (VR).

Advertisement - scroll to continue reading

Professor Jayne White initiated the study, reaching out to HIT Lab NZ to gauge their interest in a collaboration after identifying an issue for student teachers who did not have ready access to real-life infants as part of their qualification.

The proposition quickly caught the attention of Associate Professor Heide Lukosch, Head of HIT Lab NZ’s Applied Immersive Game Initiative (AIGI), which aims to accelerate research and public use of immersive gaming applications with a view to improving personal, social, educational and health-related outcomes.

Professor White then led the two faculties, with funding from the University’s Child Well-being Research Institute, in the first trial of a VR prototype.

“VR has proven itself to be an effective strategy for learning practical skills across many disciplines, but it’s possibilities within education are yet to be fully explored,” Professor White says. “We’re excited by the potential of the tool, and the opportunities we’ve identified for future development.”

“Professor White proposed the idea of applying VR in early childhood education, because while infants are capable of expressing themselves from birth, it is sometimes difficult for non-familial adults to understand their verbal and non-verbal cues.” Associate Professor Lukosch says.

“What we can do through our research, is provide the support required to understand the cues or signals they produce to better identify and rehearse a grammar that will respond effectively to their needs.”

“The development of a VR tool of this kind would be enormously beneficial when you consider the wide-ranging situations in everyday life in which more than verbal communication is needed.”

Associate Professor Lukosch and Professor White now co-lead the study which, informed by the Mātauranga Māori concept of whanaungatanga under the guidance of UC Senior Lecturer, Dr Ngaroma Williams, will identify key elements needed to support relational skills for adults working with infants, and translating them into novel training environments.

“We’re really intrigued by the opportunity and the power of virtual environments to help people in situations that would be otherwise hard to access, or potentially dangerous,” Associate Professor Lukosch explained.

“In this case, we aim to create a virtual, immersive environment that enables the user to really feel present and accountable for their actions in certain situations, with opportunities to reflect on these.”

A pair of gloves incorporating haptic technology will form an invaluable element to these virtual environments and encounters with infants. A mechanical system within the gloves will simulate the resistance someone would realistically feel as they carefully handled an infant, whether it be to change, soothe, feed, or entertain them.

One of the early VR training prototypes tasked users with the responsibility for establishing a positive interaction, interpreting, and responding to non-verbal cues concerning virtual infant wants or preferences.

Preferences or personas is something that can be programmed into a virtual baby avatar, and as the user decides what kind of actions they will take in their interaction with the virtual baby, the research will enable us to develop an intelligent system that will decide how their ‘baby’ will respond.

“For example, we could programme a preference for red and when the user picks up a red toy, then the baby would smile. If they choose a yellow toy or similar, then it would start crying, Associate Professor Lukosch explains.”

While it’s not artificial intelligence, as the research component requires the team to have control over the system for user studies, there may be opportunities for this emerging technology moving forward.

“In these training scenarios, it’s very obvious that there’s a chance to integrate artificial intelligence later, to make the learning environments particularly adaptive.”

The UC team are also working closely with Professor Tony Walls and Dr Niki Newman, lead of the Christchurch-based University of Otago Simulation Centre, who are informing the healthcare elements of the study.

As the study progresses towards commercialisation over the next three years, the team hopes the general design principles for interaction derived from the study can be transferred to other validated training environment domains.

“We’re providing a solution that can be difficult to access in real life and providing this support is something that excites us very much about the technology.”

© Scoop Media

Advertisement - scroll to continue reading
 
 
 
Culture Headlines | Health Headlines | Education Headlines

 
 
 
 
 
 
 

LATEST HEADLINES

  • CULTURE
  • HEALTH
  • EDUCATION
 
 
 
 

Join Our Free Newsletter

Subscribe to Scoop’s 'The Catch Up' our free weekly newsletter sent to your inbox every Monday with stories from across our network.