Safety at Work

Safety at Work

Safety at Work is an applied research project to integrate immersive experiential learning with positive behaviour support training in the disability sector.

Partners

Victorian Government 

Swinburne PAVE

Scope

 

Investigators

Kim Vincs (TMT)
John Scahill (SCOPE)
Debbie McLaughlin (PAVE)
Mark di Marco (SCOPE)
Jeni Paay (Swinburne CDI)
Rachael McDonald (Swinburne CDI)
Jordy Kaufman (Swinburne CMH)
John McCormick (TMT)
Karen Hall (PAVE)
Brendan Parsons (SCOPE)
Aoife McCann (SCOPE)
Natasha Drozdoff (SCOPE)
Karen Phelps (SCOPE)

Casey Richardson (TMT)
Casey Dalbo (TMT)
Haydon Bakker (TMT)
Adam Carr (TMT)
Jordan Cook-Irwin (TMT)
Stephen Jeal (TMT)
Irene Gironacci (TMT)
Tony Nguyen (TMT)
Esther Wilding (CDI)
Warren Davis (SCOPE)
Kathlyn Moynihan (PAVE)
Jacinta Purnaro (SCOPE)

Occupational violence is a significant risk within the disability sector, in particular where frontline support workers are supporting people with a disability in a residential or community environment where some individuals may exhibit challenging behaviours, in extreme cases involving violence. Acknowledging this challenge, Scope, one of Australia’s largest providers of disability support services and the lead provider of PBS training in Victoria, approached us with an ambitious workforce innovation idea.   This idea has become ‘Safety at Work’, a multi-disciplinary research initiative that will mainstream VR for training disability support workers in positive behaviour support (PBS) for people with disabilities. Working with Scope, and with Swinburne PAVE, the Embodied Movement Design Studio team, led by Professor Kim Vincs, is developing a new approach to VR-based PBS training. We are using our deep creative and performance knowledge to co-create a set of five VR scenarios that have the key learning elements of PBS embedded within the choices users make, within an engaging and immersive interactive environment.

Professor Kim Vincs and Dr John McCormick from TMT are working with leading researchers from Swinburne’s Centre for Design Innovation, Professor Jeni Paay (user experience) and Professor Rachael McDonald (disability and health care), and from Swinburne’s Centre for Mental Health, Associate Professor Jordy Kaufman (psychology).  Our collaboration also involves a large team of Scope PBS experts, led by Mark Di Marco (Manager, Positive Behaviour Support Services) and education specialists from PAVE, led by Debbie McLoughlin (Manager, Strategic Projects, Swinburne PAVE), to develop and validate the learning benefits of VR for PBS support. 

By developing, testing and validating a stand-alone VR system that can be embedded within PBS curricula in the workforce and in TAFE courses, we aim to deliver a ‘step-change’ in the design and delivery of training within the disability and other adjacent human services sectors through efficiency (cost, mobility and scale) and effectiveness (improved learner experience, improved and enduring knowledge and behavioural outcomes). Through improving the scale and quality of PBS training, we aim to reduce the incidence of occupational violence in the disability sector.

This study addresses an increasingly important question in the use of VR for workplace training. While VR and AR training systems are being developed in many industries and many organisations around the world on the basis of VR’s ability to immerse and engage, the educational benefits of these systems are not yet fully understood.

Related Projects

VR Anatomy Lab

VR Anatomy Lab

 The Embodied Movement Design team has created a new VR anatomy teaching lab. Students from across the health science programs at Swinburne University of Technology will study in a bespoke immersive and interactive environment, launching later in 2020.  

Partners

Swinburne University of Technology, 

Faculty of Health, Arts and Design

Investigators

Kim Vincs
Rachael McDonald (Swinburne CDI)
Adam Carr
Jordan Cook-Irwin
Stephen Jeal
Joshua Reason
Irene Gironacci

Teaching anatomy traditionally requires a physical dissection lab. In this project, the Centre for Transformative Media Technologies worked with Swinburne’s School of Health Sciences, and with the Faculty of Health, Arts and Design to design and build a complete VR anatomy dissection and teaching lab. The lab uses multi-player VR technology to allow students and teachers to interact in small and large groups, with state of the art 3D digital anatomy models within a fully interactive environment.

Related Projects

Collaborative Embodied Movement Design Network

Collaborative Embodied Movement Design Network

Movement-based technologies such as augmented and virtual reality, haptic and robotic interfaces form the cutting edge of human computer interaction (HCI) development. This project has developed new infrastructure to create a national collaborative network of arts/technology researchers, enabling them to work together to optimise the quality of these systems from an embodied perspective, and to create new innovation possibilities for industry, commerce, education, health care and the arts. The network features real-time remote motion capture collaboration between facilities. 

Partners

This research was funded by the Australian Government  through the Australian Research Council’s Linkage, Infrastructure and Equipment Scheme (Project LE170100066)





Investigators

Kim Vincs (TMT)

John McCormick (TMT)

Troy Innocent (RMIT)

Adam Nash (RMIT)

Simon Biggs (Uni SA)

Bruce Thomas (Uni SA)

Frank Vetere (Uni Melb)

Robert Vincs (Uni Melb)

Saeid Nahavandi (Deakin)

Douglas Creighton (Deakin)

Jordan Vincent (Deakin)

Petra Gemeinboeck (UNSW)

Keith Armstrong (QUT)

Thomas Chandler (Monash)

Scott deLahunta (Coventry, UK)

Related Projects

Enhancing the impact of Australian performing arts: virtual scenography and opera for the 21st century

Enhancing the impact of Australian performing arts: virtual scenography and opera for the 21st century

This project, a partnership with Victorian Opera, explores the artistic and economic potential of 3D virtual scenography through three landmark productions, The Flying Dutchman, Four Saints in Three Acts and The Snow Queen. 

Partners

Victorian Opera

This research was funded  by the Australian Government through the Australian Research Council’s Linkage Program (Project LP1400100742)

Investigators

Kim Vincs
Richard Mills

Digitally created virtual 3D scenography has the potential to create immersive stage environments that re-imagine and re-tool the artistic language of opera production for the 21st century, and to re-model the economics of live theatre production through reducing touring costs. 

In these productions, the performers are integrated within spectacular CG landscapes and characters, which the audience views using 3D glasses. These highly successful works delighted audiences and drew exceptional reviews, awards and nominations, including finalist in the non-game category of Unity’s international Unite awards for The Flying Dutchman.

The Flying Dutchman

This production, created in collaboration with Victorian Opera, explored the potential of 3D stereoscopic technology to provide unparalleled depth to the set and reduce the cost of building and transporting physical sets whilst touring, both vital to a sustainable and visually engaging opera industry.

Original 3D projected scenography, viewed using 3D glasses, was created in the Unity game engine. This enabled us to cue animations in real-time during the live performance, and provided a unique ability to move the camera perspective around the ‘world’ of the operain this case a Norwegian Fjord. This methodology provided a new visual dramaturgy for the opera, whereby audiences feel as if they are moving through the set, and experience a visceral sense of depth and presence. Without the need for a physical set, this research also contributed greater economic flexibility to this opera production, increasing its potential to reach regional and remote audiences.

The work was a critical success, demonstrating the power of 3D techniques to refresh the aesthetics of traditional operas such as The Flying Dutchman. The work was a finalist in the Unite (Unity game engine) awards, nominated for eight Green Room Awards (winning three), and nominated for one Helpmann Award. The Flying Dutchman was profiled by the Australian Research Council in their 2015 report as an exemplar of impact.

https://www.victorianopera.com.au/past-productions/the-flying-dutchman

Four Saints in Three Acts

This project, also created in collaboration with Linkage partner Victorian Opera, aimed to develop, implement, and validate the aesthetic opportunities afforded by 3D digital scenography as well as the capacity of virtual, projected sets to remodel the economics of theatre production, extending the artistic and cultural reach of traditional opera.

Building on the success of The Flying Dutchman, Four Saints in Three Acts involved creating 3D stereoscopic imagery using the Unity game engine to augment the surrealist librettos written by Gertrude Stein that form the basis of this opera. When viewed by an audience wearing 3D glasses, radical imagery appears: fish fly, stained glass windows shatter, and portals open to a cosmic space setting. These effects play upon the ambiguous nature of the libretto, whilst camera movements enabled by the game engine, such as orbital shots and sudden zooms, enhance the sense of disjuncture, non-sequitur and play.

The work was created in collaboration with well-known director and theatre-maker Nancy Black, and performed as part of the Victorian Opera’s season at the Merlyn Theatre, Coopers Malthouse in Melbourne. This project brought a little-known yet seminal modernist opera to life for Melbourne audiences, and, without the need for a built set, the production is able to tour at reduced cost. While the opera itself proved challenging to critics, the 3D scenography was an outstanding success and the work was nominated for a Green Room award and the CHASS non-traditional research output prize.

https://www.victorianopera.com.au/season/four-saints-in-three-acts

https://www.victorianopera.com.au/behind-the-scenes/reviews-four-saints-in-three-acts

 

The Snow Queen

The Snow Queen is the third opera created in collaboration with Victorian Opera as part of this ARC Linkage project, which aimed to design, test and evaluate the artistic and economic value of virtual scenography for touring performing arts companies, and enable opera companies to present large scale productions in previously inaccessible regional and rural areas.

The Snow Queen was specifically designed for regional touring, and premiered at Wodonga, in country Victoria in November 2017. The production simplified camera movement and 3D object placement to successfully accommodate a large on-stage community chorus. This regional staging demonstrated the capacity of 3D virtual scenography to reduce touring costs for opera. The project used Unreal Engine rather than Unity game engine as in the previous two operas. This enabled us to create more visually rich imagery in less time, hence reducing costs.

The work successfully developed and demonstrated technological and artistic compositional approaches that complemented the regional setting of this tour. This is particularly significant for companies such as Victorian Opera, which see regional touring as a critical aspect of their mission to bring new cultural experiences to audiences, and as a core element of their business and funding models.

https://www.victorianopera.com.au/season/the-snow-queen

Related Projects

Eve of Dust

Eve of Dust

Eve of Dust is a collaborative performance and installation between a human and a robot. The artwork draws on both the possibilities and anxieties arising from the collaboration between humans and emerging intelligent systems personified in the robot.

Partners

Australia Council for the Arts

Investigators

John McCormick
Adam Nash (RMIT)
Steph Hutchison (QUT)

Eve of Dust is a collaborative performance and installation between a human and a robot. The artwork draws on both the possibilities and anxieties arising from the collaboration between humans and emerging intelligent systems personified in the robot. The artwork uses a Sawyer collaborative robot, an articulated 7-jointed robot arm that somewhat resembles a snake. The performance investigates the co-creative possibilites offered by collboration with non-human systems.

The work has two modes: performance mode and interactive mode. Performance mode is a collaborative duet between the robot and a professional dancer. Using a handheld VR controller to pick out points in space, the dancer is able to choreograph the robot’s movement in real time, in collaboration with the robot. The robot’s movements generate music in real time, with the rotation, position and motion of the robot determining pitch, rhythm, timbre etc. In this way, the dancer responds to and collaborates in both the robot’s movements and the generated music, creating a collaborative dance duet that is unique every performance.

In interactive mode, members of the public can play with the robot using a handheld VR controller to choreograph the robot’s movements which, as in performance mode, generates music in realtime. Inviting a playful interaction, people can collaborate with the robot to make a real time robot music and dance performance. People can respond to the robot movement and indeed will find it hard to remain passive in the unfolding duet that is unique to each person. Performed as part of Siggraph Asia 2018 at the Tokyo International Forum Japan. More information available at: http://www.wildsystem.net/eve_of_dust.html

Related Projects

City of Androids

City of Androids

A child robot uses an artificial neural network and location data to recognise where it is and what it is seeing within the Melbourne CBD. Participants can wheel the child around and it will recite stories it creates based on its surrounds. Emerging intelligent systems are increasingly impactful on our lives. The research investigates shared creativity and empathy with non-human systems. 

Partners

City of Melbourne – Arts Grants

Investigators

John McCormick

Adam Nash (RMIT)

A child robot uses anartificial neural network and location data to recognise where it is and what it is seeing within the Melbourne CBD. Participants can wheel the child around and it will recite stories it creates based on its surrounds. Emerging intelligent systems are increasingly impactful on our lives. The research investigates shared creativity and empathy with non-human systems.

Related Projects

Child in the Wild (ROBOTS + AI)

Child in the Wild (ROBOTS + AI)

Presentation of the Child in the Wild exhibition at the Art Science Museum Singapore, Black Box RMIT and the Horsham Town Hall Gallery.

Partners

Australia Council for the Arts

Investigators

John McCormick
Adam Nash  (RMIT University)

Child in the Wild is a work by Wild System (aka Adam Nash and John McCormick), see the work’s official website at wildsystem.net. Child in the Wild is an interactive installation that enables human participants and a child robot to co-create an immersive audiovisual artwork through the use of the robot’s artificial neural networks to enable object and image recognition. The resulting artwork dissolves the boundaries between computational and physical phenomena, dispalying an aesthetic that is a real hybrid of the physical and the digital, of human and machine learning, of natural and artificial intelligence, and of real and synthetic evolution.

It is an artwork and aesthetic that emerges from the interaction between robot, people and virtual environment, neither one taking precedence, rather collaborating on a genuinely post-digital, post-convergent artwork. Child in the Wild has been presented at SIGGRAPH Asia Art Track, Macau, China in November 2016; Singapore ArtScience Museum, ACM Creativity and Cognition Art Track Microbytes of Innovation in July 2017; Data Is Nothing at RMIT Black Box Gallery in Marck 2018.

Related Projects

WITH ME – Interactive sound installation about aloneness and loneliness

WITH ME - Interactive sound installation about aloneness and loneliness

This immersive dance work focuses on themes of loneliness and being alone, placing the audience into moments of ‘alone-ness’ to physically experience the work. 

Partners

Australia Council for the Arts

City of Stonnington

 

 

Investigators

Kim Vincs (TMT)
Clare Dyson (Swinburne)

This immersive dance work focuses on themes of loneliness and being alone, placing the audience into moments of ‘alone-ness’ to physically experience the work. The performance works between the cracks of dance, lighting design, sound and audience experience. It is a collaboration between dance, sound, installation and performance and has been choreographed by Clare Dyson in collaboration with performer Gerard Van Dyck, lighting designer Mark Dyson, stage designer Bruce Mckinven , dramaturg Kathryn Kelly, sound artist Mike Willmett and original performer Brian Lucas.

Related Projects

Memoryscape

Memoryscape

Memory-scapes is an exploration of Cinematic VR and location-based VR. The Creative Arts research project combines installation, multiscreen, Cinematic VR & location-based VR frameworks to research imaginative storytelling and immersive experiences.

Partners

Centre for Transformative Media Technologies

Investigators

Max Schleser

Memory-scapes is an exploration of Cinematic VR and location-based VR. The Creative Arts research project combines installation, multiscreen, Cinematic VR & location-based VR frameworks to research imaginative storytelling and immersive experiences. As an experimental screen production, the VR work will focus the construction of on story and memory-scapes. Working in the tradition of experimental filmmaking, the practice-led research project will re-define the time and space continuum formulating approaches to VR time in the context of interactive and generative storytelling.

While the idea of VR is not new and has been surfacing since the 1990s, accessible omnidirectional video cameras that integrate with standard video production workflows were launched in the last three year. The affordances and aesthetic implications of cinematic VR and location-based VR are not explored yet. Surfacing research suggests a ‘new paradigm of mobile cinematics’ and creative industries not only approach VR as an emerging technology, but as a new industry sector.

Related Projects

9th International Mobile Innovation Screening & Festival

9th International Mobile Innovation Screening & Festival

The International Mobile Storytelling Congress (IMSC) will take place on 22-24 November 2019 in Ningbo, China. IMSC focuses mobile, smartphone and pocket filmmaking, mobile innovation and mobile creativity. 

Partners

MINA – Mobile Innovation Network & Association, 

Mobile Studies International (MSI) 

University of Nottingham Ningbo China (UNNC)

 

Investigators

Max Schleser

Alongside the International Mobile Storytelling Congress (IMSC), the 9th edition of MINA’s International Mobile Innovation Screening will take place on 22-24 November 2019 in Ningbo, China. IMSC focuses mobile, smartphone and pocket filmmaking, mobile innovation and mobile creativity. IMSC provides a forum for practitioners and scholars to showcase projects and discuss changes, challenges and chances of mobile storytelling. MINA (www.mina.pro) is the longest running film festival internationally dedicated to mobile & smartphone filmmaking with a focus on moving-image arts, documentary, community engaged film productions, experimental films and emerging film production forms and formats, such as MoJo, drone videos, AR and Mobile Cinematic VR. 

Related Projects