PLAYStrong
Processes & Discoveries
The following is a chronological timeline showing a sequence of notable processes, events and discoveries that occurred over the course of the workshop. This record was assembled with reference to two reports compiled by student interns who attended the workshop; the "Five Interfaces Progress Report" written by Thais Holanda and the "DDL2 Daily Report" prepared by Powrnika Kugathasan.
04.17.23
The team meets for the first time in the Luella Massey Studio Theatre. Introductions are made and Candy Blair leads the group through an Indigenous smudging ceremony outside.
While beginning to set up the POETICstates interface and familiarizing themselves with the new space the team runs into a few problems; first, it was discovered that there were only power outlets on the right side of the theatre so everything had to be shifted to the right side and extension chords used when necessary; second, the stage lights were not working so additional LED lights were arranged around the stage area; thirdly, the stage proscenium was blocking projections onto the back wall and so the projector had to be repositioned by Adrien Whan, the theatre technical director; and finally, initial set up was delayed because at first Gustavo Sol didn't have access to a temporary UofT login and so could not connect to the WiFi in order to make required software updates.
04.18.23
Setting up and testing of the POETICstates interface continued today, but Candy Blair was not able to be in attendance because she had to stay home to take care of her son who had become ill. In order to accomodate Candy Blair, we set up a semi-mobile Zoom station on wheels with a laptop, camera, screen, microphone and speaker so that she could still participate. We also found out today that Renusha Athugala, one of the presenters, would not get his Canadian visitor visa in time so it was decided that he would present his project remotely and be Zoomed into the workshop in the same way that we had arranged to allow Candy Blair to participate from afar.
Because the projector had not been repositioned yet, the visual biofeedback was not available for Ethan Persyko while testing the EEG component of the interface. However, the audio feedback component coming from the speakers were working and so Ethan Persyko was still able to regulate the EEG reading by listening to the sound cues triggered by the reading of his brain activity.
Today was the first open house day of the workshop where artists and friends were invited to observe and/or test two of interfaces; the POETICstates interface and the 4VR Young VR interface presented by You Zhi Hu. Several testers of the 4VR Young interface reported a disorienting effect based on the mismatch between the movements of their bodies (e.g. pedalling on the bike and turning their heads) and the video. This created a disorienting and dizzying sensation in the user. It was suggested that before immediately solving the discrepancy by matching the video with the body movements more naturally, it would be interesting to consider examples of applications where a surreal and disorienting effect may be desirable.
​
During testing and observation of the POETICstates interface, it was discovered that not enough contrast in the video image for the facial recognition component caused issues. To solve this problem Thais Holanda put on a scarf in order to increase contrast between her face and neck. More lighting was added as well to create more definition. When testing the EEG component, it was discovered that Thais Holanda's long hair impaired the headset's ability to receive input, so Ethan Persyko demonstrated that component instead because with his shorter hair it worked better.
04.19.23
04.20.23
For day one of two of the undergraduate student focus group the participants were shown demonstrations of the POETICstates and 4VR Young interfaces. After the students observed both interfaces it was pointed out that the POETICstates interface was harder to understand because of its relatively more complicated user interface. Were it to be used to student mental health it was suggested to create a more user-friendly visual interface.
Photo: Tanya Humeniuk. Chairs set up in preparation for the student focus group.
Photo: Tanya Humeniuk. Thais Holanda demonstrating the facial recognition biofeedback component of POETICstates.
Photo: Tanya Humeniuk. Jacob Loat, student of Don Sinclair, testing the 4VR Young interface.
Photo: Tanya Humeniuk. Gustavo Sol adjusting EEG headset on Ethan Persyko's head.
Photo: Tanya Humeniuk. Candy Blair being Zoomed in.
Photo: Tanya Humeniuk. Stage proscenium blocking projection onto the back wall.
Photo: Tanya Humeniuk. First day setting up for POETICstates interface.
Photo: Tanya Humeniuk. View inside from the outside.
04.21.23
For day two of the student focus group's interaction with the POETICstates and 4VR Young interfaces, the students actually had the chance to test out the interfaces for themselves. In order to deal with the disorienting quality of the 4VR Young interface, it was suggested by several students that increasing the resistance of the bike would create a more realistic experience and closer match between the physical and visual feedback.
​
It was also observed that this being day 2 for the student focus group, the participants were beginning to get more comfortable with each other and as such were talking more and more freely, contributing to a more fun and productive environment.
Photo: Tanya Humeniuk. Antje Budde, You Zhi Hu and Gustavo Sol listening to feedback from the focus group students.
04.24.23
The plan for the second week was to set up, test and present the SATI and Forest Bathing interfaces. Renusha Athugala had been having internet connectivity issues since a recent shark attack to the underwater internet cables resulted in the internet slowing down in Vietnam. As such, back up plans were made in case Renusha Athugala's internet continued to have issues. Luckily Renusha was able to send Don Sinclair and Gustavo Sol his code which they ran on their laptops since it was mostly written in MaxMSP which they were familiar with. So the SATI interface was set up in the Luella Massey Studio Theatre, with the intention of Renusha Athugala giving a Zoom lecture on it later in the week.
​
Initially when the interface was set up, the camera would shake when people moved on stage, causing the motion tracking to work poorly. This was fixed by placing the camera tripod on some soft flat objects found in the theatre (see photo).
​
While testing the interface initially, it was discovered that the users would become quickly physically exhausted which would also contribute to an increase in stress. To fix this, two 5-second pauses were added in the middle of the 5-minute loop to give the user a chance to pause and refocus.
​
The issue of the user-friendliness of the POETICstates interface was also revisited today, and potential solutions suggested included using pictures taken of masks from the Art Gallery of Ontario (AGO), including Snapchat filters, or videos of the emotive facial movements from a classical Indian dance called Navarasa. Because of the technical difficulties and relative complexity of incorporating the videos into the interface it was decided that the masks were going to be used instead.
Photo: Tanya Humeniuk. Don Sinclair setting up the SATI interface.
Photo: Tanya Humeniuk. Camera tripod stabilized by foam blocks.
04.25.23
Today the POETICstates interface was tested for the first time with the incorporation of the masks. After being tested by Thais Holanda and Powrnika Kugathasa, we discovered that the masks created a better awareness of what the facial muscles were doing since the emotions of the masks were more ambiguous and open to interpretation. So rather than focusing on conveying a particular emotion, the masks created more of an awareness of one's actual body and to what extent it corresponded to the mask — regardless of what the emotion of the mask was interpreted to be.
Photo: Tanya Humeniuk. Thais Holanda testing the POETICstates interface with the AGO masks.
04.26.23
Today was the open house for the SATI and Forest Bathing interfaces. Powrnika Kugathasa demonstrated the interface for the visitors. She then described her experience with the interface, stating that initially she experienced some performance anxiety, even as a trained dancer. We concluded that a non-dancer participant would probably feel even more anxiety. Thus it was established that the interface would be most effective if a private option was available.
Photo: Tanya Humeniuk. Powrnika Kugathasa demonstrating the SATI interface for the visitors.
04.27.23
This was the second last day of the student focus group. Over the next two days they would experience the SATI, Forest Bathing and POETICstates interfaces. After Powrnika demonstrated the SATI interface for the students, Renusha Athugala was able to attend the focus group via Zoom and get direct feedback from the students. Seemingly more comfortable than ever before, the students were very vocal and gave many suggestions for further adjustments and applications for the interfaces.
Photo: Tanya Humeniuk. Renusha Athugala attending the focus group feedback/discussion session via Zoom.
04.28.23
One of the goals for today was to try and add a visual component to the ECG part of Gustavo Sol's POETICstates interface. This was done by representing stress levels using coloured bars. The main issue was that we encountered technical difficulties with the heart rate monitor. It was concluded that a better ECG monitor would be needed to move forward with this component of the interface.
Photo: Tanya Humeniuk. Thais Holanda hooked up to the heart rate monitor testing the new interface.
05.10.23
Today we had Danielle Lottridge come in to present her Vector:Interact interface. After setting up the interface, it was discovered that because of the position of the projector and the height of the stage, the participant's body cast a shadow on the projection. This was not originally intended but we decided to explore the potential benefits to this accident. Two people tested out the interface, Grace, a student from the Interactive Media Lab, and Candy Blair. Having a background in dance, Candy Blair was able to use her shadow to her advantage whereas for Grace, who moved around relatively less, it seemed to be a detriment to the experience since her shadow covered most of the visual feedback.
Photo: Tanya Humeniuk. Candy Blair testing the Vector:Interact interface.
Photo: Tanya Humeniuk. Grace testing the Vector:Interact interface.
05.11.23
Today we tested a different version of the Vector:Interact interface. This one also had music, but it was much more piercing and intense so didn't create as relaxing of an environment. When Ethan Persyko tested it out he described the experience as more aroused and curious than relaxed.
​
The team attempted to connect VR 3D glasses to the interface but ran into an issue with the Unity software. So the testing of this interface had to stop there.
Photo: Tanya Humeniuk. Ethan Persyko testing out the second version of the Vector:Interact interface.
Photo: Tanya Humeniuk. Danielle Lottridge and Grace (Interactive Media Lab) attempting to connect 3D VR glasses to the Vector:Interact interface.
05.12.23
Today the workshop was open to friends/dancers to come out and test the Vector:Interact interface. Because of some initial technical difficulties setting up the interface, Danielle Lottridge remained on stage while the dancers tested it out. But as a result, the dancers were more aware of her presence and would stop to ask her questions, causing breaks in what might have otherwise been a greater flow or immersion in the experience.
Photo: Tanya Humeniuk. Dancer testing the Vector:Interact interface.