Time Managment Study

Abstract 

HCI research has shown that ambient cues (like light) can help people manage time during meetings. By placing cues in VR we can offer new kinds of feedback not possible in the physical world, but thus far, research has not yet examined the efficacy of ambient cues for time management in VR meetings. This research explores how shared movement through an environment might affect meeting participants’ ability to manage time. To answer this question, our team has designed a novel VR application with a moving platform (the Time Barge) which moves through a virtual environment over the course of a meeting, displaying ambient cues to participants as they manage an experimental task. While data analysis is ongoing, preliminary results suggested that most participants didn’t notice time pasting and weren’t distracted by the environment. Interventions like the Time Barge point to opportunities for VR meetings to support social dynamics that go beyond what is possible in the physical world.


Background

There’s been a shift to hybrid and remote work that have caused some challenges with meetings. VR meetings are an alternative that supports richer forms of embodied virtual copresence. HCI research has demonstrated that technology can assist people manage social dynamics in a work environment and more specifically in meetings. One of the technology interventions might include making the dynamic between people more pronounced in meetings. Another would be a way to address time management in meetings. The exploration of these ideas have not been explored in VR yet. The meeting possibilities in VR stretch wider than what might be available in a physical space. One of the challenges faced in meetings, in general, is time management. 

Without someone designated to keep the group on track and on time, it can become awkward and can disrupt opportunities for relational communication if someone has to be the ‘bad guy’ by reminding them of the time. In VR meetings, time can be managed through a range of ambient cues, including feedback mechanisms that would not be possible in the physical world. For example, VR meetings can enable participants to travel to move through space and time simultaneously, taking advantage of our innate human ability to note the passage of time as we travel together (for instance, as is the case when having a conversation during a walk or a road trip). To explore this opportunity, collaborators from SFSU and UCSC have designed a VR meeting environment to study how moving environmental cues may help teams to manage time in meetings.

Introduction

Our team was interested in how VR meeting environments could offer environmental cues to help with time management. We were especially interested in examining how teams may better manage meeting time by using a moving meeting platform that gradually glides past various environmental cues. To address this concern we designed a VR meeting environment that uses a flying barge as the platform for conducting meetings on. The barge moves slowly through a moderately designed landscape, giving subtle environmental cue which may assist in helping the team keep track of time.  

Hypothesis

Ambient cues of a moving environment will correlate with less “time-talk” and more effective time management.

Methods 

• Experimental validation

• Research-through-design

• Field study of ‘technological probes’

Experimental Psychology Research: 

  • Hidden Profiles Paradigm protocol

    • Involves participants having different access to information about candidates in a candidate evaluation scenario for a new hire.

    • Citation: 

      • Stasser G, Titus W. Pooling of unshared information in group decision making: biased information sampling during discussion. Journal of Personality & Social Psychology 1985; 48:1467–1478.

  • Moser et al 

    • (1) all steps take place in VR (in order to accentuate the time management collaboration challenges for the participants).

    • (2) added step where participants cooperatively decide on criteria they’ll use to evaluate candidates (to make the first step of the hidden profile paradigm a collaborative effort)

Design Methods

Mozilla hubs, a web-based VR chat room, can be used as a creative tool for its customizable abilities.We created a scene where the protocol will take place using the scene builder (editor). The platform (barge), along with the other 3D items utilized in the experiment, were first constructed by our team in Blender in a triangle shape. Participants will rank the knowledge, skills, and abilities they believe are most appropriate for an airline pilot during the first phase. Only three options will be available to them. They will then move on to phase two and, in accordance with their selections, proceed to their podiums, where they are instructed to read the references quietly to themselves.

Following the hidden profile paradigm, there will be hidden pieces of information on each reference list that is not presented to the other participants. They will not be told of this. As they continue to phase three, they are expected to recall the information presented to each and discuss it with the group as they review the candidates’ resumes. We wonder whether the hidden pieces of information will be brought up. We designed an environment that the barge would travel throughout, signaling the passage of time. A clock will still be present, located at the center of the barge. This clock will allow participants to keep track of time using the clock if they choose to. They start in the hangar and will practice getting comfortable with the navigational features. Even though research-through-design will be more prominent in later phases of research, early RtD exploration and reflections helped our iterative process in creating the environment. The participants will enter the barge in the left-hand corner and navigate throughout the phases in a clockwise direction. Dividers section off each phase to give visual cues that participants should stay in each section and focus on the task at hand. As each phase completes, a participant will push a button to make the next phase appear. At the end of the exercise, they will hit the done button, which will end the exercise.They will then be asked to complete the post-survey.

My Experience Moderating

We recruited students from SFSU that had a few months experience using VR for a class. 

We started by making sure we had all the programs running on our computers before starting the presentation for the participants. We recorded the session using OBS and held the meeting on zoom. We started with an introduction to the experience before guiding each participant through the set up and then eventually the environment.

We were able to collect survey and interview data as well as observe the participants social interactions between each other. 

We encounter a few technical challenges during one of the first sessions when one participant wasn’t able to enter the controlled setting from their headset. We tried to troubleshoot the issue by having him reloading and then powering on and off but it seemed like an error within the participants headset. We ended up rescheduling that session with different participants and didn’t run into any issues again. 

I noticed the environment at moments of transition from one section [task] to another because that’s when I had a moment to look around.
— Participant June83281

Key Findings

  • Some in experimental condition noticed visual cues - flags with numbers, floating over water, and the mountain range approaching.

  • It was interesting that both experimental and control have people saying "not paying attention to time"

  • For a few groups there was a person who took the lead in time management (in some way)

  • Control group noticed salient details about the environment. 

Important Takeaways

  • Overall, 9 experiments have been completed. 7 Experiment (movement) sessions and 2 Controlled (non-movement) sessions

  • Preliminary results suggested that most participants didn’t realize the time passing 

  • The surrounding environment weren’t distracting to them. For those that did cite distraction it was often in the context of technical issues or VR acclimation issues

  • Time-talk events should be coded based on what people noticed in the environment, if anything.

  • Preliminary results also suggested that transitions between tasks should be coded based on whether people looked around the environment or not

Next Steps Involve:

  • Behavioral coding of the video

  • Analysis of survey results

  • Finalize data collection