Speaker
Description
During a collaborative task, like carrying a sofa or a table togehter with a co-actor, it is common for individuals to coordinate their motor behavior in terms of space and time. Interestingly, coordination between partners can occur even without verbal communication, as they observe each other's movements and/or the movements of the object they are interacting with. The study aimed to examine how social coupling between two individuals can emerge in collaborative tasks under different perceptual information conditions. A visual reference condition was compared with three other conditions that included new types of real-time auditory feedback: effect-based, performance-based, and combined effect/performance-based auditory feedback. A novel paradigm was introduced in which participants' actions are seamlessly merged to control an object on a tablet computer application. The participants were required to synchronize their movements with a 90° phase difference and adjust finger dynamics to make the object (a ball) rotating accurately on a given circular trajectory on the tablet. The results suggest that various types of additional auditory information can alter interpersonal coordination in joint tasks.
References
Hwang T-H, Schmitz G, Klemmt K, Brinkop L, Ghai S, Stoica M, Maye A, Blume H and Effenberg AO (2018) Effect- and Performance-Based Auditory Feedback on Interpersonal Coordination. Front. Psychol. 9:404. doi: 10.3389/fpsyg.2018.00404
Keywords | auditory feedback, collaborative task, human-human interface, interpersonal coordination, movement sonification |
---|