Publications by year
In Press
Harris D, Arthur T, Kearse J, Olonilua M, Hassan EK, de Burgh T, Wilson M, Vine SJ (In Press). A comparison of live fire, 2D video, and virtual reality simulations for judgemental training in the military.
Abstract:
A comparison of live fire, 2D video, and virtual reality simulations for judgemental training in the military
Simulation methods, including physical synthetic environments, already play a substantial role in human skills training in the military and are commonly used for developing situational awareness and judgemental skills. The rapid development of virtual reality technologies has provided a new opportunity for performing this type of training, but before VR can be adopted as part of mandatory training it should be subjected to rigorous tests of its suitability and effectiveness. In this work, we adopted established methods for testing the fidelity and validity of simulated environments to compare three different methods of judgemental training. Thirty-nine dismounted close combat troops from the UK’s Royal Air Force completed shoot/don’t-shoot judgemental tasks in: i) live-fire; ii) virtual reality; and iii) 2D video simulation conditions. A range of shooting accuracy and decision-making metrics were recorded from all three environments. The results showed that 2D video simulation posed little decision-making challenge during training. Decision-making performance across live fire and virtual reality simulations was comparable but the two may offer slightly different, and perhaps complementary, methods of training judgemental skills. Different types of simulation should, therefore, be selected carefully to address the exact training need.
Abstract.
Harris D, Arthur T, Broadbent D, Wilson M, Vine S, Runswick O (In Press). An active inference account of skilled anticipation in sport: Using computational models to
formalise theory and generate new hypotheses.
Sports MedicineAbstract:
An active inference account of skilled anticipation in sport: Using computational models to
formalise theory and generate new hypotheses
Optimal performance in time-constrained and dynamically changing environments depends on
making reliable predictions about future outcomes. In sporting tasks, performers have been found
to employ multiple information sources to maximize the accuracy of their predictions, but
questions remain about how different information sources are weighted and integrated to guide
anticipation. In this paper, we outline how predictive processing approaches, and active inference
in particular, provide a unifying account of perception and action which explains many of the
prominent findings in the sports anticipation literature. Active inference proposes that perception
and action are underpinned by the organisms’ need to remain within certain stable states. To this
end, decision making approximates Bayesian inference and actions are used to minimise future
prediction errors during brain-body-environment interactions. Using a series of Bayesian
neurocomputational models based on a partially observable Markov process, we demonstrate that
key findings from the literature can be recreated from the first principles of active inference. In
doing so, we formulate a number of novel and empirically falsifiable hypotheses about human
anticipation capabilities which could guide future investigations in the field
Abstract.
Harris D, Wilson M, Jones MI, de Burgh T, Mundy D, Arthur T, Olonilua M, Vine SJ (In Press). An investigation of feed-forward and feed-back eye movement training in immersive virtual reality.
Abstract:
An investigation of feed-forward and feed-back eye movement training in immersive virtual reality
The control of eye gaze is critical to the execution of many skills. The observation that task experts in many domains exhibit more efficient control of eye gaze than novices has led to the development of gaze training interventions that teach these behaviours. We aimed to extend this literature by i) examining the relative benefits of feed-forward (observing an expert’s eye movements) versus feed-back (observing your own eye movements) training, and ii) automating this training within virtual reality. Serving personnel from the British Army and Royal Navy were randomised to either feed-forward or feed-back training within a virtual reality simulation of a room search and clearance task. Eye movement metrics – including visual search, saccade direction, and entropy – were recorded to quantify the efficiency of visual search behaviours. Feed-forward and feed-back eye movement training produced distinct learning benefits, but both accelerated the development of efficient gaze behaviours. However, we found no evidence that these more efficient search behaviours transferred to better decision making in the room clearance task. Our results suggest integrating eye movement training principles within virtual reality training simulations may be effective, but further work is needed to understand the learning mechanisms.
Abstract.
Harris D, Wilson M, Jones M, de Burgh T, Mundy D, Arthur T, Olonilua M, Vine S (In Press). An investigation of feed-forward and feed-back eye movement training in immersive virtual reality. Journal of Eye Movement Research
Harris D, Arthur T, de Burgh T, Duxbury M, Lockett-Kirk R, McBarnett W, Vine S (In Press). Assessing expertise using eye tracking in a Virtual Reality flight simulation. The International Journal of Aerospace Psychology
Harris D, Donaldson R, Bray M, Arthur T, Wilson M, Vine SJ (In Press). Attention computing for enhanced visuomotor skill performance: Testing the effectiveness of gaze-adaptive cues in virtual reality golf putting.
Abstract:
Attention computing for enhanced visuomotor skill performance: Testing the effectiveness of gaze-adaptive cues in virtual reality golf putting
This work explored how immersive technologies like virtual reality can be exploited for improved motor learning. While virtual reality is becoming a practical replacement for training that is otherwise expensive, dangerous, or inconvenient to deliver, virtual simulations can also enhance the learning process. Based on the concept of ‘attention computing’, we developed and tested a novel ‘gaze-adaptive’ training method within a virtual putting environment augmented with eye and motion tracking. Novice golfers were randomly assigned to either standard putting practice in virtual reality (control) or gaze-adaptive training conditions. For gaze-adaptive training, the golf ball was sensitive to the participant’s gaze and illuminated when fixated upon, to prompt longer and more stable pre-shot fixations. We recorded the effect of these training conditions on task performance, gaze control, and putting kinematics. Gaze-adaptive training was successful in generating more expert-like gaze control and putting kinematics, although this did not transfer to improved performance outcomes within the abbreviated training paradigm. These findings suggest that gaze-adaptive environments can enhance visuomotor learning and may be a promising method for augmenting virtual training environments.
Abstract.
Harris D, Arthur T, Wilson M, Vine SJ (In Press). Eye tracking for affective computing in virtual reality healthcare applications.
Abstract:
Eye tracking for affective computing in virtual reality healthcare applications
This paper examines opportunities and challenges associated with using eye tracking as a sensory system for affective computing in extended reality (XR) environments. Affective computing is a rapidly growing field that aims to develop computing systems capable of recognizing, interpreting, and responding to human emotions. Eye tracking has several potential benefits for improving the detection of emotions, including its ability to unobtrusively monitor affective states in real time, and its sensitivity to a variety of affective states. This paper introduces affective computing, explains eye tracking methodologies, and describes how eye tracking can be used to detect attentional and affective states, realizing concepts such as virtual exposure therapy, and adaptive virtual reality. The paper discusses several potential applications of eye tracking for affective computing in healthcare settings
Abstract.
Harris D, Arthur T, Wilson M, Le Gallais B, Parsons T, Dill A, Vine SJ (In Press). Impaired updating of predictive eye movements during anxious states.
Abstract:
Impaired updating of predictive eye movements during anxious states
The impact of clinical anxiety on learning and decision-making is well-established. However, the influence of temporary anxiety states on optimal learning and belief updating in healthy individuals remains less explored. In this study, we investigated how anxious states affect the process of forming and revising sensorimotor predictions. Study participants engaged in a virtual reality interceptive task while we manipulated performance incentives to induce situational pressure. We then assessed changes in physiological arousal, self-reported anxiety levels, task performance, and eye movement patterns. Employing Bayesian computational models of perception, we analysed how quickly predictive eye movements were adjusted across multiple trials. The results revealed that heightened anxiety led to a slower rate of updating predictive eye movements, accompanied by an increase in visual exploration of the environment. These findings deepen our understanding of how emotional states, like anxiety, interact with active inference behaviours. Specifically, they highlight the limitation imposed on updating predictive sensorimotor behaviours during anxious conditions. We discuss the implications of these findings within the context of theoretical frameworks such as the free energy principle, which conceptualises anxiety as a state of internal entropy that organisms seek to alleviate.
Abstract.
Arthur T, Vine S, Brosnan M, Buckingham G (In Press). Predictive Sensorimotor Control in. Autism. Brain: a journal of neurology
Harris D, Arthur T (In Press). Predictive eye movements are adjusted in a Bayes-optimal fashion in response to unexpectedly changing environmental probabilities. Cortex
Arthur T, Harris D (In Press). Predictive eye movements are adjusted in a Bayes-optimal fashion in response to unexpectedly changing environmental probabilities.
Abstract:
Predictive eye movements are adjusted in a Bayes-optimal fashion in response to unexpectedly changing environmental probabilities
This paper examines the application of active inference to naturalistic visuomotor control. Active inference proposes that actions serve to minimise future prediction errors and are dynamically adjusted according to uncertainty about sensory information, predictions, or the environment. We investigated whether predictive gaze behaviours are indeed adjusted in this Bayes-optimal fashion during a virtual racquetball task. In this task, participants intercepted bouncing balls with varying levels of elasticity, under conditions of high and low environmental volatility. Participants’ gaze patterns differed between stable and volatile conditions in a manner consistent with generative models of Bayes-optimal behaviour. Partially observable Markov models also revealed an increased rate of associative learning in response to unpredictable shifts in environmental probabilities, although there was no overall effect of volatility on this parameter. Findings extend active inference frameworks into complex and unconstrained visuomotor tasks and present important implications for a neurocomputational understanding of the visual guidance of action.
Abstract.
Harris D, Arthur T, Vine SJ, liu J, Rahman HRA, Han F, Wilson M (In Press). Task-evoked pupillary responses track precision-weighted prediction errors and learning rate during interceptive visuomotor actions.
Abstract:
Task-evoked pupillary responses track precision-weighted prediction errors and learning rate during interceptive visuomotor actions
In this study we examined the relationship between physiological encoding of surprise and the learning of anticipatory eye movements. Active inference portrays perception and action as interconnected inference processes driven by the imperative to minimise the surprise of sensory observations. To examine this characterisation of oculomotor learning during a hand-eye coordination task, we tested whether anticipatory eye movements were updated in accordance with Bayesian principles and whether learning rates tracked pupil dilation as a marker of ‘surprise’. Forty-four participants completed an interception task in immersive virtual reality that required them to hit bouncing balls with either expected or unexpected bounce profiles. We recorded anticipatory eye movements known to index participants’ beliefs about likely ball bounce trajectories. By fitting a hierarchical Bayesian inference model to the trial-wise trajectories of these predictive eye movements, we were able to estimate each individual’s expectations about bounce trajectories, rates of belief updating, and precision-weighted prediction errors. We found that the task-evoked pupil response tracked prediction errors and learning rates but not beliefs about ball bounciness or environmental volatility. These findings are partially consistent with active inference accounts and shed light on how encoding of surprise may shape the control of action.
Abstract.
Arthur T, Vine SJ, Buckingham G, Brosnan M, Wilson M, Harris D (In Press). Testing predictive coding theories of autism spectrum disorder using models of active inference.
Abstract:
Testing predictive coding theories of autism spectrum disorder using models of active inference
Several competing neuro-computational theories of autism have emerged from predictive coding models of the brain. These accounts have a common focus on the relationship between prior beliefs and sensory inputs as a mechanism for explaining key features of autism, yet they differ in exactly how they characterise atypicalities in perception and action. We tested these competing predictions using computational modelling of two datasets that allowed us to probe both visual and motor aspects of active inference: manual gripping forces during object lifting and anticipatory eye movements during a naturalistic interception task. We compared estimated belief trajectories between autistic and neurotypical individuals to determine the underlying differences in active inference. We found no evidence of chronic deficits in the use of priors or weighting of sensory information during object lifting. Differences in prior beliefs, rates of belief updating, and the precision weighting of prediction errors were, however, observed for anticipatory eye movements. Notably, we observed autism-related difficulties in flexibly adapting learning rates in response to environmental change (i.e. volatility). These findings suggest that aberrant encoding of precision and context-sensitive adjustments provide a better explanation of autistic perception than generic attenuation of priors or persistently high precision prediction errors.
Abstract.
Harris D, Vine SJ, Wilson M, Arthur T (In Press). The design and development of a virtual environment to measure eye movement indicators of prediction: Report on pilot testing.
Abstract:
The design and development of a virtual environment to measure eye movement indicators of prediction: Report on pilot testing
This report describes the results of the design, development, and pilot testing of a virtual reality interception task. The task was designed to measure anticipatory eye movements as a way to index the evolution of probabilistic beliefs about the environment. We sought to validate the task as a way to measure predictions by manipulating statistics of the environment and determining whether eye movements tracked the changes in probability. During the task, the player was placed in a virtual squash court, where a ball was projected from one of two locations on the front wall. The player simply has to intercept the ball. We created conditions with a 90/10, 70/30, and 50/50 left/right probability split to examine whether the horizontal position of the eye just before the ball was released tracked these probabilities. Results indicated that anticipatory eye position was adjusted in response to these probabilities, but the effect was relatively weak. These results partially validate the task but also indicate that additional challenge or uncertainty may be needed to create a greater demand on correct prediction.
Abstract.
Harris D, Arthur T, Vine SJ, Rahman HRA, Han F, liu J, Wilson M (In Press). The effect of performance pressure and error-feedback on anxiety and performance in an interceptive task.
Abstract:
The effect of performance pressure and error-feedback on anxiety and performance in an interceptive task
While the disruptive effects of anxiety on attention and performance have been well documented, the antecedents to anxiety in motivated performance scenarios are less well understood. We therefore sought to understand the cognitive appraisals that mediate the relationship between pressurised performance situations and the onset of anxiety. We tested the effects of performance pressure and error feedback on appraisals of the probability and cost of failure, the experience of anxiety, and subsequent impacts on visual attention, movement kinematics, and task performance during a virtual reality interception task. A series of linear mixed effects models indicated that failure feedback and situational pressure influenced moment-to-moment appraisals of the probability and cost of failure, which subsequently predicted the onset of anxious states. We did not, however, observe downstream effects on performance and attention. The findings support the predictions of Attentional Control Theory Sport, that i) momentary errors lead to negative appraisals of the probability of future failure; and ii) that appraisals of both the cost and probability of future failure are important predictors of anxiety. The results contribute to a better understanding of the precursors to anxiety and the feedback loops that may maintain anxious states.
Abstract.
Harris D, Vine SJ, Wilson M, Arthur T (In Press). The relationship between environmental statistics and predictive gaze behaviour during a manual interception task: Eye movements as active inference.
Abstract:
The relationship between environmental statistics and predictive gaze behaviour during a manual interception task: Eye movements as active inference
Human observers are known to frequently act like Bayes-optimal decision makers and there is growing evidence that the deployment of the visual system may similarly be driven by probabilistic mental models of the environment. We tested whether eye movements during a dynamic interception task were indeed optimised according to Bayesian inference principles. Forty-one participants intercepted oncoming balls in a virtual reality racquetball task across five counterbalanced conditions in which the relative probability of the onset location was manipulated. Analysis of pre-onset gaze positions indicated that eye position tracked the true distribution of onset location, indicating that the gaze system spontaneously adhered to environmental statistics. Eye position did not, however, minimise the distance between the target and foveal vision in a fully probabilistic way, and instead often reflected a ‘best guess’ about onset location. Trial-to-trial changes in gaze position were found to be better explained by Bayesian learning models (Hierarchical Gaussian Filter) than associative learning models. Additionally, parameters relating to the precision of beliefs and prediction errors extracted from the participant-wise models were related to both task-evoked pupil dilations and variability in gaze positions, providing further evidence that probabilistic context was reflected in spontaneous gaze dynamics.
Abstract.
2023
Arthur T, Vine SJ, Wilson M, Harris D (2023). The role of prediction and visual tracking strategies during manual interception: an exploration of individual differences.
2022
Savickaite S, Husselman T-A, Taylor R, Millington E, Hayashibara E, Arthur T (2022). Applications of virtual reality (VR) in autism research: current trends and taxonomy of definitions.
Journal of Enabling Technologies,
16(2), 147-154.
Abstract:
Applications of virtual reality (VR) in autism research: current trends and taxonomy of definitions
PurposeRecent work could further improve the use of VR technology by advocating the use of psychological theories in task design and highlighting certain properties of VR configurations and human – VR interactions. The variety of VR technology used in the trials prevents us from establishing a systematic relationship between the technology type and its effectiveness. As such, more research is needed to study this link, and our piece is an attempt to shed a spotlight on the issue.Design/methodology/approachTo explore recent developments in the field, the authors followed the procedures of scoping review by Savickaite et al. (2022) and included publications from 2021 to 2022.FindingsIn this updated analysis, it was clear that the research themes emerging over the last two years were similar to those identified previously. Social training and intervention work still dominates the research area, in spite of recent calls from the autism community to broaden the scientific understanding of neurodivergent experiences and daily living behaviours. Although, autism is often characterised by difficulties with social interactions, it is just one part of the presentation. Sensory differences, motor difficulties and repetitive behaviours are also important facets of the condition, as well as various wider aspects of health, wellbeing and quality of life. However, many of these topics appear to be understudied in research on VR applications for autism.Originality/valueVR stands out from other representational technologies because of its immersion, presence and interactivity and has grown into its own niche. The question of what constitutes a truly immersive experience has resurfaced. We can no longer deny that VR has established itself in autism research. As the number of studies continues to grow, it is a perfect time to reconsider and update our notion of definitions of immersion and its reliance on hardware.
Abstract.
Arthur T, Brosnan M, Harris D, Buckingham G, Wilson M, Williams G, Vine S (2022). Investigating how Explicit Contextual Cues Affect Predictive Sensorimotor Control in Autistic Adults.
J Autism Dev DisordAbstract:
Investigating how Explicit Contextual Cues Affect Predictive Sensorimotor Control in Autistic Adults.
Research suggests that sensorimotor difficulties in autism could be reduced by providing individuals with explicit contextual information. To test this, we examined autistic visuomotor control during a virtual racquetball task, in which participants hit normal and unexpectedly-bouncy balls using a handheld controller. The probability of facing each type of ball was varied unpredictably over time. However, during cued trials, participants received explicit information about the likelihood of facing each uncertain outcome. When compared to neurotypical controls, autistic individuals displayed poorer task performance, atypical gaze profiles, and more restricted swing kinematics. These visuomotor patterns were not significantly affected by contextual cues, indicating that autistic people exhibit underlying differences in how prior information and environmental uncertainty are dynamically modulated during movement tasks.
Abstract.
Author URL.
Millington E, Hayashibara E, Arthur T, Husselman TA, Savickaite S, Taylor R (2022). Neurodivergent participatory action research for Virtual Reality (VR).
Journal of Enabling Technologies,
16(2), 141-146.
Abstract:
Neurodivergent participatory action research for Virtual Reality (VR)
Purpose: This paper aims to raise awareness of and argue for the use of participatory methods for the research and development of Virtual Reality (VR) applications designed for neurodivergent groups. This includes exploring why it is important to meaningfully include neurodivergent groups and the benefits their inclusion provide. Design/methodology/approach: VR is becoming increasingly widespread as a consumer product and interventional tool. It is vital for researchers and developers to embrace best practices in these early stages of using the technology, making certain that neurodivergent people have the best possible outcomes. Findings: the neurodivergent community is dissatisfied with many of the research directions currently being undertaken. This dissatisfaction arises from conflicting priorities between different stakeholders and the lack of input from the community. Participatory research brings neurodivergent people into the research process, whether as members of the research team or as consultants at key steps. Effective participatory research ensures that the priorities of the neurodivergent community are better incorporated in research, as well as enabling the development of more effective applications for VR. Originality/value: Participatory methods are unutilised in the development of applications aimed for neurodivergent people. By describing their use and utility in other areas, this article aims to encourage other VR researchers to take neurodivergent people on board.
Abstract.
Harris DJ, Arthur T, Vine SJ, Liu J, Abd Rahman HR, Han F, Wilson MR (2022). Task-evoked pupillary responses track precision-weighted prediction errors and learning rate during interceptive visuomotor actions.
Scientific Reports,
12(1).
Abstract:
Task-evoked pupillary responses track precision-weighted prediction errors and learning rate during interceptive visuomotor actions
AbstractIn this study, we examined the relationship between physiological encoding of surprise and the learning of anticipatory eye movements. Active inference portrays perception and action as interconnected inference processes, driven by the imperative to minimise the surprise of sensory observations. To examine this characterisation of oculomotor learning during a hand–eye coordination task, we tested whether anticipatory eye movements were updated in accordance with Bayesian principles and whether trial-by-trial learning rates tracked pupil dilation as a marker of ‘surprise’. Forty-four participants completed an interception task in immersive virtual reality that required them to hit bouncing balls that had either expected or unexpected bounce profiles. We recorded anticipatory eye movements known to index participants’ beliefs about likely ball bounce trajectories. By fitting a hierarchical Bayesian inference model to the trial-wise trajectories of these predictive eye movements, we were able to estimate each individual’s expectations about bounce trajectories, rates of belief updating, and precision-weighted prediction errors. We found that the task-evoked pupil response tracked prediction errors and learning rates but not beliefs about ball bounciness or environmental volatility. These findings are partially consistent with active inference accounts and shed light on how encoding of surprise may shape the control of action.
Abstract.
2021
Arthur T, Harris D, Allen K, Naylor C, Wood G, Vine S, Wilson M, Tsaneva-Atanasova K, Buckingham G (2021). Visuo-motor attention during object interaction in children with developmental coordination disorder.
CortexAbstract:
Visuo-motor attention during object interaction in children with developmental coordination disorder
Developmental coordination disorder (DCD) describes a condition of poor motor performance in the absence of intellectual impairment. Despite being one of the most prevalent developmental disorders, little is known about how fundamental visuomotor processes might function in this group. One prevalent idea is children with DCD interact with their environment in a less predictive fashion than typically developing children. A metric of prediction which has not been examined in this group is the degree to which the hands and eyes are coordinated when performing manual tasks. To this end, we examined hand and eye movements during an object lifting task in a group of children with DCD (n=19) and an age-matched group of children without DCD (n=39). We observed no differences between the groups in terms of how well they coordinated their hands and eyes when lifting objects, nor in terms of the degree by which the eye led the hand. We thus find no evidence to support the proposition that children with DCD coordinate their hands and eyes in a non-predictive fashion. In a follow-up exploratory analysis we did, however, note differences in fundamental patterns of eye movements between the groups, with children in the DCD group showing some evidence of atypical visual sampling strategies and gaze anchoring behaviours during the task.
Abstract.
Author URL.
2019
Arthur TG, Wilson MR, Moore LJ, Wylie LJ, Vine SJ (2019). Examining the effect of challenge and threat states on endurance exercise capabilities.
Psychology of Sport and Exercise,
44, 51-59.
Abstract:
Examining the effect of challenge and threat states on endurance exercise capabilities
This paper presents the first two studies to explore the effect of challenge and threat states on endurance exercise capabilities. In study one, relationships between cardiovascular markers of challenge and threat states, ratings of perceived exertion (RPE), and exercise tolerance were explored during moderate- and severe-intensity cycling. Cardiovascular reactivity more reflective of a challenge state (i.e. relatively higher cardiac output and/or lower total peripheral resistance reactivity) predicted lower RPE throughout moderate- but not severe-intensity cycling. Building on these findings, study two experimentally manipulated participants into challenge, threat, and neutral groups, and compared 16.1 km time-trial performances, where pacing is self-regulated by RPE. Participants completed familiarisation, control, and experimental visits while physiological (oxygen uptake), perceptual (RPE), and performance-based (time to completion [TTC] and power output [PO]) variables were assessed. When compared to the threat group, the challenge group demonstrated cardiovascular responses more indicative of a challenge state, and delivered faster early-race pacing (PO) at similar RPE. Although there were no significant differences in TTC, results revealed that augmentations in PO for the challenge group were facilitated by tempered perceptions of fatigue. The findings suggest that an individual's pre-exercise psychophysiological state might influence perceived exertion and endurance exercise capabilities.
Abstract.
Arthur T, Vine S, Brosnan M, Buckingham G (2019). Exploring how material cues drive sensorimotor prediction across different levels of autistic-like traits.
Exp Brain Res,
237(9), 2255-2267.
Abstract:
Exploring how material cues drive sensorimotor prediction across different levels of autistic-like traits.
Recent research proposes that sensorimotor difficulties, such as those experienced by many autistic people, may arise from atypicalities in prediction. Accordingly, we examined the relationship between non-clinical autistic-like traits and sensorimotor prediction in the material-weight illusion, where prior expectations derived from material cues typically bias one's perception and action. Specifically, prediction-related tendencies in perception of weight, gaze patterns, and lifting actions were probed using a combination of self-report, eye-tracking, motion-capture, and force-based measures. No prediction-related associations between autistic-like traits and sensorimotor control emerged for any of these variables. Follow-up analyses, however, revealed that greater autistic-like traits were correlated with reduced adaptation of gaze with changes in environmental uncertainty. These findings challenge proposals of gross predictive atypicalities in autistic people, but suggest that the dynamic integration of prior information and environmental statistics may be related to autistic-like traits. Further research into this relationship is warranted in autistic populations, to assist the development of future movement-based coaching methods.
Abstract.
Author URL.