Electrophysiological Signatures of Veridical Head Direction in Humans

Electrophysiological Signals of Head Direction

Electrophysiological Signatures of Veridical Head Direction in Humans

Navigation is one of the core components of human complex cognition, where head direction information is crucial for positioning oneself in space. However, due to the requirement of having the head fixed in a specific position during most neuroimaging experiments, understanding how the human brain is tuned to actual head direction signals, which involve the physical rotation of the head, is relatively scarce. To tackle this issue, a research article titled “Electrophysical Signatures of Veridical Head Direction in Humans” by Benjamin J. Griffiths et al. was published in the Nature Human Behaviour journal, the link to the article is: https://doi.org/10.1038/s41562-024-01872-1.

Research Background

Humans primarily rely on physical head rotation for perceiving head direction, whereas previous understanding of this phenomenon’s neural codes mainly relies on studies of rodents. Research on rodents suggests that single brain units selectively respond to the current head direction, with these “head direction” cells selectively firing when the animal is oriented at specific angles in the environment, regardless of the animal’s physical location in the environment. Disrupting head direction cells disrupts spatial representation, implying that representation of head direction is crucial for navigation.

Research Motivation

Although animal studies revealed the importance of these head direction cells, studying this phenomenon directly in the human brain has many challenges. Traditional imaging techniques (such as magnetic encephalography and magnetic resonance imaging) require fixing the head position to minimize artifacts, limiting conclusions about active navigation. In contrast, electroencephalography (EEG), with its lack of restriction on body movement, can be used to measure brain activity during movement. Therefore, Griffiths et al. hoped to investigate how the human brain is tuned to actual head direction signals using EEG and intracranial electrophysiology (iEEG) techniques.

Research Source

This study was carried out by Benjamin J. Griffiths (Department of Psychology, Ludwig-Maximilians-Universität München), Thomas Schreiner, Julia K. Schaefer, and others from the Department of Psychology at Ludwig-Maximilians-Universität München, the Centre for Human Brain Health at the University of Birmingham, the Hospital of Ludwig-Maximilians-Universität, and the Epilepsy Centre at Ludwig-Maximilians-Universität. The study was submitted on July 12, 2023, accepted on March 22, 2024, and published online in the Nature Human Behaviour journal.

Experimental Design and Research Process

The study involved two experiments involving 32 and 20 healthy participants and 10 epilepsy patients. The healthy participants completed a series of direction tasks involving EEG recording and motion tracking requiring them to physically rotate their heads to target locations. The patients completed the tasks while undergoing iEEG recording and motion tracking.

Experiment Processes

  1. Experiment One: Positioning Task

    • A total of 32 participants completed four-direction tasks, including rotation without prompt, auditory prompted rotation, eye movement task, and rotation without visual input. Using the forward encoding model (FEM) in cross-validation predicted EEG activity based on head direction.
    • Task Details:
      • Rotation Without Prompt: The subject, guided only by visual cues, rotated their head to the specified screen;
      • Auditory Prompted Rotation: corresponding animal sounds were played as visual cues moved, prompting subjects to rotate their heads;
      • Eye Movement Task: The subject only moved their eyes to the target screen, keeping their head still;
      • Rotation Without Visual Input: Subjects wore virtual reality goggles, only able to see a fixed point, and rotated their heads in response to sound prompts.
  2. Experiment Two: Comparison of Standing and Sitting Postures

    • A total of 20 participants completed direction tasks involving changing positions and standing or sitting. The main aim was to validate the results of the first experiment, and differentiate position-independent and position-specific head direction signals.

Data Processing & Analysis Methods

  • EEG Data Processing: After data collection, EEG signals were processed using high-pass and low-pass filters, artifacts were removed and re-referenced, then components least correlated with muscular activity were extracted using Independent Component Analysis (ICA) and Principal Component Analysis (PCA).
  • Motion Tracking Data Processing: Head movements were recorded using a Dominick Freedom system, and the stepping angles of the head were extracted.
  • Eye Movement Tracking Data Processing: Eye movement data was recorded using the Tobii Pro Spectrum system, and filtered, then trials with artifacts were removed.
  • Forward Encoding Model (FEM): Set head direction angles according to different base sets, used ridge regression to estimate relative weights with EEG activity, then combined with a linear mixed-effect model (LME) to evaluate electrophysical signals at different head directions.

Major Research Findings

  1. EEG Data Tuning Result: Results showed that EEG signals tuned significantly as the head direction changed, with the best effect at a tuning width of 20°. EEG activity could still predict head direction changes after controlling for visual signals and muscle activity.
  2. Space-Independent Signals: Signals were found in the brains of the occipital and medial temporal lobe regions, independent of visual input and location changes, pointing to a phenomena similar to the head direction cells in rodents.
  3. Time Anticipation Effect: EEG tuning signals peaked at about 120 ms before the physical rotation of the head, once again consistent with the characteristics of pre-prediction observed in animal studies.
  4. Brain Area Source: Source localization of EEG and analysis of iEEG identified the medial temporal lobe as the source of head direction effects. Additionally, areas such as the parietal lobe and frontal lobe also demonstrated tuning effects.

Research Conclusions and Significance

The present study detected electrophysiological signals correlated with actual head direction in freely moving human participants for the first time, showing significant similarity to the activity of head direction cells in rodents. These findings not only extend our understanding of head direction encoding in human spatial navigation but also aid in the development of more natural navigation and virtual reality applications.

Research Highlights

  • By clever experimental design and applying the forward encoding model, the traditional imaging method’s restriction on head fixation was overcome, revealing precise head direction tuning signals in humans for the first time.
  • After controlling for visual, auditory, and muscular activities, the study further clarified neural signals purely based on head direction.
  • The experiment combined EEG and iEEG recording, providing a chain of evidence from cortical electrical activity to deeper brain areas.

This research not only deepens our understanding of human cognitive navigation but also helps promote neuroscience application research in more natural environments.