We’re happy to announce the first public release of respyra, a Python toolbox for respiratory motor tracking experiments. The package is available now on PyPI and the accompanying preprint is up on PsyArXiv.

Interoception research has long focused on how people perceive internal signals like heartbeats. But the control of bodily signals – particularly how we voluntarily regulate and adapt our respiratory patterns through respiratory motor tracking – has received far less attention. respyra was built to open up this space.
The toolbox integrates a Vernier Go Direct Respiration Belt with PsychoPy to create a closed-loop breathing paradigm. Participants wear a wireless chest-mounted force sensor and follow a sinusoidal target dot on screen with their breathing. The display provides continuous color-coded feedback, going from green when tracking is accurate through to red when it drifts.
What makes this more than a simple biofeedback tool is the visuomotor perturbation capability. Drawing on the reaching adaptation literature, respyra can amplify or attenuate the visual representation of the participant’s breathing without changing the target or the feedback. At a gain of 2.0, for example, every deviation from the target looks twice as large on screen, forcing the participant to halve their breathing amplitude to visually match the target. This creates a principled way to study respiratory motor recalibration.
respyra-plot) producing a 6-panel summary figure for quick data review
The preprint reports a single-participant validation study across 48 trials. Key findings:

We’re releasing respyra at an early alpha stage deliberately. We believe the best way to build a useful tool is to get it into researchers’ hands early, collect feedback on what works and what doesn’t, and develop it collaboratively.
The roadmap is ambitious. In the near term, we plan to expand the condition library with box breathing, oddball paradigms, ramp conditions, and adaptive difficulty scaling. We’re organizing a multi-site validation study with several labs participating to properly characterize the task and establish normative data across a range of participants.
Further down the road, we envision expanding the analysis toolkit with time-frequency analysis, phase-locking metrics, and group-level visualization. We’re also interested in gamified versions of the task for pediatric and clinical samples, where sustained attention to a sinusoidal dot may not be the most engaging paradigm.
If your lab works on breathing, interoception, or respiratory psychophysiology, we’d welcome your involvement. We’re particularly looking for collaborators interested in:
You can install respyra today with pip install respyra, explore the documentation, and run a no-hardware demo to see the display in action:
pip install respyra
python -m respyra.demos.demo_display
The code is on GitHub under an MIT license. Issues, pull requests, and feature suggestions are all welcome.
To get in touch about collaboration, contact micah@cfin.au.dk.
If you use respyra in your research, please cite the preprint:
Allen, M. (2026). respyra: A General-Purpose Respiratory Tracking Toolbox for Interoception Research. PsyArXiv. https://osf.io/preprints/psyarxiv/wjuce_v1