The importance of being precise: using PsychoPy for stimulus presentation and OxySoft for triggering

You’ve got your NIRS device ready and have already thought of the protocol for your next experiment. You have a clear picture of it in your head: you can see the subjects with the NIRS cap on facing a PC monitor showing each of the instructions and stimuli at the exact timing you were aiming for. Timestamps for each event are automatically saved and once the session is over, you are ready to go for the analysis. Sounds nice, right? How simple is it, though? Well, turns out it’s quite simple, actually. You just need the right tools, and the combination of PsychoPy and Oxysoft is a perfect match!

 

 

What is PsychoPy?

PsychoPy is an open-source Python library specially designed for presenting stimuli in neuroscience and psychophysics experiments. It allows the researcher to synthesize and present auditory and visual stimuli to subjects, while simultaneously collecting their behavioral responses as well as brain or other physiological activity. The main interest in using PsychoPy lies in the possibility of generating stimuli in real-time. Stimuli can not only be updated and presented to the subject in a frame to frame basis without loss of temporal precision but can also be modified in real-time based on the subject’s response.

PsychoPy offers the user different sub-modules for realizing the experiment setup, amongst which it’s worth mentioning: a visual sub-module, by which one or more windows can be created for presenting instructions and visual stimuli; a sound sub-module, for auditory stimuli; and an event sub-module, for collecting the subject’s responses via mouse or keyboard interaction and reacting to them if desired.

Among the many advantages, there are to using PsychoPy, certainly, the most alluring one must be its temporal precision. A millisecond precision can be achieved provided that the experiment is implemented in appropriate hardware. More information on the effects of monitors, drivers, operating systems, keyboards, and audio can be found in this Timing issues and synchronization-link. Relying on the use of very precise clocks on the host CPU, access to rapid communication ports, and a double-buffered method for rendering, PyschoPy achieves a very precise timing mechanism based on the flip of new frames on the screen. Hence, the user can choose the exact moment where the new frame will be updated and save its precise timing.

While PsychoPy is designed for running experiments on Python, a similar toolbox named Psychtoolbox is available for Matlab and GNU Octave.

Generating stimuli with PsychoPy, an example

So, you might now be wondering how to set up your experiment with PsychoPy. PsychoPy’s official website offers many tutorials that guide you through the different modules and their functionalities, allowing you to implement your experiment design just as you want it. As an example, we chose a simple demo, which you can find explained in more detail in https://www.psychopy.org/coder/tutorial1.html. Here, a window element, where the visual stimuli will be displayed, is created. A grating and fixation stimuli are generated and drawn, but will only be displayed once we update the frame, via the flip() command. To end our experiment, we will wait for the subject to press any key and close the window. And voilà, that’s it! 

 
An example script for generating grating stimuli in PsychoPy, taken from https://www.psychopy.org/coder/tutorial1.html

An example script for generating grating stimuli in PsychoPy, taken from https://www.psychopy.org/coder/tutorial1.html

 

Synchronizing PsychoPy events with NIRS data collection: OxySoft triggers

Having implemented a basic experiment using PsychoPy, the only thing remaining now is communicating with OxySoft, our recording and analysis software. OxySoft will allow us to add the timestamps for the presented stimuli and user responses to our collected data. Once we’ve configured Python to access the DCOM interface - check our How are we synchronizing Artinis NIRS devices with other data streams? blogpost for details - we can easily send triggers to OxySoft. This enables us to synchronize the events happening in our experiment with the data streams we are collecting, greatly simplifying subsequent data analysis.

All we need to do in our script is import the Dispatch method from Python’s win32com library[1] connect to OxySoft and create a COM-object, which we’ll call “OxySoft”. Once this is done, all that’s left to do is choose the moments in our experiment where we want to send the triggers. It could be, for example, in the beginning and the end of the experiment. To do so, we call the OxySoft object we created and simply use the WriteEvent method, with an event key and description of our choice as arguments. These events, when running the experiment, will be automatically added to our measurements in OxySoft.

 
code 2.png

An example script generating grating stimuli with PsychoPy and sending events to OxySoft in the beginning and end of the experiment.

 

And finally, once your script is ready, running your experiment is just as simple as starting a measurement in OxySoft and hitting run on your Python console. PsychoPy and OxySoft will handle the rest for you! Would you like trying PsychoPy and OxySoft as well? If you are struggling, please let one of our application-specialists know and they will gladly help you further.

 
Example of triggers as received by OxySoft after running the Python script 5 times.

Example of triggers as received by OxySoft after running the Python script 5 times.

[1] To install it, type “pip install pywin32” or “pip install pypiwin32” on the command window or Anaconda prompt, if applicable. If you have different python versions, you should instead type “py -3.7 -m pip install pywin32” or “py -3.7 -m pip install pypiwin32” (or a different version number).


My name is María Sofía Sappia, MSc. As an Early Stage Researcher in the RHUMBO Marie Sklodowska-Curie ITN project, I will be conducting multimodal experiments within virtual reality environments to study emotions. I will combine narrative with spatial and audio-visual cues within these environments to elicit specific emotions and analyze the effects in both central and autonomic nervous systems. As a result, I expect to come up with a model of human emotions that identifies a subject’s emotional state at a given moment.

The RHUMBO project has received funding from the European Union’s Horizon 2020 research and innovation programme under the Marie Skłodowska-Curie grant agreement No 813234.

This blog post reflects only the author’s view and the Agency and the Commission are not responsible for any use that may be made of the information it contains.

 

References

Peirce, Jonathan W. "Generating stimuli for neuroscience using PsychoPy." Frontiers in neuroinformatics 2 (2009): 10.

Peirce, Jonathan. “Timing Issues and Synchronization.” PsychoPy. University of Nottingham. Accessed June 9, 2020. https://www.psychopy.org/index.html.

 
 
Previous
Previous

How to clean your Artinis NIRS systems and accessories?

Next
Next

Artinis Literature Overview 2019 - Brain research