<?xml version="1.0" encoding="utf-8"?><feed xmlns="http://www.w3.org/2005/Atom" ><generator uri="https://jekyllrb.com/" version="3.10.0">Jekyll</generator><link href="https://markspan.github.io/feed.xml" rel="self" type="application/atom+xml" /><link href="https://markspan.github.io/" rel="alternate" type="text/html" /><updated>2026-02-03T10:53:02+00:00</updated><id>https://markspan.github.io/feed.xml</id><title type="html">Mark M. Span</title><subtitle>Psychophysiology and Behaviour Measurement Portfolio</subtitle><author><name>Mark M. Span</name></author><entry><title type="html">EVTPlugins, rewrite by Martin</title><link href="https://markspan.github.io/evtplugins/" rel="alternate" type="text/html" title="EVTPlugins, rewrite by Martin" /><published>2025-08-11T00:00:00+00:00</published><updated>2025-08-11T00:00:00+00:00</updated><id>https://markspan.github.io/evtplugins</id><content type="html" xml:base="https://markspan.github.io/evtplugins/"><![CDATA[<h1 id="pyevt---a-python-binder-for-the-event-exchanger-evt-2-usb-hardware">pyevt - A python binder for the Event-Exchanger EVT-2 USB hardware</h1>

<h2 id="1-about">1. About</h2>
<p>This repository contains the code to communicate with <em>EVT-2</em> USB-devices, developed by the Research Support group of the faculty of Behavioral and Social Science from the University of Groningen. This code was originally written by Eise Hoekstra and Mark M. Span and is now maintained by Martin Stokroos</p>

<p>The <em>EVT-2</em> is an event marking and triggering device intended for physiological experiments.
<em>pyevt</em> is a Python module to communicate with <em>EVT-2</em> hardware (+derivatives).</p>

<h2 id="2-install">2. Install</h2>
<p>Install pyevt with:</p>

<p><code class="language-plaintext highlighter-rouge">pip install pyevt</code> or
<code class="language-plaintext highlighter-rouge">pip install --user pyevt</code> on managed computers.</p>

<h2 id="3-dependencies">3. Dependencies</h2>
<p>The <em>pyevt</em>-library uses the <em>HIDAPI</em> python module to communicate over USB according the HID class.
<img src="https://pypi.org/project/hidapi/" alt="https://pypi.org/project/hidapi/" /></p>

<h2 id="4-device-permission-for-linux">4. Device Permission for Linux</h2>
<p>In Linux (Ubuntu), permission for using EVT (HID) devices should be given by adding the next lines to a file, for example named: <code class="language-plaintext highlighter-rouge">99-evt-devices.rules</code> in <code class="language-plaintext highlighter-rouge">/etc/udev/rules.d</code>:</p>

<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code># /etc/udev/rules.d/99-evt-devices.rules

# All EVT devices
SUBSYSTEM=="usb", ATTR{idVendor}=="0004", MODE="0660", GROUP="plugdev"
SUBSYSTEM=="usb", ATTR{idVendor}=="0008", MODE="0660", GROUP="plugdev"
SUBSYSTEM=="usb", ATTR{idVendor}=="0009", MODE="0660", GROUP="plugdev"
SUBSYSTEM=="usb", ATTR{idVendor}=="0114", MODE="0660", GROUP="plugdev"
SUBSYSTEM=="usb", ATTR{idVendor}=="0208", MODE="0660", GROUP="plugdev"
SUBSYSTEM=="usb", ATTR{idVendor}=="0308", MODE="0660", GROUP="plugdev"
SUBSYSTEM=="usb", ATTR{idVendor}=="0408", MODE="0660", GROUP="plugdev"
SUBSYSTEM=="usb", ATTR{idVendor}=="0508", MODE="0660", GROUP="plugdev"
SUBSYSTEM=="usb", ATTR{idVendor}=="0604", MODE="0660", GROUP="plugdev"
SUBSYSTEM=="usb", ATTR{idVendor}=="0808", MODE="0660", GROUP="plugdev"
SUBSYSTEM=="usb", ATTR{idVendor}=="0909", MODE="0660", GROUP="plugdev"
SUBSYSTEM=="usb", ATTR{idVendor}=="1803", MODE="0660", GROUP="plugdev"
SUBSYSTEM=="usb", ATTR{idVendor}=="1807", MODE="0660", GROUP="plugdev"
</code></pre></div></div>

<p>The user should be a member of the <code class="language-plaintext highlighter-rouge">plugdev</code> -group.</p>

<p>Check with:</p>

<p><code class="language-plaintext highlighter-rouge">$ groups username</code></p>

<p>If this is not the case, add the user to the plugdev group by typing:</p>

<p><code class="language-plaintext highlighter-rouge">$ sudo usermod -a -G plugdev username</code></p>

<h2 id="5-python-coding-examples">5. Python coding examples</h2>

<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>from pyevt import EventExchanger

myevt = EventExchanger()
# Get list of devices containing the partial string 'partial_device_name'
myevt.scan('partial_device_name') # The default is 'EventExchanger'.

# Create a device handle:
myevt.attach_name('partial_device_name') # Example: 'EVT02', 'SHOCKER' or 'RSP-12', etc. The default is 'EventExchanger'.

myevt.write_lines(0) # clear outputs
myevt.pulse_lines(170, 1000) # value=170, duration=1000ms

# remove device handle
myevt.close()

# connect RSP-12
myevt.attach_name('RSP-12')
myevt.wait_for_event(3, None) # wait for button 1 OR 2, timeout is infinite.
myevt.close() # remove device handle

</code></pre></div></div>

<h2 id="6-license">6. License</h2>
<p>The evt-plugins collection is distributed under the terms of the GNU General Public License 3.
The full license should be included in the file COPYING, or can be obtained from</p>

<p><a href="http://www.gnu.org/licenses/gpl.txt">http://www.gnu.org/licenses/gpl.txt</a></p>

<p>This plugin collection contains the work of others.</p>

<h2 id="7-documentation">7. Documentation</h2>
<p>Information about EVT-devices and OpenSesame plugins:</p>

<p><a href="https://markspan.github.io/evtplugins/">https://markspan.github.io/evtplugins/</a></p>

<h1 id="the-bss-research-support-opensesame-plugin-collection">The BSS Research Support OpenSesame Plugin Collection</h1>

<p><em>An OpenSesame plugin collection for sending stimulus synchronization triggers and response collection through Event Exchanger (EVT-2) USB-devices.</em></p>

<p>Copyright 2010-2024 Mark Span (<a href="mailto:m.m.span@rug.nl">m.m.span@rug.nl</a>), M. Stokroos (<a href="mailto:m.stokroos@rug.nl">m.stokroos@rug.nl</a>)</p>

<p>Contributions: This code is based on the work of Eise Hoekstra and Mark M. Span. The code is debugged and rewritten for OpenSesame 4 by Martin Stokroos.</p>

<h2 id="1-about-1">1. About</h2>
<p>The BSS Research Support OpenSesame Plugin Collection for use with Event Exchanger (EVT-2) USB-devices.</p>

<p>EVT-devices and the associated plugins are developed by the <a href="https://myuniversity.rug.nl/infonet/medewerkers/profiles/departments/11422">Research Support</a> department from the faculty of Behavioural and Social Sciences from the University of Groningen.</p>

<p>The currently supported OpenSesame version is: 4.</p>

<p>The following plugins are available:</p>

<table>
  <thead>
    <tr>
      <th>icon</th>
      <th>plugin</th>
      <th>Description</th>
      <th>OpenSesame back-end</th>
      <th>operating system</th>
      <th>Status</th>
    </tr>
  </thead>
  <tbody>
    <tr>
      <td><img src="opensesame_plugins/evt_plugins/evt_trigger/evt_trigger_large.png" alt="" /></td>
      <td><em>evt_trigger</em></td>
      <td>plugin for event exchanger EVT-2,3 and 4 variants for generating triggers</td>
      <td>PyGame, PsychoPy</td>
      <td>Windows</td>
      <td>ok</td>
    </tr>
    <tr>
      <td><img src="opensesame_plugins/evt_plugins/response_box/response_box_large.png" alt="" /></td>
      <td><em>response_box</em></td>
      <td>plugin for all of the RSP12x button response box variants with 1-8 buttons</td>
      <td>PyGame, PsychoPy</td>
      <td>Windows</td>
      <td>ok</td>
    </tr>
    <tr>
      <td><img src="opensesame_plugins/evt_plugins/rsp_pygame/rsp_pygame_large.png" alt="" /></td>
      <td><em>rsp_pygame</em></td>
      <td>plugin for RSP12x button response box variants with 1-8 buttons</td>
      <td>PyGame</td>
      <td>Windows, Linux</td>
      <td>ok</td>
    </tr>
    <tr>
      <td><img src="opensesame_plugins/evt_plugins/tactile_stimulator/tactile_stimulator_large.png" alt="" /></td>
      <td><em>tactile_stimulator</em></td>
      <td>plugin for the Electrotactile Stimulator (SHK-1B) 0-5mA</td>
      <td>PyGame</td>
      <td>Windows</td>
      <td>ok</td>
    </tr>
    <tr>
      <td><img src="opensesame_plugins/evt_plugins/vas_evt/vas_evt_large.png" alt="" /></td>
      <td><em>vas_evt</em></td>
      <td>A Visual Analog Slider plugin controlled via an encoder knob connected to the EVT-2</td>
      <td>PyGame</td>
      <td>Windows</td>
      <td>planned</td>
    </tr>
    <tr>
      <td><img src="opensesame_plugins/evt_plugins/vas_gui/vas_gui_large.png" alt="" /></td>
      <td><em>vas_gui</em></td>
      <td>A Visual Analog Slider plugin controlled via the PC-mouse on a predefined canvas (sketchpad)</td>
      <td>PyGame</td>
      <td>Windows, Linux</td>
      <td>Mouse response not ok in Linux.</td>
    </tr>
    <tr>
      <td><img src="opensesame_plugins/evt_plugins/rgb_led_control/rgb_led_control_large.png" alt="" /></td>
      <td><em>rgb_led_control</em></td>
      <td>plugin for multi-color LED response boxes</td>
      <td>PyGme</td>
      <td>Windows</td>
      <td>not validated</td>
    </tr>
  </tbody>
</table>

<h3 id="package-dependencies">Package dependencies</h3>
<p>The plugins are dependent on the Python module pyevt and the underlying hidapi package.</p>

<p><a href="https://pypi.org/project/hidapi/">https://pypi.org/project/hidapi/</a></p>

<p><em>pyevt</em> and <em>hidapi</em> are installed from the Python Console in OpenSesame with the single command:</p>

<p><code class="language-plaintext highlighter-rouge">!pip install --user pyevt</code></p>

<p>NOTE: Currently, the plugin package is released as pip package in a test environment. Clone this repository and copy the plugins manually into your OpenSesame python package folder or temporary install from the command line in OpenSesame 4 with:</p>

<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>!pip install --user hidapi
!pip install --user --index-url https://test.pypi.org/simple/ pyevt
!pip install --user --index-url https://test.pypi.org/simple/ evt-plugins
</code></pre></div></div>

<h3 id="environmental-settings">Environmental settings</h3>
<p>By default, the OpenSesame 4.0 plugins are installed as python site-package and automatically loaded at the startup.
When the plugins are located somewhere else, add your path to the python-path of OpenSesame in the <code class="language-plaintext highlighter-rouge">environment.yaml</code> file in the OpenSesame program directory (The OPENSESAME_plugin_PATH is old style). See for the instructions here: <a href="https://rapunzel.cogsci.nl/manual/environment/">https://rapunzel.cogsci.nl/manual/environment/</a></p>

<h2 id="2-plugin-descriptions">2. Plugin Descriptions</h2>

<p><em>evt_trigger</em></p>

<p>Existing modes:</p>

<ul>
  <li>Clear output lines</li>
  <li>Write output line</li>
  <li>Invert output lines</li>
  <li>Pulse output lines</li>
</ul>

<p><em>response_box</em></p>

<p>Collects responses from a 1 to 8 button RSP-12x response box.</p>

<p>After the prepare phase of the plugin, a workspace variable <code class="language-plaintext highlighter-rouge">connected_device_plugin_instance_name</code> is created to check if the actual tactile-stimulator device is really detected and connected to the plugin.</p>

<p><em>rsp_pygame</em></p>

<p>This response-box plugin works for EVT devices as well for joystick devices. It makes use of the pygame joystick API and is platform independent.</p>

<p><em>tactile_stimulator</em></p>

<p>The tactile_stimulator plugin operates in two modes. Usually two instances of this plugin are used in the OpenSesame experiment. Mode-I, the <code class="language-plaintext highlighter-rouge">Calibration</code>-mode should always precede the <code class="language-plaintext highlighter-rouge">Stimulate</code>-mode. In <code class="language-plaintext highlighter-rouge">Calibration</code>-mode the upper limit of stimulus-current threshold is set between 0 and 5mA rms. In the <code class="language-plaintext highlighter-rouge">Stimulate</code>-mode, a percentage of the stimulus-current upper limit is set to be applied to the subject. The <code class="language-plaintext highlighter-rouge">Calibration</code>-mode can be used standalone for instance to precondition the subject. The pulse duration can be extended up to 2000ms.</p>

<p>After the prepare phase of the plugin, a workspace variable <code class="language-plaintext highlighter-rouge">connected_device_plugin_instance_name</code> is created to check if the actual tactile-stimulator device is really detected and connected to the plugin.</p>

<p>Here below follows a list of variables that appear in the OpenSesame variable inspector when using the tactile_stimulator plugin:</p>

<table>
  <thead>
    <tr>
      <th>variable name</th>
      <th>description</th>
    </tr>
  </thead>
  <tbody>
    <tr>
      <td><code class="language-plaintext highlighter-rouge">tactstim_calibration_perc</code></td>
      <td>The percentage of the slider setting for the stimulus current of up to 5mA rms max.</td>
    </tr>
    <tr>
      <td><code class="language-plaintext highlighter-rouge">tactstim_calibration_milliamp</code></td>
      <td>The calibration value of the stimulus current in mA’s. This is the max. current applied to the subject.</td>
    </tr>
    <tr>
      <td><code class="language-plaintext highlighter-rouge">tactstim_calibration_value</code></td>
      <td>The byte value representation of the calibrated current.</td>
    </tr>
    <tr>
      <td><code class="language-plaintext highlighter-rouge">tactstim_pulse_milliamp</code></td>
      <td>The actual current in mA’s, applied to the subject when pulsing.</td>
    </tr>
    <tr>
      <td><code class="language-plaintext highlighter-rouge">tactstim_pulse_value</code></td>
      <td>The actual byte value representation that is sent to the tactile stimulator.</td>
    </tr>
    <tr>
      <td><code class="language-plaintext highlighter-rouge">tactstim_pulse_duration_ms</code></td>
      <td>The pulse duration time in ms.</td>
    </tr>
    <tr>
      <td><code class="language-plaintext highlighter-rouge">tactstim_time_last_pulse</code></td>
      <td>Unique time stamp in seconds from the moment of the shock.</td>
    </tr>
  </tbody>
</table>

<p><em>vas_evt</em></p>

<p>A Visual Analog Slider plugin controlled by an EVT rotary or linear encoder.
The <em>vas_evt</em> plugin does not work standalone, but requires a linkage to a custom designed sketchpad screen.</p>

<p><em>vas_gui</em></p>

<p>A Visual Analog Slider plugin. The <em>vas_gui</em> plugin does not work standalone, but requires a linkage to a custom designed sketchpad screen with an analog slider design!</p>

<p>Here below is the list of the variables that will appear in the OpenSesame variable inspector when using the vas_gui plugin:</p>

<table>
  <thead>
    <tr>
      <th>variable name</th>
      <th>description</th>
    </tr>
  </thead>
  <tbody>
    <tr>
      <td><code class="language-plaintext highlighter-rouge">vas_response</code></td>
      <td>This value is the reading from the VAS object, ranging from 0 to 100.</td>
    </tr>
    <tr>
      <td><code class="language-plaintext highlighter-rouge">vas_response_time</code></td>
      <td>this is the repsonse time in ms. The value -1 means that the timeout period was reached.</td>
    </tr>
  </tbody>
</table>

<p><em>rgb_led_control</em></p>

<p>This plugin works for the RSP-LT device, a response-box with RGB-controlled LED buttons.</p>

<h2 id="3-license">3. LICENSE</h2>
<p>The evt-plugins collection is distributed under the terms of the GNU General Public License 3.
The full license should be included in the file COPYING, or can be obtained from</p>

<p><a href="http://www.gnu.org/licenses/gpl.txt">http://www.gnu.org/licenses/gpl.txt</a></p>

<p>This plugin collection contains the work of others.</p>

<h2 id="4-documentation">4. Documentation</h2>
<p>Installation instructions and documentation on OpenSesame are available on the documentation website:</p>

<p><a href="http://osdoc.cogsci.nl/">http://osdoc.cogsci.nl/</a></p>

<p>Evt-plugin information:</p>

<p><a href="https://markspan.github.io/evtplugins/">https://markspan.github.io/evtplugins/</a></p>]]></content><author><name>Mark M. Span</name></author><category term="EEG" /><category term="EventExchanger" /><category term="ButtonBox" /><category term="coding" /><category term="opensesame" /><category term="python" /><category term="plugin" /><summary type="html"><![CDATA[pyevt - A python binder for the Event-Exchanger EVT-2 USB hardware]]></summary></entry><entry><title type="html">LSL - Polar H10</title><link href="https://markspan.github.io/Polar/" rel="alternate" type="text/html" title="LSL - Polar H10" /><published>2023-11-23T00:00:00+00:00</published><updated>2023-11-23T00:00:00+00:00</updated><id>https://markspan.github.io/Polar</id><content type="html" xml:base="https://markspan.github.io/Polar/"><![CDATA[<h1 id="polarband2lsl"><a href="https://github.com/markspan/PolarBand2lsl">PolarBand2lsl</a></h1>

<h2 id="polarband2lsl-1">PolarBand2lsl</h2>
<p>Send PolarBand H10 Data to an <a href="https://github.com/sccn/labstreaminglayer">LSL</a> stream.</p>

<h2 id="manual">Manual:</h2>
<ol>
  <li><strong>Install <a href="https://www.anaconda.com/">Python</a>:</strong>
    <ul>
      <li>If not installed, download and install Python.</li>
    </ul>
  </li>
  <li><strong>Open an Anaconda Prompt:</strong>
    <ul>
      <li>Open an Anaconda prompt and execute the following commands:
        <div class="language-bash highlighter-rouge"><div class="highlight"><pre class="highlight"><code>pip <span class="nb">install </span>pylsl <span class="nt">--user</span>
pip <span class="nb">install </span>bleak <span class="nt">--user</span>
</code></pre></div>        </div>
        <p><em>This installs <a href="https://pypi.org/project/pylsl/">pylsl</a> for LSL support and <a href="https://bleak.readthedocs.io/en/latest/">bleak</a> for Bluetooth Low Energy communication.</em></p>
      </li>
    </ul>
  </li>
  <li><strong>Platform Compatibility:</strong>
    <ul>
      <li>This solution is expected to work on PC, MAC, and Linux since <strong>bleak</strong> is used for Bluetooth LE communication.</li>
    </ul>
  </li>
  <li><strong>Configure the Code:</strong>
    <ul>
      <li>Change to the directory with this code. Update the MAC address in the code to match the MAC address of your band.</li>
    </ul>
  </li>
  <li><strong>Run the Stream:</strong>
    <ul>
      <li>Execute the following command:
        <div class="language-bash highlighter-rouge"><div class="highlight"><pre class="highlight"><code>python Polar2LSL
</code></pre></div>        </div>
        <p><em>This starts the LSL stream.</em></p>
      </li>
    </ul>
  </li>
  <li><strong>Recording:</strong>
    <ul>
      <li>Record the stream with <a href="https://github.com/labstreaminglayer/App-LabRecorder/releases">Labrecorder</a>.</li>
    </ul>
  </li>
  <li><strong>Sample Script for Peak Detection:</strong>
    <ul>
      <li>A sample script for peak detection is provided, based on <a href="https://nl.mathworks.com/help/wavelet/ug/r-wave-detection-in-the-ecg.html">Matlab Documentation</a>.
        <ul>
          <li>This script uses the xdf import module of LabStreamingLayer (https://github.com/xdf-modules/xdf-Matlab) and the ‘findpeaks’ function from the MATLAB Signal Processing Toolbox.</li>
        </ul>
      </li>
    </ul>
  </li>
</ol>

<p><img src="https://user-images.githubusercontent.com/4105112/110318793-40345100-800e-11eb-9f86-872d7848a1ac.png" alt="Screenshot 2021-02-25 115853" /></p>

<ol>
  <li><strong>GUI Version:</strong>
    <ul>
      <li>Additionally, I’ve developed a GUI version that wraps around the code using Kivy. This GUI offers several advantages, such as the ability to discover nearby POLAR bands. You can find the <a href="https://github.com/markspan/PolarBand2lsl/releases/tag/v1.0.0">release</a> for windows only.</li>
    </ul>
  </li>
</ol>

<h1 id="stolen-from">Stolen from:</h1>
<p><a href="https://towardsdatascience.com/creating-a-data-stream-with-polar-device-a5c93c9ccc59">Pareeknikhil</a></p>]]></content><author><name>Mark M. Span</name></author><category term="labstreaminglayer" /><category term="heartrate" /><category term="synchronisation" /><category term="coding" /><summary type="html"><![CDATA[PolarBand2lsl]]></summary></entry><entry><title type="html">EEGLAB (elc) Files for the 64 channel Waveguard Caps</title><link href="https://markspan.github.io/Waveguard/" rel="alternate" type="text/html" title="EEGLAB (elc) Files for the 64 channel Waveguard Caps" /><published>2023-06-27T00:00:00+00:00</published><updated>2023-06-27T00:00:00+00:00</updated><id>https://markspan.github.io/Waveguard</id><content type="html" xml:base="https://markspan.github.io/Waveguard/"><![CDATA[<h1 id="elc-files">ELC files</h1>

<p><a href="https://sccn.ucsd.edu/eeglab/index.php">EEGlab</a> and <a href="https://erpinfo.org/erplab">ERPlab</a> use .elc files to describe the placements of 
the electrodes. Here you can download the <a href="/assets/standard_waveguard64_1005.elc">standard waveguard 64 channel 10-20</a> cap .elc file,and the <a href="/assets/standard_waveguard64_equidistant.elc">standard waveguard 64 channel 
equidistant duke layout</a> cap .elc file.</p>]]></content><author><name>Mark M. Span</name></author><category term="EEG" /><category term="Caps" /><category term="EEGlab" /><category term="Matlab" /><summary type="html"><![CDATA[ELC files]]></summary></entry><entry><title type="html">Plugin set for OpenSesame. Update</title><link href="https://markspan.github.io/evtplugins/" rel="alternate" type="text/html" title="Plugin set for OpenSesame. Update" /><published>2023-04-05T00:00:00+00:00</published><updated>2023-04-05T00:00:00+00:00</updated><id>https://markspan.github.io/evtplugins</id><content type="html" xml:base="https://markspan.github.io/evtplugins/"><![CDATA[<h1 id="eventexchanger">EventExchanger</h1>

<p>GitHub link:
<a href="https://github.com/markspan/evtplugins">https://github.com/markspan/evtplugins</a></p>

<h2 id="short-description">Short Description:</h2>

<p>Set of <a href="https://osdoc.cogsci.nl/">OpenSesame</a> toolbox items to use hardware developed by the University of Groningen,
faculty of Behavioural and Social Sciences, department of Research Support, in OpenSesame.</p>

<p>Code Written by Eise Hoekstra and Mark M. Span, Maintained by Mark M. Span</p>

<h2 id="usage">Usage:</h2>

<p>Start OpenSesame (preferably in Admin mode)
Open the console (ctrl-D) of OpenSesame, and type:</p>

<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code> !pip install evtplugins 
</code></pre></div></div>
<p>or, if you cannot obtain root privileges:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code> !pip install evtplugins --user
</code></pre></div></div>

<p>and then, if this results in success, close OpenSesame, and open it again.</p>

<hr />
<p><em>when installing on a MAC (<strong>or another platform that has no pip executable available in the terminal</strong>) you should use:</em></p>

<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>import pip
pip.main(['install', 'evtplugins'])
</code></pre></div></div>

<p>If all went well, the plugins are now available in your toolbox.</p>

<hr />
<p><em>They will be useable either when installed as Admin, or as user, as long as a user can connect to the USB port (this could be more difficult when running Linux)</em></p>

<hr />

<p><img src="/images/evtpluginsplus.png" alt="EVTPLUGINS" /></p>

<p><em>at the moment</em> there are five (5) plugins available.</p>

<ul>
  <li><a href="#EVTXX">EVTXX</a> item: send codes through an “EventExchanger” to a physiology recording device, to synchronise the behavioural data with the physiological data.</li>
  <li><a href="#ResponseBox">ResponseBox</a> item: Alternative to the default ‘JoyStick’ plugin. Made for the custom made buttonboxes of Research Support (works / can be made to work with all HID devices).</li>
  <li><a href="#RGB_Led_Control">RGB_Led_Control</a> item: Extended Responsebox item for use with the RGB Responsebox, enabling colour use and feedback on the buttonbox.</li>
  <li><a href="#VAS">VAS</a> item: a “Visual Analogue Scale”. I tried to make it as customizable as possible, so its up to the user to stay close to the original VAS, or design their own.</li>
  <li><a href="#Shocker">SHOCKER</a> Officially: <em>Tactile Stimulator</em> The tactile stimulator is a device that can be used to administer unpleasant tactile feedback to a subject.</li>
</ul>

<h2 id="usage-of-each-item">Usage of each item:</h2>

<h3 id="evtxx"><a name="EVTXX">EVTXX</a></h3>
<p>The purpose of this device is to send codes through an “EventExchanger” to a physiology recording system in order to synchronize behavioral data with physiological data.</p>

<p><img src="/images/EVT-2.jpg" alt="EVT-2" /></p>

<p>When you add the item to your task, it will look for connected EVT devices. In the item’s configuration page, you can select the EVT device from a list. This enables the use of 
multiple EVT devices in the same task. During the development of your task, it is recommended to select the “DUMMY” device, which is always available and outputs the codes straight 
to the debug window of OpenSesame. Note that the list of devices is populated at startup, so connecting the device after startup will not enable it.</p>

<p><img src="/images/EVT-config.png" alt="EVT-2 config" /></p>

<p>The item has two modes: “Set Output Lines” and “Pulse Output Lines”. In the first mode, the code value is placed on the output lines of the EVT until it is changed. In the second mode, 
the code value is placed on the output lines for a specified number of milliseconds, and then changed back to zero. Because the devices have eight output lines, the code values that 
can be conveyed through these devices range from 1 to 255. Normally, “0” means no code or event, and the values between 1 and 255 can be used to code stimuli.</p>

<p>In the “Pulse Output Lines” mode, OpenSesame will not pause until the code is reset (non-blocking!). Be careful not to send codes when the EVT is still “pulsing”, as the recorded 
code can be affected. We have received reports of the EVTs getting confused and sending erroneous codes after possibly overlapping pulses. Therefore, we recommend pulsing with short 
latencies, approximately 4 times the time between two samples on the amplifier. This way, no code will be missed, and the chance of overlapping pulses is minimal.</p>

<h4 id="using-in-code">Using in code:</h4>

<p>The easiest way to use the code is to use the underlying library ‘pyEVT’ <a href="https://github.com/markspan/pyEVT">GitHub Link</a> . Somewhere in the start of your task create and select the device you want to use:</p>

<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code> from pyEVT import EvtExchanger 
 EE = EvtExchanger()
 EE.Select("EVT") 
</code></pre></div></div>

<hr />
<p><em>The Parameter used (Here ‘EVT’) will select the USB device that has ‘EVT’ as part of the name. This will usually suffice, but if there are more devices that comply, you should use the serial number here to select ONE</em>.
<em>Using no, or empty strings will look for devices with “EventExchanger” in the name.</em>
—</p>

<p>and then optionally set the channels to 0.</p>

<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>EE.SetLines(0) 
</code></pre></div></div>
<p>Further on in your task you can now use the <em>SetLines</em> and <em>PulseLines</em> functions when you need to inform the physiology about an event.</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>EE.PulseLines(255, 1000) 
</code></pre></div></div>
<p><em>If you use multiple devices that use the EvtExchanger API, you need to select the device to be used before you call functions on it: this includes the use of Buttonboxes and EventExchangers. When you use the plugin (“dragged and dropped) this is taken care of in the plugin.</em></p>

<h3 id="responsebox"><a name="ResponseBox">ResponseBox</a></h3>

<p>There are multiple devices that work with this plugin. They mostly differ in number and position of the buttons that are attached, but they are also available as voice-keys, and most of them can even record the occurrence of an r-top.
This is the generic form:</p>

<p><img src="/images/RSP-12.jpg" alt="RSP-12" /></p>

<p>But there are many variations, with touch-buttons, different layouts etc. The can also be made bespoke.
There is also a footpedal-form:</p>

<p><img src="/images/RSP-12-F.jpg" alt="RSP-12-f" /></p>

<p>The devices are known to windows as joysticks, and can also be used with the generic joystick plugin.
The configuration is also remarkably similar to the generic joystick plugin. The ‘Responsebox’ plugin is also compatible with the <em>Psycho</em> back-end.</p>

<p><img src="/images/RSP-config.png" alt="RSP-12 config" /></p>

<h4 id="using-in-code-1">Using in code:</h4>
<p>The easiest way to use the code is to use the underlying library ‘pyEVT’ <a href="https://github.com/markspan/pyEVT">GitHub Link</a> . Somewhere in the start of your task create and select the device you want to use:</p>

<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code> from pyEVT import EvtExchanger 
 RSP = EvtExchanger()
 RSP.Select("RSP") 
</code></pre></div></div>

<p>Now you can wait for a response using the command:</p>

<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>RSP.WaitForDigEvents(AllowedEventLines, responseTimeout)
</code></pre></div></div>

<p>Where <em>AllowedEventLines</em> is the bitpattern containing the buttons that should generate a response, and <em>responseTimeout</em> the timeout value in ms.</p>

<h3 id="rgb_led_control"><a name="RGB_Led_Control">RGB_Led_Control</a></h3>
<p>The RGB-Led control is a version of the response-box that has keys that has RGB LEDs inside. The keys are quite a bit larger then the default ResponseBox keys.</p>

<p><img src="/images/RSP-RGB-config.png" alt="RSP-RGB config" /></p>

<p>The configuration is again similar to the ResponseBox configuration, but also has fields where the “current” colour of the keys can be defined.
If needed, you can also define a colour the pressed key will get, either when it is the correct, or if its one of the incorrect options.</p>

<h3 id="vas"><a name="VAS">VAS</a></h3>
<p>The item itself is a bit more complex then the previous items.The VAS item is designed to be used with the Rotary Encoder.</p>

<p><img src="/images/RSP-RDC1.jpg" alt="Rotary Encoder" /></p>

<p>It also has the option to be used with a standard mouse. 
The item interacts with a standard <a href="https://osdoc.cogsci.nl/3.3/manual/stimuli/visual/#using-the-sketchpad-and-feedback-items">sketchpad</a> item. In particular,
the item refers to a canvas with some <a href="https://osdoc.cogsci.nl/3.3/manual/python/canvas/#naming-accessing-and-modifying-elements">named</a> elements on it. 
The names of the elements can be entered in the configuration of the item.</p>

<p><img src="/images/VAS1.png" alt="VAS" /></p>

<p>There are 3 names relevant for the VAS item:</p>
<ul>
  <li>the name of the sketchpad to interact with,</li>
  <li>the name of the line element (on this sketchpad) on which the cursor moves,</li>
  <li>and the name of the cursor.</li>
</ul>

<p>Optionally, there is the possibility to animate a timer (again, on the named sketchpad), counting down to the “end of response”. If this element is used, it also must be named.</p>

<p><img src="/images/VAS-config.png" alt="VAS config" /></p>

<p>The relative complexity of the VAS item led to the inclusion of a bare-bones example of its use. This example can be found under Tools - Example experiments in OpenSesame.</p>

<p>There is also a <strong>VAS2</strong> item in the toolbox: this is a customized version, that has no coupling with any encoders and can only be used in combination with the mouse. The customizable parameters are therefore different. The main features are: the use of a GUI button to end the selection: that is, a rectangle on the canvas that is named in the parameters: when clicked after a selection, it will end the VAS2. (Optional) labels that can be clicked to the maximum and minimum values of the VAS, and the <strong>appearance</strong> of the cursor after the VASBODY is clicked.</p>

<h3 id="shocker"><a name="Shocker">Shocker</a></h3>
<h2 id="tactile-stimulator">Tactile Stimulator.</h2>

<p>The tactile stimulator is a device that can be used to administer unpleasant tactile feedback to a subject.</p>

<p><img src="/images/tactilestimulator.jpg" alt="LooksLike" /></p>

<hr />
<p><em>This maximum value of the applied current is 5 mA, which is reached
 when a byte with value 255 (being 100%) is sent to the Tactile Stimulator. 
 When a 0 is sent, the current will be 0 mA. This value can (and should) be limited by a careful calibration procedure!</em></p>

<hr />

<p>When using the Tactile Stimulator the first thing to do is to run a calibration. This will limit the current send to the subject to a maximum, that is being calibrated to the subjective experience of the subject.</p>

<p>You do so by dragging the plugin into you experiment. The configuration pane looks like this:</p>

<p><img src="/images/TS_Calibration.png" alt="Calib" /></p>

<p>Running this plugin in the ‘calibration’ mode (as seen above) will lead to the view below:</p>

<p><img src="/images/TS_CalibrationScreen.png" alt="CalibScreen" /></p>

<p>Clicking (with the mouse) on the bar in the middle of the screen will change the values for the maximum current to be used in the experiment. The newly chosen values are readable in the green field on this screen.</p>

<p>Pressing the red “Test” button will administer this current to the subject. The calibration procedure will entail a steadily increasing current, and getting feedback from the subject on the subjective experience. The shock should probably be annoying, and should definitely not hurt. After each “Test” shock, a pause of 8 seconds will start, with the button turning blue. During this pause no test shock can be administered. This is to prevent accidental repetitive shocks.</p>

<p>When the calibration value is acquired, the green “OK” button can be pressed, and the calibration value is accepted. Your task will now continue.</p>

<p>Dragging a second “shock” plugin to the task, will enable the administering of shock upto the calibration value to your subject. The calibration screen in “shock” mode will look like this:</p>

<p><img src="/images/TS_Shock.png" alt="Shock" /></p>

<p>Main thing to note here is (next to the selection of the actual device: “Productname”) is the activation of the ‘Percentage’ field. This field can be filled with a number between 0 and 100, and is used as a index, leading to a shock to be administered to the subject, with a current that is defined as a percentage of the calibration value.</p>

<p>So if the calibration led to the selection of a value of 2.5mAh (50%), setting the “Percentage” field to e.g., 30% will lead to a shock with a current of 30% <em>of the calibration value of 2.5mAh</em>, ie 0.75 mAh !</p>

<hr />
<p><em>The plugin will not allow fast repetitions of the shocks, as they usually are not what the researcher wants (or at least should want). The default minimum ISI (inter-shock-interval) is 1 second. The duration of the stimulation is normally 150ms. This cannot be changed through the interface, as it is meant to be constant. The value <strong>is</strong> visible, and will be logged in a logger item</em></p>

<hr />]]></content><author><name>Mark M. Span</name></author><category term="EEG" /><category term="EventExchanger" /><category term="ButtonBox" /><category term="coding" /><category term="opensesame" /><category term="python" /><category term="plugin" /><summary type="html"><![CDATA[EventExchanger]]></summary></entry><entry><title type="html">Maps of the Various BSS Labs</title><link href="https://markspan.github.io/Maps/" rel="alternate" type="text/html" title="Maps of the Various BSS Labs" /><published>2023-03-06T00:00:00+00:00</published><updated>2023-03-06T00:00:00+00:00</updated><id>https://markspan.github.io/Maps</id><content type="html" xml:base="https://markspan.github.io/Maps/"><![CDATA[<h1 id="maps-of-the-the-various-labs">MAPS OF THE THE VARIOUS LABS</h1>

<p>also look at this <a href="https://maps.rug.nl/maps/">amazing link!</a></p>

<p>These are pdfs with the floorplans of the different labs we have at BSS in Groningen.</p>

<p>Availeability: <a href="https://labreservations.gmw.rug.nl">https://labreservations.gmw.rug.nl</a></p>

<p><a href="/maps/2111_Nieuwenhuisgebouw_1ste_verd.pdf">Map of the Ground Floor of the Nieuwenhuis Building</a></p>

<p><a href="/maps/2111_Nieuwenhuisgebouw_beg.gr.pdf">Map of the First Floor of the Nieuwenhuis Building</a></p>

<p><a href="/maps/2211_Heymansgebouw_kelder.pdf">Map of the Basement of the Heymans Building</a></p>

<p><a href="/maps/2212_Muntinggebouw_2de_verd.pdf">Map of the 2nd floor of the Munting building Building</a></p>

<p><a href="/maps/2213_Heymansvleugel_4de_verd.pdf">Map of the 4th floor of the Heymans Wing</a></p>

<p><a href="/maps/4347_KPNBorg_Lab_Kelder.pdf">Map of the Basement of the KPN Borg (near the train station)</a></p>]]></content><author><name>Mark M. Span</name></author><category term="maps" /><category term="labs" /><category term="bss" /><summary type="html"><![CDATA[MAPS OF THE THE VARIOUS LABS]]></summary></entry><entry><title type="html">EEG Training instructional Videos</title><link href="https://markspan.github.io/EEGTraining/" rel="alternate" type="text/html" title="EEG Training instructional Videos" /><published>2023-02-07T00:00:00+00:00</published><updated>2023-02-07T00:00:00+00:00</updated><id>https://markspan.github.io/EEGTraining</id><content type="html" xml:base="https://markspan.github.io/EEGTraining/"><![CDATA[<p>click on :</p>

<p><a href="https://link.gmw.rug.nl/labintro/">https://link.gmw.rug.nl/labintro/</a></p>]]></content><author><name>Mark M. Span</name></author><category term="EEG" /><category term="GMW" /><category term="Training" /><summary type="html"><![CDATA[click on :]]></summary></entry><entry><title type="html">Plugin set for OpenSesame. Update</title><link href="https://markspan.github.io/evtplugins/" rel="alternate" type="text/html" title="Plugin set for OpenSesame. Update" /><published>2022-02-24T00:00:00+00:00</published><updated>2022-02-24T00:00:00+00:00</updated><id>https://markspan.github.io/evtplugins</id><content type="html" xml:base="https://markspan.github.io/evtplugins/"><![CDATA[<h1 id="eventexchanger">EventExchanger</h1>

<p>GitHub link:
<a href="https://github.com/markspan/evtplugins">https://github.com/markspan/evtplugins</a></p>

<h2 id="short-description">Short Description:</h2>

<p>Set of <a href="https://osdoc.cogsci.nl/">OpenSesame</a> toolbox items to use hardware developed by the University of Groningen,
faculty of Behavioural and Social Sciences, department of Research Support, in OpenSesame.</p>

<p>Code Written by Eise Hoekstra and Mark M. Span, Maintained by Mark M. Span</p>

<h2 id="usage">Usage:</h2>

<p>Start OpenSesame (preferably in Admin mode)
Open the console (ctrl-D) of OpenSesame, and type:</p>

<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code> !pip install evtplugins 
</code></pre></div></div>
<p>or, if you cannot obtain root privileges:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code> !pip install evtplugins --user
</code></pre></div></div>

<p>and then, if this results in success, close OpenSesame, and open it again.</p>

<hr />
<p><em>when installing on a MAC (<strong>or another platform that has no pip executable available in the terminal</strong>) you should use:</em></p>

<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>import pip
pip.main(['install', 'evtplugins'])
</code></pre></div></div>

<p>If all went well, the plugins are now available in your toolbox.</p>

<hr />
<p><em>They will be useable either when installed as Admin, or as user, as long as a user can connect to the USB port (this could be more difficult when running Linux)</em></p>

<hr />

<p><img src="/images/evtpluginsplus.png" alt="EVTPLUGINS" /></p>

<p><em>at the moment</em> there are five (5) plugins available.</p>

<ul>
  <li><a href="#EVTXX">EVTXX</a> item: send codes through an “EventExchanger” to a physiology recording device, to synchronise the behavioural data with the physiological data.</li>
  <li><a href="#ResponseBox">ResponseBox</a> item: Alternative to the default ‘JoyStick’ plugin. Made for the custom made buttonboxes of Research Support (works / can be made to work with all HID devices).</li>
  <li><a href="#RGB_Led_Control">RGB_Led_Control</a> item: Extended Responsebox item for use with the RGB Responsebox, enabling colour use and feedback on the buttonbox.</li>
  <li><a href="#VAS">VAS</a> item: a “Visual Analogue Scale”. I tried to make it as customizable as possible, so its up to the user to stay close to the original VAS, or design their own.</li>
  <li><a href="#Shocker">SHOCKER</a> Officially: <em>Tactile Stimulator</em> The tactile stimulator is a device that can be used to administer unpleasant tactile feedback to a subject.</li>
</ul>

<h2 id="usage-of-each-item">Usage of each item:</h2>

<h3 id="evtxx"><a name="EVTXX">EVTXX</a></h3>
<p>The purpose of this item/device is to send codes through an “EventExchanger” to a physiology recording, to synchronise the behavioural data with the physiological data.
What we call an “EventExchanger”, usually is an <strong>EVT-2</strong>, see picture below. This is an USB version of the <strong>EVT-1</strong>, which did/does the same, but was/is connected to a printer-port. 
There is also an <strong>EVT-3</strong>. This is a larger version of the EVT-2, and is only used in the EEG cabins at the basement. The EVT-2 and EVT-3 are compatible, and only differ in extra (external) connections for the EVT-3.</p>

<p><img src="/images/EVT-2.jpg" alt="EVT-2" /></p>

<p>The toolbox item for the EVTXX is to be used for either the EVT-2 and the EVT-3. When you place the item in your task, it will look for <em>attached</em> EVT devices. 
In the configuration page of the item you can select the to-be-used EVT from a list. This enables the use of multiple EVT devices in the same task. During the development of your task it is
advised to select the “DUMMY”, which is always available, and will output the codes straight to the debug window of OpenSesame. <strong>Be aware:</strong> the list of devices is populated at startup, so connecting 
the device after startup will not enable it.</p>

<p><img src="/images/EVT-config.png" alt="EVT-2 config" /></p>

<p>The item has two modes: <em>set output lines</em>, and <em>pulse output lines</em>. The first mode will place the code value on the output lines of the EVT until changed, the second mode will
do so for a number of milliseconds, and then change back to zero. Because the devices have eight outputlines, the code you can convey through these devices can vary between 1 and 255.</p>

<p>Normally, “0” means no code, no event, and the values between 1 and 255 can be used to code your stimuli.</p>

<hr />
<p><em>In the “Pulse Output Lines” mode, OpenSesame will <strong>not</strong> pause until the code is reset (non-blocking!). Be careful not to send codes when the EVT is still “pulsing”, as the code that will be recorded can be affected. We  have had reports of the EVTs “getting confused” and sent erroneous codes after possibly overlapping pulses. We therefore advice pulsing with short latencies, for about 4 times the time between two samples on the amplifier. No code will the be “missed” and the chance of overlapping pulses is minimal.</em></p>

<hr />

<h4 id="using-in-code">Using in code:</h4>

<p>The easiest way to use the code is to use the underlying library ‘pyEVT’ <a href="https://github.com/markspan/pyEVT">GitHub Link</a> . Somewhere in the start of your task create and select the device you want to use:</p>

<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code> from pyEVT import EvtExchanger 
 EE = EvtExchanger()
 EE.Select("EVT") 
</code></pre></div></div>

<hr />
<p><em>The Parameter used (Here ‘EVT’) will select the USB device that has ‘EVT’ as part of the name. This will usually suffice, but if there are more devices that comply, you should use the serial number here to select ONE</em>.
<em>Using no, or empty strings will look for devices with “EventExchanger” in the name.</em>
—</p>

<p>and then optionally set the channels to 0.</p>

<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>EE.SetLines(0) 
</code></pre></div></div>
<p>Further on in your task you can now use the <em>SetLines</em> and <em>PulseLines</em> functions when you need to inform the physiology about an event.</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>EE.PulseLines(255, 1000) 
</code></pre></div></div>
<p><em>If you use multiple devices that use the EvtExchanger API, you need to select the device to be used before you call functions on it: this includes the use of Buttonboxes and EventExchangers. When you use the plugin (“dragged and dropped) this is taken care of in the plugin.</em></p>

<h3 id="responsebox"><a name="ResponseBox">ResponseBox</a></h3>

<p>There are multiple devices that work with this plugin. They mostly differ in number and position of the buttons that are attached, but they are also available as voice-keys, and most of them can even record the occurrence of an r-top.
This is the generic form:</p>

<p><img src="/images/RSP-12.jpg" alt="RSP-12" /></p>

<p>But there are many variations, with touch-buttons, different layouts etc. The can also be made bespoke.
There is also a footpedal-form:</p>

<p><img src="/images/RSP-12-F.jpg" alt="RSP-12-f" /></p>

<p>The devices are known to windows as joysticks, and can also be used with the generic joystick plugin.
The configuration is also remarkably similar to the generic joystick plugin. The ‘Responsebox’ plugin is also compatible with the <em>Psycho</em> back-end.</p>

<p><img src="/images/RSP-config.png" alt="RSP-12 config" /></p>

<h4 id="using-in-code-1">Using in code:</h4>
<p>The easiest way to use the code is to use the underlying library ‘pyEVT’ <a href="https://github.com/markspan/pyEVT">GitHub Link</a> . Somewhere in the start of your task create and select the device you want to use:</p>

<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code> from pyEVT import EvtExchanger 
 RSP = EvtExchanger()
 RSP.Select("RSP") 
</code></pre></div></div>

<p>Now you can wait for a response using the command:</p>

<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>RSP.WaitForDigEvents(AllowedEventLines, responseTimeout)
</code></pre></div></div>

<p>Where <em>AllowedEventLines</em> is the bitpattern containing the buttons that should generate a response, and <em>responseTimeout</em> the timeout value in ms.</p>

<h3 id="rgb_led_control"><a name="RGB_Led_Control">RGB_Led_Control</a></h3>
<p>The RGB-Led control is a version of the response-box that has keys that has RGB LEDs inside. The keys are quite a bit larger then the default ResponseBox keys.</p>

<p><img src="/images/RSP-RGB-config.png" alt="RSP-RGB config" /></p>

<p>The configuration is again similar to the ResponseBox configuration, but also has fields where the “current” colour of the keys can be defined.
If needed, you can also define a colour the pressed key will get, either when it is the correct, or if its one of the incorrect options.</p>

<h3 id="vas"><a name="VAS">VAS</a></h3>
<p>The item itself is a bit more complex then the previous items.The VAS item is designed to be used with the Rotary Encoder.</p>

<p><img src="/images/RSP-RDC1.jpg" alt="Rotary Encoder" /></p>

<p>It also has the option to be used with a standard mouse. 
The item interacts with a standard <a href="https://osdoc.cogsci.nl/3.3/manual/stimuli/visual/#using-the-sketchpad-and-feedback-items">sketchpad</a> item. In particular,
the item refers to a canvas with some <a href="https://osdoc.cogsci.nl/3.3/manual/python/canvas/#naming-accessing-and-modifying-elements">named</a> elements on it. 
The names of the elements can be entered in the configuration of the item.</p>

<p><img src="/images/VAS1.png" alt="VAS" /></p>

<p>There are 3 names relevant for the VAS item:</p>
<ul>
  <li>the name of the sketchpad to interact with,</li>
  <li>the name of the line element (on this sketchpad) on which the cursor moves,</li>
  <li>and the name of the cursor.</li>
</ul>

<p>Optionally, there is the possibility to animate a timer (again, on the named sketchpad), counting down to the “end of response”. If this element is used, it also must be named.</p>

<p><img src="/images/VAS-config.png" alt="VAS config" /></p>

<p>The relative complexity of the VAS item led to the inclusion of a bare-bones example of its use. This example can be found under Tools - Example experiments in OpenSesame.</p>

<p>There is also a <strong>VAS2</strong> item in the toolbox: this is a customized version, that has no coupling with any encoders and can only be used in combination with the mouse. The customizable parameters are therefore different. The main features are: the use of a GUI button to end the selection: that is, a rectangle on the canvas that is named in the parameters: when clicked after a selection, it will end the VAS2. (Optional) labels that can be clicked to the maximum and minimum values of the VAS, and the <strong>appearance</strong> of the cursor after the VASBODY is clicked.</p>

<h3 id="shocker"><a name="Shocker">Shocker</a></h3>
<h2 id="tactile-stimulator">Tactile Stimulator.</h2>

<p>The tactile stimulator is a device that can be used to administer unpleasant tactile feedback to a subject.</p>

<p><img src="/images/tactilestimulator.jpg" alt="LooksLike" /></p>

<hr />
<p><em>This maximum value of the applied current is 5 mA, which is reached
 when a byte with value 255 (being 100%) is sent to the Tactile Stimulator. 
 When a 0 is sent, the current will be 0 mA. This value can (and should) be limited by a careful calibration procedure!</em></p>

<hr />

<p>When using the Tactile Stimulator the first thing to do is to run a calibration. This will limit the current send to the subject to a maximum, that is being calibrated to the subjective experience of the subject.</p>

<p>You do so by dragging the plugin into you experiment. The configuration pane looks like this:</p>

<p><img src="/images/TS_Calibration.png" alt="Calib" /></p>

<p>Running this plugin in the ‘calibration’ mode (as seen above) will lead to the view below:</p>

<p><img src="/images/TS_CalibrationScreen.png" alt="CalibScreen" /></p>

<p>Clicking (with the mouse) on the bar in the middle of the screen will change the values for the maximum current to be used in the experiment. The newly chosen values are readable in the green field on this screen.</p>

<p>Pressing the red “Test” button will administer this current to the subject. The calibration procedure will entail a steadily increasing current, and getting feedback from the subject on the subjective experience. The shock should probably be annoying, and should definitely not hurt. After each “Test” shock, a pause of 8 seconds will start, with the button turning blue. During this pause no test shock can be administered. This is to prevent accidental repetitive shocks.</p>

<p>When the calibration value is acquired, the green “OK” button can be pressed, and the calibration value is accepted. Your task will now continue.</p>

<p>Dragging a second “shock” plugin to the task, will enable the administering of shock upto the calibration value to your subject. The calibration screen in “shock” mode will look like this:</p>

<p><img src="/images/TS_Shock.png" alt="Shock" /></p>

<p>Main thing to note here is (next to the selection of the actual device: “Productname”) is the activation of the ‘Percentage’ field. This field can be filled with a number between 0 and 100, and is used as a index, leading to a shock to be administered to the subject, with a current that is defined as a percentage of the calibration value.</p>

<p>So if the calibration led to the selection of a value of 2.5mAh (50%), setting the “Percentage” field to e.g., 30% will lead to a shock with a current of 30% <em>of the calibration value of 2.5mAh</em>, ie 0.75 mAh !</p>

<hr />
<p><em>The plugin will not allow fast repetitions of the shocks, as they usually are not what the researcher wants (or at least should want). The default minimum ISI (inter-shock-interval) is 1 second. The duration of the stimulation is normally 150ms. This cannot be changed through the interface, as it is meant to be constant. The value <strong>is</strong> visible, and will be logged in a logger item</em></p>

<hr />]]></content><author><name>Mark M. Span</name></author><category term="EEG" /><category term="EventExchanger" /><category term="ButtonBox" /><category term="coding" /><category term="opensesame" /><category term="python" /><category term="plugin" /><summary type="html"><![CDATA[EventExchanger]]></summary></entry><entry><title type="html">Plugin set for OpenSesame. Update</title><link href="https://markspan.github.io/evtplugins/" rel="alternate" type="text/html" title="Plugin set for OpenSesame. Update" /><published>2022-01-23T00:00:00+00:00</published><updated>2022-01-23T00:00:00+00:00</updated><id>https://markspan.github.io/evtplugins</id><content type="html" xml:base="https://markspan.github.io/evtplugins/"><![CDATA[<h1 id="eventexchanger">EventExchanger</h1>

<p>GitHub link:
<a href="https://github.com/markspan/evtplugins">https://github.com/markspan/evtplugins</a></p>

<h2 id="short-description">Short Description:</h2>

<p>Set of <a href="https://osdoc.cogsci.nl/">OpenSesame</a> toolbox items to use hardware developed by the University of Groningen,
faculty of Behavioural and Social Sciences, department of Research Support, in OpenSesame.</p>

<p>Code Written by Eise Hoekstra and Mark M. Span, Maintained by Mark M. Span</p>

<h2 id="usage">Usage:</h2>

<p>Start OpenSesame (preferably in Admin mode)
Open the console (ctrl-D) of OpenSesame, and type:</p>

<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code> !pip install evtplugins 
</code></pre></div></div>
<p>or, if you cannot obtain root priveleges:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code> !pip install evtplugins --user
</code></pre></div></div>

<p>and then, if this results in success, close OpenSesame, and open it again.</p>

<p>If all went well, the plugins are now available in your toolbox.</p>

<hr />
<p><em>They will be useable either when installed as Admin, or as user, as long as a user can connect to the USB port (this could be more difficult when running Linux)</em></p>

<hr />

<p><img src="/images/evtpluginsplus.png" alt="EVTPLUGINS" /></p>

<p><em>at the moment</em> there are five (5) plugins available.</p>

<ul>
  <li><a href="#EVTXX">EVTXX</a> item: send codes through an “EventExchanger” to a physiology recording device, to synchronise the behavioural data with the physiological data.</li>
  <li><a href="#ResponseBox">ResponseBox</a> item: Alternative to the default ‘JoyStick’ plugin. Made for the custom made buttonboxes of Research Support (works / can be made to work with all HID devices).</li>
  <li><a href="#RGB_Led_Control">RGB_Led_Control</a> item: Extended Responsebox item for use with the RGB Responsebox, enabling colour use and feedback on the buttonbox.</li>
  <li><a href="#VAS">VAS</a> item: a “Visual Analogue Scale”. I tried to make it as customizable as possible, so its up to the user to stay close to the original VAS, or design their own.</li>
  <li><a href="#Shocker">SHOCKER</a> Officially: <em>Tactile Stimulator</em> The tactile stimulator is a device that can be used to administer unpleasant tactile feedback to a subject.</li>
</ul>

<h2 id="usage-of-each-item">Usage of each item:</h2>

<h3 id="evtxx"><a name="EVTXX">EVTXX</a></h3>
<p>The purpose of this item/device is to send codes through an “EventExchanger” to a physiology recording, to synchronise the behavioural data with the physiological data.
What we call an “EventExchanger”, usually is an <strong>EVT-2</strong>, see picture below. This is an USB version of the <strong>EVT-1</strong>, which did/does the same, but was/is connected to a printer-port. 
There is also an <strong>EVT-3</strong>. This is a larger version of the EVT-2, and is only used in the EEG cabins at the basement. The EVT-2 and EVT-3 are compatible, and only differ in extra (external) connections for the EVT-3.</p>

<p><img src="/images/EVT-2.jpg" alt="EVT-2" /></p>

<p>The toolbox item for the EVTXX is to be used for either the EVT-2 and the EVT-3. When you place the item in your task, it will look for <em>attached</em> EVT devices. 
In the configuration page of the item you can select the to-be-used EVT from a list. This enables the use of multiple EVT devices in the same task. During the development of your task it is
advised to select the “DUMMY”, which is always available, and will output the codes straight to the debug window of OpenSesame. <strong>Be aware:</strong> the list of devices is populated at startup, so connecting 
the device after startup will not enable it.</p>

<p><img src="/images/EVT-config.png" alt="EVT-2 config" /></p>

<p>The item has two modes: <em>set output lines</em>, and <em>pulse output lines</em>. The first mode will place the code value on the output lines of the EVT until changed, the second mode will
do so for a number of milliseconds, and then change back to zero. Because the devices have eight outputlines, the code you can convey through these devices can vary between 1 and 255.</p>

<p>Normally, “0” means no code, no event, and the values between 1 and 255 can be used to code your stimuli.</p>

<hr />
<p><em>In the “Pulse Output Lines” mode, OpenSesame will <strong>not</strong> pause until the code is reset (non-blocking!). Be careful not to send codes when the EVT is still “pulsing”, as the code that will be recorded can be affected. We  have had reports of the EVTs “getting confused” and sent erroneous codes after possibly overlapping pulses. We therefore advice pulsing with short latencies, for about 4 times the time between two samples on the amplifier. No code will the be “missed” and the chance of overlapping pulses is minimal.</em></p>

<hr />

<h4 id="using-in-code">Using in code:</h4>

<p>The easiest way to use the code is to use the underlying library ‘pyEVT’ <a href="https://github.com/markspan/pyEVT">GitHub Link</a> . Somewhere in the start of your task create and select the device you want to use:</p>

<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code> from pyEVT import EvtExchanger 
 EE = EvtExchanger()
 EE.Select("EVT") 
</code></pre></div></div>

<hr />
<p><em>The Parameter used (Here ‘EVT’) will select the USB device that has ‘EVT’ as part of the name. This will usually suffice, but if there are more devices that comply, you should use the serial number here to select ONE</em>.
<em>Using no, or empty strings will look for devices with “EventExchanger” in the name.</em>
—</p>

<p>and then optionally set the channels to 0.</p>

<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>EE.SetLines(0) 
</code></pre></div></div>
<p>Further on in your task you can now use the <em>SetLines</em> and <em>PulseLines</em> functions when you need to inform the physiology about an event.</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>EE.PulseLines(255, 1000) 
</code></pre></div></div>
<p><em>If you use multiple devices that use the EvtExchanger API, you need to select the device to be used before you call functions on it: this includes the use of Buttonboxes and EventExchangers. When you use the plugin (“dragged and dropped) this is taken care of in the plugin.</em></p>

<h3 id="responsebox"><a name="ResponseBox">ResponseBox</a></h3>

<p>There are multiple devices that work with this plugin. They mostly differ in number and position of the buttons that are attached, but they are also available as voice-keys, and most of them can even record the occurrence of an r-top.
This is the generic form:</p>

<p><img src="/images/RSP-12.jpg" alt="RSP-12" /></p>

<p>But there are many variations, with touch-buttons, different layouts etc. The can also be made bespoke.
There is also a footpedal-form:</p>

<p><img src="/images/RSP-12-F.jpg" alt="RSP-12-f" /></p>

<p>The devices are known to windows as joysticks, and can also be used with the generic joystick plugin.
The configuration is also remarkably similar to the generic joystick plugin. The ‘Responsebox’ plugin is also compatible with the <em>Psycho</em> back-end.</p>

<p><img src="/images/RSP-config.png" alt="RSP-12 config" /></p>

<h4 id="using-in-code-1">Using in code:</h4>
<p>The easiest way to use the code is to use the underlying library ‘pyEVT’ <a href="https://github.com/markspan/pyEVT">GitHub Link</a> . Somewhere in the start of your task create and select the device you want to use:</p>

<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code> from pyEVT import EvtExchanger 
 RSP = EvtExchanger()
 RSP.Select("RSP") 
</code></pre></div></div>

<p>Now you can wait for a response using the command:</p>

<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>RSP.WaitForDigEvents(AllowedEventLines, responseTimeout)
</code></pre></div></div>

<p>Where <em>AllowedEventLines</em> is the bitpattern containing the buttons that should generate a response, and <em>responseTimeout</em> the timeout value in ms.</p>

<h3 id="rgb_led_control"><a name="RGB_Led_Control">RGB_Led_Control</a></h3>
<p>The RGB-Led control is a version of the response-box that has keys that has RGB LEDs inside. The keys are quite a bit larger then the default ResponseBox keys.</p>

<p><img src="/images/RSP-RGB-config.png" alt="RSP-RGB config" /></p>

<p>The configuration is again similar to the ResponseBox configuration, but also has fields where the “current” colour of the keys can be defined.
If needed, you can also define a colour the pressed key will get, either when it is the correct, or if its one of the incorrect options.</p>

<h3 id="vas"><a name="VAS">VAS</a></h3>
<p>The item itself is a bit more complex then the previous items.The VAS item is designed to be used with the Rotary Encoder.</p>

<p><img src="/images/RSP-RDC1.jpg" alt="Rotary Encoder" /></p>

<p>It also has the option to be used with a standard mouse. 
The item interacts with a standard <a href="https://osdoc.cogsci.nl/3.3/manual/stimuli/visual/#using-the-sketchpad-and-feedback-items">sketchpad</a> item. In particular,
the item refers to a canvas with some <a href="https://osdoc.cogsci.nl/3.3/manual/python/canvas/#naming-accessing-and-modifying-elements">named</a> elements on it. 
The names of the elements can be entered in the configuration of the item.</p>

<p><img src="/images/VAS1.png" alt="VAS" /></p>

<p>There are 3 names relevant for the VAS item:</p>
<ul>
  <li>the name of the sketchpad to interact with,</li>
  <li>the name of the line element (on this sketchpad) on which the cursor moves,</li>
  <li>and the name of the cursor.</li>
</ul>

<p>Optionally, there is the possibility to animate a timer (again, on the named sketchpad), counting down to the “end of response”. If this element is used, it also must be named.</p>

<p><img src="/images/VAS-config.png" alt="VAS config" /></p>

<p>The relative complexity of the VAS item led to the inclusion of a bare-bones example of its use. This example can be found under Tools - Example experiments in OpenSesame.</p>

<h3 id="shocker"><a name="Shocker">Shocker</a></h3>
<h2 id="tactile-stimulator">Tactile Stimulator.</h2>

<p>The tactile stimulator is a device that can be used to administer unpleasant tactile feedback to a subject.</p>

<p><img src="/images/tactilestimulator.jpg" alt="LooksLike" /></p>

<hr />
<p><em>This maximum value of the applied current is 5 mA, which is reached
 when a byte with value 255 (being 100%) is sent to the Tactile Stimulator. 
 When a 0 is sent, the current will be 0 mA. This value can (and should) be limited by a careful calibration procedure!</em></p>

<hr />

<p>When using the Tactile Stimulator the first thing to do is to run a calibration. This will limit the current send to the subject to a maximum, that is being calibrated to the subjective experience of the subject.</p>

<p>You do so by dragging the plugin into you experiment. The configuration pane looks like this:</p>

<p><img src="/images/TS_Calibration.png" alt="Calib" /></p>

<p>Running this plugin in the ‘calibration’ mode (as seen above) will lead to the view below:</p>

<p><img src="/images/TS_CalibrationScreen.png" alt="CalibScreen" /></p>

<p>Clicking (with the mouse) on the bar in the middle of the screen will change the values for the maximum current to be used in the experiment. The newly chosen values are readable in the green field on this screen.</p>

<p>Pressing the red “Test” button will administer this current to the subject. The calibration procedure will entail a steadily increasing current, and getting feedback from the subject on the subjective experience. The shock should probably be annoying, and should definitely not hurt. After each “Test” shock, a pause of 8 seconds will start, with the button turning blue. During this pause no test shock can be administered. This is to prevent accidental repetitive shocks.</p>

<p>When the calibration value is acquired, the green “OK” button can be pressed, and the calibration value is accepted. Your task will now continue.</p>

<p>Dragging a second “shock” plugin to the task, will enable the administering of shock upto the calibration value to your subject. The calibration screen in “shock” mode will look like this:</p>

<p><img src="/images/TS_Shock.png" alt="Shock" /></p>

<p>Main thing to note here is (next to the selection of the actual device: “Productname”) is the activation of the ‘Percentage’ field. This field can be filled with a number between 0 and 100, and is used as a index, leading to a shock to be administered to the subject, with a current that is defined as a percentage of the calibration value.</p>

<p>So if the calibration led to the selection of a value of 2.5mAh (50%), setting the “Percentage” field to e.g., 30% will lead to a shock with a current of 30% <em>of the calibration value of 2.5mAh</em>, ie 0.75 mAh !</p>

<hr />
<p><em>The plugin will not allow fast repetitions of the shocks, as they usually are not what the researcher wants (or at least should want). The default minimum ISI (inter-shock-interval) is 1 second. The duration of the stimulation is normally 150ms. This cannot be changed through the interface, as it is meant to be constant. The value <strong>is</strong> visible, and will be logged in a logger item</em></p>

<hr />]]></content><author><name>Mark M. Span</name></author><category term="EEG" /><category term="EventExchanger" /><category term="ButtonBox" /><category term="coding" /><category term="opensesame" /><category term="python" /><category term="plugin" /><summary type="html"><![CDATA[EventExchanger]]></summary></entry><entry><title type="html">Tactile Stimulator for OpenSesame. Plugins from Research Support BSS University of Groningen</title><link href="https://markspan.github.io/Tactile-Stimulator/" rel="alternate" type="text/html" title="Tactile Stimulator for OpenSesame. Plugins from Research Support BSS University of Groningen" /><published>2022-01-18T00:00:00+00:00</published><updated>2022-01-18T00:00:00+00:00</updated><id>https://markspan.github.io/Tactile%20Stimulator</id><content type="html" xml:base="https://markspan.github.io/Tactile-Stimulator/"><![CDATA[<h1 id="eventexchanger">EventExchanger</h1>

<p>GitHub link:
<a href="https://github.com/markspan/evtplugins">https://github.com/markspan/evtplugins</a></p>

<h2 id="short-description">Short Description:</h2>

<p>Adding to the set of <a href="https://osdoc.cogsci.nl/">OpenSesame</a> toolbox items to use hardware developed by the University of Groningen,
faculty of Behavioural and Social Sciences, department of Research Support, in OpenSesame.</p>

<p>Code Written by Eise Hoekstra and Mark M. Span, Maintained by Mark M. Span</p>

<h2 id="usage">Usage:</h2>

<p>Start OpenSesame 
Open the console (ctrl-D) of OpenSesame, and type:</p>

<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code> !pip install evtplugins --user
</code></pre></div></div>
<p>and then, if this results in success, close OpenSesame, and open it again.</p>

<p>If all went well, the plugins are now available in your toolbox.</p>

<p><img src="/images/evtpluginsplus.png" alt="EVTPLUGINS" /></p>

<h2 id="tactile-stimulator">Tactile Stimulator.</h2>

<p>The tactile stimulator is a device that can be used to administer unpleasant tactile feedback to a subject.</p>

<p><img src="/images/tactilestimulator.jpg" alt="LooksLike" /></p>

<hr />
<p><em>This maximum value of the applied current is 5 mA, which is reached
 when a byte with value 255 (being 100%) is sent to the Tactile Stimulator. 
 When a 0 is sent, the current will be 0 mA. This value can (and should) be limited by a careful calibration procedure!</em></p>

<hr />

<p>When using the Tactile Stimulator the first thing to do is to run a calibration. This will limit the current send to the subject to a maximum, that is being calibrated to the subjective experience of the subject.</p>

<p>You do so by dragging the plugin into you experiment. The configuration pane looks like this:</p>

<p><img src="/images/TS_Calibration.png" alt="Calib" /></p>

<p>Running this plugin in the ‘calibration’ mode (as seen above) will lead to the view below:</p>

<p><img src="/images/TS_CalibrationScreen.png" alt="CalibScreen" /></p>

<p>Clicking (with the mouse) on the bar in the middle of the screen will change the values for the maximum current to be used in the experiment. The newly chosen values are readable in the green field on this screen.</p>

<p>Pressing the red “Test” button will administer this current to the subject. The calibration procedure will entail a steadily increasing current, and getting feedback from the subject on the subjective experience. The shock should probably be annoying, and should definitely not hurt. After each “Test” shock, a pause of 8 seconds will start, with the button turning blue. During this pause no test shock can be administered. This is to prevent accidental repetitive shocks.</p>

<p>When the calibration value is acquired, the green “OK” button can be pressed, and the calibration value is accepted. Your task will now continue.</p>

<p>Dragging a second “shock” plugin to the task, will enable the administering of shock upto the calibration value to your subject. The calibration screen in “shock” mode will look like this:</p>

<p><img src="/images/TS_Shock.png" alt="Shock" /></p>

<p>Main thing to note here is (next to the selection of the actual device: “Productname”) is the activation of the ‘Percentage’ field. This field can be filled with a number between 0 and 100, and is used as a index, leading to a shock to be administered to the subject, with a current that is defined as a percentage of the calibration value.</p>

<p>So if the calibration led to the selection of a value of 2.5mAh (50%), setting the “Percentage” field to e.g., 30% will lead to a shock with a current of 30% <em>of the calibration value of 2.5mAh</em>!</p>

<hr />
<p><em>The plugin will not allow fast repetitions of the shocks, as they usually are not what the researcher wants (or at least should want). The default minimum ISI (inter-shock-interval) is 1 second.</em></p>

<hr />]]></content><author><name>Mark M. Span</name></author><category term="Tactile" /><category term="Stimulator" /><category term="EventExchanger" /><category term="ButtonBox" /><category term="coding" /><category term="opensesame" /><category term="python" /><category term="plugin" /><summary type="html"><![CDATA[EventExchanger]]></summary></entry><entry><title type="html">LSL - Polar H10</title><link href="https://markspan.github.io/Polar/" rel="alternate" type="text/html" title="LSL - Polar H10" /><published>2021-03-08T00:00:00+00:00</published><updated>2021-03-08T00:00:00+00:00</updated><id>https://markspan.github.io/Polar</id><content type="html" xml:base="https://markspan.github.io/Polar/"><![CDATA[<p><a href="https://github.com/markspan/PolarBand2lsl">polarBandH10-2LSL</a></p>

<h1 id="polarband2lsl">PolarBand2lsl</h1>
<p>Send PolarBand H10 Data to an <a href="https://github.com/sccn/labstreaminglayer">LSL</a> stream.</p>

<h1 id="manual">Manual:</h1>
<p>Install <a href="https://www.anaconda.com/">Python</a> if you haven’t yet</p>

<p>then open an anaconda prompt, and do a</p>

<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>pip install pylsl --user
pip install bleak --user
</code></pre></div></div>

<p>to install <a href="https://pypi.org/project/pylsl/">pylsl</a> and <a href="https://bleak.readthedocs.io/en/latest/">bleak</a> into python.</p>

<p>as <strong>bleak</strong> is used for the bluetooth LE communication, this <em>should</em> work on PC, MAC and Linux.</p>

<p>and then change to the dir with this code.
Change the MAC address in the code to the MAC address of your band first.</p>

<p>do a</p>

<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>python Polar2LSL
</code></pre></div></div>

<p>to get the stream running.</p>

<p>You can record the stream with <a href="https://github.com/labstreaminglayer/App-LabRecorder/releases">Labrecorder</a></p>

<p>A sample script for peak detection is also provided. Based on <a href="https://nl.mathworks.com/help/wavelet/ug/r-wave-detection-in-the-ecg.html]">Matlab Documentation</a>.
This script uses the xdf import module of LabStreamingLayer (https://github.com/xdf-modules/xdf-Matlab), and the ‘findpeaks’ function from the MATLAB Signal Processing Toolbox</p>

<p><img src="https://user-images.githubusercontent.com/4105112/110318793-40345100-800e-11eb-9f86-872d7848a1ac.png" alt="Screenshot 2021-02-25 115853" /></p>
<h1 id="stolen-from">Stolen from:</h1>
<p><a href="https://towardsdatascience.com/creating-a-data-stream-with-polar-device-a5c93c9ccc59">Pareeknikhil</a></p>]]></content><author><name>Mark M. Span</name></author><category term="labstreaminglayer" /><category term="heartrate" /><category term="synchronisation" /><category term="coding" /><summary type="html"><![CDATA[polarBandH10-2LSL]]></summary></entry></feed>