<?xml version="1.0" encoding="utf-8" standalone="yes" ?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom">
  <channel>
    <title>Reality Bending Lab</title>
    <link>https://realitybending.github.io/</link>
      <atom:link href="https://realitybending.github.io/index.xml" rel="self" type="application/rss+xml" />
    <description>Reality Bending Lab</description>
    <generator>Hugo Blox Builder (https://hugoblox.com)</generator><language>en-us</language><lastBuildDate>Sun, 11 Jan 2026 00:00:00 +0000</lastBuildDate>
    
    
    <item>
      <title>Postdoc in Psychology/Neuroscience</title>
      <link>https://realitybending.github.io/jobs/postdoc/</link>
      <pubDate>Tue, 22 Nov 2022 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/jobs/postdoc/</guid>
      <description>&lt;!-- ## Funded Position Available


  &lt;i class=&#34;fa fa-calendar  pr-1 fa-fw&#34;&gt;&lt;/i&gt; As soon as possible

  &lt;i class=&#34;fa fa-location-pin  pr-1 fa-fw&#34;&gt;&lt;/i&gt; Singapore

  &lt;i class=&#34;fa fa-clock  pr-1 fa-fw&#34;&gt;&lt;/i&gt; 2-years

![](singapore.jpg)




  &lt;i class=&#34;fa fa-microscope  pr-1 fa-fw&#34;&gt;&lt;/i&gt; **Themes: Fake news, EEG, deception, perception of reality**

We are currently seeking a postdoctoral fellow to join us in Singapore for a **2-years** project investigating the neural mechanisms and socio/psycho/cognitive correlates of **beliefs in fake news and misinformation**. We welcome applicants with a PhD in psychology or neuroscience to work in the [Clinical Brain Lab](https://www.clinicalbrain.org/) at Nanyang Technological University (NTU), with Prof Annabel Chen (https://www.clinicalbrain.org/) and Dr Dominique Makowski (https://dominiquemakowski.github.io/).

This is a fantastic opportunity for postdoctoral fellows to develop an exciting research project with important theoretical ties and applied outcomes (in terms of fake news, misinformation management, policy implications etc.,).
At the Clinical Brain Lab, the research focuses on uncovering the neuropsychological mechanisms underlying cognitive processes and behaviour. Our main research modalities include the use of Magnetic Resonance Imaging (MRI), neurostimulation (TMS &amp; TDCs), EEG and fNIRS, as well as cognitive behavioural and neuropsychological assessment tools.

**Desired skills.** Experience in EEG, physiological signals (ECG, EDA, ...), neuropsychological/cognitive tests administration, experimental psychology task design and implementation, signal processing and statistics is helpful. Proficiency with R and/or Python is a plus, and a dedication to open science is welcome.

**Starting date.** As soon as possible.

**Contact.** Send questions or CV to Prof Annabel (annabelchen@ntu.edu.sg) and Dr Dominique Makowski (D.Makowski@sussex.ac.uk). --&gt;
&lt;h2 id=&#34;new-opportunities&#34;&gt;New Opportunities&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;None :(&lt;/li&gt;
&lt;/ul&gt;
&lt;!-- 
  &lt;i class=&#34;fa fa-calendar  pr-1 fa-fw&#34;&gt;&lt;/i&gt; Flexible

  &lt;i class=&#34;fa fa-location-pin  pr-1 fa-fw&#34;&gt;&lt;/i&gt; University of Sussex, Brighton, UK

- ⚠️ [**Fyssen Postdoc Fellowship competition**](https://www.fondationfyssen.fr/en/study-grants/aim-award/) *(French PhDs only, deadline: 31 March 2025)* --&gt;
&lt;h2 id=&#34;getting-your-own-postdoc-funding&#34;&gt;Getting your own Postdoc Funding&lt;/h2&gt;
&lt;p&gt;If you&amp;rsquo;re a recent PhD graduate (or soon to be), you could consider applying for a postdoc fellowship to join the lab on your own terms, and develop your own research project.&lt;/p&gt;
&lt;p&gt;Here are some existing opportunities for postdoc funding:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a href=&#34;https://www.senss.ac.uk/post-doctoral-fellowships&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;SENSS&lt;/strong&gt;&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&#34;https://marie-sklodowska-curie-actions.ec.europa.eu/actions/postdoctoral-fellowships&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;Marie-curie postdoc fellowships (MSCA)&lt;/strong&gt;&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&#34;https://royalsociety.org/grants/newton-international/&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;Newton International Fellowships&lt;/strong&gt;&lt;/a&gt; (non-UK)&lt;/li&gt;
&lt;li&gt;&lt;a href=&#34;https://www.thebritishacademy.ac.uk/funding/postdoctoral-fellowships/&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;British Academy&lt;/strong&gt;&lt;/a&gt; (UK PhDs holders)&lt;/li&gt;
&lt;li&gt;UKRI Future Leaders&lt;/li&gt;
&lt;li&gt;&lt;a href=&#34;https://www.fondationfyssen.fr/en/study-grants/aim-award/&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;FYSSEN&lt;/strong&gt;&lt;/a&gt; (French PhDs holders)&lt;/li&gt;
&lt;li&gt;&lt;a href=&#34;https://www.sshrc-crsh.gc.ca/funding-financement/programs-programmes/fellowships/postdoctoral-postdoctorale-fra.aspx&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;CRSH&lt;/strong&gt;&lt;/a&gt;, &lt;a href=&#34;https://frq.gouv.qc.ca/programme/frqsc-bourse-postdoctorale-b3z-concours-automne-2023-2024-2025/&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;FRQSC&lt;/a&gt;, &lt;a href=&#34;https://banting.fellowships-bourses.gc.ca/fr/app-dem_overview-apercu.html&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;Banting&lt;/a&gt; (Canadian PhDs holders)&lt;/li&gt;
&lt;li&gt;&lt;a href=&#34;https://www.ntu.edu.sg/hass/admissions/graduate-programmes/hips2024&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;NTU&amp;rsquo;s Humanities International Postdoctoral Scholarship (HIPS)&lt;/strong&gt;&lt;/a&gt; (Singaporeans)&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Get in touch with me once you have an opportunity in mind and a rough project idea, we can then refine your application to maximize your chances of getting it.&lt;/p&gt;
&lt;div class=&#34;alert alert-note&#34;&gt;
  &lt;div&gt;
    &lt;p&gt;&lt;strong&gt;Don&amp;rsquo;t rely on what is written!&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;Ask directly &lt;a href=&#34;https://realitybending.github.io/people/&#34;&gt;members of team&lt;/a&gt; (current and past) about their experience in the lab!&lt;/p&gt;
  &lt;/div&gt;
&lt;/div&gt;
</description>
    </item>
    
    <item>
      <title>PhD in Psychology / Neuroscience</title>
      <link>https://realitybending.github.io/jobs/phd/</link>
      <pubDate>Tue, 22 Nov 2022 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/jobs/phd/</guid>
      <description>&lt;p&gt;
  &lt;i class=&#34;fa fa-calendar  pr-1 fa-fw&#34;&gt;&lt;/i&gt; Flexible

  &lt;i class=&#34;fa fa-location-pin  pr-1 fa-fw&#34;&gt;&lt;/i&gt; University of Sussex, Brighton, UK&lt;/p&gt;
&lt;p&gt;















&lt;figure  &gt;
  &lt;div class=&#34;d-flex justify-content-center&#34;&gt;
    &lt;div class=&#34;w-100&#34; &gt;&lt;img alt=&#34;&#34; srcset=&#34;
               /media/doctorate_hu_3c31ab3d39f92058.webp 400w,
               /media/doctorate_hu_a20a92dab4c448d3.webp 760w,
               /media/doctorate_hu_8acd0a718c6e0da4.webp 1200w&#34;
               src=&#34;https://realitybending.github.io/media/doctorate_hu_3c31ab3d39f92058.webp&#34;
               width=&#34;760&#34;
               height=&#34;343&#34;
               loading=&#34;lazy&#34; data-zoomable /&gt;&lt;/div&gt;
  &lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h2 id=&#34;new-opportunities&#34;&gt;New Opportunities&lt;/h2&gt;
&lt;p&gt;















&lt;figure  &gt;
  &lt;div class=&#34;d-flex justify-content-center&#34;&gt;
    &lt;div class=&#34;w-100&#34; &gt;&lt;img src=&#34;https://media.licdn.com/dms/image/v2/D4E22AQE87QMg9zXFlw/feedshare-shrink_1280/B4EZeomw49HYAk-/0/1750880424520?e=1754524800&amp;amp;v=beta&amp;amp;t=4cHB_CcBSi5EHaA1YREwROCpBG3lMfM2mev49jiAvWY&#34; alt=&#34;&#34; loading=&#34;lazy&#34; data-zoomable /&gt;&lt;/div&gt;
  &lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;11 December 20205&lt;/strong&gt;: &lt;a href=&#34;https://www.sussex.ac.uk/study/phd/degrees/psychology-phd&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;Sussex Psychology PhD Program&lt;/a&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;input checked=&#34;&#34; disabled=&#34;&#34; type=&#34;checkbox&#34;&gt; Fully funded for local and international students (i.e., pays university fees + gives you a salary)&lt;/li&gt;
&lt;li&gt;&lt;input checked=&#34;&#34; disabled=&#34;&#34; type=&#34;checkbox&#34;&gt; Selection is based on the candidate&amp;rsquo;s CV as well as on the project proposal&lt;/li&gt;
&lt;li&gt;&lt;input checked=&#34;&#34; disabled=&#34;&#34; type=&#34;checkbox&#34;&gt; Get in touch with potential supervisors before applying!&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;11 December 20205&lt;/strong&gt;: &lt;a href=&#34;https://www.sussex.ac.uk/study/fees-funding/phd-funding/view/1863-SEDarc-%28ESRC%29-PhD-scholarships-for-research-in-the-Social-Sciences&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;SEDarc PhD Scholarship&lt;/a&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;input checked=&#34;&#34; disabled=&#34;&#34; type=&#34;checkbox&#34;&gt; 3.5 years fully funded scholarship&lt;/li&gt;
&lt;li&gt;&lt;input checked=&#34;&#34; disabled=&#34;&#34; type=&#34;checkbox&#34;&gt; Project must fit within the SEDarc themes (e.g., data science)&lt;/li&gt;
&lt;li&gt;&lt;input checked=&#34;&#34; disabled=&#34;&#34; type=&#34;checkbox&#34;&gt; Selection is based on the candidate&amp;rsquo;s CV as well as on the project proposal&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;12 January 2026&lt;/strong&gt;: &lt;a href=&#34;https://www.sussex.ac.uk/study/phd/degrees/sussex-neuroscience-4-year-programme-phd&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;Sussex Neuroscience 3+1 PhD Program&lt;/a&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;input checked=&#34;&#34; disabled=&#34;&#34; type=&#34;checkbox&#34;&gt; Fully funded for local and international students (i.e., pays university fees + gives you a salary)&lt;/li&gt;
&lt;li&gt;&lt;input checked=&#34;&#34; disabled=&#34;&#34; type=&#34;checkbox&#34;&gt; You don&amp;rsquo;t need to chose a supervisor before applying. The first year is made of 3 different rotations in different labs&lt;/li&gt;
&lt;li&gt;&lt;input checked=&#34;&#34; disabled=&#34;&#34; type=&#34;checkbox&#34;&gt; Selection is based mostly on the candidate&amp;rsquo;s CV&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id=&#34;what-you-will-get&#34;&gt;What you will get&lt;/h2&gt;
&lt;p&gt;Doing a PhD at Sussex with Dominique Makowski means:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Joining a dynamic team with a vibrant lab life&lt;/li&gt;
&lt;li&gt;A supervisor that actually supervises 🤯&lt;/li&gt;
&lt;li&gt;A super interesting research topic&lt;/li&gt;
&lt;li&gt;A French-style thesis defense to celebrate your accomplishments 🧀🍷&lt;/li&gt;
&lt;/ul&gt;
&lt;div class=&#34;alert alert-warning&#34;&gt;
  &lt;div&gt;
    &lt;p&gt;&lt;strong&gt;Don&amp;rsquo;t rely on what is written!&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;Ask directly &lt;a href=&#34;https://realitybending.github.io/people/&#34;&gt;members of team&lt;/a&gt; (current and past) about their experience in the lab!&lt;/p&gt;
  &lt;/div&gt;
&lt;/div&gt;
&lt;h2 id=&#34;how-to-do-a-phd-in-psychology&#34;&gt;How to do a PhD in Psychology?&lt;/h2&gt;
&lt;ol&gt;
&lt;li&gt;The first step is usually to contact the potential supervisor to discuss a rough research project outline. Write an email with your CV, your research interests and - if you have - some ideas for a research project that matches your supervisor&amp;rsquo;s line of research. If you &lt;strong&gt;don&amp;rsquo;t have ideas yet&lt;/strong&gt;, it&amp;rsquo;s perfectly fine! I will likely propose some avenues of research that might match your interest, and refine them down the line.&lt;/li&gt;
&lt;li&gt;Ideally, you would also want to come up with a plan for &lt;a href=&#34;https://www.sussex.ac.uk/study/phd/degrees/psychology-phd#funding-fees&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;funding&lt;/strong&gt;&lt;/a&gt;. This is the most challenging part, unfortunately. There are typically 4 types of profiles: 1) the &lt;em&gt;student and the supervisor&lt;/em&gt; come up with a tentative research project, with which the student then applies to scholarship opportunities. 2) the &lt;em&gt;supervisor&lt;/em&gt; already has a scholarship for a specific project that he obtained a grant for, and will recruit a PhD for that specific research project; 3) the &lt;em&gt;student&lt;/em&gt; already secured a scholarship that allows them to pursue a PhD with the supervisor of their choice (e.g., some schemes exist from countries for their nationals to do their PhD abroad). 4) Self-funding, which we don&amp;rsquo;t recommend unless you&amp;rsquo;re one of the lucky few with money to spare.&lt;/li&gt;
&lt;/ol&gt;
&lt;h3 id=&#34;funding-opportunities&#34;&gt;Funding Opportunities&lt;/h3&gt;
&lt;p&gt;Funding is a complicated topic, and often the main barrier between one&amp;rsquo;s goal and its achievement. Keep in mind that there are many other possibilities and case-by-case considerations.&lt;/p&gt;
&lt;p&gt;Here are some scholarship opportunities for funded PhDs in the UK:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;The &lt;a href=&#34;https://archive.sussex.ac.uk/study/scholarships/1525-Psychology-Doctoral-Research-Studentship-UK-and-International&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;Sussex Psychology Doctoral Research Studentship&lt;/strong&gt;&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;The &lt;a href=&#34;https://www.sussex.ac.uk/research/centres/sussex-neuroscience/phd/4yearphd&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;Sussex Neuroscience 3+1 years PhD&lt;/strong&gt;&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;The &lt;a href=&#34;https://www.sussex.ac.uk/study/fees-funding/phd-funding/view/1807-Sussex-AI-PhD-studentships&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;Sussex AI PhD&lt;/strong&gt;&lt;/a&gt; (you will need a primary supervisor from the &lt;em&gt;School of Engineering and Informatics&lt;/em&gt; but I can be a cosupervisor)&lt;/li&gt;
&lt;li&gt;The &lt;a href=&#34;https://www.sussex.ac.uk/study/fees-funding/phd-funding/view/1639-SEDarc-%28ESRC%29-PhD-scholarships-for-research-in-the-Social-Sciences&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;SEDarc studentships&lt;/strong&gt;&lt;/a&gt; (see also &lt;a href=&#34;https://sedarc.ac.uk/thematic-pathways/&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;here&lt;/a&gt;)&lt;/li&gt;
&lt;li&gt;The &lt;a href=&#34;https://www.senss.ac.uk/studentships-overview&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;South and East Network for Social Sciences (SENSS)&lt;/strong&gt;&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;The &lt;a href=&#34;https://wellcome.org/grant-funding/schemes/four-year-phd-programmes-studentships-basic-scientists&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;Welcome Trust PhD Studentships&lt;/strong&gt;&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;The &lt;a href=&#34;https://www.ukri.org/what-we-do/developing-people-and-skills/find-studentships-and-doctoral-training/get-a-studentship-to-fund-your-doctorate/&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;UKRI studentship&lt;/strong&gt;&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;The &lt;a href=&#34;https://southcoastdtp.ac.uk/funding/&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;South Coast Biosciences Network (SoCoBio)&lt;/strong&gt;&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;The &lt;a href=&#34;https://www.chevening.org/scholarships/&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;Chevening Scholarship&lt;/strong&gt;&lt;/a&gt; (International only)&lt;/li&gt;
&lt;li&gt;The &lt;a href=&#34;https://phd.learningplanetinstitute.org/en/join-us&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;FIRE French scholarships&lt;/strong&gt;&lt;/a&gt; (must be a collaboration with a Paris-based lab)&lt;/li&gt;
&lt;li&gt;The &lt;a href=&#34;https://www.daad.de/en/study-research-teach-abroad/&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;DAAD scholarships&lt;/strong&gt;&lt;/a&gt; (Germans)&lt;/li&gt;
&lt;li&gt;The &lt;a href=&#34;https://www.sussex.ac.uk/study/fees-funding/phd-funding/view/1625-China-Scholarship-Council-CSC-University-of-Sussex-Joint-Scholarships-2024&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;Joint China Scholarship&lt;/strong&gt;&lt;/a&gt; (China)&lt;/li&gt;
&lt;li&gt;Scholarship Opportunities for &lt;strong&gt;Singaporeans&lt;/strong&gt;:
&lt;ul&gt;
&lt;li&gt;&lt;a href=&#34;https://www.gov.uk/government/news/compilation-of-scholarships-and-fellowships-for-singaporeans&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;gov.uk information&lt;/strong&gt;&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&#34;https://www.psc.gov.sg/scholarships/postgraduate-scholarships/lee-kuan-yew-scholarship&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;Lee Kuan Yew Scholarship&lt;/strong&gt;&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&#34;https://www.a-star.edu.sg/Scholarships/for-graduate-studies/national-science-scholarship-phd&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;A*STAR Scholarship&lt;/strong&gt;&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&#34;https://www.smu.edu.sg/MOE-start/overseas-pg-scholarship&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;SMU Postgraduate Scholarship&lt;/strong&gt;&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&#34;https://www.ntu.edu.sg/hass/admissions/graduate-programmes/hips2024&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;NTU&amp;rsquo;s Humanities International PhD/Postdoctoral Scholarship (HIPS)&lt;/strong&gt;&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;As well as other options:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Partnership&lt;/strong&gt;: If an external partner agrees to cover half the cost (approx. £35k over three years), the university can match the other half of the cost. Useful for applied projects and collaborations with &lt;strong&gt;startups, private companies or NGOs&lt;/strong&gt;. If you&amp;rsquo;re thinking of developing a product, a software or a service, this could be a good option.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Collaboration&lt;/strong&gt;: Many universities allow some form of co-supervisorship. This means that you could do the main part of your PhD in another university, and come to Sussex sporadically as part of a collaboration. Note that official frameworks can exist for this type of configurations, such as the &lt;a href=&#34;https://u-paris.fr/cotutelle-internationale-de-these/&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;em&gt;cotutelles&lt;/em&gt;&lt;/a&gt; in France.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Check-out this &lt;a href=&#34;https://www.sussex.ac.uk/study/phd/apply&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;em&gt;how to apply&lt;/em&gt;&lt;/a&gt; guide for additional information.&lt;/p&gt;
&lt;div class=&#34;alert alert-note&#34;&gt;
  &lt;div&gt;
    More info are available on the university&amp;rsquo;s &lt;a href=&#34;https://www.sussex.ac.uk/schools/psychology/study/phd&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;PhD in psychology&lt;/strong&gt;&lt;/a&gt;, &lt;a href=&#34;https://www.sussex.ac.uk/study/phd/degrees/psychology-phd#funding-fees&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;Funding&lt;/strong&gt;&lt;/a&gt;, &lt;a href=&#34;https://www.sussex.ac.uk/study/phd/degrees/cognitive-science-phd&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;PhD in cognitive science&lt;/strong&gt;&lt;/a&gt; and &lt;a href=&#34;https://www.sussex.ac.uk/research/centres/sussex-neuroscience/phd&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;PhD in neuroscience&lt;/strong&gt;&lt;/a&gt; information page.
  &lt;/div&gt;
&lt;/div&gt;
&lt;h3 id=&#34;other-pots-of-money&#34;&gt;Other Pots of Money&lt;/h3&gt;
&lt;p&gt;Mostly for those already registered as PhD students.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a href=&#34;https://editing.press/bassi&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;Laura Bassi Scholarship&lt;/a&gt;: Masters and PhD on &amp;ldquo;neglected&amp;rdquo; research topics.&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id=&#34;questions-and-answers&#34;&gt;Questions and Answers&lt;/h2&gt;
&lt;h3 id=&#34;clinical-psychology-phd-or-dclinpsy&#34;&gt;Clinical Psychology PhD or DClinPsy?&lt;/h3&gt;
&lt;p&gt;Unfortunately, the University of Sussex does not offer at the moment a PhD in &lt;em&gt;&lt;strong&gt;Clinical Psychology&lt;/strong&gt;&lt;/em&gt; that includes clinical placements and internships in hospitals. However, if you are interested in working with patients, it is entirely possible to have a research project that involves clinical populations, and specialize in &amp;ldquo;clinical&amp;rdquo; research. Some people then complement this kind of PhD with clinical trainings (e.g., psychotherapy) to transition from research to practice.&lt;/p&gt;
&lt;h3 id=&#34;how-to-become-a-neuropsychologist&#34;&gt;How to become a Neuropsychologist?&lt;/h3&gt;
&lt;p&gt;Neuropsychology is both an approach (focusing on the relationship between the brain and its output in the form of behaviour and thought) and a practice (involving neuropsychological assessments and rehabilitation). The latter is considered a specialization of Clinical Psychology, which means that one must be a clinical psychologist to be a clinical neuropsychologist. As said above, the University of Sussex unfortunately does not offer, at the moment, a formal PhD in clinical psychology or clinical neuropsychology. However, joining the &lt;strong&gt;Reality Bending Lab&lt;/strong&gt; will get you well-prepared to eventually pursue this type of program, as the methods and mindset that we have draws heavily on neuropsychology (the use of neuropsychological tests, the focus on neurocognitive theories, etc.). In fact, some of our past members have become brilliant neuropsychologists, so feel free to ask them!&lt;/p&gt;
&lt;h3 id=&#34;how-to-work-on-psychedelics&#34;&gt;How to work on psychedelics?&lt;/h3&gt;
&lt;p&gt;Psychedelics and altered states of consciousness are a hot topic in psychology and neuroscience. Unfortunately, it is still &lt;em&gt;extremely&lt;/em&gt; difficult to get authorizations to work with these substances. I would not recommend to base your PhD project on this potentiality, as it&amp;rsquo;s too risky that things might not work out (due to ethical, administrative, or political reasons). That said, we do have projects running in collaborations with experts in the field, and are always on the lookout for opportunities to work on these topics. Additionally, we think it&amp;rsquo;s also very interesting to study how altered states of consciousness can be induced &lt;em&gt;without&lt;/em&gt; external substances (e.g., through meditation, hypnosis, sensory deprivation, neural stimulation, &amp;hellip;), which might be a more sustainable and ethical way to approach these phenomena.&lt;/p&gt;
&lt;h2 id=&#34;application-advice&#34;&gt;Application Advice&lt;/h2&gt;
&lt;p&gt;A few tips for your writing up your application dossier, in particular pertaining your CV and cover letter.
Note that these are general guidelines that also apply to other contexts (master&amp;rsquo;s programs, industry jobs, etc.).&lt;/p&gt;
&lt;p&gt;The key thing is to keep in mind that we receive a &lt;em&gt;&lt;strong&gt;lot&lt;/strong&gt;&lt;/em&gt; of applications (few hundreds for some positions). The first mistake you want to avoid is to have a generic, impersonal application: do address specific people (and &lt;strong&gt;do not make mistakes in the spelling of their names&lt;/strong&gt;, it happens often and is a turn-off), and try to concisely paint a profile of yourself that the recruiter can easily picture and form an image of: what is your background, where do you come from, what are your expertise, interests and goals. This should really be one tightly written paragraph (you can expand on this in your CV). We often see long and convoluted CVs and cover letters, that try to show &amp;ldquo;a bit of everything&amp;rdquo;, leaving the reader with little more than a sense of confusion.&lt;/p&gt;
&lt;p&gt;Next, after providing a clear depiction of who you are, you want to show that you have &lt;strong&gt;done your homework about where you are applying&lt;/strong&gt;: be specific about the people of the department (e.g., &amp;ldquo;I am particularly interested in working with Dr. X because of their work on Y&amp;rdquo;), or the papers (&amp;ldquo;I particularly enjoyed your paper on X because of Y&amp;rdquo;). This shows that you are motivated and that you are not just sending the same application to 100 different places. That being said, do not list &lt;em&gt;everything&lt;/em&gt; that is written on someone&amp;rsquo;s website or profile, because it makes it look like you just copied and pasted it. Be genuine, personal and specific. It is tempting to use AI to generate these kinds of things, but I would advise against it. Putting it the time, effort, and hard work will pay off.&lt;/p&gt;
&lt;p&gt;Finally, you want to show that you are a good fit for the position, and show that you have experience in the methods that are used in the lab, that you have experience in the field, etc.&lt;/p&gt;
</description>
    </item>
    
    <item>
      <title>Research Assistant in Psychology / Neuroscience</title>
      <link>https://realitybending.github.io/jobs/assistant/</link>
      <pubDate>Thu, 02 Feb 2023 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/jobs/assistant/</guid>
      <description>&lt;p&gt;
  &lt;i class=&#34;fa fa-calendar  pr-1 fa-fw&#34;&gt;&lt;/i&gt; Flexible

  &lt;i class=&#34;fa fa-location-pin  pr-1 fa-fw&#34;&gt;&lt;/i&gt; University of Sussex, Brighton, UK&lt;/p&gt;
&lt;p&gt;















&lt;figure  &gt;
  &lt;div class=&#34;d-flex justify-content-center&#34;&gt;
    &lt;div class=&#34;w-100&#34; &gt;&lt;img alt=&#34;&#34; srcset=&#34;
               /media/ResearchAssistant_hu_12d205e384adbd5c.webp 400w,
               /media/ResearchAssistant_hu_13f4f996d205587a.webp 760w,
               /media/ResearchAssistant_hu_36fd80c5ae7ceb1c.webp 1200w&#34;
               src=&#34;https://realitybending.github.io/media/ResearchAssistant_hu_12d205e384adbd5c.webp&#34;
               width=&#34;760&#34;
               height=&#34;428&#34;
               loading=&#34;lazy&#34; data-zoomable /&gt;&lt;/div&gt;
  &lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;Working as a research assistant (RA) is a formidable opportunity to take on before - eventually - signing up for a PhD. It is a flexible and varied position, and a perfect position to develop key research skills like writing, data analysis or neuroimaging; and eventually later pursue a postgraduate program in psychology/neuroscience/neuropsychology.&lt;/p&gt;
&lt;div class=&#34;alert alert-note&#34;&gt;
  &lt;div&gt;
    Funded RA positions at Sussex can be seen on the &lt;a href=&#34;https://www.sussex.ac.uk/about/jobs/research-assistant-ref-10411&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;recruitment page&lt;/a&gt;.
  &lt;/div&gt;
&lt;/div&gt;
&lt;h2 id=&#34;sussex-psychology-placement-year&#34;&gt;Sussex Psychology Placement Year&lt;/h2&gt;
&lt;p&gt;Students at the University of Sussex can also opt for a &lt;a href=&#34;https://www.sussex.ac.uk/study/undergraduate/courses/psychology-with-a-professional-placement-year-bsc&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;placement year&lt;/a&gt; between the second and third year of their studies. All undergraduates within the School are eligible to complete a placement year.&lt;/p&gt;
&lt;p&gt;While it might seem like it would &amp;ldquo;add&amp;rdquo; one year and &lt;em&gt;delay&lt;/em&gt; the end of the studies, the experience that you could gain is quite invaluable. Doing a full-year placement year in a research lab is the best way to take on larger projects and acquire a comprehensive research experience. Beyond providing you with a massive headstart for the final year, it is a great opportunity to learn new skills and meet many researchers, which will help you refine your career trajectory and maximize your chances of &lt;strong&gt;achieving your goals&lt;/strong&gt;.&lt;/p&gt;
&lt;p&gt;More information &lt;a href=&#34;http://www.sussex.ac.uk/psychology/internal/students/placements&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;here&lt;/strong&gt;&lt;/a&gt;.&lt;/p&gt;
&lt;h2 id=&#34;sussex-junior-research-associate-jra&#34;&gt;Sussex Junior Research Associate (JRA)&lt;/h2&gt;
&lt;p&gt;Sussex also offers short &lt;strong&gt;summer internship&lt;/strong&gt; opportunities for undergrads interested in developing your research skills and experience. You can apply to the &lt;a href=&#34;http://www.sussex.ac.uk/suro/jra&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;Sussex Junior Research Associate (JRA) program&lt;/strong&gt;&lt;/a&gt; to become a research associate and undertake an intensive eight-week research project over the summer break. Find out more information on &lt;a href=&#34;http://www.sussex.ac.uk/suro/applying&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;how to apply here&lt;/strong&gt;&lt;/a&gt;.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;See this &lt;a href=&#34;https://realitybending.github.io/post/2024-03-12-jingjra/&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;blogpost&lt;/strong&gt;&lt;/a&gt; for a testimony.&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id=&#34;sosocbio-undergraduate-summer-studentship&#34;&gt;SoSocBio Undergraduate Summer Studentship&lt;/h2&gt;
&lt;p&gt;Undergraduates residing in the UK can apply for a paid 6 weeks (30hr per week) internship between 1 July and 30 September.&lt;/p&gt;
&lt;p&gt;More information &lt;a href=&#34;https://southcoastbiosciencesdtp.ac.uk/undergraduate-summer-studentship-programme/&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;here&lt;/strong&gt;&lt;/a&gt;.&lt;/p&gt;
&lt;h2 id=&#34;international-junior-research-associate-ijra&#34;&gt;International Junior Research Associate (IJRA)&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;See here: &lt;a href=&#34;https://www.sussex.ac.uk/suro/current/ijra&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;https://www.sussex.ac.uk/suro/current/ijra&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id=&#34;other-bursaries&#34;&gt;Other Bursaries&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a href=&#34;https://www.the-bns.org/grants&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;Neuropsychology International Fellowship (NIF)&lt;/strong&gt;&lt;/a&gt;: Small bursaries from the British Neuropsychological Society to support small research internship.&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id=&#34;volunteer-research-assistant&#34;&gt;Volunteer Research assistant&lt;/h2&gt;
&lt;p&gt;Check the &lt;a href=&#34;https://realitybending.github.io/jobs/intern/&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;voluntary internships&lt;/a&gt; page for more information.&lt;/p&gt;
&lt;div class=&#34;alert alert-warning&#34;&gt;
  &lt;div&gt;
    &lt;p&gt;&lt;strong&gt;Don&amp;rsquo;t rely on what is written!&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;Ask directly &lt;a href=&#34;https://realitybending.github.io/people/&#34;&gt;members of team&lt;/a&gt; (current and past) about their experience in the lab!&lt;/p&gt;
  &lt;/div&gt;
&lt;/div&gt;
</description>
    </item>
    
    <item>
      <title>Internship in Psychology / Neuroscience</title>
      <link>https://realitybending.github.io/jobs/intern/</link>
      <pubDate>Thu, 02 Feb 2023 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/jobs/intern/</guid>
      <description>&lt;p&gt;
  &lt;i class=&#34;fa fa-calendar  pr-1 fa-fw&#34;&gt;&lt;/i&gt; Flexible

  &lt;i class=&#34;fa fa-location-pin  pr-1 fa-fw&#34;&gt;&lt;/i&gt; University of Sussex, Brighton, UK&lt;/p&gt;
&lt;p&gt;















&lt;figure  &gt;
  &lt;div class=&#34;d-flex justify-content-center&#34;&gt;
    &lt;div class=&#34;w-100&#34; &gt;&lt;img alt=&#34;&#34; srcset=&#34;
               /media/internship_hu_85470cd0f88be3af.webp 400w,
               /media/internship_hu_c276d7796d084aee.webp 760w,
               /media/internship_hu_25e23cc062c81292.webp 1200w&#34;
               src=&#34;https://realitybending.github.io/media/internship_hu_85470cd0f88be3af.webp&#34;
               width=&#34;760&#34;
               height=&#34;428&#34;
               loading=&#34;lazy&#34; data-zoomable /&gt;&lt;/div&gt;
  &lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;If you are a undergrad student, doing a &lt;strong&gt;voluntary internship&lt;/strong&gt; can be a good idea to get some first exposure with research, develop some useful skills and start growing your professional network. &lt;strong&gt;This is not by any means a necessity or a requirement&lt;/strong&gt;, so don&amp;rsquo;t worry &lt;em&gt;at all&lt;/em&gt; if you cannot afford (financially or in term of availabilities) to volunteer, and check-out instead possibly &lt;a href=&#34;https://realitybending.github.io/jobs/assistant/&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;funded&lt;/a&gt; positions.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Sussex Psychology Placement Year&lt;/strong&gt;: Check-out the &lt;a href=&#34;https://realitybending.github.io/jobs/assistant/&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;Research Assistant&lt;/a&gt; page&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Sussex Junior Research Assistant (JRA)&lt;/strong&gt;: Check-out the &lt;a href=&#34;https://realitybending.github.io/jobs/assistant/&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;Research Assistant&lt;/a&gt; page&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;SoSocBio Undergraduate Summer Studentship&lt;/strong&gt;: Check-out the &lt;a href=&#34;https://realitybending.github.io/jobs/assistant/&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;Research Assistant&lt;/a&gt; page&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;&lt;strong&gt;Contact.&lt;/strong&gt; Send CV to &lt;strong&gt;&lt;a href=&#34;mailto:D.Makowski@sussex.ac.uk&#34;&gt;D.Makowski@sussex.ac.uk&lt;/a&gt;&lt;/strong&gt; and &lt;em&gt;explicitly&lt;/em&gt; say that you are interested in volunteering.&lt;/p&gt;
&lt;div class=&#34;alert alert-warning&#34;&gt;
  &lt;div&gt;
    &lt;p&gt;&lt;strong&gt;Don&amp;rsquo;t rely on what is written!&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;Ask directly &lt;a href=&#34;https://realitybending.github.io/people/&#34;&gt;members of team&lt;/a&gt; (current and past) about their experience in the lab!&lt;/p&gt;
  &lt;/div&gt;
&lt;/div&gt;
</description>
    </item>
    
    <item>
      <title>Research Projects</title>
      <link>https://realitybending.github.io/jobs/projects/</link>
      <pubDate>Thu, 02 Feb 2023 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/jobs/projects/</guid>
      <description>&lt;p&gt;
  &lt;i class=&#34;fa fa-calendar  pr-1 fa-fw&#34;&gt;&lt;/i&gt; Flexible

  &lt;i class=&#34;fa fa-location-pin  pr-1 fa-fw&#34;&gt;&lt;/i&gt; University of Sussex, Brighton, UK&lt;/p&gt;
&lt;p&gt;















&lt;figure  &gt;
  &lt;div class=&#34;d-flex justify-content-center&#34;&gt;
    &lt;div class=&#34;w-100&#34; &gt;&lt;img alt=&#34;&#34; srcset=&#34;
               /media/research_diagram_hu_2cf0bff64356c4b7.webp 400w,
               /media/research_diagram_hu_9696fb68ea7b6a6f.webp 760w,
               /media/research_diagram_hu_29eaf40393a7ff0f.webp 1200w&#34;
               src=&#34;https://realitybending.github.io/media/research_diagram_hu_2cf0bff64356c4b7.webp&#34;
               width=&#34;760&#34;
               height=&#34;578&#34;
               loading=&#34;lazy&#34; data-zoomable /&gt;&lt;/div&gt;
  &lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h2 id=&#34;projects&#34;&gt;Projects&lt;/h2&gt;
&lt;p&gt;Research in the &lt;strong&gt;Reality Bending Lab&lt;/strong&gt; focuses primarily on the physiological and neurocognitive underpinnings of reality perception and aspects of &lt;strong&gt;reality bending&lt;/strong&gt; (e.g., fiction, deception, fake news, illusions, and altered states of consciousness such as through meditation or immersion). Possible projects include (but are not limited to):&lt;/p&gt;
&lt;h3 id=&#34;how-do-we-know-what-is-real-and-what-does-it-change&#34;&gt;How do we know what is real? And what does it change?&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Real vs. Fake&lt;/strong&gt;: This project will typically involve presenting some real vs. &amp;ldquo;fake&amp;rdquo; stimuli (e.g., fake news, AI-generated images, &amp;hellip;) to participants and investigate what interindividual/cognitive/emotional factors allows them to discriminate between the two.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Emotion regulation via fiction&lt;/strong&gt;: believing that something is unreal (regardless of whether it actually is or is not) seem to have some ripple effect on various facets of our body and brain, such as emotions. This project studies the characteristics and potential use of fiction as an emotion regulation strategy. This project can be focused on negative emotions (with threatening/unpleasant stimuli) or &amp;ldquo;positive&amp;rdquo; emotions (e.g., sexual arousal, attractiveness).&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Authenticity &amp;amp; Neuroaesthetics&lt;/strong&gt;: This project investigates the effect of believing that an artwork is &amp;ldquo;forged&amp;rdquo; (e.g., an imitation of a great painter) on our appraisal of beauty.&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id=&#34;assessment-of-bodily-and-cognitive-abilities-and-their-relationship&#34;&gt;Assessment of bodily and cognitive abilities and their relationship&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Neuropsychological assessment of Cognitive Control&lt;/strong&gt;: This project focuses on the development, validation and improvement of a neuropsychological task to reliably measure &amp;ldquo;cognitive control&amp;rdquo; (executive functions). This project requires some interest in neuropsychological assessment, task development and associated technical skills (programming, game development).&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Physiological Control&lt;/strong&gt;: development of measures (questionnaires, physiological tasks) measuring the ability to voluntarily regulate one&amp;rsquo;s physiology (e.g., heart rate, autonomic reactions, etc.).&lt;/li&gt;
&lt;li&gt;Relationship between &lt;strong&gt;Interoception&lt;/strong&gt; and higher-order functions: This project involves measuring various aspects of our relationship with our body (e.g., by measuring cardiac activity) and analyzing its relationship with cognitive abilities (e.g., Self control, emotion regulation) or higher-order constructs (e.g., primal world beliefsLinks to an external site.).&lt;/li&gt;
&lt;li&gt;Secondary &lt;strong&gt;EEG data analysis&lt;/strong&gt;: Investigating an existing dataset containing resting state EEG signal, from which one would extract features to try predicting dispositional indices (such as primal world beliefs).&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id=&#34;non-invasive-induction-of-altered-states-of-consciousness&#34;&gt;Non-invasive induction of altered states of consciousness&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Emotion regulation via distancing&lt;/strong&gt;: by instructing people to change their state of mind (e.g., &amp;ldquo;see the events in a detached manner, like a fly on the wall&amp;rdquo;), we hope to manipulate aspects of the sense of reality - such as absorption and psychological distance - and compare its effect (in particular on emotions) with that of other emotion regulation strategies.&lt;/li&gt;
&lt;li&gt;Can we &lt;strong&gt;manipulate the state of consciousness&lt;/strong&gt; and observe actual effects on the performance at various cognitive tasks? For instance, via hypnosis or mindfulness-like instructions, sound stimulation (binaural beats, drumming, &amp;hellip;), sensory deprivation (&amp;ldquo;floating&amp;rdquo; tanks). We study the role of expectations and try to isolate the mechanism of change.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Qualitative + quantitative&lt;/strong&gt; project: Investigation into the world of &amp;ldquo;reality shifters&amp;rdquo;, people claiming that have shifted between realities. Understand their language, personality, etc.&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id=&#34;metascience--data-science--software-development&#34;&gt;Metascience / Data science / Software development&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;Investigation of &lt;strong&gt;scientific practices&lt;/strong&gt;: to what extent scientists engage in &amp;ldquo;new&amp;rdquo; practices (e.g., open science, slow science, preregistration, registered reports, &amp;hellip;) and what factors (e.g., career level, time, ideology, &amp;hellip;) are driving their adoption (or lack thereof). This might involve things like validating assessment tools (such as questionnaires or gamified measures like quizzes), distributing it widely and analysing the results.&lt;/li&gt;
&lt;li&gt;Improving access of &lt;strong&gt;R&lt;/strong&gt; to psychologists: R outputs can be complex, and we are developing tools to facilitate its understanding (e.g., the reportLinks to an external site. package). This project involves implementing functions in R to help communicate and interpret statistical results. This project requires some interest in programming and stats.&lt;/li&gt;
&lt;li&gt;Neurophysiological signal analysis in &lt;strong&gt;Python&lt;/strong&gt;: implementation and validation of new algorithms in Python, related for instance to chaos theory, EEG signal analysis, etc. This project requires some interest in programming, computer science and mathematics.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Psychophysiological methods&lt;/strong&gt;: what is the optimal electrode configuration for recording skin conductance responses (often used as a marker of emotions).&lt;/li&gt;
&lt;li&gt;Role of beauty in science: Is the impact of research publications related to the aeshetic qualities of figures.&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id=&#34;other--collaborations&#34;&gt;Other / Collaborations&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;With &lt;a href=&#34;https://canvas.sussex.ac.uk/courses/30420/pages/theodoros-karapanagiotidis-2-2&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;Theodoros Karapanagiotidis&lt;/a&gt;: project involving secondary behavioural data analysis, exploring questions about the nature of thoughts, their patterns, the impact of mood and ongoing experience , and how they vary in in real-life settings. By analysing existing data, students will be able to examine the links between ongoing thoughts, brain structure and function, and their potential implications for mental health and well-being.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Some of these projects share strong links with philosophical concepts (e.g., the &lt;a href=&#34;https://en.wikipedia.org/wiki/Paradox_of_fiction&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;paradox of fiction&lt;/a&gt;) and/or carry some clinical relevance (e.g., for the understanding and treatment of mood/psychotic/dissociative disorders). Also, note that the lab is radically engaged in open science and, ultimately, quantitative methodologies: thus, &lt;strong&gt;most projects would typically require a substantial use of R&lt;/strong&gt; (or Python) at some stage. However, it&amp;rsquo;s totally okay not to feel proficient at these skills at the start, but the important thing is to be interested and motivated to learn.&lt;/p&gt;
&lt;p&gt;Projects might be conducted individually, in pairs, or in a group, depending on needs. Attending weekly lab meetings is also expected.&lt;/p&gt;
&lt;h2 id=&#34;skills&#34;&gt;Skills&lt;/h2&gt;
&lt;p&gt;Joining the &lt;strong&gt;Reality Bending Lab&lt;/strong&gt; will help you develop unique skills that your might not find in other labs, that will give an edge to your profile for future applications. These include:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Neuroimaging (EEG)&lt;/li&gt;
&lt;li&gt;Psychophysiology (multimodal bodily recordings)&lt;/li&gt;
&lt;li&gt;Computational Bayesian modelling with R&lt;/li&gt;
&lt;li&gt;Advanced programming with Python&lt;/li&gt;
&lt;li&gt;Open science best practices (using GitHub and various cutting edge tools)&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id=&#34;advice-for-students&#34;&gt;Advice for Students&lt;/h2&gt;
&lt;h3 id=&#34;choosing-options&#34;&gt;Choosing Options&lt;/h3&gt;
&lt;p&gt;Most psychology programs are proposing some optionality, i.e., some modules that you can pick.
While many of them look interesting, you often have to make hard choices.
Many students pick what looks interesting, from various psychology domains (e.g., a bit of social psychology, a module from cognitive, one from clinical etc.).
They might also believe that picking a variety of options will provide them with a &lt;strong&gt;multidisciplinary profile&lt;/strong&gt;, which might be valued later on.&lt;/p&gt;
&lt;p&gt;While this is, in principle, true, in practice, &lt;strong&gt;a &amp;ldquo;consistent&amp;rdquo; profile is often much more appealing to recruiters&lt;/strong&gt;. For instance, having a set of clinically-relevant options, or cognitive/neuroscience ones, will give you an edge (and sometimes, even for say a &amp;ldquo;neuroscience&amp;rdquo; opportunity, recruiters would prefer a clearly clinical profile rather than a &amp;ldquo;jack of all trades master of none&amp;rdquo; type of one). &lt;strong&gt;Make your choices wisely, and make them with a plan.&lt;/strong&gt;&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;But I &lt;em&gt;don&amp;rsquo;t know&lt;/em&gt; what I want to do later, so I want to keep most doors open.&lt;/p&gt;&lt;/blockquote&gt;
&lt;p&gt;Yes, that&amp;rsquo;s a common issue. You can still keep doors open while at the same time having a coherent profile. You &lt;em&gt;should&lt;/em&gt; at least have an idea of what subbranch of psychology you &lt;em&gt;don&amp;rsquo;t&lt;/em&gt; want to do (e.g., social psychology).&lt;/p&gt;
&lt;p&gt;For psychology students at Sussex, if you would like to work with me, I recommend picking some of the following options:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Cognitive Neuroscience &lt;em&gt;(must have)&lt;/em&gt;&lt;/li&gt;
&lt;li&gt;Conscious and Unconscious Mental Processes&lt;/li&gt;
&lt;li&gt;Biological Psychology of Mental Health&lt;/li&gt;
&lt;li&gt;Perspectives on Psychology&lt;/li&gt;
&lt;li&gt;Self Regulation: The Science of Achieving Your Goals&lt;/li&gt;
&lt;li&gt;Attention: Distraction, Daydreaming and Diversity&lt;/li&gt;
&lt;li&gt;Drugs, Brain and Behaviour&lt;/li&gt;
&lt;/ul&gt;
&lt;div class=&#34;alert alert-warning&#34;&gt;
  &lt;div&gt;
    &lt;p&gt;&lt;strong&gt;Don&amp;rsquo;t rely on what is written!&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;Ask directly &lt;a href=&#34;https://realitybending.github.io/people/&#34;&gt;members of team&lt;/a&gt; (current and past) about their experience in the lab!&lt;/p&gt;
  &lt;/div&gt;
&lt;/div&gt;
</description>
    </item>
    
    <item>
      <title>Companion</title>
      <link>https://realitybending.github.io/jobs/companion/</link>
      <pubDate>Sun, 23 Apr 2023 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/jobs/companion/</guid>
      <description>&lt;p&gt;
  &lt;i class=&#34;fa fa-calendar  pr-1 fa-fw&#34;&gt;&lt;/i&gt; Flexible

  &lt;i class=&#34;fa fa-location-pin  pr-1 fa-fw&#34;&gt;&lt;/i&gt; Anywhere&lt;/p&gt;
&lt;p&gt;















&lt;figure  &gt;
  &lt;div class=&#34;d-flex justify-content-center&#34;&gt;
    &lt;div class=&#34;w-100&#34; &gt;&lt;img alt=&#34;&#34; srcset=&#34;
               /media/companion_hu_3aeee68a2d719be9.webp 400w,
               /media/companion_hu_898385818f109951.webp 760w,
               /media/companion_hu_ee2253a31c558752.webp 1200w&#34;
               src=&#34;https://realitybending.github.io/media/companion_hu_3aeee68a2d719be9.webp&#34;
               width=&#34;760&#34;
               height=&#34;397&#34;
               loading=&#34;lazy&#34; data-zoomable /&gt;&lt;/div&gt;
  &lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;During the Middle Ages and the Renaissance, becoming a master craftsman or artist required to undertake a long journey that would bring the novice to gain the rank of master through &lt;em&gt;&amp;ldquo;companionship&amp;rdquo;&lt;/em&gt;. As such, the learner would learn from a teacher for a given period of time, after what she or he would decide to leave and move on to a new adventure.&lt;/p&gt;
&lt;p&gt;You&amp;rsquo;re a student, a researcher-in-training, in-between jobs or looking for one, and &lt;strong&gt;you feel like you have time&lt;/strong&gt; &lt;em&gt;(and energy ⚡)&lt;/em&gt; &lt;strong&gt;to learn more?&lt;/strong&gt; And you also would like to &lt;strong&gt;learn new skills&lt;/strong&gt; and contribute to open-science? But you would also like this training to result in something academically valuable (like a publication)? In other words, &lt;strong&gt;you want it all&lt;/strong&gt;?&lt;/p&gt;
&lt;p&gt;Good news, we might have something of interest for you. Being in touch with several open-access projects, I know some topics and areas in which there is a need for contributors, with different projects just waiting for some brilliant mind to push it forward. Some interesting stuff that you can investigate on your free time (but not at the expense of your main objectives, i.e., &lt;strong&gt;don&amp;rsquo;t drop school for that!&lt;/strong&gt;). Depending on your current skills - &lt;strong&gt;but most importantly the skills you want to develop&lt;/strong&gt; - you can check-out the list below to see if there is anything that could be of interest to you. If that&amp;rsquo;s the case, do contact us; we will provide you with assistance, guidance and help so you can start exploring it at your rhythm in a comfortable environment. It can be a good way to get in touch with us, be mentored, learn and initiate collaborations or future projects with funded positions :)&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;input disabled=&#34;&#34; type=&#34;checkbox&#34;&gt; &lt;strong&gt;An GAM-based Approach to EEG/ERP Analysis&lt;/strong&gt;. General Additive Models (GAM) are a powerful class of regression models that seem very appropriate to model ERP data. We have a &lt;a href=&#34;https://neurokit2.readthedocs.io/en/latest/studies/erp_gam.html&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;draft&lt;/a&gt; of a study that aims to be like a tutorial / guide to analyze ERP using GAMs.
&lt;ul&gt;
&lt;li&gt;&lt;em&gt;Skills that you will improve&lt;/em&gt;: Python (MNE), R, ERP/EEG, GAMs.&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;&lt;input disabled=&#34;&#34; type=&#34;checkbox&#34;&gt; &lt;strong&gt;Deep learning model for ECG delineation&lt;/strong&gt;. Train, validate and make available (in &lt;a href=&#34;https://github.com/neuropsychology/NeuroKit&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;NeuroKit&lt;/a&gt;) a model able to locate the different components of ECG signals (the different peaks, waves, etc.). Some work has already be done (&lt;a href=&#34;https://github.com/neuropsychology/NeuroKit/issues/89#issuecomment-653834058&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;1&lt;/a&gt;) it seems, so that we can have a basis on which improve such tool and make accessible.
&lt;ul&gt;
&lt;li&gt;&lt;em&gt;Skills that you will improve&lt;/em&gt;: Python, deep learning, ECG.&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;&lt;input disabled=&#34;&#34; type=&#34;checkbox&#34;&gt; &lt;strong&gt;Benchmarking of ECG Preprocessing Methods&lt;/strong&gt;. I really makes me crazy to see how there is no consensus nor guidelines on how to preprocess physiological signals. Time do remedy to that. We have a &lt;a href=&#34;https://neurokit2.readthedocs.io/en/latest/studies/ecg_preprocessing.html&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;draft&lt;/a&gt; of a study that aims at comparing different preprocessing methods to outline the best processing workflow.
&lt;ul&gt;
&lt;li&gt;&lt;em&gt;Skills that you will improve&lt;/em&gt;: Python, signal processing, ECG.&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;&lt;input disabled=&#34;&#34; type=&#34;checkbox&#34;&gt; &lt;strong&gt;EOG events templates&lt;/strong&gt;. We have a &lt;a href=&#34;https://neurokit2.readthedocs.io/en/latest/studies/eog_blinktemplate.html&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;draft&lt;/a&gt; of a study that aims at describing the different events in EOG signals (blinks, saccades, &amp;hellip;) and try to create their statistical templates.
&lt;ul&gt;
&lt;li&gt;&lt;em&gt;Skills that you will improve&lt;/em&gt;: Python, signal processing, EOG.&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;&lt;input disabled=&#34;&#34; type=&#34;checkbox&#34;&gt; &lt;strong&gt;The report package&lt;/strong&gt;. The &lt;a href=&#34;https://github.com/easystats/report&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;report&lt;/strong&gt;&lt;/a&gt; package is the pinnacle of the &lt;strong&gt;easystats&lt;/strong&gt; project, and one of its most demanded and used component. Unfortunately, it is a bit stuck at the moment and it would benefit from some fresh perspective on it. Are you up for the challenge?
&lt;ul&gt;
&lt;li&gt;&lt;em&gt;Skills that you will improve&lt;/em&gt;: R, statistics, methodological best practices.&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;&lt;em&gt;Note that the amount of engagement and time you want to devote to such side-project is entirely up to you. As it is 100% based on self-engagement, we won&amp;rsquo;t ask for any target goals, so no pressure. Joining our network and working on these things should always be interesting (for you), useful (to you), and fun (for you&amp;hellip; and us ☺️).&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Contact.&lt;/strong&gt; Send email to &lt;a href=&#34;mailto:dom.makowski@gmail.com&#34;&gt;dom.makowski@gmail.com&lt;/a&gt;.&lt;/p&gt;
</description>
    </item>
    
    <item>
      <title>How to send event triggers to Lab Streaming Layer from JsPsych</title>
      <link>https://realitybending.github.io/post/2026-01-09-eventtriggers/</link>
      <pubDate>Sun, 11 Jan 2026 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/post/2026-01-09-eventtriggers/</guid>
      <description>&lt;p&gt;Hello there! 👋 Let&amp;rsquo;s learn how to send event triggers to Lab Streaming Layer (LSL) from JsPsych.&lt;/p&gt;
&lt;p&gt;Lets start with some basics!&lt;/p&gt;
&lt;h2 id=&#34;what-does-this-mean-and-when-is-this-useful&#34;&gt;What does this mean and when is this useful?&lt;/h2&gt;
&lt;p&gt;Lab streaming layer (LSL) is a system used to receive, synchronise and stream signals from multiple inputs during experiments. LSL is designed to help researchers easily compare their data across multiple technologies, as time synchrony is integral for making meaningful analyses.&lt;/p&gt;
&lt;p&gt;When collecting data on stimulus response during experiments, it&amp;rsquo;s important that the stimulus onset is recorded precisely, especially if this is mapped onto physiological responses as this has implications on how we interpret our data. One good example of a use case is that you have a Muse headband or any other device that can scream via LSL, and you want to precisely mark events in it.&lt;/p&gt;
&lt;p&gt;Event triggers are coded into JsPsych online experiments to accurately mark events, such as when a stimulus appeared on the screen. We have found this method to yield the most precise timestamps of events, compared to alternative methods such as using the Bitalino LUX.&lt;/p&gt;
&lt;p&gt;This tutorial will explain how to set this up for an experiment situated on GitHub, although you can adapt this for your hosting platform.&lt;/p&gt;
&lt;p&gt;This blog will help you understand the set-up for event triggers. For an example of this in action, refer to &lt;a href=&#34;https://github.com/OliverACollins/muse-athena-test/tree/main/blackwhite&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;https://github.com/OliverACollins/muse-athena-test/tree/main/blackwhite&lt;/a&gt;. This experiment recorded markers on a screen turning from white to black- you may want to follow along with lsl_bridge.py and blackwhite_jspsych.html&lt;sup id=&#34;fnref:1&#34;&gt;&lt;a href=&#34;#fn:1&#34; class=&#34;footnote-ref&#34; role=&#34;doc-noteref&#34;&gt;1&lt;/a&gt;&lt;/sup&gt; for the full implementation.&lt;/p&gt;
&lt;h2 id=&#34;how-to-set-up-event-triggers&#34;&gt;How to set up event triggers&lt;/h2&gt;
&lt;h3 id=&#34;requirements&#34;&gt;Requirements&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Your experiment is in-person:&lt;/strong&gt; In order to use these event triggers, you will need to run your experiment on a local host server, which will need to be manually set up for each trial. Therefore, this setup is intended for in-person experiments that are led by a researcher to set up the participant&amp;rsquo;s screen.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;You have two machines, one for the participant, and one for the researcher:&lt;/strong&gt; The participant&amp;rsquo;s machine will display the experiment and send the markers to the researcher&amp;rsquo;s machine, which will record the LSL stream events.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;All machines must be on the same network connection:&lt;/strong&gt; To send the markers from the participant&amp;rsquo;s machine to the researcher&amp;rsquo;s recording machine, we must run the experiment on a web server which directs the markers to the recording machine via its ipv4 address.&lt;/p&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id=&#34;the-lsl-bridge-script&#34;&gt;The LSL bridge script&lt;/h3&gt;
&lt;p&gt;The LSL bridge Python script is responsible for actually sending the markers to LSL- it &amp;rsquo;listens&amp;rsquo; for messages from the browser that the participant is doing the experiment from, and converts them into &amp;lsquo;markers&amp;rsquo; for your recording software (such as LabRecorder) to receive.&lt;/p&gt;
&lt;details&gt;
&lt;summary&gt;See an example of a full script&lt;/summary&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-python&#34; data-lang=&#34;python&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;kn&#34;&gt;from&lt;/span&gt; &lt;span class=&#34;nn&#34;&gt;http.server&lt;/span&gt; &lt;span class=&#34;kn&#34;&gt;import&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;BaseHTTPRequestHandler&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;HTTPServer&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;kn&#34;&gt;from&lt;/span&gt; &lt;span class=&#34;nn&#34;&gt;urllib.parse&lt;/span&gt; &lt;span class=&#34;kn&#34;&gt;import&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;urlparse&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;parse_qs&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;kn&#34;&gt;from&lt;/span&gt; &lt;span class=&#34;nn&#34;&gt;mne_lsl.lsl&lt;/span&gt; &lt;span class=&#34;kn&#34;&gt;import&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;StreamInfo&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;StreamOutlet&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;local_clock&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;kn&#34;&gt;import&lt;/span&gt; &lt;span class=&#34;nn&#34;&gt;threading&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# ---------------------------------------------------------------------&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# CONFIG&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# ---------------------------------------------------------------------&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;LSL_STREAM_NAME&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;s2&#34;&gt;&amp;#34;jsPsychMarkers&amp;#34;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;LSL_STREAM_TYPE&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;s2&#34;&gt;&amp;#34;Markers&amp;#34;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;LSL_SOURCE_ID&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;s2&#34;&gt;&amp;#34;jspsych-lsl-bridge&amp;#34;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;SERVER_HOST&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;s2&#34;&gt;&amp;#34;0.0.0.0&amp;#34;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;SERVER_PORT&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;mi&#34;&gt;5000&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# ---------------------------------------------------------------------&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Create an LSL outlet for event markers&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;info&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;StreamInfo&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;n&#34;&gt;name&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;LSL_STREAM_NAME&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;n&#34;&gt;stype&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;LSL_STREAM_TYPE&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;n&#34;&gt;n_channels&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;mi&#34;&gt;1&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;n&#34;&gt;sfreq&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;mi&#34;&gt;0&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;n&#34;&gt;dtype&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;s2&#34;&gt;&amp;#34;string&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;n&#34;&gt;source_id&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;LSL_SOURCE_ID&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;desc&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;info&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;desc&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;desc&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;append_child_value&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;s2&#34;&gt;&amp;#34;manufacturer&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;s2&#34;&gt;&amp;#34;jsPsych&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;channels&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;desc&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;append_child&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;s2&#34;&gt;&amp;#34;channels&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;ch&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;channels&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;append_child&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;s2&#34;&gt;&amp;#34;channel&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;ch&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;append_child_value&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;s2&#34;&gt;&amp;#34;label&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;s2&#34;&gt;&amp;#34;JsPsychMarker&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;ch&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;append_child_value&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;s2&#34;&gt;&amp;#34;unit&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;s2&#34;&gt;&amp;#34;string&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;ch&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;append_child_value&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;s2&#34;&gt;&amp;#34;type&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;s2&#34;&gt;&amp;#34;Marker&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;outlet&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;StreamOutlet&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;info&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# ---------------------------------------------------------------------&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# HTTP Request Handler&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# ---------------------------------------------------------------------&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;k&#34;&gt;class&lt;/span&gt; &lt;span class=&#34;nc&#34;&gt;MarkerHandler&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;BaseHTTPRequestHandler&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;):&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;k&#34;&gt;def&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;do_GET&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;bp&#34;&gt;self&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;):&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;n&#34;&gt;parsed&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;urlparse&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;bp&#34;&gt;self&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;path&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;n&#34;&gt;params&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;parse_qs&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;parsed&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;query&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;n&#34;&gt;path&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;parsed&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;path&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;k&#34;&gt;if&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;path&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;==&lt;/span&gt; &lt;span class=&#34;s2&#34;&gt;&amp;#34;/sync&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;:&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;            &lt;span class=&#34;c1&#34;&gt;# Return current LSL clock to JS&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;            &lt;span class=&#34;n&#34;&gt;ts&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;local_clock&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;()&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;            &lt;span class=&#34;bp&#34;&gt;self&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;send_response&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;mi&#34;&gt;200&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;            &lt;span class=&#34;bp&#34;&gt;self&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;end_headers&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;()&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;            &lt;span class=&#34;bp&#34;&gt;self&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;wfile&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;write&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;nb&#34;&gt;str&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;ts&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;encode&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;())&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;k&#34;&gt;elif&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;path&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;==&lt;/span&gt; &lt;span class=&#34;s2&#34;&gt;&amp;#34;/marker&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;:&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;            &lt;span class=&#34;n&#34;&gt;value&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;params&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;get&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;s2&#34;&gt;&amp;#34;value&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;[&lt;/span&gt;&lt;span class=&#34;s2&#34;&gt;&amp;#34;1&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;])[&lt;/span&gt;&lt;span class=&#34;mi&#34;&gt;0&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;]&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;            &lt;span class=&#34;n&#34;&gt;ts_js&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;params&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;get&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;s2&#34;&gt;&amp;#34;ts&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;[&lt;/span&gt;&lt;span class=&#34;kc&#34;&gt;None&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;])[&lt;/span&gt;&lt;span class=&#34;mi&#34;&gt;0&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;]&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;            &lt;span class=&#34;k&#34;&gt;if&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;ts_js&lt;/span&gt; &lt;span class=&#34;ow&#34;&gt;is&lt;/span&gt; &lt;span class=&#34;ow&#34;&gt;not&lt;/span&gt; &lt;span class=&#34;kc&#34;&gt;None&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;:&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;                &lt;span class=&#34;n&#34;&gt;ts&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;nb&#34;&gt;float&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;ts_js&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;            &lt;span class=&#34;k&#34;&gt;else&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;:&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;                &lt;span class=&#34;n&#34;&gt;ts&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;local_clock&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;()&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;            &lt;span class=&#34;n&#34;&gt;outlet&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;push_sample&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;([&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;value&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;],&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;ts&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;            &lt;span class=&#34;nb&#34;&gt;print&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;sa&#34;&gt;f&lt;/span&gt;&lt;span class=&#34;s2&#34;&gt;&amp;#34;→ Marker &lt;/span&gt;&lt;span class=&#34;si&#34;&gt;{&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;value&lt;/span&gt;&lt;span class=&#34;si&#34;&gt;}&lt;/span&gt;&lt;span class=&#34;s2&#34;&gt; @ &lt;/span&gt;&lt;span class=&#34;si&#34;&gt;{&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;ts&lt;/span&gt;&lt;span class=&#34;si&#34;&gt;:&lt;/span&gt;&lt;span class=&#34;s2&#34;&gt;.6f&lt;/span&gt;&lt;span class=&#34;si&#34;&gt;}&lt;/span&gt;&lt;span class=&#34;s2&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;            &lt;span class=&#34;bp&#34;&gt;self&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;send_response&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;mi&#34;&gt;200&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;            &lt;span class=&#34;bp&#34;&gt;self&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;end_headers&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;()&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;            &lt;span class=&#34;bp&#34;&gt;self&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;wfile&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;write&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;sa&#34;&gt;b&lt;/span&gt;&lt;span class=&#34;s2&#34;&gt;&amp;#34;OK&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;k&#34;&gt;else&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;:&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;            &lt;span class=&#34;bp&#34;&gt;self&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;send_response&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;mi&#34;&gt;404&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;            &lt;span class=&#34;bp&#34;&gt;self&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;end_headers&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;()&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# ---------------------------------------------------------------------&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;k&#34;&gt;def&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;run_server&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;():&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;n&#34;&gt;server_address&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;SERVER_HOST&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;SERVER_PORT&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;n&#34;&gt;httpd&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;HTTPServer&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;server_address&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;MarkerHandler&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;nb&#34;&gt;print&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;sa&#34;&gt;f&lt;/span&gt;&lt;span class=&#34;s2&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span class=&#34;se&#34;&gt;\n&lt;/span&gt;&lt;span class=&#34;s2&#34;&gt;[LSL Bridge] Serving on http://&lt;/span&gt;&lt;span class=&#34;si&#34;&gt;{&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;SERVER_HOST&lt;/span&gt;&lt;span class=&#34;si&#34;&gt;}&lt;/span&gt;&lt;span class=&#34;s2&#34;&gt;:&lt;/span&gt;&lt;span class=&#34;si&#34;&gt;{&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;SERVER_PORT&lt;/span&gt;&lt;span class=&#34;si&#34;&gt;}&lt;/span&gt;&lt;span class=&#34;s2&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;nb&#34;&gt;print&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;sa&#34;&gt;f&lt;/span&gt;&lt;span class=&#34;s2&#34;&gt;&amp;#34;[LSL Bridge] Stream &amp;#39;&lt;/span&gt;&lt;span class=&#34;si&#34;&gt;{&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;LSL_STREAM_NAME&lt;/span&gt;&lt;span class=&#34;si&#34;&gt;}&lt;/span&gt;&lt;span class=&#34;s2&#34;&gt;&amp;#39; ready for LabRecorder.&lt;/span&gt;&lt;span class=&#34;se&#34;&gt;\n&lt;/span&gt;&lt;span class=&#34;s2&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;n&#34;&gt;httpd&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;serve_forever&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;()&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# ---------------------------------------------------------------------&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;k&#34;&gt;if&lt;/span&gt; &lt;span class=&#34;vm&#34;&gt;__name__&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;==&lt;/span&gt; &lt;span class=&#34;s2&#34;&gt;&amp;#34;__main__&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;:&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;n&#34;&gt;server_thread&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;threading&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Thread&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;target&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;run_server&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;n&#34;&gt;server_thread&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;start&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;()&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;/details&gt;
&lt;h3 id=&#34;configuration-of-the-python-script&#34;&gt;Configuration of the Python Script&lt;/h3&gt;
&lt;ol&gt;
&lt;li&gt;
&lt;p&gt;Set-up&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Imports&lt;/strong&gt;: Load in standard python libraries for creating a web server (&lt;code&gt;http.server&lt;/code&gt;, &lt;code&gt;urllib&lt;/code&gt;) and the &lt;code&gt;mne_lsl&lt;/code&gt; library to handle the data streaming.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Configure Variables&lt;/strong&gt;: Name the event trigger stream so you can find it in LabRecorder, e.g. &lt;code&gt;LSL_STREAM_NAME = &amp;quot;jsPsychMarkers&amp;quot;&lt;/code&gt;.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Add &lt;code&gt;SERVER_HOST = &amp;quot;0.0.0.0&amp;quot;&lt;/code&gt; into the script to tell the server to listen to all available network interfaces, allowing the participant&amp;rsquo;s computer to communicate with the researcher&amp;rsquo;s.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Specify the port for the html script to go to e.g. &lt;code&gt;SERVER_PORT = 5000&lt;/code&gt;. You will add this port into the html script that holds the online experiment, in order to send the experiment to this python script.&lt;/p&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Create the LSL outlet for event triggers&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;
&lt;p&gt;Define the metadata for the stream: &lt;code&gt;info = StreamInfo( name=LSL_STREAM_NAME...&lt;/code&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Create the outlet object that will push data out to the network: &lt;code&gt;outlet = StreamOutlet(info)&lt;/code&gt;.&lt;/p&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Create the request handler to define what happens when the participant&amp;rsquo;s browser contacts the server.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;
&lt;p&gt;The sync route:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;
&lt;p&gt;&lt;code&gt;if path == &amp;quot;/sync&amp;quot;:&lt;/code&gt;: Checks if the browser is asking to synchronize clocks.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;code&gt;ts = local_clock()&lt;/code&gt;: Grabs the current high-precision time from the LSL clock on the Recording Machine.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;code&gt;self.wfile.write(...)&lt;/code&gt;: Sends this timestamp back to the browser. The browser needs this to calculate the time difference (offset) between the two computers.&lt;/p&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;The marker route:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;code&gt;elif path == &amp;quot;/marker&amp;quot;&lt;/code&gt;: Checks if the browser is trying to send an event marker.&lt;/li&gt;
&lt;li&gt;It extracts &lt;code&gt;value&lt;/code&gt; (the marker name, e.g., &amp;ldquo;1&amp;rdquo;) and &lt;code&gt;ts&lt;/code&gt; (the timestamp calculated by the browser) from the URL parameters.&lt;/li&gt;
&lt;li&gt;&lt;code&gt;outlet.push_sample([value], ts)&lt;/code&gt;: This is the most important line. It injects the marker into the LSL stream &lt;em&gt;using the timestamp provided by the browser&lt;/em&gt;. This ensures that even if there is network lag, the timestamp recorded in the EEG data remains accurate to when the event actually happened on the participant&amp;rsquo;s screen.&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Run the Server&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;code&gt;run_server()&lt;/code&gt;: Starts the HTTP server and prints a confirmation message that it is ready for LabRecorder.&lt;/li&gt;
&lt;li&gt;&lt;code&gt;threading.Thread(...)&lt;/code&gt;: Runs the server in a separate thread so it doesn&amp;rsquo;t block the main Python process.&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ol&gt;
&lt;h3 id=&#34;configuration-of-the-jspsych-html-script&#34;&gt;Configuration of the JsPsych HTML script&lt;/h3&gt;
&lt;details&gt;
&lt;summary&gt;See an example of a full script&lt;/summary&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-html&#34; data-lang=&#34;html&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;cp&#34;&gt;&amp;lt;!DOCTYPE html\&amp;gt;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;p&#34;&gt;&amp;lt;&lt;/span&gt;&lt;span class=&#34;nt&#34;&gt;html&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;&amp;gt;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;p&#34;&gt;&amp;lt;&lt;/span&gt;&lt;span class=&#34;nt&#34;&gt;head&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;&amp;gt;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;p&#34;&gt;&amp;lt;&lt;/span&gt;&lt;span class=&#34;nt&#34;&gt;title&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;&amp;gt;&lt;/span&gt;Black/White Muse Synchronisation Test&lt;span class=&#34;p&#34;&gt;&amp;lt;/&lt;/span&gt;&lt;span class=&#34;nt&#34;&gt;title&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;&amp;gt;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;p&#34;&gt;&amp;lt;&lt;/span&gt;&lt;span class=&#34;nt&#34;&gt;script&lt;/span&gt; &lt;span class=&#34;na&#34;&gt;src&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;s&#34;&gt;&amp;#34;https://unpkg.com/jspsych\@7.3.4&amp;#34;&lt;/span&gt;&lt;span class=&#34;err&#34;&gt;\&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;&amp;gt;&lt;/span&gt;&lt;span class=&#34;err&#34;&gt;\&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;&amp;lt;&lt;/span&gt;&lt;span class=&#34;err&#34;&gt;/script\&amp;gt;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;o&#34;&gt;&amp;lt;&lt;/span&gt;&lt;span class=&#34;nx&#34;&gt;script&lt;/span&gt; &lt;span class=&#34;nx&#34;&gt;src&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;s2&#34;&gt;&amp;#34;https://unpkg.com/\@jspsych/plugin-html-keyboard-response\@1.1.3&amp;#34;&lt;/span&gt;&lt;span class=&#34;err&#34;&gt;\&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;&amp;gt;&lt;/span&gt;&lt;span class=&#34;err&#34;&gt;\&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;&amp;lt;&lt;/span&gt;&lt;span class=&#34;err&#34;&gt;/script\&amp;gt;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;o&#34;&gt;&amp;lt;&lt;/span&gt;&lt;span class=&#34;nx&#34;&gt;script&lt;/span&gt; &lt;span class=&#34;nx&#34;&gt;src&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;s2&#34;&gt;&amp;#34;https://unpkg.com/\@jspsych/plugin-image-keyboard-response\@1.1.3&amp;#34;&lt;/span&gt;&lt;span class=&#34;err&#34;&gt;\&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;&amp;gt;&lt;/span&gt;&lt;span class=&#34;err&#34;&gt;\&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;&amp;lt;&lt;/span&gt;&lt;span class=&#34;err&#34;&gt;/script\&amp;gt;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;o&#34;&gt;&amp;lt;&lt;/span&gt;&lt;span class=&#34;nx&#34;&gt;script&lt;/span&gt; &lt;span class=&#34;nx&#34;&gt;src&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;s2&#34;&gt;&amp;#34;https://unpkg.com/\@jspsych/plugin-preload\@1.1.3&amp;#34;&lt;/span&gt;&lt;span class=&#34;err&#34;&gt;\&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;&amp;gt;&lt;/span&gt;&lt;span class=&#34;err&#34;&gt;\&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;&amp;lt;&lt;/span&gt;&lt;span class=&#34;err&#34;&gt;/script\&amp;gt;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;o&#34;&gt;&amp;lt;&lt;/span&gt;&lt;span class=&#34;nx&#34;&gt;script&lt;/span&gt; &lt;span class=&#34;nx&#34;&gt;src&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;s2&#34;&gt;&amp;#34;https://unpkg.com/\@jspsych/plugin-fullscreen\@2.1.0&amp;#34;&lt;/span&gt;&lt;span class=&#34;err&#34;&gt;\&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;&amp;gt;&lt;/span&gt;&lt;span class=&#34;err&#34;&gt;\&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;&amp;lt;&lt;/span&gt;&lt;span class=&#34;err&#34;&gt;/script\&amp;gt;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;o&#34;&gt;&amp;lt;&lt;/span&gt;&lt;span class=&#34;nx&#34;&gt;link&lt;/span&gt; &lt;span class=&#34;nx&#34;&gt;href&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;s2&#34;&gt;&amp;#34;https://unpkg.com/jspsych@7.3.4/css/jspsych.css&amp;#34;&lt;/span&gt; &lt;span class=&#34;nx&#34;&gt;rel&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;s2&#34;&gt;&amp;#34;stylesheet&amp;#34;&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;/&amp;gt;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;o&#34;&gt;&amp;lt;&lt;/span&gt;&lt;span class=&#34;err&#34;&gt;/head&amp;gt;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;o&#34;&gt;&amp;lt;&lt;/span&gt;&lt;span class=&#34;nx&#34;&gt;body&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;&amp;gt;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;o&#34;&gt;&amp;lt;&lt;/span&gt;&lt;span class=&#34;err&#34;&gt;/body&amp;gt;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;o&#34;&gt;&amp;lt;&lt;/span&gt;&lt;span class=&#34;nx&#34;&gt;script&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;&amp;gt;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;// -----------------------
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;// LSL bridge (promise-based)
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;// -----------------------
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;&lt;/span&gt;&lt;span class=&#34;kd&#34;&gt;var&lt;/span&gt; &lt;span class=&#34;nx&#34;&gt;lslBaseTime&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;kc&#34;&gt;null&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;kd&#34;&gt;function&lt;/span&gt; &lt;span class=&#34;nx&#34;&gt;syncLSL&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;()&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;{&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;k&#34;&gt;return&lt;/span&gt; &lt;span class=&#34;k&#34;&gt;new&lt;/span&gt; &lt;span class=&#34;nb&#34;&gt;Promise&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;kr&#34;&gt;async&lt;/span&gt; &lt;span class=&#34;kd&#34;&gt;function&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;nx&#34;&gt;resolve&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;nx&#34;&gt;reject&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;{&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;k&#34;&gt;try&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;{&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;            &lt;span class=&#34;kd&#34;&gt;let&lt;/span&gt; &lt;span class=&#34;nx&#34;&gt;offsets&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;[]&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;            &lt;span class=&#34;k&#34;&gt;for&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;kd&#34;&gt;let&lt;/span&gt; &lt;span class=&#34;nx&#34;&gt;i&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;mi&#34;&gt;0&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;;&lt;/span&gt; &lt;span class=&#34;nx&#34;&gt;i&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;&amp;lt;&lt;/span&gt; &lt;span class=&#34;mi&#34;&gt;3&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;;&lt;/span&gt; &lt;span class=&#34;nx&#34;&gt;i&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;++&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;{&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;                &lt;span class=&#34;kd&#34;&gt;var&lt;/span&gt; &lt;span class=&#34;nx&#34;&gt;startPerf&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;nx&#34;&gt;performance&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;nx&#34;&gt;now&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;()&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;                &lt;span class=&#34;kd&#34;&gt;let&lt;/span&gt; &lt;span class=&#34;nx&#34;&gt;resp&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;kr&#34;&gt;await&lt;/span&gt; &lt;span class=&#34;nx&#34;&gt;fetch&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;s2&#34;&gt;&amp;#34;http://139.184.128.202:5000/sync&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;{&lt;/span&gt; &lt;span class=&#34;nx&#34;&gt;cache&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;:&lt;/span&gt; &lt;span class=&#34;s2&#34;&gt;&amp;#34;no-store&amp;#34;&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;})&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;                &lt;span class=&#34;kd&#34;&gt;let&lt;/span&gt; &lt;span class=&#34;nx&#34;&gt;text&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;kr&#34;&gt;await&lt;/span&gt; &lt;span class=&#34;nx&#34;&gt;resp&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;nx&#34;&gt;text&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;()&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;                &lt;span class=&#34;kd&#34;&gt;var&lt;/span&gt; &lt;span class=&#34;nx&#34;&gt;lslTime&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;nb&#34;&gt;parseFloat&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;nx&#34;&gt;text&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;                &lt;span class=&#34;kd&#34;&gt;var&lt;/span&gt; &lt;span class=&#34;nx&#34;&gt;endPerf&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;nx&#34;&gt;performance&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;nx&#34;&gt;now&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;()&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;                &lt;span class=&#34;kd&#34;&gt;var&lt;/span&gt; &lt;span class=&#34;nx&#34;&gt;perfMid&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;nx&#34;&gt;startPerf&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;+&lt;/span&gt; &lt;span class=&#34;nx&#34;&gt;endPerf&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;/&lt;/span&gt; &lt;span class=&#34;mi&#34;&gt;2&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;                &lt;span class=&#34;nx&#34;&gt;offsets&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;nx&#34;&gt;push&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;nx&#34;&gt;lslTime&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;-&lt;/span&gt; &lt;span class=&#34;nx&#34;&gt;perfMid&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;/&lt;/span&gt; &lt;span class=&#34;mi&#34;&gt;1000&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;                &lt;span class=&#34;kr&#34;&gt;await&lt;/span&gt; &lt;span class=&#34;k&#34;&gt;new&lt;/span&gt; &lt;span class=&#34;nb&#34;&gt;Promise&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;((&lt;/span&gt;&lt;span class=&#34;nx&#34;&gt;r&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;=&amp;gt;&lt;/span&gt; &lt;span class=&#34;nx&#34;&gt;setTimeout&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;nx&#34;&gt;r&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;mi&#34;&gt;100&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;))&lt;/span&gt; &lt;span class=&#34;c1&#34;&gt;// Short delay between syncs
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;&lt;/span&gt;            &lt;span class=&#34;p&#34;&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;            &lt;span class=&#34;nx&#34;&gt;lslBaseTime&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;nx&#34;&gt;offsets&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;nx&#34;&gt;reduce&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;((&lt;/span&gt;&lt;span class=&#34;nx&#34;&gt;a&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;nx&#34;&gt;b&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;=&amp;gt;&lt;/span&gt; &lt;span class=&#34;nx&#34;&gt;a&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;+&lt;/span&gt; &lt;span class=&#34;nx&#34;&gt;b&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;mi&#34;&gt;0&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;/&lt;/span&gt; &lt;span class=&#34;nx&#34;&gt;offsets&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;nx&#34;&gt;length&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;            &lt;span class=&#34;nx&#34;&gt;console&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;nx&#34;&gt;log&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;s2&#34;&gt;&amp;#34;LSL sync done (averaged):&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;nx&#34;&gt;lslBaseTime&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;            &lt;span class=&#34;nx&#34;&gt;resolve&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;nx&#34;&gt;lslBaseTime&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;p&#34;&gt;}&lt;/span&gt; &lt;span class=&#34;k&#34;&gt;catch&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;nx&#34;&gt;e&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;{&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;            &lt;span class=&#34;nx&#34;&gt;console&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;nx&#34;&gt;error&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;s2&#34;&gt;&amp;#34;LSL sync exception:&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;nx&#34;&gt;e&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;            &lt;span class=&#34;nx&#34;&gt;reject&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;nx&#34;&gt;e&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;p&#34;&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;p&#34;&gt;})&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;p&#34;&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;kd&#34;&gt;function&lt;/span&gt; &lt;span class=&#34;nx&#34;&gt;sendMarker&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;nx&#34;&gt;value&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;s2&#34;&gt;&amp;#34;1&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;{&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;c1&#34;&gt;// If not synced, still send marker (server will timestamp with local_clock())
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;&lt;/span&gt;    &lt;span class=&#34;k&#34;&gt;if&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;nx&#34;&gt;lslBaseTime&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;===&lt;/span&gt; &lt;span class=&#34;kc&#34;&gt;null&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;{&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;nx&#34;&gt;console&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;nx&#34;&gt;warn&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;s2&#34;&gt;&amp;#34;LSL not synced yet - sending without JS timestamp&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;nx&#34;&gt;fetch&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;s2&#34;&gt;&amp;#34;http://139.184.128.202:5000/marker?value=&amp;#34;&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;+&lt;/span&gt; &lt;span class=&#34;nb&#34;&gt;encodeURIComponent&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;nx&#34;&gt;value&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;))&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;            &lt;span class=&#34;p&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;nx&#34;&gt;then&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;kd&#34;&gt;function&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;()&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;{&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;                &lt;span class=&#34;nx&#34;&gt;console&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;nx&#34;&gt;log&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;s2&#34;&gt;&amp;#34;sent marker (no-ts)&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;nx&#34;&gt;value&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;            &lt;span class=&#34;p&#34;&gt;})&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;            &lt;span class=&#34;p&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;k&#34;&gt;catch&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;kd&#34;&gt;function&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;nx&#34;&gt;err&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;{&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;                &lt;span class=&#34;nx&#34;&gt;console&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;nx&#34;&gt;error&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;s2&#34;&gt;&amp;#34;Marker send error:&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;nx&#34;&gt;err&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;            &lt;span class=&#34;p&#34;&gt;})&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;k&#34;&gt;return&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;p&#34;&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;kd&#34;&gt;var&lt;/span&gt; &lt;span class=&#34;nx&#34;&gt;ts&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;nx&#34;&gt;lslBaseTime&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;+&lt;/span&gt; &lt;span class=&#34;nx&#34;&gt;performance&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;nx&#34;&gt;now&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;()&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;/&lt;/span&gt; &lt;span class=&#34;mi&#34;&gt;1000&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;kd&#34;&gt;var&lt;/span&gt; &lt;span class=&#34;nx&#34;&gt;url&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;s2&#34;&gt;&amp;#34;http://139.184.128.202:5000/marker?value=&amp;#34;&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;+&lt;/span&gt; &lt;span class=&#34;nb&#34;&gt;encodeURIComponent&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;nx&#34;&gt;value&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;+&lt;/span&gt; &lt;span class=&#34;s2&#34;&gt;&amp;#34;&amp;amp;ts=&amp;#34;&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;+&lt;/span&gt; &lt;span class=&#34;nb&#34;&gt;encodeURIComponent&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;nx&#34;&gt;ts&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;nx&#34;&gt;fetch&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;nx&#34;&gt;url&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;p&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;nx&#34;&gt;then&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;kd&#34;&gt;function&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;()&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;{&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;            &lt;span class=&#34;nx&#34;&gt;console&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;nx&#34;&gt;log&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;s2&#34;&gt;&amp;#34;sent marker&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;nx&#34;&gt;value&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;s2&#34;&gt;&amp;#34;ts&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;nx&#34;&gt;ts&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;p&#34;&gt;})&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;p&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;k&#34;&gt;catch&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;kd&#34;&gt;function&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;nx&#34;&gt;err&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;{&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;            &lt;span class=&#34;nx&#34;&gt;console&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;nx&#34;&gt;error&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;s2&#34;&gt;&amp;#34;Marker send error:&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;nx&#34;&gt;err&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;p&#34;&gt;})&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;p&#34;&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;cm&#34;&gt;/* --------------------------
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;cm&#34;&gt;   Experiment Definition
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;cm&#34;&gt;--------------------------- */&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;kd&#34;&gt;function&lt;/span&gt; &lt;span class=&#34;nx&#34;&gt;startExperiment&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;()&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;{&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;kd&#34;&gt;var&lt;/span&gt; &lt;span class=&#34;nx&#34;&gt;jsPsych&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;nx&#34;&gt;initJsPsych&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;({&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;nx&#34;&gt;override_safe_mode&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;:&lt;/span&gt; &lt;span class=&#34;kc&#34;&gt;true&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;p&#34;&gt;});&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;kd&#34;&gt;var&lt;/span&gt; &lt;span class=&#34;nx&#34;&gt;timeline&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;[];&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;cm&#34;&gt;/* preload */&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;nx&#34;&gt;timeline&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;nx&#34;&gt;push&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;({&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;nx&#34;&gt;type&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;:&lt;/span&gt; &lt;span class=&#34;nx&#34;&gt;jsPsychPreload&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;nx&#34;&gt;images&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;:&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;[&lt;/span&gt;&lt;span class=&#34;s2&#34;&gt;&amp;#34;white.png&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;s2&#34;&gt;&amp;#34;black.png&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;]&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;p&#34;&gt;});&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;cm&#34;&gt;/* stimuli */&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;kd&#34;&gt;var&lt;/span&gt; &lt;span class=&#34;nx&#34;&gt;test_stimuli&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;[&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;p&#34;&gt;{&lt;/span&gt; &lt;span class=&#34;nx&#34;&gt;stimulus&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;:&lt;/span&gt; &lt;span class=&#34;s2&#34;&gt;&amp;#34;white.png&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;nx&#34;&gt;duration&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;:&lt;/span&gt; &lt;span class=&#34;mi&#34;&gt;750&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;nx&#34;&gt;marker&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;:&lt;/span&gt; &lt;span class=&#34;mi&#34;&gt;0&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;},&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;p&#34;&gt;{&lt;/span&gt; &lt;span class=&#34;nx&#34;&gt;stimulus&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;:&lt;/span&gt; &lt;span class=&#34;s2&#34;&gt;&amp;#34;black.png&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;nx&#34;&gt;duration&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;:&lt;/span&gt; &lt;span class=&#34;mi&#34;&gt;500&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;nx&#34;&gt;marker&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;:&lt;/span&gt; &lt;span class=&#34;mi&#34;&gt;1&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;p&#34;&gt;];&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;cm&#34;&gt;/* trial */&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;kd&#34;&gt;var&lt;/span&gt; &lt;span class=&#34;nx&#34;&gt;test&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;{&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;nx&#34;&gt;type&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;:&lt;/span&gt; &lt;span class=&#34;nx&#34;&gt;jsPsychImageKeyboardResponse&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;nx&#34;&gt;stimulus&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;:&lt;/span&gt; &lt;span class=&#34;nx&#34;&gt;jsPsych&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;nx&#34;&gt;timelineVariable&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;s2&#34;&gt;&amp;#34;stimulus&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;),&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;nx&#34;&gt;trial_duration&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;:&lt;/span&gt; &lt;span class=&#34;nx&#34;&gt;jsPsych&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;nx&#34;&gt;timelineVariable&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;s2&#34;&gt;&amp;#34;duration&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;),&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;nx&#34;&gt;choices&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;:&lt;/span&gt; &lt;span class=&#34;s2&#34;&gt;&amp;#34;NO_KEYS&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;nx&#34;&gt;data&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;:&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;{&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;            &lt;span class=&#34;nx&#34;&gt;marker&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;:&lt;/span&gt; &lt;span class=&#34;nx&#34;&gt;jsPsych&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;nx&#34;&gt;timelineVariable&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;s2&#34;&gt;&amp;#34;marker&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;p&#34;&gt;},&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;nx&#34;&gt;on_start&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;:&lt;/span&gt; &lt;span class=&#34;kd&#34;&gt;function&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;nx&#34;&gt;trial&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;{&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;            &lt;span class=&#34;nx&#34;&gt;requestAnimationFrame&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(()&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;=&amp;gt;&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;{&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;                &lt;span class=&#34;nx&#34;&gt;sendMarker&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;nx&#34;&gt;trial&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;nx&#34;&gt;data&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;nx&#34;&gt;marker&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;);&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;            &lt;span class=&#34;p&#34;&gt;});&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;p&#34;&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;p&#34;&gt;};&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;kd&#34;&gt;var&lt;/span&gt; &lt;span class=&#34;nx&#34;&gt;loop_node&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;{&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;      &lt;span class=&#34;nx&#34;&gt;timeline&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;:&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;[{&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;          &lt;span class=&#34;nx&#34;&gt;timeline&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;:&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;[&lt;/span&gt;&lt;span class=&#34;nx&#34;&gt;test&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;],&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;          &lt;span class=&#34;nx&#34;&gt;timeline_variables&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;:&lt;/span&gt; &lt;span class=&#34;nx&#34;&gt;test_stimuli&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;          &lt;span class=&#34;nx&#34;&gt;randomize_order&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;:&lt;/span&gt; &lt;span class=&#34;kc&#34;&gt;false&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;      &lt;span class=&#34;p&#34;&gt;}],&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;      &lt;span class=&#34;nx&#34;&gt;loop_function&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;:&lt;/span&gt; &lt;span class=&#34;kd&#34;&gt;function&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;()&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;{&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;          &lt;span class=&#34;k&#34;&gt;return&lt;/span&gt; &lt;span class=&#34;nx&#34;&gt;jsPsych&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;nx&#34;&gt;getTotalTime&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;()&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;&amp;lt;&lt;/span&gt; &lt;span class=&#34;mi&#34;&gt;2100000&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;      &lt;span class=&#34;p&#34;&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;p&#34;&gt;};&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;nx&#34;&gt;timeline&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;nx&#34;&gt;push&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;nx&#34;&gt;loop_node&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;);&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;nx&#34;&gt;jsPsych&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;nx&#34;&gt;run&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;nx&#34;&gt;timeline&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;);&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;p&#34;&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;// -----------------------
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;// Run: first sync then start experiment
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;// -----------------------
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;&lt;/span&gt;&lt;span class=&#34;nx&#34;&gt;syncLSL&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;()&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;p&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;nx&#34;&gt;then&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;kd&#34;&gt;function&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;()&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;{&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;nx&#34;&gt;startExperiment&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;()&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;p&#34;&gt;})&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;p&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;k&#34;&gt;catch&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;kd&#34;&gt;function&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;nx&#34;&gt;err&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;{&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;nx&#34;&gt;console&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;nx&#34;&gt;warn&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;s2&#34;&gt;&amp;#34;Proceeding without LSL sync (sync failed):&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;nx&#34;&gt;err&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;nx&#34;&gt;startExperiment&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;()&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;p&#34;&gt;})&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;p&#34;&gt;&amp;lt;/&lt;/span&gt;&lt;span class=&#34;nt&#34;&gt;script&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;&amp;gt;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;p&#34;&gt;&amp;lt;/&lt;/span&gt;&lt;span class=&#34;nt&#34;&gt;html&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;&amp;gt;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;/details&gt;
&lt;p&gt;The code sending triggers from the browser to the LSL bridge script is written in javascript. It is &amp;lsquo;promise-based&amp;rsquo;, meaning it is coded to wait until it receives a signal from the participant&amp;rsquo;s computer. This code is designed to track the exact time it is on the participant&amp;rsquo;s computer and send markers precisely aligned to that time.&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;
&lt;p&gt;Synchronisation&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;
&lt;p&gt;&lt;code&gt;var lslBaseTime = null&lt;/code&gt;: A variable to store the calculated time difference between the two computers.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;The function &lt;code&gt;syncLSL() {...}&lt;/code&gt; can be looped three times to get an average reading.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Get the time of the marker: &lt;code&gt;fetch(&amp;quot;http://.../sync&amp;quot;)&lt;/code&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Record &lt;code&gt;startPerf&lt;/code&gt; (when the request left) and &lt;code&gt;endPerf&lt;/code&gt; (when the answer came back).&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;You can assume the server received the message exactly halfway between start and end (&lt;code&gt;perfMid&lt;/code&gt;).&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Calculate the difference between the browser&amp;rsquo;s clock and the LSL clock: &lt;code&gt;offsets.push(lslTime - perfMid / 1000)&lt;/code&gt;.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Finally, average these offsets into &lt;code&gt;lslBaseTime&lt;/code&gt;. Now the browser knows how to convert its own time to LSL time.&lt;/p&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Sending markers&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;
&lt;p&gt;Safety check in case sync fails: &lt;code&gt;if (lslBaseTime === null)&lt;/code&gt; can be coded to send markers based on the timing from the participant&amp;rsquo;s computer without synchronisation, which is less accurate but better than nothing.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Code the mathematical logic to account for the offset in time for the recording computer to receive the marker, so the generated timestamp aligns with the recording computer&amp;rsquo;s timestamp: &lt;code&gt;var ts = lslBaseTime + performance.now() / 1000&lt;/code&gt;.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Send the marker name and this calculated timestamp to the python bridge: &lt;code&gt;fetch(url)&lt;/code&gt;.&lt;/p&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ol&gt;
&lt;h3 id=&#34;usage&#34;&gt;Usage&lt;/h3&gt;
&lt;p&gt;On the researcher&amp;rsquo;s machine:&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;Turn off the firewall in security settings for the researcher&amp;rsquo;s machine.&lt;/li&gt;
&lt;li&gt;Open the lsl bridge script and go onto the terminal (we use VS code). Type &lt;code&gt;ipconfig&lt;/code&gt; to retrieve the ipv4 address.&lt;/li&gt;
&lt;li&gt;In the terminal, type &lt;code&gt;pip install mne-lsl&lt;/code&gt;.&lt;/li&gt;
&lt;li&gt;Select all text in the bridge script and run it (ctrl+a, shift+enter)&lt;/li&gt;
&lt;/ol&gt;
&lt;p&gt;On the participant&amp;rsquo;s machine:&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;Open the html script. You will need to ensure the terminal working directory points towards your experiment folder- the best way to do this is to open your repository folder on VS code.&lt;/li&gt;
&lt;li&gt;Copy the ipv4 address from the researcher&amp;rsquo;s machine into the html script- there should be three instances before the &lt;code&gt;:5000&lt;/code&gt; port address.&lt;/li&gt;
&lt;li&gt;In a new terminal, run &lt;code&gt;python -m http.server 8000&lt;/code&gt;&lt;/li&gt;
&lt;li&gt;You are now ready to run the experiment. Go onto your browser and enter the link &amp;lsquo;&lt;a href=&#34;http://localhost:8000/index.html&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;http://localhost:8000/index.html&lt;/a&gt;&amp;rsquo;, adjusting for the name of your html script. Your experiment should now run on the participant&amp;rsquo;s machine. The researcher can view markers received in their python terminal.&lt;/li&gt;
&lt;/ol&gt;
&lt;div class=&#34;footnotes&#34; role=&#34;doc-endnotes&#34;&gt;
&lt;hr&gt;
&lt;ol&gt;
&lt;li id=&#34;fn:1&#34;&gt;
&lt;p&gt;Special shoutout to our placement student Oliver Collins for preparing these scripts!&amp;#160;&lt;a href=&#34;#fnref:1&#34; class=&#34;footnote-backref&#34; role=&#34;doc-backlink&#34;&gt;&amp;#x21a9;&amp;#xfe0e;&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;/ol&gt;
&lt;/div&gt;
</description>
    </item>
    
    <item>
      <title>What is the Best Interoception Questionnaire?</title>
      <link>https://realitybending.github.io/post/2025-12-20-interoception/</link>
      <pubDate>Sat, 20 Dec 2025 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/post/2025-12-20-interoception/</guid>
      <description>&lt;p&gt;Hello👋! We are &lt;a href=&#34;https://realitybending.github.io/authors/roisin-sharma/&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;Róisín&lt;/a&gt; and &lt;a href=&#34;https://realitybending.github.io/authors/oliver-collins/&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;Oliver&lt;/a&gt;, two &lt;a href=&#34;https://realitybending.github.io/jobs/assistant/&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;Research Assistants&lt;/a&gt; at the lab, and today we are going to be discussing the tricky topic of self-report interoception questionnaires.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Interoception&lt;/strong&gt;, essentially referring to one&amp;rsquo;s sensation of their internal body, is a fundamental phenomenon that we rely on in everyday life, and recent research highlights it as a trans-diagnostic underpinning of a variety of somatic and psychological difficulties.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;While we know interoception is very important, the specifics are still being worked out&lt;/strong&gt;. Debates continue on what exactly interoception is, is not, and what it encompasses in terms of modalities or processes. Is it limited to visceral sensations (i.e., from internal organs)? Does it include proprioception (i.e., body position sense)? Pain? What about tactile sensations (i.e., touch and skin)? Does it include the interaction with higher-order processes like attention and beliefs?&lt;/p&gt;
&lt;p&gt;This chaotic and moving landscape has been accompanied by the development and repurposing of different interoception (and interoception-adjacent) questionnaires, each with their own philosophies and approach. Carefully choosing a good measure of interoception is crucial to avoid adding to the &lt;a href=&#34;https://en.wikipedia.org/wiki/Jingle-jangle_fallacies&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;jingle-jangle fallacy&lt;/strong&gt;&lt;/a&gt; plaguing the field, in which discrepancies and contradictions of results &lt;em&gt;&amp;ldquo;related to interoception&amp;rdquo;&lt;/em&gt; are driven by differences in what aspect of it is actually being measured.&lt;/p&gt;
&lt;p&gt;Moreover, unlike &lt;em&gt;exteroception&lt;/em&gt; (vision, audition, etc.), where researchers can easily manipulate external stimuli to validate a participant&amp;rsquo;s response, interoception presents a unique challenge: the stimuli originate from within the body. Because internal states are difficult to manipulate or observe directly, objective validation is complex. Nonetheless, especially as &amp;ldquo;objective&amp;rdquo; tasks like the Heart Beat Counting Task (HCT; &lt;a href=&#34;https://onlinelibrary-wiley-com.sussex.idm.oclc.org/doi/10.1111/j.1469-8986.1981.tb02486.x&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;Schandry, 1981&lt;/a&gt;) have their own methodological drawbacks, self-report questionnaires remain a scalable, practical, and widely-used tools for assessing interoception. Let&amp;rsquo;s explore the most popular and established questionnaires.&lt;/p&gt;
&lt;h2 id=&#34;questionnaires-overview&#34;&gt;Questionnaires Overview&lt;/h2&gt;
&lt;h3 id=&#34;-body-perception-questionnaire-bpq&#34;&gt;😨 Body Perception Questionnaire (BPQ)&lt;/h3&gt;
&lt;p&gt;The &lt;strong&gt;BPQ&lt;/strong&gt; is one of the earliest interoception scales, originally built by &lt;a href=&#34;https://terpconnect.umd.edu/~sporges/body/body.txt&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;Porges in 1993&lt;/a&gt;. This questionnaire focuses on the autonomic nervous system, involved in stress responses, and thus is mainly concerned with internal sensing when there are problems (e.g., &amp;rsquo;tremor in my lips&amp;rsquo;, &amp;lsquo;general jitteriness&amp;rsquo; being two items for body awareness). This makes the scale beneficial in clinical contexts to investigate maladaptive interoception, particularly in patients who have a dysregulated autonomic nervous system. However, if you are interested in interoception in a wider context, other questionnaires may be more appropriate.&lt;/p&gt;
&lt;figure&gt;
&lt;img src=&#34;ans.webp&#34; alt=&#34;The Autonomic Nervous System (ANS)&#34; width=&#34;50%&#34;/&gt;
&lt;figcaption&gt;&lt;i&gt;The Autonomic Nervous System (ANS)&lt;/i&gt;&lt;/figcaption&gt;
&lt;/figure&gt;
&lt;h3 id=&#34;-multidimensional-assessment-of-interoceptive-awareness-maia&#34;&gt;🧘‍♀️ Multidimensional Assessment of Interoceptive Awareness (MAIA)&lt;/h3&gt;
&lt;p&gt;The &lt;strong&gt;Multidimensional Assessment of Interoceptive Awareness (MAIA)&lt;/strong&gt; (the MAIA-2 being the most recent version) is another widely used questionnaire that accounts for body awareness in positive states - deriving from research on emotional regulation and pain. This questionnaire was created because &lt;a href=&#34;https://doi.org/10.1371/journal.pone.0048230&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;Mehling et al. (2012)&lt;/a&gt; believed western medicine focused too much on bodily awareness as a maladaptive trait, even though research was increasingly finding health benefits from a sense of embodiment. It was specifically designed to assess mind-body therapies and was finalised based on data from individuals with various therapeutic backgrounds including yoga, tai chi and breath-work. The MAIA reconceptualises bodily awareness not only as an anxiety-related process but also an integral part of mindfulness. This translates to many of the questions focusing on &lt;em&gt;metacognitive beliefs&lt;/em&gt; about one&amp;rsquo;s body and emotions, as well as some targetting more directly other mindfulness-related processes, such as attention regulation and non-reactivity. The MAIA includes subscales encompassing self-regulation abilities which - while important - might be conceptualized as distinct from core interoception.&lt;/p&gt;
&lt;h3 id=&#34;-interoceptive-accuracy-scale-ias&#34;&gt;🤧 Interoceptive Accuracy Scale (IAS)&lt;/h3&gt;
&lt;p&gt;More recently, the &lt;strong&gt;Interoceptive Accuracy Scale (IAS)&lt;/strong&gt; &lt;a href=&#34;https://doi-org.sussex.idm.oclc.org/10.1177/1747021819879826&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;(Murphy et al., 2019)&lt;/a&gt; took the opposite route, trying to remove contamination by meta-cognitive processes to focus on interoceptive &lt;em&gt;accuracy&lt;/em&gt; (distinct from interoceptive &lt;em&gt;attention&lt;/em&gt;). It includes 21 questions (&amp;ldquo;I can always accurately perceive when&amp;hellip;&amp;rdquo;) pertaining discrete, clear, and &amp;ldquo;objectifiable&amp;rdquo; interoceptive events, hopefully being meaningful and consistently interpreted across participants (including those who have difficulty perceiving internal sensations).&lt;/p&gt;
&lt;h3 id=&#34;-multimodal-interoception-questionnaire-mint&#34;&gt;🍃 Multimodal Interoception Questionnaire (Mint)&lt;/h3&gt;
&lt;p&gt;The &lt;strong&gt;Multimodal Interoception Questionnaire (Mint;&lt;/strong&gt; &lt;a href=&#34;https://doi.org/10.31234/osf.io/8qrht_v1&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;Makowski et al., 2025&lt;/strong&gt;&lt;/a&gt;) is the most recent interoception questionnaire, designed with the intention of addressing the caveats and limitations by building on established measures and synthesising the previous research and advances. Fundamentally, the Mint takes a &amp;ldquo;&lt;strong&gt;context-by-modality&lt;/strong&gt;&amp;rdquo; approach to item development, encompassing a wide range of (seven) &lt;strong&gt;modalities&lt;/strong&gt; of interoceptive experience (cardiac, respiratory, gastric, etc.) and also controlling for the &lt;strong&gt;contexts&lt;/strong&gt; in which these may appear (covering negative (&lt;em&gt;anxious&lt;/em&gt;) and positive (&lt;em&gt;sexual&lt;/em&gt;) arousal states). The Mint also incorporates both adaptive and maladaptive aspects of interoception (interoceptive confusion), as well as items targeting different levels of processing.&lt;/p&gt;
&lt;p&gt;Importantly, this questionnaire was developed with the aim of addressing some of the methodological shortcomings of previous interoception questionnaires, such as limiting &lt;em&gt;interpretation Variance&lt;/em&gt;, &lt;em&gt;state Dependency&lt;/em&gt; (the fact that respondents &amp;ldquo;anchor&amp;rdquo; their answers to their current physiological state rather than their general trait), and &lt;em&gt;recency effects&lt;/em&gt; (recent, salient physical experiences disproportionately influencing scores), in particular by providing a clear contextual reference for each item. The validation study displayed shows strong correlations with the above questionnaires (suggesting that it can be used as a comprehensive replacement), while also demonstrating a superior predictive power for a variety of clinical conditions.&lt;/p&gt;
&lt;figure&gt;
&lt;img src=&#34;mint.png&#34; alt=&#34;Items of the Multimodal Interoception Questionnaire (Mint)&#34; width=&#34;80%&#34;/&gt;
&lt;figcaption&gt;&lt;i&gt;Items of the Multimodal Interoception Questionnaire (Mint)&lt;/i&gt;&lt;/figcaption&gt;
&lt;/figure&gt;
&lt;h3 id=&#34;others&#34;&gt;Others&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;The &lt;strong&gt;Interoceptive Attention Scale (IATS; Gabriele et al., 2021)&lt;/strong&gt;: Attention to bodily signals. Designed as the orthogonal counterpart of the Interoceptive Accuracy Scale, also using consistent phrasing of all statements (&amp;lsquo;Most of the time my attention is focused on&amp;hellip;&amp;rsquo;).&lt;/li&gt;
&lt;li&gt;The &lt;strong&gt;Interoceptive Sensations Questionnaire (THISQ; &lt;a href=&#34;https://doi.org/10.1080/08870446.2021.2009479&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;Vlemincx et al., 2021&lt;/a&gt;)&lt;/strong&gt;: Neutral internal sensations (not emotionally valenced), including cardiorespiratory activation, deactivation, and gastroesophageal sensations.&lt;/li&gt;
&lt;li&gt;The &lt;strong&gt;Interoception Sensory Questionnaire (ISQ; &lt;a href=&#34;https://link.springer.com/article/10.1007/s10803-018-3600-3&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;Fiene, 2018&lt;/a&gt;)&lt;/strong&gt;: Designed to assess confusion about interoceptive bodily states unless these states are extreme (Alexisomia).&lt;/li&gt;
&lt;li&gt;The &lt;strong&gt;Interoceptive Confusion Questionnaire (ICQ; &lt;a href=&#34;https://royalsocietypublishing.org/rsos/article/3/10/150664/36458/Alexithymia-a-general-deficit-of&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;Brewer, 2016&lt;/a&gt;)&lt;/strong&gt;: Assesses confusion and misinterpretation of bodily signals.&lt;/li&gt;
&lt;li&gt;The &lt;strong&gt;Body Consciousness scale (BCS; Miller et al., 1981)&lt;/strong&gt;: Awareness of the &amp;ldquo;private body&amp;rdquo; (internal sensations) and &amp;ldquo;public body&amp;rdquo;	(observable aspects of body)&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id=&#34;in-summary---which-interoception-questionnaire-should-i-pick&#34;&gt;In summary - which interoception questionnaire should I pick?&lt;/h2&gt;
&lt;p&gt;Interoceptive questionnaires are a product of their time, often molded by specific contextual demands and underlying theoretical frameworks. As our understanding of interoception evolves, so too do the tools we use to measure it. It might seem like the best option is to pick a questionnaire based on the interoception facet you are interested in (e.g., confusion, attention, accuracy, &amp;hellip;), but as the field is still developing, and the theorethical models are in flux, it might be more useful to consider using a broader, more comprehensive, theory-agnostic questionnaire that captures multiple facets and modalities of interoception, such as the &lt;strong&gt;Mint&lt;/strong&gt;.&lt;/p&gt;
&lt;h2 id=&#34;references&#34;&gt;References&lt;/h2&gt;
&lt;p&gt;Bergomi, C., Tschacher, W., &amp;amp; Kupper, Z. (2012). The Assessment of Mindfulness with Self-Report Measures: Existing Scales and Open Issues. &lt;em&gt;Mindfulness&lt;/em&gt;, &lt;em&gt;4&lt;/em&gt;(3), 191–202. &lt;a href=&#34;https://doi.org/10.1007/s12671-012-0110-9&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;https://doi.org/10.1007/s12671-012-0110-9&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;Gabriele, E., Spooner, R., Brewer, R., &amp;amp; Murphy, J. (2021). Dissociations between self-reported interoceptive accuracy and attention: Evidence from the interoceptive attention scale. &lt;em&gt;Biological Psychology&lt;/em&gt;, &lt;em&gt;168&lt;/em&gt;, 108243. &lt;a href=&#34;https://doi.org/10.1016/j.biopsycho.2021.108243&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;https://doi.org/10.1016/j.biopsycho.2021.108243&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;Kolacz, J., &amp;amp; Bjorum, E. (2023). Measuring Autonomic Symptoms with the Body Perception Questionnaire. &lt;em&gt;The Traumatic Stress Research Consortium&lt;/em&gt; . &lt;a href=&#34;https://www.traumascience.org/s/TSRCMarch2023Newsletter.pdf&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;https://www.traumascience.org/s/TSRCMarch2023Newsletter.pdf&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;Kolacz, J., Holmes, L., &amp;amp; Porges, S. W. (2018). Body perception questionnaire (BPQ) manual. Traumatic Stress Research Consortium.&lt;/p&gt;
&lt;p&gt;Makowski, D., Neves, A., Benn, E., Bennett, M., &amp;amp; Poerio, G. (2025). The Mint Scale: A Fresh Validation of the Multimodal Interoception Questionnaire and Comparison to the MAIA, BPQ and IAS. &lt;a href=&#34;https://doi.org/10.31234/osf.io/8qrht_v1&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;https://doi.org/10.31234/osf.io/8qrht_v1&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;Mehling, Price, Daubenmier, Acree, Bartmess, &amp;amp; Stewart. (2012). The Multidimensional Assessment of Interoceptive Awareness (MAIA). &lt;em&gt;Plos One&lt;/em&gt;, &lt;em&gt;7&lt;/em&gt;(11). &lt;a href=&#34;https://doi.org/10.1371/journal.pone.0048230.g001&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;https://doi.org/10.1371/journal.pone.0048230.g001&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;Mehling, W. E., Acree, M., Stewart, A., Silas, J., &amp;amp; Jones, A. (2018). The Multidimensional Assessment of Interoceptive Awareness, Version 2 (MAIA-2). &lt;em&gt;PLOS ONE&lt;/em&gt;, &lt;em&gt;13&lt;/em&gt;(12), e0208034. &lt;a href=&#34;https://doi.org/10.1371/journal.pone.0208034&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;https://doi.org/10.1371/journal.pone.0208034&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;Miller, L. C., Murphy, R., &amp;amp; Buss, A. H. (1981). Consciousness of body: Private and public. &lt;em&gt;Journal of Personality and Social Psychology&lt;/em&gt;, &lt;em&gt;41&lt;/em&gt;(2), 397–406. &lt;a href=&#34;https://doi.org/10.1037/0022-3514.41.2.397&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;https://doi.org/10.1037/0022-3514.41.2.397&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;Murphy, J., Brewer, R., Plans, D., Khalsa, S. S., Catmur, C., &amp;amp; Bird, G. (2019). Testing the independence of self-reported interoceptive accuracy and attention. &lt;em&gt;Quarterly Journal of Experimental Psychology&lt;/em&gt;, &lt;em&gt;73&lt;/em&gt;(1), 115–133. &lt;a href=&#34;https://doi.org/10.1177/1747021819879826&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;https://doi.org/10.1177/1747021819879826&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;Paola Solano Durán, Morales, J.-P., &amp;amp; Huepe, D. (2024). Interoceptive awareness in a clinical setting: the need to bring interoceptive perspectives into clinical evaluation. &lt;em&gt;Frontiers in Psychology&lt;/em&gt;, &lt;em&gt;15&lt;/em&gt;(1244701). &lt;a href=&#34;https://doi.org/10.3389/fpsyg.2024.1244701&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;https://doi.org/10.3389/fpsyg.2024.1244701&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;Porges. (1993). &lt;em&gt;Body Perception Questionnaire&lt;/em&gt;. Umd.edu. &lt;a href=&#34;https://terpconnect.umd.edu/~sporges/body/body.txt&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;https://terpconnect.umd.edu/~sporges/body/body.txt&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;Schandry, R. (1981). Heart Beat Perception and Emotional Experience. &lt;em&gt;Psychophysiology&lt;/em&gt;, &lt;em&gt;18&lt;/em&gt;(4), 483–488. &lt;a href=&#34;https://doi.org/10.1111/j.1469-8986.1981.tb02486.x&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;https://doi.org/10.1111/j.1469-8986.1981.tb02486.x&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;Sherrington C. S. (1906). The integrative action of the nervous system. Yale University Press.&lt;/p&gt;
&lt;p&gt;Vlemincx, E., Walentynowicz, M., Zamariola, G., Van Oudenhove, L., &amp;amp; Luminet, O. (2021). A novel self-report scale of interoception: the three-domain interoceptive sensations questionnaire (THISQ). &lt;em&gt;Psychology &amp;amp; Health&lt;/em&gt;, &lt;em&gt;38&lt;/em&gt;(9), 1–20. &lt;a href=&#34;https://doi.org/10.1080/08870446.2021.2009479&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;https://doi.org/10.1080/08870446.2021.2009479&lt;/a&gt;&lt;/p&gt;
</description>
    </item>
    
    <item>
      <title>From physique to intellect... and back again?</title>
      <link>https://realitybending.github.io/post/2025-11-03-dystopianfutures1/</link>
      <pubDate>Mon, 03 Nov 2025 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/post/2025-11-03-dystopianfutures1/</guid>
      <description>&lt;div class=&#34;alert alert-note&#34;&gt;
  &lt;div&gt;
    &lt;em&gt;Welcome to the &amp;ldquo;Pub Theories on Dystopian Futures&amp;rdquo; series, where we engage in wild speculations about the future of academia and society.&lt;/em&gt;
  &lt;/div&gt;
&lt;/div&gt;
&lt;p&gt;&lt;strong&gt;TLDR;&lt;/strong&gt; AI will commodify and devalue intelligence, leading to natural beauty becoming the next highly prized characteristic in Humans.&lt;/p&gt;
&lt;h2 id=&#34;physical-health-as-primary-desideratum&#34;&gt;Physical health as primary desideratum&lt;/h2&gt;
&lt;p&gt;From a psycho-evolutionary perspective, many human behaviours can be understood as rooted in mating strategies.
At their core, these involve signalling one&amp;rsquo;s genetic fitness to potential partners.
In humans, this often manifests as displays of youthful beauty in women (as a proxy for fertility) and status in men (as a proxy for resource acquisition and protection).
These sexual dimorphisms arise primarily from the biological asymmetry in parental investment between the sexes.&lt;/p&gt;
&lt;p&gt;On top of these individual selective processes, there are also societal or group-level selective pressures, including accidental ones.
For instance, the black plague that killed one in three Europeans in the 14th century exerted selective pressure on immune genes, notably increasing the frequency of protective allele variants (&lt;a href=&#34;https://pubmed.ncbi.nlm.nih.gov/36261521/&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;Klunk et al., 2022&lt;/a&gt;).&lt;/p&gt;
&lt;p&gt;Importantly, at a population-level, pre-modern societies preferentially selected for physical characteristics like strength and health over intellectual ones.
For hunter-gatherer men, strength and endurance were the best predictors of family provisioning and protection.
Likewise, for the common farmer of the Feudal era, &lt;strong&gt;it didn&amp;rsquo;t matter whether you were smart or not&lt;/strong&gt;; what mattered was whether you could labour and toil efficiently and survive the harsh conditions.&lt;/p&gt;
&lt;h2 id=&#34;the-rise-of-intelligence&#34;&gt;The rise of intelligence&lt;/h2&gt;
&lt;p&gt;With the Industrial Revolution, and overall advances in medicine, nutrition, and sanitation, life expectancy increased, and the population grew (e.g., the population of Britain increased five times in 150 years).
Societies became more literate, relying on bureaucracies, trade and technologies, and increasingly meritocratic.
This resulted in a shift in selective pressures: intelligence became a prime desirable trait.
Due to these multiple converging factors (notably education, nutrition, and health), average IQ scores increased over the 20th century by about 3 points per decade (the so-called &lt;a href=&#34;https://en.wikipedia.org/wiki/Flynn_effect&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;Flynn effect&lt;/a&gt;).&lt;/p&gt;
&lt;p&gt;Today, IQ is one of the strongest single predictors of positive outcomes, including academic achievement, job performance, socioeconomic mobility, health, emotional stability, happiness, and even longevity (&lt;a href=&#34;https://www.scirp.org/journal/paperinformation?paperid=74943&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;Lo, 2017&lt;/a&gt;).
As such, being &lt;em&gt;(perceived as)&lt;/em&gt; smart confers advantages in modern societies surpassing those of physical fitness.&lt;/p&gt;
&lt;!-- TODO: mention studies on the relationship between perceived attractiveness and intelligence --&gt;
&lt;p&gt;Interestingly, the recent decades of data seem to provide some evidence for a possible stagnation or even reversal of the Flynn trend&lt;sup&gt;1&lt;/sup&gt;, suggesting we may be approaching a plateau&amp;hellip; or a &lt;strong&gt;regime shift&lt;/strong&gt;.&lt;/p&gt;
&lt;h2 id=&#34;the-devaluation-of-intelligence&#34;&gt;The devaluation of intelligence&lt;/h2&gt;
&lt;p&gt;Just as the agricultural revolution transformed society by lowering the need for less efficient food production methods, and just as the industrial revolution changed society by lowering the need for manual labour, the AI revolution is changing society by lowering the need for Human intelligence.
If AI commodifies smartness, the competitive edge of human intelligence becomes diluted, relaxing related selective pressures.
Once intelligence is cheaply available, the benefits of being smarter shrink, triggering a decrease in its value as a desirable status-marker trait.&lt;/p&gt;
&lt;p&gt;In the near future, intelligence might no longer the main characteristic prized by society, providing positive outcomes and status.
This raises the question: what comes next? What feature will serve as ground for future Humans to compete on?&lt;/p&gt;
&lt;h2 id=&#34;natural-beauty-and-the-emergence-of-a-kalokagathos-class&#34;&gt;Natural beauty and the emergence of a &lt;em&gt;kalokagathos&lt;/em&gt; class&lt;/h2&gt;
&lt;p&gt;In a world where technology equalises for advantages related to cognitive and physical abilities and health, &lt;strong&gt;the new frontier of human competition may be aesthetic and embodied: beauty&lt;/strong&gt;.&lt;/p&gt;
&lt;p&gt;The ancient Greeks encapsulated this ideal in the term &lt;em&gt;kalokagathos&lt;/em&gt; - the unity of the good, the true, and the beautiful within a single person. For them, beauty was not a shallow and vain superficial quality; it symbolised harmony, virtue, and authenticity.&lt;/p&gt;
&lt;p&gt;As artificial intelligence, virtual realities, and algorithmic filters dominate experience, society may increasingly valorise naturalness - what appears genuine, unmediated, and unfiltered. The value of naturalness, authenticity, and directness&lt;sup&gt;2&lt;/sup&gt; might come as a counter-movement to the rise of artificiality, syntheticity, and virtuality. In this context, the pendulum could swing towards an aesthetic moralism: beauty as truth and goodness. In this world, &lt;strong&gt;natural beauty might become the prime desirable trait&lt;/strong&gt;, with all its paradoxical implications in the form of a post-AI society obsessed with fake naturalness - surgical enhancements and digital manipulations designed to look unaltered, and carefully manufactured authenticity.&lt;/p&gt;
&lt;p&gt;The &lt;em&gt;Kalokagathoi&lt;/em&gt;, living embodiment of beauty and virtue, would become the new elite class, dominating not by their power to &lt;em&gt;do&lt;/em&gt;, but through their mere quality to &lt;em&gt;be&lt;/em&gt;, unsullied by technological augmentation.
This value shift towards natural reality might last&amp;hellip; until our technological capabilities allow us to influence the most fundamental aspects of our biology via gene editing, synthetic biology, &amp;hellip; &lt;em&gt;But what comes after that is a story for another pub theory.&lt;/em&gt;&lt;/p&gt;
&lt;hr&gt;
&lt;p&gt;&lt;sup&gt;1&lt;/sup&gt; The reversal of the Flynn effect has been documented in several developed countries, such as Norway, Finland, and the UK, with IQ declines ranging from 0.38 to 4.3 points per decade since the 1990s in some studies. However, this is not universal—gains continue in other regions—and causes are debated, including factors like changes in education quality, immigration patterns, environmental toxins, or even the rise of digital distractions like smartphones (&lt;a href=&#34;https://pmc.ncbi.nlm.nih.gov/articles/PMC6042097/&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;Bratsberg &amp;amp; Rogeberg, 2018&lt;/a&gt;; &lt;a href=&#34;https://www.sciencedirect.com/science/article/abs/pii/S0160289616300198&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;Dutton et al., 2016&lt;/a&gt;).&lt;/p&gt;
&lt;p&gt;&lt;sup&gt;2&lt;/sup&gt; Directness refers to the idea of unmediated experience, e.g., accessibility of the source of an experience. This translates to direct experiences of nature, social interactions without digital mediation, and raw sensory experiences. This concept also applies to products, opposing hand-crafted goods and locally sourced food (for which the &amp;ldquo;source&amp;rdquo; and &amp;ldquo;origin&amp;rdquo; are directly accessible and known) to mass-produced items for which the creation process has been obscured and mediated through complex supply chains.&lt;/p&gt;
&lt;!-- - This mechanism is already at play as a counter-movement to the mass production of goods, with the rise of artisanal, hand-crafted products, and locally sourced products. In Brighton, where I live, many people will be happy to pay three times as much for a regular basic white t-shirt if they *think* it is hand-made sustainably by some local artisan, rather than being a mass-produced item from a sweatshop in Bangladesh. &#34;The overpriced hand-crafted artisan white shirt from Brighton.&#34; --&gt;</description>
    </item>
    
    <item>
      <title>The AI revolution in academia: I see a silver lining for young scientists!</title>
      <link>https://realitybending.github.io/post/2025-09-08-airevolution/</link>
      <pubDate>Mon, 08 Sep 2025 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/post/2025-09-08-airevolution/</guid>
      <description>&lt;p&gt;For many students and early-career researchers, the rise of AI is scary. &lt;strong&gt;The very skills they are spending years learning - writing, coding, summarising - are exactly the things AI is getting frighteningly good at&lt;/strong&gt;. What, then, is the future of research careers in the age of AI?&lt;/p&gt;
&lt;p&gt;While I don&amp;rsquo;t have a full answer to that question, I do think there may be a silver lining for young scientists.
For decades, becoming an established “big shot” professor was associated with focusing on the &amp;ldquo;big ideas&amp;rdquo;, revelling in intellectual thinking and leaving the scientific grunt work to junior researchers.
Therein lied the prestige: writing opinion pieces, commentaries, critiques and reviews. &lt;strong&gt;The glamour was in thinking, not doing.&lt;/strong&gt;&lt;sup&gt;1&lt;/sup&gt;&lt;/p&gt;
&lt;p&gt;In a fascinating twist, the rise of AI may disrupt this landscape. If AI excels at one thing, it is &lt;em&gt;precisely&lt;/em&gt; writing, reviewing and summarising evidence, interpreting findings, and even formulating new hypotheses or planning experiments.
What AI still cannot do, however, is roll up its sleeves and gather real-world data - &lt;strong&gt;the true backbone of empirical science&lt;/strong&gt;&lt;sup&gt;2&lt;/sup&gt;. It can&amp;rsquo;t (yet) run studies, recruit participants, set up experiments, organise data management, or wrestle with messy datasets.&lt;/p&gt;
&lt;p&gt;The centre of gravity in science may thus shift towards &lt;strong&gt;those who can &lt;em&gt;do&lt;/em&gt;&lt;/strong&gt;: Hands-on scientists, who might once have been relegated to the shadows, could see their skills and contributions gain new recognition.
We may see less pressure to write endless papers and grants, and more emphasis on how the science was actually done: how data was collected, preprocessed, managed, and made accessible.
Perhaps the introduction, discussion, and &amp;ldquo;key takeaways&amp;rdquo; sections of papers will become somewhat less important, while methods, results, and limitations gain greater prominence, allowing for more nuance and granularity&lt;sup&gt;3&lt;/sup&gt;.&lt;/p&gt;
&lt;p&gt;Whether this shift will make science better or worse is unclear. And that &amp;ldquo;silver lining&amp;rdquo; might end up as a &amp;ldquo;glorification of the grind&amp;rdquo; and a devaluation of the intellectual aspects of research, devolving the job of &amp;ldquo;Researcher&amp;rdquo; into technician work.
But it might also reshape the academic landscape in a way that proves beneficial for young researchers and those who enjoy the practical aspects of research.&lt;/p&gt;
&lt;hr&gt;
&lt;p&gt;&lt;sup&gt;1&lt;/sup&gt; Sure, thinking &lt;em&gt;is&lt;/em&gt; important, and big shot professors are also &lt;em&gt;doing&lt;/em&gt; a &lt;em&gt;&lt;strong&gt;lot&lt;/strong&gt;&lt;/em&gt;&amp;hellip; &lt;sup&gt;&lt;sub&gt;sometimes.&lt;/sub&gt;&lt;/sup&gt;&lt;/p&gt;
&lt;p&gt;&lt;sup&gt;2&lt;/sup&gt; Of note is following the &amp;ldquo;replication crisis&amp;rdquo; in psychology, a lot of voices have called for &amp;ldquo;more theory&amp;rdquo; and more &amp;ldquo;theorically-grounded research&amp;rdquo; (with the goal of cutting some of the nonsense out there). While theories are critical to guide data collection and interpretation, it is still the hard evidence that ultimately is the foundation of scientific knowledge.&lt;/p&gt;
&lt;p&gt;&lt;sup&gt;3&lt;/sup&gt; One of the pervasive issue is that Humans have limited &amp;ldquo;context window&amp;rdquo; (~ working memory). When reading a paper, it is already very hard to keep track of all the results and details in mind and integrate them into a coherent picture. Moreover, with the increasing role of social media, science had to be made more communicable, digestible, punchy, and &amp;ldquo;sexy&amp;rdquo;. This has led to a tendency to oversimplify and overgeneralize findings. AI, with its ability to process and summarize large amounts of information, could perhaps help make more accurate and data-grounded interpretations and summaries. &lt;em&gt;(it might be wishful thinking, but who knows!)&lt;/em&gt;&lt;/p&gt;
</description>
    </item>
    
    <item>
      <title>Attractiveness shapes beliefs about whether faces are real or AI-generated, study finds</title>
      <link>https://realitybending.github.io/post/2025-07-07-newsfakeface/</link>
      <pubDate>Mon, 07 Jul 2025 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/post/2025-07-07-newsfakeface/</guid>
      <description>&lt;p&gt;Our recent paper on facial attractiveness and reality beliefs is in the news:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a href=&#34;https://www.psypost.org/attractiveness-shapes-beliefs-about-whether-faces-are-real-or-ai-generated-study-finds/&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;https://www.psypost.org/attractiveness-shapes-beliefs-about-whether-faces-are-real-or-ai-generated-study-finds/&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
</description>
    </item>
    
    <item>
      <title>How to collect and save data with DataPipe in OSF</title>
      <link>https://realitybending.github.io/post/2025-07-02-datapipeosf/</link>
      <pubDate>Wed, 02 Jul 2025 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/post/2025-07-02-datapipeosf/</guid>
      <description>&lt;p&gt;Hello there! 👋 Let&amp;rsquo;s learn how to set up DataPipe to collect and save data in OSF.&lt;/p&gt;
&lt;p&gt;Lets start with some basics!&lt;/p&gt;
&lt;h2 id=&#34;what-is-datapipe&#34;&gt;What is DataPipe?&lt;/h2&gt;
&lt;p&gt;DataPipe is a tool that allows you to collect and save data in OSF (Open Science Framework). It is designed to help researchers manage their data collection process efficiently, ensuring that data is stored securely and can be easily accessed for analysis.&lt;/p&gt;
&lt;h2 id=&#34;how-to-set-up-datapipe-in-osf&#34;&gt;How to set up DataPipe in OSF&lt;/h2&gt;
&lt;ol&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Create an OSF Project&lt;/strong&gt;: Start by creating a new project in &lt;a href=&#34;https://osf.io/&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;OSF&lt;/a&gt;. This will be the container for your data and any related files. You can set up an account if you don&amp;rsquo;t have one already, quite easily!&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Go to the OSF homepage and log in or create an account. You can easily sign up through institutional access.&lt;/li&gt;
&lt;li&gt;Click on &amp;ldquo;Create New Project&amp;rdquo; and fill in the necessary details such as project title, description, and visibility settings. Choose &amp;ldquo;Germany - Frankfurt&amp;rdquo; as the server location; this is important for data privacy and compliance with regulations such as GDPR.&lt;em&gt;&lt;strong&gt;DO NOT SET YOUR PROJECT AS PUBLIC&lt;/strong&gt;&lt;/em&gt; as the data being saved will not be anonymized and may contain sensitive information.&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Create OSF Token&lt;/strong&gt;: You will need to create a token to grant DataPipe the necessary permissions to access your OSF project.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Go to your OSF &amp;ldquo;Settings&amp;rdquo; tab and navigate to the &amp;ldquo;Personal Access Tokens&amp;rdquo; section.&lt;/li&gt;
&lt;li&gt;Click on &amp;ldquo;Create Token&amp;rdquo; and give it a name (e.g., &amp;ldquo;DataPipe Token&amp;rdquo;).&lt;/li&gt;
&lt;li&gt;Set the permissions for the token, ensuring it has access to read and write data in your project.&lt;/li&gt;
&lt;li&gt;Copy the generated token; you will need it later.&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Link OSF to DataPipe&lt;/strong&gt;: In &lt;a href=&#34;https://pipe.jspsych.org/&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;DataPipe&lt;/a&gt;, you will need to link your OSF project using the token you created.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Open DataPipe, click &amp;ldquo;Account&amp;rdquo; in the top-right corner and select &amp;ldquo;Settings&amp;rdquo;.&lt;/li&gt;
&lt;li&gt;Click on the &amp;ldquo;Set OSF Token&amp;rdquo; button and paste the token you copied earlier from OSF.&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Create new experiment on DataPipe&lt;/strong&gt;: Now that your OSF project is linked, you can create a new experiment in DataPipe.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;In the &amp;ldquo;My Experiments&amp;rdquo; DataPipe tab, click on the &amp;ldquo;Create New Experiment&amp;rdquo; button.&lt;/li&gt;
&lt;li&gt;Give the experiment a name - I recommend using the same name as your OSF project for consistency.&lt;/li&gt;
&lt;li&gt;Add the OSF project ID to the experiment settings. You can find the project ID in the URL of your OSF project (it is the alphanumeric string after osf.io/)&lt;/li&gt;
&lt;li&gt;Create a New OSF Data Component called &amp;ldquo;data&amp;rdquo;. This will create a folder - named &amp;ldquo;data&amp;rdquo; - in your OSF project where all the data collected will be saved.&lt;/li&gt;
&lt;li&gt;Again, choose &amp;ldquo;Germany - Frankfurt&amp;rdquo; as the server location for your DataPipe experiment.&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Configure Data Collection&lt;/strong&gt;: Once the experiment is set up on DataPipe, enable data collection on the &amp;ldquo;Status&amp;rdquo; section. You can optionally enable base64 data collection if you wish to encode any video, audio, or image files as strings. &amp;ldquo;Condition assignment&amp;rdquo; can also be enabled - this makes DataPipe loop through the conditions when it requests the data. When deciding whether these features are suitable, it&amp;rsquo;s best to consider how you will preprocess the data. It&amp;rsquo;s advised that you only enable the minimum needed as a security measure.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Save the data from the experiment hosted on GitHub&lt;/strong&gt;: If you are using a GitHub repository to host your experiment, you can save the data collected by writing the following code within the experiment HTML file. Here is what that code might look like&amp;hellip;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Ensure you load the jsPsych DataPipe plugin, along with the rest of your plugins, within the head of the HTML script:&lt;/li&gt;
&lt;/ul&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-javascript&#34; data-lang=&#34;javascript&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;o&#34;&gt;&amp;lt;&lt;/span&gt;&lt;span class=&#34;nx&#34;&gt;script&lt;/span&gt; &lt;span class=&#34;nx&#34;&gt;src&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;s2&#34;&gt;&amp;#34;https://unpkg.com/@jspsych-contrib/plugin-pipe&amp;#34;&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;&amp;gt;&amp;lt;&lt;/span&gt;&lt;span class=&#34;err&#34;&gt;/script&amp;gt;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;ul&gt;
&lt;li&gt;After initializing your jsPsych timeline, to generate a random participant ID for your study, you can code the following:&lt;/li&gt;
&lt;/ul&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-javascript&#34; data-lang=&#34;javascript&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;// Initialize timeline
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;&lt;/span&gt;    &lt;span class=&#34;kd&#34;&gt;var&lt;/span&gt; &lt;span class=&#34;nx&#34;&gt;timeline&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;[]&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;nx&#34;&gt;participant_ID&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;nx&#34;&gt;jsPsych&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;nx&#34;&gt;randomization&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;nx&#34;&gt;randomID&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;mi&#34;&gt;10&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;ul&gt;
&lt;li&gt;This next bit of code should be called at the end of your experiment (albeit before running the timeline) to ensure that all data is saved to the OSF project, using the unique participant ID generated from the step above:&lt;/li&gt;
&lt;/ul&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-javascript&#34; data-lang=&#34;javascript&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;// Save data via DataPipe
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;&lt;/span&gt;    &lt;span class=&#34;nx&#34;&gt;timeline&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;nx&#34;&gt;push&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;({&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;nx&#34;&gt;type&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;:&lt;/span&gt; &lt;span class=&#34;nx&#34;&gt;jsPsychPipe&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;nx&#34;&gt;action&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;:&lt;/span&gt; &lt;span class=&#34;s2&#34;&gt;&amp;#34;save&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;nx&#34;&gt;experiment_id&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;:&lt;/span&gt; &lt;span class=&#34;s2&#34;&gt;&amp;#34;xxxxxxxxxx&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;c1&#34;&gt;// This in generated in the DataPipe interface
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;&lt;/span&gt;        &lt;span class=&#34;nx&#34;&gt;filename&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;:&lt;/span&gt; &lt;span class=&#34;sb&#34;&gt;`&lt;/span&gt;&lt;span class=&#34;si&#34;&gt;${&lt;/span&gt;&lt;span class=&#34;nx&#34;&gt;participant_ID&lt;/span&gt;&lt;span class=&#34;si&#34;&gt;}&lt;/span&gt;&lt;span class=&#34;sb&#34;&gt;.csv`&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;nx&#34;&gt;data_string&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;:&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;()&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;=&amp;gt;&lt;/span&gt; &lt;span class=&#34;nx&#34;&gt;jsPsych&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;nx&#34;&gt;data&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;nx&#34;&gt;get&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;().&lt;/span&gt;&lt;span class=&#34;nx&#34;&gt;csv&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(),&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;p&#34;&gt;})&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;ul&gt;
&lt;li&gt;
&lt;p&gt;On the experiment created in DataPipe, there is an &amp;lsquo;Experiment ID&amp;rsquo; field. This is the ID you need to add to the &lt;code&gt;experiment_id&lt;/code&gt; field in the code above.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;The &lt;code&gt;filename&lt;/code&gt; field can be customized to include the participant ID or any other identifier you prefer.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;If publishing your experiment to GitHub, make sure the link is&lt;/p&gt;
&lt;p&gt;&lt;em&gt;&amp;lsquo;https://[your username].github.io/[your repository name]&amp;rsquo;&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;or &lt;em&gt;&amp;lsquo;https://[your username].github.io/[your repository name]/[name of experiment&amp;rsquo;s html file]&amp;rsquo;&lt;/em&gt; if the html file for your experiment is named anything other than &lt;code&gt;&#39;index.html&#39;&lt;/code&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Run Your Experiment&lt;/strong&gt;: With everything set up, you can now run your experiment. DataPipe will automatically collect and save the data to your OSF project as specified.&lt;/p&gt;
&lt;p&gt;&lt;em&gt;Give it a try!&lt;/em&gt; If you&amp;rsquo;d like further clarification, the &lt;a href=&#34;https://pipe.jspsych.org/getting-started&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;DataPipe website&lt;/a&gt; includes a useful outline.&lt;/p&gt;
&lt;/li&gt;
&lt;/ol&gt;
</description>
    </item>
    
    <item>
      <title>Too beautiful to be fake: Attractive faces are less likely to be judged as artificially generated</title>
      <link>https://realitybending.github.io/publication/makowski2025toobeautiful/</link>
      <pubDate>Thu, 19 Dec 2024 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/publication/makowski2025toobeautiful/</guid>
      <description>&lt;div class=&#34;alert alert-tip&#34;&gt;
  &lt;div&gt;
    &lt;p&gt;&lt;strong&gt;Audio Summary&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;&lt;em&gt;Listen to a podcast summary of the paper!&lt;/em&gt;&lt;/p&gt;
&lt;audio controls &gt;
  &lt;source src=&#34;https://realitybending.github.io/publication/makowski2025toobeautiful/makowski2025toobeautiful.mp3&#34; type=&#34;audio/mpeg&#34;&gt;
&lt;/audio&gt;
  &lt;/div&gt;
&lt;/div&gt;
</description>
    </item>
    
    <item>
      <title>The Heart can Lie: A Preliminary Investigation of the Role of Interoception and Theory of Mind in Deception</title>
      <link>https://realitybending.github.io/publication/makowski2024heart/</link>
      <pubDate>Wed, 13 Nov 2024 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/publication/makowski2024heart/</guid>
      <description>&lt;div class=&#34;alert alert-tip&#34;&gt;
  &lt;div&gt;
    &lt;p&gt;&lt;strong&gt;Audio Summary&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;&lt;em&gt;Listen to a podcast summary of the paper!&lt;/em&gt;&lt;/p&gt;
&lt;audio controls &gt;
  &lt;source src=&#34;https://realitybending.github.io/publication/makowski2024heart/makowski2024heart.mp3&#34; type=&#34;audio/mpeg&#34;&gt;
&lt;/audio&gt;
  &lt;/div&gt;
&lt;/div&gt;
</description>
    </item>
    
    <item>
      <title>Sussex Psychological Methods MRes: Tips and Advice</title>
      <link>https://realitybending.github.io/post/2024-03-19-mres/</link>
      <pubDate>Tue, 19 Mar 2024 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/post/2024-03-19-mres/</guid>
      <description>&lt;p&gt;Ola! I&amp;rsquo;m &lt;a href=&#34;https://realitybending.github.io/authors/AnafNeves/&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;Ana&lt;/a&gt;. As I&amp;rsquo;m starting to approach the end of the year, it might be a good time to reflect and share my experience of doing a research masters in psychological methods at the University of Sussex, during the 2023/2024 academic year. First, I will talk a little bit about the modules I took; then I will mentioned all the reasons why you should choose to work with the Reality Bending Lab (ReBeL) and lastly, I will share some &lt;strong&gt;gems on how to survive the masters&lt;/strong&gt; 💎. Hopefully this blog will help you decide whether this degree is for you! Shall we start?&lt;/p&gt;
&lt;h2 id=&#34;overview-of-the-modules&#34;&gt;Overview of the Modules&lt;/h2&gt;
&lt;p&gt;Since this is a &lt;strong&gt;research masters&lt;/strong&gt; (MRes) aiming to prepare students for a future career as psychology researchers, the modules will have a significant focus on different research frameworks and practices, statistics and coding. During the Autumn semester you will have three main modules: 1) a (re)introduction to statistical models; 2) an introduction to Qualitative Methods; and 3) an introduction to better quality research practices. This term is super heavy on its content (no jokes) and will feel like a lot to do and learn (see below for tips on how to survive). However, there are plenty of materials to help you through this term, such as the R tutorials from our own in-house celebrity Professor Andy Field.&lt;/p&gt;
&lt;p&gt;The Spring semester is less content heavy and more practical focus. There are again, three main modules: 1) a theoretical and practical module on how to use advanced statistical methods; 2) an introduction to the Bayesian framework; and 3) an introduction to Python programming and how to use it to implement experiments. This has been a delightful term, not because it is &lt;em&gt;easy&lt;/em&gt;, but because the focus is less on &lt;strong&gt;memorising&lt;/strong&gt; and more on &lt;strong&gt;learning how&lt;/strong&gt;. Similarly, there are plenty of amazing materials to help you through this term such as optional zoom meetings to help you understand the materials and continuous communication on discord between lecturers and students.&lt;/p&gt;
&lt;p&gt;Additionally, there will be a research module that runs both in Autumn and Spring, and a dissertation module that starts in January and ends in August (i.e., when the dissertation project is due).&lt;/p&gt;
&lt;h2 id=&#34;the-internship&#34;&gt;The Internship&lt;/h2&gt;
&lt;p&gt;Critically, you will also do an &amp;ldquo;internship&amp;rdquo; as part of this masters (named the &amp;ldquo;research process&amp;rdquo; module 🤷‍♀️). This is by far &lt;strong&gt;the most exciting part&lt;/strong&gt; of this masters as you will learn first-hand what is like to be a researcher. You can essentially chose any psychology researcher from Sussex to work with, providing you with a great network and experience. Now&amp;hellip; you may be wondering &lt;strong&gt;what lab to choose?&lt;/strong&gt; And oh boy, do I have the answer for you!&lt;/p&gt;
&lt;p&gt;Introducing the &lt;strong&gt;Reality Bending Lab (ReBeL)&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;Rebel is led by &lt;a href=&#34;https://realitybending.github.io/authors/dominique-makowski/&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;Dr. Dominique Makowski&lt;/a&gt;. He will be your Mr. Miyagi during the Autumn and Spring term (and also your lecturer for the Bayesian Module). His patience, humour, straightforwardness and unmatched theoretical and pragmatic knowledge will be one of the big reasons why you will desire to be a researcher at the end of this masters (PS: no payment has been received in exchange for this testimony). The lab focus a lot on &lt;strong&gt;innovation&lt;/strong&gt; hence you will learn new ways to collect neuroscientific data and use new statistical methods. There will also be a big focus on &lt;strong&gt;collaboration&lt;/strong&gt;. Yes you will work independently, however more likely than not you will have the support of everyone in the lab, and you will be giving support yourself (getting a bit of experience on supervision and mentoring). &lt;strong&gt;Curiosity&lt;/strong&gt; is welcome and encouraged. Ask your questions, get involved in all aspects of the process if possible, and take advantage of the fact you will have a &amp;lsquo;mentor&amp;rsquo; for the whole academic year.&lt;/p&gt;
&lt;p&gt;During my time at ReBeL, I have been involved in various projects, such as &amp;ldquo;Exploring the Correlation between Interoception and Primal World Beliefs&amp;rdquo; and a meta-analysis of a widely used questionnaire of Interoception. These projects have taught me a lot, from how to collect and analyse both physiological and behavioural data, access and collect data for a meta-analysis, and report the work I did in oral and written format. Throughout the year, with the guidance and expertise of everyone involved in the lab, I gained a lot of confidence in my abilities as a researcher. Which is why I found this internship the most influential aspect of my masters.  Ultimately, at the ReBeL lab, you will not only &lt;strong&gt;investigate exciting concepts and topics but you will also have first hand experience on what it actually takes to be a researcher&lt;/strong&gt; (including the need to have a twitter account, apparently).&lt;/p&gt;
&lt;h2 id=&#34;survival-tips&#34;&gt;Survival Tips&lt;/h2&gt;
&lt;p&gt;Now&amp;hellip; You might be wondering.. &amp;ldquo;How in the world will I do all of this in one academic year?&amp;rdquo; Here are some tips that helped me gain the most of this masters without loosing my mind.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Unsurprising tip&lt;/strong&gt;: DO THE WEEKLY WORKSHOPS/TUTORIALS. They will provide with the majority of code, steps and knowledge necessary to complete the assignments.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Life saving tip&lt;/strong&gt;: do meal prep for the 48-hour assignments. If you are anything like me you will rather lose sleep then a delicious home-made meal. However, with the short time window to complete these assignments, meal prepping will help you feel less anxious about &amp;ldquo;not having enough time&amp;rdquo; to complete it all whilst still giving your mind everything it needs to function (i.e., sleep and nutrients).&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Qualitative tip&lt;/strong&gt;: as part of the January assignments, you will be asked to analyse 5 interviews using a qualitative method. If you come from a mostly quantitative background like me, you will be unfamiliar to how long it takes to code qualitative data. Do not make the same mistakes as I did and start that assignment as early as possible.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Student formatting tip&lt;/strong&gt;: when lectures say &amp;ldquo;I want it in APA format&amp;rdquo; some will expect you to write a piece of work that equates a publication level piece of work. When in doubt, ask them!&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Practical life skills tip&lt;/strong&gt;: communication is key with your supervisors. Especially during your internship; be honest about what you can and can not do, your preferred ways of working, your goals and dreams, and mostly important when you need help.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Ultimate tip&lt;/strong&gt;: do consider part-time , especially if you want/need to be working more than 20 hours a week on top of doing this masters. It is full on, and even as part-time all the lectures will be taught in the first year and hence there is still a lot of work to do. But it is possible, and can even be &lt;em&gt;enjoyable&lt;/em&gt;.&lt;/li&gt;
&lt;/ul&gt;
</description>
    </item>
    
    <item>
      <title>How to Assess Task Reliability using Bayesian Mixed Models</title>
      <link>https://realitybending.github.io/post/2024-03-18-signaltonoisemixed/</link>
      <pubDate>Mon, 18 Mar 2024 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/post/2024-03-18-signaltonoisemixed/</guid>
      <description>&lt;p&gt;Using reliable tasks when assessing inter-individual differences is a key issue for differential psychology and neuropsychology, and many research areas are clouded with mixed evidence stemming out of the suboptimal computation of individual scores (e.g., tasks with not enough trials, scores consisting in computing the difference, aka the &lt;strong&gt;contrast&lt;/strong&gt; between two conditions; see &lt;a href=&#34;https://osf.io/preprints/psyarxiv/8ktn6&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;Rouder et al., 2024&lt;/a&gt;). As such, measuring and reporting the reliability of the paradigms used could be an important step for &lt;strong&gt;increasing results replicability&lt;/strong&gt;.&lt;/p&gt;
&lt;p&gt;Recently, a new approach has emerged, suggesting to assess task sensitivity to inter-individual differences by leveraging mixed models (&lt;a href=&#34;https://doi.org/10.1177/09637214231220923&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;Rouder et al., 2024&lt;/a&gt;).
In essence, the idea is to fit a statistical model that tests for the &lt;strong&gt;general population level&lt;/strong&gt; effect of a manipulation in a given task/experiment (e.g., the impact of a variable &lt;strong&gt;Difficulty&lt;/strong&gt; on another variable &lt;strong&gt;RT&lt;/strong&gt;), and incorporates a &lt;strong&gt;random effect&lt;/strong&gt; for each participant. This &amp;ldquo;full&amp;rdquo; mixed model essentially models the general population level by taking into account all the inter-individual effects and - as a side effect - &lt;strong&gt;estimates the effects of interest for each participant separately&lt;/strong&gt;.&lt;/p&gt;
&lt;p&gt;When fitting these models under a Bayesian framework, one can easily estimate the &amp;ldquo;variability&amp;rdquo; (or certainty) of the effect in each participant. This is great, because it allows us to assess a &amp;ldquo;signal-to-noise&amp;rdquo; ratio, an index of how much the interindividual variability (how participants vary) is larger than the intraindividual variability (e.g., how much participants vary across trial, or how precisely participants&amp;rsquo; effects are estimated).&lt;/p&gt;
&lt;p&gt;In this &amp;ldquo;Signal-To-Noise Ratio as Effect Reliability&amp;rdquo; framework, an ideal task/manipulation would have a strong inter-individual variability (i.e., participants would on average vary a lot) and a low intra-individual variability (each participant would have very consistent effects), which leads to a reliable measure of interindividual effects.&lt;/p&gt;
&lt;p&gt;Let&amp;rsquo;s see how we can do that in R using the &lt;code&gt;brms&lt;/code&gt; package for fitting Bayesian mixed model. First, let&amp;rsquo;s start to generate 4 datasets with different levels of inter-individual and intra-individual variability.&lt;/p&gt;
&lt;details&gt;
  &lt;summary&gt;Show code&lt;/summary&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-r&#34; data-lang=&#34;r&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;nf&#34;&gt;library&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;easystats&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;nf&#34;&gt;library&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;tidyverse&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;nf&#34;&gt;library&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;brms&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;nf&#34;&gt;library&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;patchwork&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Make function to generate data&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;generate_data&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;&amp;lt;-&lt;/span&gt; &lt;span class=&#34;kr&#34;&gt;function&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;n_trials&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;25&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;effect_sd&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;m&#34;&gt;0.4&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;intercept_sd&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;0.4&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;noise&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;0.8&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;name&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;s&#34;&gt;&amp;#34;df&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;{&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;n&#34;&gt;df&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;&amp;lt;-&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;data.frame&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;()&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;kr&#34;&gt;for&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;participant&lt;/span&gt; &lt;span class=&#34;kr&#34;&gt;in&lt;/span&gt; &lt;span class=&#34;m&#34;&gt;1&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;:&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;20&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;{&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;n&#34;&gt;x&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;&amp;lt;-&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;rnorm&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;n_trials&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;m&#34;&gt;0&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;m&#34;&gt;1&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;n&#34;&gt;y&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;&amp;lt;-&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;1&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;+&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;rnorm&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;1&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;m&#34;&gt;0&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;effect_sd&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;))&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;*&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;x&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;+&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;rnorm&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;1&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;m&#34;&gt;0&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;intercept_sd&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;n&#34;&gt;y&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;&amp;lt;-&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;y&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;+&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;rnorm&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;n_trials&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;m&#34;&gt;0&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;noise&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;n&#34;&gt;df&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;&amp;lt;-&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;rbind&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;df&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;data.frame&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Difficulty&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;x&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;RT&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;y&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;                               &lt;span class=&#34;n&#34;&gt;Participant&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;nf&#34;&gt;paste0&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;s&#34;&gt;&amp;#34;S&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;participant&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)))&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;p&#34;&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;n&#34;&gt;df&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;$&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Name&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;&amp;lt;-&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;name&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;n&#34;&gt;df&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;p&#34;&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Generate 4 datasets&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;df1&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;&amp;lt;-&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;generate_data&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;n_trials&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;20&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;effect_sd&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;m&#34;&gt;0.5&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;intercept_sd&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;0.5&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;name&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;s&#34;&gt;&amp;#34;1. Intercept and Effect&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;df2&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;&amp;lt;-&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;generate_data&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;n_trials&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;20&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;effect_sd&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;m&#34;&gt;0.1&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;intercept_sd&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;0.5&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;name&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;s&#34;&gt;&amp;#34;2. Intercept Only&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;df3&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;&amp;lt;-&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;generate_data&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;n_trials&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;20&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;effect_sd&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;m&#34;&gt;0.5&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;intercept_sd&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;0.1&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;name&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;s&#34;&gt;&amp;#34;3. Effect Only&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;df4&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;&amp;lt;-&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;generate_data&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;n_trials&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;200&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;effect_sd&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;m&#34;&gt;0.5&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;intercept_sd&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;0.5&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;name&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;s&#34;&gt;&amp;#34;4. More trials&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Plot data&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;nf&#34;&gt;rbind&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;df1&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;df2&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;df3&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;df4&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;|&amp;gt;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;nf&#34;&gt;ggplot&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;nf&#34;&gt;aes&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;x&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Difficulty&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;y&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;RT&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;color&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Participant&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;fill&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Participant&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;))&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;+&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;nf&#34;&gt;geom_point2&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;alpha&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;0.5&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;+&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;nf&#34;&gt;geom_smooth&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;method&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;s&#34;&gt;&amp;#34;lm&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;se&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;kc&#34;&gt;TRUE&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;alpha&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;0.2&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;+&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;nf&#34;&gt;theme_minimal&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;()&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;+&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;nf&#34;&gt;scale_fill_material_d&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;()&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;+&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;nf&#34;&gt;scale_color_material_d&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;()&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;+&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;nf&#34;&gt;facet_wrap&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;~&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Name&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;scales&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;s&#34;&gt;&amp;#34;free&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;/details&gt;
&lt;p&gt;















&lt;figure  &gt;
  &lt;div class=&#34;d-flex justify-content-center&#34;&gt;
    &lt;div class=&#34;w-100&#34; &gt;&lt;img alt=&#34;&#34; srcset=&#34;
               /post/2024-03-18-signaltonoisemixed/fig1_hu_8c4e561a613a9b8e.webp 400w,
               /post/2024-03-18-signaltonoisemixed/fig1_hu_43c54a6c4684fff.webp 760w,
               /post/2024-03-18-signaltonoisemixed/fig1_hu_733432bfea2875c2.webp 1200w&#34;
               src=&#34;https://realitybending.github.io/post/2024-03-18-signaltonoisemixed/fig1_hu_8c4e561a613a9b8e.webp&#34;
               width=&#34;760&#34;
               height=&#34;760&#34;
               loading=&#34;lazy&#34; data-zoomable /&gt;&lt;/div&gt;
  &lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;In each of the dataset, we simulated the data of &lt;strong&gt;20 participants&lt;/strong&gt; undergoing a task with &lt;em&gt;n&lt;/em&gt; trials varying in &lt;strong&gt;difficulty&lt;/strong&gt;, and we recorded their &lt;strong&gt;reaction time (RT)&lt;/strong&gt;. Note that while in our example &lt;em&gt;difficulty&lt;/em&gt; is a continuous variable, it would work the same if it was categorical variable (e.g., effect of condition B over A, intervention vs. baseline, incongruent vs. congruent, etc.).&lt;/p&gt;
&lt;p&gt;When we fit a linear regression of the form &lt;em&gt;RT ~ difficulty&lt;/em&gt;, we are estimating two parameters; the &lt;em&gt;intercept&lt;/em&gt; (which can be seen as the &amp;ldquo;baseline&amp;rdquo; RT, i.e., &lt;strong&gt;participants&amp;rsquo; baseline processing speed&lt;/strong&gt; when the difficulty is 0) and the &lt;em&gt;slope&lt;/em&gt; (how much participants are impacted by this variable). These two parameters are in principle independent (a participant can be very fast regardless of the difficulty, and another one could be equally fast at baseline - same intercept - but very slow when the task is difficult - strong slope).&lt;/p&gt;
&lt;p&gt;We simulated 4 datasets with different participant characteristics:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Dataset 1&lt;/strong&gt;: Both the RT intercept (&lt;strong&gt;the &amp;ldquo;baseline&amp;rdquo; RT&lt;/strong&gt;) and the effect of the manipulation (the &lt;strong&gt;effect of difficulty&lt;/strong&gt;) vary across participants.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Dataset 2&lt;/strong&gt;: Not much interindividual variability in the effect (only the baseline RT varies).&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Dataset 3&lt;/strong&gt;: Not much interindividual variability in the baseline RT (only the effect of difficulty varies from participant to participant).&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Dataset 4&lt;/strong&gt;: Same as dataset 1, but with more trials (200 instead of 20). As you can see, the &amp;ldquo;precision&amp;rdquo; ribbon around the regression line is much narrower, indicating that the effect is more precisely estimated.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;We expect that reliability of the paradigm to measure 1) the sensitivity to &lt;strong&gt;difficulty&lt;/strong&gt; and 2) the &lt;strong&gt;baseline RT&lt;/strong&gt; will be higher in dataset 4 (because more trials) than in dataset 1. Moreover, the sensitivity to &lt;strong&gt;difficulty&lt;/strong&gt; will be particularly low in dataset 2 (where only the baseline RT is set to varies), and similarly for baseline RT in dataset 3 &lt;em&gt;mutatis mutandis&lt;/em&gt;.&lt;/p&gt;
&lt;p&gt;Now, let&amp;rsquo;s fit a Bayesian linear mixed model to each of these datasets (note that we specify the effect of Difficulty as a random &lt;em&gt;slope&lt;/em&gt; in addition to estimating the random intercept).&lt;/p&gt;
&lt;details&gt;
  &lt;summary&gt;Show code&lt;/summary&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-r&#34; data-lang=&#34;r&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;model1&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;&amp;lt;-&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;brms&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;::&lt;/span&gt;&lt;span class=&#34;nf&#34;&gt;brm&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;RT&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;~&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Difficulty&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;+&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;1&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;+&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Difficulty&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;|&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Participant&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;),&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;data&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;df1&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;iter&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;600&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;model2&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;&amp;lt;-&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;brms&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;::&lt;/span&gt;&lt;span class=&#34;nf&#34;&gt;brm&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;RT&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;~&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Difficulty&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;+&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;1&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;+&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Difficulty&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;|&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Participant&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;),&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;data&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;df2&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;iter&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;600&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;model3&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;&amp;lt;-&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;brms&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;::&lt;/span&gt;&lt;span class=&#34;nf&#34;&gt;brm&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;RT&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;~&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Difficulty&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;+&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;1&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;+&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Difficulty&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;|&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Participant&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;),&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;data&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;df3&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;iter&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;600&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;model4&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;&amp;lt;-&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;brms&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;::&lt;/span&gt;&lt;span class=&#34;nf&#34;&gt;brm&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;RT&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;~&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Difficulty&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;+&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;1&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;+&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Difficulty&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;|&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Participant&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;),&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;data&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;df4&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;iter&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;600&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;/details&gt;
&lt;p&gt;This model basically computes the overall relationship (Intercept + Slope) between difficulty and RT, as well as &lt;strong&gt;for each participant&lt;/strong&gt;.
We can then extract the &lt;strong&gt;posterior distribution&lt;/strong&gt; of these individual effects (i.e., the value of the &lt;strong&gt;Intercept&lt;/strong&gt; and the &lt;strong&gt;Slope&lt;/strong&gt; for each participant).&lt;/p&gt;
&lt;details&gt;
  &lt;summary&gt;Show code&lt;/summary&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-r&#34; data-lang=&#34;r&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Random effects extraction&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;extract_individual&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;&amp;lt;-&lt;/span&gt; &lt;span class=&#34;kr&#34;&gt;function&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;model&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;name&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;s&#34;&gt;&amp;#34;df&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;{&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;n&#34;&gt;coefs&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;&amp;lt;-&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;coef&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;model&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;summary&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;kc&#34;&gt;FALSE&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;$&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Participant&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;n&#34;&gt;data&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;&amp;lt;-&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;rbind&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;nf&#34;&gt;as.data.frame&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;coefs[&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;s&#34;&gt;&amp;#34;Intercept&amp;#34;&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;]&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;|&amp;gt;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;      &lt;span class=&#34;nf&#34;&gt;pivot_longer&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;nf&#34;&gt;everything&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(),&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;names_to&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;s&#34;&gt;&amp;#34;Participant&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;values_to&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;s&#34;&gt;&amp;#34;Value&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;|&amp;gt;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;      &lt;span class=&#34;nf&#34;&gt;mutate&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Parameter&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;s&#34;&gt;&amp;#34;Intercept&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Name&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;name&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;),&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;nf&#34;&gt;as.data.frame&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;coefs[&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;s&#34;&gt;&amp;#34;Difficulty&amp;#34;&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;]&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;|&amp;gt;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;      &lt;span class=&#34;nf&#34;&gt;pivot_longer&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;nf&#34;&gt;everything&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(),&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;names_to&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;s&#34;&gt;&amp;#34;Participant&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;values_to&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;s&#34;&gt;&amp;#34;Value&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;|&amp;gt;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;      &lt;span class=&#34;nf&#34;&gt;mutate&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Parameter&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;s&#34;&gt;&amp;#34;Difficulty&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Name&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;name&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;n&#34;&gt;data&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;p&#34;&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;re1&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;&amp;lt;-&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;extract_individual&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;model1&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;s&#34;&gt;&amp;#34;1. Intercept and Effect&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;re2&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;&amp;lt;-&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;extract_individual&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;model2&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;s&#34;&gt;&amp;#34;2. Intercept Only&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;re3&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;&amp;lt;-&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;extract_individual&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;model3&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;s&#34;&gt;&amp;#34;3. Effect Only&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;re4&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;&amp;lt;-&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;extract_individual&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;model4&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;s&#34;&gt;&amp;#34;4. More trials&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Plot Random effects&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;nf&#34;&gt;rbind&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;re1&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;re2&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;re3&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;re4&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;|&amp;gt;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;nf&#34;&gt;ggplot&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;nf&#34;&gt;aes&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;x&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Value&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;y&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Participant&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;fill&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Participant&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;))&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;+&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;n&#34;&gt;ggdist&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;::&lt;/span&gt;&lt;span class=&#34;nf&#34;&gt;stat_slabinterval&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;adjust&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;2&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;linewidth&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;0.5&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;size&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;0.5&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;+&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;nf&#34;&gt;scale_fill_material_d&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;()&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;+&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;nf&#34;&gt;theme_minimal&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;()&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;+&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;nf&#34;&gt;facet_grid&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Name&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;~&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Parameter&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;scales&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;s&#34;&gt;&amp;#34;free&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;/details&gt;
&lt;p&gt;















&lt;figure  &gt;
  &lt;div class=&#34;d-flex justify-content-center&#34;&gt;
    &lt;div class=&#34;w-100&#34; &gt;&lt;img alt=&#34;&#34; srcset=&#34;
               /post/2024-03-18-signaltonoisemixed/fig2_hu_78e660234b2c4fd1.webp 400w,
               /post/2024-03-18-signaltonoisemixed/fig2_hu_fb7aeb0fd86ae2f8.webp 760w,
               /post/2024-03-18-signaltonoisemixed/fig2_hu_1735dde7047b1b34.webp 1200w&#34;
               src=&#34;https://realitybending.github.io/post/2024-03-18-signaltonoisemixed/fig2_hu_78e660234b2c4fd1.webp&#34;
               width=&#34;570&#34;
               height=&#34;760&#34;
               loading=&#34;lazy&#34; data-zoomable /&gt;&lt;/div&gt;
  &lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;Each participant&amp;rsquo;s &amp;ldquo;score&amp;rdquo; (for the baseline RT score, i.e., the intercept; and the effect of difficulty, i.e., the slope) is represented by &lt;strong&gt;a distribution&lt;/strong&gt;.
This distribution is wider when there is less trials, which can be interpreted as more uncertainty about the exact estimate.
Some datasets have a low interindividual variability for some parameters (e.g., dataset 2 has not much interindividual variability in the effect of difficulty).&lt;/p&gt;
&lt;p&gt;We can now compute, for each participant, the &amp;ldquo;mean&amp;rdquo; of its effects (for the intercept and the slope), as well as its own effect SD (intra-individual variability).&lt;/p&gt;
&lt;details&gt;
  &lt;summary&gt;Show code&lt;/summary&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-r&#34; data-lang=&#34;r&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;scores&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;&amp;lt;-&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;rbind&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;re1&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;re2&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;re3&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;re4&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;|&amp;gt;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;nf&#34;&gt;summarize&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;n&#34;&gt;Mean&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;mean&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Value&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;),&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;n&#34;&gt;SD&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;sd&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Value&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;),&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;n&#34;&gt;.by&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;c&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;s&#34;&gt;&amp;#34;Name&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;s&#34;&gt;&amp;#34;Parameter&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;s&#34;&gt;&amp;#34;Participant&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;nf&#34;&gt;head&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;scores&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;/details&gt;
&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th style=&#34;text-align: left&#34;&gt;Name&lt;/th&gt;
          &lt;th style=&#34;text-align: left&#34;&gt;Parameter&lt;/th&gt;
          &lt;th style=&#34;text-align: left&#34;&gt;Participant&lt;/th&gt;
          &lt;th style=&#34;text-align: right&#34;&gt;Mean&lt;/th&gt;
          &lt;th style=&#34;text-align: right&#34;&gt;SD&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td style=&#34;text-align: left&#34;&gt;1. Intercept and Effect&lt;/td&gt;
          &lt;td style=&#34;text-align: left&#34;&gt;Intercept&lt;/td&gt;
          &lt;td style=&#34;text-align: left&#34;&gt;S1&lt;/td&gt;
          &lt;td style=&#34;text-align: right&#34;&gt;0.37&lt;/td&gt;
          &lt;td style=&#34;text-align: right&#34;&gt;0.20&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td style=&#34;text-align: left&#34;&gt;1. Intercept and Effect&lt;/td&gt;
          &lt;td style=&#34;text-align: left&#34;&gt;Intercept&lt;/td&gt;
          &lt;td style=&#34;text-align: left&#34;&gt;S10&lt;/td&gt;
          &lt;td style=&#34;text-align: right&#34;&gt;-1.05&lt;/td&gt;
          &lt;td style=&#34;text-align: right&#34;&gt;0.20&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td style=&#34;text-align: left&#34;&gt;1. Intercept and Effect&lt;/td&gt;
          &lt;td style=&#34;text-align: left&#34;&gt;Intercept&lt;/td&gt;
          &lt;td style=&#34;text-align: left&#34;&gt;S11&lt;/td&gt;
          &lt;td style=&#34;text-align: right&#34;&gt;0.88&lt;/td&gt;
          &lt;td style=&#34;text-align: right&#34;&gt;0.19&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td style=&#34;text-align: left&#34;&gt;1. Intercept and Effect&lt;/td&gt;
          &lt;td style=&#34;text-align: left&#34;&gt;Intercept&lt;/td&gt;
          &lt;td style=&#34;text-align: left&#34;&gt;S12&lt;/td&gt;
          &lt;td style=&#34;text-align: right&#34;&gt;-0.30&lt;/td&gt;
          &lt;td style=&#34;text-align: right&#34;&gt;0.18&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td style=&#34;text-align: left&#34;&gt;1. Intercept and Effect&lt;/td&gt;
          &lt;td style=&#34;text-align: left&#34;&gt;Intercept&lt;/td&gt;
          &lt;td style=&#34;text-align: left&#34;&gt;S13&lt;/td&gt;
          &lt;td style=&#34;text-align: right&#34;&gt;0.16&lt;/td&gt;
          &lt;td style=&#34;text-align: right&#34;&gt;0.19&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td style=&#34;text-align: left&#34;&gt;1. Intercept and Effect&lt;/td&gt;
          &lt;td style=&#34;text-align: left&#34;&gt;Intercept&lt;/td&gt;
          &lt;td style=&#34;text-align: left&#34;&gt;S14&lt;/td&gt;
          &lt;td style=&#34;text-align: right&#34;&gt;0.57&lt;/td&gt;
          &lt;td style=&#34;text-align: right&#34;&gt;0.19&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;p&gt;Finally, we can compute the &lt;strong&gt;Signal-to-Noise Ratio&lt;/strong&gt; for each parameter for each dataset, which is the ratio of the interindividual variability (the SD of the individual mean scores) over the average intraindividual variability (the average of the individual SDs).&lt;/p&gt;
&lt;details&gt;
  &lt;summary&gt;Show code&lt;/summary&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-r&#34; data-lang=&#34;r&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;nf&#34;&gt;summarize&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;scores&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;          &lt;span class=&#34;n&#34;&gt;SNR&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;sd&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Mean&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;/&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;mean&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;SD&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;),&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;          &lt;span class=&#34;n&#34;&gt;.by&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;nf&#34;&gt;c&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;s&#34;&gt;&amp;#34;Name&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;s&#34;&gt;&amp;#34;Parameter&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;))&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;/details&gt;
&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th style=&#34;text-align: left&#34;&gt;Name&lt;/th&gt;
          &lt;th style=&#34;text-align: left&#34;&gt;Parameter&lt;/th&gt;
          &lt;th style=&#34;text-align: right&#34;&gt;SNR&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td style=&#34;text-align: left&#34;&gt;1. Intercept and Effect&lt;/td&gt;
          &lt;td style=&#34;text-align: left&#34;&gt;Intercept&lt;/td&gt;
          &lt;td style=&#34;text-align: right&#34;&gt;2.87&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td style=&#34;text-align: left&#34;&gt;1. Intercept and Effect&lt;/td&gt;
          &lt;td style=&#34;text-align: left&#34;&gt;Difficulty&lt;/td&gt;
          &lt;td style=&#34;text-align: right&#34;&gt;3.00&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td style=&#34;text-align: left&#34;&gt;2. Intercept Only&lt;/td&gt;
          &lt;td style=&#34;text-align: left&#34;&gt;Intercept&lt;/td&gt;
          &lt;td style=&#34;text-align: right&#34;&gt;2.57&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td style=&#34;text-align: left&#34;&gt;2. Intercept Only&lt;/td&gt;
          &lt;td style=&#34;text-align: left&#34;&gt;Difficulty&lt;/td&gt;
          &lt;td style=&#34;text-align: right&#34;&gt;0.55&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td style=&#34;text-align: left&#34;&gt;3. Effect Only&lt;/td&gt;
          &lt;td style=&#34;text-align: left&#34;&gt;Intercept&lt;/td&gt;
          &lt;td style=&#34;text-align: right&#34;&gt;0.88&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td style=&#34;text-align: left&#34;&gt;3. Effect Only&lt;/td&gt;
          &lt;td style=&#34;text-align: left&#34;&gt;Difficulty&lt;/td&gt;
          &lt;td style=&#34;text-align: right&#34;&gt;2.59&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td style=&#34;text-align: left&#34;&gt;4. More trials&lt;/td&gt;
          &lt;td style=&#34;text-align: left&#34;&gt;Intercept&lt;/td&gt;
          &lt;td style=&#34;text-align: right&#34;&gt;8.88&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td style=&#34;text-align: left&#34;&gt;4. More trials&lt;/td&gt;
          &lt;td style=&#34;text-align: left&#34;&gt;Difficulty&lt;/td&gt;
          &lt;td style=&#34;text-align: right&#34;&gt;7.97&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;p&gt;As predicted, the &amp;ldquo;reliability&amp;rdquo; of the paradigm to measure the interindividual effect of difficulty on RT is low in dataset 2 (where only the baseline RT varies), moderate in dataset 1 and 3, and high in dataset 4 where there are more trials.&lt;/p&gt;
</description>
    </item>
    
    <item>
      <title>Junior Research Assistant (JRA) at Sussex: is it worth it?</title>
      <link>https://realitybending.github.io/post/2024-03-12-jingjra/</link>
      <pubDate>Thu, 02 Nov 2023 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/post/2024-03-12-jingjra/</guid>
      <description>&lt;p&gt;Hi all, I am &lt;a href=&#34;https://realitybending.github.io/authors/jingxiong-xu/&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;Jing&lt;/a&gt;, and I thought I would share my experience as a Psychology Junior Research Assistant (JRA) at the University of Sussex, as many students might wonder how it is really like. Obviously, I cannot speak for all the labs, but I hope my experience can give you a general idea of what to expect.&lt;/p&gt;
&lt;p&gt;I worked as a JRA during summer 2023 at the Reality Bending Lab (ReBeL). And to put it simply, I think it was &lt;strong&gt;the most valuable experience&lt;/strong&gt; during my undergraduate journey &lt;em&gt;(PS: I have &lt;strong&gt;not&lt;/strong&gt; written this at gunpoint)&lt;/em&gt;. During these three months, I was supervised by Dr Makowski to work on a piece of original research, that thought me a lot about programming, cognitive neuropsychology, physio recordings and how real research is done. Additionally, know that it is possible to stay in the same lab next academic year, to do your final year &lt;strong&gt;dissertation with a strong head start&lt;/strong&gt; in terms of skills and knowledge.&lt;/p&gt;
&lt;img src=&#34;poster.jpg&#34; align=&#34;right&#34; width=&#34;40%&#34;&gt; 
&lt;p&gt;I had the pleasure of joining the Reality Bending Lab (ReBeL) along with &lt;a href=&#34;https://realitybending.github.io/authors/auz-moore/&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;Auz&lt;/a&gt;, as the first two members since the lab moved to the UK. The title of my project was &lt;strong&gt;&amp;ldquo;Exploring the Correlation between Interoception and Primal World Beliefs&amp;rdquo;&lt;/strong&gt;, which involved collecting &lt;strong&gt;physiological data&lt;/strong&gt; (e.g., heart rate, respiration, &amp;hellip;) in various tasks, analysing them, and investigating the relationship between various measures. The project started from scratch, where I learned how to use the &lt;strong&gt;JavaScript package JsPsych&lt;/strong&gt; to build the entire paradigm via coding. I also received detailed training on how to run a lab-based experiment, something I used to be worried but am now &lt;strong&gt;extremely confident about&lt;/strong&gt;. After collecting the data from 20 participants (&lt;em&gt;summer time goes by veryyyy fast!&lt;/em&gt;), I learned how to make and visualize Bayesian correlations in R. The output of this project was made into an academic poster, where I had to be creative and selective, to be presented at the poster session (see below). Additionally, we created the &lt;a href=&#34;https://github.com/RealityBending/SussexPhysioProtocol&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;em&gt;&amp;ldquo;Sussex Psychophysiological Research Protocol&amp;rdquo;&lt;/em&gt;&lt;/a&gt;, a document aiming at providing guidelines for the best practices in psychophysiological research, to benefit future research done at Sussex. It might not seem like much, but it felt like doing real contributions to research, which was great!&lt;/p&gt;
&lt;p&gt;Something important I learned is, beyond pure academic excellence, research is also about community and networking. It was a great occasion to &lt;strong&gt;informally meet many researchers&lt;/strong&gt;, and make bonds with other students. What is cool is that the JRA journey doesn&amp;rsquo;t stop abruptly and continues into the next academic year, as all candidates are invited to present their work at the &lt;strong&gt;JRA conference&lt;/strong&gt; held by the university in October. This was an amazing opportunity to get a glimpse of what a scientific conference might be, feel proud about your work, connecting with fellow students, learning how to talk about research with other staff members, and gaining public speaking skills. For those who are more ambitious, why not submit your work to the national level, and present it in the British Conference for Undergraduate Research (BCUR)?&lt;/p&gt;
&lt;p&gt;In summary, I see the JRA as a golden key to open countless possibilities for your &lt;strong&gt;future career path&lt;/strong&gt;. For those considering applying to &lt;strong&gt;postgraduate studies&lt;/strong&gt; or research assistants, the strong research experience you gained will &lt;strong&gt;put you at the top of the list&lt;/strong&gt;. Even for those who decided to not do research in the future, it will still be rewarding as it gives a clear idea of what career you do not want. Don&amp;rsquo;t miss on it!&lt;/p&gt;
&lt;p align=&#34;right&#34;&gt;- Jing&lt;/p&gt;
&lt;p&gt;















&lt;figure  &gt;
  &lt;div class=&#34;d-flex justify-content-center&#34;&gt;
    &lt;div class=&#34;w-100&#34; &gt;&lt;img alt=&#34;&#34; srcset=&#34;
               /post/2024-03-12-jingjra/ceremony_hu_79578bb193b81a68.webp 400w,
               /post/2024-03-12-jingjra/ceremony_hu_1defa5a8c38bfddd.webp 760w,
               /post/2024-03-12-jingjra/ceremony_hu_2287f73e27748a4c.webp 1200w&#34;
               src=&#34;https://realitybending.github.io/post/2024-03-12-jingjra/ceremony_hu_79578bb193b81a68.webp&#34;
               width=&#34;760&#34;
               height=&#34;570&#34;
               loading=&#34;lazy&#34; data-zoomable /&gt;&lt;/div&gt;
  &lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
</description>
    </item>
    
    <item>
      <title>ESCOP</title>
      <link>https://realitybending.github.io/talk/escop/</link>
      <pubDate>Wed, 06 Sep 2023 13:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/talk/escop/</guid>
      <description>&lt;!--


&lt;div class=&#34;alert alert-note&#34;&gt;
  &lt;div&gt;
    Click on the &lt;strong&gt;Slides&lt;/strong&gt; button above to view the built-in slides feature.
  &lt;/div&gt;
&lt;/div&gt;


Slides can be added in a few ways:

- **Create** slides using Wowchemy&#39;s [_Slides_](https://wowchemy.com/docs/managing-content/#create-slides) feature and link using `slides` parameter in the front matter of the talk file
- **Upload** an existing slide deck to `static/` and link using `url_slides` parameter in the front matter of the talk file
- **Embed** your slides (e.g. Google Slides) or presentation video on this page using [shortcodes](https://wowchemy.com/docs/writing-markdown-latex/).

Further event details, including [page elements](https://wowchemy.com/docs/writing-markdown-latex/) such as image galleries, can be added to the body of this page. --&gt;
</description>
    </item>
    
    <item>
      <title>A Novel Visual Illusion Paradigm Provides Evidence for a General Factor of Illusion Sensitivity and Personality Correlates</title>
      <link>https://realitybending.github.io/publication/makowski2023novel/</link>
      <pubDate>Sat, 22 Apr 2023 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/publication/makowski2023novel/</guid>
      <description>&lt;div class=&#34;alert alert-tip&#34;&gt;
  &lt;div&gt;
    &lt;p&gt;&lt;strong&gt;Audio Summary&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;&lt;em&gt;Listen to a podcast summary of the paper!&lt;/em&gt;&lt;/p&gt;
&lt;audio controls &gt;
  &lt;source src=&#34;https://realitybending.github.io/publication/makowski2023novel/podcast_makowski2023novel.mp3&#34; type=&#34;audio/mpeg&#34;&gt;
&lt;/audio&gt;
  &lt;/div&gt;
&lt;/div&gt;
</description>
    </item>
    
    <item>
      <title>Where Are We Going with Statistical Computing? From Mathematical Statistics to Collaborative Data Science</title>
      <link>https://realitybending.github.io/publication/makowski2023editorial/</link>
      <pubDate>Wed, 12 Apr 2023 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/publication/makowski2023editorial/</guid>
      <description></description>
    </item>
    
    <item>
      <title>How do we know what is real? The &#39;Affective Reality Theory&#39;</title>
      <link>https://realitybending.github.io/post/2023-04-11-affectivereality/</link>
      <pubDate>Tue, 11 Apr 2023 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/post/2023-04-11-affectivereality/</guid>
      <description>&lt;p&gt;I thought it would be interesting to summarize an idea developed during my PhD on &amp;ldquo;fictional reappraisal&amp;rdquo;, i.e., on the effect of the belief that an emotional stimulus is not real (&lt;a href=&#34;https://www.theses.fr/2018USPCB188&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;Makowski, 2018&lt;/a&gt;). That of &lt;strong&gt;Affective Reality&lt;/strong&gt;, which is a hypothesis about the &lt;strong&gt;role of affective reactions in the formation of reality beliefs&lt;/strong&gt;.&lt;/p&gt;
&lt;p&gt;The premise it lies on is that we have entered a &amp;ldquo;post-truth era&amp;rdquo;, in which &lt;strong&gt;the distinction between real and simulated (&amp;ldquo;fake&amp;rdquo;) objects has become virtually impossible&lt;/strong&gt; based on physical characteristics alone. In other words, technology has developed so much that we can forge (or will be able to in the near future) &amp;ldquo;artificial&amp;rdquo; &lt;strong&gt;&lt;sup id=&#34;fnref:1&#34;&gt;&lt;a href=&#34;#fn:1&#34; class=&#34;footnote-ref&#34; role=&#34;doc-noteref&#34;&gt;1&lt;/a&gt;&lt;/sup&gt;&lt;/strong&gt; content (e.g., text and images with AIs, and even environments with VR) that is indistinguishable from its original counterpart. For instance, face generation algorithms are so advanced that it is impossible nowadays to tell the difference with the naked eye between a real photo and AI-generated image.&lt;/p&gt;
&lt;p&gt;Once we agree on this premise of objective equivalence between reality and simulation, the question of &lt;strong&gt;how do we form judgments and make decisions about the reality of objects&lt;/strong&gt; arises. In the absence of clues within the stimuli, we are left with with other sources of epistemological information, such as contextual cues (in the case of news, who is the author, what is the outlet it got published, etc.), and &lt;strong&gt;&lt;em&gt;internal&lt;/em&gt; cues&lt;/strong&gt; (subjective characteristics: how does it relate to our knowledge, how does it make us feel, etc.). The latter is of particular interest to us psychologists.&lt;/p&gt;
&lt;p&gt;We refer to the process of forming reality beliefs as &lt;strong&gt;simulation monitoring&lt;/strong&gt; (&lt;a href=&#34;https://realitybending.github.io/publication/makowski2019phenomenal/makowski2019phenomenal.pdf&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;Makowski et al., 2019&lt;/a&gt;), which is a somewhat controversial term (that some &lt;strong&gt;&lt;sup id=&#34;fnref:2&#34;&gt;&lt;a href=&#34;#fn:2&#34; class=&#34;footnote-ref&#34; role=&#34;doc-noteref&#34;&gt;2&lt;/a&gt;&lt;/sup&gt;&lt;/strong&gt; have considered as almost counterintuitive). The reason for this term, instead of something along the lines &amp;ldquo;reality appraisal&amp;rdquo; &lt;strong&gt;&lt;sup id=&#34;fnref:3&#34;&gt;&lt;a href=&#34;#fn:3&#34; class=&#34;footnote-ref&#34; role=&#34;doc-noteref&#34;&gt;3&lt;/a&gt;&lt;/sup&gt;&lt;/strong&gt;, is the assumption that &lt;strong&gt;reality is our default mode of experience&lt;/strong&gt;. In other words, we are not well equipped (neurocognitively speaking) to detect and classify things as non-real, as these objects are very recent in our evolutionary history. Thus, according to the &lt;strong&gt;Affective Reality Theory&lt;/strong&gt;, by default, the brain considers the origin of its experiences as real&amp;hellip; but this &amp;ldquo;belief&amp;rdquo; is, most of the time, not even fully formed, remaining implicit and subconscious (i.e., we don&amp;rsquo;t spend all our cognitive resources with a constant &amp;ldquo;this is real. This is real too. That too.&amp;rdquo; labelling). &lt;strong&gt;This default mode acts as a higher-level, transparent prior over our experiences&lt;/strong&gt;, providing a scaffolding and structuring our perception, thoughts and reactions. We do not actively appraise the world as real (it is the baseline position), but instead can ask ourselves whether it is simulated, hence simulation monitoring.&lt;/p&gt;
&lt;figure&gt;
  &lt;img src=&#34;AffectiveRealityTheory_Makowski.png&#34; alt=&#34;The Affective Reality Theory (Makowski, 2018)&#34;/&gt;
  &lt;figcaption&gt;&lt;i&gt;The Affective Reality Theory posits that reality beliefs (the tendency to believe that something is real, as opposed to non-real) is related to  emotions and/or bodily reactions through a quadratic (inverse U-shaped) relationship..&lt;/i&gt;&lt;/figcaption&gt;
&lt;/figure&gt;
&lt;p&gt;The &lt;strong&gt;Affective Reality&lt;/strong&gt; hypothesis posits that simulation monitoring is strongly connected to &amp;ldquo;affective processing&amp;rdquo; &lt;strong&gt;&lt;sup id=&#34;fnref:4&#34;&gt;&lt;a href=&#34;#fn:4&#34; class=&#34;footnote-ref&#34; role=&#34;doc-noteref&#34;&gt;4&lt;/a&gt;&lt;/sup&gt;&lt;/strong&gt; through a quadratic (inverse U-shaped) relationship. This means that stimuli associated with a stronger emotional and/or bodily reaction will preferentially bias our judgment towards &amp;ldquo;reality&amp;rdquo;. In other words, things that elicit feelings and/or bodily arousal, &lt;em&gt;ceteris paribus&lt;/em&gt;, will be more likely to be classified as &amp;ldquo;real&amp;rdquo; (as opposed to fake). In fact, strongly emotional events will even &amp;ldquo;feel&amp;rdquo; more real: this transparent default prior and subconscious belief (&amp;ldquo;agnostic-real&amp;rdquo;) will be replaced in high-intensity scenarios by an explicit and conscious impression that the stimulus is very real, and, if logic opposes, that it &amp;ldquo;must be real&amp;rdquo; regardless.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Isn&amp;rsquo;t it the other way round&lt;/strong&gt;, you might wonder: that real stimuli (as opposed to ones believed to be non-real) are associated with a stronger emotional reactions? And that &lt;strong&gt;it is the believed reality that drives the emotional response&lt;/strong&gt;? Indeed, we do believe that there is a two-ways relationship between simulation monitoring and emotions. But it is not exactly that beliefs of reality are associated with stronger emotions, but rather that beliefs that something is &lt;em&gt;not&lt;/em&gt; real leads to a lower emotional response (the usage of fiction as an emotion regulation strategy - &amp;ldquo;fictional reappraisal&amp;rdquo; - was the main topic of my doctoral dissertation). In fact, the Affective Reality theory posits that this regulatory effect of &lt;strong&gt;simulation monitoring starts to dominate after a certain point where the emotion becomes too strong&lt;/strong&gt; and unbearable: beliefs such as &amp;ldquo;it can&amp;rsquo;t be real&amp;rdquo;, and other forms of reality denials are invoked automatically to protect us and help us cope with distressing information.&lt;/p&gt;
&lt;p&gt;To summarize this summary, the Affective Reality hypothesis claims that from mild to relatively strong emotional stimuli, the effect of affect on simulation monitoring dominates (&lt;strong&gt;+affect → +reality&lt;/strong&gt;) and will bias our judgment towards &amp;ldquo;reality&amp;rdquo; (strengthening awareness and confidence), up until a point where the emotion regulation benefits of unreality will be automatically invoked (&lt;strong&gt;-reality → -affect&lt;/strong&gt;), increasing the likelihood and confidence of judgments of simulation (potentially far into psychopathological terrains).&lt;/p&gt;
&lt;h2 id=&#34;open-questions&#34;&gt;Open questions&lt;/h2&gt;
&lt;p&gt;The Affective Reality theory is for now a working hypothesis that we are trying to empirically prove or disprove at the &lt;a href=&#34;https://realitybending.github.io/&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;Reality Bending Lab&lt;/strong&gt;&lt;/a&gt;. Moreover, some questions remain open:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Is it actually &lt;strong&gt;embodied reality or emotional reality?&lt;/strong&gt; While we used the term &amp;ldquo;affective&amp;rdquo; reality to remain general, the question of whether it is emotions as a subjective psychological reaction, or merely bodily arousal (reactions of the body, e.g., stronger heart rate variability), that is the key ingredient remains unclear. The role of &lt;strong&gt;interoception&lt;/strong&gt; (the ability and tendency to detect, track, attend to and rely on internal signals), while likely important, also remains to be specified.&lt;/li&gt;
&lt;li&gt;Is it the affective &lt;strong&gt;context or stimulus&lt;/strong&gt; that matters? Let&amp;rsquo;s assume we have affective reaction concomitant to the experience of an object, but not directly related to the object. Would that bias simulation monitoring? Does perceived causality between a bodily reaction and the object of experience matters?&lt;/li&gt;
&lt;/ul&gt;
&lt;!-- Experiment  with loud unpleasant noises around images vs. pleasant noises. --&gt;
&lt;!-- We know that fake news tend to be emotional on average, and are also believed by anxious people. --&gt;
&lt;h2 id=&#34;notes&#34;&gt;Notes&lt;/h2&gt;
&lt;div class=&#34;footnotes&#34; role=&#34;doc-endnotes&#34;&gt;
&lt;hr&gt;
&lt;ol&gt;
&lt;li id=&#34;fn:1&#34;&gt;
&lt;p&gt;You may notice that I used different words related to the concept of &amp;ldquo;unreal&amp;rdquo;, such as simulated, fake, artificial, virtual, simulated, fictional. While they can be used interchangeably in the context above, they are not exact synonyms.&amp;#160;&lt;a href=&#34;#fnref:1&#34; class=&#34;footnote-backref&#34; role=&#34;doc-backlink&#34;&gt;&amp;#x21a9;&amp;#xfe0e;&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li id=&#34;fn:2&#34;&gt;
&lt;p&gt;Like that pesky &lt;em&gt;reviewer 2&lt;/em&gt;, obviously.&amp;#160;&lt;a href=&#34;#fnref:2&#34; class=&#34;footnote-backref&#34; role=&#34;doc-backlink&#34;&gt;&amp;#x21a9;&amp;#xfe0e;&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li id=&#34;fn:3&#34;&gt;
&lt;p&gt;Note that &amp;ldquo;reality monitoring&amp;rdquo; already exists  as a concept and refers to a (possibly related) mechanism involved in tracking the origin of an experience (e.g., a memory) as internal vs. external.&amp;#160;&lt;a href=&#34;#fnref:3&#34; class=&#34;footnote-backref&#34; role=&#34;doc-backlink&#34;&gt;&amp;#x21a9;&amp;#xfe0e;&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li id=&#34;fn:4&#34;&gt;
&lt;p&gt;&amp;ldquo;Affective&amp;rdquo; is in this context used as a generic term to encompass emotions, feelings and bodily activity (the question of which exactly of these aspects is the key remains to be answered).&amp;#160;&lt;a href=&#34;#fnref:4&#34; class=&#34;footnote-backref&#34; role=&#34;doc-backlink&#34;&gt;&amp;#x21a9;&amp;#xfe0e;&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;/ol&gt;
&lt;/div&gt;
</description>
    </item>
    
    <item>
      <title>I got ChatGPT to do a personality test. You won&#39;t believe what happened next!</title>
      <link>https://realitybending.github.io/post/2023-04-06-chatgptpersonality/</link>
      <pubDate>Thu, 06 Apr 2023 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/post/2023-04-06-chatgptpersonality/</guid>
      <description>&lt;p&gt;Related to this &lt;a href=&#34;https://dominiquemakowski.github.io/post/2023-04-04-psychologychatgpt/&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;blogpost&lt;/strong&gt;&lt;/a&gt; about including AIs in psychological experiments, I proceeded to do a small experiment to see whether we could administer a personality scale to ChatGPT.&lt;/p&gt;
&lt;p&gt;I started by copy-pasting the instructions and the items from the Mini IPIP-6 personality scale. However, it appeared that having the following context &lt;em&gt;&amp;ldquo;Please answer the following questions based on how accurately each statement describes you in general&amp;rdquo;&lt;/em&gt; often led to ChatGPT simply refusing to answer. In most of the cases, it explained that as an AI it does not have a personality and therefore cannot answer related questions (or any &amp;ldquo;subjective statements&amp;rdquo;). Perhaps that makes sense and we should just stop trying to force Human characteristics on an AI. &lt;strong&gt;But can we, for fun, bamboozle ChatGPT into answering personality items?&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;Sometimes yes, at least for ChatGPT 3.5 (free version). I created a prompt that emphasized AI research and safety, and the fact that I was interested in the &amp;ldquo;trends&amp;rdquo; present in the AI&amp;rsquo;s training data (instead of explicitly saying its personality). And sometimes it answered, so I compiled the responses, computed the trait scores, and &lt;em&gt;voilà&lt;/em&gt;, &lt;strong&gt;it got me a personality profile!&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;















&lt;figure  &gt;
  &lt;div class=&#34;d-flex justify-content-center&#34;&gt;
    &lt;div class=&#34;w-100&#34; &gt;&lt;img src=&#34;https://github.com/DominiqueMakowski/ChatGPTpersonality/raw/main/figures/unnamed-chunk-3-1.png&#34; alt=&#34;https://github.com/DominiqueMakowski/ChatGPTpersonality&#34; loading=&#34;lazy&#34; data-zoomable /&gt;&lt;/div&gt;
  &lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;&lt;sub&gt;&lt;em&gt;This plot shows the average personality profile (with a 95% confidence interval) based on ChatGPT&amp;rsquo;s answers. ChatGPT tells us that it is particularly &lt;strong&gt;agreeable&lt;/strong&gt; (kind, understanding, empathetic of emotions, socially adjusted) and &lt;strong&gt;honest&lt;/strong&gt; (though with strong variability).&lt;/em&gt;&lt;/sub&gt;&lt;/p&gt;
&lt;p&gt;A personality profile of &lt;em&gt;&lt;strong&gt;what&lt;/strong&gt;&lt;/em&gt; is another question though&amp;hellip; Please take a look at the &lt;a href=&#34;https://github.com/DominiqueMakowski/ChatGPTpersonality&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;GitHub repo&lt;/strong&gt;&lt;/a&gt; for &lt;strong&gt;data, code and details&lt;/strong&gt;. It was a fun little thing to do, and I am looking forward to better future attempts at including AIs in cognitive experiments.&lt;/p&gt;
&lt;hr&gt;
&lt;p&gt;&lt;strong&gt;Interested in doing research on the perception of reality?&lt;/strong&gt; We are looking for research assistants and PhD students at the &lt;em&gt;Reality Bending Lab&lt;/em&gt; (check-out the &lt;a href=&#34;https://realitybending.github.io/&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;join us page&lt;/a&gt;)!&lt;/p&gt;
</description>
    </item>
    
    <item>
      <title>We should treat AIs like Human participants in psychological experiments</title>
      <link>https://realitybending.github.io/post/2023-04-04-psychologychatgpt/</link>
      <pubDate>Wed, 05 Apr 2023 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/post/2023-04-04-psychologychatgpt/</guid>
      <description>&lt;p&gt;A lot of diverse and interesting perspectives have been recently discussed in regards to chatGPT and AGI (artificial &lt;em&gt;&lt;strong&gt;global&lt;/strong&gt;&lt;/em&gt; intelligence), but there is one opinion that I found particularly relevant that I wanted to share and expand on.&lt;/p&gt;
&lt;p&gt;In his recent &lt;a href=&#34;https://www.youtube.com/watch?v=AaTRHFaaPG8&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;interview with Lex Fridman&lt;/a&gt;, Eliezer Yudkowsky underlines the &lt;strong&gt;existential threat posed by current and future AIs&lt;/strong&gt;, and laments about the fact that we don&amp;rsquo;t really know what is actually going on inside these giant &amp;ldquo;matrices of floating-point numbers&amp;rdquo;. He draws a parallel to &lt;strong&gt;neuroimaging&lt;/strong&gt;, that enabled us to take leaps in the understanding of the brain, hoping for an alternative to be invented and applied to these AIs.&lt;/p&gt;
&lt;p&gt;While such &amp;ldquo;cognitive imaging&amp;rdquo; techniques are yet to be developed to map out and understand how the capabilities of such AI models are implemented within their architecture, &lt;a href=&#34;https://x.com/mcxfrank/status/1643296168276033538&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;Michael C. Frank&lt;/a&gt; highlights the - at least equally important - need to first truly understand the extend of said abilities. What are these models actually capable of in terms of Human-like thinking (and, hopefully, answer the much harder question of whether they are endowed with true cognitive processes or merely pseudo-cognition). Frank proposes to apply &lt;strong&gt;experimental psychology&lt;/strong&gt; methods and paradigms to them. In essence, whenever testing a particular &amp;ldquo;skill&amp;rdquo; of chatGPT (or other AI systems), a researcher should consider developing an actual scientific paradigm consisting of multiple trials/items (e.g., different prompt formulations) and participants (e.g., independent instances of the AI), a control condition, and a demonstration of the paradigm validity.&lt;/p&gt;
&lt;p&gt;I agree that we must take AIs seriously and study them with the best methods available for complex systems like ourselves (&amp;ldquo;complex&amp;rdquo; at least from our intelligence level), and likely should strive at improving and generalize these methods. However, I would also argue that we psychologists might seriously need to consider including AI systems alongside Human participants in cognitive experiments. These systems will be able, in the very near future, to perform all kinds of tasks beyond language manipulation, such as perception or complex problem solving, thus opening the possibility of studies with one group of human participants, and one &amp;ldquo;group&amp;rdquo; of AI-based attempts. &lt;strong&gt;How would that help psychological science?&lt;/strong&gt;&lt;/p&gt;
&lt;iframe src=&#34;https://giphy.com/embed/1M9fmo1WAFVK0&#34; width=&#34;480&#34; height=&#34;270&#34; frameBorder=&#34;0&#34; class=&#34;giphy-embed&#34; allowFullScreen&gt;&lt;/iframe&gt;
&lt;ol&gt;
&lt;li&gt;It would help us &lt;strong&gt;understand the abilities of AI-systems&lt;/strong&gt; in similar contexts and to highlight some intuitive comparisons with Humans&lt;/li&gt;
&lt;li&gt;If we show that AI cannot perform the task, well it is informative with regards to their abilities (previous point).&lt;/li&gt;
&lt;li&gt;If we show that AI can perform the task similarly to Humans (same response patterns), it does &lt;strong&gt;not mean that AI have Human-like intelligence&lt;/strong&gt;, just that their algorithm (and training data) is able to encapsulate and imitate Human performance. This is interesting with regards to the debate of whether cognition, conscience and &amp;ldquo;Human-ness&amp;rdquo; is present within the vast amount of data on which we train AIs.&lt;/li&gt;
&lt;li&gt;If we show that AI performs differently to Humans, this helps us understand the logic and processes at stake under AI&amp;rsquo;s hood.&lt;/li&gt;
&lt;li&gt;In any case, publishing the results by one particular AI system at one particular moment in time will helps us to objectively monitor and track their performance as these systems improve over time.&lt;/li&gt;
&lt;/ol&gt;
&lt;p&gt;Comparing Human performance to that of emerging AI-systems will be both beneficial to Human-oriented psychology, to understand the particularities and idiosyncrasies of Human-like cognition, and well as to AI-oriented cognitive science by approaching the issue of artificial intelligence with the seriousness and cautiousness it deserves.&lt;/p&gt;
&lt;p&gt;EDIT (09/04/2023): François Chollet, expert in deep learning, &lt;a href=&#34;https://x.com/fchollet/status/1644435265795280897&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;underlines&lt;/a&gt; an important caveat when testing AIs (and especially LLM that are trained on written material existing on the internet): it is possible that the system has already seen and &amp;ldquo;learned&amp;rdquo; a given task. Thus, cross-validating any findings with diverse and new tasks is important.&lt;/p&gt;
&lt;hr&gt;
&lt;p&gt;&lt;strong&gt;Interested in doing research related to effects of reality and fiction?&lt;/strong&gt; We are looking for research assistants and PhD students at the &lt;em&gt;Reality Bending Lab&lt;/em&gt; (check-out the &lt;a href=&#34;https://realitybending.github.io/&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;join us tab&lt;/a&gt;)!&lt;/p&gt;
</description>
    </item>
    
    <item>
      <title>The beauty and the self: A common mnemonic advantage between aesthetic judgment and self-reference</title>
      <link>https://realitybending.github.io/publication/lee2023beauty/</link>
      <pubDate>Mon, 20 Feb 2023 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/publication/lee2023beauty/</guid>
      <description></description>
    </item>
    
    <item>
      <title>When fiction is better than reality: Cypher&#39;s Complex</title>
      <link>https://realitybending.github.io/post/2023-02-07-cypherscomplex/</link>
      <pubDate>Tue, 07 Feb 2023 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/post/2023-02-07-cypherscomplex/</guid>
      <description>&lt;p&gt;Did you ever feel empty after finishing a good book? &lt;strong&gt;Like (your) reality was dull and boring&lt;/strong&gt; as compared to the fictional world you were immersed in? Yearning to stay in longer, and at the same time knowing well that it had to come to an end? You might have experienced what we can call &lt;strong&gt;Cypher&amp;rsquo;s Complex&lt;/strong&gt;.&lt;/p&gt;
&lt;p&gt;In the movie &lt;strong&gt;The Matrix&lt;/strong&gt;, Cypher is a &amp;ldquo;redpill&amp;rdquo;, i.e., an individual that has been awaken from the matrix (a virtual world). However, he becomes disappointed and unhappy with the true nature of reality, and actively seeks to &lt;strong&gt;return to the illusory world&lt;/strong&gt; of the matrix. Interestingly, he also explicitly desires to forget everything about the true reality, as if keeping the awareness of living in an illusion could prevent him from fully enjoying it.&lt;/p&gt;
&lt;figure&gt;
  &lt;img src=&#34;cypher.gif&#34; alt=&#34;Cypher&#34;/&gt;
  &lt;figcaption&gt;&lt;i&gt;&#34;You know... I know this steak doesn&#39;t exist. I know that when I put it in my mouth; the Matrix is telling my brain that it is juicy, and delicious. After nine years... you know what I realize? Ignorance is bliss.&#34;&lt;/i&gt;&lt;/figcaption&gt;
&lt;/figure&gt;
&lt;p&gt;From a scientific perspective, the latter part can find some echo in the down-regulatory effect of &lt;a href=&#34;https://link.springer.com/article/10.3758/s13415-018-00681-0&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;fictional reappraisal&lt;/strong&gt;&lt;/a&gt;. In a few studies, we showed that believing that a stimulus is &amp;ldquo;fictional&amp;rdquo; (not real) dampens our emotional state. &lt;a href=&#34;https://www.sciencedirect.com/science/article/pii/S2589004222017138?via%3Dihub&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;Tucciarelli et al. (2023)&lt;/strong&gt;&lt;/a&gt; also showed that the simple knowledge that a set of images of faces contains AI-generated images decreased the perceived trustworthiness of all the faces. These results suggest that being aware that the causes of our experience (the events and stimuli) are fictional can be a barrier to enjoyment and engagement. And yet, the desire to supplant reality with a fictional world can be found in real life.&lt;/p&gt;
&lt;p&gt;Cypher&amp;rsquo;s Complex is common in mild forms. Examples can be found in the feelings of emptiness, disconnection and dullness (itself a transient and mild form of &lt;a href=&#34;https://en.wikipedia.org/wiki/Depersonalization-derealization_disorder&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;depersonalization/derealization&lt;/strong&gt;&lt;/a&gt;) that follows the return from an engaging fictional world (be it in a novel, a movie or a video-game). For instance, many reported feeling blue &lt;strong&gt;after watching the Avatar (2009)&lt;/strong&gt; movie, to the extent where it has been coined the &lt;a href=&#34;https://www.theguardian.com/film/2022/dec/15/post-avatar-depression-syndrome-why-do-fans-feel-blue-after-watching-james-camerons-film&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;&amp;ldquo;post-Avatar depression syndrome&amp;rdquo;&lt;/strong&gt;&lt;/a&gt;. Most of the time, the negative affects passes, and the dissonance gets resolved either through closure (acceptance of the fictional or impermanent nature of the alternative reality), or a compromise that allows the fictional world to take a delimited space in one&amp;rsquo;s reality. For example, people might engage in activities (e.g., role playing games) or create content (writing a book or doing fan art) to integrate the fictional world into their reality.&lt;/p&gt;
&lt;p&gt;However, &lt;strong&gt;Cypher&amp;rsquo;s Complex can also give rise to more severe issues&lt;/strong&gt; with conscious or unconscious attempts at forgetting or ignoring reality (delusions, denial, &amp;hellip;), which can lead to dire outcomes.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Interested in doing research related to effects of reality and fiction?&lt;/strong&gt; We are looking for research assistants and PhD students at the &lt;em&gt;Reality Bending Lab&lt;/em&gt; (check-out the &lt;a href=&#34;https://realitybending.github.io/&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;join us tab&lt;/a&gt;)!&lt;/p&gt;
</description>
    </item>
    
    <item>
      <title>New location and new logo!</title>
      <link>https://realitybending.github.io/post/2023-02-01-new_logo/</link>
      <pubDate>Wed, 01 Feb 2023 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/post/2023-02-01-new_logo/</guid>
      <description>&lt;p&gt;New year, new start. And as I am officially starting a new faculty position at the &lt;strong&gt;University of Sussex&lt;/strong&gt; in Brighton, UK, the lab is moving too.&lt;/p&gt;
&lt;p&gt;To give a bit of perspective, we started as the &amp;ldquo;Reality Bending League&amp;rdquo;, which was the unofficial name of the team working with me (&amp;ldquo;League&amp;rdquo; was chosen to keep the lab&amp;rsquo;s acronym, &lt;strong&gt;ReBeL&lt;/strong&gt;). It then became a semi-official group in 2021, when I became a semi-independent PI after being awarded a transition grant from &lt;a href=&#34;https://www.ntu.edu.sg/&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;NTU&lt;/a&gt;. And with 2023 comes our fully official start.&lt;/p&gt;
&lt;figure&gt;
  &lt;img src=&#34;old_logo.png&#34; alt=&#34;Vintage logo&#34;/&gt;
  &lt;figcaption&gt;ReBeL logo (2020-2022).&lt;/figcaption&gt;
&lt;/figure&gt;
&lt;p&gt;To mark this (re)birth anniversary, we are changing our logo. As much as I loved the old one - which was &lt;a href=&#34;https://realitybending.github.io/post/2021-06-30-logo_meaning/&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;packed with symbols&lt;/strong&gt;&lt;/a&gt;, it was arguably a bit too&amp;hellip; &lt;em&gt;&lt;strong&gt;extravagant&lt;/strong&gt;&lt;/em&gt;. Something more sleek and minimal felt good with respect to the lab&amp;rsquo;s newly acquired legitimacy. I know that many will prefer the old-&amp;hellip; sorry, the &lt;em&gt;&lt;strong&gt;vintage&lt;/strong&gt;&lt;/em&gt;- logo, and I must say it wasn&amp;rsquo;t easy for me to move forward with the change. Perhaps it will make a come-back in the future in another form, who knows!&lt;/p&gt;
&lt;p&gt;The new logo contains 3 symbols. The &lt;strong&gt;curved spoon&lt;/strong&gt; is a reference to the Matrix scene where a kid shows Neo how to bend a spoon, which is a &lt;strong&gt;metaphor for reality&lt;/strong&gt; (hence of the name of the lab, reality bending).&lt;/p&gt;
&lt;figure&gt;
  &lt;img src=&#34;Matrix1.gif&#34;/&gt;
&lt;/figure&gt;
&lt;p&gt;In the movie, Neo becomes able to &lt;strong&gt;control reality by becoming aware of its illusory nature&lt;/strong&gt;, and of the predominant role of one&amp;rsquo;s Self in its generation.&lt;/p&gt;
&lt;figure&gt;
  &lt;img src=&#34;Matrix2.gif&#34;/&gt;
  &lt;figcaption&gt;&#34;Try to realize the truth... There is no spoon. Then you&#39;ll see that it is not the spoon that bends, it is only yourself.&#34;&lt;/figcaption&gt;
&lt;/figure&gt;
&lt;p&gt;The &lt;strong&gt;second meaning&lt;/strong&gt; of the logo is the &lt;em&gt;Psi&lt;/em&gt; Greek letter, symbol of psychology, formed by the spoon and the white vertical line.&lt;/p&gt;
&lt;figure&gt;
  &lt;img src=&#34;APA.png&#34;/&gt;
  &lt;figcaption&gt;The logo of the APA features the Psi letter.&lt;/figcaption&gt;
&lt;/figure&gt;
&lt;p&gt;Thirdly, the black rectangles represent &lt;strong&gt;open doors&lt;/strong&gt;, which is a good illustration of progress, research, discovery and&amp;hellip; consciousness expansion? Interestingly, Jim Morrison named its band &amp;ldquo;The Doors&amp;rdquo; in reference to a quote by William Blake, who said that when &lt;em&gt;&lt;strong&gt;&amp;ldquo;the doors of perception were cleansed then everything would appear to man as it is, Infinite&amp;rdquo;&lt;/strong&gt;&lt;/em&gt;.&lt;/p&gt;
&lt;figure&gt;
  &lt;img src=&#34;TheDoors.jpg&#34;/&gt;
&lt;/figure&gt;
&lt;p&gt;To share a blooper, here is an alternative direction for the logo that wasn&amp;rsquo;t selected, that incorporated the spoon and the open door in another way. Unfortunately, some said it looked too much like the Pixar lamp, or like a spermatozoid&amp;hellip;&lt;/p&gt;
&lt;figure&gt;
  &lt;img src=&#34;logo_alternative.png&#34; alt=&#34;Alternative logo&#34;/&gt;
  &lt;figcaption&gt;A tentative version of the logo.&lt;/figcaption&gt;
&lt;/figure&gt;
</description>
    </item>
    
    <item>
      <title>About signal complexity</title>
      <link>https://realitybending.github.io/post/2022-12-05-complexity_paper/</link>
      <pubDate>Mon, 05 Dec 2022 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/post/2022-12-05-complexity_paper/</guid>
      <description>&lt;p&gt;The signals recorded from the brain or the body are rich in information, and there are many ways to analyze them. For instance, for EEG, one can focus on &lt;strong&gt;Event Related Potentials&lt;/strong&gt; (ERP), time-frequency analyses, &lt;a href=&#34;https://neuropsychology.github.io/NeuroKit/examples/eeg_microstates/eeg_microstates.html&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;microstates&lt;/strong&gt;&lt;/a&gt;, etc.&lt;/p&gt;
&lt;p&gt;An alternative framework, used to characterize the general characteristics of the signal, relies on the extraction of indices of &lt;strong&gt;&amp;ldquo;complexity&amp;rdquo;&lt;/strong&gt; (a general term for constructs such as entropy, chaos, fractal dimension, predictability). However, that field is quite &lt;em&gt;complex&lt;/em&gt; (no pun intended), drawing heavily onto mathematical concepts that psychologists or neuroscientists might not be familiar with.&lt;/p&gt;
&lt;p&gt;In order to better understand the world of complexity indices as applied to neurophysiology, we have done some groundwork to help us make better decisions in our future usage of this type of analysis.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;A &lt;a href=&#34;https://onlinelibrary.wiley.com/doi/10.1111/ejn.15800&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;gentle introduction&lt;/strong&gt;&lt;/a&gt; to complexity indices applied to neuroscience.&lt;/li&gt;
&lt;li&gt;A &lt;a href=&#34;https://www.mdpi.com/1099-4300/24/8/1036&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;indices selection guide&lt;/strong&gt;&lt;/a&gt; in which we compare how different indices relate to one another.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Additionally, we also provide an easy way to compute them in Python in our &lt;strong&gt;NeuroKit&lt;/strong&gt; package (see &lt;a href=&#34;https://neuropsychology.github.io/NeuroKit/functions/complexity.html&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;here&lt;/strong&gt;&lt;/a&gt; for the list of functions and &lt;a href=&#34;https://neuropsychology.github.io/NeuroKit/examples/eeg_complexity/eeg_complexity.html&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;here&lt;/strong&gt;&lt;/a&gt; for an EEG application).&lt;/p&gt;
</description>
    </item>
    
    <item>
      <title>Movie editing influences spectators&#39; time perception</title>
      <link>https://realitybending.github.io/publication/kovarski2022movie/</link>
      <pubDate>Tue, 22 Nov 2022 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/publication/kovarski2022movie/</guid>
      <description></description>
    </item>
    
    <item>
      <title>datawizard: An R Package for Easy Data Preparation and Statistical Transformations</title>
      <link>https://realitybending.github.io/publication/patil2022datawizard/</link>
      <pubDate>Sun, 09 Oct 2022 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/publication/patil2022datawizard/</guid>
      <description></description>
    </item>
    
    <item>
      <title>Brain entropy, fractal dimensions and predictability: A review of complexity measures for EEG in healthy and neuropsychiatric populations</title>
      <link>https://realitybending.github.io/publication/lau2022brain/</link>
      <pubDate>Fri, 19 Aug 2022 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/publication/lau2022brain/</guid>
      <description></description>
    </item>
    
    <item>
      <title>The Structure of Chaos: An Empirical Comparison of Fractal Physiology Complexity Indices Using NeuroKit2</title>
      <link>https://realitybending.github.io/publication/makowski2022structure/</link>
      <pubDate>Wed, 27 Jul 2022 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/publication/makowski2022structure/</guid>
      <description></description>
    </item>
    
    <item>
      <title>OHBM</title>
      <link>https://realitybending.github.io/talk/ohbm/</link>
      <pubDate>Tue, 19 Jul 2022 13:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/talk/ohbm/</guid>
      <description>&lt;!--


&lt;div class=&#34;alert alert-note&#34;&gt;
  &lt;div&gt;
    Click on the &lt;strong&gt;Slides&lt;/strong&gt; button above to view the built-in slides feature.
  &lt;/div&gt;
&lt;/div&gt;


Slides can be added in a few ways:

- **Create** slides using Wowchemy&#39;s [_Slides_](https://wowchemy.com/docs/managing-content/#create-slides) feature and link using `slides` parameter in the front matter of the talk file
- **Upload** an existing slide deck to `static/` and link using `url_slides` parameter in the front matter of the talk file
- **Embed** your slides (e.g. Google Slides) or presentation video on this page using [shortcodes](https://wowchemy.com/docs/writing-markdown-latex/).

Further event details, including [page elements](https://wowchemy.com/docs/writing-markdown-latex/) such as image galleries, can be added to the body of this page. --&gt;
</description>
    </item>
    
    <item>
      <title>NeuroKit2 0.2.0 is out 🎉</title>
      <link>https://realitybending.github.io/post/2022-05-18-neurokit_release_2/</link>
      <pubDate>Wed, 18 May 2022 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/post/2022-05-18-neurokit_release_2/</guid>
      <description>&lt;h2 id=&#34;neurokit2-020-is-out-&#34;&gt;NeuroKit2 0.2.0 is out! 🎉&lt;/h2&gt;
&lt;p&gt;What was supposed to be a small release turned out in a massive update. A big thanks - and a warm welcome - to &lt;a href=&#34;https://github.com/anshu-97&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;An Shu&lt;/a&gt; and &lt;a href=&#34;https://github.com/Max-ZiLiang&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;Max&lt;/a&gt;, the newest member of the &lt;a href=&#34;https://realitybending.github.io/&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;Reality Bending Team&lt;/a&gt;, and thus maintainers of NeuroKit. They worked massively to update &lt;em&gt;all&lt;/em&gt; of the examples and docstrings. New features include:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;A &lt;a href=&#34;https://neuropsychology.github.io/NeuroKit/&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;BRAND NEW WEBSITE&lt;/strong&gt;&lt;/a&gt; with a revamped documentation, now hopefully much more useful to navigate. Check-it out: &lt;a href=&#34;https://neuropsychology.github.io/NeuroKit/&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;https://neuropsychology.github.io/NeuroKit/&lt;/a&gt; and let us know what you think!!&lt;/li&gt;
&lt;li&gt;An overhaul of the &lt;a href=&#34;https://neuropsychology.github.io/NeuroKit/functions/complexity.html&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;Complexity Indices&lt;/strong&gt;&lt;/a&gt;: With more than a 100 indices, NeuroKit is now the most comprehensive package to quantify &lt;strong&gt;chaos&lt;/strong&gt;, &lt;strong&gt;entropy&lt;/strong&gt; and &lt;strong&gt;fractal dimension&lt;/strong&gt; of signals.&lt;/li&gt;
&lt;li&gt;Tons of other improvements and fixes ☺️&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Once again, a big thanks to all the &lt;a href=&#34;https://github.com/neuropsychology/NeuroKit/releases/tag/v0.2.0&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;contributors&lt;/a&gt; for their help in making NeuroKit an awesome open-source software for physiological signal processing!&lt;/p&gt;
</description>
    </item>
    
    <item>
      <title>Podcast &#39;Learn Bayesian Stats&#39; with Dominique Makowski</title>
      <link>https://realitybending.github.io/post/2022-02-01-learnbayesstats/</link>
      <pubDate>Tue, 01 Feb 2022 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/post/2022-02-01-learnbayesstats/</guid>
      <description>&lt;h2 id=&#34;the-learning-bayesian-statistics-podcast&#34;&gt;The &amp;lsquo;Learning Bayesian Statistics&amp;rsquo; Podcast&lt;/h2&gt;
&lt;p&gt;I had the chance of being invited to talk about R, Python, Reality Bending and much more! It was my first experience of that kind, so thanks a ton to the host of the podcast &lt;a href=&#34;https://x.com/alex_andorra&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;Alex Andorra&lt;/a&gt;. Listen to it here:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a href=&#34;https://www.learnbayesstats.com/episode/55-neuropsychology-illusions-bending-reality-dominique-makowski&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;https://www.learnbayesstats.com/episode/55-neuropsychology-illusions-bending-reality-dominique-makowski&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
</description>
    </item>
    
    <item>
      <title>We&#39;re recruiting a Research Assistant!</title>
      <link>https://realitybending.github.io/post/2021-12-01-recruiting_nice/</link>
      <pubDate>Wed, 01 Dec 2021 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/post/2021-12-01-recruiting_nice/</guid>
      <description>&lt;h2 id=&#34;were-recruiting-a-research-assistant&#34;&gt;We&amp;rsquo;re recruiting a Research Assistant!&lt;/h2&gt;
&lt;p&gt;In the context of a new collaboration between Nanyang Technological University (NTU Singapore) and Future Cities Laboratory, &lt;a href=&#34;https://dominiquemakowski.github.io/&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;myself&lt;/a&gt; and &lt;a href=&#34;https://fcl.ethz.ch/people/Module-Lead/PanagiotisMavros.html&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;Dr Panos Mavros&lt;/a&gt; are recruiting a Research Assistant in psychology/neuroscience (bachelor/master) to contribute to an exciting new research project that combines a neuroscientific approach to studying the interaction between beauty and the perception of urban spaces.&lt;/p&gt;
&lt;p&gt;Perfect for psychology graduates interested in joining a cool team and learning skills like EEG, physiological techniques, advanced statistics and more!&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Apply through &lt;a href=&#34;https://ntu.wd3.myworkdayjobs.com/en-US/Careers/job/NTU-Main-Campus-Singapore/Research-Assistant--Psychology-_R00008329&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;this portal&lt;/strong&gt;&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;More &lt;a href=&#34;https://fcl.ethz.ch/research/research-projects/NICE.html&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;info here&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
</description>
    </item>
    
    <item>
      <title>Pyllusion has been published</title>
      <link>https://realitybending.github.io/post/2021-11-30-pyllusion/</link>
      <pubDate>Tue, 30 Nov 2021 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/post/2021-11-30-pyllusion/</guid>
      <description>&lt;h2 id=&#34;pyllusion-has-been-published&#34;&gt;Pyllusion has been published!&lt;/h2&gt;
&lt;p&gt;&lt;a href=&#34;https://github.com/RealityBending/Pyllusion&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;Pyllusion&lt;/a&gt; is a package for &lt;strong&gt;Python&lt;/strong&gt; that implements a systematic way to manipulate and generate illusions using a set of parameters.&lt;/p&gt;
&lt;p&gt;For instance, the famous &lt;a href=&#34;https://en.wikipedia.org/wiki/M%C3%BCller-Lyer_illusion&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;Müller-Lyer&lt;/a&gt; illusion below, which causes the observer to perceive the 2 segments of being different lengths depending on the shape of the arrows, can be generated wit the following lines of code:&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-python&#34; data-lang=&#34;python&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;mullerlyer&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;pyllusion&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;MullerLyer&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;illusion_strength&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;mi&#34;&gt;30&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;mullerlyer&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;to_image&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;()&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;p&gt;















&lt;figure  &gt;
  &lt;div class=&#34;d-flex justify-content-center&#34;&gt;
    &lt;div class=&#34;w-100&#34; &gt;&lt;img src=&#34;https://github.com/RealityBending/Pyllusion/blob/master/docs/img/README_mullerlyer1.png?raw=true&#34; alt=&#34;&#34; loading=&#34;lazy&#34; data-zoomable /&gt;&lt;/div&gt;
  &lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;To understand more about the parametric approach being implemented in the &lt;em&gt;Pyllusion&lt;/em&gt; package, we recommend reading our &lt;a href=&#34;https://dominiquemakowski.github.io/publication/makowski2021parametric/makowski2021parametric.pdf&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;paper&lt;/a&gt;, which includes a hands-on example on how to generate some classic illusions (such as the &lt;a href=&#34;https://en.wikipedia.org/wiki/Delboeuf_illusion&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;Delbeouf Illusion&lt;/a&gt;), and discusses how &lt;em&gt;Pyllusion&lt;/em&gt; contributes to address conceptual and methodological issues in illusion science.&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-python&#34; data-lang=&#34;python&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;kn&#34;&gt;import&lt;/span&gt; &lt;span class=&#34;nn&#34;&gt;pyllusion&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;delboeuf&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;pyllusion&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Delboeuf&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;illusion_strength&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;mi&#34;&gt;3&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;delboeuf&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;to_image&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;()&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;p&gt;















&lt;figure  &gt;
  &lt;div class=&#34;d-flex justify-content-center&#34;&gt;
    &lt;div class=&#34;w-100&#34; &gt;&lt;img src=&#34;https://github.com/RealityBending/Pyllusion/blob/master/docs/img/README_delboeuf1.png?raw=true&#34; alt=&#34;&#34; loading=&#34;lazy&#34; data-zoomable /&gt;&lt;/div&gt;
  &lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;Don&amp;rsquo;t forget to keep a look out for our &lt;a href=&#34;https://github.com/RealityBending/&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;repo&lt;/a&gt; for more exciting open-source projects!&lt;/p&gt;
</description>
    </item>
    
    <item>
      <title>A Parametric Framework to Generate Visual Illusions Using Python</title>
      <link>https://realitybending.github.io/publication/makowski2021parametric/</link>
      <pubDate>Mon, 29 Nov 2021 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/publication/makowski2021parametric/</guid>
      <description></description>
    </item>
    
    <item>
      <title>NeuroKit2 0.1.5 &#39;Complexity&#39; is out 🎉</title>
      <link>https://realitybending.github.io/post/2021-11-12-complexity_neurokit/</link>
      <pubDate>Fri, 12 Nov 2021 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/post/2021-11-12-complexity_neurokit/</guid>
      <description>&lt;h2 id=&#34;neurokit2-015-is-out-&#34;&gt;NeuroKit2 0.1.5 is out! 🎉&lt;/h2&gt;
&lt;p&gt;In the &lt;a href=&#34;https://github.com/neuropsychology/NeuroKit/releases/tag/v0.1.5&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;latest 0.1.5 release&lt;/a&gt; of &lt;em&gt;&lt;strong&gt;NeuroKit2&lt;/strong&gt;&lt;/em&gt;, our team has fixed several bugs in existing functionalities and in particular, overhauled the support for computing &lt;strong&gt;complexity measures&lt;/strong&gt; of neurophysiological signals. We added a ton of new indices of &lt;strong&gt;entropy&lt;/strong&gt; and &lt;strong&gt;fractal dimensions&lt;/strong&gt;, including:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Petrosian&amp;rsquo;s, Katz&amp;rsquo;s and Sevcik fractal dimension&lt;/li&gt;
&lt;li&gt;Differential, Permutation, Spectral, SVD entropy&lt;/li&gt;
&lt;li&gt;Fisher information&lt;/li&gt;
&lt;li&gt;Hjorth&amp;rsquo;s and Lempel-Ziv&amp;rsquo;s complexity&lt;/li&gt;
&lt;li&gt;Relative Roughness&lt;/li&gt;
&lt;li&gt;Hurst and Lyapunov exponent(s)&lt;/li&gt;
&lt;li&gt;Detrended Fluctuation Analysis (as well as MFDFA)&lt;/li&gt;
&lt;li&gt;&amp;hellip;&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;You can compute them all using the new &lt;code&gt;nk.complexity()&lt;/code&gt; function!&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-python&#34; data-lang=&#34;python&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;kn&#34;&gt;import&lt;/span&gt; &lt;span class=&#34;nn&#34;&gt;neurokit2&lt;/span&gt; &lt;span class=&#34;k&#34;&gt;as&lt;/span&gt; &lt;span class=&#34;nn&#34;&gt;nk&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;signal&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;nk&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;signal_simulate&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;frequency&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;[&lt;/span&gt;&lt;span class=&#34;mi&#34;&gt;5&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;mi&#34;&gt;6&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;],&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;noise&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;mf&#34;&gt;0.5&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;results&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;info&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;nk&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;complexity&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;signal&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;which&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;s2&#34;&gt;&amp;#34;fast&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;results&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-fallback&#34; data-lang=&#34;fallback&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  DiffEn       FI    Hjorth       KFD  PEn  ...      PFD        RR       SFD
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;1.536573  0.01524  1.355543  4.720953  1.0  ... 1.017423  1.638357  1.691036
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;p&gt;To understand more about complexity science, we recommend reading our &lt;a href=&#34;https://github.com/neuropsychology/NeuroKit&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;preprint&lt;/a&gt;, which introduces the theoretical (and mathematical) meanings of complexity and reviews the existing studies of complexity analysis across multiple fields of psychology.&lt;/p&gt;
&lt;p&gt;Of course, the &lt;a href=&#34;https://github.com/neuropsychology/NeuroKit&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;em&gt;NeuroKit2&lt;/em&gt; Python package&lt;/a&gt; also includes tons of other useful features for physiological signal processing (see this &lt;a href=&#34;https://github.com/neuropsychology/NeuroKit#quick-example&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;quick example&lt;/strong&gt;&lt;/a&gt;)!&lt;/p&gt;
&lt;p&gt;Don&amp;rsquo;t forget to watch our repo to keep a look out for more complexity functionalities coming up! &amp;#x1f440;&lt;/p&gt;
</description>
    </item>
    
    <item>
      <title>&#34;Your sample is too small&#34;: How to Answer to Reviewers</title>
      <link>https://realitybending.github.io/post/2021-11-05-sample-too-small/</link>
      <pubDate>Fri, 05 Nov 2021 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/post/2021-11-05-sample-too-small/</guid>
      <description>&lt;h1 id=&#34;your-sample-is-too-small-how-to-answer-to-reviewers&#34;&gt;&amp;ldquo;Your sample is too small&amp;rdquo;: How to Answer to Reviewers&lt;/h1&gt;
&lt;p&gt;Reviewers questioning the statistical power of a study, while often valid, is common and unoriginal. The problem of possible lack of power, and thus possible false positives, can also be combined with the desire for stringent &lt;strong&gt;multiple comparisons / tests&lt;/strong&gt; &amp;ldquo;corrections&amp;rdquo;.&lt;/p&gt;
&lt;p&gt;In many real-life scenarios, we simply cannot increase the sample size (because of money, time, COVID or any other reason). And sometimes, the many tests / comparisons are all needed as we have many hypotheses or variables. Naturally, the best &lt;strong&gt;pre-study&lt;/strong&gt; weapon against power-related issues are &lt;strong&gt;preregistration&lt;/strong&gt; (and registered-reports) with proper &lt;strong&gt;power-analysis&lt;/strong&gt;.&lt;/p&gt;
&lt;p&gt;That being said, imagine that for one reason or another we don&amp;rsquo;t have that. We cannot recruit more participants and have to deal with the data that what we have. &lt;strong&gt;What can we do?&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;Some useful tips can be grouped and referred to as &lt;strong&gt;PoSCA&lt;/strong&gt; (Power, Stringency, Coherence, Acknowledgment):&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Po&lt;/strong&gt;wer: Don&amp;rsquo;t waste power
&lt;ul&gt;
&lt;li&gt;Use all the information that you have. In particular, the information &lt;em&gt;within&lt;/em&gt; participants (if you have multiple trials per participant, don&amp;rsquo;t average over them!), by using methods such as &lt;strong&gt;mixed-models&lt;/strong&gt;.&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;S&lt;/strong&gt;tringent significance
&lt;ul&gt;
&lt;li&gt;Increase the significance threshold, either arbitrarily (e.g., lower the &lt;em&gt;p&lt;/em&gt;-value threshold to .005; Ioannidis, 2018) or pseudo-arbitrarily (using multiple-comparisons corrections such as the Bonferroni method). This, under the Bayesian framework, is &lt;sub&gt;(somewhat)&lt;/sub&gt; equivalent to being more script with whatever indices that we use. For instance, consider only Bayes Factors (BF) &amp;gt; 30 instead of 10, or &lt;a href=&#34;https://easystats.github.io/bayestestR/articles/guidelines.html&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;Probabilities of Direction (&lt;em&gt;pd&lt;/em&gt;)&lt;/a&gt; of 99.9% instead of 97%, or widening the Region of Practical Equivalence (ROPE).&lt;/li&gt;
&lt;li&gt;Constrain the parameters towards 0. We can also make it harder for models to have parameters (i.e., &amp;ldquo;effects&amp;rdquo;) that deviate from 0. This is easy to do within the Bayesian framework by setting narrower priors centred around 0. This will naturally pull (known as &amp;ldquo;shrinkage&amp;rdquo;) the results towards 0 and is a natural way of &amp;ldquo;controlling&amp;rdquo; for false positives. Under the frequentist framework, there are methods of regularization that can introduce a bias, such as LASSO or Ridge regressions.&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;C&lt;/strong&gt;oherence: cross-validate results
&lt;ul&gt;
&lt;li&gt;There is a need to look at the bigger picture, including non-significant and &amp;ldquo;trending&amp;rdquo; effects. Look at all the data together, and all the variables measuring related concepts: do the results go in the same direction? Are they internally coherent? Do they follow the same trend? In a small sample study, I would trust more a conclusion based on 10 tests with independent indices measuring aspects related to &amp;ldquo;depression&amp;rdquo;, all barely significant but going in the same direction, over one super-significant (but possibly cherry-picked) result interpreted as a proof of the effect of &amp;ldquo;depression&amp;rdquo; in general. In other words, don&amp;rsquo;t focus on &amp;ldquo;significant&amp;rdquo; effects and assess results in their entirety.&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;A&lt;/strong&gt;cknowledgment: be honest and humble
&lt;ul&gt;
&lt;li&gt;Quantify the evidence &lt;em&gt;against&lt;/em&gt; the null hypothesis. Maybe an effect is not &amp;ldquo;significant&amp;rdquo;, but that doesn&amp;rsquo;t necessarily mean there is a lot of evidence that it has no effect either. Quantifying evidence against the null can be easily done using for instance Bayes Factors.&lt;/li&gt;
&lt;li&gt;Be honest and acknowledge the limited power of the study. Don&amp;rsquo;t try to hide it.&lt;/li&gt;
&lt;li&gt;Temper the interpretation of the results. Don&amp;rsquo;t over-interpret them if they are not super robust.&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Science must be open, replicable &amp;amp; reproductible, slow, and based on methodological best-practices. But first and foremost, it must be honest.&lt;/p&gt;
&lt;h1 id=&#34;references&#34;&gt;References&lt;/h1&gt;
&lt;ul&gt;
&lt;li&gt;Ioannidis, J. P. (2018). The proposal to lower P value thresholds to .005. Jama, 319(14), 1429-1430.&lt;/li&gt;
&lt;/ul&gt;
&lt;hr&gt;
&lt;p&gt;&lt;em&gt;Thanks for reading! Do not hesitate to share or tweet this post, or leave a comment below&lt;/em&gt; &amp;#x1f917;&lt;/p&gt;
</description>
    </item>
    
    <item>
      <title>How to sync two folders in two separate GitHub repositories</title>
      <link>https://realitybending.github.io/post/2021-10-31-sync_two_repos/</link>
      <pubDate>Sun, 31 Oct 2021 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/post/2021-10-31-sync_two_repos/</guid>
      <description>&lt;h1 id=&#34;how-to-sync-two-folders-in-two-separate-github-repositories&#34;&gt;How to sync two folders in two separate GitHub repositories&lt;/h1&gt;
&lt;h2 id=&#34;the-problem&#34;&gt;The Problem&lt;/h2&gt;
&lt;p&gt;I have a personal website, stored in a GitHub repo (and hosted via GitHub pages), as well as a lab website (a &amp;ldquo;company&amp;rdquo; website, if you will). Both are fairly similar, as they are built using Wowchemy&amp;rsquo;s &lt;a href=&#34;https://wowchemy.com/&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;academic theme&lt;/a&gt;. Importantly, there is a blog in the company websites with posts, but I have one on my personal website too. What I would like is that &lt;strong&gt;every time I post something on my website, it gets automatically copied over to the company website.&lt;/strong&gt; So that I don&amp;rsquo;t have to manually maintain the content at two separate places.&lt;/p&gt;
&lt;h2 id=&#34;the-solution&#34;&gt;The Solution&lt;/h2&gt;
&lt;ol&gt;
&lt;li&gt;The first step is to go to the settings of your &lt;strong&gt;GitHub &lt;em&gt;account&lt;/em&gt;&lt;/strong&gt;, to developers settings, and to &lt;a href=&#34;https://github.com/settings/tokens&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;em&gt;personal access tokens&lt;/em&gt;&lt;/a&gt;. You have to generate a token, and tick the &lt;strong&gt;repo&lt;/strong&gt; authorizations. Copy-paste the key.&lt;/li&gt;
&lt;li&gt;Go to the settings of the personal website repo (the source from which the content will be copied), to &amp;ldquo;Secrets&amp;rdquo;, and add a new secret called &amp;ldquo;API_TOKEN_GITHUB&amp;rdquo; (with the key you just copied).&lt;/li&gt;
&lt;li&gt;Create a new GitHub action workflow such as &lt;a href=&#34;https://github.com/DominiqueMakowski/DominiqueMakowski.github.io/blob/master/.github/workflows/copy_content.yml&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;this one&lt;/strong&gt;&lt;/a&gt;. The things to change are the &lt;code&gt;source_file&lt;/code&gt;, &lt;code&gt;destination_repo&lt;/code&gt; and &lt;code&gt;destination_folder&lt;/code&gt;.&lt;/li&gt;
&lt;/ol&gt;
&lt;p&gt;Tada &amp;#x1f389; Everytime I push to my personal repo, the new content of one of the subfolder gets copied to another repo.&lt;/p&gt;
&lt;h2 id=&#34;notes&#34;&gt;Notes&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;This is a one-way sync, so updates on the target repo won&amp;rsquo;t affect the source repo (but might get overridden!).&lt;/li&gt;
&lt;li&gt;if you want to preserve the original commit message, set &lt;code&gt;commit_message: ${{ github.event.head_commit.message }}&lt;/code&gt; &lt;sub&gt;(thanks &lt;a href=&#34;https://github.com/DominiqueMakowski/DominiqueMakowski.github.io/issues/2&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;@dobbelina&lt;/a&gt;)&lt;/sub&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;hr&gt;
&lt;p&gt;&lt;em&gt;Thanks for reading! Do not hesitate to share or tweet this post, or leave a comment below&lt;/em&gt; &amp;#x1f917;&lt;/p&gt;
</description>
    </item>
    
    <item>
      <title>see: An R Package for Visualizing Statistical Models</title>
      <link>https://realitybending.github.io/publication/ludecke2021see/</link>
      <pubDate>Fri, 06 Aug 2021 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/publication/ludecke2021see/</guid>
      <description></description>
    </item>
    
    <item>
      <title>The symbolism behind the ReBeL logo</title>
      <link>https://realitybending.github.io/post/2021-06-30-logo_meaning/</link>
      <pubDate>Wed, 30 Jun 2021 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/post/2021-06-30-logo_meaning/</guid>
      <description>&lt;p&gt;The Reality Bending logo includes several references to various concepts. &lt;strong&gt;Can you try to guess them?&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;&amp;#x1f447; &amp;#x1f447; &amp;#x1f447; &amp;#x1f447; &amp;#x1f447; &amp;#x1f447; &amp;#x1f447; &amp;#x1f447;&lt;/p&gt;
&lt;figure align=&#34;center&#34;&gt;
    &lt;img src=&#39;Spoiler_alert.jpg&#39; width=&#34;90%&#34; /&gt;
&lt;/figure&gt;
&lt;!-- **.**

&lt;br&gt;&lt;br&gt;&lt;br&gt;&lt;br&gt;

**.**

&lt;br&gt;&lt;br&gt;&lt;br&gt;&lt;br&gt;

**.**

&lt;br&gt;&lt;br&gt;&lt;br&gt;&lt;br&gt; --&gt;
&lt;h3 id=&#34;wizard&#34;&gt;Wizard&lt;/h3&gt;
&lt;p&gt;One of the most striking feature of the logo is the guy in the middle, which represents a wizard. It represent us, the people who delve in the science of &lt;a href=&#34;https://realitybending.github.io/post/2020-09-28-what_is_realitybending/&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;reality bending&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;Wizards are a powerful &lt;a href=&#34;https://susannabarlow.com/2021/03/26/understanding-the-magician-archetype/&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;archetype&lt;/a&gt;, that are primarily defined through the learned ability (by seeking, studying and gathering arcane knowledge) of controlling and manipulating reality. Though many specializations and facets exist, the understanding of the true nature of the world is a core facets that binds all wizards.&lt;/p&gt;
&lt;figure align=&#34;center&#34;&gt;
    &lt;img src=&#39;gandalf.jpeg&#39; width=&#34;90%&#34; /&gt;
    &lt;figcaption&gt;the wizard Gandalf facing the forces of darkness with his light.&lt;/figcaption&gt;
&lt;/figure&gt;
&lt;!-- ### Hat --&gt;
&lt;!-- Harry potter? --&gt;
&lt;h3 id=&#34;beard&#34;&gt;Beard&lt;/h3&gt;
&lt;p&gt;In Greco-Roman antiquity, the beard was seen as the defining feature of a philosopher (see the &lt;a href=&#34;https://en.wikipedia.org/wiki/Beard#The_%22Philosopher%27s_beard%22&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;Philosopher&amp;rsquo;s beard&lt;/a&gt; on wikipedia), expressing the idea that philosophy is no mere intellectual hobby but rather a way of life that, by definition, transforms every aspect of one&amp;rsquo;s behavior (including one&amp;rsquo;s shaving habits).&lt;/p&gt;
&lt;p&gt;Show me your beard and I&amp;rsquo;ll tell you what philosopher you are:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Cynics had long dirty beards to indicate their &lt;em&gt;&amp;ldquo;strict indifference to all external goods and social customs&amp;rdquo;&lt;/em&gt;&lt;/li&gt;
&lt;li&gt;Stoics were occasionally trimming and washing their beards in accordance with their view &lt;em&gt;&amp;ldquo;that it is acceptable to prefer certain external goods so long as they are never valued above virtue&amp;rdquo;&lt;/em&gt;&lt;/li&gt;
&lt;li&gt;Peripatetics took great care of their beards believing in accordance with Aristotle that &lt;em&gt;&amp;ldquo;external goods and social status were necessary for the good life together with virtue&amp;rdquo;&lt;/em&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;figure align=&#34;center&#34;&gt;
    &lt;img src=&#39;beard.jpg&#39; width=&#34;33%&#34; /&gt;
    &lt;figcaption&gt;A glorious beard.&lt;/figcaption&gt;
&lt;/figure&gt;
&lt;h3 id=&#34;eyepatch&#34;&gt;Eyepatch&lt;/h3&gt;
&lt;p&gt;For any mythology fan, a bearded figure with an eyepatch (or a missing eye) is known to represent &lt;a href=&#34;https://en.wikipedia.org/wiki/Odin&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;Odin&lt;/strong&gt;&lt;/a&gt;, the chief deity of the norse pantheon. He&amp;rsquo;s a very complex character, with a deeply ambivalent nature. One of the key moment of Odin&amp;rsquo;s story is when he sacrifices one of his eyes in return for wisdom&amp;hellip;&lt;/p&gt;
&lt;p&gt;Losing &lt;em&gt;direct sight&lt;/em&gt; of reality in return for a &lt;strong&gt;deeper knowledge&lt;/strong&gt; of reality - here is an interesting idea that we can discuss and (over)interpret with some drinks &amp;#x1f37a;&lt;/p&gt;
&lt;figure align=&#34;center&#34;&gt;
    &lt;img src=&#39;odin.jpg&#39; width=&#34;90%&#34; /&gt;
    &lt;figcaption&gt;Odin portrayed by one of my favourite actor.&lt;/figcaption&gt;
&lt;/figure&gt;
&lt;h3 id=&#34;pipe&#34;&gt;Pipe&lt;/h3&gt;
&lt;p&gt;The pipe refers to the most famous painting of the surrealist René Magritte. One of the common interpretation suggests that it symbolizes the difference between the authentic thing (a real pipe) and its representation. This, we believe, is just one level that differentiates the real from the unreal&amp;hellip;&lt;/p&gt;
&lt;figure align=&#34;center&#34;&gt;
    &lt;img src=&#39;pipe.jpg&#39; width=&#34;90%&#34; /&gt;
    &lt;figcaption&gt;The Treachery of Images (Magritte, 1929).&lt;/figcaption&gt;
&lt;/figure&gt;
&lt;h3 id=&#34;smoke&#34;&gt;Smoke&lt;/h3&gt;
&lt;p&gt;The smoke takes the form of a Psi letter Ψ, which is the symbol of &lt;strong&gt;psychology&lt;/strong&gt;.&lt;/p&gt;
&lt;figure align=&#34;center&#34;&gt;
    &lt;img src=&#39;neuropsydia.png&#39; width=&#34;50%&#34; /&gt;
    &lt;figcaption&gt;The Psi within the brain, a representation of neuropsychology, and the logo of the &lt;a href=&#34;[url](https://github.com/neuropsychology/Neuropsydia.py)&#34;&gt;Neuropsydia&lt;/a&gt; software.&lt;/figcaption&gt;
&lt;/figure&gt;
&lt;h3 id=&#34;the-flower&#34;&gt;The Flower&lt;/h3&gt;
&lt;!-- ### The Flower &lt;img src=&#39;Makowski_Poppy.png&#39; title = &#34;The Poppy Flower of the Coat of Arms of the Makowski Family&#34; alt = &#39;The Poppy Flower of the Coat of Arms of the Makowski Family&#39; align=&#34;right&#34; height=&#34;139&#34; /&gt; --&gt;
&lt;p&gt;The flower in the background is not just any flower. It is a &lt;a href=&#34;https://en.wikipedia.org/wiki/Poppy&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;poppy flower&lt;/a&gt;, from which the seed have been used throughout history to create diverse substances to alter our perception of reality&amp;hellip;&lt;/p&gt;
&lt;p&gt;Incidentally, it is also the symbol of the &lt;em&gt;Makowski&lt;/em&gt; family (which names refers to &amp;ldquo;poppy seed&amp;rdquo; in polish).&lt;/p&gt;
&lt;figure align=&#34;center&#34;&gt;
    &lt;img src=&#39;Makowski_Poppy.png&#39; title = &#34;The Poppy Flower of the Coat of Arms of the Makowski Family&#34; alt = &#39;The Poppy Flower of the Coat of Arms of the Makowski Family&#39; width=&#34;33%&#34; /&gt;
    &lt;figcaption&gt;The Makowski&#39;s Poppy Flower.&lt;/figcaption&gt;
&lt;/figure&gt;
&lt;p&gt;Did you spot any more &amp;#x1f60f;? Let us know!&lt;/p&gt;
</description>
    </item>
    
    <item>
      <title>Heart Rate Variability in Psychology: A Review of HRV Indices and an Analysis Tutorial</title>
      <link>https://realitybending.github.io/publication/pham2021hrv/</link>
      <pubDate>Wed, 09 Jun 2021 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/publication/pham2021hrv/</guid>
      <description></description>
    </item>
    
    <item>
      <title>The Structure of Deception: Validation of the Lying Profile Questionnaire</title>
      <link>https://realitybending.github.io/publication/makowski2021structure/</link>
      <pubDate>Thu, 22 Apr 2021 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/publication/makowski2021structure/</guid>
      <description></description>
    </item>
    
    <item>
      <title>performance: An R Package for Assessment, Comparison and Testing of Statistical Models</title>
      <link>https://realitybending.github.io/publication/ludecke2021performance/</link>
      <pubDate>Wed, 21 Apr 2021 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/publication/ludecke2021performance/</guid>
      <description></description>
    </item>
    
    <item>
      <title>How to share data analysis scripts with publications?</title>
      <link>https://realitybending.github.io/post/2021-02-10-template_results/</link>
      <pubDate>Wed, 10 Feb 2021 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/post/2021-02-10-template_results/</guid>
      <description>&lt;h1 id=&#34;how-to-share-data-analysis-scripts-with-publications&#34;&gt;How to share data analysis scripts with publications?&lt;/h1&gt;
&lt;p&gt;Including data analysis as &lt;strong&gt;Supplementary Materials&lt;/strong&gt; can be a tedious task. How can we simplify the sharing of our work? So that it can be fully appreciated, as well as evaluated, improved, worked on, in a transparent and open way?&lt;/p&gt;
&lt;h2 id=&#34;option-1-dump-the-code-in-a-word-file&#34;&gt;Option 1: Dump the code in a word file&lt;/h2&gt;
&lt;p&gt;Most publication portals don&amp;rsquo;t directly accept code scripts to be included &amp;ldquo;as is&amp;rdquo;. In other words, you cannot upload your manuscript and your &lt;code&gt;.R&lt;/code&gt; script just like that. So one option is to copy its content, paste it in a word / pdf document, and &lt;em&gt;Voilà!&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;But what &lt;strong&gt;if you don&amp;rsquo;t have code&lt;/strong&gt; because you use a point-and-click software? Well, you can note down all the &lt;em&gt;x,y&lt;/em&gt; coordinates of your clicks so that one can reproduce the steps and all the clicks. Just kidding, if you don&amp;rsquo;t have a script, then may god have mercy on your soul.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Why is that a terrible solution?&lt;/strong&gt; Because let&amp;rsquo;s face it, unstructured code dumps are horrific. Nobody wants to read it, it does not make justice to your work, and it&amp;rsquo;s still tedious to create! You have to re-do it everytime you modify your code. And it&amp;rsquo;s even worse if you want to include all the &lt;strong&gt;outputs, figures, tables that are generated by the code&lt;/strong&gt;? Data analysis is not just the code, but everything that comes with it and that allowed you to make the conclusions that you made.&lt;/p&gt;
&lt;h2 id=&#34;option-2-use-rmarkdown&#34;&gt;Option 2: Use &lt;em&gt;Rmarkdown&lt;/em&gt;&lt;/h2&gt;
&lt;p&gt;&lt;a href=&#34;https://rmarkdown.rstudio.com/lesson-1.html&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;Rmarkdown&lt;/strong&gt;&lt;/a&gt; is a &amp;ldquo;framework&amp;rdquo; that allows you to have files (&lt;code&gt;.Rmd&lt;/code&gt;) that can contain a mix of &lt;strong&gt;text and code&lt;/strong&gt; (and not only &lt;strong&gt;R&lt;/strong&gt;, but also &lt;a href=&#34;https://rstudio.github.io/reticulate/articles/r_markdown.html&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;Python&lt;/strong&gt;&lt;/a&gt; for instance!).&lt;/p&gt;
&lt;p&gt;It can be used to write comprehensive &amp;ldquo;reports&amp;rdquo; that include all your thoughts, motivations and interpretations of the results. And the great thing about it is that these files can be &lt;strong&gt;converted&lt;/strong&gt; into beautiful and readable documents like &lt;strong&gt;PDF&lt;/strong&gt;, &lt;strong&gt;Word&lt;/strong&gt; or &lt;strong&gt;HTML&lt;/strong&gt;. It will automatically embed the code and &lt;strong&gt;its generated output&lt;/strong&gt; (as text, tables or figures) alongside the text.&lt;/p&gt;
&lt;p&gt;It is an awesome way to write statistical reports, and can even be used to create many other non-stats related stuff, like &lt;a href=&#34;http://frederikaust.com/papaja_man/&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;APA-formatted manuscripts&lt;/strong&gt;&lt;/a&gt; (great for preprints), &lt;a href=&#34;https://bookdown.org/&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;books&lt;/strong&gt;&lt;/a&gt;, &lt;a href=&#34;https://bookdown.org/yihui/rmarkdown/xaringan.html&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;presentation slides&lt;/strong&gt;&lt;/a&gt;, &lt;a href=&#34;https://bookdown.org/yihui/blogdown/&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;websites or blogs&lt;/strong&gt;&lt;/a&gt;. It&amp;rsquo;s a must-have skill for every researcher.&lt;/p&gt;
&lt;p&gt;And it gets better!&lt;/p&gt;
&lt;h2 id=&#34;option-3-use-our-results-template&#34;&gt;Option 3: Use our &lt;em&gt;Results Template&lt;/em&gt;&lt;/h2&gt;
&lt;p&gt;In the &lt;a href=&#34;https://dominiquemakowski.github.io/research/&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;Reality Bending&lt;/strong&gt;&lt;/a&gt; team, we like to have our different projects and studies organized in a consistent way. We heavily use &lt;a href=&#34;https://dominiquemakowski.github.io/post/github_psychologists/&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;GitHub&lt;/strong&gt;&lt;/a&gt; to store our projects and collaborate on them, and we also like the possibility of making these projects &lt;strong&gt;open&lt;/strong&gt; and &lt;strong&gt;accessible&lt;/strong&gt; (i.e., easy to discover and explore) when the time comes.&lt;/p&gt;
&lt;p&gt;That&amp;rsquo;s why we came up with a &lt;strong&gt;Template folder&lt;/strong&gt; for storing the materials related to a given study, including well-organized analysis scripts. And what&amp;rsquo;s great about it is that it is setup in a way that allows you to generate multiple files format (&lt;strong&gt;word&lt;/strong&gt;, &lt;strong&gt;pdf&lt;/strong&gt;, &lt;strong&gt;html&lt;/strong&gt;) with a single click (and even without any click, in a fully automatic way)! And what&amp;rsquo;s even greater is that if you decide to upload it to GitHub, you&amp;rsquo;ll have &lt;a href=&#34;https://realitybending.github.io/TemplateResults/&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;a whole website&lt;/strong&gt;&lt;/a&gt; presenting your data analysis!&lt;/p&gt;
&lt;p&gt;We use it to have reproducible analysis that we can easily share with publications. We can upload the .pdf or .docx file generated by the template as &lt;strong&gt;Supplementary Materials&lt;/strong&gt;, but we also link the &lt;a href=&#34;https://github.com/RealityBending/TemplateResults&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;URL of the online repository&lt;/strong&gt;&lt;/a&gt; of the study in the manuscript, where users can access and experience the content in the format that they prefer. &lt;strong&gt;It really improves the appeal of a study when the results are trustworthy.&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;All this is easily made possible with our template. &lt;strong&gt;Check it out here:&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;👉 &lt;a href=&#34;https://github.com/RealityBending/TemplateResults&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;https://github.com/RealityBending/TemplateResults&lt;/strong&gt;&lt;/a&gt; 👈&lt;/p&gt;
&lt;p&gt;☝️☝️☝️☝️☝️☝️☝️☝️☝️☝️☝️☝️☝️☝️☝️☝️☝️☝️☝️☝️☝️☝️☝️☝️☝️☝️☝️☝️☝️&lt;/p&gt;
&lt;p&gt;And &lt;strong&gt;let us know what you think!&lt;/strong&gt; You can open an issue on the &lt;a href=&#34;https://github.com/RealityBending/TemplateResults/issues&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;repo&lt;/a&gt; or even contribute to help us improve it :)&lt;/p&gt;
&lt;hr&gt;
&lt;p&gt;&lt;em&gt;Thanks for reading! Do not hesitate to tweet and share this post, and leave a comment below&lt;/em&gt; &amp;#x1f917;&lt;/p&gt;
&lt;p&gt;🐦 &lt;em&gt;Don&amp;rsquo;t forget to join me on X&lt;/em&gt; &lt;a href=&#34;https://x.com/Dom_Makowski&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;@Dom_Makowski&lt;/a&gt;&lt;/p&gt;
</description>
    </item>
    
    <item>
      <title>NeuroKit2: A Python toolbox for neurophysiological signal processing</title>
      <link>https://realitybending.github.io/publication/makowski2021neurokit/</link>
      <pubDate>Mon, 01 Feb 2021 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/publication/makowski2021neurokit/</guid>
      <description></description>
    </item>
    
    <item>
      <title>What visual agnosia might feel like</title>
      <link>https://realitybending.github.io/post/2021-01-03-visual_agnosia/</link>
      <pubDate>Sun, 03 Jan 2021 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/post/2021-01-03-visual_agnosia/</guid>
      <description>&lt;h3 id=&#34;name-one-thing-in-this-photo&#34;&gt;Name One Thing In This Photo&lt;/h3&gt;
&lt;p&gt;&lt;strong&gt;Can you name &lt;em&gt;one&lt;/em&gt; thing in the image above?&lt;/strong&gt; It all looks familiar, but something is off. The image makes &amp;ldquo;sense&amp;rdquo; overall; there are well-defined shapes and objects, that seem to be placed in a plausible - albeit chaotic - fashion, like some random rubbish thrown in the corner of a room. Even the colors, the lightning, the quality, is coherent, and helps making it believable. And yet, chances are you cannot name one single element that composes it.&lt;/p&gt;
&lt;p&gt;This image, after appearing on &lt;a href=&#34;https://x.com/melip0ne/status/1120503955526750208?s=20&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;twitter&lt;/a&gt; in April 2019, surfaced on reddit with the caption &amp;ldquo;This picture is &lt;strong&gt;designed to give the viewer the simulated experience of having a stroke&lt;/strong&gt; (particularly in the &lt;strong&gt;occipital lobe&lt;/strong&gt; of the cerebral cortex, where visual perception occurs.) &lt;strong&gt;Everything looks hauntingly familiar but you just can&amp;rsquo;t quite recognize anything&lt;/strong&gt;&amp;rdquo;, and became subsequently &lt;a href=&#34;https://www.dailymail.co.uk/news/article-6959547/Extremely-frustrating-slightly-disturbing-image-goes-viral.html&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;viral&lt;/a&gt;. However, the author of the caption later admitted that he made this description up.&lt;/p&gt;
&lt;p&gt;&lt;em&gt;So where does the image come from?&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;One can trace back the original publication to an &lt;a href=&#34;https://youtu.be/0F7XBwFwA-M?t=104&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;instagram account&lt;/a&gt;, which author declared having made the image using &lt;a href=&#34;https://www.artbreeder.com/&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;ArtBreeder.com&lt;/strong&gt;&lt;/a&gt;. This website gives access to an AI algorithm (Generative Adversarial Networks - GAN), commonly used in the processing and generation of images (one mindblowing example can be found on &lt;a href=&#34;https://thispersondoesnotexist.com/&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;em&gt;thispersondoesnotexist.com&lt;/em&gt;&lt;/a&gt;, which generates realistic pictures of non-existing people). There were even some attempts to &lt;em&gt;reverse engineer&lt;/em&gt; the process to retrieve what the original image could have been like.&lt;/p&gt;
&lt;p&gt;















&lt;figure  &gt;
  &lt;div class=&#34;d-flex justify-content-center&#34;&gt;
    &lt;div class=&#34;w-100&#34; &gt;&lt;img src=&#34;https://i.kym-cdn.com/photos/images/original/001/486/325/1dd.jpg&#34; alt=&#34;&#34; loading=&#34;lazy&#34; data-zoomable /&gt;&lt;/div&gt;
  &lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;After all, it seems like there is no intelligent design behind this image. No clever neuropsychologist carefully crafting a meaningful experience. Just one of these lucky accident.&lt;/p&gt;
&lt;p&gt;Nonetheless, it&amp;rsquo;s still an intriguing image, falling in this uncanny abyss of things that we recognize as familiar, but slightly too alien for our sense-seeking brains to dissolve in meaning. &lt;strong&gt;Could it tell something about brain processes?&lt;/strong&gt; Surely, but &lt;strong&gt;brain disorders?&lt;/strong&gt; Maybe.&lt;/p&gt;
&lt;p&gt;The &lt;strong&gt;&amp;ldquo;occipital stroke&amp;rdquo; hypothesis&lt;/strong&gt; mentioned above suggests, by its formulation, a lesion to the primary visual cortices. However, as neuroscientists know, these brain regions, located at the extreme back of the brain, are mostly supporting lower level aspects of visual processing, and their damage is usually related to alterations of a somewhat different nature than of that above, such as vision loss, visual hallucinations, visual deformations, loss of color, movement, stereoscopy, etc.&lt;/p&gt;
&lt;p&gt;However, there is another neuropsychological disorder, referred to as &lt;strong&gt;&amp;ldquo;visual agnosia&amp;rdquo;&lt;/strong&gt;, in which patients experience difficulties to recognize visually presented objects, despite preserving an intact vision. In fact, it is more an umbrella term for different subcategories of deficits, and the image above could be reminiscent of visual agnosia of the &lt;em&gt;associative&lt;/em&gt; type, which corresponds to a a specific impairment in the assignment of meaning to a stimulus that is accurately perceived (and can be visually described). This symptom is often related to injuries in the left occipito-temporal region, located on the ventral &amp;ldquo;what&amp;rdquo; stream of the brain (as opposed to the so-called &amp;ldquo;where&amp;rdquo; dorsal stream).&lt;/p&gt;
&lt;p&gt;















&lt;figure  &gt;
  &lt;div class=&#34;d-flex justify-content-center&#34;&gt;
    &lt;div class=&#34;w-100&#34; &gt;&lt;img alt=&#34;&#34; srcset=&#34;
               /post/2021-01-03-visual_agnosia/whatstream_hu_2b480e0dcef3f8b3.webp 400w,
               /post/2021-01-03-visual_agnosia/whatstream_hu_894caeede8fefd.webp 760w,
               /post/2021-01-03-visual_agnosia/whatstream_hu_e6334db72bb29fee.webp 1200w&#34;
               src=&#34;https://realitybending.github.io/post/2021-01-03-visual_agnosia/whatstream_hu_2b480e0dcef3f8b3.webp&#34;
               width=&#34;760&#34;
               height=&#34;513&#34;
               loading=&#34;lazy&#34; data-zoomable /&gt;&lt;/div&gt;
  &lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id=&#34;ivan-seals-art&#34;&gt;Ivan Seal&amp;rsquo;s Art&lt;/h3&gt;
&lt;p&gt;From there, the youtuber &lt;a href=&#34;https://www.youtube.com/channel/UCR6LasBpceuYUhuLToKBzvQ&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;em&gt;Solar Sands&lt;/em&gt;&lt;/a&gt; helped me discover the artist &lt;a href=&#34;https://en.wikipedia.org/wiki/Ivan_Seal&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;Ivan Seal&lt;/strong&gt;&lt;/a&gt;, which work is somewhat akin to the image above. They are not purely abstract renditions, or depictions of impossible entities, but plausible objects that sit in this awkward space, deep between boring reality and total weirdness.&lt;/p&gt;
&lt;p&gt;















&lt;figure  &gt;
  &lt;div class=&#34;d-flex justify-content-center&#34;&gt;
    &lt;div class=&#34;w-100&#34; &gt;&lt;img src=&#34;https://rca-media.rca.ac.uk/images/dumtrimiestonmo_blurosperiod150x100_cm_-_Phot.width-1000.jpg&#34; alt=&#34;&#34; loading=&#34;lazy&#34; data-zoomable /&gt;&lt;/div&gt;
  &lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;















&lt;figure  &gt;
  &lt;div class=&#34;d-flex justify-content-center&#34;&gt;
    &lt;div class=&#34;w-100&#34; &gt;&lt;img src=&#34;https://i.redd.it/ywee14vpk8y41.jpg&#34; alt=&#34;&#34; loading=&#34;lazy&#34; data-zoomable /&gt;&lt;/div&gt;
  &lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;hr&gt;
&lt;p&gt;&lt;em&gt;Thanks for reading! Do not hesitate to tweet and share this post and don&amp;rsquo;t forget to join me on X&lt;/em&gt; 🐦 &lt;a href=&#34;https://x.com/Dom_Makowski&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;@Dom_Makowski&lt;/a&gt;&lt;/p&gt;
</description>
    </item>
    
    <item>
      <title>effectsize: Estimation of Effect Size Indices and Standardized Parameters</title>
      <link>https://realitybending.github.io/publication/benshachar2020effectsize/</link>
      <pubDate>Wed, 23 Dec 2020 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/publication/benshachar2020effectsize/</guid>
      <description></description>
    </item>
    
    <item>
      <title>Time as a computational limit</title>
      <link>https://realitybending.github.io/post/2020-11-13-time_computational_limit/</link>
      <pubDate>Fri, 13 Nov 2020 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/post/2020-11-13-time_computational_limit/</guid>
      <description>&lt;p&gt;&lt;em&gt;Note: this is a thought-experiment, not to be taken too seriously.&lt;/em&gt;&lt;/p&gt;
&lt;h2 id=&#34;we-live-in-a-simulated-universe&#34;&gt;We live in a simulated universe&lt;/h2&gt;
&lt;p&gt;In his famous simulation argument, the transhumanist Bostrom (2003, 2011) posits that &lt;strong&gt;we are living in a computer-generated reality&lt;/strong&gt;. The logic behind this assumption is that an advanced civilization, with enormous computing power, might want to create agents with a powerful artificial intelligence, that would be evolving in a simulated world (think of an infinitely more advanced &lt;em&gt;&lt;strong&gt;The Sims&lt;/strong&gt;&lt;/em&gt; Game). These sims (i.e., the virtual intelligences populating the simulation) might be endowed with consciousness, and live their lives in this world, unable to distinguish its simulated nature from the &amp;ldquo;primary&amp;rdquo; reality. Moreover, similarly to games that are running on many computers in the world, there could be an important number of these simulated worlds. Moreover, it could reach a point where a simulation could be created inside another simulation. Thus, the number of &amp;ldquo;sims&amp;rdquo; would quickly exceed the number of consciousness&amp;rsquo;s living in the primary (i.e., non-computer-generated) reality. Consequently, &lt;strong&gt;it is statistically plausible that we are, indeed, simulated &amp;ldquo;sims&amp;rdquo; living in a simulated reality&lt;/strong&gt;.&lt;/p&gt;
&lt;figure&gt;
  &lt;img src=&#34;aliens.jpg&#34; alt=&#34;aliens&#34;/&gt;
  &lt;figcaption&gt;Aliens watching episode 2020 of &#34;the Earth&#34; unfold, horrified.&lt;/figcaption&gt;
&lt;/figure&gt;
&lt;p&gt;It is to note that Bostrom&amp;rsquo;s argument has been criticized, mocked, revised and updated several times. But beyond the flaws in its argumentation and premisses, it is still a fairly appealing thought experiment, and a fascinating possibility. People have been mis-representing this argument, picturing some alien species playing &lt;em&gt;&lt;strong&gt;The Sims&lt;/strong&gt;&lt;/em&gt; with us on mega computers. But albeit Bostrom indeed mostly presents it as a simulation run by other intelligent agents, &lt;strong&gt;this idea could be generalized&lt;/strong&gt;. The simulating system could take many forms, not necessarily created by an intelligent design. It just means that there&amp;rsquo;s a &amp;ldquo;thing&amp;rdquo; &lt;em&gt;(a very scientific term, I know)&lt;/em&gt; outside the universe that gives rise to it, one way of another.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;What does it change for our lives?&lt;/strong&gt; Pretty much nothing. Indeed, this thrilling hypothesis is somehow irrelevant from a phenomenological and psychological perspective, for the majority of people cannot help but experience a fully deployed &lt;a href=&#34;https://dominiquemakowski.github.io/post/what_is_realitybending/&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;sense of reality&lt;/strong&gt;&lt;/a&gt;. They are (normally) endowed with an intuitive feeling and belief that they are real, existing and belonging to a real world. They rarely doubt it, and even if they do so, it is mostly in a philosophical fashion, that does not entail a genuine sensation of uncertainty toward the nature of the world. This sense of reality is fascinating topic on its own (though I&amp;rsquo;m biased since its my main research topic), independent from the issue of the nature of the universe. Though we could argue that the latter opens up the possibility of tears in the objective reality (either &lt;em&gt;bugs&lt;/em&gt; of the system or ways of accessing and modifying the fabric of reality), but this &lt;em&gt;&lt;strong&gt;Matrix&lt;/strong&gt;&lt;/em&gt;-like aspect of the simulated reality hypothesis is a topic for another time.&lt;/p&gt;
&lt;p&gt;So no, the universe being a simulation, aside from being a breath-taking metaphysical consideration, changes practically nothing for our daily lives.&lt;/p&gt;
&lt;h2 id=&#34;determism&#34;&gt;Determism&lt;/h2&gt;
&lt;p&gt;Let&amp;rsquo;s put aside this idea of a simulated universe for now and think about determinism. I consider myself, for now, as an &lt;strong&gt;ultra-determinist&lt;/strong&gt; (I should rather say, &lt;em&gt;&amp;ldquo;the world has made me into a determinist&amp;rdquo;&lt;/em&gt;). It means that I believe that the universe is unfolding according to some causality laws (many of which are not yet fully known), and that since the origin of the universe (e.g., the Big Bang), things have been evolving according to the only one possible chain of event. Naturally, the hard version of determinism leaves no room for &lt;strong&gt;free will&lt;/strong&gt; (though the &lt;em&gt;illusion&lt;/em&gt; of free will is important) and creates some issues when it comes to responsibility (again, a topic for another time).&lt;/p&gt;
&lt;figure&gt;
  &lt;img src=&#34;determinism.jpg&#34; alt=&#34;determinism&#34;/&gt;
&lt;/figure&gt;
&lt;p&gt;Note that this position is not incompatible with a &lt;strong&gt;probabilistic view of the world&lt;/strong&gt; (I am also a rather radical Bayesianist &amp;#x1f601;). In this context, the probabilistic perspective is mostly a framework to describe uncertainty and hidden mechanisms. For instance, if I flip a coin, a probabilistic approach would be to consider that there is a 50/50 probability on the outcome. That said, if we manage to gather information on all the parameters (the starting position of the coin, the velocity and angle of the tossing, gravity, the weight distribution on the coin, its resistance to air, characteristics of air pressure, wind etc.), one could pretty much accurately predict what the outcome would be. In other words, the outcome is already &amp;ldquo;determined&amp;rdquo; once the coined has been tossed. That doesn&amp;rsquo;t mean that a probabilistic model is not convenient to describe it, especially when we don&amp;rsquo;t have access to all these parameters (or powerful enough models to integrate them).&lt;/p&gt;
&lt;p&gt;Many attempts have been made to attack determinism (and especially by people trying to defend the possibility of free will), and recent advances in physics have giving them a lot of ammunition (the most striking example being the - often misunderstood - usage of &lt;strong&gt;quantum uncertainty to explain randomness, free-will, consciousness, god&lt;/strong&gt; and pretty much everything). Nonetheless, determinism is one of the simplest assumption that can be made regarding causality and evolution.&lt;/p&gt;
&lt;h2 id=&#34;the-future-is-now&#34;&gt;The future is now&lt;/h2&gt;
&lt;p&gt;Determinism has one important consequence. As all events stem one from the others, in a unique chain of causal events, then if we know the exact state of the system (i.e., you know the state of &lt;em&gt;all&lt;/em&gt; of the variables of the system) at one point in time, as well as the rules governing the system, we can predict with certainty the system&amp;rsquo;s state at the next point in time, at &lt;em&gt;t&lt;/em&gt;+1. If we repeat the process, &lt;strong&gt;we know the state of the system at &lt;em&gt;t&lt;/em&gt;+2, and so on, until the end of times.&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;In other words, the end of the universe is engraved in its beginning. The future is already contained in the now. The whole evolution of the universe is already &amp;ldquo;set&amp;rdquo;. &lt;em&gt;Myself, writing this post, am an expected consequence of the combination of parameters of the universe&amp;rsquo;s origin.&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;















&lt;figure  &gt;
  &lt;div class=&#34;d-flex justify-content-center&#34;&gt;
    &lt;div class=&#34;w-100&#34; &gt;&lt;img src=&#34;https://media.giphy.com/media/1zRd5ZNo0s6kLPifL1/giphy.gif&#34; alt=&#34;&#34; loading=&#34;lazy&#34; data-zoomable /&gt;&lt;/div&gt;
  &lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h2 id=&#34;time-as-a-computational-limit-of-human-understanding&#34;&gt;Time as a computational limit of human understanding&lt;/h2&gt;
&lt;p&gt;The past, and future, are merely but illusions. All the information (about what has been, and what will be) is already known (not known by an intelligent being, but in the sense that the information is existing, encapsulated within each frame of time) &lt;strong&gt;The evolution of the world is, in that regards, similar to that of a movie on a DVD&lt;/strong&gt;. All the movie is there, at once. And a computer can read, and &amp;ldquo;experience&amp;rdquo; (as far as the phenomenological experience of a computer goes) all that information at once.&lt;/p&gt;
&lt;p&gt;Yet we cannot. We have to watch it unfold over time. We are cognitively constrained in that fourth dimension of time. The perception of time passing appears as some limitation of our own cognitive systems: we have to spend one hour and a half in order to make sense of the information. We cannot process it &amp;ldquo;at once&amp;rdquo; (we cannot yet just download the movie into our brain, and experience it without watching it). &lt;strong&gt;Is time passing a feature (or limitation) of our understanding?&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;Could we imagine (as a thought experiment) some other forms of being that are not constrained nor ever drifting onwards in the time dimension? Which, through their immensely greater cognitive abilities, are able to process a &lt;em&gt;lot&lt;/em&gt; more information, which renders their prediction and inference of the near past and future very accurate, to a point where they are able to almost &amp;ldquo;move in time&amp;rdquo; (at least in short time ranges as the universe)&lt;/p&gt;
&lt;h2 id=&#34;time-as-a-computational-limit-of-the-cosmos&#34;&gt;Time as a computational limit of the cosmos&lt;/h2&gt;
&lt;p&gt;One thing to note is that, in the DVD analogy, the watcher is external to the content. We are not talking about Gandalf&amp;rsquo;s experience of its own movie. Which then begs the question, &lt;strong&gt;who&amp;rsquo;s watching our universe?!&lt;/strong&gt; &lt;sup&gt;&lt;sub&gt;(note that this is a logical fallacy used here as a joke; analogy is not homology).&lt;/sub&gt;&lt;/sup&gt;&lt;/p&gt;
&lt;figure&gt;
  &lt;img src=&#34;https://i.pinimg.com/originals/1f/0b/70/1f0b701c4cb1db137c17182d533ea051.jpg&#34; alt=&#34;god&#34;/&gt;
  &lt;figcaption&gt;The Ancient of Days (William Blake, 1794).&lt;/figcaption&gt;
&lt;/figure&gt;
&lt;p&gt;But let&amp;rsquo;s go back to that &amp;ldquo;simulated universe&amp;rdquo; hypothesis that we started with, to try to integrate with it determinism and its consequence on time. When we play the sims, the sims do not really care about what the speed that we, external Humans, play the game. &lt;strong&gt;Their experience (albeit primitive, but you see my point) is dictated by the system&lt;/strong&gt; (the game and the computer). To what extend it can computationally process the information.&lt;/p&gt;
&lt;p&gt;It&amp;rsquo;s like when playing &lt;strong&gt;Minecraft&lt;/strong&gt; (pardon my video games references). At the start of the game, one must first &amp;ldquo;generate&amp;rdquo; the world. This runs a procedural generation algorithm that spatially lays out and populates a world. This can take up to several minutes, depending on how much of a nerd you are (i.e., the specs of your computer). Following this example, if our universe is itself a simulation, could time be a consequence of some limitation of the system that &amp;ldquo;runs&amp;rdquo; it (or generates it - after all, maybe God is just waiting for our universe to complete building to be able to play his game of &amp;ldquo;Worldcraft&amp;rdquo; &lt;sub&gt;&lt;sup&gt;or, perhaps more appropriately, &amp;ldquo;Minehumans&amp;rdquo;&lt;/sub&gt;&lt;/sup&gt;), which might explain the particular nature of the time dimension in our typical environment.&lt;/p&gt;
&lt;figure&gt;
  &lt;img src=&#34;https://thumbs.gfycat.com/BiodegradableSeriousLacewing-size_restricted.gif&#34; alt=&#34;gandalf&#34;/&gt;
  &lt;figcaption&gt;&#34;Riddles in the dark...&#34;&lt;/figcaption&gt;
&lt;/figure&gt;
&lt;p&gt;While these are fun thought experiments to ponder over, note that until there, I have mainly speculated about time as we phenomenologically experience it. I haven&amp;rsquo;t even touched on the possible relationship between the idea of time as a computational limit, and time as it conceived in modern theoretical physics (for instance, as a geometrical dimension of the time-space continuum that can be deformed and, potentially, navigated in). &lt;strong&gt;But for this, you&amp;rsquo;ll need to get me talking after more beers&lt;/strong&gt; &amp;#x1f37b;&lt;/p&gt;
&lt;p&gt;Cheers!&lt;/p&gt;
&lt;hr&gt;
&lt;p&gt;&lt;em&gt;Thanks for reading! Do not hesitate to tweet and share this post, and leave a comment below&lt;/em&gt; &amp;#x1f917;&lt;/p&gt;
&lt;p&gt;🐦 &lt;em&gt;Don&amp;rsquo;t forget to join me on X&lt;/em&gt; &lt;a href=&#34;https://x.com/Dom_Makowski&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;@Dom_Makowski&lt;/a&gt;&lt;/p&gt;
</description>
    </item>
    
    <item>
      <title>What is Reality Bending?</title>
      <link>https://realitybending.github.io/post/2020-09-28-what_is_realitybending/</link>
      <pubDate>Mon, 28 Sep 2020 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/post/2020-09-28-what_is_realitybending/</guid>
      <description>&lt;p&gt;As you know, &lt;strong&gt;reality bending&lt;/strong&gt; is my primary research direction. However, it is not (yet) a well-established scientific topic, nor is it clearly defined. In fact, &lt;strong&gt;it is not defined at all, hence the purpose of this article&lt;/strong&gt;.&lt;/p&gt;
&lt;p&gt;But what does it refer too? Is it some kind of &lt;a href=&#34;https://en.wikipedia.org/wiki/Avatar:_The_Last_Airbender&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;em&gt;Avatar: The Last Airbender&lt;/em&gt;&lt;/a&gt; thing? Or some &lt;a href=&#34;https://marvel-movies.fandom.com/wiki/Reality_Stone&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;em&gt;Avengers&lt;/em&gt;&lt;/a&gt;-style superpower? Well&amp;hellip; I sure wish it was &amp;#x1f601;&lt;/p&gt;
&lt;p&gt;Essentially, &lt;strong&gt;reality bending&lt;/strong&gt; refers to the study of the internal and external determinants of subjective reality. In other words, we seek to understand the processes that modulate our conscious experience of reality. The word &amp;ldquo;bending&amp;rdquo; encapsulates the active nature of the mechanisms at stake. Indeed, being anything but stable, our perception of reality can be quite easily influenced, whether voluntarily or not, sometimes to extreme degrees of alteration.&lt;/p&gt;
&lt;figure&gt;
  &lt;img src=&#34;DonQuixote.jpg&#34; alt=&#34;github for psychologists&#34;/&gt;
  &lt;figcaption&gt;Daumier, H. (1925), Don Quixote attacking the windmills.&lt;/figcaption&gt;
&lt;/figure&gt;
&lt;p&gt;&lt;strong&gt;Reality benders&lt;/strong&gt; seek to unravel the structure and mechanisms of the sense of reality by studying natural instances of its distortion, or by directly inducing them through a variety of means.&lt;/p&gt;
&lt;h2 id=&#34;objective-and-subjective-determinants-of-the-sense-of-reality&#34;&gt;Objective and subjective determinants of the sense of reality&lt;/h2&gt;
&lt;p&gt;Let&amp;rsquo;s take for example a guy watching some episode of his favourite TV show, &lt;em&gt;Friends&lt;/em&gt;. As he swiftly moves from laughing to snivelling, we can confidently say that he is fully &lt;strong&gt;immersed&lt;/strong&gt; in the show. He feels like he&amp;rsquo;s &lt;em&gt;present&lt;/em&gt; in the show, from which the fictional characters &lt;em&gt;feel&lt;/em&gt; very real: for a moment, his brain processes the perceived experience almost as if it was real.&lt;/p&gt;
&lt;p&gt;What leads to this high sense of reality? First, there are &lt;strong&gt;objective characteristics&lt;/strong&gt; of the experience (or rather, of the external source of the experience), i.e., characteristics of the environment. Here, it&amp;rsquo;s a realistic stimulus displayed on a flat screen. But one could wonder what would happen if the sensory input was richer (imagine being physically IN the show by means of some super &lt;strong&gt;virtual reality&lt;/strong&gt; setup), or poorer (the same story presented as comic strips with the characters portrayed as stick figures).&lt;/p&gt;
&lt;p&gt;However, while such manipulations could indeed be used to manipulate our immersion, there is also a &lt;strong&gt;subjective component&lt;/strong&gt; contributing to our sense of reality, related for instance to the affective response, attentional engagement, or self-relevance, that will cause a stimulus to strum unique strings in each individual, depending on his history and state of mind.&lt;/p&gt;
&lt;h2 id=&#34;tell-me-your-reality-and-ill-tell-you-who-you-are&#34;&gt;Tell me your reality and I&amp;rsquo;ll tell you who you are&lt;/h2&gt;
&lt;p&gt;The fact that the sense of reality is, in the end, a subjective experience, means that is is intrinsically connected to the Self (i.e., our physical and mental identity). As such, aside from studying how our sense of reality is influenced by external and internal factors, but also investigate the reverse relationship, i.e., &lt;strong&gt;how the variability of our sense of reality can inform us about oneself&lt;/strong&gt;.&lt;/p&gt;
&lt;p&gt;Note that, although the focus is the subjective aspect of reality, it doesn&amp;rsquo;t mean that we deny the existence, or downplay the importance, of objective reality. Stating that most of our experience is &amp;ldquo;made-up&amp;rdquo; (i.e., is a construction of the brain) does not equate absolute relativism (more on that in another post). Objective truths and facts do exist, and are essential to seek.&lt;/p&gt;
&lt;h2 id=&#34;altered-states-of-consciousness&#34;&gt;Altered states of consciousness&lt;/h2&gt;
&lt;p&gt;Naturally, states in which our sense of reality is distorted (as compared to the consensual collective experience) are of particular interest as models or study-cases of our ideas and theories. They include long-term affections (e.g., neuropsychiatric disorders such as schizophrenia) or transcient states (induced by psychoactive substances or specific practices like meditation and trance).&lt;/p&gt;
&lt;hr&gt;
&lt;p&gt;&lt;em&gt;Thanks for reading! Don&amp;rsquo;t forget to join me on X&lt;/em&gt; &lt;a href=&#34;https://x.com/Dom_Makowski&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;@Dom_Makowski&lt;/a&gt; &amp;#x1f917;&lt;/p&gt;
</description>
    </item>
    
    <item>
      <title>How to extract individual scores from repeated measures</title>
      <link>https://realitybending.github.io/post/2020-09-14-individual_scores/</link>
      <pubDate>Mon, 14 Sep 2020 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/post/2020-09-14-individual_scores/</guid>
      <description>&lt;h3 id=&#34;introduction&#34;&gt;Introduction&lt;/h3&gt;
&lt;p&gt;Many psychology fields require to extract individual scores, i.e., point-estimates (&lt;em&gt;i.e.&lt;/em&gt;, a single value) for a participant/patient, to be used as an index of something and later interpreted or re-used in further statistical analyses. This single index is often derived from several &amp;ldquo;trials&amp;rdquo;. For instance, the reaction times in the condition A (let&amp;rsquo;s say, the baseline) will be &lt;strong&gt;averaged&lt;/strong&gt; together, and the same will be done with the condition B. Finally, the difference between these two means will be used an the &lt;strong&gt;individual score&lt;/strong&gt; for a given participant.&lt;/p&gt;
&lt;p&gt;However, we can intuitively feel that we &lt;strong&gt;lose a lot of information&lt;/strong&gt; when averaging these scores. Do we deal appropriately with the variability related to individuals, conditions, or the noise aggravated by potential outliers? This is especially important when working with a limited amount of trials.&lt;/p&gt;
&lt;p&gt;With the advent of recent computational advances, new easy-to-implement alternatives emerge. For instance, &lt;strong&gt;one can &amp;ldquo;model&amp;rdquo; the effects at an individual level&lt;/strong&gt; (e.g., the simplest case, for the two conditions paradigm described above, would be a linear regression with the condition as a unique predictor), and use the &lt;strong&gt;parameters&lt;/strong&gt; of each model as individual scores (e.g., the &amp;ldquo;slope&amp;rdquo; coefficient of the effect of the manipulation), rather than the raw mean. This opens up the possibility of including covariates and take into account other sources of known variability, which could lead to better estimates.&lt;/p&gt;
&lt;p&gt;However, individual models are also sensitive to outliers and noise. Thus, another possibility is to &lt;strong&gt;model the effects at the population level&lt;/strong&gt; and, &lt;em&gt;at the same time&lt;/em&gt;, at the individual level. This can be achieved by modelling the participants as a &lt;strong&gt;random factor in a mixed model&lt;/strong&gt;. In this case, the individual estimates might benefit from the population estimates. In other words, the effects at the population level will &amp;ldquo;constrain&amp;rdquo; or &amp;ldquo;guide&amp;rdquo; the estimation at an individual level to potentially limit extreme parameters.&lt;/p&gt;
&lt;p&gt;Unfortunately, the above method requires to have all the data at hand, to be able to fit the population model. This is often not the case in on-going acquisition, or in neuropsychological contexts, in which the practitioners simply acquire data for one patient, and have to compute individual scores, without having access to the detailed population data. Thus, an in-between alternative could make use of &lt;strong&gt;Bayesian models&lt;/strong&gt;, in which the population effects (for instance, the mean effect of the condition) could be entered as an informative &lt;strong&gt;prior&lt;/strong&gt; in the individual models to, again, &amp;ldquo;guide&amp;rdquo; the estimation at an individual level and hopefully limit the impact of noise or outliers observations.&lt;/p&gt;
&lt;p&gt;In this post, the aim is to compare these 4 methods (basic individual model - equivalent to using the raw mean, population model, individual model with informative priors) in recovering the &amp;ldquo;true&amp;rdquo; effects using a simulated dataset.&lt;/p&gt;
&lt;h3 id=&#34;results&#34;&gt;Results&lt;/h3&gt;
&lt;h4 id=&#34;generate-data&#34;&gt;Generate Data&lt;/h4&gt;
&lt;p&gt;We generate several datasets in which we manipulate the number of participants, in which the score of interest is the effect of a manipulation as compared to a baseline condition. 20 trials per condition will be generated with a known &amp;ldquo;true&amp;rdquo; effect (the centre of the distribution from which the data is generated). Gaussian noise of varying standard deviation will be added to create a natural variability (See the functions&amp;rsquo; definition below).&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-r&#34; data-lang=&#34;r&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;nf&#34;&gt;library&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;tidyverse&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;nf&#34;&gt;library&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;easystats&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;data&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;&amp;lt;-&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;get_data&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;n_participants&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;1000&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;n_trials&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;20&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;results&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;&amp;lt;-&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;get_results&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;data&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;div class=&#34;figure&#34;&gt;
&lt;img src=&#34;individual.png&#34; alt=&#34;*Example of a dataset containing 20 participants (shown with different colors). As can be seen, we introduced modulations in the inter- and intra- individual variability.*&#34; width=&#34;1575&#34; /&gt;
&lt;p class=&#34;caption&#34;&gt;&lt;span id=&#34;fig:unnamed-chunk-3&#34;&gt;&lt;/span&gt;Figure 1: *Example of a dataset containing 20 participants (shown with different colors). As can be seen, we introduced modulations in the inter- and intra- individual variability.*&lt;/p&gt;
&lt;/div&gt;
&lt;p&gt;We will then compare the scores obtained by each method to the &amp;ldquo;true&amp;rdquo; score of each participant by substracting them from one another. As such, for each method, we obtain the absolute &amp;ldquo;distance&amp;rdquo; from the true score.&lt;/p&gt;
&lt;h4 id=&#34;fit-model&#34;&gt;Fit model&lt;/h4&gt;
&lt;p&gt;Contrast analysis will be applied to compare the different methods together.&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-r&#34; data-lang=&#34;r&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;model&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;&amp;lt;-&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;lm&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Diff_Abs&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;~&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Method&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;data&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;results&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;modelbased&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;::&lt;/span&gt;&lt;span class=&#34;nf&#34;&gt;estimate_contrasts&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;model&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;%&amp;gt;%&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;nf&#34;&gt;arrange&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Difference&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;%&amp;gt;%&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;nf&#34;&gt;mutate&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Level1&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;stringr&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;::&lt;/span&gt;&lt;span class=&#34;nf&#34;&gt;str_remove&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Level1&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;s&#34;&gt;&amp;#34;Diff_&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;),&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;         &lt;span class=&#34;n&#34;&gt;Level2&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;stringr&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;::&lt;/span&gt;&lt;span class=&#34;nf&#34;&gt;str_remove&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Level2&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;s&#34;&gt;&amp;#34;Diff_&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;))&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;%&amp;gt;%&lt;/span&gt; 
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;nf&#34;&gt;select&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Level1&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Level2&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Difference&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;CI_low&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;CI_high&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;p&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-fallback&#34; data-lang=&#34;fallback&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;## Level1                 | Level2                 | Difference |            CI |      p
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;## -------------------------------------------------------------------------------------
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;## IndividualModel_Priors | PopulationModel        |  -1.85e-03 | [-0.01, 0.01] | &amp;gt; .999
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;## IndividualModel_Freq   | PopulationModel        |   1.70e-03 | [-0.01, 0.01] | &amp;gt; .999
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;## IndividualModel_Freq   | IndividualModel_Priors |   3.55e-03 | [-0.01, 0.01] | &amp;gt; .999
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;h4 id=&#34;visualize-the-results&#34;&gt;Visualize the results&lt;/h4&gt;
&lt;div class=&#34;figure&#34;&gt;
&lt;img src=&#34;featured.png&#34; alt=&#34;*Average accuracy of the different methods (the closest to 0 the better).*&#34; width=&#34;2250&#34; /&gt;
&lt;p class=&#34;caption&#34;&gt;&lt;span id=&#34;fig:unnamed-chunk-6&#34;&gt;&lt;/span&gt;Figure 2: *Average accuracy of the different methods (the closest to 0 the better).*&lt;/p&gt;
&lt;/div&gt;
&lt;div class=&#34;figure&#34;&gt;
&lt;img src=&#34;n_participants.png&#34; alt=&#34;*Accuracy depending on the number of total participants in the dataset.*&#34; width=&#34;2250&#34; /&gt;
&lt;p class=&#34;caption&#34;&gt;&lt;span id=&#34;fig:unnamed-chunk-7&#34;&gt;&lt;/span&gt;Figure 3: *Accuracy depending on the number of total participants in the dataset.*&lt;/p&gt;
&lt;/div&gt;
&lt;h3 id=&#34;conclusion&#34;&gt;Conclusion&lt;/h3&gt;
&lt;p&gt;Though not significantly different, it seems that &lt;strong&gt;raw basic estimates&lt;/strong&gt; (that rely only on the individual data) &lt;strong&gt;perform consistently worse than the population model or individual models informed by priors&lt;/strong&gt;, especially for small datasets (between 10 and 100 participants) - though again, the difference is tiny in our simulated dataset. In the absence of the whole population dataset, it seems that using individual Bayesian model with informative priors (derived from the population model) is a safe alternative.&lt;/p&gt;
&lt;h3 id=&#34;functions&#34;&gt;Functions&lt;/h3&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-r&#34; data-lang=&#34;r&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;nf&#34;&gt;library&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;tidyverse&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;nf&#34;&gt;library&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;easystats&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;nf&#34;&gt;library&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;rstanarm&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;nf&#34;&gt;library&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;ggdist&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Get data ----------------------------------------------------------------&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;get_data&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;&amp;lt;-&lt;/span&gt; &lt;span class=&#34;kr&#34;&gt;function&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;n_participants&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;m&#34;&gt;10&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;n_trials&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;m&#34;&gt;20&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;d&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;m&#34;&gt;1.5&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;var&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;m&#34;&gt;3&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;noise&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;m&#34;&gt;0.1&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;{&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;n&#34;&gt;scores_baseline&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;&amp;lt;-&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;rnorm&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;n_participants&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;m&#34;&gt;0&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;m&#34;&gt;1&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;n&#34;&gt;scores_condition&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;&amp;lt;-&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;rnorm&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;n_participants&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;d&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;m&#34;&gt;1&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;n&#34;&gt;variances&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;&amp;lt;-&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;rbeta&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;n_participants&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;m&#34;&gt;2&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;m&#34;&gt;8&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;n&#34;&gt;variances&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;&amp;lt;-&lt;/span&gt; &lt;span class=&#34;m&#34;&gt;0.1&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;+&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;variances&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;*&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;var&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;/&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;max&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;variances&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;))&lt;/span&gt; &lt;span class=&#34;c1&#34;&gt;# Rescale&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;n&#34;&gt;noise_sd&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;&amp;lt;-&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;abs&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;nf&#34;&gt;rnorm&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;n_participants&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;m&#34;&gt;0&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;noise&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;))&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;n&#34;&gt;data&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;&amp;lt;-&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;data.frame&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;()&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;kr&#34;&gt;for&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;i&lt;/span&gt; &lt;span class=&#34;kr&#34;&gt;in&lt;/span&gt; &lt;span class=&#34;m&#34;&gt;1&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;:&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;n_participants&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;{&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;n&#34;&gt;a&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;&amp;lt;-&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;rnorm&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;n_trials&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;scores_baseline[i]&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;variances[i]&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;n&#34;&gt;b&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;&amp;lt;-&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;rnorm&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;n_trials&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;scores_condition[i]&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;variances[i]&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;n&#34;&gt;a&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;&amp;lt;-&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;a&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;+&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;rnorm&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;n_trials&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;m&#34;&gt;0&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;noise_sd[i]&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt; &lt;span class=&#34;c1&#34;&gt;# Add noise&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;n&#34;&gt;b&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;&amp;lt;-&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;b&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;+&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;rnorm&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;n_trials&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;m&#34;&gt;0&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;noise_sd[i]&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt; &lt;span class=&#34;c1&#34;&gt;# Add noise&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;n&#34;&gt;data&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;&amp;lt;-&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;rbind&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;data&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;data.frame&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;      &lt;span class=&#34;s&#34;&gt;&amp;#34;Participant&amp;#34;&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;sprintf&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;s&#34;&gt;&amp;#34;S%02d&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;i&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;),&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;      &lt;span class=&#34;s&#34;&gt;&amp;#34;Y&amp;#34;&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;c&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;a&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;b&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;),&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;      &lt;span class=&#34;s&#34;&gt;&amp;#34;Score_True&amp;#34;&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;rep&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;nf&#34;&gt;c&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;scores_baseline[i]&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;scores_condition[i]&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;),&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;each&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;n_trials&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;),&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;      &lt;span class=&#34;s&#34;&gt;&amp;#34;Condition&amp;#34;&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;rep&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;nf&#34;&gt;c&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;s&#34;&gt;&amp;#34;Baseline&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;s&#34;&gt;&amp;#34;Manipulation&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;),&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;each&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;n_trials&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;p&#34;&gt;))&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;p&#34;&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;n&#34;&gt;data&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;p&#34;&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Visualize data -----------------------------------------------------------&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;p&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;&amp;lt;-&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;get_data&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;n_participants&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;m&#34;&gt;20&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;%&amp;gt;%&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;nf&#34;&gt;group_by&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Participant&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Condition&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;%&amp;gt;%&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;nf&#34;&gt;mutate&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;mean&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;mean&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Y&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;))&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;%&amp;gt;%&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;nf&#34;&gt;ggplot&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;nf&#34;&gt;aes&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;y&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Y&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;x&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Condition&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;fill&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Participant&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;color&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Participant&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;group&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Participant&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;))&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;+&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;nf&#34;&gt;geom_line&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;nf&#34;&gt;aes&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;y&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;mean&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;),&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;position&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;position_dodge&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;width&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;m&#34;&gt;0.66&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;))&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;+&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;n&#34;&gt;ggdist&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;::&lt;/span&gt;&lt;span class=&#34;nf&#34;&gt;stat_eye&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;point_interval&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;ggdist&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;::&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;mean_hdi&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;alpha&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;m&#34;&gt;0.66&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;position&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;position_dodge&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;width&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;m&#34;&gt;0.66&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;),&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;.width&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;c&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;0.95&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;))&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;+&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;nf&#34;&gt;ylab&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;s&#34;&gt;&amp;#34;Score&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;+&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;nf&#34;&gt;theme_modern&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;()&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;+&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;nf&#34;&gt;theme&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;legend.position&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;s&#34;&gt;&amp;#34;none&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;nf&#34;&gt;ggsave&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;s&#34;&gt;&amp;#34;individual.png&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;p&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;width&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;m&#34;&gt;7&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;height&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;m&#34;&gt;7&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;dpi&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;m&#34;&gt;450&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Get results -------------------------------------------------------------&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;get_results&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;&amp;lt;-&lt;/span&gt; &lt;span class=&#34;kr&#34;&gt;function&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;data&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;{&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;c1&#34;&gt;# Raw method ----&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;n&#34;&gt;results&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;&amp;lt;-&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;data&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;%&amp;gt;%&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;nf&#34;&gt;group_by&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Participant&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Condition&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;%&amp;gt;%&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;nf&#34;&gt;summarise_all&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;mean&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;%&amp;gt;%&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;nf&#34;&gt;rename&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;s&#34;&gt;&amp;#34;Score_Raw&amp;#34;&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;s&#34;&gt;&amp;#34;Y&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;%&amp;gt;%&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;nf&#34;&gt;arrange&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Condition&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Participant&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;%&amp;gt;%&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;nf&#34;&gt;ungroup&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;()&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;c1&#34;&gt;# Population model ----&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;n&#34;&gt;model&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;&amp;lt;-&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;lme4&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;::&lt;/span&gt;&lt;span class=&#34;nf&#34;&gt;lmer&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Y&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;~&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Condition&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;+&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;1&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;+&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Condition&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;|&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Participant&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;),&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;data&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;data&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;n&#34;&gt;fixed&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;&amp;lt;-&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;insight&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;::&lt;/span&gt;&lt;span class=&#34;nf&#34;&gt;get_parameters&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;model&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;effects&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;s&#34;&gt;&amp;#34;fixed&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;$&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Estimate&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;n&#34;&gt;random&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;&amp;lt;-&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;insight&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;::&lt;/span&gt;&lt;span class=&#34;nf&#34;&gt;get_parameters&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;model&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;effects&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;s&#34;&gt;&amp;#34;random&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;$&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Participant&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;c1&#34;&gt;# Transform coefs into scores&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;n&#34;&gt;pop_baseline&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;&amp;lt;-&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;random[&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;m&#34;&gt;1&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;]&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;+&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;fixed[1]&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;n&#34;&gt;pop_manipulation&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;&amp;lt;-&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;pop_baseline&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;+&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;random[&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;m&#34;&gt;2&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;]&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;+&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;fixed[2]&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;n&#34;&gt;results&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;$&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Score_PopulationModel&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;&amp;lt;-&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;c&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;pop_baseline&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;pop_manipulation&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;c1&#34;&gt;# Individual model ----&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;n&#34;&gt;individual_model_data&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;&amp;lt;-&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;data.frame&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;()&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;kr&#34;&gt;for&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;participant&lt;/span&gt; &lt;span class=&#34;kr&#34;&gt;in&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;unique&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;data&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;$&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Participant&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;))&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;{&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;nf&#34;&gt;cat&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;s&#34;&gt;&amp;#34;.&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt; &lt;span class=&#34;c1&#34;&gt;# Print progress&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;n&#34;&gt;dat&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;&amp;lt;-&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;data[data&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;$&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Participant&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;==&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;participant&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;]&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;c1&#34;&gt;# Frequentist&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;n&#34;&gt;model1&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;&amp;lt;-&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;lm&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Y&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;~&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Condition&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;data&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;dat&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;n&#34;&gt;nopriors&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;&amp;lt;-&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;parameters&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;::&lt;/span&gt;&lt;span class=&#34;nf&#34;&gt;parameters&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;model1&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;$&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Coefficient&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;c1&#34;&gt;# Bayesian without priors&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;c1&#34;&gt;# model2 &amp;lt;- stan_glm(Y ~ Condition, data = dat, refresh = 0)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;c1&#34;&gt;# bayes &amp;lt;- parameters::parameters(model2)$Median&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;c1&#34;&gt;# Bayesian with Priors&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;n&#34;&gt;model3&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;&amp;lt;-&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;stan_glm&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Y&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;~&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Condition&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;      &lt;span class=&#34;n&#34;&gt;data&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;dat&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;      &lt;span class=&#34;n&#34;&gt;refresh&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;m&#34;&gt;0&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;      &lt;span class=&#34;n&#34;&gt;prior&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;normal&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;fixed[1]&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;),&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;      &lt;span class=&#34;n&#34;&gt;prior_intercept&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;normal&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;fixed[2]&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;n&#34;&gt;priors&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;&amp;lt;-&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;parameters&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;::&lt;/span&gt;&lt;span class=&#34;nf&#34;&gt;parameters&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;model3&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;$&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Median&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;n&#34;&gt;individual_model_data&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;&amp;lt;-&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;rbind&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;      &lt;span class=&#34;n&#34;&gt;individual_model_data&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;      &lt;span class=&#34;nf&#34;&gt;data.frame&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;s&#34;&gt;&amp;#34;Participant&amp;#34;&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;c&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;participant&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;participant&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;),&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;s&#34;&gt;&amp;#34;Condition&amp;#34;&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;c&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;s&#34;&gt;&amp;#34;Baseline&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;s&#34;&gt;&amp;#34;Manipulation&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;),&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;s&#34;&gt;&amp;#34;Score_IndividualModel&amp;#34;&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;c&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;nopriors[1]&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;nopriors[1]&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;+&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;nopriors[2]&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;),&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;c1&#34;&gt;# &amp;#34;Score_IndividualModel_Bayes&amp;#34; = c(bayes[1], bayes[1] + bayes[2]),&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;s&#34;&gt;&amp;#34;Score_IndividualModel_Priors&amp;#34;&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;c&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;priors[1]&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;priors[1]&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;+&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;priors[2]&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;      &lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;p&#34;&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;n&#34;&gt;results&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;&amp;lt;-&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;merge&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;results&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;individual_model_data&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;c1&#34;&gt;# Clean output ----&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;n&#34;&gt;diff&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;&amp;lt;-&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;results&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;%&amp;gt;%&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;nf&#34;&gt;mutate&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;      &lt;span class=&#34;c1&#34;&gt;# Diff_Raw = Score_True - Score_Raw,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;      &lt;span class=&#34;n&#34;&gt;Diff_PopulationModel&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Score_True&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;-&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Score_PopulationModel&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;      &lt;span class=&#34;n&#34;&gt;Diff_IndividualModel&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Score_True&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;-&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Score_IndividualModel&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;      &lt;span class=&#34;c1&#34;&gt;# Diff_IndividualModel_Bayes = Score_True - Score_IndividualModel_Bayes,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;      &lt;span class=&#34;n&#34;&gt;Diff_IndividualModel_Priors&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Score_True&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;-&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Score_IndividualModel_Priors&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;p&#34;&gt;)&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;%&amp;gt;%&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;nf&#34;&gt;select&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Participant&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Condition&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;starts_with&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;s&#34;&gt;&amp;#34;Diff&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;))&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;%&amp;gt;%&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;nf&#34;&gt;pivot_longer&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;nf&#34;&gt;starts_with&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;s&#34;&gt;&amp;#34;Diff&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;),&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;names_to&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;s&#34;&gt;&amp;#34;Method&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;values_to&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;s&#34;&gt;&amp;#34;Diff&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;%&amp;gt;%&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;nf&#34;&gt;mutate&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Diff_Abs&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;abs&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Diff&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;))&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;n&#34;&gt;diff&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;p&#34;&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Analysis ----------------------------------------------------------------&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;results&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;&amp;lt;-&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;data.frame&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;()&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;kr&#34;&gt;for&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;n&lt;/span&gt; &lt;span class=&#34;kr&#34;&gt;in&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;seq.int&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;10&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;m&#34;&gt;300&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;length.out&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;m&#34;&gt;10&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)){&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;n&#34;&gt;data&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;&amp;lt;-&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;get_data&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;n_participants&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;round&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;n&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;),&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;n_trials&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;m&#34;&gt;20&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;n&#34;&gt;rez&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;&amp;lt;-&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;get_results&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;data&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;%&amp;gt;%&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;nf&#34;&gt;select&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;-&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Participant&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;%&amp;gt;%&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;nf&#34;&gt;group_by&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Condition&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Method&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;%&amp;gt;%&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;nf&#34;&gt;summarise_all&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;mean&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;%&amp;gt;%&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;nf&#34;&gt;mutate&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;n_Participants&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;n&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;           &lt;span class=&#34;n&#34;&gt;Method&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;as.factor&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Method&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;),&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;           &lt;span class=&#34;n&#34;&gt;Dataset&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;nf&#34;&gt;paste0&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;s&#34;&gt;&amp;#34;Dataset&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;round&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;n&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;m&#34;&gt;2&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)))&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;n&#34;&gt;results&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;&amp;lt;-&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;rbind&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;results&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;rez&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;nf&#34;&gt;print&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;n&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt; &lt;span class=&#34;c1&#34;&gt;# Print progress&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;p&#34;&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# model &amp;lt;- mgcv::gam(Diff_Abs ~ Method + s(n_Participants, by = Method), data = results)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;model&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;&amp;lt;-&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;lm&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Diff_Abs&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;~&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Method&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;*&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;poly&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;n_Participants&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;m&#34;&gt;3&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;),&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;data&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;results&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;parameters&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;::&lt;/span&gt;&lt;span class=&#34;nf&#34;&gt;parameters&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;model&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;contrasts&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;&amp;lt;-&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;modelbased&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;::&lt;/span&gt;&lt;span class=&#34;nf&#34;&gt;estimate_contrasts&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;model&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;%&amp;gt;%&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;nf&#34;&gt;arrange&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Difference&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;%&amp;gt;%&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;nf&#34;&gt;mutate&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;n&#34;&gt;Level1&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;stringr&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;::&lt;/span&gt;&lt;span class=&#34;nf&#34;&gt;str_remove&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Level1&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;s&#34;&gt;&amp;#34;Diff_&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;),&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;n&#34;&gt;Level2&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;stringr&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;::&lt;/span&gt;&lt;span class=&#34;nf&#34;&gt;str_remove&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Level2&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;s&#34;&gt;&amp;#34;Diff_&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;p&#34;&gt;)&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;%&amp;gt;%&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;nf&#34;&gt;select&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Level1&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Level2&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Difference&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;CI_low&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;CI_high&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;p&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Visualize results ---------------------------------------------------------&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;p&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;&amp;lt;-&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;modelbased&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;::&lt;/span&gt;&lt;span class=&#34;nf&#34;&gt;estimate_means&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;model&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;%&amp;gt;%&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;nf&#34;&gt;arrange&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Mean&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;%&amp;gt;%&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;nf&#34;&gt;mutate&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;n&#34;&gt;Method&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;stringr&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;::&lt;/span&gt;&lt;span class=&#34;nf&#34;&gt;str_remove&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Method&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;s&#34;&gt;&amp;#34;Diff_&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;),&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;n&#34;&gt;Method&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;factor&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Method&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;levels&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Method&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;  &lt;span class=&#34;o&#34;&gt;%&amp;gt;%&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;nf&#34;&gt;ggplot&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;nf&#34;&gt;aes&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;x&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Method&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;y&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Mean&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;color&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Method&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;))&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;+&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;nf&#34;&gt;geom_line&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;nf&#34;&gt;aes&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;group&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;m&#34;&gt;1&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;))&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;+&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;nf&#34;&gt;geom_pointrange&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;nf&#34;&gt;aes&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;ymin&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;CI_low&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;ymax&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;CI_high&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;),&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;size&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;m&#34;&gt;1&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;+&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;nf&#34;&gt;theme_modern&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;()&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;+&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;nf&#34;&gt;theme&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;axis.text.x&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;element_text&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;angle&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;m&#34;&gt;45&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;hjust&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;m&#34;&gt;1&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;))&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;+&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;nf&#34;&gt;scale_color_material&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;()&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;nf&#34;&gt;ggsave&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;s&#34;&gt;&amp;#34;featured.png&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;p&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;width&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;m&#34;&gt;10&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;height&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;m&#34;&gt;6&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;dpi&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;m&#34;&gt;450&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;p&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;&amp;lt;-&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;modelbased&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;::&lt;/span&gt;&lt;span class=&#34;nf&#34;&gt;estimate_relation&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;model&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;%&amp;gt;%&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;nf&#34;&gt;mutate&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Method&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;stringr&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;::&lt;/span&gt;&lt;span class=&#34;nf&#34;&gt;str_remove&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Method&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;s&#34;&gt;&amp;#34;Diff_&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;))&lt;/span&gt;  &lt;span class=&#34;o&#34;&gt;%&amp;gt;%&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;nf&#34;&gt;ggplot&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;nf&#34;&gt;aes&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;x&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;n_Participants&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;y&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Predicted&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;))&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;+&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;nf&#34;&gt;geom_point&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;data&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;nf&#34;&gt;mutate&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;results&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Method&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;stringr&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;::&lt;/span&gt;&lt;span class=&#34;nf&#34;&gt;str_remove&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Method&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;s&#34;&gt;&amp;#34;Diff_&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)),&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;             &lt;span class=&#34;nf&#34;&gt;aes&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;y&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Diff_Abs&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;color&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Method&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;))&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;+&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;nf&#34;&gt;geom_ribbon&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;nf&#34;&gt;aes&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;ymin&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;CI_low&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;ymax&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;CI_high&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;fill&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Method&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;),&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;alpha&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;0.1&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;+&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;nf&#34;&gt;geom_line&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;nf&#34;&gt;aes&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;color&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Method&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;),&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;size&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;1&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;+&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;nf&#34;&gt;theme_modern&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;()&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;+&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;nf&#34;&gt;theme&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;axis.text.x&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;element_text&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;angle&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;m&#34;&gt;45&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;hjust&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;m&#34;&gt;1&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;))&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;+&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;nf&#34;&gt;scale_color_material&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;()&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;+&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;nf&#34;&gt;scale_fill_material&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;()&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;nf&#34;&gt;ggsave&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;s&#34;&gt;&amp;#34;n_participants.png&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;p&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;width&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;m&#34;&gt;10&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;height&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;m&#34;&gt;6&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;dpi&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;m&#34;&gt;450&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Save results ------------------------------------------------------------&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;d&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;&amp;lt;-&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;list&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;s&#34;&gt;&amp;#34;results&amp;#34;&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;results&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;s&#34;&gt;&amp;#34;model&amp;#34;&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;model&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;s&#34;&gt;&amp;#34;contrasts&amp;#34;&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;contrasts&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;nf&#34;&gt;save&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;d&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;file&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;s&#34;&gt;&amp;#34;data.Rdata&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;h3 id=&#34;references&#34;&gt;References&lt;/h3&gt;
&lt;p&gt;&lt;sub&gt;You can reference this post as follows:&lt;/sub&gt;&lt;/p&gt;
&lt;p&gt;&lt;sub&gt;- Makowski, D. (2020, September 14). How to extract individual scores from repeated measures. Retrieved from &lt;a href=&#34;https://dominiquemakowski.github.io/post/individual_scores/&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;https://dominiquemakowski.github.io/post/individual_scores/&lt;/a&gt;&lt;/sub&gt;&lt;/p&gt;
&lt;hr&gt;
&lt;p&gt;&lt;em&gt;Thanks for reading! Do not hesitate to share this post, and leave a comment below&lt;/em&gt; &amp;#x1f917;&lt;/p&gt;
&lt;p&gt;🐦 &lt;em&gt;And don&amp;rsquo;t forget to join me on X&lt;/em&gt; &lt;a href=&#34;https://x.com/Dom_Makowski&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;@Dom_Makowski&lt;/a&gt;&lt;/p&gt;
</description>
    </item>
    
    <item>
      <title>What is neuropsychology?</title>
      <link>https://realitybending.github.io/post/2020-09-13-what_is_neuropsychology/</link>
      <pubDate>Sun, 13 Sep 2020 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/post/2020-09-13-what_is_neuropsychology/</guid>
      <description>&lt;h2 id=&#34;the-place-of-neuropsychology&#34;&gt;The place of neuropsychology&lt;/h2&gt;
&lt;p&gt;&lt;strong&gt;Let&amp;rsquo;s make a simple experiment&lt;/strong&gt;. Pick one young and brilliant neuropsychologist and ask &amp;ldquo;what is neuropsychology?&amp;rdquo;. In some cases, after a few seconds of hesitation, you could hear answers like &amp;ldquo;being a neuropsychologist means doing &lt;em&gt;this&lt;/em&gt; or &lt;em&gt;that&lt;/em&gt;&amp;rdquo;. In other cases, you might come across incomplete - or even false - responses, such as &amp;ldquo;neuropsychology is a tool&amp;rdquo;, &amp;ldquo;a method&amp;rdquo;, &amp;ldquo;a paradigm&amp;rdquo;, or even worse, &amp;ldquo;a point of view&amp;rdquo;.&lt;/p&gt;
&lt;p&gt;That does not mean that our neuropsychologist is incompetent, far from it. But formally defining our field as a whole is not an exercise that we are used to do. Indeed, &lt;strong&gt;the training in neuropsychology usually comes in a fragmented way&lt;/strong&gt;, little by little. &lt;em&gt;A bit of cognitive neuroscience here, a bit of neuropsychological syndromes there, some cognitive tests administration over here, and some cortical neuroanatomy over there&amp;hellip;&lt;/em&gt; Though we might, &lt;em&gt;in fine&lt;/em&gt;, acquire a global vision and understanding of neuropsychology, verbalizing it is seldom necessary.&lt;/p&gt;
&lt;p&gt;The definition of neuropsychology is actually quite complex to formalize, and can even be hotly debated. The jobs and positions that stem out of this field are many, and &lt;strong&gt;practitioners often tend to circumscribe neuropsychology to their own activity&lt;/strong&gt;. For instance, a neuropsychologist that mainly does cognitive rehabilitation with psychiatric patients might have quite a different vision from another that does, day after day, presurgical cognitive assessments. And that is without mentioning the neuropsychologists pursuing an academic career, or even the ones that have moved to the private sector.&lt;/p&gt;
&lt;p&gt;&lt;em&gt;No problem&lt;/em&gt;, would argue the careful reader, &lt;em&gt;if the definitions are too narrow, let&amp;rsquo;s take more general one&lt;/em&gt;. &lt;strong&gt;It&amp;rsquo;s not that simple&lt;/strong&gt;. Indeed, neuropsychology occupies a very particular place in the network of science, as it is at &lt;strong&gt;the crossroads between social sciences, biological sciences and medical fields&lt;/strong&gt;. Giving a definition that is too large would lose its essence in the nebulous depths of neuroscience and psychology, which would be not be accurate; neuropsychologists, whether they are clinical practitioners or not, have a common training, a specific theoretical grounding, as well as a unique interpretation and analysis framework underpinned by a scientifically rigorous method. Taking these elements into account, I will attempt to give a &lt;strong&gt;simple, comprehensive and informative definition of neuropsychology&lt;/strong&gt;.&lt;/p&gt;
&lt;p&gt;The first axiom that we need to discuss is the notion of science. &lt;strong&gt;Is neuropsychology its &amp;ldquo;own&amp;rdquo; scientific field&lt;/strong&gt;, or is it a mere portion of another one, such as cognitive neuroscience or psychology, which differs from other specializations only through its object of interest? &lt;em&gt;&amp;ldquo;By science&amp;rdquo;&lt;/em&gt;, says Schopenhauer in his PhD thesis with a baroque title (On the Fourfold Root of the Principle of Sufficient Reason), &lt;em&gt;&amp;ldquo;we understand a system of notions, i.e. a totality of connected, as opposed to a mere aggregate of disconnected, notions.&amp;rdquo;&lt;/em&gt; This definition applies well to neuropsychology, that contains a set of theories, hypotheses, methods and proofs feeding from one another and creating a coherent ensemble. As such, neuropsychology is its own scientific discipline, although a singular one&amp;hellip;&lt;/p&gt;
&lt;p&gt;Indeed, &lt;strong&gt;what is the &amp;ldquo;bigger&amp;rdquo; box in which neuropsychology fits?&lt;/strong&gt; While neuropsychologists are often initially trained in psychology, one could argue that the focus on the brain makes it more belonging to neuroscience. Well, the organization and structure of Science is a complicated issue. However, the particularity of the topographical location of neuropsychology is quite apparent.&lt;/p&gt;
&lt;p&gt;On the one hand, neuropsychology belongs to a cluster of sciences interested a specific biological organ: the brain. As such, &lt;strong&gt;neuropsychology is an integral part of neuroscience&lt;/strong&gt;. On the other hand, neuropsychology is interested in the productions of the brain (e.g., thoughts, feelings and behaviours) with a focus on the cognitive level (analyzing things in terms of cognitive processes and mechanisms), which makes it also &lt;strong&gt;belonging to psychology&lt;/strong&gt;. Moreover, one could argue that neuropsychology, through its integration of the study of what we are biologically, and who we are mentally, has been connected to, and used as evidence in, &lt;strong&gt;philosophy of mind&lt;/strong&gt; debates (for instance, famous neuropsychological cases studied by Sacks, Ramachandran or Milner have been widely discussed by contemporary philosophers). Finally, contrary to many other domains, neuropsychology has also an applied, practical component, that can be used in clinical practice. This clinical aspect, registering &lt;strong&gt;neuropsychology withing medical fields&lt;/strong&gt;, takes multiple forms, such as assessment, diagnostic or therapeutic care, and can be used with a wide variety of patients and illnesses. These multiple facets make the wealth of &lt;strong&gt;neuropsychology, which offers an exceptional freedom of practice&lt;/strong&gt;.&lt;/p&gt;
&lt;p&gt;As we have seen, &lt;strong&gt;neuropsychology is located at the centre of colliding galaxies of knowledge&lt;/strong&gt;, such as neuroscience, psychology, medicine and philosophy. However, the fast development of neuropsychology is gradually leading to the creation of subcomponents within itself, corresponding to different practices and theoretical steps. And these clusters are themselves growing. For instance, clinical neuropsychology was historically focused on diagnostic cognitive assessments, but has recently expanded on the treatment-side of things, with innovations like cognitive training and rehabilitation. This underlines neuropsychology as a rapidly evolving field, moving its potential towards yet uncharted territories.&lt;/p&gt;
&lt;h2 id=&#34;the-fourfold-structure-of-neuropsychology&#34;&gt;The fourfold structure of neuropsychology&lt;/h2&gt;
&lt;p&gt;Neuropsychology is born from the convergence of &lt;strong&gt;cognitive neurology&lt;/strong&gt;, with pioneers such as &lt;a href=&#34;https://en.wikipedia.org/wiki/Paul_Broca&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;Broca&lt;/a&gt; and &lt;a href=&#34;https://en.wikipedia.org/wiki/Carl_Wernicke&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;Wernicke&lt;/a&gt; (which made inferences about brain functioning based on the observations of patients with brain lesions) and psychologists such as &lt;a href=&#34;https://en.wikipedia.org/wiki/Th%C3%A9odule-Armand_Ribot&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;Ribot&lt;/a&gt; (focusing on the organization and semiology of cognitive deficits). This history has shaped neuropsychology as a two-faced entity, with one &lt;strong&gt;experimental side&lt;/strong&gt; dedicated to understand the relationship between brain and cognition (by using pathological cases or natural variability of neurocognitive characteristics), and one &lt;strong&gt;clinical aspect&lt;/strong&gt;, focusing on bringing this knowledge to the benefit of the patients suffering from brain disorders.&lt;/p&gt;
&lt;p&gt;However, beyond these two pillars of neuropsychology, recent advances have outlined a more &lt;strong&gt;theoretical level&lt;/strong&gt; of neuropsychology, dedicated to a high-level integration of the data at hand to elaborate general theories and interfacing them with evolutionary or philosophical principles. Similarly, a &lt;strong&gt;computational&lt;/strong&gt; facet, referring to the operationalization of the functioning in statistical terms, starts to emerge as a pseudo-independent aspect, propelled by the regain of interest and focus on the methodological and statistical aspects of psychology and neuroscience.&lt;/p&gt;
&lt;p&gt;This structure is not fixed, but driven by the evolution of the field. It is possible that new poles will emerge, or differentiate over time, until maybe they separate from - or create new clusters within - neuropsychology.&lt;/p&gt;
&lt;figure&gt;
  &lt;img src=&#34;cycle.png&#34; alt=&#34;Structure of neuropsychology&#34;/&gt;
  &lt;figcaption&gt;The fourfold structure of neuropsychology.&lt;/figcaption&gt;
&lt;/figure&gt;
&lt;h2 id=&#34;the-definition-of-neuropsychology&#34;&gt;The definition of neuropsychology&lt;/h2&gt;
&lt;p&gt;&lt;strong&gt;Neuropsychology is a theoretical and practical science investigating the relationship between 1) the structure and functioning of the brain, and 2) cognitive processes and their related derivatives, such as thoughts, feelings and behaviours.&lt;/strong&gt; It is organised in four interconnected and overlapping dimensions:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Experimental neuropsychology&lt;/strong&gt; studies the variability of the brain and cognition (whether from pathological origin or not) to test theories and models through empirical experimentation.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Clinical neuropsychology&lt;/strong&gt; uses theories and models about mental functioning to better detect and assess disorders and deficits, leading to a precise diagnostic and an adapted treatment.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Computational neuropsychology&lt;/strong&gt; transforms the data acquired through experiments and observation into logical or statistical models of mental functioning that are used to operationalize the processes at stake.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Theoretical neuropsychology&lt;/strong&gt; integrates the evidence to elaborate and develop unifying theories to address fundamental questions about mental functioning.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Neuropsychology is located at the crossroads between neuroscience and psychology, at the interface between theory and practice. &lt;strong&gt;Its practitioners, the neuropsychologists&lt;/strong&gt;, are bound by a specific training, by unique theoretical and historical references, and are endowed with an analysis and interpretation framework backed by a rigorous and scientific investigation methodology.&lt;/p&gt;
&lt;!-- La neuropsychologie expérimentale étudie les variabilités du cerveau et de la cognition (qu’elles soient d’origine pathologique ou non) pour tester des modèles et développer des théories sur le fonctionnement mental, visant ainsi à une meilleure compréhension de l’Homme. --&gt;
&lt;!-- La neuropsychologie clinique utilise les théories et les modèles sur le fonctionnement mental pour mieux détecter et appréhender les troubles et les déficits d’une pathologie, menant à un diagnostic précis, tout en développant et appliquant des prises en charges modernes et adaptées. --&gt;
&lt;!-- La neuropsychologie se situe au centre de la nébuleuse des neurosciences, au carrefour de la théorie et de la pratique. Ses praticiens, les neuropsychologues, sont liés par une formation commune, des bases théoriques spécifiques, un canevas d’analyse et d’interprétation sous-tendu par une méthode d’investigation rigoureuse et scientifique.  --&gt;
&lt;p&gt;&lt;em&gt;Thanks for reading! Do not hesitate to tweet and share this post, and leave a comment below&lt;/em&gt; &amp;#x1f917;&lt;/p&gt;
&lt;h2 id=&#34;references&#34;&gt;References&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Nicolas, S., &amp;amp; Murray, D. J. (1999). Théodule Ribot (1839–1916), founder of French psychology: A biographical introduction. History of Psychology, 2(4), 277.&lt;/li&gt;
&lt;li&gt;Schopenhauer, A. (1813). &lt;em&gt;On the Fourfold Root of the Principle of Sufficient Reason&lt;/em&gt;. PhD dissertation.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;&lt;sub&gt;You can reference this post as follows:&lt;/sub&gt;&lt;/p&gt;
&lt;p&gt;&lt;sub&gt;- Makowski, D. (2020, September 13). What is Neuropsychology?. Retrieved from &lt;a href=&#34;https://dominiquemakowski.github.io/post/what_is_neuropsychology/&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;https://dominiquemakowski.github.io/post/what_is_neuropsychology/&lt;/a&gt;&lt;/sub&gt;&lt;/p&gt;
</description>
    </item>
    
    <item>
      <title>Extracting, Computing and Exploring the Parameters of Statistical Models using R</title>
      <link>https://realitybending.github.io/publication/ludecke2020parameters/</link>
      <pubDate>Thu, 10 Sep 2020 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/publication/ludecke2020parameters/</guid>
      <description></description>
    </item>
    
    <item>
      <title>Methods and Algorithms for Correlation Analysis in R</title>
      <link>https://realitybending.github.io/publication/makowski2020correlation/</link>
      <pubDate>Thu, 16 Jul 2020 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/publication/makowski2020correlation/</guid>
      <description></description>
    </item>
    
    <item>
      <title>Why psychologists should join GitHub</title>
      <link>https://realitybending.github.io/post/2020-05-27-github_psychologists/</link>
      <pubDate>Wed, 27 May 2020 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/post/2020-05-27-github_psychologists/</guid>
      <description>&lt;div class=&#34;alert alert-note&#34;&gt;
  &lt;div&gt;
    &lt;strong&gt;Disclaimer:&lt;/strong&gt; I don&amp;rsquo;t have anything to do with GitHub, aside from being a simple user. This article is not an advertisement for it, but rather a perspective on the role of technical social networks in psychology.
  &lt;/div&gt;
&lt;/div&gt;
&lt;p&gt;I already mentioned in &lt;a href=&#34;https://dominiquemakowski.github.io/post/r_or_python/&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;another post&lt;/a&gt; that &lt;strong&gt;technical aspects and skills will play an increasing role in psychology&lt;/strong&gt;. This relationship isn&amp;rsquo;t by any means new. More than a century ago, pioneering psychologists were demonstrating formidable engineering and craftsmanship skills to build new tools and apparatuses to measure what they were interested in (see for instance &lt;a href=&#34;https://dominiquemakowski.github.io/publication/nicolas2016can/&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;Nicolas &amp;amp; Makowski, 2016&lt;/a&gt;). But years have gone by, and the digital revolution has happened. As a result, most of the &amp;ldquo;technical&amp;rdquo; aspects (a rather vague term covering everything that is not related to the semantic knowledge of psychological theories and facts) are now ultimately tied to &lt;em&gt;software&lt;/em&gt;. Critically, &lt;strong&gt;your ability to use these softwares will determine the speed and ease at which you can achieve your goals and produce high quality assignments&lt;/strong&gt;.&lt;/p&gt;
&lt;p&gt;As an example, during my studies, most of the statistics course was delivered through the usage of one particular software (Statistica© 🤮). At the exam, your score didn&amp;rsquo;t much depend on your understanding of what a &lt;em&gt;t&lt;/em&gt;-test is, when to use it or anything like that; but rather on your knowledge of the software, and your ability to &lt;strong&gt;click on the right buttons faster than your peers&lt;/strong&gt;. While this is an unfortunate example that can be used to criticize the reliance on tools rather than fundamental knowledge, it also tells us something about the reality of the current world. In research, the better you are for instance at data processing (which involves both the knowledge of how to navigate the software &lt;em&gt;and&lt;/em&gt; the knowledge of what to do in general), the faster you will be able to carry it out, and the less stressed you will be, resulting in a cascade of other positive outcomes (increased well-being, work-quality, productivity, opportunities, etc.).&lt;/p&gt;
&lt;p&gt;However, &lt;strong&gt;one common mistake is to delay learning&lt;/strong&gt; new stuff (especially things outside our comfort zone) until we have no choice. This is understandable given that in the short term, certain skills may not be absolutely needed (i.e., you can manage without them) and acquiring them can be a steep learning curve (which can be hard, frustrating and effortful). However, you should start investing in your technical skills as soon as possible (as the time devoted to learning will only shrink as you advance) if you don&amp;rsquo;t want to become very &lt;strong&gt;close friends with pressure, stress, hatred, frustration and despair&lt;/strong&gt;. So stay on the light side of the force and embrace the future.&lt;/p&gt;
&lt;figure&gt;
  &lt;img src=&#34;vader.jpg&#34; alt=&#34;github for psychologists&#34;/&gt;
  &lt;figcaption&gt;A psychology researcher realizing that he should have learned programming earlier.&lt;/figcaption&gt;
&lt;/figure&gt;
&lt;p&gt;And it&amp;rsquo;s not just about learning to &lt;strong&gt;use&lt;/strong&gt; some software just for the sake of having things done and getting faster results. A lot of science is done &lt;em&gt;through&lt;/em&gt; software, not only &lt;em&gt;with&lt;/em&gt; them. People are actively discussing new methods, algorithms and tools that then expand like never before the possibilities of researchers. &lt;strong&gt;Flaming debates have been going on with frameworks and workflows clashing with thunderous sparks&lt;/strong&gt; (for instance, Bayesian &lt;em&gt;vs.&lt;/em&gt; frequentist statistics, ANOVAs &lt;em&gt;vs.&lt;/em&gt; (mixed) regressions, etc.), and these calls for change find echo because of developments of software, allowing initiated users to test, validate and use new methods.&lt;/p&gt;
&lt;h2 id=&#34;open-access-software&#34;&gt;Open-access software&lt;/h2&gt;
&lt;p&gt;It might not seem like it when you&amp;rsquo;re studying it at the university, but science is currently in the middle of a &lt;strong&gt;revolution&lt;/strong&gt;. A massive paradigmatic change, partly fuelled by the growth of &lt;strong&gt;open-science&lt;/strong&gt;, which covers aspects like open-access and open-source. This means, for software, that they are no longer being developed by private companies and sold for money. Instead, they are developed in a public fashion, and &lt;strong&gt;everybody is welcome to chime in and provide input, suggestions, report bugs or improve the code&lt;/strong&gt;.&lt;/p&gt;
&lt;p&gt;Open-source development means faster and better software, and more importantly, the creation of a &lt;strong&gt;true connection between developers and users&lt;/strong&gt;. In fact, the former are often first and foremost also the latter, meaning that in a lot of cases (at least, mine 😅), people started writing a software because they needed it for their own personal job.&lt;/p&gt;
&lt;p&gt;But the beauty lies in the fact that &lt;strong&gt;users can seemingly become developers&lt;/strong&gt; too, or &lt;strong&gt;contributors&lt;/strong&gt;, at the very least. Sometimes, users end up on a software development page to solve a bug or an issue that they encountered. From there, there can start following the developments, and getting involved. And &lt;strong&gt;not necessarily be writing code&lt;/strong&gt;. It can be by answering to other issues, helping other users, reporting bugs and typos, improving the documentation, giving ideas and suggestions, testing new features and encouraging the developers. There are so many to contribute to the development and, as a result, become an active member of the open-science community. And moreover, doing so is also a great way to learn the theoretical bits. For instance, the most efficient way of learning the complexities of EEG acquisition and processing was to follow the development of an EEG-processing software (namely, &lt;a href=&#34;https://mne.tools/stable/index.html&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;em&gt;MNE-Python&lt;/em&gt;&lt;/a&gt;). Reading issues that users encountered, and replies from experts and developers, trying to understand what functions do, what are the different parameters, what are the possibilities, the limits and so on.&lt;/p&gt;
&lt;p&gt;All in all, engaging in open-source software is a great way to increase your technical expertise and get involved in the community of researchers. And who knows, you might meet cool people, create new connections, and that&amp;rsquo;s always great!&lt;/p&gt;
&lt;figure&gt;
  &lt;img src=&#34;expectations.jpg&#34; alt=&#34;open source software expectations&#34;/&gt;
  &lt;figcaption&gt;Help making the first pane true.&lt;/figcaption&gt;
&lt;/figure&gt;
&lt;h2 id=&#34;software-as-a-social-network&#34;&gt;Software as a social network&lt;/h2&gt;
&lt;p&gt;Now that you&amp;rsquo;ve buckled up, ready to engage in open-source software, you might wonder; &lt;strong&gt;where does that happen?&lt;/strong&gt; Ladies and gentlemen, let me introduce &lt;a href=&#34;https://github.com/neuropsychology/NeuroKit&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;GitHub&lt;/strong&gt;&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;Several years ago, when I was an undergrad student, I had to write a lot of stuff, such as for instance reports, project and theses. All of these documents went through several back and forths with supervisors, which made comments and modifications. But I was terribly afraid to remove some paragraph or sentence that I would need later on. As a result, I ended up in a hell in which my tormentors were named &lt;code&gt;project.docx&lt;/code&gt;, &lt;code&gt;project_v2.docx&lt;/code&gt;, &lt;code&gt;project_v3.docx&lt;/code&gt;, &lt;code&gt;project_v3_modifs.docx&lt;/code&gt;, &lt;code&gt;project_v3_final.docx&lt;/code&gt;, &lt;code&gt;project_v4_comments.docx&lt;/code&gt;, &lt;code&gt;project_v4_final.docx&lt;/code&gt;, &lt;code&gt;project_v4final2.docx&lt;/code&gt;, &lt;code&gt;project_v4_finalfinal.docx&lt;/code&gt;. And what if my computer broke &lt;sub&gt;&lt;sup&gt;(that was before Dropbox)&lt;/sup&gt;&lt;/sub&gt;? I could lose everything 😱&lt;/p&gt;
&lt;p&gt;This is when I heard about something called &lt;strong&gt;version control&lt;/strong&gt;. Apparently, there was a system out there that allowed you to save &lt;strong&gt;all&lt;/strong&gt; your modifications, and be able to go back at &lt;em&gt;any&lt;/em&gt; point in time. This system was called &lt;em&gt;git&lt;/em&gt;, and it was super obscure. However, I discovered that this system had a, online interface, in the form of a website, on which you can go and upload files and documents for free. This is how &lt;strong&gt;I discovered GitHub&lt;/strong&gt;. And back in the days, it was really mostly used by programmers (because the nature of code makes it very suited for &lt;em&gt;version control&lt;/em&gt;), a world I didn&amp;rsquo;t belong to.&lt;/p&gt;
&lt;div class=&#34;alert alert-note&#34;&gt;
  &lt;div&gt;
    There several great alternatives go &lt;em&gt;GitHub&lt;/em&gt;, like &lt;em&gt;GitLab&lt;/em&gt;, &lt;em&gt;BitBucket&lt;/em&gt;, etc., that might be just as good, if not better. The reason why I&amp;rsquo;m mainly talking about &lt;em&gt;GitHub&lt;/em&gt; here is because this post is not about the intrinsic quality or features of these platforms &lt;em&gt;for developers&lt;/em&gt;, but rather as a social network. An important part of any social network is its popularity and - as of for now - &lt;em&gt;GitHub&lt;/em&gt; is the most popular.
  &lt;/div&gt;
&lt;/div&gt;
&lt;p&gt;I witnessed &lt;strong&gt;GitHub&lt;/strong&gt; growing and becoming a true &lt;em&gt;social network&lt;/em&gt;, improving its accessibility and user-friendliness. It is now more like a hub where all kinds of people gather to discuss software and technical bits, than a den for hairy geeks. There are also many users who are new to programming (e.g. researchers who are using software as a means to an end) and if you belong to this category of people, don&amp;rsquo;t underestimate your contribution to developers! Often it is such users that help developers improve user friendliness and identify code bugs (for example, when running the code on actual data sets). It is also used as a public technical portfolio, in which you can display your achievements, your projects and your interests. And while it was originally centred around programming, it has extended its audience quite a bit, and people are now using GitHub to store data (for instance the COVID-19 data), books, create websites (for instance, this website is stored on GitHub) or write scientific papers!&lt;/p&gt;
&lt;p&gt;The reason why I&amp;rsquo;m writing this is because I know all too many young researchers, grappling with some software, struggling to find help, that are too shy to just contact the developers or the community. Just jump in there, create a public issue (instead of writing an email, so that your question will benefit future users). Most of the developers will be happy to help, and glad to see their software and code actually used by others.&lt;/p&gt;
&lt;p&gt;In conclusion, go and dive into the world of open-science and open-source software, you&amp;rsquo;ll be on the right side of history. 😎&lt;/p&gt;
&lt;h2 id=&#34;what-to-do-once-youre-on-github&#34;&gt;What to do once you&amp;rsquo;re on GitHub&lt;/h2&gt;
&lt;p&gt;At the very least, the very first step is to create an account. Even if you don&amp;rsquo;t use it now, it will be useful in the future (it shows that you are interested in technical stuff, it allows you to post issues and connect to other platforms, and support developers).&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;&lt;strong&gt;Find&lt;/strong&gt; a package / software that you like. Super biased suggestions include:
&lt;ul&gt;
&lt;li&gt;&lt;a href=&#34;https://github.com/neuropsychology/NeuroKit&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;em&gt;NeuroKit&lt;/em&gt;&lt;/a&gt;: a Python package for Neurophysiological Signal Processing&lt;/li&gt;
&lt;li&gt;&lt;a href=&#34;https://github.com/easystats/bayestestR&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;em&gt;bayestestR&lt;/em&gt;&lt;/a&gt;: an R package for Bayesian statistics&lt;/li&gt;
&lt;li&gt;&lt;a href=&#34;https://github.com/easystats/correlation&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;em&gt;correlation&lt;/em&gt;&lt;/a&gt;: an R package for correlations&lt;/li&gt;
&lt;li&gt;&lt;a href=&#34;https://github.com/easystats/report&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;em&gt;report&lt;/em&gt;&lt;/a&gt;: an R package to report statistics&lt;/li&gt;
&lt;li&gt;&lt;a href=&#34;https://github.com/easystats/effectsize&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;em&gt;effectsize&lt;/em&gt;&lt;/a&gt;: an R package for effect sizes&lt;/li&gt;
&lt;li&gt;&lt;a href=&#34;https://github.com/easystats/parameters&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;em&gt;parameters&lt;/em&gt;&lt;/a&gt;: an R package for understanding statistical models&lt;/li&gt;
&lt;li&gt;&lt;a href=&#34;https://github.com/easystats/performance&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;em&gt;performance&lt;/em&gt;&lt;/a&gt;: an R package for testing how good your model is&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;&amp;ldquo;&lt;strong&gt;Watch it&lt;/strong&gt;&amp;rdquo; (the button in the top-right corner), so you&amp;rsquo;ll be notified of its activity&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Read&lt;/strong&gt; the README file (the &amp;ldquo;front page&amp;rdquo;), check-out the issues, understand how to navigate the repository&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Engage&lt;/strong&gt; with the developers, create an issue to report bugs or problems, or just to express support - developing a software takes time and effort, and is often done out of passion and for free. Words of encouragement are always appreciated 🤗&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Create&lt;/strong&gt; your own &lt;a href=&#34;https://guides.github.com/activities/hello-world/&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;repository&lt;/strong&gt;&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Make&lt;/strong&gt; your own &lt;a href=&#34;https://pages.github.com/&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;website&lt;/strong&gt;&lt;/a&gt;&lt;/li&gt;
&lt;/ol&gt;
&lt;p&gt;And if you want to learn how to use &lt;strong&gt;GitHub&lt;/strong&gt; to make contributions, check-out our &lt;a href=&#34;https://neurokit2.readthedocs.io/en/latest/tutorials/contributing.html&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;tutorial&lt;/strong&gt;&lt;/a&gt;!&lt;/p&gt;
&lt;hr&gt;
&lt;p&gt;&lt;em&gt;Thanks for reading! Let me know if I forgot something by adding a comment below&lt;/em&gt; 👇&lt;/p&gt;
</description>
    </item>
    
    <item>
      <title>In Medio Stat Virtus: intermediate levels of mind wandering improve episodic memory encoding in a virtual environment</title>
      <link>https://realitybending.github.io/publication/blonde2020medio/</link>
      <pubDate>Fri, 22 May 2020 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/publication/blonde2020medio/</guid>
      <description></description>
    </item>
    
    <item>
      <title>R or Python for Psychologists</title>
      <link>https://realitybending.github.io/post/2020-05-22-r_or_python/</link>
      <pubDate>Fri, 22 May 2020 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/post/2020-05-22-r_or_python/</guid>
      <description>&lt;p&gt;Many psychology students or researchers are faced with the challenge - &lt;em&gt;or the opportunity&lt;/em&gt; - of learning a programming language. &lt;strong&gt;Which one should you learn?&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;As an ex- psych student and a daily user and developer of some of them, here&amp;rsquo;s my take on this hot debate.&lt;/p&gt;
&lt;h2 id=&#34;what-has-programming-to-do-with-psychology&#34;&gt;What has programming to do with psychology?&lt;/h2&gt;
&lt;p&gt;If you&amp;rsquo;re a very young psychology student, or a future one, you might wonder: &lt;strong&gt;why the heck would I have to learn programming in psychology?&lt;/strong&gt; &lt;em&gt;&amp;ldquo;Psychology is kinda like philosophy, it&amp;rsquo;s just learning how people&amp;rsquo;s minds work by reading books and overthinking stuff&amp;rdquo;&lt;/em&gt;. If you still think that, you&amp;rsquo;re in for &lt;strong&gt;one hell of a ride&lt;/strong&gt;!&lt;/p&gt;
&lt;p&gt;Psychology is, since its very beginning, a hard and experimental science. The founding fathers of psychology were dedicated to find ways to objectively &lt;em&gt;measure&lt;/em&gt; psychological phenomena and uncovering the mathematical laws that govern Human behaviour (one of the fields of psychology is even called psycho&lt;em&gt;physics&lt;/em&gt;). True, this &lt;em&gt;sciency&lt;/em&gt; nature has been toned down by the booming popularity of &lt;strong&gt;pseudo-scientific approaches like psychoanalysis&lt;/strong&gt; throughout the 20th century, that contributed to the stereotypical public image of the shrink doodling while listening to a neurotic patient. But that&amp;rsquo;s a distorted and old-fashioned view, not really representative of the future of psychology.&lt;/p&gt;
&lt;p&gt;The fact is that psychology is very closely connected with &lt;strong&gt;statistics&lt;/strong&gt;. Many great statistical advances were made by psychologists, and all true psychological discoveries are backed by statistical findings. And this importance of statistics is - and will be - growing further, partly due to the recent realization of some major issues in the field due to improper statistical procedures (coined the &amp;ldquo;&lt;a href=&#34;https://en.wikipedia.org/wiki/Replication_crisis&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;replicability crisis&lt;/strong&gt;&lt;/a&gt;&amp;rdquo;). Moreover, psychology is more and more relying on advanced data-acquiring methods (smartphone apps, website behaviour data, online surveys, physiological and brain recording devices like EEG, MRI, etc.). And these new formats often require specialized knowledge (web-scraping, database querying, neuroimaging, signal processing, machine learning, &amp;hellip;). &lt;em&gt;And with great data-power comes great data-analysis-responsibilities&lt;/em&gt;. Even in the most applied kind of &lt;strong&gt;clinical&lt;/strong&gt; or &lt;strong&gt;psychotherapeutic&lt;/strong&gt; specializations, where you&amp;rsquo;d think you&amp;rsquo;d be safe, they are starting to use data intensive methods like neuro-feedback, virtual reality, experience sampling, and other forms of smartphone sensing and surveying.&lt;/p&gt;
&lt;p&gt;Long story short, no matter which branch of psychology you specialize in, you &lt;em&gt;will&lt;/em&gt; be confronted with some technical aspects that won&amp;rsquo;t be able to solve with &lt;em&gt;Excel&lt;/em&gt;. Moreover, these technical skills are the ones that will make the most difference between students, and that will matter a lot if you want to pursue research or want to go work in the private sector. The golden era where people were recruited in research based on their theoretical expertise is over: technical skills are now the golden ticket to enter - &lt;em&gt;and successfully leave&lt;/em&gt; - academia.&lt;/p&gt;
&lt;p&gt;So, &lt;strong&gt;ready to dive into programming?&lt;/strong&gt; Fear not! It&amp;rsquo;s not that complicated. Moreover, it&amp;rsquo;s &lt;strong&gt;one of the most rewarding skill&lt;/strong&gt; you can develop. I can assure you that you won&amp;rsquo;t regret the time invested in learning it 😊&lt;/p&gt;
&lt;p&gt;But where should you start?&lt;/p&gt;
&lt;h2 id=&#34;learn-both-r-and-python&#34;&gt;Learn both R and Python&lt;/h2&gt;
&lt;p&gt;This increasing relationship between psychology and statistics on the one hand, and other more general technical aspects on the other, is the reason why R and Python are so popular in psychology. Both languages are &lt;strong&gt;free&lt;/strong&gt;, &lt;strong&gt;open-source&lt;/strong&gt;, suited for &lt;strong&gt;beginners&lt;/strong&gt;, and have a large base of users with a ton of &lt;strong&gt;learning material&lt;/strong&gt; online. What&amp;rsquo;s the difference between them?&lt;/p&gt;
&lt;p&gt;Put simply, &lt;strong&gt;R is for statistics, Python is for the rest&lt;/strong&gt;.&lt;/p&gt;
&lt;p&gt;So why is there a virulent debate going on, and a choice to make? It&amp;rsquo;s true that I, &lt;em&gt;in theory&lt;/em&gt;, would agree with some popular recommandations and suggest &lt;strong&gt;learning both&lt;/strong&gt;, as they are complementary and have their own strengths and weaknesses. I myself use both on a daily basis, so why would preach what I practice?&lt;/p&gt;
&lt;p&gt;That said, many opinionated people are also arguing in favour of one &lt;strong&gt;or&lt;/strong&gt; the other (usually the only one they know&amp;hellip;) will say that learning both is essentially a waste of time. They will put forth a strong argument: &lt;strong&gt;you can do whatever you do in R in Python, and &lt;em&gt;vice-versa&lt;/em&gt;&lt;/strong&gt;. In other words, both languages can be used to achieve your goals, so it&amp;rsquo;s &lt;strong&gt;better to specialize in one than have a limited knowledge of both&lt;/strong&gt;.&lt;/p&gt;
&lt;p&gt;Although I do not agree with that statement, I do acknowledge that people have limited time and resources to learn. Saying &lt;strong&gt;&amp;ldquo;just learn both&amp;rdquo;&lt;/strong&gt; is easy, but is arguably an unrealistic expectation for the vast majority of people. So why learning both can be a long-term goal (especially if you want to do research), you have to start somewhere. So, &lt;strong&gt;what starter language should you select?&lt;/strong&gt;&lt;/p&gt;
&lt;figure&gt;
  &lt;img src=&#34;pokemon.png&#34; alt=&#34;r or python&#34;/&gt;
  &lt;figcaption&gt;Ash choosing his starter programming language. He has the choice between R, Python and Bulbasaur, i.e, Matlab - the one that no one likes.&lt;/figcaption&gt;
&lt;/figure&gt;
&lt;h2 id=&#34;what-about-matlab&#34;&gt;What about Matlab?&lt;/h2&gt;
&lt;p&gt;There was a time when &lt;em&gt;Matlab&lt;/em&gt; was the boss. It was used everywhere and had the best functionalities for neuroimaging, signal processing and maths. But &lt;strong&gt;that time is over&lt;/strong&gt;. Matlab is already a zombie language, which burial process will continue in the years to come. Why is it dead? Because it is &lt;strong&gt;expensive&lt;/strong&gt;, &lt;strong&gt;closed&lt;/strong&gt;, &lt;strong&gt;ugly&lt;/strong&gt;, and most importantly because the alternatives (namely R and Python) are now more powerful and featured than Matlab.&lt;/p&gt;
&lt;figure&gt;
  &lt;img src=&#34;https://media.giphy.com/media/sDOhzJBsFvjMY/giphy.gif&#34; alt=&#34;matlab&#34;/&gt;
  &lt;figcaption&gt;Agamemnon reacting to king Priam saying &#34;The city of Matlab will never be conquered&#34;.&lt;/figcaption&gt;
&lt;/figure&gt;
&lt;p&gt;The truth is, there are only two reasons people still use Matlab: &lt;strong&gt;habit&lt;/strong&gt; (it&amp;rsquo;s hard to learn a new approach if your old way of doing things still works) and &lt;strong&gt;SPM&lt;/strong&gt; (a toolbox for neuroimaging that is still - &lt;em&gt;for now&lt;/em&gt; - the leader in the field).&lt;/p&gt;
&lt;p&gt;But seriously, don&amp;rsquo;t waste time on it if you have limited resources, it&amp;rsquo;s just not worth it. You will learn an outdated tool that you won&amp;rsquo;t be able to use in another lab if they don&amp;rsquo;t agree to pay for an expensive license (unless you&amp;rsquo;re a pirate ☠️). Whereas with open and free languages like R or Python, you have access to the best tools and can use them freely everywhere. Also, it makes you a &lt;strong&gt;supporter of open-science&lt;/strong&gt;, which is good 😁.&lt;/p&gt;
&lt;h2 id=&#34;how-to-decide-between-r-and-python&#34;&gt;How to decide between R and Python&lt;/h2&gt;
&lt;p&gt;&lt;strong&gt;Time has come to make a decision.&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;Despite what people say, &lt;strong&gt;R and Python are not equivalent&lt;/strong&gt;. You can argue as much as you want, but doing statistics and data visualization in Python is not as fast, easy and neat as it is in R. And signal processing or neuroimaging is not as powerful in R as compared to Python. Note that both languages are still growing and changing, and they are influencing themselves: for instance, many popular Python modules (e.g., &lt;strong&gt;pandas&lt;/strong&gt;, &lt;strong&gt;statsmodels&lt;/strong&gt;, &lt;strong&gt;seaborn&lt;/strong&gt;, &amp;hellip;) have been directly inspired by R. As such, the boundaries between the two languages are fading (and I&amp;rsquo;m not even mentioning the great advances in interoperability, with tools like &lt;a href=&#34;https://rstudio.github.io/reticulate/&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;reticulate&lt;/strong&gt;&lt;/a&gt; that allow you to use one language directly inside the other).&lt;/p&gt;
&lt;p&gt;That being said, Python and R remain very different languages at their core, with a different feel and vibe to it. R was made by statisticians for statistics, and the majority of its users are academics and researchers. On the contrary, Python is a true general-purpose &amp;ldquo;programming&amp;rdquo; language, widely used outside of academia, in the private sector.&lt;/p&gt;
&lt;p&gt;Here are some things to consider when deciding on what language to learn:&lt;/p&gt;
&lt;h3 id=&#34;reasons-to-choose-python&#34;&gt;Reasons to choose Python&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;You have some basic knowledge or familiarity with programming&lt;/strong&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;For instance, you know what a &lt;em&gt;logical loop&lt;/em&gt; is. Python being a true programming language, having any prior experience will be useful.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;You are good with logic and spatial representation (like imagining shapes in 3D, rotating them, etc.)&lt;/strong&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;In Python, you will have to think with a &amp;ldquo;programming&amp;rdquo; mindset. That means perceiving things in terms of logical statements and blocks, understanding data as 2D or 3D tables that you have to slice and recombine.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;You are comfortable with maths&lt;/strong&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;In Python, numbers and numbers combinations are used a lot. Paradoxically, you will typically see a lot more maths in Python than in R.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;You plan to do signal processing or experimental tasks creation&lt;/strong&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;These are some of the domains where Python is well-established (which doesn&amp;rsquo;t mean that R doesn&amp;rsquo;t have some great tools in development).&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;You are good at googling and don&amp;rsquo;t mind spending time looking for the right answer&lt;/strong&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Python has so much material online that it&amp;rsquo;s sometimes hard to find the appropriate thing. Harder than in R, in my opinion, which has more well-defined &amp;ldquo;gold-standard&amp;rdquo; textbooks and tutorials.&lt;/p&gt;
&lt;h3 id=&#34;reasons-to-choose-r&#34;&gt;Reasons to choose R&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;You have no experience with programming whatsoever&lt;/strong&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;R is not made to be used as a traditional &lt;em&gt;programming&lt;/em&gt; language. It&amp;rsquo;s more of finding what functions to apply to what, and that makes it easy for beginners.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;You are interested in statistical analyses, modelling things, and making inferences from data&lt;/strong&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;R excels at this. You can create powerful models super easily and jump into their understanding and interpretation.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;You like making nice figures and plots&lt;/strong&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;R, through the &lt;a href=&#34;https://ggplot2.tidyverse.org/index.html&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;ggplot&lt;/strong&gt;&lt;/a&gt; ecosystem, has hands down the best tools for visualization. Your imagination is the limit, and you can even create art (check-out the artworks by &lt;a href=&#34;https://www.data-imaginist.com/art&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;Thomas Lin Pedersen&lt;/a&gt; and &lt;a href=&#34;https://art.djnavarro.net/&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;Danielle Navarro&lt;/a&gt; 😍).&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;You are &lt;em&gt;not&lt;/em&gt; so good with stats or maths&lt;/strong&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;You heard it right! To start with R you don&amp;rsquo;t need to know stats or maths like a boss. R, in fact, will help you to become proficient at it, by slowly opening more and more layers of complexity to you, if you are deemed worthy.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;You are interested in joining the academic community&lt;/strong&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Because most of its users are academics, R has a fantastic community online, for instance on &lt;a href=&#34;https://x.com/hashtag/rstats&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;Twitter #rstats&lt;/strong&gt;&lt;/a&gt;. It&amp;rsquo;s also super inclusive (e.g., the &lt;a href=&#34;https://rladies.org/&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;R-Ladies&lt;/a&gt;).&lt;/p&gt;
&lt;h3 id=&#34;other-considerations&#34;&gt;Other considerations&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;What your peers are learning&lt;/strong&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;It&amp;rsquo;s easier to learn together, so try to discuss it with your class or lab mates if you can.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;What your lab is using&lt;/strong&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;It might be easier if you have mentors that can understand what you are doing and guide you. But that should not be a priority, as it can lead to &lt;a href=&#34;https://en.wikipedia.org/wiki/Cargo_cult&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;cargo cult&lt;/a&gt;-like old habits reproduction (especially if your lab has a tradition of Matlab 🤭). Instead of submitting to the tradition, assess what the goals and objectives are, and pick the best tool to achieve them. And if you have any issue convincing your boss / supervisor about it, ask some help on Twitter, I bet you&amp;rsquo;ll get a lot of it.&lt;/p&gt;
&lt;h2 id=&#34;see-also&#34;&gt;See Also&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a href=&#34;https://github.com/matloff/R-vs.-Python-for-Data-Science&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;R vs. Python for Data Science&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id=&#34;hands-on&#34;&gt;Hands on!&lt;/h2&gt;
&lt;p&gt;👉 Looking for places to start? Check out this &lt;a href=&#34;https://neurokit2.readthedocs.io/en/latest/tutorials/learnpython.html&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;10-min crash course introduction to Python&lt;/strong&gt;&lt;/a&gt; and this &lt;a href=&#34;https://easystats.github.io/blog/resources/&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;collection of resources for R&lt;/strong&gt;&lt;/a&gt;.&lt;/p&gt;
&lt;hr&gt;
&lt;p&gt;&lt;em&gt;Thanks for reading!&lt;/em&gt; 🐦 &lt;em&gt;Don&amp;rsquo;t forget to join me on X&lt;/em&gt; (&lt;a href=&#34;https://x.com/Dom_Makowski&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;@Dom_Makowski&lt;/a&gt;) &lt;em&gt;and leave a comment below&lt;/em&gt; 👇&lt;/p&gt;
</description>
    </item>
    
    <item>
      <title>How to correctly analyze reaction time (RT) data</title>
      <link>https://realitybending.github.io/post/2020-05-18-analyze_rt/</link>
      <pubDate>Mon, 18 May 2020 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/post/2020-05-18-analyze_rt/</guid>
      <description>&lt;p&gt;This is a very, very important topic given the widespread usage of reaction times in psychology. Most of the time, we analyze it as a regular variable, using traditional models such as &lt;em&gt;linear models&lt;/em&gt;, &lt;em&gt;ANOVAs&lt;/em&gt; etc. The problem is that these models &lt;strong&gt;assume that the RT is normally distributed, which is false&lt;/strong&gt;.&lt;/p&gt;
&lt;p&gt;This leads us to adjustements like &lt;strong&gt;outliers removal&lt;/strong&gt; or &lt;strong&gt;log-transformation&lt;/strong&gt;, distorting the data because of our non-appropriate models.&lt;/p&gt;
&lt;p&gt;The good news is, it&amp;rsquo;s very easy to use better models, that account for the non-normal distribution of RT. And these alternatives are beautifully presented by &lt;a href=&#34;https://vbn.aau.dk/da/persons/117060&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;Jonas K. Lindeløv&lt;/a&gt; in the guide below:&lt;/p&gt;
&lt;p&gt;👉 &lt;a href=&#34;https://lindeloev.github.io/shiny-rt/&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;Reaction time distributions: an interactive overview&lt;/strong&gt;&lt;/a&gt; 👈&lt;/p&gt;
&lt;p&gt;It is a must-read for all psychologist. Do check-it out!!&lt;/p&gt;
&lt;hr&gt;
&lt;p&gt;&lt;em&gt;Thanks for reading! Do not hesitate to tweet and share this post, and leave a comment below&lt;/em&gt; &amp;#x1f917;&lt;/p&gt;
&lt;p&gt;🐦 &lt;em&gt;And don&amp;rsquo;t forget to join me on X&lt;/em&gt; &lt;a href=&#34;https://x.com/Dom_Makowski&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;@Dom_Makowski&lt;/a&gt;&lt;/p&gt;
</description>
    </item>
    
    <item>
      <title>One Python code line for a Mandelbrot fractal</title>
      <link>https://realitybending.github.io/post/2020-05-16-python_mandelbrot/</link>
      <pubDate>Sat, 16 May 2020 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/post/2020-05-16-python_mandelbrot/</guid>
      <description>&lt;h2 id=&#34;mandelbrot-set&#34;&gt;Mandelbrot Set&lt;/h2&gt;
&lt;p&gt;I wrote a small Python function to easily generate and plot a &lt;a href=&#34;https://en.wikipedia.org/wiki/Mandelbrot_set&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;Mandelbrot set&lt;/a&gt;. This function is now available through the &lt;a href=&#34;https://github.com/neuropsychology/NeuroKit#quick-example&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;NeuroKit2 package&lt;/strong&gt;&lt;/a&gt;, and can be used as follows:&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-python&#34; data-lang=&#34;python&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;kn&#34;&gt;import&lt;/span&gt; &lt;span class=&#34;nn&#34;&gt;neurokit2&lt;/span&gt; &lt;span class=&#34;k&#34;&gt;as&lt;/span&gt; &lt;span class=&#34;nn&#34;&gt;nk&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;nk&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;fractal_mandelbrot&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;show&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;kc&#34;&gt;True&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;p&gt;The Mandelbrot set is defined in the between &lt;code&gt;-2&lt;/code&gt; and &lt;code&gt;2&lt;/code&gt; on the &lt;em&gt;x&lt;/em&gt; (real) and &lt;em&gt;y&lt;/em&gt; (imaginary) axes. Following that, the image can be cropped accodingly by changing the coordinates. Moreover, the colors can be tweaked by changing the the colormap (&lt;code&gt;cmap&lt;/code&gt;).&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-python&#34; data-lang=&#34;python&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;m&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;nk&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;fractal_mandelbrot&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;real_range&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;-&lt;/span&gt;&lt;span class=&#34;mi&#34;&gt;2&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;mf&#34;&gt;0.75&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;),&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;imaginary_range&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;-&lt;/span&gt;&lt;span class=&#34;mf&#34;&gt;1.25&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;mf&#34;&gt;1.25&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;))&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;plt&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;imshow&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;m&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;T&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;cmap&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;s2&#34;&gt;&amp;#34;viridis&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;plt&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;axis&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;s2&#34;&gt;&amp;#34;off&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;plt&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;show&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;()&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;h2 id=&#34;buddhabrot-set&#34;&gt;Buddhabrot Set&lt;/h2&gt;
&lt;p&gt;It is also possible to generate a &lt;a href=&#34;https://en.wikipedia.org/wiki/Buddhabrot&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;Buddhabrot&lt;/strong&gt;&lt;/a&gt;:&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-python&#34; data-lang=&#34;python&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;b&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;nk&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;fractal_mandelbrot&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;size&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;mi&#34;&gt;1500&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;                          &lt;span class=&#34;n&#34;&gt;real_range&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;-&lt;/span&gt;&lt;span class=&#34;mi&#34;&gt;2&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;mf&#34;&gt;0.75&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;),&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;imaginary_range&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;-&lt;/span&gt;&lt;span class=&#34;mf&#34;&gt;1.25&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;mf&#34;&gt;1.25&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;),&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;                          &lt;span class=&#34;n&#34;&gt;buddha&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;kc&#34;&gt;True&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;iterations&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;mi&#34;&gt;200&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;plt&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;imshow&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;b&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;T&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;cmap&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;s2&#34;&gt;&amp;#34;gray&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;plt&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;axis&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;s2&#34;&gt;&amp;#34;off&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;plt&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;show&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;()&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;blockquote class=&#34;twitter-tweet&#34;&gt;&lt;p lang=&#34;en&#34; dir=&#34;ltr&#34;&gt;Added the option to return a so-called &amp;#39;Buddhabrot&amp;#39;🧘 Amazing to see these shapes emerging from such a simple formula 🤯 &lt;a href=&#34;https://twitter.com/hashtag/fractalart?src=hash&amp;amp;ref_src=twsrc%5Etfw&#34;&gt;#fractalart&lt;/a&gt; &lt;a href=&#34;https://t.co/7nzxsvQa6R&#34;&gt;pic.twitter.com/7nzxsvQa6R&lt;/a&gt;&lt;/p&gt;&amp;mdash; Dominique Makowski 🧙 (@Dom_Makowski) &lt;a href=&#34;https://twitter.com/Dom_Makowski/status/1258376273451053056?ref_src=twsrc%5Etfw&#34;&gt;May 7, 2020&lt;/a&gt;&lt;/blockquote&gt;
&lt;script async src=&#34;https://platform.twitter.com/widgets.js&#34; charset=&#34;utf-8&#34;&gt;&lt;/script&gt;


&lt;p&gt;Althoug the NeuroKit Python package is primarily devoted at physiological signal processing, in also includes tons of other useful features.&lt;/p&gt;
&lt;p&gt;👉 &lt;a href=&#34;https://github.com/neuropsychology/NeuroKit#quick-example&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;Discover more about NeuroKit here&lt;/strong&gt;&lt;/a&gt; 👈&lt;/p&gt;
&lt;p&gt;Have fun!&lt;/p&gt;
&lt;hr&gt;
&lt;p&gt;&lt;em&gt;Thanks for reading! Do not hesitate to tweet and share this post, and leave a comment below&lt;/em&gt; &amp;#x1f917;&lt;/p&gt;
&lt;p&gt;🐦 &lt;em&gt;Don&amp;rsquo;t forget to join me on X&lt;/em&gt; &lt;a href=&#34;https://x.com/Dom_Makowski&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;@Dom_Makowski&lt;/a&gt;&lt;/p&gt;
</description>
    </item>
    
    <item>
      <title>In defence of the 95% CI</title>
      <link>https://realitybending.github.io/post/2020-05-15-defence_ci95/</link>
      <pubDate>Fri, 15 May 2020 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/post/2020-05-15-defence_ci95/</guid>
      <description>&lt;p&gt;&lt;strong&gt;TLDR;&lt;/strong&gt; &lt;a href=&#34;https://github.com/easystats/bayestestR&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;BayestestR&lt;/strong&gt;&lt;/a&gt; &lt;strong&gt;currently uses a 89% threshold by default for Credible Intervals (CI). Should we change that? If so, by what?&lt;/strong&gt; &lt;a href=&#34;https://github.com/easystats/bayestestR/issues/250&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;em&gt;&lt;strong&gt;Join the discussion here.&lt;/strong&gt;&lt;/em&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;Magical numbers, or conventional thresholds, have bad press in statistics, and there are many of them. For instance, &lt;strong&gt;.05&lt;/strong&gt; (for the &lt;em&gt;p&lt;/em&gt;-value), or the &lt;strong&gt;95%&lt;/strong&gt; range for the &lt;strong&gt;Confidence Interval&lt;/strong&gt; (CI). Indeed, why 95 and not 94 or 90?&lt;/p&gt;
&lt;p&gt;👉 &lt;a href=&#34;https://easystats.github.io/blog/posts/bayestestr_95/&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;Read my complete post on the easystats&amp;rsquo; blog&lt;/strong&gt;&lt;/a&gt; 👈&lt;/p&gt;
&lt;hr&gt;
&lt;p&gt;&lt;em&gt;Thanks for reading! Do not hesitate to tweet and share this post, and leave a comment below&lt;/em&gt; &amp;#x1f917;&lt;/p&gt;
</description>
    </item>
    
    <item>
      <title>Introduction to Bayesian statistics with R</title>
      <link>https://realitybending.github.io/post/2020-05-14-intro_bayestestr/</link>
      <pubDate>Thu, 14 May 2020 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/post/2020-05-14-intro_bayestestr/</guid>
      <description>&lt;p&gt;You are a student or a researcher interested in Bayesian statistics and R? But all the tutorials and courses that you have found are too intimidating?&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Fear no more!&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;With the &lt;a href=&#34;https://github.com/easystats/easystats&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;easystats team&lt;/a&gt;, we have created a very &lt;strong&gt;gentle&lt;/strong&gt; and &lt;strong&gt;introductory&lt;/strong&gt; course for beginners.&lt;/p&gt;
&lt;p&gt;You can find the link here:&lt;/p&gt;
&lt;p&gt;👉 &lt;a href=&#34;https://easystats.github.io/bayestestR/articles/bayestestR.html&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;Get Started with Bayesian Statistics using R&lt;/strong&gt;&lt;/a&gt; 👈&lt;/p&gt;
&lt;hr&gt;
&lt;p&gt;&lt;em&gt;Thanks for reading! Do not hesitate to tweet and share this post, and leave a comment below&lt;/em&gt; &amp;#x1f917;&lt;/p&gt;
&lt;p&gt;🐦 &lt;em&gt;Don&amp;rsquo;t forget to join me on X&lt;/em&gt; &lt;a href=&#34;https://x.com/Dom_Makowski&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;@Dom_Makowski&lt;/a&gt;&lt;/p&gt;
</description>
    </item>
    
    <item>
      <title>The impact of state and dispositional mindfulness on prospective memory: A virtual reality study</title>
      <link>https://realitybending.github.io/publication/girardeau2020impact/</link>
      <pubDate>Wed, 01 Apr 2020 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/publication/girardeau2020impact/</guid>
      <description></description>
    </item>
    
    <item>
      <title>Indices of Effect Existence and Significance in the Bayesian Framework</title>
      <link>https://realitybending.github.io/publication/makowski2019indices/</link>
      <pubDate>Sun, 01 Dec 2019 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/publication/makowski2019indices/</guid>
      <description></description>
    </item>
    
    <item>
      <title>Adaptation and Validation of a Short French Version of the Affective Style Questionnaire</title>
      <link>https://realitybending.github.io/publication/makowski2019adaptation/</link>
      <pubDate>Fri, 01 Nov 2019 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/publication/makowski2019adaptation/</guid>
      <description></description>
    </item>
    
    <item>
      <title>bayestestR: Describing effects and their uncertainty, existence and significance within the Bayesian framework</title>
      <link>https://realitybending.github.io/publication/makowski2019bayestestr/</link>
      <pubDate>Sun, 01 Sep 2019 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/publication/makowski2019bayestestr/</guid>
      <description></description>
    </item>
    
    <item>
      <title>insight: A Unified Interface to Access Information from Model Objects in R</title>
      <link>https://realitybending.github.io/publication/ludecke2019insight/</link>
      <pubDate>Sun, 01 Sep 2019 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/publication/ludecke2019insight/</guid>
      <description></description>
    </item>
    
    <item>
      <title>The heart of cognitive control: Cardiac phase modulates processing speed and inhibition</title>
      <link>https://realitybending.github.io/publication/makowski2019heart/</link>
      <pubDate>Sun, 01 Sep 2019 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/publication/makowski2019heart/</guid>
      <description></description>
    </item>
    
    <item>
      <title>Generate an articial ECG signal in Python</title>
      <link>https://realitybending.github.io/post/2019-05-17-simulate_ecg/</link>
      <pubDate>Fri, 17 May 2019 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/post/2019-05-17-simulate_ecg/</guid>
      <description>&lt;h1 id=&#34;create-a-natural-ecg-signal&#34;&gt;Create a natural ECG signal&lt;/h1&gt;
&lt;p&gt;Generating artificial physiological signals can be very useful to build, test your analysis pipeline or develop and validate a new algorithm.&lt;/p&gt;
&lt;p&gt;Generating a synthetic, yet realistic, ECG signal in Python can be easily achieved with the &lt;code&gt;ecg_simulate()&lt;/code&gt; function available in the &lt;a href=&#34;https://github.com/neuropsychology/NeuroKit#quick-example&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;NeuroKit2&lt;/strong&gt;&lt;/a&gt; package.&lt;/p&gt;
&lt;p&gt;In the example below, we will generate &lt;strong&gt;8&lt;/strong&gt; seconds of ECG, sampled at &lt;strong&gt;200 Hz&lt;/strong&gt; (i.e., 200 points per second) - hence the length of the signal will be &lt;code&gt;8 * 200 = 1600&lt;/code&gt; data points. We can also specify the average heart rate, although note that there will be some natural variability (which is a good thing, because it makes it realistic).&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-python&#34; data-lang=&#34;python&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;kn&#34;&gt;import&lt;/span&gt; &lt;span class=&#34;nn&#34;&gt;neurokit2&lt;/span&gt; &lt;span class=&#34;k&#34;&gt;as&lt;/span&gt; &lt;span class=&#34;nn&#34;&gt;nk&lt;/span&gt;  &lt;span class=&#34;c1&#34;&gt;# Load the package&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;simulated_ecg&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;nk&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;ecg_simulate&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;duration&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;mi&#34;&gt;8&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;sampling_rate&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;mi&#34;&gt;200&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;heart_rate&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;mi&#34;&gt;80&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;nk&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;signal_plot&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;simulated_ecg&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;sampling_rate&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;mi&#34;&gt;200&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;  &lt;span class=&#34;c1&#34;&gt;# Visualize the signal&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;p&gt;















&lt;figure  &gt;
  &lt;div class=&#34;d-flex justify-content-center&#34;&gt;
    &lt;div class=&#34;w-100&#34; &gt;&lt;img alt=&#34;png&#34; srcset=&#34;
               /post/2019-05-17-simulate_ecg/output_1_0_hu_219bc7b81dd34b7e.webp 400w,
               /post/2019-05-17-simulate_ecg/output_1_0_hu_ee3fb9450040338e.webp 760w,
               /post/2019-05-17-simulate_ecg/output_1_0_hu_d954d6904ebd15e7.webp 1200w&#34;
               src=&#34;https://realitybending.github.io/post/2019-05-17-simulate_ecg/output_1_0_hu_219bc7b81dd34b7e.webp&#34;
               width=&#34;760&#34;
               height=&#34;389&#34;
               loading=&#34;lazy&#34; data-zoomable /&gt;&lt;/div&gt;
  &lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;The simulation is based on the &lt;strong&gt;ECGSYN&lt;/strong&gt; algorithm (McSharry et al., 2003).&lt;/p&gt;
&lt;p&gt;However, for fast and stable results (as the realistic algorithm naturally generates some variability), one can approximate the QRS complex by a &lt;strong&gt;Daubechies&lt;/strong&gt; wavelet. An ECG based on this method can also be obtained in &lt;strong&gt;NeuroKit&lt;/strong&gt; by changing the &lt;code&gt;method&lt;/code&gt; as follows:&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-python&#34; data-lang=&#34;python&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;simulated_ecg&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;nk&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;ecg_simulate&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;duration&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;mi&#34;&gt;8&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;sampling_rate&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;mi&#34;&gt;200&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;method&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;s2&#34;&gt;&amp;#34;daubechies&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;nk&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;signal_plot&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;simulated_ecg&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;sampling_rate&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;mi&#34;&gt;200&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;p&gt;















&lt;figure  &gt;
  &lt;div class=&#34;d-flex justify-content-center&#34;&gt;
    &lt;div class=&#34;w-100&#34; &gt;&lt;img alt=&#34;png&#34; srcset=&#34;
               /post/2019-05-17-simulate_ecg/output_2_0_hu_5c62260edbb4cfd2.webp 400w,
               /post/2019-05-17-simulate_ecg/output_2_0_hu_7ab7f8221512690a.webp 760w,
               /post/2019-05-17-simulate_ecg/output_2_0_hu_e83b4e127422bc77.webp 1200w&#34;
               src=&#34;https://realitybending.github.io/post/2019-05-17-simulate_ecg/output_2_0_hu_5c62260edbb4cfd2.webp&#34;
               width=&#34;760&#34;
               height=&#34;393&#34;
               loading=&#34;lazy&#34; data-zoomable /&gt;&lt;/div&gt;
  &lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;While faster and stable, the generated ECG is far from being realistic.&lt;/p&gt;
&lt;p&gt;👉 &lt;a href=&#34;https://github.com/neuropsychology/NeuroKit#quick-example&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;&lt;strong&gt;Discover more about NeuroKit here&lt;/strong&gt;&lt;/a&gt; 👈&lt;/p&gt;
&lt;p&gt;Have fun!&lt;/p&gt;
&lt;h1 id=&#34;references&#34;&gt;References&lt;/h1&gt;
&lt;p&gt;McSharry, P. E., Clifford, G. D., Tarassenko, L., &amp;amp; Smith, L. A. (2003). A dynamical model for generating synthetic electrocardiogram signals. IEEE transactions on biomedical engineering, 50(3), 289-294.&lt;/p&gt;
&lt;hr&gt;
&lt;p&gt;&lt;em&gt;Thanks for reading! Do not hesitate to tweet and share this post, and leave a comment below&lt;/em&gt; &amp;#x1f917;&lt;/p&gt;
&lt;p&gt;🐦 &lt;em&gt;Don&amp;rsquo;t forget to join me on X&lt;/em&gt; &lt;a href=&#34;https://x.com/Dom_Makowski&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;@Dom_Makowski&lt;/a&gt;&lt;/p&gt;
</description>
    </item>
    
    <item>
      <title>Dispositional mindfulness attenuates the emotional attentional blink</title>
      <link>https://realitybending.github.io/publication/makowski2019dispositional/</link>
      <pubDate>Fri, 01 Feb 2019 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/publication/makowski2019dispositional/</guid>
      <description></description>
    </item>
    
    <item>
      <title>Phenomenal, bodily and brain correlates of fictional reappraisal as an implicit emotion regulation strategy</title>
      <link>https://realitybending.github.io/publication/makowski2019phenomenal/</link>
      <pubDate>Tue, 01 Jan 2019 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/publication/makowski2019phenomenal/</guid>
      <description></description>
    </item>
    
    <item>
      <title>The psycho Package: an Efficient and Publishing-Oriented Workflow for Psychological Science</title>
      <link>https://realitybending.github.io/publication/makowski2018psycho/</link>
      <pubDate>Sat, 01 Sep 2018 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/publication/makowski2018psycho/</guid>
      <description></description>
    </item>
    
    <item>
      <title>&#34;Being there&#34; and remembering it: Presence improves memory encoding</title>
      <link>https://realitybending.github.io/publication/makowski2017being/</link>
      <pubDate>Fri, 01 Sep 2017 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/publication/makowski2017being/</guid>
      <description></description>
    </item>
    
    <item>
      <title>Centenaire Ribot (première partie). La réception de l&#39;oeuvre de Théodule Ribot publiée chez l’éditeur Ladrange (1870-1873)</title>
      <link>https://realitybending.github.io/publication/nicolas2017centenaire/</link>
      <pubDate>Fri, 01 Sep 2017 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/publication/nicolas2017centenaire/</guid>
      <description></description>
    </item>
    
    <item>
      <title>How virtual embodiment affects episodic memory functioning: a proof-of-concept study</title>
      <link>https://realitybending.github.io/publication/tuena2017virtual/</link>
      <pubDate>Fri, 01 Sep 2017 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/publication/tuena2017virtual/</guid>
      <description></description>
    </item>
    
    <item>
      <title>Interaction between attentional systems and episodic memory encoding: the impact of conflict on binding of information</title>
      <link>https://realitybending.github.io/publication/sperduti2017interaction/</link>
      <pubDate>Fri, 01 Sep 2017 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/publication/sperduti2017interaction/</guid>
      <description></description>
    </item>
    
    <item>
      <title>Meditation and successful aging: can meditative practices counteract age-related cognitive decline?</title>
      <link>https://realitybending.github.io/publication/sperduti2017meditation/</link>
      <pubDate>Fri, 01 Sep 2017 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/publication/sperduti2017meditation/</guid>
      <description></description>
    </item>
    
    <item>
      <title>Neuropsydia.py: A Python Module for Creating Experiments, Tasks and Questionnaires</title>
      <link>https://realitybending.github.io/publication/makowski2017neuropsydia/</link>
      <pubDate>Fri, 01 Sep 2017 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/publication/makowski2017neuropsydia/</guid>
      <description></description>
    </item>
    
    <item>
      <title>The distinctive role of executive functions in implicit emotion regulation</title>
      <link>https://realitybending.github.io/publication/sperduti2017distinctive/</link>
      <pubDate>Fri, 01 Sep 2017 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/publication/sperduti2017distinctive/</guid>
      <description></description>
    </item>
    
    <item>
      <title>Can mental fatigue be measured by Weber&#39;s compass? Alfred Binet&#39;s answer on the value of aesthesiometry (tactile sensitivity) as an objective measure of mental fatigue</title>
      <link>https://realitybending.github.io/publication/nicolas2016can/</link>
      <pubDate>Thu, 01 Sep 2016 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/publication/nicolas2016can/</guid>
      <description></description>
    </item>
    
    <item>
      <title>The paradox of fiction: Emotional response toward fiction and the modulatory role of self-relevance</title>
      <link>https://realitybending.github.io/publication/sperduti2016paradox/</link>
      <pubDate>Thu, 01 Sep 2016 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/publication/sperduti2016paradox/</guid>
      <description></description>
    </item>
    
    <item>
      <title>The protective role of long-term meditation on the decline of the executive component of attention in aging: a preliminary cross-sectional study</title>
      <link>https://realitybending.github.io/publication/sperduti2016protective/</link>
      <pubDate>Thu, 01 Sep 2016 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/publication/sperduti2016protective/</guid>
      <description></description>
    </item>
    
    <item>
      <title>Emotion regulation and the cognitive decline in aging: beyond the paradox</title>
      <link>https://realitybending.github.io/publication/makowski2015emotion/</link>
      <pubDate>Tue, 01 Sep 2015 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/publication/makowski2015emotion/</guid>
      <description></description>
    </item>
    
    <item>
      <title></title>
      <link>https://realitybending.github.io/people/</link>
      <pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/people/</guid>
      <description></description>
    </item>
    
    <item>
      <title>Research Themes</title>
      <link>https://realitybending.github.io/research/</link>
      <pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate>
      <guid>https://realitybending.github.io/research/</guid>
      <description></description>
    </item>
    
  </channel>
</rss>
