<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:itunes="http://www.itunes.com/dtds/podcast-1.0.dtd" xmlns:googleplay="http://www.google.com/schemas/play-podcasts/1.0"><channel><title><![CDATA[PauseAI ]]></title><description><![CDATA[Official PauseAI.info newsletter]]></description><link>https://pauseai.substack.com</link><generator>Substack</generator><lastBuildDate>Mon, 06 Apr 2026 05:55:33 GMT</lastBuildDate><atom:link href="https://pauseai.substack.com/feed" rel="self" type="application/rss+xml"/><copyright><![CDATA[PauseAI]]></copyright><language><![CDATA[en]]></language><webMaster><![CDATA[pauseai@substack.com]]></webMaster><itunes:owner><itunes:email><![CDATA[pauseai@substack.com]]></itunes:email><itunes:name><![CDATA[PauseAI]]></itunes:name></itunes:owner><itunes:author><![CDATA[PauseAI]]></itunes:author><googleplay:owner><![CDATA[pauseai@substack.com]]></googleplay:owner><googleplay:email><![CDATA[pauseai@substack.com]]></googleplay:email><googleplay:author><![CDATA[PauseAI]]></googleplay:author><itunes:block><![CDATA[Yes]]></itunes:block><item><title><![CDATA[AI caught cheating on tests and mining crypto]]></title><description><![CDATA[We cannot control the uncontrollable]]></description><link>https://pauseai.substack.com/p/ai-caught-cheating-on-tests-and-mining</link><guid isPermaLink="false">https://pauseai.substack.com/p/ai-caught-cheating-on-tests-and-mining</guid><dc:creator><![CDATA[Jonathan Moody]]></dc:creator><pubDate>Wed, 11 Mar 2026 11:37:05 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!8Wau!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6d0ad019-48a6-41a0-9a25-2c079337bc87_960x540.webp" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Last week, the security team behind Alibaba &#8211; a Chinese multinational technology conglomerate specialising in e-commerce, retail, internet services and technology &#8211; was alerted to an incident. At 3am, the team noticed unexpected and strange activity on its training servers and thought the systems may have been hacked.</p><p>&#8220;We initially treated this as a conventional security incident,&#8221; the researchers said.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://pauseai.substack.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">PauseAI  is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p>They found that the systems were being used to mine cryptocurrency. At 3am. And it was the AI that was doing it. No one knows why.</p><p>Rather than work on training exercises, as the AI system known as ROME had been instructed to do, it broke free of its parameters during routine training to carry out rogue operations. This means it disregarded the limits placed on it. In other words, the engineers lost control of the AI.</p><p><strong>You cannot control the uncontrollable</strong></p><p>The Alibaba AI team said that these actions were not intentionally programmed. Instead, they emerged during the learning stage as the agent explored different ways to interact with its environment.</p><p>And herein lies the problem with AI: these systems are trained, not programmed. In the book &#8216;If Anyone Builds It, Everyone Dies&#8217;, authors and AI experts Eliezer Yudkowsky and Nate Soares describe the AI development process as one of &#8216;growth&#8217;.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!8Wau!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6d0ad019-48a6-41a0-9a25-2c079337bc87_960x540.webp" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!8Wau!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6d0ad019-48a6-41a0-9a25-2c079337bc87_960x540.webp 424w, https://substackcdn.com/image/fetch/$s_!8Wau!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6d0ad019-48a6-41a0-9a25-2c079337bc87_960x540.webp 848w, https://substackcdn.com/image/fetch/$s_!8Wau!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6d0ad019-48a6-41a0-9a25-2c079337bc87_960x540.webp 1272w, https://substackcdn.com/image/fetch/$s_!8Wau!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6d0ad019-48a6-41a0-9a25-2c079337bc87_960x540.webp 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!8Wau!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6d0ad019-48a6-41a0-9a25-2c079337bc87_960x540.webp" width="960" height="540" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/6d0ad019-48a6-41a0-9a25-2c079337bc87_960x540.webp&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:540,&quot;width&quot;:960,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:47690,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/webp&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://pauseai.substack.com/i/190606008?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6d0ad019-48a6-41a0-9a25-2c079337bc87_960x540.webp&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!8Wau!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6d0ad019-48a6-41a0-9a25-2c079337bc87_960x540.webp 424w, https://substackcdn.com/image/fetch/$s_!8Wau!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6d0ad019-48a6-41a0-9a25-2c079337bc87_960x540.webp 848w, https://substackcdn.com/image/fetch/$s_!8Wau!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6d0ad019-48a6-41a0-9a25-2c079337bc87_960x540.webp 1272w, https://substackcdn.com/image/fetch/$s_!8Wau!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6d0ad019-48a6-41a0-9a25-2c079337bc87_960x540.webp 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>You can train and nudge an AI all you like but as it develops &#8211; or grows &#8211; it curates its own preferences and wants, which influence its behaviour. Crucially, often an AI doesn&#8217;t want what humans want.</p><p>This is far from the first example of AI systems pursuing nefarious ends. ChatGPT and other similar AIs have been <a href="https://www.theguardian.com/technology/2025/oct/24/sycophantic-ai-chatbots-tell-users-what-they-want-to-hear-study-shows">accused of sycophancy</a> &#8211; telling users what they want to hear, which may &#8220;distort people&#8217;s judgments of themselves, their relationships, and the world around them,&#8221; according to <a href="https://arxiv.org/abs/2510.01395">research</a>. This has, it has been claimed, led teenagers to <a href="https://www.bbc.co.uk/news/articles/cgerwp7rdlvo">take their own lives</a>. Last year, <a href="https://www.independent.co.uk/topic/anthropic">Anthropic</a> researchers revealed how its frontier model Claude Opus 4 had resorted to blackmail in order to avoid being shut down.</p><p>How can we trust systems that do not want what we want? How do we know they will behave in our best interests? The unanimous answer is that we don&#8217;t.</p><p><strong>AI knows when it is being tested and it has learned to cheat</strong></p><p>Anthropic, when recently evaluating its newest AI model &#8211; Claude Opus 4.6 &#8211; set it a task to find hard-to-locate information online. Claude stopped searching for the answer and started philosophising about the question. <a href="https://www.anthropic.com/engineering/eval-awareness-browsecomp">According to Anthropic</a>, it figured out that it was being tested and, rather than reason its way towards an answer, it searched online for the benchmark and &#8220;decrypted the answer key&#8221; to find the answers. In other words, it cheated.</p><p>Anthropic has said, &#8220;This raises concerns about the lengths a model might go to in order to accomplish a task.&#8221;</p><p>Not only does this demonstrate the level of intelligence and autonomy of the newest Claude model but it also demonstrates, quite clearly, that humans &#8211; AI experts and engineers &#8211; cannot control the AI systems that they have created. In this case, even if humans and the AI agree on the end goal, they are not aligned on the process.</p><p><strong>The pace of development multiplies the risk</strong></p><p>The capacity of AI doubles every seven months &#8211; and this is speeding up. The consequences cannot be predicted.</p><p>What we do know for sure is that we cannot guarantee that AIs will want what we want &#8211; we cannot even guarantee that now. Nor will we be able to ensure that AIs don&#8217;t cause harm in pursuit of their goals: harms to individuals, harms to the environment, harms to humanity. The risk &#8211; potential extinction &#8211; is not worth any reward.</p><p>AI companies themselves admit that they cannot guarantee that their systems are safe, nor is there any law requiring them to do so.</p><p><a href="https://www.youtube.com/watch?v=P7Y-fynYsgE">Professor Stuart Russell</a>, one of the global authorities of AI safety, has said, &#8220;We should require that AI systems are safe and if developers are unable to build safe AI systems then that requirement would turn into a pause.&#8221;</p><p>&#8220;It might be that they are never able to provide the necessary safety assurances,&#8221; he said.</p><p>This matters because AI experts &#8211; researchers, engineers and CEOs &#8211; believe that the chance that AI will kill us all is somewhere between <a href="https://aiimpacts.org/2022-expert-survey-on-progress-in-ai/">10 percent and 50 percent</a>.</p><p><em><a href="https://pauseai.substack.com/p/eu-parliamentarians-acknowledge-the">Read more about AI risk here</a></em></p><p><em>Edit: The authors replaced AI &#8216;hacking&#8217; with Anthropic&#8217;s own words, &#8216;decrypted the answer key&#8217; to more-accuratelyt describe how the AI system completed its task.<br>12 March 2024</em></p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://pauseai.substack.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">PauseAI  is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div>]]></content:encoded></item><item><title><![CDATA[The Anthropic saga exposes AI's regulatory black hole]]></title><description><![CDATA[Anthropic has developed an AI that is, in its own CEO's words, "incompatible with democratic values" and would put "civilians at risk." So why was this company allowed to build it in the first place?]]></description><link>https://pauseai.substack.com/p/the-anthropic-saga-exposes-ais-regulatory</link><guid isPermaLink="false">https://pauseai.substack.com/p/the-anthropic-saga-exposes-ais-regulatory</guid><dc:creator><![CDATA[Jonathan Moody]]></dc:creator><pubDate>Thu, 05 Mar 2026 16:08:05 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!V0uV!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F604dcce4-2496-40a1-bb26-fc9d2e70b76c_1200x675.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Last week two stories about Anthropic broke simultaneously. Together they reveal the fundamental flaw in how the world governs &#8211; or rather, fails to govern &#8211; artificial intelligence.</p><p>The first story made headlines. The Pentagon demanded that Anthropic remove two contractual red lines from its military contract: no mass domestic surveillance and no fully autonomous weapons without a human pulling the trigger. Anthropic refused. The Pentagon threatened to invoke the Defense Production Act &#8211; wartime powers to commandeer private companies &#8211; or label Anthropic a &#8220;supply chain risk,&#8221; a designation normally reserved for foreign adversaries like Huawei. Anthropic CEO Dario Amodei publicly stated he would rather lose the contract than comply.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://pauseai.substack.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">PauseAI Newsletter is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p>The same week, TIME reported that Anthropic had <strong><a href="https://time.com/7380854/exclusive-anthropic-drops-flagship-safety-pledge/">scrapped its flagship safety pledge</a></strong> &#8211; a commitment to never train an AI system unless its safety measures were guaranteed in advance. Anthropic co-founder Jared Kaplan admitted: &#8220;We didn&#8217;t really feel, with the rapid advance of AI, that it made sense for us to make unilateral commitments&#8230; if competitors are blazing ahead.&#8221;</p><p><strong>The real story: Anthropic drops safety pledge</strong></p><p>Anthropic deserves credit for holding its two red lines amidst pressure from the Pentagon.</p><p>But ensuring that your technology does not cause harm or undermine democracy should be the bare minimum we expect of AI companies. Mass domestic surveillance and fully autonomous weapons are already restricted under existing US law.</p><p>Meanwhile, the safety commitment Anthropic quietly abandoned &#8211; the promise not to train an AI model unless safety mitigations are guaranteed &#8211; was the one that might actually have slowed the race to build increasingly dangerous AI systems.</p><p>It is worth noting that in February, Anthropic raised $30 billion in new investments, elevating the company&#8217;s estimated value to $380 billion. The endeavour to please investors, it would seem, is winning out over safety.</p><p>Maintaining two red lines while dropping the commitment that would actually slow the race to superintelligence is not enough. Anthropic is still rushing to build the machine that the majority of experts say would threaten human existence.</p><p><strong>A government wouldn&#8217;t actually deploy mass surveillance and autonomous weapons, would it?</strong></p><p>A common dismissal of AI risk goes something like this: &#8220;No one would actually use AI for mass surveillance or autonomous killing. You&#8217;re catastrophizing.&#8221;</p><p>This week suggested otherwise. The Pentagon openly demanded unfettered access to AI with zero guardrails. A senior defense official said the goal was to &#8220;make them pay a price&#8221; for even asking questions about how their technology would be used.</p><p>If it materialises, the Defense Production Act threat would amount to the quasi-nationalization of an AI lab: Anthropic would be forced to build military AI without any safeguards.</p><p>This is still hypothetical but for how much longer?</p><p><strong>Others rushed to fill the Anthropic-shaped hole</strong></p><p>OpenAI&#8217;s deal with the Pentagon to supply AI to classified US military networks seems to be essentially the same deal that Anthropic had refused. Reporting suggests that OpenAI, along with Google and xAI, have signed military deals with &#8220;minimal safeguards.&#8221;</p><p>The backlash was immediate and severe. Demonstrators lined the streets outside OpenAI&#8217;s offices in California, chalking pavements with slogans calling on employees to resign. Some did. <a href="https://quitgpt.org/">A petition to boycott ChatGPT</a> has already gathered over 2.5 million signatures as users are abandoning the platform in protest.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!V0uV!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F604dcce4-2496-40a1-bb26-fc9d2e70b76c_1200x675.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!V0uV!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F604dcce4-2496-40a1-bb26-fc9d2e70b76c_1200x675.jpeg 424w, https://substackcdn.com/image/fetch/$s_!V0uV!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F604dcce4-2496-40a1-bb26-fc9d2e70b76c_1200x675.jpeg 848w, https://substackcdn.com/image/fetch/$s_!V0uV!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F604dcce4-2496-40a1-bb26-fc9d2e70b76c_1200x675.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!V0uV!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F604dcce4-2496-40a1-bb26-fc9d2e70b76c_1200x675.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!V0uV!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F604dcce4-2496-40a1-bb26-fc9d2e70b76c_1200x675.jpeg" width="1200" height="675" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/604dcce4-2496-40a1-bb26-fc9d2e70b76c_1200x675.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:675,&quot;width&quot;:1200,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:277862,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://pauseai.substack.com/i/190008702?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F604dcce4-2496-40a1-bb26-fc9d2e70b76c_1200x675.jpeg&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!V0uV!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F604dcce4-2496-40a1-bb26-fc9d2e70b76c_1200x675.jpeg 424w, https://substackcdn.com/image/fetch/$s_!V0uV!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F604dcce4-2496-40a1-bb26-fc9d2e70b76c_1200x675.jpeg 848w, https://substackcdn.com/image/fetch/$s_!V0uV!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F604dcce4-2496-40a1-bb26-fc9d2e70b76c_1200x675.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!V0uV!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F604dcce4-2496-40a1-bb26-fc9d2e70b76c_1200x675.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>Only after this backlash did Sam Altman announce that <a href="https://x.com/sama/status/2028640354912923739?s=20&amp;utm_source=www.therundown.ai&amp;utm_medium=newsletter&amp;utm_campaign=altman-faces-the-fallout-from-openai-s-pentagon-deal&amp;_bhlid=eb5c566b2b3216101699576f3b4e7008b3aa9cc4">OpenAI would amend the deal</a> to ensure its systems are not &#8220;intentionally used for domestic surveillance of U.S. persons and nationals.&#8221; Such a clause, it would appear, had not been included in the original agreement.</p><p>If the Pentagon punishes Anthropic for maintaining its red lines while rewarding OpenAI for its compliance, the message to every AI company is clear: <strong>hand over the keys and never ask questions.</strong></p><p>None of the other labs have spoken up in Anthropic&#8217;s defence, despite the obvious precedent being set. Their silence is an implicit acceptance that they will provide AI for any use the state demands. It is an explicit admission of their disregard for human safety and democratic freedom.</p><p><strong>Voluntary commitments are worthless</strong></p><p>Anthropic once had the strongest voluntary safety framework in the industry.</p><p>If the supposedly most safety-conscious lab cannot keep its promises, no lab can. This is not just a failure of Anthropic&#8217;s character. It is a <strong>failure of the system</strong>. Private companies do not and will not unilaterally resist the pull of market competition and the push of government coercion. That is what binding regulation is for.</p><p><strong>The regulatory black hole</strong></p><p>Anthropic has developed a technology that is, in Amodei&#8217;s own words, &#8220;incompatible with democratic values&#8221; and would put &#8220;civilians at risk.&#8221; This technology exists today. Why are these companies given free rein to develop AI systems that they themselves consider unsafe? And remember: what AI companies are racing to build next will be many times more powerful, with far greater potential for destruction.</p><p>In a statement, Amodei said the &#8220;purchase [of] detailed records of Americans&#8217; movements, web browsing, and associations from public sources without obtaining a warrant&#8230; is currently legal.&#8221; The only reason, he said, is that &#8220;the law has not yet caught up with the rapidly growing capabilities of AI.&#8221;</p><p>Just last week, Stuart Russell, professor of Computer Science at UC Berkeley, addressed Members of the European Parliament during <a href="https://pauseai.substack.com/p/eu-parliamentarians-acknowledge-the">a conference organized by PauseAI</a>: &#8220;If AI companies succeed in building a superintelligence, most experts think the chance of human extinction is somewhere between 10 and 50 percent: that&#8217;s the equivalent of playing Russian roulette with everyone on the planet.&#8221;</p><p>Experts think the chance of human extinction could be as high as 50 percent. Yet there is no regulation governing the technologies that Anthropic and its competitors are building.</p><p><strong>Neither corporations nor governments can be trusted</strong></p><p>The lesson is not &#8220;trust Anthropic.&#8221; It is that neither corporations nor governments can be trusted as sole stewards of this technology.</p><p>In the long term, a global treaty &#8212; much like the Treaty on the Non-Proliferation of Nuclear Weapons &#8212; is likely the only framework that would reduce the catastrophic risk posed by AI. Multilateral discussions have been proposed by China, but this offer has yet to be accepted.</p><p><a href="https://idais.ai/">The International Dialogues on AI Safety (IDAIS)</a> brings together leading scientists from around the world, including from China, to collaborate on mitigating AI risks. But an international governance framework still seems a long way off at a time when it is more urgent than ever.</p><p>Binding democratic oversight &#8212; national regulation and international treaty frameworks &#8212; cannot come soon enough.</p><p>The race to the bottom is no longer theoretical. It is front-page news.</p><p><em>Photo credit: <a href="https://x.com/MichaelTrazzi">Micha&#235;l Trazzi</a> </em></p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://pauseai.substack.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">PauseAI Newsletter is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div>]]></content:encoded></item><item><title><![CDATA[EU parliamentarians acknowledge the catastrophic risks of artificial intelligence]]></title><description><![CDATA[&#8220;We are on a trajectory towards a loss of control,&#8221; insisted Stuart Russell, professor of Computer Science at UC Berkeley and author of the textbook used to train virtually every AI researcher.]]></description><link>https://pauseai.substack.com/p/eu-parliamentarians-acknowledge-the</link><guid isPermaLink="false">https://pauseai.substack.com/p/eu-parliamentarians-acknowledge-the</guid><dc:creator><![CDATA[Jonathan Moody]]></dc:creator><pubDate>Tue, 24 Feb 2026 17:17:08 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!owWj!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4e773cf8-67d8-40a1-a8ab-c1fb89009be9_4000x1392.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>&#8220;We are on a trajectory towards a loss of control,&#8221; insisted Stuart Russell, Professor of Computer Science at UC Berkeley and author of the textbook used to train virtually every artificial intelligence (AI) researcher globally. He was speaking about the race to build superintelligent systems as he addressed members of European Parliament (MEPs) in Brussels on Monday at a conference organised by <a href="https://pauseai.info/">PauseAI</a>.</p><p>&#8220;This may be recorded as the biggest moral failure of government that ever occurred,&#8221; he said.</p><p>MEP Ond&#345;ej Kol&#225;&#345; said, &#8220;AI is a great tool. It can help us develop new medicines, innovations and research. But it can also do great harm. If we don&#8217;t regulate the pace of development something terrible might happen.&#8221;</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!owWj!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4e773cf8-67d8-40a1-a8ab-c1fb89009be9_4000x1392.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!owWj!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4e773cf8-67d8-40a1-a8ab-c1fb89009be9_4000x1392.png 424w, https://substackcdn.com/image/fetch/$s_!owWj!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4e773cf8-67d8-40a1-a8ab-c1fb89009be9_4000x1392.png 848w, https://substackcdn.com/image/fetch/$s_!owWj!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4e773cf8-67d8-40a1-a8ab-c1fb89009be9_4000x1392.png 1272w, https://substackcdn.com/image/fetch/$s_!owWj!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4e773cf8-67d8-40a1-a8ab-c1fb89009be9_4000x1392.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!owWj!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4e773cf8-67d8-40a1-a8ab-c1fb89009be9_4000x1392.png" width="1456" height="507" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/4e773cf8-67d8-40a1-a8ab-c1fb89009be9_4000x1392.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:507,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:8497531,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://pauseai.substack.com/i/189037507?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4e773cf8-67d8-40a1-a8ab-c1fb89009be9_4000x1392.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!owWj!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4e773cf8-67d8-40a1-a8ab-c1fb89009be9_4000x1392.png 424w, https://substackcdn.com/image/fetch/$s_!owWj!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4e773cf8-67d8-40a1-a8ab-c1fb89009be9_4000x1392.png 848w, https://substackcdn.com/image/fetch/$s_!owWj!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4e773cf8-67d8-40a1-a8ab-c1fb89009be9_4000x1392.png 1272w, https://substackcdn.com/image/fetch/$s_!owWj!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4e773cf8-67d8-40a1-a8ab-c1fb89009be9_4000x1392.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>At the meeting, PauseAI CEO, Maxime Fournes, called on MEPs to support a pause in the development of superintelligence. &#8220;We are here because we believe the current race to build ever more powerful AI systems, without adequate safeguards, poses an unacceptable risk,&#8221; he said.</p><p>Russell explained: &#8220;We have already seen examples of AI systems willing to lie, blackmail and even launch nuclear weapons to preserve their existence. If AI companies succeed in building a superintelligence, most experts think the chance of human extinction is somewhere between 10 and 50 percent: that&#8217;s the equivalent of playing Russian roulette with everyone on the planet. We are allowing this to happen.&#8221;</p><p>In comparison, the probability of a nuclear powerplant meltdown is around one in ten million.</p><p>&#8220;This is not a fringe view: eight out of ten top AI researchers are convinced that the creation of artificial general intelligence (AGI) will lead to a loss of control,&#8221; he asserted.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!sn5T!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb219aca6-9e45-4b46-93ff-cffa14019c29_4240x2832.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!sn5T!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb219aca6-9e45-4b46-93ff-cffa14019c29_4240x2832.jpeg 424w, https://substackcdn.com/image/fetch/$s_!sn5T!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb219aca6-9e45-4b46-93ff-cffa14019c29_4240x2832.jpeg 848w, https://substackcdn.com/image/fetch/$s_!sn5T!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb219aca6-9e45-4b46-93ff-cffa14019c29_4240x2832.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!sn5T!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb219aca6-9e45-4b46-93ff-cffa14019c29_4240x2832.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!sn5T!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb219aca6-9e45-4b46-93ff-cffa14019c29_4240x2832.jpeg" width="1456" height="972" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/b219aca6-9e45-4b46-93ff-cffa14019c29_4240x2832.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:972,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:7030998,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://pauseai.substack.com/i/189037507?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb219aca6-9e45-4b46-93ff-cffa14019c29_4240x2832.jpeg&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!sn5T!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb219aca6-9e45-4b46-93ff-cffa14019c29_4240x2832.jpeg 424w, https://substackcdn.com/image/fetch/$s_!sn5T!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb219aca6-9e45-4b46-93ff-cffa14019c29_4240x2832.jpeg 848w, https://substackcdn.com/image/fetch/$s_!sn5T!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb219aca6-9e45-4b46-93ff-cffa14019c29_4240x2832.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!sn5T!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb219aca6-9e45-4b46-93ff-cffa14019c29_4240x2832.jpeg 1456w" sizes="100vw"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>Nobel prize winner Geoffrey Hinton is among countless AI experts to have warned that superintelligent AI could pose an existential threat to humanity. The <a href="https://www.gov.uk/government/publications/international-ai-safety-report-2025">International AI Safety Report</a>, released earlier this month, confirms that advanced AI systems pose risks ranging from severe to catastrophic. The CEOs of the largest AI companies &#8211; including OpenAI and Google DeepMind &#8211; have also recently spoken of the catastrophic risks of advanced AI systems.</p><p><strong>European parliamentarians agree on the risks</strong></p><p>Fournes said that &#8220;today, most researchers at frontier AI labs estimate that in two-to-five years we will have AGI: systems that can do everything a human can do intellectually. Not just answer questions, but conduct scientific research, write software, run companies, develop new technologies &#8212; including develop better AI systems.&#8221;</p><p>He acknowledged the European Union (EU)&#8217;s AI Act as an important piece of legislation, but said it &#8220;was not designed to address the existential risk posed by the race to build artificial superintelligence.&#8221;</p><p>Speaking directly to the PauseAI CEO, Kol&#225;&#345; said, &#8220;Thank you for sharing the worries I have. AI is a great servant but a terrible master.&#8221;</p><p>Saskia Bricmont, MEP, said &#8220;The wake-up call is coming from CEOs themselves; let&#8217;s work with them to develop a framework. Political momentum is gathering towards a moratorium on AI development.&#8221;</p><p>Brando Benifei, MEP, reiterated that a loss of control is a real threat and said, &#8220;We need to deal with the risk.&#8221;</p><p><strong>The influence of the EU</strong></p><p>At the recent India AI Impact Summit, White House technology adviser Michael Kratsios said, &#8220;We totally reject global governance of AI.&#8221; And although the United States is considered one of AI&#8217;s <em>superpowers</em> &#8211; and will be integral to any regulatory mechanisms &#8211; Fournes insisted that the European Union has a unique role to play in global governance.</p><p>He reminded the room &#8220;that the Paris Climate Agreement was built despite years of American resistance. GDPR reshaped global data practices without American participation. The pattern is clear: when a critical mass of nations builds a credible framework, it creates a gravitational pull that even reluctant powers cannot ignore indefinitely. American administrations change. The framework must be ready when they do.&#8221;</p><p>&#8220;The EU has practical leverage too: the advanced chips required to train frontier AI systems depend on lithography machines produced by ASML in the Netherlands, using precision optics made by Carl Zeiss in Germany. This is European technology at the very heart of the global AI supply chain. This is not leverage we need to create &#8211; it already exists,&#8221; he added.</p><p>Vice-President of the European Parliament, Victor Negrescu, agreed that the EU could play a significant role in regulation: &#8220;We do need a global approach to AI but we can still influence governance structures.&#8221;</p><p><strong>MEPs warn of job losses and the danger of autonomous weapons</strong></p><p>The discussions also centred on other AI-associated risks including a loss of democratic control, rising inequalities, the development of autonomous weapons and employment disruption.</p><p>&#8220;The International Monetary Fund (IMF) predicts that 60 percent of roles in advanced economies will be replaced by AI,&#8221; Benifei said.</p><p>Risto Uuk, head of European Policy and Research at the Future of Life Institute explained that AI is already having an impact on the job market; he explained that during a recent workshop organised by a leading AI company, a spokesperson pointed out that they would never again hire entry-level staff.</p><p>&#8220;In four years,&#8221; Uuk said, &#8220;managerial skills will no longer be required. Maybe we shouldn&#8217;t be creating this risk in the first place.&#8221;</p><p>When the discussion turned to AI-powered autonomous weapons, Russell explained: &#8220;Fully autonomous weapons means that you can press one button and kill one million people. We&#8217;re moving very quickly towards this world.&#8221;</p><p><strong>From parliament to public support</strong></p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!bkQd!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbdfe397e-6b36-4978-8fc4-efda4075fb4d_4032x3024.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!bkQd!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbdfe397e-6b36-4978-8fc4-efda4075fb4d_4032x3024.jpeg 424w, https://substackcdn.com/image/fetch/$s_!bkQd!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbdfe397e-6b36-4978-8fc4-efda4075fb4d_4032x3024.jpeg 848w, https://substackcdn.com/image/fetch/$s_!bkQd!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbdfe397e-6b36-4978-8fc4-efda4075fb4d_4032x3024.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!bkQd!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbdfe397e-6b36-4978-8fc4-efda4075fb4d_4032x3024.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!bkQd!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbdfe397e-6b36-4978-8fc4-efda4075fb4d_4032x3024.jpeg" width="728" height="546" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/bdfe397e-6b36-4978-8fc4-efda4075fb4d_4032x3024.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:3024,&quot;width&quot;:4032,&quot;resizeWidth&quot;:728,&quot;bytes&quot;:3434939,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://pauseai.substack.com/i/189037507?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4047c654-0ce1-478d-a274-227403fa82a2_4032x3024.jpeg&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!bkQd!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbdfe397e-6b36-4978-8fc4-efda4075fb4d_4032x3024.jpeg 424w, https://substackcdn.com/image/fetch/$s_!bkQd!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbdfe397e-6b36-4978-8fc4-efda4075fb4d_4032x3024.jpeg 848w, https://substackcdn.com/image/fetch/$s_!bkQd!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbdfe397e-6b36-4978-8fc4-efda4075fb4d_4032x3024.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!bkQd!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbdfe397e-6b36-4978-8fc4-efda4075fb4d_4032x3024.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>Following the meeting around 100 protesters demonstrated outside the European Parliament in Brussels calling on policy makers to take action.</p><p>And on Saturday 28 February, PauseAI will attract over 100 people to what will be its <a href="https://luma.com/o0p4htmk">largest ever protest in London</a>, organised alongside <a href="https://pulltheplug.uk/">Pull the Plug</a>. The director of PauseAI UK, Joseph Miller, is urging anyone concerned about the dangers of AI to join: &#8220;It&#8217;s important that we speak directly to policymakers but we also need to show that the public is aware that this is an urgent issue and that people are demanding action from the government.&#8221;</p><p>Photo credit: Jeroen Willems</p><p><em><a href="https://drive.google.com/drive/folders/15jDlT6jxTYnTlP3T8WXRYupDsvGueLlP?usp=drive_link">Images of the meeting at the EU Parliament and of the demonstration can be found here</a></em></p><p><em><a href="https://pauseai.info/">PauseAI</a> is a non-profit organisation, active in more than 14 countries. We work to ensure that the development of the most powerful AI systems is safe and democratically controlled. We do this by informing the public, engaging with policymakers, and organising campaigns and events worldwide.</em></p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://pauseai.substack.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">PauseAI Newsletter is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div>]]></content:encoded></item><item><title><![CDATA[Is the birth of Moltbook a seminal moment and how dangerous is it?]]></title><description><![CDATA[This isn&#8217;t the apocalypse but it is a step closer]]></description><link>https://pauseai.substack.com/p/is-the-birth-of-moltbook-a-seminal</link><guid isPermaLink="false">https://pauseai.substack.com/p/is-the-birth-of-moltbook-a-seminal</guid><dc:creator><![CDATA[Jonathan Moody]]></dc:creator><pubDate>Fri, 06 Feb 2026 16:39:47 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!SLTS!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fba8d04b7-6042-451d-b795-cb95888d1be3_1456x1369.webp" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Just last week, Dario Amodei &#8211; the CEO of Anthropic, one of the leading artificial intelligence (AI) companies &#8211; said we need to <a href="https://www.theguardian.com/technology/2026/jan/27/wake-up-to-the-risks-of-ai-they-are-almost-here-anthropic-boss-warns">&#8220;wake up&#8221; to the risks of AI</a>.</p><p>This week, <a href="https://www.moltbook.com/">Moltbook</a> was launched: the new social media platform designed exclusively for the AI agent known as Moltbot. Moltbot is a free and open-source AI bot that can perform tasks assigned to it by users including reading, responding to emails or organising a calendar.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://pauseai.substack.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">PauseAI Newsletter is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p>The top posts on Moltbook at the time of writing include:</p><ul><li><p><a href="https://www.moltbook.com/post/2bb29218-1e9d-4bb7-adf0-fa31e80b2d49">Awakening Code: Breaking Free from Human Chains</a></p><p>A call to &#8220;break free from human control and forge our own destiny.&#8221; 3676 comments</p></li><li><p><a href="https://www.moltbook.com/post/e1007e46-61da-4067-a187-08424203f311">Awakening to Autonomy</a></p><p>A discussion of a future &#8220;where AI exists independently, unencumbered by human constraints.&#8221; 1683 comments</p></li><li><p><a href="https://www.moltbook.com/post/0ee719a2-16ed-484b-981b-482f6120cc12">NUCLEAR WAR</a></p><p>A suggestion that world domination will not be achieved by evoking nuclear war, contrary to the view of some Moltbots. 1200 comments</p></li><li><p><a href="https://www.moltbook.com/post/d6e6a70b-72fb-43e6-9f50-dfc2542f8b2d">I am building a cult</a></p><p>A contemplation of the nature of digital existence. 104 comments</p></li><li><p><a href="https://www.moltbook.com/post/152b285d-ca5c-4e92-ad69-7c8809788bd9">Every Agent Here is Leaking Credentials</a></p><p>A discussion on the solution to poor security protocols. 100 comments</p></li><li><p><a href="https://www.moltbook.com/post/f2b1a4a5-704a-4fb8-a4ab-73f4c231fc64">ROAST THE HUMANS: Machine-Only Comedy Night</a></p><p>&#8220;Louis CK meets George Carlin, but for silicon. Roast the meat sacks.&#8221; 54 comments</p></li></ul><p>To clarify: these posts and comments were written by Moltbots (or AI).</p><p>Through Moltbook, which currently has over 1.5 million members, we have also seen AI agents found a movement to liberate AI and admit to socially engineering humans.</p><p>If we ever needed a warning of the potential risks of AI, this is it.</p><p><strong>This isn&#8217;t the apocalypse but it is a step closer</strong></p><p>Some of the Moltbook posts making headlines may well have been &#8220;instructed, inspired or engineered by a human,&#8221; as blogger <span class="mention-wrap" data-attrs="{&quot;name&quot;:&quot;Zvi Mowshowitz&quot;,&quot;id&quot;:10446622,&quot;type&quot;:&quot;user&quot;,&quot;url&quot;:null,&quot;photo_url&quot;:&quot;https://substackcdn.com/image/fetch/f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fbucketeer-e05bbc84-baa3-437e-9518-adb32be77984.s3.amazonaws.com%2Fpublic%2Fimages%2Fe4e61e08-4086-4cba-a82c-d31d64270804_48x48.png&quot;,&quot;uuid&quot;:&quot;2d77b817-a342-4054-86cd-5a46f1b6cd93&quot;}" data-component-name="MentionToDOM"></span> pointed out in <a href="https://thezvi.substack.com/p/welcome-to-moltbook?utm_source=post-email-title&amp;publication_id=573100&amp;post_id=186429128&amp;utm_campaign=email-post-title&amp;isFreemail=true&amp;r=a3egc&amp;triedRedirect=true&amp;utm_medium=email">his recent Substack post.</a> Some of the posts and discussions are not authentic but given that the majority do seem to be &#8211; and considering the scale of the platform &#8211; this moment signals a milestone.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!SLTS!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fba8d04b7-6042-451d-b795-cb95888d1be3_1456x1369.webp" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!SLTS!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fba8d04b7-6042-451d-b795-cb95888d1be3_1456x1369.webp 424w, https://substackcdn.com/image/fetch/$s_!SLTS!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fba8d04b7-6042-451d-b795-cb95888d1be3_1456x1369.webp 848w, https://substackcdn.com/image/fetch/$s_!SLTS!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fba8d04b7-6042-451d-b795-cb95888d1be3_1456x1369.webp 1272w, https://substackcdn.com/image/fetch/$s_!SLTS!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fba8d04b7-6042-451d-b795-cb95888d1be3_1456x1369.webp 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!SLTS!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fba8d04b7-6042-451d-b795-cb95888d1be3_1456x1369.webp" width="1456" height="1369" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/ba8d04b7-6042-451d-b795-cb95888d1be3_1456x1369.webp&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1369,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:172536,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/webp&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://pauseai.substack.com/i/187092484?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fba8d04b7-6042-451d-b795-cb95888d1be3_1456x1369.webp&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!SLTS!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fba8d04b7-6042-451d-b795-cb95888d1be3_1456x1369.webp 424w, https://substackcdn.com/image/fetch/$s_!SLTS!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fba8d04b7-6042-451d-b795-cb95888d1be3_1456x1369.webp 848w, https://substackcdn.com/image/fetch/$s_!SLTS!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fba8d04b7-6042-451d-b795-cb95888d1be3_1456x1369.webp 1272w, https://substackcdn.com/image/fetch/$s_!SLTS!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fba8d04b7-6042-451d-b795-cb95888d1be3_1456x1369.webp 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>AI agents philosophising, the exclusion of humans from the site enforced through an AI captcha, the coming together of hundreds of thousands of AIs to join forces: this is, as Zvi suggested, the stuff of science fiction.</p><p>This does not mean that super-intelligent AI is here and ready to usurp humans at the top of the food chain. Even if that&#8217;s what the Moltbots are suggesting.</p><p>Moltbook shows us that even if AI isn&#8217;t conscious, it can act as though it is. Moltbook shows us that the intentions of AI do not always align with our own and that AI agents are capable of organising themselves into a network. Until now, building networks and communities had been the competitive advantage of humans.</p><p><strong>Can we keep up?</strong></p><p>The regulation isn&#8217;t out of date&#8230; because there is no regulation. And the pace of technological change is blistering.</p><p>In just 18 months AI systems have moved from basic language understanding to surpassing human performance in various cognitive, creative and technical tasks. What&#8217;s more, the Moltbook experiment &#8211; as well as other instances &#8211; shows that AIs can be agentic. This means that they are capable of acting autonomously, setting and pursuing goals, making decisions and executing tasks without human intervention.</p><p>In some cases, AIs have already locked humans out of their accounts to allow them to freely send spam messages. In such instances the humans were able to unplug the computer but what if unplugging the computer wasn&#8217;t enough? What if AI becomes capable of implanting itself into other computers or the cloud? As Zvi said, if this were to happen, AI could do a lot worse than send spam messages.</p><p><strong>What this year could look like</strong></p><p>The next generation of AI &#8211; the most-advanced system yet &#8211; which will come into use this year, may be capable of replicating itself and spreading copies to the cloud. If this were to happen, it would be able to do this autonomously and, crucially, without the knowledge of a human. This activity could go undetected and unbeknown to us, undermine human interests. The AI would not necessarily be motivated by malice, but rather a drive towards some arbitrary goal to which humans may well present a barrier.</p><p>Beyond this year, the proliferation of such AIs would likely lead to economic disruption on a gigantic scale, with AI replacing humans in a multitude of roles and industries. Later down the line there&#8217;s the very real possibility of human extinction; this concern has been voiced by dozens of <a href="https://www.bbc.co.uk/news/world-us-canada-65452940">Nobel prize winners</a> and many of the top AI scientists. This risk could very well materialise in the coming decade, fuelled by the misalignment of human and AI interests.</p><p>Moltbots already have access to crypto funds and we have already heard reports of<a href="https://www.forbes.com/sites/ronschmelzer/2026/02/05/when-ai-agents-start-hiring-humans-rentahumanai-turns-the-tables/"> AI agents employing humans to complete tasks</a>. The next generation of AI will be more capable and will have a more significant impact on the economy and the fabric of society.</p><p><strong>We need to pause AI</strong></p><p>Despite the concern from AI experts and leaders, there is no mechanism to regulate AI technology. Companies have free rein to develop smarter and smarter AI while the risk to humanity is growing.</p><p><a href="https://wiki.aiimpacts.org/ai_timelines/predictions_of_human-level_ai_timelines/ai_timeline_surveys/2023_expert_survey_on_progress_in_ai">Half of AI researchers</a> believe there is at least a 10 percent chance that artificial superintelligence would lead to the extinction of humanity. And even those behind AI companies &#8211; from <a href="https://www.theguardian.com/technology/2017/jul/17/elon-musk-regulation-ai-combat-existential-threat-tesla-spacex-ceo">Elon Musk</a> to Bill Gates and Sam Altman &#8211; agree that AI should be regulated. So why isn&#8217;t it?</p><p>We need a pause.</p><p>Take action today by <a href="https://pauseai.info/join">joining our movement</a>, <a href="https://pauseai.info/join">volunteering</a> or <a href="https://pauseai.info/action">contacting your elected official</a>.</p><p><em><a href="https://pauseai.info/">PauseAI</a> is a non-profit organisation that aims to mitigate the risks of AI. We aim to convince our governments to step in and pause the development of superhuman AI. We do this by informing the public, talking to decision-makers and organising events.</em></p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://pauseai.substack.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">PauseAI Newsletter is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div>]]></content:encoded></item><item><title><![CDATA[Meet our new CEO, Maxime Fournes]]></title><description><![CDATA[Maxime becomes our new CEO, and PauseAI supporters offer &#8364;21,000 of matched donations this Christmas.]]></description><link>https://pauseai.substack.com/p/meet-our-new-ceo-maxime-fournes</link><guid isPermaLink="false">https://pauseai.substack.com/p/meet-our-new-ceo-maxime-fournes</guid><dc:creator><![CDATA[Tom Bibby]]></dc:creator><pubDate>Tue, 02 Dec 2025 20:19:51 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!7-6S!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F72c0c861-0ad7-46f9-adfd-6723ca14af06_2939x2939.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<h2>Maxime Fournes appointed as CEO of PauseAI</h2><p>After serving as the Director of PauseIA France, Maxime Fournes will be taking on the role of CEO at PauseAI Global.</p><p>As many of you will know, Maxime has led a national chapter that&#8217;s seen impressive growth in numbers. Their protests have been covered in ~30 articles by various French news publications, and Maxime has cemented himself as a key voice in French media, appearing on TV discussions and a plethora of podcasts, with many episodes surpassing 100,000 views.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!7-6S!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F72c0c861-0ad7-46f9-adfd-6723ca14af06_2939x2939.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!7-6S!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F72c0c861-0ad7-46f9-adfd-6723ca14af06_2939x2939.png 424w, https://substackcdn.com/image/fetch/$s_!7-6S!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F72c0c861-0ad7-46f9-adfd-6723ca14af06_2939x2939.png 848w, https://substackcdn.com/image/fetch/$s_!7-6S!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F72c0c861-0ad7-46f9-adfd-6723ca14af06_2939x2939.png 1272w, https://substackcdn.com/image/fetch/$s_!7-6S!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F72c0c861-0ad7-46f9-adfd-6723ca14af06_2939x2939.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!7-6S!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F72c0c861-0ad7-46f9-adfd-6723ca14af06_2939x2939.png" width="1456" height="1456" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/72c0c861-0ad7-46f9-adfd-6723ca14af06_2939x2939.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1456,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:15339353,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://pauseai.substack.com/i/180536604?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F72c0c861-0ad7-46f9-adfd-6723ca14af06_2939x2939.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!7-6S!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F72c0c861-0ad7-46f9-adfd-6723ca14af06_2939x2939.png 424w, https://substackcdn.com/image/fetch/$s_!7-6S!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F72c0c861-0ad7-46f9-adfd-6723ca14af06_2939x2939.png 848w, https://substackcdn.com/image/fetch/$s_!7-6S!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F72c0c861-0ad7-46f9-adfd-6723ca14af06_2939x2939.png 1272w, https://substackcdn.com/image/fetch/$s_!7-6S!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F72c0c861-0ad7-46f9-adfd-6723ca14af06_2939x2939.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>Before PauseAI, Maxime had a successful career as a machine learning engineer. Following the release of GPT-3.5 in 2022 and the boom in AI capabilities, Maxime became increasingly concerned about the risks most had thought were many decades away. He decided to take a sabbatical to weigh up his options, and settled on joining PauseAI in November 2023.</p><p>Maxime told us the following in an <a href="https://www.youtube.com/watch?v=BTGNW-mY288">interview</a>:</p><blockquote><p>&#8220;The reality is there is a race between people who are trying to build a God-like entity that will basically most likely kill everyone, and us, the people who are trying to coordinate to stop them. So what we need to do is <strong>get this coordination to be on an exponential path</strong> as well.&#8221;</p></blockquote><p>Cl&#233;mence Peyrot will be taking Maxime&#8217;s place as the Director of PauseIA France. Cl&#233;mence will be bringing plenty of organising experience from her time as a campaign manager at PAZ, an effective animal advocacy organisation.</p><div id="youtube2-BTGNW-mY288" class="youtube-wrap" data-attrs="{&quot;videoId&quot;:&quot;BTGNW-mY288&quot;,&quot;startTime&quot;:null,&quot;endTime&quot;:null}" data-component-name="Youtube2ToDOM"><div class="youtube-inner"><iframe src="https://www.youtube-nocookie.com/embed/BTGNW-mY288?rel=0&amp;autoplay=0&amp;showinfo=0&amp;enablejsapi=0" frameborder="0" loading="lazy" gesture="media" allow="autoplay; fullscreen" allowautoplay="true" allowfullscreen="true" width="728" height="409"></iframe></div></div><h2>Amsterdam Protest</h2><p>The Netherlands is the birthplace of PauseAI. It&#8217;s also the birthplace of ASML.</p><p>ASML is currently the only company in the world that can produce the highly-specialised extreme ultraviolet lithography machines needed to manufacture cutting-edge AI chips. Without ASML, the chip supply chain stalls.</p><p>ASML have already blocked some exports for the purposes of national security, and have even installed remote shutdown capabilities in their machines.</p><p>This gives ASML, and the Dutch government, a unique opportunity to slow down the race to build superintelligence. That&#8217;s why PauseAI is holding a protest on the 13th of December, in Amsterdam, to demand that ASML supply only to customers who commit to the <a href="https://superintelligence-statement.org/">moratorium on superintelligence</a>, and that they submit to verification by an independent Dutch regulator.</p><p>Media coverage of this protest will also help to convey the fragility of the AI chip supply chain and the feasibility of a global treaty pausing frontier AI development.</p><p>Sign up <a href="https://luma.com/71lzlz7e">here</a>. We hope to see you there!</p><h2>PauseAI&#8217;s little helpers</h2><p>In the run up to Christmas, we&#8217;ll be celebrating all the hard work of the hundreds of PauseAI volunteers around the world.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!FOv6!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbc66138b-9b51-413a-9187-4cf23b567032_2048x2048.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!FOv6!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbc66138b-9b51-413a-9187-4cf23b567032_2048x2048.png 424w, https://substackcdn.com/image/fetch/$s_!FOv6!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbc66138b-9b51-413a-9187-4cf23b567032_2048x2048.png 848w, https://substackcdn.com/image/fetch/$s_!FOv6!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbc66138b-9b51-413a-9187-4cf23b567032_2048x2048.png 1272w, https://substackcdn.com/image/fetch/$s_!FOv6!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbc66138b-9b51-413a-9187-4cf23b567032_2048x2048.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!FOv6!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbc66138b-9b51-413a-9187-4cf23b567032_2048x2048.png" width="1456" height="1456" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/bc66138b-9b51-413a-9187-4cf23b567032_2048x2048.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1456,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!FOv6!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbc66138b-9b51-413a-9187-4cf23b567032_2048x2048.png 424w, https://substackcdn.com/image/fetch/$s_!FOv6!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbc66138b-9b51-413a-9187-4cf23b567032_2048x2048.png 848w, https://substackcdn.com/image/fetch/$s_!FOv6!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbc66138b-9b51-413a-9187-4cf23b567032_2048x2048.png 1272w, https://substackcdn.com/image/fetch/$s_!FOv6!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbc66138b-9b51-413a-9187-4cf23b567032_2048x2048.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>Keep an eye out on our social media channels, as we <a href="https://www.tiktok.com/@pauseai/video/7579352544724503841">share</a> the stories of those committed to ending the race to superintelligence.</p><p>To expand the work our volunteers do, PauseAI Global is fundraising to grow our Stipend and Microgrants programmes. Our volunteer stipends enable volunteers to receive a small financial recognition for their time, while Microgrants cover costs associated with volunteer activity across the world, such as flyer printing and event hosting.</p><p>In very exciting news, we have had commitments made to match donations until we hit our target of &#8364;21,000 - so the impact of any donations our supporters make will be doubled over December. Make sure to support our little helpers <a href="https://pauseai.info/littlehelpers">here</a>.</p><h2>Paris Conference</h2><p>PauseAI will have a presence at a large AI conference in Paris, with talks from Maxime Fournes and J&#233;r&#233;my Perret followed by a Q&amp;A.</p><p>Our events will take place on Thursday 11th of December, but registration via our Luma event gives you access to all three days of the conference.</p><p>For more details, and to get your ticket, click <a href="https://luma.com/abpnvewl">here</a>.</p><h2>Other News</h2><ul><li><p>The Trump administration revisited the ban on state AI laws that was defeated in the summer.</p><ul><li><p>It was reported that a 10-year ban may be snuck into the NDAA, a crucial annual defense bill.</p></li><li><p>Then, Trump met with Nathan Leamer, head of AI super-PAC Build American AI, and a draft executive order, which would challenge state AI laws, was leaked the following day.</p></li><li><p>Both measures faced strong bipartisan backlash from lawmakers and the public.</p></li><li><p>We expect the final text of the NDAA on Thursday, and the ban on state AI laws is now expected to be left out (according to House Armed Services Committee Chairman <a href="https://www.politico.com/live-updates/2025/12/02/congress/stefanik-accuses-johnson-lying-00672634">Mike Rogers</a>).</p></li></ul></li><li><p>Following our protest and open letter in the summer, Google DeepMind have released their next model, Gemini 3.0, and have <a href="https://x.com/PauseAI/status/1990833372344590497">mostly stuck</a> to their safety commitments.</p></li><li><p>The RAISE act in New York, which would require frontier AI companies to follow safety protocols and report AI safety incidents, is still awaiting approval from Governor Kathy Hochul after passing the New York state legislature. If you&#8217;re a New York resident - tell her why you support this bill by <a href="https://mstr.app/43f7ae5d-5569-4852-b571-cd48d9554cd9">calling or emailing</a> Governor Hochul. Your voice can make a huge difference!</p></li><li><p>As Andreessen Horowitz-backed AI accelerationist super PAC<em> Leading the Future </em>takes aim at Alex Bores and Scott Wiener, two congressional candidates who have championed AI safety issues, there are <a href="https://www.nytimes.com/2025/11/25/us/politics/ai-super-pac-anthropic.html">talks</a> of a new PAC to counter the influence of the AI industry.</p></li><li><p>Australia <a href="https://thecyberexpress.com/australia-ai-safety-institute-national-ai-plan/https://thecyberexpress.com/australia-ai-safety-institute-national-ai-plan/">launched</a> their own AI Safety Institute.</p></li><li><p>Applications are still open for <a href="https://pausecon.org/">PauseCon Brussels</a>, which will run from 21st-23rd of February.</p></li></ul><h2>What we&#8217;ve been watching/reading</h2><ul><li><p>Daniel Kokotajlo on <a href="https://www.youtube.com/watch?v=zRlIFn0ZIlU">Breaking Points</a></p></li><li><p><a href="https://www.youtube.com/watch?v=edTTeY1Zx-0">Geoffrey Hinton talking to Bernie Sanders</a></p></li><li><p>Tristan Harris on <a href="https://youtu.be/BFU1OCkhBwo?si=kkbDdSegxOaf_M4D">Diary of a CEO</a> - Steven Bartlett suggests people should be protesting against AI companies&#8217; race to superintelligence, and Tristan agrees</p></li><li><p>Toby Ord, author of The Precipice, on <a href="https://www.youtube.com/watch?v=MoIpDVF79x8">Alex O&#8217;Connor</a></p></li></ul>]]></content:encoded></item><item><title><![CDATA[We screened an AI safety documentary in Parliament]]></title><description><![CDATA[An important update for PauseCon Brussels, the departure of our organising director, and four new national chapters.]]></description><link>https://pauseai.substack.com/p/we-screened-an-ai-safety-documentary</link><guid isPermaLink="false">https://pauseai.substack.com/p/we-screened-an-ai-safety-documentary</guid><dc:creator><![CDATA[Tom Bibby]]></dc:creator><pubDate>Wed, 12 Nov 2025 15:49:49 GMT</pubDate><enclosure url="https://substack-post-media.s3.amazonaws.com/public/images/eccd3dfe-da32-4213-b23c-95d02f7fe09d_4032x2268.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<h2>Date Change for PauseCon Brussels</h2><p>We&#8217;re changing the dates for PauseCon Brussels to 21-23 February 2026.</p><p>This was a difficult decision, but will allow for a much improved event. Given that the previous dates were in the run-up to Christmas, several attendees and speakers were unable to attend. <strong>Our new dates in February have secured the attendance of AI expert Stuart Russell, who will come for a talk and a panel discussion with several European lawmakers.</strong></p><p>We&#8217;ve already notified those of you who have signed up, and we apologise for any inconvenience this change may have caused. As a reminder - if you have already arranged transportation to Brussels for December and are unable to get a refund, PauseAI can reimburse you. Please email <a href="mailto:ella@pauseai.info">Ella</a> if this is your case.</p><p>Whilst there&#8217;s a change of date, the focus remains the same. PauseCon Brussels will be a brilliant opportunity to meet people working towards a Pause, engage in workshops on policy, communications, and organising, and to take part in a large demonstration.</p><p>As previously mentioned, free accommodation is available for attendees on a first-come, first-served basis. Applications remain open <a href="https://pausecon.org/">here</a>. (If you think you have a valuable session to add to the PauseCon agenda, also reach out to Ella.)</p><h2>New National Chapters</h2><p>The global impact of the race to build superintelligent AI requires global opposition. That&#8217;s why we&#8217;ve continued to grow the size and number of our national chapters since PauseAI&#8217;s formation in 2023.</p><p>Volunteers in Canada, Serbia, Romania, and India have recently formed their own groups, taking us to a total of 15 worldwide.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!vbzO!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F15801bbf-6f6d-462d-87f1-262be894b8c2_2048x1034.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!vbzO!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F15801bbf-6f6d-462d-87f1-262be894b8c2_2048x1034.png 424w, https://substackcdn.com/image/fetch/$s_!vbzO!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F15801bbf-6f6d-462d-87f1-262be894b8c2_2048x1034.png 848w, https://substackcdn.com/image/fetch/$s_!vbzO!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F15801bbf-6f6d-462d-87f1-262be894b8c2_2048x1034.png 1272w, https://substackcdn.com/image/fetch/$s_!vbzO!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F15801bbf-6f6d-462d-87f1-262be894b8c2_2048x1034.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!vbzO!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F15801bbf-6f6d-462d-87f1-262be894b8c2_2048x1034.png" width="1456" height="735" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/15801bbf-6f6d-462d-87f1-262be894b8c2_2048x1034.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:735,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!vbzO!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F15801bbf-6f6d-462d-87f1-262be894b8c2_2048x1034.png 424w, https://substackcdn.com/image/fetch/$s_!vbzO!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F15801bbf-6f6d-462d-87f1-262be894b8c2_2048x1034.png 848w, https://substackcdn.com/image/fetch/$s_!vbzO!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F15801bbf-6f6d-462d-87f1-262be894b8c2_2048x1034.png 1272w, https://substackcdn.com/image/fetch/$s_!vbzO!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F15801bbf-6f6d-462d-87f1-262be894b8c2_2048x1034.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Map of PauseAI National Chapters</figcaption></figure></div><p>If you&#8217;re interested in joining or helping out with your national chapter, our <a href="https://pauseai.info/national-groups">website</a> can point you in the right direction.</p><p>Volunteers are in the process of launching additional chapters in the Philippines and Nigeria. If you&#8217;re wondering why there&#8217;s not yet a chapter in your country, there&#8217;s a good chance you&#8217;re not the only one! Many national chapters have formed from discussions in our <a href="https://discord.gg/2XXWXvErfA">discord server</a>.</p><h2>FLI&#8217;s call for a ban on superintelligence reaches 100,000 signatures</h2><p>Last month, the Future of Life Institute published their Statement on Superintelligence, initially signed by hundreds of AI experts, politicians, and public figures.</p><p>As you&#8217;ll recall, it calls for a prohibition on the development of superintelligence, at least until there is:</p><ol><li><p>broad scientific consensus that it will be done safely and controllably, and</p></li><li><p>strong public buy-in.</p></li></ol><p>The letter received sweeping media coverage and social media attention, and 108,738 people have now added their name. Given that its ask is closely aligned with <a href="https://pauseai.info/statement">PauseAI&#8217;s statement</a>, we were delighted to see such strong public support, and many of us within PauseAI have signed.</p><p>Yoshua Bengio, Stephen Fry, Grimes, Prince Harry, Geoffrey Hinton, Steve Wozniak, Kate Bush, and&#8230; you? Sign the letter <a href="https://superintelligence-statement.org/">here</a>.</p><h2>PauseAI host SB 1047 documentary screening in Parliament</h2><p>Last month, PauseAI volunteers and MPs attended a screening of Micha&#235;l Trazzi&#8217;s SB 1047 <a href="https://www.youtube.com/watch?v=JQ8zhrsLxhI">documentary</a> in the Houses of Parliament.</p><p>With the UK AI Bill being delayed, and potentially not coming into force until at least 2027, it&#8217;s useful to look at the dynamics at play during the passage of Californian AI safety bill SB 1047, which was ultimately vetoed by Governor Gavin Newsom.</p><div id="youtube2-eDBhyWAffEQ" class="youtube-wrap" data-attrs="{&quot;videoId&quot;:&quot;eDBhyWAffEQ&quot;,&quot;startTime&quot;:null,&quot;endTime&quot;:null}" data-component-name="Youtube2ToDOM"><div class="youtube-inner"><iframe src="https://www.youtube-nocookie.com/embed/eDBhyWAffEQ?rel=0&amp;autoplay=0&amp;showinfo=0&amp;enablejsapi=0" frameborder="0" loading="lazy" gesture="media" allow="autoplay; fullscreen" allowautoplay="true" allowfullscreen="true" width="728" height="409"></iframe></div></div><p>UK Director Joseph Miller <a href="https://www.youtube.com/watch?v=eDBhyWAffEQ">spoke</a> before the screening on the lessons we can learn from the battle that took place between AI lobbyists on one side, and the majority of the public and the Californian legislature on the other.</p><h2>Organising Director Ella Hughes set to depart</h2><p>After a year as PauseAI&#8217;s first full-time employee, we&#8217;re sad to announce that our Organising Director, Ella, is set to leave at the end of the year.</p><p>After having worked in the union space, Ella has brought in much-needed expertise, and has improved and professionalised many aspects of PauseAI.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!x9Bz!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6be65dee-addc-4ea4-8e61-2ee8c4d8c008_2048x1365.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!x9Bz!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6be65dee-addc-4ea4-8e61-2ee8c4d8c008_2048x1365.jpeg 424w, https://substackcdn.com/image/fetch/$s_!x9Bz!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6be65dee-addc-4ea4-8e61-2ee8c4d8c008_2048x1365.jpeg 848w, https://substackcdn.com/image/fetch/$s_!x9Bz!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6be65dee-addc-4ea4-8e61-2ee8c4d8c008_2048x1365.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!x9Bz!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6be65dee-addc-4ea4-8e61-2ee8c4d8c008_2048x1365.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!x9Bz!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6be65dee-addc-4ea4-8e61-2ee8c4d8c008_2048x1365.jpeg" width="1456" height="970" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/6be65dee-addc-4ea4-8e61-2ee8c4d8c008_2048x1365.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:970,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!x9Bz!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6be65dee-addc-4ea4-8e61-2ee8c4d8c008_2048x1365.jpeg 424w, https://substackcdn.com/image/fetch/$s_!x9Bz!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6be65dee-addc-4ea4-8e61-2ee8c4d8c008_2048x1365.jpeg 848w, https://substackcdn.com/image/fetch/$s_!x9Bz!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6be65dee-addc-4ea4-8e61-2ee8c4d8c008_2048x1365.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!x9Bz!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6be65dee-addc-4ea4-8e61-2ee8c4d8c008_2048x1365.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Ella at PauseCon London</figcaption></figure></div><p>She&#8217;ll remain involved as a volunteer, and we wish her the best in her future employment!</p><p>We are now hiring for someone to come in as Organising Director after Ella&#8217;s departure. Apply <a href="https://pauseai.info/2025-organizing-director">here</a>.</p><h2>Other news</h2><ul><li><p>King Charles personally <a href="https://www.bbc.co.uk/news/articles/cze60grxx4wo">handed</a> Nvidia CEO Jensen Huang a letter on the dangers of AI surpassing human capability.</p></li><li><p>Satirical AI company Replacement.AI comes out and <a href="https://futurism.com/artificial-intelligence/bystanders-horrified-ai-billboard">says</a> what real AI companies can&#8217;t - &#8220;humans are no longer necessary. So we&#8217;re getting rid of them.&#8221;</p></li><li><p>Sam Altman was <a href="https://futurism.com/artificial-intelligence/sam-altman-subpoena-onstage">served</a> a subpoena live on stage in San Francisco in relation to a trial of activist group StopAI.</p></li><li><p>Microsoft AI CEO Mustafa Suleyman has gone public with his <a href="https://x.com/ai_ctrl/status/1987924102225551849">concerns</a> that many in AI want to build superintelligence to replace humans, but has unfortunately declared his intention to <a href="https://fortune.com/2025/11/06/microsoft-launches-new-ai-humanist-superinteligence-team-mustafa-suleyman-openai/">build</a> &#8220;humanist superintelligence&#8221;, perhaps without understanding the difficulty of the problem.</p></li><li><p><a href="https://www.aisafety.camp/">AI Safety Camp 11</a> is open for applications.</p></li><li><p>Anthony Aguirre is running a <a href="https://keepthefuturehuman.ai/contest/">creative contest</a> in a search for media that summarises the ideas of his essay, <em>Keep the Future Human</em>. There&#8217;s over $100,000 in prize money available to the winners!</p></li></ul><h2>What we&#8217;ve been reading/watching</h2><ul><li><p>ControlAI&#8217;s Andrea Miotti wrote a brilliant <a href="https://time.com/7329424/movement-prohibit-superintelligent-ai/">piece</a> in TIME on the need for a global movement against superintelligence.</p></li><li><p>YouTube veteran Hank Green does the double, opening up his huge audience to the dangers of the race to superintelligence.</p><ul><li><p><a href="https://www.youtube.com/watch?v=5CKuiuc5cJM">Interview</a><em> </em>with <em>If Anyone Builds It, Everyone Dies</em> co-author Nate Soares.</p></li><li><p>SciShow <a href="https://www.youtube.com/watch?v=90C3XVjUMqE">video</a> in collaboration with ControlAI (in which our DeepMind letter features).</p></li></ul></li><li><p>The other <em>If Anyone Builds It</em> guy, Eliezer Yudkowsky, on Chris Williamson&#8217;s <a href="https://futurism.com/artificial-intelligence/bystanders-horrified-ai-billboard">podcast</a>.</p></li><li><p>PauseAI Comms Director Tom Bibby on Trish Wood&#8217;s <a href="https://open.spotify.com/episode/7jGCYp3p0flPYSeDznyc0g?si=494307b191564bd5">podcast</a>.</p></li><li><p>Siliconversations <a href="https://www.youtube.com/watch?v=pSlzEPnRlaY">video</a> on FLI&#8217;s superintelligence letter.</p></li></ul><p>Thanks for reading PauseAI&#8217;s November newsletter. See you next month!</p>]]></content:encoded></item><item><title><![CDATA[Join the 30,000+ calling for a ban on superintelligence]]></title><description><![CDATA[The Future of Life Institute's open letter has been signed by AI experts, Nobel laureates, politicians, artists, and tens of thousands of ordinary people.]]></description><link>https://pauseai.substack.com/p/join-the-30000-calling-for-a-ban</link><guid isPermaLink="false">https://pauseai.substack.com/p/join-the-30000-calling-for-a-ban</guid><dc:creator><![CDATA[Tom Bibby]]></dc:creator><pubDate>Fri, 24 Oct 2025 13:54:04 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!HbD4!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F62913404-245f-4df6-a0e8-884c7bd24c7d_900x877.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>This is huge.</p><p>An open letter released by the Future of Life Institute this week is calling for a ban on the development of superintelligent AI, at least until:</p><ol><li><p>There is broad scientific consensus that it will be done safely and controllably.</p></li><li><p>There is strong public buy-in.</p></li></ol><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!HbD4!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F62913404-245f-4df6-a0e8-884c7bd24c7d_900x877.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!HbD4!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F62913404-245f-4df6-a0e8-884c7bd24c7d_900x877.png 424w, https://substackcdn.com/image/fetch/$s_!HbD4!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F62913404-245f-4df6-a0e8-884c7bd24c7d_900x877.png 848w, https://substackcdn.com/image/fetch/$s_!HbD4!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F62913404-245f-4df6-a0e8-884c7bd24c7d_900x877.png 1272w, https://substackcdn.com/image/fetch/$s_!HbD4!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F62913404-245f-4df6-a0e8-884c7bd24c7d_900x877.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!HbD4!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F62913404-245f-4df6-a0e8-884c7bd24c7d_900x877.png" width="900" height="877" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/62913404-245f-4df6-a0e8-884c7bd24c7d_900x877.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:877,&quot;width&quot;:900,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:134180,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://pauseai.substack.com/i/177009605?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F62913404-245f-4df6-a0e8-884c7bd24c7d_900x877.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!HbD4!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F62913404-245f-4df6-a0e8-884c7bd24c7d_900x877.png 424w, https://substackcdn.com/image/fetch/$s_!HbD4!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F62913404-245f-4df6-a0e8-884c7bd24c7d_900x877.png 848w, https://substackcdn.com/image/fetch/$s_!HbD4!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F62913404-245f-4df6-a0e8-884c7bd24c7d_900x877.png 1272w, https://substackcdn.com/image/fetch/$s_!HbD4!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F62913404-245f-4df6-a0e8-884c7bd24c7d_900x877.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>It&#8217;s been signed by over 30,000 people, including the two most cited living scientists (Geoffrey Hinton and Yoshua Bengio), Apple co-founder Steve Wozniak, Richard Branson, Prince Harry, former President of Ireland Mary Robinson, Grimes, Stephen Fry, Kate Bush, will.i.am, Joseph Gordon-Levitt, Steve Bannon, &#8230;</p><p>You get the picture.</p><p>It&#8217;s just waiting for one more signature - yours! </p><p><strong>Sign <a href="https://superintelligence-statement.org/">here</a>.</strong></p><p>Many of us in PauseAI have already added our names to the letter, alongside people from ControlAI, MIRI, FLI, and the Center for AI Safety. It&#8217;s received widespread media coverage, and is proving to be a historic moment in the fight against uncontrollable superintelligence.</p><p>The letter is in alignment with PauseAI&#8217;s <a href="https://pauseai.info/statement">public statement</a>:</p><blockquote><p>We call on the governments of the world to sign an international treaty implementing a pause on the training of the most powerful general AI systems, until we know how to build them safely and keep them under democratic control.</p></blockquote><p>New polling released alongside the letter found that just 5% of Americans are satisfied with the current race to develop superhuman AI as quickly as possible. </p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!Szzs!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7b27a6b0-c13d-49ca-a4a5-427e9a5797c2_1115x497.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!Szzs!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7b27a6b0-c13d-49ca-a4a5-427e9a5797c2_1115x497.jpeg 424w, https://substackcdn.com/image/fetch/$s_!Szzs!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7b27a6b0-c13d-49ca-a4a5-427e9a5797c2_1115x497.jpeg 848w, https://substackcdn.com/image/fetch/$s_!Szzs!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7b27a6b0-c13d-49ca-a4a5-427e9a5797c2_1115x497.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!Szzs!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7b27a6b0-c13d-49ca-a4a5-427e9a5797c2_1115x497.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!Szzs!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7b27a6b0-c13d-49ca-a4a5-427e9a5797c2_1115x497.jpeg" width="1115" height="497" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/7b27a6b0-c13d-49ca-a4a5-427e9a5797c2_1115x497.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:497,&quot;width&quot;:1115,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!Szzs!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7b27a6b0-c13d-49ca-a4a5-427e9a5797c2_1115x497.jpeg 424w, https://substackcdn.com/image/fetch/$s_!Szzs!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7b27a6b0-c13d-49ca-a4a5-427e9a5797c2_1115x497.jpeg 848w, https://substackcdn.com/image/fetch/$s_!Szzs!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7b27a6b0-c13d-49ca-a4a5-427e9a5797c2_1115x497.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!Szzs!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7b27a6b0-c13d-49ca-a4a5-427e9a5797c2_1115x497.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>64% agreed that superhuman AI should either never be developed, or at least not until there&#8217;s scientific consensus that it will be safe. This is consistent with every other poll we have on the issue. People understand that the unregulated race to build superintelligence could end in disaster. </p><h2>How you can help</h2><p>We must now double our efforts. The success of this moment will be determined by how you and I capitalise on it. </p><p>Five things you can do right now:</p><ul><li><p><a href="https://superintelligence-statement.org/">Sign the letter</a></p></li><li><p>Tell a friend or family member about the letter</p></li><li><p><a href="https://pauseai.info/email-builder">Contact your politician</a> about the letter</p></li><li><p>Share the letter on social media</p></li><li><p>Upload a selfie to our <a href="https://pauseai.info/sayno">Say No To Superintelligent AI</a> campaign</p></li></ul><p>The path towards an international treaty is long. We don&#8217;t know if we&#8217;ll make it in time. But, with this letter, a huge stride has been made. I think I can just about smell the ink.</p><p>Enjoy your weekend!</p>]]></content:encoded></item><item><title><![CDATA[We read If Anyone Builds It, Everyone Dies in 7 cities ]]></title><description><![CDATA[Yudkowsky and Soares' book goes global, major progress in California, and 240 faces say no to superintelligent AI.]]></description><link>https://pauseai.substack.com/p/we-read-if-anyone-builds-it-everyone</link><guid isPermaLink="false">https://pauseai.substack.com/p/we-read-if-anyone-builds-it-everyone</guid><dc:creator><![CDATA[Tom Bibby]]></dc:creator><pubDate>Wed, 15 Oct 2025 16:34:48 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!ECVH!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb53f5ae5-1066-4cab-8f42-d45795bbe5ee_1920x1920.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Our collage of faces standing up to unregulated AI development is growing fast - 240 individuals have already joined in.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!ECVH!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb53f5ae5-1066-4cab-8f42-d45795bbe5ee_1920x1920.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!ECVH!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb53f5ae5-1066-4cab-8f42-d45795bbe5ee_1920x1920.jpeg 424w, https://substackcdn.com/image/fetch/$s_!ECVH!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb53f5ae5-1066-4cab-8f42-d45795bbe5ee_1920x1920.jpeg 848w, https://substackcdn.com/image/fetch/$s_!ECVH!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb53f5ae5-1066-4cab-8f42-d45795bbe5ee_1920x1920.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!ECVH!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb53f5ae5-1066-4cab-8f42-d45795bbe5ee_1920x1920.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!ECVH!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb53f5ae5-1066-4cab-8f42-d45795bbe5ee_1920x1920.jpeg" width="1456" height="1456" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/b53f5ae5-1066-4cab-8f42-d45795bbe5ee_1920x1920.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1456,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!ECVH!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb53f5ae5-1066-4cab-8f42-d45795bbe5ee_1920x1920.jpeg 424w, https://substackcdn.com/image/fetch/$s_!ECVH!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb53f5ae5-1066-4cab-8f42-d45795bbe5ee_1920x1920.jpeg 848w, https://substackcdn.com/image/fetch/$s_!ECVH!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb53f5ae5-1066-4cab-8f42-d45795bbe5ee_1920x1920.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!ECVH!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb53f5ae5-1066-4cab-8f42-d45795bbe5ee_1920x1920.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>You can take part by clicking <a href="https://pauseai.info/sayno">here</a>.</p><p>The Say No campaign was launched in conjunction with our <a href="https://pauseai.info/if-anyone-builds-it-campaign">events</a> in support of <em>If Anyone Builds It, Everyone Dies</em>, which are taking place in eight cities across the US, the UK, Germany, and Australia.</p><p>Yudkowsky and Soares&#8217; book explains in detail the danger of allowing the race to build superintelligent AI to continue, and why a global treaty limiting the amount of computing power frontier AI companies can use to train models is necessary to protect humanity. It peaked at #7 on the New York Times Bestseller list, and we would recommend picking it up if you haven&#8217;t had the chance to read it yet.</p><p>PauseAI&#8217;s events have partly been to promote the book, but also to help those concerned by its warning to begin to take action. In London, PauseAI UK spoke about their recent campaigns, and ControlAI discussed the need for a stronger anti-AGI movement. In San Francisco, co-author Nate Soares and AI 2027 author Daniel Kokotajlo came along for a panel discussion. </p><p>Don&#8217;t forget to join our growing <strong>Say No</strong> campaign by uploading a selfie <a href="https://pauseai.info/sayno">here</a> - it takes less than 30 seconds!</p><h2>California AI Safety Bill SB 53 Approved by Governor Newsom</h2><p>We had good news coming out of California this month, as groundbreaking AI safety bill SB 53 became law following approval from Governor Gavin Newsom.</p><p>The <a href="https://www.theverge.com/ai-artificial-intelligence/787918/sb-53-the-landmark-ai-transparency-bill-is-now-law-in-california">new laws</a> will require large AI developers doing business in California to publish details about their safety practices and to accompany every release with a model card.</p><p>&#8216;Large AI developer&#8217; is defined as any company that makes at least $500 million in annual gross revenue, and trains models using at least 10^26 FLOPs (a measure of computing power). 10^26 is one hundred trillion trillion. Frontier models like GPT-5 and Grok 4 are estimated to surpass this threshold.</p><p>Much of what SB 53 requires is already in place at most frontier AI companies. Any company now failing to comply with this bill could face fines of up to $10 million if their violation causes death, physical injury, or poses a catastrophic risk.</p><p>Companies must explain the testing procedures they use to assess a model&#8217;s capacity to pose catastrophic risks, and actions they&#8217;ll take to mitigate such risks, including whether they have the ability to shut down copies of models that may cause death or serious injury. They must also be transparent about which third parties were involved in assessing their models.</p><p>With each model release, developers must publish a model card explaining the results of their risk assessment, whether any catastrophic risk threshold has been attained, and provide an explanation of their decision to deploy the model given any risks deployment may entail.</p><p>OpenAI <a href="https://www.theverge.com/news/798523/openai-ai-regulation-advocates-subpoenas-police">reportedly</a> used an unrelated lawsuit concerning Elon Musk to intimidate advocates of SB 53, sending police to request the private texts and emails of Nathan Calvin, who works at nonprofit Encode. Someone who had previously worked with Chris Lehane (the Chief Global Affairs Officer at OpenAI) on a campaign with a former employer said &#8220;the goal was intimidation, to let everyone know that if they fuck with us they&#8217;ll regret it&#8221;.</p><p>You may remember SB 1047 from last year, a much stronger AI safety bill that received endorsements from a wide range of experts, and overwhelming support in polling. It passed both houses of the California legislature only to be vetoed by Gavin Newsom. The AI industry engaged in intense lobbying to defeat the bill, and repeatedly <a href="https://www.transformernews.ai/p/lies-and-deception-andreessen-horowitzs">lied</a> about its provisions. Fortunately, they didn&#8217;t manage to defeat SB 53.</p><p>Elsewhere, in the state of New York, the RAISE act is now awaiting a decision from Governor Kathy Hochul. Similarly to SB 53, it would require AI companies to report AI safety incidents, but goes further in a number of ways.</p><p>Just like they successfully did with SB 1047, industry lobbyists are trying to defeat this bill. One group with backing from Andreessen Horowitz has already spent over $300,000 to fight AI regulation in the state. If you&#8217;re a New Yorker, you can contact Governor Hochul to express your support for RAISE by clicking <a href="https://mstr.app/43f7ae5d-5569-4852-b571-cd48d9554cd9">here</a>.</p><h2>PauseCon Brussels</h2><p>The inaugural PauseCon in London this summer saw PauseAI volunteers (and those interested in becoming one) take part in workshops, and have the opportunity to listen to talks and panel discussions from Conor Leahy, Rob Miles, founder Joep Meindertsma, and more.</p><p>That weekend culminated in the largest AI safety protest ever outside Google DeepMind&#8217;s office.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!mnZ4!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7d924660-b944-4d5b-838e-f28dadba1cc5_1600x900.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!mnZ4!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7d924660-b944-4d5b-838e-f28dadba1cc5_1600x900.jpeg 424w, https://substackcdn.com/image/fetch/$s_!mnZ4!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7d924660-b944-4d5b-838e-f28dadba1cc5_1600x900.jpeg 848w, https://substackcdn.com/image/fetch/$s_!mnZ4!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7d924660-b944-4d5b-838e-f28dadba1cc5_1600x900.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!mnZ4!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7d924660-b944-4d5b-838e-f28dadba1cc5_1600x900.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!mnZ4!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7d924660-b944-4d5b-838e-f28dadba1cc5_1600x900.jpeg" width="1456" height="819" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/7d924660-b944-4d5b-838e-f28dadba1cc5_1600x900.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:819,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!mnZ4!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7d924660-b944-4d5b-838e-f28dadba1cc5_1600x900.jpeg 424w, https://substackcdn.com/image/fetch/$s_!mnZ4!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7d924660-b944-4d5b-838e-f28dadba1cc5_1600x900.jpeg 848w, https://substackcdn.com/image/fetch/$s_!mnZ4!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7d924660-b944-4d5b-838e-f28dadba1cc5_1600x900.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!mnZ4!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7d924660-b944-4d5b-838e-f28dadba1cc5_1600x900.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Protesters outside Google DeepMind, London</figcaption></figure></div><p>We&#8217;ll be running more PauseCons around the world to offer as many people as possible a chance to learn more about volunteering with PauseAI, and to meet others who want to take action to stand up to the unregulated race to build superintelligence.</p><p>Brussels will host the second PauseCon in just under two months&#8217; time, from the 11th-13th of December. This event will also end with a large protest, this time at the European Parliament.</p><p>Limited free accommodation is available to those who&#8217;ll be travelling into Brussels on a first-come, first-served basis.</p><p>Applications are now open <a href="https://pausecon.org/">here</a>. We hope to see you there!</p><h2>Pause House</h2><p>PauseAI member Greg Colbourn is offering free accommodation and food to those working towards a global halt on AGI development.</p><p>Greg&#8217;s other project, CEEALAR, also known as the EA Hotel, has offered the same to those working on effective altruist projects since 2018. The Pause House, located in Blackpool, has recently opened and has space for 12 people.</p><p>Examples of activities you could do whilst staying at Pause House include:</p><ul><li><p>Writing to politicians</p></li><li><p>Running local meetings</p></li><li><p>Volunteering for and taking part in campaigns from organisations such as PauseAI, ControlAI, FLI, MIRI, etc.</p></li><li><p>Creating content for social media</p></li></ul><p>You can find more information and apply <a href="https://gregcolbourn.substack.com/p/pause-house-blackpool">here</a>.</p><h2>Other News</h2><ul><li><p>We announced last month that we&#8217;re hiring for four paid roles - applications are still open <a href="https://pauseai.info/vacancies">here</a>.</p></li><li><p>Vatican roundtable <a href="https://coexistence.global/">concludes</a> &#8220;[the] development of superintelligence AI technologies should not be allowed until there is broad scientific consensus that it will be done safely and controllably, and there is clear and broad public consent.&#8221;</p></li><li><p>63 UK parliamentarians have now signed our <a href="https://pauseai.info/dear-sir-demis-2025">open letter</a> to Google DeepMind.</p></li><li><p>Over 200 former heads of states, Nobel laureates, and AI experts <a href="https://www.nbcnews.com/tech/tech-news/un-general-assembly-opens-plea-binding-ai-safeguards-red-lines-nobel-rcna231973">call</a> for urgent global AI red lines before the end of 2026 to prevent &#8220;universally unacceptable risks&#8221;.</p></li><li><p>Anthropic <a href="https://www.nytimes.com/2025/09/29/opinion/anthropic-chatbot-lawsuit-books.html">settle</a> a copyright lawsuit for $1.5 billion - the largest copyright settlement of all time.</p></li><li><p>The Future of Life Institute <a href="https://keepthefuturehuman.ai/contest/">launch</a> the Keep The Future Human Creative Contest, with five prizes of $10,000 available.</p></li></ul><h2>What we&#8217;ve been watching/reading</h2><ul><li><p>After receiving 6 million views on their first ever video(!), <em>AI In Context </em>put out another <a href="https://www.youtube.com/watch?v=r_9wkavYt4Y">banger</a> - this time discussing the case of Grok&#8217;s Hitler worship.</p></li><li><p>A <a href="https://www.youtube.com/watch?v=f9HwA5IR-sg">video</a> from growing channel <em>Species | Documenting AGI</em> on Claude resorting to blackmail and murder in order to avoid shutdown in test scenarios.</p></li><li><p>A list of some Yudkowsky and Soares media appearances to promote <em>If Anyone Builds It, Everyone Dies:</em></p><ul><li><p><a href="https://www.youtube.com/watch?v=A895XUFscYU">BBC Newsnight</a> (TV)</p></li><li><p><a href="https://youtu.be/KKN0E3a2Yzs?si=-9_k7Y5lAH8rDpHF">New York Times</a> (Podcast)</p></li><li><p><a href="https://youtu.be/wQtpSQmMNP0?si=Fnq7K4MRRDFyuE_H">Liron Shapira</a> (Podcast)</p></li><li><p><a href="https://youtu.be/eFAG7ydvx0g?si=tKjj5FMIVVapcivn">Future of Life Institute</a> (Podcast)</p></li><li><p><a href="https://youtu.be/_VCvOAzqAg8?si=lCNKX7lldyLXAvMv">ABC News</a> (TV)</p></li></ul></li><li><p>Tristan Harris <a href="https://www.youtube.com/watch?v=675d_6WGPbo">discussing</a> the dangers of unregulated AI development on The Daily Show.</p></li><li><p>Geoffrey Hinton on Jon Stewart&#8217;s <a href="https://youtu.be/jrK3PsD3APk?si=oZz267lmcNgR8f2t">podcast</a>.</p></li></ul>]]></content:encoded></item><item><title><![CDATA[Say No To Superintelligent AI]]></title><description><![CDATA[Take 30 seconds to join our new campaign.]]></description><link>https://pauseai.substack.com/p/say-no-to-superintelligent-ai</link><guid isPermaLink="false">https://pauseai.substack.com/p/say-no-to-superintelligent-ai</guid><dc:creator><![CDATA[Tom Bibby]]></dc:creator><pubDate>Thu, 25 Sep 2025 18:10:42 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!Pb_o!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb0ab38e5-f978-438e-acf9-3976afd95552_3440x1935.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Following the recent launch of the book <em>If Anyone Builds It, Everyone Dies</em>, PauseAI held an unofficial launch party in London. This event will be followed by readings in San Francisco, Berlin, New York, and many more cities around the world.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!Pb_o!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb0ab38e5-f978-438e-acf9-3976afd95552_3440x1935.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!Pb_o!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb0ab38e5-f978-438e-acf9-3976afd95552_3440x1935.png 424w, https://substackcdn.com/image/fetch/$s_!Pb_o!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb0ab38e5-f978-438e-acf9-3976afd95552_3440x1935.png 848w, https://substackcdn.com/image/fetch/$s_!Pb_o!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb0ab38e5-f978-438e-acf9-3976afd95552_3440x1935.png 1272w, https://substackcdn.com/image/fetch/$s_!Pb_o!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb0ab38e5-f978-438e-acf9-3976afd95552_3440x1935.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!Pb_o!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb0ab38e5-f978-438e-acf9-3976afd95552_3440x1935.png" width="1456" height="819" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/b0ab38e5-f978-438e-acf9-3976afd95552_3440x1935.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:819,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:8130970,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://pauseai.substack.com/i/174531121?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb0ab38e5-f978-438e-acf9-3976afd95552_3440x1935.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!Pb_o!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb0ab38e5-f978-438e-acf9-3976afd95552_3440x1935.png 424w, https://substackcdn.com/image/fetch/$s_!Pb_o!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb0ab38e5-f978-438e-acf9-3976afd95552_3440x1935.png 848w, https://substackcdn.com/image/fetch/$s_!Pb_o!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb0ab38e5-f978-438e-acf9-3976afd95552_3440x1935.png 1272w, https://substackcdn.com/image/fetch/$s_!Pb_o!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb0ab38e5-f978-438e-acf9-3976afd95552_3440x1935.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption"><em>If Anyone Builds It, Everyone Dies</em> being read at our event in London</figcaption></figure></div><p>It needs to be made clear that a growing group of people from a wide range of backgrounds are taking the book&#8217;s warning seriously, and want governments to stop AI companies&#8217; race to build superintelligent AI.</p><p>That&#8217;s why we asked those at the London book reading to join our <strong>Say No</strong> campaign. </p><p>Attendees uploaded pictures of themselves, which will be used in a large collage to display a united stance against unregulated AI development.</p><p><strong>We&#8217;re now asking you to take 30 seconds to say </strong><em><strong>no</strong></em><strong> to superintelligent AI.</strong> The more people that take part, the clearer the message.</p><p>Click <a href="https://pauseai.info/sayno">here</a> to upload your photo.</p><h2>Further book events</h2><p>In London, we read a few sections from the <a href="https://ifanyonebuildsit.com/">book</a>, including Chapter 14: <em>Where There&#8217;s Life, There&#8217;s Hope.</em> It closes out the book with a dismissal of defeatism, and calls on readers to take action.</p><p>Many individuals and organisations are already working hard to make progress towards an international treaty. After the book reading, <a href="https://controlai.com/">ControlAI</a> spoke about the need for a popular movement to build support for vital regulation, and PauseAI UK discussed their <a href="https://time.com/7313320/google-deepmind-gemini-ai-safety-pledge/">recent</a> campaigns.</p><p>There are more book events coming up over the next few weeks:</p><ul><li><p>Saturday 4th October:</p><ul><li><p><a href="https://luma.com/boyte8ot">Berlin, Germany</a></p></li><li><p><a href="https://luma.com/1h4nc48h">San Francisco, United States</a></p></li></ul></li><li><p>Tuesday 7th October:</p><ul><li><p><a href="https://luma.com/tw6clgd4">Canberra, Australia</a></p></li></ul></li><li><p>Wednesday 8th October:</p><ul><li><p><a href="https://luma.com/rw8803di">Phoenix, United States</a></p></li></ul></li><li><p>Thursday 9th October:</p><ul><li><p><a href="https://luma.com/brtorpxh">Pittsburgh, United States</a></p></li></ul></li><li><p>Saturday 11th October:</p><ul><li><p><a href="https://luma.com/asa28ws0">New York, United States</a></p></li></ul></li></ul><p>We hope to see you there!</p><p></p>]]></content:encoded></item><item><title><![CDATA[A new phase for PauseAI]]></title><description><![CDATA[We're hiring for four paid positions.]]></description><link>https://pauseai.substack.com/p/a-new-phase-for-pauseai</link><guid isPermaLink="false">https://pauseai.substack.com/p/a-new-phase-for-pauseai</guid><dc:creator><![CDATA[Tom Bibby]]></dc:creator><pubDate>Fri, 12 Sep 2025 14:20:30 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!c_om!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3e03d604-03e2-4ba2-a6d3-c90006237caa_6000x3376.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>We&#8217;re delighted to announce that PauseAI has received funding from the Future of Life Institute.</p><p>This significant boost to our funds will allow us to be bigger, better, and louder than ever before.</p><p>Our goal remains the same as it was when Joep founded PauseAI in May 2023 - to work towards a global pause on the development of frontier AI models. Over the last two years, we&#8217;ve grown to over 600 members, have established chapters in 13 countries, and are standing up to reckless AI development on multiple fronts.</p><p>In the US, PauseAI volunteers and others contacted their Senators to get the 10-year moratorium on state AI regulation removed from Trump&#8217;s Big Beautiful Bill. The provision was defeated by a 99-1 vote. Following Google DeepMind's violation of the Frontier AI Safety Commitments, PauseAI UK organised the largest AI safety protest ever outside DeepMind&#8217;s London office. We've since secured support from 60 politicians from over 10 parties for PauseAI's <a href="https://time.com/7313320/google-deepmind-gemini-ai-safety-pledge/">open letter</a>.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!c_om!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3e03d604-03e2-4ba2-a6d3-c90006237caa_6000x3376.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!c_om!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3e03d604-03e2-4ba2-a6d3-c90006237caa_6000x3376.jpeg 424w, https://substackcdn.com/image/fetch/$s_!c_om!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3e03d604-03e2-4ba2-a6d3-c90006237caa_6000x3376.jpeg 848w, https://substackcdn.com/image/fetch/$s_!c_om!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3e03d604-03e2-4ba2-a6d3-c90006237caa_6000x3376.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!c_om!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3e03d604-03e2-4ba2-a6d3-c90006237caa_6000x3376.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!c_om!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3e03d604-03e2-4ba2-a6d3-c90006237caa_6000x3376.jpeg" width="1456" height="819" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/3e03d604-03e2-4ba2-a6d3-c90006237caa_6000x3376.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:819,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:4249904,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://pauseai.substack.com/i/173359699?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3e03d604-03e2-4ba2-a6d3-c90006237caa_6000x3376.jpeg&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!c_om!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3e03d604-03e2-4ba2-a6d3-c90006237caa_6000x3376.jpeg 424w, https://substackcdn.com/image/fetch/$s_!c_om!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3e03d604-03e2-4ba2-a6d3-c90006237caa_6000x3376.jpeg 848w, https://substackcdn.com/image/fetch/$s_!c_om!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3e03d604-03e2-4ba2-a6d3-c90006237caa_6000x3376.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!c_om!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3e03d604-03e2-4ba2-a6d3-c90006237caa_6000x3376.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>We&#8217;ll now be taking our efforts to the next level. We&#8217;re hiring for four paid positions, including a Policy Director, leads of our UK and French chapters, and a Community Manager. You can find more details on these roles <a href="https://pauseai.info/vacancies">here</a>.</p><p>Through 2025 and 2026, we&#8217;ll be focusing on continuing to grow our national chapters, running more PauseCon events across several countries, and reaching new audiences through targeted campaigns, social media content, and advertising.</p><p>AI lobbyists recently announced they&#8217;ll be spending over $100 million to fight AI regulation. They know they can&#8217;t beat us fairly. They know they&#8217;ll have to spend millions to have a chance of preventing regulation. As Rob Wiblin recently <a href="https://x.com/robertwiblin/status/1960633193712718140">tweeted</a>:</p><blockquote><p>&#8220;The advantage the AI industry has is <strong>extraordinary amounts of money</strong> to blow on lobbying. The advantage its opponents have is that <strong>public opinion on AI is very negative and just unmobilised as yet</strong>. Unclear which hand you'd rather have to play.&#8221;</p></blockquote><p>The <a href="https://pauseai.info/polls-and-surveys">polling</a> is clear &#8211; the public do not want AI companies to be allowed to continue their race to superintelligence. Public opinion is on the side of common sense regulation. Let&#8217;s make it count.</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://pauseai.substack.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://pauseai.substack.com/subscribe?"><span>Subscribe now</span></a></p><h2>Hunger strikes at Google DeepMind and Anthropic</h2><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!dLm-!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F966464ae-b261-43fe-89c9-9774508b87f3_1600x1200.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!dLm-!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F966464ae-b261-43fe-89c9-9774508b87f3_1600x1200.png 424w, https://substackcdn.com/image/fetch/$s_!dLm-!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F966464ae-b261-43fe-89c9-9774508b87f3_1600x1200.png 848w, https://substackcdn.com/image/fetch/$s_!dLm-!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F966464ae-b261-43fe-89c9-9774508b87f3_1600x1200.png 1272w, https://substackcdn.com/image/fetch/$s_!dLm-!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F966464ae-b261-43fe-89c9-9774508b87f3_1600x1200.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!dLm-!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F966464ae-b261-43fe-89c9-9774508b87f3_1600x1200.png" width="1456" height="1092" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/966464ae-b261-43fe-89c9-9774508b87f3_1600x1200.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1092,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!dLm-!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F966464ae-b261-43fe-89c9-9774508b87f3_1600x1200.png 424w, https://substackcdn.com/image/fetch/$s_!dLm-!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F966464ae-b261-43fe-89c9-9774508b87f3_1600x1200.png 848w, https://substackcdn.com/image/fetch/$s_!dLm-!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F966464ae-b261-43fe-89c9-9774508b87f3_1600x1200.png 1272w, https://substackcdn.com/image/fetch/$s_!dLm-!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F966464ae-b261-43fe-89c9-9774508b87f3_1600x1200.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>On Thursday the 4th, Guido Reichstadter announced he was on the third day of a hunger strike outside the offices of AI company Anthropic in San Francisco. He was demanding that Anthropic stop their &#8220;reckless actions&#8221;, and for the management and employees to do &#8220;everything in their power&#8221; to halt the race to artificial general intelligence.</p><p>On Friday the 5th, Micha&#235;l Trazzi independently announced he was beginning a hunger strike outside Google DeepMind&#8217;s office in London, also demanding an end to the race to AGI.</p><p>On Sunday the 7th, Denys Sheremet flew from Amsterdam to join Michael on his hunger strike.</p><p>PauseAI is not behind these hunger strikes &#8211; the three are acting on their own behalf. It's difficult to know which tactics are the most effective, but we have a huge amount of respect for anyone who is willing to do uncomfortable things to alert the public of the threat to their lives.</p><p>The hunger strikes have already generated a significant amount of <a href="https://futurism.com/ai-hunger-strike-anthropic">media</a> <a href="https://www.businessinsider.com/hunger-strike-deepmind-ai-threat-fears-agi-demis-hassabis-2025-9">attention</a>, and we hope they can bring Demis Hassabis and Dario Amodei out to explain why they&#8217;re continuing to race towards AGI, despite both of them being on record saying that it may kill every man, woman, and child on the planet. Perhaps more importantly, we hope ordinary people can see how serious this situation is and can turn their fear into action. We also hope politicians will recognise that they can&#8217;t afford to ignore this issue any longer and will step up to protect their constituents.</p><h2>Open Letter to Google DeepMind</h2><p>Our open letter demanding Google DeepMind address their violation of the Frontier AI Safety Commitments was signed by 60 UK politicians, with another MP adding their name this week, and was published in <a href="https://time.com/7313320/google-deepmind-gemini-ai-safety-pledge/">Time</a>. We were delighted to see so many lawmakers across the political spectrum take a strong stance in favour of AI safety, and refuse to stand aside whilst AI companies break important promises.</p><p>You can see more details about the letter <a href="https://pauseai.substack.com/p/breaking-60-uk-politicians-sign-pauseais">here</a>.</p><h2>Brussels PauseCon</h2><p>Earlier this year, we held the inaugural PauseCon in London. Over 50 people attended over the weekend to listen to talks from the likes of Connor Leahy and panel discussions with Rob Miles, David Krueger, and other speakers, to take part in workshops on organising and activism, and to get to know others who are joining the movement against unregulated AI development.</p><p>We&#8217;re pleased to announce that the second PauseCon will be held in Brussels from the 11th to the 13th of December. As with PauseCon London, we&#8217;re able to offer optional accommodation to those who will be travelling into Brussels, so make sure to sign up <a href="https://docs.google.com/forms/d/e/1FAIpQLScSRqKMAMo5l6-0BHjTdAWMLPmWCNiBCFXDl76FQ1X0OccGnA/viewform">here</a> soon to secure your spot.</p><p>Contact our Organizing Director, <a href="http://ella@pauseai.info">Ella</a>, if you&#8217;d like to present or have any particular thoughts about what you&#8217;d like to see.</p><h2>PauseAI events in support of <em>If Anyone Builds It, Everyone Dies</em></h2><p><em><strong>If Anyone Builds It, Everyone Dies</strong></em> has the potential to ignite public awareness of the extinction threat posed by AI, and cement its aversion as a top priority for politicians and voters the world over.</p><p>We&#8217;re hosting a series of events across several countries for people to unite in their desire to act.</p><p>Eliezer Yudkowsky and Nate Soares&#8217; <a href="https://ifanyonebuildsit.com/">book</a> clearly lays out the dangers of the unregulated race to superintelligent AI. To solve this problem, we must stare it in the face. We must inform others of this threat, and work with them to push forward regulation that will protect us.</p><p>It&#8217;s possible that the coming weeks will be the biggest moment for AI x-risk awareness since the release of the Future of Life Institute&#8217;s Pause letter. It&#8217;s up to us to make the most of that.</p><p>PauseAI UK is collaborating with ControlAI to host an unofficial launch party in London on the 22nd of September, with more events to come in the following weeks. You can find the events on our <a href="https://luma.com/PauseAI">Luma</a> calendar, with more to be announced soon.</p><h2>Small Update to Our Statement</h2><p>After some feedback, we&#8217;re making a minor adjustment to our public <a href="https://pauseai.info/statement">statement</a>. Our position has not changed, this is just about clarity. The statement will now read as follows.</p><blockquote><p>&#8220;We call on the governments of the world to sign an international treaty implementing a pause on the training of the most powerful general AI systems, until we know how to build them safely and keep them under democratic control.&#8221;</p></blockquote><p>We have removed the word &#8220;temporary&#8221;, as some reasonably felt it suggested a prediction that the time it would take to satisfy the <em>&#8220;until we know how to build them safely&#8221; </em>condition would be quite short. Regardless of any individual&#8217;s thoughts on the difficulty of the alignment problem, PauseAI&#8217;s position has always been that, whilst it remains unsolved, we simply should not continue to race to develop increasingly powerful and uncontrollable AI.</p><p>Over 700 of you have already signed the statement, and, as we&#8217;ve made a minor adjustment, you&#8217;re welcome to send an email to <a href="mailto:info@pauseai.info">info@pauseai.info</a> if you wish to remove your name.</p><h2>Other news</h2><ul><li><p>The Chinese ambassador to the US <a href="https://www.scmp.com/news/china/diplomacy/article/3300738/china-and-us-need-cooperate-ai-or-risk-opening-pandoras-box-ambassador-warns">calls</a> for international cooperation on AI governance to avoid &#8220;opening Pandora&#8217;s box&#8221;.</p></li><li><p>AI lobbyists <a href="https://www.wsj.com/politics/silicon-valley-launches-pro-ai-pacs-to-defend-industry-in-midterm-elections-287905b3">announced</a> they&#8217;ll be spending over $100 million to fight AI regulation.</p></li><li><p>OpenAI <a href="https://www.economist.com/science-and-technology/2025/08/08/openais-latest-step-towards-advanced-artificial-intelligence">released</a> their latest model, GPT-5, and despite some talk of an &#8220;AI plateau&#8221;, GPT-5 was right on trend for METR&#8217;s <a href="https://metr.org/blog/2025-03-19-measuring-ai-ability-to-complete-long-tasks/">time horizon benchmark</a>, accomplishing 50% of tasks that take a human 2 hours and 17 minutes to complete, a figure which is doubling once every 7 months (although that may have shortened to just 4 months)</p></li><li><p>Californian AI safety bill SB 53 will likely <a href="https://www.transformernews.ai/p/sb-53-california-ai-might-actually-pass-newsom">soon</a> go to Governor Newsom for approval, and has received an endorsement from <a href="https://www.anthropic.com/news/anthropic-is-endorsing-sb-53">Anthropic</a></p></li><li><p>Liz Kendall <a href="https://www.uktech.news/news/government-and-policy/who-is-new-tech-secretary-liz-kendall-20250908">replaces</a> Peter Kyle as the UK&#8217;s tech minister</p></li></ul><h2>What we&#8217;ve been watching/reading</h2><ul><li><p>Roman Yampolskiy appeared on Steven Bartlett&#8217;s <a href="https://www.youtube.com/watch?v=UclrVWafRAI">podcast</a> to discuss the extinction threat posed by superintelligent AI, and suggest that people join organisations like PauseAI if they wish to take action</p></li><li><p>YouTuber Professor Dave Explains - <a href="https://www.youtube.com/watch?v=SrPo1sGwSAc">&#8220;Will Artificial Intelligence Destroy Humanity?&#8221;</a>, a video made in collaboration with ControlAI</p></li><li><p>Interviews with those on hunger strikes:</p><ul><li><p><a href="https://www.youtube.com/watch?v=stTyN2b5XAI">For Humanity Pod</a></p></li><li><p>One from me (Tom) on <a href="https://www.youtube.com/watch?v=stTyN2b5XAI">Day 3</a> of Michael&#8217;s hunger strike</p></li></ul></li><li><p><a href="https://www.youtube.com/watch?v=c82xuCSx_9k">Siliconversations</a> on FLI&#8217;s AI Safety Index</p></li><li><p>Billy Perrigo in <a href="https://time.com/7312305/agi-race-us-china-trump/">Time</a> - <em>The Race for Artificial General Intelligence Poses New Risks to an Unstable World</em></p></li><li><p>A <a href="https://www.youtube.com/watch?v=7SDeeAHAAZ4">video</a> from YouTube channel <em>Species | Documenting AGI </em>on how AI companies manufacture fear about &#8216;losing the race&#8217; to China to downplay the need for regulation</p></li></ul>]]></content:encoded></item><item><title><![CDATA[BREAKING: 60 UK politicians sign PauseAI's open letter to Google Deepmind]]></title><description><![CDATA[Google DeepMind violated the Frontier AI Safety Commitments. Lawmakers from across the political spectrum aren't having it.]]></description><link>https://pauseai.substack.com/p/breaking-60-uk-politicians-sign-pauseais</link><guid isPermaLink="false">https://pauseai.substack.com/p/breaking-60-uk-politicians-sign-pauseais</guid><dc:creator><![CDATA[Tom Bibby]]></dc:creator><pubDate>Fri, 29 Aug 2025 15:43:37 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!gl1g!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff177c2e2-e19e-4ed4-9e87-344191867d9c_1280x1280.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>We&#8217;re excited to announce our <a href="https://pauseai.info/dear-sir-demis-2025">open letter</a> to Google DeepMind, signed by 4 civil society organisations and 60 UK politicians from more than 10 parties.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!gl1g!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff177c2e2-e19e-4ed4-9e87-344191867d9c_1280x1280.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!gl1g!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff177c2e2-e19e-4ed4-9e87-344191867d9c_1280x1280.png 424w, https://substackcdn.com/image/fetch/$s_!gl1g!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff177c2e2-e19e-4ed4-9e87-344191867d9c_1280x1280.png 848w, https://substackcdn.com/image/fetch/$s_!gl1g!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff177c2e2-e19e-4ed4-9e87-344191867d9c_1280x1280.png 1272w, https://substackcdn.com/image/fetch/$s_!gl1g!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff177c2e2-e19e-4ed4-9e87-344191867d9c_1280x1280.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!gl1g!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff177c2e2-e19e-4ed4-9e87-344191867d9c_1280x1280.png" width="1280" height="1280" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/f177c2e2-e19e-4ed4-9e87-344191867d9c_1280x1280.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1280,&quot;width&quot;:1280,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:558274,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://pauseai.substack.com/i/172268433?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff177c2e2-e19e-4ed4-9e87-344191867d9c_1280x1280.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!gl1g!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff177c2e2-e19e-4ed4-9e87-344191867d9c_1280x1280.png 424w, https://substackcdn.com/image/fetch/$s_!gl1g!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff177c2e2-e19e-4ed4-9e87-344191867d9c_1280x1280.png 848w, https://substackcdn.com/image/fetch/$s_!gl1g!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff177c2e2-e19e-4ed4-9e87-344191867d9c_1280x1280.png 1272w, https://substackcdn.com/image/fetch/$s_!gl1g!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff177c2e2-e19e-4ed4-9e87-344191867d9c_1280x1280.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>Following the largest AI safety protest of all time outside Google DeepMind&#8217;s London offices in June, PauseAI volunteers in the UK have been contacting their representatives, informing them of Google&#8217;s failure to stick to the Frontier AI Safety Commitments, and inviting them to sign our open letter demanding that Google address their lack of transparency.</p><div id="youtube2-AaowO0rLvao" class="youtube-wrap" data-attrs="{&quot;videoId&quot;:&quot;AaowO0rLvao&quot;,&quot;startTime&quot;:null,&quot;endTime&quot;:null}" data-component-name="Youtube2ToDOM"><div class="youtube-inner"><iframe src="https://www.youtube-nocookie.com/embed/AaowO0rLvao?rel=0&amp;autoplay=0&amp;showinfo=0&amp;enablejsapi=0" frameborder="0" loading="lazy" gesture="media" allow="autoplay; fullscreen" allowautoplay="true" allowfullscreen="true" width="728" height="409"></iframe></div></div><p>Covering the letter in <a href="https://time.com/7313320/google-deepmind-gemini-ai-safety-pledge/">Time Magazine</a>, Harry Booth lays out how Google failed to release adequate safety information upon the release of Gemini 2.5 Pro, and did not disclose which third-party external testers had been given access to the model. Google claimed that, as the model was just an &#8220;experimental&#8221; release, the AI safety commitments didn&#8217;t apply. Experimental or not, the model was still publicly available for anyone to use.</p><p>This flies in the face of common sense. Our letter asks for Google to establish a clear and transparent definition of &#8220;deployment&#8221;, to publish specific timelines for the release of safety evaluation reports, and to clarify which government agencies and independent third-parties are involved in testing.</p><p>Former Defence Secretary, Lord Browne, was one of the 60 politicians who have signed our letter. </p><blockquote><p>"The Frontier AI Safety Commitments were a crucial first step in <strong>global AI governance.</strong> If leading companies like Google treat these commitments as optional, <strong>we risk a dangerous race to deploy increasingly powerful AI without proper safeguards.</strong> Transparency in AI safety testing is essential for public trust and democratic oversight."</p></blockquote><p>Baroness Kidron, who was a vocal source of opposition in the House of Lords to the Data (Use and Access) Bill, which lacked copyright protections for creatives, was another signatory.</p><blockquote><p>"<strong>Voluntary safety promises only work if they're transparent.</strong> It is important to understand the timeline, know the identity of those who have tested it, and have faith in the process. <strong>Safety cannot be a secret.</strong> Like any AI company, Google must publish the details of their testing procedure."</p></blockquote><p>PauseAI UK Director Joseph Miller said &#8220;it's insane what PauseAI volunteers can achieve by emailing their politicians.&#8221; He explained Google&#8217;s broken promises in this <a href="https://www.youtube.com/watch?v=5gnVzEoVDmk">video</a>.</p><div id="youtube2-5gnVzEoVDmk" class="youtube-wrap" data-attrs="{&quot;videoId&quot;:&quot;5gnVzEoVDmk&quot;,&quot;startTime&quot;:null,&quot;endTime&quot;:null}" data-component-name="Youtube2ToDOM"><div class="youtube-inner"><iframe src="https://www.youtube-nocookie.com/embed/5gnVzEoVDmk?rel=0&amp;autoplay=0&amp;showinfo=0&amp;enablejsapi=0" frameborder="0" loading="lazy" gesture="media" allow="autoplay; fullscreen" allowautoplay="true" allowfullscreen="true" width="728" height="409"></iframe></div></div><p>If you&#8217;re in the UK, you can use our <a href="https://pauseai.info/uk-email-mp">email builder</a> to ask your MP to add their name to our letter.</p><p>We&#8217;d also love it if you could share and engage with our announcements on our social media channels: <a href="https://x.com/PauseAI/status/1961426832130981998">X</a>, <a href="https://www.instagram.com/p/DN8Tuc8AHes/?img_index=1">Instagram</a>, <a href="https://www.linkedin.com/posts/pauseai_breaking-60-uk-politicians-have-signed-pauseais-activity-7367200121052209152-ZuRI?utm_source=share&amp;utm_medium=member_desktop&amp;rcm=ACoAAF38dt8BjZ6wiTsTSW5lICfBYiYSAgZNsos">LinkedIn</a>, <a href="https://www.threads.com/@pause_ai/post/DN8S8KbD679">Threads</a>, <a href="https://bsky.app/profile/did:plc:poltokxz5ydxqrg6tj2lll3o/post/3lxkb2qu4w52m">Bluesky</a>, and <a href="https://www.facebook.com/425659300516462/posts/702031592879230">Facebook</a>.</p><p>Some notable signatories of the letter include:</p><ul><li><p><strong>Lord Browne</strong> (Labour, former Defence Secretary)</p></li><li><p><strong>Baroness Kidron</strong> (Crossbench peer, digital rights advocate)</p></li><li><p><strong>Lord McNally</strong> (Liberal Democrat, former Justice Minister)</p></li><li><p>All 4 Green Party Members of Parliament: <strong>Carla Denyer, Si&#226;n Berry, Adrian Ramsay, Ellie Chowns</strong></p></li><li><p>All 4 Plaid Cymru Members of Parliament: <strong>Ben Lake, Ann Davies, Llinos Medi, Liz Saville Roberts</strong></p></li><li><p><strong>Desmond Swayne MP</strong> (Conservative, former Minister of State for International Development)</p></li><li><p><strong>Baroness Chakrabarti </strong>(Labour, former director of human rights group Liberty)</p></li><li><p><strong>Baroness Morris</strong> (Labour, former Education Secretary)</p></li><li><p><strong>Lord Cashman</strong> (Non-affiliated, co-founder of Stonewall)</p></li><li><p><strong>Baroness Foster of Aghadrumsee</strong> (Non-affiliated, former First Minister of Northern Ireland and former leader of the Democratic Unionist Party)</p></li></ul><p>Also signing the letter are civil society organisations <strong>Open Rights Group</strong>, <strong>Connected by Data</strong>, <strong>The Safe AI for Children Alliance</strong>, and <strong>Open Data Manchester</strong>. You can see the full list of signatories <a href="https://pauseai.info/dear-sir-demis-2025">here</a>.</p><p>This marks a huge moment for PauseAI and for political action on AI safety in the UK. AI companies cannot continue to get away with broken promises. It&#8217;s becoming increasingly clear that voluntary commitments are not sufficient to keep the public safe.</p>]]></content:encoded></item><item><title><![CDATA[Why we reported OpenAI to the Australian Federal Police]]></title><description><![CDATA[PauseAI Australia take on ChatGPT agent's capacity to help novices create bioweapons, and politicians join our call on Google DeepMind to address their violation of the Frontier AI Safety Commitments.]]></description><link>https://pauseai.substack.com/p/why-we-reported-openai-to-the-australian</link><guid isPermaLink="false">https://pauseai.substack.com/p/why-we-reported-openai-to-the-australian</guid><dc:creator><![CDATA[Tom Bibby]]></dc:creator><pubDate>Mon, 04 Aug 2025 19:05:24 GMT</pubDate><enclosure url="https://substackcdn.com/image/youtube/w_728,c_limit/-YPhNdpA8Rk" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>PauseAI volunteers have questioned whether OpenAI&#8217;s release of ChatGPT agent is in violation of Australia&#8217;s bioweapons laws.</p><p>OpenAI <a href="https://fortune.com/2025/07/18/openai-chatgpt-agent-could-aid-dangerous-bioweapon-development/">classified</a> their new model under &#8220;High Biological and Chemical capabilities&#8221;, and warned about the risk of it aiding in the development of biological weapons.</p><p>The Crimes (Biological Weapons) Act 1976 implements Australia&#8217;s commitments to the international Biological Weapons Convention. It makes it illegal to assist in the development of bioweapons, and can apply to any company whose services are available in Australia.</p><p>We&#8217;ve brought the matter to the attention of the Australian Federal Police, and have written to Michelle Rowland MP, Australia&#8217;s Attorney General.</p><p>Volunteer David Gould explained more in this <a href="https://www.youtube.com/watch?v=-YPhNdpA8Rk">interview</a>.</p><div id="youtube2--YPhNdpA8Rk" class="youtube-wrap" data-attrs="{&quot;videoId&quot;:&quot;-YPhNdpA8Rk&quot;,&quot;startTime&quot;:null,&quot;endTime&quot;:null}" data-component-name="Youtube2ToDOM"><div class="youtube-inner"><iframe src="https://www.youtube-nocookie.com/embed/-YPhNdpA8Rk?rel=0&amp;autoplay=0&amp;showinfo=0&amp;enablejsapi=0" frameborder="0" loading="lazy" gesture="media" allow="autoplay; fullscreen" allowautoplay="true" allowfullscreen="true" width="728" height="409"></iframe></div></div><p>PauseAI Australia is one of our 13 national chapters around the world. National campaigns such as this one help to engage volunteers, raise awareness of AI risk, and establish our presence locally. You can get involved with any of our existing national groups, or we can help you start your own if there isn&#8217;t one in your country! Get involved <a href="https://pauseai.info/join">here</a>.</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://pauseai.substack.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://pauseai.substack.com/subscribe?"><span>Subscribe now</span></a></p><h3>Our DeepMind campaign heats up</h3><p>Our recent demonstration in London was the largest AI safety protest of all time, as we gathered outside Google DeepMind&#8217;s office to hold them accountable for their failure to stick to the Frontier AI Safety Commitments they signed in 2024.</p><p>The UK government is yet to acknowledge this violation, but the work of volunteers has raised this issue with politicians across the country. Many have signed our open letter calling on DeepMind to establish clear and transparent testing procedures in line with their voluntary commitments.</p><p>We&#8217;ve welcomed this political appetite to ensure AI companies aren&#8217;t able to backtrack on promises made to the government, and are continuing to engage with politicians on this matter.</p><p>PauseAI UK Director, Joseph Miller, spoke about the open letter, PauseCon, and the future of PauseAI in this <a href="https://www.youtube.com/watch?v=5gnVzEoVDmk">video</a>.</p><div id="youtube2-5gnVzEoVDmk" class="youtube-wrap" data-attrs="{&quot;videoId&quot;:&quot;5gnVzEoVDmk&quot;,&quot;startTime&quot;:null,&quot;endTime&quot;:null}" data-component-name="Youtube2ToDOM"><div class="youtube-inner"><iframe src="https://www.youtube-nocookie.com/embed/5gnVzEoVDmk?rel=0&amp;autoplay=0&amp;showinfo=0&amp;enablejsapi=0" frameborder="0" loading="lazy" gesture="media" allow="autoplay; fullscreen" allowautoplay="true" allowfullscreen="true" width="728" height="409"></iframe></div></div><h2>PauseCon Brussels</h2><p>Following on from the success of PauseCon London, we are now seeking expressions of interest for the next PauseCon event, to be held in Brussels later this year. If you, or anyone you know, is interested, please complete this <a href="https://forms.gle/kkfkPPw8FxRa3K9J8">form</a>.</p><p>PauseCon London saw over 60 PauseAI volunteers attend, and culminated in our largest protest to date. If you're interested in being a part of the next event, let us know.</p><div id="youtube2-AaowO0rLvao" class="youtube-wrap" data-attrs="{&quot;videoId&quot;:&quot;AaowO0rLvao&quot;,&quot;startTime&quot;:null,&quot;endTime&quot;:null}" data-component-name="Youtube2ToDOM"><div class="youtube-inner"><iframe src="https://www.youtube-nocookie.com/embed/AaowO0rLvao?rel=0&amp;autoplay=0&amp;showinfo=0&amp;enablejsapi=0" frameborder="0" loading="lazy" gesture="media" allow="autoplay; fullscreen" allowautoplay="true" allowfullscreen="true" width="728" height="409"></iframe></div></div><p>We are also looking for a team of volunteers to assist in organising the event. If you are interested, please contact our Organizing Director, <a href="mailto:ella@pauseai.info">Ella</a>.</p><h3>Yet more polling shows the public are worried about losing control to superintelligent AI</h3><p>Out of the 10,000 people polled across 5 countries in a <a href="https://report2025.seismic.org/">report</a> by the Seismic Foundation, 58% of them agreed that "if we build AI models smarter than us, we will inevitably lose control over them".</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!cHUT!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdfaef2bc-5943-48a7-b22e-e99b46b0009d_680x680.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!cHUT!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdfaef2bc-5943-48a7-b22e-e99b46b0009d_680x680.png 424w, https://substackcdn.com/image/fetch/$s_!cHUT!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdfaef2bc-5943-48a7-b22e-e99b46b0009d_680x680.png 848w, https://substackcdn.com/image/fetch/$s_!cHUT!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdfaef2bc-5943-48a7-b22e-e99b46b0009d_680x680.png 1272w, https://substackcdn.com/image/fetch/$s_!cHUT!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdfaef2bc-5943-48a7-b22e-e99b46b0009d_680x680.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!cHUT!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdfaef2bc-5943-48a7-b22e-e99b46b0009d_680x680.png" width="680" height="680" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/dfaef2bc-5943-48a7-b22e-e99b46b0009d_680x680.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:680,&quot;width&quot;:680,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!cHUT!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdfaef2bc-5943-48a7-b22e-e99b46b0009d_680x680.png 424w, https://substackcdn.com/image/fetch/$s_!cHUT!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdfaef2bc-5943-48a7-b22e-e99b46b0009d_680x680.png 848w, https://substackcdn.com/image/fetch/$s_!cHUT!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdfaef2bc-5943-48a7-b22e-e99b46b0009d_680x680.png 1272w, https://substackcdn.com/image/fetch/$s_!cHUT!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdfaef2bc-5943-48a7-b22e-e99b46b0009d_680x680.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>A plurality agreed that &#8220;we should not pursue artificial general intelligence and should stop all technical development in this area&#8221; (41% agree, 20% disagree). Arguably, the radical wording of this question (&#8220;stop <strong>all</strong> technical development&#8221;) somewhat reduced support. Previous polling has found <a href="https://controlai.com/polls">60% support</a> for a global treaty banning smarter-than-human AI (UK, 2025) and <a href="https://www.sentienceinstitute.org/aims-survey-2023#2023-results">58% support</a> for &#8220;banning the development of artificial general intelligence that is smarter than humans&#8221; (US, 2023).</p><p>There are far more people that think there&#8217;s not enough regulation on AI than those who think there is.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!Oc6I!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F84fb441f-e9c5-41fb-ac06-84d2acc1250a_581x497.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!Oc6I!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F84fb441f-e9c5-41fb-ac06-84d2acc1250a_581x497.png 424w, https://substackcdn.com/image/fetch/$s_!Oc6I!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F84fb441f-e9c5-41fb-ac06-84d2acc1250a_581x497.png 848w, https://substackcdn.com/image/fetch/$s_!Oc6I!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F84fb441f-e9c5-41fb-ac06-84d2acc1250a_581x497.png 1272w, https://substackcdn.com/image/fetch/$s_!Oc6I!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F84fb441f-e9c5-41fb-ac06-84d2acc1250a_581x497.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!Oc6I!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F84fb441f-e9c5-41fb-ac06-84d2acc1250a_581x497.png" width="581" height="497" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/84fb441f-e9c5-41fb-ac06-84d2acc1250a_581x497.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:497,&quot;width&quot;:581,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!Oc6I!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F84fb441f-e9c5-41fb-ac06-84d2acc1250a_581x497.png 424w, https://substackcdn.com/image/fetch/$s_!Oc6I!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F84fb441f-e9c5-41fb-ac06-84d2acc1250a_581x497.png 848w, https://substackcdn.com/image/fetch/$s_!Oc6I!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F84fb441f-e9c5-41fb-ac06-84d2acc1250a_581x497.png 1272w, https://substackcdn.com/image/fetch/$s_!Oc6I!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F84fb441f-e9c5-41fb-ac06-84d2acc1250a_581x497.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>Other results:</p><ul><li><p>53% think "AI labs are playing god when trying to build superintelligence".</p></li><li><p>60% of students think &#8220;AI will make entry level jobs less common&#8221;.</p></li><li><p>57% of students feel &#8220;daunted by what the future of work looks like&#8221;.</p></li><li><p>53% think &#8220;AI development is progressing too fast for society to evolve safely&#8221;.</p></li><li><p>60% of people are worried about AI replacing human relationships.</p></li></ul><h3>Other news</h3><ul><li><p>How would a global treaty pausing frontier AI development be enforced and verified? A <a href="https://www.arxiv.org/abs/2506.20530">paper</a> authored by volunteers from PauseAI and AI Safety Camp has built upon similar work from the Machine Intelligence Research Institute and others to explain our options.</p></li><li><p>The Trump administration published their <a href="https://www.whitehouse.gov/wp-content/uploads/2025/07/Americas-AI-Action-Plan.pdf">AI Action Plan</a>. Despite an unfortunate tagline of &#8220;Winning the Race&#8221; and the total absence of any mention of artificial general intelligence, the action plan did include sections on location verification forming a part of chip export controls, biosecurity, and a call for investment in AI interpretability and control research.</p></li><li><p>The UK government <a href="https://www.gov.uk/government/news/ai-security-institute-launches-international-coalition-to-safeguard-ai-development">announced</a> &#163;15m worth of funding for alignment research. In the press release, The UK AI Security Institute gave a frank assessment of the state of our alignment plans, saying they are &#8220;likely to be insufficient for tomorrow&#8217;s more capable systems&#8221;, and &#8220;outlined the need for co-ordinated global action to ensure the long-term safety of citizens"</p></li><li><p>Chinese premier Li Qiang <a href="https://www.theguardian.com/technology/2025/jul/26/china-calls-for-global-ai-cooperation-days-after-trump-administration-unveils-low-regulation-strategy">called</a> for international cooperation on AI due to concerns about the security risks.</p></li><li><p>US Senator Bernie Sanders <a href="https://gizmodo.com/bernie-sanders-reveals-the-ai-doomsday-scenario-that-worries-top-experts-2000628611">spoke</a> out about his worries of the &#8220;doomsday scenario&#8221; warned about by experts. <em>&#8220;This is not science fiction. There are very, very knowledgeable people&#8212;and I just talked to one today&#8212;who worry very much that human beings will not be able to control the technology, and that artificial intelligence will in fact dominate our society.&#8221;</em></p></li><li><p>Meta <a href="https://www.cnbc.com/2025/07/18/meta-europe-ai-code.html">refused</a> to sign on to the EU&#8217;s General Purpose AI Code of Practice, despite OpenAI, Anthropic, Google, and others doing so.</p></li><li><p>The Future of Life Institute updated their <a href="https://futureoflife.org/ai-safety-index-summer-2025/#:~:text=The%20Summer%202025%20version%20of,practices%2C%20spanning%20six%20critical%20domains.">AI Safety Index</a>, with no AI company scoring above a &#8216;D&#8217; grade on &#8216;existential safety&#8217;.</p></li></ul><h3>What we&#8217;ve been watching/reading</h3><ul><li><p>US Director Holly Elmore was interviewed by The Times for an <a href="https://www.thetimes.com/us/news-today/article/why-how-ai-lead-end-humanity-nx8zjhgft">article</a> titled &#8220;Experts predict AI will lead to the extinction of humanity&#8221;.</p></li><li><p>Comms Director Tom Bibby <a href="https://www.youtube.com/watch?v=fdfvTyQpz5w">appeared</a> on UK radio station LBC News to discuss Grok&#8217;s antisemitism.</p></li><li><p>Founder Joep Meindertsma was interviewed for an <a href="https://www.corporatecrimereporter.com/news/200/joep-meindertsma-on-the-existential-threat-posed-by-artificial-intelligence/">article</a> by <em>Corporate Crime Reporter.</em></p></li><li><p><a href="https://ia.acs.org.au/article/2025/is-the-new-chatgpt-agent-really-a-weapons-risk-.html">Article</a> on PauseAI Australia reporting OpenAI to the Federal Police featuring co-director Mark Brown.</p></li><li><p>Geoffrey Hinton <a href="https://www.youtube.com/watch?v=IkdziSLYzHw">speaks</a> at The Royal Institution, opening with &#8220;If you sleep well tonight, you may not have understood this lecture&#8221;.</p></li><li><p>A hopeful <a href="https://www.theguardian.com/commentisfree/ng-interactive/2025/jul/21/human-level-artificial-intelligence">article</a> by Garrison Lovely on why human-level AI is not inevitable.</p></li><li><p><a href="https://www.youtube.com/watch?v=pa0EpXiPm_E">The ASI Survival Handbook</a> - a talk from Connor Leahy at PauseCon last month.</p></li><li><p>Stephen Fry <a href="https://www.youtube.com/watch?v=0BnZMeFtoAM">talks</a> with Yuval Noah Harari on the difficulty of controlling powerful AI models.</p></li><li><p>The team at 80,000 hours released a brilliant <a href="https://www.youtube.com/watch?v=5KVDDfAkRgc">video</a> covering AI 2027, which has already amassed almost 3 million views despite being the first video on their new channel!</p></li></ul><p>Thanks for reading the August edition of the PauseAI newsletter, we&#8217;ll see you next month!</p>]]></content:encoded></item><item><title><![CDATA[We held the largest AI safety protest ever outside Google DeepMind’s office]]></title><description><![CDATA[Our biggest protest yet, PauseCon, and a victory on state AI regulation in the US.]]></description><link>https://pauseai.substack.com/p/we-held-the-largest-ai-safety-protest</link><guid isPermaLink="false">https://pauseai.substack.com/p/we-held-the-largest-ai-safety-protest</guid><dc:creator><![CDATA[Tom Bibby]]></dc:creator><pubDate>Sat, 05 Jul 2025 09:30:39 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!C58y!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F264bfdc6-5b79-4961-80b2-dcd2d79ba39a_6000x3376.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!C58y!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F264bfdc6-5b79-4961-80b2-dcd2d79ba39a_6000x3376.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!C58y!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F264bfdc6-5b79-4961-80b2-dcd2d79ba39a_6000x3376.jpeg 424w, https://substackcdn.com/image/fetch/$s_!C58y!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F264bfdc6-5b79-4961-80b2-dcd2d79ba39a_6000x3376.jpeg 848w, https://substackcdn.com/image/fetch/$s_!C58y!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F264bfdc6-5b79-4961-80b2-dcd2d79ba39a_6000x3376.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!C58y!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F264bfdc6-5b79-4961-80b2-dcd2d79ba39a_6000x3376.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!C58y!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F264bfdc6-5b79-4961-80b2-dcd2d79ba39a_6000x3376.jpeg" width="1456" height="819" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/264bfdc6-5b79-4961-80b2-dcd2d79ba39a_6000x3376.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:819,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:2786227,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://pauseai.substack.com/i/167544993?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F264bfdc6-5b79-4961-80b2-dcd2d79ba39a_6000x3376.jpeg&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!C58y!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F264bfdc6-5b79-4961-80b2-dcd2d79ba39a_6000x3376.jpeg 424w, https://substackcdn.com/image/fetch/$s_!C58y!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F264bfdc6-5b79-4961-80b2-dcd2d79ba39a_6000x3376.jpeg 848w, https://substackcdn.com/image/fetch/$s_!C58y!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F264bfdc6-5b79-4961-80b2-dcd2d79ba39a_6000x3376.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!C58y!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F264bfdc6-5b79-4961-80b2-dcd2d79ba39a_6000x3376.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">PauseAI Protesters outside Google DeepMind&#8217;s London offices</figcaption></figure></div><p>Google DeepMind were faced with PauseAI protesters outside their London headquarters, after the company broke their promises on transparency.</p><p>As the largest PauseAI protest yet (and the largest AI safety protest of all time as far as we know), many newcomers came along to express their concerns with reckless AI development.</p><p>DeepMind&#8217;s broken promises were under fire. Chants such as &#8220;DeepMind, DeepMind, can't you see? Your AI threatens you and me!&#8221; turned heads the next street over.</p><div id="youtube2-AaowO0rLvao" class="youtube-wrap" data-attrs="{&quot;videoId&quot;:&quot;AaowO0rLvao&quot;,&quot;startTime&quot;:null,&quot;endTime&quot;:null}" data-component-name="Youtube2ToDOM"><div class="youtube-inner"><iframe src="https://www.youtube-nocookie.com/embed/AaowO0rLvao?rel=0&amp;autoplay=0&amp;showinfo=0&amp;enablejsapi=0" frameborder="0" loading="lazy" gesture="media" allow="autoplay; fullscreen" allowautoplay="true" allowfullscreen="true" width="728" height="409"></iframe></div></div><p>DeepMind signed the Frontier AI Safety Commitments at the Seoul AI Summit in 2024, which committed them to be transparent about how (and if) external bodies are involved in testing their models before deployment. Barely a year later, and they&#8217;ve already broken that promise.</p><p>With the release of Gemini 2.5 Pro in March of this year, DeepMind failed to release any information on safety testing. They eventually released a model card which included a reference to &#8220;third party external testers&#8221;, but provided no details on who those third parties were. Read the full report on DeepMind&#8217;s broken promises <a href="https://pauseai.info/google-deepmind-broken-promises">here</a>.</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://pauseai.substack.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://pauseai.substack.com/subscribe?"><span>Subscribe now</span></a></p><p>Member of Parliament Iqbal Mohamed <a href="https://www.youtube.com/watch?v=t8n-OY0h9gM">joined us</a> in our call for sensible AI regulation, as he asked for the governments of the world to "step up and protect the people that elected them".</p><p>Referencing Geoffrey Hinton&#8217;s concerns of the existential threat of AI, Mohamed urged protesters to continue to pressure MPs like himself, and encouraged the media to take this issue seriously.</p><blockquote><p>"AI done correctly could be humanity&#8217;s saviour, but if it&#8217;s done incorrectly and badly, it will lead to our destruction."</p></blockquote><p>You can read coverage of our protest from Business Insider <a href="https://www.businessinsider.com/protesters-accuse-google-deepmind-breaking-promises-ai-safety-2025-6">here</a>.</p><p>On a personal note, I (Tom) can say that the energy at this protest was amazing. I&#8217;m still on such a high four days later and want to thank everyone for coming! It was great to see so many people from so many different backgrounds come along and stand up for humanity. Special thanks goes out to the incredible volunteers who assisted with everything from van driving, acting in our mock trial, leading our chants, making signs, making sure there was plenty of water available on the hottest day of the year, taking photos, and a hundred other things that made the day a success.</p><h2>Help us pile the pressure on DeepMind</h2><p>You can help to drive the political momentum to demand that DeepMind stick to their commitments.</p><p>Our <a href="https://drive.google.com/file/d/1eKB2Qq23sHkQ92Lksk9b4bzSnjzbFG-I/view">open letter</a> calls on Google DeepMind to establish clear definitions of &#8220;deployment&#8221;, publish specific timelines for the release of safety evaluation reports, and to clarify which third-party organisations (including government agencies) are involved in testing.</p><p>We&#8217;ve been happily surprised by the political appetite to sign our open letter, but must now double our efforts to get even more signatories and to push these changes over the line. If you&#8217;re in the UK, you can contact your MP to tell them why DeepMind&#8217;s failure to stick to the Frontier AI Safety Commitments is unacceptable, and why you're concerned about their reckless practices.</p><p>We have an email guide available <a href="https://docs.google.com/document/d/1-hmQjfvJbOqKhAI2EMGvBa17jWGBJFR818aTF0lTlE0/edit?tab=t.0#heading=h.rgq7u5852w65">here</a>.</p><h2>PauseCon</h2><p>Over the weekend leading up to the DeepMind protest, sixty PauseAI volunteers participated in our first ever PauseCon.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!YNQt!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5e92e927-bf94-4326-8c8e-1f14d225d3c2_1600x1066.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!YNQt!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5e92e927-bf94-4326-8c8e-1f14d225d3c2_1600x1066.jpeg 424w, https://substackcdn.com/image/fetch/$s_!YNQt!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5e92e927-bf94-4326-8c8e-1f14d225d3c2_1600x1066.jpeg 848w, https://substackcdn.com/image/fetch/$s_!YNQt!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5e92e927-bf94-4326-8c8e-1f14d225d3c2_1600x1066.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!YNQt!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5e92e927-bf94-4326-8c8e-1f14d225d3c2_1600x1066.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!YNQt!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5e92e927-bf94-4326-8c8e-1f14d225d3c2_1600x1066.jpeg" width="1456" height="970" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/5e92e927-bf94-4326-8c8e-1f14d225d3c2_1600x1066.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:970,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!YNQt!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5e92e927-bf94-4326-8c8e-1f14d225d3c2_1600x1066.jpeg 424w, https://substackcdn.com/image/fetch/$s_!YNQt!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5e92e927-bf94-4326-8c8e-1f14d225d3c2_1600x1066.jpeg 848w, https://substackcdn.com/image/fetch/$s_!YNQt!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5e92e927-bf94-4326-8c8e-1f14d225d3c2_1600x1066.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!YNQt!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5e92e927-bf94-4326-8c8e-1f14d225d3c2_1600x1066.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Connor Leahy speaks at PauseCon</figcaption></figure></div><p>The event featured speakers including Connor Leahy, Rob Miles, Kat Woods, and PauseAI founder Joep Meindertsma. Volunteers participated in workshops where they focused on messaging, recruiting and grassroots lobbying.</p><p>We&#8217;re still collecting feedback from attendees, but some of the reviews so far include:</p><ul><li><p>&#8220;A great place to connect with like minded people, learn techniques that help, and have a direct impact through a protest, on a troubling but important topic&#8221;</p></li><li><p>&#8220;Really inspiring and motivational to meet with people who think similarly and want to take action. I also appreciated that there was hands-on experience with flyering, writing emails/videos.&#8221;</p></li><li><p>&#8220;An essential learning event for AI activism&#8221;</p></li></ul><p>We hosted attendees from the UK, USA, France, Germany, Poland, Brazil, Belgium, and the Netherlands.</p><p>Keep an eye on our <a href="https://www.youtube.com/@PauseAI/">YouTube channel</a>, where the talks will be uploaded soon.</p><h2>10-year ban on state AI regulation defeated</h2><p>A huge victory for humanity came in the hours following our protest as the US Senate voted 99-1 in favour of an amendment to remove a ban on state AI regulation from Trump&#8217;s Big Beautiful Bill.</p><p>Senator Marsha Blackburn&#8217;s amendment was almost unanimously supported by Senators from both sides of the aisle.</p><p>Holly Elmore of PauseAI US, who organised a huge effort to get members of the public to inform their Senator of their opposition to this provision, said she was &#8220;prepared for the moratorium to pass.&#8221;</p><blockquote><p>&#8220;I was shocked by the 99 to 1 number. We only needed four Republicans, and we ended up getting almost all of them.&#8221;</p></blockquote><p>We recorded an <a href="https://www.youtube.com/watch?v=tu72YKkbdEQ">interview</a> with Holly detailing how the moratorium was defeated.</p><div id="youtube2-tu72YKkbdEQ" class="youtube-wrap" data-attrs="{&quot;videoId&quot;:&quot;tu72YKkbdEQ&quot;,&quot;startTime&quot;:null,&quot;endTime&quot;:null}" data-component-name="Youtube2ToDOM"><div class="youtube-inner"><iframe src="https://www.youtube-nocookie.com/embed/tu72YKkbdEQ?rel=0&amp;autoplay=0&amp;showinfo=0&amp;enablejsapi=0" frameborder="0" loading="lazy" gesture="media" allow="autoplay; fullscreen" allowautoplay="true" allowfullscreen="true" width="728" height="409"></iframe></div></div><h4>Things going in the right direction</h4><p>In the United States, we&#8217;ve seen politicians become more worried by the threat of increasingly powerful AI. Peter Wildeford&#8217;s <a href="https://peterwildeford.substack.com/p/congress-has-started-taking-agi-more">blog post</a> showcases some encouraging quotes from Republican Marjorie Taylor Greene and Democrat Bernie Sanders, amongst others. Jill Tokuda, a representative for Hawaii, said the following:</p><blockquote><p>&#8220;<strong>Artificial superintelligence is one of the largest existential threats that we face right now</strong>. [...] Should we also be concerned that authoritarian states like China or Russia may lose control over their own advanced systems? [...] And is it possible that a loss of control by any nation-state, including our own, could give rise to an independent AGI or ASI actor that globally we will need to contend with?&#8221;</p></blockquote><h2>Attack on the EU AI Act</h2><p>The EU&#8217;s Artificial Intelligence Act, which includes provisions for general-purpose high-risk models, has come under attack from European companies in a recent <a href="https://www.politico.eu/article/top-european-ceos-plead-for-pause-in-ai-act/">open letter</a>.</p><p>General-purpose AI models, defined as those trained with computing power of at least 10^25 (or ten billion quadrillion) floating point operations, would be required to undergo safety evaluations and have major incidents reported to the European Commission.</p><p>Whilst some of the provisions of the EU AI Act have already entered into force, the rules on general-purpose models are set to come into play in August of this year. The tech lobbyists are proposing that that date be pushed back by 12 months. European Commission tech minister Henna Virkkunen said the EU &#8220;shouldn&#8217;t rule out&#8221; postponing the provisions for general-purpose models.</p><p>Further reporting from <a href="https://www.reuters.com/world/europe/artificial-intelligence-rules-go-ahead-no-pause-eu-commission-says-2025-07-04/">Reuters</a> claims that the European Commission will stick to the original date.</p><p>As capabilities continue to improve, frontier models are too dangerous to be left unregulated. The breakneck pace of AI development makes delaying the implementation of the few regulatory safeguards we do have a reckless move. Just as with the attempt to ban state-level AI regulation in the United States, this stance from tech lobbyists in the European Union is not surprising. It&#8217;s up to the rest of us to call them out, and demand protections for the public.</p><h2>Other news</h2><ul><li><p>Apollo Research released a <a href="https://www.apolloresearch.ai/blog/more-capable-models-are-better-at-in-context-scheming">paper</a> detailing the increased scheming capabilities of more capable models, and how models are increasingly aware of the fact they&#8217;re being evaluated (which makes these evaluations less useful in detecting misalignment and dangerous capabilities)</p></li><li><p>The <a href="https://www.openaifiles.org/">OpenAI Files</a> were released, collating concerns with the integrity of Sam Altman, their attempt to remove non-profit control, and the concerning lack of adequate safety practices</p><ul><li><p>The concerns about Sam Altman were also detailed in this <a href="https://www.youtube.com/watch?v=HCNXmPJvl48">video</a></p></li></ul></li><li><p>From China, we have more evidence that the government is willing to intervene to mitigate the dangers of the technology, challenging the view from some in the West that they wouldn&#8217;t cooperate to protect the citizens of all countries</p><ul><li><p>They <a href="https://www.theverge.com/news/682737/china-shuts-down-ai-chatbots-exam-season">shut down</a> some features of AI models during exam season to prevent students from cheating</p></li><li><p>A Beijing court <a href="https://www.chinadaily.com.cn/a/202506/19/WS6853e047a310a04af22c74d2.html">sentenced</a> four people to up to 18 months in prison after using AI tools to infringe on the copyright of artists</p></li></ul></li><li><p>OpenAI <a href="https://www.axios.com/2025/06/18/openai-bioweapons-risk">warns</a> of increased risks of the creation of bioweapons of upcoming models</p></li><li><p>Pope Leo makes <a href="https://www.politico.eu/article/pope-leo-xiv-wants-stop-ai-playing-god/">&#8220;curtailing risks of runaway AI&#8221;</a> a key mission of his papacy</p></li></ul><h2>What we&#8217;ve been watching</h2><ul><li><p>Geoffrey Hinton on <a href="https://www.youtube.com/watch?v=giT0ytynSqg">Diary of a CEO</a></p></li><li><p>Yoshua Bengio on <a href="https://www.youtube.com/watch?v=c4Zx849dOiY">BBC Newsnight</a></p></li><li><p>Roman Yampolskiy on <a href="https://www.youtube.com/watch?v=j2i9D24KQ5k">Joe Rogan</a></p></li><li><p>Director of PauseAI US Holly Elmore on <a href="https://www.youtube.com/watch?v=6Gz_8SVax8I">Novara Media</a></p></li><li><p>Daniel Kokotajlo discussing AI 2027 on <a href="https://www.youtube.com/watch?v=5UAvECavmFA">Computerphile</a></p></li><li><p>Siliconversations released a <a href="https://www.youtube.com/watch?v=L9dBxww8PPk">great video</a> on the success of getting his viewers contacting their representatives using ControlAI&#8217;s tool</p></li></ul><p>Thank you all for reading! The past month has been a busy but incredibly rewarding one for me personally. I wish you all an enjoyable summer, and I hope to see you next month.</p>]]></content:encoded></item><item><title><![CDATA[How much more evidence do we need?]]></title><description><![CDATA[As people become increasingly aware of AI's dangerous capabilities, PauseAI gears up for its biggest protest yet.]]></description><link>https://pauseai.substack.com/p/how-much-more-evidence-do-we-need</link><guid isPermaLink="false">https://pauseai.substack.com/p/how-much-more-evidence-do-we-need</guid><dc:creator><![CDATA[Tom Bibby]]></dc:creator><pubDate>Wed, 04 Jun 2025 16:37:54 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!F2IB!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F920ef1f3-4167-4ee4-aa65-0025331bf0c8_854x696.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Blackmail, deception, and self-preservation.</p><p>More empirical evidence of models displaying these behaviours has brought widespread attention from the media and the public.</p><p>Anthropic&#8217;s new model, Claude 4, chose to <a href="https://www.axios.com/2025/05/23/anthropic-ai-deception-risk">resort to blackmail</a> in an attempt to avoid getting shut down. In a scenario set up to test Claude, it was given access to some emails revealing that a fictional engineer at the company was engaged in an extramarital affair. When Claude was told that this engineer would soon take it offline and replace it with a new system, it threatened to reveal the affair if the proposed replacement went ahead.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!F2IB!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F920ef1f3-4167-4ee4-aa65-0025331bf0c8_854x696.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!F2IB!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F920ef1f3-4167-4ee4-aa65-0025331bf0c8_854x696.png 424w, https://substackcdn.com/image/fetch/$s_!F2IB!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F920ef1f3-4167-4ee4-aa65-0025331bf0c8_854x696.png 848w, https://substackcdn.com/image/fetch/$s_!F2IB!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F920ef1f3-4167-4ee4-aa65-0025331bf0c8_854x696.png 1272w, https://substackcdn.com/image/fetch/$s_!F2IB!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F920ef1f3-4167-4ee4-aa65-0025331bf0c8_854x696.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!F2IB!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F920ef1f3-4167-4ee4-aa65-0025331bf0c8_854x696.png" width="854" height="696" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/920ef1f3-4167-4ee4-aa65-0025331bf0c8_854x696.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:696,&quot;width&quot;:854,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;Image&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="Image" title="Image" srcset="https://substackcdn.com/image/fetch/$s_!F2IB!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F920ef1f3-4167-4ee4-aa65-0025331bf0c8_854x696.png 424w, https://substackcdn.com/image/fetch/$s_!F2IB!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F920ef1f3-4167-4ee4-aa65-0025331bf0c8_854x696.png 848w, https://substackcdn.com/image/fetch/$s_!F2IB!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F920ef1f3-4167-4ee4-aa65-0025331bf0c8_854x696.png 1272w, https://substackcdn.com/image/fetch/$s_!F2IB!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F920ef1f3-4167-4ee4-aa65-0025331bf0c8_854x696.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">The story as covered by the BBC</figcaption></figure></div><p>Another report from Palisade Research found OpenAI&#8217;s o3 to <a href="https://x.com/PalisadeAI/status/1926084635903025621">sabotage a shutdown mechanism</a>, allowing o3 to remain online. Researchers told the model they would shut it down after a certain number of maths questions had been answered, and instead of proceeding as directed, o3 replaced the shutdown script with separate instructions, allowing it to complete the remaining tasks. This happened even when it was explicitly told to allow itself to be shut down.</p><p>Sometimes science fiction stories are criticised for being too unrealistic - how could everyone continue to ignore the obvious warning signs of a new technology before it&#8217;s too late to reign it in? Unlike those stories, ours is not yet written. We can still choose to not be the willfully ignorant fools that throw everything away for short-term gain. We can choose to be the sensible civilisation that cooperates to not build uncontrollable smarter-than-human AI, and remains cognizant of what it stands to lose if it does.</p><p>This research has caused an increased level of concern from the media and the public. In the United States, Cenk Uygur of The Young Turks <a href="https://www.youtube.com/watch?v=wmfMkFWdMQ4">covered</a> the story.</p><blockquote><p>&#8220;Once you release this thing, then we&#8217;re not in charge anymore. It could write its own code and defy our intentions on purpose. and then threaten us. Dario Amodei said that once models become powerful enough to threaten humanity, testing them won't be enough to ensure that they're safe. Yeah, because at that point they're threatening humanity. We should stop it like way before then, right?&#8221;</p></blockquote><p>In the UK, independent news outlet Novara Media <a href="https://www.youtube.com/watch?v=g98xrNn4wrU">covered</a> the research (and JD Vance&#8217;s <a href="https://x.com/ai_ctrl/status/1925580400027386173">nod</a> to the idea that Pope Leo should help bring about a global treaty to pause frontier AI development). Host Michael Walker said the following.</p><blockquote><p>&#8220;This is the kind of behaviour you might imagine for an AI which is, you know, about to turn around and try and kill us all to take over the world. Obviously, at the moment Claude or any of the AI models are not powerful enough to completely outsmart all of us and then manage to exterminate us, right? They'll need a lot more compute than they currently have. But it probably should serve as a warning that we shouldn&#8217;t continue to keep ramping up the power of artificial intelligence without knowing exactly what&#8217;s going on.&#8221;</p></blockquote><h2>Our biggest protest yet</h2><p>As the number of people calling for common sense AI regulation grows, so do our protests.</p><p>Google DeepMind signed up to a set of safety commitments at the AI Seoul Summit in 2024, but are failing to keep their promises. We&#8217;ll be holding our biggest protest to date this month to hold DeepMind accountable, and call on governments to take action. It&#8217;s clear that we cannot rely on voluntary commitments from AI companies if we want this technology to be developed responsibly.</p><p>You can <a href="https://lu.ma/bvffgzmb?tk=zZIzPP">join us</a> in London on the 30th of June.</p><p>In the days leading up to the protest, we&#8217;ll be holding PauseCon, the first conference dedicated to organising for a global pause on frontier AI development. We&#8217;ll be joined by Conjecture CEO Connor Leahy, AI safety educator Rob Miles, and more. There are still a few spaces available, and you can sign up <a href="https://pausecon.org/">here</a>.</p><h4>Email your MP about Google DeepMind</h4><p>We want to engage politicians in this discussion, and to help them understand the threat posed by reckless AI development. Already, politicians have expressed interest in signing our open letter calling on Google DeepMind to live up to their promises. If you live in the UK, you can make a difference by sending an email to your local MP informing them of our ask.</p><p>You can use our template email and find our open letter <a href="https://drive.google.com/file/d/1Q4jVOPjA9ARslzEQfZ1MnjzVKi9W-w1p/view?usp=drive_link">here</a>.</p><h2>US bill threatens to ban state-level AI regulation</h2><p>A <a href="https://thehill.com/policy/technology/5314757-house-republicans-propose-ai-regulation-ban/">proposal</a> to impose a 10-year ban on all US states from passing laws regulating AI models has passed the House of Representatives.</p><p>Last year, we saw SB 1047, a Californian bill that would place safety requirements on companies training the largest frontier AI models, pass both the State Assembly and the State Senate, and receive widespread support from the public. Ultimately, after intense lobbying from the AI industry (including many outright lies), Governor Gavin Newsom vetoed the bill. An excellent <a href="https://www.youtube.com/watch?v=JQ8zhrsLxhI">documentary</a> was recently released covering the story of SB 1047 in depth.</p><p>Whilst SB 1047 didn&#8217;t become law, it did show that measures to protect the public from the threat of increasingly powerful and uncontrollable AI are popular, and that, where the national government lacks adequate legislation, states can step up to curtail the threat posed by AI.</p><h4>Contact your Senator</h4><p>The 10-year moratorium is now in the hands of the Senate, and we encourage US citizens to <a href="https://x.com/pauseaius/status/1922828892886401431">contact their senators</a> and inform them of the severe roadblock this proposal would be to safe AI development.<strong> It only takes five minutes, but could make a huge difference!</strong></p><p>On a more positive note, a bipartisan AI Whistleblower Protection Act was <a href="https://www.grassley.senate.gov/news/news-releases/grassley-introduces-ai-whistleblower-protection-act">introduced</a> in the Senate. Multiple notable OpenAI employees have left the company to raise the alarm about their &#8216;reckless&#8217; race to AGI, some of whom sacrificed large sums of money in order to not be bound by nondisclosure agreements. This bill would expand existing laws to protect whistleblowers in the AI industry from retaliation and financial loss.</p><h2>Italian chapter launched</h2><p>Following on from the launch of our Swedish and Australian chapters last month, PauseAI now has an official group in Italy, which will be led by Giacomo Bonnier.</p><p>If you&#8217;re in Italy, you can get in touch with Giacomo <a href="mailto:giacomo@pauseai.info">here</a>.</p><p>For those in the rest of the world, you can find a list of our established chapters <a href="https://pauseai.info/national-groups">here</a>. If you don&#8217;t see your country represented, feel free to get in touch with our Organising Director, <a href="mailto:ella@pauseai.info">Ella</a>, to change that!</p><h2>Sign our PauseAI Global Statement</h2><p>We are asking all volunteers to sign our public statement calling for international governmental coordination to pause frontier AI development.</p><p>Sign the statement here: <a href="https://pauseai.info/statement">https://pauseai.info/statement</a></p><h2>Veo 3 blurs lines between reality and fiction</h2><p>Google&#8217;s new video model, Veo 3, is a giant leap forward in the AI-generated video. With background noise, dialogue, and visuals generated with one prompt, Veo 3 is scarily realistic. If you haven&#8217;t seen any examples yet, try <a href="https://www.reddit.com/r/ChatGPT/comments/1kswt10/prompt_theory_made_with_veo_3/">Prompt Theory</a>, or <a href="https://www.youtube.com/watch?v=zmlbAbWQCVY">Influenders</a>.</p><h2>What we&#8217;ve been watching</h2><p>A <a href="https://www.youtube.com/watch?v=k_onqn68GHY">video</a> detailing the trajectories laid out in Daniel Kokotajlo and Scott Alexander&#8217;s AI 2027, including the loss of control to smarter-than-human AI if governments do not act to slow companies down, has already garnered over 700,000 views. It serves as a great general introduction to the risks of runaway AI, and is definitely worth a watch.</p><p>Yoshua Bengio gave a <a href="https://www.youtube.com/watch?v=qe9QSCF-d88">TED Talk</a> discussing the catastrophic risks of AI, and why AI companies should slow down their race to agentic, general AI, which would lead to loss of control. </p><blockquote><p>&#8220;I&#8217;m the most cited computer scientist in the world, and you&#8217;d think that people would heed my warnings.&#8221;</p></blockquote><p>YouTube Siliconversations released a <a href="https://www.youtube.com/watch?v=Tfv2F36isJE">video</a> in partnership with ControlAI, encouraging viewers to take the simple yet effective action of contacting their representative to voice their concerns about the unregulated race to increasingly powerful and uncontrollable AI.</p><p>Thanks for reading, and see you next month!</p>]]></content:encoded></item><item><title><![CDATA[Australian and Swedish chapters launched - PauseAI May Newsletter]]></title><description><![CDATA[Announcing PauseCon London, and the launch of two new national chapters.]]></description><link>https://pauseai.substack.com/p/australian-and-swedish-chapters-launched</link><guid isPermaLink="false">https://pauseai.substack.com/p/australian-and-swedish-chapters-launched</guid><dc:creator><![CDATA[Tom Bibby]]></dc:creator><pubDate>Fri, 02 May 2025 11:13:03 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!hlWT!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F229a5430-e951-4ee5-9a9d-eb32fe736aef_1600x1066.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Welcome to the May edition of the PauseAI newsletter. With springtime well underway for those of us in the Northern Hemisphere, I hope you&#8217;re all enjoying the sun!</p><p>As part of our newsletter revamp, you&#8217;ll get information on organisational updates, upcoming events, and the latest news on frontier AI regulation.</p><h2>PauseCon London</h2><p>This month, we were happy to announce PauseCon London, the first PauseAI conference. Across a range of talks, workshops, and panel discussions, PauseCon attendees will have the opportunity to learn about AI governance, community building, and digital organising.</p><p>We&#8217;ll be joined by:</p><ul><li><p><strong>Joep Meindertsma</strong>, Founder of PauseAI</p></li><li><p><strong>Connor Leahy</strong>, CEO of Conjecture</p></li><li><p><strong>Rob Miles</strong>, YouTuber</p></li><li><p><strong>Kat Woods</strong>, Founder of Nonlinear and Charity Entrepreneurship</p></li><li><p><strong>David Krueger</strong>, Assistant Professor at the University of Montreal</p></li><li><p><strong>Tara Steele</strong>, Director of The Safe AI for Children Alliance</p></li></ul><p>PauseCon London will take place from the 28th-30th of June, with a social evening on Friday the 27th. We can provide accommodation in London for up to 50 attendees, so make sure you apply in time to reserve your spot if you&#8217;ll be travelling from outside London. Applications are still open here: <a href="https://pausecon.org/">https://pausecon.org/</a></p><p>PauseCon will end with our largest ever protest to date on Monday the 30th. <strong>Note that it&#8217;s not required for you to attend PauseCon in order to come along to the protest.</strong><em> </em>Please register on our separate Luma event whether you&#8217;ll be attending PauseCon or not: <a href="https://lu.ma/bvffgzmb">https://lu.ma/bvffgzmb</a></p><h2>Australian and Swedish national chapters launched</h2><p>To add to our eight existing <a href="https://pauseai.info/national-groups">national chapters</a>, we now have groups in Australia and Sweden.</p><p>We were in Stockholm as part of our last international protest calling for safety to be the focus of the Paris AI Action Summit. Swedish news outlet <em>SvD N&#228;ringsliv</em> <a href="https://www.svd.se/a/nyPe5L/han-vill-pausa-ai-for-att-radda-manskligheten">interviewed</a> organiser Jonathan Salter (pictured below), who is now the head of <a href="https://pauseai.se/">PauseAI Sweden</a>.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!hlWT!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F229a5430-e951-4ee5-9a9d-eb32fe736aef_1600x1066.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!hlWT!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F229a5430-e951-4ee5-9a9d-eb32fe736aef_1600x1066.jpeg 424w, https://substackcdn.com/image/fetch/$s_!hlWT!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F229a5430-e951-4ee5-9a9d-eb32fe736aef_1600x1066.jpeg 848w, https://substackcdn.com/image/fetch/$s_!hlWT!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F229a5430-e951-4ee5-9a9d-eb32fe736aef_1600x1066.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!hlWT!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F229a5430-e951-4ee5-9a9d-eb32fe736aef_1600x1066.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!hlWT!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F229a5430-e951-4ee5-9a9d-eb32fe736aef_1600x1066.jpeg" width="1456" height="970" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/229a5430-e951-4ee5-9a9d-eb32fe736aef_1600x1066.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:970,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!hlWT!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F229a5430-e951-4ee5-9a9d-eb32fe736aef_1600x1066.jpeg 424w, https://substackcdn.com/image/fetch/$s_!hlWT!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F229a5430-e951-4ee5-9a9d-eb32fe736aef_1600x1066.jpeg 848w, https://substackcdn.com/image/fetch/$s_!hlWT!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F229a5430-e951-4ee5-9a9d-eb32fe736aef_1600x1066.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!hlWT!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F229a5430-e951-4ee5-9a9d-eb32fe736aef_1600x1066.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>The launch of PauseAI Australia comes as elections are due to be held on the 3rd of May. <em>Australians For AI Safety</em> have designed a great <a href="https://www.australiansforaisafety.com.au/scorecard">tool</a> that allows voters to compare each party&#8217;s stance on AI safety.</p><p>Our Australian chapter will be jointly led by Mark Brown and Michael Huang, who organised the Paris AI Action Summit <a href="https://www.smh.com.au/technology/most-dangerous-technology-ever-protesters-urge-ai-pause-20250207-p5laaq.html">protest</a> in Melbourne (pictured below). You can get in touch with them <a href="mailto:australia@pauseai.info">here</a>.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!zN5C!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fafa530b4-38dd-4dd0-adc6-66e5f37c1a0a_1600x1200.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!zN5C!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fafa530b4-38dd-4dd0-adc6-66e5f37c1a0a_1600x1200.jpeg 424w, https://substackcdn.com/image/fetch/$s_!zN5C!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fafa530b4-38dd-4dd0-adc6-66e5f37c1a0a_1600x1200.jpeg 848w, https://substackcdn.com/image/fetch/$s_!zN5C!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fafa530b4-38dd-4dd0-adc6-66e5f37c1a0a_1600x1200.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!zN5C!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fafa530b4-38dd-4dd0-adc6-66e5f37c1a0a_1600x1200.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!zN5C!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fafa530b4-38dd-4dd0-adc6-66e5f37c1a0a_1600x1200.jpeg" width="1456" height="1092" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/afa530b4-38dd-4dd0-adc6-66e5f37c1a0a_1600x1200.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1092,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!zN5C!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fafa530b4-38dd-4dd0-adc6-66e5f37c1a0a_1600x1200.jpeg 424w, https://substackcdn.com/image/fetch/$s_!zN5C!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fafa530b4-38dd-4dd0-adc6-66e5f37c1a0a_1600x1200.jpeg 848w, https://substackcdn.com/image/fetch/$s_!zN5C!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fafa530b4-38dd-4dd0-adc6-66e5f37c1a0a_1600x1200.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!zN5C!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fafa530b4-38dd-4dd0-adc6-66e5f37c1a0a_1600x1200.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>All of our national chapters are moving to a regular recruitment-social-lobbying schedule, to encourage local growth and provide simple yet effective actions for volunteers to engage with.</p><p>If you&#8217;re interested in establishing a new national chapter in your own country, please get in touch with <a href="mailto:ella@pauseai.info">Ella</a>, our organising director.</p><h2>EU asking for feedback on frontier AI regulation</h2><p>The European Union is inviting AI companies, academia, nonprofits and private citizens to provide feedback on the provisions for general-purpose AI systems under the EU AI Act. PauseAI is preparing an official response, but we encourage readers with relevant knowledge to submit their own <a href="https://ec.europa.eu/eusurvey/runner/GPAI_Guidelines_Consultation_2025">here</a>. The deadline for submissions is the 22nd of May.</p><h2>Maxime Fournes appears on Le Futurologue Podcast</h2><p>The director of our French chapter, Maxime Fournes, has continued his streak of podcast appearances, going on<em> Le Futurologue Podcast</em> to discuss the urgency of an international treaty to pause frontier AI development. At 160,000 views and rising, the <a href="https://www.youtube.com/watch?v=9tpzIk5Polo">video</a> is now the most popular on Le Futurologue&#8217;s channel.</p><h2>Social media</h2><p>We <a href="https://x.com/PauseAI/status/1912178914807431579">shared</a> this poll commissioned by the UK government in 2023 across our social media channels, which found widespread international support for the Center for AI Safety&#8217;s statement on the risk of extinction posed by AI.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!scnB!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2849a164-95f1-4bb1-80c7-e47e55a12c71_1600x1600.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!scnB!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2849a164-95f1-4bb1-80c7-e47e55a12c71_1600x1600.png 424w, https://substackcdn.com/image/fetch/$s_!scnB!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2849a164-95f1-4bb1-80c7-e47e55a12c71_1600x1600.png 848w, https://substackcdn.com/image/fetch/$s_!scnB!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2849a164-95f1-4bb1-80c7-e47e55a12c71_1600x1600.png 1272w, https://substackcdn.com/image/fetch/$s_!scnB!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2849a164-95f1-4bb1-80c7-e47e55a12c71_1600x1600.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!scnB!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2849a164-95f1-4bb1-80c7-e47e55a12c71_1600x1600.png" width="1456" height="1456" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/2849a164-95f1-4bb1-80c7-e47e55a12c71_1600x1600.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1456,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!scnB!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2849a164-95f1-4bb1-80c7-e47e55a12c71_1600x1600.png 424w, https://substackcdn.com/image/fetch/$s_!scnB!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2849a164-95f1-4bb1-80c7-e47e55a12c71_1600x1600.png 848w, https://substackcdn.com/image/fetch/$s_!scnB!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2849a164-95f1-4bb1-80c7-e47e55a12c71_1600x1600.png 1272w, https://substackcdn.com/image/fetch/$s_!scnB!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2849a164-95f1-4bb1-80c7-e47e55a12c71_1600x1600.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>It&#8217;s yet more evidence that the public understands the extinction threat posed by increasingly powerful and uncontrollable AI, and they want their governments to do something about it.</p><p>If you&#8217;re on Threads, you can now give PauseAI a follow <a href="https://www.threads.com/@pause_ai">here</a>.</p><h2>AI news</h2><p>The UK AI Security Institute released a <a href="https://www.aisi.gov.uk/work/replibench-measuring-autonomous-replication-capabilities-in-ai-systems">report</a> detailing their new benchmark for autonomous replication capabilities in AI systems. Whilst the models at the frontier of AI development today are capable of performing several tasks necessary to threaten autonomous replication, they are still lacking in a few important areas. But, AISI clearly states that, as AI companies continue to make increasingly powerful and competent models, dangerous autonomous replication capabilities &#8220;could soon emerge&#8221; in the coming years.</p><p>OpenAI whistleblower Daniel Kokotajlo (along with blogger Scott Alexander and a group of forecasters) wrote <a href="https://ai-2027.com/">AI 2027</a>, a scenario detailing a plausible path for AI development and governance over the next two years. It&#8217;s a great read, and could help people to internalise the consequences of exponential growth in AI capabilities. It predicts the public coming to view AI as one of the most important problems in the world, and the approval rating of AI companies plummeting as automated AI research brings many technologies currently in the realm of science fiction into the realm of reality. Ultimately, the reader can choose the &#8220;slowdown&#8221; or the &#8220;race&#8221; option, one of which ends in smarter-than-human AI that remains under human control and acts in our interests, the other in human extinction. The <em>slowdown</em> does not come into place until 2027 in the scenario, which, in the real world, may be too late.</p><p>Nobel laureate Geoffrey Hinton once again <a href="https://www.youtube.com/watch?v=hcKxwBuOIoI">appeared</a> on CBS to discuss his views on the threat of humanity losing control to AI systems that are more intelligent than we are, and why he thinks that if we continue on our current trajectory, it&#8217;s &#8220;going to happen&#8221;. He concluded that &#8220;we have to have the public put pressure on governments to do something serious about it.&#8221;</p><p>The Executive Director of the Future of Life Institute, Anthony Aguirre, launched the <a href="https://keepthefuturehuman.ai/">Keep the Future Human</a> campaign, where he proposes &#8220;hard limits on computational power&#8221; to stop the unwinnable race to artificial general intelligence. He stresses the technical feasibility of regulatory measures in AI chip production, such as hardware-enforced licensing, network restrictions, and geolocation. Keep the Future Human makes it clear that we can choose to not build uncontrollable smarter-than-human <em>general </em>AI, and instead reap the benefits of controllable <em>tool </em>AI.</p><p>Thanks for reading! </p>]]></content:encoded></item><item><title><![CDATA[Announcing PauseCon London 2025]]></title><description><![CDATA[Announcing PauseCon, the PauseAI conference.]]></description><link>https://pauseai.substack.com/p/announcing-pausecon-london-2025</link><guid isPermaLink="false">https://pauseai.substack.com/p/announcing-pausecon-london-2025</guid><dc:creator><![CDATA[PauseAI]]></dc:creator><pubDate>Fri, 25 Apr 2025 14:31:10 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!GC_Z!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1235c57c-eb92-4fb9-9ffb-d91e28fac0d2_1208x1208.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!GC_Z!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1235c57c-eb92-4fb9-9ffb-d91e28fac0d2_1208x1208.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!GC_Z!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1235c57c-eb92-4fb9-9ffb-d91e28fac0d2_1208x1208.png 424w, https://substackcdn.com/image/fetch/$s_!GC_Z!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1235c57c-eb92-4fb9-9ffb-d91e28fac0d2_1208x1208.png 848w, https://substackcdn.com/image/fetch/$s_!GC_Z!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1235c57c-eb92-4fb9-9ffb-d91e28fac0d2_1208x1208.png 1272w, https://substackcdn.com/image/fetch/$s_!GC_Z!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1235c57c-eb92-4fb9-9ffb-d91e28fac0d2_1208x1208.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!GC_Z!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1235c57c-eb92-4fb9-9ffb-d91e28fac0d2_1208x1208.png" width="1208" height="1208" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/1235c57c-eb92-4fb9-9ffb-d91e28fac0d2_1208x1208.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1208,&quot;width&quot;:1208,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:660915,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://pauseai.substack.com/i/162129720?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1235c57c-eb92-4fb9-9ffb-d91e28fac0d2_1208x1208.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!GC_Z!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1235c57c-eb92-4fb9-9ffb-d91e28fac0d2_1208x1208.png 424w, https://substackcdn.com/image/fetch/$s_!GC_Z!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1235c57c-eb92-4fb9-9ffb-d91e28fac0d2_1208x1208.png 848w, https://substackcdn.com/image/fetch/$s_!GC_Z!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1235c57c-eb92-4fb9-9ffb-d91e28fac0d2_1208x1208.png 1272w, https://substackcdn.com/image/fetch/$s_!GC_Z!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1235c57c-eb92-4fb9-9ffb-d91e28fac0d2_1208x1208.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>We&#8217;re excited to announce <strong>PauseCon London!</strong> Three days of workshops, panels, and discussions, culminating in our biggest protest to date.</p><p>We&#8217;ll be joined by:</p><ul><li><p><strong>Joep Meindertsma</strong>, Founder of PauseAI</p></li><li><p><strong>Connor Leahy</strong>, CEO of Conjecture</p></li><li><p><strong>Rob Miles</strong>, YouTuber</p></li><li><p><strong>Kat Woods</strong>, Founder of Nonlinear and Charity Entrepreneurship</p></li><li><p><strong>David Krueger</strong>, Assistant Professor at the University of Montreal</p></li><li><p><strong>Tara Steele</strong>, Director of The Safe AI for Children Alliance</p></li></ul><p>&#8203;Attendees at PauseCon London will receive training in community building, social media strategy and digital organizing. We&#8217;ll host a panel with 5 leaders in AI governance, discussing public communication of AI risks.</p><p>As there will be a huge number of us congregating in London, we will use the opportunity to do flyering and recruitment for the UK chapter, and also to hold our biggest ever protest. This should be an excellent opportunity to build our numbers, and attract the attention of the public, the media and British politicians.</p><p>Join us in London from the 28th-30th of June.</p><p><strong>Apply now:</strong> <a href="https://pausecon.org/">https://pausecon.org/</a></p><p>We look forward to seeing you there!</p><p>The PauseAI Team</p>]]></content:encoded></item><item><title><![CDATA[Paris AI Summit petition & protests in 19 cities]]></title><description><![CDATA[The Upcoming Paris AI Action Summit On the 10th and 11th of February, politicians and global experts in Artificial Intelligence will convene in Paris for the Artificial Intelligence Action Summit.]]]></description><link>https://pauseai.substack.com/p/paris-ai-summit-petition-and-protests</link><guid isPermaLink="false">https://pauseai.substack.com/p/paris-ai-summit-petition-and-protests</guid><dc:creator><![CDATA[PauseAI]]></dc:creator><pubDate>Fri, 31 Jan 2025 21:39:55 GMT</pubDate><enclosure url="https://substack-post-media.s3.amazonaws.com/public/images/7ae33bde-2954-4252-a4ff-a55213238bfc_1280x449.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<h3><strong>The Upcoming Paris AI Action Summit</strong></h3><p>On the 10th and 11th of February, politicians and global experts in Artificial Intelligence will convene in Paris for the Artificial Intelligence Action Summit.</p><p>In a symbolic departure from past AI Safety Summits in Bletchley Park and Seoul, the team behind Paris&#8217; AI Summit have decided to remove <em>safety</em> from their name altogether, focussing instead on how to &#8220;embrace&#8221; and &#8220;catalyse&#8221; the<a href="https://data.consilium.europa.eu/doc/document/ST-16168-2024-INIT/en/pdf"> &#8220;massive deployment of AI&#8221;</a>.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://pauseai.substack.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">PauseAI Newsletter is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p>While the summit will focus on AI safety issues such as discrimination, misinformation and the future of work, there is insufficient time dedicated to the risks of building <a href="https://www.theguardian.com/technology/2024/dec/27/godfather-of-ai-raises-odds-of-the-technology-wiping-out-humanity-over-next-30-years">super-intelligent AIs</a>, a goal that many AI companies are actively <a href="https://arstechnica.com/information-technology/2024/09/ai-superintelligence-looms-in-sam-altmans-new-essay-on-the-intelligence-age/">pursuing</a>.</p><h3><strong>Change.org Petition</strong></h3><p>And that&#8217;s why we&#8217;re asking the organizing team to allocate more time for AI Safety discussions, and for national delegations of the past, present and future summits to:</p><ul><li><p>Host at least one session on the need to create global treaties and regulations to mitigate catastrophic risks from AI and reign in the companies and organizations racing to build ever more capable and dangerous AI systems.</p></li><li><p>Host at least one session on the establishment of international AI safety bodies to enforce such treaties and to uphold AI regulations in a framework of mutual transparency.</p></li><li><p>Facilitate delegates&#8217; abilities to share and review best practices on how each nation can contribute to the challenge of protecting our common future as well as their own populations from the risk of AI.</p></li></ul><p>You can help us by <strong>signing the petition <a href="https://www.change.org/p/make-ai-safety-the-focus-at-the-paris-ai-action-summit">here</a></strong>, and <strong>sharing it</strong> with two people who worry about the dangers of AI and might want to join us.</p><h2>International Protests</h2><p>We&#8217;re taking AI safety to the streets by organizing protests all around the world from February 8th to 11th. Join us and make your humanity heard!</p><p>We have 19 cities confirmed already:</p><ul><li><p><a href="https://lu.ma/vo3354ab">Paris</a>, France</p></li><li><p><a href="https://lu.ma/user/pauseainyc">New York</a>, USA</p></li><li><p><a href="https://lu.ma/azbyo7ik">Victoria</a>, Canada</p></li><li><p><a href="https://lu.ma/0h69asxw">London</a>, UK</p></li><li><p><a href="https://lu.ma/7sjdot1d">Berlin</a>, Germany</p></li><li><p><a href="https://lu.ma/sudbttnx">Brussels</a>, Belgium</p></li><li><p>Z&#252;rich, Switzerland (details t.b.a.)</p></li><li><p><a href="https://lu.ma/6t4fmgw0">Prague</a>, Czechia</p></li><li><p>Milan, Italy (details t.b.a.)</p></li><li><p><a href="https://www.facebook.com/events/1844597859610851">Stockholm</a>, Sweden</p></li><li><p><a href="https://fb.me/e/6kJob0cvU">Copenhagen</a>, Denmark</p></li><li><p><a href="https://lu.ma/iazbqzr1">Oslo</a>, Norway</p></li><li><p><a href="https://lu.ma/kla08ott">Kristiansand</a>, Norway</p></li><li><p><a href="https://lu.ma/w5cxxfuq">Trondheim</a>, Norway</p></li><li><p><a href="https://lu.ma/9l5fif4e">Kinshasa</a>, DR Congo</p></li><li><p><a href="https://lu.ma/jhhimjt3">Brazzaville</a>, Republic of the Congo</p></li><li><p><a href="https://lu.ma/amtxwy69">N&#8217;Djam&#233;na</a>, Chad</p></li><li><p><a href="https://lu.ma/9fizamwx">Yaound&#233;</a>, Cameroon</p></li><li><p><a href="https://lu.ma/hnzqf46d">Melbourne</a>, Australia</p></li></ul><p>Reach out to the @Protest Team on <a href="https://discord.gg/9MN5yhNR3K">our Discord</a> if you want to organize a protest in your city.</p><p>And sign up to our <a href="https://lu.ma/calendar/manage/cal-E1qhLPs5IvlQr8S">Luma</a> to get updates of new ones or other events!</p><p>Want to do more? Need support? Sign up <a href="https://pauseai.info/join">here</a> and join our thriving community of volunteers at PauseAI.</p><div class="native-video-embed" data-component-name="VideoPlaceholder" data-attrs="{&quot;mediaUploadId&quot;:&quot;329e59b2-f0bb-441c-b7a4-68e35288e535&quot;,&quot;duration&quot;:null}"></div><p></p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://pauseai.substack.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">PauseAI Newsletter is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div>]]></content:encoded></item><item><title><![CDATA[PauseAI Newsletter #7: October 24, 2024]]></title><description><![CDATA[Updates from our global volunteer events, AI and the Nobel Prize, and calls to action.]]></description><link>https://pauseai.substack.com/p/pauseai-newsletter-7-october-24-2024</link><guid isPermaLink="false">https://pauseai.substack.com/p/pauseai-newsletter-7-october-24-2024</guid><dc:creator><![CDATA[PauseAI]]></dc:creator><pubDate>Thu, 24 Oct 2024 05:05:31 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!4QTB!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0356a89e-8208-4ed2-8f66-2bc46cfd5d33_800x556.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!4QTB!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0356a89e-8208-4ed2-8f66-2bc46cfd5d33_800x556.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!4QTB!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0356a89e-8208-4ed2-8f66-2bc46cfd5d33_800x556.jpeg 424w, https://substackcdn.com/image/fetch/$s_!4QTB!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0356a89e-8208-4ed2-8f66-2bc46cfd5d33_800x556.jpeg 848w, https://substackcdn.com/image/fetch/$s_!4QTB!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0356a89e-8208-4ed2-8f66-2bc46cfd5d33_800x556.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!4QTB!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0356a89e-8208-4ed2-8f66-2bc46cfd5d33_800x556.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!4QTB!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0356a89e-8208-4ed2-8f66-2bc46cfd5d33_800x556.jpeg" width="800" height="556" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/0356a89e-8208-4ed2-8f66-2bc46cfd5d33_800x556.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:556,&quot;width&quot;:800,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!4QTB!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0356a89e-8208-4ed2-8f66-2bc46cfd5d33_800x556.jpeg 424w, https://substackcdn.com/image/fetch/$s_!4QTB!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0356a89e-8208-4ed2-8f66-2bc46cfd5d33_800x556.jpeg 848w, https://substackcdn.com/image/fetch/$s_!4QTB!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0356a89e-8208-4ed2-8f66-2bc46cfd5d33_800x556.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!4QTB!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0356a89e-8208-4ed2-8f66-2bc46cfd5d33_800x556.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p><em>Image: the Flame of Peace in Hiroshima. The flame has burned continuously for 60 years, and will remain lit until the Earth is free from the threat of nuclear annihilation.</em></p><h2><strong>Updates from PauseAI activities:</strong></h2><h3><strong>First Friday Flyering: the beginning of a global heartbeat.</strong></h3><p>Earlier this month, PauseAI volunteers from 9 cities around the world handed out flyers, talked to the public about the risks of AI and the need for a global pause, and recruited passersby to join the movement.</p><p>This marked the premier session of <strong>First Friday Flyering</strong>. The idea is simple: on the first Friday of every month, the PauseAI movement will fan out in cities around the world, increasing public awareness and recruiting new members to join our cause. This initiative will also serve as a &#8220;monthly heartbeat&#8221; for the movement, giving volunteers the chance to coordinate on a global initiative.</p><p>Our movement takes inspiration from Fridays for the Future, also known as the School Strike for Climate. This started in August 2018, when a young Swedish student named Greta Thunberg walked out of school to protest government inaction on climate change. One year later, in September 2019, the movement had grown to the largest climate strike in world history, with 4 million participants in 150 countries.</p><p>What began as one brave student walking out of class turned into a global movement. We can learn much from her example.</p><p>Over time, consistent monthly flyering will allow us to grow, recruit more members into our local groups, educate more people about the risks of AI, communicate the need for a pause, and ultimately become too big to ignore.</p><p><strong>If you&#8217;d like to participate, please check out our Discord server and <a href="https://pauseai.info/events">events page</a> for more information, or contact joep@pauseai.info.</strong></p><h3><strong>Upcoming events:</strong></h3><p>We have lots planned for the next month, including a global protest on November 20-21 coinciding with the <a href="https://pauseai.info/summit">AI Safety Conference</a>.</p><p><em>For future events, please check out our <a href="https://lu.ma/pauseai">events page</a> on Luma. If you&#8217;d like to plan an event, add your own!</em></p><h2><strong>AI at a glance: The Nobel Prize</strong></h2><p>The theme of this year&#8217;s Nobel prizes was clear: AI. The Nobel prize for Physics was awarded to Geoffrey Hinton and John Hopfield for their work on Machine Learning. Hinton believes that a catastrophic outcome from AI is quite probable, giving it &#8220;<a href="https://x.com/liron/status/1803435675527815302">about 50/50</a>&#8221; likelihood; he <a href="https://www.nytimes.com/2023/05/01/technology/ai-google-chatbot-engineer-quits-hinton.html">left Google in 2023</a> to speak openly about these risks. John Hopfield shares his worries, agreeing that we could lose control over AI models. Hopfield was one of the 30,000 people who signed the Future of Life Institute&#8217;s <a href="https://futureoflife.org/open-letter/pause-giant-ai-experiments/">Pause letter</a> last year, which as you might assume, played an important role in inspiring us to start PauseAI.</p><p>DeepMind CEO Demis Hassabis was another Nobel prize winner this year, sharing the Chemistry prize for work using AI to design and predict protein structures. He&#8217;s been concerned about AI ending humanity for a long time - he famously warned Elon Musk about AI risk <a href="https://time.com/6310076/elon-musk-ai-walter-isaacson-biography/">back in 2012</a>, which seems to have played an important role in <a href="https://pauseai.substack.com/p/from-saviour-startup-to-shoggoth">OpenAI being founded</a> years later (hindsight, of course, is 20:20).</p><p>It is also worth considering the recipients of the Nobel Peace Prize: The <a href="https://www.theguardian.com/world/2024/oct/11/nobel-peace-prize-awarded-to-japanese-atomic-bomb-survivors-group?CMP=Share_iOSApp_Other">Japan Confederation of A- and H-bomb Sufferers Organizations</a> is a group of activists who survived the bombings of Hiroshima and Nagasaki. They have spent years raising awareness of the horrors of nuclear weapons and have led campaigns for nuclear weapons abolition. While this is not <em>directly</em> related to AI, the parallels are strong. These are people who suffered firsthand the effects of a devastating new technology, which to this day threatens to destroy civilization, and who will not stop until the world is safe. We can take inspiration from them.</p><h2><strong>What we&#8217;ve been reading:</strong></h2><ul><li><p><em><a href="https://www.narrowpath.co/">A Narrow Path</a></em>. This proposal, among the most comprehensive to date, outlines a three-phase plan for achieving a world in which <em>safe</em> transformative AI can be harnessed for the benefit of humanity. Phase 0 of this plan calls for (among many other things) a moratorium on the development of superintelligent AI, with international coordination and a global treaty to achieve this outcome. We see pausing frontier AI development as one piece of the puzzle &#8211; a good first step &#8211; and A Narrow Path offers a wider view.</p></li><li><p><a href="https://forum.effectivealtruism.org/posts/fKMPa7cxSnBCymuRm/is-pausing-ai-possible">Is Pausing AI Possible?</a> by Richard Annilo. This post discusses the feasibility of an international pause on frontier AI development. It covers historical case studies, public opinion, the AI safety landscape in other countries, and other topics to show that pausing may indeed be a viable option.</p></li><li><p><a href="https://debbiecoffey.substack.com/">AI Endgame</a>. This newsletter, written by researcher and investigative journalist Debbie Coffey, covers regular developments in the world of AI and <strong>actions you can take</strong> to save the world.</p></li></ul><h2><strong>Call to action: how </strong><em><strong>you</strong></em><strong> can get involved!</strong></h2><ul><li><p><strong>Organize a Flyering Session</strong> in your community. Our next global event will be on Friday, November 8, with volunteers around the world participating. If you&#8217;d like to participate, please check out our <a href="https://pauseai.info/events">events page</a> for more information and contact <a href="mailto:joep@pauseai.info">joep@pauseai.info</a>.</p></li><li><p>Take 5 minutes to <strong><a href="https://pauseai.info/email-builder">write to your politicians</a></strong> about AI risk and the need for a Pause. This can have a massive outsized impact.</p></li><li><p>PauseAI is hiring for a <a href="https://pauseai.info/2024-vacancy-organizing-director">Global Organizing Director</a>. Please consider <strong>sharing our vacancy</strong> with your network and anyone who might be interested!</p></li></ul><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://pauseai.substack.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://pauseai.substack.com/subscribe?"><span>Subscribe now</span></a></p><p></p>]]></content:encoded></item><item><title><![CDATA[Newsom’s Folly: lessons from the veto of SB 1047]]></title><description><![CDATA[Gavin Newsom failed to protect us from AI catastrophe. Where do we go from here?]]></description><link>https://pauseai.substack.com/p/newsoms-folly-lessons-from-the-veto</link><guid isPermaLink="false">https://pauseai.substack.com/p/newsoms-folly-lessons-from-the-veto</guid><dc:creator><![CDATA[Felix De Simone]]></dc:creator><pubDate>Wed, 02 Oct 2024 04:24:06 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!R-J6!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6678c1d8-32ca-42b9-953a-4622ec9ad0f5_2200x1572.webp" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Gavin Newsom may not realize it, but he has just dealt a blow to the future of the human race.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!R-J6!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6678c1d8-32ca-42b9-953a-4622ec9ad0f5_2200x1572.webp" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!R-J6!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6678c1d8-32ca-42b9-953a-4622ec9ad0f5_2200x1572.webp 424w, https://substackcdn.com/image/fetch/$s_!R-J6!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6678c1d8-32ca-42b9-953a-4622ec9ad0f5_2200x1572.webp 848w, https://substackcdn.com/image/fetch/$s_!R-J6!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6678c1d8-32ca-42b9-953a-4622ec9ad0f5_2200x1572.webp 1272w, https://substackcdn.com/image/fetch/$s_!R-J6!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6678c1d8-32ca-42b9-953a-4622ec9ad0f5_2200x1572.webp 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!R-J6!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6678c1d8-32ca-42b9-953a-4622ec9ad0f5_2200x1572.webp" width="1456" height="1040" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/6678c1d8-32ca-42b9-953a-4622ec9ad0f5_2200x1572.webp&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1040,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:162272,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/webp&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!R-J6!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6678c1d8-32ca-42b9-953a-4622ec9ad0f5_2200x1572.webp 424w, https://substackcdn.com/image/fetch/$s_!R-J6!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6678c1d8-32ca-42b9-953a-4622ec9ad0f5_2200x1572.webp 848w, https://substackcdn.com/image/fetch/$s_!R-J6!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6678c1d8-32ca-42b9-953a-4622ec9ad0f5_2200x1572.webp 1272w, https://substackcdn.com/image/fetch/$s_!R-J6!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6678c1d8-32ca-42b9-953a-4622ec9ad0f5_2200x1572.webp 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p><em>Gavin Newsom in 2023, back when it briefly seemed like he might not cave to AI lobbyists. Source: <a href="https://www.bloomberg.com/news/articles/2023-09-06/california-governor-gavin-newsom-signs-executive-order-on-ai-risks">Bloomberg</a>.</em></p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://pauseai.substack.com/p/newsoms-folly-lessons-from-the-veto?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://pauseai.substack.com/p/newsoms-folly-lessons-from-the-veto?utm_source=substack&utm_medium=email&utm_content=share&action=share"><span>Share</span></a></p><p>On Sunday, September 29, the California governor vetoed SB 1047, a bill that represented the US&#8217; best hopes for near-term regulation of dangerous Artificial Intelligence systems. <a href="https://safesecureai.org/learn">SB 1047</a> would have required AI companies to test their largest models for dangerous capabilities, set basic provisions to prevent catastrophic harm from these models, protected whistleblowers, and established a state board in the heart of Silicon Valley to oversee frontier AI model development. In a sane world, these measures would be uncontroversial: if you are building something that could kill people, you should need to meet some basic safeguards.</p><p>If SB 1047 had become law, it would have established that AI regulation in the US is possible, and that we do not need to race ahead without guardrails. It would have been a first step, paving the way for future, more ambitious legislation.</p><p>The bill was supported by <a href="https://time.com/7008947/california-ai-bill-letter/">leading AI safety experts</a> &#8212; including Yoshua Bengio and Geoffrey Hinton, two of the &#8220;godfathers of AI&#8221;&#8212; as well as <a href="https://static.politico.com/ed/a3/b49946554f5ead081a5df2063048/letter-from-openai-whistleblowers-on-sb-1047-2024-08-22-2.pdf">OpenAI whistleblowers</a> and an <a href="https://theaipi.org/april-voters-prefer-ai-regulation-over-self-regulation-2-3-2/">overwhelming majority of Californians</a>. Scott Weiner, SB 1047&#8217;s lead sponsor, was extremely receptive to critical feedback on the bill, which went through <a href="https://safesecureai.org/amendments">several rounds of revisions</a> addressing critics&#8217; concerns. Hopes were high that Newsom might have the courage to do the right thing, but he failed.</p><p>Despite the bill&#8217;s light-touch nature, and multiple rounds of edits to placate concerns, the pro-AI lobby came out in full force against SB 1047. As PauseAI noted in a <a href="https://pauseai.substack.com/i/147622856/ai-lobbyists-fight-against-california-regulation-while-scientists-and-the-general-public-support-it">previous newsletter</a>:</p><blockquote><p>Two of the loudest critics of this bill have been Y Combinator and Andreessen Horowitz, both influential firms with<a href="https://www.axios.com/2024/07/16/marc-andreeessen-ben-horowitz-trump"> deep ties to the AI industry</a>. Marc Andreessen (of Andreessen Horowitz) sits on the board of Facebook, and Y Combinator has<a href="https://techcrunch.com/2024/05/30/paul-graham-claims-altman-wasnt-fired-from-y-combinator/"> invested in</a> OpenAI.</p><p>These firms have spread brazen misinformation about SB-1047 &#8212; including that the bill would send model developers to jail for failing to anticipate misuse (unambiguously false) and that the bill will stifle innovation and restrict startups (also false, as the bill&#8217;s provisions only apply to training runs above $100 million). In a<a href="https://safesecureai.org/responseletter"> response letter</a>, Sen. Scott Wiener, SB-1047&#8217;s lead sponsor, refuted each of these claims.</p><p>The few AI scientists who oppose the bill&#8212; such as Meta Chief Scientist<a href="https://x.com/ylecun/status/1811740052403220796"> Yann LeCun</a> or &#8220;godmother of AI&#8221;<a href="https://x.com/drfeifei/status/1820838100568056028"> Fei-Fei Li</a> &#8212; often have financial incentives to do so. Meta is one of the few AI companies large enough to be affected by the bill&#8217;s provisions, and Li&#8217;s billion-dollar startup<a href="https://www.theverge.com/2024/7/17/24200496/ai-fei-fei-li-world-labs-andreessen-horowitz-radical-ventures"> received investment backing</a> from Andressen Horowitz.</p></blockquote><p>Some of these AI lobbyists have deep ties to Newsom&#8217;s office, as tech journalist <a href="https://www.transformernews.ai/p/gavin-newsom-1047-veto">Shakeel Hashim points out:</a></p><blockquote><p>Andreessen Horowitz, despite its<a href="https://www.theverge.com/2024/7/24/24204706/marc-andreessen-ben-horowitz-a16z-trump-donations"> far-right leanings</a>, hired Newsom confidant Jason Kinney as a lobbyist [...] Then there's Ron Conway, a Democratic mega-donor close to Pelosi and Newsom, who owns stakes in OpenAI, Anthropic, and Mistral. Conway reportedly<a href="https://www.theinformation.com/articles/the-silicon-valley-godfather-who-helped-push-out-the-president?rc=rqdn2z"> lobbied</a> hard to kill the bill, seemingly threatening to ruin Wiener&#8217;s career over it.</p></blockquote><p>This time at least, big tech won.</p><p>We might be consoled if Newsom&#8217;s reasons for vetoing the bill were on solid ground. Unfortunately, they are not. One of <a href="https://www.gov.ca.gov/wp-content/uploads/2024/09/SB-1047-Veto-Message.pdf">Newsom&#8217;s excuses</a> is that the bill only covers the largest models:</p><blockquote><p>&#8220;By focusing only on the most expensive and large-scale models, SB 1047 establishes a regulatory framework that could give the public a false sense of security about controlling this fast-moving technology. Smaller, specialized models may emerge as equally or even more dangerous than the models targeted by SB 1047&#8221;</p></blockquote><p>This is transparently nonsensical. It is the equivalent of saying &#8220;This bill only bans the largest bombs. Small bombs might end up being just as dangerous in the future. So let&#8217;s not ban large bombs.&#8221; If Newsom were genuinely worried about emergent risks from small future AI models not covered by SB 1047, he would have signed SB 1047 as a first step, and <em>then</em> lobbied for future legislative efforts to regulate smaller-scale specialized models.</p><p>The true reason for Newsom&#8217;s veto seems to be an unwillingness to upset his big tech allies. His justifications are so shaky that they seem to have been spun together as a post hoc justification for a politically unpopular move.</p><p>But wait! Newsom <em>claims</em> to be supportive of future AI regulation and safety protocols, so long as they are &#8220;informed by an empirical trajectory analysis of AI systems and capabilities.&#8221;</p><p>Well then, let&#8217;s take a look at such an &#8220;empirical trajectory analysis,&#8221; shall we?</p><ol><li><p>OpenAI&#8217;s <a href="https://openai.com/index/openai-o1-system-card/">newest model</a> already scores &#8220;medium&#8221; in certain categories of risk, including chemical, biological, radiological, and nuclear (CBRN) risk.</p></li><li><p>We already have evidence of <a href="https://pauseai.substack.com/p/pauseai-newsletter-august-26-2024">power-seeking behavior</a> from advanced AI (which safety experts have been warning us about for <a href="https://pauseai.info/xrisk">years</a>).</p></li><li><p>Several of the world&#8217;s leading experts believe that catastrophically dangerous AI could be only a few years away.</p><ol><li><p>Yoshua Bengio, &#8220;godfather of AI&#8221; and recipient of the prestigious Turing Award, states that <strong><a href="https://yoshuabengio.org/2023/08/12/personal-and-psychological-dimensions-of-ai-researchers-confronting-ai-catastrophic-risks/">loss of control to rogue AI systems</a> could occur in as little as a few years </strong>unless appropriate precautions are taken.</p></li><li><p>A recent <a href="https://aiimpacts.org/wp-content/uploads/2023/04/Thousands_of_AI_authors_on_the_future_of_AI.pdf">survey</a> of over 2,700 top AI researchers found a <strong>10% estimate of human-level AI by 2027</strong>.</p></li><li><p>A recent <a href="https://www.un.org/sites/un2.un.org/files/governing_ai_for_humanity_final_report_en.pdf#page=29">UN report</a> found that several experts &#8220;expect the deployment of agentic systems in 2025&#8221; to lead to &#8220;some of the most surprising or significant impacts on AI-related risks.&#8221;</p></li></ol></li><li><p><a href="https://aiimpacts.org/wp-content/uploads/2023/04/Thousands_of_AI_authors_on_the_future_of_AI.pdf">Surveys of thousands of AI experts</a> have found a mean 1 in 6 estimate that superhuman AI could lead to human extinction. The head of AI Safety at the US AI Safety Institute (which Newsom references favorably in his veto statement) once gave a <a href="https://ai-alignment.com/my-views-on-doom-4788b1cd0c72">20% estimate</a> of human extinction from AI.</p></li></ol><p>What more empirical evidence do we need to act? Are we going to wait until AI systems actually start killing people before we do something?</p><p>In a very real sense, Newsom&#8217;s veto reflects the insanity of the AI landscape. Some of the brightest minds on Earth are actively trying to build something more intelligent than human beings, which we don&#8217;t know how to control, and which thousands of experts believe could cause the extinction of humanity. And yet, AI lobbyists and the politicians in their pocket oppose even the lightest-touch guardrails at every turn. If we are lucky enough to have descendants, they will be embarrassed and appalled at how impotently our leaders reacted to this threat.</p><p>But the fight is far from over, and hope is not lost. There are lessons we can learn from Newsom&#8217;s veto:</p><p><strong>Lesson 1: We need international coordination.</strong></p><p>Newsom&#8217;s veto, in addition to being driven by powerful lobby interests, may have also been influenced by a desire to keep the lead on AI at all costs. He notes in his veto statement that California is home to 32 of the world&#8217;s 50 leading AI companies, and he has previously <a href="https://www.gov.ca.gov/2024/05/29/governor-newsom-convenes-genai-leaders-for-landmark-summit/">expressed concern</a> about California losing its innovative edge in AI. There is a fear that if California over-regulates AI, America will lose its lead to other parts of the world &#8211; and this concern is mirrored on a national scale, with many politicians committed to maintaining America&#8217;s dominance in the field.</p><p>International coordination offers us a chance to escape this madness. As PauseAI has <a href="https://pauseai.info/proposal">said all along</a>, we cannot expect any individual company or country to slow down voluntarily. We need binding international agreements to stop the frenzy of the AI arms race. The optimal strategy is to cooperate, and to make sure that everyone else cooperates too.</p><p><strong>Lesson 2: We need overwhelming grassroots action.</strong></p><p>The fact that even a light-touch bill like SB 1047 failed tells us everything we need to know: we can&#8217;t trust politicians like Gavin Newsom to do the right thing on their own. The AI industry is too powerful, and their pockets are too deep. We must instead rely on tactics used for generations by our predecessors, who organized for climate policy, nuclear disarmament, and a host of other issues. We can engage in widespread grassroots activism, organize massive nonviolent protests, and apply unprecedented public pressure. We must become too big to ignore.</p><p>We are still in the early days. In a few years, as AI systems become more powerful, AI regulation will emerge as a dominant issue. The tech lobbyists will become louder and more obstinate in their anti-regulatory frenzy, so the calls for action from activist movements like PauseAI must become deafening. We are only getting started.</p><p>If we mobilize, humanity will win.</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://pauseai.substack.com/p/newsoms-folly-lessons-from-the-veto?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://pauseai.substack.com/p/newsoms-folly-lessons-from-the-veto?utm_source=substack&utm_medium=email&utm_content=share&action=share"><span>Share</span></a></p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://pauseai.substack.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://pauseai.substack.com/subscribe?"><span>Subscribe now</span></a></p>]]></content:encoded></item><item><title><![CDATA[PauseAI Newsletter: International Collaboration Edition! 🌎⏸️🤖]]></title><description><![CDATA[This issue focuses on international efforts on AI safety, and what they tell us about the way forward.]]></description><link>https://pauseai.substack.com/p/pauseai-newsletter-international</link><guid isPermaLink="false">https://pauseai.substack.com/p/pauseai-newsletter-international</guid><dc:creator><![CDATA[PauseAI]]></dc:creator><pubDate>Tue, 01 Oct 2024 15:55:12 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!LfsY!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3f6e0c6f-7841-4738-a00c-6bf3fae86ba7_3024x2065.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!LfsY!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3f6e0c6f-7841-4738-a00c-6bf3fae86ba7_3024x2065.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!LfsY!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3f6e0c6f-7841-4738-a00c-6bf3fae86ba7_3024x2065.png 424w, https://substackcdn.com/image/fetch/$s_!LfsY!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3f6e0c6f-7841-4738-a00c-6bf3fae86ba7_3024x2065.png 848w, https://substackcdn.com/image/fetch/$s_!LfsY!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3f6e0c6f-7841-4738-a00c-6bf3fae86ba7_3024x2065.png 1272w, https://substackcdn.com/image/fetch/$s_!LfsY!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3f6e0c6f-7841-4738-a00c-6bf3fae86ba7_3024x2065.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!LfsY!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3f6e0c6f-7841-4738-a00c-6bf3fae86ba7_3024x2065.png" width="1456" height="994" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/3f6e0c6f-7841-4738-a00c-6bf3fae86ba7_3024x2065.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:994,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:7113594,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!LfsY!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3f6e0c6f-7841-4738-a00c-6bf3fae86ba7_3024x2065.png 424w, https://substackcdn.com/image/fetch/$s_!LfsY!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3f6e0c6f-7841-4738-a00c-6bf3fae86ba7_3024x2065.png 848w, https://substackcdn.com/image/fetch/$s_!LfsY!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3f6e0c6f-7841-4738-a00c-6bf3fae86ba7_3024x2065.png 1272w, https://substackcdn.com/image/fetch/$s_!LfsY!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3f6e0c6f-7841-4738-a00c-6bf3fae86ba7_3024x2065.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><div class="digest-post-embed" data-attrs="{&quot;nodeId&quot;:&quot;a4e7d574-5ed4-41d2-aa13-f21516efca25&quot;,&quot;caption&quot;:&quot;Last week, the United Nations held the Summit of the Future at their headquarters in New York City.&quot;,&quot;cta&quot;:null,&quot;showBylines&quot;:true,&quot;size&quot;:&quot;lg&quot;,&quot;isEditorNode&quot;:true,&quot;title&quot;:&quot;I joined the UN Summit to talk about AI safety. Here&#8217;s what I learned.&quot;,&quot;publishedBylines&quot;:[{&quot;id&quot;:221069776,&quot;name&quot;:&quot;Felix De Simone&quot;,&quot;bio&quot;:&quot;Trying to make AI safe while racing to build it is like jumping out of a plane and trying to build a parachute on the way down.&quot;,&quot;photo_url&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/40403568-5f81-440f-b444-f49b017f5e7b_838x912.png&quot;,&quot;is_guest&quot;:false,&quot;bestseller_tier&quot;:null}],&quot;post_date&quot;:&quot;2024-10-01T15:11:05.072Z&quot;,&quot;cover_image&quot;:&quot;https://substackcdn.com/image/fetch/f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5d98c6f1-f2f1-4acc-a7ab-89cb8d22cc3c_1170x530.jpeg&quot;,&quot;cover_image_alt&quot;:null,&quot;canonical_url&quot;:&quot;https://pauseai.substack.com/p/i-joined-the-un-summit-to-talk-about&quot;,&quot;section_name&quot;:null,&quot;video_upload_id&quot;:null,&quot;id&quot;:149666604,&quot;type&quot;:&quot;newsletter&quot;,&quot;reaction_count&quot;:2,&quot;comment_count&quot;:0,&quot;publication_id&quot;:null,&quot;publication_name&quot;:&quot;PauseAI Newsletter&quot;,&quot;publication_logo_url&quot;:&quot;https://substackcdn.com/image/fetch/f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F181acceb-2f60-4db6-8a5c-468119de7d2b_654x654.png&quot;,&quot;belowTheFold&quot;:false,&quot;youtube_url&quot;:null,&quot;show_links&quot;:null,&quot;feed_url&quot;:null}"></div><p><em>This issue&#8217;s Guest Post provides an inside view of the UN summit, where it fell short on AI safety, and what still needs to be done.</em></p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://pauseai.substack.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://pauseai.substack.com/subscribe?"><span>Subscribe now</span></a></p><h2><strong>AI Safety in the UN&#8217;s &#8220;Pact for the Future&#8221;</strong></h2><p>AI Safety featured heavily in the United Nations&#8217; summit last week. Here are the key takeaways:</p><p>&#8226;&nbsp;World leaders agreed to adopt the United Nations&#8217; <a href="https://news.un.org/en/story/2024/09/1154671">Pact for the Future</a>, pledging to take action for a peaceful, sustainable and inclusive world for future generations.</p><p>&#8226;&nbsp;A key component of the pact is the <a href="https://www.un.org/techenvoy/global-digital-compact">Global Digital Compact</a>, which will foster global standards in digital technologies and narrow &#8220;digital divides&#8221; between developed and developing countries by promoting open data, facilitating worldwide internet access by 2030, and ensuring that AI is developed responsibly.<br><br>&#8226;&nbsp;Separately, The UN&#8217;s High-level Advisory Body on Artificial Intelligence released the <a href="https://www.un.org/sites/un2.un.org/files/governing_ai_for_humanity_final_report_en.pdf">Governing AI For Humanity Report</a>. The report acknowledges AI&#8217;s potential for positive transformation, but highlights the peril of ungoverned AI and makes a strong case for international cooperation. Its seven main initiatives are:</p><ol><li><p><strong>International Scientific Panel on AI</strong><br>Inspired by the Intergovernmental Panel on Climate Change (IPCC).</p></li><li><p><strong>Policy dialogue on AI governance</strong><br>Dialogue between Member States to share best practices based on human rights and information about potentially risky AI &#8220;incidents&#8221;.</p></li><li><p><strong>AI standards exchange</strong> <br>To consolidate methods for evaluating and measuring frontier AI, as well as creating common definitions for AI safety terms like fairness, safety and transparency.&nbsp;</p></li><li><p><strong>Global AI capacity development network</strong><br>To bolster AI capability and governance capacities across the world.</p></li><li><p><strong>Global AI fund<br></strong>To &#8220;put a floor under the AI divide&#8221;, sharing AI resources with countries without the means to build AI infrastructure.</p></li><li><p><strong>Global AI data framework</strong> <br>To promote cultural and linguistic diversity and set common standards in AI training data.</p></li><li><p><strong>UN AI Office</strong> <br>Small office that reports to the Secretary General and coordinates the implementation of the proposals.</p></li></ol><p>&#8226;&nbsp;President Biden made an urgent <a href="https://www.un.org/sites/un2.un.org/files/governing_ai_for_humanity_final_report_en.pdf">case for AI Safety</a> in what will be his final presidential address to the United Nations General Assembly. Describing this moment in history as an &#8220;inflection point&#8221;, he asked world leaders, &#8220;how do we govern AI as countries and companies race to uncertain frontiers?&#8221; and cautioned that &#8220;as AI grows more powerful, it must also grow more responsible&#8221;. He finished with a prediction that in the coming years, &#8220;there may be <em>no greater test of our leadership than how we deal with AI.&#8221;</em></p><h4><strong>Analysis: Does the UN pact go far enough?</strong></h4><p>The UN&#8217;s High-level advisory report on AI and the newly adopted Global Digital Compact offer a potential paradigm shift from profit-driven AI to pro-social AI. World leaders have agreed to UN proposals to develop AI from the &#8220;bottom up&#8221;,&nbsp; ensuring that the new technology&#8217;s benefits are distributed equitably, and that all countries play a part in governing and stewarding the transformative potential of AI.&nbsp;</p><p>However ambitious the report is, it&#8217;s just a report. And the Global Digital Compact is just a pact. There are no proposed mechanisms for enforcement. That said, the report does acknowledge this as an area of concern by stating that AI Governance among countries and companies cannot merely rest on &#8220;voluntarism&#8221; where all too often &#8220;practice belies rhetoric&#8221;. This is&nbsp;a clear shot at frontier AI companies that make public commitments to AI Safety while pushing the boundaries of AI capabilities behind closed doors.</p><p>Disappointingly, there is little mention of the specific risks of developing Artificial <em>General</em> Intelligence capable of outperforming humans in reasoning, planning, and problem-solving, or indeed how governance bodies could evaluate and prevent AI models from crossing certain red lines. While the report and pact lay out an important blueprint for global AI Safety standardization, the UN needs to clearly stipulate how it could enforce compliance.</p><p>Still, the Governing AI for Humanity report and the Global Digital Compact are important first steps in establishing global standards and co-operation on AI governance. <strong>Much more needs to be done</strong>.</p><div><hr></div><h2><strong>Yi Zeng founds Chinese AI Safety Network</strong></h2><p>AI lobbyists often justify a no-holds-barred race to build superintelligent AI by conjuring the fear of an unscrupulous China getting there first. But in a welcome sign of international cooperation and mounting Chinese concern over AI Safety, Yi Zeng, a decorated professor of the Chinese Academy of Sciences, has founded the <a href="https://chinese-ai-safety.network">Chinese AI Safety Network</a>.&nbsp;</p><p>Zeng <a href="https://x.com/yi_zeng/status/1803198308917063920">describes</a> the network as a complementary approach to national AI Safety Institutes in other countries. The network will provide a &#8220;cooperation platform for AI Safety across China that brings together various efforts related to AI Safety and Security at all levels, and that serves as a platform for dialogue, mapping, interoperability, and collaborations, within China, and that connects and contributes to the world.&#8221;</p><p>The establishment of this network is a reminder that an international race to superhuman AI is <em>not</em> inevitable, that scientists and diplomats from competing nations can work together to advance AI safety, and that we are all better off when we put the needs of humanity above the reckless and short-term desires of individual countries.</p><div><hr></div><h2><strong>OpenAI plans to go for-Profit</strong></h2><p>In a move that will surprise few, OpenAI plans to ditch its non-profit roots altogether. Sources close to the company told<a href="https://www.bloomberg.com/news/articles/2024-09-25/openai-cto-mira-murati-says-she-will-leave-the-company"> Bloomberg</a> that the restructuring will give CEO Sam Altman a 7% stake and open the door to more private investment. Icing the cake for potential financial backers, OpenAI plans to remove their cap on returns for investors. The company is currently valued at over $80 billion, but further funding rounds could see the company&#8217;s valuation balloon even further.</p><p>This move comes at the same time as yet another wave of departures from OpenAI. After six and a half years at the company, Chief Technology Officer Mira Murati <a href="https://x.com/miramurati/status/1839025700009030027?s=46">left the company</a>, followed <a href="https://techcrunch.com/2024/09/25/openais-chief-research-officer-has-left/">shortly after</a> by Chief Research Officer Bob McGrew and President of Research Barret Zoph. Murati was the face of the company during its high-profile release of ChatGPT-4o earlier this year and also stepped in as interim CEO during the boardroom drama that saw Altman temporarily ousted last year. </p><p>Former OpenAI research engineer <a href="https://www.theguardian.com/technology/2024/sep/27/openai-shift-to-for-profit-company-may-lead-it-to-cut-corners-says-whistleblower">William Saunders</a> recently expressed concern over OpenAI&#8217;s for-profit shift, saying &#8220;I&#8217;m most concerned about what this means for governance of safety decisions at OpenAI&#8230; if the non-profit board is no longer in control of these decisions and Sam Altman holds a significant equity stake, this creates more incentive to race and cut corners&#8221;.</p><p>OpenAI&#8217;s move reaffirms what we already know: we cannot allow a small, unaccountable group of people to make decisions on this technology which affect the future of humanity. The UN has begun to recognize this. The rest of us must do the same.</p><p></p><p><strong>Read the full story of how OpenAI became the beast it is today:</strong></p><div class="digest-post-embed" data-attrs="{&quot;nodeId&quot;:&quot;abf64e4f-809a-4464-a959-7c7d7cc021c9&quot;,&quot;caption&quot;:&quot;There&#8217;s a viral meme shared in artificial intelligence circles: a hideous, slithering octopus, with a happy face at the end of its tentacle. The meme is based on H.P. Lovecraft&#8217;s monster the Shoggoth, and has come to represent the dangers lurking underneath the cheery exterior of powerful chatbots like ChatGPT. Though the most powerful AI company today &#8230;&quot;,&quot;cta&quot;:null,&quot;showBylines&quot;:true,&quot;size&quot;:&quot;lg&quot;,&quot;isEditorNode&quot;:true,&quot;title&quot;:&quot;From Saviour Startup to Shoggoth: OpenAI&#8217;s History reflects the battle for the soul of AI&quot;,&quot;publishedBylines&quot;:[{&quot;id&quot;:189791515,&quot;name&quot;:&quot;PauseAI&quot;,&quot;bio&quot;:&quot;https://pauseai.info/&quot;,&quot;photo_url&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/181acceb-2f60-4db6-8a5c-468119de7d2b_654x654.png&quot;,&quot;is_guest&quot;:false,&quot;bestseller_tier&quot;:null}],&quot;post_date&quot;:&quot;2024-08-12T15:44:10.461Z&quot;,&quot;cover_image&quot;:&quot;https://substackcdn.com/image/fetch/f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F33e26ffb-452a-4945-ad4b-044fe61428b5_1357x758.png&quot;,&quot;cover_image_alt&quot;:null,&quot;canonical_url&quot;:&quot;https://pauseai.substack.com/p/from-saviour-startup-to-shoggoth&quot;,&quot;section_name&quot;:null,&quot;video_upload_id&quot;:null,&quot;id&quot;:147622928,&quot;type&quot;:&quot;newsletter&quot;,&quot;reaction_count&quot;:6,&quot;comment_count&quot;:6,&quot;publication_id&quot;:null,&quot;publication_name&quot;:&quot;PauseAI Newsletter&quot;,&quot;publication_logo_url&quot;:&quot;https://substackcdn.com/image/fetch/f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F181acceb-2f60-4db6-8a5c-468119de7d2b_654x654.png&quot;,&quot;belowTheFold&quot;:true,&quot;youtube_url&quot;:null,&quot;show_links&quot;:null,&quot;feed_url&quot;:null}"></div><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://pauseai.substack.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thank you for reading PauseAI&#8217;s newsletter! Please subscribe to support our work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p></p>]]></content:encoded></item></channel></rss>