fosstodon.org is one of the many independent Mastodon servers you can use to participate in the fediverse.
Fosstodon is an invite only Mastodon instance that is open to those who are interested in technology; particularly free & open source software. If you wish to join, contact us for an invite.

Administered by:

Server stats:

8.7K
active users

#lesswrong

0 posts0 participants0 posts today
C++ Wage Slave<p><span class="h-card" translate="no"><a href="https://techhub.social/@hosford42" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>hosford42</span></a></span> <span class="h-card" translate="no"><a href="https://mastodon.social/@mercianpilgrim" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>mercianpilgrim</span></a></span> <span class="h-card" translate="no"><a href="https://dair-community.social/@timnitGebru" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>timnitGebru</span></a></span> </p><p>I didn't know about any of these until I came across this thread. I spent a short while reading about them just now, and oh — my — goodness. </p><p>The <a href="https://infosec.space/tags/ChurchOfAI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ChurchOfAI</span></a> makes a lot of noise about your not having to exercise any faith or believe in unlikely stories, and then makes some wild leaps of faith — such as that AI will master time and space and enable us to travel through time — and seems to think these things are logical and inevitable. It ignores obvious objections, such as that time travel may be impossible, AI may be uncontrollable, AI may be subverted to control people rather than empower them, or AI may simply not work.</p><p>But even that seems better than <a href="https://infosec.space/tags/Robotheism" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Robotheism</span></a>, which, to me, just looks unhinged. It claims that AI is the root of existence, which seems pretty far-fetched, given that the universe is over 13bn years old and AGI still doesn't exist.</p><p>Aaron, are you thinking of <a href="https://infosec.space/tags/LessWrong" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>LessWrong</span></a>?</p>
Hacker News<p>A Straightforward Explanation of the Good Regulator Theorem</p><p><a href="https://www.lesswrong.com/posts/JQefBJDHG6Wgffw6T/a-straightforward-explanation-of-the-good-regulator-theorem" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://www.</span><span class="ellipsis">lesswrong.com/posts/JQefBJDHG6</span><span class="invisible">Wgffw6T/a-straightforward-explanation-of-the-good-regulator-theorem</span></a></p><p><a href="https://mastodon.social/tags/HackerNews" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>HackerNews</span></a> <a href="https://mastodon.social/tags/GoodRegulatorTheorem" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>GoodRegulatorTheorem</span></a> <a href="https://mastodon.social/tags/StraightforwardExplanation" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>StraightforwardExplanation</span></a> <a href="https://mastodon.social/tags/SystemsTheory" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>SystemsTheory</span></a> <a href="https://mastodon.social/tags/ControlTheory" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ControlTheory</span></a> <a href="https://mastodon.social/tags/LessWrong" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>LessWrong</span></a></p>
Vladimir Savić<p>Great <a href="https://mastodon.social/tags/video" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>video</span></a> adaptation of a short story published by Eliezer Yudkowsky on <a href="https://mastodon.social/tags/LessWrong" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>LessWrong</span></a> forum (<a href="https://www.lesswrong.com/posts/5wMcKNAwB6X4mp9og/that-alien-message8" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://www.</span><span class="ellipsis">lesswrong.com/posts/5wMcKNAwB6</span><span class="invisible">X4mp9og/that-alien-message8</span></a>) </p><p>That Alien Message <a href="https://www.youtube.com/watch?v=fVN_5xsMDdg" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://www.</span><span class="">youtube.com/watch?v=fVN_5xsMDdg</span><span class="invisible"></span></a> <a href="https://mastodon.social/tags/watching" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>watching</span></a>📺</p>
James<p>Emergence Spirals… sounds sort of woo-woo but it's not, despite what Yudkowsky says… <a href="https://nonzerosum.games/emergencespirals.html" rel="nofollow noopener" target="_blank">nonzerosum.games/emergencespirals.html</a> <a class="hashtag" rel="nofollow noopener" href="https://bsky.app/search?q=%23evolution" target="_blank">#evolution</a> <a class="hashtag" rel="nofollow noopener" href="https://bsky.app/search?q=%23feedbackloops" target="_blank">#feedbackloops</a> <a class="hashtag" rel="nofollow noopener" href="https://bsky.app/search?q=%23biology" target="_blank">#biology</a> <a class="hashtag" rel="nofollow noopener" href="https://bsky.app/search?q=%23systemstheory" target="_blank">#systemstheory</a> <a class="hashtag" rel="nofollow noopener" href="https://bsky.app/search?q=%23complexity" target="_blank">#complexity</a> <a class="hashtag" rel="nofollow noopener" href="https://bsky.app/search?q=%23Yudkowsky" target="_blank">#Yudkowsky</a> <a class="hashtag" rel="nofollow noopener" href="https://bsky.app/search?q=%23LessWrong" target="_blank">#LessWrong</a> <a class="hashtag" rel="nofollow noopener" href="https://bsky.app/search?q=%23Popper" target="_blank">#Popper</a> <a class="hashtag" rel="nofollow noopener" href="https://bsky.app/search?q=%23Hegel" target="_blank">#Hegel</a> <a class="hashtag" rel="nofollow noopener" href="https://bsky.app/search?q=%23Parasites" target="_blank">#Parasites</a> <a class="hashtag" rel="nofollow noopener" href="https://bsky.app/search?q=%23Symbiosis" target="_blank">#Symbiosis</a> <a class="hashtag" rel="nofollow noopener" href="https://bsky.app/search?q=%23Science" target="_blank">#Science</a> <a class="hashtag" rel="nofollow noopener" href="https://bsky.app/search?q=%23Nature" target="_blank">#Nature</a> <a class="hashtag" rel="nofollow noopener" href="https://bsky.app/search?q=%23Philosophy" target="_blank">#Philosophy</a> <a class="hashtag" rel="nofollow noopener" href="https://bsky.app/search?q=%23History" target="_blank">#History</a><br><br><a href="https://nonzerosum.games/emergencespirals.html" rel="nofollow noopener" target="_blank">EMERGENCE SPIRALS</a></p>
Chris J. Karr<p>Just did my best to try and infect the <a class="mention" href="https://bsky.app/profile/weirdfictionquarterly.com" rel="nofollow noopener" target="_blank">@weirdfictionquarterly.com</a> writers with Roko's Basilisk. In one box is the theme of WFQ's Summer 2025 issue. In the second is... <a class="hashtag" href="https://bsky.app/search?q=%23RokosBasilisk" rel="nofollow noopener" target="_blank">#RokosBasilisk</a> <a class="hashtag" href="https://bsky.app/search?q=%23LessWrong" rel="nofollow noopener" target="_blank">#LessWrong</a> <a class="hashtag" href="https://bsky.app/search?q=%23TimelessDecisionTheory" rel="nofollow noopener" target="_blank">#TimelessDecisionTheory</a> <a class="hashtag" href="https://bsky.app/search?q=%23WeirdFictionQuarterly" rel="nofollow noopener" target="_blank">#WeirdFictionQuarterly</a> <a class="hashtag" href="https://bsky.app/search?q=%23Singularity" rel="nofollow noopener" target="_blank">#Singularity</a> <a class="hashtag" href="https://bsky.app/search?q=%23AGI" rel="nofollow noopener" target="_blank">#AGI</a> <a class="hashtag" href="https://bsky.app/search?q=%23ArtificialIntelligence" rel="nofollow noopener" target="_blank">#ArtificialIntelligence</a><br><br><a href="https://slate.com/technology/2014/07/rokos-basilisk-the-most-terrifying-thought-experiment-of-all-time.html" rel="nofollow noopener" target="_blank">The Most Terrifying Thought Ex...</a></p>
Hacker News<p>Recent AI model progress feels mostly like bullshit</p><p><a href="https://www.lesswrong.com/posts/4mvphwx5pdsZLMmpY/recent-ai-model-progress-feels-mostly-like-bullshit" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://www.</span><span class="ellipsis">lesswrong.com/posts/4mvphwx5pd</span><span class="invisible">sZLMmpY/recent-ai-model-progress-feels-mostly-like-bullshit</span></a></p><p><a href="https://mastodon.social/tags/HackerNews" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>HackerNews</span></a> <a href="https://mastodon.social/tags/AI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AI</span></a> <a href="https://mastodon.social/tags/Progress" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Progress</span></a> <a href="https://mastodon.social/tags/AI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AI</span></a> <a href="https://mastodon.social/tags/Models" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Models</span></a> <a href="https://mastodon.social/tags/Critique" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Critique</span></a> <a href="https://mastodon.social/tags/Technology" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Technology</span></a> <a href="https://mastodon.social/tags/Discussion" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Discussion</span></a> <a href="https://mastodon.social/tags/LessWrong" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>LessWrong</span></a></p>

потерял точилку на кухне

интересно, есть ли "рациональные" способы поиска потерянного? (точилка подарок, поэтому есть эмоциональная привязанность)

получил обновление своих убеждений о мире и о своих навыках искать точилку

#LessWrong is a community blog focused on "refining the art of human #rationality." To this end, it focuses on identifying and overcoming bias, improving judgment and problem-solving, and speculating about the future. The blog is based on the ideas of Eliezer Yudkowsky, a research fellow for the Machine Intelligence Research Institute (MIRI; previously known as the Singularity Institute for Artificial Intelligence, and then the Singularity Institute). Many members of LessWrong share Yudkowsky's interests in #transhumanism, artificial intelligence (AI), the #Singularity, and #ai

#RationalWiki #miri #ai
rationalwiki.org/wiki/LessWron

Разобрался в основном с алгоритмом обучения gradual value/policy iteration, прикольненький.

gibberblot.github.io/rl-notes/ тут что-то про него

Всё ради того, чтобы разобрать статью. Какого формата статья?

"Смотрите, круто если мы будем reward при обучении с подкреплением не максимизировать, а делать равным чему-то. Вот как это делать. Оказывается, если выбирать интервал, то получается много способов это сделать. Тогда можно наложить всяких ограничений по безопасности, чтобы не улететь по случайности в ту или иную сторону, или не выбирать слишком хардовые шаги и тд"

lesswrong.com/s/4TT69Yt5FDWijA

gibberblot.github.ioPolicy iteration — Introduction to Reinforcement Learning
Replied in thread

@futurebird
'Terraforming earth' would get a lot more people interested but it would also be like shit for flies to the #ExistentialRisk / #Longtermism / #LessWrong / #TESCREAL crowd. You'd almost immediately see it coopted to advocate for building a bunch of fusion-fueled domed cities populated by white people designign a brave new future filled with computronium-maximizing nanoassemblers & simulated humans living simulations of lives optimized for maximum neo-Utilitarian 'happiness.'