I’ve thought a lot about these self-declared rationalists, and at the end of the day I think the most important thing that explains them and that people should understand about them, is that they don’t include confidence intervals in their analyses - which means they don’t account for compounded uncertainty. Their extrapolations are therefore pure noise. It’s all nonsense.
In some ways I think this is obvious: These post-hoc rationalists are the only ones who think emotions can be ignored. For most people that seems to be a clearly an outrageous proposition. How many steps ahead can one reliably predict if they cannot even properly characterize the relevant actors? Not very far. It doesn’t matter whether you think emotions are causative or reactive, you can’t simply ignore them and think that you’re going to be able to see the full picture.
It’s also worth noting that their process is the antithesis of science. The modern philosophy of science views science as a relativist construct. These SV modernists do not treat their axioms as relative, they believe that each observation is an immutable truth. In science you have to build a bridge if you want to connect two ideas, while in SV rationalism you are basically allowed to transpose any idea into any other field or context without any connection or support except your own biases.
People equate Roko’s Basilisk to Pascal’s Wager, but afaik Pascal’s game involved the acceptance or denial of a single omnipotent deity. If we accept the premise of Roko’s Basilisk then we are not considering a monotheistic state, we are considering a potentially polytheistic reality with an indeterminate/variable number of godlike entities, and we should not fear Roko’s Basilisk because there are potentially near-infinite “deities” who should be feared even more than the Basilisk. In this context, reacting to the known terror is just as likely to be minimally optimal as maximally optimal. To use their own jargon a little bit, opportunity is therefore maximally utilized by not reacting to any deities that don’t exist today and won’t exist by the end of next week. To do otherwise is to court sunk costs.
As opposed to things like climate change that exist today and will be worse by next week, but which are almost entirely ignored by this group.
I’ve been trying to understand the diversity among SV rationalist groups - the basilisk was originally banned, the ziz crew earnestly believed in principles that were just virtue signaling for everyone else, etc. I’ve always seen them as right-wingers, but could some segment be primed for the left? To consider this I had to focus on their flavor of accelerationism: These people practice an accelerationism that weighs future (possible) populations as more important than the present (presumably) smaller population. On this basis they practice EA and such, believing that if they gain power that they will be able to steer society in a direction that benefits the future at the expense of the present. (This being the opposite approach of some other accelerationists who want to tear power down instead of capturing it.) Upon distilling all of the different varieties of SV rationalism I’ve encountered, in essence it seems they believe they must out-right the right so that they can one day do things that aren’t necessarily associated with the right. In my opinion, one cannot create change by playing a captured game. The only way I see to make useful allies out of any of these groups is to convince them that their flavor of accelerationism is self-defeating, and that progress must always be pursued directly and immediately. Which is not easy because these are notoriously stubborn individuals. They built a whole religion around post-hoc rationalization, after all.
As a Star Trek guy who is frustrated by what Vulcans refer to as Logic, and where the idea that being logical had anything to do with suppressing emotions or just ignoring a lot of complexities of what Logic even is and boiling it down to basically being very very very stern and stoic has had some weird fucking consequences.
Also Rocco’s Basslick is stupid cause computers don’t work that way and we could just like, unplug it. Computers are machines, how would a skynet or whatever get the structures built to put everyone in a simulation or whatever? Things just don’t work that way.
My favorite thing about vulcans is they’re like “we are very logical, logic logic logic” and everyone is like “they’re so smart and logical” but because tv writers are dumb they’re basically just super racist radlibs
That and the Libertsrianification of the series post roddenberry are both topics I really wanna dig into and do a big giant write up on but the homework is massive and ultimately about a silly TV show. It could be kinda worthwhile and fun but everyone is doing book reports at the same time and I sorta feel my time is better spent on real shit and it’s hard to squeeze in silly research.
I’ve thought a lot about these self-declared rationalists, and at the end of the day I think the most important thing that explains them and that people should understand about them, is that they don’t include confidence intervals in their analyses - which means they don’t account for compounded uncertainty. Their extrapolations are therefore pure noise. It’s all nonsense.
In some ways I think this is obvious: These post-hoc rationalists are the only ones who think emotions can be ignored. For most people that seems to be a clearly an outrageous proposition. How many steps ahead can one reliably predict if they cannot even properly characterize the relevant actors? Not very far. It doesn’t matter whether you think emotions are causative or reactive, you can’t simply ignore them and think that you’re going to be able to see the full picture.
It’s also worth noting that their process is the antithesis of science. The modern philosophy of science views science as a relativist construct. These SV modernists do not treat their axioms as relative, they believe that each observation is an immutable truth. In science you have to build a bridge if you want to connect two ideas, while in SV rationalism you are basically allowed to transpose any idea into any other field or context without any connection or support except your own biases.
People equate Roko’s Basilisk to Pascal’s Wager, but afaik Pascal’s game involved the acceptance or denial of a single omnipotent deity. If we accept the premise of Roko’s Basilisk then we are not considering a monotheistic state, we are considering a potentially polytheistic reality with an indeterminate/variable number of godlike entities, and we should not fear Roko’s Basilisk because there are potentially near-infinite “deities” who should be feared even more than the Basilisk. In this context, reacting to the known terror is just as likely to be minimally optimal as maximally optimal. To use their own jargon a little bit, opportunity is therefore maximally utilized by not reacting to any deities that don’t exist today and won’t exist by the end of next week. To do otherwise is to court sunk costs.
As opposed to things like climate change that exist today and will be worse by next week, but which are almost entirely ignored by this group.
I’ve been trying to understand the diversity among SV rationalist groups - the basilisk was originally banned, the ziz crew earnestly believed in principles that were just virtue signaling for everyone else, etc. I’ve always seen them as right-wingers, but could some segment be primed for the left? To consider this I had to focus on their flavor of accelerationism: These people practice an accelerationism that weighs future (possible) populations as more important than the present (presumably) smaller population. On this basis they practice EA and such, believing that if they gain power that they will be able to steer society in a direction that benefits the future at the expense of the present. (This being the opposite approach of some other accelerationists who want to tear power down instead of capturing it.) Upon distilling all of the different varieties of SV rationalism I’ve encountered, in essence it seems they believe they must out-right the right so that they can one day do things that aren’t necessarily associated with the right. In my opinion, one cannot create change by playing a captured game. The only way I see to make useful allies out of any of these groups is to convince them that their flavor of accelerationism is self-defeating, and that progress must always be pursued directly and immediately. Which is not easy because these are notoriously stubborn individuals. They built a whole religion around post-hoc rationalization, after all.
As a Star Trek guy who is frustrated by what Vulcans refer to as Logic, and where the idea that being logical had anything to do with suppressing emotions or just ignoring a lot of complexities of what Logic even is and boiling it down to basically being very very very stern and stoic has had some weird fucking consequences.
Also Rocco’s Basslick is stupid cause computers don’t work that way and we could just like, unplug it. Computers are machines, how would a skynet or whatever get the structures built to put everyone in a simulation or whatever? Things just don’t work that way.
My favorite thing about vulcans is they’re like “we are very logical, logic logic logic” and everyone is like “they’re so smart and logical” but because tv writers are dumb they’re basically just super racist radlibs
That and the Libertsrianification of the series post roddenberry are both topics I really wanna dig into and do a big giant write up on but the homework is massive and ultimately about a silly TV show. It could be kinda worthwhile and fun but everyone is doing book reports at the same time and I sorta feel my time is better spent on real shit and it’s hard to squeeze in silly research.