

top 1%
So… 1 in a 100? That isn’t that impressive. I’m ignoring the utter weirdness of what he is even talking about, but you expect a billionaire to have at least a better grasp of numbers.


top 1%
So… 1 in a 100? That isn’t that impressive. I’m ignoring the utter weirdness of what he is even talking about, but you expect a billionaire to have at least a better grasp of numbers.


Basically it’s really obvious that they don’t have a meaningful way to describe exactly what they want it to do and so they’re playing whack-a-mole with undesired behaviors in order to minimize how often it embarrasses them.
The whole ‘how many r’s in strawberry’ sort of stuff already made me suspect that, when the popular one was fixed and other attempts at asking for letters did still give the miscounts.
Wonder of the goblin stuff is the start of some model collapse. And if we all can make it worse by talking about goblins more. As goblins are always relevant.
E: poor openai, it just wants to tell everyone about its dnd campaign.


Employees have discussed ways to tweak AI models to prioritize sponsored information in ChatGPT’s responses when users ask relevant queries
Hope people realize that this doesnt stop at ads. (Preaching to the choir here). See Grok.


That is a still quite high right? Esp considering they think 5% of nul-a is quite high. (For some reason I once had two copies of that). (I have read nul-a and not metamorp of prime)


The manosphere lingo, the header image with the leather jacket and the fake signing of a boob, the self dealing, the pretending they pay all their devs/researchers a lot of money, the who uses the much tokens leaderboard. There is such a high amount of sick desperation in all this.
We were to hard on the previous wave, who like Balmer were just cringe capitalist overlords.


I think that if someone were to be as obssessed with living forever as LW are, it would be seen as a form of mental illness and the Minds would gently try to correct it.
Yeah, I don’t think they would care if it was just a few, or a small group, but culture people who start to claim others are deathists and the extreme of whom have all kinds weird violent thoughts on them would be concerning. Doubt it would be a huge concern to the minds however, they prob only really get active when one of them also starts wants to create an empire or something, but it is hard to amass resources for that in the culture, esp if no mind is on your side.
Do wonder why we never see culture people who worship the minds as gods.


This gives me very high live service video game monetization feelings, another reason to stay far away from it. At least they don’t have the thing where every times costs multiples of 50 and you buy tokens amounts not divisible by 50.


Something is not reich about that.


Interesting that in the comments somebody also mentions that the people of the culture euthanize after a couple of centuries. No big shock that the LW people would disagree with that, as parts of the LW idea space is living forever in a computer simulation. So the culture can’t be utopian or good just because of that.


Wtf is up with the face in the top left circle.


I think that, while many LessWrong readers do believe that one party is way better than the other, such that the inter-party quality variation is far larger than the intra-party quality variation, this is not true of all readers.
… Wait is this about race and iq again?
Anyway the math ain’t mathing, as there never can be a republican above average enough to counterbalance out that they are an republican.


Thats very good.


I was trying to combine the scientology e-meter stuff with the iq race science g-factor stuff.


This person also seems to have no concept of the finality of death
The god ai can perfectly simulate people, and a sa copy is you, death isnt permanent. And when you start to think this is inevitable and close, murder becomes just another way to signal how strongly you feel about a thing.


soul forges continually working to bring back the dead
Even in death, duty does not end.


‘top ai’. so it is a sex thing after all.
Just looking at that picture makes me hear googles of unsimulated people scream in terror.


Yeah but never pull 9 numbers out of your ass, that would make you too smart and they will tell the gov to drone strike you.


They develop a special g-meter to find people who could potentially create more efficient gpus and send them to the gulags.
This doing the work together thing reminds me of how some teachers at my uni used to teach. It was always more satisfying when your teachers didn’t know the answers beforehand and people worked on it together than if it turned out the teacher already knew. Of course these sorts of lessons are way harder to setup.