

There’s also the weird line spacing change in the last paragraph


There’s also the weird line spacing change in the last paragraph


Sounds quite similar to Nick Cave’s letter on the topic, read here by Stephen Fry. (anyone feel free to reply with a piped link, for some reason it’s never worked for me)


Well fuck that


Where is the MIT study in question? The link in the article, apparently to a PDF, redirects elsewhere


aka enshittification


You know, I think I’m overdue for a donation to Wikipedia. They honestly might end up being the last bastion of sanity


It’s actually kind of worrisome that they have to guess it was his code based on the function/method name. Do these people not use version control? I guess not, they sure as hell don’t do code reviews if this guy managed to get this code into production


Yeah I see what you mean. There’s a decent argument to be made that something like reasoning appears as an emergent property in this kind of system, I’ll admit. Still, the fact that fundamentally the code works as a prediction engine rules out any sort of real cognition, even if it makes an impressive simulacrum. There’s just no ability to invent, no true novelty, which – to my mind at least – is the hallmark of actual reasoning.


an open source reasoning AI
It’s still an LLM right? I’m going to have to take issue with your use of the word ‘reasoning’ here
Motherfucker put a trigger warning on that shit
Do you have a moment to talk about our Lord and Saviour, Zalgo?
Had to post this, gives me the giggles every time. Ryan George gets it.
https://www.youtube.com/watch?v=nQpxAvjD_30