It really doesn’t help that it is a no-effort repost, of highly speculative article.
Check the previous incarnation of this post on the same community. It had a positive carma!
It really doesn’t help that it is a no-effort repost, of highly speculative article.
Check the previous incarnation of this post on the same community. It had a positive carma!
I didn’t read the article, I just looked it up. To find the fortune-teller graph that I can see on Leamy. That one that predicts that “misalignment” will be detected in 2027… And then AI will go straight up, or horizontal for a moment and then straight up. I don’t have enough time to check if the rest of the article is on the same BS level.
More helium to pump the balloon


My AI questions:
Can AI fuck off?
Can the bubble pop?
Can we make all AI models free and open source since it’s entirely trained on stolen content?


Agreed. Copilot hijacked my Windos, and I feel much more convenient using Linux now.


My Bose headphones didn’t break, but they looked awful after two years of usage, like garbage. So I decided to buy a Sony… Sony broke after one year after they fell off the bed (cheap headphones would survive this without a scratch, but those were expensive)


</\Vader voice>We Overhauled Our Terms of Service, pray we don’t overhaul it any further</Vader voice>


I’m from Poland and didn’t think about it until I found that in the first part of The Witcher games, the game that is directed to adults, they had to develop a special versions for the USA. The version included full violence with decapitations and gore, but showing a woman’s nipple was too far so they had to censor it.


Profitability went out of fashion. Control and power is the new trend. Big tech controls 95% (if not more) of all information sources. Even if they don’t own newspapers, they can make them more or less visible at will. Google tried to monetize it with ads everywhere but that is the old way of thinking.
Cambridge analytica showed that by controlling the information, you can select who makes the laws.
Musk bought Twitter without any chance for a reasonable return on investment. Guess why?
The “old tech” (Google, Facebook, Twitter, TikTok) controls what you know. But AI goes a step further. It controls what you think about things. It can explain why, what you thought was outrageous, is perfectly normal, or vice versa. It’s your own personalised propaganda machine. Once enough people are hooked and get used to using AI, the enshitification phase will begin and the AI will become more and more opinionated. And most people won’t notice… Some will, but they don’t need to influence everyone.


Don’t worry, it makes your life a little bit harder now, but it is worth it, because if everything goes well, your employer will save a lot of money by firing you. Yes you will lose your source of income, but ChatGPT will help you by providing tips on how to survive as a homeless person!
Totally worth it.


Impossible!
People only use languages other than English, to make fun of Americans. But as soon as there are no Americans near, we all switch back to English /s


Most good changes are difficult. And it’s not “all or nothing”, every small victory is worth it on its own.


Their what?


Very expensive but possible… in 20 to 50 years… If most of the EU cooperates… So… No.


Light bulb is a Reverse solar panel


What if the “laws aren’t the same” remark was about “you can’t transmit without a permit”? Not about the “you need license to listen”?


“Rico!”
“Kaboom?”
“Yes Rico, kaboom!”


It’s the dumbest “security” rule I ever heard of. It’s a “limited access” shared by almost 30 million people.


They already have four. What they need is one which supports 100% of the features, is easy to use and selected by default. They might keep other more advanced UIs but forcing new users to select UI as a first step, before they can do anything in the software is just plain stupid.
I was alive when computer RAM was measured in KB and when you wanted to have more of it, you had to manually solder it to the main board… Youngling.