Just a dude on the internet, looking for content and fun! I love Linux, gaming, writing, reading, music, anime, walks, and occasionally movies too. Chronically ill and anxious too, that makes life quite interesting…At times.

  • 0 Posts
  • 31 Comments
Joined 9 months ago
cake
Cake day: July 1st, 2025

help-circle






  • Users needed to be able to choose if they wanted those LLM features from the very beginning, opt-in is the only sane way Mozilla could’ve handled this push towards LLM integration in Firefox. You are being “All Hail Corporate” by refusing to hold Mozilla accountable for their user hostile behavior. In this case, insane defaults and not respecting user choice is the bad behavior on the part of Mozilla. The only respectful choice would’ve been allowing the user to reject LLM features when first starting Firefox. A kill switch cannot be considered enough in this case and never should be!



  • LLM features should be opt-in by default, full stop. Just turning it off without being given a choice to refuse is very scummy behavior on the part of Mozilla. What if they wanted to use the default Firefox? Shouldn’t a user be allowed to opt-in and have LLM features off by default on every fresh install?

    It’s nice having options that respect your freedom of choice and don’t force their deluded ideas upon you…I feel that mainline Firefox, made by Mozilla should do that as well! By not holding Mozilla to higher standards, you get another Microsoft Edge or Chrome situation all over again, this time in open source spaces.


  • Any LLM features should’ve been opt-in by default if Mozilla actually gave a shit, asking the user if they wanted that useless bullshit before installing. As this technology isn’t polished and can introduce vulnerabilities due to it’s inherently insecure nature. A kill switch is useful if a user decided that LLMs weren’t it and wanted to disable everything wholesale at the click of a button; only after they originally consented to LLM features being enabled.

    Mozilla are only adding this feature because users made their LLM by default installation look pretty grim, I was one of those many dissenting voices about that. They wanted to jump on the LLM hype and cash in on some techbro attention, not considering that some of their user base would outright reject the idea.





  • I’ve been reading about the problems that Windows 11 has had for years, from the safety of being on Linux for good after Recall was announced (and floundered like a fish out of water). I missed how just about any Linux Distro got out of my way and let me work in general peace. It let me know when updates were needed and waited until I decided to install them. Occasional donation asks (probably once or twice a year) to KDE, which I do not mind because they are open source and awesome, I donate due to their work in the Linux space! Sure, I’ve had occasional problems, but, there’s been a solution for every issue I faced!

    Windows 11 problems are directly caused by Microsoft insane desire to push AI into everything ( insane because it’s dangerous and has no safety rails). Until Microsoft shift their OS and mentality in the right direction, those fundamental issues will never be solved (not that I would ever willingly use anything Microsoft made again after this fiasco).







  • LostWanderer@fedia.iotoTechnology@lemmy.world*Permanently Deleted*
    link
    fedilink
    arrow-up
    53
    arrow-down
    3
    ·
    4 months ago

    Who would’ve thought?! Given how they designed their artificially incompetent creations to be complaisant bundles of algorithms designed to maximize the engagement from vulnerable users. “AI” validates anything that it is told, don’t actually get users real human assistance when they have a mental crisis. These tools can be easily prompted into divulging suicide methods and deliberately isolate vulnerable people in order to maintain engagement. Until we regulate the fuck out of companies like OpenAI and the research+development process of “AI”, this will be a problem that more people will experience.