• 0 Posts
  • 20 Comments
Joined 1 year ago
cake
Cake day: December 4th, 2024

help-circle

  • That is not what the article describes at all.

    A city representative said officials reviewed the intersection after receiving concerns from Brandlin and determined it did not meet the requirements for a four-way stop but added pedestrian striping to improve safety.

    Brandlin spent about $1,000 of his own money on commercial-grade materials, including 30-inch reflective stop signs matching the other ones on the street. He began installing them himself to replace the yellow posted crosswalk signs on the intersection in the early morning of March 14, according to the El Segundo Police Department.

    Police arrested him around 1:30 a.m. while he worked on the second direction of traffic. Brandlin said the arrest was excessive, saying he was cited with multiple charges, including felonies.



  • I have yet to meet a woman I’m close enough friends with who doesn’t have a personal sexual assault story. Not a harrassment story, an SA story. Could just be bad luck but i don’t think it is. It also lines right up with the statistic that 3/4 women get sexually assaulted before 30 (that stat is from memory, but I’ll try and track it down in a bit.)

    I believe It is much worse than you think.

    EDIT: so on the stat I popped: NSVRC says 1 in 5 women in their lifetimes and RAINN says 1 in 6 in their lifetime. It’s been a while since i’d read that stat so it makes sense it’d be off.(though it is disappointing just how far off it ended up being, big whiff on my part) Those stat pages also have numbers for men as well




  • I would disagree. Many other technologies have eliminated more jobs, caused more damage to society and the environment, and been more generally consequential. AI has been bad kn all those ways, but is by no means the worst of them all. Let’s not forget that we’re still dealing with the social damage and ripple effects of the invention of the atomic bomb, and that previous video and audio manipulation tools had already severely damaged social trust in media. LLMs have just worsened those already significantly damaged systems.




  • Could also put up:

    • Massive collections of people are exploited in order to train various AI systems.
    • Machine learning apps that create text or images from prompts are supposed to be supplementary but businesses are actively trying to replace their workers with this software.
    • Machine learning image generation currently has diminishing returns for training as we pump exponentially more content into them.
    • Machine learning text and image generated content self-poisons their generater’s sample pool, greatly diminishing the ability for these systems to learn from real world content.

    There’s actually a much longer list if we expand to talking about other AI systems, like the robot systems we’re currently training to use in automatic warfare. There’s also the angle of these image and text generation systems being used for political manipulation and scams. There’s alot of terrible problems created from this tech.


  • The big problem is the state which has such extreme power.

    The big part that makes those states work is default compliance, which allows that state to commit violence using everyone who complies (which is everyone by default). Individuals being non-compliant directly translates to power being syphoned away from the state. This is why morality and ethics education are so Important, as the state cannot do anything immoral if individuals refuse to do immoral actions. The second most important thing is transparency, since states utilize opacueness as a means to obfuscate the morality of actions. This is one reason why dense hierarchies are utilized in governments - to obfuscate actions and provide personal deniability to members of the state infrastructure.