• 0 Posts
  • 11 Comments
Joined 3 years ago
cake
Cake day: July 2nd, 2023

help-circle

  • They also fired all their park workers during covid and gave themselves 10 million bonuses while their workers were surviving on food stamps. Some workers had even signed non compete clauses so they literally could not use their talents elsewhere to feed themselves.

    There are plenty of things to hate Disney for, especially as they approach super-monopoly status, ruin nearly every franchise they touch, and have trouble telling what’s good or not. As a company, Disney’s morals and decisions grow more concerning every month. Disney is basically a disaster in progress.

    However, this specific complaint seems bad: it’s the wrong scale. Many companies were in the wrong during COVID, but it’s hard to look at these numbers and say the layoffs here were bad decisions based on $10M in bonuses. The scales are just too different.

    Disney laid off 32,000 park workers At a measly 40 hours per week at their “minimum wage” (formerly $15/hr, now $24/hr): that’s $83.2 million PER MONTH: $998M a year. A $10M “bonus” is 1% of that, and even smaller compared to the $6.4B of park revenue they had loss.

    The former CEO “gave up” their salary ($3M) and “bonus” ($45M in 2019), had 20-30% pay cuts to the executive staff, and a few other items. The CEO did get “$10M” in stock awards, but stock awards don’t get you off food stamps. Those stocks become nothing if the company posts bad financials, which would hurt more than just the execs.

    The $1.5B dividend payout in April 2020 looks much worse. Abigail Disney ranted about it on Twitter (now X). His rant is at the appropriate scale: Disney paid out billions before they chose to save millions. The execs got quite a bit of that dividend payout. That’s the greed.


  • Did you purposely miss the first and last questions: Which laptop is the good value?

    I never said people need to run LLMs. I said Apple dominates high-end laptops and wanted a good high-end to compare to the high-end Macbooks.

    Instead of just complaining about Apple, can do what I asked? Best cheaper laptop alternative that checks the non-LLM boxes I mentioned:

    If you want good cooling, good power (CPU and GPU), good screen, good keyboard, good battery, good WiFi, etc., the options get limited quickly.


  • Is there a particular model you’re thinking of? Not just the line. I usually find that Windows laptops don’t have enough cooling or make other sacrifices. If you want good cooling, good power (CPU and GPU), good screen, good keyboard, good battery, good WiFi, etc., the options get limited quickly.

    Even the RAM cost misses some of the picture. Apple Silicon’s RAM is available to the GPU and can run local LLMs and other machine learning models. Pre-AI-hype Macs from 2021 (maybe 2020) already had this hardware. Compare that to PC laptops from the same era. Even in this era, try getting Apple’s 200-400GB/s RAM performance on a PC laptop.

    PC desktop hardware is the most flexible option for any budget and is cost-effective for most budgets. For laptops, Apple dominates their price points, even pre-Apple-silicon.

    The OS becomes the final nail in the coffin. Linux is great, but a lot of software still only supports Windows and Apple; Linux support for the latest/current hardware can be a hit or miss (My three-year-old, 12th-gen Thinkpad just started running well). If the choice is between Mac OS or Windows 11, is there much of a choice? Does that change if a company wants to buy, manage, and support it? Which model should we be looking at? It’s about time to replace my Thinkpad.



  • Time isn’t the only factor for adoption. Between the adoption of IPv4 and IPv6, the networking stack shifted away from network companies like Novell to the OSes like Windows, which delayed IPv6 support until Vista.

    When IPv4 was adopted, the networking industry was a competitive space. When IPv6 came around, it was becoming stagnant, much like Internet Explorer. It wasn’t until Windows Vista that IPv6 became an option, Windows 7 for professionals to consider it, and another few years later for it to actually deployable in a secure manner (and that’s still questionable).

    Most IT support and developers can even play with IPv6 during the early 2000s because our operating systems and network stacks didn’t support it. Meanwhile, there was a boom of Internet connected devices that only supported IPv4. There are a few other things that affected adoption, but it really was a pretty bad time for IPv6 migration. It’s a little better now, but “better” still isn’t very good.


  • Eyron@lemmy.worldtoNews@lemmy.world*Permanently Deleted*
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    2 years ago

    You should probably read/know the actual law, rather than just getting it close. You’re probably referring to 18 USC 922 (d) (10), which includes any felony-- not just shooting. That’s one of 11 listed requirements in that section, which assumes that the first requirement (a) (1) is met: not an interstate nor foreign transaction. There’s a lot more to it than just “as long as you don’t have good evidence they’re going to go shoot someone”

    Even after the sale, ownership is still illegal under section (g)-- it just isn’t the seller’s fault anymore.

    This is basic information that should be known to any gun safety advocate. “Responsible” gun owners must know those laws, plus others backward and forward. One small slip-up is a felony, jail, and permanent loss of gun ownership/use. Are they really supposed to listen to those who can’t even talk about current law correctly?

    The law can be better, but you won’t do yourself any favors by misrepresenting it.




  • Do you use Android? AI was the last thing on their minds for AOSP until OpenAI got popular. They’ve been refining the UIs, improving security/permissions, catching up on features, bringing WearOS and Android TV up to par, and making a Google Assistant incompetent. Don’t take my word for it; you’ll rarely see any AI features before OpenAI’s popularity: v15, v14, v13, and v12. As an example of the benefits: Google and Samsung collaborating on WearOS allowed more custom apps and integrations for nearly all users. Still, there was a major drop in battery life and compatibility with non-Android devices compared to Tizen.

    There are plenty of other things to complain about with their Android development. Will they continue to change or kill things like they do all their other products? Did WearOS need to require Android OSes and exclude iOS? Do Advertising APIs belong in the base OS? Should vendors be allowed to lock down their devices as much as they do? Should so many features be limited to Pixel devices? Can we get Google Assistant to say “Sorry, something went wrong. When you’re ready: give it another try” less often instead of encouraging stupidity? (It’s probably not going to work if you try again).

    Google does a lot of wrong, even in Android. AI on Android isn’t one of them yet. Most other commercially developed operating systems are proprietary, rather than open to users and OEMs. The collaboration leaves much to be desired, but Android is unfortunately one of the best examples of large-scale development of more open and libre/free systems. A better solution than trying to break Android up, is taking/forking Android and making it better than Google seems capable of.


  • I’m still rocking a Galaxy Watch 4: one of the first Samsung watches with WearOS. It has a true always-on screen, and most should. The always-on was essential to me. I generally notice within 60 minutes if an update or some “feature” tries to turn it off. Unfortunately, that’s the only thing off about your comment.

    It’s a pretty rough experience. The battery is hit or miss. At good times, I could get 3 days. Keeping it locked, (like after charging) used to kill it within 60 minute (thankfully, fixed after a year). Bad updates can kill the battery life, even when new: from 3 days life to 10 hours, then to 3 days again. Now, after almost 3 years, it’s probably about 30 hours, rather than 3 days.

    In general, the battery life with always-on display should last more than 24 hours. That’d be pretty acceptable for a smartwatch, but is it a smartwatch?

    It can’t play music on its own without overheating. It can’t hold a phone call on its own without overheating. App support is limited, but the processor seems to struggle most of the time. Actually smart features seem rare, especially for something that needs consistent charging.

    Most would be better off with a Pebble or less “smart” watch: better water resistance, better battery, longer support, 90% of the usable features, and other features to help make up for any differences.