• 0 Posts
  • 11 Comments
Joined 7 months ago
cake
Cake day: March 8th, 2024

help-circle
  • I don’t know that it’s an eyesight issue. I mean, if you have good enough eyesight to read stuff on your phone screen you have good enough eyesight to see the difference.

    It may be an awareness thing, where the more you care about photography the more the limitations of the bad cameras stand out. And hey, that’s fine, if the phone makes good enough pictures for you that’s great. Plus, yeah, you can get phones with the exact same lens and sensor where one of them has a big fat bump that is deliberately blown up to make the cameras “feel” premium. There’s been a fair amount of marketing around this.

    But if you compare A to B it’s very obvious. Camera bumps became a marker of premium phones for a reason.



  • I am annoyed by most phone trends of the past decade, but… yeah, if you go back to a 2014 phone today there is some readjustment between what you remember phone photo and video looking like versus what they actually look like. That was the Galaxy S5 year. That thing had a single camera you would consider unacceptable as your selfie shooter today.

    EDIT: This thread made me go look up reviews, and man, yeah, I remember every single indoors photo on my own S5 looking just like this. What a blast of nostalgia. I didn’t realize there is a digital equivalent to 80s pictures having gone all sepia and magenta-y, but here it is.




  • I once had a guy walk into the subway, sit down, loudly declare he’d sneak into a military base, steal a tank and kill us all, then rant for a while about specific ways to kill his fellow passengers, including some very specific grenade action.

    Then he sat there in silence for a couple of minutes, quietly turned towards the too-horrified-to-change-seats nerdy guy to his left and politely ask him if he had a lighter for his cigarrette.

    It was a morning train, most people just kept trying to nap.


  • I know a few. Xerox is used for photocopying in other languages. Kleenex is the accepted term for “paper tissue” in Spain. Zodiac and Vespa are used for specific types of ship and motorcycle in multiple places, even when not manufactured by those brands. Thermos is a brand name, used in multiple countries as well. Sellotape is used in the UK for transparent sticky tape.

    I don’t speak every regional variant of every language, but the short answer is this is definitely not a US thing. At all.


  • “Jello” is a brand name, which I think may be the example most people in the US specifically don’t realize. There are tons of others.

    I think “googling” counts because a) it kinda makes sense even without the branding, b) I hear it all the time, and c) I say it myself even though I haven’t used Google as my default search engine for ages.



  • For one thing, it’s absolutely not true that what these apps provide is the same as what we had. That’s another place where the AI grifters and the AI fearmongers are both lying. This is not a 1:1 replacement for older tech. Not on search, where LLM queries are less reliable at finding facts and being accurate but better at matching fuzzy searches without specific parameters. Not with image generation, obviously. Not with tools like upscaling, frame interpolation and so on.

    For another, some of the numbers being thrown around are not realistic or factual, are not presented in context or are part of a power increase trend that was already ongoing with earlier applications. The average high end desktop PC used to run on 250W in the 90s, 500W in the 2000s. Mine now runs at 1000W. Playing a videogame used to burn as much power as a couple of lightbulbs, now it’s the equivalent of turning on your microwave oven.

    The argument that we are burning more power because we’re using more compute for entertainment purposes is not factually incorrect, but it’s both hyperbolic (some of the cost estimates being shared virally are deliberate overestimates taken out of context) and not inconsistent with how we use other computer features and have used other computer features for ages.

    The only reason you’re so mad about me wasting some energy asking an AI to generate a cute picture but not at me using an AI to generate frames for my videogame is that one of those is a viral panic that maps nicely into the viral panic about crypto people already had and the other is a frog that has been slow burning for three decades so people don’t have a reason to have an opinion about it.


  • The tragic irony of the kind of misinformed article this is linking is that the server farms that would be running this stuff are fairly efficient. The water is reused and recycled, the heat is often used for other applications. Because wasting fewer resources is cheaper than wasting more resources.

    But all those locally-run models on laptop CPUs and desktop GPUs? That’s grid power being turned into heat and vented into a home (probably with air conditioning on).

    The weird AI panic, driven by an attempt to repurpose the popular anti-crypto arguments whether they matched the new challenges or not, is going to PR this tech into wasting way more energy than it would otherwise by distributing it over billions of computer devices paid by individual users. And nobody is going to notice or care.

    I do hate our media landscape sometimes.