• 0 Posts
  • 10 Comments
Joined 1 year ago
cake
Cake day: July 19th, 2023

help-circle





  • But it’s true.

    Coding is, like, the smallest aspect out of all of programming. And unfortunately the part that’s the most fun.

    But if you’re a coder, I assume you don’t know how to design complex systems, just (maybe) implement them or parts of them. That’s not what defines programming.

    (Disclaimer, in all fairness: that’s in my personal, layman opinion as someone who doesn’t know much theory. I might just be very very in the wrong here, lol.)





  • there is no chance you would get back to the Intel system and plug it in every 2 hours.

    don’t be irrealistic. most laptops in the Macbook price range will have 8 hours of usage in low consumption mode or around 6 or 5 if you need more power.

    While I completely agree on the repairability front, which is really quite unfortunate and quite frankly a shame (at least iPhones have been getting more repairable, silver lining I guess? damned need for neverending profits), it’s just… non unrealistic.

    That being said, unified memory kind of sucks but it’s still understandable due to the advantages it brings, and fixed-in-place main storage that also stores the OS is just plain shitty. It’ll render all these devices unusable once that SSD gives out.

    Anyhow, off the tangent again: I have Stats installed for general system monitoring, as well as AlDente to limit charge to 80% of maximum battery capacity. All that to say, by now after around 1.5 years of owning the M2 MacBook Air (which I’ve been waiting for to buy/to release since late 2019, btw), I know pretty well which wattages to expect and can gauge its power usage pretty well.

    I’ll try to give a generalized rundown:

    • High-intensity workloads (mostly in shorter bursts for me): typically around 10W. I’ve installed Minecraft before once just to test it, and I get reasonable frames (both modded and unmodded), where it seemed to draw maybe 15W, thus still being able to charge (!) the battery off a 30W power supply. It doesn’t ever really go above 20W as a rule of thumb, and the CPU/GPU will be capable enough for easily 80-90% of the general population.
    • Idle/suspended: unnoticeable. I use my machine every day with maybe an exception or three per month, but from what I’ve read from others, battery will dip slightly after a month of standby, but that’s mostly due to battery chemistry I’d assume, not actually background usage.
    • Idle/running, light usage (yes it’s the same category*): It actually depends on the screen size edit: whoops, brightness. Energy consumption due to CPU usage is by far the minority portion. I’d say 2-4W, maybe. Screen usage when really bright makes it jump to 8-9W, darker-but-not-minimum screen brightnesses leave it at… 5W maybe.

    Given the spec sheet’s 52 Wh battery, you can draw your own conclusions about the actual runtime of this thing by simple division. I leave it mostly plugged in to preserve the battery for when it becomes a couch laptop in around 5-8 years, so I can’t actually testify on that yet, I just know the numbers.

    I didn’t mean for this to come off as fanboi-y as it did now. I also really want to support Framework, but recommending it universally from my great-aunt to my colleagues is not as easy as it is with the MacBook. Given they’re a company probably 1,000 times smaller than Apple, what they’re doing is still tremendously impressive, but in all honesty, I don’t see myself leaving ARM architecture anytime soon. It’s just too damn efficient.

    *At least for my typical usage, which will be browser with far too many tabs and windows open + a few shell sessions + a (may or may not be shell) text editor, sometimes full-fledged IDE, but mostly just text editors with plugins.