I ran a Hackintosh as my main machine for many years. I specced (using info found on tonymacx86[1]) and built it around 15 years ago to make it work well as a Hackintosh.
And work well it did. Incredibly well actually. But now the doors are closing. macOS will soon be ARM/Silicon only and that will be the end of a Hackintosh era. That's the reason I'm not willing to invest the time or energy to do the same again.
Hopefully sometime another era of Hackintoshers will find a way to make macOS run on non-Apple ARM hardware. Perhaps it's already happening? :)
After 20 years of Hackintosh and Apple hardware I eventually lost hope in Apple due to their unsustainable practices (mostly unserviceable and unupgradeable soldered memory and storage) and moved on to Linux (https://getaurora.dev/). Couldn't be happier with KDE Plasma.
Funnily enough I hope for an exact opposite: the Apple hardware is IMHO the best and the most irreplaceable part of the whole Apple experience and I’d be a very happy man if there was a simple way of running not-macOS on e.g. the M3 MacBook Pro with full driver support.
Not saying it isn't good, it is. I still use Silicon Macs at work.
Just depends on your priorities and what you value I guess.
I'm not on board with the iOSification of macOS either. IMO, it's has gotten progressively worse with every version since Snow Leopard. Atomic Fedora distros and KDE Plasma has made me look forward to upgrades again (and yeah, they happen automatically in the background).
With macOS I just stuck around as long as I could knowing that there'd be no upsides to getting the next macOS release.
There’s really nothing comparable when it comes to real world battery life in the x86 world. Things there have improved with Lunar Lake and the last couple of gens of AMD mobile SoCs, but there’s still a pretty chunky delta. Most x86 laptops still require active cooling too, and unfortunately reducing fan noise or at least improving the profile of said noise is rarely a priority.
To be fair - the reverse engineering has to be done for every of the M series, and the M1 is the best supported Apple SoC so far. I guess we should support the project if we want them to succeed (they live from donations).
This is too sensible of an idea to ever happen, though. Nearly every time a discussion comes up about this, the companies essentially argue that their right to fill the world with ewaste trumps all.
Presumably ARM hackintoshes are, or will be, possible too? Recent ARM processors (eg: Qualcomm’s Snapdragon X) have performance comparable to Apple’s M-series, and are even finding their way into desktops. So there shouldn’t be a shortage of candidate hardware in the coming years!
ARM processors significantly differ from Apple Silicon. Apple's M series are not fully compliant with the ARM64 ISA and contain undocumented instructions. There was a discussion about that a few month ago on /r/hackintosh https://www.reddit.com/r/hackintosh/comments/1c9rfdz/is_hack...
My understanding is that it will be much harder as ARM is way less standardized than the Intel x86 platform. This is something that Linus Thorvalds has talked about in the past [1]. Think about how huge an effort it is to get Linux to run on Apple’s ARM machines or how some distros have a Pi specific variant. I guess getting macOS to run on other ARM machines could be similar challenging.
I ran one for quite a while back in the netbook days - snowleopard worked quite well on the first Atom eeepc. Back then macos was a bit of a revelation - coming from Windows or Linux where the first thing you did after installing the OS was spend a few hours tweaking things, there was nothing to do and it just ran.
Eventually I went back to debian linux after a main disk failure and kept the machine going a few years longer by progressively lightening the software load, from gnome -> xfce -> lxde, but the time came when it was just too slow to browse any more, and it had to be retired.
I've also had a very "sustainable" 6800k machine with 32GB ram that was designated later for many other tasks.
But the title is more of a click bait rather than backing that claim.
The power source I guess is also a factor, since newer Macs are much more power efficient than Intel days.
Since 2016, Apple made their machines less "sustainable" in the sense of upgradability (my MacBook Air 2013 was borderline with 8GB of RAM and ended with 2TB hdd), I'm happy the new Mac minis are also going that "hackable" direction.
Anyway,
Software wise, the "Hackintosh" scene is also valuable for true macOS users just as enabling Windows 11 on unsupported hardware, you can get up to Sequoia with running those custom EFIs.
The power savings of new computers are not relevant if compared with the energy that is needed to build them. University of Edinburgh (2016) found that by extending the lifetime of a single computer and monitor from four years to six
years, approximately 190 kgCO2e of carbon emissions are avoided (I doubt anyone is using their devices even 4 years nowadays). Report is here: https://edwebcontent.ed.ac.uk/sites/default/files/atoms/file...
You can see it straight from the horse's mouth, too. On pages 13 and 14 [1], they estimate that between 9 and 23% of the total lifetime emissions for a Mac mini come from using it. The rest is manufacturing and distribution.
According to the fine print, the lifecycle of the computer is determined by some ISO standards and I spent a good 10 minutes searching the web to determine a ballpark figure for this (3 years? 5 years? 10 years?) and came up completely empty.
The nice thing is that all those new x86 laptops are going to keep being slower than my 2020 M1 mini unless they're high spec, so software/websites are still built with that in mind.
What display servers, drivers do you need to mess with? I have a modern Dell laptop and everything worked immediately. Display, WiFi, Bluetooth, Ethernet, storage, sleep mode, good battery life, every port worked, the webcam worked. This was on Fedora. The case is the same for Debian on a different Lenovo laptop.
I know Linux has this reputation for being difficult to use, no hardware support, etc and if this were 2009 I might agree with that. However, Linux couldn’t be easier to use right now.
Wayland works fine. Pipewire? My speakers on the laptop work good and plugged into a dock with conventional speakers works fine.
I’ve seen this repeated over and over and I don’t believe people are lying, I’m sure some have had the experience recently but I’d like to know what hardware they’re using.
The only reason I can see to use Windows is software support for something proprietary.
We have Lenovo Thinkpads at work, which are actually certified for Linux but have known compatibility issues with Bluetooth audio there. I don't remember the exact details, but basically the end result is the Linux users here only have wired headphones.
And if you care about dedicated graphics, especially Nvidia, those drivers have a very different level of support for Windows.
If you care about obscure cases, I have some really cheap Asus laptop (almost a netbook) where Linux mostly works, but the fans stop spinning any time it wakes from sleep, until the next reboot. Somehow not an issue in Windows.
I quickly threw Fedora on a 6-year-old Microsoft Surface Go LTE, and the only thing that didn't work work was the sim card. Everything else was perfect, including touchscreen.
That's the problematic part to me personally. I want a HDPI convertible/tablet form factor, that means no DELL, and linux support is currently not an option.
I had high hopes for the 12" Framework, but again it's resolution is 1200p, which is so much less than a Surface Pro for instance.
I installed Kubuntu 24 on a generic laptop (Machreator N17) and it worked fine other than having to disable D3cold in the BIOS so Wi-Fi works when waking from sleep. I did spend hours before the BIOS setting trying alternate rtw_8821 drivers to no avail.
Bluetooth support is still a shit show if you don't buy a laptop that has been in some way developed for linux (if not for your current model, then for a model using similar components). Alternatively, buying a custom bluetooth usb adapter may work but only on a supported distro and only ever with an older bluetooth protocol.
Same goes for WiFi. Good luck trying to get WiFi6 to work properly, ignoring the fact that you have to go out of your way to find an adapter that will even work in linux.
I have a Lenovo laptop where Bluetooth works perfect. I even had a Bluetooth mouse by Logitech that could switch between BT and the logi reciever, so I was able to use it on two computers back and forth with a button press. It instant connected to laptop, took ~1 second to connect to the proprietary receiver.
Maybe a fluke, because the keyboard died on that laptop. Just like every other Lenovo I've owned.
Never buying another one even if it has the exact specs I need when I am shopping for a great price, which is how they've gotten me 3 times now.
> without messing around or playing with display servers, drivers and the like.
Who does that? Not Linux users at least as all drivers are usually shipped by the distro. If my memory is correct, this is on an OS built at Redmond that devices used to come shipped with a cdrom full of drivers because the OS didn't care providing them.
There’s a big asterisk that should be added, though. Windows is not a productive desktop environment for me, primarily due to its poor implementation of virtual desktops and handling of multiple monitors that feels like a bad hack (to set different wallpapers per-display, you still need to combine the picture for each display into a single gigantic image which is ridiculous). I’m sure I’m not alone in these feelings.
I use an HDTV and an HDR 2k165hz monitor. I set the desktop background to black, anything else messes with my peripheral vision if some window decoration isn't aligned with the background.
I also use a 720p60 projector as a screen sometimes on that machine, too.
It all just works. My largest complaint with Windows seems to have been fixed with the 24h2 patch - read on if you care.
I use "@" prefixes on folder names to have them sort first when sorting by name descending. Somehow for a couple years, some applications would crash if trying to stat these folders, for example, launching VLC would usually crash it if the last thing I played was something from V:\@TV\Columbo
I was keeping a list of things that crashed like this; or while opening or saving files in "@" folders. Notepad, notepad++, vlc, mplayer, windows photo viewer (any and all), and so on
I was able to guess it had something to do with internal variables in the windows API, you can see them on my system in pwsh by going to V: \ and typing @ and pressing tab key. They're even a different color than the quoted folder name that eventually appears.
I could never figure out how to report this bug in a way that would be helpful, as I barely use powershell or the windows API.
I tried WSL, but when WSL 1 was abandoned in favor of Hyper-V I was devastated because the new Hyper-V WSL is kind of shit in comparison. No longer are processes and files shared with the host system, you just get an opaque block of CPU and an opaque block of disk space being used. BTW, if at any time in the future you have to clear out part of that opaque block of disk space to make room - good luck shrinking it. I completely recognize why WSL 1 could have been unsustainable for Microsoft but that does not mean that I do not disagree with the decision to abandon it.
There is a certain attraction to the Linux being safely contained but I'm just not getting it at all. I don't want my Linux in a container, I want to live there.
So basically that's why I now use macOS, because it doesn't have to use hacks like WSL in order to be like Linux, it just already is Unix.
Also in general I tend to find that Apple is more detail-oriented than Microsoft, they are probably autistic as fuck and that is a very good thing. (in my opinion)
I'm not yet fully happy with the state of affairs on desktop Linux, although GNOME is to a degree shaping up to a decent experience (partly by copying macOS, though). Their trackpad support is far better than Windows, for example.
you can shrink it now! You're going to have to change the vhd format to sparse (this is why docker for windows doesn't run on exFAT-formatted SSDs, as I found out a couple weeks ago) and then run compaction on it once. After running compaction once, it should grow and shrink dynamically as you'd expect.
My point is that I don't need a linux container at all because all my tools run natively on macOS. Even (and especially) the ones that wouldn't run well on Windows.
For example: I can run valgrind on macOS. Good luck running valgrind on Windows.
I recently purchased a new laptop battery and installed Ubuntu 24.04 on a Macbook Pro from 2009 that I had kept in storage. It is used for light note taking and e-mail, because I wanted to try and see how viable it was to try and stretch the hardware a bit longer instead of adding to the e-waste problem by buying something new.
It's worked well so far. I will keep using it, and then eventually perhaps pick up another refurbished machine when it finally comes time to replace it.
There are at least two problems with this article:
1) If you can hack a current version of macOS to run on non Apple hardware, you can probably also hack a current version of macOS to run on older Apple hardware (see OpenCore Legacy Patcher).
2) Support for macOS on x86 hardware will be discontinued in the not so distant future, putting Hackintoshes into the same category as old x86 Apple hardware.
I've done this three times, albeit not very recently. First you need a regular computer, on which you're both going to Google a ton and prepare a lot of software. It's only for fun or maaaybe a niche work use case.
I also had a MacPro5,1 I used extra long, thanks to some of the tooling mentioned here. Great machine. But seeing how it used 200W and I had to buy a new RX580 for it to be usable, idk if that really helped the environment, nor was that the reason I had it.
I thought it would be interesting to run the numbers for the question “how far can I drive my Prius if I didn’t buy this ?” for a new Macbook Pro M4 (14 inch).
The answer is a pleasing 1337 miles (198kg lifetime CO2 of the macbook / 148g mile CO2 average for the 2011 Prius).
This is pretty surprising. I’d always thought laptop purchases had a much higher impact than an average couple of months of driving.
For calibrating your understanding of greenhouse gas emissions producing, packaging, shipping, using, and recovering materials from a 14-inch MacBook Pro in 2023 emits about 243 kg of carbon [1], which is equivalent to using 27 gallons of gasoline. [2]
For some people, this may math out. But it's probably easier to worry less about buying a new laptop every once in a while, and more about driving a bit less or getting a more efficient car.
A reasonable rule of thumb is how much money you spend on something. Even if you buy something that is ostensively green your money is still flowing into an economy which isn’t.
…which is about as much as commuting to work that’s 5km away for 6 months. Given a ~3 year lifespan of such device, it isn’t such a low amount. And when you decide to get a new, more efficient car, it will have a its own carbon footprint too.
I'm sure someone can remark, given an average of 125W of power in the human, and the CO2 released, which is really just our food post processed into fat and being attached to poisonous oxygen to get it out of our bodies, how much, exactly, carbon is exhaled walking 40 minutes a day.
Yes, but getting a more efficient electric car for example typically will “pay for itself” in carbon emission reduction over a couple years. Plus it pushes older less efficient cars out of the bottom of the used market and supports growth of clean technology.
And 20% of the cost is the electricity to use the product, so that should be discounted if you’re comparing buying new to reusing an older computer, and you should compare the power efficiency.
Assuming an average four year replacement cycle (https://www.statista.com/statistics/1021171/united-states-el...) and excluding use phase power use, we can napkin math annualize it to 243 * 0.8 / 4 = 48.6 kg co2E ~= 5.5 gallons of gas per year ~= 1.2% of average annual passenger car gas consumption.
The only reaction I can think of to this massive non-sequitur of an article is "WTF?"
If you really care about forced obsolescence, refuse to upgrade and push back strongly whenever you can against the wildly wasteful web and its effective browser monopoly. Write and prefer native code. Add drivers for hardware that's "no longer supported". What does this have to do with Hackintoshing? No idea.
> push back strongly whenever you can against the wildly wasteful web and its effective browser monopoly
What does that mean (like what should one do to actually follow what you're saying?) I hate how the web is progressing, but there's nothing I can do to stop it. I use the free-est browser I can use while still being able to access most stuff (Firefox), I use old computers (my newest machine is from 2018 which I just bought second-hand a few months ago), and I use open source OSes (OpenBSD, FreeBSD, Linux).
Anyway, that person has already directly "refused to upgrade" by using existing hardware, so I'm not sure what you're actually criticizing. Hackintoshing happens to be the way they reached this goal, and they made a blog post about it. Nothing wrong with that.
Indeed, it makes no sense because some old hardware that might be ok with Linux or even Windows is probably not ok with macOS. This article even tells you to go buy an RX580, and that's not the half of it.
And since you mentioned wasteful web, medium.com is a prime example of that.
Seems AI-written. Doesn't give any justification for MacOS being better for older hardware than Linux or even Windows. You won't be getting updates for your Intel hardware on MacOS, so you're right back where you started with Windows 10 going EOL.
Too many typos to be AI written. I think the point was that if you are able to use the old hardware longer it is better for the environment - not the effectiveness of old hardware in general (Linux always wins this argument)
This is not AI-written. This is what used to be called asperger's (now part of autism spectrum disorder). You can see it in how they write:
> In 2025 the question is: why Hackintosh? And after being in the different communities for so long and observing reasons people install Hackintosh they are:
the way they construct sentences is so distinctly recognizable: [And after [being in [the different communities] for so long] and observing [reasons people [install Hackintosh]] they are:]
It's like they use Scratch blocks to assemble their language. It's hard to explain, but you can know it when you see it.
I ran a Hackintosh as my main machine for many years. I specced (using info found on tonymacx86[1]) and built it around 15 years ago to make it work well as a Hackintosh.
And work well it did. Incredibly well actually. But now the doors are closing. macOS will soon be ARM/Silicon only and that will be the end of a Hackintosh era. That's the reason I'm not willing to invest the time or energy to do the same again.
Hopefully sometime another era of Hackintoshers will find a way to make macOS run on non-Apple ARM hardware. Perhaps it's already happening? :)
After 20 years of Hackintosh and Apple hardware I eventually lost hope in Apple due to their unsustainable practices (mostly unserviceable and unupgradeable soldered memory and storage) and moved on to Linux (https://getaurora.dev/). Couldn't be happier with KDE Plasma.
[1] https://www.tonymacx86.com
Funnily enough I hope for an exact opposite: the Apple hardware is IMHO the best and the most irreplaceable part of the whole Apple experience and I’d be a very happy man if there was a simple way of running not-macOS on e.g. the M3 MacBook Pro with full driver support.
Not saying it isn't good, it is. I still use Silicon Macs at work.
Just depends on your priorities and what you value I guess.
I'm not on board with the iOSification of macOS either. IMO, it's has gotten progressively worse with every version since Snow Leopard. Atomic Fedora distros and KDE Plasma has made me look forward to upgrades again (and yeah, they happen automatically in the background).
With macOS I just stuck around as long as I could knowing that there'd be no upsides to getting the next macOS release.
I haven't noticed much, but they did ruin iTunes and Sysprefs.
There’s really nothing comparable when it comes to real world battery life in the x86 world. Things there have improved with Lunar Lake and the last couple of gens of AMD mobile SoCs, but there’s still a pretty chunky delta. Most x86 laptops still require active cooling too, and unfortunately reducing fan noise or at least improving the profile of said noise is rarely a priority.
Luckily we have Asahi Linux. https://asahilinux.org/docs/platform/feature-support/m3/
…with barely supported M1 and LKML drama. I would love for it to be an option, but it just isn’t.
To be fair - the reverse engineering has to be done for every of the M series, and the M1 is the best supported Apple SoC so far. I guess we should support the project if we want them to succeed (they live from donations).
We could also regulate that large hardware companies have to publicly document their systems. Otherwise the public pays with more ewaste.
This is too sensible of an idea to ever happen, though. Nearly every time a discussion comes up about this, the companies essentially argue that their right to fill the world with ewaste trumps all.
[dead]
> macOS will soon be ARM/Silicon only and that will be the end of a Hackintosh era.
Need it be so ? There will be a "final" version for Intel and it might have a lifetime on non-Apple HW of many years.
Presumably ARM hackintoshes are, or will be, possible too? Recent ARM processors (eg: Qualcomm’s Snapdragon X) have performance comparable to Apple’s M-series, and are even finding their way into desktops. So there shouldn’t be a shortage of candidate hardware in the coming years!
ARM processors significantly differ from Apple Silicon. Apple's M series are not fully compliant with the ARM64 ISA and contain undocumented instructions. There was a discussion about that a few month ago on /r/hackintosh https://www.reddit.com/r/hackintosh/comments/1c9rfdz/is_hack...
My understanding is that it will be much harder as ARM is way less standardized than the Intel x86 platform. This is something that Linus Thorvalds has talked about in the past [1]. Think about how huge an effort it is to get Linux to run on Apple’s ARM machines or how some distros have a Pi specific variant. I guess getting macOS to run on other ARM machines could be similar challenging.
1: https://www.pcworld.com/article/410627/why-linux-pioneer-lin...
I ran one for quite a while back in the netbook days - snowleopard worked quite well on the first Atom eeepc. Back then macos was a bit of a revelation - coming from Windows or Linux where the first thing you did after installing the OS was spend a few hours tweaking things, there was nothing to do and it just ran.
Eventually I went back to debian linux after a main disk failure and kept the machine going a few years longer by progressively lightening the software load, from gnome -> xfce -> lxde, but the time came when it was just too slow to browse any more, and it had to be retired.
I've also had a very "sustainable" 6800k machine with 32GB ram that was designated later for many other tasks.
But the title is more of a click bait rather than backing that claim.
The power source I guess is also a factor, since newer Macs are much more power efficient than Intel days.
Since 2016, Apple made their machines less "sustainable" in the sense of upgradability (my MacBook Air 2013 was borderline with 8GB of RAM and ended with 2TB hdd), I'm happy the new Mac minis are also going that "hackable" direction.
Anyway, Software wise, the "Hackintosh" scene is also valuable for true macOS users just as enabling Windows 11 on unsupported hardware, you can get up to Sequoia with running those custom EFIs.
The power savings of new computers are not relevant if compared with the energy that is needed to build them. University of Edinburgh (2016) found that by extending the lifetime of a single computer and monitor from four years to six years, approximately 190 kgCO2e of carbon emissions are avoided (I doubt anyone is using their devices even 4 years nowadays). Report is here: https://edwebcontent.ed.ac.uk/sites/default/files/atoms/file...
You can see it straight from the horse's mouth, too. On pages 13 and 14 [1], they estimate that between 9 and 23% of the total lifetime emissions for a Mac mini come from using it. The rest is manufacturing and distribution.
According to the fine print, the lifecycle of the computer is determined by some ISO standards and I spent a good 10 minutes searching the web to determine a ballpark figure for this (3 years? 5 years? 10 years?) and came up completely empty.
[1] https://www.apple.com/environment/pdf/products/desktops/Mac_...
The nice thing is that all those new x86 laptops are going to keep being slower than my 2020 M1 mini unless they're high spec, so software/websites are still built with that in mind.
[Edit] Ok, I should have read better. i7-6800k is from intel and not from the stone age.
---
Are you sure your 68000 machine had 32Gb of ram? If memory serves me well it was most likely 32 Mb.
68020 has a 32 bit address space, so the maximum RAM it can handle is 4Gb
They're talking about the i7-6800k
https://www.intel.com/content/www/us/en/products/sku/94189/i...
Honestly I saw 6800k and thought it was the Motorola 6800.
Just install linux and move on with life
[dead]
For those who care about their life and getting things done just install WSL.
You still get to run Windows as normal and have Linux running without messing around or playing with display servers, drivers and the like.
What display servers, drivers do you need to mess with? I have a modern Dell laptop and everything worked immediately. Display, WiFi, Bluetooth, Ethernet, storage, sleep mode, good battery life, every port worked, the webcam worked. This was on Fedora. The case is the same for Debian on a different Lenovo laptop.
I know Linux has this reputation for being difficult to use, no hardware support, etc and if this were 2009 I might agree with that. However, Linux couldn’t be easier to use right now.
Wayland works fine. Pipewire? My speakers on the laptop work good and plugged into a dock with conventional speakers works fine.
I’ve seen this repeated over and over and I don’t believe people are lying, I’m sure some have had the experience recently but I’d like to know what hardware they’re using.
The only reason I can see to use Windows is software support for something proprietary.
We have Lenovo Thinkpads at work, which are actually certified for Linux but have known compatibility issues with Bluetooth audio there. I don't remember the exact details, but basically the end result is the Linux users here only have wired headphones.
And if you care about dedicated graphics, especially Nvidia, those drivers have a very different level of support for Windows.
If you care about obscure cases, I have some really cheap Asus laptop (almost a netbook) where Linux mostly works, but the fans stop spinning any time it wakes from sleep, until the next reboot. Somehow not an issue in Windows.
I quickly threw Fedora on a 6-year-old Microsoft Surface Go LTE, and the only thing that didn't work work was the sim card. Everything else was perfect, including touchscreen.
> I have a modern Dell laptop
That's the problematic part to me personally. I want a HDPI convertible/tablet form factor, that means no DELL, and linux support is currently not an option.
I had high hopes for the 12" Framework, but again it's resolution is 1200p, which is so much less than a Surface Pro for instance.
I installed Kubuntu 24 on a generic laptop (Machreator N17) and it worked fine other than having to disable D3cold in the BIOS so Wi-Fi works when waking from sleep. I did spend hours before the BIOS setting trying alternate rtw_8821 drivers to no avail.
Bluetooth support is still a shit show if you don't buy a laptop that has been in some way developed for linux (if not for your current model, then for a model using similar components). Alternatively, buying a custom bluetooth usb adapter may work but only on a supported distro and only ever with an older bluetooth protocol.
Same goes for WiFi. Good luck trying to get WiFi6 to work properly, ignoring the fact that you have to go out of your way to find an adapter that will even work in linux.
I don't think I've ever seen a device where Bluetooth works 100% of the time. I had issues on Windows, Linux, Mac, Android and iOS.
I have a Lenovo laptop where Bluetooth works perfect. I even had a Bluetooth mouse by Logitech that could switch between BT and the logi reciever, so I was able to use it on two computers back and forth with a button press. It instant connected to laptop, took ~1 second to connect to the proprietary receiver.
Maybe a fluke, because the keyboard died on that laptop. Just like every other Lenovo I've owned.
Never buying another one even if it has the exact specs I need when I am shopping for a great price, which is how they've gotten me 3 times now.
> without messing around or playing with display servers, drivers and the like.
Who does that? Not Linux users at least as all drivers are usually shipped by the distro. If my memory is correct, this is on an OS built at Redmond that devices used to come shipped with a cdrom full of drivers because the OS didn't care providing them.
There’s a big asterisk that should be added, though. Windows is not a productive desktop environment for me, primarily due to its poor implementation of virtual desktops and handling of multiple monitors that feels like a bad hack (to set different wallpapers per-display, you still need to combine the picture for each display into a single gigantic image which is ridiculous). I’m sure I’m not alone in these feelings.
I use an HDTV and an HDR 2k165hz monitor. I set the desktop background to black, anything else messes with my peripheral vision if some window decoration isn't aligned with the background.
I also use a 720p60 projector as a screen sometimes on that machine, too.
It all just works. My largest complaint with Windows seems to have been fixed with the 24h2 patch - read on if you care.
I use "@" prefixes on folder names to have them sort first when sorting by name descending. Somehow for a couple years, some applications would crash if trying to stat these folders, for example, launching VLC would usually crash it if the last thing I played was something from V:\@TV\Columbo
I was keeping a list of things that crashed like this; or while opening or saving files in "@" folders. Notepad, notepad++, vlc, mplayer, windows photo viewer (any and all), and so on
I was able to guess it had something to do with internal variables in the windows API, you can see them on my system in pwsh by going to V: \ and typing @ and pressing tab key. They're even a different color than the quoted folder name that eventually appears.
I could never figure out how to report this bug in a way that would be helpful, as I barely use powershell or the windows API.
I tried WSL, but when WSL 1 was abandoned in favor of Hyper-V I was devastated because the new Hyper-V WSL is kind of shit in comparison. No longer are processes and files shared with the host system, you just get an opaque block of CPU and an opaque block of disk space being used. BTW, if at any time in the future you have to clear out part of that opaque block of disk space to make room - good luck shrinking it. I completely recognize why WSL 1 could have been unsustainable for Microsoft but that does not mean that I do not disagree with the decision to abandon it.
There is a certain attraction to the Linux being safely contained but I'm just not getting it at all. I don't want my Linux in a container, I want to live there.
So basically that's why I now use macOS, because it doesn't have to use hacks like WSL in order to be like Linux, it just already is Unix.
Also in general I tend to find that Apple is more detail-oriented than Microsoft, they are probably autistic as fuck and that is a very good thing. (in my opinion)
I'm not yet fully happy with the state of affairs on desktop Linux, although GNOME is to a degree shaping up to a decent experience (partly by copying macOS, though). Their trackpad support is far better than Windows, for example.
you can shrink it now! You're going to have to change the vhd format to sparse (this is why docker for windows doesn't run on exFAT-formatted SSDs, as I found out a couple weeks ago) and then run compaction on it once. After running compaction once, it should grow and shrink dynamically as you'd expect.
thank you!! I will keep your advice in mind if I ever try WSL again
> because it doesn't have to use hacks like WSL in order to be like Linux, it just already is Unix.
If my knowledge is up to date, you still have to run a linux VM (hidden behind whatever tool you are using) to run linux containers right?
My point is that I don't need a linux container at all because all my tools run natively on macOS. Even (and especially) the ones that wouldn't run well on Windows.
For example: I can run valgrind on macOS. Good luck running valgrind on Windows.
I recently purchased a new laptop battery and installed Ubuntu 24.04 on a Macbook Pro from 2009 that I had kept in storage. It is used for light note taking and e-mail, because I wanted to try and see how viable it was to try and stretch the hardware a bit longer instead of adding to the e-waste problem by buying something new.
It's worked well so far. I will keep using it, and then eventually perhaps pick up another refurbished machine when it finally comes time to replace it.
There are at least two problems with this article:
1) If you can hack a current version of macOS to run on non Apple hardware, you can probably also hack a current version of macOS to run on older Apple hardware (see OpenCore Legacy Patcher). 2) Support for macOS on x86 hardware will be discontinued in the not so distant future, putting Hackintoshes into the same category as old x86 Apple hardware.
I've done this three times, albeit not very recently. First you need a regular computer, on which you're both going to Google a ton and prepare a lot of software. It's only for fun or maaaybe a niche work use case.
I also had a MacPro5,1 I used extra long, thanks to some of the tooling mentioned here. Great machine. But seeing how it used 200W and I had to buy a new RX580 for it to be usable, idk if that really helped the environment, nor was that the reason I had it.
Hackintoshing supports internal GPUs (iGPU) as well (not only RX 580) - and lots of people install on their Lenovo Thinkpads or other notebooks.
I thought it would be interesting to run the numbers for the question “how far can I drive my Prius if I didn’t buy this ?” for a new Macbook Pro M4 (14 inch).
The answer is a pleasing 1337 miles (198kg lifetime CO2 of the macbook / 148g mile CO2 average for the 2011 Prius).
This is pretty surprising. I’d always thought laptop purchases had a much higher impact than an average couple of months of driving.
148 grams per mile? Do you use a water wheel made of magic to charge your Prius?
For calibrating your understanding of greenhouse gas emissions producing, packaging, shipping, using, and recovering materials from a 14-inch MacBook Pro in 2023 emits about 243 kg of carbon [1], which is equivalent to using 27 gallons of gasoline. [2]
For some people, this may math out. But it's probably easier to worry less about buying a new laptop every once in a while, and more about driving a bit less or getting a more efficient car.
[1] https://www.apple.com/environment/pdf/products/notebooks/14-... [2] https://www.epa.gov/energy/greenhouse-gas-equivalencies-calc...
We really need rules of thumb like this physically posted in more places. I grew up seeing "be green" signs in school that didn't really tell you how.
A reasonable rule of thumb is how much money you spend on something. Even if you buy something that is ostensively green your money is still flowing into an economy which isn’t.
Yeah, that's what I use if nothing else is available, but in this case the 27 gallons of gasoline costs about 1/10 of the MacBook.
Something I would like to see is well regulated “Carbon Facts” on products like nutrition facts.
…which is about as much as commuting to work that’s 5km away for 6 months. Given a ~3 year lifespan of such device, it isn’t such a low amount. And when you decide to get a new, more efficient car, it will have a its own carbon footprint too.
> that’s 5km away
Which is about an hour's walk, or about twenty minutes cycling.
I'm sure someone can remark, given an average of 125W of power in the human, and the CO2 released, which is really just our food post processed into fat and being attached to poisonous oxygen to get it out of our bodies, how much, exactly, carbon is exhaled walking 40 minutes a day.
Yes, but getting a more efficient electric car for example typically will “pay for itself” in carbon emission reduction over a couple years. Plus it pushes older less efficient cars out of the bottom of the used market and supports growth of clean technology.
I should add that this represents 6% of the 443 gallons the average car uses per year. https://afdc.energy.gov/data/mobile/10308
And 20% of the cost is the electricity to use the product, so that should be discounted if you’re comparing buying new to reusing an older computer, and you should compare the power efficiency.
Assuming an average four year replacement cycle (https://www.statista.com/statistics/1021171/united-states-el...) and excluding use phase power use, we can napkin math annualize it to 243 * 0.8 / 4 = 48.6 kg co2E ~= 5.5 gallons of gas per year ~= 1.2% of average annual passenger car gas consumption.
Hackintoshes make for good troubleshooting devices when working with Mac formatted drives and images.
This is what I use them for exactly. The HFSPlus driver is three years old now, and seems to not get updated anymore. Does anyone know if it is the same one as in Linux kernel? https://github.com/acidanthera/OcBinaryData/blob/master/Driv...
https://archive.is/tuAZc
The only reaction I can think of to this massive non-sequitur of an article is "WTF?"
If you really care about forced obsolescence, refuse to upgrade and push back strongly whenever you can against the wildly wasteful web and its effective browser monopoly. Write and prefer native code. Add drivers for hardware that's "no longer supported". What does this have to do with Hackintoshing? No idea.
> push back strongly whenever you can against the wildly wasteful web and its effective browser monopoly
What does that mean (like what should one do to actually follow what you're saying?) I hate how the web is progressing, but there's nothing I can do to stop it. I use the free-est browser I can use while still being able to access most stuff (Firefox), I use old computers (my newest machine is from 2018 which I just bought second-hand a few months ago), and I use open source OSes (OpenBSD, FreeBSD, Linux).
Anyway, that person has already directly "refused to upgrade" by using existing hardware, so I'm not sure what you're actually criticizing. Hackintoshing happens to be the way they reached this goal, and they made a blog post about it. Nothing wrong with that.
There's a flavor of virtue signaling about saving the environment using really old hardware.
Yes, but associating that with Hackintoshing is the weirdest part.
Indeed, it makes no sense because some old hardware that might be ok with Linux or even Windows is probably not ok with macOS. This article even tells you to go buy an RX580, and that's not the half of it.
And since you mentioned wasteful web, medium.com is a prime example of that.
Seems AI-written. Doesn't give any justification for MacOS being better for older hardware than Linux or even Windows. You won't be getting updates for your Intel hardware on MacOS, so you're right back where you started with Windows 10 going EOL.
Too many typos to be AI written. I think the point was that if you are able to use the old hardware longer it is better for the environment - not the effectiveness of old hardware in general (Linux always wins this argument)
This is not AI-written. This is what used to be called asperger's (now part of autism spectrum disorder). You can see it in how they write:
> In 2025 the question is: why Hackintosh? And after being in the different communities for so long and observing reasons people install Hackintosh they are:
the way they construct sentences is so distinctly recognizable: [And after [being in [the different communities] for so long] and observing [reasons people [install Hackintosh]] they are:]
It's like they use Scratch blocks to assemble their language. It's hard to explain, but you can know it when you see it.