There were a number of exciting announcements from Apple at WWDC 2024, from macOS Sequoia to Apple Intelligence. However, a subtle addition to Xcode 16 — the development environment for Apple platforms, like iOS and macOS — is a feature called Predictive Code Completion. Unfortunately, if you bought into Apple’s claim that 8GB of unified memory was enough for base-model Apple silicon Macs, you won’t be able to use it. There’s a memory requirement for Predictive Code Completion in Xcode 16, and it’s the closest thing we’ll get from Apple to an admission that 8GB of memory isn’t really enough for a new Mac in 2024.
I can’t believe, there’s no Linux reference yet!
Give your “8 gigs not enough” hardware to one of us and see it revived running faster than whatever you’re running now with your subpar OS.
Software and AI development would be hard with 8gb of RAM on Linux. Having you seen the memes on AI adding to global climate change? Not even Linux can fix the issues with ChatGPT…
I don’t think anyone anywhere is claiming 8GB RAM is enough for software and AI development. Pretty sure we’re talking about consumer-grade hardware here. And low-end at that.
My main development machine has 8 GB, for what it’s worth. And most of the software in use nowadays was developped when 8GB was a lot of RAM
This. My Mac has 16GB but I use half of it with a Linux virtual machine, since I use my Mac to write Linux (server) software.
I don’t need to do that - I could totally run that software directly on my Mac, but I like having a dev environment where I can just delete it all and start over without affecting my main OS. I could totally work effectively with 8GB. Also I don’t need to give the Linux VM less memory, all my production servers have way less than that. But I don’t need to - because 8GB for the host is more than enough.
Obviously it depends what software you’re running, but editing text, compiling code, and browsing the web… it doesn’t use that much. And the AI code completion system I use needs terabytes of RAM. Hard to believe Apple’s one that runs locally will be anywhere near as good.
The lede by OP here contains this:
So either RecluseRamble meant that development with a feature like predictive code completion would work on 8 GB of RAM if you were using Linux or his comparison was shit.
That’s absolutely what I’m saying. Apple is just holding back that feature for upselling (as always) and because it’s hardly possible to debloat macOS.
Okay good, thanks for confirming. I remember Kate feeling very nice to use during my studies, more responsive than VS Code or Eclipse. But I also had 16Gigabytes of RAM, so I couldn’t be sure.
I do. GCC doesn’t need much. Vim/emacs work fine with 128 MB of RAM. With 1 GB you can run KDE and QtCreator instead of vim.
Macbook Pros aren’t really consumer grade hardware. Nor are they priced like consumer grade hardware.
Apple products in general aren’t.
That’s not true at all. Macbook Air starts at $900. You can even find a used M1 Air for cheaper. Absolutely was a steal compared to the budget thin laptops from Asus, Acer, etc. which start around $700. Once you go below $700 in laptop market, corners are cut. Perhaps Mediatek WiFi chips are used, laptop isn’t thin, touchpad is awful, screen colors are worse. Apple usually puts iPad + keyboard in that market segment instead.
Tl; dr: Apple products are more expensive than budget electronics but priced comparatively to items that compete with it. However, electronic prices in the high end tier are getting hirer.
So it’s more expensive than the competitors which also have real budget options at easily half the price but then “corners are cut”.
You know, I won’t even argue about the quality of Apple products - they are top tier. But calling the pricing “a steal” is just dishonest.
They have consistently been averaging at 150-200% the price of comparable hardware at least since the 90s. While there may be examples like yours where the gap is smaller, there are plenty of outrageous examples like the infamous monitor stand or some ridiculously priced chargers.
You are right about the accessories, horribly overpriced.
I used to fix laptops for a living. I worked at a place where we had used Apple products and stuff from other brands. Sure, you could buy a core i5 Toshiba laptop that had a similar Intel CPU (though Apple tended to use Intel chips with slightly more GPU performance) at a fraction of the price. The screen was garbage, the WiFi stalled, the touchpad was unusable, using the keyboard made the chassis flex, etc. The comparable products from Lenovo, Samsung, HP were similarly priced.
You can find some laptops with decent Intel or AMD chips for $600 these days. Usually they will be plastic or bricks. Which is fine of you don’t mind that. People want thinner products and that calls for a better design to (1) handle the heat or (2) buy the better binned CPU that operates better at lower frequencies.
Not only that but people were willing to buy the used Macbooks. Much better than the other brands where the plastic and PCBs were sent for recycling MUCH more often. Better for the environment.
They are even below that.
I actually bought a m1 mini for a linux low power server. I was getting tired of the Pi4 being so slow when I needed to compile something. Works real well, just need the Asahi team to get TB working. And for my server stuff, 8gb is plenty.
You wouldn’t happen to run a jellyfin server on that mac mini would you? Currently looking to find something performant with small form factor and low power consumption.
I’ve run Plex servers on Mac Minis (M1). Docker on MacOS runs well finally — the issues that were everywhere a couple of years ago are resolved.
It ran very well on the hardware. The OP of this post is right, 8gb is not enough in 2024; however I would also wager that the vast majority of commenters have not used MacOS recently or regularly. It is actually very performant and has a memory scheduler that rivals that found on GNU/Linux. Apple’s users aren’t wrong when they talk about how much better the OS is than Windows at using memory.
Not sure about jellyfin, but I assume it uses ffmpeg? The M1 is fast enough that ffmpeg can re-encode raw video footage from a high end camera (talking file sizes in the 10s of gigabyte range) an order of magnitude faster than realtime.
That would be about 20W. Apparently it uses 5W while idle — which is low compared to an Intel CPU but actually surprisingly high.
Power consumption on my M1 laptop averages at about 2.5 watts with active use based on the battery size and how long it lasts on a charge and that includes the screen. Apple hasn’t optimised the Mac Mini for energy efficiency (though it is naturally pretty efficient).
TLDR if you really want the most energy efficient Mac, get a secondhand M1 MacBook Air. Or even better, consider an iPhone with Linux in a virtual machine - https://getutm.app/ - though I’m not sure how optimsied ffmpeg will be in that environment… the processor is certainly capable of encoding video quickly, it’s a camera so it has to be able to encode video well.
Depending on codec and settings, this might be super fast and super slow.
No I do not, but I don’t see any reason it shouldn’t work though. I have PiHole, Apache, email, cups, mythtv and samba currently.
I can run Arch Linux (BTW!) in a potato with starch RAM!
You can run Windows on 200 MB of RAM
Honestly I have no qualms with MacOS. Probably the best OS. Problem is you can’t run it on anything that is repairable or upgradeable, and in 7 years it won’t be supported any longer. If they would just sell me a $500 lifetime license for MacOS that I could install on a Framework laptop, I’d buy it in a heartbeat. But they know they make way more money by not making that option available.
I’d love to see you run xcode 16 code completion on your superior OS. Send me a link once you’ve uploaded the vid.
Why limit it to proprietary software? Almost every linux distro can run Github Copilot X and Jetbrains, which both have had more time to be publicly used and tested and work better in my opinion.
Send me a video link of Mac having direct access to containers without using a VM (which ruins the point of containers). THAT is directly related to my actual work, as opposed to needing a robot to code for me specifically using Apple’s AI
Because that was what the article was about…I actually am a Linux user and fan, folks just misreading the intentions of my post.
I would genuinely love to see it, because I’m stuck on mac hardware to do my job and I really hope one day they get crucified for their anticompetative practices so I can freely choose the OS my business uses.
Pls provide source code.
There is a project being worked on called Darling, but it isn’t ready yet. The developers are making progress though.
Removed by mod
Removed by mod
As I said: feel free to upgrade your MacBook just don’t throw the one with a “meager” 8 gigs away since it’s totally usable with a non-bloated system.
Removed by mod
You replied, I replied back. That’s how public social media work. It’s unlikely we know each other.
Removed by mod
Removed by mod
Do you actually want to run an application that doesn’t exist on Linux?
I use a virtual machines for the 2 or 3 times a year I need to use a couple garbage windows-only programs. Usually for configuring some arcane piece of proprietary hardware that people were getting rid of because it is incompatible with everything.
Removed by mod
“Disrespectful” would be telling you that you in particular should continue to use windows or mac, and avoid Linux like the plague.
Removed by mod
If I wanted your clothes, I wouldn’t have left them at goodwill.