Have you seen what it outputs? The same way we can compile C to brainfuck, it doesn’t mean it’s good or is useful.
Have you seen what it outputs? The same way we can compile C to brainfuck, it doesn’t mean it’s good or is useful.
You can’t compile C to java bytecode, they are fundamentally incompatible. But you can compile C to wasm, which is what you want for a good universal bytecode. Java is shit.
For example when watching 1080p youtube video in Safari the power consumption is only 0.1watt because it’s using hardware decoders. (not including display backlight, I can’t measure it). But when I play the same video in firefox which is using software decoding the consumption is around 0.7w which is not as good as hw decoders, but still less than a watt
no, it’s just an easy sustained load that can be measured accurately. If you have some other application that provides sustained load but doesn’t spin all the cores to 100% please suggest it, I will try.
I did some actual measurements just to confirm it, here is minecraft in default configuration running @ 100fps and the cpu+gpu consumption is around 6w in total. If you add about 5w for display backlight and other components the total would be 9-10 hours of play time on my 100wh battery.
Can you please take the same measurements on your system? Maybe ryzen system is better than intel, never had one.
I did some actual measurements just to confirm it, here is minecraft in default configuration running @ 100fps and the cpu+gpu consumption is around 6w in total. If you add about 5w for display backlight and other components the total would be 9-10 hours of play time on my 100wh battery.
Can you please take the same measurements on your system? I’d like to see how good is the alternative.
Just don’t buy an 8gb model, easy fix) But seriously when you get a laptop which allows you to work 8 hours straight from battery and then have 30% capacity left at the end of a day, there is no chance you would get back to the Intel system and plug it in every 2 hours.
You talk about high prices however there is no actual competition. High end systems like Dell XPS and others cost the same as M3. You do get some benefits like touch screen or whatever but you get shitty touchpad and 3 hours of battery life.
In regards to the software I agree macOS is not the best, but maybe you noticed the topic is about Fedora Linux so you do have options now.
Current apple systems are objectively superior. The display image quality is better than competition, the touchpad hardware is better, CPU is top 1 in the world in single thread performance and the battery life is unrivaled.
If you talk about the repairability it only matters in case it breaks and it only happens to a small % of the owners. Most people won’t need to repair it. However you do use your device every day, so why would you give up the better user experience? Because of a small chance you would need to pay for repairs later, or even at all? It doesn’t make sense.
The same argument applies to upgrades as well. If you think you’ll need an upgrade just buy a bigger version from the start. It may be more expensive but once again you get a better experience overall.
20 to 25°C in superior units
I can’t quite understand what is your point? Are you arguing that both JVM and WASM are bad? With this I agree, they both have terrible performance and in an ideal world we wouldn’t use any of them.
Are you arguing that JVM bytecode is better than WASM? That’s objectively not true. One example is a function pointer in C. To compile it to JVM bytecode you would need to convert it to the virtual call using some very roundabout way. But in WASM you have native support for function pointers, which gives much better flexibility when compiling other languages.