• 0 Posts
  • 16 Comments
Joined 5 months ago
cake
Cake day: February 17th, 2025

help-circle




  • There was this recent attack to XZ utils, which shows that more attention is needed on the code being merged and compiled.

    XZ was made possible largely because there was unaudited binary data. One part as test data in the repo, and the other part within the pre-built releases. Bootstrapping everything from source would have required that these binaries had an auditable source, thus allowing public eyes to review the code and likely stopping the attack. Granted, reproducibility almost certainly would have too, unless the malware wasn’t directly present in the code.

    Pulled from here:

    Every unauditable binary also leaves us vulnerable to compiler backdoors as described by Ken Thompson in the 1984 paper Reflections on Trusting Trust and beautifully explained by Carl Dong in his Bitcoin Build System Security talk.

    It is therefore equally important that we continue towards our final goal: A Full Source bootstrap; removing all unauditable binary seeds.

    Sure you might have the code that was input into GCC to create the binary, and sure the code can be absolutely safe, and you can even compile it yourself to see that you arrive at the same bit-for-bit binary as the official release binary. But was GCC safe? Did some other compilation dependency infect the compiled binary? Bootstrapping from an auditable seed can answer this question.


  • The solution is to have stronger privacy laws.

    Many people have the power to make certain privacy attacks impossible right now. I consider making that change better for those people than adding a law which can’t stop the behavior, but just adds a negative incentive.

    I wouldn’t wait around for the law to prosecute MITM attacks, I would use end to end encryption.

    Choosing an esoteric system for yourself is a good way for a free people to protect their privacy, but it won’t scale.

    If this is referencing using a barely-used system as a privacy or security protection, then I would regard that as bad protection.

    Everyone using GrapheneOS would be a net security upgrade. All the protections in place wouldn’t just fade away now that Facebook wants to spy on that OS. They’re still in place; Facebook’s job is still harder than it otherwise would be.



  • Yes. Memory allocated, but not written to, still counts toward your limit, unlike in overcommit modes 0 or 1.

    The default is to hope that not enough applications on the system cash out on their memory and force the system OOM. You get more efficient use of memory, but I don’t like this approach.

    And as a bonus, if you use overcommit 2, you get access to vm.admin_reserve_kbytes which allows you to reserve memory only for admin users. Quite nice.





  • unhrpetby@sh.itjust.workstolinuxmemes@lemmy.worldStallman
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    2 months ago

    If such a project were to become compromised (the way XZ-Utils was), it would eventually spread to Ventoy.

    What a lot of people don’t know is that the XZ attack entirely relied on binary blobs: Partially in the repo as binary test files, and partially in only the github release (binary).

    If someone actually built it from source, they weren’t vulnerable. So contrary to some, it wasn’t a vulnerability that was in plain view that somehow passed volunteer review.

    This is why allowing binary data in open-source repos should be heavily frowned upon.