• 1 Post
  • 34 Comments
Joined 1 year ago
cake
Cake day: June 9th, 2023

help-circle

  • Only if those device makers are willing to use it. And that has always been the tightrope linux has walked.

    Its very history as a x86 platform means it has needed to develop drivers where hardware providers did not care. So that code needed to run on closed hardware.

    It was bloody rare in the early days that any manufacturer cared to help. And still today its a case of rare hardware that needs no non free firmware.

    Free hardware is something I’ll support. But it is stallman et als fight not the linux kernel developers. They started out having to deal with patented hardware before any one cared.


  • proprietary

    Well related to the owner is the very definition of proprietary. So as far as upstream vs not available for upstream is concerned. That is what the term is used for in linux.

    So yep by its very definition while a manufacture is using a licence that other distributions cannot embed with their code. Marking it proprietary is how the linux kernal tree was designed to handle it.

    EDIT: The confusion sorta comes from the whole history of IBM and the PC.

    Huge amounts of PC hardware (and honestly all modern electronics) are protected by hardware patients. Its inbuilt into the very history of IBMs bios being reverse engineered in the 1980s.

    So as Linux for all its huge hardware support base today. It was originally designed as a x86(IBM PC) compatible version of Unix.

    As such when Stallman created GPL 3 in part as a way of trying to end hardware patients. Linux was forced to remain on GPL 2 simply because it is unable to exist under GPL 3 freedom orientated restrictions.

    The proprietary title is not seen as an insult. But simply an indication that it is not in the control of the developers labelling it.




  • Just of the top of my head discovered today.

    Not a GUI as one exists. But a more configurable one as it is crap for visually impaired.

    Rpi-imager gui dose not take theme indications for font size etc. Worse it has no configuration to change such thing.

    Making it pretty much unsuable for anyone with poor vision.

    Also it varies for each visually impaired indevidual. But dark mode is essential for some of ua.

    So if your looking for small projects. Youd at least make me happy;)



  • Yep pretty much but on a larger scale.

    1st please do not believe the bull that there was no problem. Many folks like me were paid to fix it before it was an issue. So other than a few companies, few saw the result, not because it did not exist. But because we were warned. People make jokes about the over panic. But if that had not happened, it would hav been years to fix, not days. Because without the panic, most corporations would have ignored it. Honestly, the panic scared shareholders. So boards of directors had to get experts to confirm the systems were compliant. And so much dependent crap was found running it was insane.

    But the exaggerations of planes falling out of the sky etc. Was also bull. Most systems would have failed but BSOD would be rare, but code would crash and some works with errors shutting it down cleanly, some undiscovered until a short while later. As accounting or other errors showed up.

    As other have said. The issue was that since the 1960s, computers were set up to treat years as 2 digits. So had no expectation to handle 2000 other than assume it was 1900. While from the early 90s most systems were built with ways to adapt to it. Not all were, as many were only developing top layer stuff. And many libraries etc had not been checked for this issue. Huge amounts of the infra of the world’s IT ran on legacy systems. Especially in the financial sector where I worked at the time.

    The internet was a fairly new thing. So often stuff had been running for decades with no one needing to change it. Or having any real knowledge of how it was coded. So folks like me were forced to hunt through code or often replace systems that were badly documented or more often not at all.

    A lot of modern software development practices grew out of discovering what a fucking mess can grow if people accept an “if it ain’t broke, don’t touch it” mentality.







  • HumanPenguin@feddit.uktoOpen Source@lemmy.mldon't use ladybird browser lol
    link
    fedilink
    English
    arrow-up
    15
    arrow-down
    3
    ·
    5 months ago

    Agreed. Most of us really do not think about this shit as often as we should. I know I am guilty of assuming he when typing. I know because I make an effort not to be. And notice how often I need to correct text. Being older than many developers. I just grew up with the assumptions. So like many my age needed my attention drawn to the societal indoctrination.

    People politely pointing it out is important. As is people volunteering to help correct older documentation.


  • The direct numerics of moors law may not be definite.

    But the principal it defines is. In the future computers will have much more power then they do now.

    The reason modern GPUs use things like shaders etc is to allow them to archive massive manipulation of data in more efficient ways specific for the task desired.

    Honestly this is why I mention time scale as the main thing that will make this possible. How modern gpus or other specialised processers do the task is less important then what the game code is asking the gpu to achieve.

    The idea that at a unknown future date. The CPU GPUs or what ever future tech we have. will never be able to run fast enough to read current cpu or gpu instruction sets. And generate the effect defined using future techniques is not viable as an argument. The only questions are how long and is anyone going to have the motivation to reverse engineer the large but finite instruction sets used by secretive hardware corps today.


  • Not so sure about that. When you consider time spans.

    Currently we can emulate the majority of early games consoles. So theoretically with time and Moors law any hardware will be emulate able in a few decades. With enough information.

    The advantage of open source software. Is it can be used with the original binaries to reverse engineer the instruction set even if the original manufacturer wishes to hide it. So with will and effort even the most complex hardware will be able to be emulated on future much faster hardware.


  • Blasphemy quick stone the unbelievers.

    Kidding of course. Have to admit I agree. I’ve used Linux since the late 1990s. So long long before it was usable by most folks standards.

    I started because my university had HPUX machines that we needed to submit work on. So wanted a unix like enviroment at home I could work on. This was a tim when linux was basically slackers on 50plus floppy disks. Xwindows needed configuring for every monitor. Honestly by current standards usability was non existant compared to windows.

    But honestly I spent so much time on the system. And watched it improve. To the point I find windows an utter pain in the arse now. And will avoid it under all circumstances.

    But the idea of convincing folks who have no interest. Where the hell do folks find the time.





  • Not OP. But curios on the subject. I use debian bookworm with an older Nvidia 1050.

    I currently tend to use gnome. As I have multi res monitors. Mainly due to vision issues. 2x32inch 2k 1x28inch 4k and a 24inch 1k

    Dose any desktop allow stable fractional scaling for each monitor independently. Its been a good few years since I looked into it. But in the past it was unstable.