• 0 Posts
  • 36 Comments
Joined 6 months ago
cake
Cake day: May 20th, 2024

help-circle
  • To install Steam on most distros with popular DE’s, you click the software store to open the software store. If Steam isn’t listed in the front page then just click the search box and start typing Steam.

    When you see it, click the install button.

    When it is done open it by clicking the Open button or pressing the Windows (or Super) key and type Steam. Click it when you see it.


  • Compiling from GitHub is cherry picking the worst case especially for “most normal people” and frankly they should be using the software store GUI in their DE to install and update software with nice easy buttons to click.

    Frankly software management for a normal person generally is easier on Linux than it is on Windows for stuff made to run on Linux.

    But don’t worry someone will respond with nvidia’s shitty proprietary drivers.


  • People forget XP was pretty bad at first just like Windows 98 and like Windows 98 people became less critical after a bunch of major fixes. For Windows 98 this became Windows 98SE and for XP this became XP SP2 (and eventually 3).

    Both Vista and 7 had problems before they were fixed after awhile. The most common issue I can remember was UAC and everyone just told you to turn it off to install and use their software and games. There were also a bunch of breaking Win API stuff and a lot of software made for XP just didn’t work anymore in Vista+.

    People mainly just remember them after they were fixed, except for Vista because 7 came out fairly quickly (just 2 years later). Microsoft does not have a good track record for initial Windows releases but eventually everyone forgets and even some of the bad ones are remembered as the good ones.


  • In an enterprise imaged Windows laptop they and you probably wouldn’t have superuser privileges in order to keep yourselves from doing stuff like deleting core Windows dependencies. Maybe they give you full administrative access at your company but if you deleted the Program Files folder to save time you’d be blamed by pretty much everyone.

    You guys obviously have root privileges or else you wouldn’t have been able to delete the system’s core Python2 installation. And frankly you must have literally manually deleted it because the package manager would have told you what havoc you were about to enact and made you tell it to do it anyway.

    But what’s even weird to me is that most python devs I know, including myself use python virtual environments (venv) to use different versions and package bloat control from something like pip but keep it all nice and neat.

    If you wanted python3 to be the default you have to change the PATH in Windows or if you don’t know what you are doing I guess reinstall whichever python with a .MSI an hope it does it for you.

    Meanwhile, in Linux you can just use the alternatives utility to literally pick your preferred versions and it takes care of the paths for you.

    And with the HDMI issue? You must not be using the same graphics drivers and someone is using proprietary graphics drivers (won’t have the issues you’ve described) and the other is using open source versions (you’ll have the issues you’ve described) because companies are shitty about their proprietary closed standards.

    Which brings up another point. You say you all use the same laptop model and OS but you don’t all use the same drivers? There’s no baseline? There’s no control?

    This sounds like a Hell of your own making. This is why users in general should never have full administrative privileges and they should be tailored down to just what you need. Epecially if they haven’t yet learned the basics of the OS they are using because they are at best a danger to themselves and at worst a vulnerable laptop inside the network.


  • Just do in what I do. Don’t join meetings most of the time. That way when you do it is noteworthy to the meeting stakeholder.

    Yeah sure my manglers through the years try to have ‘the talk’ but after awhile of training them via sheer apathy they shut the fuck up.

    I solve complex problems, get my tasks done, I’m independent and I stay busy because I’ll get bored. Most meetings could just be an email. There’s no real collaboration except managers or scrum masters asking what your blockers are but not actually doing anything about it. If I think the meeting will be a waste of my time I just don’t show up.



  • Yes, but has it taken both OS’ out at the same time? It hasn’t but it could happen, however, the chances are even less. There’s obvious risk mitigation in mixing vendors in infrastructure for both hardware and software in the enterprise.

    If some critical services were lost in your enterprise last time until RH updated their kernel then you could have benefitted from running that service from Windows as well. Now the reverse is true. You could have another DC via Samba on Linux in your forest if you wanted to, in order to have an AD still for example. Same goes for file share servers, intermediary certificate servers (hopefully your Root CA is not always on the network) and pretty much most critical services.

    Most enterprises run a lot of services off of a hypervisor and have overhead to scale (or they are already in a sinking ship), so you can just spin up VMs to do that. It isn’t as if it is unreasonably labor intensive compared to other similar risk mitigation implementations. Any sane CCB (obviously there are edge cases but we are talking in general here) will even let you get away without a vendor support contract for those, since they are just for emergency redundancy and not anywhere near critical unless the critical services have already shit the bed.


  • I get the sentiment but defense in depth is a methodology to live by in IT and auto updating via the Internet is not a good risk to take in general. For example, should Crowdstrike just disappear one day, your entire infrastructure shouldn’t be at enormous risk nor should critical services. Even if it’s your anti-virus, a virus or ransomware shouldn’t be able to easily propagate through the enterprise. If it did, then it is doubtful something like Crowdstrike is going to be able to update and suddenly reverse course. If it can then you’re just lucky that the ransomware that made it through didn’t do anything in defense of itself (disconnecting from the network, blocking CIDRs like Crowdsource’s update servers, blocking processes, whatever) and frankly you can still update those clients anyway from your own AV update server which is a product you’d be using if you aren’t allowing updates from the Internet in order to roll them out in dev first, phasing and/or schedules from your own infrastructure.

    Crowdstrike is just another lesson in that.




  • I’m not against using Google, stack exhange, man pages, apropos, tldr, etc. but if you’re trying to advertise competence with a skillset but you can’t do the basics and frankly it is still essentially a mystery to you then youre just being dishonest. Sure use all tools available to you though because that’s a good thing to do.

    Just because someone breathed air in the same space occasionally over the years where a tool exists does not mean that they can honestly say that those are years of experience with it on a resume or whatever.


  • I’ve worked as an IT architect at various companies in my career and you can definitely get support contracts for engineering support of RHEL, Ubuntu, SUSE, etc. That isn’t the issue. The issue is that there are a lot of system administrators with “15 years experience in Linux” that have no real experience in Linux. They have experience googling for guides and tutorials while having cobbled together documents of doing various things without understanding what they are really doing.

    I can’t tell you how many times I’ve seen an enterprise patch their Linux solutions (if they patched them at all with some ridiculous rubberstamped PO&AM) manually without deploying a repo and updating the repo treating it as you would a WSUS. Hell, I’m pleasantly surprised if I see them joined to a Windows domain (a few times) or an LDAP (once but they didn’t have a trust with the Domain Forest or use sudoer rules…sigh).


  • Nah not for the big providers. The biggest problem is not having RUA for DMARC set up at all, set to None for the action or having an email in the RUA that will give a bounce message back to a sender (or not having DMARC at all in your DNS). The safe thing to do is set up SPF, DKIM and DMARC (correctly).

    You cant always control getting into a spam box from time to time if someone in your IPs /24 makes it onto popular spam databases but that’s very temporary but it is also very possible someone in your /24 is always on the lists. You can check yourself and there are both scripts and sites that will check most of the popular ones for you.

    /24 is a very popular CIDR to use for stuff like spam filtering or internet facing IPS.



  • Needing to use command line for some things that should be a right click, not supporting right click, ambiguities galore when looking at a package repository, odd defaults in packages that one really wouldn’t expect to have to check (e.g. Selecting RDP connection in a Remote app, but it defaults the security to something other than RDP?)

    Sounds like you’re using a GNOME Desktop. You should give KDE Plasma a try instead. KDE Plasma basically gives you a Windows-esq experience without trying to install something like GNOME extensions.

    For a regular user there’s not much point into going into the command-line anymore.

    there’s problems like Libre Office devs …

    Sure but there’s also alternatives. LibreOffice doesn’t try to emulate Microsoft Office and they never really have. They won’t even try to be compatible with MS Office but rather they do with OOXML which Microsoft created for other Office suites to be compatible with it but then just never supported it very well. Some alternatives do however. WPS Office is perhaps the most popular alternative for this that does try to be compatible with MS Office and emulate its feel and features but ONLYOFFICE is also a contender.


  • Just a heads up, if you use an AMD GPU, the drivers are built into the Linux kernel itself by AMD engineers (and others helping/supporting/contributing to the kernel like themselves). So you don’t even have drivers to install, unless you’re one of the 10 people that want to use AMD GPUs for Machine Learning. Then you’d do a quick install of AMD PRO (those are proprietary so that’s why they aren’t built into the kernel).


  • To be fair, I find that people with a Computer Science degree are pretty much just like most other users except that they need more privileged access somewhere because they are usually software developers or somewhere in that orbit. A Computer Science degree does not prepare someone to be a sysadmin. That doesn’t mean they can’t be an excellent one but it certainly isn’t because of their degree path.




  • It’s also difficult for developers to publish to Linux because of the wide variety of different Linux systems.

    I disagree there. The issue is that in Windows people bring over their own version of libraries they compiled on (the millions of .dll files) and you can even look in your Uninstall Apps settings where there’s a bunch of MS specific runtime bundles to see that’s even an issue in the MS ecosystem.

    In Linux, developers have relied on the library versions just being there. It is, I’d argue, the most compelling reason package managers basically had to come into existence. On the flip-side this can cause issues where there is some version on the system by the package manager that replaces another version. And something not a part of that package management system isn’t a part of those dependency checks and if they don’t put the libraries with the binaries…well it is just luck if you have them all or if other versions can support those library calls in the same way still.

    In Linux that is all those .so’s in /var/lib and stuff.

    You don’t really see many proprietary things using package managers and those that do are packaged by someone else and are in some sort of repo that isn’t part of the vanilla install because of legal caution.

    Companies that made their money on porting games to Linux prior to Proton basically causing them to shutter Linux porting would put their .so’s in with the game bundle themselves, just like you see happening in Windows when .dll’s are inside the actual program’s folders.

    However, the more that this sort of dependency management has become abstracted by development suites that take care of this for the developers, the less they understand about it.

    Flatpaks actually take care of this and it is one reason they are so popular. They figure out (well that’s a simplification) those library dependencies, sandbox the apps with those dependencies so the library paths don’t interfere with other flatpaks or the base system itself. People complain about this as a con because “the download is BIGGER” even though flatpak doesn’t install the same runtimes over and over again, so once they are there, the download may still be bigger but the installed storage isn’t.

    Anyway, yes Linus Torvalds complained about the “Linux fragmentation” issue but it was about DE’s not the state of the development ecosystem itself as I recall, though the rant is very old, so maybe I don’t remember all of it.

    Wider application support would be a start.

    Sure, but that’s not a Linux problem, that’s a developer problem. Linux supports application development just fine. It is a kernel and the surrounding ecosystem is the operating system after all. It is developers that don’t support it. That isn’t really something Linux in and of itself can effectively solve. Users have to increase and developers supporting applications for Linux will also increase. The classic Linux Chicken and the Egg problem but it is capitalism and that’s just going to be how it has to work.