• 0 Posts
  • 17 Comments
Joined 8 months ago
cake
Cake day: July 5th, 2024

help-circle


  • I disagree that the growing pains are steadily going down, too. Maybe I’m just old, but to me the golden age of Linux “just working” is definitely not now.

    Well, you can disagree all you like, but it doesn’t line up with reality. Hell, I even have my parents on Linux and it’s working great for them. Since setting them up with Linux the amount of times I have to connect to their computer or even drive over there has gone down to nearly zero.

    The thing that killed Linux getting bigger then was the lack of software support.

    You actually have that backwards, but that’s neither here nor there. The reality is that Linux is more compatible than ever. Most services that anyone uses is done online in a browser. Most Steam games work out of the box, not including live-service and multiplayer games with prohibitive anticheat. Even most non-Steam games work without much fuss. Lutris makes many games a one-click installation.

    it’s complete chaos.

    Not really. It may seem that way, but it’s just the fast progress of tech. Frankly, Linux leads that progress because of its monopolistic use in servers.

    let this be a note that the handling of third party FSs

    Not going to happen, unless you want to have a chat with Microsoft.

    external mounted drives in Linux should get much better

    There’s no issues with external drives in Linux beyond the usual stuff every OS deals with.

    Steam should start giving non-SteamOS distros some love

    Steam has worked swimmingly on other distros well before SteamOS was ever a thing.


  • it’s to find the best FS to share across a dual boot setup

    It just doesn’t exist. NTFS is proprietary and really the only choice for a Windows setup, and for Linux NTFS just isn’t a good choice. The only reason people recommend it is that it’s the path of least friction for users that comes out of the box. I’ve tried installing and using an EXT4 driver in Windows and it’s not painless process, and functionality was serviceable at best, but this was at least 5 years ago, so it might be better now.

    That is not a ridiculous use case, it is not an edge case. Another self-contradicting frequent recommendation among Linux cheerleading places is to start migrating with a dual boot setup and Steam absolutely supports importing pre-existing libraries from mounted external drives.

    True, but generally you would migrate your game data from a mounted NTFS drive to a FOSS filesystem.

    This is an exhausting conversation every time.

    Yes, the ever changing landscape of tech is exhausting. Like Windows barfing all over itself during updates (I had to reinstall my wife’s desktop just a couple weeks ago because a Windows update completely destroyed itself), which has been a regular issue for many years now. And that doesn’t even touch on the myriad of other issues Windows consistently has, nevermind all the privacy issues with Recall and ads within the OS, or OneDrive without permission uploading all user’s local documents, deleting them (edit: from the local drive), and them holding the data hostage when the OneDrive account doesn’t have enough purchased space for it all.

    People insist that Linux is finally ready (it is inevitable, someone said) to take over for Windows, but when people bring up legitimate technical issues the answer is consistently that oh, well, this works fine on AMD cards, and HDR isn’t that important anyway and who needs surround sound on a PC anyway

    There are always growing pains, but they are getting less and less with each release. Remember that these growing pains are a result of hardware and software makers ignoring Linux, and the reduction of pains are being tackled mostly by volunteer work. But as the userbase grows, so will support for these things and the pains will eventually go away.

    If your answer is that dual boot setups just aren’t viable, then great, but that means Linux itself is not viable for a whole host of use cases

    No, Linux is absolutely viable for most use cases. Dual boot setups are fine also even with mounting NTFS partitions on Linux, what you’re doing is saying “my specific use case of storing games on an NTFS partition isn’t ideal, so it doesn’t work for anyone”.

    If you really insist on using it, there ARE ways to do it, but it’s an advanced setup, Take a look here (at your own risk):

    https://github.com/ValveSoftware/Proton/wiki/Using-a-NTFS-disk-with-Linux-and-Windows


  • every answer online is to “just use NTFS, it’s good now”.

    The hell is it! MOST topics about using NTFS with Steam on Linux is do not use NTFS. I have no idea where you got that info from.

    I also tried moving one of my drives to ExFAT

    Just… why? Granted exFAT is more open than NTFS (did they completely open it, can’t remember, too lazy to check), but it’s also very lacking compared to other filesystems. It’s really just meant as a filesystem for removable media. Something that just about every system is capable of reading and writing to. Like a bare minimum amount of features all OSes can work with.

    If you really want to try this unholy union of a setup, maybe installing an open source filesystem onto Windows would work (very slightly) better.

    Now I’m in no way promising that this will work. Windows does a lot of crap at the I/I layer that complicates things, so you might still have issues. But I think it has a better chance than what you’ve already tried.



  • You’re trying to use an open source OS with a proprietary closed-source filesystem. The reason it’s buggy is because the driver you’re using for accessing the NTFS partition is reverse engineered at a “best effort” degree. The driver isn’t complete (will never be until Microsoft open sources it), and one of the things that’s a sore point is running executables from an NTFS partition. Steam just does not handle it well and that’s not Steam’s fault or their problem to fix, nor is it Linux’s fault or their problem to fix. Frankly, it’s not even Microsoft’s fault either because they’re under no obligation to release their source code.

    It’s not my problem

    It’s 100% your problem.

    Things that don’t work shouldn’t work a little bit and then break other stuff permanently. That’s just not a tolerable behavior for mainstream software.

    You don’t use your cellphone as a hammer and complain that cellphones aren’t tough enough when the screen breaks. You don’t say “that’s just not tolerable behaviour from a mainstream consumer product”.

    The solution here is to separate your Steam library between games you play on Windows and Linux. Or simply to commit to just one OS for gaming. If you choose Windows for that, that’s perfectly fine. No one is going to give you a hard time over that. You use whatever works for you.

    But please understand that your whole argument here is that you created a setup that’s unstable (which is fine, I learned the hard way too), were told it’s unstable and why, then in the next breathe complained that it’s not your fault, it’s everyone else’s.

    NTFS is a garbage filesystem in my opinion anyways.






  • But it could also be for legal reasons, like websites where you can post stuff for everybody to see, in case you post something highly illegal and the authorities need to find you. Another example is where a webshop is required to keep a copy of your data for their bookkeeping.

    None of these require your account to “exist”. There could simply be an acknowledgement stating those reasons with “after X days the data will be deleted, and xyz will be archived for legal reasons”.

    Mostly it’s 30-90 days where they keep your data, just in case somebody else decided to delete your account or you were drunk or something

    This is the only valid reason. But even then this could be stated so that the user is fully aware. Then an email one week and another one day before deletion as a reminder, and a final confirmation after the fact. I’ve used services before that do this. It’s done well and appreciated.

    This pseudo-deletion shadow account stuff is annoying.


  • What the user was doing is that they don’t trust that the system truly deleted the account, and they worry it was just deactivated (while claiming it was “deleted”). So they tried to do a password recovery which often reactivates a falsely “deleted” account.

    I’ve done this before and had to message the company and have them confirm the account is entirely deleted.


  • The level of inaccuracy in a regular clock resulting in drift is orders of magnitude greater than any amount of time dilation you would experience.

    This is the reason we use extremely high precision clocks (like atomic clocks) and then synchronize everything else with them. Even your phone’s clock would drift noticeably over a period of a few months if it never synced with some network server.

    The NTP protocol exists precisely for this. There are entire companies that specialize in providing and maintaining synchronized wall clocks for facilities like hospitals, schools, and other organizations.


  • Using Relational DBs where the data model is better suited to other sorts of DBs.

    This is true if most or all of your data is such. But when you have only a few bits of data here and there, it’s still better to use the RDB.

    For example, in a surveillance system (think Blue Iris, Zone Minder, or Shinobi) you want to use an RDB, but you’re going to have to store JSON data from alerts as well as other objects within the frame when alerts come in. Something like this:

    {
      "detection":{
        "object":"person",
        "time":"2024-07-29 11:12:50.123",
        "camera":"LemmyCam",
        "coords": {
        	"x":"23",
        	"y":"100",
        	"w":"50",
        	"h":"75"
        	}
        }
      },
      "other_ojects":{
         
      }
    }
    

    While it’s possible to store this in a flat format in a table. The question is why would you want to. Postgres’ JSONB datatype will store the data as efficiently as anything else, while also making it queryable. This gives you the advantage of not having to rework the the table structure if you need to expand the type of data points used in the detection software.

    It definitely isn’t a solution for most things, but it’s 100% valid to use.

    There’s also the consideration that you just want to store JSON data as it’s generated by whatever source without translating it in any way. Just store the actual data in it’s “raw” form. This allows you to do that also.

    Edit: just to add to the example JSON, the other advantage is that it allows a variable number of objects within the array without having to accommodate it in the table. I can’t count how many times I’ve seen tables with “extra1, extra2, extra3, extra4, …” because they knew there would be extra data at some point, but no idea what it would be.


  • JSON data within a database is perfectly fine and has completely justified use cases. JSON is just a way to structure data. If it’s bespoke data or something that doesn’t need to be structured in a table, a JSON string can keep all that organized.

    We use it for intake questionnaire data. It’s something that needs to be on file for record purposes, but it doesn’t need to be queried aside from simply being loaded with the rest of the record.

    Edit: and just to add, even MS SQL/Azure SQL has the ability to both query and even index within a JSON object. Of course Postgres’ JSONB data type is far better suited for that.