• 0 Posts
  • 27 Comments
Joined 7 months ago
cake
Cake day: August 14th, 2024

help-circle


  • I use Actual and my solution is to just report the differences in investments value at the end of each week as a transaction. It’s not great but it affords me an opportunity to see trends in a different way and make adjustments feeling a little more informed. I even put my car in and just check KBB every year and update it. Helps with the year end net worth evaluation though it’s not the most flexible.





  • I’m a bit frustrated by my steam replay. Not only does it show windows in my platform chart despite my complete abandonment of windows in my home for years at this point, but also one of my most consistently and highest played game (fall guys, hundreds of hours played on Linux in 2024) didn’t even show up in my top games list.

    Edit: looks like the replay continues to update itself so it just wasn’t done processing or something because fall guys now shows up in mine as expected with accurate numbers, a new game was added to my number of games player count after I started a new one last night, and they are no longer showing me the platform breakdown lol.



  • What are the features you need from your host? If it’s just remote syncing, why not just make a small Debian system and install git on it? You can manage security on the box itself. Do you need the overhead of gitlab at all?

    I say this because I did try out hosting my own GitLab, GitTea, Cogs, etc and I just found I never needed any of the features. The whole point was to have a single remote that can be backed up and redeployed easily in disaster situations but otherwise all my local work just needed simple tracking. I wrote a couple scripts so my local machine can create new repos remotely and I also setup ssh key on the remote machine.

    I don’t have a complicated setup, maybe you do, not sure. But I didn’t need the integrated features and overhead for solo self hosting.

    For example, one of my local machine scripts just executes a couple commands on the remote to create a new folder, cd into it, and then run git init —bare then I can just clone the new project folder on the local machine and get started.









  • For clarity, the recommendation is specifically 3 copies of your data, not 3 backups.

    3-2-1 backup; 3 copies of the data, 2 types of storage devices, 1 off-site storage location.

    So in a typical homelab case you would have your primary hot data, the actual device being used to create and manage that data, your desktop. You’d regularly backup that data into warm storage such as a NAS with redundancy (raid Z1, Z2, etc). Followed by regular but slower intervals of backups to a remote location, such as a duplicate NAS with a secure tunnel or even an external drive(s) sitting at a friend or family member’s house, bank vault, wherever. That would be considered cold storage (and should be automated as such if it’s constantly powered).

    My own addition to this is that at least one of the hot / warm devices should be on battery backup in case of power events. I’ll always advocate that to be the primary machine but in homelab the server would be more important and the NAS would be part of that stack.

    Cloud is not considered a backup unless the data owner is also the storage owner, for general reliability reasons related to control over the system and storage. Cloud is, however, a reasonable temporary storage for moves and transfers.


  • I self host services as much as possible for multiple reasons; learning, staying up to date with so many technologies with hands on experience, and security / peace of mind. Knowing my 3-2-1 backup solution is backing my entire infrastructure helps greatly in feeling less pressured to provide my data to unknown entities no matter how trustworthy, as well as the peace of mind in knowing I have control over every step of the process and how to troubleshoot and fix problems. I’m not an expert and rely heavily on online resources to help get me to a comfortable spot but I also don’t feel helpless when something breaks.

    If the choice is to trust an encrypted backup of all my sensitive passwords, passkeys, and recovery information on someone else’s server or have to restore a machine, container, vm, etc. from a backup due to critical failures, I’ll choose the second one because no matter how encrypted something is someone somewhere will be able to break it with time. I don’t care if accelerated and quantum encryption will take millennia to break. Not having that payload out in the wild at all is the only way to prevent it being cracked.