• 1 Post
  • 47 Comments
Joined 1 year ago
cake
Cake day: June 5th, 2023

help-circle


  • Why do you need the files in your local?
    Is your network that slow?

    I’ve heard of multiple content creators which have their video files in their NAS to share between their editors, and they work directly from the NAS.
    Could you do the same? You’ll be working with music, so the network traffic will be lower than with video.

    If you do this you just need a way to mount the external directory, either with rclone or with sshfs.


    The disks on my NAS go to sleep after 10 minutes idle time and if possible I would prefer not waking them up all the time

    I think this is a good strategy to not put additional stress in your drives (as a non-expert of NAS), but I’ve read the actual wear and tear of the drives is mostly during this process of spinning up and down. That’s why NAS drives should be kept spinning all the time.
    And drives specifically built for NAS setups are designed with this in mind.





  • I had a similar case.
    My minipc has a microSD card slot and I figured if it could be done for a RPI, why not for a mini PC? :P

    After a few months I bought a new m2nvme but I didn’t want to start from scratch (maybe I should’ve looked into nix?)
    So what I did was sudo dd if=/dev/sda of=/dev/sdc bs=1024k status=progress
    And that worked perfectly!

    Things to note:

    • both drives need to be unmounted, so you need a live OS or another machine.
    • The new drive will have the same exact partitions, which means the same size, so you need to expand them after the copy.
    • PS: this was for a drive with ext4 partitions, but in theory dd works with the bytes so it shouldn’t be an issue what fs you use.

  • I love playing Dwar Fortress, I’ve spent hours and hours in there.
    The game is free from the developera site, the download is only 15MB.
    If you want to support them you can buy it in steam, the listed requirements is 500MB of storage, I assume since this version has a tile set.

    I’ve also put so far 400 hours in oxygen not included, I think it uses around 2GB of storage.

    And to me, any monster hunter game its worth its price, I’ve bought each game and played it for minimum 200 hours each, I think I reached 500 in on of them.
    Tho the newer ones are pretty heavy for their respective platforms. Also triple-A game price.


  • I only had to run this in my home server, behind my router which already has firewall to prevent outside traffic, so at least I’m a bit at ease for that.
    In the VPS everything worked without having to manually modify iptables.

    For some reason I wasn’t being able to make a curl call to the internet inside docker.
    I thought it could be DNS, but that was working properly trying nslookup tailscale.com
    The call to the same url wasn’t working at all. I don’t remember the exact details of the errors since the iptables modification fixed it.

    AFAIK the only difference between the two setups was ufw enabled in the VPS, but not at home.
    So I installed UFW at home and removed the rule from iptables and everything keeps working right now.

    I didn’t save the output of iptables before uwf, but right now there are almost 100 rules for it.

    For example since this is curl you’re probably going to connect to ports 80 and 443 so you can add --dport to restrict the ports to the OUTPUT rule. And you should specify the interface (in this case docker0) in almost all cases.

    Oh, that’s a good point!
    I’ll later try to replicate the issue and test this, since I don’t understand why OUTPUT should be solved by an INPUT rule.



  • Ah, that makes sense!
    Yes, a DB would let you build this. But the point is in the word “build”, you need to think about what is needed, in which format, how to properly make all the relationships to have data consistency and flexibility, etc.
    For example, you might implement the tags as a text field, then we still have the same issue about addition, removal, and reorder. One fix could be have a many tags to one task table. Then we have the problem of mistyping a tag, you might want to add TODO but you forgot you have it as todo, which might not be a problem if the field is case insensitive, but what about to-do?
    So there are still a lot of stuff you might oversight which will come up to sidetrack you from creating and doing your tasks even if you abstract all of this into a script.

    Specifically for todo list I selfhost https://vikunja.io/
    It has OAS so you can easily generate a library for any language for you to create a CLI.
    Each task has a lot of attributes, including the ones you want: relation between tasks, labels, due date, assignee.

    Maybe you can have a project for your book list, but it might be overkill.

    For links and articles to read I’d say a simple bookmark software could be enough, even the ones in your browser.
    If you want to go a bit beyond that I’m using https://github.com/goniszewski/grimoire
    I like it because it has nested categories plus tags, most other bookmark projects only have simple categories or only tags.
    It also has a basic API but is enough for most use cases.
    Other option could be an RSS reader if you want to get all articles from a site. I’m using https://github.com/FreshRSS/FreshRSS which has the option to retrieve data form sites using XMLPath in case they don’t offer RSS.


    If you still want to go the DB route, then as others have mentioned, since it’ll be local and single user, sqlite is the best option.
    I’d still encourage you to use any existing project, and if it’s open source you can easily contribute the code you’d have done for you to help improve it for the next person with your exact needs.

    (Just paid attention to your username :P
    I also love matcha, not an addict tho haha)


  • I can’t imagine this flow working with any DB without an UI to manage it.
    How are you going to store all that in an easy yet flexible way to handle all with SQL?

    A table for notes?
    What fields would it have? Probably just a text field.
    Creating it is simple: insert “initial note”… How are you going to update it? A simple update to the ID won’t work since you’ll be replacing all the content, you’d need to query the note, copy it to a text editor and then copy it back to a query (don’t forget to escape it).
    Then probably you want to know which is your oldest note, so you need to include created_at and updated_at fields.
    Maybe a title per note is a nice addition, so a new field to add title.

    What about the todo lists? Will they be stored in the same notes table?
    If so, then the same problem, how are you going to update them? Include new items, mark items as done, remove them, reorder them.
    Maybe a dedicated table, well, two tables, list metadata and list items.
    In metadata almost the same fields as notes, but description instead of text. The list items will have status and text.

    Maybe you can reuse the todo tables for your book list and links/articles to read.

    so that I can script its commands to create simpler abstractions, rather than writing out the full queries every time.

    This already exists, several note taking apps which wrap around either the filesystem or a DB so you only have to worry about writing your ideas into them.
    I’d suggest to not reinvent the wheel unless nothing satisfies you.

    What are the pros of using a DB directly for your use case?
    What are the cons of using a note taking app which will provide a text editor?

    If you really really want to use a DB maybe look into https://github.com/zadam/trilium
    It uses sqlite to store the notes, so maybe you can check the code and get an idea if it’s complicated or not for you to manually replicate all of that.
    If not, I’d also recommend obsidian, it stores the notes in md files, so you can open them with any software you want and they’ll have a standard syntax.





  • Ah, then no, the last thing I knew about it you can’t migrate accounts from one server to another, which is what you’re trying to do here.
    As I mentioned if you were able to move the keys which identify your account it would be easy for someone to impersonate you.
    Also, your public keys are shared among all the instances you’ve interacted, so this might break your interactions there.


  • Do you still have the old database? You should be able to move your instance around as long as you have a dump of your DB, that’s where all the keys of each community and user in your instance are. Those are the ones telling other instances you’re actually you, if you loose those I don’t know what can be done so other instances flush your old content and treat you as a new account. But I would count on thi s being a feature since it could lead to people impersonating someone else if they get a hold of the domain without the DB.

    EDIT: amm, maybe I didn’t understand correctly, are you trying to move to a new domain? Or to a new server with the same domain?
    What’s re-home?


  • I also like local only with a similar set up as yours, rsync to and HDD and to an SSD.
    But I also would recommend you to follow that suggestion, you need to have an external backup managed by someone else (encrypted, of course) so you can have options if anything happens to everything in your local.
    It’s up to you how much you’re willing to pay to be sure to be able to retrieve your data.

    I’m using iDrive e2, it says it has a limited offer, but it’s been there for over a year.

    Im basically paying $1.25 for 2TB per month (it’s charged at once for 24 months) https://www.idrive.com/s3-storage-e2/pricing


  • Found that also myself trying to do the same thing haha. I did the same process as OP, gparted took 2.5 hours in my 1TB HDD to create a new partition, then copying the data from old to new partition was painfully slow, so I went to copy it to another dive and into the new partition.
    Afterwards I deleted the old partition and grew the new one, which took a bit more than 1.5 hours.

    If I had the space I would have copied all the data out of the drive, formatted it and then copied back into. It would have been quicker.