Hi programmers,

I work from two computers: a desktop and laptop. I often interrupt my work on one computer and continue on the other, where I don’t have access to uncommitted progress on the first computer. Frustrating!

Potential solution: using git to auto save progress.

I’m posting this to get feedback. Maybe I’m missing something and this is over complicated?

Here is how it could work:

Creating and managing the separate branch

Alias git commands (such as git checkout), such that I am always on a branch called “[branch]-autosave” where [branch] is the branch I intend to be on, and the autosave branch always branches from it. If the branch doesn’t exist, it is always created.

handling commits

Whenever I commit, the auto save branch would be squashed and merged with the underlying branch.

autosave functionality

I use neovim as my editor, but this could work for other editors.

I will write an editor hook that will always pull the latest from the autosave branch before opening a file.

Another hook will always commit and push to origin upon the file being saved from the editor.

This way, when I get on any of my devices, it will sync the changes pushed from the other device automatically.

Please share your thoughts.

  • NegativeLookBehind@lemmy.world
    link
    fedilink
    English
    arrow-up
    27
    ·
    10 days ago

    Write code on a machine you can remote into from each computer? Less commits, possibly less reverts, less chance of forgetting to git pull after switching machines…idk.

        • Kkmou@lemm.ee
          link
          fedilink
          arrow-up
          4
          ·
          edit-2
          10 days ago

          Don’t think git as a sync storage, more like to merge works.

          If you need to share files between computers use a shared storage.

          Always use the right tool for the job. Mount a shared storage or use synctools rsync, etc

    • matcha_addict@lemy.lolOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 days ago

      I have considered this approach, but there are several things I had issues with.

      • there is still a degree of latency. It’s not a deal breaker, but it is annoying
      • clipboard programs don’t work. They copy to the remote host’s clipboard. I bet there’s a solution to this, but I couldn’t find it from spending a limited time looking into it.
      • in the rare case the host is unreachable, I am kinda screwed. Not a deal breaker since its rare, but the host has to be always on, whether the git solution only requires it to be on when it syncs

      To address the issues you brought up:

      • less commits: this would be resolved by squashing every time I make a commit. The auto save commits will be wiped. If I really hated commits, I could just amend instead of commit, but I rather have the history.
      • forgetting to git pull: the hooks I talked about will take care of that. I won’t have to ever worry about forgetting anymore.
      • Strykker@programming.dev
        link
        fedilink
        arrow-up
        1
        ·
        6 days ago

        Your git solution still has all of these issues, as you need the git server to be alive, for number 3 use something like rsync so you keep a local copy that is backed up if you are concerned about the file share being offline.

        • matcha_addict@lemy.lolOP
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          6 days ago

          I don’t need the client computers to be alive, only the central server (which could be github.com for example, so not even a server I manage).

      • actually@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        9 days ago

        I once used a virtual desktop in the cloud, and I could access that from anywhere. It was just a regular OS that had all my tools, and it was where my work was done changes. Ultimately, that remote desktop went away when I changed jobs. But, it would be something I would think about again for me.

        There is a danger of things going poof, or not being accessible. It cannot be helped at all. But a push to a backup repo during each commit, would allow an emergency restore. Doing a snapshot every few days of the machine, for example if its on AWS or other, helps lessen the loss when and if it goes poof.

        To solve the issue of the internet going out, have one of your local computers do a regular pull as a cron job of the backup repo