9

Currently the code base for the project I am working on is remotely on a company server. and it has to stay like that. also the remote git repository cannot be made public.

My current setup is:

  • Connect to the VPN
  • run sshfs to mount a copy of the code
  • start working on the code
  • when I am done: ssh to the remote server and run git commands there

The problem with this, is that the VPN falls from time to time, so My sshfs mounth breaks, and my IDE freezes. what I do is to manually reconnect the VPN, then run sshfs again, and get back to work.

But it gets annoying as the VPN falls more often.

So I wonder if there are any settings for sshfs for some sort of cache, that would allow me to work, and only sync the changes when the VPN gets back.

That may make no sense, since if the remote driver is not available there is nothing to write to. So what about a different setup that uses some watch kind of thing and uses rsync to move changes in a bidirectional way (either when I save a file, or when I do git pull )

I can't just git clone, because I can't reproduce the entire environment to work 'locally' (DB and stuff)

the code has to be in their servers, in order for me to test/see my work I have to access a URL, that is my sandbox. I can't git push each time I want to see my changes.

Asgaroth
  • 199
  • 1
  • 5
  • 3
    Why don't you use `git` the sane way? Clone the repo, and work remotely. – zecrazytux Jan 17 '13 at 16:18
  • I can't just git clone, because I can't reproduce the entire environment to work 'locally' the code has to be in their servers, in order for me to test/see my work. I have access a URL that is my sandbox. I can't git push each time I want to see my changes – Asgaroth Jan 17 '13 at 16:35
  • 1
    Cripes, that's pretty broken. – EEAA Jan 17 '13 at 16:35
  • How is it a bad question? its not like I can change their infrastructure, or take decisions over it, I just need a way to work with them remotely. – Asgaroth Jan 17 '13 at 18:17

3 Answers3

2

zecrazytux is right -- Why don't you use git the way you're supposed to: by cloning the repository, working on it remotely, and pushing the changes back to the master?

I see no reason you "can't" git push your work each time you want to see your changes (ideally pushing to a development branch that then gets merged when it's tested and proven working) -- lots of people do this. You can even use a post-receive hook to deploy your changes into the environment if you want to automate that part of things.
(You obviously don't WANT to do this, but you haven't given any reason why so I reject the premise of your problem.)


Frankly there's nothing you can do to make an unreliable network connection "tolerable" (ESPECIALLY if you're trying to mount network filesystems) -- you can either work remotely as outlined above, SSH into the system and work directly on it (screen is your friend here), or investigate and fix the underlying network instability.
Trying to do something else to "make it tolerable" is an exercise in futility (think "cocktail umbrella in a hurricane").

voretaq7
  • 79,345
  • 17
  • 128
  • 213
  • The reason is, we have redmine and each commit should have a specific format. so a bunch of commits is not acceptable for a single task. Also having to do git commit + push, before going to the browser and do a refresh (F5), would make it less tolerable than it is now. Imagine having to commit + push, just because I missed a semicolon (I'm exagerating, but you get the idea) – Asgaroth Jan 17 '13 at 19:20
  • `we have redmine and each commit should have a specific format` <-- So when you merge the development branch format the commit for the merge the way redmine expects it. `git` is flexible like that :-) – voretaq7 Jan 17 '13 at 19:22
  • The missing semicolon thing is simple: Don't make mistakes (I'm exaggerating, but you get the idea - you can run static syntax checks against your stuff before you push to eliminate little things like that, and since Redmine is only going to see the merge-down commit nobody needs to know the horrors that happened in your dev branch anyway) -- otherwise we're back to "work remotely, use `screen` for when the connection dies" – voretaq7 Jan 17 '13 at 19:24
  • that could be true, but I'm still left with having to commit+push to see the changes, off course I use syntax checkers, you didn't get the idea. but you know that when working in web apps, you need to do a lot of page refresh when working on something, again commit+puhs would make it less tolerable than it is now. – Asgaroth Jan 17 '13 at 19:31
  • You should be able to have this setup: origin → remote clone → local clone. A push from the local clone will be done to the remote clone (and is not restricted by any redmine stuff). When you are in a state which seems proper for redmine, you can push the remote clone to origin. – Alfe Sep 26 '17 at 15:08
2

I am using this sshfs options to minimize latency:

sshfs -o Ciphers=arcfour,compression=no,nonempty,auto_cache,reconnect,workaround=all user@development.net:/usr/local/gitdev/ ~/dev/code

It has the reconnect flag, all sshfs workarounds, using auto cache and arcfour chiper.

You can read about those options on sshfs manual, I found those the fastest sshfs options at least to my setup.

ETA: More on sshfs performance read here: sshfs performance

kaaposc
  • 3
  • 3
diamonddog
  • 21
  • 1
0

I have phpstorm IDE over vpn + sshfs from home and it's git which is slow, scanning lots of files.

I use https://github.com/ericpruitt/sshfsexec with my ide or my terminal and all local commands are executed as ssh remote commands automatically and it's faster.

After install it, on PHPSTORM : settings>Git > path to git executable => /home/username/bin/sshfsexec/git

GDM13
  • 1