ls-remote set timeout
Im new to this list - so hello, hope I'm welcome.
My problem is: I have a configuration for my bash saved on a private
git-repo. Every time, i start bash, my .bashrc checks this repo out to
get all changes (alias, some functions, $PS1 and so on). So i can have
my working environment on all my servers with personal login.
Now I'm working on a new customer, where github.com is not reachable
(firewall/proxy). Parts of my configuration (some plugins/scripts for
vim) cannot be updated there, because they are hosted on github.com. :-/
Now i tried to fiddle in a check, if a repo is reachable in my .bashrc.
And i found out, that git ls-remote is a good choice, because it handles
redirections from http to https correctly behind this proxy. (direct
https links to git-repos do even not work in this surrounding... don't
ask why, please).
I can check, if my private repo (git bare repo with gitweb) is reachable
(http pull, ssh is also closed!!!) with git ls-remote. But this check
hangs on github.com in case of a redirection from the proxy to a "this
is forbidden"-site... . And it hangs forever (1 Minute, 2 Minutes or
even really forever!)
Is it possible, to include a "--connection-timeout" and/or the
"--max-time" option for curl, that i can give to my "git ls-remote"
command? So i can call
git --connection-timeout 3 -m 3 ls-remote <REPOURL>
and the command stops after 3 seconds with an errorcode, which I can
I tried netcat and curl directly. In this environment only git ls-remote
will work correctly on reachable repos, but it hangs on blocked... :-/
Thank you for your interests