Tuesday, September 8, 2009

Copying Files across networks - in optimized way.

scenario :
what - a bunch of files in a dir tree
how much - size is above gb
where - across continent

if we have a reliable connection best is to use ssh copy - yes we will tar gunzip on the source and untar/unzip on destination -

#>tar czf - [dir0] [dir1] ...  | ssh <destination> 'tar xzf -'

if we have a reliable high speed connection probably zipping wouldnt save that much time

#> tar cf - [dir0] [dir1] ...  | ssh <destination> 'tar xf -'

OK now comes unreliable connection  :)
we can zip and send it through rsync - which incase connection broken retries only the pieces which were not transfered. Remember a zip file if changed will be transferred again. hence in case source is going to change frequently and there is requirement to sync both ends over unreliable slow speeds best would be to do a rsync of dir as it is

#>rsync -a [dir0] [dir1] ...  destination:[path]


No comments:

VIM issues with powerline

What to do if you get this annoying issue - vi requirements.txt ...