<article>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#article09_06_23_1823201</id>
	<title>How Do You Sync &amp; Manage Your Home Directories?</title>
	<author>kdawson</author>
	<datestamp>1245784740000</datestamp>
	<htmltext>digitalderbs writes <i>"A problem plaguing most people with multiple computers is the arduous task of synchronizing files between them: documents, pictures, code, or data. Everyone seems to have their own strategies, whether they involve USB drives, emailed attachments, rsync, or a distributed management system, all of which have varying degrees of success in implementing fast synchronization, interoperability, redundancy and versioning, and encryption. Myself, I've used <a href="http://www.cis.upenn.edu/~bcpierce/unison/">unison</a> for file synchronization and <a href="http://rsnapshot.org/">rsnapshot</a> for backups between two Linux servers and a Mac OS X laptop. I've recently considered adding some sophistication by implementing a version control system like <a href="http://subversion.tigris.org/">subversion</a>, <a href="http://git-scm.com/">git</a>, or <a href="http://bazaar-vcs.org/">bazaar</a>, but have found some shortcomings in automating commits and pushing updates to all systems. What system do you use to manage your home directories, and how have they worked for you for managing small files (e.g. dot configs) and large (gigabyte binaries of data) together?"</i></htmltext>
<tokenext>digitalderbs writes " A problem plaguing most people with multiple computers is the arduous task of synchronizing files between them : documents , pictures , code , or data .
Everyone seems to have their own strategies , whether they involve USB drives , emailed attachments , rsync , or a distributed management system , all of which have varying degrees of success in implementing fast synchronization , interoperability , redundancy and versioning , and encryption .
Myself , I 've used unison for file synchronization and rsnapshot for backups between two Linux servers and a Mac OS X laptop .
I 've recently considered adding some sophistication by implementing a version control system like subversion , git , or bazaar , but have found some shortcomings in automating commits and pushing updates to all systems .
What system do you use to manage your home directories , and how have they worked for you for managing small files ( e.g .
dot configs ) and large ( gigabyte binaries of data ) together ?
"</tokentext>
<sentencetext>digitalderbs writes "A problem plaguing most people with multiple computers is the arduous task of synchronizing files between them: documents, pictures, code, or data.
Everyone seems to have their own strategies, whether they involve USB drives, emailed attachments, rsync, or a distributed management system, all of which have varying degrees of success in implementing fast synchronization, interoperability, redundancy and versioning, and encryption.
Myself, I've used unison for file synchronization and rsnapshot for backups between two Linux servers and a Mac OS X laptop.
I've recently considered adding some sophistication by implementing a version control system like subversion, git, or bazaar, but have found some shortcomings in automating commits and pushing updates to all systems.
What system do you use to manage your home directories, and how have they worked for you for managing small files (e.g.
dot configs) and large (gigabyte binaries of data) together?
"</sentencetext>
</article>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28448409</id>
	<title>PowerFolder</title>
	<author>totmacher</author>
	<datestamp>1245771240000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>I like to <a href="http://www.powerfolder.com/" title="powerfolder.com" rel="nofollow">sync my files with PowerFolder</a> [powerfolder.com].
I can directly sync my computers in LAN or online and don't need to upload them to any service provider.<nobr> <wbr></nobr>...And it works perfectly between my Linux and Windows box. I think a Mac client is also available.</htmltext>
<tokenext>I like to sync my files with PowerFolder [ powerfolder.com ] .
I can directly sync my computers in LAN or online and do n't need to upload them to any service provider .
...And it works perfectly between my Linux and Windows box .
I think a Mac client is also available .</tokentext>
<sentencetext>I like to sync my files with PowerFolder [powerfolder.com].
I can directly sync my computers in LAN or online and don't need to upload them to any service provider.
...And it works perfectly between my Linux and Windows box.
I think a Mac client is also available.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443747</id>
	<title>Home Server</title>
	<author>therapyreject</author>
	<datestamp>1245789480000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>I save any data I might need shared to my server at home, and just map a drive on all my pc's I use. I only use Windows at home, so its pretty simple and works just fine.</htmltext>
<tokenext>I save any data I might need shared to my server at home , and just map a drive on all my pc 's I use .
I only use Windows at home , so its pretty simple and works just fine .</tokentext>
<sentencetext>I save any data I might need shared to my server at home, and just map a drive on all my pc's I use.
I only use Windows at home, so its pretty simple and works just fine.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28445061</id>
	<title>bash script and unison</title>
	<author>vlm</author>
	<datestamp>1245750720000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>A bash script that runs unison on certain directories if that directory exists.  About ten for different directories (like ~/music, ~/movies, etc)<br>I have no interest in backing up dot files like<nobr> <wbr></nobr>.kde as what works on my 7 inch battery powered "netbook" is probably not applicable to my giant monitor high powered desktop.<br>My tiny laptop for space reasons does not have a ~/audiobooks directory thus the script does not sync ~/audiobooks.<br>The script of course distributes itself and updates itself into<nobr> <wbr></nobr>/usr/local/bin all by itself somewhat virally.</p><p>For system backup (closely related) I only back up config files and have my unisonsync script back it up to all my machines.<br>There is no point in backing up<nobr> <wbr></nobr>/bin/bash since there are about 300 world wide debian mirrors that back that up much better than I could ever dream of doing, and an amd64<nobr> <wbr></nobr>/bin/bash would be of little use on a replacement i386 machine, etc.<br>All my machines share a unison'ed ~/backup directory... structure like ~/backup/server/stuff ~/backup/media/stuff ~/backup-mythtv-upstairs/stuff<br>So, all my machines have a backup copy of all important config files or changed files on all the machines I control.  For the eight or so machines it consumes only a couple megs.<br>I backup things like<nobr> <wbr></nobr>/etc/network/interfaces or<nobr> <wbr></nobr>/etc/ntp.conf etc.<br>mysqldump makes an appearance or two.<br>When I recently set up "mythtv-downstairs" frontend, I pretty much worked thru each file I backup for "mythtv-upstairs" and it worked, so its of much more use than disaster recovery.</p><p>My script also runs certain monitoring scripts and dumps their output into the backup system.  run cpuid and dump the output into ~/backup/machinename/cpuid.  So in the event of total and utter system failure I know the exact specs of the dead box without any memorization or googling.  (did it have one gig or two?  easy, read the file ~/backup/machinename/free)  I save a copy of the output of lsmod, cpuid, free, df, cfdisk -P s<nobr> <wbr></nobr>/dev/whatever, cat<nobr> <wbr></nobr>/proc/mdstat as appropriate, lots of other stuff.</p><p>On the big server, a cron job weekly tars up ~/backup and stashes it.  Occasionally I burn this vast collection of backup files to a CD or copy to a flash and then store it offsite.  I also make offsite copies of my relevant ~/whatever directories as I see fit.</p></htmltext>
<tokenext>A bash script that runs unison on certain directories if that directory exists .
About ten for different directories ( like ~ /music , ~ /movies , etc ) I have no interest in backing up dot files like .kde as what works on my 7 inch battery powered " netbook " is probably not applicable to my giant monitor high powered desktop.My tiny laptop for space reasons does not have a ~ /audiobooks directory thus the script does not sync ~ /audiobooks.The script of course distributes itself and updates itself into /usr/local/bin all by itself somewhat virally.For system backup ( closely related ) I only back up config files and have my unisonsync script back it up to all my machines.There is no point in backing up /bin/bash since there are about 300 world wide debian mirrors that back that up much better than I could ever dream of doing , and an amd64 /bin/bash would be of little use on a replacement i386 machine , etc.All my machines share a unison'ed ~ /backup directory... structure like ~ /backup/server/stuff ~ /backup/media/stuff ~ /backup-mythtv-upstairs/stuffSo , all my machines have a backup copy of all important config files or changed files on all the machines I control .
For the eight or so machines it consumes only a couple megs.I backup things like /etc/network/interfaces or /etc/ntp.conf etc.mysqldump makes an appearance or two.When I recently set up " mythtv-downstairs " frontend , I pretty much worked thru each file I backup for " mythtv-upstairs " and it worked , so its of much more use than disaster recovery.My script also runs certain monitoring scripts and dumps their output into the backup system .
run cpuid and dump the output into ~ /backup/machinename/cpuid .
So in the event of total and utter system failure I know the exact specs of the dead box without any memorization or googling .
( did it have one gig or two ?
easy , read the file ~ /backup/machinename/free ) I save a copy of the output of lsmod , cpuid , free , df , cfdisk -P s /dev/whatever , cat /proc/mdstat as appropriate , lots of other stuff.On the big server , a cron job weekly tars up ~ /backup and stashes it .
Occasionally I burn this vast collection of backup files to a CD or copy to a flash and then store it offsite .
I also make offsite copies of my relevant ~ /whatever directories as I see fit .</tokentext>
<sentencetext>A bash script that runs unison on certain directories if that directory exists.
About ten for different directories (like ~/music, ~/movies, etc)I have no interest in backing up dot files like .kde as what works on my 7 inch battery powered "netbook" is probably not applicable to my giant monitor high powered desktop.My tiny laptop for space reasons does not have a ~/audiobooks directory thus the script does not sync ~/audiobooks.The script of course distributes itself and updates itself into /usr/local/bin all by itself somewhat virally.For system backup (closely related) I only back up config files and have my unisonsync script back it up to all my machines.There is no point in backing up /bin/bash since there are about 300 world wide debian mirrors that back that up much better than I could ever dream of doing, and an amd64 /bin/bash would be of little use on a replacement i386 machine, etc.All my machines share a unison'ed ~/backup directory... structure like ~/backup/server/stuff ~/backup/media/stuff ~/backup-mythtv-upstairs/stuffSo, all my machines have a backup copy of all important config files or changed files on all the machines I control.
For the eight or so machines it consumes only a couple megs.I backup things like /etc/network/interfaces or /etc/ntp.conf etc.mysqldump makes an appearance or two.When I recently set up "mythtv-downstairs" frontend, I pretty much worked thru each file I backup for "mythtv-upstairs" and it worked, so its of much more use than disaster recovery.My script also runs certain monitoring scripts and dumps their output into the backup system.
run cpuid and dump the output into ~/backup/machinename/cpuid.
So in the event of total and utter system failure I know the exact specs of the dead box without any memorization or googling.
(did it have one gig or two?
easy, read the file ~/backup/machinename/free)  I save a copy of the output of lsmod, cpuid, free, df, cfdisk -P s /dev/whatever, cat /proc/mdstat as appropriate, lots of other stuff.On the big server, a cron job weekly tars up ~/backup and stashes it.
Occasionally I burn this vast collection of backup files to a CD or copy to a flash and then store it offsite.
I also make offsite copies of my relevant ~/whatever directories as I see fit.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28445419</id>
	<title>Re:Dropbox</title>
	<author>ralphweaver</author>
	<datestamp>1245751920000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>Is source code available?</htmltext>
<tokenext>Is source code available ?</tokentext>
<sentencetext>Is source code available?</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443909</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28477455</id>
	<title>Re:Mobile Home Directory</title>
	<author>mattdaemon</author>
	<datestamp>1245951120000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>I'm curious as to how you sync the Mac's on login/logout.
Are you using LoginHook/LogoutHook or<nobr> <wbr></nobr>.bash\_login/.bash\_logout and are you doing it interactively?

I'm new to the mac and coming from windows and linux environments had setup such syncs using the Group Policy Editor's login/logout scripts and plain<nobr> <wbr></nobr>.bash\_login/.bash\_logout scripts respectively. This allowed me to interact with said scripts so that I could inspect any output/conflicts. I can't however, figure out how to do this on the Mac. I've only been able to run the scripts with no ouput/interaction if I use the LoginHook/LogoutHook or have to manually startup/shutdown a terminal window.</htmltext>
<tokenext>I 'm curious as to how you sync the Mac 's on login/logout .
Are you using LoginHook/LogoutHook or .bash \ _login/.bash \ _logout and are you doing it interactively ?
I 'm new to the mac and coming from windows and linux environments had setup such syncs using the Group Policy Editor 's login/logout scripts and plain .bash \ _login/.bash \ _logout scripts respectively .
This allowed me to interact with said scripts so that I could inspect any output/conflicts .
I ca n't however , figure out how to do this on the Mac .
I 've only been able to run the scripts with no ouput/interaction if I use the LoginHook/LogoutHook or have to manually startup/shutdown a terminal window .</tokentext>
<sentencetext>I'm curious as to how you sync the Mac's on login/logout.
Are you using LoginHook/LogoutHook or .bash\_login/.bash\_logout and are you doing it interactively?
I'm new to the mac and coming from windows and linux environments had setup such syncs using the Group Policy Editor's login/logout scripts and plain .bash\_login/.bash\_logout scripts respectively.
This allowed me to interact with said scripts so that I could inspect any output/conflicts.
I can't however, figure out how to do this on the Mac.
I've only been able to run the scripts with no ouput/interaction if I use the LoginHook/LogoutHook or have to manually startup/shutdown a terminal window.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443765</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443881</id>
	<title>Arduous Task?</title>
	<author>sexconker</author>
	<datestamp>1245789900000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>What?<br>If you have enough files and move between systems frequently enough that this is an issue, consider a USB flash drive.</p><p>Store your files there.<br>Keep backups on each machine.</p><p>Want versioning?  Seems to me that files typically have a datestamp for when they were last modified.  That's all the versioning people need 99.9\% of the time.</p><p>If you're in a situation where this is a big problem (many users many files many machines), you want your damned files stored in a more permanent location, such as on, I don't know, a file server?</p><p>Save your fucking files to the server.  If you're away from the server, upload your file when you get back.  When you get to another machine, grab the files you intend to work on.</p><p>Hell, Windows has supported this shit for ages with offline files and the whole "Briefcase" bullshit.</p><p><a href="http://www.microsoft.com/windowsxp/using/mobility/learnmore/offlinefiles.mspx" title="microsoft.com">http://www.microsoft.com/windowsxp/using/mobility/learnmore/offlinefiles.mspx</a> [microsoft.com]<br><a href="http://support.microsoft.com/kb/307885" title="microsoft.com">http://support.microsoft.com/kb/307885</a> [microsoft.com]</p><p>And yes, Windows does a simple versioning and backup with shadow copies.</p><p>In your case, 2 Linux servers and OS X, just sync shit on a schedule when all machines are on the network if you want.  The 2 servers should always be in sync with each other (since they're servers and should always be up and networked).</p><p>The laptop is the only issue, and should sync when connecting to and before disconnecting from the network.</p><p>Any way you want to go about syncing files is fine.  Might I suggest a simple GUI drag and drop to/from the server?  Seems to me most users can handle that, as long as you beat into them how to know which direction to do it.  You could simplify this by making a simple script users could run.  This script could include making backups on the server so we don't have issues of people going the wrong way, and so you can timestamp each old version (useful for keeping files for various projects grouped together, so people can grab old versions of a project if that's what they're working on).</p><p>You don't need services to handle this for you.<br>You have 2 servers and 1 laptop.  I would say you don't need anything to handle this for you.  I wouldn't even go as far as to keep a flash drive laying around.  Just, you know, remember to grab files you're going to work on before leaving with your laptop, and remember to reupload those you've changed when you get back to the network.</p><p>If you've got a more complicated setup (multiple users accessing and modifying the same files at the same time) THEN you need a version control / checkout service running, and even then, none of them are intuitive, and users WILL get confused and break shit.  Especially when you're dealing with mobile users who will be away from the network for unspecified periods of time.</p></htmltext>
<tokenext>What ? If you have enough files and move between systems frequently enough that this is an issue , consider a USB flash drive.Store your files there.Keep backups on each machine.Want versioning ?
Seems to me that files typically have a datestamp for when they were last modified .
That 's all the versioning people need 99.9 \ % of the time.If you 're in a situation where this is a big problem ( many users many files many machines ) , you want your damned files stored in a more permanent location , such as on , I do n't know , a file server ? Save your fucking files to the server .
If you 're away from the server , upload your file when you get back .
When you get to another machine , grab the files you intend to work on.Hell , Windows has supported this shit for ages with offline files and the whole " Briefcase " bullshit.http : //www.microsoft.com/windowsxp/using/mobility/learnmore/offlinefiles.mspx [ microsoft.com ] http : //support.microsoft.com/kb/307885 [ microsoft.com ] And yes , Windows does a simple versioning and backup with shadow copies.In your case , 2 Linux servers and OS X , just sync shit on a schedule when all machines are on the network if you want .
The 2 servers should always be in sync with each other ( since they 're servers and should always be up and networked ) .The laptop is the only issue , and should sync when connecting to and before disconnecting from the network.Any way you want to go about syncing files is fine .
Might I suggest a simple GUI drag and drop to/from the server ?
Seems to me most users can handle that , as long as you beat into them how to know which direction to do it .
You could simplify this by making a simple script users could run .
This script could include making backups on the server so we do n't have issues of people going the wrong way , and so you can timestamp each old version ( useful for keeping files for various projects grouped together , so people can grab old versions of a project if that 's what they 're working on ) .You do n't need services to handle this for you.You have 2 servers and 1 laptop .
I would say you do n't need anything to handle this for you .
I would n't even go as far as to keep a flash drive laying around .
Just , you know , remember to grab files you 're going to work on before leaving with your laptop , and remember to reupload those you 've changed when you get back to the network.If you 've got a more complicated setup ( multiple users accessing and modifying the same files at the same time ) THEN you need a version control / checkout service running , and even then , none of them are intuitive , and users WILL get confused and break shit .
Especially when you 're dealing with mobile users who will be away from the network for unspecified periods of time .</tokentext>
<sentencetext>What?If you have enough files and move between systems frequently enough that this is an issue, consider a USB flash drive.Store your files there.Keep backups on each machine.Want versioning?
Seems to me that files typically have a datestamp for when they were last modified.
That's all the versioning people need 99.9\% of the time.If you're in a situation where this is a big problem (many users many files many machines), you want your damned files stored in a more permanent location, such as on, I don't know, a file server?Save your fucking files to the server.
If you're away from the server, upload your file when you get back.
When you get to another machine, grab the files you intend to work on.Hell, Windows has supported this shit for ages with offline files and the whole "Briefcase" bullshit.http://www.microsoft.com/windowsxp/using/mobility/learnmore/offlinefiles.mspx [microsoft.com]http://support.microsoft.com/kb/307885 [microsoft.com]And yes, Windows does a simple versioning and backup with shadow copies.In your case, 2 Linux servers and OS X, just sync shit on a schedule when all machines are on the network if you want.
The 2 servers should always be in sync with each other (since they're servers and should always be up and networked).The laptop is the only issue, and should sync when connecting to and before disconnecting from the network.Any way you want to go about syncing files is fine.
Might I suggest a simple GUI drag and drop to/from the server?
Seems to me most users can handle that, as long as you beat into them how to know which direction to do it.
You could simplify this by making a simple script users could run.
This script could include making backups on the server so we don't have issues of people going the wrong way, and so you can timestamp each old version (useful for keeping files for various projects grouped together, so people can grab old versions of a project if that's what they're working on).You don't need services to handle this for you.You have 2 servers and 1 laptop.
I would say you don't need anything to handle this for you.
I wouldn't even go as far as to keep a flash drive laying around.
Just, you know, remember to grab files you're going to work on before leaving with your laptop, and remember to reupload those you've changed when you get back to the network.If you've got a more complicated setup (multiple users accessing and modifying the same files at the same time) THEN you need a version control / checkout service running, and even then, none of them are intuitive, and users WILL get confused and break shit.
Especially when you're dealing with mobile users who will be away from the network for unspecified periods of time.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443757</id>
	<title>USB Drive.</title>
	<author>Anonymous</author>
	<datestamp>1245789480000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>If you keep everything really organized a usb stick and an external hd suffice. Occasionally overwriting old files on the hd to keep "backups" of the usb stick.<br>The problem is with my music collection is 30GB and my usb stick is 2GB<nobr> <wbr></nobr>:) I'm looking forward to replacing my usb stick for a 32+ GB usb stick so that I don't have to worry about extra hd's.</p></htmltext>
<tokenext>If you keep everything really organized a usb stick and an external hd suffice .
Occasionally overwriting old files on the hd to keep " backups " of the usb stick.The problem is with my music collection is 30GB and my usb stick is 2GB : ) I 'm looking forward to replacing my usb stick for a 32 + GB usb stick so that I do n't have to worry about extra hd 's .</tokentext>
<sentencetext>If you keep everything really organized a usb stick and an external hd suffice.
Occasionally overwriting old files on the hd to keep "backups" of the usb stick.The problem is with my music collection is 30GB and my usb stick is 2GB :) I'm looking forward to replacing my usb stick for a 32+ GB usb stick so that I don't have to worry about extra hd's.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443849</id>
	<title>FTP</title>
	<author>NineNine</author>
	<datestamp>1245789780000</datestamp>
	<modclass>Interestin</modclass>
	<modscore>2</modscore>
	<htmltext><p>FTP back and forth, select the root and overwrite whatever's newer.  Unless the time on the files gets screwed up, it works fine.  Worst case scenario, which is the dates/times getting messed up, the FTP client downloads everything.  No big deal.  I do it daily for all kinds of files.</p><p>As older, wiser programmers than myself have always told me: KISS: Keep It Simple, Stupid.</p></htmltext>
<tokenext>FTP back and forth , select the root and overwrite whatever 's newer .
Unless the time on the files gets screwed up , it works fine .
Worst case scenario , which is the dates/times getting messed up , the FTP client downloads everything .
No big deal .
I do it daily for all kinds of files.As older , wiser programmers than myself have always told me : KISS : Keep It Simple , Stupid .</tokentext>
<sentencetext>FTP back and forth, select the root and overwrite whatever's newer.
Unless the time on the files gets screwed up, it works fine.
Worst case scenario, which is the dates/times getting messed up, the FTP client downloads everything.
No big deal.
I do it daily for all kinds of files.As older, wiser programmers than myself have always told me: KISS: Keep It Simple, Stupid.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443533</id>
	<title>"Distributed homedirs" or "CVS'd configs"?</title>
	<author>Slipped\_Disk</author>
	<datestamp>1245788760000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>I'd be interested in answers for this from the "I want the same homedir contents everywhere, but NFS makes me vomit" standpoint<nobr> <wbr></nobr>:)</p><p>For managing the default profiles around the office we use git - the dotfiles &amp; such are managed, and the rest is left as an exercise for the user.<br>It's not ideal (I hate it), but it's what we've got...</p></htmltext>
<tokenext>I 'd be interested in answers for this from the " I want the same homedir contents everywhere , but NFS makes me vomit " standpoint : ) For managing the default profiles around the office we use git - the dotfiles &amp; such are managed , and the rest is left as an exercise for the user.It 's not ideal ( I hate it ) , but it 's what we 've got.. .</tokentext>
<sentencetext>I'd be interested in answers for this from the "I want the same homedir contents everywhere, but NFS makes me vomit" standpoint :)For managing the default profiles around the office we use git - the dotfiles &amp; such are managed, and the rest is left as an exercise for the user.It's not ideal (I hate it), but it's what we've got...</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28450535</id>
	<title>Ifolder</title>
	<author>hopey</author>
	<datestamp>1245841080000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>Open source, got server/client. Works on Linux, Windows and Mac:<br><br>http://www.kablink.org/ifolder<br><br>Packaging is on the way for Ubuntu:<br><br>https://wiki.ubuntu.com/iFolderPackaging</htmltext>
<tokenext>Open source , got server/client .
Works on Linux , Windows and Mac : http : //www.kablink.org/ifolderPackaging is on the way for Ubuntu : https : //wiki.ubuntu.com/iFolderPackaging</tokentext>
<sentencetext>Open source, got server/client.
Works on Linux, Windows and Mac:http://www.kablink.org/ifolderPackaging is on the way for Ubuntu:https://wiki.ubuntu.com/iFolderPackaging</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444039</id>
	<title>Re:Svn</title>
	<author>Anonymous</author>
	<datestamp>1245790380000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Yes, I wonder what the shortcomings of CVS are. I'm using it myself and it does exactly what I need, so I assume that "shortcomings" depend on the individual (no penis joke pun intended).</p></htmltext>
<tokenext>Yes , I wonder what the shortcomings of CVS are .
I 'm using it myself and it does exactly what I need , so I assume that " shortcomings " depend on the individual ( no penis joke pun intended ) .</tokentext>
<sentencetext>Yes, I wonder what the shortcomings of CVS are.
I'm using it myself and it does exactly what I need, so I assume that "shortcomings" depend on the individual (no penis joke pun intended).</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443419</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28465371</id>
	<title>Wuala might work too</title>
	<author>STFS</author>
	<datestamp>1245939300000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>I know this is late and all but wuala might suite someones needs:
<a href="http://www.wuala.com/" title="wuala.com">http://www.wuala.com/</a> [wuala.com]

It's a distributed file system.

Pros:
- Owned by LaCie
- Uses distributed P2P technology to store data fragments (your data is encrypted and distributed).
- Ability to "trade storage space" (donate your hdd space and bandwidth for more "distributed storage").
- Multiplatform

Cons:
- Java application on your desktop<nobr> <wbr></nobr>:-(

Check out their features: <a href="http://www.wuala.com/en/learn/features" title="wuala.com">http://www.wuala.com/en/learn/features</a> [wuala.com]<nobr> <wbr></nobr>...and a google tech talk they gave about their technology: <a href="http://www.youtube.com/watch?v=3xKZ4KGkQY8" title="youtube.com">http://www.youtube.com/watch?v=3xKZ4KGkQY8</a> [youtube.com]</htmltext>
<tokenext>I know this is late and all but wuala might suite someones needs : http : //www.wuala.com/ [ wuala.com ] It 's a distributed file system .
Pros : - Owned by LaCie - Uses distributed P2P technology to store data fragments ( your data is encrypted and distributed ) .
- Ability to " trade storage space " ( donate your hdd space and bandwidth for more " distributed storage " ) .
- Multiplatform Cons : - Java application on your desktop : - ( Check out their features : http : //www.wuala.com/en/learn/features [ wuala.com ] ...and a google tech talk they gave about their technology : http : //www.youtube.com/watch ? v = 3xKZ4KGkQY8 [ youtube.com ]</tokentext>
<sentencetext>I know this is late and all but wuala might suite someones needs:
http://www.wuala.com/ [wuala.com]

It's a distributed file system.
Pros:
- Owned by LaCie
- Uses distributed P2P technology to store data fragments (your data is encrypted and distributed).
- Ability to "trade storage space" (donate your hdd space and bandwidth for more "distributed storage").
- Multiplatform

Cons:
- Java application on your desktop :-(

Check out their features: http://www.wuala.com/en/learn/features [wuala.com] ...and a google tech talk they gave about their technology: http://www.youtube.com/watch?v=3xKZ4KGkQY8 [youtube.com]</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443923</id>
	<title>The internet never forgets.</title>
	<author>Anonymous</author>
	<datestamp>1245790080000</datestamp>
	<modclass>Funny</modclass>
	<modscore>5</modscore>
	<htmltext>I embed all my documents in porn and post them on various web forums.  The recovery procedure involves spidering my spam folder. I recently found my high school history term paper in a jpg of Marylin Chambers.</htmltext>
<tokenext>I embed all my documents in porn and post them on various web forums .
The recovery procedure involves spidering my spam folder .
I recently found my high school history term paper in a jpg of Marylin Chambers .</tokentext>
<sentencetext>I embed all my documents in porn and post them on various web forums.
The recovery procedure involves spidering my spam folder.
I recently found my high school history term paper in a jpg of Marylin Chambers.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28450775</id>
	<title>USB</title>
	<author>pinkushun</author>
	<datestamp>1245844920000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>Offline: Tried many solutions, but none ever worked as well as my USB Disk! This is from years of habit though.<br>
Online: Personally, and given the resources, I would just host my own file server from home, for anywhere-anytime-access.</htmltext>
<tokenext>Offline : Tried many solutions , but none ever worked as well as my USB Disk !
This is from years of habit though .
Online : Personally , and given the resources , I would just host my own file server from home , for anywhere-anytime-access .</tokentext>
<sentencetext>Offline: Tried many solutions, but none ever worked as well as my USB Disk!
This is from years of habit though.
Online: Personally, and given the resources, I would just host my own file server from home, for anywhere-anytime-access.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28445529</id>
	<title>Re:Beyond Compare</title>
	<author>BillAtHRST</author>
	<datestamp>1245752400000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>++ to BC! (http://scootersoftware.com/)
<br>
If you only go in one direction, automated or semi-automated tools are great (e.g., robocopy on win).  But I've never trusted automatic bi-directional replication -- just because one file is older than another doesn't mean it doesn't contain info that the newer file doesn't.  <br> BC makes reconciliing different directories and files as pleasant as possible.</htmltext>
<tokenext>+ + to BC !
( http : //scootersoftware.com/ ) If you only go in one direction , automated or semi-automated tools are great ( e.g. , robocopy on win ) .
But I 've never trusted automatic bi-directional replication -- just because one file is older than another does n't mean it does n't contain info that the newer file does n't .
BC makes reconciliing different directories and files as pleasant as possible .</tokentext>
<sentencetext>++ to BC!
(http://scootersoftware.com/)

If you only go in one direction, automated or semi-automated tools are great (e.g., robocopy on win).
But I've never trusted automatic bi-directional replication -- just because one file is older than another doesn't mean it doesn't contain info that the newer file doesn't.
BC makes reconciliing different directories and files as pleasant as possible.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443729</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444775</id>
	<title>Unison plus rdiff-backup</title>
	<author>nealfunkbass</author>
	<datestamp>1245749700000</datestamp>
	<modclass>Interestin</modclass>
	<modscore>1</modscore>
	<htmltext>Unison to keep directories in sync on multiple machines

rdiff-backup for backups and keeping old versions</htmltext>
<tokenext>Unison to keep directories in sync on multiple machines rdiff-backup for backups and keeping old versions</tokentext>
<sentencetext>Unison to keep directories in sync on multiple machines

rdiff-backup for backups and keeping old versions</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443991</id>
	<title>iFolder</title>
	<author>Anonymous</author>
	<datestamp>1245790260000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>iFolder seems to work best for me.  http://www.kablink.org/ifolder  I use the Novell/Suse version though.</p></htmltext>
<tokenext>iFolder seems to work best for me .
http : //www.kablink.org/ifolder I use the Novell/Suse version though .</tokentext>
<sentencetext>iFolder seems to work best for me.
http://www.kablink.org/ifolder  I use the Novell/Suse version though.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444853</id>
	<title>NSLU2 as backup server</title>
	<author>Anonymous</author>
	<datestamp>1245749940000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>I use an NSLU2 with an attached 160GB drive as a network backup server that does nightly (actually eveningly) rsyncs of the home directories on my wife's Mac and my Jaunty box. The NSLU2 gets backed up onto DVDR every couple of weeks. I also have rather large collections of MP3s and photos, which are similarly backed up and exported read-only via NFS for all to enjoy. Since these collections only grow, they go on DVDR every time there is enough new material to fill a new one.</p></htmltext>
<tokenext>I use an NSLU2 with an attached 160GB drive as a network backup server that does nightly ( actually eveningly ) rsyncs of the home directories on my wife 's Mac and my Jaunty box .
The NSLU2 gets backed up onto DVDR every couple of weeks .
I also have rather large collections of MP3s and photos , which are similarly backed up and exported read-only via NFS for all to enjoy .
Since these collections only grow , they go on DVDR every time there is enough new material to fill a new one .</tokentext>
<sentencetext>I use an NSLU2 with an attached 160GB drive as a network backup server that does nightly (actually eveningly) rsyncs of the home directories on my wife's Mac and my Jaunty box.
The NSLU2 gets backed up onto DVDR every couple of weeks.
I also have rather large collections of MP3s and photos, which are similarly backed up and exported read-only via NFS for all to enjoy.
Since these collections only grow, they go on DVDR every time there is enough new material to fill a new one.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28454579</id>
	<title>mac/linux/win - no problem</title>
	<author>josh6847</author>
	<datestamp>1245866280000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>I use ubuntu server (in-house) - all you need. Learn linux system administration &amp; unix scripting and your life will be much better (or find a geeky dude who knows how). Heres how I do it:

1) Samba for internal network. Now my mac, linux, and windows are all talking.
2) SSH and FTP for when you are away.
3) apache/mysql/php/perl; I suggest Lampp.
4) Get a domain name and run your own dns updater; I suggest dyndns.org
5) Point your apache config to the directory where all your data is located, and put a php auth and an htaccess auth on it. Now I can access everything through HTTP.
6) Keep everything on mac, and have a back-up copy on the server; linux and windows can just ftp/ssh there happy selves to the data when needed.
7) Unix scripts to back-up my mac up at the end of the day.

--Life is beautiful with Ubuntu--</htmltext>
<tokenext>I use ubuntu server ( in-house ) - all you need .
Learn linux system administration &amp; unix scripting and your life will be much better ( or find a geeky dude who knows how ) .
Heres how I do it : 1 ) Samba for internal network .
Now my mac , linux , and windows are all talking .
2 ) SSH and FTP for when you are away .
3 ) apache/mysql/php/perl ; I suggest Lampp .
4 ) Get a domain name and run your own dns updater ; I suggest dyndns.org 5 ) Point your apache config to the directory where all your data is located , and put a php auth and an htaccess auth on it .
Now I can access everything through HTTP .
6 ) Keep everything on mac , and have a back-up copy on the server ; linux and windows can just ftp/ssh there happy selves to the data when needed .
7 ) Unix scripts to back-up my mac up at the end of the day .
--Life is beautiful with Ubuntu--</tokentext>
<sentencetext>I use ubuntu server (in-house) - all you need.
Learn linux system administration &amp; unix scripting and your life will be much better (or find a geeky dude who knows how).
Heres how I do it:

1) Samba for internal network.
Now my mac, linux, and windows are all talking.
2) SSH and FTP for when you are away.
3) apache/mysql/php/perl; I suggest Lampp.
4) Get a domain name and run your own dns updater; I suggest dyndns.org
5) Point your apache config to the directory where all your data is located, and put a php auth and an htaccess auth on it.
Now I can access everything through HTTP.
6) Keep everything on mac, and have a back-up copy on the server; linux and windows can just ftp/ssh there happy selves to the data when needed.
7) Unix scripts to back-up my mac up at the end of the day.
--Life is beautiful with Ubuntu--</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444139</id>
	<title>google docs works well for me</title>
	<author>goffster</author>
	<datestamp>1245790680000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Allows me to share with others as well.</p></htmltext>
<tokenext>Allows me to share with others as well .</tokentext>
<sentencetext>Allows me to share with others as well.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444125</id>
	<title>Panix</title>
	<author>Saint Stephen</author>
	<datestamp>1245790680000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>I pay $100 a year for a Panix shell account and keep my data there.  My own little server in the sky<nobr> <wbr></nobr>:-)</p></htmltext>
<tokenext>I pay $ 100 a year for a Panix shell account and keep my data there .
My own little server in the sky : - )</tokentext>
<sentencetext>I pay $100 a year for a Panix shell account and keep my data there.
My own little server in the sky :-)</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28449143</id>
	<title>Perforce</title>
	<author>whereiswaldo</author>
	<datestamp>1245779220000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>I use Perforce version control at home to sync documents and pictures between 4 computers.  It's free for up to 2 user accounts and 5 client specs (I only use 1 account).  It has worked well for several years and handles small and large files easily.</p><p>Note: I'm not affiliated with Perforce but I do use it at work.</p></htmltext>
<tokenext>I use Perforce version control at home to sync documents and pictures between 4 computers .
It 's free for up to 2 user accounts and 5 client specs ( I only use 1 account ) .
It has worked well for several years and handles small and large files easily.Note : I 'm not affiliated with Perforce but I do use it at work .</tokentext>
<sentencetext>I use Perforce version control at home to sync documents and pictures between 4 computers.
It's free for up to 2 user accounts and 5 client specs (I only use 1 account).
It has worked well for several years and handles small and large files easily.Note: I'm not affiliated with Perforce but I do use it at work.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28454069</id>
	<title>Re:Dropbox</title>
	<author>Anonymous</author>
	<datestamp>1245864540000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>there's also a 5 GB limit on individual file sizes</p></htmltext>
<tokenext>there 's also a 5 GB limit on individual file sizes</tokentext>
<sentencetext>there's also a 5 GB limit on individual file sizes</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444305</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28445673</id>
	<title>Re:Why is there no discount online backup?</title>
	<author>Rick Genter</author>
	<datestamp>1245752940000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p><a href="http://mozy.com/" title="mozy.com">http://mozy.com/</a> [mozy.com]</p><p>I use mozy to back up my systems. I have just under 600GB saved at mozy. They use blowfish encryption and you can use your own key so only you have access to your data (the encryption is performed locally before being transmitted over the wire). I back up both a Mac and a PC to mozy. I don't know if they have a Linux client.</p><p>For unlimited storage they charge ~50 USD/year/system being backed up. I find it well worth it for my peace of mind (it's a great off-site backup solution).</p></htmltext>
<tokenext>http : //mozy.com/ [ mozy.com ] I use mozy to back up my systems .
I have just under 600GB saved at mozy .
They use blowfish encryption and you can use your own key so only you have access to your data ( the encryption is performed locally before being transmitted over the wire ) .
I back up both a Mac and a PC to mozy .
I do n't know if they have a Linux client.For unlimited storage they charge ~ 50 USD/year/system being backed up .
I find it well worth it for my peace of mind ( it 's a great off-site backup solution ) .</tokentext>
<sentencetext>http://mozy.com/ [mozy.com]I use mozy to back up my systems.
I have just under 600GB saved at mozy.
They use blowfish encryption and you can use your own key so only you have access to your data (the encryption is performed locally before being transmitted over the wire).
I back up both a Mac and a PC to mozy.
I don't know if they have a Linux client.For unlimited storage they charge ~50 USD/year/system being backed up.
I find it well worth it for my peace of mind (it's a great off-site backup solution).</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444495</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444877</id>
	<title>Multiple PCs?</title>
	<author>houghi</author>
	<datestamp>1245750000000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Having more then one PC does not mean I have more then one home directory. I just use NFS.</p></htmltext>
<tokenext>Having more then one PC does not mean I have more then one home directory .
I just use NFS .</tokentext>
<sentencetext>Having more then one PC does not mean I have more then one home directory.
I just use NFS.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443419</id>
	<title>Svn</title>
	<author>Anonymous</author>
	<datestamp>1245788400000</datestamp>
	<modclass>Redundant</modclass>
	<modscore>0</modscore>
	<htmltext><p>Subversion.</p></htmltext>
<tokenext>Subversion .</tokentext>
<sentencetext>Subversion.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443957</id>
	<title>Various Tools for syncing I use are...</title>
	<author>Stinky Fartface</author>
	<datestamp>1245790140000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>I use the following tools to stay synced in various ways:

Plaxo offers a fairly good Outlook sync utility for free which keeps my Address Book, Calendar and Tasks synced on my work and home desktop computers.

Dropbox offers 2-3 GB of free storage that automatically syncs to any computer logged into that account. I keep all sorts of stuff in there. Photoshop prefs and tools, automation scripts, encrypted password database, etc.

I set up a cheap home server with FTP. It goes to sleep if unused and I can wake it up with a Magic Packet remotely before doing file transfers.

For large libraries of software, music, etc. I have a portable hard drive that I sync on either end using Directory Toolkit about every week or two depending.

And Foxmarks for Firefox.</htmltext>
<tokenext>I use the following tools to stay synced in various ways : Plaxo offers a fairly good Outlook sync utility for free which keeps my Address Book , Calendar and Tasks synced on my work and home desktop computers .
Dropbox offers 2-3 GB of free storage that automatically syncs to any computer logged into that account .
I keep all sorts of stuff in there .
Photoshop prefs and tools , automation scripts , encrypted password database , etc .
I set up a cheap home server with FTP .
It goes to sleep if unused and I can wake it up with a Magic Packet remotely before doing file transfers .
For large libraries of software , music , etc .
I have a portable hard drive that I sync on either end using Directory Toolkit about every week or two depending .
And Foxmarks for Firefox .</tokentext>
<sentencetext>I use the following tools to stay synced in various ways:

Plaxo offers a fairly good Outlook sync utility for free which keeps my Address Book, Calendar and Tasks synced on my work and home desktop computers.
Dropbox offers 2-3 GB of free storage that automatically syncs to any computer logged into that account.
I keep all sorts of stuff in there.
Photoshop prefs and tools, automation scripts, encrypted password database, etc.
I set up a cheap home server with FTP.
It goes to sleep if unused and I can wake it up with a Magic Packet remotely before doing file transfers.
For large libraries of software, music, etc.
I have a portable hard drive that I sync on either end using Directory Toolkit about every week or two depending.
And Foxmarks for Firefox.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444149</id>
	<title>If you're willing to spend some $</title>
	<author>Binkleyz</author>
	<datestamp>1245790740000</datestamp>
	<modclass>Informativ</modclass>
	<modscore>2</modscore>
	<htmltext><p>Doubletake software makes an enterprise ready, real time replication suite.</p><p>It does block level replication, so only the changed bits of, say, a 10Gb databse gets changed.. It uses on the fly en/decryption so that the data streams are somewhat smaller than they would be otherwise..</p><p>I work for a Fortune 10 company, and when we have a need for real-time data replication, this is what we use.</p></htmltext>
<tokenext>Doubletake software makes an enterprise ready , real time replication suite.It does block level replication , so only the changed bits of , say , a 10Gb databse gets changed.. It uses on the fly en/decryption so that the data streams are somewhat smaller than they would be otherwise..I work for a Fortune 10 company , and when we have a need for real-time data replication , this is what we use .</tokentext>
<sentencetext>Doubletake software makes an enterprise ready, real time replication suite.It does block level replication, so only the changed bits of, say, a 10Gb databse gets changed.. It uses on the fly en/decryption so that the data streams are somewhat smaller than they would be otherwise..I work for a Fortune 10 company, and when we have a need for real-time data replication, this is what we use.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444489</id>
	<title>Subversion &amp; "outsourcing"</title>
	<author>l0b0</author>
	<datestamp>1245748740000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>I've got a home Subversion server with separate repositories for documents, settings, contacts, and projects. Been like that for five years now, and it's remarkably stable and nice. For anything data intensive, there's Flickr, del.icio.us, Gmail, WordPress, etc., with a private backup just in case.</htmltext>
<tokenext>I 've got a home Subversion server with separate repositories for documents , settings , contacts , and projects .
Been like that for five years now , and it 's remarkably stable and nice .
For anything data intensive , there 's Flickr , del.icio.us , Gmail , WordPress , etc. , with a private backup just in case .</tokentext>
<sentencetext>I've got a home Subversion server with separate repositories for documents, settings, contacts, and projects.
Been like that for five years now, and it's remarkably stable and nice.
For anything data intensive, there's Flickr, del.icio.us, Gmail, WordPress, etc., with a private backup just in case.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444475</id>
	<title>Shared folders</title>
	<author>Anonymous</author>
	<datestamp>1245748680000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>I used to have two computers networked to each other and I would name or label certain folders as shared folders and copy the folders' contents over. With two Macs, you can just use Appleshare over TCP/IP, with a Mac and a Windows machine, you can use SMB, or use a flash drive or some such. Drag only new files if you like, or Select All and make sure you're just keeping the newer versions.</p><p>It's not an elegant one-click solution, but it is a simple Command-A and drag solution.</p></htmltext>
<tokenext>I used to have two computers networked to each other and I would name or label certain folders as shared folders and copy the folders ' contents over .
With two Macs , you can just use Appleshare over TCP/IP , with a Mac and a Windows machine , you can use SMB , or use a flash drive or some such .
Drag only new files if you like , or Select All and make sure you 're just keeping the newer versions.It 's not an elegant one-click solution , but it is a simple Command-A and drag solution .</tokentext>
<sentencetext>I used to have two computers networked to each other and I would name or label certain folders as shared folders and copy the folders' contents over.
With two Macs, you can just use Appleshare over TCP/IP, with a Mac and a Windows machine, you can use SMB, or use a flash drive or some such.
Drag only new files if you like, or Select All and make sure you're just keeping the newer versions.It's not an elegant one-click solution, but it is a simple Command-A and drag solution.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28446279</id>
	<title>simple</title>
	<author>Anonymous</author>
	<datestamp>1245755580000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>My way is very simple. I'll connect to the other machine using Filezilla over sftp. Then drag my folders over to my local PC. When prompted, I allow newer files to over-write older ones. It's fairly fast and very simple. Just make sure both computers have a ssh server.</p></htmltext>
<tokenext>My way is very simple .
I 'll connect to the other machine using Filezilla over sftp .
Then drag my folders over to my local PC .
When prompted , I allow newer files to over-write older ones .
It 's fairly fast and very simple .
Just make sure both computers have a ssh server .</tokentext>
<sentencetext>My way is very simple.
I'll connect to the other machine using Filezilla over sftp.
Then drag my folders over to my local PC.
When prompted, I allow newer files to over-write older ones.
It's fairly fast and very simple.
Just make sure both computers have a ssh server.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444817</id>
	<title>Re:Svn</title>
	<author>morgauxo</author>
	<datestamp>1245749820000</datestamp>
	<modclass>Informativ</modclass>
	<modscore>3</modscore>
	<htmltext>The biggest shortcoming of CVS that I know is the lack of ability to rename a file.  Yes, you can copy it then delete the original but CVS sees this as a new file with no revision history.  If I understand correctly subversion was created by former CVS users to overcome a few shortcomings of CVS with this being the biggest one. Thus SVN has a similar "feel" though not identical commands to CVS and a superior feature set.</htmltext>
<tokenext>The biggest shortcoming of CVS that I know is the lack of ability to rename a file .
Yes , you can copy it then delete the original but CVS sees this as a new file with no revision history .
If I understand correctly subversion was created by former CVS users to overcome a few shortcomings of CVS with this being the biggest one .
Thus SVN has a similar " feel " though not identical commands to CVS and a superior feature set .</tokentext>
<sentencetext>The biggest shortcoming of CVS that I know is the lack of ability to rename a file.
Yes, you can copy it then delete the original but CVS sees this as a new file with no revision history.
If I understand correctly subversion was created by former CVS users to overcome a few shortcomings of CVS with this being the biggest one.
Thus SVN has a similar "feel" though not identical commands to CVS and a superior feature set.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444039</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443429</id>
	<title>test</title>
	<author>Anonymous</author>
	<datestamp>1245788400000</datestamp>
	<modclass>Offtopic</modclass>
	<modscore>-1</modscore>
	<htmltext><p>test</p></htmltext>
<tokenext>test</tokentext>
<sentencetext>test</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28445873</id>
	<title>Duplication</title>
	<author>sxmjmae</author>
	<datestamp>1245753780000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>I just try to avoid duplication of data.</p><p>
I have backups here and backups there.  One full 500 GB drive full of prized photo's.</p><p>
A few restores and un-deletes and presto I can find 3+ files of the same thing.</p><p>
I need a NAS that can do backups and help me avoid data duplications.  Now only if the 5+ GB NAS can come down in price.</p></htmltext>
<tokenext>I just try to avoid duplication of data .
I have backups here and backups there .
One full 500 GB drive full of prized photo 's .
A few restores and un-deletes and presto I can find 3 + files of the same thing .
I need a NAS that can do backups and help me avoid data duplications .
Now only if the 5 + GB NAS can come down in price .</tokentext>
<sentencetext>I just try to avoid duplication of data.
I have backups here and backups there.
One full 500 GB drive full of prized photo's.
A few restores and un-deletes and presto I can find 3+ files of the same thing.
I need a NAS that can do backups and help me avoid data duplications.
Now only if the 5+ GB NAS can come down in price.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443729</id>
	<title>Beyond Compare</title>
	<author>Anonymous</author>
	<datestamp>1245789420000</datestamp>
	<modclass>Informativ</modclass>
	<modscore>3</modscore>
	<htmltext><p>On the windows side there is a great utility called Beyond Compare, around $30, that I have used to do this.  I even had a small client once that could not afford a real backup software, so we faked the backup using portable USB hard drives and the Beyond Compare utility to sync her server and desktop to the drives.  Worked quite great and the while thing was done for under $200.</p></htmltext>
<tokenext>On the windows side there is a great utility called Beyond Compare , around $ 30 , that I have used to do this .
I even had a small client once that could not afford a real backup software , so we faked the backup using portable USB hard drives and the Beyond Compare utility to sync her server and desktop to the drives .
Worked quite great and the while thing was done for under $ 200 .</tokentext>
<sentencetext>On the windows side there is a great utility called Beyond Compare, around $30, that I have used to do this.
I even had a small client once that could not afford a real backup software, so we faked the backup using portable USB hard drives and the Beyond Compare utility to sync her server and desktop to the drives.
Worked quite great and the while thing was done for under $200.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443653</id>
	<title>What about a FUSE FS powered by a MySQL DB?</title>
	<author>Anonymous</author>
	<datestamp>1245789120000</datestamp>
	<modclass>Interestin</modclass>
	<modscore>3</modscore>
	<htmltext>FAST 2009 has a paper on semantic data management using a file system built on top of an object store powered by MySQL.  Performance isn't great, but it uses a distributed file system solution to solve the synchronization issue in a very nice way (e.g., synchronize all albums with my iPod, all photos with my laptop and computer, etc...).  You can specify rules and I liked it when I heard about it.  However performance is actually important, despite their claim<nobr> <wbr></nobr>:).

Perspective: Semantic Data Management for the Home
Brandon Salmon, Carnegie Mellon University; Steven W. Schlosser, Intel Research Pittsburgh; Lorrie Faith Cranor and Gregory R. Ganger, Carnegie Mellon University

HTML Paper
<a href="http://www.usenix.org/events/fast09/tech/full\_papers/salmon/salmon\_html/index.html" title="usenix.org" rel="nofollow">http://www.usenix.org/events/fast09/tech/full\_papers/salmon/salmon\_html/index.html</a> [usenix.org]
PDF Paper
<a href="http://www.usenix.org/events/fast09/tech/full\_papers/salmon/salmon.pdf" title="usenix.org" rel="nofollow">http://www.usenix.org/events/fast09/tech/full\_papers/salmon/salmon.pdf</a> [usenix.org]
Slides
<a href="http://www.usenix.org/events/fast09/tech/slides/salmon.pdf" title="usenix.org" rel="nofollow">http://www.usenix.org/events/fast09/tech/slides/salmon.pdf</a> [usenix.org]</htmltext>
<tokenext>FAST 2009 has a paper on semantic data management using a file system built on top of an object store powered by MySQL .
Performance is n't great , but it uses a distributed file system solution to solve the synchronization issue in a very nice way ( e.g. , synchronize all albums with my iPod , all photos with my laptop and computer , etc... ) .
You can specify rules and I liked it when I heard about it .
However performance is actually important , despite their claim : ) .
Perspective : Semantic Data Management for the Home Brandon Salmon , Carnegie Mellon University ; Steven W. Schlosser , Intel Research Pittsburgh ; Lorrie Faith Cranor and Gregory R. Ganger , Carnegie Mellon University HTML Paper http : //www.usenix.org/events/fast09/tech/full \ _papers/salmon/salmon \ _html/index.html [ usenix.org ] PDF Paper http : //www.usenix.org/events/fast09/tech/full \ _papers/salmon/salmon.pdf [ usenix.org ] Slides http : //www.usenix.org/events/fast09/tech/slides/salmon.pdf [ usenix.org ]</tokentext>
<sentencetext>FAST 2009 has a paper on semantic data management using a file system built on top of an object store powered by MySQL.
Performance isn't great, but it uses a distributed file system solution to solve the synchronization issue in a very nice way (e.g., synchronize all albums with my iPod, all photos with my laptop and computer, etc...).
You can specify rules and I liked it when I heard about it.
However performance is actually important, despite their claim :).
Perspective: Semantic Data Management for the Home
Brandon Salmon, Carnegie Mellon University; Steven W. Schlosser, Intel Research Pittsburgh; Lorrie Faith Cranor and Gregory R. Ganger, Carnegie Mellon University

HTML Paper
http://www.usenix.org/events/fast09/tech/full\_papers/salmon/salmon\_html/index.html [usenix.org]
PDF Paper
http://www.usenix.org/events/fast09/tech/full\_papers/salmon/salmon.pdf [usenix.org]
Slides
http://www.usenix.org/events/fast09/tech/slides/salmon.pdf [usenix.org]</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444237</id>
	<title>Just started using DirSyncPro...</title>
	<author>Anonymous</author>
	<datestamp>1245747780000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>DirSyncPro is FOSS, I just started using it a few months ago, and I love it.  My company now uses it to keep our two programmers synced(sp?) to the server.  Works great, I hope they package it better for MS soon, there is no installer yet, just an executable.</htmltext>
<tokenext>DirSyncPro is FOSS , I just started using it a few months ago , and I love it .
My company now uses it to keep our two programmers synced ( sp ?
) to the server .
Works great , I hope they package it better for MS soon , there is no installer yet , just an executable .</tokentext>
<sentencetext>DirSyncPro is FOSS, I just started using it a few months ago, and I love it.
My company now uses it to keep our two programmers synced(sp?
) to the server.
Works great, I hope they package it better for MS soon, there is no installer yet, just an executable.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443781</id>
	<title>Re:Dropbox</title>
	<author>Stinky Fartface</author>
	<datestamp>1245789540000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>I'll second this. I like my Dropbox. I referred enough people to get up to 3GB free, and that is surprisingly useful. I fantasize about my entire desktop running from my Dropbox but I can't afford that level of service.</htmltext>
<tokenext>I 'll second this .
I like my Dropbox .
I referred enough people to get up to 3GB free , and that is surprisingly useful .
I fantasize about my entire desktop running from my Dropbox but I ca n't afford that level of service .</tokentext>
<sentencetext>I'll second this.
I like my Dropbox.
I referred enough people to get up to 3GB free, and that is surprisingly useful.
I fantasize about my entire desktop running from my Dropbox but I can't afford that level of service.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443469</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28445987</id>
	<title>shfs mounts by cron, rsnapshot</title>
	<author>cenc</author>
	<datestamp>1245754260000</datestamp>
	<modclass>Interestin</modclass>
	<modscore>3</modscore>
	<htmltext><p>I use shfs mounts by ( to make sure it stays mounted even if connection is interrupted) and ssh tunnels for everything else, with preshared keys to a central server / proxy, and rsnapshots for backup on the central server with hot swap drives.</p><p>This works on desktops, remote office, and for notebooks. I essentially don't trust my employees or myself to remember to encrypt everything or use "secure" protocals all the time, and so I remove the need to remember from the whole process. I can then focus on securing one system. Great if everything else is secure, but just in case. Very good for notebooks jumping from open wireless to open wireless systems, and also keeping track of employees activity in one location. I can log fairly easily everything they do or don't do (yea, the 2 hour coffee break sticks out like sore thumb in the logs).</p><p>Among other things this also has the nice side effect that should say a notebook or desktop be stolen, it will phone home as soon as it is connected to the internet and send detailed information about what it is doing.</p></htmltext>
<tokenext>I use shfs mounts by ( to make sure it stays mounted even if connection is interrupted ) and ssh tunnels for everything else , with preshared keys to a central server / proxy , and rsnapshots for backup on the central server with hot swap drives.This works on desktops , remote office , and for notebooks .
I essentially do n't trust my employees or myself to remember to encrypt everything or use " secure " protocals all the time , and so I remove the need to remember from the whole process .
I can then focus on securing one system .
Great if everything else is secure , but just in case .
Very good for notebooks jumping from open wireless to open wireless systems , and also keeping track of employees activity in one location .
I can log fairly easily everything they do or do n't do ( yea , the 2 hour coffee break sticks out like sore thumb in the logs ) .Among other things this also has the nice side effect that should say a notebook or desktop be stolen , it will phone home as soon as it is connected to the internet and send detailed information about what it is doing .</tokentext>
<sentencetext>I use shfs mounts by ( to make sure it stays mounted even if connection is interrupted) and ssh tunnels for everything else, with preshared keys to a central server / proxy, and rsnapshots for backup on the central server with hot swap drives.This works on desktops, remote office, and for notebooks.
I essentially don't trust my employees or myself to remember to encrypt everything or use "secure" protocals all the time, and so I remove the need to remember from the whole process.
I can then focus on securing one system.
Great if everything else is secure, but just in case.
Very good for notebooks jumping from open wireless to open wireless systems, and also keeping track of employees activity in one location.
I can log fairly easily everything they do or don't do (yea, the 2 hour coffee break sticks out like sore thumb in the logs).Among other things this also has the nice side effect that should say a notebook or desktop be stolen, it will phone home as soon as it is connected to the internet and send detailed information about what it is doing.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443673</id>
	<title>Dropbox</title>
	<author>gphilip</author>
	<datestamp>1245789180000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><a href="http://www.getdropbox.com/" title="getdropbox.com" rel="nofollow">http://www.getdropbox.com/</a> [getdropbox.com]

  Works perfectly for me. I use the free option, which has a 2 GB limit, which is more than enough for me to keep all my important stuff in sync. It has a client that integrates nicely with the Nautilus file browser on Ubuntu, which is what I use at home and office.


  Whatever you put (or symlink) into a designated folder (which you can choose) gets mirrored to their server, from where it gets synced to every other system where you have installed the client, the next time you connect that system to the internet. They also give web-based access to the stored files.


  There is the issue of privacy for the really paranoid, but I am not very concerned about that with the files I currently choose to mirror. I am more worried about the chance that their client develops a bug that wipes out my files, but I guess I'll take that risk.</htmltext>
<tokenext>http : //www.getdropbox.com/ [ getdropbox.com ] Works perfectly for me .
I use the free option , which has a 2 GB limit , which is more than enough for me to keep all my important stuff in sync .
It has a client that integrates nicely with the Nautilus file browser on Ubuntu , which is what I use at home and office .
Whatever you put ( or symlink ) into a designated folder ( which you can choose ) gets mirrored to their server , from where it gets synced to every other system where you have installed the client , the next time you connect that system to the internet .
They also give web-based access to the stored files .
There is the issue of privacy for the really paranoid , but I am not very concerned about that with the files I currently choose to mirror .
I am more worried about the chance that their client develops a bug that wipes out my files , but I guess I 'll take that risk .</tokentext>
<sentencetext>http://www.getdropbox.com/ [getdropbox.com]

  Works perfectly for me.
I use the free option, which has a 2 GB limit, which is more than enough for me to keep all my important stuff in sync.
It has a client that integrates nicely with the Nautilus file browser on Ubuntu, which is what I use at home and office.
Whatever you put (or symlink) into a designated folder (which you can choose) gets mirrored to their server, from where it gets synced to every other system where you have installed the client, the next time you connect that system to the internet.
They also give web-based access to the stored files.
There is the issue of privacy for the really paranoid, but I am not very concerned about that with the files I currently choose to mirror.
I am more worried about the chance that their client develops a bug that wipes out my files, but I guess I'll take that risk.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28445373</id>
	<title>The right tools for the job</title>
	<author>Enahs</author>
	<datestamp>1245751740000</datestamp>
	<modclass>Interestin</modclass>
	<modscore>3</modscore>
	<htmltext><p>I don't share EVERYTHING, but I share some things:</p><ul><li>If I just need to go one way, I use rsync.</li><li>If I need 2-way sync but no versioning info, I use unison.</li><li>If I need n-way sync but no versioning info, I use unison with a central "untouchable" folder.</li><li>If I need versioning info, I use git.</li></ul></htmltext>
<tokenext>I do n't share EVERYTHING , but I share some things : If I just need to go one way , I use rsync.If I need 2-way sync but no versioning info , I use unison.If I need n-way sync but no versioning info , I use unison with a central " untouchable " folder.If I need versioning info , I use git .</tokentext>
<sentencetext>I don't share EVERYTHING, but I share some things:If I just need to go one way, I use rsync.If I need 2-way sync but no versioning info, I use unison.If I need n-way sync but no versioning info, I use unison with a central "untouchable" folder.If I need versioning info, I use git.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28450665</id>
	<title>Re:Dropbox</title>
	<author>Richard\_at\_work</author>
	<datestamp>1245843240000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>Its currently still in extended beta, and there are paid accounts available.</htmltext>
<tokenext>Its currently still in extended beta , and there are paid accounts available .</tokentext>
<sentencetext>Its currently still in extended beta, and there are paid accounts available.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28446521</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28450149</id>
	<title>NAS</title>
	<author>jandersen</author>
	<datestamp>1245834360000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>The best way I can think of is, buy a NAS and put your homedir on it. They aren't all that expensive any more, and it lets the whole family share things.</p><p>The other way, which I use to allow me to work on my projects both from home and the office, is to use a revision control system, like svn, cvs or perforce. Check things in every time you finish working.</p></htmltext>
<tokenext>The best way I can think of is , buy a NAS and put your homedir on it .
They are n't all that expensive any more , and it lets the whole family share things.The other way , which I use to allow me to work on my projects both from home and the office , is to use a revision control system , like svn , cvs or perforce .
Check things in every time you finish working .</tokentext>
<sentencetext>The best way I can think of is, buy a NAS and put your homedir on it.
They aren't all that expensive any more, and it lets the whole family share things.The other way, which I use to allow me to work on my projects both from home and the office, is to use a revision control system, like svn, cvs or perforce.
Check things in every time you finish working.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28447687</id>
	<title>Unison</title>
	<author>speedtux</author>
	<datestamp>1245763860000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>I think Unison is probably still the best tool for bidirectional sync of directory trees.  Unfortunately, it's written in OCAML.  In principle, OCAML is a nice language, but Unison is written in a pretty awful style, and, more importantly, the whole thing is kind of hard to port.</p></htmltext>
<tokenext>I think Unison is probably still the best tool for bidirectional sync of directory trees .
Unfortunately , it 's written in OCAML .
In principle , OCAML is a nice language , but Unison is written in a pretty awful style , and , more importantly , the whole thing is kind of hard to port .</tokentext>
<sentencetext>I think Unison is probably still the best tool for bidirectional sync of directory trees.
Unfortunately, it's written in OCAML.
In principle, OCAML is a nice language, but Unison is written in a pretty awful style, and, more importantly, the whole thing is kind of hard to port.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28446163</id>
	<title>rsync for dummies</title>
	<author>patiodragon</author>
	<datestamp>1245755040000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p><a href="http://kimbriggs.com/computers/computer-notes/linux-notes/samba-setup-ubuntu704-guide.file" title="kimbriggs.com" rel="nofollow">http://kimbriggs.com/computers/computer-notes/linux-notes/samba-setup-ubuntu704-guide.file</a> [kimbriggs.com]</p></htmltext>
<tokenext>http : //kimbriggs.com/computers/computer-notes/linux-notes/samba-setup-ubuntu704-guide.file [ kimbriggs.com ]</tokentext>
<sentencetext>http://kimbriggs.com/computers/computer-notes/linux-notes/samba-setup-ubuntu704-guide.file [kimbriggs.com]</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443659</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28451203</id>
	<title>I use xrandr with a suitable Xconfig.</title>
	<author>drew\_eckhardt</author>
	<datestamp>1245850800000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>I take my 4.5 pound laptop into work, xrandr -s 1 and Poof!  I have all my files on a large 1920x1080 monitor.  I type xrandr -s 0, all my windows move back to the laptop where the files remain.  I also have an option for dual displays on the laptop and an external monitor for presentations on a projector.  Sneaker net gets my files where they need to go by the time I need them no matter how many gigabytes they consume and how bad local network connectivity is.</p><p>Keeping everything important on one backed up (clonezilla, need to rerun) laptop avoids software issues, has single copy update semantics, and exceptional performance (25X faster than NFS at my office, with the tree building in 2 minutes not 50).</p><p>Modern laptops are large and fast enough for nearly everything.</p></htmltext>
<tokenext>I take my 4.5 pound laptop into work , xrandr -s 1 and Poof !
I have all my files on a large 1920x1080 monitor .
I type xrandr -s 0 , all my windows move back to the laptop where the files remain .
I also have an option for dual displays on the laptop and an external monitor for presentations on a projector .
Sneaker net gets my files where they need to go by the time I need them no matter how many gigabytes they consume and how bad local network connectivity is.Keeping everything important on one backed up ( clonezilla , need to rerun ) laptop avoids software issues , has single copy update semantics , and exceptional performance ( 25X faster than NFS at my office , with the tree building in 2 minutes not 50 ) .Modern laptops are large and fast enough for nearly everything .</tokentext>
<sentencetext>I take my 4.5 pound laptop into work, xrandr -s 1 and Poof!
I have all my files on a large 1920x1080 monitor.
I type xrandr -s 0, all my windows move back to the laptop where the files remain.
I also have an option for dual displays on the laptop and an external monitor for presentations on a projector.
Sneaker net gets my files where they need to go by the time I need them no matter how many gigabytes they consume and how bad local network connectivity is.Keeping everything important on one backed up (clonezilla, need to rerun) laptop avoids software issues, has single copy update semantics, and exceptional performance (25X faster than NFS at my office, with the tree building in 2 minutes not 50).Modern laptops are large and fast enough for nearly everything.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444447</id>
	<title>Re:Dropbox</title>
	<author>Anonymous</author>
	<datestamp>1245748560000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>It looks like it might be interesting, but I'll be honest here, the site is a bit on the awful side.  For starters, "<a href="http://jakeapp.com/about" title="jakeapp.com" rel="nofollow">About Jake</a> [jakeapp.com]" starts "Jake, what does it do?"  "Works on any computer" and "easy to use" are not "what it does", if I didn't know what a file synchronizer is, I'd not be any more enlightened by this.</p><p>Given that "users don't have to be online all the time", it sounds like it does distributed storage, meaning that a computer could crash and none of the synchronized data would be lost, this is something that should be explicitly stated.  If so, it also means that you need to explain what happens when A and B go on a vacation with their laptop and both of them decide to work on importantfile.doc while they're offline.  How do you handle that conflict when they come back and try to synchronize?</p><p>Part of it might be non-native English (nobody would use the word "surely" in marketing their product as you have.  Try "SharePoint may claim to be a good tool, but..."  The same goes for "where data lie", just try "No central data storage server required".</p><p>Also, now that rails has crashed and burned at your site, I suggest making static pages like "about" and such just plain html and served up through a normal webserver without requiring rails<nobr> <wbr></nobr>;)</p></htmltext>
<tokenext>It looks like it might be interesting , but I 'll be honest here , the site is a bit on the awful side .
For starters , " About Jake [ jakeapp.com ] " starts " Jake , what does it do ?
" " Works on any computer " and " easy to use " are not " what it does " , if I did n't know what a file synchronizer is , I 'd not be any more enlightened by this.Given that " users do n't have to be online all the time " , it sounds like it does distributed storage , meaning that a computer could crash and none of the synchronized data would be lost , this is something that should be explicitly stated .
If so , it also means that you need to explain what happens when A and B go on a vacation with their laptop and both of them decide to work on importantfile.doc while they 're offline .
How do you handle that conflict when they come back and try to synchronize ? Part of it might be non-native English ( nobody would use the word " surely " in marketing their product as you have .
Try " SharePoint may claim to be a good tool , but... " The same goes for " where data lie " , just try " No central data storage server required " .Also , now that rails has crashed and burned at your site , I suggest making static pages like " about " and such just plain html and served up through a normal webserver without requiring rails ; )</tokentext>
<sentencetext>It looks like it might be interesting, but I'll be honest here, the site is a bit on the awful side.
For starters, "About Jake [jakeapp.com]" starts "Jake, what does it do?
"  "Works on any computer" and "easy to use" are not "what it does", if I didn't know what a file synchronizer is, I'd not be any more enlightened by this.Given that "users don't have to be online all the time", it sounds like it does distributed storage, meaning that a computer could crash and none of the synchronized data would be lost, this is something that should be explicitly stated.
If so, it also means that you need to explain what happens when A and B go on a vacation with their laptop and both of them decide to work on importantfile.doc while they're offline.
How do you handle that conflict when they come back and try to synchronize?Part of it might be non-native English (nobody would use the word "surely" in marketing their product as you have.
Try "SharePoint may claim to be a good tool, but..."  The same goes for "where data lie", just try "No central data storage server required".Also, now that rails has crashed and burned at your site, I suggest making static pages like "about" and such just plain html and served up through a normal webserver without requiring rails ;)</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443909</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28447311</id>
	<title>Re:rsync</title>
	<author>Anonymous</author>
	<datestamp>1245761100000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>I take it you can't rearrange the data to use sub-directories but can't you just setup a few folder  with hardlinks to subsets of the files?</p></htmltext>
<tokenext>I take it you ca n't rearrange the data to use sub-directories but ca n't you just setup a few folder with hardlinks to subsets of the files ?</tokentext>
<sentencetext>I take it you can't rearrange the data to use sub-directories but can't you just setup a few folder  with hardlinks to subsets of the files?</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444277</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28446011</id>
	<title>Dropbox is the way to go</title>
	<author>amaiman</author>
	<datestamp>1245754380000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>I would highly recommend Dropbox.  I've been using them for close to a year now, and have never run into any major problems.  It "just works", which is important for something that you don't want to think about.  Having the files in the cloud means you don't need to keep both PCs on at the same time (older apps such as FolderShare required this).  The delta sync and the "instant upload" if the file already exists on their server (even if it's not in your own dropbox...for example, if you put a Linux ISO in your Dropbox that someone else has already uploaded to them, it identifies it based on hash and you then don't need to upload it -- it sounds like a privacy issue, but it's not [no hash collisions], read about it on their site) really speed things up.  The backup and versioning features are nice, too.
<br> <br>
Dropbox is also cross-platform, so you can use it to send files to/from your server if you want to (takes a bit of fiddling to make it work under CLI-only, but their Wiki has instructions on that)
<br> <br>
I use Mozilla Weave (hosting my own server since the public one is long since full) for syncing Firefox settings, it works fairly well, but is a pain to configure and troubleshoot sometimes.</htmltext>
<tokenext>I would highly recommend Dropbox .
I 've been using them for close to a year now , and have never run into any major problems .
It " just works " , which is important for something that you do n't want to think about .
Having the files in the cloud means you do n't need to keep both PCs on at the same time ( older apps such as FolderShare required this ) .
The delta sync and the " instant upload " if the file already exists on their server ( even if it 's not in your own dropbox...for example , if you put a Linux ISO in your Dropbox that someone else has already uploaded to them , it identifies it based on hash and you then do n't need to upload it -- it sounds like a privacy issue , but it 's not [ no hash collisions ] , read about it on their site ) really speed things up .
The backup and versioning features are nice , too .
Dropbox is also cross-platform , so you can use it to send files to/from your server if you want to ( takes a bit of fiddling to make it work under CLI-only , but their Wiki has instructions on that ) I use Mozilla Weave ( hosting my own server since the public one is long since full ) for syncing Firefox settings , it works fairly well , but is a pain to configure and troubleshoot sometimes .</tokentext>
<sentencetext>I would highly recommend Dropbox.
I've been using them for close to a year now, and have never run into any major problems.
It "just works", which is important for something that you don't want to think about.
Having the files in the cloud means you don't need to keep both PCs on at the same time (older apps such as FolderShare required this).
The delta sync and the "instant upload" if the file already exists on their server (even if it's not in your own dropbox...for example, if you put a Linux ISO in your Dropbox that someone else has already uploaded to them, it identifies it based on hash and you then don't need to upload it -- it sounds like a privacy issue, but it's not [no hash collisions], read about it on their site) really speed things up.
The backup and versioning features are nice, too.
Dropbox is also cross-platform, so you can use it to send files to/from your server if you want to (takes a bit of fiddling to make it work under CLI-only, but their Wiki has instructions on that)
 
I use Mozilla Weave (hosting my own server since the public one is long since full) for syncing Firefox settings, it works fairly well, but is a pain to configure and troubleshoot sometimes.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444071</id>
	<title>Re:"Distributed homedirs" or "CVS'd configs"?</title>
	<author>short</author>
	<datestamp>1245790500000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>Using $HOME in CVS since 2001, it works perfectly, it is public:<br>
<tt>cvs -d<nobr> <wbr></nobr>:pserver:pserver:@dyn.jankratochvil.net/cvs co nethome</tt>
<br>
Checking it out across the world on various machines.  If I find I miss something on some host, I do `<tt>cvs update</tt>' by hand.  Not a rocket science.
<br>
I have only one host I consider secure enough so there is no point in distributed mails.
<br>
I was using my Gecko bookmarks in $HOME but I had to create a <a href="http://www.jankratochvil.net/project/xbelnormalize/" title="jankratochvil.net" rel="nofollow">small script</a> [jankratochvil.net] to "normalize" them.  Otherwise they contain a lot of useless info (timestamps, whether expanded etc.) making both history diffs useless and conflict merges difficult/impossible.</htmltext>
<tokenext>Using $ HOME in CVS since 2001 , it works perfectly , it is public : cvs -d : pserver : pserver : @ dyn.jankratochvil.net/cvs co nethome Checking it out across the world on various machines .
If I find I miss something on some host , I do ` cvs update ' by hand .
Not a rocket science .
I have only one host I consider secure enough so there is no point in distributed mails .
I was using my Gecko bookmarks in $ HOME but I had to create a small script [ jankratochvil.net ] to " normalize " them .
Otherwise they contain a lot of useless info ( timestamps , whether expanded etc .
) making both history diffs useless and conflict merges difficult/impossible .</tokentext>
<sentencetext>Using $HOME in CVS since 2001, it works perfectly, it is public:
cvs -d :pserver:pserver:@dyn.jankratochvil.net/cvs co nethome

Checking it out across the world on various machines.
If I find I miss something on some host, I do `cvs update' by hand.
Not a rocket science.
I have only one host I consider secure enough so there is no point in distributed mails.
I was using my Gecko bookmarks in $HOME but I had to create a small script [jankratochvil.net] to "normalize" them.
Otherwise they contain a lot of useless info (timestamps, whether expanded etc.
) making both history diffs useless and conflict merges difficult/impossible.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443533</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444845</id>
	<title>My setup</title>
	<author>FunkyELF</author>
	<datestamp>1245749880000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>I have a single fileserver (soon to be replaced with a <a href="http://www.plugcomputer.org/" title="plugcomputer.org">plug computer</a> [plugcomputer.org] whenever mine ships).<br>
Everything pulls from there.<br>
Every once in a while I plug in my 1TB external USB driver and sync from the main server, then unplug and put back in the safe.<br>
<br>
When I retire my fileserver and move to the plug computer, I will use my 1TB external for the server and buy another one for backups.  It will be formatted with a different filesystem.</htmltext>
<tokenext>I have a single fileserver ( soon to be replaced with a plug computer [ plugcomputer.org ] whenever mine ships ) .
Everything pulls from there .
Every once in a while I plug in my 1TB external USB driver and sync from the main server , then unplug and put back in the safe .
When I retire my fileserver and move to the plug computer , I will use my 1TB external for the server and buy another one for backups .
It will be formatted with a different filesystem .</tokentext>
<sentencetext>I have a single fileserver (soon to be replaced with a plug computer [plugcomputer.org] whenever mine ships).
Everything pulls from there.
Every once in a while I plug in my 1TB external USB driver and sync from the main server, then unplug and put back in the safe.
When I retire my fileserver and move to the plug computer, I will use my 1TB external for the server and buy another one for backups.
It will be formatted with a different filesystem.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28446037</id>
	<title>Unison</title>
	<author>astaines</author>
	<datestamp>1245754500000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>I manage all my home directories with Unison, including 80Gig of pictures across Linux and Windows. The only issue I've noticed is that for single files over 1G it gets slow, so I split up my thunderbird folders to get under this. Everything else just works.</p></htmltext>
<tokenext>I manage all my home directories with Unison , including 80Gig of pictures across Linux and Windows .
The only issue I 've noticed is that for single files over 1G it gets slow , so I split up my thunderbird folders to get under this .
Everything else just works .</tokentext>
<sentencetext>I manage all my home directories with Unison, including 80Gig of pictures across Linux and Windows.
The only issue I've noticed is that for single files over 1G it gets slow, so I split up my thunderbird folders to get under this.
Everything else just works.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28456269</id>
	<title>Don't automate commits</title>
	<author>jgrahn</author>
	<datestamp>1245871920000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><blockquote><div><p>I've recently
considered adding some sophistication by implementing a version
control system like subversion, git, or bazaar, but have found some
shortcomings in automating commits and pushing updates to all systems.</p></div></blockquote><p>
Like Homer Simpson once said: "If something seems hard, it's probably not worth doing".
Forget about the automagical sync idea, and version control suddenly becomes attractive.
</p><p>
I keep the important parts of my $HOME in CVS, with the repository available over ssh.
I commit my changes when I have something ready, and I update when I suspect there is something to update.
<em>You *cannot* automate this -- a human needs to be around to resolve conflicts, in the rare cases
where there are any.</em>
</p><p>
One interesting aspect of this is that it simplifies backups.
A directory under version control doesn't need backups -- you backup the repository instead.
You don't have to spend time excluding certain files/file name patterns from the backup to save space;
you have implicitly excluded them by not commiting them.</p></div>
	</htmltext>
<tokenext>I 've recently considered adding some sophistication by implementing a version control system like subversion , git , or bazaar , but have found some shortcomings in automating commits and pushing updates to all systems .
Like Homer Simpson once said : " If something seems hard , it 's probably not worth doing " .
Forget about the automagical sync idea , and version control suddenly becomes attractive .
I keep the important parts of my $ HOME in CVS , with the repository available over ssh .
I commit my changes when I have something ready , and I update when I suspect there is something to update .
You * can not * automate this -- a human needs to be around to resolve conflicts , in the rare cases where there are any .
One interesting aspect of this is that it simplifies backups .
A directory under version control does n't need backups -- you backup the repository instead .
You do n't have to spend time excluding certain files/file name patterns from the backup to save space ; you have implicitly excluded them by not commiting them .</tokentext>
<sentencetext>I've recently
considered adding some sophistication by implementing a version
control system like subversion, git, or bazaar, but have found some
shortcomings in automating commits and pushing updates to all systems.
Like Homer Simpson once said: "If something seems hard, it's probably not worth doing".
Forget about the automagical sync idea, and version control suddenly becomes attractive.
I keep the important parts of my $HOME in CVS, with the repository available over ssh.
I commit my changes when I have something ready, and I update when I suspect there is something to update.
You *cannot* automate this -- a human needs to be around to resolve conflicts, in the rare cases
where there are any.
One interesting aspect of this is that it simplifies backups.
A directory under version control doesn't need backups -- you backup the repository instead.
You don't have to spend time excluding certain files/file name patterns from the backup to save space;
you have implicitly excluded them by not commiting them.
	</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443835</id>
	<title>Re:Dropbox</title>
	<author>digitalderbs</author>
	<datestamp>1245789720000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>This does seem like a very viable option. For those not aware, it's cloud server that you upload from<nobr> <wbr></nobr>/download to. For <br>

It appears to be a very clean system, but I would be concerned about having open/unencrypted files on an uncontrolled server. Dropbox would be great if you could manage your own server, which doesn't appear to be the case.<br> <br>

thanks for the link.</htmltext>
<tokenext>This does seem like a very viable option .
For those not aware , it 's cloud server that you upload from /download to .
For It appears to be a very clean system , but I would be concerned about having open/unencrypted files on an uncontrolled server .
Dropbox would be great if you could manage your own server , which does n't appear to be the case .
thanks for the link .</tokentext>
<sentencetext>This does seem like a very viable option.
For those not aware, it's cloud server that you upload from /download to.
For 

It appears to be a very clean system, but I would be concerned about having open/unencrypted files on an uncontrolled server.
Dropbox would be great if you could manage your own server, which doesn't appear to be the case.
thanks for the link.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443469</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28453179</id>
	<title>Re:Subversion with a touch of bash</title>
	<author>Anonymous</author>
	<datestamp>1245861480000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>&gt; If there is interest, I may post my sync script.</p><p>Please do!</p></htmltext>
<tokenext>&gt; If there is interest , I may post my sync script.Please do !</tokentext>
<sentencetext>&gt; If there is interest, I may post my sync script.Please do!</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28446663</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444103</id>
	<title>Briefcase</title>
	<author>TurboNed</author>
	<datestamp>1245790620000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>Isn't that what everyone uses Microsoft's desktop Briefcase for?  I thought that's the ultimate in synchronization tools.</htmltext>
<tokenext>Is n't that what everyone uses Microsoft 's desktop Briefcase for ?
I thought that 's the ultimate in synchronization tools .</tokentext>
<sentencetext>Isn't that what everyone uses Microsoft's desktop Briefcase for?
I thought that's the ultimate in synchronization tools.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28445209</id>
	<title>homegrown java</title>
	<author>os10000</author>
	<datestamp>1245751260000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Hello,</p><p>I have built a java program.  You can find it here: <a href="http://www.os10000.net/fs/java/app\_dsync/index.html" title="os10000.net" rel="nofollow">http://www.os10000.net/fs/java/app\_dsync/index.html</a> [os10000.net]</p><p>Features:<br>* it is GPL<br>* it is used for exporting and importing<br>* it creates a digital certificate for the machine it's run on<br>* it creates a 1-1 relationship with a machine that it's synching with<br>* it creates an export file on the source machine &amp; imports it on the target machine (you have to move it)<br>* the export file is a zip file<br>* you build a ruleset on the export machine (files, directories, regexes) what you wish to export<br>* you build a ruleset on the import machine (same) what you wish to import<br>* these two rulesets give you total control even when you're exchanging with someone else<br>* you have rules for "soft master", "hard master", "soft slave", "hard slave", "progress", etc.</p><p>If you can use "unison", use that.  If you wish to automate, use "app\_dsync".</p><p>Have a good day,</p><p>Oliver</p></htmltext>
<tokenext>Hello,I have built a java program .
You can find it here : http : //www.os10000.net/fs/java/app \ _dsync/index.html [ os10000.net ] Features : * it is GPL * it is used for exporting and importing * it creates a digital certificate for the machine it 's run on * it creates a 1-1 relationship with a machine that it 's synching with * it creates an export file on the source machine &amp; imports it on the target machine ( you have to move it ) * the export file is a zip file * you build a ruleset on the export machine ( files , directories , regexes ) what you wish to export * you build a ruleset on the import machine ( same ) what you wish to import * these two rulesets give you total control even when you 're exchanging with someone else * you have rules for " soft master " , " hard master " , " soft slave " , " hard slave " , " progress " , etc.If you can use " unison " , use that .
If you wish to automate , use " app \ _dsync " .Have a good day,Oliver</tokentext>
<sentencetext>Hello,I have built a java program.
You can find it here: http://www.os10000.net/fs/java/app\_dsync/index.html [os10000.net]Features:* it is GPL* it is used for exporting and importing* it creates a digital certificate for the machine it's run on* it creates a 1-1 relationship with a machine that it's synching with* it creates an export file on the source machine &amp; imports it on the target machine (you have to move it)* the export file is a zip file* you build a ruleset on the export machine (files, directories, regexes) what you wish to export* you build a ruleset on the import machine (same) what you wish to import* these two rulesets give you total control even when you're exchanging with someone else* you have rules for "soft master", "hard master", "soft slave", "hard slave", "progress", etc.If you can use "unison", use that.
If you wish to automate, use "app\_dsync".Have a good day,Oliver</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28464109</id>
	<title>backuppc</title>
	<author>Anonymous</author>
	<datestamp>1245923400000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>I use backuppc. anything else is a hassle to explain to my wife.</p></htmltext>
<tokenext>I use backuppc .
anything else is a hassle to explain to my wife .</tokentext>
<sentencetext>I use backuppc.
anything else is a hassle to explain to my wife.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28448631</id>
	<title>JungleDisk.com</title>
	<author>Anonymous</author>
	<datestamp>1245773580000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext>JungleDisk.com</htmltext>
<tokenext>JungleDisk.com</tokentext>
<sentencetext>JungleDisk.com</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28446321</id>
	<title>Re:Beyond Compare</title>
	<author>Anonymous</author>
	<datestamp>1245755760000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>There's a linux version as well.</p></htmltext>
<tokenext>There 's a linux version as well .</tokentext>
<sentencetext>There's a linux version as well.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443729</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444261</id>
	<title>Space differences</title>
	<author>Bert64</author>
	<datestamp>1245747900000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>I tried to sync my homedirs between machines, because it's annoying not having my settings and saved passwords etc on every machine..<br>But, the homedir on my desktop is 300gb in size, neither of my laptops even have that much space on them, especially the netbook.<br>On the desktop i can keep everything i need, but the netbook needs to keep as small working set of whatever i'm working on at the time.</p></htmltext>
<tokenext>I tried to sync my homedirs between machines , because it 's annoying not having my settings and saved passwords etc on every machine..But , the homedir on my desktop is 300gb in size , neither of my laptops even have that much space on them , especially the netbook.On the desktop i can keep everything i need , but the netbook needs to keep as small working set of whatever i 'm working on at the time .</tokentext>
<sentencetext>I tried to sync my homedirs between machines, because it's annoying not having my settings and saved passwords etc on every machine..But, the homedir on my desktop is 300gb in size, neither of my laptops even have that much space on them, especially the netbook.On the desktop i can keep everything i need, but the netbook needs to keep as small working set of whatever i'm working on at the time.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443659</id>
	<title>Re:Svn</title>
	<author>Anonymous</author>
	<datestamp>1245789180000</datestamp>
	<modclass>Interestin</modclass>
	<modscore>4</modscore>
	<htmltext><p>I use git, with flashbake and cron to automate commits, and a simply cron job to automatically update a backup copy on an external hard drive.</p></htmltext>
<tokenext>I use git , with flashbake and cron to automate commits , and a simply cron job to automatically update a backup copy on an external hard drive .</tokentext>
<sentencetext>I use git, with flashbake and cron to automate commits, and a simply cron job to automatically update a backup copy on an external hard drive.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443419</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444623</id>
	<title>Portable hard drive</title>
	<author>lemon\_dieter</author>
	<datestamp>1245749220000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext>Every file I download, generate, work with, etc. stays with me on a small portable hard drive.  I also use the portable apps versions of abiword, firefox, etc.  I only attempt to do CAD work on the machine in my office, where everything is zen for drafting.  I set up an Rsync to copy the contents of the drive to the raid 5 in my office machine at 8:00 every morning.  This is way to use my paranoia to the advantage of making sure I come to work on time, satisfying my boss as well.</htmltext>
<tokenext>Every file I download , generate , work with , etc .
stays with me on a small portable hard drive .
I also use the portable apps versions of abiword , firefox , etc .
I only attempt to do CAD work on the machine in my office , where everything is zen for drafting .
I set up an Rsync to copy the contents of the drive to the raid 5 in my office machine at 8 : 00 every morning .
This is way to use my paranoia to the advantage of making sure I come to work on time , satisfying my boss as well .</tokentext>
<sentencetext>Every file I download, generate, work with, etc.
stays with me on a small portable hard drive.
I also use the portable apps versions of abiword, firefox, etc.
I only attempt to do CAD work on the machine in my office, where everything is zen for drafting.
I set up an Rsync to copy the contents of the drive to the raid 5 in my office machine at 8:00 every morning.
This is way to use my paranoia to the advantage of making sure I come to work on time, satisfying my boss as well.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443777</id>
	<title>Rsync</title>
	<author>Perl-Pusher</author>
	<datestamp>1245789540000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>Whenever I'm on my workplace network I have cron job that uses rsync to sync my Documents directory between my linux desktop &amp; mac laptop. This way the latest file is always on both machines. The Desktop is also rsync'd to a backup server daily, and weekly for off site storage at bank vault. And the Laptop uses Time Machine at my home. This allows me the flexibility of grabbing older files if I need them.  For mail I use imap with SSL. So short of a nuclear holocaust, it would be pretty hard to lose a file.</htmltext>
<tokenext>Whenever I 'm on my workplace network I have cron job that uses rsync to sync my Documents directory between my linux desktop &amp; mac laptop .
This way the latest file is always on both machines .
The Desktop is also rsync 'd to a backup server daily , and weekly for off site storage at bank vault .
And the Laptop uses Time Machine at my home .
This allows me the flexibility of grabbing older files if I need them .
For mail I use imap with SSL .
So short of a nuclear holocaust , it would be pretty hard to lose a file .</tokentext>
<sentencetext>Whenever I'm on my workplace network I have cron job that uses rsync to sync my Documents directory between my linux desktop &amp; mac laptop.
This way the latest file is always on both machines.
The Desktop is also rsync'd to a backup server daily, and weekly for off site storage at bank vault.
And the Laptop uses Time Machine at my home.
This allows me the flexibility of grabbing older files if I need them.
For mail I use imap with SSL.
So short of a nuclear holocaust, it would be pretty hard to lose a file.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28445955</id>
	<title>Dropbox</title>
	<author>Anonymous</author>
	<datestamp>1245754140000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>Dropbox looks pretty awesome. I just spent some money on a home NAS array so I'd prefer to do that. What's the closest thing to dropbox that I can put on my local NAS array instead?</p></htmltext>
<tokenext>Dropbox looks pretty awesome .
I just spent some money on a home NAS array so I 'd prefer to do that .
What 's the closest thing to dropbox that I can put on my local NAS array instead ?</tokentext>
<sentencetext>Dropbox looks pretty awesome.
I just spent some money on a home NAS array so I'd prefer to do that.
What's the closest thing to dropbox that I can put on my local NAS array instead?</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28445023</id>
	<title>Nobody reads subjects anyways</title>
	<author>neurovish</author>
	<datestamp>1245750540000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Fileserver and an NFS export.<br>My home directories generally don't contain anything that I need distributed.  If I need something in a home directory that is not there, then I will copy it there from wherever it does exist.  I only really have two computers that I actively use though, so I just do document versioning by hand if there happens to be something I was working with on my laptop and desktop.  This is really a problem for people?  I would think that anybody who regularly uses more than two computers (desktop + laptop) would be sufficiently capable of setting up something that works for them....otherwise, they probably really don't need to be using so many different computers.</p></htmltext>
<tokenext>Fileserver and an NFS export.My home directories generally do n't contain anything that I need distributed .
If I need something in a home directory that is not there , then I will copy it there from wherever it does exist .
I only really have two computers that I actively use though , so I just do document versioning by hand if there happens to be something I was working with on my laptop and desktop .
This is really a problem for people ?
I would think that anybody who regularly uses more than two computers ( desktop + laptop ) would be sufficiently capable of setting up something that works for them....otherwise , they probably really do n't need to be using so many different computers .</tokentext>
<sentencetext>Fileserver and an NFS export.My home directories generally don't contain anything that I need distributed.
If I need something in a home directory that is not there, then I will copy it there from wherever it does exist.
I only really have two computers that I actively use though, so I just do document versioning by hand if there happens to be something I was working with on my laptop and desktop.
This is really a problem for people?
I would think that anybody who regularly uses more than two computers (desktop + laptop) would be sufficiently capable of setting up something that works for them....otherwise, they probably really don't need to be using so many different computers.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444303</id>
	<title>Re:rsync</title>
	<author>Nukenbar</author>
	<datestamp>1245748020000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>along with cron, it is all of the backup that a normal user will need.</p></htmltext>
<tokenext>along with cron , it is all of the backup that a normal user will need .</tokentext>
<sentencetext>along with cron, it is all of the backup that a normal user will need.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443875</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28473571</id>
	<title>git</title>
	<author>Anonymous</author>
	<datestamp>1245929280000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>I've been using git to track and propagate my various linux/osx home directories. I use it because I can maintain different branches (one for each machine), and propagate <i>changes</i> rather than whole files between them. This is very nice for config files where the content may be very similar between machines, but differ in a few areas here and there. I maintain files on 6 different machines, and have a master public server I use to push and pull from. If I change a file and commit it, and want to propagate the change to all my other machines, I use this script which I hacked up to do it for me:<br><a href="http://pohl.ececs.uc.edu/~jeremy/propagate-commit.sh" title="uc.edu" rel="nofollow">http://pohl.ececs.uc.edu/~jeremy/propagate-commit.sh</a> [uc.edu]</p><p>It takes as its argument a commit id, and will propagate the commit to all other branches in your repo. I'm sure it can be modified to work differently and/or be more configurable.</p><p>Performance wise, git is fast as hell. And it compresses object very well. Plus you get versioning of course<nobr> <wbr></nobr>;)</p></htmltext>
<tokenext>I 've been using git to track and propagate my various linux/osx home directories .
I use it because I can maintain different branches ( one for each machine ) , and propagate changes rather than whole files between them .
This is very nice for config files where the content may be very similar between machines , but differ in a few areas here and there .
I maintain files on 6 different machines , and have a master public server I use to push and pull from .
If I change a file and commit it , and want to propagate the change to all my other machines , I use this script which I hacked up to do it for me : http : //pohl.ececs.uc.edu/ ~ jeremy/propagate-commit.sh [ uc.edu ] It takes as its argument a commit id , and will propagate the commit to all other branches in your repo .
I 'm sure it can be modified to work differently and/or be more configurable.Performance wise , git is fast as hell .
And it compresses object very well .
Plus you get versioning of course ; )</tokentext>
<sentencetext>I've been using git to track and propagate my various linux/osx home directories.
I use it because I can maintain different branches (one for each machine), and propagate changes rather than whole files between them.
This is very nice for config files where the content may be very similar between machines, but differ in a few areas here and there.
I maintain files on 6 different machines, and have a master public server I use to push and pull from.
If I change a file and commit it, and want to propagate the change to all my other machines, I use this script which I hacked up to do it for me:http://pohl.ececs.uc.edu/~jeremy/propagate-commit.sh [uc.edu]It takes as its argument a commit id, and will propagate the commit to all other branches in your repo.
I'm sure it can be modified to work differently and/or be more configurable.Performance wise, git is fast as hell.
And it compresses object very well.
Plus you get versioning of course ;)</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28445899</id>
	<title>rsync + OpenSolaris (ZFS) w/time slider</title>
	<author>EBorisch</author>
	<datestamp>1245753900000</datestamp>
	<modclass>Interestin</modclass>
	<modscore>3</modscore>
	<htmltext><p>Nighly (or more frequently, if you like) rsync to an OpenSolaris server running ZFS w/ Time Slider.</p><p>Quality versioned backups with little effort, plus data integrity (checksums built into the filesystem), compression, and (if desired) RAID-Z(2) goodness! In addition, the provided time slider interface allows easy browsing of versions.</p><p>Just my 2c...</p></htmltext>
<tokenext>Nighly ( or more frequently , if you like ) rsync to an OpenSolaris server running ZFS w/ Time Slider.Quality versioned backups with little effort , plus data integrity ( checksums built into the filesystem ) , compression , and ( if desired ) RAID-Z ( 2 ) goodness !
In addition , the provided time slider interface allows easy browsing of versions.Just my 2c.. .</tokentext>
<sentencetext>Nighly (or more frequently, if you like) rsync to an OpenSolaris server running ZFS w/ Time Slider.Quality versioned backups with little effort, plus data integrity (checksums built into the filesystem), compression, and (if desired) RAID-Z(2) goodness!
In addition, the provided time slider interface allows easy browsing of versions.Just my 2c...</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444327</id>
	<title>Allway Sync</title>
	<author>mustafap</author>
	<datestamp>1245748080000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>I absolutely love allway sync. Very cheap ( I normally never buy software, so low price helped ) but completely simplifies syncing between two home and one work PC.</p><p>It's never screwed me up once (fingers crossed!)</p></htmltext>
<tokenext>I absolutely love allway sync .
Very cheap ( I normally never buy software , so low price helped ) but completely simplifies syncing between two home and one work PC.It 's never screwed me up once ( fingers crossed !
)</tokentext>
<sentencetext>I absolutely love allway sync.
Very cheap ( I normally never buy software, so low price helped ) but completely simplifies syncing between two home and one work PC.It's never screwed me up once (fingers crossed!
)</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28450565</id>
	<title>I'd like to have my files in the Cloud</title>
	<author>Xouba</author>
	<datestamp>1245841500000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>Myself, I'd solve that issue with a somewhat hefty (~10Mb/s) Internet connection, a local cache and a place in "the Cloud" to store my files. I know of the shortcomings of storing your things in some server online (security, privacy<nobr> <wbr></nobr>...), but as many of the other things I use are already there, it's only natural that this is there too.</htmltext>
<tokenext>Myself , I 'd solve that issue with a somewhat hefty ( ~ 10Mb/s ) Internet connection , a local cache and a place in " the Cloud " to store my files .
I know of the shortcomings of storing your things in some server online ( security , privacy ... ) , but as many of the other things I use are already there , it 's only natural that this is there too .</tokentext>
<sentencetext>Myself, I'd solve that issue with a somewhat hefty (~10Mb/s) Internet connection, a local cache and a place in "the Cloud" to store my files.
I know of the shortcomings of storing your things in some server online (security, privacy ...), but as many of the other things I use are already there, it's only natural that this is there too.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28446685</id>
	<title>I use the Apple tool</title>
	<author>MrKaos</author>
	<datestamp>1245757500000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>the idont bother.</htmltext>
<tokenext>the idont bother .</tokentext>
<sentencetext>the idont bother.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28448765</id>
	<title>Re:Mobile Home Directory</title>
	<author>Anonymous</author>
	<datestamp>1245775140000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext>Exactly! NFS + Samba for windows is the obvious solution here for me. One may need to implement centralized logins as well</htmltext>
<tokenext>Exactly !
NFS + Samba for windows is the obvious solution here for me .
One may need to implement centralized logins as well</tokentext>
<sentencetext>Exactly!
NFS + Samba for windows is the obvious solution here for me.
One may need to implement centralized logins as well</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443765</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444395</id>
	<title>Re:Myself...</title>
	<author>theJML</author>
	<datestamp>1245748380000</datestamp>
	<modclass>Informativ</modclass>
	<modscore>2</modscore>
	<htmltext><p>But doesn't that leave you open to crisis when one dies?</p><p>Or is TFA not talking about backup so much?</p><p>Personaly, though I have different uses for various systems, often find that I want to access the same data and having it in a central location is the best way to store it. So I have a central Gentoo install on a low power Geode board (runs at about 5 watts, up to 15 watts with drives spinning), with mirrored and regularly backed up storage (to tape and exchanged with a remote location's tape set when possible (like xmas time when I go to my parents who are 300 miles away, don't really care enough for it to be more frequent than that)). Each server connects to this central location with nfs or samba/cifs and all 'serious work' is done on those shares.</p><p>I've been looking into svn to help manage the various changes to files I make over time, but I haven't really migrated things to it yet.</p><p>I've also stored some important things in 'the cloud' on sites like google docs with much success.</p></htmltext>
<tokenext>But does n't that leave you open to crisis when one dies ? Or is TFA not talking about backup so much ? Personaly , though I have different uses for various systems , often find that I want to access the same data and having it in a central location is the best way to store it .
So I have a central Gentoo install on a low power Geode board ( runs at about 5 watts , up to 15 watts with drives spinning ) , with mirrored and regularly backed up storage ( to tape and exchanged with a remote location 's tape set when possible ( like xmas time when I go to my parents who are 300 miles away , do n't really care enough for it to be more frequent than that ) ) .
Each server connects to this central location with nfs or samba/cifs and all 'serious work ' is done on those shares.I 've been looking into svn to help manage the various changes to files I make over time , but I have n't really migrated things to it yet.I 've also stored some important things in 'the cloud ' on sites like google docs with much success .</tokentext>
<sentencetext>But doesn't that leave you open to crisis when one dies?Or is TFA not talking about backup so much?Personaly, though I have different uses for various systems, often find that I want to access the same data and having it in a central location is the best way to store it.
So I have a central Gentoo install on a low power Geode board (runs at about 5 watts, up to 15 watts with drives spinning), with mirrored and regularly backed up storage (to tape and exchanged with a remote location's tape set when possible (like xmas time when I go to my parents who are 300 miles away, don't really care enough for it to be more frequent than that)).
Each server connects to this central location with nfs or samba/cifs and all 'serious work' is done on those shares.I've been looking into svn to help manage the various changes to files I make over time, but I haven't really migrated things to it yet.I've also stored some important things in 'the cloud' on sites like google docs with much success.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443499</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443953</id>
	<title>Re:Mobile Home Directory</title>
	<author>dburkland</author>
	<datestamp>1245790140000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>I have something similar setup in that I got a FreeBSD setup with jails (openldap, NFS, Apache, etc) which hosts all of the home directories for my machines. Works quite nicely</htmltext>
<tokenext>I have something similar setup in that I got a FreeBSD setup with jails ( openldap , NFS , Apache , etc ) which hosts all of the home directories for my machines .
Works quite nicely</tokentext>
<sentencetext>I have something similar setup in that I got a FreeBSD setup with jails (openldap, NFS, Apache, etc) which hosts all of the home directories for my machines.
Works quite nicely</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443765</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443701</id>
	<title>Keep only one home directory</title>
	<author>Anonymous</author>
	<datestamp>1245789300000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>There are numerous ways to do this, but I would use Samba (best used with VPN), because it can pretty much be used everywhere, even on computers with crappy OSes like Vista. There are numerous other solutions for the same thing; FTP, AFS, maybe iSCSI.</htmltext>
<tokenext>There are numerous ways to do this , but I would use Samba ( best used with VPN ) , because it can pretty much be used everywhere , even on computers with crappy OSes like Vista .
There are numerous other solutions for the same thing ; FTP , AFS , maybe iSCSI .</tokentext>
<sentencetext>There are numerous ways to do this, but I would use Samba (best used with VPN), because it can pretty much be used everywhere, even on computers with crappy OSes like Vista.
There are numerous other solutions for the same thing; FTP, AFS, maybe iSCSI.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444897</id>
	<title>X Forwarding</title>
	<author>Thedougler604</author>
	<datestamp>1245750120000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>Currently I'm doing my masters and I have two types of files that I have to move between two locations.

1) Students Grades
2) Tex Files (for thesis work)

So, my solution was that I ssh into my FreeBSD system at home and then use xforwarding to make my spreadsheet program or tex editor show up on my office client.

This lets me keep one record of my documents in my home location and all traffic is strongly encrypted.  (just in case oliver is a student).</htmltext>
<tokenext>Currently I 'm doing my masters and I have two types of files that I have to move between two locations .
1 ) Students Grades 2 ) Tex Files ( for thesis work ) So , my solution was that I ssh into my FreeBSD system at home and then use xforwarding to make my spreadsheet program or tex editor show up on my office client .
This lets me keep one record of my documents in my home location and all traffic is strongly encrypted .
( just in case oliver is a student ) .</tokentext>
<sentencetext>Currently I'm doing my masters and I have two types of files that I have to move between two locations.
1) Students Grades
2) Tex Files (for thesis work)

So, my solution was that I ssh into my FreeBSD system at home and then use xforwarding to make my spreadsheet program or tex editor show up on my office client.
This lets me keep one record of my documents in my home location and all traffic is strongly encrypted.
(just in case oliver is a student).</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28452821</id>
	<title>I use my own</title>
	<author>Oat-Bran</author>
	<datestamp>1245860100000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>custom script that works well for me:

<a href="http://pastebin.com/f6239e9c9" title="pastebin.com" rel="nofollow">http://pastebin.com/f6239e9c9</a> [pastebin.com]</htmltext>
<tokenext>custom script that works well for me : http : //pastebin.com/f6239e9c9 [ pastebin.com ]</tokentext>
<sentencetext>custom script that works well for me:

http://pastebin.com/f6239e9c9 [pastebin.com]</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444169</id>
	<title>Combination SVN and Shared Drive</title>
	<author>rainmaestro</author>
	<datestamp>1245747600000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>When dealing with source code for the software projects I work on, I store everything on SVN repos hosted on my quasi-server box (SVN, CUPS, assorted DB's, Trac, etc). Whatever machine I need to be on (dev machine, XP build machine, etc), I just sync the checkout.</p><p>For regular files, I keep most of my stuff on a 1TB drive that is NFS'ed to all my machines. If I need to do a lot of work on a big file, I pull down a copy, edit it locally, then push it back to the share. This handles all of my needs for the most part. My Linux, BSD and OSX machines are good to go, my two XP machines aren't. One is only for playing games that choke in Wine, the other is an old box that just runs a build script for my apps, so I've never bothered to try to get them configured to handle NFS shares.</p><p>Both methods are cheap, simple, and fairly pain-free if you keep Windows out of the mixed-mode environment.</p></htmltext>
<tokenext>When dealing with source code for the software projects I work on , I store everything on SVN repos hosted on my quasi-server box ( SVN , CUPS , assorted DB 's , Trac , etc ) .
Whatever machine I need to be on ( dev machine , XP build machine , etc ) , I just sync the checkout.For regular files , I keep most of my stuff on a 1TB drive that is NFS'ed to all my machines .
If I need to do a lot of work on a big file , I pull down a copy , edit it locally , then push it back to the share .
This handles all of my needs for the most part .
My Linux , BSD and OSX machines are good to go , my two XP machines are n't .
One is only for playing games that choke in Wine , the other is an old box that just runs a build script for my apps , so I 've never bothered to try to get them configured to handle NFS shares.Both methods are cheap , simple , and fairly pain-free if you keep Windows out of the mixed-mode environment .</tokentext>
<sentencetext>When dealing with source code for the software projects I work on, I store everything on SVN repos hosted on my quasi-server box (SVN, CUPS, assorted DB's, Trac, etc).
Whatever machine I need to be on (dev machine, XP build machine, etc), I just sync the checkout.For regular files, I keep most of my stuff on a 1TB drive that is NFS'ed to all my machines.
If I need to do a lot of work on a big file, I pull down a copy, edit it locally, then push it back to the share.
This handles all of my needs for the most part.
My Linux, BSD and OSX machines are good to go, my two XP machines aren't.
One is only for playing games that choke in Wine, the other is an old box that just runs a build script for my apps, so I've never bothered to try to get them configured to handle NFS shares.Both methods are cheap, simple, and fairly pain-free if you keep Windows out of the mixed-mode environment.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444495</id>
	<title>Why is there no discount online backup?</title>
	<author>CopaceticOpus</author>
	<datestamp>1245748740000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>For large amounts of files (100s of GBs), the cheapest way to back them up is to use a couple sets of external hard drives, always keeping at least one set off-site.</p><p>I would like to know why this is the case. Why is there no service out there that can provide backups for large amounts of data at a price that is competitive with using external hard drives? Such a service should be able to take advantage of scale by storing the bulk of the data onto industrial tape drives, and only retrieving it as needed.</p><p>It would require a lot on bandwidth to do the initial backup, but once that is complete, only incremental backups are needed. A database could store the file names, dates, sizes, and hashes, in order to determine what needs to be updated.</p><p>This would not be for data that needs to be accessed repeatedly - it is for backups. So they might charge a modest fee to recover a few files which you accidentally deleted. If you lose an entire drive, you could select a new drive model, and they'd ship it to you with your data on the drive, for the cost of the drive plus a service fee.</p><p>Is this not feasible, or has it just not been done?</p></htmltext>
<tokenext>For large amounts of files ( 100s of GBs ) , the cheapest way to back them up is to use a couple sets of external hard drives , always keeping at least one set off-site.I would like to know why this is the case .
Why is there no service out there that can provide backups for large amounts of data at a price that is competitive with using external hard drives ?
Such a service should be able to take advantage of scale by storing the bulk of the data onto industrial tape drives , and only retrieving it as needed.It would require a lot on bandwidth to do the initial backup , but once that is complete , only incremental backups are needed .
A database could store the file names , dates , sizes , and hashes , in order to determine what needs to be updated.This would not be for data that needs to be accessed repeatedly - it is for backups .
So they might charge a modest fee to recover a few files which you accidentally deleted .
If you lose an entire drive , you could select a new drive model , and they 'd ship it to you with your data on the drive , for the cost of the drive plus a service fee.Is this not feasible , or has it just not been done ?</tokentext>
<sentencetext>For large amounts of files (100s of GBs), the cheapest way to back them up is to use a couple sets of external hard drives, always keeping at least one set off-site.I would like to know why this is the case.
Why is there no service out there that can provide backups for large amounts of data at a price that is competitive with using external hard drives?
Such a service should be able to take advantage of scale by storing the bulk of the data onto industrial tape drives, and only retrieving it as needed.It would require a lot on bandwidth to do the initial backup, but once that is complete, only incremental backups are needed.
A database could store the file names, dates, sizes, and hashes, in order to determine what needs to be updated.This would not be for data that needs to be accessed repeatedly - it is for backups.
So they might charge a modest fee to recover a few files which you accidentally deleted.
If you lose an entire drive, you could select a new drive model, and they'd ship it to you with your data on the drive, for the cost of the drive plus a service fee.Is this not feasible, or has it just not been done?</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28508561</id>
	<title>Allway sync</title>
	<author>agerbak</author>
	<datestamp>1246197600000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>I've found Allway sync (http://allwaysync.com/) to be a bulletproof multi-directional sync tool, which I regularly use to synchronize 100's of GB of photos, documents and music across laptop, desktop and external drive.

Windows only, which will be a problem for some (especially on this forum!), though of course a Windows machine can be used to sync to a file share hosted by another OS.

Free for moderate use. Small fee for unlimited "pro" version. One of the few such programs I've ever paid for. Regularly updated with new functionality.

I agree with others here that having a single approach for different kinds of data is probably not the right answer. For me, photos, documents and music are my three "buckets" of content, which I sync with different jobs and different frequency. OS and program settings are the other thing I backup manually, typically only when upgrading / migrating machines...</htmltext>
<tokenext>I 've found Allway sync ( http : //allwaysync.com/ ) to be a bulletproof multi-directional sync tool , which I regularly use to synchronize 100 's of GB of photos , documents and music across laptop , desktop and external drive .
Windows only , which will be a problem for some ( especially on this forum !
) , though of course a Windows machine can be used to sync to a file share hosted by another OS .
Free for moderate use .
Small fee for unlimited " pro " version .
One of the few such programs I 've ever paid for .
Regularly updated with new functionality .
I agree with others here that having a single approach for different kinds of data is probably not the right answer .
For me , photos , documents and music are my three " buckets " of content , which I sync with different jobs and different frequency .
OS and program settings are the other thing I backup manually , typically only when upgrading / migrating machines.. .</tokentext>
<sentencetext>I've found Allway sync (http://allwaysync.com/) to be a bulletproof multi-directional sync tool, which I regularly use to synchronize 100's of GB of photos, documents and music across laptop, desktop and external drive.
Windows only, which will be a problem for some (especially on this forum!
), though of course a Windows machine can be used to sync to a file share hosted by another OS.
Free for moderate use.
Small fee for unlimited "pro" version.
One of the few such programs I've ever paid for.
Regularly updated with new functionality.
I agree with others here that having a single approach for different kinds of data is probably not the right answer.
For me, photos, documents and music are my three "buckets" of content, which I sync with different jobs and different frequency.
OS and program settings are the other thing I backup manually, typically only when upgrading / migrating machines...</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28449359</id>
	<title>Re:Dropbox</title>
	<author>Anonymous</author>
	<datestamp>1245781980000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>I have serious problems with dropbox  because what ever i do i need to send my files to the Internet where it is stored and than its downloaded again.<br>We are not allowed to store data outside the company. so dropbox is not working for us. We decided to use "powerfolder Pro".<br>- syncs directly in Lan &amp; Online<br>- we can do the job  without their infrastructure<br>- we can run our own online storage inhouse<br>- udt file transfers if you want</p><p>ok thats a little bit to big for you but it does the job for small networks also and syncs multiple  computers directly.</p><p>second recommendation would have been "beinsync" but they dropped out of business  everything else i found is more an onlinestorage with a  sync function, what is horrible if you want to do something in lan and do not have so much upload speed.</p><p>E.g. laptop to desktop in LAN at home LAN connection 100 mb upload to Internet 100 kb, dropbox needs a little longer for the same work than powerfolder does who can do it directly in LAN.</p></htmltext>
<tokenext>I have serious problems with dropbox because what ever i do i need to send my files to the Internet where it is stored and than its downloaded again.We are not allowed to store data outside the company .
so dropbox is not working for us .
We decided to use " powerfolder Pro " .- syncs directly in Lan &amp; Online- we can do the job without their infrastructure- we can run our own online storage inhouse- udt file transfers if you wantok thats a little bit to big for you but it does the job for small networks also and syncs multiple computers directly.second recommendation would have been " beinsync " but they dropped out of business everything else i found is more an onlinestorage with a sync function , what is horrible if you want to do something in lan and do not have so much upload speed.E.g .
laptop to desktop in LAN at home LAN connection 100 mb upload to Internet 100 kb , dropbox needs a little longer for the same work than powerfolder does who can do it directly in LAN .</tokentext>
<sentencetext>I have serious problems with dropbox  because what ever i do i need to send my files to the Internet where it is stored and than its downloaded again.We are not allowed to store data outside the company.
so dropbox is not working for us.
We decided to use "powerfolder Pro".- syncs directly in Lan &amp; Online- we can do the job  without their infrastructure- we can run our own online storage inhouse- udt file transfers if you wantok thats a little bit to big for you but it does the job for small networks also and syncs multiple  computers directly.second recommendation would have been "beinsync" but they dropped out of business  everything else i found is more an onlinestorage with a  sync function, what is horrible if you want to do something in lan and do not have so much upload speed.E.g.
laptop to desktop in LAN at home LAN connection 100 mb upload to Internet 100 kb, dropbox needs a little longer for the same work than powerfolder does who can do it directly in LAN.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444305</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28480619</id>
	<title>Re:Dropbox</title>
	<author>lfaraone</author>
	<datestamp>1246024560000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>While "Open Source", the project is not Free Software, banning one-to-many redistribution and commercial use.</htmltext>
<tokenext>While " Open Source " , the project is not Free Software , banning one-to-many redistribution and commercial use .</tokentext>
<sentencetext>While "Open Source", the project is not Free Software, banning one-to-many redistribution and commercial use.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443909</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28445397</id>
	<title>Re:Dropbox</title>
	<author>Guspaz</author>
	<datestamp>1245751860000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>I'm also using Dropbox. It's great.</p></htmltext>
<tokenext>I 'm also using Dropbox .
It 's great .</tokentext>
<sentencetext>I'm also using Dropbox.
It's great.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443469</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444233</id>
	<title>bittorrent</title>
	<author>bl8n8r</author>
	<datestamp>1245747780000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>tar cvf -<nobr> <wbr></nobr>/home | bzip2 | aespipe -w 10 -K<nobr> <wbr></nobr>/etc/homekey.gpg | bittorrent archive.`date +\%s`.foo</p></htmltext>
<tokenext>tar cvf - /home | bzip2 | aespipe -w 10 -K /etc/homekey.gpg | bittorrent archive. ` date + \ % s ` .foo</tokentext>
<sentencetext>tar cvf - /home | bzip2 | aespipe -w 10 -K /etc/homekey.gpg | bittorrent archive.`date +\%s`.foo</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443525</id>
	<title>always mount your home dir with NFS</title>
	<author>Anonymous</author>
	<datestamp>1245788760000</datestamp>
	<modclass>Troll</modclass>
	<modscore>-1</modscore>
	<htmltext><p>so I can write an shosts/rhosts file with + + and voila! You're totally pwned. Suckers.</p><p>When it comes to hacking, you bitches be playin' checkers and I'm playin' chess.</p></htmltext>
<tokenext>so I can write an shosts/rhosts file with + + and voila !
You 're totally pwned .
Suckers.When it comes to hacking , you bitches be playin ' checkers and I 'm playin ' chess .</tokentext>
<sentencetext>so I can write an shosts/rhosts file with + + and voila!
You're totally pwned.
Suckers.When it comes to hacking, you bitches be playin' checkers and I'm playin' chess.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443883</id>
	<title>rsync</title>
	<author>CaptSaltyJack</author>
	<datestamp>1245789900000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>I use Mac (a MacBook Pro and a Mac Pro), so I just do rsync -Cavz ~/Documents/SomeDir/<nobr> <wbr></nobr>/Volumes/myusername/Documents/SomeDir/ and then usually an rsync the other direction, and I'm done.</htmltext>
<tokenext>I use Mac ( a MacBook Pro and a Mac Pro ) , so I just do rsync -Cavz ~ /Documents/SomeDir/ /Volumes/myusername/Documents/SomeDir/ and then usually an rsync the other direction , and I 'm done .</tokentext>
<sentencetext>I use Mac (a MacBook Pro and a Mac Pro), so I just do rsync -Cavz ~/Documents/SomeDir/ /Volumes/myusername/Documents/SomeDir/ and then usually an rsync the other direction, and I'm done.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444937</id>
	<title>It is from Microsoft, but...</title>
	<author>drunkenoafoffofb3ta</author>
	<datestamp>1245750240000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><a href="http://www.mesh.com/" title="mesh.com" rel="nofollow">Windows Live Mesh?</a> [mesh.com] <p>
It works on Macs, Windows PCs, and mobile devices, (apparently, but I do use it on the first two happily). Could you not share your desktop folders on your devices, and get MS to sync it all for you? Alas, Linux is excluded. 5Gb space online last time I looked.</p></htmltext>
<tokenext>Windows Live Mesh ?
[ mesh.com ] It works on Macs , Windows PCs , and mobile devices , ( apparently , but I do use it on the first two happily ) .
Could you not share your desktop folders on your devices , and get MS to sync it all for you ?
Alas , Linux is excluded .
5Gb space online last time I looked .</tokentext>
<sentencetext>Windows Live Mesh?
[mesh.com] 
It works on Macs, Windows PCs, and mobile devices, (apparently, but I do use it on the first two happily).
Could you not share your desktop folders on your devices, and get MS to sync it all for you?
Alas, Linux is excluded.
5Gb space online last time I looked.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28445019</id>
	<title>I just dont.</title>
	<author>Lumpy</author>
	<datestamp>1245750540000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Multiple computers does not mean they all need to be identical.  If I need something I get it from the home server otherwise each computer is for it's task.</p><p>I dont need all the family photos on my acer aspire one when we travel.  Same as my Mac editing system does not need my resume or my letters to relatives on it.</p><p>I find a home server is far more useful than syncing machines.</p></htmltext>
<tokenext>Multiple computers does not mean they all need to be identical .
If I need something I get it from the home server otherwise each computer is for it 's task.I dont need all the family photos on my acer aspire one when we travel .
Same as my Mac editing system does not need my resume or my letters to relatives on it.I find a home server is far more useful than syncing machines .</tokentext>
<sentencetext>Multiple computers does not mean they all need to be identical.
If I need something I get it from the home server otherwise each computer is for it's task.I dont need all the family photos on my acer aspire one when we travel.
Same as my Mac editing system does not need my resume or my letters to relatives on it.I find a home server is far more useful than syncing machines.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444923</id>
	<title>Re:rsync</title>
	<author>edalytical</author>
	<datestamp>1245750180000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>I wrote a small script that wraps rsync, so all I have to do is type:</p><p><tt>msync push school</tt></p><p>And then my school directory on my laptop is synced with my desktop. I can also pull and it's even smart enough to know which machine it's on and adjust the source/destination accordingly.</p></htmltext>
<tokenext>I wrote a small script that wraps rsync , so all I have to do is type : msync push schoolAnd then my school directory on my laptop is synced with my desktop .
I can also pull and it 's even smart enough to know which machine it 's on and adjust the source/destination accordingly .</tokentext>
<sentencetext>I wrote a small script that wraps rsync, so all I have to do is type:msync push schoolAnd then my school directory on my laptop is synced with my desktop.
I can also pull and it's even smart enough to know which machine it's on and adjust the source/destination accordingly.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443875</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28445769</id>
	<title>NIS</title>
	<author>Hokan</author>
	<datestamp>1245753300000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>We have an environment that includes various versions of Linux, FreeBSD, IRIX, Solaris,  and Windows.  We use NIS and NFS for all the UNIX systems and SAMBA for Windows.</p><p>It works very well for us.</p></htmltext>
<tokenext>We have an environment that includes various versions of Linux , FreeBSD , IRIX , Solaris , and Windows .
We use NIS and NFS for all the UNIX systems and SAMBA for Windows.It works very well for us .</tokentext>
<sentencetext>We have an environment that includes various versions of Linux, FreeBSD, IRIX, Solaris,  and Windows.
We use NIS and NFS for all the UNIX systems and SAMBA for Windows.It works very well for us.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443983</id>
	<title>Re:Dropbox</title>
	<author>syphax</author>
	<datestamp>1245790260000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>I pay Dropbox for 50GB of space (I'm around 1/2 full, mostly pictures).</p><p>It's not perfect, but it's pretty damn good and I'm not looking back.</p><p>I'm too old to screw around with DIY approaches.</p></htmltext>
<tokenext>I pay Dropbox for 50GB of space ( I 'm around 1/2 full , mostly pictures ) .It 's not perfect , but it 's pretty damn good and I 'm not looking back.I 'm too old to screw around with DIY approaches .</tokentext>
<sentencetext>I pay Dropbox for 50GB of space (I'm around 1/2 full, mostly pictures).It's not perfect, but it's pretty damn good and I'm not looking back.I'm too old to screw around with DIY approaches.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443469</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28445507</id>
	<title>VCS for OpenOffice documents?</title>
	<author>caseih</author>
	<datestamp>1245752280000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>With OpenOffice using the OpenDocument format for files, which is XML (plain text) in a zip file, or even MS Office's non-standard XML formats, it would be nice if version control systems could efficiently track and store changes of these files.  I have found plugins for SVN and GIT that can reach inside the zip container and do simplistic diffs on the text, but each revision is still stored as a complete binary file.</p><p>I guess I really just need to learn LaTeX and do everything in plain text files with a good editor like vim.  But in the meantime, having a XML-in-zip-file- aware version control system would be nice.</p></htmltext>
<tokenext>With OpenOffice using the OpenDocument format for files , which is XML ( plain text ) in a zip file , or even MS Office 's non-standard XML formats , it would be nice if version control systems could efficiently track and store changes of these files .
I have found plugins for SVN and GIT that can reach inside the zip container and do simplistic diffs on the text , but each revision is still stored as a complete binary file.I guess I really just need to learn LaTeX and do everything in plain text files with a good editor like vim .
But in the meantime , having a XML-in-zip-file- aware version control system would be nice .</tokentext>
<sentencetext>With OpenOffice using the OpenDocument format for files, which is XML (plain text) in a zip file, or even MS Office's non-standard XML formats, it would be nice if version control systems could efficiently track and store changes of these files.
I have found plugins for SVN and GIT that can reach inside the zip container and do simplistic diffs on the text, but each revision is still stored as a complete binary file.I guess I really just need to learn LaTeX and do everything in plain text files with a good editor like vim.
But in the meantime, having a XML-in-zip-file- aware version control system would be nice.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444523</id>
	<title>Actually I am concerned about privacy</title>
	<author>Orion Blastar</author>
	<datestamp>1245748860000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>for example if I use online syncing utilities for my home drives, how do I know someone else won't have access to them? If I am writing a book or software project, how can I know someone won't steal it from me and claim it as their own?</p><p>I'd like to set up my own file server and have the synch software store the files there or to a USB drive or something. My old method is to burn CD-R and DVD-R disks, but they can get lost or scratched and then the data is gone. I have, for example, a DVD-R full of my old documents before I reformatted one of my systems. It is lost somewhere in my house, or maybe someone visiting stole it. I need a better solution than that. but I cannot afford to set up my own file server. So I set up an XP box with peer-peer networking for my Network drive and Printer server. I'd much rather use Linux but my Linux box is an old 700Mhz Celeron system and when I ran Fedora or Ubuntu Linux on it, it gets data corruption after a while as it only uses IDE hard drives and I cannot afford a UPS system. Our power goes out a few times a month in our area due to storms and trees hitting power lines. Which contributed to my data being unrecoverable on the Linux system and it forced me to reformat it again and again, until finally I gave up on it. Plus trying to stay up to the latest and greatest upgrades I think trying to install Fedora Core 9 over Core 8 caused the system to have problems with libraries and even running the X-Window GUI with GNOME which forced it into shell prompt mode.</p><p>My XP system is a 1.4Ghz AMD Athlon system, not even dual core, but it should be able to make a good Linux server if I can replace it with a new system for running my Windows programs on. Since I am on disability since 2002 I cannot afford new systems and make due with what I have.</p></htmltext>
<tokenext>for example if I use online syncing utilities for my home drives , how do I know someone else wo n't have access to them ?
If I am writing a book or software project , how can I know someone wo n't steal it from me and claim it as their own ? I 'd like to set up my own file server and have the synch software store the files there or to a USB drive or something .
My old method is to burn CD-R and DVD-R disks , but they can get lost or scratched and then the data is gone .
I have , for example , a DVD-R full of my old documents before I reformatted one of my systems .
It is lost somewhere in my house , or maybe someone visiting stole it .
I need a better solution than that .
but I can not afford to set up my own file server .
So I set up an XP box with peer-peer networking for my Network drive and Printer server .
I 'd much rather use Linux but my Linux box is an old 700Mhz Celeron system and when I ran Fedora or Ubuntu Linux on it , it gets data corruption after a while as it only uses IDE hard drives and I can not afford a UPS system .
Our power goes out a few times a month in our area due to storms and trees hitting power lines .
Which contributed to my data being unrecoverable on the Linux system and it forced me to reformat it again and again , until finally I gave up on it .
Plus trying to stay up to the latest and greatest upgrades I think trying to install Fedora Core 9 over Core 8 caused the system to have problems with libraries and even running the X-Window GUI with GNOME which forced it into shell prompt mode.My XP system is a 1.4Ghz AMD Athlon system , not even dual core , but it should be able to make a good Linux server if I can replace it with a new system for running my Windows programs on .
Since I am on disability since 2002 I can not afford new systems and make due with what I have .</tokentext>
<sentencetext>for example if I use online syncing utilities for my home drives, how do I know someone else won't have access to them?
If I am writing a book or software project, how can I know someone won't steal it from me and claim it as their own?I'd like to set up my own file server and have the synch software store the files there or to a USB drive or something.
My old method is to burn CD-R and DVD-R disks, but they can get lost or scratched and then the data is gone.
I have, for example, a DVD-R full of my old documents before I reformatted one of my systems.
It is lost somewhere in my house, or maybe someone visiting stole it.
I need a better solution than that.
but I cannot afford to set up my own file server.
So I set up an XP box with peer-peer networking for my Network drive and Printer server.
I'd much rather use Linux but my Linux box is an old 700Mhz Celeron system and when I ran Fedora or Ubuntu Linux on it, it gets data corruption after a while as it only uses IDE hard drives and I cannot afford a UPS system.
Our power goes out a few times a month in our area due to storms and trees hitting power lines.
Which contributed to my data being unrecoverable on the Linux system and it forced me to reformat it again and again, until finally I gave up on it.
Plus trying to stay up to the latest and greatest upgrades I think trying to install Fedora Core 9 over Core 8 caused the system to have problems with libraries and even running the X-Window GUI with GNOME which forced it into shell prompt mode.My XP system is a 1.4Ghz AMD Athlon system, not even dual core, but it should be able to make a good Linux server if I can replace it with a new system for running my Windows programs on.
Since I am on disability since 2002 I cannot afford new systems and make due with what I have.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28447979</id>
	<title>NEW solution, not previously mentioned.</title>
	<author>Xilinx\_guy</author>
	<datestamp>1245766260000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>I had to read all the comments before I decided to contribute my solution to the problem.   I use a portable Sata2 hard drive, with a Vantec enclosure that supports Sata2 and USB.   I use a 250Gbyte 2.5" hard drive encased in the Vantec aluminum shell, which slides into a docking station in a 3.5" drive bay.  I have one of these at home, and one at the office.   I carry the disk around with me, and boot Kubuntu Intrepid from an 80Gb partition on the drive.   But there's some magic in the way the drive is setup.   The 250Gbytes is split into 3 partitions of 80GB, with a 10GB swap at the end (which is almost never used).   Each partition is actually configured as an element of a Raid 1 (mirror) drive, with 3 copies.  Having a 3 way RAID 1 lets me sync the portable drive to the desktop drive and simultaneously create an extra copy on a 3rd device if I need it.   The desktops I plug this drive into also as a Sata2 hard drive, and once I successfully boot from the portable drive, I then add a partition from the fixed 3.5" drive in the desktop, and then they sync up at around 50 Mbytes/sec.   This creates my backup in case of loss, corruption, or crash.  As for privacy, I do full disk encryption on top of the RAID layer with dm-crypt, so I don't really worry about any of the disks being stolen.
  The same Kubuntu image boots beautifully on my laptop (using USB to connect it), and I also have a RAID1 mirror partition on the laptop for backup.   This way, I have multiple copies of my operating system (with home directory) in different locations, and everytime I resync the RAID 1 devices, I freshen the backup (usually daily).   I had one disk corruption problem that needed an OS rebuild, so I switched from XFS to EXT3 and haven't had any further problems.   And what about Windows?    No problem...   A copy of VirtualBox gives me an XP Home version of Windows right inside Kubuntu, where it lives very happily.  Remember, Windows works much better as an application than it does as an OS.    One thing to note on the replication speed.   I setup the RAID 1 devices with internal bitmaps, which keep track of modified raid chunks, and this causes replication between similiar images to proceed at a *much* higher speed than a simple copy.   I can synchronize an 80GB RAID 1 partition in about 15 minutes when using a SATA connection.   So at the cost of buying a 2.5" hard drive and a pair of Vantec docking stations, I can carry around my OS and home directory with security, redundancy, and convenience.
<p>
I'm happy.</p></htmltext>
<tokenext>I had to read all the comments before I decided to contribute my solution to the problem .
I use a portable Sata2 hard drive , with a Vantec enclosure that supports Sata2 and USB .
I use a 250Gbyte 2.5 " hard drive encased in the Vantec aluminum shell , which slides into a docking station in a 3.5 " drive bay .
I have one of these at home , and one at the office .
I carry the disk around with me , and boot Kubuntu Intrepid from an 80Gb partition on the drive .
But there 's some magic in the way the drive is setup .
The 250Gbytes is split into 3 partitions of 80GB , with a 10GB swap at the end ( which is almost never used ) .
Each partition is actually configured as an element of a Raid 1 ( mirror ) drive , with 3 copies .
Having a 3 way RAID 1 lets me sync the portable drive to the desktop drive and simultaneously create an extra copy on a 3rd device if I need it .
The desktops I plug this drive into also as a Sata2 hard drive , and once I successfully boot from the portable drive , I then add a partition from the fixed 3.5 " drive in the desktop , and then they sync up at around 50 Mbytes/sec .
This creates my backup in case of loss , corruption , or crash .
As for privacy , I do full disk encryption on top of the RAID layer with dm-crypt , so I do n't really worry about any of the disks being stolen .
The same Kubuntu image boots beautifully on my laptop ( using USB to connect it ) , and I also have a RAID1 mirror partition on the laptop for backup .
This way , I have multiple copies of my operating system ( with home directory ) in different locations , and everytime I resync the RAID 1 devices , I freshen the backup ( usually daily ) .
I had one disk corruption problem that needed an OS rebuild , so I switched from XFS to EXT3 and have n't had any further problems .
And what about Windows ?
No problem... A copy of VirtualBox gives me an XP Home version of Windows right inside Kubuntu , where it lives very happily .
Remember , Windows works much better as an application than it does as an OS .
One thing to note on the replication speed .
I setup the RAID 1 devices with internal bitmaps , which keep track of modified raid chunks , and this causes replication between similiar images to proceed at a * much * higher speed than a simple copy .
I can synchronize an 80GB RAID 1 partition in about 15 minutes when using a SATA connection .
So at the cost of buying a 2.5 " hard drive and a pair of Vantec docking stations , I can carry around my OS and home directory with security , redundancy , and convenience .
I 'm happy .</tokentext>
<sentencetext>I had to read all the comments before I decided to contribute my solution to the problem.
I use a portable Sata2 hard drive, with a Vantec enclosure that supports Sata2 and USB.
I use a 250Gbyte 2.5" hard drive encased in the Vantec aluminum shell, which slides into a docking station in a 3.5" drive bay.
I have one of these at home, and one at the office.
I carry the disk around with me, and boot Kubuntu Intrepid from an 80Gb partition on the drive.
But there's some magic in the way the drive is setup.
The 250Gbytes is split into 3 partitions of 80GB, with a 10GB swap at the end (which is almost never used).
Each partition is actually configured as an element of a Raid 1 (mirror) drive, with 3 copies.
Having a 3 way RAID 1 lets me sync the portable drive to the desktop drive and simultaneously create an extra copy on a 3rd device if I need it.
The desktops I plug this drive into also as a Sata2 hard drive, and once I successfully boot from the portable drive, I then add a partition from the fixed 3.5" drive in the desktop, and then they sync up at around 50 Mbytes/sec.
This creates my backup in case of loss, corruption, or crash.
As for privacy, I do full disk encryption on top of the RAID layer with dm-crypt, so I don't really worry about any of the disks being stolen.
The same Kubuntu image boots beautifully on my laptop (using USB to connect it), and I also have a RAID1 mirror partition on the laptop for backup.
This way, I have multiple copies of my operating system (with home directory) in different locations, and everytime I resync the RAID 1 devices, I freshen the backup (usually daily).
I had one disk corruption problem that needed an OS rebuild, so I switched from XFS to EXT3 and haven't had any further problems.
And what about Windows?
No problem...   A copy of VirtualBox gives me an XP Home version of Windows right inside Kubuntu, where it lives very happily.
Remember, Windows works much better as an application than it does as an OS.
One thing to note on the replication speed.
I setup the RAID 1 devices with internal bitmaps, which keep track of modified raid chunks, and this causes replication between similiar images to proceed at a *much* higher speed than a simple copy.
I can synchronize an 80GB RAID 1 partition in about 15 minutes when using a SATA connection.
So at the cost of buying a 2.5" hard drive and a pair of Vantec docking stations, I can carry around my OS and home directory with security, redundancy, and convenience.
I'm happy.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444117</id>
	<title>Re:Dropbox</title>
	<author>bastion\_xx</author>
	<datestamp>1245790620000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>I love Dropbox. The first thing I normally do is *not* store my home directory files in the default locations. It's easier to create a Vista/WIn7 favortite c:\My Dropbox, An OS X folder<nobr> <wbr></nobr>/home/user/My Dropbox, and then make entries in Explorer / Finder to make it easier to access.<br><br>I'm on the free plan at present, but Dropbox will get my money.<br><br>The ability to go back and restore files is nice too.</htmltext>
<tokenext>I love Dropbox .
The first thing I normally do is * not * store my home directory files in the default locations .
It 's easier to create a Vista/WIn7 favortite c : \ My Dropbox , An OS X folder /home/user/My Dropbox , and then make entries in Explorer / Finder to make it easier to access.I 'm on the free plan at present , but Dropbox will get my money.The ability to go back and restore files is nice too .</tokentext>
<sentencetext>I love Dropbox.
The first thing I normally do is *not* store my home directory files in the default locations.
It's easier to create a Vista/WIn7 favortite c:\My Dropbox, An OS X folder /home/user/My Dropbox, and then make entries in Explorer / Finder to make it easier to access.I'm on the free plan at present, but Dropbox will get my money.The ability to go back and restore files is nice too.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443469</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28450063</id>
	<title>Re:always mount your home dir with NFS</title>
	<author>Anonymous</author>
	<datestamp>1245876240000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p><div class="quote"><p>so I can write an shosts/rhosts file with + + and voila! You're totally pwned. Suckers.</p></div><p>You could not p0wn your own computer with a full set of passwords, hardware access and a set of screwdrivers. None uses IP based security over the internet anymore.</p><p><div class="quote"><p>When it comes to hacking, you bitches be playin' checkers and I'm playin' chess.</p></div><p>You sound like a playground kid running his mouth off without any real understanding of what you are talking about. Don't feel bad about it, everyone goes though that stage. And that rap talk just makes you sound lame.</p></div>
	</htmltext>
<tokenext>so I can write an shosts/rhosts file with + + and voila !
You 're totally pwned .
Suckers.You could not p0wn your own computer with a full set of passwords , hardware access and a set of screwdrivers .
None uses IP based security over the internet anymore.When it comes to hacking , you bitches be playin ' checkers and I 'm playin ' chess.You sound like a playground kid running his mouth off without any real understanding of what you are talking about .
Do n't feel bad about it , everyone goes though that stage .
And that rap talk just makes you sound lame .</tokentext>
<sentencetext>so I can write an shosts/rhosts file with + + and voila!
You're totally pwned.
Suckers.You could not p0wn your own computer with a full set of passwords, hardware access and a set of screwdrivers.
None uses IP based security over the internet anymore.When it comes to hacking, you bitches be playin' checkers and I'm playin' chess.You sound like a playground kid running his mouth off without any real understanding of what you are talking about.
Don't feel bad about it, everyone goes though that stage.
And that rap talk just makes you sound lame.
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443525</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28493129</id>
	<title>Re:Unison is the only way</title>
	<author>Anonymous</author>
	<datestamp>1246097820000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>Too bad Unison is no longer developed.</p></htmltext>
<tokenext>Too bad Unison is no longer developed .</tokentext>
<sentencetext>Too bad Unison is no longer developed.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28445129</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28471063</id>
	<title>Spideroak</title>
	<author>TheKeyboardSlayer</author>
	<datestamp>1245920460000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>SpiderOak
<a href="https://spideroak.com/" title="spideroak.com" rel="nofollow">https://spideroak.com/</a> [spideroak.com]

Why?  Because it's encrypted and even spideroak can't decrypt your data.

That makes all my stuff safe from prying eyes.  No one else can do that.</htmltext>
<tokenext>SpiderOak https : //spideroak.com/ [ spideroak.com ] Why ?
Because it 's encrypted and even spideroak ca n't decrypt your data .
That makes all my stuff safe from prying eyes .
No one else can do that .</tokentext>
<sentencetext>SpiderOak
https://spideroak.com/ [spideroak.com]

Why?
Because it's encrypted and even spideroak can't decrypt your data.
That makes all my stuff safe from prying eyes.
No one else can do that.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28450441</id>
	<title>git</title>
	<author>ranulf</author>
	<datestamp>1245839340000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>I also use git. I've got a script checked into the documents folder itself with an icon on my task bar to launch it. Basically, this:
<p>
<tt>
git add<nobr> <wbr></nobr>.<br>
git commit -a -m sync<br>
git pull<br>
git push<br>
</tt>
</p><p>
will add any new files to the repository and synchronise with the repository. It works pretty well, except for when a binary file has been modified on 2 computers as then you need to drop to the shell to resolve the merge. But I use it to sync between my desktop, laptop, netbook and server and never had any problems.</p></htmltext>
<tokenext>I also use git .
I 've got a script checked into the documents folder itself with an icon on my task bar to launch it .
Basically , this : git add .
git commit -a -m sync git pull git push will add any new files to the repository and synchronise with the repository .
It works pretty well , except for when a binary file has been modified on 2 computers as then you need to drop to the shell to resolve the merge .
But I use it to sync between my desktop , laptop , netbook and server and never had any problems .</tokentext>
<sentencetext>I also use git.
I've got a script checked into the documents folder itself with an icon on my task bar to launch it.
Basically, this:


git add .
git commit -a -m sync
git pull
git push


will add any new files to the repository and synchronise with the repository.
It works pretty well, except for when a binary file has been modified on 2 computers as then you need to drop to the shell to resolve the merge.
But I use it to sync between my desktop, laptop, netbook and server and never had any problems.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443659</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444127</id>
	<title>iFolder and Open Enterprise Server 2 from Novell</title>
	<author>Anonymous</author>
	<datestamp>1245790680000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>I setup Novell Open Enterprise Server 2 on a mini Linux server and use iFolder for syncing folders and files across several Windows and Linux desktops and laptops for myself and my wife.</p></htmltext>
<tokenext>I setup Novell Open Enterprise Server 2 on a mini Linux server and use iFolder for syncing folders and files across several Windows and Linux desktops and laptops for myself and my wife .</tokentext>
<sentencetext>I setup Novell Open Enterprise Server 2 on a mini Linux server and use iFolder for syncing folders and files across several Windows and Linux desktops and laptops for myself and my wife.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28449077</id>
	<title>Re:Dropbox</title>
	<author>enoz</author>
	<datestamp>1245778380000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Considering that you said "mostly pictures" it would be worth considering a professional image hosting service such as <a href="http://smugmug.com/photos/photo-sharing-features/" title="smugmug.com">Smugmug</a> [smugmug.com].</p><p>Unlimited storage from US$40 per year is less than half of what you would be paying for 50GB on Dropbox.</p></htmltext>
<tokenext>Considering that you said " mostly pictures " it would be worth considering a professional image hosting service such as Smugmug [ smugmug.com ] .Unlimited storage from US $ 40 per year is less than half of what you would be paying for 50GB on Dropbox .</tokentext>
<sentencetext>Considering that you said "mostly pictures" it would be worth considering a professional image hosting service such as Smugmug [smugmug.com].Unlimited storage from US$40 per year is less than half of what you would be paying for 50GB on Dropbox.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443983</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443707</id>
	<title>File server?</title>
	<author>Anonymous</author>
	<datestamp>1245789360000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>All my data goes on a Samba file server in the garage (which is separated from the house so hopefully in the event of a fire I don't lose both) and my various machines access this, with rights according to the user used to mount the Samba shares. This runs a nightly incremental tar to an external eSATA drive. Every time I copy any data I really care about (like new photos) onto the server I also run rsync to sync it to my machine in the house. Future plans include FreeNAS with ZFS (and hopefully snapshot support), and a number of external hard drives that I can keep at parents houses.</p></htmltext>
<tokenext>All my data goes on a Samba file server in the garage ( which is separated from the house so hopefully in the event of a fire I do n't lose both ) and my various machines access this , with rights according to the user used to mount the Samba shares .
This runs a nightly incremental tar to an external eSATA drive .
Every time I copy any data I really care about ( like new photos ) onto the server I also run rsync to sync it to my machine in the house .
Future plans include FreeNAS with ZFS ( and hopefully snapshot support ) , and a number of external hard drives that I can keep at parents houses .</tokentext>
<sentencetext>All my data goes on a Samba file server in the garage (which is separated from the house so hopefully in the event of a fire I don't lose both) and my various machines access this, with rights according to the user used to mount the Samba shares.
This runs a nightly incremental tar to an external eSATA drive.
Every time I copy any data I really care about (like new photos) onto the server I also run rsync to sync it to my machine in the house.
Future plans include FreeNAS with ZFS (and hopefully snapshot support), and a number of external hard drives that I can keep at parents houses.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28446891</id>
	<title>Git</title>
	<author>kabloom</author>
	<datestamp>1245758640000</datestamp>
	<modclass>Informativ</modclass>
	<modscore>2</modscore>
	<htmltext><p>I use a constellation of git repositories, and Joey Hess' <a href="http://kitenet.net/~joey/code/mr/" title="kitenet.net">mr</a> [kitenet.net] tool to synchronize all of them. I have no automated commits -- I just remember to commit and update manually daily.</p></htmltext>
<tokenext>I use a constellation of git repositories , and Joey Hess ' mr [ kitenet.net ] tool to synchronize all of them .
I have no automated commits -- I just remember to commit and update manually daily .</tokentext>
<sentencetext>I use a constellation of git repositories, and Joey Hess' mr [kitenet.net] tool to synchronize all of them.
I have no automated commits -- I just remember to commit and update manually daily.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444181</id>
	<title>NFS or AFP mounted home directories</title>
	<author>Anonymous</author>
	<datestamp>1245747660000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>All my Macs in the house (5) mount their home directories from a central server. This server was originally a Linux box and the mounting was done via NFS. About a year ago I got fed up enough with Linux to convert that server to a Mac as well and it now shares all home directories via AFP.</p><p>On the notebooks I have two accounts: one that mounts the home direcotory when I am home and in wireless range and another account that is a separate desktop for when I am not home. I'll occasionally rsync files from the non-central home directory to my server. I make sure that I stay on top of it so that files do not get forgotten (set a specific desktop folder for content you want to keep/sync).</p><p>HTH</p></htmltext>
<tokenext>All my Macs in the house ( 5 ) mount their home directories from a central server .
This server was originally a Linux box and the mounting was done via NFS .
About a year ago I got fed up enough with Linux to convert that server to a Mac as well and it now shares all home directories via AFP.On the notebooks I have two accounts : one that mounts the home direcotory when I am home and in wireless range and another account that is a separate desktop for when I am not home .
I 'll occasionally rsync files from the non-central home directory to my server .
I make sure that I stay on top of it so that files do not get forgotten ( set a specific desktop folder for content you want to keep/sync ) .HTH</tokentext>
<sentencetext>All my Macs in the house (5) mount their home directories from a central server.
This server was originally a Linux box and the mounting was done via NFS.
About a year ago I got fed up enough with Linux to convert that server to a Mac as well and it now shares all home directories via AFP.On the notebooks I have two accounts: one that mounts the home direcotory when I am home and in wireless range and another account that is a separate desktop for when I am not home.
I'll occasionally rsync files from the non-central home directory to my server.
I make sure that I stay on top of it so that files do not get forgotten (set a specific desktop folder for content you want to keep/sync).HTH</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28457585</id>
	<title>Sync and backup options...</title>
	<author>Anonymous</author>
	<datestamp>1245877140000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>Recently open-sourced iFolder may serve your needs.  Does need server-side set-up, but you can do that on any old box you got laying around.  http://www.kablink.org/ifolder<br>Drag-and-drop file back-up, and limited versioning.  Works on my Mac, XP, Vista and Linux desktops/laptops.</p></htmltext>
<tokenext>Recently open-sourced iFolder may serve your needs .
Does need server-side set-up , but you can do that on any old box you got laying around .
http : //www.kablink.org/ifolderDrag-and-drop file back-up , and limited versioning .
Works on my Mac , XP , Vista and Linux desktops/laptops .</tokentext>
<sentencetext>Recently open-sourced iFolder may serve your needs.
Does need server-side set-up, but you can do that on any old box you got laying around.
http://www.kablink.org/ifolderDrag-and-drop file back-up, and limited versioning.
Works on my Mac, XP, Vista and Linux desktops/laptops.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28450397</id>
	<title>Re:Windows - SyncBack</title>
	<author>Anonymous</author>
	<datestamp>1245838680000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>Can't go anywhere without your porn, can you, you pervert?</p></htmltext>
<tokenext>Ca n't go anywhere without your porn , can you , you pervert ?</tokentext>
<sentencetext>Can't go anywhere without your porn, can you, you pervert?</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443755</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444045</id>
	<title>server and source control</title>
	<author>Is0m0rph</author>
	<datestamp>1245790440000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>I have 1 server computer that stores all videos, photos, music, etc.  My other computers and Xbox 360 connect to that one.  For work related documents and source code the same server also runs Source Safe.</htmltext>
<tokenext>I have 1 server computer that stores all videos , photos , music , etc .
My other computers and Xbox 360 connect to that one .
For work related documents and source code the same server also runs Source Safe .</tokentext>
<sentencetext>I have 1 server computer that stores all videos, photos, music, etc.
My other computers and Xbox 360 connect to that one.
For work related documents and source code the same server also runs Source Safe.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28447983</id>
	<title>a mix</title>
	<author>shish</author>
	<datestamp>1245766320000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>~/art (stuff that I create and work on): git
<br>~/compsci (for the uni course, mostly code and latex): git
<br>~/fetches (random crap that gets downloaded to look at): not synced
<br>~/music (guess): rsync
<br>~/personal (various mostly text documents): git
<br>~/photos (photos organised into $date-$eventtitle/$photonum.jpg then left alone): rsync
<br>~/src (code I write): gets uploaded to FTP and the world backs it up for me<nobr> <wbr></nobr>:-)</htmltext>
<tokenext>~ /art ( stuff that I create and work on ) : git ~ /compsci ( for the uni course , mostly code and latex ) : git ~ /fetches ( random crap that gets downloaded to look at ) : not synced ~ /music ( guess ) : rsync ~ /personal ( various mostly text documents ) : git ~ /photos ( photos organised into $ date- $ eventtitle/ $ photonum.jpg then left alone ) : rsync ~ /src ( code I write ) : gets uploaded to FTP and the world backs it up for me : - )</tokentext>
<sentencetext>~/art (stuff that I create and work on): git
~/compsci (for the uni course, mostly code and latex): git
~/fetches (random crap that gets downloaded to look at): not synced
~/music (guess): rsync
~/personal (various mostly text documents): git
~/photos (photos organised into $date-$eventtitle/$photonum.jpg then left alone): rsync
~/src (code I write): gets uploaded to FTP and the world backs it up for me :-)</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443679</id>
	<title>AllwaySync</title>
	<author>Ritorix</author>
	<datestamp>1245789240000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>freeware @ <a href="http://allwaysync.com/" title="allwaysync.com" rel="nofollow">http://allwaysync.com/</a> [allwaysync.com]</p><p>I was playing with this for the first time last night, it gets the job done.  Sync software with a nice GUI, and I was easily able to backup my systems to a 1tb backup drive in a reasonable time period.  It has the usual features and can sync in multiple directions (one to many, bidirectional or one way).</p></htmltext>
<tokenext>freeware @ http : //allwaysync.com/ [ allwaysync.com ] I was playing with this for the first time last night , it gets the job done .
Sync software with a nice GUI , and I was easily able to backup my systems to a 1tb backup drive in a reasonable time period .
It has the usual features and can sync in multiple directions ( one to many , bidirectional or one way ) .</tokentext>
<sentencetext>freeware @ http://allwaysync.com/ [allwaysync.com]I was playing with this for the first time last night, it gets the job done.
Sync software with a nice GUI, and I was easily able to backup my systems to a 1tb backup drive in a reasonable time period.
It has the usual features and can sync in multiple directions (one to many, bidirectional or one way).</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28451513</id>
	<title>Re:rsync + OpenSolaris (ZFS) w/time slider</title>
	<author>DiSKiLLeR</author>
	<datestamp>1245853080000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>This is what I do.</p><p>All my windows machines rsync C:\Users\myhomedir\ to my ZFS Solaris box. Snapshots. And other tasty salty goodness.</p><p>Couldn't live without ZFS+Rsync.</p></htmltext>
<tokenext>This is what I do.All my windows machines rsync C : \ Users \ myhomedir \ to my ZFS Solaris box .
Snapshots. And other tasty salty goodness.Could n't live without ZFS + Rsync .</tokentext>
<sentencetext>This is what I do.All my windows machines rsync C:\Users\myhomedir\ to my ZFS Solaris box.
Snapshots. And other tasty salty goodness.Couldn't live without ZFS+Rsync.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28445899</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444519</id>
	<title>rsync + ln</title>
	<author>Anonymous</author>
	<datestamp>1245748860000</datestamp>
	<modclass>Insightful</modclass>
	<modscore>1</modscore>
	<htmltext><p>The bottom line is that no solution is going to do what you want perfectly. If you really want to tweak your backup/sync process (and learn a lot while doing so) then your best bet is to develop your own scripts in which rsync (and ln) are the real workhorses.</p><p>Personally I like doing synthetic backups which are like incremental backups, but the unchanged files are hardlinks (NTFS actually supports hardlinks on MS systems). I don't like storing meta-information about my backups, for some directories like<nobr> <wbr></nobr>.mozilla I like to perform deletions, however for others like Videos, I never want to delete anything, just free up space on particular drives.</p><p>Then when you have a nice script, just through it in your crontab and you're good to go.</p><p>My script has matured a lot over the years from a simple static procedure to a full-fledge program with config file parsing and very nice command line operations with getopts.</p></htmltext>
<tokenext>The bottom line is that no solution is going to do what you want perfectly .
If you really want to tweak your backup/sync process ( and learn a lot while doing so ) then your best bet is to develop your own scripts in which rsync ( and ln ) are the real workhorses.Personally I like doing synthetic backups which are like incremental backups , but the unchanged files are hardlinks ( NTFS actually supports hardlinks on MS systems ) .
I do n't like storing meta-information about my backups , for some directories like .mozilla I like to perform deletions , however for others like Videos , I never want to delete anything , just free up space on particular drives.Then when you have a nice script , just through it in your crontab and you 're good to go.My script has matured a lot over the years from a simple static procedure to a full-fledge program with config file parsing and very nice command line operations with getopts .</tokentext>
<sentencetext>The bottom line is that no solution is going to do what you want perfectly.
If you really want to tweak your backup/sync process (and learn a lot while doing so) then your best bet is to develop your own scripts in which rsync (and ln) are the real workhorses.Personally I like doing synthetic backups which are like incremental backups, but the unchanged files are hardlinks (NTFS actually supports hardlinks on MS systems).
I don't like storing meta-information about my backups, for some directories like .mozilla I like to perform deletions, however for others like Videos, I never want to delete anything, just free up space on particular drives.Then when you have a nice script, just through it in your crontab and you're good to go.My script has matured a lot over the years from a simple static procedure to a full-fledge program with config file parsing and very nice command line operations with getopts.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28447993</id>
	<title>Windows Server, Offline Sync, and Exchange</title>
	<author>DavidD\_CA</author>
	<datestamp>1245766440000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>I have a simple Windows Small Business Server 2003 running in my home on a very old Pentium III with 1 GB of RAM.  It's the primary storage for all my stuff, and my email/calendar on Exchange.</p><p>My desktop accesses it directly, but my laptop (with My Documents redirected to the server) is set to use Offline Files.</p><p>I don't have *everything* set to go offline -- my laptop HD isn't big enough for that.  But, key folders are synced by right-clicking and choosing "Available Offline".  It's as simple as that, and works flawlessly.</p><p>If I'm away and need a file that I don't have, I can connect back to the server via VPN and there it is.  Or, I can mark it for Offline sync and I'm done.</p><p>Everything is very transparent, and works great.</p><p>And for email/calendar, I do the same with Exchange and Remote Mail (plus my phone uses Windows Mobile).</p></htmltext>
<tokenext>I have a simple Windows Small Business Server 2003 running in my home on a very old Pentium III with 1 GB of RAM .
It 's the primary storage for all my stuff , and my email/calendar on Exchange.My desktop accesses it directly , but my laptop ( with My Documents redirected to the server ) is set to use Offline Files.I do n't have * everything * set to go offline -- my laptop HD is n't big enough for that .
But , key folders are synced by right-clicking and choosing " Available Offline " .
It 's as simple as that , and works flawlessly.If I 'm away and need a file that I do n't have , I can connect back to the server via VPN and there it is .
Or , I can mark it for Offline sync and I 'm done.Everything is very transparent , and works great.And for email/calendar , I do the same with Exchange and Remote Mail ( plus my phone uses Windows Mobile ) .</tokentext>
<sentencetext>I have a simple Windows Small Business Server 2003 running in my home on a very old Pentium III with 1 GB of RAM.
It's the primary storage for all my stuff, and my email/calendar on Exchange.My desktop accesses it directly, but my laptop (with My Documents redirected to the server) is set to use Offline Files.I don't have *everything* set to go offline -- my laptop HD isn't big enough for that.
But, key folders are synced by right-clicking and choosing "Available Offline".
It's as simple as that, and works flawlessly.If I'm away and need a file that I don't have, I can connect back to the server via VPN and there it is.
Or, I can mark it for Offline sync and I'm done.Everything is very transparent, and works great.And for email/calendar, I do the same with Exchange and Remote Mail (plus my phone uses Windows Mobile).</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443989</id>
	<title>Subversion</title>
	<author>ChaoticCoyote</author>
	<datestamp>1245790260000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>
I've used Subversion for years to sync my various systems. I have four different machines (2 Vista, 2 Linux) and 20GB of data that must be kept in sync.
</p><p>
Of course, there could be somethign much better out there. I'm just very comfortable with Subversion, and it works.
</p></htmltext>
<tokenext>I 've used Subversion for years to sync my various systems .
I have four different machines ( 2 Vista , 2 Linux ) and 20GB of data that must be kept in sync .
Of course , there could be somethign much better out there .
I 'm just very comfortable with Subversion , and it works .</tokentext>
<sentencetext>
I've used Subversion for years to sync my various systems.
I have four different machines (2 Vista, 2 Linux) and 20GB of data that must be kept in sync.
Of course, there could be somethign much better out there.
I'm just very comfortable with Subversion, and it works.
</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444529</id>
	<title>Re:Dropbox</title>
	<author>Anonymous</author>
	<datestamp>1245748920000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext>I don't believe that <a href="http://www.getdropbox.com/pricing" title="getdropbox.com" rel="nofollow">paying</a> [getdropbox.com] for a service that can just as easily be done with a NAS or even USB drive is a smart move.</htmltext>
<tokenext>I do n't believe that paying [ getdropbox.com ] for a service that can just as easily be done with a NAS or even USB drive is a smart move .</tokentext>
<sentencetext>I don't believe that paying [getdropbox.com] for a service that can just as easily be done with a NAS or even USB drive is a smart move.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443469</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28447303</id>
	<title>Home Server</title>
	<author>eldridgea</author>
	<datestamp>1245761040000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>I realize this isn't exactly what you're looking for, but what I do is have a dedicated Debian server running Samba (for windows and *nix) and Netatalk (for my Mac).
<p>
I keep all of my important files and media files on the server, it also backs up my computers.
</p><p>
I can access it if I am not at home via SSH.</p><p>
I'm also going to add a cron job and an external HDD for offsite backup.</p></htmltext>
<tokenext>I realize this is n't exactly what you 're looking for , but what I do is have a dedicated Debian server running Samba ( for windows and * nix ) and Netatalk ( for my Mac ) .
I keep all of my important files and media files on the server , it also backs up my computers .
I can access it if I am not at home via SSH .
I 'm also going to add a cron job and an external HDD for offsite backup .</tokentext>
<sentencetext>I realize this isn't exactly what you're looking for, but what I do is have a dedicated Debian server running Samba (for windows and *nix) and Netatalk (for my Mac).
I keep all of my important files and media files on the server, it also backs up my computers.
I can access it if I am not at home via SSH.
I'm also going to add a cron job and an external HDD for offsite backup.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28445633</id>
	<title>Terabyte synchronization and management</title>
	<author>crf00</author>
	<datestamp>1245752820000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Seriously, data management problem has caused me to become paranoid to create files and documents in anywhere other than my main computer. I have over a terabyte of files to share over several computers with different OS: Ubuntu, Windows XP, Windows XP x64, and Mac OS X. Currently the only effective way I have is to share the files through samba/windows file sharing over gigabit local network. Gigabit ethernet has give me decent performance and the speed is almost like local access, I can even stream HD movies over the network.</p><p>Initially I used a dedicated Windows XP machine to host the files. Because it is always painful when Windows restart or crashed after opening too many applications and my files would be temporarily unavailable. The previous release of Ubuntu had some mysterious bug that made the speed of samba sharing slow, and I wanted to host my files using cross-platform filesystem that have read/write access over different OS, and NTFS seem to be the only option.</p><p>Recently I have migrated my files to be hosted on Ubuntu using ext4. The samba bug no longer appeared in Interpid, and the performance has been satisfactory. There is still some other bug that cause frequent disconnect/reconnect over the network, but I still can bear it for now. The other draw back of hosting in Ubuntu is that I can no longer hot plug my hard drives to XP and Mac OS X.</p><p>Currently the only annoyance problem I have is on my Macbook, where I won't have access to the files local network when I'm not at home. I am especially paranoid to create any file on the laptop, fearing management overhead to backup or move around the data to appropriate network folders. I still have no good solution to manage the photos I store in the macbook and the desktop. Currently the photos are stored in separate Lightroom catalogs. Even though I do backup the photos but I cannot do any write operation on the backup copy on my desktop, as that would destroy the integrity with the original catalog on my Macbook. My previous photo management tool, Aperture, was even worse that I cannot do any backup of the catalog to the network or NTFS partition as that destroys the chmod permissions and render the whole catalog unusable.</p><p>I am currently developing a web application that can also be used as a home/local web server that can manage and synchronize specific file formats as efficient as the web server. For example, imagine a Flickr-like web app, that is installed on both your laptop and desktop and is not only able smart-synchronize all photos, but also automatically publish selected photos to Flickr, Facebook etc, and is powered by the very same program that powers these websites.</p><p>Please tell me if anyone have the same experience or have a better solution for terabyte file management.</p></htmltext>
<tokenext>Seriously , data management problem has caused me to become paranoid to create files and documents in anywhere other than my main computer .
I have over a terabyte of files to share over several computers with different OS : Ubuntu , Windows XP , Windows XP x64 , and Mac OS X. Currently the only effective way I have is to share the files through samba/windows file sharing over gigabit local network .
Gigabit ethernet has give me decent performance and the speed is almost like local access , I can even stream HD movies over the network.Initially I used a dedicated Windows XP machine to host the files .
Because it is always painful when Windows restart or crashed after opening too many applications and my files would be temporarily unavailable .
The previous release of Ubuntu had some mysterious bug that made the speed of samba sharing slow , and I wanted to host my files using cross-platform filesystem that have read/write access over different OS , and NTFS seem to be the only option.Recently I have migrated my files to be hosted on Ubuntu using ext4 .
The samba bug no longer appeared in Interpid , and the performance has been satisfactory .
There is still some other bug that cause frequent disconnect/reconnect over the network , but I still can bear it for now .
The other draw back of hosting in Ubuntu is that I can no longer hot plug my hard drives to XP and Mac OS X.Currently the only annoyance problem I have is on my Macbook , where I wo n't have access to the files local network when I 'm not at home .
I am especially paranoid to create any file on the laptop , fearing management overhead to backup or move around the data to appropriate network folders .
I still have no good solution to manage the photos I store in the macbook and the desktop .
Currently the photos are stored in separate Lightroom catalogs .
Even though I do backup the photos but I can not do any write operation on the backup copy on my desktop , as that would destroy the integrity with the original catalog on my Macbook .
My previous photo management tool , Aperture , was even worse that I can not do any backup of the catalog to the network or NTFS partition as that destroys the chmod permissions and render the whole catalog unusable.I am currently developing a web application that can also be used as a home/local web server that can manage and synchronize specific file formats as efficient as the web server .
For example , imagine a Flickr-like web app , that is installed on both your laptop and desktop and is not only able smart-synchronize all photos , but also automatically publish selected photos to Flickr , Facebook etc , and is powered by the very same program that powers these websites.Please tell me if anyone have the same experience or have a better solution for terabyte file management .</tokentext>
<sentencetext>Seriously, data management problem has caused me to become paranoid to create files and documents in anywhere other than my main computer.
I have over a terabyte of files to share over several computers with different OS: Ubuntu, Windows XP, Windows XP x64, and Mac OS X. Currently the only effective way I have is to share the files through samba/windows file sharing over gigabit local network.
Gigabit ethernet has give me decent performance and the speed is almost like local access, I can even stream HD movies over the network.Initially I used a dedicated Windows XP machine to host the files.
Because it is always painful when Windows restart or crashed after opening too many applications and my files would be temporarily unavailable.
The previous release of Ubuntu had some mysterious bug that made the speed of samba sharing slow, and I wanted to host my files using cross-platform filesystem that have read/write access over different OS, and NTFS seem to be the only option.Recently I have migrated my files to be hosted on Ubuntu using ext4.
The samba bug no longer appeared in Interpid, and the performance has been satisfactory.
There is still some other bug that cause frequent disconnect/reconnect over the network, but I still can bear it for now.
The other draw back of hosting in Ubuntu is that I can no longer hot plug my hard drives to XP and Mac OS X.Currently the only annoyance problem I have is on my Macbook, where I won't have access to the files local network when I'm not at home.
I am especially paranoid to create any file on the laptop, fearing management overhead to backup or move around the data to appropriate network folders.
I still have no good solution to manage the photos I store in the macbook and the desktop.
Currently the photos are stored in separate Lightroom catalogs.
Even though I do backup the photos but I cannot do any write operation on the backup copy on my desktop, as that would destroy the integrity with the original catalog on my Macbook.
My previous photo management tool, Aperture, was even worse that I cannot do any backup of the catalog to the network or NTFS partition as that destroys the chmod permissions and render the whole catalog unusable.I am currently developing a web application that can also be used as a home/local web server that can manage and synchronize specific file formats as efficient as the web server.
For example, imagine a Flickr-like web app, that is installed on both your laptop and desktop and is not only able smart-synchronize all photos, but also automatically publish selected photos to Flickr, Facebook etc, and is powered by the very same program that powers these websites.Please tell me if anyone have the same experience or have a better solution for terabyte file management.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444247</id>
	<title>unison/sshfs/rsync</title>
	<author>Cocoronixx</author>
	<datestamp>1245747840000</datestamp>
	<modclass>Informativ</modclass>
	<modscore>2</modscore>
	<htmltext><p>My current setup is a combo of unison/sshfs/rsync.. I've been using it for quite a while at this point, and it works so well that I don't even give it any thought anymore.</p><p>I have a Media Center/Fileserver box at my house that is always on, and acts as the 'master' copy of the home directory, on all my workstations my<nobr> <wbr></nobr>.xinitrc/.xsession calls unison to sync my home dir with the server root as the preferred copy, then calls my WM, after my WM exits, unison is called again with the local homedir as the authoritative copy.  A well-crafted ignore list is crucial, ignoring things like temporary file patterns, mozilla cache, machine-specific data, obscenely large files, etc.</p><p>I use sshfs-fuse for any of the above mentioned ignored files, sshfs allws you to mount remote filesystems locally, tunneling through ssh (or something like that).</p><p>Finally, rsync can be used to give time machine like backups.  In my case, I backup to USB drive connected to the fileserver, which I only turn on when running backups.</p><p>This is for the most part Linux-only, but unison has a windows binary available, and could probably be used to sync to windows automagically.</p></htmltext>
<tokenext>My current setup is a combo of unison/sshfs/rsync.. I 've been using it for quite a while at this point , and it works so well that I do n't even give it any thought anymore.I have a Media Center/Fileserver box at my house that is always on , and acts as the 'master ' copy of the home directory , on all my workstations my .xinitrc/.xsession calls unison to sync my home dir with the server root as the preferred copy , then calls my WM , after my WM exits , unison is called again with the local homedir as the authoritative copy .
A well-crafted ignore list is crucial , ignoring things like temporary file patterns , mozilla cache , machine-specific data , obscenely large files , etc.I use sshfs-fuse for any of the above mentioned ignored files , sshfs allws you to mount remote filesystems locally , tunneling through ssh ( or something like that ) .Finally , rsync can be used to give time machine like backups .
In my case , I backup to USB drive connected to the fileserver , which I only turn on when running backups.This is for the most part Linux-only , but unison has a windows binary available , and could probably be used to sync to windows automagically .</tokentext>
<sentencetext>My current setup is a combo of unison/sshfs/rsync.. I've been using it for quite a while at this point, and it works so well that I don't even give it any thought anymore.I have a Media Center/Fileserver box at my house that is always on, and acts as the 'master' copy of the home directory, on all my workstations my .xinitrc/.xsession calls unison to sync my home dir with the server root as the preferred copy, then calls my WM, after my WM exits, unison is called again with the local homedir as the authoritative copy.
A well-crafted ignore list is crucial, ignoring things like temporary file patterns, mozilla cache, machine-specific data, obscenely large files, etc.I use sshfs-fuse for any of the above mentioned ignored files, sshfs allws you to mount remote filesystems locally, tunneling through ssh (or something like that).Finally, rsync can be used to give time machine like backups.
In my case, I backup to USB drive connected to the fileserver, which I only turn on when running backups.This is for the most part Linux-only, but unison has a windows binary available, and could probably be used to sync to windows automagically.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444745</id>
	<title>A few simple apps:</title>
	<author>Anonymous</author>
	<datestamp>1245749640000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>- Windows Live Sync (I know, its Microsoft -- it wasn't when I started using it! It was called FolderShare, and after some hiccups, the re-christened product works just as well) keeps my docs in sync<br>- Mozy keeps my archives backed-up<br>- IMAP on Google Apps for your domain keeps my mail in sync<br>- Google Calendar with various, platform-specific syncing solutions<br>- Plaxo (blech, but it works) keeps my contacts in sync<br>- Foxmarks keeps my bookmarks on each computer<br>- iTunes sharing + AppleTV + AirTunes + two iPhones (one for me, one for my wife) works well enough for sharing media seamlessly</p><p>I have two Windows boxes (desktop for work, and a mini laptop) and two Mac's (Mac Mini media "server" and a MacBook Pro) all are in-sync with each other all the time...</p></htmltext>
<tokenext>- Windows Live Sync ( I know , its Microsoft -- it was n't when I started using it !
It was called FolderShare , and after some hiccups , the re-christened product works just as well ) keeps my docs in sync- Mozy keeps my archives backed-up- IMAP on Google Apps for your domain keeps my mail in sync- Google Calendar with various , platform-specific syncing solutions- Plaxo ( blech , but it works ) keeps my contacts in sync- Foxmarks keeps my bookmarks on each computer- iTunes sharing + AppleTV + AirTunes + two iPhones ( one for me , one for my wife ) works well enough for sharing media seamlesslyI have two Windows boxes ( desktop for work , and a mini laptop ) and two Mac 's ( Mac Mini media " server " and a MacBook Pro ) all are in-sync with each other all the time.. .</tokentext>
<sentencetext>- Windows Live Sync (I know, its Microsoft -- it wasn't when I started using it!
It was called FolderShare, and after some hiccups, the re-christened product works just as well) keeps my docs in sync- Mozy keeps my archives backed-up- IMAP on Google Apps for your domain keeps my mail in sync- Google Calendar with various, platform-specific syncing solutions- Plaxo (blech, but it works) keeps my contacts in sync- Foxmarks keeps my bookmarks on each computer- iTunes sharing + AppleTV + AirTunes + two iPhones (one for me, one for my wife) works well enough for sharing media seamlesslyI have two Windows boxes (desktop for work, and a mini laptop) and two Mac's (Mac Mini media "server" and a MacBook Pro) all are in-sync with each other all the time...</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444063</id>
	<title>CVS</title>
	<author>Anonymous</author>
	<datestamp>1245790500000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>I've been using CVS for over a decade now.  Generally with small files like<nobr> <wbr></nobr>.cshrc,<nobr> <wbr></nobr>.emacs, etc.</p></htmltext>
<tokenext>I 've been using CVS for over a decade now .
Generally with small files like .cshrc , .emacs , etc .</tokentext>
<sentencetext>I've been using CVS for over a decade now.
Generally with small files like .cshrc, .emacs, etc.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443419</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28448305</id>
	<title>FTP mirrors</title>
	<author>Anonymous</author>
	<datestamp>1245770040000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext>I put my important files on <a href="ftp://ftp.kernel.org/" title="kernel.org" rel="nofollow">ftp://ftp.kernel.org/</a> [kernel.org] and expect the rest of the world to mirror it.</htmltext>
<tokenext>I put my important files on ftp : //ftp.kernel.org/ [ kernel.org ] and expect the rest of the world to mirror it .</tokentext>
<sentencetext>I put my important files on ftp://ftp.kernel.org/ [kernel.org] and expect the rest of the world to mirror it.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444439</id>
	<title>RSYNC</title>
	<author>scorp1us</author>
	<datestamp>1245748560000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>I'd like to give a shout-out to the Samba team and creators of <a href="rsync.samba.org" title="slashdot.org">RSync</a> [slashdot.org]. Version 3 and later is what you want. It has many goodies that make backing up a joy, incleding an off-line incremental backup feature that I find really handy.</p><p>Still, I'm looking for a good graphical rsync that does not use cygwin on windows though.</p></htmltext>
<tokenext>I 'd like to give a shout-out to the Samba team and creators of RSync [ slashdot.org ] .
Version 3 and later is what you want .
It has many goodies that make backing up a joy , incleding an off-line incremental backup feature that I find really handy.Still , I 'm looking for a good graphical rsync that does not use cygwin on windows though .</tokentext>
<sentencetext>I'd like to give a shout-out to the Samba team and creators of RSync [slashdot.org].
Version 3 and later is what you want.
It has many goodies that make backing up a joy, incleding an off-line incremental backup feature that I find really handy.Still, I'm looking for a good graphical rsync that does not use cygwin on windows though.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28448467</id>
	<title>Re:USB drive</title>
	<author>Anonymous</author>
	<datestamp>1245772020000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>This is almost exactly what I do however I use a 2.5" 120GB hdd in a small USB drive enclosure. It has a EXT3 and a FAT32 partition. I still plan to encrypt it some day. In the mean time I guard it with my life, you know, like my wallet. To backup the data on the drive I have a shell script I wrote that creates tgz backup to each of my systems. Each backup file named with its date so todays will be mobile20090623.tgz. About once a month I delete the oldest backups. The USB drive is not the first I've had. I been through many through the years. With each of the my systems containing backups I have no reason to worry about the usb drive dieing. I'm less then a day behind at any time.</p></htmltext>
<tokenext>This is almost exactly what I do however I use a 2.5 " 120GB hdd in a small USB drive enclosure .
It has a EXT3 and a FAT32 partition .
I still plan to encrypt it some day .
In the mean time I guard it with my life , you know , like my wallet .
To backup the data on the drive I have a shell script I wrote that creates tgz backup to each of my systems .
Each backup file named with its date so todays will be mobile20090623.tgz .
About once a month I delete the oldest backups .
The USB drive is not the first I 've had .
I been through many through the years .
With each of the my systems containing backups I have no reason to worry about the usb drive dieing .
I 'm less then a day behind at any time .</tokentext>
<sentencetext>This is almost exactly what I do however I use a 2.5" 120GB hdd in a small USB drive enclosure.
It has a EXT3 and a FAT32 partition.
I still plan to encrypt it some day.
In the mean time I guard it with my life, you know, like my wallet.
To backup the data on the drive I have a shell script I wrote that creates tgz backup to each of my systems.
Each backup file named with its date so todays will be mobile20090623.tgz.
About once a month I delete the oldest backups.
The USB drive is not the first I've had.
I been through many through the years.
With each of the my systems containing backups I have no reason to worry about the usb drive dieing.
I'm less then a day behind at any time.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443649</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28445647</id>
	<title>It's called Windows</title>
	<author>mozzis</author>
	<datestamp>1245752880000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>Yes, Linux is way primitive when it comes to synchronization technology, as in other areas. Windows has a history of providing more-or-less effective solutions for this out of the box, going back at least to the My Briefcase feature of Windows 95. Now there are Offline Files, Live Sync, Sync Center, and even a more consumer-level gadget called Sync Toy. Another example of how much time you can waste trying to use Linux for serious business use.</htmltext>
<tokenext>Yes , Linux is way primitive when it comes to synchronization technology , as in other areas .
Windows has a history of providing more-or-less effective solutions for this out of the box , going back at least to the My Briefcase feature of Windows 95 .
Now there are Offline Files , Live Sync , Sync Center , and even a more consumer-level gadget called Sync Toy .
Another example of how much time you can waste trying to use Linux for serious business use .</tokentext>
<sentencetext>Yes, Linux is way primitive when it comes to synchronization technology, as in other areas.
Windows has a history of providing more-or-less effective solutions for this out of the box, going back at least to the My Briefcase feature of Windows 95.
Now there are Offline Files, Live Sync, Sync Center, and even a more consumer-level gadget called Sync Toy.
Another example of how much time you can waste trying to use Linux for serious business use.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28450493</id>
	<title>rsync...</title>
	<author>pointbeing</author>
	<datestamp>1245840300000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>It's probably been said lotsa times in this thread but I use rsync to back up stuff to an external drive.  I back up<nobr> <wbr></nobr>/home,<nobr> <wbr></nobr>/etc,<nobr> <wbr></nobr>/var/cache/apt/archives and the My Documents folder on the spousal unit's Windows machine.</p></htmltext>
<tokenext>It 's probably been said lotsa times in this thread but I use rsync to back up stuff to an external drive .
I back up /home , /etc , /var/cache/apt/archives and the My Documents folder on the spousal unit 's Windows machine .</tokentext>
<sentencetext>It's probably been said lotsa times in this thread but I use rsync to back up stuff to an external drive.
I back up /home, /etc, /var/cache/apt/archives and the My Documents folder on the spousal unit's Windows machine.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443999</id>
	<title>Webdav, SVN, etc</title>
	<author>fermion</author>
	<datestamp>1245790260000</datestamp>
	<modclass>Informativ</modclass>
	<modscore>2</modscore>
	<htmltext>For smaller files, I keep everything controlled using SVN. That is code, office type files, that sort of thing. I have a BASH script that pretty reliably works to commit, add, and update.  Everyone once in a while I have to go in and manually fix something. I suppose I could put a chron entry in to make it automatic, but it is just as easy to go to the shell and update everything.  Setting up the server was no issue, and it is an offsite backup.
<p>
For items that are larger, or that do not change so often, I use iDisk.  This is just a fancy Webdav server that I do not have to manage.
</p><p>
I keep programs on an external hard disk.  This is where I also keep my photo library and music and videos.  I use one machine for Photos, so I do not really have anything to sync there.  My music is not synced either, but I have used some third party software to hel with that.
</p><p>
It is getting to the point where if something goes wrong with a machine, I can have new one set up will all my data in a day.  In normal circumstances, I can use any one of three machines and prety much have up to date information.</p></htmltext>
<tokenext>For smaller files , I keep everything controlled using SVN .
That is code , office type files , that sort of thing .
I have a BASH script that pretty reliably works to commit , add , and update .
Everyone once in a while I have to go in and manually fix something .
I suppose I could put a chron entry in to make it automatic , but it is just as easy to go to the shell and update everything .
Setting up the server was no issue , and it is an offsite backup .
For items that are larger , or that do not change so often , I use iDisk .
This is just a fancy Webdav server that I do not have to manage .
I keep programs on an external hard disk .
This is where I also keep my photo library and music and videos .
I use one machine for Photos , so I do not really have anything to sync there .
My music is not synced either , but I have used some third party software to hel with that .
It is getting to the point where if something goes wrong with a machine , I can have new one set up will all my data in a day .
In normal circumstances , I can use any one of three machines and prety much have up to date information .</tokentext>
<sentencetext>For smaller files, I keep everything controlled using SVN.
That is code, office type files, that sort of thing.
I have a BASH script that pretty reliably works to commit, add, and update.
Everyone once in a while I have to go in and manually fix something.
I suppose I could put a chron entry in to make it automatic, but it is just as easy to go to the shell and update everything.
Setting up the server was no issue, and it is an offsite backup.
For items that are larger, or that do not change so often, I use iDisk.
This is just a fancy Webdav server that I do not have to manage.
I keep programs on an external hard disk.
This is where I also keep my photo library and music and videos.
I use one machine for Photos, so I do not really have anything to sync there.
My music is not synced either, but I have used some third party software to hel with that.
It is getting to the point where if something goes wrong with a machine, I can have new one set up will all my data in a day.
In normal circumstances, I can use any one of three machines and prety much have up to date information.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444179</id>
	<title>del.icio.us, TrueCrypt, USB stick, and iPod</title>
	<author>cavemanf16</author>
	<datestamp>1245747600000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>del.icio.us has plugins for IE, Firefox, and there is a less useful third-party plugin for Safari which make it easy to "sync" your bookmarks across computers, so I use that for my browser synchronization.<br>TrueCrypt keeps my really important data (passwords, resume, other sensitive personally identifiable info) safe and secure on my USB drive.<br>My USB stick on my keychain holds a copy of TrueCrypt to boot from directly when you plug it into a USB port (you need admin authority on the computer you're using to use this feature though), and then some other miscellaneous documents I wouldn't want to lose but aren't sensitive sit on my USB stick in generic folders.<br>And lastly, my iPod holds a copy of all of the music I care to not lose. (My wife and I also have a 750GB backup drive attached to our iMac at home to keep all of our media files, like photos and video, backed up)</p><p>Everything else is either done "in the cloud" online for us, or is proprietary or sensitive data that shouldn't be getting moved off of the primary computer it is on anyway.</p></htmltext>
<tokenext>del.icio.us has plugins for IE , Firefox , and there is a less useful third-party plugin for Safari which make it easy to " sync " your bookmarks across computers , so I use that for my browser synchronization.TrueCrypt keeps my really important data ( passwords , resume , other sensitive personally identifiable info ) safe and secure on my USB drive.My USB stick on my keychain holds a copy of TrueCrypt to boot from directly when you plug it into a USB port ( you need admin authority on the computer you 're using to use this feature though ) , and then some other miscellaneous documents I would n't want to lose but are n't sensitive sit on my USB stick in generic folders.And lastly , my iPod holds a copy of all of the music I care to not lose .
( My wife and I also have a 750GB backup drive attached to our iMac at home to keep all of our media files , like photos and video , backed up ) Everything else is either done " in the cloud " online for us , or is proprietary or sensitive data that should n't be getting moved off of the primary computer it is on anyway .</tokentext>
<sentencetext>del.icio.us has plugins for IE, Firefox, and there is a less useful third-party plugin for Safari which make it easy to "sync" your bookmarks across computers, so I use that for my browser synchronization.TrueCrypt keeps my really important data (passwords, resume, other sensitive personally identifiable info) safe and secure on my USB drive.My USB stick on my keychain holds a copy of TrueCrypt to boot from directly when you plug it into a USB port (you need admin authority on the computer you're using to use this feature though), and then some other miscellaneous documents I wouldn't want to lose but aren't sensitive sit on my USB stick in generic folders.And lastly, my iPod holds a copy of all of the music I care to not lose.
(My wife and I also have a 750GB backup drive attached to our iMac at home to keep all of our media files, like photos and video, backed up)Everything else is either done "in the cloud" online for us, or is proprietary or sensitive data that shouldn't be getting moved off of the primary computer it is on anyway.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444769</id>
	<title>Re:Dropbox</title>
	<author>snl2587</author>
	<datestamp>1245749640000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>So I take it your external storage equipment "fell off the back of a truck"?</htmltext>
<tokenext>So I take it your external storage equipment " fell off the back of a truck " ?</tokentext>
<sentencetext>So I take it your external storage equipment "fell off the back of a truck"?</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444529</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28451283</id>
	<title>iFolder</title>
	<author>Petaris</author>
	<datestamp>1245851400000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>Novell has a product called iFolder that works well.  A few years back I believe they open sourced it and made it free.  Looking now this is the link I find to it.  <a href="http://www.kablink.org/ifolder" title="kablink.org">http://www.kablink.org/ifolder</a> [kablink.org]

Might be worth looking at, though you will have to run a server for the backend service.</htmltext>
<tokenext>Novell has a product called iFolder that works well .
A few years back I believe they open sourced it and made it free .
Looking now this is the link I find to it .
http : //www.kablink.org/ifolder [ kablink.org ] Might be worth looking at , though you will have to run a server for the backend service .</tokentext>
<sentencetext>Novell has a product called iFolder that works well.
A few years back I believe they open sourced it and made it free.
Looking now this is the link I find to it.
http://www.kablink.org/ifolder [kablink.org]

Might be worth looking at, though you will have to run a server for the backend service.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28447763</id>
	<title>amazon s3 and jungledisk</title>
	<author>jackdaw</author>
	<datestamp>1245764580000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p> 15 cents per gig per month, amazon s3; jungledisk for win, linux,mac, $20.00. maybe not for video, but docs, pics, etc...cheap and automatic.</p></htmltext>
<tokenext>15 cents per gig per month , amazon s3 ; jungledisk for win , linux,mac , $ 20.00 .
maybe not for video , but docs , pics , etc...cheap and automatic .</tokentext>
<sentencetext> 15 cents per gig per month, amazon s3; jungledisk for win, linux,mac, $20.00.
maybe not for video, but docs, pics, etc...cheap and automatic.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444459</id>
	<title>Linux + Samba</title>
	<author>sanosuke001</author>
	<datestamp>1245748620000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>I set up another machine with 6x 1TB HDDs, mounted them, enabled them in samba, connect to them on any of my other machines. If you need redundancy, set up RAID or svn/cvs for code revisioning.<br> <br>

Maybe I misunderstand the question but this seems pretty straightforward for those that might frequent this site. Did the option, "file and repository server" seem too obvious? Why use an online tool or some special software to share between systems? Why store the same data on multiple machines? Just mount a shared drive and read/write to/from it.</htmltext>
<tokenext>I set up another machine with 6x 1TB HDDs , mounted them , enabled them in samba , connect to them on any of my other machines .
If you need redundancy , set up RAID or svn/cvs for code revisioning .
Maybe I misunderstand the question but this seems pretty straightforward for those that might frequent this site .
Did the option , " file and repository server " seem too obvious ?
Why use an online tool or some special software to share between systems ?
Why store the same data on multiple machines ?
Just mount a shared drive and read/write to/from it .</tokentext>
<sentencetext>I set up another machine with 6x 1TB HDDs, mounted them, enabled them in samba, connect to them on any of my other machines.
If you need redundancy, set up RAID or svn/cvs for code revisioning.
Maybe I misunderstand the question but this seems pretty straightforward for those that might frequent this site.
Did the option, "file and repository server" seem too obvious?
Why use an online tool or some special software to share between systems?
Why store the same data on multiple machines?
Just mount a shared drive and read/write to/from it.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444189</id>
	<title>SSHFS</title>
	<author>WheelDweller</author>
	<datestamp>1245747660000</datestamp>
	<modclass>Informativ</modclass>
	<modscore>2</modscore>
	<htmltext><p>Look into sshfs.  Keep your home machine ssh-reachable (isn't it already?) and you'll be able to ssh into it, mounting the filesystem on your desktop.  It's convenient, secure, and effective. Works anywhere ssh does. Good stuff!</p></htmltext>
<tokenext>Look into sshfs .
Keep your home machine ssh-reachable ( is n't it already ?
) and you 'll be able to ssh into it , mounting the filesystem on your desktop .
It 's convenient , secure , and effective .
Works anywhere ssh does .
Good stuff !</tokentext>
<sentencetext>Look into sshfs.
Keep your home machine ssh-reachable (isn't it already?
) and you'll be able to ssh into it, mounting the filesystem on your desktop.
It's convenient, secure, and effective.
Works anywhere ssh does.
Good stuff!</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443533</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443469</id>
	<title>Dropbox</title>
	<author>Anonymous</author>
	<datestamp>1245788580000</datestamp>
	<modclass>Interestin</modclass>
	<modscore>4</modscore>
	<htmltext><p>I recently started playing around with Dropbox for some smaller folders than my <em>entire</em> home directory and haven't yet run into any major problems. And the versioning it provides is nice as well, and as a plus they don't consider the deleted files that they still retain versions of as part of the quota.</p></htmltext>
<tokenext>I recently started playing around with Dropbox for some smaller folders than my entire home directory and have n't yet run into any major problems .
And the versioning it provides is nice as well , and as a plus they do n't consider the deleted files that they still retain versions of as part of the quota .</tokentext>
<sentencetext>I recently started playing around with Dropbox for some smaller folders than my entire home directory and haven't yet run into any major problems.
And the versioning it provides is nice as well, and as a plus they don't consider the deleted files that they still retain versions of as part of the quota.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28477615</id>
	<title>Re:Windows - SyncBack</title>
	<author>Anonymous</author>
	<datestamp>1245952680000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>Here is my method Acronis, SyncBack and TrueCrypt.</p><p>http://texturedstatic.blogspot.com/2009/02/one-approach-to-backup-your-computer.html</p></htmltext>
<tokenext>Here is my method Acronis , SyncBack and TrueCrypt.http : //texturedstatic.blogspot.com/2009/02/one-approach-to-backup-your-computer.html</tokentext>
<sentencetext>Here is my method Acronis, SyncBack and TrueCrypt.http://texturedstatic.blogspot.com/2009/02/one-approach-to-backup-your-computer.html</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443755</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443755</id>
	<title>Windows - SyncBack</title>
	<author>Anonymous</author>
	<datestamp>1245789480000</datestamp>
	<modclass>Interestin</modclass>
	<modscore>4</modscore>
	<htmltext><p>I spent a long time tackling this, as I am situated at different locations on different days.</p><p>I have 2 desktops and a laptop which must remain sync'd and encrypted. I use TrueCrypt for the encryption.</p><p>On my Windows boxes - SyncBack handles it. It can be triggered on write or on insertion, or just periodically. Has version control support. Will sync over FTP (poorly) and can create zip files or burn Cds etc. It's a swiss army knife of sync tools.</p><p>The key for getting the most out of a sync program is granularity. Inevitably, you'll have exceptions, and you don't want a PASS/FAIL result for your entire backup set. It works much better to sort files into categories and sync the individual groups than to try to make one profile that does your entire disk array. My 2 cents.</p></htmltext>
<tokenext>I spent a long time tackling this , as I am situated at different locations on different days.I have 2 desktops and a laptop which must remain sync 'd and encrypted .
I use TrueCrypt for the encryption.On my Windows boxes - SyncBack handles it .
It can be triggered on write or on insertion , or just periodically .
Has version control support .
Will sync over FTP ( poorly ) and can create zip files or burn Cds etc .
It 's a swiss army knife of sync tools.The key for getting the most out of a sync program is granularity .
Inevitably , you 'll have exceptions , and you do n't want a PASS/FAIL result for your entire backup set .
It works much better to sort files into categories and sync the individual groups than to try to make one profile that does your entire disk array .
My 2 cents .</tokentext>
<sentencetext>I spent a long time tackling this, as I am situated at different locations on different days.I have 2 desktops and a laptop which must remain sync'd and encrypted.
I use TrueCrypt for the encryption.On my Windows boxes - SyncBack handles it.
It can be triggered on write or on insertion, or just periodically.
Has version control support.
Will sync over FTP (poorly) and can create zip files or burn Cds etc.
It's a swiss army knife of sync tools.The key for getting the most out of a sync program is granularity.
Inevitably, you'll have exceptions, and you don't want a PASS/FAIL result for your entire backup set.
It works much better to sort files into categories and sync the individual groups than to try to make one profile that does your entire disk array.
My 2 cents.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444693</id>
	<title>Unison and git inside it</title>
	<author>fph il quozientatore</author>
	<datestamp>1245749460000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>Call me a paranoid, but I don't trust sending my entire home directory somewhere over the network with a dubious encryption protocol. I use unison with my usb stick as the root of a "star" sync topology, and if I need versioning for some project I create a git repo inside the unison'd directory. Works fine for me.</htmltext>
<tokenext>Call me a paranoid , but I do n't trust sending my entire home directory somewhere over the network with a dubious encryption protocol .
I use unison with my usb stick as the root of a " star " sync topology , and if I need versioning for some project I create a git repo inside the unison 'd directory .
Works fine for me .</tokentext>
<sentencetext>Call me a paranoid, but I don't trust sending my entire home directory somewhere over the network with a dubious encryption protocol.
I use unison with my usb stick as the root of a "star" sync topology, and if I need versioning for some project I create a git repo inside the unison'd directory.
Works fine for me.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28446393</id>
	<title>Rsync and rdiff-backup</title>
	<author>Anonymous</author>
	<datestamp>1245756120000</datestamp>
	<modclass>Interestin</modclass>
	<modscore>1</modscore>
	<htmltext><p>I use cygwin rsync (via vixie's cron) to sync<br>files from my various Windows clients to my<br>Linux backup host. On my Linux backup host I<br>use rdiff-backup (again as a cron job) to<br>backup / sync to an external USB drive.  I<br>tried gibak to externally version my $HOMEs<br>(.git as a symlink to an external USB drive<br>filesystem) but had problems (git was not<br>built to manage 20GB of content)<br>Anecdotally, git had problems processing<br>nested<nobr> <wbr></nobr>.git repos and large files.  Though<br>git was a poor choice in versioning my $HOMEs,<br>it proved to be a great choice for versioning<br>my<nobr> <wbr></nobr>/etc (again to an external repo via a<br>symlink).  For this, I used metastore and<br>etckeeper (a git wrapper utility).</p></htmltext>
<tokenext>I use cygwin rsync ( via vixie 's cron ) to syncfiles from my various Windows clients to myLinux backup host .
On my Linux backup host Iuse rdiff-backup ( again as a cron job ) tobackup / sync to an external USB drive .
Itried gibak to externally version my $ HOMEs ( .git as a symlink to an external USB drivefilesystem ) but had problems ( git was notbuilt to manage 20GB of content ) Anecdotally , git had problems processingnested .git repos and large files .
Thoughgit was a poor choice in versioning my $ HOMEs,it proved to be a great choice for versioningmy /etc ( again to an external repo via asymlink ) .
For this , I used metastore andetckeeper ( a git wrapper utility ) .</tokentext>
<sentencetext>I use cygwin rsync (via vixie's cron) to syncfiles from my various Windows clients to myLinux backup host.
On my Linux backup host Iuse rdiff-backup (again as a cron job) tobackup / sync to an external USB drive.
Itried gibak to externally version my $HOMEs(.git as a symlink to an external USB drivefilesystem) but had problems (git was notbuilt to manage 20GB of content)Anecdotally, git had problems processingnested .git repos and large files.
Thoughgit was a poor choice in versioning my $HOMEs,it proved to be a great choice for versioningmy /etc (again to an external repo via asymlink).
For this, I used metastore andetckeeper (a git wrapper utility).</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28450351</id>
	<title>Rolled my own</title>
	<author>eknagy</author>
	<datestamp>1245837600000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Rolled and GPL'ed my own:<br><a href="http://sourceforge.net/projects/nekjs/" title="sourceforge.net" rel="nofollow">http://sourceforge.net/projects/nekjs/</a> [sourceforge.net]<br>It does exactly what I need it to do<nobr> <wbr></nobr>;)</p></htmltext>
<tokenext>Rolled and GPL'ed my own : http : //sourceforge.net/projects/nekjs/ [ sourceforge.net ] It does exactly what I need it to do ; )</tokentext>
<sentencetext>Rolled and GPL'ed my own:http://sourceforge.net/projects/nekjs/ [sourceforge.net]It does exactly what I need it to do ;)</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28446597</id>
	<title>Whatever happened to iFolder ?</title>
	<author>WolphFang</author>
	<datestamp>1245757080000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>Whatever happened to iFolder ?

<a href="http://ifolder.com/ifolder" title="ifolder.com" rel="nofollow">iFolder</a> [ifolder.com]</htmltext>
<tokenext>Whatever happened to iFolder ?
iFolder [ ifolder.com ]</tokentext>
<sentencetext>Whatever happened to iFolder ?
iFolder [ifolder.com]</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444305</id>
	<title>Re:Dropbox</title>
	<author>darrylo</author>
	<datestamp>1245748020000</datestamp>
	<modclass>Informativ</modclass>
	<modscore>3</modscore>
	<htmltext><p>Yes, dropbox is very nice.  I'll second the recommendation.  Dropbox can also automatically keep previous versions of files around.  Works on PC, Mac, and linux.

</p><p>If you need security, truecrypt with dropbox is nice.  Dropbox supports incremental (delta) change file uploads/downloads, which makes large-ish truecrypt containers useful on dropbox.  The only real limitations are that (1) you have to unmount the truecrypt container before synchronization can occur, and (2) you have to insure, manually, that only one PC/Mac/linux box is accessing the truecrypt container at any one time.

</p><p>An alternative to dropbox is syncplicity, but I haven't tried it.  The feature set looks similar, though.

</p><p>Another alternative is jungledisk, which uses Amazon S3 to store your data.  The advantages here are that everything is encrypted with a key (stored only at your end, unless you enable the web interface), that you pay only for what you use, and that there's no limit on storage capacity (as long as you have money).  Disadvantages include:
</p><ul>
<li>Incremental/delta file downloads don't exist (makes truecrypt hard to use).</li>
<li>Incremental file uploads exist, for an extra $1/month fee.</li>
<li>You pay for bandwidth, and the bandwidth costs can add up.</li>
</ul></htmltext>
<tokenext>Yes , dropbox is very nice .
I 'll second the recommendation .
Dropbox can also automatically keep previous versions of files around .
Works on PC , Mac , and linux .
If you need security , truecrypt with dropbox is nice .
Dropbox supports incremental ( delta ) change file uploads/downloads , which makes large-ish truecrypt containers useful on dropbox .
The only real limitations are that ( 1 ) you have to unmount the truecrypt container before synchronization can occur , and ( 2 ) you have to insure , manually , that only one PC/Mac/linux box is accessing the truecrypt container at any one time .
An alternative to dropbox is syncplicity , but I have n't tried it .
The feature set looks similar , though .
Another alternative is jungledisk , which uses Amazon S3 to store your data .
The advantages here are that everything is encrypted with a key ( stored only at your end , unless you enable the web interface ) , that you pay only for what you use , and that there 's no limit on storage capacity ( as long as you have money ) .
Disadvantages include : Incremental/delta file downloads do n't exist ( makes truecrypt hard to use ) .
Incremental file uploads exist , for an extra $ 1/month fee .
You pay for bandwidth , and the bandwidth costs can add up .</tokentext>
<sentencetext>Yes, dropbox is very nice.
I'll second the recommendation.
Dropbox can also automatically keep previous versions of files around.
Works on PC, Mac, and linux.
If you need security, truecrypt with dropbox is nice.
Dropbox supports incremental (delta) change file uploads/downloads, which makes large-ish truecrypt containers useful on dropbox.
The only real limitations are that (1) you have to unmount the truecrypt container before synchronization can occur, and (2) you have to insure, manually, that only one PC/Mac/linux box is accessing the truecrypt container at any one time.
An alternative to dropbox is syncplicity, but I haven't tried it.
The feature set looks similar, though.
Another alternative is jungledisk, which uses Amazon S3 to store your data.
The advantages here are that everything is encrypted with a key (stored only at your end, unless you enable the web interface), that you pay only for what you use, and that there's no limit on storage capacity (as long as you have money).
Disadvantages include:

Incremental/delta file downloads don't exist (makes truecrypt hard to use).
Incremental file uploads exist, for an extra $1/month fee.
You pay for bandwidth, and the bandwidth costs can add up.
</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443469</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28453345</id>
	<title>ssh, bzr and krusader</title>
	<author>felixhummel</author>
	<datestamp>1245862080000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>For *large files* (like Ubuntu images) I use <a href="http://www.krusader.org/" title="krusader.org" rel="nofollow">http://www.krusader.org/</a> [krusader.org] to make backups or copy them somewhere on demand, but for *small, hand-written files* like configs, notes and scripts, I use <a href="http://bazaar-vcs.org/" title="bazaar-vcs.org" rel="nofollow">http://bazaar-vcs.org/</a> [bazaar-vcs.org].</p><p>An example of my workflow would look rather ugly in this comment, so have a look here instead: <a href="http://blag.felixhummel.de/junk/slashdot\_2009-06-24.html" title="felixhummel.de" rel="nofollow">http://blag.felixhummel.de/junk/slashdot\_2009-06-24.html</a> [felixhummel.de]</p></htmltext>
<tokenext>For * large files * ( like Ubuntu images ) I use http : //www.krusader.org/ [ krusader.org ] to make backups or copy them somewhere on demand , but for * small , hand-written files * like configs , notes and scripts , I use http : //bazaar-vcs.org/ [ bazaar-vcs.org ] .An example of my workflow would look rather ugly in this comment , so have a look here instead : http : //blag.felixhummel.de/junk/slashdot \ _2009-06-24.html [ felixhummel.de ]</tokentext>
<sentencetext>For *large files* (like Ubuntu images) I use http://www.krusader.org/ [krusader.org] to make backups or copy them somewhere on demand, but for *small, hand-written files* like configs, notes and scripts, I use http://bazaar-vcs.org/ [bazaar-vcs.org].An example of my workflow would look rather ugly in this comment, so have a look here instead: http://blag.felixhummel.de/junk/slashdot\_2009-06-24.html [felixhummel.de]</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28446663</id>
	<title>Subversion with a touch of bash</title>
	<author>rpwoodbu</author>
	<datestamp>1245757380000</datestamp>
	<modclass>Informativ</modclass>
	<modscore>4</modscore>
	<htmltext><p>
I have found that using Subversion (svn) with the aid of a bash script that is run manually actually works really well and provides a number of special advantages.  Here's how I have it constructed:
</p><p>
First, I don't actually make my whole home directory a svn checkout.  I have a subdirectory in it that is the checkout, and my bash script ensures there are symlinks into it for the things I want sync'd.  This makes it easy to have some differences between locations.  In particular, I can have a different<nobr> <wbr></nobr>.bashrc for one machine than another, but keep them both in svn as separate files; it is just a matter of making the symlink point to the one I want to use in each location.  My bash script will make the symlink if the file doesn't exist, and warn if the file does exist but isn't a symlink.  It does this for a number of files.
</p><p>
Another benefit of this method is that I don't put all my files in one checkout.  The core files I'll want in all my home directories (e.g.<nobr> <wbr></nobr>.bashrc,<nobr> <wbr></nobr>.vimrc, ssh<nobr> <wbr></nobr>.config and public keys, etc.) go in a checkout called "homedir".  But my documents go elsewhere.  And my sensitive files (e.g. private keys) go somewhere else still.  I choose what is appropriate to install at each location (usually just the "homedir" checkout on boxes I don't own).  My bash script detects which checkouts I have and does the appropriate steps.
</p><p>
The bash script not only sets up the symlinks but it also does an "svn status" on each checkout so I'll know if there are any files I've created that I haven't added, or any files I've modified that I haven't committed.  I prefer not to automate adds and commits.  I'll definitely see any pending things when I run my sync script, and can simply do an "svn add" or "svn commit" as necessary.
</p><p>
I also prefer not to automate the running of the sync script.  I like being in control of my bandwidth usage, especially when connected via slow links (e.g. Verizon EV-DO, AT&amp;T GPRS).  Plus dealing with conflicts is much easier when it is interactive (although I can usually avoid that scenario).  It also simplifies authentication to run it from my shell, as it can just use my ssh agent (which I forward, which is setup in my sync'd ssh config).
</p><p>
The sync bash script takes care of a few other edge-case issues, like dealing with files in ~/.ssh that have to have certain permissions and whatnot.  And I've taken care to ensure that the script doesn't just blow away files; it will warn if things don't look right, and leaves it to me to fix it.
</p><p>
Using Subversion has another big advantage: it is likely to be installed already in many places.  So when I'm given an account on someone's computer, I can usually get my environment just the way I like it in a few short steps:
</p><p>
<tt>
svn co svn+ssh://my.server.tld/my/path/to/svn/trunk/homedir ~/homedir<br>
~/homedir/bin/mysync # This is my bash script to do the syncing<br>
# Correct any complains about<nobr> <wbr></nobr>.bashrc not being a symlink and whatnot<br>
~/homedir/bin/mysync<br>
# Log out and back in, or source<nobr> <wbr></nobr>.bashrc<br>
</tt>
</p><p>
No fuss, no muss.  No downloading some sync package and building it just to get your<nobr> <wbr></nobr>.bashrc or<nobr> <wbr></nobr>.vimrc on a random box, or asking the admin to install something.  Subversion is usually there, and even if it isn't, most admins are happy to install it.  Subversion deals well with binary files, and even large files.  For bulk things (like a music library), I'm more likely to rsync it, partly because it is bulk, partly because it doesn't benefit from versioning, and partly because it only needs to be a unidirectional sync.  I could easily add that to my sync script.
</p><p>
I am simply in the habit of typing "mysync" from time to time (my<nobr> <wbr></nobr>.bashrc puts ~/bin/ in my $PATH).  This works for me very nicely.  Some people may prefer a little more automation, and of course my script could automatically do adds and commits, and even skip the log messages.  But I prefer a bit more process; after all, this is my data we're talking about!
</p><p>
If there is interest, I may post my sync script.</p></htmltext>
<tokenext>I have found that using Subversion ( svn ) with the aid of a bash script that is run manually actually works really well and provides a number of special advantages .
Here 's how I have it constructed : First , I do n't actually make my whole home directory a svn checkout .
I have a subdirectory in it that is the checkout , and my bash script ensures there are symlinks into it for the things I want sync 'd .
This makes it easy to have some differences between locations .
In particular , I can have a different .bashrc for one machine than another , but keep them both in svn as separate files ; it is just a matter of making the symlink point to the one I want to use in each location .
My bash script will make the symlink if the file does n't exist , and warn if the file does exist but is n't a symlink .
It does this for a number of files .
Another benefit of this method is that I do n't put all my files in one checkout .
The core files I 'll want in all my home directories ( e.g .
.bashrc , .vimrc , ssh .config and public keys , etc .
) go in a checkout called " homedir " .
But my documents go elsewhere .
And my sensitive files ( e.g .
private keys ) go somewhere else still .
I choose what is appropriate to install at each location ( usually just the " homedir " checkout on boxes I do n't own ) .
My bash script detects which checkouts I have and does the appropriate steps .
The bash script not only sets up the symlinks but it also does an " svn status " on each checkout so I 'll know if there are any files I 've created that I have n't added , or any files I 've modified that I have n't committed .
I prefer not to automate adds and commits .
I 'll definitely see any pending things when I run my sync script , and can simply do an " svn add " or " svn commit " as necessary .
I also prefer not to automate the running of the sync script .
I like being in control of my bandwidth usage , especially when connected via slow links ( e.g .
Verizon EV-DO , AT&amp;T GPRS ) .
Plus dealing with conflicts is much easier when it is interactive ( although I can usually avoid that scenario ) .
It also simplifies authentication to run it from my shell , as it can just use my ssh agent ( which I forward , which is setup in my sync 'd ssh config ) .
The sync bash script takes care of a few other edge-case issues , like dealing with files in ~ /.ssh that have to have certain permissions and whatnot .
And I 've taken care to ensure that the script does n't just blow away files ; it will warn if things do n't look right , and leaves it to me to fix it .
Using Subversion has another big advantage : it is likely to be installed already in many places .
So when I 'm given an account on someone 's computer , I can usually get my environment just the way I like it in a few short steps : svn co svn + ssh : //my.server.tld/my/path/to/svn/trunk/homedir ~ /homedir ~ /homedir/bin/mysync # This is my bash script to do the syncing # Correct any complains about .bashrc not being a symlink and whatnot ~ /homedir/bin/mysync # Log out and back in , or source .bashrc No fuss , no muss .
No downloading some sync package and building it just to get your .bashrc or .vimrc on a random box , or asking the admin to install something .
Subversion is usually there , and even if it is n't , most admins are happy to install it .
Subversion deals well with binary files , and even large files .
For bulk things ( like a music library ) , I 'm more likely to rsync it , partly because it is bulk , partly because it does n't benefit from versioning , and partly because it only needs to be a unidirectional sync .
I could easily add that to my sync script .
I am simply in the habit of typing " mysync " from time to time ( my .bashrc puts ~ /bin/ in my $ PATH ) .
This works for me very nicely .
Some people may prefer a little more automation , and of course my script could automatically do adds and commits , and even skip the log messages .
But I prefer a bit more process ; after all , this is my data we 're talking about !
If there is interest , I may post my sync script .</tokentext>
<sentencetext>
I have found that using Subversion (svn) with the aid of a bash script that is run manually actually works really well and provides a number of special advantages.
Here's how I have it constructed:

First, I don't actually make my whole home directory a svn checkout.
I have a subdirectory in it that is the checkout, and my bash script ensures there are symlinks into it for the things I want sync'd.
This makes it easy to have some differences between locations.
In particular, I can have a different .bashrc for one machine than another, but keep them both in svn as separate files; it is just a matter of making the symlink point to the one I want to use in each location.
My bash script will make the symlink if the file doesn't exist, and warn if the file does exist but isn't a symlink.
It does this for a number of files.
Another benefit of this method is that I don't put all my files in one checkout.
The core files I'll want in all my home directories (e.g.
.bashrc, .vimrc, ssh .config and public keys, etc.
) go in a checkout called "homedir".
But my documents go elsewhere.
And my sensitive files (e.g.
private keys) go somewhere else still.
I choose what is appropriate to install at each location (usually just the "homedir" checkout on boxes I don't own).
My bash script detects which checkouts I have and does the appropriate steps.
The bash script not only sets up the symlinks but it also does an "svn status" on each checkout so I'll know if there are any files I've created that I haven't added, or any files I've modified that I haven't committed.
I prefer not to automate adds and commits.
I'll definitely see any pending things when I run my sync script, and can simply do an "svn add" or "svn commit" as necessary.
I also prefer not to automate the running of the sync script.
I like being in control of my bandwidth usage, especially when connected via slow links (e.g.
Verizon EV-DO, AT&amp;T GPRS).
Plus dealing with conflicts is much easier when it is interactive (although I can usually avoid that scenario).
It also simplifies authentication to run it from my shell, as it can just use my ssh agent (which I forward, which is setup in my sync'd ssh config).
The sync bash script takes care of a few other edge-case issues, like dealing with files in ~/.ssh that have to have certain permissions and whatnot.
And I've taken care to ensure that the script doesn't just blow away files; it will warn if things don't look right, and leaves it to me to fix it.
Using Subversion has another big advantage: it is likely to be installed already in many places.
So when I'm given an account on someone's computer, I can usually get my environment just the way I like it in a few short steps:


svn co svn+ssh://my.server.tld/my/path/to/svn/trunk/homedir ~/homedir
~/homedir/bin/mysync # This is my bash script to do the syncing
# Correct any complains about .bashrc not being a symlink and whatnot
~/homedir/bin/mysync
# Log out and back in, or source .bashrc


No fuss, no muss.
No downloading some sync package and building it just to get your .bashrc or .vimrc on a random box, or asking the admin to install something.
Subversion is usually there, and even if it isn't, most admins are happy to install it.
Subversion deals well with binary files, and even large files.
For bulk things (like a music library), I'm more likely to rsync it, partly because it is bulk, partly because it doesn't benefit from versioning, and partly because it only needs to be a unidirectional sync.
I could easily add that to my sync script.
I am simply in the habit of typing "mysync" from time to time (my .bashrc puts ~/bin/ in my $PATH).
This works for me very nicely.
Some people may prefer a little more automation, and of course my script could automatically do adds and commits, and even skip the log messages.
But I prefer a bit more process; after all, this is my data we're talking about!
If there is interest, I may post my sync script.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443499</id>
	<title>Myself...</title>
	<author>Darkness404</author>
	<datestamp>1245788640000</datestamp>
	<modclass>Interestin</modclass>
	<modscore>2</modscore>
	<htmltext>Myself I simply store contact info "in the cloud", use my MP3 player to hold all my music and occasionally sync it to all computers to have a copy, any needed documents are either somewhere on my e-mail or on a USB drive, same with code. I have different computers for different purposes and have different data on each one. I never really liked the idea of having the same everything on all computers, most of my computers have different OSes, distros and desktop environments.</htmltext>
<tokenext>Myself I simply store contact info " in the cloud " , use my MP3 player to hold all my music and occasionally sync it to all computers to have a copy , any needed documents are either somewhere on my e-mail or on a USB drive , same with code .
I have different computers for different purposes and have different data on each one .
I never really liked the idea of having the same everything on all computers , most of my computers have different OSes , distros and desktop environments .</tokentext>
<sentencetext>Myself I simply store contact info "in the cloud", use my MP3 player to hold all my music and occasionally sync it to all computers to have a copy, any needed documents are either somewhere on my e-mail or on a USB drive, same with code.
I have different computers for different purposes and have different data on each one.
I never really liked the idea of having the same everything on all computers, most of my computers have different OSes, distros and desktop environments.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444713</id>
	<title>Re:Dropbox</title>
	<author>Anonymous</author>
	<datestamp>1245749520000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>NAS is not the same.</p><p>For example, when I am on the plane, my laptop has no access to the NAS (did I mention that I forgot to rsync before I left). With Dropbox, it syncs seamlessly, and so always has the most recent copy. And then it syncs up automatically when I get back online.</p></htmltext>
<tokenext>NAS is not the same.For example , when I am on the plane , my laptop has no access to the NAS ( did I mention that I forgot to rsync before I left ) .
With Dropbox , it syncs seamlessly , and so always has the most recent copy .
And then it syncs up automatically when I get back online .</tokentext>
<sentencetext>NAS is not the same.For example, when I am on the plane, my laptop has no access to the NAS (did I mention that I forgot to rsync before I left).
With Dropbox, it syncs seamlessly, and so always has the most recent copy.
And then it syncs up automatically when I get back online.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444529</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28446277</id>
	<title>Home directories</title>
	<author>ratboy666</author>
	<datestamp>1245755580000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Multiple?</p><p>I use automount, with nis and nfs. Linux and Solaris. New machine comes on, it just has the login/home directory. Naturally, it doesn't work on any of the inferior OSs. Neither Windows nor Mac OS X work with this setup "out of the box". Windows can mount my home directory with CIFS. And I just haven't bothered with Macs, though I suspect it would be the same.</p><p>On the road, I use unison to sync my home directory into work files, and then resync on return, if I know I will have limited connectivity. I use fuse sshfs if I know I will have reasonable connectivity, mounting my real home directory onto a subdirectory.</p><p>All this to keep a SINGLE home directory, under my control, with minimal effort (zero) needed to bring up additional machines.</p></htmltext>
<tokenext>Multiple ? I use automount , with nis and nfs .
Linux and Solaris .
New machine comes on , it just has the login/home directory .
Naturally , it does n't work on any of the inferior OSs .
Neither Windows nor Mac OS X work with this setup " out of the box " .
Windows can mount my home directory with CIFS .
And I just have n't bothered with Macs , though I suspect it would be the same.On the road , I use unison to sync my home directory into work files , and then resync on return , if I know I will have limited connectivity .
I use fuse sshfs if I know I will have reasonable connectivity , mounting my real home directory onto a subdirectory.All this to keep a SINGLE home directory , under my control , with minimal effort ( zero ) needed to bring up additional machines .</tokentext>
<sentencetext>Multiple?I use automount, with nis and nfs.
Linux and Solaris.
New machine comes on, it just has the login/home directory.
Naturally, it doesn't work on any of the inferior OSs.
Neither Windows nor Mac OS X work with this setup "out of the box".
Windows can mount my home directory with CIFS.
And I just haven't bothered with Macs, though I suspect it would be the same.On the road, I use unison to sync my home directory into work files, and then resync on return, if I know I will have limited connectivity.
I use fuse sshfs if I know I will have reasonable connectivity, mounting my real home directory onto a subdirectory.All this to keep a SINGLE home directory, under my control, with minimal effort (zero) needed to bring up additional machines.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444473</id>
	<title>Unison and encryption?</title>
	<author>Anonymous</author>
	<datestamp>1245748680000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>I am looking at Unison, and it looks interesting<nobr> <wbr></nobr>... I may end up using it.</p><p>However, I'm also considering setting up FileVault on my laptop, because I don't want client data compromised if my laptop gets stolen.</p><p>What's the chance of getting a sync tool like unison working while one (or both) of the computers in questions uses FileVault?</p></htmltext>
<tokenext>I am looking at Unison , and it looks interesting ... I may end up using it.However , I 'm also considering setting up FileVault on my laptop , because I do n't want client data compromised if my laptop gets stolen.What 's the chance of getting a sync tool like unison working while one ( or both ) of the computers in questions uses FileVault ?</tokentext>
<sentencetext>I am looking at Unison, and it looks interesting ... I may end up using it.However, I'm also considering setting up FileVault on my laptop, because I don't want client data compromised if my laptop gets stolen.What's the chance of getting a sync tool like unison working while one (or both) of the computers in questions uses FileVault?</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443941</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28451279</id>
	<title>FullSync and JFileSync</title>
	<author>edvd</author>
	<datestamp>1245851400000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>I have 2 PC (1 windows + 1 linux) and a distant FTP account.

I use JFileSync to sync the 2 PC with samba mounting. There are directories where the windows is the master, whereas other ones are mastered by the linux PC. This sync is also based on the file timestamp.

I use FullSync to sync (from windows or linux) home and distant FTP server. This sync is filesize-based, since the timestamp is not kept on the FTP server.

They are both open source software.</htmltext>
<tokenext>I have 2 PC ( 1 windows + 1 linux ) and a distant FTP account .
I use JFileSync to sync the 2 PC with samba mounting .
There are directories where the windows is the master , whereas other ones are mastered by the linux PC .
This sync is also based on the file timestamp .
I use FullSync to sync ( from windows or linux ) home and distant FTP server .
This sync is filesize-based , since the timestamp is not kept on the FTP server .
They are both open source software .</tokentext>
<sentencetext>I have 2 PC (1 windows + 1 linux) and a distant FTP account.
I use JFileSync to sync the 2 PC with samba mounting.
There are directories where the windows is the master, whereas other ones are mastered by the linux PC.
This sync is also based on the file timestamp.
I use FullSync to sync (from windows or linux) home and distant FTP server.
This sync is filesize-based, since the timestamp is not kept on the FTP server.
They are both open source software.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443851</id>
	<title>DropBox + symlinks</title>
	<author>Zortrium</author>
	<datestamp>1245789780000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>I use DropBox to keep (small) files synced across several machines -- I know some people object to keeping data 'in the cloud', but that's what backups are for.  DropBox means no forgetting to svn update or commit and has never really gotten in my way.  I keep things like my Firefox profile (bookmarks, addons, etc) in sync by symlinking relevant files in my Firefox profile folders to files in my DropBox.  It's a bit of a pain to set up initially but only needs to be done once and then my browsing session is seamless from one machine to the next.  I also use this method to sync stuff like my calendar, address book (no $99 per year to Apple for MobileMe, thanks), and SSH and bash config files (always symlinked so that I never need to actually move things).</htmltext>
<tokenext>I use DropBox to keep ( small ) files synced across several machines -- I know some people object to keeping data 'in the cloud ' , but that 's what backups are for .
DropBox means no forgetting to svn update or commit and has never really gotten in my way .
I keep things like my Firefox profile ( bookmarks , addons , etc ) in sync by symlinking relevant files in my Firefox profile folders to files in my DropBox .
It 's a bit of a pain to set up initially but only needs to be done once and then my browsing session is seamless from one machine to the next .
I also use this method to sync stuff like my calendar , address book ( no $ 99 per year to Apple for MobileMe , thanks ) , and SSH and bash config files ( always symlinked so that I never need to actually move things ) .</tokentext>
<sentencetext>I use DropBox to keep (small) files synced across several machines -- I know some people object to keeping data 'in the cloud', but that's what backups are for.
DropBox means no forgetting to svn update or commit and has never really gotten in my way.
I keep things like my Firefox profile (bookmarks, addons, etc) in sync by symlinking relevant files in my Firefox profile folders to files in my DropBox.
It's a bit of a pain to set up initially but only needs to be done once and then my browsing session is seamless from one machine to the next.
I also use this method to sync stuff like my calendar, address book (no $99 per year to Apple for MobileMe, thanks), and SSH and bash config files (always symlinked so that I never need to actually move things).</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444349</id>
	<title>Different systems for different files</title>
	<author>togofspookware</author>
	<datestamp>1245748200000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><blockquote><div><p>What system do you use to manage your home directories, and how have they worked for you for managing small files (e.g. dot configs) and large (gigabyte binaries of data) together?</p></div></blockquote><p>I don't know that managing them *together* is all that useful.  What I have been doing (and what I think is a more flexible way to manage stuff), is to divide the stuff in your home directory into independent 'projects' (e.g. financial documents, stuff for work, source code of my website, project X, project Y, my photo collection...) and manage each project separately in a way that lends itself well to the kind of file being stored.  For a directory of small files that are frequently updated, <a href="http://git-scm.com/" title="git-scm.com" rel="nofollow">Git</a> [git-scm.com] is a great way to go.  For synchronizing and backing up large collections of large files (like an MP3 or photo collection) you might try something like <a href="http://www.github.com/TOGoS/contentcouch" title="github.com" rel="nofollow">ContentCouch</a> [github.com] (disclaimer: I wrote this tool).</p></div>
	</htmltext>
<tokenext>What system do you use to manage your home directories , and how have they worked for you for managing small files ( e.g .
dot configs ) and large ( gigabyte binaries of data ) together ? I do n't know that managing them * together * is all that useful .
What I have been doing ( and what I think is a more flexible way to manage stuff ) , is to divide the stuff in your home directory into independent 'projects ' ( e.g .
financial documents , stuff for work , source code of my website , project X , project Y , my photo collection... ) and manage each project separately in a way that lends itself well to the kind of file being stored .
For a directory of small files that are frequently updated , Git [ git-scm.com ] is a great way to go .
For synchronizing and backing up large collections of large files ( like an MP3 or photo collection ) you might try something like ContentCouch [ github.com ] ( disclaimer : I wrote this tool ) .</tokentext>
<sentencetext>What system do you use to manage your home directories, and how have they worked for you for managing small files (e.g.
dot configs) and large (gigabyte binaries of data) together?I don't know that managing them *together* is all that useful.
What I have been doing (and what I think is a more flexible way to manage stuff), is to divide the stuff in your home directory into independent 'projects' (e.g.
financial documents, stuff for work, source code of my website, project X, project Y, my photo collection...) and manage each project separately in a way that lends itself well to the kind of file being stored.
For a directory of small files that are frequently updated, Git [git-scm.com] is a great way to go.
For synchronizing and backing up large collections of large files (like an MP3 or photo collection) you might try something like ContentCouch [github.com] (disclaimer: I wrote this tool).
	</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443875</id>
	<title>rsync</title>
	<author>john\_a\_smith</author>
	<datestamp>1245789900000</datestamp>
	<modclass>Informativ</modclass>
	<modscore>2</modscore>
	<htmltext>Primitive, but it works for me.</htmltext>
<tokenext>Primitive , but it works for me .</tokentext>
<sentencetext>Primitive, but it works for me.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443861</id>
	<title>Re:Dropbox</title>
	<author>bobstreo</author>
	<datestamp>1245789840000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>I'll ++ dropbox especially for locked down "workputers" and links you want to look at later.</p><p>For bookmarks, I use spurl and the RSS feed.</p><p>For RSS reading and browsing, I use netvibes.</p><p>For all the big stuff. I have a NAS box so I don't have to store all the big files<br>everywhere.</p></htmltext>
<tokenext>I 'll + + dropbox especially for locked down " workputers " and links you want to look at later.For bookmarks , I use spurl and the RSS feed.For RSS reading and browsing , I use netvibes.For all the big stuff .
I have a NAS box so I do n't have to store all the big fileseverywhere .</tokentext>
<sentencetext>I'll ++ dropbox especially for locked down "workputers" and links you want to look at later.For bookmarks, I use spurl and the RSS feed.For RSS reading and browsing, I use netvibes.For all the big stuff.
I have a NAS box so I don't have to store all the big fileseverywhere.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443469</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28445465</id>
	<title>ZFS FTW</title>
	<author>jregel</author>
	<datestamp>1245752160000</datestamp>
	<modclass>Interestin</modclass>
	<modscore>2</modscore>
	<htmltext><p>My Linux home directory is pretty tiny - only the dot files for my Linux environment (.gnome etc). I keep all my work documents and files on my OpenSolaris fileserver where ZFS provides resilience using RAID1 and point in time restores using the snapshot capabilities. I NFS mount the ZFS filesystem to my Linux box and CIFS share the same filesystem to my Windows PC and Mac.</p><p>My MP3 collection and photo albums are handled by iTunes and iPhoto respectively, syncing with my iPhone. The Mac backs these up to a Time Machine disk which in reality is a ZVol on my OpenSolaris server published as a LUN using iSCSI.</p><p>The ZFS filesystems and volumes are backed up to external USB drive using the "zfs send" command.</p><p>Blatant plug: I've documented most of the above experience on my blog.</p><p>For bookmarks, I use Xmarks to synchronise with the cloud, and take notes using Evernote.</p></htmltext>
<tokenext>My Linux home directory is pretty tiny - only the dot files for my Linux environment ( .gnome etc ) .
I keep all my work documents and files on my OpenSolaris fileserver where ZFS provides resilience using RAID1 and point in time restores using the snapshot capabilities .
I NFS mount the ZFS filesystem to my Linux box and CIFS share the same filesystem to my Windows PC and Mac.My MP3 collection and photo albums are handled by iTunes and iPhoto respectively , syncing with my iPhone .
The Mac backs these up to a Time Machine disk which in reality is a ZVol on my OpenSolaris server published as a LUN using iSCSI.The ZFS filesystems and volumes are backed up to external USB drive using the " zfs send " command.Blatant plug : I 've documented most of the above experience on my blog.For bookmarks , I use Xmarks to synchronise with the cloud , and take notes using Evernote .</tokentext>
<sentencetext>My Linux home directory is pretty tiny - only the dot files for my Linux environment (.gnome etc).
I keep all my work documents and files on my OpenSolaris fileserver where ZFS provides resilience using RAID1 and point in time restores using the snapshot capabilities.
I NFS mount the ZFS filesystem to my Linux box and CIFS share the same filesystem to my Windows PC and Mac.My MP3 collection and photo albums are handled by iTunes and iPhoto respectively, syncing with my iPhone.
The Mac backs these up to a Time Machine disk which in reality is a ZVol on my OpenSolaris server published as a LUN using iSCSI.The ZFS filesystems and volumes are backed up to external USB drive using the "zfs send" command.Blatant plug: I've documented most of the above experience on my blog.For bookmarks, I use Xmarks to synchronise with the cloud, and take notes using Evernote.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28446943</id>
	<title>Re:Unison</title>
	<author>growse</author>
	<datestamp>1245758820000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>Did they solve the issue of different versions not being able to talk to each other? I remember playing with it trying to sync cygwin with a debian box, and it just complained that the versions were different...</htmltext>
<tokenext>Did they solve the issue of different versions not being able to talk to each other ?
I remember playing with it trying to sync cygwin with a debian box , and it just complained that the versions were different.. .</tokentext>
<sentencetext>Did they solve the issue of different versions not being able to talk to each other?
I remember playing with it trying to sync cygwin with a debian box, and it just complained that the versions were different...</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443941</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444939</id>
	<title>MS SyncToy</title>
	<author>Anonymous</author>
	<datestamp>1245750240000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>Has anybody run into SyncToy's unnecessarily wanting to overwrite files that differ only in 'date modified'? It would be a minor issue, hadn't MS not implement multiple/Shift (un)check.<br>How did you solve the problem?</p><p>MS put SyncToy 3.0 towards Q4 2009 (ie. 2010), so I either have to find a solution to the above or get a suggestion for an alternative?<br>So, any suggestions for (preferably free) alternatives?</p></htmltext>
<tokenext>Has anybody run into SyncToy 's unnecessarily wanting to overwrite files that differ only in 'date modified ' ?
It would be a minor issue , had n't MS not implement multiple/Shift ( un ) check.How did you solve the problem ? MS put SyncToy 3.0 towards Q4 2009 ( ie .
2010 ) , so I either have to find a solution to the above or get a suggestion for an alternative ? So , any suggestions for ( preferably free ) alternatives ?</tokentext>
<sentencetext>Has anybody run into SyncToy's unnecessarily wanting to overwrite files that differ only in 'date modified'?
It would be a minor issue, hadn't MS not implement multiple/Shift (un)check.How did you solve the problem?MS put SyncToy 3.0 towards Q4 2009 (ie.
2010), so I either have to find a solution to the above or get a suggestion for an alternative?So, any suggestions for (preferably free) alternatives?</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28446387</id>
	<title>SpiderOak</title>
	<author>Gainax</author>
	<datestamp>1245756060000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>Disclosure: <a href="https://spideroak.com/" title="spideroak.com" rel="nofollow">SpiderOak</a> [spideroak.com] is my primary contractor, I do stuff to help their infrastructure.

That said, we do versioned, encrypted, zero-knowledge backup of Linux, Mac, and Windows machines.</htmltext>
<tokenext>Disclosure : SpiderOak [ spideroak.com ] is my primary contractor , I do stuff to help their infrastructure .
That said , we do versioned , encrypted , zero-knowledge backup of Linux , Mac , and Windows machines .</tokentext>
<sentencetext>Disclosure: SpiderOak [spideroak.com] is my primary contractor, I do stuff to help their infrastructure.
That said, we do versioned, encrypted, zero-knowledge backup of Linux, Mac, and Windows machines.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28448215</id>
	<title>Re:Dropbox</title>
	<author>Owlyn</author>
	<datestamp>1245768840000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>The thing I really like about Dropbox on my Linux system is the ability to create symbolic links in my Dropbox folder.   This way I can keep all my folders and files where I want them and just create symbolic links to what I want backed up.   I am only using the 2GB free offering at the moment, but I've been so happy with it I am likely to shell out the bucks for the paid version and backup my entire hard drive.</p><p>Other things I have tried:  I still use rsync to backup to a separate hard drive, but that wouldn't help if the house were to burn down.   I tried JungleDisk and did not like it.  I also use the Amazon S3 service --very cheap-- using the Firefox S3 Organizer plug in.  But Dropbox is more simple to use.</p></htmltext>
<tokenext>The thing I really like about Dropbox on my Linux system is the ability to create symbolic links in my Dropbox folder .
This way I can keep all my folders and files where I want them and just create symbolic links to what I want backed up .
I am only using the 2GB free offering at the moment , but I 've been so happy with it I am likely to shell out the bucks for the paid version and backup my entire hard drive.Other things I have tried : I still use rsync to backup to a separate hard drive , but that would n't help if the house were to burn down .
I tried JungleDisk and did not like it .
I also use the Amazon S3 service --very cheap-- using the Firefox S3 Organizer plug in .
But Dropbox is more simple to use .</tokentext>
<sentencetext>The thing I really like about Dropbox on my Linux system is the ability to create symbolic links in my Dropbox folder.
This way I can keep all my folders and files where I want them and just create symbolic links to what I want backed up.
I am only using the 2GB free offering at the moment, but I've been so happy with it I am likely to shell out the bucks for the paid version and backup my entire hard drive.Other things I have tried:  I still use rsync to backup to a separate hard drive, but that wouldn't help if the house were to burn down.
I tried JungleDisk and did not like it.
I also use the Amazon S3 service --very cheap-- using the Firefox S3 Organizer plug in.
But Dropbox is more simple to use.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443469</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28446375</id>
	<title>Client Side Caching + Folder Redirection</title>
	<author>Xenophon Fenderson,</author>
	<datestamp>1245756000000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>I store users' roaming profiles and home directories on a server running Windows SBS 2003.  The server's storage is a SATA RAID-5 (<a href="http://www.3ware.com/" title="3ware.com">3ware rocks!</a> [3ware.com]).  SBS backs itself up to disk weekly, which I occasionally transfer to an external hard drive for DR purposes.  The profile and home directories are separate SMB shares because the share containing the roaming profiles is configured to disallow client-side caching (which causes problems with the user profile loader on older versions of Windows and maybe even Vista).  The shares are accessed via MSDfs because some day I'd like to replicate them to a second server and want any accesses or fail-over to be somewhat automatic (again, for DR purposes).  I use Group Policy to move each user's "AppData", "Contacts", "Desktop", "Documents", "Downloads", "Favorites", "Links", "Saved Games", and "Searches" folders to their home directory.  In my scheme, "Music", "Pictures", and "Videos" are sub-folders of "Documents", for backwards compatibility with Windows XP.  I've also configured Volume Shadow Copy, which allows users to retrieve older versions of their files without needing to bother me about restoring them from archival backups, and deployed Certificate Services on SBS.  Each user's enrolled in the domain PKI, so they can encrypt their caches as well as any of their files.</p><p>From the users' perspective, everything is automatic:  They log in, work with their files, and log out.  If they are out of the office, they'll get a warning about working with a cached copy of their profile, but that's about it.  When they return, they'll get prompted to sync any conflicting changes made while offline.  Windows has featured CSC (also known as "Offline Files") for some time, but it's only gotten really stable in Windows Vista.  A few programs don't really play well with CSC but nothing that's a deal-breaker (like Firefox or Skype storing database stuff in the roaming version of the AppData folder when it really should be in local version instead, but I kind of brought that on myself when I redirected it to the network share to start with).</p></htmltext>
<tokenext>I store users ' roaming profiles and home directories on a server running Windows SBS 2003 .
The server 's storage is a SATA RAID-5 ( 3ware rocks !
[ 3ware.com ] ) . SBS backs itself up to disk weekly , which I occasionally transfer to an external hard drive for DR purposes .
The profile and home directories are separate SMB shares because the share containing the roaming profiles is configured to disallow client-side caching ( which causes problems with the user profile loader on older versions of Windows and maybe even Vista ) .
The shares are accessed via MSDfs because some day I 'd like to replicate them to a second server and want any accesses or fail-over to be somewhat automatic ( again , for DR purposes ) .
I use Group Policy to move each user 's " AppData " , " Contacts " , " Desktop " , " Documents " , " Downloads " , " Favorites " , " Links " , " Saved Games " , and " Searches " folders to their home directory .
In my scheme , " Music " , " Pictures " , and " Videos " are sub-folders of " Documents " , for backwards compatibility with Windows XP .
I 've also configured Volume Shadow Copy , which allows users to retrieve older versions of their files without needing to bother me about restoring them from archival backups , and deployed Certificate Services on SBS .
Each user 's enrolled in the domain PKI , so they can encrypt their caches as well as any of their files.From the users ' perspective , everything is automatic : They log in , work with their files , and log out .
If they are out of the office , they 'll get a warning about working with a cached copy of their profile , but that 's about it .
When they return , they 'll get prompted to sync any conflicting changes made while offline .
Windows has featured CSC ( also known as " Offline Files " ) for some time , but it 's only gotten really stable in Windows Vista .
A few programs do n't really play well with CSC but nothing that 's a deal-breaker ( like Firefox or Skype storing database stuff in the roaming version of the AppData folder when it really should be in local version instead , but I kind of brought that on myself when I redirected it to the network share to start with ) .</tokentext>
<sentencetext>I store users' roaming profiles and home directories on a server running Windows SBS 2003.
The server's storage is a SATA RAID-5 (3ware rocks!
[3ware.com]).  SBS backs itself up to disk weekly, which I occasionally transfer to an external hard drive for DR purposes.
The profile and home directories are separate SMB shares because the share containing the roaming profiles is configured to disallow client-side caching (which causes problems with the user profile loader on older versions of Windows and maybe even Vista).
The shares are accessed via MSDfs because some day I'd like to replicate them to a second server and want any accesses or fail-over to be somewhat automatic (again, for DR purposes).
I use Group Policy to move each user's "AppData", "Contacts", "Desktop", "Documents", "Downloads", "Favorites", "Links", "Saved Games", and "Searches" folders to their home directory.
In my scheme, "Music", "Pictures", and "Videos" are sub-folders of "Documents", for backwards compatibility with Windows XP.
I've also configured Volume Shadow Copy, which allows users to retrieve older versions of their files without needing to bother me about restoring them from archival backups, and deployed Certificate Services on SBS.
Each user's enrolled in the domain PKI, so they can encrypt their caches as well as any of their files.From the users' perspective, everything is automatic:  They log in, work with their files, and log out.
If they are out of the office, they'll get a warning about working with a cached copy of their profile, but that's about it.
When they return, they'll get prompted to sync any conflicting changes made while offline.
Windows has featured CSC (also known as "Offline Files") for some time, but it's only gotten really stable in Windows Vista.
A few programs don't really play well with CSC but nothing that's a deal-breaker (like Firefox or Skype storing database stuff in the roaming version of the AppData folder when it really should be in local version instead, but I kind of brought that on myself when I redirected it to the network share to start with).</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28445939</id>
	<title>for windows... Cockos's PathSync</title>
	<author>deburg</author>
	<datestamp>1245754020000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><a href="http://www.cockos.com/pathsync/" title="cockos.com" rel="nofollow">http://www.cockos.com/pathsync/</a> [cockos.com] <p><div class="quote"><p>PathSync (GPL)
</p><p>an interactive directory (path) synchronizer for windows
</p><p>PathSync can analyze two directories and show the user a list of differences between the directories.
</p><p>The user can select what actions should occur (which files to overwrite, which to delete, which to ignore), and allow PathSync to synchronize.</p></div><p>I use this in a jiffy, it works by comparing the dates and file sizes of the files; and what's there or missing; and just copy/delete to make a clone of the source.
</p><p>Furthermore, it's portable (just copy the directory) although I am not sure if it leaves anything behind in the registry
</p><p>Bonus is that it can save/load the settings that you use before.</p></div>
	</htmltext>
<tokenext>http : //www.cockos.com/pathsync/ [ cockos.com ] PathSync ( GPL ) an interactive directory ( path ) synchronizer for windows PathSync can analyze two directories and show the user a list of differences between the directories .
The user can select what actions should occur ( which files to overwrite , which to delete , which to ignore ) , and allow PathSync to synchronize.I use this in a jiffy , it works by comparing the dates and file sizes of the files ; and what 's there or missing ; and just copy/delete to make a clone of the source .
Furthermore , it 's portable ( just copy the directory ) although I am not sure if it leaves anything behind in the registry Bonus is that it can save/load the settings that you use before .</tokentext>
<sentencetext>http://www.cockos.com/pathsync/ [cockos.com] PathSync (GPL)
an interactive directory (path) synchronizer for windows
PathSync can analyze two directories and show the user a list of differences between the directories.
The user can select what actions should occur (which files to overwrite, which to delete, which to ignore), and allow PathSync to synchronize.I use this in a jiffy, it works by comparing the dates and file sizes of the files; and what's there or missing; and just copy/delete to make a clone of the source.
Furthermore, it's portable (just copy the directory) although I am not sure if it leaves anything behind in the registry
Bonus is that it can save/load the settings that you use before.
	</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443699</id>
	<title>And the Large Files?</title>
	<author>Anonymous</author>
	<datestamp>1245789300000</datestamp>
	<modclass>Informativ</modclass>
	<modscore>2</modscore>
	<htmltext>I've used subversion quite a bit and we simply avoid committing Java archives and instead use Maven2 to get those.  This is because it seems to take up a lot of space and time with large files.  Maybe this is typical of any versioning system but I do not know enough about git.  From Subversion's best practices:<p><div class="quote"><p>Be patient with large files<br> <br>

A nice feature of Subversion is that by design, there is no limit to the size of files it can handle. Files are sent "streamily" in both directions between Subversion client and server, using a small, constant amount of memory on each side of the network.<br> <br>

Of course, there are a number of practical issues to consider. While there's no need to worry about files in the kilobyte-sized range (e.g. typical source-code files), committing larger files can take a tremendous amount of both time and space (e.g. files that are dozens or hundreds of megabytes large.)<br> <br>

To begin with, remember that your Subversion working copy stores pristine copies of all version-controlled files in the<nobr> <wbr></nobr>.svn/text-base/ area. This means that your working copy takes up at least twice as much disk space as the original dataset. Beyond that, the Subversion client follows a (currently unadjustable) algorithm for committing files:<br> <br>

    * Copies the file to<nobr> <wbr></nobr>.svn/tmp/ (can take a while, and temporarily uses extra disk space))<br>
    * Performs a binary diff between the tmpfile and the pristine copy, or between the tmpfile and an empty-file if newly added. (can take a very long time to compute, even though only a small amount of data might ultimately be sent over the network)<br>
    * Sends the diff to the server, then moves the tmpfile into<nobr> <wbr></nobr>.svn/text-base/<br> <br>

So while there's no theoretical limit to the size of your files, you'll need to be aware that very large files may require quite a bit of patient waiting while your client chugs away. You can rest assured, however, that unlike CVS, your large files won't incapacitate the server or affect other users.</p></div><p>Really, I think he's asking for one tool to do both small files and large files when (in my mind) it makes more sense to back up ISOs and MP3s over longer periods of time than my source code or documents that I may edit and change daily.  <br> <br>

Subversion for source control.  A simple script that pushes large files to an external drive.  That's all I do.  Bulletproof?  No way.  But it sounds like he's devoting a lot of time to this.  I guess he must have a lot more computers than I do.</p></div>
	</htmltext>
<tokenext>I 've used subversion quite a bit and we simply avoid committing Java archives and instead use Maven2 to get those .
This is because it seems to take up a lot of space and time with large files .
Maybe this is typical of any versioning system but I do not know enough about git .
From Subversion 's best practices : Be patient with large files A nice feature of Subversion is that by design , there is no limit to the size of files it can handle .
Files are sent " streamily " in both directions between Subversion client and server , using a small , constant amount of memory on each side of the network .
Of course , there are a number of practical issues to consider .
While there 's no need to worry about files in the kilobyte-sized range ( e.g .
typical source-code files ) , committing larger files can take a tremendous amount of both time and space ( e.g .
files that are dozens or hundreds of megabytes large .
) To begin with , remember that your Subversion working copy stores pristine copies of all version-controlled files in the .svn/text-base/ area .
This means that your working copy takes up at least twice as much disk space as the original dataset .
Beyond that , the Subversion client follows a ( currently unadjustable ) algorithm for committing files : * Copies the file to .svn/tmp/ ( can take a while , and temporarily uses extra disk space ) ) * Performs a binary diff between the tmpfile and the pristine copy , or between the tmpfile and an empty-file if newly added .
( can take a very long time to compute , even though only a small amount of data might ultimately be sent over the network ) * Sends the diff to the server , then moves the tmpfile into .svn/text-base/ So while there 's no theoretical limit to the size of your files , you 'll need to be aware that very large files may require quite a bit of patient waiting while your client chugs away .
You can rest assured , however , that unlike CVS , your large files wo n't incapacitate the server or affect other users.Really , I think he 's asking for one tool to do both small files and large files when ( in my mind ) it makes more sense to back up ISOs and MP3s over longer periods of time than my source code or documents that I may edit and change daily .
Subversion for source control .
A simple script that pushes large files to an external drive .
That 's all I do .
Bulletproof ? No way .
But it sounds like he 's devoting a lot of time to this .
I guess he must have a lot more computers than I do .</tokentext>
<sentencetext>I've used subversion quite a bit and we simply avoid committing Java archives and instead use Maven2 to get those.
This is because it seems to take up a lot of space and time with large files.
Maybe this is typical of any versioning system but I do not know enough about git.
From Subversion's best practices:Be patient with large files 

A nice feature of Subversion is that by design, there is no limit to the size of files it can handle.
Files are sent "streamily" in both directions between Subversion client and server, using a small, constant amount of memory on each side of the network.
Of course, there are a number of practical issues to consider.
While there's no need to worry about files in the kilobyte-sized range (e.g.
typical source-code files), committing larger files can take a tremendous amount of both time and space (e.g.
files that are dozens or hundreds of megabytes large.
) 

To begin with, remember that your Subversion working copy stores pristine copies of all version-controlled files in the .svn/text-base/ area.
This means that your working copy takes up at least twice as much disk space as the original dataset.
Beyond that, the Subversion client follows a (currently unadjustable) algorithm for committing files: 

    * Copies the file to .svn/tmp/ (can take a while, and temporarily uses extra disk space))
    * Performs a binary diff between the tmpfile and the pristine copy, or between the tmpfile and an empty-file if newly added.
(can take a very long time to compute, even though only a small amount of data might ultimately be sent over the network)
    * Sends the diff to the server, then moves the tmpfile into .svn/text-base/ 

So while there's no theoretical limit to the size of your files, you'll need to be aware that very large files may require quite a bit of patient waiting while your client chugs away.
You can rest assured, however, that unlike CVS, your large files won't incapacitate the server or affect other users.Really, I think he's asking for one tool to do both small files and large files when (in my mind) it makes more sense to back up ISOs and MP3s over longer periods of time than my source code or documents that I may edit and change daily.
Subversion for source control.
A simple script that pushes large files to an external drive.
That's all I do.
Bulletproof?  No way.
But it sounds like he's devoting a lot of time to this.
I guess he must have a lot more computers than I do.
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443419</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28447089</id>
	<title>Don't sync home.</title>
	<author>dotwaffle</author>
	<datestamp>1245759600000</datestamp>
	<modclass>Insightful</modclass>
	<modscore>3</modscore>
	<htmltext><p>I've done exactly the same as you, used every single tool under the sun, eventually settling on Unison until I realised I was being silly...</p><p>Let's put it this way - just set up each computer how you want it, and sync the *data*, not the whole home directory.</p><p>For instance, my Documents are synced with Dropbox (though tempted to move them to UbuntuOne), my development directories are generally stored in some kind of revision control (svn/bzr/git) and either not synced or at worst, unison-ed, and everything else just stays on the machine it was created on, and backed up with duplicity to a central fileserver hosted in France.</p><p>When you realise that syncing home is *not* good, it suddenly becomes clear what you need, and what you want are completely different.</p></htmltext>
<tokenext>I 've done exactly the same as you , used every single tool under the sun , eventually settling on Unison until I realised I was being silly...Let 's put it this way - just set up each computer how you want it , and sync the * data * , not the whole home directory.For instance , my Documents are synced with Dropbox ( though tempted to move them to UbuntuOne ) , my development directories are generally stored in some kind of revision control ( svn/bzr/git ) and either not synced or at worst , unison-ed , and everything else just stays on the machine it was created on , and backed up with duplicity to a central fileserver hosted in France.When you realise that syncing home is * not * good , it suddenly becomes clear what you need , and what you want are completely different .</tokentext>
<sentencetext>I've done exactly the same as you, used every single tool under the sun, eventually settling on Unison until I realised I was being silly...Let's put it this way - just set up each computer how you want it, and sync the *data*, not the whole home directory.For instance, my Documents are synced with Dropbox (though tempted to move them to UbuntuOne), my development directories are generally stored in some kind of revision control (svn/bzr/git) and either not synced or at worst, unison-ed, and everything else just stays on the machine it was created on, and backed up with duplicity to a central fileserver hosted in France.When you realise that syncing home is *not* good, it suddenly becomes clear what you need, and what you want are completely different.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28446175</id>
	<title>mobileme</title>
	<author>Anonymous</author>
	<datestamp>1245755100000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>I use<nobr> <wbr></nobr>.MAC (now mobileme) to sync my two macs, this, combined with iphone sync of contacts, calendars and notes makes all my environment transporable. My first mac is a Macbook Pro, the second is a Dell Mini9 to carry everywhere and keep all that vital information with me at all times.</p><p>The mobileme web interface makes a lot of sense to me also so in cases of not carrying my laptop i can always access my contacts, files, etc...</p></htmltext>
<tokenext>I use .MAC ( now mobileme ) to sync my two macs , this , combined with iphone sync of contacts , calendars and notes makes all my environment transporable .
My first mac is a Macbook Pro , the second is a Dell Mini9 to carry everywhere and keep all that vital information with me at all times.The mobileme web interface makes a lot of sense to me also so in cases of not carrying my laptop i can always access my contacts , files , etc.. .</tokentext>
<sentencetext>I use .MAC (now mobileme) to sync my two macs, this, combined with iphone sync of contacts, calendars and notes makes all my environment transporable.
My first mac is a Macbook Pro, the second is a Dell Mini9 to carry everywhere and keep all that vital information with me at all times.The mobileme web interface makes a lot of sense to me also so in cases of not carrying my laptop i can always access my contacts, files, etc...</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28447141</id>
	<title>SyncToy and Rsync</title>
	<author>flyingfsck</author>
	<datestamp>1245759960000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>On Windows the free MS SyncToy works OK for me.  On *nix I use rsync.

Backup media is a slew of USB drives ranging in size from 120 to 750GB.</htmltext>
<tokenext>On Windows the free MS SyncToy works OK for me .
On * nix I use rsync .
Backup media is a slew of USB drives ranging in size from 120 to 750GB .</tokentext>
<sentencetext>On Windows the free MS SyncToy works OK for me.
On *nix I use rsync.
Backup media is a slew of USB drives ranging in size from 120 to 750GB.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444741</id>
	<title>What?</title>
	<author>Legion303</author>
	<datestamp>1245749640000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>File sharing. Independent backups. I don't understand why the average nerd would need anything fancy at home. Keep your movies on your media box, personal files on whatever box you access the the most, and don't worry about it.</p></htmltext>
<tokenext>File sharing .
Independent backups .
I do n't understand why the average nerd would need anything fancy at home .
Keep your movies on your media box , personal files on whatever box you access the the most , and do n't worry about it .</tokentext>
<sentencetext>File sharing.
Independent backups.
I don't understand why the average nerd would need anything fancy at home.
Keep your movies on your media box, personal files on whatever box you access the the most, and don't worry about it.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28449031</id>
	<title>CVS works for me</title>
	<author>evaddnomaid</author>
	<datestamp>1245777780000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>I've been using CVS for my personal files for years, and I'm happy with the choice.  If I had it to do over again I would choose SVN or git.  I keep my important files synchronized on 6 machines I use regularly.</htmltext>
<tokenext>I 've been using CVS for my personal files for years , and I 'm happy with the choice .
If I had it to do over again I would choose SVN or git .
I keep my important files synchronized on 6 machines I use regularly .</tokentext>
<sentencetext>I've been using CVS for my personal files for years, and I'm happy with the choice.
If I had it to do over again I would choose SVN or git.
I keep my important files synchronized on 6 machines I use regularly.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444267</id>
	<title>Re:Dropbox</title>
	<author>boot\_img</author>
	<datestamp>1245747900000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Another me too<nobr> <wbr></nobr>... since it syncs Linux/Mac and Windows it was perfect for me. I recently upgraded to the 50Gb plan - cant see myself going back.</p><p>( I hardly need the extra referral bonus space now but hey, it doesn't hurt either<nobr> <wbr></nobr>... <a href="https://www.getdropbox.com/referrals/NTE2MjAzNTk" title="getdropbox.com">https://www.getdropbox.com/referrals/NTE2MjAzNTk</a> [getdropbox.com] )</p></htmltext>
<tokenext>Another me too ... since it syncs Linux/Mac and Windows it was perfect for me .
I recently upgraded to the 50Gb plan - cant see myself going back .
( I hardly need the extra referral bonus space now but hey , it does n't hurt either ... https : //www.getdropbox.com/referrals/NTE2MjAzNTk [ getdropbox.com ] )</tokentext>
<sentencetext>Another me too ... since it syncs Linux/Mac and Windows it was perfect for me.
I recently upgraded to the 50Gb plan - cant see myself going back.
( I hardly need the extra referral bonus space now but hey, it doesn't hurt either ... https://www.getdropbox.com/referrals/NTE2MjAzNTk [getdropbox.com] )</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443469</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28445097</id>
	<title>Re:Myself...</title>
	<author>Anonymous</author>
	<datestamp>1245750900000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>Pretty much same here.  I figure out what I need/want and where I need/want it.  A lot of stuff exists as attachments to emails that I can access via a web interface. I find it useful to actually think about  the data, rather than treating it all equally.</p><p>When I've really needed a whole identical environment, its been done with net mounts.  A portable USB drive seems the best option if network mounts are<br>not practical.</p></htmltext>
<tokenext>Pretty much same here .
I figure out what I need/want and where I need/want it .
A lot of stuff exists as attachments to emails that I can access via a web interface .
I find it useful to actually think about the data , rather than treating it all equally.When I 've really needed a whole identical environment , its been done with net mounts .
A portable USB drive seems the best option if network mounts arenot practical .</tokentext>
<sentencetext>Pretty much same here.
I figure out what I need/want and where I need/want it.
A lot of stuff exists as attachments to emails that I can access via a web interface.
I find it useful to actually think about  the data, rather than treating it all equally.When I've really needed a whole identical environment, its been done with net mounts.
A portable USB drive seems the best option if network mounts arenot practical.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443499</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444129</id>
	<title>I just don't sync</title>
	<author>Anonymous</author>
	<datestamp>1245790680000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>Between my Desktop, Server, Laptop, and iPhone I just keep whatever data on them I want and move what I want when I want.  I can always access any other machine from any machine (mobile internet) so I always have access to the files I want or need.</p></htmltext>
<tokenext>Between my Desktop , Server , Laptop , and iPhone I just keep whatever data on them I want and move what I want when I want .
I can always access any other machine from any machine ( mobile internet ) so I always have access to the files I want or need .</tokentext>
<sentencetext>Between my Desktop, Server, Laptop, and iPhone I just keep whatever data on them I want and move what I want when I want.
I can always access any other machine from any machine (mobile internet) so I always have access to the files I want or need.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443649</id>
	<title>USB drive</title>
	<author>Anonymous</author>
	<datestamp>1245789120000</datestamp>
	<modclass>Interestin</modclass>
	<modscore>4</modscore>
	<htmltext>I carry a 16 Gig USB flash drive with my working files on it. I've using this method since the days of 100 Meg Zip drives and just keep upgrading the media. My flash drive is automatically backed up to my backup server at home in the middle of the night so, if I forget it at the office, I'm only a few hours behind. Besides, I can use free Logmein to log into the office computer and transfer a file if it's got new and important information on it. It works the same way in reverse if I forget it at home. Since my working files are on the USB drive which is also compatible with my Linux machines, it really doesn't make much difference which machine I plug it into.  Did I mention encryption? That's a good idea in case you lose the drive if you've got any sensitive information on it.</htmltext>
<tokenext>I carry a 16 Gig USB flash drive with my working files on it .
I 've using this method since the days of 100 Meg Zip drives and just keep upgrading the media .
My flash drive is automatically backed up to my backup server at home in the middle of the night so , if I forget it at the office , I 'm only a few hours behind .
Besides , I can use free Logmein to log into the office computer and transfer a file if it 's got new and important information on it .
It works the same way in reverse if I forget it at home .
Since my working files are on the USB drive which is also compatible with my Linux machines , it really does n't make much difference which machine I plug it into .
Did I mention encryption ?
That 's a good idea in case you lose the drive if you 've got any sensitive information on it .</tokentext>
<sentencetext>I carry a 16 Gig USB flash drive with my working files on it.
I've using this method since the days of 100 Meg Zip drives and just keep upgrading the media.
My flash drive is automatically backed up to my backup server at home in the middle of the night so, if I forget it at the office, I'm only a few hours behind.
Besides, I can use free Logmein to log into the office computer and transfer a file if it's got new and important information on it.
It works the same way in reverse if I forget it at home.
Since my working files are on the USB drive which is also compatible with my Linux machines, it really doesn't make much difference which machine I plug it into.
Did I mention encryption?
That's a good idea in case you lose the drive if you've got any sensitive information on it.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444277</id>
	<title>Re:rsync</title>
	<author>fishbowl</author>
	<datestamp>1245747960000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>This is enterprise fileserver level, not home directories, but I have about 1.5TB of data, in about 4 million files, that I replicate between two sites.</p><p>rsync totally breaks (runs out of memory) on a set of files this large.</p><p>I handled it by taking LTO-4 tapes to the location for the initial dump, and then using "find" to make incremental tars.  Syncing deletions is still a problem.  I don't have the budget for even the maintenance costs for netapp or EMC solutions.</p></htmltext>
<tokenext>This is enterprise fileserver level , not home directories , but I have about 1.5TB of data , in about 4 million files , that I replicate between two sites.rsync totally breaks ( runs out of memory ) on a set of files this large.I handled it by taking LTO-4 tapes to the location for the initial dump , and then using " find " to make incremental tars .
Syncing deletions is still a problem .
I do n't have the budget for even the maintenance costs for netapp or EMC solutions .</tokentext>
<sentencetext>This is enterprise fileserver level, not home directories, but I have about 1.5TB of data, in about 4 million files, that I replicate between two sites.rsync totally breaks (runs out of memory) on a set of files this large.I handled it by taking LTO-4 tapes to the location for the initial dump, and then using "find" to make incremental tars.
Syncing deletions is still a problem.
I don't have the budget for even the maintenance costs for netapp or EMC solutions.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443875</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28445663</id>
	<title>Mercurial</title>
	<author>Omnifarious</author>
	<datestamp>1245752940000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>I use <a href="http://www.selenic.com/mercurial/" title="selenic.com">Mercurial</a> [selenic.com], and I don't know why it wasn't mentioned along with the other three.</p></htmltext>
<tokenext>I use Mercurial [ selenic.com ] , and I do n't know why it was n't mentioned along with the other three .</tokentext>
<sentencetext>I use Mercurial [selenic.com], and I don't know why it wasn't mentioned along with the other three.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444351</id>
	<title>moving online</title>
	<author>karl3</author>
	<datestamp>1245748200000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext> i've tried many approaches - none worked. solution seems to be online.

i've already moved my bookmarks to faviki, music to grooveshark, pics and vids to facebook, docs to to google documents.<nobr> <wbr></nobr>... i only haven't figured out what to do with my films.</htmltext>
<tokenext>i 've tried many approaches - none worked .
solution seems to be online .
i 've already moved my bookmarks to faviki , music to grooveshark , pics and vids to facebook , docs to to google documents .
... i only have n't figured out what to do with my films .</tokentext>
<sentencetext> i've tried many approaches - none worked.
solution seems to be online.
i've already moved my bookmarks to faviki, music to grooveshark, pics and vids to facebook, docs to to google documents.
... i only haven't figured out what to do with my films.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28445129</id>
	<title>Re:Unison is the only way</title>
	<author>Anonymous</author>
	<datestamp>1245750960000</datestamp>
	<modclass>Interestin</modclass>
	<modscore>2</modscore>
	<htmltext><p>I've been a very happy Unison for many many years. But after recently switching to Ubuntu for my laptop and desktop I finally discovered the joy of Unison over SSH and also discovered the auto=true and terse=true command switches which makes the whole thing automatic, just like Dropbox (I think there's one more switch and I won't even have to press 'g' to run the sync).</p><p>I've tried other solutions - DropBox is too much in the 'cloud' for me; rsync is only one way (what's with that?), Robocopy is Windows only, svn/hg/bzr/git store versions and all I want is two way sync.</p><p>Unison is the peak of perfection in two way sync; there is nothing else...</p></htmltext>
<tokenext>I 've been a very happy Unison for many many years .
But after recently switching to Ubuntu for my laptop and desktop I finally discovered the joy of Unison over SSH and also discovered the auto = true and terse = true command switches which makes the whole thing automatic , just like Dropbox ( I think there 's one more switch and I wo n't even have to press 'g ' to run the sync ) .I 've tried other solutions - DropBox is too much in the 'cloud ' for me ; rsync is only one way ( what 's with that ?
) , Robocopy is Windows only , svn/hg/bzr/git store versions and all I want is two way sync.Unison is the peak of perfection in two way sync ; there is nothing else.. .</tokentext>
<sentencetext>I've been a very happy Unison for many many years.
But after recently switching to Ubuntu for my laptop and desktop I finally discovered the joy of Unison over SSH and also discovered the auto=true and terse=true command switches which makes the whole thing automatic, just like Dropbox (I think there's one more switch and I won't even have to press 'g' to run the sync).I've tried other solutions - DropBox is too much in the 'cloud' for me; rsync is only one way (what's with that?
), Robocopy is Windows only, svn/hg/bzr/git store versions and all I want is two way sync.Unison is the peak of perfection in two way sync; there is nothing else...</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443941</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28450851</id>
	<title>Abe</title>
	<author>Anonymous</author>
	<datestamp>1245846240000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>For people who wanna sync two large copies of data (e.g. 100GB) over two remote computers, using a smaller medium (e.g. a USB stick), syncoco maybe useful http://syncoco.sourceforge.net/<br>It only copy those modified files over.... unfortunately its development seems halted</p></htmltext>
<tokenext>For people who wan na sync two large copies of data ( e.g .
100GB ) over two remote computers , using a smaller medium ( e.g .
a USB stick ) , syncoco maybe useful http : //syncoco.sourceforge.net/It only copy those modified files over.... unfortunately its development seems halted</tokentext>
<sentencetext>For people who wanna sync two large copies of data (e.g.
100GB) over two remote computers, using a smaller medium (e.g.
a USB stick), syncoco maybe useful http://syncoco.sourceforge.net/It only copy those modified files over.... unfortunately its development seems halted</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28446791</id>
	<title>I use a printer...</title>
	<author>mr\_lizard13</author>
	<datestamp>1245758040000</datestamp>
	<modclass>Funny</modclass>
	<modscore>2</modscore>
	<htmltext>...to print documents off my main computer, then a scanner to get those documents onto my second computer.
<br>
<br>
I then repeat the process in the other direction to create a seamless 'bidirectional' sync solution.
<br>
<br>
For video, I simply record the footage off the monitor using a camcorder, which I carry with me. My video, on the move.</htmltext>
<tokenext>...to print documents off my main computer , then a scanner to get those documents onto my second computer .
I then repeat the process in the other direction to create a seamless 'bidirectional ' sync solution .
For video , I simply record the footage off the monitor using a camcorder , which I carry with me .
My video , on the move .</tokentext>
<sentencetext>...to print documents off my main computer, then a scanner to get those documents onto my second computer.
I then repeat the process in the other direction to create a seamless 'bidirectional' sync solution.
For video, I simply record the footage off the monitor using a camcorder, which I carry with me.
My video, on the move.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28447819</id>
	<title>Own INet Server (debian, root access) + Subversion</title>
	<author>Qbertino</author>
	<datestamp>1245764940000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>I have a server on the Inet with Debian and root access for websites and CMSes of critical long-term customers and my own project versioning and testing. I use a seperate usergroup for each project and also have a usergroup for my Project directory that basically has around 10 years worth of my projects in it. I sync using the CLI client for ssh+svn on OS X and Linux. If I had Windows in there somewhere - which I don't since about 7 years ago - I'd have TortoiseSVN to cover syncing there.</p><p>I find this a practical solution as I can access my current stuff from *anywhere* at any time and I can use the same skillkit I use for my daily developement work at the job. At work we also use SVN for project specific docs and media. The tools are there, svn is tried and true and you can also fix unreversably borked versioning with a little XML editing inside the<nobr> <wbr></nobr>.svn directory if things should go completely haywire.</p><p>For archiving I'd actually archive off the repository itself. Haven't seen the need to do that yet though. Backups I still to with alternating/overturning drag and drop copies of entire dirtrees via the Finder or Nautilus/Konqueror on to external HDDs - which is a leftover from the before-svn-sync times which I intend to change someday and integrate into the pipeline.</p></htmltext>
<tokenext>I have a server on the Inet with Debian and root access for websites and CMSes of critical long-term customers and my own project versioning and testing .
I use a seperate usergroup for each project and also have a usergroup for my Project directory that basically has around 10 years worth of my projects in it .
I sync using the CLI client for ssh + svn on OS X and Linux .
If I had Windows in there somewhere - which I do n't since about 7 years ago - I 'd have TortoiseSVN to cover syncing there.I find this a practical solution as I can access my current stuff from * anywhere * at any time and I can use the same skillkit I use for my daily developement work at the job .
At work we also use SVN for project specific docs and media .
The tools are there , svn is tried and true and you can also fix unreversably borked versioning with a little XML editing inside the .svn directory if things should go completely haywire.For archiving I 'd actually archive off the repository itself .
Have n't seen the need to do that yet though .
Backups I still to with alternating/overturning drag and drop copies of entire dirtrees via the Finder or Nautilus/Konqueror on to external HDDs - which is a leftover from the before-svn-sync times which I intend to change someday and integrate into the pipeline .</tokentext>
<sentencetext>I have a server on the Inet with Debian and root access for websites and CMSes of critical long-term customers and my own project versioning and testing.
I use a seperate usergroup for each project and also have a usergroup for my Project directory that basically has around 10 years worth of my projects in it.
I sync using the CLI client for ssh+svn on OS X and Linux.
If I had Windows in there somewhere - which I don't since about 7 years ago - I'd have TortoiseSVN to cover syncing there.I find this a practical solution as I can access my current stuff from *anywhere* at any time and I can use the same skillkit I use for my daily developement work at the job.
At work we also use SVN for project specific docs and media.
The tools are there, svn is tried and true and you can also fix unreversably borked versioning with a little XML editing inside the .svn directory if things should go completely haywire.For archiving I'd actually archive off the repository itself.
Haven't seen the need to do that yet though.
Backups I still to with alternating/overturning drag and drop copies of entire dirtrees via the Finder or Nautilus/Konqueror on to external HDDs - which is a leftover from the before-svn-sync times which I intend to change someday and integrate into the pipeline.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28446747</id>
	<title>Re:Why is there no discount online backup?</title>
	<author>Blakey Rat</author>
	<datestamp>1245757740000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p><i>I would like to know why this is the case. Why is there no service out there that can provide backups for large amounts of data at a price that is competitive with using external hard drives?</i></p><p>Like Mozy (http://mozy.com/), you mean? The online backup service that has 250+ GB of my data on it stored without a peep of complaint? The one that costs $5/month/computer, definitely competitive with buying HDs?</p><p>Maybe you should verify that it is, indeed, not the case before going on a long rant about it. Just a thought.</p><p>Not an employee, just a happy customer.</p></htmltext>
<tokenext>I would like to know why this is the case .
Why is there no service out there that can provide backups for large amounts of data at a price that is competitive with using external hard drives ? Like Mozy ( http : //mozy.com/ ) , you mean ?
The online backup service that has 250 + GB of my data on it stored without a peep of complaint ?
The one that costs $ 5/month/computer , definitely competitive with buying HDs ? Maybe you should verify that it is , indeed , not the case before going on a long rant about it .
Just a thought.Not an employee , just a happy customer .</tokentext>
<sentencetext>I would like to know why this is the case.
Why is there no service out there that can provide backups for large amounts of data at a price that is competitive with using external hard drives?Like Mozy (http://mozy.com/), you mean?
The online backup service that has 250+ GB of my data on it stored without a peep of complaint?
The one that costs $5/month/computer, definitely competitive with buying HDs?Maybe you should verify that it is, indeed, not the case before going on a long rant about it.
Just a thought.Not an employee, just a happy customer.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444495</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443765</id>
	<title>Mobile Home Directory</title>
	<author>Drizzt Do'Urden</author>
	<datestamp>1245789540000</datestamp>
	<modclass>Informativ</modclass>
	<modscore>2</modscore>
	<htmltext><p>At home, I've got a Linux server hosting an LDAP structure to mimic MacOS X Server's config. It is sharing my home directory via NFS. My Macs sync this home directory on login and logout, so all my personnal data is centalized for easy backup and available on any Mac I happen to add to my home network.</p></htmltext>
<tokenext>At home , I 've got a Linux server hosting an LDAP structure to mimic MacOS X Server 's config .
It is sharing my home directory via NFS .
My Macs sync this home directory on login and logout , so all my personnal data is centalized for easy backup and available on any Mac I happen to add to my home network .</tokentext>
<sentencetext>At home, I've got a Linux server hosting an LDAP structure to mimic MacOS X Server's config.
It is sharing my home directory via NFS.
My Macs sync this home directory on login and logout, so all my personnal data is centalized for easy backup and available on any Mac I happen to add to my home network.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443475</id>
	<title>rsync</title>
	<author>timdrijvers</author>
	<datestamp>1245788580000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>rsync + hardlinks</htmltext>
<tokenext>rsync + hardlinks</tokentext>
<sentencetext>rsync + hardlinks</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443843</id>
	<title>Partway there, but data safety first</title>
	<author>hawk</author>
	<datestamp>1245789780000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Right now, actual synchronization tends to be entirely manual, with scp of subdirectories and possibly a tar -c | tar -x combo to not overwrite newer copies.</p><p>I decided to work on data integrity first--but then I have client info to consider.</p><p>I have a 3x1.5T zraid array using full disks on the main machine, and an external 1.5T for backup (I'll grab another, so I can have two alternating backups).  These will stay disconnected when not backing up, and in two other rooms of the house.  I'll probably copy zfs snapshots of<nobr> <wbr></nobr>/home, probably filtered for any<nobr> <wbr></nobr>/cache/ and so forth.  I also have offsite backup that i haven't gotten around to enabling ):</p><p>I'm planning on actually figuring out rsync, and from there specifying the parts of<nobr> <wbr></nobr>/home/* that get synchronized (possibly to an nfs mount off the main machine?)</p><p>hawk</p></htmltext>
<tokenext>Right now , actual synchronization tends to be entirely manual , with scp of subdirectories and possibly a tar -c | tar -x combo to not overwrite newer copies.I decided to work on data integrity first--but then I have client info to consider.I have a 3x1.5T zraid array using full disks on the main machine , and an external 1.5T for backup ( I 'll grab another , so I can have two alternating backups ) .
These will stay disconnected when not backing up , and in two other rooms of the house .
I 'll probably copy zfs snapshots of /home , probably filtered for any /cache/ and so forth .
I also have offsite backup that i have n't gotten around to enabling ) : I 'm planning on actually figuring out rsync , and from there specifying the parts of /home/ * that get synchronized ( possibly to an nfs mount off the main machine ?
) hawk</tokentext>
<sentencetext>Right now, actual synchronization tends to be entirely manual, with scp of subdirectories and possibly a tar -c | tar -x combo to not overwrite newer copies.I decided to work on data integrity first--but then I have client info to consider.I have a 3x1.5T zraid array using full disks on the main machine, and an external 1.5T for backup (I'll grab another, so I can have two alternating backups).
These will stay disconnected when not backing up, and in two other rooms of the house.
I'll probably copy zfs snapshots of /home, probably filtered for any /cache/ and so forth.
I also have offsite backup that i haven't gotten around to enabling ):I'm planning on actually figuring out rsync, and from there specifying the parts of /home/* that get synchronized (possibly to an nfs mount off the main machine?
)hawk</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28447331</id>
	<title>GROOVE (MS OFFICE)</title>
	<author>Anonymous</author>
	<datestamp>1245761220000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>No one mentioned this yet?  Simple, low bandwith, easy to remigrate to a new comp/after a reformat...  works great for me (mostly only use for docs, but large pdfs included).</p><p>Works over any internet connection I have ever used.</p><p>Its MS, get over it, it just works.<nobr> <wbr></nobr>:)</p></htmltext>
<tokenext>No one mentioned this yet ?
Simple , low bandwith , easy to remigrate to a new comp/after a reformat... works great for me ( mostly only use for docs , but large pdfs included ) .Works over any internet connection I have ever used.Its MS , get over it , it just works .
: )</tokentext>
<sentencetext>No one mentioned this yet?
Simple, low bandwith, easy to remigrate to a new comp/after a reformat...  works great for me (mostly only use for docs, but large pdfs included).Works over any internet connection I have ever used.Its MS, get over it, it just works.
:)</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444569</id>
	<title>Re:Unison</title>
	<author>Anonymous</author>
	<datestamp>1245749040000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>Gets my vote too. Very simple to use, and highly reliable.</p></htmltext>
<tokenext>Gets my vote too .
Very simple to use , and highly reliable .</tokentext>
<sentencetext>Gets my vote too.
Very simple to use, and highly reliable.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443941</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28447279</id>
	<title>Nobody mentioned grsync</title>
	<author>garryknight</author>
	<datestamp>1245760860000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>I first got into syncing when I bought my now-departed Thinkpad. I looked at unison but didn't like the way it seemed to dump large amounts of configuration stuff in my home directory. So I wrote scripts using rsync - one set to sync stuff from the pc to the laptop, and one set to sync the other way.<br> <br>

When I got my EeePC then my Advent (MSI Wind clone) netbook I simply edited my sync scripts (and<nobr> <wbr></nobr>/etc/hosts) and carried on as usual. Then I discovered grsync which is a graphical front-end to rsync.<br> <br>

Rather than having a script for each directory branch, I just fill in the source and targets, then select which options I want, e.g. which perms to preserve, whether to compress the data (useful when syncing while away from home), whether to delete files that are missing on the target, etc. So instead of a set of scripts, I just have one config file in ~/.grsync which is, of course, pure text and can be hand-edited if necessary.<br> <br>

What's even better about grsync is that between the source and target directory fields is a double-headed arrow; click this and the source becomes the target and vice versa. So I sync to the netbook, do stuff while I'm out of the house, then click the double-headed arrow and sync back to the pc. Couldn't be simpler.<br> <br>

Grsync also has fields for commands to be executed before and after rsync is called, so you can copy, move, zip, write logfiles, etc, etc. It also has a field in which you can enter additional options to rsync, so you could, for example, enter --exclude '*~' if you wanted rsync to ignore certain backup files.<br> <br>

I suppose I should add the usual disclaimer that I'm no relation to the author or his dog. You can find grsync here: <a href="http://www.opbyte.it/grsync/" title="opbyte.it" rel="nofollow">http://www.opbyte.it/grsync/</a> [opbyte.it]</htmltext>
<tokenext>I first got into syncing when I bought my now-departed Thinkpad .
I looked at unison but did n't like the way it seemed to dump large amounts of configuration stuff in my home directory .
So I wrote scripts using rsync - one set to sync stuff from the pc to the laptop , and one set to sync the other way .
When I got my EeePC then my Advent ( MSI Wind clone ) netbook I simply edited my sync scripts ( and /etc/hosts ) and carried on as usual .
Then I discovered grsync which is a graphical front-end to rsync .
Rather than having a script for each directory branch , I just fill in the source and targets , then select which options I want , e.g .
which perms to preserve , whether to compress the data ( useful when syncing while away from home ) , whether to delete files that are missing on the target , etc .
So instead of a set of scripts , I just have one config file in ~ /.grsync which is , of course , pure text and can be hand-edited if necessary .
What 's even better about grsync is that between the source and target directory fields is a double-headed arrow ; click this and the source becomes the target and vice versa .
So I sync to the netbook , do stuff while I 'm out of the house , then click the double-headed arrow and sync back to the pc .
Could n't be simpler .
Grsync also has fields for commands to be executed before and after rsync is called , so you can copy , move , zip , write logfiles , etc , etc .
It also has a field in which you can enter additional options to rsync , so you could , for example , enter --exclude ' * ~ ' if you wanted rsync to ignore certain backup files .
I suppose I should add the usual disclaimer that I 'm no relation to the author or his dog .
You can find grsync here : http : //www.opbyte.it/grsync/ [ opbyte.it ]</tokentext>
<sentencetext>I first got into syncing when I bought my now-departed Thinkpad.
I looked at unison but didn't like the way it seemed to dump large amounts of configuration stuff in my home directory.
So I wrote scripts using rsync - one set to sync stuff from the pc to the laptop, and one set to sync the other way.
When I got my EeePC then my Advent (MSI Wind clone) netbook I simply edited my sync scripts (and /etc/hosts) and carried on as usual.
Then I discovered grsync which is a graphical front-end to rsync.
Rather than having a script for each directory branch, I just fill in the source and targets, then select which options I want, e.g.
which perms to preserve, whether to compress the data (useful when syncing while away from home), whether to delete files that are missing on the target, etc.
So instead of a set of scripts, I just have one config file in ~/.grsync which is, of course, pure text and can be hand-edited if necessary.
What's even better about grsync is that between the source and target directory fields is a double-headed arrow; click this and the source becomes the target and vice versa.
So I sync to the netbook, do stuff while I'm out of the house, then click the double-headed arrow and sync back to the pc.
Couldn't be simpler.
Grsync also has fields for commands to be executed before and after rsync is called, so you can copy, move, zip, write logfiles, etc, etc.
It also has a field in which you can enter additional options to rsync, so you could, for example, enter --exclude '*~' if you wanted rsync to ignore certain backup files.
I suppose I should add the usual disclaimer that I'm no relation to the author or his dog.
You can find grsync here: http://www.opbyte.it/grsync/ [opbyte.it]</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28447125</id>
	<title>I wrote about this exact topic in April</title>
	<author>hacker</author>
	<datestamp>1245759780000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>You can read more about it here:

</p><p> <a href="http://blog.gnu-designs.com/snapshot-backups-of-everything-using-rsync-including-windows" title="gnu-designs.com">http://blog.gnu-designs.com/snapshot-backups-of-everything-using-rsync-including-windows</a> [gnu-designs.com]

</p><p>Basically I'm using rsnapshot to back up everything, Linux, BSD and yes.. even Windows, with relatively pain-free results. The added benefit (for the Windows users) is that they can browse the snapshot hierarchy (exposed via Samba), to get back any files they want from the hourly/weekly/monthly snapshots on the array.

</p><p>It works beautifully here.</p></htmltext>
<tokenext>You can read more about it here : http : //blog.gnu-designs.com/snapshot-backups-of-everything-using-rsync-including-windows [ gnu-designs.com ] Basically I 'm using rsnapshot to back up everything , Linux , BSD and yes.. even Windows , with relatively pain-free results .
The added benefit ( for the Windows users ) is that they can browse the snapshot hierarchy ( exposed via Samba ) , to get back any files they want from the hourly/weekly/monthly snapshots on the array .
It works beautifully here .</tokentext>
<sentencetext>You can read more about it here:

 http://blog.gnu-designs.com/snapshot-backups-of-everything-using-rsync-including-windows [gnu-designs.com]

Basically I'm using rsnapshot to back up everything, Linux, BSD and yes.. even Windows, with relatively pain-free results.
The added benefit (for the Windows users) is that they can browse the snapshot hierarchy (exposed via Samba), to get back any files they want from the hourly/weekly/monthly snapshots on the array.
It works beautifully here.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28477315</id>
	<title>Re:It's called Windows</title>
	<author>skeeto</author>
	<datestamp>1245949980000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>Uhhmm.... never heard of rsync?</htmltext>
<tokenext>Uhhmm.... never heard of rsync ?</tokentext>
<sentencetext>Uhhmm.... never heard of rsync?</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28445647</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28446251</id>
	<title>NFS</title>
	<author>defaria</author>
	<datestamp>1245755400000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>In a word - NFS. When you only have 1 home directory there's nothing to sync!</htmltext>
<tokenext>In a word - NFS .
When you only have 1 home directory there 's nothing to sync !</tokentext>
<sentencetext>In a word - NFS.
When you only have 1 home directory there's nothing to sync!</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444645</id>
	<title>Apple MobileMe</title>
	<author>Brice21</author>
	<datestamp>1245749340000</datestamp>
	<modclass>Informativ</modclass>
	<modscore>2</modscore>
	<htmltext>I use the MobileMe service from Apple. The 20 Gb iDisk is enough for syncing files between my several computers. It's mounted on the desktop of my various Macs and with the iDisk's cache, copying files is instantaneous and I can work on my files while offline. When I get back to an internet connection, it syncs. I can also access my iDisks using the webinterface or mount the iDisk in WebDAV on Windows.

Also MobileMe Sync my iPhone and my computers Calendars, Address Book, Keychains, Bookmarks, Dashboard Widgets, Dock Items, Mail Accounts, rules, signatures, smart mailboxes, Notes, Preferences (including serial numbers) and FTP favorites. I can use the me.com website to access my adress book, calendars, files and emails from any web browser.

Configuration is very simple (login, password, thick a checkbox) and require no software install.</htmltext>
<tokenext>I use the MobileMe service from Apple .
The 20 Gb iDisk is enough for syncing files between my several computers .
It 's mounted on the desktop of my various Macs and with the iDisk 's cache , copying files is instantaneous and I can work on my files while offline .
When I get back to an internet connection , it syncs .
I can also access my iDisks using the webinterface or mount the iDisk in WebDAV on Windows .
Also MobileMe Sync my iPhone and my computers Calendars , Address Book , Keychains , Bookmarks , Dashboard Widgets , Dock Items , Mail Accounts , rules , signatures , smart mailboxes , Notes , Preferences ( including serial numbers ) and FTP favorites .
I can use the me.com website to access my adress book , calendars , files and emails from any web browser .
Configuration is very simple ( login , password , thick a checkbox ) and require no software install .</tokentext>
<sentencetext>I use the MobileMe service from Apple.
The 20 Gb iDisk is enough for syncing files between my several computers.
It's mounted on the desktop of my various Macs and with the iDisk's cache, copying files is instantaneous and I can work on my files while offline.
When I get back to an internet connection, it syncs.
I can also access my iDisks using the webinterface or mount the iDisk in WebDAV on Windows.
Also MobileMe Sync my iPhone and my computers Calendars, Address Book, Keychains, Bookmarks, Dashboard Widgets, Dock Items, Mail Accounts, rules, signatures, smart mailboxes, Notes, Preferences (including serial numbers) and FTP favorites.
I can use the me.com website to access my adress book, calendars, files and emails from any web browser.
Configuration is very simple (login, password, thick a checkbox) and require no software install.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28447149</id>
	<title>Re:USB drive</title>
	<author>Anonymous</author>
	<datestamp>1245760020000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>That's half an answer....  how do you automatically back up to the server?  What do you use for encryption?</p></htmltext>
<tokenext>That 's half an answer.... how do you automatically back up to the server ?
What do you use for encryption ?</tokentext>
<sentencetext>That's half an answer....  how do you automatically back up to the server?
What do you use for encryption?</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443649</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28450291</id>
	<title>Re:Dropbox is NOT open source</title>
	<author>jcn</author>
	<datestamp>1245836340000</datestamp>
	<modclass>Informativ</modclass>
	<modscore>2</modscore>
	<htmltext>You may want to check out<blockquote><div><p>    <a href="http://blog.hacker.dk/2008/10/dropbox-is-not-open-source/" title="hacker.dk" rel="nofollow">Dan Kolleth's rant</a> [hacker.dk]</p></div>
</blockquote></div>
	</htmltext>
<tokenext>You may want to check out Dan Kolleth 's rant [ hacker.dk ]</tokentext>
<sentencetext>You may want to check out    Dan Kolleth's rant [hacker.dk]

	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443469</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28451515</id>
	<title>Re:Dropbox</title>
	<author>Anonymous</author>
	<datestamp>1245853080000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>Sure, if you don't care about privacy or the added hassle of encrypting files before shipping off, use something like Dropbox.</p><p>And do yourself a favor, READ THE FINE PRINT.   Companies like Google, et al, are known index your content and use those results for undisclosed purposes (sometimes, I believe, without your consent).</p></htmltext>
<tokenext>Sure , if you do n't care about privacy or the added hassle of encrypting files before shipping off , use something like Dropbox.And do yourself a favor , READ THE FINE PRINT .
Companies like Google , et al , are known index your content and use those results for undisclosed purposes ( sometimes , I believe , without your consent ) .</tokentext>
<sentencetext>Sure, if you don't care about privacy or the added hassle of encrypting files before shipping off, use something like Dropbox.And do yourself a favor, READ THE FINE PRINT.
Companies like Google, et al, are known index your content and use those results for undisclosed purposes (sometimes, I believe, without your consent).</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443469</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28447915</id>
	<title>Re:If you're willing to spend some $</title>
	<author>Anonymous</author>
	<datestamp>1245765660000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext>Hooray for astroturfing.</htmltext>
<tokenext>Hooray for astroturfing .</tokentext>
<sentencetext>Hooray for astroturfing.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444149</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28456523</id>
	<title>AJC Directory Synchronizer</title>
	<author>Anonymous</author>
	<datestamp>1245872940000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>I recommend AJC Directory Synchronizer.<br>I can sync two directories in either direction and make incremental backups of the deleted files.<br>It's great for backing up directories to another hard drive, or USB drive.<br>See <a href="http://www.ajcsoft.com/ProductsAJCDirSync.php" title="ajcsoft.com" rel="nofollow">http://www.ajcsoft.com/ProductsAJCDirSync.php</a> [ajcsoft.com]</p></htmltext>
<tokenext>I recommend AJC Directory Synchronizer.I can sync two directories in either direction and make incremental backups of the deleted files.It 's great for backing up directories to another hard drive , or USB drive.See http : //www.ajcsoft.com/ProductsAJCDirSync.php [ ajcsoft.com ]</tokentext>
<sentencetext>I recommend AJC Directory Synchronizer.I can sync two directories in either direction and make incremental backups of the deleted files.It's great for backing up directories to another hard drive, or USB drive.See http://www.ajcsoft.com/ProductsAJCDirSync.php [ajcsoft.com]</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443941</id>
	<title>Unison</title>
	<author>ashayh</author>
	<datestamp>1245790140000</datestamp>
	<modclass>Interestin</modclass>
	<modscore>4</modscore>
	<htmltext><a href="http://www.cis.upenn.edu/~bcpierce/unison/" title="upenn.edu">Unison</a> [upenn.edu]</htmltext>
<tokenext>Unison [ upenn.edu ]</tokentext>
<sentencetext>Unison [upenn.edu]</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444507</id>
	<title>I use a file server...</title>
	<author>RocketRabbit</author>
	<datestamp>1245748800000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>With well over 2TB of RAW files from my digital cameras, a bunch of music, and really all kinds of stuff that I've saved or generated over the years, there's no really good way to manage a huge file library over multiple computers - nor is there any point in doing so.</p><p>For years a Windows Media Center Edition PC that my wife won at the state fair (in a truth telling contest) did double duty as a file server.  I recently let the smoke out of it with some horrifically bad RAM, and am building its replacement using FreeBSD with MythTV, and using the ZFS as the underlying fabric of what I hope to be a long-term stable and expandible storage server.  That way there is one master copy of everything - no merging two divergent variations of a music library that were once identical (for example).  I have saved a great deal of space, too, with the on the fly compression that ZFS offers.</p><p>Doing this and then sharing the resulting system over SMB and NFS is a hell of a lot less painful than anything else.</p></htmltext>
<tokenext>With well over 2TB of RAW files from my digital cameras , a bunch of music , and really all kinds of stuff that I 've saved or generated over the years , there 's no really good way to manage a huge file library over multiple computers - nor is there any point in doing so.For years a Windows Media Center Edition PC that my wife won at the state fair ( in a truth telling contest ) did double duty as a file server .
I recently let the smoke out of it with some horrifically bad RAM , and am building its replacement using FreeBSD with MythTV , and using the ZFS as the underlying fabric of what I hope to be a long-term stable and expandible storage server .
That way there is one master copy of everything - no merging two divergent variations of a music library that were once identical ( for example ) .
I have saved a great deal of space , too , with the on the fly compression that ZFS offers.Doing this and then sharing the resulting system over SMB and NFS is a hell of a lot less painful than anything else .</tokentext>
<sentencetext>With well over 2TB of RAW files from my digital cameras, a bunch of music, and really all kinds of stuff that I've saved or generated over the years, there's no really good way to manage a huge file library over multiple computers - nor is there any point in doing so.For years a Windows Media Center Edition PC that my wife won at the state fair (in a truth telling contest) did double duty as a file server.
I recently let the smoke out of it with some horrifically bad RAM, and am building its replacement using FreeBSD with MythTV, and using the ZFS as the underlying fabric of what I hope to be a long-term stable and expandible storage server.
That way there is one master copy of everything - no merging two divergent variations of a music library that were once identical (for example).
I have saved a great deal of space, too, with the on the fly compression that ZFS offers.Doing this and then sharing the resulting system over SMB and NFS is a hell of a lot less painful than anything else.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444723</id>
	<title>gmail</title>
	<author>Dan667</author>
	<datestamp>1245749520000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>Maybe this it to simplistic, but for non critical data I email myself with the attachment.  Has search and can be accessed anywhere with web access.</htmltext>
<tokenext>Maybe this it to simplistic , but for non critical data I email myself with the attachment .
Has search and can be accessed anywhere with web access .</tokentext>
<sentencetext>Maybe this it to simplistic, but for non critical data I email myself with the attachment.
Has search and can be accessed anywhere with web access.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28451189</id>
	<title>Re:Subversion with a touch of bash</title>
	<author>digitalderbs</author>
	<datestamp>1245850680000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>I tried subversion, and moved to bazaar because it handles renames easily (automv plugin), and it supports pushing a commit to a server (push-and-update plugin). It's also well suited for diffing binary files (as it git).<br> <br>

I am, however, interested in seeing your scripts. I've setup some automated scripts to do this work for me, but I suspect they're not optimal for this. thanks,</htmltext>
<tokenext>I tried subversion , and moved to bazaar because it handles renames easily ( automv plugin ) , and it supports pushing a commit to a server ( push-and-update plugin ) .
It 's also well suited for diffing binary files ( as it git ) .
I am , however , interested in seeing your scripts .
I 've setup some automated scripts to do this work for me , but I suspect they 're not optimal for this .
thanks,</tokentext>
<sentencetext>I tried subversion, and moved to bazaar because it handles renames easily (automv plugin), and it supports pushing a commit to a server (push-and-update plugin).
It's also well suited for diffing binary files (as it git).
I am, however, interested in seeing your scripts.
I've setup some automated scripts to do this work for me, but I suspect they're not optimal for this.
thanks,</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28446663</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443495</id>
	<title>Dropbox</title>
	<author>princessproton</author>
	<datestamp>1245788640000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Since I don't have a lot of content to sync, <a href="https://www.getdropbox.com/link/21.5ymysR88uu" title="getdropbox.com" rel="nofollow">Dropbox</a> [getdropbox.com] meets my needs perfectly (and it's free!).</p></htmltext>
<tokenext>Since I do n't have a lot of content to sync , Dropbox [ getdropbox.com ] meets my needs perfectly ( and it 's free !
) .</tokentext>
<sentencetext>Since I don't have a lot of content to sync, Dropbox [getdropbox.com] meets my needs perfectly (and it's free!
).</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443909</id>
	<title>Re:Dropbox</title>
	<author>buchner.johannes</author>
	<datestamp>1245790020000</datestamp>
	<modclass>Informativ</modclass>
	<modscore>4</modscore>
	<htmltext><p>Have a look at <a href="http://slashdot.org/~buchner.johannes/journal/230647" title="slashdot.org">Jake</a> [slashdot.org]. Official website: <a href="http://jakeapp.com/" title="jakeapp.com">Jake</a> [jakeapp.com]</p><p>It is aimed for the average user (no server setup needed) and provides a syncing solution across the Internet with a nice UI. Free and open source, available for all operating systems.</p><p>Check it out!</p></htmltext>
<tokenext>Have a look at Jake [ slashdot.org ] .
Official website : Jake [ jakeapp.com ] It is aimed for the average user ( no server setup needed ) and provides a syncing solution across the Internet with a nice UI .
Free and open source , available for all operating systems.Check it out !</tokentext>
<sentencetext>Have a look at Jake [slashdot.org].
Official website: Jake [jakeapp.com]It is aimed for the average user (no server setup needed) and provides a syncing solution across the Internet with a nice UI.
Free and open source, available for all operating systems.Check it out!</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443469</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28446521</id>
	<title>Re:Dropbox</title>
	<author>vikstar</author>
	<datestamp>1245756720000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>There is no advertising in Dropbox, and it's free... why?<br>I'm a bit cautious of something that is free, has no advertising, and requires an online server.</p></htmltext>
<tokenext>There is no advertising in Dropbox , and it 's free... why ? I 'm a bit cautious of something that is free , has no advertising , and requires an online server .</tokentext>
<sentencetext>There is no advertising in Dropbox, and it's free... why?I'm a bit cautious of something that is free, has no advertising, and requires an online server.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443469</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28448445</id>
	<title>Server?</title>
	<author>mrbcs</author>
	<datestamp>1245771780000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>I have a server on the lan, accessible via the web that does automatic backup every night? What else do you need?</htmltext>
<tokenext>I have a server on the lan , accessible via the web that does automatic backup every night ?
What else do you need ?</tokentext>
<sentencetext>I have a server on the lan, accessible via the web that does automatic backup every night?
What else do you need?</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28447159</id>
	<title>For our customers.....</title>
	<author>Anonymous</author>
	<datestamp>1245760080000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>Our company has had good luck with two products.... Novell Storage Manager and Novell iFolder.  We don't use other Novell products but these two are rock solid.  Once we get past the "Novell" stigma our customers won't go back to the old manual way of doing things in microsoft active directory (MAD) or to old home directories in whatever other system they have.  On-th-go employees like iFolder because it keeps everything up to date regardless of location and is fast.  Larger customers love NSM because of policy-based storage management which is really quite amazing.  The support for the products, when there is the need, is great too.</p></htmltext>
<tokenext>Our company has had good luck with two products.... Novell Storage Manager and Novell iFolder .
We do n't use other Novell products but these two are rock solid .
Once we get past the " Novell " stigma our customers wo n't go back to the old manual way of doing things in microsoft active directory ( MAD ) or to old home directories in whatever other system they have .
On-th-go employees like iFolder because it keeps everything up to date regardless of location and is fast .
Larger customers love NSM because of policy-based storage management which is really quite amazing .
The support for the products , when there is the need , is great too .</tokentext>
<sentencetext>Our company has had good luck with two products.... Novell Storage Manager and Novell iFolder.
We don't use other Novell products but these two are rock solid.
Once we get past the "Novell" stigma our customers won't go back to the old manual way of doing things in microsoft active directory (MAD) or to old home directories in whatever other system they have.
On-th-go employees like iFolder because it keeps everything up to date regardless of location and is fast.
Larger customers love NSM because of policy-based storage management which is really quite amazing.
The support for the products, when there is the need, is great too.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28445145</id>
	<title>My setup - Linux/Windows/Mac clients</title>
	<author>digitalhermit</author>
	<datestamp>1245751020000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>The heart of my network is a xen based server running fileserver, DNS, DHCP, web proxy and LDAP.</p><p>Physically, the system is an older Athlon 2200 with 3G RAM, mirrored 100G disks for the base OS and disk files. The software stack is CentOS 5.3 based, running xen.</p><p>The fileserver virtual machine has its own 500G single disk assigned to it one which I keep media files, pictures etc.  I run a rsync backup script to another physical machine. The really critical stuff (family pictures, irreplaceable documents) also get backed up to an external hard drive every week or so (no set schedule). I don't have an automated process to back up my OS images though.</p><p>I run CentOS directory server (now Port389). Windows and Linux clients can authenticate, but most systems have local authentication. I also use autodir/autofs on Linux/Solaris systems.  This allows me to login on any Unix/Linux system and have my entire work environment ready. I have the following<nobr> <wbr></nobr>.profile that lets me keep separate profiles for each login:</p><p>## Profile Script<br>LDAP\_HOME: Contains the NFS automount directories for a subset of LDAP users.</p><p>LDAP\_CURRENT\_HOST=`hostname -s`</p><p>if [ -f ~/.hostconfig/${LDAP\_CURRENT\_HOST}.profile ]; then<br>
&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; . ~/.hostconfig/${LDAP\_CURRENT\_HOST}.profile<br>fi</p><p>For my Linux clients, I automount the fileshare to<nobr> <wbr></nobr>/mnt/fileserver. Each home directory has a symlink to that mount point. You can put this in the skel startup so that each new user gets the link.  This allows anyone on my network to play music, watch movies, view pictures, from the share. I set up the directory structure on the fileshare as follows:</p><p>Media:  Contains subdirs for Audio, Video, and Images.<br>Documentation: Contains subdirs for Computer, Household, Film, etc..<br>CVS:  The main CVS repository for my work<br>Software: Contains subdirs for Windows, Linux, MacOS, Java, Solaris</p><p>For the Windows clients, I define a network drive on each system pointing to the fileserver.  I can also access the CVS server via Eclipse when I need to do something with Java or Windows Perl.</p><p>There are some downsides to my setup. I tend to upload images from my Windows machines and for the most part, these are laptops that connect wirelessly so synchronization can take hours to upload an 8G CF card. I'm happy with the setup, however. Placing something in a "critical" folder means it gets a backup rotation so I can retrieve earlier versions. Other stuff is backed up, but not at a high priority and no versioning.</p></htmltext>
<tokenext>The heart of my network is a xen based server running fileserver , DNS , DHCP , web proxy and LDAP.Physically , the system is an older Athlon 2200 with 3G RAM , mirrored 100G disks for the base OS and disk files .
The software stack is CentOS 5.3 based , running xen.The fileserver virtual machine has its own 500G single disk assigned to it one which I keep media files , pictures etc .
I run a rsync backup script to another physical machine .
The really critical stuff ( family pictures , irreplaceable documents ) also get backed up to an external hard drive every week or so ( no set schedule ) .
I do n't have an automated process to back up my OS images though.I run CentOS directory server ( now Port389 ) .
Windows and Linux clients can authenticate , but most systems have local authentication .
I also use autodir/autofs on Linux/Solaris systems .
This allows me to login on any Unix/Linux system and have my entire work environment ready .
I have the following .profile that lets me keep separate profiles for each login : # # Profile ScriptLDAP \ _HOME : Contains the NFS automount directories for a subset of LDAP users.LDAP \ _CURRENT \ _HOST = ` hostname -s ` if [ -f ~ /.hostconfig/ $ { LDAP \ _CURRENT \ _HOST } .profile ] ; then                 .
~ /.hostconfig/ $ { LDAP \ _CURRENT \ _HOST } .profilefiFor my Linux clients , I automount the fileshare to /mnt/fileserver .
Each home directory has a symlink to that mount point .
You can put this in the skel startup so that each new user gets the link .
This allows anyone on my network to play music , watch movies , view pictures , from the share .
I set up the directory structure on the fileshare as follows : Media : Contains subdirs for Audio , Video , and Images.Documentation : Contains subdirs for Computer , Household , Film , etc..CVS : The main CVS repository for my workSoftware : Contains subdirs for Windows , Linux , MacOS , Java , SolarisFor the Windows clients , I define a network drive on each system pointing to the fileserver .
I can also access the CVS server via Eclipse when I need to do something with Java or Windows Perl.There are some downsides to my setup .
I tend to upload images from my Windows machines and for the most part , these are laptops that connect wirelessly so synchronization can take hours to upload an 8G CF card .
I 'm happy with the setup , however .
Placing something in a " critical " folder means it gets a backup rotation so I can retrieve earlier versions .
Other stuff is backed up , but not at a high priority and no versioning .</tokentext>
<sentencetext>The heart of my network is a xen based server running fileserver, DNS, DHCP, web proxy and LDAP.Physically, the system is an older Athlon 2200 with 3G RAM, mirrored 100G disks for the base OS and disk files.
The software stack is CentOS 5.3 based, running xen.The fileserver virtual machine has its own 500G single disk assigned to it one which I keep media files, pictures etc.
I run a rsync backup script to another physical machine.
The really critical stuff (family pictures, irreplaceable documents) also get backed up to an external hard drive every week or so (no set schedule).
I don't have an automated process to back up my OS images though.I run CentOS directory server (now Port389).
Windows and Linux clients can authenticate, but most systems have local authentication.
I also use autodir/autofs on Linux/Solaris systems.
This allows me to login on any Unix/Linux system and have my entire work environment ready.
I have the following .profile that lets me keep separate profiles for each login:## Profile ScriptLDAP\_HOME: Contains the NFS automount directories for a subset of LDAP users.LDAP\_CURRENT\_HOST=`hostname -s`if [ -f ~/.hostconfig/${LDAP\_CURRENT\_HOST}.profile ]; then
                .
~/.hostconfig/${LDAP\_CURRENT\_HOST}.profilefiFor my Linux clients, I automount the fileshare to /mnt/fileserver.
Each home directory has a symlink to that mount point.
You can put this in the skel startup so that each new user gets the link.
This allows anyone on my network to play music, watch movies, view pictures, from the share.
I set up the directory structure on the fileshare as follows:Media:  Contains subdirs for Audio, Video, and Images.Documentation: Contains subdirs for Computer, Household, Film, etc..CVS:  The main CVS repository for my workSoftware: Contains subdirs for Windows, Linux, MacOS, Java, SolarisFor the Windows clients, I define a network drive on each system pointing to the fileserver.
I can also access the CVS server via Eclipse when I need to do something with Java or Windows Perl.There are some downsides to my setup.
I tend to upload images from my Windows machines and for the most part, these are laptops that connect wirelessly so synchronization can take hours to upload an 8G CF card.
I'm happy with the setup, however.
Placing something in a "critical" folder means it gets a backup rotation so I can retrieve earlier versions.
Other stuff is backed up, but not at a high priority and no versioning.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444249</id>
	<title>AD with redirected my documents</title>
	<author>mordred99</author>
	<datestamp>1245747840000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>I have AD on my file server (and DNS, exchange, etc.) but setup through group policy folder re-direction of the "my documents" to a \\server\user\mydocs directory.  Every user that has a domain gets these files created automatically.  I also created the local settings folders as well (thus outlook and things like that) in the \\server\user\settings directories.  For my linux box I create a folder<nobr> <wbr></nobr>/mnt/user then in the<nobr> <wbr></nobr>/etc/fstab create a mount there with full rights.  Then finally in my linux box home directory, I have a documents folder, which I create a link to the mount point.

For MP3 and other files they are stored in a separate directory on the same NAS on the windows box, create mounts for them in linux, and can access them via windows shares published in AD.  However I use Jiznora as the front end to access all my music and files.

It is not a lot of work to setup, and took maybe 30 minutes once I had the servers up and an AD setup.  All pcs can connect in my house (wireless, etc.) and also my xbox 360.</htmltext>
<tokenext>I have AD on my file server ( and DNS , exchange , etc .
) but setup through group policy folder re-direction of the " my documents " to a \ \ server \ user \ mydocs directory .
Every user that has a domain gets these files created automatically .
I also created the local settings folders as well ( thus outlook and things like that ) in the \ \ server \ user \ settings directories .
For my linux box I create a folder /mnt/user then in the /etc/fstab create a mount there with full rights .
Then finally in my linux box home directory , I have a documents folder , which I create a link to the mount point .
For MP3 and other files they are stored in a separate directory on the same NAS on the windows box , create mounts for them in linux , and can access them via windows shares published in AD .
However I use Jiznora as the front end to access all my music and files .
It is not a lot of work to setup , and took maybe 30 minutes once I had the servers up and an AD setup .
All pcs can connect in my house ( wireless , etc .
) and also my xbox 360 .</tokentext>
<sentencetext>I have AD on my file server (and DNS, exchange, etc.
) but setup through group policy folder re-direction of the "my documents" to a \\server\user\mydocs directory.
Every user that has a domain gets these files created automatically.
I also created the local settings folders as well (thus outlook and things like that) in the \\server\user\settings directories.
For my linux box I create a folder /mnt/user then in the /etc/fstab create a mount there with full rights.
Then finally in my linux box home directory, I have a documents folder, which I create a link to the mount point.
For MP3 and other files they are stored in a separate directory on the same NAS on the windows box, create mounts for them in linux, and can access them via windows shares published in AD.
However I use Jiznora as the front end to access all my music and files.
It is not a lot of work to setup, and took maybe 30 minutes once I had the servers up and an AD setup.
All pcs can connect in my house (wireless, etc.
) and also my xbox 360.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28445839</id>
	<title>Unison; and maybe git in the future.</title>
	<author>vyrus128</author>
	<datestamp>1245753600000</datestamp>
	<modclass>Interestin</modclass>
	<modscore>3</modscore>
	<htmltext><p>Currently? Just unison -quiet, running from cron. (I have it wrapped in a script that does locking, since Unison doesn't seem to lock against itself reliably, for reasons I don't understand.) I've had two problems worth watching out for:<br>1) Try to avoid running it against NFS. It walks the entire synced area every time you sync. Local disk will be two orders of magnitude faster.<br>2) Be careful syncing between case-sensitive and case-insensitive filesystems. Unison will start failing out if you ever create two files differing only in case.</p><p>Beyond that, I'm looking to start using git to version both my code and my textual data. I'm not intending to use git itself to sync the repositories; I'm going to use it for versioning only, and keep syncing using Unison. The reason is because I'm the only user, and for my own convenience I'd like the working copy to be synced. All I really need out of git is versioning anyway; I already have a workable solution for syncing.</p></htmltext>
<tokenext>Currently ?
Just unison -quiet , running from cron .
( I have it wrapped in a script that does locking , since Unison does n't seem to lock against itself reliably , for reasons I do n't understand .
) I 've had two problems worth watching out for : 1 ) Try to avoid running it against NFS .
It walks the entire synced area every time you sync .
Local disk will be two orders of magnitude faster.2 ) Be careful syncing between case-sensitive and case-insensitive filesystems .
Unison will start failing out if you ever create two files differing only in case.Beyond that , I 'm looking to start using git to version both my code and my textual data .
I 'm not intending to use git itself to sync the repositories ; I 'm going to use it for versioning only , and keep syncing using Unison .
The reason is because I 'm the only user , and for my own convenience I 'd like the working copy to be synced .
All I really need out of git is versioning anyway ; I already have a workable solution for syncing .</tokentext>
<sentencetext>Currently?
Just unison -quiet, running from cron.
(I have it wrapped in a script that does locking, since Unison doesn't seem to lock against itself reliably, for reasons I don't understand.
) I've had two problems worth watching out for:1) Try to avoid running it against NFS.
It walks the entire synced area every time you sync.
Local disk will be two orders of magnitude faster.2) Be careful syncing between case-sensitive and case-insensitive filesystems.
Unison will start failing out if you ever create two files differing only in case.Beyond that, I'm looking to start using git to version both my code and my textual data.
I'm not intending to use git itself to sync the repositories; I'm going to use it for versioning only, and keep syncing using Unison.
The reason is because I'm the only user, and for my own convenience I'd like the working copy to be synced.
All I really need out of git is versioning anyway; I already have a workable solution for syncing.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444221</id>
	<title>http://www.gbridge.com/</title>
	<author>jsnipy</author>
	<datestamp>1245747720000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><a href="http://www.gbridge.com/" title="gbridge.com">http://www.gbridge.com/</a> [gbridge.com]
very underrated</htmltext>
<tokenext>http : //www.gbridge.com/ [ gbridge.com ] very underrated</tokentext>
<sentencetext>http://www.gbridge.com/ [gbridge.com]
very underrated</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443723</id>
	<title>portable hard drive</title>
	<author>Compunexus</author>
	<datestamp>1245789420000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>I have found that a portable hard drive with an encrypted partition works best. Put DSL in the bootable partition in case I have to work at an untrusted PC. FEBE and gmarks cover the worst of my needs on Firefox. If someone swipes my drive, they don't get my data.</p></htmltext>
<tokenext>I have found that a portable hard drive with an encrypted partition works best .
Put DSL in the bootable partition in case I have to work at an untrusted PC .
FEBE and gmarks cover the worst of my needs on Firefox .
If someone swipes my drive , they do n't get my data .</tokentext>
<sentencetext>I have found that a portable hard drive with an encrypted partition works best.
Put DSL in the bootable partition in case I have to work at an untrusted PC.
FEBE and gmarks cover the worst of my needs on Firefox.
If someone swipes my drive, they don't get my data.</sentencetext>
</comment>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_23_1823201_7</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443525
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28450063
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_23_1823201_1</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443941
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28445129
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28493129
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_23_1823201_36</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443469
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28445397
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_23_1823201_27</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443755
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28477615
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_23_1823201_0</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28446663
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28451189
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_23_1823201_43</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443729
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28446321
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_23_1823201_26</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443469
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443909
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28480619
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_23_1823201_2</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443469
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444267
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_23_1823201_33</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443469
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28448215
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_23_1823201_10</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443469
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443909
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28445419
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_23_1823201_49</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443765
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28448765
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_23_1823201_25</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443469
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28450291
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_23_1823201_48</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443875
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444923
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_23_1823201_39</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443533
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444071
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_23_1823201_30</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443419
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444039
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444817
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_23_1823201_16</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443469
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443781
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_23_1823201_18</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444495
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28446747
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_23_1823201_20</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443875
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444303
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_23_1823201_13</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443469
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443835
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_23_1823201_47</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443941
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444473
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_23_1823201_6</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443649
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28448467
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_23_1823201_37</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444149
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28447915
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_23_1823201_14</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443469
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443909
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444447
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_23_1823201_42</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443941
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444569
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_23_1823201_29</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443765
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28477455
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_23_1823201_5</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443765
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443953
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_23_1823201_11</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443875
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444277
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28447311
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_23_1823201_34</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443469
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444117
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_23_1823201_4</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443419
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443659
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28446163
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_23_1823201_50</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28446663
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28453179
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_23_1823201_41</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443499
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444395
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_23_1823201_24</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443469
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444529
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444769
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_23_1823201_40</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443499
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28445097
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_23_1823201_31</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443469
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443861
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_23_1823201_17</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443469
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444529
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444713
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_23_1823201_3</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443941
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28446943
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_23_1823201_32</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443419
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444063
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_23_1823201_23</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443729
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28445529
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_23_1823201_46</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28445899
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28451513
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_23_1823201_19</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443649
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28447149
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_23_1823201_22</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443533
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444189
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_23_1823201_9</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443419
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443659
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28450441
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_23_1823201_15</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443469
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28446521
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28450665
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_23_1823201_38</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443419
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443699
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_23_1823201_8</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443469
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444305
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28449359
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_23_1823201_45</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444495
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28445673
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_23_1823201_28</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443469
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443983
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28449077
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_23_1823201_21</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443755
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28450397
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_23_1823201_44</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443469
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444305
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28454069
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_23_1823201_35</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443469
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28451515
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_23_1823201_12</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28445647
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28477315
</commentlist>
</thread>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_06_23_1823201.17</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444247
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_06_23_1823201.9</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443941
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28445129
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28493129
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444569
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444473
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28446943
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_06_23_1823201.3</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28445839
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_06_23_1823201.2</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444519
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_06_23_1823201.25</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443765
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28448765
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28477455
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443953
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_06_23_1823201.22</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443419
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444039
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444817
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444063
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443659
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28446163
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28450441
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443699
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_06_23_1823201.35</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28445899
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28451513
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_06_23_1823201.11</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443755
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28450397
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28477615
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_06_23_1823201.30</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443475
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_06_23_1823201.4</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443875
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444303
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444923
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444277
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28447311
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_06_23_1823201.12</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443729
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28445529
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28446321
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_06_23_1823201.19</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444459
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_06_23_1823201.5</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28445373
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_06_23_1823201.31</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443533
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444071
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444189
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_06_23_1823201.27</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443881
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_06_23_1823201.21</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28446597
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_06_23_1823201.24</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444237
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_06_23_1823201.13</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443469
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28445397
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28448215
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28450291
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443983
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28449077
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444267
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28451515
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443861
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444117
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443909
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28480619
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28445419
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444447
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444529
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444713
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444769
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443781
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443835
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28446521
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28450665
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444305
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28449359
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28454069
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_06_23_1823201.6</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443525
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28450063
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_06_23_1823201.32</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443849
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_06_23_1823201.29</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28446663
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28451189
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28453179
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_06_23_1823201.28</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28445465
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_06_23_1823201.14</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443649
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28447149
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28448467
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_06_23_1823201.18</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444149
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28447915
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_06_23_1823201.7</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444495
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28446747
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28445673
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_06_23_1823201.0</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443701
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_06_23_1823201.23</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444169
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_06_23_1823201.26</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443499
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444395
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28445097
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_06_23_1823201.20</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443653
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_06_23_1823201.33</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444103
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_06_23_1823201.16</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444845
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_06_23_1823201.15</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28445647
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28477315
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_06_23_1823201.1</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28444351
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_06_23_1823201.8</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443429
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_06_23_1823201.34</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28443923
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_06_23_1823201.10</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_23_1823201.28446791
</commentlist>
</conversation>
