On Sat, Oct 29, 2011 at 1:11 AM, Dirk Loss <span dir="ltr"><<a href="mailto:lists@dirk-loss.de">lists@dirk-loss.de</a>></span> wrote:<br><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex;">
So files placed in a deeper directory structure have a higher risk of<br>
getting lost, because every parent directory has to be available and<br>
correct in order to find the files?<br>
<br>
And since each of the parent directories might be placed on a different<br>
set of servers, files may get lost because too many servers holding<br>
information about their parent directories have failed -- although<br>
enough servers might be available to reconstruct the files themselves?<br></blockquote><div><br></div><div>This is true iff S > H (where S is the number of servers in the grid), and is much more likely if S >> H. >From a mathematical perspective, if you compute the number of ways you can choose H from S then you have the number of distinct "loss sets" into which your files are distributed (uniformly, in the case of a stable grid). For very large grids, the number of loss sets is so big that every file essentially lives or dies independently of every other (or close enough).</div>
<div><br></div><div>This is one of the reasons why I prefer to set H=N=S. That way all of my files live or die together and the deep-tree problem is irrelevant. It's funny but true that this is a case where putting all your eggs in one basket is better that distributing them across many baskets. It makes sense, though, when you realize that dropping one basket will break not just the eggs in that basket, but eggs in many other baskets as well.</div>
<div> </div></div>-- <br>Shawn<br>