tag:blogger.com,1999:blog-786333285568106173.post3305422258751688862..comments2024-01-25T01:05:59.968-05:00Comments on WebDiarios de Motocicleta: Cuckoo HashingMihaihttp://www.blogger.com/profile/11599372864611039927noreply@blogger.comBlogger6125tag:blogger.com,1999:blog-786333285568106173.post-78702888802014292222010-02-06T20:24:05.575-05:002010-02-06T20:24:05.575-05:00Glad you figured it out!
Again, I believe Moser&#...Glad you figured it out!<br /><br />Again, I believe Moser's paper got its hype because it did something smart in its handling of the LovĂˇsz lemma (which I don't understand, since I didn't think about this problem -- which is probably also the case for 90% of the hype-generators, but that's another story :). It's certainly not the case that this paper is the root of all encoding proofs :)Mihaihttps://www.blogger.com/profile/11599372864611039927noreply@blogger.comtag:blogger.com,1999:blog-786333285568106173.post-21396954387362035862010-02-06T13:49:57.817-05:002010-02-06T13:49:57.817-05:00By the way Moser's proof is essentially an enc...By the way Moser's proof is essentially an encoding proof. It basically shows that, after some startup cost, each step of the algorithm compresses k random bits into log(dk) bits. So as long as log(dk) < k, there had better not be too many steps.Anonymoushttps://www.blogger.com/profile/06328185728688465505noreply@blogger.comtag:blogger.com,1999:blog-786333285568106173.post-6856052012275470422010-02-06T13:46:51.829-05:002010-02-06T13:46:51.829-05:00Ah, ok. I see it now. This is the part I missed:...Ah, ok. I see it now. This is the part I missed:<br /><br /><i>all edges of the path, in order, taking (k-1)lg n bits</i><br /><br />I didn't understand that this gives you the key that generated each edge, so you can fully recover the functions g and h.<br /><br />Thanks,<br />PatAnonymoushttps://www.blogger.com/profile/06328185728688465505noreply@blogger.comtag:blogger.com,1999:blog-786333285568106173.post-59598853609773821842010-02-06T13:13:50.191-05:002010-02-06T13:13:50.191-05:00This comment has been removed by the author.Anonymoushttps://www.blogger.com/profile/06328185728688465505noreply@blogger.comtag:blogger.com,1999:blog-786333285568106173.post-79820665229186397222010-02-06T12:15:55.180-05:002010-02-06T12:15:55.180-05:00I haven't looked at Moser's paper, but, gi...I haven't looked at Moser's paper, but, given the blog-hype it received, I sincerely hope he does something smarter than just expressing probabilities in encoding terminology. This is a trivial idea that I presume many people had.<br /><br />As for the encoding, an edge is identical to a key. In other words, I may assume that the keys are {1,...,n}. You notice how I always say that I am encoding an edge with lg n bits.<br /><br />To put it in your terminology, there are n edges h(x_i)--g(x_i). I can encode which edge is involved in some subgraph using lg n bits. So I do indeed recover the h and g values for all edges.Mihaihttps://www.blogger.com/profile/11599372864611039927noreply@blogger.comtag:blogger.com,1999:blog-786333285568106173.post-89622424640635681082010-02-06T07:49:51.878-05:002010-02-06T07:49:51.878-05:00This is a nice idea that looks like another applic...This is a nice idea that looks like another application of "Moser's Method," but there's something I don't quite get.<br /><br />In your encoding proof, H is entropy of the bitstring h(x_1),g(x_1),...,h(x_n),g(x_n).<br /><br />Can you recover this bitstring from the encoding? As far as I can see, you can discover which edges are in the Cuckoo graph, but not which keys generated those edges. Or did I miss something?Anonymoushttps://www.blogger.com/profile/06328185728688465505noreply@blogger.com