“You have to have good researchers. Good researchers, all of them.” – Paraphrase of quote from Das Boot.
Sometimes writing a paper can be a challenging slog: goals move as conjectures prove wrong; facts that seem simple turn out to be slippery to prove; experiments must be rerun as the paper evolves; and notation needs to be redone for the sake or readability.
In such a situation, it’s a blessing to have coauthors who will go the extra mile to keep pushing and pushing the paper to completion. So I would now really like to thank my students Jeffrey Knockel and George Saad for their hard work on our latest paper: “Self-Healing of Byzantine Faults” (abstract below, full paper here).
Note: I wouldn’t exactly compare the research process with serving on a U-boat mission during the decline of the Third Reich. For me it’s most similar to mountaineering, whose ethos is captured by the motto of the Seattle mountaineers: “Like fun, only different!”
Recent years have seen significant interest in designing networks that are self-healing in the sense that they can automatically recover from adversarial attack. Previous work shows that it is possible for a network to automatically recover, even when an adversary repeatedly deletes nodes in the network. However, there have not yet been any algorithms that self-heal in the case where an adversary takes over nodes in the network. In this paper, we address this gap.
In particular, we describe a communication network over n nodes that ensures the following properties, even when an adversary controls up to nodes, for any positive . First, the network provides point-to-point communication with bandwidth and latency costs that are asymptotically optimal. Second, message corruptions occur in expectation, before the adversarially controlled nodes are effectively quarantined so that they cause no more corruptions. We present empirical results showing that our approach may be practical.
PODC 2012 accepted papers are out (and have been for a while). This looks like a great collection of papers, with many focusing on problems in distributed reliability/security. It’s also nice to see some of my former students represented (Maxwell Young and Amitabh Trehan). Some papers that caught my eye:
Adi Akavia, Shafi Goldwasser, and Carmit Hazay
Distributed Public Key Schemes Secure against Continual Leakage
Faster Randomized Consensus with an Oblivious Adversary
Binbin Chen, Haifeng Yu, Yuda Zhao, and Phillip Gibbons
The Cost of Fault-Tolerance in Multi-Party Communication Complexity
Seth Gilbert and Maxwell Young
Making Evildoers Pay: Resource-Competitive Broadcast in Sensor Networks
Alexander Jaffe, Thomas Moscibroda, and Siddhartha Sen
On the Price of Equivocation in Byzantine Agreement
Allen Clement, Flavio Junqueira, Aniket Kate, and Rodrigo Rodrigues
On the (Limited) Power of Non-Equivocation
Guanfeng Liang and Nitin Vaidya
Byzantine Broadcast in Point-to-Point Networks using Local Linear Coding
James Aspnes, Hagit Attiya, Keren Censor-Hillel, and Faith Ellen
Faster than Optimal Snapshots (for a While)
Stephan Holzer and Roger Wattenhofer
Optimal Distributed All Pairs Shortest Paths and Applications
Michael Backes, Fabian Bendun, and Aniket Kate
Brief Announcement: Distributed Cryptography using TrInc
Atish Das Sarma, Ashwin Lall, Danupon Nanongkai, and Amitabh Trehan
Brief Announcement: Maintaining Large Dense Subgraphs on Dynamic Networks
So here’s one partial solution to the puzzle I described in my last post.
The prover sets up n pairs of cups behind the screen. For pair i, she places X_i stones in both cups, where X_i is a random variable uniformly distributed among the positive integers. Then she removes the screen.
Now the verifier chooses n-1 of the pairs of cups and asks the prover to reveal the contents. The verifier rejects the proof immediately if any of these revealed pairs have unequal numbers of stones.
Next, the prover takes each cup in the unrevealed pair and pours its contents into each of the cups in the original pair of cups in the table. Finally, the prover reveals the new contents of these original cups. If there are the same number of stones in each cup in the pair, the verifier accepts the proof, otherwise, he rejects it.
So this ensures that if the verifier accepts the proof, then the original pair of cups have equal number of stones with probability 1-1/n. Moreover, the verifier learns little about the number of stones in the original pair. It’s not perfect, since the verifier knows, for example, that the original number of stones is no more than the final number of stones. (Of course, if you were doing this on a computer, you’d add the stones modulo some large number)
I’m interested in improvements to this algorithm. Can the original number of stones be better hidden? Can we decrease the probability of error as a function of the number of cups used?
The following puzzle is supposed to have applications in treaty verification.
There are two people playing a game: a prover and a verifier. There is a table between the two people, and on the table are two cups, each with some number of pebbles in them. The prover also has an unlimited supply of cups and pebbles, and a screen she can use to hide activity on her side of the table.
The goal of the game is for the prover to prove to the verifier that the two cups originally on the table have the same number of pebbles in them, but to not reveal the number of pebbles in either of these cups.
How can you ensure that the verifier can be sure that the two original cups have the same number of pebbles with any arbitrarily small probability of error? What is the most efficient protocol you can devise to accomplish this goal? Here efficiency is measured in terms of time (number of rounds) and communication (number of cups), versus the probability with which the verifier can be sure that the two original cups have the same number of pebbles.
I wanted to post a quick update to make sure that the issue remains in the “blogsphere”. This post from saveufcise suggests that the department is still being strongly pressured to merge with EE. You can now take action to help them by either writing a letter or pledging money to create a College of Computing(240K pledged in last 3 days).
I interviewed at U. Florida CISE many years ago, and came away with a very favorable impression of a department that was clearly going places. I can also say that the department is filled with strong researchers (and incidentally nice people), who should not be going through this debacle. This page makes it clear that the CISE department is one of the best in the school of engineering at U. Florida (see excerpts below). So if you can, please take a moment to support them!
“The CISE department at UF has consistently demonstrated its excellence. In 2010, CISE graduated 25 PhD students, (3 times more than that number at 2000). We are bringing 5.5 million per year on research funding, (almost 4 times the amount in 2000). We have 2 ACM Fellows,4 IEEE Fellows and 2 AAAS Fellows; many faculty have won other awards and are on prestigious editorial boards and program committeees. Almost all have active NSF or NIH funded research programs. 12 young faculty won NSF Career Awards, which is 22% of College total, (also 5 times more than 2000). Over the last 5 years, we have won 11 best paper awards. we teach thousands of nonmajors, and approximately 600 undergraduate majors, 400 masters and 131 PhD’s with 32 tenure track faculty and at this moment, only 3 nontenure track faculty! CISE is generating 17% of the college’s primary source of income (weighted student credit hours) while only costing only 10% of the college. The Dean herself admitted during her Apr. 12 interview with students that the CISE department has the highest revenue/cost ratio in the college. CISE faculty are actively collaborating with researchers in almost every UF College, along with many national and international universities, as befits the flagship research institution of the state.
It is impossible to overemphasize the importance of a strong Computer Science and Engineering program in today’s economy. The predicted growth rate for Computer Science positions over the next ten years is 30%, almost 3 times the predicted growth rate for all engineering occupations. Software engineering jobs have consistently achieved high national rankings (#1 in 2011 and 2012).
Gainesville’s technology job sector has exploded in the last three years, with the founding of companies such as Grooveshark, Infinite Energy, and Shadow Learning. MindTree recently selected Gainesville as the site for its US expansion largely due to the presence of the CISE program. The opening of the UF Innovation Hub on January 11th promises to draw even more high-tech companies to Gainesville.”