Interactive communication - overview


Interactive communication concerns the number of bits that must be transmitted when a sender wants to convey information to a receiver who has related data, and the potential transmission savings afforded when the two communicators are allowed to interact.

This outline describes some of the basic concepts, main results, and most pressing open problems concerning worst-case transmission for general communication problems. Among them:

Additional outlines describing balanced distributions and average-case communication, where stronger results can be proven, will be added soon.

Interactive communication is best introduced via examples, especially the following example.


There are n teams in the n-BA. Over breakfast, Bob reads that the Celtics will play the Lakers. On the evening news, Alice hears that the Lakers won, but not whom they beat. As a result, Bob knows the two teams that played (Celtics and Lakers), but not who won (Lakers); Alice knows the winning team, but not who they played against.

How many bits of information must Alice and Bob exchange in the worst case for Bob to find out who won?

Remark: Two slight variations are easy to analyze. If Bob did not know who played, Alice would need to transmit log n bits describing the winning team. (If you're not convinced, or unsure about and , see basic introduction.) If, in addition to the winner, Alice also knew the losing team, she could transmit only one bit indicating whether the winning team is lexicographically lower or (as is the case here) higher. Essentially, we would like to know whether our scenario requires closer to the log n bits of the first variation, or the 1 bit of the second.
Note that in this example, like all others considered in this overview, a sender (Alice) wants to convey some information (winner) to a receiver (Bob) who has related data (the two teams that played).

One-way communication

We first consider the simplest case where only Alice can transmit. We show that in this one-way communication scenario she must send exactly log n bits in the worst case.

log n bits suffice.  As we remarked above, log n bits suffice even without using Bob's information: Alice and Bob agree in advance on a log n-bit indexing (say lexicographic) of the teams and Alice transmits the winner's index.

log n bits are necessary.  To see that log n bits are needed in the worst case even with Bob's information, note that Alice's message to Bob is determined by the winner, for that is all she knows. For example, she may transmit 011 if she hears that the Celtics won and 1100 if she hears that the Pistons won. If for two different teams she transmits the same message, say 101, then in the event that these two teams played each other, Bob will not know who won. It follows that Alice must transmit a different message for every team. The picky will also note that if one message is a prefix of another then if the corresponding teams play, Bob may not know when a message ends. Hence no message can prefix another. There are therefore n different messages, none a prefix of another. At least one requires log n bits.

From one-way to interaction

The preceding analysis assumed that only Alice can transmit so, for example, if two teams are assigned the same message Bob can't indicate he's missing information.

What if interaction is allowed? Can the total number of bits be reduced?

Remark: Recall that (1) we count the bits transmitted by both Alice and Bob; (2) we are interested in the worst-case number of bits; (3) Alice doesn't need to learn anything. So it's not a priori clear whether interaction can reduce communication.

Two messages

Interaction can reduce communication significantly. We first describe a simple interactive protocol where Alice and Bob exchange just two messages and transmit a total of log log n+1 bits. Considerably less than the log n bits required with one-way communication.

Alice and Bob agree in advance on a log n-bit representation of each of the n teams. Bob, who knows the two players, considers their binary representations. He transmits to Alice the location of the first bit where the two teams differ. Alice replies with the winner's bit in that location.

Example: Suppose that n=16, so that each team is described by 4 bits, and that the binary representations of the Celtics and the Lakers are 0110 and 1010 respectively. Bob uses two bits to transmit the number 3, indicating that 0110 and 1010 first differ in the third (from right) bit location, and Alice replies 0 because the third bit of the Lakers' representation is 0. Bob can then tell that the Lakers won because the Celtics' third bit is 1.

To count the number of bits, observe that Bob describes a number between 1 and log n, hence transmits loglog n= log log n bits. Alice transmits 1 bit for a total of log log n+1 bits.

Remark: Curiously, Bob who doesn't need to convey any information ended up transmitting most of the bits. Alice, the only one who wants to convey information, transmitted just one bit.

Any better?

Is that the best we can do? Is there a protocol that requires fewer bits in the worst case?

Remark: To reduce the number of bits, we can consider either another two-message protocol, or one that requires more messages but fewer bits.
This question is as easily answered for League as it is in the general communication setup which we describe next. We use minimal amount of "jargon" so please bear with it.

The general communication scenario

In the most general communication problem a sender knows a random value X while a receiver knows a random value Y and wants to determine X. We are interested in the number of bits required in the worst case to ensure that the receiver can always determine X correctly.

We assume that X and Y are distributed according to a known probability distribution p(x,y) and that the sender and the receiver agree in advance on a protocol that determines who transmits what, based on the values they hold and previous transmissions.

One can consider various scenarios depending on the number of messages the communicators can exchange. We let Wm denote the least number of bits transmitted by both communicators in the worst case when they are allowed to exchange at most m messages. W1 is therefore the number of bits required with one message, namely only the sender can transmit, and W2 is the total number of bits required with at most two messages: the receiver transmits, and the sender replies.

Allowing additional messages can only reduce the number of bits (we allow, not require, more messages), hence W1, W2, W3, ... is a non-increasing sequence of nonnegative integers and therefore converges to a limit W which is the number of bits that must be transmitted in the worst case when there are no restrictions on the number of messages exchanged.

We would like to determine the Wm's for specific problems and the largest possible discrepancies between them for arbitrary problems.

Remark: It is easy to see that since we consider the worst-case number of bits and allow no errors, the exact probability distribution is irrelevant. The Wm's are determined by the support set of p, the set of (x,y) values that can occur.

Example: For League, Y is a game, namely a set of two distinct integers between 1 and n, and X is a winning team, namely one of the two integers in Y. We saw that W1=log n and W2log log n+1. To determine these numbers, we didn't use the probability of the various games and winners. All we cared about was that every two teams could play and that the winner was one of the two teams.


The following segments describe general results that hold for all communication problems. In particular they apply to the League for which we still want to determine W2 and W.

The limits of interaction

The League Example shows that interaction can significantly reduce communication over the one-way number of bits. Can interaction achieve arbitrary savings? Is there for example a communication problem that requires a trillion bits one-way but only two bits with interaction?

No. The maximum reduction via interaction is logarithmic. For all communication problems:
WlogW1+1. (1)

Remark: This bound holds only for worst-case number of bits. For average case there can be an arbitrary improvement even between one and two messages (see average-case results (coming soon)).

Application to league

Bound (1) can be used to completely determine the number of bits needed for the League Example. We showed that for League W1 = log n. Therefore,

W2 W3 ... W logW1+1 = log log n+1.

But we also saw that for League

log log n+1W2.

Hence for League all the inequalities above hold with equality and

W2=W3=...=W=log log n+1.

This shows that the two-message protocol we described is optimal. There's no better league protocol, regardless of the number of messages.

One message may be exponentially wasteful

Bound (1) cannot be strengthened. It is achieved for all communication problems where (like league), there are two possible X value for every Y value. For all these communication problems,

W=logW1+1. (2)

How should Equation (2) be interpreted? If multiple messages are acceptable, it is a positive result saying that interaction can save a lot. However if, as in many scenarios, fewer messages are preferable, then, rewriting Equation (2) as

W1=2W∞ - 1
(and modifying it a bit if W1 is not a power of 2), it is a negative result stating that in some problems a single message requires exponentially more bits than the minimum necessary.

It is natural to ask whether a small number (albeit >1) of messages suffices to reduce communication to almost the best possible.

Two messages are almost optimal

Perhaps the most interesting result applying to all communication problems is that just two messages always suffice to reduce communication to at most four times the optimal. For all communication problems,

W2 4W+2. (3)

Recall that for some problems (like league) one message is exponentially wasteful. Bound (3) says that for all communication problems, two messages are almost optimal. Hence while there may be a large incentive to move from one to two messages, moving from two to more messages has a limited benefit.

In certain cases, Bound (3) also points us to efficient two-message protocols where they are not a priori obvious. This is illustrated by the following example.


As if one league wasn't enough, imagine a tournament with l leagues, each with n teams. As in the (soccer) world cup, two teams from each league proceed to the playoffs and one of these 2l teams wins the playoffs.

Bob know the 2l teams that proceeded to the playoffs. Alice knows the winning team and wants to convey that information to Bob.

If only one message is allowed, Alice must completely describe the winning team. If the message she assigns to one team is the same as, or a prefix of, a message she assigns to another team, then if both teams participate in the playoffs, Bob will not know who won. Hence each team must be assigned a different message from prefix-free set. Therefore,

W1= log nl.

It is also easy to find an efficient three-message protocol. Alice can transmit log l bits describing the winning team's league. Thereafter, Alice and Bob can use the League problem's protocol. Bob considers the two teams that proceeded to the playoffs from that league, and uses log log n bits to describe a location where the two teams differ. Alice transmits one bit describing the winning team's bit in that location. It follows that

W3 log l + log log n +1.

Efficient two-message protocols are more elusive. Next we use Bound (3) to upper bound W2. Later on, we'll mention a lower bound.

Application of Bound (3) to playoffs

Consider the playoffs problem when l is constant and n increases. We saw that

W1 = log n +c1    while    W3 log log n +c2,
where c1 = log l and c2 = log l+1. Hence three messages require significantly fewer bits than one message.

A low-rate 2-message protocol is not immediately obvious. Yet Bound (3) implies that

W2 4W+2 4log log n + 4c2 + 2.
Hence an efficient two-message protocol exists.

Are two messages optimal?

Two messages are the first achieving near optimality. It is natural to wonder about their precise efficiency. Can bound (3) be improved? Are two messages exactly optimal?

Two messages are not optimal

Two messages are not optimal. They may require twice the minimum number of bits.

More specifically, for every α, however large, there is a communication problem such that W3α and

W22W3- o(W3)
where o(x) is a term that when divided by x, tends to zero as x goes to ∞.

This discrepancy is manifested for the Playoffs problem.

Two messages are not optimal for Playoffs

We saw that for the Playoffs problem with l leagues, each with n teams,

W3 log l + log log n +1.

Using graph-coloring arguments, it can be proven that
W2 log min(n,2l2).

Therefore, as l increases and n=2l2,

W22W3- o(W3),

showing that two messages may require twice the optimal number of bits.

Open problem 1: two messages

We saw that two messages are always within a factor of four of the optimal and sometimes requires twice the optimal. What is the largest possible discrepancy between W2 and W?

Namely, what is the largest constant c such that for all communication problems

W2cW+o(W) ?
Again, o(x) is a term that when divided by x, tends to zero as x goes to ∞.

Open problem 2: multiple messages

One message can require exponentially more than the minimal number of bits. Two messages are always within a factor of four from the optimal, and sometimes require twice the optimal. The most interesting theoretical open problem is whether a fixed number m of messages is always asymptotically optimal:

Is there an m such that for all communication problems,

WmW+ o(W)?
As usual, o(x) is a term that when divided by x, tends to zero as x goes to ∞.

An affirmative answer would mean that there is never a strong reason to exchange more than m messages.

The answer to this question is not known, but the rest of the overview describes some partial results.

Three messages are are not optimal

If an optimal number of messages exists it is not three either. For every α, however large, there is a communication problem such that W4α and
W32W4- o(W4). (4)

Four messages are within a factor of three from optimal

The lowest number of messages that could potentially be optimal is four. But the only bound know is that four messages are within a factor of three of the optimal. For all communication problems,

W4 3W +o(W). (5)


We showed that for the worst case number of bits:

Among the more interesting open problems, we mentioned that it is not known:

What's next?

To find out more about interactive communication you can read


To better understand the material and see the proofs, you can read the following papers. Most of the results are described in:

Bound (4) is described in: And bound (5) is proven in: Related topics (more germane to average case) are discussed in