|
|
| SEMINARS |
|
Meetings of the St. Petersburg Mathematical Society
|
|||
|
|
|||
|
A joint meeting with the Chebyshev Laboratory
|
|||
|
Information and interactive communication M. Braverman University of Toronto |
|||
|
Abstract: Notions of entropy and information, pioneered by Shannon, have been very powerful tools in coding theory. Coding theory aims to solve the problem of one-way communication: sending a message from Alice to Bob using as little communication as possible, sometimes over a noisy channel. We will discuss several extensions of information-theoretic notions to the two-way communication setting. We use them to prove a direct sum theorem for randomized communication complexity, showing that implementing More generally, we will show that information cost |
|||