Shannon capacity and related problems in Information Theory and Ramsey Theory

Similar documents
Definition For vertices u, v V (G), the distance from u to v, denoted d(u, v), in G is the length of a shortest u, v-path. 1

Discrete mathematics , Fall Instructor: prof. János Pach

Interleaving Schemes on Circulant Graphs with Two Offsets

Module 11. Directed Graphs. Contents

On the Relationships between Zero Forcing Numbers and Certain Graph Coverings

3 Fractional Ramsey Numbers

Graph Theory S 1 I 2 I 1 S 2 I 1 I 2

Number Theory and Graph Theory

2 The Fractional Chromatic Gap

Small Survey on Perfect Graphs

Monochromatic loose-cycle partitions in hypergraphs

Combinatorial Gems. Po-Shen Loh. June 2009

Treewidth and graph minors

The strong chromatic number of a graph

Efficient Universal Recovery in Broadcast Networks

Mathematical and Algorithmic Foundations Linear Programming and Matchings

Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science Algorithms For Inference Fall 2014

9.5 Equivalence Relations

GRAPH THEORY and APPLICATIONS. Factorization Domination Indepence Clique

Abstract. A graph G is perfect if for every induced subgraph H of G, the chromatic number of H is equal to the size of the largest clique of H.

Disjoint directed cycles

Subdivided graphs have linear Ramsey numbers

CONNECTIVITY AND NETWORKS

Module 7. Independent sets, coverings. and matchings. Contents

HW Graph Theory SOLUTIONS (hbovik) - Q

Superconcentrators of depth 2 and 3; odd levels help (rarely)

Characterizing Graphs (3) Characterizing Graphs (1) Characterizing Graphs (2) Characterizing Graphs (4)

by conservation of flow, hence the cancelation. Similarly, we have

Ma/CS 6b Class 26: Art Galleries and Politicians

Theorem 2.9: nearest addition algorithm

Part II. Graph Theory. Year

Lecture 22 Tuesday, April 10

Approximation slides 1. An optimal polynomial algorithm for the Vertex Cover and matching in Bipartite graphs

These notes present some properties of chordal graphs, a set of undirected graphs that are important for undirected graphical models.

Table 1 below illustrates the construction for the case of 11 integers selected from 20.

1. The following graph is not Eulerian. Make it into an Eulerian graph by adding as few edges as possible.

Byzantine Consensus in Directed Graphs

Algorithm and Complexity of Disjointed Connected Dominating Set Problem on Trees

Vertex Colorings without Rainbow or Monochromatic Subgraphs. 1 Introduction

The Encoding Complexity of Network Coding

AMS /672: Graph Theory Homework Problems - Week V. Problems to be handed in on Wednesday, March 2: 6, 8, 9, 11, 12.

ON SWELL COLORED COMPLETE GRAPHS

Some relations among term rank, clique number and list chromatic number of a graph

In this lecture we discuss the complexity of approximation problems, and show how to prove they are NP-hard.

Exact Algorithms Lecture 7: FPT Hardness and the ETH

Notes for Lecture 24

IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 53, NO. 10, OCTOBER

CPS 102: Discrete Mathematics. Quiz 3 Date: Wednesday November 30, Instructor: Bruce Maggs NAME: Prob # Score. Total 60

Endomorphisms and synchronization, 2: Graphs and transformation monoids

Matching Theory. Figure 1: Is this graph bipartite?

NP-Hardness. We start by defining types of problem, and then move on to defining the polynomial-time reductions.

/ Approximation Algorithms Lecturer: Michael Dinitz Topic: Linear Programming Date: 2/24/15 Scribe: Runze Tang

Network flows and Menger s theorem


Exercise set 2 Solutions

CS388C: Combinatorics and Graph Theory

QED Q: Why is it called the triangle inequality? A: Analogue with euclidean distance in the plane: picture Defn: Minimum Distance of a code C:

Slides for Faculty Oxford University Press All rights reserved.

Hypergraphs and Geometry. Noga Alon, Tel Aviv U.

CSE 215: Foundations of Computer Science Recitation Exercises Set #9 Stony Brook University. Name: ID#: Section #: Score: / 4

Partitioning Complete Multipartite Graphs by Monochromatic Trees

Math 776 Graph Theory Lecture Note 1 Basic concepts

Endomorphisms and synchronization, 2: Graphs and transformation monoids. Peter J. Cameron

Problem Set 2 Solutions

A taste of perfect graphs (continued)

Models of distributed computing: port numbering and local algorithms

Space vs Time, Cache vs Main Memory

W[1]-hardness. Dániel Marx. Recent Advances in Parameterized Complexity Tel Aviv, Israel, December 3, 2017

K 4 C 5. Figure 4.5: Some well known family of graphs

NP-Completeness. Algorithms

arxiv: v1 [math.co] 28 Sep 2010

Zhibin Huang 07. Juni Zufällige Graphen

INTRODUCTION TO THE HOMOLOGY GROUPS OF COMPLEXES

Improved Approximations for Graph-TSP in Regular Graphs

IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 52, NO. 4, APRIL

Information Theory and Coding Prof. S. N. Merchant Department of Electrical Engineering Indian Institute of Technology, Bombay

On 2-Subcolourings of Chordal Graphs

1 Digraphs. Definition 1

OUTLINE. Paper Review First Paper The Zero-Error Side Information Problem and Chromatic Numbers, H. S. Witsenhausen Definitions:

Introduction to Graph Theory

Extremal Graph Theory: Turán s Theorem

Advanced Combinatorial Optimization September 17, Lecture 3. Sketch some results regarding ear-decompositions and factor-critical graphs.

Section 3.1: Nonseparable Graphs Cut vertex of a connected graph G: A vertex x G such that G x is not connected. Theorem 3.1, p. 57: Every connected

WORM COLORINGS. Wayne Goddard. Dept of Mathematical Sciences, Clemson University Kirsti Wash

6. Lecture notes on matroid intersection

Kyle Gettig. Mentor Benjamin Iriarte Fourth Annual MIT PRIMES Conference May 17, 2014

Comparing sizes of sets

Discrete Mathematics

The Structure of Bull-Free Perfect Graphs

On Covering a Graph Optimally with Induced Subgraphs

Graph theory - solutions to problem set 1

Achievable Rate Regions for Network Coding

Rigidity, connectivity and graph decompositions

Some Elementary Lower Bounds on the Matching Number of Bipartite Graphs

Theory of Computing. Lecture 10 MAS 714 Hartmut Klauck

On the Rigidity of Ramanujan Graphs

Chapter 3: Paths and Cycles

Vertex cuts and group splittings. Bernhard Krön

Approximation Algorithms

Characterizations of Trees

Transcription:

Shannon capacity and related problems in Information Theory and Ramsey Theory Eyal Lubetzky Based on Joint work with Noga Alon and Uri Stav May 2007 1

Outline of talk Shannon Capacity of of a graph: graph: Introduction Some Some bounds bounds and and properties The The capacity of of a disjoint disjoint union. union. Ramsey Numbers Application: Index Index Coding: Introduction Constructing non-linear index-codes beating beating the the linear linear optimum Open Open problems Ramsey Constructions 2

Shannon capacity - Introduction Transmission over a noisy channel : Input alphabet: Output alphabet:! maps each input letter to a set of possible output letters. Goal ([Shannon 56]): What is is the maximal rate of of zero-error transmission over a given noisy channel? 3

Single letter transmission over Define the characteristic graph of a channel : where. The set guarantees zero error is an independent set of. OPT for a single use of. 5 4 1 3 2 1 1,2 2 2,3 3 3,4 4 4,5 5 5,1 5 4 OPT = 2 1 3 2 4

Strong graph powers - definition Q: Q: Can we we benefit from sending longer words over? Define, the th strong graph power of :! are adjacent for all, either or. When is the characteristic graph of, and are confusable in. 5

Strong graph powers - application Q: Q: Can we we benefit from sending longer words over? A: A: YES OPT= for sending k-letter words via. Block-coding shows. A strict inequality is possible! 5 4 1 3 2 {1, 4} {11, 14, 41, 44} {11, 23, 35, 42, 54} 6

Shannon capacity - definition The Shannon Capacity of G is defined to be:! is the effective alphabet-size of when sending zero-error transmission. E.g., if, then for we can send -letter words via without danger of confusion. 7

Shannon capacity: some bounds [Shannon 56]:. Smallest graph unsettled by this was. ( Motivated [Berge 60] to study perfect graphs; WPGT proved by [Lovász 72], SPGT by [CRST 02]. ) [Haemers 78, 79]: algebraic upper bounds. [Lovász 79]: (the Lovász func.), giving.! remains unknown even for simple and small graphs, e.g.. 8

Shannon capacity: original bounds [Shannon 56]:. By definition. Similar to to proving that :: If If cliques cover the vertices of of,, then can be covered by cliques. X X Cartesian product of of cliques = clique X. X 9

Shannon capacity: algebraic bound! represents over iff: Diagonal entries are non-zero: Off diagonal entries whenever [Haemers 78, 79]: matrices over Proof:! independent set of full full rank Higher powers: by definition, represents : 10

Where is attained? Shannon s bound gives examples of graphs where : 1-letter words are optimal. Lovász s function gives examples of graphs where : 2-letter words are optimal. No known with other finite optimal word-length. Q: Q: Can we we approximate by by for some large finite? A: A: No, not even after we we witness any finite number of of improvements 11

Rate increases between powers [Alon+L 06]: There can be any finite number of rate increases at any arbitrary locations: Nevertheless, we can deduce some bound on given, using Ramsey Theory. 12

Shannon capacity and Ramsey No. The The Ramsey number is is the the minimal integer so so that that every 2-edge-coloring of of the the complete graph has has a monochromatic.. Suppose. Then (!). [Erdős+McEliece+Taylor 71]: A tight bound of: Proof: color the edges of an independent set of according to the disconnected coordinate. Ramsey Numbers 13

Sum of channels 2 senders combine separate channels, and : Each letter can be sent from either of the 2 channels. Letters from are never confused with those from. Characteristic graph is. Disjoint union of individual char. graphs [Shannon 56]:, and conjectured that (=) always holds. Q: Q: How can adding a separate channel increase the capacity by by more than? אבג... abc 14

The Shannon capacity of a union [Alon 98] disproved Shannon s conjecture: Proof outline: Suppose for some : in the 2 nd power Ind. set implies. Such a is a Ramsey graph! Proof applies an algebraic bound to a variant of the Ramsey construction by [Frankl+Wilson 81]. 5 1 1 5 2 4 3 4 3 2 15

Multiple channels & privileged users [Alon+L 08]: The following stronger result holds: E.g., ensures that: Any senders combined have a high capacity. Any group of senders has a low capacity. 16

Ramsey Theory revisited By-product: explicit construction for a Ramsey graph with respect to rainbow sub-graphs: Ramsey Constructions 17

Outline of talk - revisited Shannon Capacity of of a graph: graph: Introduction Some Some bounds bounds and and properties The The capacity of of a disjoint disjoint union. union. Ramsey Numbers Application: Index Index Coding: Introduction Constructing non-linear index-codes beating beating the the linear linear optimum Open Open problems Ramsey Constructions 18

Index Coding Problem Definition [Birk+Kol 98],[Bar-Yossef+Birk+Jayram+Kol 06]: Server broadcasts data to receivers, Input data:. Each is interested in, and knows some subset of the remaining bits. Goal: design a code of minimal word length, so that: for every input word, every will be able to recover the bit (using his side-information). 's side information Codeword Dec Dec 19

Motivation: Informed Source Coding Content broadcast to cashing clients: Limited individual storage Slow backward channel Clients inform server on known & required blocks. Goal: broadcast a short stream, allowing each client to recover its wanted data. 20

Index coding in terms of graphs Define the (directed) side-information graph: Vertex set:.! is an edge iff knows the value of. An index code of length for is: An encoding function:, Decoding functions:, so that, :. = minimum length of an index code for. Out-neighbors of of in in.. 21

Index coding Examples Note: For any graph,,.. Suppose every knows all the bits except : Side-information graph is the complete graph. A linear index code of length 1:,.!. Similarly, if no knows any of the bits: Side-information graph is the edgeless graph. Counting argument: code must contain distinct words, hence. 22

A linear index coding scheme Set: : the adjacency matrix of, : basis for over. Encoding: given, send. Decoding: Altogether: can reconstruct. Allows recovering 1 1 1 iff knows knows these bits by by definition. 23

! Optimal linear index codes Note: For any spanning sub-graph,,.. [BBJK 06] showed: is the size of the optimal linear index code. In many cases. The main conjecture of [BBJK 06]: Conj: Linear index coding is is always optimal, i.e., for any.. e.g., perfect graphs, acyclic graphs, holes, anti-holes, 24

Beating the linear optimum [L+Stav]: the conjecture of [BBJK 06] is false in, essentially, the strongest possible way: (hardly improves trivial protocol of of sending the entire word )) 25

Index coding - proof sketch such that is large, and is small. Need to be small regardless of Use higher order fields: Take representing over : if does not know Encode using bits. Decoding: Generalizing, we have. 26

Index coding - proof sketch such that is large, and is small. Difficult to provide lower bounds on Use the Shannon capacity:!!! It follows that for every on vertices,. be be large large for for must must is an independent set Showing that will imply that 27

Index coding - proof sketch such that is small, and is small. Such a is a Ramsey graph. The construction of [Alon 98]: for some large primes. small small Use Lucas Theorem to extend this construction to any distinct primes. Choosing completes the proof. 28

Beating linear codes over any field We constructed graphs where using linear codes over higher-order fields.! Q: Q: Can beat any linear index coding scheme, i.e.,? A: A: YES (a (a corollary of of the previous Thm). Take for the previous : small small Code concatenation: 29

Multiple round index coding! rounds (each with its own input & setting): : minimal length of such an index code. Multiple usage can improve the average rate! Example: is interested in the th bit of each word. 1 st usage: knows the bits 2 nd usage: knows the bits : side information graphs are transitive tournaments In this case,, yet, largest possible gap! 30

Some open problems Multiple round index coding: Recall that. How does behave? Inequality may be strict What is the expected value of? Can be exponentially larger than? Generalized setting: multiple receivers may be interested in the same bit. 31

Thank you. May 2007 32