You are currently browsing the category archive for the ‘Independence number’ category.

Recently this talk was given by Dr. Andrew Thangaraj in our reading seminar and I was fascinated by the elegance of the solution. I went through the original paper “On the Shannon Capacity of a Graph” by Lovasz and decided to write up some notes. The technique described by Andrew follows a modern approach that Â requires spectral graph theory and I will describe the spectral theory approach in a later post. I will begin with the basic description of the problem and Lovassz’s solution.

** Zero error capacity of a noisy type writer channel **

Suppose you have a typewriter that is noisy, i.e., the printed letter might turn out to be the pressed letter or the next letter with equal probability. For example, if the “c” key is pressed then the printed letter might be “c” with probability or a “d” with probability . If letter “z” is pressed, then the printed letter is “z” with probability or an “a” with probability . So the zero-error capacity of this type writer channel is defined as the maximum number of symbols (more correctly bits) that can be conveyed per channel use. It is easy to see that if we only use a alternate letter set, then there is no confusion at the output. It can also be shown that something better cannnot be done. So the error free capacity in bits is

Alternatively, we can form the following graph: Connect two nodes if there is confusion of using two nodes simultaneously. For the -letter type writer channel it is easy to see that the graph equals the cycle . Observe that the maximum independent set of has nodes. Hence for ,

where is the size of the maximal independent set of a graph . So now lets look at the cycle . This would correspond to a type writer of letters . It is easy to see that . Earlier the code was for one channel use. It turns out that combing letters over two channel uses improves the number of error free symbols. So for the (And its corresponding typewriter) sending the pairs of letters leads to an error free code. For , the error-free capacity equals (proved by Lovasz). So we should look at graph product and their independence numbers to obtain the error free capacity.

**Definition 1 (Strong Product)**

*For two graphs and , their strong product denoted by is defined on in which is adjacent to iff in and in . (“” denotes adjacency.)*

denotes the strong product of the graph by itself times. The following Figure illustrates a Â product of a 5 cycle graph with itself.