You are currently browsing the category archive for the ‘Independence number’ category.

Recently this talk was given by Dr. Andrew Thangaraj in our reading seminar and I was fascinated by the elegance of the solution. I went through the original paper “On the Shannon Capacity of a Graph” by Lovasz and decided to write up some notes. The technique described by Andrew follows a modern approach that  requires spectral graph theory and I will describe the spectral theory approach in a later post. I will begin with the basic description of the problem and Lovassz’s solution.

Zero error capacity of a noisy type writer channel

Suppose you have a typewriter that is noisy, i.e., the printed letter might turn out to be the pressed letter or the next letter with equal probability. For example, if the “c” key is pressed then the printed letter might be “c” with probability {0.5} or a “d” with probability {0.5}. If letter “z” is pressed, then the printed letter is “z” with probability {0.5} or an “a” with probability {0.5}. So the zero-error capacity of this type writer channel is defined as the maximum number of symbols (more correctly bits) that can be conveyed per channel use. It is easy to see that if we only use a {13} alternate letter set, then there is no confusion at the output. It can also be shown that something better cannnot be done. So the error free capacity in bits is

\displaystyle \Theta_E=\log_2(13).

Alternatively, we can form the following graph: Connect two nodes if there is confusion of using two nodes simultaneously. For the {26}-letter type writer channel it is easy to see that the graph equals the cycle {C_{26}}. Observe that the maximum independent set of {C_{26}} has {13} nodes. Hence for {C_{26}} ,

\displaystyle \Theta_E=\log_2(\alpha(C_{26})),

where {\alpha(G)} is the size of the maximal independent set of a graph {G}. So now lets look at the cycle {C_5}. This would correspond to a type writer of {5} letters {\{a,b,c,d,e\}}. It is easy to see that {\alpha(C_5)=2}. Earlier the code was for one channel use. It turns out that combing letters over two channel uses improves the number of error free symbols. So for the {C_5} (And its corresponding typewriter) sending the pairs of letters {\{aa,bc,ce,db,ed\}} leads to an error free code. For {C_5}, the error-free capacity equals {\log_2(5)/2} (proved by Lovasz). So we should look at graph product and their independence numbers to obtain the error free capacity.

Definition 1 (Strong Product) For two graphs {G} and {H}, their strong product denoted by {G.H} is defined on {V(G)\times V(H)} in which {(x,y)} is adjacent to {(x',y')} iff {x\sim x'} in {G} and {y\sim y'} in {H}. (“{\sim}” denotes adjacency.)

{G^k} denotes the strong product of the graph {G} by itself {k} times. The following Figure illustrates a  product of a 5 cycle graph with itself.

Product of the 5 cycle with itself

The strong product of a 5 cycle graph.

Read the rest of this entry »

Advertisements