> Dear Prof.XXXX,
>
> I am a graduate student at the Cavendish who escape labwork twice a week to
> attend your lectures on Information Theory.
>
> I have a question that, I believe, only you would be able to give a hint of
> the answer.
>
> I was wondering whether the Noiseless Channel Theorem (stating max
> compression = entropy) could somehow be linked to the Sampling Theorem (to
> compress a signal without loss of info, the sampling frequency must be at
> least twice the bandwidth). The two are about reliable compression...
>
> Every prof I talked to, in CMS and in Brazil (I'm from an engineering
> school in Brazil), answered me simply: "NO, the two theories stand
> independently". Somehow, I was not quite happy with their peremptory
> answer.
>
> If you answer me "NO, the two theories stand independently", I promise I'll
> give up bothering people with this "stupid question". Otherwise, could you
> recommend me some book addressing this question?
>
> Thank you for your attention. Sincerely,
They do sound related, I agree.
I recommend reading Shannon's original papers; they are quite lucid.
I think that the way to understand this issue may be this:
a source x(t) has two numbers associated with it:
its entropy, and its number of degrees of freedom.
Bandwidth relates to number of degrees of freedom.
Entropy relates to signal to noise as well as d.o.f.
There is a book by D.M.MacKay called Information Mechanism and Meaning (1950 or
so)
that refers to these as "selective information content" and "metrical
information
content", a terminology that has not caught on.
Hope this helps.
XXX
(XXX é o meu prof-herói daqui de Cambridge)
0 Comments:
Post a Comment
<< Home