Mitra, Partha P., Stark, Jason B., Green, Andrew G. (2002) Nonlinear Propagation and Information Theory. OPN Trends, 13 (3(Supp). S22-28.
Abstract
Through widespread usage, the word “information” has come to have a tangible, concrete feel to it. Today we speak glibly of information flow and of information superhighways. Information is an abstraction, nevertheless, and it speaks to the insights of Claude Shannon that we have a formal theory that allows us to describe in fairly precise terms the flow of information through a communication channel. Although the basic formalism is quite general, information theory was initially developed in the context of telephonic communication through copper cable, and dealt mostly with a linear propagation channel, as exemplified by Shannon’s famous formula for the capacity of a channel (Table 1) with additive white Gaussian noise (AWGN). Such a channel is defined by a linear relationship between the output time series Y(t) and input X(t) through the relation Y(t) = X(t) + N(t), where N(t) is a Gaussian noise process with a flat power spectrum. All three processes are assumed to be bandlimited with a bandwidth W. The capacity of this channel is given by C=W log2(1+S/N), where S and N are the signal and noise powers, respectively.
Item Type: | Paper |
---|---|
Subjects: | physics > information theory |
CSHL Authors: | |
Communities: | CSHL labs > Mitra lab |
Depositing User: | Matt Covey |
Date: | 2002 |
Date Deposited: | 21 Apr 2014 16:50 |
Last Modified: | 21 Apr 2014 16:50 |
URI: | https://repository.cshl.edu/id/eprint/29773 |
Actions (login required)
Administrator's edit/view item |