Factbites
 Where results make sense
About us   |   Why use us?   |   Reviews   |   PR   |   Contact us  

Topic: Shannon capacity


Related Topics

In the News (Mon 19 Aug 19)

  
  Shannon–Hartley theorem - Wikipedia, the free encyclopedia
In the 1940s, Claude Shannon developed the concept of channel capacity, based in part on the ideas of Nyquist and Hartley, and then formulated a complete theory of information and its transmission.
Shannon's theorem shows how to compute a channel capacity from a statistical description of a channel, and establishes that given a noisy channel with capacity C and information transmitted at a rate R, then if
Shannon, "Communication in the presence of noise", Proc.
en.wikipedia.org /wiki/Shannon-Hartley_theorem   (1713 words)

  
 Imatest - Shannon information capacity
The Shannon capacity, as we mentioned, is a function of both bandwidth W and signal-to-noise ratio, S/N.
Imatest displays noise and Shannon capacity plots at the bottom of the Chromatic aberration figure if the (Plot) Shannon capacity and Noise spectrum (in CA plot) checkbox in the SFR input dialog box is checked (the default is unchecked) and the selected region is sufficiently large.
Shannon capacity has not been used to characterize photographic images because it was difficult to calculate and interpret.
www.imatest.com /docs/shannon.html   (2595 words)

  
 Channel capacity   (Site not responding. Last check: 2007-10-21)
As we have already noted, the astonishing part of this theory is the existence of a channel capacity.
In fact, all we may deduce from the proof of the theorem is that it must be a long one.
Shannon's channel coding theorem applies to the channel, not to the source.
www.cs.ucl.ac.uk /staff/S.Bhatti/D51-notes/node31.html   (504 words)

  
 Shannon's Capacity Theorem   (Site not responding. Last check: 2007-10-21)
Shannon's Capacity Theorem says the unthinkable is possible: Error-free transmission is possible so long as the transmitter does not exceed the channel's capacity.
The Capacity Theorem can also be stated in terms of the transmission rates by dividing the coding rate (and the capacity) by the duration of the bit interval.
Shannon showed that the capacity of an additive white Gaussian noise channel is given by
www.owlnet.rice.edu /~engi202/capacity.html   (281 words)

  
 A Glossary for Molecular Information Theory and the Delila System
Shannon's famous work on information theory was frustrating in the sense that he proved that codes exist that can reduce error rates to as low as one may desire, but he did not say how this could be accomplished.
Shannon's channel capacity theorem showed that it is possible to build systems with as low an error as desired, but one cannot avoid errors entirely.
Shannon sphere: A sphere in a high dimensional space which represents either a single message of a communications system (after sphere) or the volume that contains all possible messages (before sphere) could be called a Shannon sphere, in honor of Claude Shannon who recognized its importance in information_theory.
www.lecb.ncifcrf.gov /~toms/glossary.html   (10853 words)

  
 IEEE Spectrum: Closing in on the perfect code   (Site not responding. Last check: 2007-10-21)
Before Shannon's work, engineers thought that to reduce communications errors, it was necessary to increase transmission power or to send the same message repeatedly—much as when, in a crowded pub, you have to shout for a beer several times.
Shannon showed that with the right collection of codewords—with the right code, in other words—it was possible to attain the channel capacity.
Shannon proved mathematically that coding was the means to reach capacity, but he didn't show exactly how to construct these capacity-approaching codes.
staging.spectrum.ieee.org /mar04/3957   (3913 words)

  
 Background/Introduction   (Site not responding. Last check: 2007-10-21)
The benefits include an increased capacity -– roughly proportioned to the minimum of the number of receive and transmit antennas -- a robustness to fading and shadowing, i.e., diversity, and decreased interference among different transmissions.
The gain from cooperative diversity is both an increase in rate, measured for example by ergodic (average) capacity, and diversity, measured for example by outage or outage capacity.
Deriving upper and lower bounds for the Shannon capacity of cooperative diversity in fading channels.
www-ee.eng.hawaii.edu /~madsen/research.htm   (763 words)

  
 Pushing the Limit: Science News Online, Nov. 5, 2005   (Site not responding. Last check: 2007-10-21)
Shannon showed that at any given noise level, there is an upper limit on the ratio of the information to the redundancy required for accurate transmission.
Shannon considered how much redundancy must be added to a message so that the information in it can survive a noisy transmission.
Shannon's law, however, says that there is a limit to how good these codes can get—whether the communication channel is a fiber-optic cable or a noisy room.
www.sciencenews.org /articles/20051105/bob8.asp   (2527 words)

  
 DIMACS Workshop on Signal Processing for Wireless Transmission
Capacity and coding strategies for state-dependent channels with state sequence known to the transmitter but unknown to the receiver is a classical problem in information theory that dates back to Shannon (causal knowledge of the state sequence) and to Gel'fand and Pinsker (non-causal knowledge of the state sequence).
Because the exact distribution of the instantaneous channel capacity in a Rayleigh fading environment is difficult to analyze, we prove a central limit theorem for MIMO channels with a large number of antennas.
Recently the capacity region of a multi-input multi-output (MIMO) Gaussian broadcast channel, with Gaussian codebooks and known-interference cancellation through dirty paper coding (DPC), was shown to equal the union of the capacity regions of a collection of MIMO multiple access channels (MAC).
dimacs.rutgers.edu /Workshops/Wireless/abstracts.html   (3765 words)

  
 Derivation of from the Capacity of Molecular Machines
Shannon, 1949], it is the maximum amount of information which a molecular machine can gain per operation.
Because it produces the same result as equation (16), the derivation shows that the machine capacity (equation (17)) is closely related to the Second Law of Thermodynamics under isothermal conditions.
Schneider, 1991].) So Shannon's channel capacity is, surprisingly, also related to the ``isothermal'' Second Law of Thermodynamics.
www.lecb.ncifcrf.gov /~toms/paper/edmm/latex/node8.html   (282 words)

  
 The adaptive classical capacity of a quantum channel, or Information capacities of three symmetric pure states in three ...
Several such capacities have already been defined, depending on what operations are allowed in the protocols that the sender uses to encode classical information into these quantum states, and that the receiver uses to decode it.
This capacity, and the capacities obtained using various specific values of β in Q(β), are shown in Figure 7.
Shannon's formula for the capacity of a classical channel is the entropy of the average output less the average entropy of the output.
www.research.ibm.com /journal/rd/481/shor.html   (10132 words)

  
 Imatest - SFR results: Chromatic Aberration, Noise, and Shannon capacity plot
The Noise spectrum and Shannon capacity plots, shown below beneath Chromatic Aberration, are plotted only if the Noise Spectrum and Shannon capacity (in CA plot) checkbox in the Imatest SFR input dialog box has been checked.
Noise spectrum and Shannon capacity are only plotted if the selected region is large enough to provide reasonable noise statistics.
Shannon capacity C as a function of signal S (closely related to image Contrast), where S is expressed as the percentage of the difference between the white and fl regions of the target
www.imatest.com /docs/sfr_CAplot.html   (740 words)

  
 Shannon Statue Dedications
Graduate student Kevin Holt displays Shannon's capacity formula for the white Gaussian noise channel, as inscripbed on the 'sheet' in the statue's left hand.
Betty Shannon and sculptor Eugene Daub, the statue is located in Shannon Park in the center of downtown Gaylord.
Everyone is warmly invited to attend the program, which includes the unveiling ceremony from 6 to 6:30 PM in Shannon Park, followed immediately by a reception and a panel discussion at the Otsego Club, located approximately one mile east of Shannon Park, in the Hidden Valley Resort.
www.eecs.umich.edu /shannonstatue   (1339 words)

  
 Understanding the information rate of BPL and other last-mile pipes
I've used Shannon's equation to attempt to make a reasonable and fair comparison but even so the plot shown in Figure B should be viewed as a qualitative indicator rather than a precise one.
I've estimated total information capacity by dividing the available spectrum into 100 segments, spreading the signal power evenly over the spectrum and performing a piece-wise integration of the information capacity using the parameters appropriate to the center of each individual segment.
BPL capacity has been calculated using the full spectrum studied by the OPERA (Open PLC European Research Alliance) report, which is about twice that used by the HomePlug standard.
www.computingunplugged.com /issues/issue200608/00001828001.html   (431 words)

  
 Favorite Equations   (Site not responding. Last check: 2007-10-21)
The Shannon's Theorem, put forth by Claude Elwood Shannon in 1948 gives an upper-bound on the Capacity of a communications link (in Bits/Second), as a function of the link Bandwidth and the SNR (Signal-To-Noise Ratio).
Where C is the Capacity, B is the Bandwidth, and S/N is the Signal-To-Noise ratio of the link.
It was the Shannon's equation that has triggered the explosive research in the field of Fiber Optics, because according to his equation, the data rates trasmitted using today's optical fibers are less than 1% of their maximum data-rate (i.e, their Shannon Capacity).
www.ittc.ku.edu /~rvc/html/equations   (519 words)

  
 Noisy channel coding theorem - Wikipedia, the free encyclopedia
The Shannon limit or Shannon capacity of a communications channel is the theoretical maximum information transfer rate of the channel, for a particular noise level.
Proved by Claude Shannon in 1948, the theorem describes the maximum possible efficiency of error-correcting methods versus levels of noise interference and data corruption.
Shannon's theorem has wide-ranging applications in both communications and data storage applications.
en.wikipedia.org /wiki/Shannon_limit   (1271 words)

  
 10G Ethernet Over Structured Copper Cabling - White Paper by Siemon
Shannon’s equation provides a benchmark against which the performance of a practical communication system may be measured.
Shannon’s equation provides us with the background to determine the theoretical channel capacity of an underlying media.
According to a key study [6] provided to the 10GBASE-T task group, the target required Shannon capacity to guarantee 10 Gb/s Ethernet is estimated at 15.9 Gb/s when applying the LDPC coding approach.
www.siemon.com /us/white_papers/04-12-22_10G_Ethernet_Over_Structured_Copper_Cabling.asp   (4247 words)

  
 Coverstory - Big Pipe on Campus   (Site not responding. Last check: 2007-10-21)
Also known as Shannon’s capacity, Shannon’s Law is a measure that describes how efficiently a cable can transmit data at different rates.
Taking into account the additional bandwidth required to overcome noise produced by active hardware, such as jitter, especially troublesome in the higher frequencies, a Shannon’s capacity of 18-Gbps is required from the cabling infrastructure to achieve 10-Gbps transmission at 100 meters.
When Deaver was searching for a 10-Gbps UTP solution in early 2004, cable manufacturers could not demonstrate Shannon’s capacity of 18 Gbps at 100 meters, due primarily to alien crosstalk, which is the amount of noise measured on a pair within a cable that is induced from an adjacent cable.
www.comnews.com /stories/articles/0305/0305coverstory.htm   (1997 words)

  
 SoC drawer: Detecting and correcting I/O and memory errors
The carrying capacity of channel data increases with bandwidth.
However, capacity is limited by the signal-to-noise ratio on a given link as a function of the binary log of the sum of the signal power plus noise divided by the noise (Equation 1).
C is the channel capacity in bits/second after error correction is applied.
www.ibm.com /developerworks/power/library/pa-soc6/index.html?ca=drs-   (2917 words)

  
 CISE - Seminar Abstract
Since the direct application of Shannon's channel theorem seems inappropriate as it is based on asymptotically infinite delay which is typically detrimental to the stability of a feedback system, researchers looked for new emerging fundamental limits for the stability and performance of feedback systems.
Such conditions involve quantities related to the capacity of the communication channel, and in some special cases, reduce to the capacity.
We again show how both the channel capacity of the forward channel, and the bode integral of the feedback channel provide lower bounds on the best achievable performance.
www.bu.edu /systems/seminars/classes/class09.htm   (539 words)

  
 Capacity of Constrained Systems in One and Two Dimensions by Paul H. Siegel
The capacity represents the growth rate of the number of sequences or arrays in the constrained system, as their size increases toward infinity.
The capacity can also be interpreted as the maximum entropy achieved by any probability measure on the constrained system.
From the coding perspective, the capacity represents an upper bound on the rate of invertible codes from unconstrained sequences to the constrained system.
www.ima.umn.edu /biology/wkshp_abstracts/siegel1.html   (266 words)

  
 roger h shannon - ResearchIndex document query   (Site not responding. Last check: 2007-10-21)
Abstract We derive the ergodic (Shannon) capacity region of an M-user broadcast fading
The Shannon Capacity of a union - Alon (1998)
The Shannon Capacity of a union Noga Alon Abstract For an
citeseer.ist.psu.edu /cis?q=Roger+H.+Shannon   (812 words)

  
 Proceedings of the American Mathematical Society
N. Alon, The Shannon capacity of a union, Combinatorica, 18 (1998), 301-310.
T. Bohman, M. Ruszinkó, and L. Thoma, Shannon capacity of large odd cycles, Proceedings of the 2000 IEEE International Symposium on Information Theory, June 25-30, Sorrento, Italy, p.
W. Haemers, An upper bound for the Shannon capacity of a graph, Colloquia Mathematica Societatis János Bolyai, 25: Algebraic Methods in Graph Theory, Szeged (Hungary), 1978, 267-272.
www.ams.org /proc/2003-131-11/S0002-9939-03-06495-5/home.html   (445 words)

  
 CiteULike: Tag shannon   (Site not responding. Last check: 2007-10-21)
The Shannon capacity of a graph and the independence numbers of its powers
Shannon information, LMC complexity and Renyi entropies: a straightforward approach
The Shannon information of filtrations and the additional logarithmic utility of insiders
www.citeulike.org /tag/shannon   (208 words)

  
 Proceedings of the American Mathematical Society
A limit theorem for the Shannon capacities of odd cycles.
T. Bohman, A limit theorem for the Shannon capacities of odd cycles I, Proceedings of the AMS 131 (2003), 3559-3569.
T. Bohman, R. Holzman, A nontrivial lower bound on the Shannon capacities of the complements of odd cycles, IEEE Transactions on Information Theory, 49(3) (2003), 721-722.
www.mathaware.org /proc/2005-133-02/S0002-9939-04-07470-2/home.html   (366 words)

  
 Generalized Capacity Formula   (Site not responding. Last check: 2007-10-21)
Before discussing about the formulae for the capacity in the fading environment, it is worthwhile to mention that in the current analysis, capacity is a limit to error-free bit rate that is provided by the information theory, and as the technology advances everyday, we have a better chance to approached the limit.
Any system an achieve a bit rate that is only a fraction of the capacity.
The standard formula for the Shannon capacity [2-4] expressed in bps/Hz is given by
www.ee.ucr.edu /~munir/term/node4.html   (218 words)

  
 hp labs : research : information theory : seminars
Many of these challenges can be overcome with multiple antennas at the transmitters and receivers of the wireless network, since these antennas both increase data rate and reduce channel randomness.
However, traditional methods for determining Shannon capacity fail for channels with multiple antennas, especially when there are multiple users, channel variations over time, or channel memory.
We present several new mathematical techniques to study Shannon capacity of multiantenna wireless channels with these properties, including duality, dirty paper coding, and Lyapunov exponents for products of random matrices.
www.hpl.hp.com /research/info_theory/seminar/goldsmith051603.html   (205 words)

  
 CiteULike: On the Shannon capacity of dual MIMO systems in Ricean fading   (Site not responding. Last check: 2007-10-21)
On the Shannon capacity of dual MIMO systems in Ricean fading
We derive exact analytical expressions for the probability density function and cumulative distribution function for the capacity of a dual multiple-input multiple-output system (either two transmit or two receive antennas) transmitting in Ricean fading.
In contrast to earlier work we do not require the line-of-sight (LOS) channel matrix to be of rank one.
www.citeulike.org /user/jakeeng/article/921677   (251 words)

Try your search on: Qwika (all wikis)

Factbites
  About us   |   Why use us?   |   Reviews   |   Press   |   Contact us  
Copyright © 2005-2007 www.factbites.com Usage implies agreement with terms.