sábado, outubro 21, 2017

C = B * log2(1+S/N): "An Equation for Every Occasion - Fifty-Two Formulas and Why They Matter" by John M. Henshaw


Oh man, I once had a heated debate with someone, who was convinced that vinyl gave better sound quality than CDs because it contained higher frequencies than 22.05 kHz. And that those frequencies would "harmonically" influence the lower frequencies, even though he accepted his ears could only detect  <  20 kHz. He was insistent that he could hear the difference, despite his ear essentially being a low-pass filter (well ok band pass, but you know what I mean). Apparently they don't teach linear superposition in some engineering courses.

One of the glaring absences is the mentioning of the contribution of Boltzmann who was the first person to formulate the logarithmic function to connect the average uncertainty with the probability of random variable. Shannon had extended this result into the communication scenario to propose two theorems-source coding theorem and the channel coding theorem-that are the basis of the modern communication technology; Shannon’s treatment became “Shannon’s Law.” Until then nobody really knew what they were doing before Shannon. It was all very ad-hoc.

Shannon's main contribution was to show us that it was possible to send large messages with an arbitrarily low number of errors even when the channel itself makes errors, provided the data rate is less than the Shannon channel capacity: C = B * log2(1+S/N), where B is the bandwidth in hertz, S is the signal power in watts, N is the noise power in watts, and C is the capacity in bits/sec.

Shannon didn't actually tell us *how* to design codes to reach channel capacity. Starting in the 1960s, largely driven by deep space communication, better and better methods were developed until the early 1990s when turbo coding was discovered. It comes so close to Shannon's capacity limit that there will probably not be anything better.

The big missing thing in Information Theory is trying to understand what are "natural bits of information" due Physics and Math dominating the field. There are lots of opportunities here for students that learn to thing on the big picture. Accordingly to Matrix/DNA Theory models, natural bits are a "living thing", while bits inside wires are artificial bits - the difference between salvage animals free in the jungle and domesticated animals as prisoner at our houses. But, a living thing is something that contains the Matrix universal formula for systems, which first shape is natural light wave. Academic Science is studying the process of communication between two or more different systems. This process is an avenue of two ways. At the point of the sender, the environment is chaotic, then, the signal needs trespassing this zone till arriving to the receptor, inside or outside the chaotic zone. Sender and receptor, chaos and order, becomes, by a mathematical viewpoint, 0s and 1s systems, so, like two humans systems speaks the human language, these 0s and 1s speaks the 0 and 1 language. These two 0 and 1 systems are two brains of two computers, and here is the cause that computers does not learn how to think by themselves. Natural bit of information is, in itself, a complete and working system, containing the seven universal systemic functions, or variables. So, these bits can be processed and transformed by a natural brains, producing a third new product. I think that quantum computation will need seven variables, instead the two 0 and 1. But, knowing the Matrix/DNA formula, we can do it.

Our modern human technology about information and transmission still is at the stone age. The most advanced technology for transmission of information will use the "cosmic wave background", accordingly to Matrix/DNA Theory's models. If there is a natural light current flowing by the whole universe, why not use it, instead of manipulating pulsars of voltages of wired electrical currents?
But, then, humans will need a revolution on the understanding about natural light, natural and living bits of information, and what's the cosmic radiation. Matrix/DNA is investigating these issues. These ex-machine light contains the force that imprints dynamics into inertial matter, creating natural systems, by its life's cycle process. The natural white and positive waves of light emitted by quantum vortexes (these are the first shape of natural living bits of information), at every micro big bang of this pulsating Universe has no time, it is instantaneous, expanding by the whole; the light that we grasp and from which we know its "speed" is merely these cosmic waves re-transmitted by receptor/sender stations known as stars, pulsars, etc. So, we will have a bit of information transmitted instantaneous throughout the whole Universe. Knowing the distant supreme goal will drive us to develop the technology.

Mathematically Shannon's formula for information is just Boltzmann's formula for entropy. Shannon actually called it entropy and equated information with entropy, his interpretation being that the information obtained when "reading a message" was greater the greater one's initial lack of knowledge regarding the message (as measured by the entropy).

In college I did a paper on compression algorithms and information theory; it really was an interesting and expansive subject and this book give us a sound explanation of the fundamentals, although short on the particulars.

Sem comentários: