The Mathematical Theory of Information is, basically, a theory about optimum transmission of messages. The transfer of information is carried from the source to the recipient, whereas the transfer of vector energy is conducted from transmitter to receiver.
This analytical scheme - in different versions and with subtle terminological variations - constitutes a constant presence in communication studies, probably helping through its applicability to heterogeneous phenomena. Indeed, every communicative process is developed according to the scheme here reproduced, whether:
a) It occurs between two machines (e.g. the communication that takes place in homeostatic apparatus, which ensure that a certain temperature doesn't exceed the established limit, providing appropriate adjustments to the thermal situation the moment they receive a suitably encoded message);
b) It occurs between two human beings;
c) It occurs between a machine and a human being (the typical case of the fuel level in the tank of a car, communicated by a float and electric signals in the dashboard, in which a message to the driver appears).
The functionality of this communication model has not only consisted in its broad applicability: it has focused on the fact that it allowed to discover the transmission interference actors, that is to say, the noise problem (due to signal loss or parasitic information produced in the channel).
That was a key point, since the main operational purpose of the theory was to pass off the maximum amount of information with the minimum interference and the maximum economy of time and energy through the channel.
The code, which the information theory is interested in - and which makes the transmission of information possible -, is used to reduce the initial equiprobability at the source, establishing a system of recurrence. It's a purely syntactic system, an organizer system that doesn't contemplate in its own relevance the messages' meaning problem, that is to say, the most specifically communicative dimension. Information - as a statistical measure of the equiprobability of events at the source, as a measurable entity in purely quantitative terms - not to be confused with the meaning, that is, the value attributed on the basis of a code that maps the data elements with other entities (correlated by convention), which, in fact, are not transmitted.
That which limits the information theory is not only the diverse meaning of the "code" concept (internal syntax of the signal sequence in front of a correlation between elements of different systems), but mostly the systematic evacuation of the dimension on significance.
Authors / References: Claude Elwood Shannon & Warren Weaver.
© 2003-2020 Comunicologos.com. Todos los derechos reservados.