The following discussion shows that the message in DNA has real, objective meaning and that all real communication has a semantical component.
- Claude Shannon’s information theory does not mathematically quantify semantics – because so far as we know it’s impossible to do so. However, Shannon and Weaver explicitly acknowledge the existence and importance of semantics. A message, after all, must mean something or it’s not a message.
- The quantitative measure of noise (a meaningless message) could be exactly the same as a very meaningful message. That does not mean they are the same, however, and Warren Weaver says people who confuse the two are “jokers.” This is one of the most common mis-applications of information theory. The quantitative measure of information is used to guage channel capacity, not the usefulness of information content.
- Thomas Schneider’s EV program is cited as an example of the irrelevance of semantic content. My reply is that the lack of semantic content in his model invalidates his model. It doesn’t produce evolution, it just fills a matrix with meaningless numbers that Schneider misleadingly calls “micro machines.”
You don’t understand Shannon’s theory. It doesn’t incorporate semantics (meaning) and even you yourself repeated Shannon’s quote that semantics are not part of the model. Yet now you have asserted that Shannon’s theory has something to do with semantical information! Does Shannon’s theory impact the meaning? Of course. Does it prescribe or discuss it? No. It’s all subjective and there is no mathematical definition of semantics. If you don’t understand that, I really question your credentials.
Shannon’s theory does not quantify the semantical content of the message. Nevertheless all encoding / decoding systems (DNA, Ethernet, Human languages, TCP/IP) have a semantical component. And what Shannon’s theories supply us with is a set of mechanisms and formulas for verifying that the decoding has been properly done so that semantical content is preserved. It’s worth noting that all of the encoding / decoding systems I have referred to have mechanisms that perform this verification (through some form of redundancy). The systems you are attempting to map into Shannon’s model lack a decoding mechanism, and they lack this error detection feature as well.
“The semantic aspects of communication are irrelevant to the engineering problem” -Weaver. Shannon’s theory says nothing about meaning and semantics!
At the statistical or syntactical level of an encoding / decoding system – even within Shannon’s model – meaning of symbols still has to be assigned. That is, by definition, what decoding is. I would encourage you to purchase Shannon and Weaver’s book and read the first chapter. And see exactly what they say about semantics. They seem to think it’s pretty important.
If something doesn’t exist apart from a context, then it has no independent existence.
That is False. If something exists within any physical context at all, then it does in fact exist.
Warren Weaver’s words refute you. He says not all messages have semantical meaning and messages with no meaning are equivalent to messages with it. So not all communication systems or codes have to have meaning.
Weaver has stated that his theory cannot quantify semantical meaning. There is no debate about that. But you have carried this further and now said that semantical meaning therefore does not exist. Shannon and Weaver have most certainly NOT stated that semantical meanings don’t exist; in fact when Weaver says “In fact, two messages, one of which is heavily loaded with meaning and the other of which is pure nonsense” this is a direct acknowledgement that meaning is real and that it is important. Weaver goes further to address those who say that because noise and meaningful data appear exactly the same within his theory, that they therefore are equivalent – he calls them “jokers.” I quote: “Uncertainty which arises by virtue of freedom of choice on the part of the sender is desirable uncertainty. Uncertainty which arises because of errors or because of the influence of noise is undesirable uncertainty. It is thus clear where the joker is in saying that the received signal has more information.”
Tom Schneider’s EV program uses Shannon’s theory to define communication systems for molecular machines which are completely devoid of meaning. That’s an example of a communication channel with no semantical content.
The fact that Schneider’s molecular “machines” are devoid of meaning does not in any way buttress your case, it just weakens Schneider’s. This is an unfortunate and rather significant limitation of his model, an aspect in which his program fails to model reality. If you are looking for someone to teach you about information theory, Shannon would be a better instructor than Schneider.
I am conforming to Shannon’s own limitation, which you are refusing to accept. Get it through your head, Shannon’s theory doesn’t deal with meaning. If it is present it has nothing to do with engineering. Your insistence that codes must have meaning is either incompetence or dishonesty.
Rob, you have yet to address my prior statement: “Every real example of an encoding / decoding system that I have offered – DNA, computer software programs, human language, etc etc etc – all of those systems without exception have a semantical component. Those communication systems transmit meaning in the form of some kind of instruction, request or plan.” I have made a valid observation, that meaning objectively exists within these systems, despite the inability of mathematical models to fully quantify it. Please address this.
There is no objective theory of semantical information. Therefore semantics doesn’t exist.
Semantical information objectively exists – that’s why your computer monitor turns off when the power saver tells it to. That is an objective fact. Real effects, real causes. This is just one of millions of examples of things that objectively exist even though there is no single, universal mathematical quantification for them.
you claim all codes convey semantical meaning ,
I am sending you this message (and it is a message): 99jei7sn
What is the semantical meaning?
You are not correctly quoting me. I said “Every real example of an encoding / decoding system that I have offered – DNA, computer software programs, human language, etc etc etc – all of those systems without exception have a semantical component.” Your above message has a semantical component. It also has garbage characters inserted. Shannon would classify the garbage characters as “noise.” And if a person wants to say those characters constitute information, Weaver would call that person a “joker.” Or, if you have assigned a prior meaning to them, then they have semantical content.
The existence of semantical information in all these systems is a fact. But it’s not even necessary to judge the semantics of a message to know whether it has been successfully decoded. Successful or unsuccessful decoding is accomplished within the Shannon model; we can use Shannon’s criteria to judge whether a message has been transmitted correctly whether it has meaning or not. We can mathematically assign a value to how successful the decoding was. 100% correct. 99% correct. 50% correct. Rob, was your message “99jei7sn” corrupted, or was it successfully transmitted?
We can make that determination right here. We can also make it with respect to DNA or Bee waggles. But we cannot make it with respect to gravity, or magma flows, because with these systems there is no decoding of a pre-determined message. The “message” we get from magma flows is always 100% “correct” because it only represents itself. This is in stark contrast to DNA or Bee waggles, where there clearly is a pre-determined message. Thus we see a vast chasm between coded systems and naturally occurring systems. Codes do not occur naturally.
The term ‘information’ has many contexts and you are forcing DNA into a context that is not appropriate.
The word “information” is used in many ways which is why I have adhered to rigorous definitions in this thread. The formal definition is appropriate, as it is the same definition of coded information used by Shannon et al.
Isn’t language another way of saying ‘semantical information’? Science tells us the origin of these codes and the origin of Shannon codes isn’t a mystery either. They’re simply an outcome of matter and energy interacting.
I don’t see any place where anyone here has shown an example of naturally occurring coded information with semantical information. And yes, for language to be useful, it has to have a semantical component. It has to actually mean something. As Norbert Weiner said, “Information is information, neither matter nor energy. Any materialism that fails to recognize this will not survive the present day.”