Information

Information is a term derived from the Latin verb meaning "to give form to the mind", "to discipline", "instruct", "teach". Information is generally understood as knowledge or facts that one has acquired. However, in some areas of science information is defined differently and often ambiguously.

To creation science, it is information (God's word) that underlies the fine-tuning of the universe. Furthermore, the existence of biological information inside every cell (DNA and RNA) provides what is perhaps the most powerful argument for intelligent design. William Dembski asserts that DNA possesses specified complexity (i.e., it is both complex and specified, simultaneously) and therefore it must have been produced by an intelligent cause (i.e., that it was designed), rather than being the result of natural processes. One of the main objections to evolution is the origin of the enormous amounts of genetic information content that is needed for an organism to evolve from microbes to humans. Not only has no credible source been identified where information could be produced by natural processes, but in contrast the adaptation of living organism involves a reduction of the information in the genome through natural selection.

Definitions or Characterizations
The word "information" is used in many ways. We mentioned the lay person's sense above, but it is also used of a sequence of symbols (such as letters of a language (see picture at right), dots and dashes of Morse code, or the arrangement of the bumps of Braille) that convey meaning. Another way the term is used is in communications theory, and compression of messages. It is mainly the last two of those senses that are discussed in this article.

Royal Truman, in his analysis published in the Journal of Creation discusses two families of approaches: The one derived from Shannon's work and the other derived from Gitt's work. Truman also mention the algorithm definition of information, developed by Solomonoff, Kolmogorov and with contributions from Gregory Chaitin but that has not been discussed in his article. According to Stephen Meyer, scientists usually distinguish two basic types of information: meaningful or functional information and so-called Shannon information (named after Claude Shannon, who developed statistical information theory). Shannon information is really not the same thing as meaningful information. Meaningful information, encoded into a language, can be measured statistically, per Shannon, but the measure is of the redundancy of the symbols, the so-called Shannon entropy, not a measure of the "information content", or meaning. Shannon, for example, can be uses to measure the "information content" of a set of random symbols that have no meaning.

Information is often not defined. Some definitions relate the concept of information to meaning. In turn, the meaning of "meaning" has never been properly explained when applied to human thought processes. Some definitions, characterizations or notions about information are found in the literature or on the web but without consensus. Some of these are:

Robert M. Losee; Information may be understood in a domain-independent way as the values within the outcome of any process. Winfried Nöth; Information in its everyday sense is a qualitative concept associated with meaning and news. Ray Kurzweil; Information is a sequence of data that is meaningful in a process, such as the DNA code of an organism, or the bits in a computer program. Gregory Bateson; Information is a difference which makes a difference. Valdemar W.Setzer Information is an informal abstraction (that is, it cannot be formalized through a logical or mathematical theory) which is in the mind of some person in the form of thoughts, representing something of significance to that person. Note that this is not a definition, it is a characterization, because "mind", "thought", "something", "significance" and "person" cannot be well defined. I assume here an intuitive (naïve) understanding of these terms. Wikipedia; Information, in its most restricted technical sense, is a sequence of symbols that can be interpreted as a message. Information can be recorded as signs, or transmitted as signals. Information is any kind of event that affects the state of a dynamic system that can interpret the information. Answer of some of those present in the lectures of Stephen Talbott to a large audience of librarians; That's the stuff we work with. Norbert Wiener; Information is information, not matter or energy.

Information theory
A clear definition of the concept of “information” cannot be found in information theory textbooks. Gibbs proposes in this context a simple definition: “Information (I) is the amount of the data after data compression”. To Shannon, the semantic aspects of communication are irrelevant to the engineering problem and the significant aspect is that the actual message is one picked from a set of possible messages. According to J. Z. Young, the concept of information in a system, to Shannon, may be defined as that feature of it which remains invariant under re-coding.

Semiotics
Both Information theory and semiotics study information but because of its strictly quantitative approach, information theory has a more limited scope. In semiotics, the concept of information is related to signs. A sign is something that can be interpreted as having a meaning, other than itself, and therefore a vehicle of information to one able to decode this signal. Signs can be regarded in terms of inter-dependent levels: pragmatics, semantics, syntax, and empiric.

Dr. Werner Gitt proposes conceptually five different levels of information :

According Dr. Gitt, there is no known law through which matter can give rise to information. In his article on scientific laws of information, published in the Journal of Creation, Dr. Gitt states that information is not a property of matter, it is a non-material entity so its origin is in the same way not explicable by material processes. Dr. Gitt also points out that the most important prerequisite for the production of information is the sender's own will, so that information can arises only through will encompassing intention and purpose. Gitt also points out that as information is neither formed of matter (although it can be carried on matter) nor energy, it constitutes a third fundamental quantity of the universe.

Biology
It is generally accepted that the meaning of information given by Claude Shannon in his theory of mathematical information is relevant and legitimate in many areas of biology but in recent decades, and even before, many biologists have applied informational concepts in a broader sense. They see most basic processes characteristic of living organisms being understood in terms of the expression of information, the execution of programs, and the interpretation of codes. John von Neumann stated that the genes themselves are clearly parts of a digital system of components. Many biologists, especially materialists, see this trend as a having foundational problems.

Either way, many scientists in various fields of science consider living organisms as having biological information. Gregory Chaitin, a renowned Argentine-American mathematician and computer scientist, sees DNA as a computer program for calculating the organism and the relationship between male and female as a way of transmission of biological information from the first to the last. David Baltimore, an American biologist and Nobel laureate, stated that "Modern Biology is a science of information". Edmund Jack Ambrose, quoted by Davis and Kenyon, said that "There is a message if the order of bases in DNA can be translated by the cell into some vital activity necessary for survival or reproduction". Richard Dawkins, a British ethologist and evolutionary biologist, has written that life itself is the flow of a river of DNA which he also denominates a river of information. Stephen Meyer points out that producing organismal form requires the generation of information in Shannon's sense. But he goes further to observe that "like meaningful sentences or lines of computer code, genes and proteins are also specified with respect to function." Meyer points out that the information contained in the DNA has a high degree of specificity. David Berlinski an American philosopher, educator, and author, also draws a parallel between biology and information theory. In his book "The Deniable Darwin & Other Essays" he stated that:

Whatever else a living creature may be...[it] is also a combinatorial system, its organization controlled by a strange, a hidden and obscure text, one written in a biochemical code. It is an algorithm that lies at the humming heart of life, ferrying information from one set of symbols (the nucleic acids) to another (the proteins)

The intelligent design concept that DNA exhibited specified complexity was developed by mathematician and philosopher William Dembski. Dembski claims that when something exhibits specified complexity (i.e., is both complex and specified, simultaneously) one can infer that it was produced by an intelligent cause (i.e., that it was designed), rather than being the result of natural processes (see naturalism). He provides the following examples: "A single letter of the alphabet is specified without being complex. A long sentence of random letters is complex without being specified. A Shakespearean sonnet is both complex and specified." He states that details of living things can be similarly characterized, especially the "patterns" of molecular sequences in functional biological molecules such as DNA.

Dembski defines a probability of 1 in 10150 as the "universal probability bound". Its value corresponds to the inverse of the upper limit of "the total number of possible specified events throughout cosmic history," as calculated by Dembski. He defines complex specified information (CSI) as specified information with a probability less than this limit. (The terms "specified complexity" and "complex specified information" are used interchangeably.) He argues that CSI cannot be generated by the only known natural mechanisms of physical law and chance, or by their combination. He argues that this is so because laws can only shift around or lose information, but do not produce it, and chance can produce complex unspecified information, or non-complex specified information, but not CSI; he provides a mathematical analysis that he asserts demonstrates that law and chance working together cannot generate CSI, either. Dembski and other proponents of ID contend that CSI is best explained as being due to an intelligent cause and is therefore a reliable indicator of design.

Life is still as special as it ever was, because it resides not in the chemical media but in the information stored on the chemical media. -Edgar Andrews

Quantifying information
David Salomon states: "Information seems to be one of those entities that cannot be precisely defined, cannot be quantified, and cannot be dealt rigorously". Salomon went on to say, however, that in the field of information theory, information can be treated quantitatively.

Shannon entropy
In A Mathematical Theory of Communication Shannon endows the term information not only with technical significance, but also for measure. Shannon theorized the idea of quantitative measure of information and defined a quantity called self-information. The self-information, denoted by i, associated with an event A is given by:


 * i(A) = -logbP(A)

where P(A) is the probability that the event A will occur and b is the chosen base of the log. If the unit of information is bits, we use b=2 and so on.

The measurement of information, in mathematical terms, has to consider the number of signals, their probability, and combinatorial restrictions. The amount of information transmitted by a signal increases the more it is their rarity, and the more frequent is a signal, less information it transmits. It's worth noting that while we can quantify the probability of any given symbol, we can use no absolute number for the information content of a given message.

Chaitin-Kolmogoroff theory
Another way of measuring information content is the Kolmogorov complexity (also known as Chaitin information). The Kolmogorov complexity of a string is the length of the shortest possible description of the string in some fixed universal description language (like the Turing machine). Let x be a binary string and let d(x) be the shortest string  achieved by concatenating a Turing machine M and an input i to which the Turing machine halts, leaving the string x on the tape. The Kolmogorov complexity K(x) is:


 * K(x) = |d(x)|

that means, the Kolmogorov complexity is the length of the minimal description of x. The complexity can be viewed as the measure of the "patternlessness" of the sequence, and can be equate with the idea of randomness. The length of the shortest description will depend on the choice of description language. By way of illustration we compare two strings:

"CREATIONWIKICREATIONWIKICREATIONWIKICREATIONWIKICREATIONWIKICREATIONWIKICREATIONWIKI"

and

"7W7JAHAGLKJHGBNMZCXVSFQP92725FFADSHALKJNMZAQWSXPLÇKJHGTRFOUMSVAXZXCTEÇALSKDJFHGBEOQI"

Both strings have the same number of letters but the former can be represented in a more compact way: "7 x 'CREATIONWIKI". Another way to represent the first sequence is using a language such as this Pascal-like:

This program contains 34 ASCII characters (counting the blanks and new line) plus 1 character (the parameter: 7) (the value of the variable m in this particular case). For other values of m the program length in characters will be 34 + log m. One way to measure the randomness of the former sequence is to form the ratio of program length to string length. This results in the measure of the randomness of:


 * r ≤ (34 + log m)/(12m)

Similarly to the previous case (Shannon), the Kolmogorov complexity can't measure the meaning of information. In fact, it measures the compressibility of a given sequence.

Spontaneous appearance
Manfred Eigen, Bernd Olaf-Küppers and John Maynard Smith and many other biologists have stated that the origin of information is biology's central problem. Some, like Manfred Eigen argue that it is possible for the spontaneous, stochastic, emergence of information out of chaos. In his book Steps Towards Life Eigen states what he regards as the central problem faced in origins of life research: "Our task is to find an algorithm, a natural law that leads to the origin of information". A. E. Wilder-Smith, in contrast, states that

If information, like entropy were to arise stochastically, then the basis of Shannon and Wiener´s definition of information would be fundamentally and thoroughly destroyed.

Wilder-Smith establishes a distinction between actual and potential information. The former can never be synthesized by stochastic processes, while the latter might be. He establishes a comparison between actual information and negentropy and, on the other side, a correspondence between potential information and entropy. Wilder-Smith proposes a simple example that clarifies the distinction between potential and actual information. The potential to make pictures out of a large amount of randomly distributed dots is infinite although a set of randomly distributed dots will not show in reality an image that looks like something (e.g., a bicycle). The points randomly distributed do possess the capacity for endless amounts of information, but do not communicate any by themselves, so, indeed, there's no actual information.

Lester and Bohlin also agree with Wilder-Smith. They point out that several authors in recent years have established a connection between the genetic code present in DNA and information theory. The overall conclusion of their studies is that information cannot arise spontaneously by mechanistic processes.

In his book A Case Against Accident and Self-Organization, Dean L. Overman builds a compelling case that life is no accident. It is not possible to bring the entire argument of the book here. Overman poses that a central distinction between living and non-living matter is the existence of a genome or a composite of genetic messages which carry enough information content to replicate and maintain the organism. The information contained in the genetic code, like any information or message, is not made of matter. The meaning of the genetic code can not be reduced to a physical or chemical property. Information content is the minimum number of instructions necessary to specify the structure and, in living systems, information content requires an enormous amount of specified instructions. According to Overman, many have proposed calculations for the probability of complex organic compounds such as enzymes, proteins or DNA molecules emerge by chance. Many have concluded that this probability is extremely low, virtually an impossibility.

Evolutionist Michael Denton wrote the controversial book "Evolution: A Theory in Crisis". In his book, writing about the origin of life, Denton states:

The failure to give a plausible evolutionary explanation for the origin of life casts a number of shadows over the whole field of evolutionary speculation.

Due to the enormous odds against abiogenesis on earth some scientists have turned to the panspermia hypothesis, the belief that life started off this planet. Among these scientists are Francis Crick, Fred Hoyle, Svante Arrhenius, Leslie Orgel and Thomas Gold.

Problem for evolution
According to Jonathan Sarfati, the main scientific objection to evolution is not whether changes, whatever their extent, occur through time. The key issue is the origin of the enormous amount of genetic that is needed in order for a microbe to evolve, ultimately reaching the complexity of humans. Dr. Lee Spetner points out that in living organisms, adaptation often take place by reducing the information in the genome and notes that the vertebrate eye or its immune system could never have evolved by loss of information alone.

Biological information is prescriptive, as illustrated by the understanding that genetic information defines biological processes and the systems of which they are part. Likewise, minds are prescriptive and impart meaning to matter and not the other way around. Furthermore, mental processes and their abstract products cannot be physically measured and are therefore are not comprised of or a property of matter. Because prescriptive information is always traceable to an intelligent source, it follows that it is a non-physical product of a non-physical mind. This is illustrated by the understanding that information is not bound to whatever medium onto which it is encoded, since information can be shared without translocation it's material medium. If information were comprised of or a property of matter, it would not be possible to convey information without translocation of the material medium upon which it is encoded. For example, it is possible to share the information that is encoded in a book without moving any of the matter that comprises the book - a book can be read aloud and it's information shared with a listener, yet none of the matter of the book is transferred to the listener. This analogy provides a clear understanding that the information which defines living systems is a product of intelligence, and not a product of material processes.

Since evolution theory is based upon the material processes of chemistry (as with Abiogenesis) and biochemistry (as with Evolution Theory), evolution therefore cannot account for the information which defines living systems, and we have profound evidence that a mind of astonishing intelligence is the origin of the prescriptive information that defines living systems and the nano-technology the cell.

The Information Enigma
aA-FcnLsF1g

DA0Ojxr4pv0