FANDOM


http://en.wikipedia.org/wiki/Entropy_(information_theory)


Argument in progress Edit

Information is any Platonic antonymic encoding/decoding mechanism that represents something other than itself. God as ultimate reality outside the Godelian wall, ratiocinates his antonymity from his domain of perfection to the range of what we experience as matter in this cursed universe. God cursed his own creation with the fall and sin of Adam 6000 years ago,

Notes Edit

http://laboratoriogene.info/Ciencia_Hoje/Popper1978.pdf

https://groups.google.com/group/talk.origins/browse_frm/thread/dd675f7240b8889d#

It terms of Platonic duality(Pattern or design) *evolution* can only be in the spontaneous generation or gradualism sense. We only understand concepts as contrasts to to other like light is the contrast to Darkness. By rejecting Platonic duality we have meaningless sentences . Spontaneous generation it self is divided into either a miracle by God or a miracle by nature.

Darwin used evolution in the gradualism sense as opposed to the spontaneous generation (Aristotle) sense. What evolutionists today are trying to do is to move away from gradualism to its contrast spontaneous but using the same TERM.

@..http://biologos.org/blog/evolution-and-origin-of-biological-information-part-1-intelligent-design When I reviewed Signature for the American Scientific Affiliation journal Perspectives on Science and Christian Faith (PSCF) what struck me, repeatedly, was that Meyer made no mention of the evidence for natural selection as a mechanism to increase biological information...@

Information can never increase or decrease because from they YEC premise it has no physical location, it can only be expressed. By analogy matter can never be created or destroyed but only expressed in different formats. Information was itself never created, it existed before the beginning. We are OOP derived instances of information, made in the image of Information(God.)

Information isn't ever created but only expressed as the Platonic opposite do adaptation. Communication is not the same thing as information, because communication has a physical dimension as Shannon explained in his paper on Communication. In terms of Platonic duality entropy is another term for uncertainty and uncertainty is understood as the contrast to certainty. There is no third alternative to the certainty/uncertainty duality. http://en.wikipedia.org/wiki/Entropy_%28information_theory%29

God as the incarnation of Information itself is not subject to uncertainty(entropy). Reproducing a signal at another end over a medium is subject to uncertainty(entropy)

Shannon Edit

http://en.wikipedia.org/wiki/Entropy_(information_theory)

Equivalently, the Shannon entropy is a measure of the average information content one is missing when one does not know the value of the random variable. The concept was introduced by Claude E. Shannon in his 1948 paper "A Mathematical Theory of Communication".[citation needed]

notes: A heading states in the paper - CHOICE, UNCERTAINTY AND ENTROPY. That last part should be or entropy not and. In the context used entropy is the synonym for uncertainty.


Shannon's theorem of communication, not information Edit

The title of Shannon's paper is "A Mathematical Theory of Communication" and specifically not Information(intent,volition , will, meaning or pragmatics which has no physical location). Shannon used information in the body of the paper as a dissimilar term for measuring the reproduction of a difference in voltage levels between two points over a certain medium. Digital in one context is used for the difference in voltage levels. The difference in voltage levels per se is like the difference in the orientation of leaves - it has nothing to do with information. It is only when voltage or any other physical entity is used to represent something other than itself that we have information.

Shannon (http://cm.bell-labs.com/cm/ms/what/shannonday/shannon1948.pdf) wrote

The fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point. Frequently the messages have meaning; that is they refer to or are correlated according to some system with certain physical or conceptual entities. These semantic aspects of communication are irrelevant to the engineering problem.

Shannon's usage of "entropy" was used as the metaphor for uncertainty.

In thermodynamics an increase in "entropy" is a fancy way of saying the concentrate of heat at one end became uniform over the medium: this in itself has nothing to do with information.

It is only when such phenomena is used to represent something other than itself that encoding/decoding took place. All electronic communication involve a concentrate of electrons at one end becoming uniform over the medium(increase entropy) , with the amplifier on the receiving end detecting a sampling of those electrons. This is communication and not information in and of itself. When a metal rod is heated at one end, the concentrate of heat disperses over the entire rod, meaning the entropy increased.

The tendency of a concentrate of energy to become uniform(increase entropy) applies to both open and closed systems and thus the laws of thermodynamics apply in both open and closed systems(this paragraph is pending, needs more references)


From this it should be clear that "entropy" is a weasel word that is misused when a person doesn't wish to be clear or is trying to confuse the issues on communication,information and thermodynamics.

When humans talk to one another by communicating 1 , there is information 1 exchange between signal sender and signal receiver. Communication 1 in this context is the synonym for information1(will,intent,meaning) which has nothing to do with voltage levels. A difference in voltage levels is not information1 because it is measured at a physical point, voltage levels don't have intent. Information as intention is an abstraction that is symbolically represented by something other than itself(voltage,letters,dictionaries etc.). Palindromic genetic sequences were discovered in the genome, the concept of a palindrome has no physical location, it can be represented symbolically with amino acids,copper or any other physical substance and always implies will,volition,intent or Pragmatics. These genetic codes which function as an Irreducible Functionality grammar don't constitute the essence of Life1 itself, they are only an interface, much like Maxwell's equations don't constitute the essence of magnetism itself.

In a certain sense we don't know the essence of meaning(information,will), like we don't know the essence of gravity and magnetism. We describe the effects of magnetism with Maxwell's equations, but have no idea what is magnetism nor gravity as per the article "The decline of the philosophical spirit" in Wireless World. When a person writes down a series of sentences about himself, they function as an interface to the essence of the person. In the same way genes as a grammar are the interface to Life1 himself, the Lord Jesus Christ which maintains everything by the power of his Language alone. Hence mutations in genes , which are copying errors can't explain the essence of Life1. Copying errors was introduced due the fall of Adam when he sinned 6000 years ago.

Define Edit

Information is an abstraction that is symbolically represented by something other than itself(voltage,letters,dictionaries etc.), it has no physical location.



David Chalmers Edit

David Chalmers, professor of philosophy Dir. of The Center for Consciousness studies University of Arizona


Links Edit

Dissimilar

http://www.uncommondescent.com/intelligent-design/2nd-law-of-thermodynamics-an-agument-creationists-and-id-proponents-should-not-use/

Ulam and John von Neumann through to John Conway's Game of Life and the extensive work of Stephen Wolfram, made it clear that complexity could be generated as an emergent feature of extended systems with simple local interactions. Over a similar period of time, Benoît Mandelbrot's large body of work on fractals showed that much complexity in nature could be described by certain ubiquitous mathematical laws, while the extensive study of phase transitions carried out in the 1960s and 1970s showed how scale invariant phenomena such as fractals and power laws emerged at the critical point between phases.

Note Dembski Edit

http://www.wordtrade.com/science/cosmology/dembski.htm

@ .....Certainty and uncertainty in information theory are analogous to order and disorder in physics....@

Information or communication? They are not the same thing. Communication has a physical dimension while information has no physical dimension.

Laws of thermodynamics Edit

--


Andy McIntosh Edit

http://www.metacafe.com/watch/4739025/information_what_is_it_really_professor_andy_mcintosh/

Information is represented with  matter and energy , it isn't matter and energy

Ad blocker interference detected!


Wikia is a free-to-use site that makes money from advertising. We have a modified experience for viewers using ad blockers

Wikia is not accessible if you’ve made further modifications. Remove the custom ad blocker rule(s) and the page will load as expected.