0% found this document useful (0 votes)
13 views16 pages

Information

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
13 views16 pages

Information

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 16

Information

The simple definition of information is the act of informing - the


communication of knowledge from a sender to a receiver that
informs (literally shapes) the receiver.

By information we mean a quantity that can be understood


mathematically and physically. It corresponds to the common-
sense meaning of information, in the sense of communicating or
informing. It is like the information stored in books and
computers. But it also measures the information in any physical
object, like a snow crystal or a star like our sun, as well as the
information in biological systems, including the genetic code, the
cell structure, the messaging inside and between the cells, and the
developmental learning of the phenotype.

Although some commentators would like to limit the term


"information" to messages sent with an intended purpose, physical
scientists have long considered the structure in physical objects as
something that can be quantitatively measured by an observer. In
our digital "information age," those measurements are reported in
digital "bits".

Now information is neither matter nor energy, though it needs


matter to be embodied, and energy to be communicated.

Information philosophy considers a material object as an


"information structure," from which the immaterial information
can be abstracted as meaningful knowledge. In addition to
structure, much of the information in living things consists of
messages that are sent to invoke processes running in biological
machines. Biological structures are often digital (DNA, RNA,
proteins) and biological messaging greatly resembles a language,
with arbitrary symbols, much like human language.

It was the dream of great logicians, from Gottfried Leibniz and


Gottlob Frege to Bertrand Russell and Ludwig Wittgenstein, to
represent the physical world by "logical atoms." The later
Wittgenstein and twentieth-century language philosophers thought
that "what there is" could be described with analytically true
statements or "propositions." For Wittgenstein, every true
proposition corresponds to a real fact in the world. They were
failures, but information philosophy builds on their dreams.

Information philosophy identifies the total information in a


material object with the yes/no answers to all the questions that
can be asked or with the true/false statements that can be said
about the object. In modern information theory terms, information
philosophy "digitizes" the object. From each answer or truth value
we can, in principle, derive a "bit" of information.

While "total" information is hopelessly impractical to measure


precisely, whatever information we can "abstract" from a
"concrete" object gives us a remarkably simple answer to one of
the deepest problems in metaphysics, the existential status of
ideas, of Platonic "Forms," including the entities of logic and
mathematics.

Rather than simply ask "Do abstract entities like numbers and
properties exist," a metaphysicist prefers to ask in what way they
might exist that is different from the way in which "concrete"
objects exist.

Concrete objects can be seen and touched by our senses. They are
purely material, with causal relations that obey the physical,
though statistical, laws of nature.
Abstract entities are immaterial, but some of them can still play a
causal role, for example when agents use them to decide on their
actions, or when chance events (particularly at the quantum level)
go this way instead of that.

Information philosophy restores so-called "non-existent objects" to


our ontology. They consist of the same kind of abstract
information that provides the structure and process information of
a concrete object. What we call a "concept" about an object is
some subset of the information in the object, accurate to the extent
that the concept is isomorphic to that subset. By "picking out"
different subsets, we can categorize and sort objects into classes or
sets according to different concepts.

Information philosophy hope to settle somewhere deep


philosophical issues about absolute and relative identity, first
posed by Leibniz. All material objects are self-identical, despite
concerns about vague boundaries. All objects have relations with
other objects that can be interpreted as relative identities. All
objects are relatively identical to other objects in some respects
and different qua other respects.

Two numerically distinct objects can be perfectly identical (x = x)


internally, if their intrinsic information content is identical.
Relational (extrinsic) information with other objects and positions
in space and time is ignored. The Greeks called intrinsic
information pros heauto or idios poion. Aristotle and the Stoics
called this the peculiar qualities of an individual.

They distinguished peculiar properties from the material substrate,


which they called hupokeimenon, the "underlying. Extrinsic
information is found in an object's relations with other objects and
space and time. The Greek terms were pros ta alla, toward others,
and
pros ti pos echon, relatively disposed.

Just as the mind is like software in the brain hardware, the abstract
information in a material object is the same kind of immaterial
stuff as the information in an abstract entity, a concept or a "non-
existent object." Some philosophers say that such immaterial
things "subsist," rather than exist.

Broadly speaking, the distinction between concrete and abstract


objects corresponds to the distinction between the material and the
ideal. Ideas in minds are immaterial. They need the matter of the
brain to be embodied and some kind of energy to be
communicated to other minds. But they are not themselves matter
or energy. Those "eliminativists" who believe the natural world
contains only material things deny the existence of ideas and
immaterial information.

Bits of information are close to the logical atoms of Russell and


Wittgenstein.

And information philosophy is a "correspondence theory." The


information we can actually measure in an information structure is
a subset, a partial isomorphism, of the total information in the
structure.

In 1929, Leo Szilard calculated the mean value of the quantity of


entropy produced by a 1-bit ("yes/no") measurement as

S = k log 2,

where k is Boltzmann's constant.

Following Szilard, Ludwig von Bertalanffy, Erwin Schrödinger,


Norbert Wiener, Claude Shannon, Warren Weaver, John von
Neumann, Leon Brillouin, C.F. von Weizsäcker, and of course
John Wheeler, with his "It from Bit." They all had similar views of
the connection between the physical entropy of matter and the
abstract "bits" of information that can be used to describe the
physical arrangement of discrete elementary particles.

For Schrödinger, a living organism is "feeding on negative


entropy" from the sun. Wiener said "The quantity we define as
amount of information is the negative of the quantity usually
defined as entropy in similar situations." Brillouin created the term
"negentropy" because he said, "One of the most interesting parts in
Wiener's Cybernetics is the discussion on "Time series,
information, and communication," in which he specifies that a
certain "amount of information is the negative of the quantity
usually defined as entropy in similar situations."

Shannon, with a nudge from von Neumann, used the term entropy
to describe his estimate of the amount of information that can be
communicated over a channel, because his mathematical theory of
the communication of information produced a mathematical
formula identical to Boltzmann's equation for entropy, except for a
minus sign (the negative in negative entropy).

Boltzmann entropy: S = k ∑ pi ln pi. Shannon information: I


= - ∑ pi ln pi.

Entropy is energy divided by temperature (joules/°K) and


information is measured in dimensionless bits. Entropy is a
physical property of a material object. Information is an
immaterial property of many things, material and ideal.

Shannon's communications theory brings us back to information as


that found in a message between a sender and a receiver. He
showed that a message that is certain to tell you something you
already know contains no new information.

If everything that happens was certain to happen, as determinist


philosophers claim, no new information would ever enter the
universe. Information would be a universal constant. There would
be "nothing new under the sun." Every past and future event can in
principle be known (as Pierre-Simon Laplace suggested) by a
super-intelligence with access to such a fixed totality of
information.

It is of the deepest philosophical significance that information is


based on the mathematics of probability. If all outcomes were
certain, there would be no “surprises” in the universe. Information
would be conserved and a universal constant, as some
mathematicians mistakenly believe. Information philosophy
requires the ontological chance and probabilistic outcomes of
modern quantum physics to produce new information.

But at the same time, without the extraordinary stability of


quantized information structures over cosmological time scales,
life and the universe we know would not be possible. Quantum
mechanics reveals the architecture of the universe to be discrete
rather than continuous, to be digital rather than analog.

Creation of information structures means that in parts of the


universe the local entropy is actually going down. Creation of a
low-entropy system is always accompanied by radiation of energy
and entropy away from the local structure to distant parts of the
universe, to the night sky and the cosmic background.

From Newton’s time to the start of the 19th century, the Laplacian
view coincided with the notion of the divine foreknowledge of an
omniscient God. On this view, complete, perfect and constant
information exists at all times that describes the designed
evolution of the universe and of the creatures inhabiting the world.

From Newton’s time to the start of the 19th century, the Laplacian
view coincided with the notion of the divine foreknowledge of an
omniscient God. On this view, complete, perfect and constant
information exists at all times that describes the designed
evolution of the universe and of the creatures inhabiting the world.

In this God’s-eye view, information is a constant of nature. Some


mathematicians argue that information must be a conserved
quantity, like matter and energy. They are wrong. In Laplace's
view, information would be a constant straight line over all time,
as shown in the figure.

If information were a universal constant, there would be “nothing


new under the sun.” Every past and future event can in principle
be known by Laplace's super-intelligent demon , with its access to
such a fixed totality of information.

Midway through the 19th century, Lord Kelvin (William


Thomson) realized that the newly discovered second law of
thermodynamics required that information could not be constant,
but would be destroyed as the entropy (disorder) increased.
Hermann Helmholtz described this as the “heat death” of the
universe.

Mathematicians who are convinced that information is always


conserved argue that macroscopic order is disappearing into
microscopic order, but the information could in principle be
recovered, if time could only be reversed.

This raises the possibility of some connection between the


increasing entropy and what Arthur Stanley Eddington called
“Time’s Arrow.”

Kelvin’s claim that information must be destroyed when entropy


increases would be correct if the universe were a closed system.
But in our open and expanding universe, my Harvard colleague
David Layzer showed that the maximum possible entropy is
increasing faster than the actual entropy. The difference between
maximum possible entropy and the current entropy is called
negative entropy, opening the possibility for complex and stable
information structures to develop.

We can see from the figure that it is not only entropy that increases
in the direction of the arrow of time, but also the information
content of the universe. We can describe the new information as
"emerging."
Despite the second law of thermodynamics, stable and lawlike
information structures evolved out of the initial chaos. First,
quantum processes formed microscopic particulate matter –
quarks, baryons, nuclei, and electrons. Eventually these became
atoms,. Later, under the influence of gravitation – macroscopic
galaxies, stars, and planets form.

Every new information structure reduces the entropy locally, so


the second law requires an equal (or generally much greater)
amount of entropy to be carried away. Without the expansion of
the universe, this would be impossible.

The positive entropy carried away (the big dark arrow on the left)
is always greater than and generally orders of magnitude larger
than the negative entropy in the created information structure (the
smaller light arrow on the right).

See the cosmic creation process for the negative entropy flows that
lead to human life.

Information is emergent, because the universe began in a state of


minimal information (thermodynamic equilibrium, maximum
disorder - or "entropy").

And there are three distinct kinds of information emergence:

1. the "order out of chaos" when the matter in the universe forms
information structures
(this is Prigogine's chaos and complexity theory)
2. the "order out of order" when the material information
structures form self-replicating biological information
structures
(this is Schrödinger's definition of life as "feeding on negative
entropy"
3. the pure "information out of order" when organisms with minds
process and externalize information, communicating it to other
minds and storing it in the environment
(this is our information theory of mind)

Information philosophy explains how new information is


constantly being created, by nature and by humanity. We are co-
creators of our universe.
Information theory is the mathematical quantification of
communication to describe how information is transmitted and
received, in human language, for example.

Information science is the study of the categorization,


classification, manipulation, storage, and retrieval of information.

Cognitive science is the study of the mental acquisition, retention,


and utilization of knowledge, which we can describe as actionable
information.

Information philosophy is an attempt to examine some classic


problems in philosophy from the standpoint of information.

What is information that merits its use as the foundation of a new


philosophical method of inquiry?

Abstract information is neither matter nor energy, yet it needs


matter for its concrete embodiment and energy for its
communication. Information is immaterial.
It is the modern spirit, the ghost in the machine.

Immaterial information is perhaps as close as a physical or


biological scientist can get to the idea of a soul or spirit that
departs the body at death. When a living being dies, it is the
maintenance of biological information that ceases. The matter
remains.

Biological systems are different from purely physical systems


primarily because they create, store, and communicate
information. Living things store information in a memory of the
past that they use to shape their future. Fundamental physical
objects like atoms have no history.
And when human beings export some of their personal
information to make it a part of human culture, that information
moves closer to becoming immortal.

Human beings differ from other animals in their extraordinary


ability to communicate information and store it in external
artifacts. In the last decade the amount of external information per
person may have grown to exceed an individual's purely biological
information.

Information is an excellent basis for philosophy, and for science as


well, capable of answering questions about metaphysics (the
ontology of things themselves), epistemology (the existential
status of ideas and how we know them), idealism (pure
information), the mind-body problem, the problem of free will,
and the "hard" problem of consciousness.

In our information philosophy, knowledge is the sum of all the


information created and preserved by humanity. It is all the
information in human minds and in artifacts of every kind - from
books and internetworked computers to our dwellings and
managed environment.

We shall see that all information in the universe is created by a


single two-part process, the only one capable of generating and
maintaining information in spite of the dread second law of
thermodynamics, which describes the irresistible increase in
disorder or entropy. We call this anti-entropic process ergodic. It
should be appreciated as the creative source of everything we can
possibly value, and of everything distinguishable from chaos and
therefore interesting.
Enabled by the general relativistic expansion of the universe, the
cosmic creative process has formed the macrocosmos of galaxies,
stars, and planets. It has also generated the particular forms of
microscopic matter - atoms, molecules, and the complex
macromolecules that support biological organisms. It includes all
quantum cooperative phenomena.

Quantum phenomena control the evolution of life and human


knowledge. They help bring new information into the universe in a
fundamentally unpredictable way. They drive biological
speciation. They facilitate human creativity and free will.

Although information philosophy looks at the universe, life, and


intelligence through the single lens of information, it is far from
mechanical and reducible to a deterministic physics. The growth
of information over time - our principle of increasing information
- is the essential reason why time matters and individuals are
distinguishable.

Information is the principal reason that biology is not reducible to


chemistry and physics. Increasing information (a combination of
perfect replication with occasional copying errors) explains all
emergent phenomena, including many "laws of nature."

In information philosophy, the future is unpredictable for two


basic reasons. First, quantum mechanics shows that some events
are not predictable. The world is causal, but not pre-determined.
Second, the early universe does not contain the information of
later times, just as early primates do not contain the information
structures for intelligence and verbal communication, and infants
do not contain the knowledge and remembered experience they
will have as adults.
In the naive world of Laplace's demon and strict determinism, all
the information in the universe is constant at all times. But
"determinism" itself is an emergent idea, realized only when large
numbers of particles assemble into bodies that can average over
the irreducible microscopic indeterminacy of their component
atoms.

Information and Entropy

In our open and expanding universe, the maximum possible


entropy is increasing faster than the actual entropy. The difference
between maximum possible entropy and the current entropy is
called negative entropy. There is an intimate connection between
the physical quantity negative entropy and information.

To give this very positive quantity of "negative" entropy a positive


name, we call it "Ergo" and describe processes capable of
generating negative entropy "ergodic."

Ergodic processes provide room to increase the information


structures in the universe. As pointed out by David Layzer, the
Arrow of Time points not only to increasing disorder but also to
increasing information.

The increase of biological information is primarily by perfect


replication of prior existing information, but it is critically
important that replication errors occur from time to time. They are
the source of new species and creative new ideas.

The universe is creative. Information structures and processes are


emergent. Some laws of nature are emergent. Adequately
deterministic phenomena are emergent. The very idea of
determinism is emergent. Knowledge of the present did not all
exist in the past. We have only a rough idea of the exact future.
The creative process continues. Life and humanity are a part of the
process. What gets created is in part our responsibility. We can
choose to help create and preserve information. Or we can choose
to destroy it.

We are free to create our own future.

Is Everything Information?

Some recent scientists, especially mathematical physicists, think


that the fundamental essence of the universe is information. Like
the earliest monists who say All is One, theists who say everything
is simply thoughts in the mind of God, or panpsychists for whom
our minds are part of a single cosmic consciousness, these
arguments that explain everything as one thing, really explain
nothing.

Explanations need details about a large number of particular things


for us to generalize and think we know something about all things.

Some specific physicists who have looked to information as the


basis for physics include John Wheeler ("it from bit"), Seth Lloyd
("the universe is a computer"), Vlatko Vedral ("the universe is
quantum information"), and Erik Verlinde ("matter is made of bits
of information").

References

Is Information Fundamental?

PDF's generated at: Mon Apr 15 2024 01:12:21 GMT+0000


(Coordinated Universal Time)

Source:
https://www.informationphilosopher.com/introduction/Information/

You might also like