Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Science

Studying Intelligence Thru Entropy? 35

An Anonymous Coward asks: "Given that entropy is the measure of order or disorder. Given that any force that changes the entropy of any system in a predictable way is an 'intelligent' force. Is it true that the study of HOW entropy changes in any given system is the study of intelligence itself, in that given system? I is it true that producing systems whose sole purpose it is to capture and synthesise changes in entropy is the production of intelligent systems?"

"A case in point. Neural networks are weighted switches. They store their 'weights' in the neuron. The storage of these weights determines the networks ability to perform an intellectual task. Therefore studying the 'entropy' of these weights and what and how they change and the effects of these changes is to study the networks 'intelligence' directly?

Another case in point. Genetic algorithms can search a solution landscape and then select the 'best' solution as a seed to the next iteration. This 'best current solution' will have an entropy or measure of order or disorder. So, in these terms, the system is measuring the level of chaos in the system according to some rules and selecting the solution that produces the least chaos (most entropy)

Is this striking any cords with anyone?"

This discussion has been archived. No new comments can be posted.

Studying Intelligence Thru Entropy?

Comments Filter:
  • by algaeman ( 600564 ) on Monday August 12, 2002 @11:06PM (#4059119)
    Negative changes in entropy are information, not necessarily intelligence. Intelligence has connotations that information does not. Therefore, while several tons of algae are able to produce more information (by converting CO2, H2O and energy into carbohydrates) than a similar mass of humans, I suspect that the humans possess more intelligence. I think in order to have an intelligent discussion on the subject, an effective definition of intelligence would be needed. Not being a cognitive psychologist, I'll leave that to the experts and go on growing tons of algae.
    • by Anonymous Coward
      Specist! We algea not only process more information but we are more intelligent than the off-the-rack homo sapien. We're just smart enough not to let on.
    • by Anonymous Coward
      while several tons of algae are able to produce more information ... than a similar mass of humans, I suspect that the humans possess more intelligence.

      You've obviously never manned a help desk :o)
  • Logically speaking (Score:2, Insightful)

    by Froze ( 398171 )
    Only studying the predictable change in entropy would follow from your suppositions. ie.
    HOW predicatble entropic changes in any given system is the study of intelligence itself...
  • Here's a cool site where you can take on the role of Maxwell's Demon [k12.ca.us] and use your intellect to create entropy.
  • by kmellis ( 442405 ) <kmellis@io.com> on Monday August 12, 2002 @11:23PM (#4059183) Homepage

    Entropy is a measurement of a microcosmic physical property. The generalized idea of "disorder" that led to the idea of information entropy is related but seperate.

    This is important because it is a pernicious error to conflate the two [wisc.edu], an error which often results in false conclusions about thermodynamics and the macrocosmic world.

    • interesting article, thank you


    • > Entropy is a measurement of a microcosmic physical property. The generalized idea of "disorder" that led to the idea of information entropy is related but seperate.

      As someone pointed out on talk.origins earlier this year, you can see it right away by looking at the units.

      I.e., what is the conversion factor for Joules/Kelvin to bits?

    • by grammar nazi ( 197303 ) on Tuesday August 13, 2002 @01:18AM (#4059628) Journal
      I wouldn't be the grammar nazi if I didn't take point out that: "Is this striking any cords with anyone?" (line from original story) is incorrect. The proper homonym is chord, not cord; "to strike a chord"

      The first paragraph makes the following hypothesis: "...the study of HOW entropy changes in any given system is the study of intelligence itself...?" That seems to be the main question in this ask slashdot.

      In the second paragraph, he shows us how little he knows about neural networks (or grammar). It is not the storage of the weights in a neural network that determines the networks "intellectual" ability, but rather the value of the weights (I would argue it's the training method, network structure, rather than the actual weights). Furthermore, this sentence about the storage of the weights does not lead to the next sentence: "Therefore, studying the "entropy" of these weights... is to study intelligence". One point does not imply the other point. I'm not sure what he/she is trying to say here. As a side point, there is little entropy to the neural network nodes anyways. Similar networks, performing similar but different functions, have similar node weights (I know this from working developing an OCR system for 2 years at Lockheed Martin).

      The author's third paragraph is to study the entropic nature of Genetic Algorithms. However, the entropy derived in each generation of a genetic algorithm is directly derived from a random number generator. All GA entropy is derived from randomness in the selection. To study each generation is equivalent to studying random number generation, nothing more.

      Finally, I wouldn't be doing my Mathematics degree justice if I didn't point out that CHAOS IS NOT ENTROPY. Chaos is marked by having a complicated, seemingly random, system described by a simple structure or order (Period 3 implies Chaos, York, Lee, et al). Entropy is a random system having NO simple structure or order.

      Does the original author still have a valid question? Probably. Little understanding of intelligence is to be gained from studying randomness and entropy in GAs and Neural Networks. Perhaps, the time would be better spent studying entropy XOR GAs and NNs. Or learning a little bit more about any of these things and then reposing the question.

      • by Anonymous Coward
        I suggest that they read this [amazon.com] book to get a generalized understanding of the topic. Then they should suppliment in a particular area to be come an expert in a sub set. I also suggest that one should avoid making giant leaps when working with things scientific... rather check and re-check yourself. Einstein had the famous "E=mc^2" five times in his notes and verified with friends who were skilled mathematicians several times before he actually published it.

        Damn. slash is forcing me to post anon.

      • > I wouldn't be the grammar nazi if I didn't take point out that: "Is this striking any cords with anyone?" (line from original story) is incorrect.

        "didn't take point out"???

        Playing grammar nazi is a dangerous game: there seems to be a law of nature that increases the probability of grammatical errors in posts criticizing others' grammar. (I think they call it the "Second Law of Poetic Justice".)

      • Three points.

        1) Yes, chaotic systems do have an entropy rate, production of information in bits per second.

        This is the Kolmogorov-Sinai entropy, and when you partition your state space into a discrete form correctly this K-S entropy will equal the Shannon informational entropy rate.

        The ideas of entropy can be applied to deterministic dynamical systems when discretized which induces effective 'probabilistic' laws even without assuming any "fundamental" probablism in the laws of motion (which I don't believe anyway).

        2) The overall topic seems really silly and is reversing cause and effect.

        All reduction in entropy is hardly intelligence.

        A complete reduction of configurational entropy is going to a fixed point: e.g. dead.

        3) the quoted professor's hysterical denial of any relation between Shannon's "informational entropy" and physical entropy is exaggerated.

        It is true that physical entropy does refer to entropy in the specific physically realistic space of the probabilities of particles' degrees of freedom, at least classically, their positions and momenta.

        If you apply the theory informational entropy to that particular space, and add in the laws of motion (namely chaos that brings you to physical equilibrium) you get regular physical entropy and thermodynamics.

        Information theory is a mathematical theory (and it is great), which can be applied in some circumstances to real physics.
    • "Overrated" at "4"? ("Moderation Totals: Insightful=1, Interesting=1, Overrated=1, Total=3. ") Huh. Since my point is something that is central to the discussion, and not coincidentally completely correct but not well-known, I suppose this must be one of those examples of misusing moderation as a form of argument. Ah, well.
  • Assumptions (Score:3, Interesting)

    by Incongruity ( 70416 ) on Monday August 12, 2002 @11:27PM (#4059203)
    Given that any force that changes the entropy of any system in a predictable way is an 'intelligent' force.

    Well, my first thought is that just because something is changed in a manner that is able to be predicted, *does not* mean that you (or anyone else) will be able to predict it. This is very similar to the halting problem [everything2.com] (see also Turing machines), in basic computing theory. How do you know if you can't predict the behavior (ie it's truly random) or just that you haven't found the correct functional description yet?

    My second thought is that your first premise, as stated above, can be taken in (at least) two ways, a strong sense and a trivial sense. First, the trivial sense: you're simply labeling anything that can predictably change the entropy of a system as intelligent. simple, and actually setting yourself up for a nice, simple tautology of equivillences. The strong sense: Intelligence is *required* to change the entropy of a system in a predictable way. This then requires a definition of what you mean by intelligence and I somehow don't think that this strong sense is what you mean. So, it's the trival case you're interested in (that is to say that you've defined intelligence for us).

    Is it true that the study of HOW entropy changes in any given system is the study of intelligence itself, in that given system?

    Inasmuch as the "how" really gets at the "what" (or that they're intimately connected, see Aristotle's 4 causes, covered well [everything2.com] at everything2 [everything2.com]).

    The real issue though, is that you seem to be trying to accurately describe/define intelligence but do not do a good enough job accounting for the common usage of the word to be anything more than either putting forth a simple tautological statement or you are failing to accomplish your goal in an effective or substantial way...but that's just my simple opinion.

    -inco

  • Life is an expression of the information carried in genetic material. Information exists independent of entropy. Because information and entropy are specificly not linked, life forms can grow more complex over time. In fact life forms can and therefore must grow more complex over time, because even after life is complex enough to have trancend direct competition with it's enviroment, There is still direct competition with others of the species.

    Intelligence and entropy are most emphaticly NOT linked.
    • hmm...maybe. we dont know if intelligence and entropy *are* linked only when considering intelligent organisms tho. how do you prove its not linked when considering an intelligent organism ?
      • Consider a fast CPU. 70 watts of electricity go in, 70 watts of heat go out.

        Information gets processed.

        If information was linked to energy, 70 watts would go in, 69 watts would come out and an arbitrary number of bits would be processed.

        The more energy you could "convert" to information, the cooler your processor would get..

        Alas it was said best earlier by another poster. (paraphrase) 'There is no unit conversion for Joules to bits'.

        To belive any organism is anything more than a complex machine is to discredit the last hundred years of biology.

        The idea of vitalism has held back advancement to an amazing degree. Psychology's abandonment of vitalism caused psychology to become a hard science as late as the mid eighties. As soon as we abandoned the idea that the mind was something more than an expression of the brain, the psychotropic revolution occured.

        We no longer beat people with rubber hoses, shock them, give them insulin overdoses, or drill holes in people's frontal lobes. We give them prozac.
        • You seem to be saying that the CPU is "producing" information ("convert"ing energy). The reality is that a CPU (or at least my CPU) is a "processor" of information, not a producer, and by changing the form of information, it should produce waste heat, rather than use up energy. Hence, 70watts in, 71 watts out. And an arbitrary number of bits get processed in the bargain.
          • So computers are perpetual motion machines, 101% efficient?

            Same energy in, same energy out, reguardless of how much energy or information is involved. the two are not related.

            Entropy is not linked to Intelligence.
            • First of all, on what do you base the claim that your processor (or even your whole computer) gives off 70watts of heat? (FWIW, my powersupply claims 300w - although the drives probably account for most of that...) Rather than your assumption that the processor must give off all its energy as heat, I am merely suggesting that the opposite would be true if the author's assumption holds. In the absence of any discriminating evidence between the two, I could just as easily believe that *the information converted is independent of the energy consumed (70w electricity => 70w waste heat), or *information conversion releases "waste heat" much like any other energy conversion(70w electricity+100w information => 99w processed information+71w waste heat), or even *(my personal preference, though it pro'lly conflicts with the author's assumption) that information is matter-like, and its conversion is work done. This would mean that your 70watts of electricity went into 10w work (processing) done on information, and 60w waste heat. Interestingly, anecdotal evidence would lend support to the latter 2 - my CPU runs hotter when processing large amounts of data, by either generating more order (and therefore more waste heat) or by doing more work (and therefore making more waste heat). PS, in physics we call it "waste heat" because it is UNRECOVERABLE! In fact, that is almost the sole reason you cannot have a perpetual motion machine... the energy change (from stored to motion) gives off waste heat!
            • First of all, on what do you base the claim that your processor (or even your whole computer) gives off 70watts of heat? (FWIW, my powersupply claims 300w - although the drives probably account for most of that...)

              Rather than your assumption that the processor must give off all its energy as heat, I am merely suggesting that the opposite would be true if the author's assumption holds. In the absence of any discriminating evidence between the two, I could just as easily believe that

              *the information converted is independent of the energy consumed (70w electricity => 70w waste heat), or
              *information conversion releases "waste heat" much like any other energy conversion(70w electricity+100w information => 99w processed information+71w waste heat), or even
              *(my personal preference, though it pro'lly conflicts with the author's assumption) that information is matter-like, and its conversion is work done. This would mean that your 70watts of electricity went into 10w work (processing) done on information, and 60w waste heat.

              Interestingly, anecdotal evidence would lend support to the latter 2 - my CPU runs hotter when processing large amounts of data, by either generating more order (and therefore more waste heat) or by doing more work (and therefore making more waste heat).

              PS, in physics we call it "waste heat" because it is UNRECOVERABLE! In fact, that is almost the sole reason you cannot have a perpetual motion machine... the energy change (from stored to motion) gives off waste heat!
              • Re:Information (Score:3, Interesting)

                by Perdo ( 151843 )
                Get back on the hi-speed bitumen crack....

                If you tell me an idea, have you lost anything?

                you can tell a billion people without losing it yourself.

                If you give me your sandwitch, you go hungry.

                Your sandwitch will not feed a billion people.

                The information itself is not governed by the laws of physics.

                Information is independent of energy.

                Could it be ANY simpler for you?
  • In the studey of dissapation-fluctuation phenomena in condensed matter, when atomic motion occurs in a highly cooperative manner due to some type of impulsive electrical or mechanical excitation, that motion has a definite information content. The information sink can be considered the thermal bath in which the solid exchanges energy. Entropy in this sense has been refered to as information loss.
  • Nah. (Score:4, Insightful)

    by Black Parrot ( 19622 ) on Tuesday August 13, 2002 @12:12AM (#4059377)


    > Given that any force that changes the entropy of any system in a predictable way is an 'intelligent' force.

    The second law of thermodynamics is pretty predictable, but it has nothing to do with intelligence. Unless you consider randomly colliding molecules to be functionally intelligent.

    No flame intended, but have you by any chance been listening to the proponents of "intelligent design theory", the latest reincarnation of creation 'science'?

    > A case in point. Neural networks are weighted switches. They store their 'weights' in the neuron. The storage of these weights determines the networks ability to perform an intellectual task. Therefore studying the 'entropy' of these weights and what and how they change and the effects of these changes is to study the networks 'intelligence' directly?

    You seem to be confusing the training of the network with its operations after it has been trained.

    > Another case in point. Genetic algorithms can search a solution landscape and then select the 'best' solution as a seed to the next iteration. This 'best current solution' will have an entropy or measure of order or disorder. So, in these terms, the system is measuring the level of chaos in the system according to some rules and selecting the solution that produces the least chaos (most entropy)

    Actually, depending what problem the GA is working working on and what exactly you measure for the entropy calculations, the entropy may either increase or decrease as it progresses. (I know this for a fact, because I've done it.)

    > Is this striking any cords with anyone?

    Yeah, the same kind Lister strikes when he plays his guitar on Red Dwarf.

    There is certainly room for applications of entropy to the study of these things, but you don't seem to be off to a good start. For some basic applications of information theory to neural networks, see Haykin's textbook. There's surely lots more literature out there, if you care to track it down.

  • > systems [that] change
    > entropy [are] intelligent systems?

    Only for a very unabitious definition of
    "intelligent". An internal combustion engine
    would be one example. An air conditioner or
    a food processor might be other examples.
    They also take inputs, you will notice...
    signals, even.
  • of "Intelligence" seems to be our definition of Biomass.
  • Given that entropy is the measure of order or disorder.

    The difficulty is in this first step. It's not like you can go to the hardware store and buy an "entropymeter" like you can a voltmeter or thermometer. So how do you measure entropy? i.e. how do you look for patterns in data when you might not even understand the process that creates the data? We do have some tools (heck, I like bzipping the original and comparing compressed with uncompressed file sizes) but they do not find all relevant patterns by any means.

    I'm not saying that entropy is a useless concept; I'm just saying that it's not an easy thing to measure in complex (AI-type) systems. When your tool fails to detect a pattern it will increase your measured entropy, but the pattern is still there. I am extremely unconfortable if changing my tool changes my measure of the entropy.

  • The questions and particularly the assumptions are gross over simplifications.

    Try this. The power to your neural net fails. The result is a very predictable and massive change in entropy. But, there is absolutely positively no intelligence involved.

    My answer is also a ridiculous over simplification but, that's the point.
  • My own thoughts. I'm not an entropy geek, but here's a variable anyway - take it for whatever it's worth.

    It's been suggested by the author that a change in the state of entropy that is predictable (IE, to a neural network) would connote intelligence. FreeLinux countered that this doesn't apply, as the removal of a power source and the predictable result of your NN failure would not connote intelligence, it would merely connote a power interruption of some flavor. In short, it was gross oversimplification.

    Now let's try something: apply the change directly to the entropy pool. (Consider that a power hit is pretty much indirect.) To oversimplify further, it would be not unlike adding a random frob to a Life field.

    Perhaps the answer here is within the question (or in fact within the results), but I will leave that as an exercise to the reader.

  • Life is a localized reversal of entropy, not intelligence. Life can arrange atoms in states which are more energetic and more organized. Overall entropy continues to make things less organized.

    Life does indeed increase entropy of some material, its waste from food. Energy is extracted from food to power the entropy reversals. Growth and organization are the results, although only localized exceptions to the rule.

    Cells, multicelled organisms, plants, complex life forms, tools, machinery, cities...all are organized but not necessarily intelligence.

    A simple hydraulic ram pump [maintenanceresources.com] is a good example. It lets water fall through, slams a valve shut, the inertia of the water moving forward pushes some water up through a one-way valve, pumping that water to a higher level. Much more water is wasted by allowing it to fall (so as to get the water moving fast enough for the inertia has enough force) than is pumped upward. The falling water has to be made to drop to its lower state in order to move the lesser amount to a higher energy state.

    Oh, yeah.. the ram pump is reversing entropy on the raised water but it's hard to consider it alive. It is wearing out and rusting and is only reversing entropy in one way. If it is part of a city with a maintenance crew (human or robots) then it becomes harder to separate from the qualities of a life form. Proceed with comparisons to a molecular pump in the wall of a cell...

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...