Studying Intelligence Thru Entropy? 35
"A case in point. Neural networks are weighted switches. They store their 'weights' in the neuron. The storage of these weights determines the networks ability to perform an intellectual task. Therefore studying the 'entropy' of these weights and what and how they change and the effects of these changes is to study the networks 'intelligence' directly?
Another case in point. Genetic algorithms can search a solution landscape and then select the 'best' solution as a seed to the next iteration. This 'best current solution' will have an entropy or measure of order or disorder. So, in these terms, the system is measuring the level of chaos in the system according to some rules and selecting the solution that produces the least chaos (most entropy)
Is this striking any cords with anyone?"
Entropy and information (Score:3, Insightful)
Re:Entropy and information (Score:1, Funny)
Re:Entropy and information (Score:1, Funny)
You've obviously never manned a help desk
Logically speaking (Score:2, Insightful)
HOW predicatble entropic changes in any given system is the study of intelligence itself...
Here's a site to slashdot... (Score:2, Interesting)
Information "entropy" is not entropy. (Score:3, Interesting)
Entropy is a measurement of a microcosmic physical property. The generalized idea of "disorder" that led to the idea of information entropy is related but seperate.
This is important because it is a pernicious error to conflate the two [wisc.edu], an error which often results in false conclusions about thermodynamics and the macrocosmic world.
Re:Information "entropy" is not entropy. (Score:1)
Re: Information "entropy" is not entropy. (Score:2)
> Entropy is a measurement of a microcosmic physical property. The generalized idea of "disorder" that led to the idea of information entropy is related but seperate.
As someone pointed out on talk.origins earlier this year, you can see it right away by looking at the units.
I.e., what is the conversion factor for Joules/Kelvin to bits?
Re: Information "entropy" is not entropy. (Score:2, Funny)
Mountain Dew cans is the conversion factor. Joules of energy fed to Computer Science students.
Re:Information "entropy" is not entropy. (Score:4, Informative)
The first paragraph makes the following hypothesis: "...the study of HOW entropy changes in any given system is the study of intelligence itself...?" That seems to be the main question in this ask slashdot.
In the second paragraph, he shows us how little he knows about neural networks (or grammar). It is not the storage of the weights in a neural network that determines the networks "intellectual" ability, but rather the value of the weights (I would argue it's the training method, network structure, rather than the actual weights). Furthermore, this sentence about the storage of the weights does not lead to the next sentence: "Therefore, studying the "entropy" of these weights... is to study intelligence". One point does not imply the other point. I'm not sure what he/she is trying to say here. As a side point, there is little entropy to the neural network nodes anyways. Similar networks, performing similar but different functions, have similar node weights (I know this from working developing an OCR system for 2 years at Lockheed Martin).
The author's third paragraph is to study the entropic nature of Genetic Algorithms. However, the entropy derived in each generation of a genetic algorithm is directly derived from a random number generator. All GA entropy is derived from randomness in the selection. To study each generation is equivalent to studying random number generation, nothing more.
Finally, I wouldn't be doing my Mathematics degree justice if I didn't point out that CHAOS IS NOT ENTROPY. Chaos is marked by having a complicated, seemingly random, system described by a simple structure or order (Period 3 implies Chaos, York, Lee, et al). Entropy is a random system having NO simple structure or order.
Does the original author still have a valid question? Probably. Little understanding of intelligence is to be gained from studying randomness and entropy in GAs and Neural Networks. Perhaps, the time would be better spent studying entropy XOR GAs and NNs. Or learning a little bit more about any of these things and then reposing the question.
Re:Information "entropy" is not entropy. (Score:1)
Re:Information "entropy" is not entropy. (Score:1, Informative)
Damn. slash is forcing me to post anon.
Re: Information "entropy" is not entropy. (Score:2, Funny)
> I wouldn't be the grammar nazi if I didn't take point out that: "Is this striking any cords with anyone?" (line from original story) is incorrect.
"didn't take point out"???
Playing grammar nazi is a dangerous game: there seems to be a law of nature that increases the probability of grammatical errors in posts criticizing others' grammar. (I think they call it the "Second Law of Poetic Justice".)
Re:Information "entropy" is not entropy. (Score:2, Insightful)
1) Yes, chaotic systems do have an entropy rate, production of information in bits per second.
This is the Kolmogorov-Sinai entropy, and when you partition your state space into a discrete form correctly this K-S entropy will equal the Shannon informational entropy rate.
The ideas of entropy can be applied to deterministic dynamical systems when discretized which induces effective 'probabilistic' laws even without assuming any "fundamental" probablism in the laws of motion (which I don't believe anyway).
2) The overall topic seems really silly and is reversing cause and effect.
All reduction in entropy is hardly intelligence.
A complete reduction of configurational entropy is going to a fixed point: e.g. dead.
3) the quoted professor's hysterical denial of any relation between Shannon's "informational entropy" and physical entropy is exaggerated.
It is true that physical entropy does refer to entropy in the specific physically realistic space of the probabilities of particles' degrees of freedom, at least classically, their positions and momenta.
If you apply the theory informational entropy to that particular space, and add in the laws of motion (namely chaos that brings you to physical equilibrium) you get regular physical entropy and thermodynamics.
Information theory is a mathematical theory (and it is great), which can be applied in some circumstances to real physics.
Re:Information "entropy" is not entropy. (Score:2)
Assumptions (Score:3, Interesting)
Well, my first thought is that just because something is changed in a manner that is able to be predicted, *does not* mean that you (or anyone else) will be able to predict it. This is very similar to the halting problem [everything2.com] (see also Turing machines), in basic computing theory. How do you know if you can't predict the behavior (ie it's truly random) or just that you haven't found the correct functional description yet?
My second thought is that your first premise, as stated above, can be taken in (at least) two ways, a strong sense and a trivial sense. First, the trivial sense: you're simply labeling anything that can predictably change the entropy of a system as intelligent. simple, and actually setting yourself up for a nice, simple tautology of equivillences. The strong sense: Intelligence is *required* to change the entropy of a system in a predictable way. This then requires a definition of what you mean by intelligence and I somehow don't think that this strong sense is what you mean. So, it's the trival case you're interested in (that is to say that you've defined intelligence for us).
Is it true that the study of HOW entropy changes in any given system is the study of intelligence itself, in that given system?
Inasmuch as the "how" really gets at the "what" (or that they're intimately connected, see Aristotle's 4 causes, covered well [everything2.com] at everything2 [everything2.com]).
The real issue though, is that you seem to be trying to accurately describe/define intelligence but do not do a good enough job accounting for the common usage of the word to be anything more than either putting forth a simple tautological statement or you are failing to accomplish your goal in an effective or substantial way...but that's just my simple opinion.
-inco
Information (Score:2)
Intelligence and entropy are most emphaticly NOT linked.
Re:Information (Score:1)
Re:Information (Score:2)
Information gets processed.
If information was linked to energy, 70 watts would go in, 69 watts would come out and an arbitrary number of bits would be processed.
The more energy you could "convert" to information, the cooler your processor would get..
Alas it was said best earlier by another poster. (paraphrase) 'There is no unit conversion for Joules to bits'.
To belive any organism is anything more than a complex machine is to discredit the last hundred years of biology.
The idea of vitalism has held back advancement to an amazing degree. Psychology's abandonment of vitalism caused psychology to become a hard science as late as the mid eighties. As soon as we abandoned the idea that the mind was something more than an expression of the brain, the psychotropic revolution occured.
We no longer beat people with rubber hoses, shock them, give them insulin overdoses, or drill holes in people's frontal lobes. We give them prozac.
Re:Information (Score:1)
Re:Information (Score:2)
Same energy in, same energy out, reguardless of how much energy or information is involved. the two are not related.
Entropy is not linked to Intelligence.
Re:Information (Score:1)
Re:Information (Score:1)
Rather than your assumption that the processor must give off all its energy as heat, I am merely suggesting that the opposite would be true if the author's assumption holds. In the absence of any discriminating evidence between the two, I could just as easily believe that
*the information converted is independent of the energy consumed (70w electricity => 70w waste heat), or
*information conversion releases "waste heat" much like any other energy conversion(70w electricity+100w information => 99w processed information+71w waste heat), or even
*(my personal preference, though it pro'lly conflicts with the author's assumption) that information is matter-like, and its conversion is work done. This would mean that your 70watts of electricity went into 10w work (processing) done on information, and 60w waste heat.
Interestingly, anecdotal evidence would lend support to the latter 2 - my CPU runs hotter when processing large amounts of data, by either generating more order (and therefore more waste heat) or by doing more work (and therefore making more waste heat).
PS, in physics we call it "waste heat" because it is UNRECOVERABLE! In fact, that is almost the sole reason you cannot have a perpetual motion machine... the energy change (from stored to motion) gives off waste heat!
Re:Information (Score:3, Interesting)
If you tell me an idea, have you lost anything?
you can tell a billion people without losing it yourself.
If you give me your sandwitch, you go hungry.
Your sandwitch will not feed a billion people.
The information itself is not governed by the laws of physics.
Information is independent of energy.
Could it be ANY simpler for you?
Actually Physical Entropy is Information Entropy (Score:1)
Nah. (Score:4, Insightful)
> Given that any force that changes the entropy of any system in a predictable way is an 'intelligent' force.
The second law of thermodynamics is pretty predictable, but it has nothing to do with intelligence. Unless you consider randomly colliding molecules to be functionally intelligent.
No flame intended, but have you by any chance been listening to the proponents of "intelligent design theory", the latest reincarnation of creation 'science'?
> A case in point. Neural networks are weighted switches. They store their 'weights' in the neuron. The storage of these weights determines the networks ability to perform an intellectual task. Therefore studying the 'entropy' of these weights and what and how they change and the effects of these changes is to study the networks 'intelligence' directly?
You seem to be confusing the training of the network with its operations after it has been trained.
> Another case in point. Genetic algorithms can search a solution landscape and then select the 'best' solution as a seed to the next iteration. This 'best current solution' will have an entropy or measure of order or disorder. So, in these terms, the system is measuring the level of chaos in the system according to some rules and selecting the solution that produces the least chaos (most entropy)
Actually, depending what problem the GA is working working on and what exactly you measure for the entropy calculations, the entropy may either increase or decrease as it progresses. (I know this for a fact, because I've done it.)
> Is this striking any cords with anyone?
Yeah, the same kind Lister strikes when he plays his guitar on Red Dwarf.
There is certainly room for applications of entropy to the study of these things, but you don't seem to be off to a good start. For some basic applications of information theory to neural networks, see Haykin's textbook. There's surely lots more literature out there, if you care to track it down.
Question begging (Score:2)
> entropy [are] intelligent systems?
Only for a very unabitious definition of
"intelligent". An internal combustion engine
would be one example. An air conditioner or
a food processor might be other examples.
They also take inputs, you will notice...
signals, even.
His Definition: (Score:2)
But how do you measure Entropy? (Score:2)
The difficulty is in this first step. It's not like you can go to the hardware store and buy an "entropymeter" like you can a voltmeter or thermometer. So how do you measure entropy? i.e. how do you look for patterns in data when you might not even understand the process that creates the data? We do have some tools (heck, I like bzipping the original and comparing compressed with uncompressed file sizes) but they do not find all relevant patterns by any means.
I'm not saying that entropy is a useless concept; I'm just saying that it's not an easy thing to measure in complex (AI-type) systems. When your tool fails to detect a pattern it will increase your measured entropy, but the pattern is still there. I am extremely unconfortable if changing my tool changes my measure of the entropy.
Gross oversimplification. (Score:2)
Try this. The power to your neural net fails. The result is a very predictable and massive change in entropy. But, there is absolutely positively no intelligence involved.
My answer is also a ridiculous over simplification but, that's the point.
On neural nets (Score:1)
It's been suggested by the author that a change in the state of entropy that is predictable (IE, to a neural network) would connote intelligence. FreeLinux countered that this doesn't apply, as the removal of a power source and the predictable result of your NN failure would not connote intelligence, it would merely connote a power interruption of some flavor. In short, it was gross oversimplification.
Now let's try something: apply the change directly to the entropy pool. (Consider that a power hit is pretty much indirect.) To oversimplify further, it would be not unlike adding a random frob to a Life field.
Perhaps the answer here is within the question (or in fact within the results), but I will leave that as an exercise to the reader.
Reversal of Entropy is Life, not Intelligence (Score:1)
Life does indeed increase entropy of some material, its waste from food. Energy is extracted from food to power the entropy reversals. Growth and organization are the results, although only localized exceptions to the rule.
Cells, multicelled organisms, plants, complex life forms, tools, machinery, cities...all are organized but not necessarily intelligence.
A simple hydraulic ram pump [maintenanceresources.com] is a good example. It lets water fall through, slams a valve shut, the inertia of the water moving forward pushes some water up through a one-way valve, pumping that water to a higher level. Much more water is wasted by allowing it to fall (so as to get the water moving fast enough for the inertia has enough force) than is pumped upward. The falling water has to be made to drop to its lower state in order to move the lesser amount to a higher energy state.
Oh, yeah.. the ram pump is reversing entropy on the raised water but it's hard to consider it alive. It is wearing out and rusting and is only reversing entropy in one way. If it is part of a city with a maintenance crew (human or robots) then it becomes harder to separate from the qualities of a life form. Proceed with comparisons to a molecular pump in the wall of a cell...