Entropy.

  • Visitor
  • Visitor
19 May 2017 19:41 - 19 May 2017 19:44 #284612 by
Replied by on topic Entropy.

Magnus Staar wrote:

Kyrin Wyldstar wrote: Brought to you by the Kamikeedi Temple.


Tell me more about this. Where do I find it. Google doesn't seem to know (or it may, but won't tell me).


Google will not help you here... :P


https://www.templeofthejediorder.org/forum/47-Journals/118237-kamikeedi-temple-of-the-resurrected-order-of-the-je-daii
Last edit: 19 May 2017 19:44 by .

Please Log in to join the conversation.

  • Visitor
  • Visitor
05 Jun 2017 22:20 #286585 by
Replied by on topic Entropy.
I agree with Kyrin and Gisteron at almost everything: entropy is a physics variable, related to (and according to modern physical statistics) quantity of information on a system. So, the more entropy you have, the more information you'll find in a system.

This is not trivial, as a candidate to head the Unified Particle Theory is related to entropy: all systems evolve spontaneously to more entropic states of matter. I studied physics for four years and worked like a charm not matter the subject: cinematic, mechanics, thermodinamics, optic, astrophysics, quantum physics... Entropy is everywhere.

Reading entropy as chaos is related to 19th century Science paradigm: they tought that there had to be a force that made some measures impossible, and entropy seemed a nice candidate (no kidding, it actually happened that way).

On the other side, chaos is related to equations solving: an equation results are chaotic if a small change in their initial values gets a totally different (sometimes unexpected) result. In physics (both classical and modern) all equations are more or less chaotic, if you dig in a little.

For example, imagine a well-known physics system: the Solar system, with all the planets and satellites. Newton's Universal Gravitation Law is very well defined (for example, it proved dark matter and energy exist), and planet movement is accurately defined and measured. Bad news: it is actually, but its future movement isn't. If you iterate the model a couple thousand years, then start again changing 1 m the distance (for example) between the Sun and the Earth... Well, the first model says we will be having a great Autumn next 5th October 4017, the other says we'll have a not-so-pleasant Spring. Both models are 'right', but a small measure error gets us unexpected results.

Now imagine a very chaotic system, like weather forecast: an error in less than 0.001 ºC (or ºF) can change all, sometimes predicting just the opposite of what will actually happen.

What those two principles mean together is the 20th century Science paradigm: Science tries to be deterministic, but it isn't since we can't get an exact analytic/numeric solution for a complex system. We can get a system evolution path, but we can't determine it perfectly (at least, for a long time).

Ok, let me recap: entropy is the mechanism, the rule of thumb for physics. Chaos is more a mathematical notion, deeply tied to Universe evolution and, therefore, to entropy.

So (and this is my personal interpretation), when I say there is no chaos, there is harmony, I mean that when something happens and you don't understand it, check for Nature's laws, and that senseless situation will be explained by itself, as Nature is always harmonious. Chaos is the veil we use when we don't understand something.

Of course, you have to find the set of laws governing what you observe, and that can be a challenge: no sense in using quantum physics to explain Trump's election, for example (Hanlon's principle would be more appropiate, yet unprecise). /end_humor

With all this in mind, I think entropy tells me everything will change, sooner or later. The Force tells me Nature's laws will govern that change, and I'll have to find them. Chaos tells me that, after some time, all my actual predictions will be wrong.

In a few words, understanding the world grants calm, as you see it for what it is: a complex system, unpredictable, but explainable. I think entropy and chaos together grant this.

Please Log in to join the conversation.

More
05 Jun 2017 23:07 - 05 Jun 2017 23:07 #286596 by JamesSand
Replied by JamesSand on topic Entropy.

What Tach (And Gist and others) explained


I like your answers, they're probably accurate and certainly sound sciency, but you're forgetting the essential element of story telling.


You only need 5% truth

:laugh:
Last edit: 05 Jun 2017 23:07 by JamesSand.

Please Log in to join the conversation.

  • Visitor
  • Visitor
06 Jun 2017 14:16 #286679 by
Replied by on topic Entropy.

JamesSand wrote:

What Tach (And Gist and others) explained


I like your answers, they're probably accurate and certainly sound sciency, but you're forgetting the essential element of story telling.


You only need 5% truth

:laugh:


LOL I'm relegated to an "other"? There is a difference between myth and legend, you know. :P

Please Log in to join the conversation.

More
06 Jun 2017 17:53 #286736 by Gisteron
Replied by Gisteron on topic Entropy.

tach980 wrote: ... the more entropy you have, the more information you'll find in a system.

Surely, you mean the opposite. The more entropy a system has, the less discrete it is, the less specific our statements about it can be.

This is not trivial, as a candidate to head the Unified Particle Theory is related to entropy: all systems evolve spontaneously to more entropic states of matter.

If there is a spontaneous change in the system, i.e. if the system is unstable in that way, entropy will increase with that change. A system at equilibrium will not spontaneously generate more entropy, because it is by definition in a state of maximal entropy.

I studied physics for four years and worked like a charm not matter the subject:

And yet you spell kinematics with a c, thermodynamics with an i, optics without the s and mention at least three subjects (kinematics, mechanics, and optics) where entropy considerations aren't made at all?

Reading entropy as chaos is related to 19th century Science paradigm: they tought that there had to be a force that made some measures impossible, and entropy seemed a nice candidate (no kidding, it actually happened that way).

Well, I'm no historian of science, so I wouldn't know about what they thought in the 19th century. However, meanwhile, in the 21st, entropy is not a force, nor does it make measurements (because measures are something else entirely) impossible. Maybe back then they called something else by that name, but nowadays that's just not what we mean by entropy.

On the other side, chaos is related to equations solving: an equation results are chaotic if a small change in their initial values gets a totally different (sometimes unexpected) result. In physics (both classical and modern) all equations are more or less chaotic, if you dig in a little.

For example, imagine a well-known physics system: the Solar system, with all the planets and satellites. Newton's Universal Gravitation Law is very well defined (for example, it proved dark matter and energy exist), and planet movement is accurately defined and measured. Bad news: it is actually, but its future movement isn't. If you iterate the model a couple thousand years, then start again changing 1 m the distance (for example) between the Sun and the Earth... Well, the first model says we will be having a great Autumn next 5th October 4017, the other says we'll have a not-so-pleasant Spring. Both models are 'right', but a small measure error gets us unexpected results.

This has nothing to do with entropy. I'm also not sure if gravity is a particularly good example. Gravity (GMm/|r|²) is Lipschitz continuous in all relevant locations because mass is finite (especially not infinite) and distance is finite (especially not 0). There is thus a conservation of mechanical energy (being conservative in this sense is equivalent with being the gradient of a potential which Newton's gravity happens to be) and angular momentum (see Kepler's third law). Were gravity chaotic, course corrections of space probes would be impossible. If by small we mean small compared to the range of the quantity concerned, then small changes in momentum result in small changes to the predicted path. On the other hand, small changes to parameters of weather (temperature, pressure, humidity, wind direction and intensity, locally AND globally), as you rightly point out, can in short amounts of time have large effects on each other, large meaning orders of magnitude greater as a fraction of the range, than the input change; even slight margins of error in measurement make accurate predictions tantamount to luck.

What those two principles mean together is the 20th century Science paradigm: Science tries to be deterministic, but it isn't since we can't get an exact analytic/numeric solution for a complex system. We can get a system evolution path, but we can't determine it perfectly (at least, for a long time).

The one interpretation of quantum mechanics that promised to retain determinism is the one that has been rejected rather thoroughly at the time (1927) to the point where now despite renewed interest it is mostly relegated to obscurity, at least where physics departments are concerned. However, even in a world that is properly deterministic, it does not follow that science would make or be able to make perfectly accurate predictions. By contraposition, the fact that our science cannot, does not imply that our world is non-deterministic. That follows only under the assumption that if a description of a system that uses formalisms of probability theory, the system itself therefore involves genuine randomness. Whether this is so I am happy to leave to philosophers. In science we are happy to say that our models have known and finite margins of error. There is no need to assert that this tells us any truth about the world and I for one would find it arrogant if we did assert any such thing.

Ok, let me recap: entropy is the mechanism, the rule of thumb for physics. Chaos is more a mathematical notion, deeply tied to Universe evolution and, therefore, to entropy.

Entropy is actually a quantity they use in probability theory, information theory, game theory, and by extension computer science. Chaos on the other hand, is a (vague) measure of the "roughness" of a phase space portrait. If anything, chaos would be the physical notion (although far from a rule of thumb for it) and entropy would at best be both.

... when something happens and you don't understand it, check for Nature's [sic] laws, and that senseless situation will be explained by itself, as Nature [sic] is always harmonious. Chaos is the veil we use when we don't understand something.

Equivocation. Earlier you described chaos as a measure of how chaotic, i.e. how unpredictable a system is. It is, in this sense, a description of the phase space. Even if hypothetically we could know exactly where in it the current state of the system is, that would change nothing about the shape of the portrait. In fact, given that our models are only ever approximate, even perfect knowledge wouldn't grant us perfect predictive power for long periods of time. The chaos you referred to earlier is not a measure or a cover for our ignorance and thus not the chaos you are talking about in this passage.

Better to leave questions unanswered than answers unquestioned
The following user(s) said Thank You: Adder

Please Log in to join the conversation.

  • Visitor
  • Visitor
06 Jun 2017 23:41 #286786 by
Replied by on topic Entropy.
Ok, I didn’t expected a long and precise answer. I’ll do my best to stand to this challenge :)

First of all, I wanted to stay as simple and informative as I could. Gisteron, for what you said, I sense you have above-the-average knowledge in science, but I was writing for others that may not have our background.

I have a friend that once said: ‘for Science to be a light in the darkness of human’s ignorance, is one of the darkest human arts’, and I agree with that. Even simplest science is uncomprenhensible if not properly explained, and I find chaos and entropy two of the less understandable scientific concepts for general public.

So, let’s start again.

Gisteron wrote:

tach980 wrote: ... the more entropy you have, the more information you'll find in a system.

Surely, you mean the opposite. The more entropy a system has, the less discrete it is, the less specific our statements about it can be.


Well, I meant exactly what I wrote, at least in thermodynamics. I didn’t want to introduce state functions, since I don’t find them easy to understand. But, as a state function, that is exactly what it means: a measure of the information contained by the system.

Be aware that I’m not talking about what kind of information, or if that information is recoverable/comprehensible. It’s just information, nothing more, nothing less.

Gisteron wrote:

tach980 wrote: This is not trivial, as a candidate to head the Unified Particle Theory is related to entropy: all systems evolve spontaneously to more entropic states of matter.

If there is a spontaneous change in the system, i.e. if the system is unstable in that way, entropy will increase with that change. A system at equilibrium will not spontaneously generate more entropy, because it is by definition in a state of maximal entropy.


Like Schrödinger’s cat, yes and no at the same time. It’s true that if a system evolves, it does it in such a way the final state is more entropic. But I think I unwillingly tricked you: there are other candidates to head the UPT, and they will share that ‘header place’. One of those is the Minimum Free Energy Principle. This principle states that all systems evolve spontaneously to locally minimum free energy states.

As a fun fact, this principle is contrary to the Maximum Entropic Principle but, together, those two principles explain equilibrium states (i.e. Gibb's equation in batteries)

Gisteron wrote:

tach980 wrote: I studied physics for four years and worked like a charm not matter the subject:

And yet you spell kinematics with a c, thermodynamics with an i, optics without the s and mention at least three subjects (kinematics, mechanics, and optics) where entropy considerations aren't made at all?


Ok, you got me. I didn’t study a degree in Physics… In English. I’m not English, Australian or from the USA, so those mispelling, altought a failure, doesn’t mean I’m wrong, it just says I don’t completely dominate English as a language.

Warning: Spoiler!


That reason is a very weak argument. Let’s see other more scientific reasons.

First, I’ll put kinematics and mechanics together and call that group ‘Classical mechanics’, as they are sub-subjects, studied as separated. Usually in kinematics you study the movement, then in mechanics you study what causes that movement, and try to get the equations that rule that movement.

There are two basic approaches to get movement equations in mechanics: Newtonian equations and energetic approach (i.e. Euler-Lagrange equations, Hamiltonian operands). To define energy you have to define work developed by a force. Conservative forces are then defined. Non-conservative forces (such as friction) are inmediately defined as corolary. And, if you are using the energetic approach, Hamilton equation changes dramatically, since you have a non-conservative (entropic) member.

So, a non-conservative force means entropy is at play. And we trust it with our lives when we push the brake pedal in our cars.

On the other hand, optics is not only related to glasses and lenses: it's about interaction between matter and electromagnetic waves.

It’s a bit hard to see: difraction of a wave. Entropy is not explicitly defined, but the evolution is entropic, since energy density decays as the wave difracts.

Ok, not enough: a wave penetrating a non-transparent material, causing the material to get hot. I can’t remember right now the name this effect had. I’ll check it out, but entropy is there.

Gisteron wrote:

tach980 wrote: Reading entropy as chaos is related to 19th century Science paradigm: they tought that there had to be a force that made some measures impossible, and entropy seemed a nice candidate (no kidding, it actually happened that way).

Well, I'm no historian of science, so I wouldn't know about what they thought in the 19th century. However, meanwhile, in the 21st, entropy is not a force, nor does it make measurements (because measures are something else entirely) impossible. Maybe back then they called something else by that name, but nowadays that's just not what we mean by entropy.


Just stopped here to tell than I’m either an historian of science, but I had a subject about history of science, and that was a funny class. It amazed me how 19th century scientifics trusted some silly ideas (this is not the wildest).

Gisteron wrote:

tach980 wrote: On the other side, chaos is related to equations solving: an equation results are chaotic if a small change in their initial values gets a totally different (sometimes unexpected) result. In physics (both classical and modern) all equations are more or less chaotic, if you dig in a little.

For example, imagine a well-known physics system: the Solar system, with all the planets and satellites. Newton's Universal Gravitation Law is very well defined (for example, it proved dark matter and energy exist), and planet movement is accurately defined and measured. Bad news: it is actually, but its future movement isn't. If you iterate the model a couple thousand years, then start again changing 1 m the distance (for example) between the Sun and the Earth... Well, the first model says we will be having a great Autumn next 5th October 4017, the other says we'll have a not-so-pleasant Spring. Both models are 'right', but a small measure error gets us unexpected results.

This has nothing to do with entropy. I'm also not sure if gravity is a particularly good example. Gravity (GMm/|r|²) is Lipschitz continuous in all relevant locations because mass is finite (especially not infinite) and distance is finite (especially not 0). There is thus a conservation of mechanical energy (being conservative in this sense is equivalent with being the gradient of a potential which Newton's gravity happens to be) and angular momentum (see Kepler's third law). Were gravity chaotic, course corrections of space probes would be impossible. If by small we mean small compared to the range of the quantity concerned, then small changes in momentum result in small changes to the predicted path. On the other hand, small changes to parameters of weather (temperature, pressure, humidity, wind direction and intensity, locally AND globally), as you rightly point out, can in short amounts of time have large effects on each other, large meaning orders of magnitude greater as a fraction of the range, than the input change; even slight margins of error in measurement make accurate predictions tantamount to luck.


This was about explaining how a well-known law is also subject to chaos. Just to put it out with an example.

I disagree at the end. As chaos theory says, all science problems are chaotic by definition. More or less, that depends on the model we use. And I find that you gave me the reason: space probe orbit corrections. If gravity wasn’t chaotic, orbit corrections would be made only to transfer or retire satellites. That’s one of the reasons why all satellites have a lifespan, even in stable orbits (like geosynchronous).

Gisteron wrote:

tach980 wrote: What those two principles mean together is the 20th century Science paradigm: Science tries to be deterministic, but it isn't since we can't get an exact analytic/numeric solution for a complex system. We can get a system evolution path, but we can't determine it perfectly (at least, for a long time).

The one interpretation of quantum mechanics that promised to retain determinism is the one that has been rejected rather thoroughly at the time (1927) to the point where now despite renewed interest it is mostly relegated to obscurity, at least where physics departments are concerned. However, even in a world that is properly deterministic, it does not follow that science would make or be able to make perfectly accurate predictions. By contraposition, the fact that our science cannot, does not imply that our world is non-deterministic. That follows only under the assumption that if a description of a system that uses formalisms of probability theory, the system itself therefore involves genuine randomness. Whether this is so I am happy to leave to philosophers. In science we are happy to say that our models have known and finite margins of error. There is no need to assert that this tells us any truth about the world and I for one would find it arrogant if we did assert any such thing.


I disagree. Science is deterministic by definition: a prediction model tries to be deterministic, but chaos makes prediction impossible beyond certain point, farther to the future the less chaotic the model is. I’m not talking about an interpretation of an specific subject, but of Science as a whole.

Gisteron wrote:

tach980 wrote: Ok, let me recap: entropy is the mechanism, the rule of thumb for physics. Chaos is more a mathematical notion, deeply tied to Universe evolution and, therefore, to entropy.

Entropy is actually a quantity they use in probability theory, information theory, game theory, and by extension computer science. Chaos on the other hand, is a (vague) measure of the "roughness" of a phase space portrait. If anything, chaos would be the physical notion (although far from a rule of thumb for it) and entropy would at best be both.


Not sure if I understood you. I meant the rule of thumb is entropy, since system evolution tries to maximize it, and chaos is a notion related to evolution and, therefore, to entropy.

Gisteron wrote:

tach980 wrote: ... when something happens and you don't understand it, check for Nature's [sic] laws, and that senseless situation will be explained by itself, as Nature [sic] is always harmonious. Chaos is the veil we use when we don't understand something.

Equivocation. Earlier you described chaos as a measure of how chaotic, i.e. how unpredictable a system is. It is, in this sense, a description of the phase space. Even if hypothetically we could know exactly where in it the current state of the system is, that would change nothing about the shape of the portrait. In fact, given that our models are only ever approximate, even perfect knowledge wouldn't grant us perfect predictive power for long periods of time. The chaos you referred to earlier is not a measure or a cover for our ignorance and thus not the chaos you are talking about in this passage.


First, I said ‘Nature’ for a reason: there are lots of situations with no obvious or simple explanation, specially in human sciences. Most sociological models are weak, or fail to explain some usually common situations (check out, for example, educational models). So, when I say ‘check for Nature’s laws’ I mean ‘better start using your eyes and adapting for what it is to come’.

Second, when I said ‘chaos is a veil’ I referred to the mental status you have/suffer when you don’t understand something, and contradictory explanations come to your mind. I mean it in the same way as some kind of bias, and I thought context gave that meaning away.

I call that feeling ‘the veil of chaos’, since information seems to have no pattern, no sense. It’s not so different from a chaotic model: results don’t have to make sense for a set of initial values, even if the model is right.

So, when nothing seems to have sense, look around you and try to find out an explanation for what's going on, even if that goes against you actually believe. Because we may not know Nature's laws, but that doesn't mean they don't exist, or that they are not harmonious between them.

Even if it seems I disagree in almost everything with Gisteron, actually I think it's the opposite. It's nice (and rare) to find someone with clear ideas about physics.

Please Log in to join the conversation.

More
07 Jun 2017 17:18 #286889 by Gisteron
Replied by Gisteron on topic Entropy.
As you rightly mention at the end, I do, too, believe that really we agree on most of this. Of course there is no need to throw down any math gloves before those to whom it's witchcraft. I do however believe that questions about the fundamentals of nature are simple questions, which is why tools as primitive as logic are often so fit to address them, if fed good premises... I digress...
I believe that even when conversing with people far less educated on a subject than ourselves we should make an effort to be precise. Yes, technically one could describe entropy as a measure of how much "information" there is to be had about a system. Shannon's "self-information" is used to refer to exactly the same quantity in information theory as entropy, but then so does "surprisal". When I first saw the formula on a math tools for physicists exercise sheet it was referred to as "ignorance" and suggestively enough called S. Aside from scaling by a constant it was the same entropy I later learned about in thermodynamics, and as I read up about it on the internet, it appears that it is minimal when the probabilities of the microstates are extreme (1 or 0), i.e. when we haver certain knowledge of outcomes, and maximal when the probabilities are least distinguishable (something I actually had to prove in that exercise mentioned above). Perhaps I am, too, guilty of over-simplifying earlier, in that I failed to mention the "information content" aspect of entropy. Needless to say the more of the total picture we cut away for the purposes of simplicity, the more vaguely we choose to express ourselves, the more misunderstandings are bound to happen, not to mention all of the woo we might be inviting along the way... But I digress again.

So let the takeaway be that for the most part we agree, be it about entropy or about chaos or about the limitations of science in her effort to describe the world. I shall then for now respond to the few points of genuine contention, if I may, which may be as few as one:

tach980 wrote: Science is deterministic by definition: a prediction model tries to be deterministic, but chaos makes prediction impossible beyond certain point, farther to the future the less chaotic the model is. I’m not talking about an interpretation of an specific subject, but of Science as a whole.

Fortunately this disagreement is a philosophical one, not scientific. When we study, or do research, this barely makes any difference. This is perhaps why this can be contentious between us eventhough we may agree on all actual contents of science at the same time. In my opinion, every scientific proposition is a conditional statement, with a big, big, antecedent. We are of course not logicians and many of us are not even mathematicians, so we usually don't go out of our way to explicitly mention all our premises, not least because many of them are shared among just about all the propositions we generate. So yes, if we have something as "deterministic" as Newtonian gravity we can "calculate" an exact value of location or momentum at a point in time to all of the confidence we can put into the values we plug into it. I would however say that all this is conditional on the model being accurate in the first place which we well know it isn't much like we know that no other model we have is. And it is for this reason technically a wiser choice to mention that we are only as confident in our "prediction" as we can be confident in the reliability of the model. There is always a margin of error, whether we can determine its numerical value or not, there are always assumptions, always uncertainties. But I believe that there is nothing shameful in that. In my view, the goal of science is not to find truth, but to develop a working model. One that balances effort spent calculating against precision gained and precision required for the task.
This may, in fairness, be the same as a recognition of chaos as a limiting factor of just how much it is we can do, so we may again here be in agreement. I do at any rate not believe that science is in the business of telling truths from falsehoods. It is then not the goal of science, in my opinion, to make a true prediction, to be properly deterministic, rather to make a prediction that is "accurate enough", where "enough" depends wholly on what margins of error we can afford allowing for.

Better to leave questions unanswered than answers unquestioned

Please Log in to join the conversation.

  • Visitor
  • Visitor
07 Jun 2017 18:23 #286899 by
Replied by on topic Entropy.

Gisteron wrote: I believe that even when conversing with people far less educated on a subject than ourselves we should make an effort to be precise.

Agree but, in my experience, if you are too technical, too precise (an easy feat when talking about science) even those who may understand you will not even try to understand what you say. I find more useful to explain things not too precise, so it will trigger curiosity about the matter. I had a teacher on my first college year who taught me in a very special way: first, a concept; inmediately after that, a (sometimes funny) example about the concept; then, the theoric (and boring) explanation. Calculus was the subject, and almost everyone passed the finals (in my college, the average pass-rate for any subject is about 30-40% tops).

Gisteron wrote:

tach980 wrote: Science is deterministic by definition: a prediction model tries to be deterministic, but chaos makes prediction impossible beyond certain point, farther to the future the less chaotic the model is. I’m not talking about an interpretation of an specific subject, but of Science as a whole.

Fortunately this disagreement is a philosophical one, not scientific.

I strongly disagree with this specific sentence, altought I agree with the rest. This is a very personal matter to me, as I see it fundational to understand modern science and life. The explanation is very long, and I’d like to discuss it properly, in another thread if you are interested. If I’m in the mood, I’ll write about it this weekend.

Please Log in to join the conversation.

Moderators: ZeroVerheilenChaotishRabeMorkanoRiniTaviKhwang