Image not available

1025x1600

d7d9fbf3f6aa6d816....jpg

🧵 Untitled Thread

Anonymous No. 16280194

Is there any practical difference between time and entropy?
Aren't they basically the same thing?

Anonymous No. 16281553

Entropy is a human-made concept due to general inability to cope with microscale nondeterminism. Being kicked in place, it occasionally produces right numbers but still remains a mystery at large for humankind, not to mention blabbering scientists.

Inb4 definitions of the entropy, second law, irreversibility and stochasticity are so retarded and useless that I wonder sometimes about corresponding theoretical fields not falling apart from lack of personnel without down syndrome.

Anonymous No. 16281625

Well, entropy is an emergent quality and requires a statistical aspect to exist, but time exists as a notion both on micro- and macro scale. So while there are certainly ways to link entropy to "the arrow of time", claiming that they are "the same thing" simply doesn't hold, sorry

Furthermore, entropy is such a very important concept within statistical physics and is used in manners that aren't linked to time at all. For example black hole entropy or the Sorkin-Johnston vacuum state.

Source: am PhD student in theoretical physics/trust me bro

Anonymous No. 16281626

2 more entropies

Anonymous No. 16281627

>>16281625
>but time exists as a notion both on micro- and macro scale
proof?

Anonymous No. 16281635

>>16281627
Micro scale: pick one particle (doesn't have to be fancy-schmancy quantum particle, can literally be a speck of dust), and model it's movement through space as a function of time. There you go: time as a parameter of dynamics existing on the micro-state scale.
Macro: take your fridge, unplug it and leave it open. The temperature will decrease over time, meaning time again parametrises a dynamical change. However, that change is on a macro scale, because temperature only exists as an emergent/macro quality.

Hence, both micro and macro examples of time existing. Since entropy is a purely emergent quality, entropy and time cannot be "the same", as the latter too applies to micro system

(You can do these "experiments" at home, but I hope you don't need to )

Anonymous No. 16281644

>>16281627
We use time in our best models from every layer of complexity from physics to sociology.
If OP is discussing physical entropy, then >>16281625 must be right.
If OP is talking about the information theory version of entropy, then it might apply to every layer of complexity but its not measuring time because you can simulate entropy decreasing or increasing over time.

Anonymous No. 16281648

>>16281644
I don't understand your point about information theory. Please elaborate

Anonymous No. 16281655

>>16281553
Completely false, you are retarded. “Nondeterminism” is not an experimental fact but a deliberate theoretical choice. See pilot wave theorem, which is a totally deterministic and valid interpretation of QM, which does not make entropy disappear. Entropy is an automatic consequence of the structure of our reality, analogous to how primes are an automatic consequence to having numbers at all (curiously, primes are local reversals of entropy although entropy still prevails in the end due to the prime number theorem). You can say entropy doesn’t exist, per se, similar to how you can say the non-real “zeroes” of the Riemann zeta function (which only occur at the prime numbers) only exist in a negative sense. Meaning that the word “entropy” in a positive sense is attempting to point negatively to a negative thing (the fact that you can’t capture the spaghetti spilling out of your pockets in all directions without spilling more spaghetti in the process).

Anonymous No. 16281656

>>16281635
and how do you prove that either of those are independent of entropy?

Anonymous No. 16281671

>>16281656
Entropy (by definition) requires an ensemble of micro states to exist. In other words, for entropy to mean anything you need more than one particle. This justifies why the microstate is entropy independent: by definition, entropy can't exist for a single particle state (at least not in a meaningful way way).
As for mye macro-example, I explicitly chose an example where entropy _did_ matter - temperature and entropy go hand-in-hand. My point was not to say that time and entropy have nothing to do with eachother, but rather that there does exist times where the notion of time and entropy are incompatible. Hence (in a general sense): time≠entropy.

No one is saying that there aren't links between entropy and time - there for sure are, but flat out claiming that "time = entropy" is like saying "temperature = velocity". Sure the two are linked but they aren't the same. I mean they don't even have the same units.

Image not available

2133x827

EntropyDecreaseOv....jpg

Anonymous No. 16281727

>>16281648
I was interpreting OPs "entropy" in a different sense to see if the idea that time and entropy are "basically the same thing" could make sense using the information theory version of entropy.
I don't think entropy applies at the computationally irreducible level without considering emergence. But I don't know if computation is the most fundamental thing either.
Either way, you can simulate a program where entropy arbitrarily changes as time moves forward. The pic related shows how we move further away from a uniform distribution, decreasing entropy, as time moves forward. The information theoretic version of entropy wouldn't apply here since it can be completely independent of time.
From a theoretical computer science lens, we can have entropy without any time variable, so there are cases were a type of entropy applies, but time does not.
Entropy needs emergence to exist, and time needs physical reality to exist. That is a difference between them.

Anonymous No. 16282059

>>16281655
>automatic consequence of the structure of our reality

Let me see, where I've heard things like this... Ah yes, 'automatic' spacetime response to presence of mass, yet again we see the concept which has been postulated due to clear lack of effort to develop a proper theory. Entropy concept is just an umbrella definition for several things we do not understand, talking about physical entropy - see the entire ergodic theory which really is a very basic thing, but requires a lot of writing due to human inability to catch the concept as whole. Another problem is a blind macroscale applicability of an entropy concept, which is utterly stupid but no one except cranks are actually arguing against.

Some more rant for the mix of the information theory and physics - when I first encountered Landauer's principle as HS undergrad approximately two decades ago, I was in awe. Today I think that the assumptions are purely wrong as simple combinatorics couldn't tie *discrete* stuff to kT, because derivations from entropy concept has been verified only on "stochastic" scale. Quick google revealed very funny paper where people actually employing same stochastic, e.g. non-discrete measurement to verify the same principle, which is beyond stupid.

For the pure information theory, entropy definition is quite good while still being vague desription for a 'thing in this way', same as Kolmogorov complexity. Just because there is a human logic at the base of a theory instead of a dark and infinite pool of real world's physics, the hand-waving suddenly appears as sane.

Anonymous No. 16282067

>>16280194
time is fundamental. entropy is an artificial madeup concept. they're totally different. please go back and do your homework.