Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> The quantities we choose to measure are dictated by our psychological and physiological limitations.

No. The enthalpy changes measured by a calorimeter are not dependent on our psychological limitations.

> The volume of a container is not objectively defined.

Yes, it is, for any reasonable definition of "objective". We know how to measure lengths, we know how they change when we use different frames of reference so there is no situation in which a volume is subjective.

> An organism which lives at a faster time scale will see the walls of the container vibrating and oscillating.

This does not matter. We defined a time scale from periodic physical phenomena, and then we know how time changes depending on the frame of reference. There is no subjectivity in this, whatever is doing the measurement has no role in it. Time does not depend on how you feel. It’s Physics, not Psychology.

> This is the same with other thermodynamic quantities.

No, it’s really not. You seem to know just enough vocabulary to be dangerous and I encourage you to read an introductory Physics textbook.



Sometimes I wish HN had merit badges. Or if you like, a device to measure the amount of information contained within a post.


I am not sure it would help, I think it would just enhance groupthink. I like how you need to write something obviously stupid or offensive for the downvotes to have a visible effect and that upvotes have no visible effect at all. (Yes, it changes ranking, but there are other factors). People are less prejudiced when they read the comment than if they see that it is already at -3 or +5.

Yes, it means that some posts should be more (or less) visible than they are but overall I think it’s a good balance.

Besides, I am not that interested in the absolute amount of information in a post. I want information that is relevant to me, and that is very subjective :)


The enthalpy changes measured by a calorimeter are dependent on the design of the calorimeter, which could have been a different piece of equipment. In a sense, that makes it dependent on the definition of enthalpy.

If you introduced a new bit of macro information to the definition of an ensemble, you'd divide the number of microstates by some factor. That's the micro level equivalent of macroscopic entropy being undefined up to an additive constant.

The measurables don't tell you S, they only tell you dS.


> The enthalpy changes measured by a calorimeter are dependent on the design of the calorimeter, which could have been a different piece of equipment.

Right, but that is true of anything. Measuring devices need to be calibrated and maintained properly. It does not make something like a distance subjective, just because someone is measuring it in cm and someone else in km.

> If you introduced a new bit of macro information to the definition of an ensemble, you'd divide the number of microstates by some factor. That's the micro level equivalent of macroscopic entropy being undefined up to an additive constant.

It would change the entropy of your model. An ensemble in statistical Physics is not a physical object. It is a mental construct and a tool to calculate properties. An actual material would have whatever entropy it wants to have regardless of any assumptions we make. You would just find that the entropy of the material would match the entropy of one of the models better than the other one. If you change your mind and re-run the experiment, you’d still find the same entropy. This happens e.g. if we assume that the experiment is at a constant volume while it is actually under constant pressure, or the other way around.

> In a sense, that makes it dependent on the definition of enthalpy.

Not really. A joule is a joule, a kelvin is a kelvin, and the basic laws of thermodynamics are some of the most well tested in all of science. The entropy of a bunch of atoms is not more dependent on arbitrary definitions than the energy levels of the atoms.

> The measurables don't tell you S, they only tell you dS.

That’s true in itself, the laws of Thermodynamics are invariant if we add a constant term to the entropy. But it does not mean that entropy is subjective: two observers agreeing that the thing they are observing has an entropy of 0 at 0 K will always measure the same entropy in the same conditions. And it does not mean that actual entropy is dependent on specific assumptions about the state of the thing.

This is also true of energy, and electromagnetic potentials (and potentials in general). This is unrelated to entropy being something special or subjective.


Entropy isn't subjective, but it is relative to what you can measure about a given system, or what measurements you are analyzing a system with respect to. In quantum mechanics there are degenerate ground states, and in classical mechanics there are cases where the individual objects (such as stars) are visible but averaged over. You should take a look at the Gibbs paradox.


I would say an even more limiting measure of entropy shows that entropy isn't subjective. That is entropy is the ability to extract useful work from a system by moving the system from a state of lower entropy to a state of higher entropy (in a closed system)

No subjective measure of entropy can allow you to create a perpetual motion machine. The measurements of any two separate closed systems could be arbitrary, but when said systems are compared with each other units and measurements standardize.


> entropy is the ability to extract useful work from a system

https://www.damtp.cam.ac.uk/user/tong/statphys/jaynes.pdf

The amount of useful work that we can extract from any system depends - obviously and necessarily - on how much “subjective” information we have about its microstate, because that tells us which interactions will extract energy and which will not; this is not a paradox, but a platitude. If the entropy we ascribe to a macrostate did not represent some kind of human information about the underlying microstates, it could not perform its thermodynamic function of determining the amount of work that can be extracted reproducibly from that macrostate. […] the rules of thermodynamics are valid and correctly describe the measurements that it is possible to make by manipulating the macro variables within the set that we have chosen to use. This useful versatility - a direct result of and illustration of the “anthropomorphic” nature of entropy - would not be apparent to, and perhaps not believed by, someone who thought that entropy was, like energy, a physical property of the microstate.

Edit: I’ve just noticed that the article discussed links to this paper, and quotes the first sentence above, and details the whifnium example given in the […] above.


I think this is a very weird thought experiment, and one that is either missing a lot of context or mostly just wrong. In particular, if two people had access to the same gas tank, and one knew that there were two types of argon and the other didn't, so they would compute different entropies, one of them would still be wrong about how much work could be extracted out of the system.

If whifnium did exist, but it was completely unobtainable, then both physicists would still not be able to extract any work out of the system. If the one that didn't know about whifnium was given some amount of it without being told what it was, and instructed in how to use it, they would still see the same amount of work being done with it as the one who did know. They would just find out that they were wrong in their calculation of the entropy of the system, even if they still didn't know how or why.

And of course, this also proves that the system had the same entropy even before humanity existed, and so the knowledge of the existence of whifnium is irrelevant to the entropy of the system. It of course affects our measurement of that entropy, and it affects the amount of work we can get that system to perform, but it changes nothing about the system itself and its entropy (unless of course you tautologically define entropy as the amount of work the experimenter/humanity can extract from the system).


> unless of course you tautologically define entropy as the amount of work the experimenter/humanity can extract from the system

Do you have a different definition? (By the way the entropy is the energy that _cannot_ be extracted as work.) The entropy of the system is a function of the particular choice of state variables - those that the experimenters can manipulate and use to extract work. It’s not a property of the system on its own. There is no “true” entropy anymore that there is a “true” set of state variables for the “true” macroscopic (incomplete) description of the system. If there was a true limiting value for the entropy it would be zero - corresponding to the case where the system is described using every microscopic variable and every degree of freedom could be manipulated.


Yes, the more regular statistical mechanics theory of entropy doesn't make any reference to an observer, and definitely not to a human observer. In that definition, the entropy of a thermodynamic system is proportional to the amount of microstates (positions, types, momentum, etc. of individual particles) that would lead to the same macrostate (temperature, volume, pressure, etc.). It doesn't matter if an observer is aware of any of this, it's an objective property of the system.

Now sure, you could choose to describe a gas (or any other system) in other terms and compute a different value for entropy with the same generalized definition. But you will not get different results from this operation - the second law of thermodynamics will still apply, and your system will be just as able or unable to produce work regardless of how you choose to represent it. You won't get better efficiency out of an engine by choosing to measure something other than the temperature/volume/pressure of the gases involved, for example.

Even if you described the system in terms its specific microstate, and thus by the definition above your computed entropy would be the minimum possible, you still wouldn't be able to do anything that a more regular model couldn't do. Maxwell's demon is not a physically possible being/machine.


> the entropy of a thermodynamic system is proportional to the amount of microstates (positions, types, momentum, etc. of individual particles) that would lead to the same macrostate (temperature, volume, pressure, etc.). It doesn't matter if an observer is aware of any of this, it's an objective property of the system.

The meaning of "would lead to the same macrostate" (and therefore the entropy) is not an "objective" property of the system (positions, types, momentum, etc. of individual particles). At least not in the way that the energy is an "objective" property of the system.

The entropy is an "objective" property of the pair formed by the system (which can be described by a microstate) and some particular way of defining macrostates for that system.

That's what people mean when they say that the entropy is not an "objective" property of a physical system: that it depends on how we choose to describe that physical system (and that description is external to the physical system itself).

Of course, if you define "system" as "the underlying microscopical system plus this thermodynamical system description that takes into account some derived state variables only" the situation is not the same as if you define "system" as "the underlying microscopical system alone".


> That's what people mean when they say that the entropy is not an "objective" property of a physical system: that it depends on how we choose to describe that physical system (and that description is external to the physical system itself).

I understand that's what they mean, but this is the part that I think is either trivial or wrong. That is, depending on your choice you'll of course get different values, but it won't change anything about the system. It's basically like choosing to measure speed in meters per second or in furlongs per fortnight, or choosing the coordinate system and reference frame: you get radically different values, but relative results are always the same.

If a system has high entropy in the traditional sense, and another one has lower entropy, and the difference is high enough that you can run an engine by transferring heat from one to the other, then this difference and this fact will remain true whatever valid choice you make for how you describe the system's macrostates. This is the sense in which the entropy is an objective, observer-independent property of the system itself: same as energy, position, momentum, and anything else we care to measure.


> I understand that's what they mean, but this is the part that I think is either trivial or wrong. That is, depending on your choice you'll of course get different values, but it won't change anything about the system. It's basically like choosing to measure speed in meters per second or in furlongs per fortnight, or choosing the coordinate system and reference frame: you get radically different values, but relative results are always the same.

I would agree that it's trivial but then it's equally trivial that it's not just like a change of coordinates.

Say that you choose to represent the macrostate of a volume of gas using either (a) its pressure or (b) the partial pressures of the helium and argon that make it up. If you put together two volumes of the same mixture the entropy won't change. The entropy after they mix is just the sum of the entropies before mixing.

However when you put together one volume of helium and a one volume of argon the entropy calculated under choice (a) doesn't change but the entropy calculated under choice (b) does increase. We're not calculating the same thing in different units: we're calculating different things. There is no change of units that makes a quantity change and also remain constant!

The (a)-entropy and the (b)-entropy are different things. Of course it's the same concept applied to two different situations but that doesn't mean it's the same thing. (Otherwise one could also say that the momentum of a particule doesn't depend on its mass or velocity because it's always the same concept applied in different situations.)


> However when you put together one volume of helium and a one volume of argon the entropy calculated under choice (a) doesn't change but the entropy calculated under choice (b) does increase. We're not calculating the same thing in different units: we're calculating different things. There is no change of units that makes a quantity change and also remain constant!

Agreed, this is not like a coordinate transform at all. But the difference from a coordinate transform is that they are not both equally valid choices for describing the physical phenomenon. Choice (a) is simply wrong: it will not accurately predict how certain experiments with the combined gas will behave.


It will predict how other certain experiments with the combined gas will behave. That's what people mean when they say that the entropy is not an "objective" property of a physical system: that it depends on how we choose to describe that physical system - and what experiments we can perform acting on that description.


What would be an example of such an experiment?

By my understanding, even if we have no idea what gas we have, if we put it into a calorimeter and measure the amount of heat we need to transfer to it to change its temperature to some value, we will get a value that will be different for a gas made up of only argon versus one that contains both neon and argon. Doesn't this show that there is some objective definition of the entropy of the gas that doesn't care about an observer's knowledge of it?


Actually the molar heat capacity for neon, or argon, or a mixture thereof, is the same. These are monotomic ideal gases as far as your calorimeter measurements can see.

If the number of particles is the same you’ll need the same heat to increase the temperature by some amount and the entropy increase will be the same. Of course you could do other things to find out what it is, like weighing the container or reading the label.


No, they are not. The entropy of an ideal monatomic gas depends on the mass of its atoms (see the Sackur–Tetrode equation). And a gas mix is not an ideal monatomic gas; its entropy increases at the same temperature and volume compared to an equal volume divided between the two gases.

Also, entropy is not the same thing as heat capacity. It's true that I didn't describe the entropy measurement process very well, so I may have been ambiguous, but they are not the same quantity.


I'll leave the discussion here but let me remind you that you talked (indirectly) about changes in entropy and not about absolute entropies: "if we put it into a calorimeter and measure the amount of heat we need to transfer to it to change its temperature to some value".

Note as well that the mass dependence in that equation for the entropy is just an additive term. The absolute value of the entropy may be different but the change in entropy is the same when you heat a 1l container of helium or neon or a mixture of them from 300K to 301K. That's 0.0406 moles of gas. The heat flow is 0.506 joules. The change in entropy is approximately 0.0017 J/K.

> And a gas mix is not an ideal monatomic gas; its entropy increases at the same temperature and volume compared to an equal volume divided between the two gases.

A mix of ideal gases is an ideal gas and its heat capacity is the weighted average of the heat capacities (trivially equal to the heat capacity of the components when it's the same). The change of entropy when you heat one, or the other, or the mix, will be the same (because you're calculating exactly the same integral of the same heat flow).

The difference in absolute value is irrelevant when we are discussing changes in entropy and measurements of the amount of heat needed to increase the temperature and whether you "will get a value that will be different for a gas made up of only argon versus one that contains both neon and argon".




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: