Maybe you misread my comment ;)
I'm sure Knuth knows qualitatively what is meant by temperature, it's been used as a measure for randomness for half a century in simulated annealing and other algorithms
I think you're still misreading my comment (and dragonwriter's and Knuth's): we all know or can look up what temperature is in randomized algorithms. However, what temperature 0.7 means is a mystery to me. I know that at temperature 0 the result is deterministic, and at higher temperature the randomness increases (possibly they are the Boltzmann factors associated to some energy function, but I don't know, and even if it is, I have no idea how it is scaled, i.e. what is the value of the Boltzmann constant). I know that the API accepts values from 0 to 2. I don't know more. Do you?
Yes. I have posted both a very nice link and a complete explanation from chat gpt 3.5 itself. It’s honestly not that complicated, especially for someone who is supposed to have any sort of authoritative view in the field.
I do not feel it is appropriate for you to say you have looked it up if you don’t know what it is besides an API input that affects randomness.