Regarding the thermal component of entropy

The question started with a mundane question of “What thermal conditions need to be maintained for PQ128MS generators and what are preferential”, but it leads to some deep theoretical questions I need help figuring.

The datasheet doesn’t give such info, the generators frame act as passive heat-sink and generators only get slightly warm. However I became increasingly curios about how will active cooling impact the device performance.

I have read the Whitepaper that gives insight into principles of operation and have a number of of points I want to confirm:

  1. The paper doesn’t cover the PQ128MS but PQ32MU, my understanding is that the former is the ‘linear scaling’ of the latter, having the increased number of modules, but the same sampling frequencies. ratios, and, as a result, figures for the entropy per bit, is that correct?

  2. Basically, thermal condition impact Shot noise (which has significant quantum component) negligibly less, compared to Classical noises, Thermal etc, therefore, cooling should increase the ratio of quantum/classical entropy components. However, I need firstly to quantify the ratio.

  3. From discussion: “because no algorithm can distinguish or separately compress the quantum entropy, these algorithms do not change the ratio of quantum entropy to other types of entropy.” This passage about Shannon-entropy is easily understood, but it only a limitation applicable to “informational” post-processing of the resulting bits, am I correct?
    About the same year as this paper, it was proven to be theoretically possible to cyclically enrich entropy and a protocol for “quantum laundry” developed ( Full randomness from arbitrarily deterministic events | Nature Communications Free randomness can be amplified | Nature Physics ) to any desired level up to 1-E, amount of cycles/input entropy grows with the decrease of E.

  4. Talking about components, entropy statistical meaning stems from multiplicity of states (W/Omega). Combination of entropy components therefore would be a product of multiplicities WqWc, where Wc - classic, Wq - quantum. From the Boltzmann’s entropy formula, we end up with Htot= Kbln(WqWc)=KblnWq + Kb*lnWc, or simply a summ: Htot = Hc + Hq = 1 - E
    Guess, there is correct, there is a similar calculation in your article in the end of page 6

  5. In a part named “DETERMINING CHAOTIC ENTROPY”
    there is a part “The final output of the generator is the result of XORing the 3 Level-three” …

The theoretical entropy at the final output is, Hc = 1-E, E = 4.7*10^(-1399)
To me it looks like an error. This should either be:

  1. Hc is actually Htotal, then it makes sense and Hclassic = 1 - Hq - E = 1 - 0.9994 - 4.7*10^(-1399) =~ 0.000599999999…
    in other words ratio between Quantum and Classic components is about 1500 and thermal conditions influence on outputs should be almost negligible (<0,1%) while the CMOS is within operational parameters that would mean a range from -60 to 200C
  2. Hchaotic includes a quantum component… same as 1 basically
  3. Hc is actually 1 - E, making a quantum entropy component less that 4.7*10^(-1399)! But this makes generator classic…
    Please elaborate on this.

“Calculations done for chaotic entropy are unrelated to pseudo-entropy, and calculations done for quantum entropy are unrelated to both chaotic entropy and pseudo-entropy.”
I think I am not entirely understanding what Chaotic entropy is comprised of, and how how to compare quantum and classic influences based on it

From a reply in Caution High Entropy Zone I now understand that quantum entropy and chaotic entropy is not a linear composite, so how do I calculate the “classic” part of the entropy?

Your question about thermal condition’s has a simple answer: the temperature of the FPGA chip (actually the temperature at the wafer level) has no meaningful affect on the performance of the generator. I directly tested over a range of -40 to 120 deg. C and observed the raw sampled entropy. There was no change. Even so, the amount of entropy used to produce an output is massively more than is needed, and deviations from “perfect randomness” could never be detected.

  1. Correct, the PQ128MS uses a larger number of entropy sources, which are used to produce each output bit, but at a higher bit rate.
  2. Temperature has no direct impact on shot noise. Temperature does affect “thermal” noise, which is proportional to temperature on the Kelvin scale. However, temperature does not affect the quality of the output random bits, unless you mean, theoretically, in the hundredth or higher decimal digit. This would never be a measurable difference. Alternatively, if your circuit is operated at absolute zero, the remaining entropy would be virtually all quantum. However, this is a very hard way to make a quantum source, and most real ICs and circuit elements will not work at that temperature.
  3. No, it is not possible to selectively increase the amount of quantum entropy in a system where it is mixed with chaotic entropy. I don’t know of anyone else who has studied and developed the mathematical theory of mixed entropy sources as I have. The total entropy can be increased; however, that never means the quantum entropy has been increased to that level. Perfect randomness does not and cannot exist in a real, physical world. It is possible to increase entropy by a number of mathematical operations at the expense of more input bits for each output bit. The increase can produce theoretical entropy as close to 1.0 as desired, but never to exactly 1.0.
  4. The entropy equation is not linear, as your equation suggests. There are ways to calculate a total, but it is a little misleading to say an entropy source has so much quantum and so much chaotic entropy, when they are mixed together in the entropy source. They are never separable after the measurement.
  5. Hc means “combined entropy.” That is, the final entropy after combining a number of entropy sources. Again, don’t consider the entropy equation to be linear so one type can just be subtracted for the total, even theoretically. The three level-three outputs are from three independent generator circuits. This was done for extreme redundancy and reliability.

There is more elaboration in the message I just left concerning combined entropy sources and predictability (in Caution High Entropy Zone). I can add that the amplitude of a mixed entropy source is the square root of the sum of the squares of the individual amplitudes. However, the separate amplitudes are theoretical since they cannot be measured separately. What is important is the predictability that arises from each entropy component. From predictability, relative predictability is calculated and from relative predictability the predictability can be estimated for the combination of a number of measurements of entropy sources. The mathematical inverse of that combined predictability (using my adaptation of entropy as a function of predictability equation) provides the final entropy for each component. While studying the effect size versus energy in MMI generators, I understood to an even deeper level how to calculate predictability of a single small component in the presence of a larger one. This is the case of an entropy source containing both quantum and chaotic entropy. In an MMI generator, the mental influence is a small component combined with the much larger component of the random signal of the generator’s entropy source. That source can be quantum or chaotic. In a microelectronic chaotic source there is always a quantum component.