GCP and rng correlations

What kind of things can be done with rngs connected all over the world? We have Simons api, I believe Andy is hosting one, newton has one, we have a few devices we could add to the distributed pool.

Just wondering, what kind of information can be gained by distributing rngs across the world and searching for correlations (or other things) and

The Global Consciousness Project, GCP, has been around for many years. It is a network of RNGs that continuously collects random bits and sends them to a central location for analysis. There the data from the entire network is combined to watch for statistically significant variations from expected values. It purportedly has responded to such big events as the 9/11 attacks and a few others, but starting in advance of the actual events.

While this has captured the imagination of many people over the years, it is hard to know what or if it has some particular meaning. The issues are two-fold:

  1. Since there is no particular intention associated with the statistical deviations, they can be either positive or negative, that is, excess 1s or excess 0s.
  2. If the window of analysis is 5 minutes, there will be 288 of these each day. Within these blocks of time, there could be 14 that are significant at the 5% level and about 3 that are significant at the 1% level. Analyzing with a moving window can increase these numbers.

Of the possible 5200 statistical events occurring at random each year, how many would be within a 1 hour window of a significant, that is newsworthy, event occurring? At a glance, there is about a 60% chance of such a correlation occurring for any event. If the relative timing window is narrowed to 15 minutes, the probability for any event is still about 15%, or one in every 7 events.

While this doesn’t prove there is nothing to GCP-type monitoring, it suggests that during a year there will be many events that are correlated, purely at random, with the GCP data.

More importantly for me is the unintended nature of the results. Could this type of system be used for any practical application? Perhaps, but I believe the RNGs should be more responsive, such as MED devices, and the precision and accuracy of the timing of the data received should be substantially higher. That way it could be combined closer to bit-by-bit from the array. Analysis, such as is used in radio astronomy – using relative timing and frequency analysis – might be used to search for more specific signals and possibly their location.

What if we’ll use multiple MEDs from different parts of the world as a participants in majority vote amplification algo?

If there is a real effect, MEDs will be more responsive than what PEAR was using. But still, I’m not sure we will see anything that can be used beyond a basic demonstration of principle. I developed a specialized version of MED that provides a continuous analog output of the bias-amplified bits. This can be sent directly into analog circuitry for more advanced signal processing projects. That avoids the loss of precise timing that comes from sending assembled bytes via USB into a computer.

The GCP project has been taken over by HeartMath Institute. See their GCP page: HeartMath GCP I think the exact way they process the data will make some, but not a lot of difference. From what I know of the project, they have chosen to use REGs based on Zener avalanche noise. My modeling indicates this is the least responsive of any entropy source I examined.

HeartMath has a product that measures heart rate variability to assess an individual’s relative activation of sympathetic versus parasympathetic nervous systems to achieve certain goals of mental and physical states. They have a widespread monitoring system (magnetometers) to detect variations in the earth’s magnetic field they think will affect how that works, and the addition of the GCP monitors is meant to enhance their marketing.

I’d heard of HeartMath before but didn’t really know much more other than their name appeared sometimes when searching the various terms that pop up in our line of research. I had no idea they’d taken over the GCP project. I just thought that Roger Nelson had said the experiment had officially ended but they were keeping things running.

Just for legacy purposes I’ll leave a link to a GCP related project I was working on circa Nov 2020 as I was in quarantine after having moved to Singapore:

It’s some hacked-together shell scripts/javascript that imports GCP data into Google Cloud Platform’s BigQuery (hence the pun gcp-in-gcp). I imported the data so I could use BigQuery to learn some basic probability/statistics math often used in this field. I’ve moved on since then but I imported all their data (from Aug/Sept 1998 I think it was up until Nov 2020) so if you’d like access or to my stats-newbie notes let me know.