After roughly 30 years of intense focus on MMI development, I have to ask, “What is it that keeps the technology from being noticed by mainstream high tech businesses?”
I have developed substantial intellectual property and theory allowing an increase in effect size (ES) and information rate (rate of obtaining error-free information) by orders of magnitude beyond other researchers. Peak ES of about 50% and information rate up to 1 bit/sec have been observed. However, the typical ES for the average user in “real-world” testing is only about 1-2% with some practice, and the information rate is more like 0.01 bits/sec. These levels are high enough for trained users to obtain information of significant value. But, such an application would require significant motivation and investment, which has not been forthcoming.
My conclusion is that Mind-Enabled technology or applied MMI systems are still too hard to use and not dramatic enough to convince more than a handful of already interested followers. So, what is the one thing that could make the technology go viral in a big way? That would seemingly be a significant increase in responsivity; meaning, the effect size must increase by about an order of magnitude to 10-20% for the average user in real applications.
Theoretical analysis strongly suggests that no continuous entropy source can be used to achieve the big boost necessary. Note, continuous sources are any kind that produce an analog signal that is periodically sampled, such as thermal noise in a resistor and shot noise from a Zener diode. It doesn’t matter if the source is considered quantum mechanical or classical.
I recently began developing a few different approaches for a “Zero-Energy Switch” random generator. That is, a generator that takes zero or exceeding small amounts of energy to switch a bit to its intended state. All these approaches measure a discrete signal from an entropy source. “Discrete” means each measurement stands alone and is inherently either a 1 or a 0 at the instant it is measured. Two examples of such discrete entropy measurements are timing of nuclear decay and single-photon detection at the output of a beam splitter.
To be sure, either of these approaches presents a challenging engineering task. I am presently working on the nuclear decay timing method. The first step is to get a usable signal from a decay source. I use a 0.8-1.0 micro-Curie (uCi) Americium-241 from a smoke detector. 1uCi is defined as a source that produces 37,000 disintegrations per second. Am-241 primarily produces alpha particles when it disintegrates. Note, an alpha “ray” or particle is a positively charged particle consisting of two protons and two neutrons. Given that alpha radiation is emitted in all directions randomly, only about 10,000 of those particles can be detected. I use a photodiode without an enclosure or package, because alpha particles are blocked even by a sheet of paper. I found it impractical to use a scintillator that is meant to convert radiation into flashes of light that can be detected by a photodiode. Only about 10% of the emitted particles produce enough light from a ZnS scintillator sheet to be detected by the photodiode.
The alpha particles produce about 1-2 nA current pulses in the photodiode, lasting about 1 us. This tiny pulse must be converted to a voltage and amplified enough to be converted by a high-speed comparator to a logic pulse for subsequent processing. There are two ways (I know of) to produce a random output binary sequence from these pulses. Each method must produce a signal that represents the average time between pulses. In method 1, the duration from the previous pulse to the current pulse will be either less than the average producing a “1” output, or greater than the average, producing a “0” output. This generates outputs at uneven intervals at the average rate of the number of detected disintegrations. Method 2 checks if there is a detection during each period equal to the average (detected) disintegration time. If yes, output a “1,” else output a “0.” This is perhaps a little simpler to implement, but the statistics of the decay may cause some autocorrelation in output bits, which is highly undesirable. One goal is to produce an output that needs no statistical correction prior to being used. Therefore, the output sequence must have very low bias and autocorrelation.
I will provide updates as the development progresses.