Their weight function was taken without clear justification.
As I told, highly unlikely, that such weighting function has any real meaning, because one bit of binary derivative of n’th order depends on 2^ppcnt(n) bits of original sequence. Not n.
However, I have insight, that there might be function with reasonable meaning. Possibly, right weighting function might help to calculate length of reduced disjunctive normal form of sequence. So, assume, that our sequence A(x) = f(y0…yk), where y1…yk are bits of binary representation of x. So, x=sum{i=0…k}(yi*2^i). So, right weighting function should help us to calculate length of minimal function, which can reproduce our sequence.
I’m trying to research that hypothesis.
Disprooved.
0011111000000000
0000001111100000
0000000001111100
These bitstings have same bientropy, but different length of A(x) = f(y0…yk).
However, there still might be some connection.
The current idea is to use bi-entropy not only to determine the weights of the bits obtained by the bias-amplifier from some chunk of fixed-length data, but to determine the length of this chunk based on bi-entropy. Since the bias, in fact, does not always appear in the data and we do not know exactly how many bits were flipped by consciousness and how long the sequence containing them was, we can set the amplification factor dynamically. We simply read the data bit by bit, along the way measuring their entropy, and as soon as we find that the entropy has reached an extremum, we feed the resulting fragment to the amplifier, no matter how long it is. Then only psi-bits and the minimum amount of data without psi will get into the amplifier. Ideally, such an amplifier could be configured to return almost no data at all in the absence of a signal.