Closed 0joshuaolson1 closed 6 years ago
When writing the following, I found out that the 1.x readmes had the wrong 18!
(so I finally published 2.x):
f=(n,a=[],b=-1)=>{
if(n<=2)return[a,b+n] // [array of n!'s odd 'factors', (2^)x remainder]
for(i=n%2?n--:n-1;i>1;i-=2)a.push(i)
return F(n/=2,a,b+n)
}
a=f(18)
n=1<<a[1]
a[0].forEach(a=>n*=a)
console.log(n,a)
Shortcuts:
b
is n - n
's popcount of set (1) bitsa.length
is n - ceil(lg n) - 1
, but:From now on, assume we instead split a
into increasingly short (decreasingly long?) subarrays, one per 3[, 5...]
run, so
a[i].length
is floor((n - 2^i)/2^(i+1))
i
in a[i]
to use for an n
(and how much of b
to use) is ntz(n)
(consume a[i]*2^(ntz(n)
b-bits)
of entropy)
popcount(n)
is 1
(n is a power of 2), then no a[i]
(1
instead)j
in a[i][j]
(assuming a[i][0]
is 3
) is floor((n - 2^i)/2^(i+1)) - 1
(but unnecessary with a cursor/finger optimization)a[i][j]
(entropy range) is j*2 + 3
popcount/Hamming weight and ntz/lsb helpers to sift through:
I'm not yet sure which order has naturally better chunking:
3, 5, 7...
)3, 5, 3, 7, 9, 5, 11, 3...
)The following ES7 shows that in both cases with a simple chunk search strategy and even with bignums, chunking near a power of 2 stays close above a power of 2 itself, and the following only shows the worst cases, not how often bad ones are encountered:
I=require('./node_modules/jsbn').BigInteger
z=(s,r)=>new I(s+'',r)
N=530 // to see an undiagnosed bug, make bigger like 5300; observe a[1].toString() trailing zeros
f=(i,a)=>{
B=i=>{
for(;i%2<1;)i/=2
return z(i)
}
if(a){
n=B(i)
if(n.intValue()<2)n=B(++i)
}
else n=z(i)
for(m=z(0);;n=n.multiply(a?B(++i):z(i+=2))){
l=n.bitLength()
if(l>N)return[j,m]
M=n.multiply(z(2).pow(N-n.bitLength()))
if(M.compareTo(m)>0){
m=M
j=i
}
}
}
r=A=>{
p=z(2).pow(N)
for(i=3;;i=2-A+a[0]){
a=f(i,A)
if(a[1].compareTo(p)<0){
console.log(i,[a[0],(s=a[1].toString(2),z(s.substring(0,s.lastIndexOf('1')+1),2).toString()),a[1].toString()])
p=a[1]
}
}
}
//r(0) // reusable order
//r(1) // adjacent order
Next to try:
I have too little confidence that this improves over actual numeric computing.
Create an alternative
ntop
(and associated entropy estimator) that can easily pull from arrays (not likea
) of2^x
,[21*23*25 /*< 2**53*/]
, etc.Note to self: see if any (odd-number double) factorial identities in https://arxiv.org/abs/0906.1317 could help; or see:
Other identities?