Hmmm...
While running SingularityNET and thinking about next-generation OpenCog and helping Ruiting with our charming little maniac Qorxi are taking up most of my time, I can’t help thinking here and there about quantum AI …
Quantum computing is moving toward practical realization — it’s still got a long way to go, but clearly the Schrodinger’s cat is out of the bag … the time when every server has a QPU alongside its GPU is now something quite concrete to foresee…
So I’m thinking a bit about how to use quantum partial search (Grover's algorithm on a chunked database) to speed up backward-chaining logical inference dramatically.
Suppose we are searching in some set S for those x in S that satisfying property P. (The interesting case is where S is known implicitly rather than explicitly listed.)
Suppose we have some distribution f over S, which assigns a probability value f(x) to each element of S — interpretable as the prior probability that x will satisfy P
Suppose we divide S into bin S1, S2,…, Sk, so that the expected number of x that satisfy P is the same for each Si (in which case the bins containing higher-probability x will have smaller cardinality) …
Then we can use quantum partial search to find a bin that contains x that satisfies P.
If the size of S is N and the number of items per bin were constant b, then the time required is (pi/4) sqrt(N/b). Time required increases with uneven-ness of bins (which means non-uniformity of distribution f, in this setup).
In an inference context, for instance, suppose one has a desired conclusion C and n premises Pi. One wants to know for what combinations Pi * Pj ==> C. One then constructs an N = n^2 dimensional Hilbert space, which has a basis vector corresponding to each combination (i,j). One call to the quantum oracle can tell us whether Pi * Pj ==> C for some particular (i,j) (note though that this call must be implementable as a unitary transformation on the Hilbert space — but following the standard math of quantum circuits it can be set up this way).
Using straight Grover’s algorithm, one can then find which Pi * Pj ==> C in sqrt(N) time.
If one wants to leverage the prior distribution, one can find which bin(s) the premise-pairs so that {Pi * Pj ==> C } live in, in time (pi/4) sqrt(c*N/b) where c>1 is the correction for the non-uniformity of the prior and b is the average number of pairs per bin.
With a uniform prior, one is finding log(N/b) bits of information about what the premises are (and narrowing down to a search over b items).
With a non-uniform prior, one is still narrowing down *on average* to a search over b items, so is still finding log(N/b) bits on average about where the items are.
This could be useful e.g. in a hybrid classical-quantum context, where the quantum computer is used to narrow down a very large number of options to a more modest number, which are then searched through using classical methods.
It could also be useful as a heuristic layer on top of Grover’s algorithm. I.e., one could do this prior-probability-guided search to narrow things down to a bin, and then do full-on Grover’s algorithm within the bin selected.
Constructing the bins in an artful way, so that e.g. bins tend to have similar entities in them, could potentially make things work even faster. Specifically, if the elements in each bin tend to be similar to each other, then the bin may effectively be a lower-dimensional subspace, which means the algorithm will work faster on that bin. So there would be advantage to clustering the items being searched before constructing the bins. If items that are clustered together tend to have similar prior probabilities, then the bins would tend to be lower-dimensional and things would tend to go faster.
Grover’s Algorithm and Natural Gradients
Now if we want to go even deeper down the rabbit hole — this funky paper shows that the quantum search problem reduces to finding optimal geodesic paths that minimize lengths on a manifold of pure density matrices with a metric structure defined by the Wigner-Yanase metric tensor …
Fisher metric geeks will simultaneously drop their jaws in amazement, and nod and grin in a self-satisfied way
So what we see here is that Grover’s algorithm is actually just following the natural gradient ... well sort of…
Putting some pieces together … We have seen that partial quantum search (Grover’s algorithm over a chunked database) can be set up to provide rapid (on average) approximate location of an item in an implicit database, where the average is taken relative to a given probability distribution (and the distribution is used to guide the chunking of the database)….
Well then — this partial quantum search on a database chunked-according-to-a-certain-distribution, should presumably correspond to following the natural gradient on a manifold of pure density matrices with a metric structure conditioned by that same distribution…
Which — if it actually holds up — is not really all that deep, just connecting some (quantum) dots, but sorta points in a nice quantum AI direction…
Post-Script: Wow, This Stuff May Be Implementable?
I was amazed/ amused to note some small-scale practical implementations of Grover’s Algorithm using Orbital Angular Momentum…
It’s all classical optics except preparation of the initial state (which is where the Oracle gets packed).
Could this be how our quantum-accelerated logical inference control is going to work? Quantum optics plugins for the server … or the cortex?