I agree with the other anon. This is a vacuous statement. We are all ignorant of a great many things and in fact there is no way to not be ignorant of something.

]]>Maybe I haven’t been paying attention, but what happened to Lev’s prizewinning question?

]]>Hmmm … the vocabulary for discussing agnotology is still evolving. So maybe “ignorance-dependent” is a better phrase?

The point being, that information theory and complexity theory are confirming that there is much more knowledge out there than individual minds can assimilate (a point insufficiently emphasized in the undergraduate curriculum).

This includes not only an exponentially large body of contingent knowledge (the kind of knowledge that Rutherford famously dismissed in saying “All science is either physics or stamp collecting.”), but uniquely to the 21st Century, an exponentially large body of *a priori* knowledge (mathematical theorems), and finally, an exponentially large body of fundamental physical laws (the “landscape”).

Needless to say, this 21st Century informatic challenge to the traditional reductionist picture of science and mathematics raises important new issues and research opportunities in … complexity and information theory!

As a practical example, how much complexity can an open market tolerate before that market no longer provides an enabling platform for individual freedom, social equity, and efficient enterprise?

Most generally, how can the foundations of the Enlightenment survive the onslaught of 21st Century complexity? These foundations being (historically) that “The ideals of equality (sexual and racial), democracy, individual liberty, and a comprehensive toleration [] are concretely superior in terms of reason and moral equity” to the “nightmare world apt to result from enshrining as a new set of privileged and prevailing values ‘difference,’ a thoroughgoing relativism, and the doctrine that all sets of values, even the most questionable kinds of theology, are ultimately equivalent.”

So an emerging central issue of our times (which is really an old issue), namely “who do you trust with knowledge?” is acquiring important new aspects and depths from complexity theory and information theory. It is becoming practically equivalent to, how will 21st Century knowledge be held, shared, and applied?

And to loop back to the start, perhaps the only human pleasure more comforting, and more universally accessible, than a drink, a smoke, or a cup of coffee in the morning, is a wholly unjustified confidence in one’s own choice of subjects of which to be ignorant. A pleasure that I embrace as fervently as anyone!

But ignorance-dependence, like alcohol-dependence and substance-dependence, is in the final analysis nothing to joke about. A planet with 10 billion people on it cannot afford to “hit bottom” before acknowledging that it has an ignorance-dependence problem.

]]>The problem is this: BPP=BQP doesn’t imply that classical computers can solve Simon’s problem; it implies that they can solve any *explicit instantiation* of Simon’s problem. And we don’t know that many explicit instantiations of Simon or Shor-like problems — basically, we have the ones derived from explicit abelian groups (like factoring and discrete log, as well as elliptic curve problems). And these problems are too specific to derive general complexity conclusions from their tractability. They’re not complete for any reasonable class, and they don’t even seem to be needed for public-key cryptography (as witnessed e.g. by Oded’s system).

I get the sense that if there were a classical algorithm for Simon’s problem (HSP in (Z/2)^n) or Shor’s problem (HSP in Z), then it would be able to “learn too much” in some sense. Can we conclude that if one of these problems is in BPP, which is implied by BQP = BPP, then some other dire thing happens, such as: One-way functions do not exist, or SZK = BPP, etc.?

]]>Yet ignorance-dependence is remarkably well-tolerated in almost all human societies … especially academia!

]]>