Also, Centaur indicated that with the SHA on die, they can produce statistically perfect RNG output.
No kidding. With any crypto-quality hash, I can produce statistically perfectly uniformly distributed pseudorandom data from *successive integers*. The von neumann whitener does let
a small bias through for very large data sets IIRC (i.e. a statistical bias is detectable in 1G or more data)
Johnny's whitener removes a particular kind of bias but does not reduce other kinds of regularity at all. Whitening don't mean squat for entropy. (Perhaps you can think of it as spread-spectrum for regularity, if the whitener isn't crypto-secure.) Dataset size is irrelevent except for detectability, you need more samples to be sure that nuances you see are there.
If you are using the hardware rng via a user space daemon feeding /dev/random then this is no longer an issue.
You MUST use some "hardware" (analog) input, and you SHOULD use whitening on the output, and most probably should do other operations in between (mixing partially unbiassed but imperfect input with a pool, for instance).