We tend to believe what we see and hear is real when, in fact, many times it just isn't. Take for example the myth that analog is continuous and digital is not. That's just plain not true, but when you look at the surface evidence it appears to be.
When we look at a Harvest moon it appears much bigger than the standard view of the moon. Of course intellectually we know the size of the moon never changes; it's just an illusion. The same can be said about a belief that there's more apparent dynamic range on an album than on a CD, when intellectually we know that too cannot be true.
There is nothing continuous about our universe; it's but an illusion. Analog is no more continuous than a stream of water is; it just looks that way. We know, of course, that everything around us is made of tiny bits that are themselves made of even tinier bits. You are not whole, you just look that way. In fact, you're made up of about 10,000 trillion bits (cells) and each of those cells made up of even tinier bits and so on.
The point of this post is one of perspective. Nothing is continuous and therefore the illusion of continuous depends entirely on the size of the bits. Once they are small enough to create the illusion of continuous that's sufficient and, if what you're hearing sounds more "digital" than "analog", chances are excellent it has nothing to do with granularity. Rather, it probably has more to do with everything else surrounding those grains: filtering, phase response, frequency response etc.
Think about this the next you casually suggest analog is superior to digital because ...... it just might be an illusion.