There were books that captured the spirit of America so well that they are cherished by literate men for decades or centuries into the future. We don't have them anymore not because our writers are less capable, we don't have them anymore because the great evil that pervades these lands seeps into our culture, an expression of secret pain and hidden traumas that devastate America. These are smart capable people, how can they not notice? Why wouldn't they make a veiled commentary on our state of affairs (at least from time to time)? We have great fiction still, yes, but we wish to forget them. Or the fiction isn't about America. It is terrible. Everything is shameful now. America is a beautiful country of freedom, or maybe was, and hopefully will be. One day we will have art worthy of her name.