Having specialized as an academic historian in the role of religion in Europe, particularly in France, I should be able to opine on the role of religion in America, I suppose. But actually this is very difficult. There are always several sides to the importance of religion anywhere, positive, neutral, and negative. Right now it does not seem to be a negative factor in the struggle between an aggressive Russia and a victimized Ukraine, since their religious backgrounds are roughly the same. In France it has been in my view mostly a negative factor, reinforcing the most illiberal, intolerant, and dangerous tendencies of its people. It has not a good history in Germany either. In Britain perhaps its role is more mixed, but at present very weak. In America it has been important intellectually, and socially, but at present divided between those viewing religions as a force for harmony and those using it as a force for demagogic and bigoted politics. Currently in India, which I know much less about, it seems to be a force for persecution and violence, as it was in the days when the British Raj came to an end in 1947.
I wonder why there are so many books now about religion’s role in America. Is it perhaps because organized religion here is in a state of rapid decline and its dangerous elements are more and more visible?