The role played by religion in the United States is nothing new. Religion has been a major component of the American cultural landscape since the inception of the Republic.