Evil is rampaging across the country, the worst people in the world are being given positions of power, hate and racism are being made into policy and the people you hate most who scream about race are being given uniforms and guns and concentration camps, the Epstein-patrons are celebrating, there is a sweeping movement across evangelicals pushing back on empathy and Christian values broadly. Human rights are being taken away, as well as basic services like healthcare, food and housing for the poor and families are being torn apart without consequence.
I'm not religious but if I were and watching all of this, I would probably believe that a deal was really struck or that this was all some kind of Revelations prophecy.