I generally agree with your points. But I think you are misrepresenting the era that preceeded Reaganism. To my mind, as someone who came of age in the 1970s, there were three significant events that changed the way Americans felt about their country:
1) Watergate made it clear to everyone that their government could not be trusted, and that corruption stretched all the way to the Executive Office.
2) Runaway inflation had an effect on the cost of living never seen before (or since) in this country. By 1980, prices had risen to more than double what they were at the start of the decade. To put that in perspective, between 2010 and 2020, we saw prices rise roughly 20%.
3) The taking of hostages in Iran, and the Carter administration's inability to resolve that crisis, created a feeling that the country no longer had the prestige of being the "leader of the free world"
America (ok, White America) saw a government that was unable to manage affairs domestically or internationally, and whose leaders couldn't be trusted. After voting the Republicans out in 1976, things seemed even worse in 1980 when Carter ran for reelection. These experiences gave Reagan - the original poster boy for Making America Great Again - the path to the White House.