Despite the arrival and bulldozing successes of the spaghetti Westerns in the 1960s and 1970s, the wider Western genre thereafter declined and grew less popular in Hollywood. The thematically black-and-white romanticist Western remains the definitive heartbeat of the first few decades of cinema, spearheaded by John Ford and John Wayne as they brought the Old West to the big screen in a blaze of heroism, fundamentalism, and moral clarity. They defined American filmmaking, and although they were criticized for their often naïve and simplistic portrayal of the era, remained the biggest fixture in the business until the 1960s.
https://screenrant.com/why-western-movies-stopped-being-popular-hollywood/