Western films have been a dominant force in Hollywood almost as long as Hollywood has existed. Themes of freedom, escapism, and the strong appeal of the western frontier popularized the genre, ...
There was a time when you couldn’t flip on the TV or buy a movie ticket without running into a cowboy. Westerns were everywhere, filling Saturday matinees at the local movie theater, dominating ...