Back to Show
Iconic America
Why Did Cowboy Movies Leave Women Out of the Picture?
In the 1920s, the rise of a new form of militant white racism had a big impact on the motion picture industry. Consequently, Hollywood Westerns downplayed the significant role that women, Black people and other marginalized groups and minorities played in settling The West, in favor of promoting the mythic image of male-dominant, gun-toting cowboys.