Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 3

The West, Westerns, and American Culture

Yekta Cengiz

21501061

When one thinks of the wild west the first image that comes to their heads is of wandering

gunslingers dispensing justice to outlaws, showdowns by noon, a wild land filled with danger

that requires strong willed and capable and other similar images that showcase the well

know appear of the wild west. An image that while not wholly untrue, is a bit romanticized.

O’Connor’s showcasing of the truth behind the mythical wild west that so many people talk

about may come off as a criticism of the west, but the as the reading continues O’Connor goal

becomes clearer, that while many aspects of the wild west were based on fact, many of its

more well know parts are completely romanticized.

This romanticization has always been there, O’Connor notes, the untamed wilderness of the

west has always been a subject of pride for Americans, not only did it represent a connection

to the nature that they claimed others have lost, but it also showed their strength and will as

they slowly tamed it, with the land itself already being mythical it was only a matter of time

until the denizens who were trying to live there would be also affected. The frontier’s

lifestyle, even before the cowboy become what they were, was seen as wilder and freer

compared to the civilized life on the east. Living on the open fields, having your own field

with your own cattle, and defending it against savage natives or bandits by yourself. While of

course the daily struggles of real cowboys were never this adventurous the idea of the wild

frontier stroked the minds of writers and others, and they began to create works of involving

adventures in wild west.


These writers and scholars began the spread of the idea of wild west, the land of the strong,

the brave cowboys, this started a transformation, as more people began to read and learn about

the adventures that were happening in the frontier the very idea of what the frontier was began

to change, it wasn’t a just a wild, untamed cut of land anymore, it was a place where man

were free, where hard work and guts were rewarded, and just as the myth of the west

continued to grow so its spread. Filmmakers and novelist flocked to the newest sensation like

hungry vultures, movies about brave settlers defending their stagecoaches or their homes from

Indians, upstanding cowboys standing up against injustice and who can forget gunfights and

Mexican standoffs? What was once would be called a rare occurrences or outright fantasies

was now considered an everyday normality in the frontier.

İt was not just the location that was changed forever, the cowboys who were in truth were

poor workers who only dealt with unruly cows or horses at worst were now seen as the

epitome of what it means to be American, they were tough, self-reliant, and brave, they were

the only ones who would ever hope to not only survive, but thrive in the frontier, They

became the American identity, and even after the end of the wild west would be seen as the

İdeal way of life. So much so that even after the frontier was done the myth endured, for at

that time it was more than that, the myth displayed everything Americans wanted to have and

wanted to be, the movies continued with their rugged, tough but fair cowboys, fighting off

bandits or savage natives, saving some innocent bride and then riding off to the sunset. Some

politicians even labeled themselves as cowboys to gain the public’s support. it was obvious,

the myth of the west was now an irradicable part of the American identity, but of course

comes the question, what happens when the American people began to change?
O’Connor shows that after the ww2, especially the Cold war era, the American people faced a

period of unrest which saw them being filled with doubt about everything around them, be it

their identity and their government, and even the myth was affected by this; Suddenly the

movies about outlaws are seen as feeding delinquent ideas about going against the

government, some movies also tried to showcase stories about other groups, like natives and

black cowboys and soldiers who, have always had taken the back stage in the other western

works. These changes show what while the myth of the west might have become ingrained

with American identity, just like the identity it will continue to change as the American

people do. A reminder that it was the American people who established the myth, no the other

way around, for it showcases the ideas the American people themselves desire.

You might also like