"The Western is a genre of fiction set in the American frontier and commonly taking place from the late 18th to the late 19th century. The term 'Western' came from the Western United States frontier." - Wikipedia
Cowboys emerged in the American West during the 19th century, primarily working on cattle ranches. They became iconic figures representing: