the Wild West
noun /ðə ˌwaɪld ˈwest/
/ðə ˌwaɪld ˈwest/
[singular]- the western states of the US during the years when the first Europeans were settling there, used especially when you are referring to the fact that there was not much respect for the law thereCultureThis is the period shown in western films, though the picture they present of the Wild West is not often very accurate. Towns that were known for their outlaws (= criminals) and violence included Tombstone, Arizona, and Dodge City, Kansas. Famous outlaws included Jesse James and his brother Frank, Billy the Kid and the Younger brothers.compare the Old West