The western regions of the US in the 19th century, when they were lawless frontier districts. The Wild West was the last of a succession of frontiers formed as settlers moved gradually further west.
Definition of Wild West in US English:
Wild West
proper nounˈˌwaɪl(d) ˈwɛstˈˌwīl(d) ˈwest
The western US in a time of lawlessness in its early history. The Wild West was the last of a succession of frontiers formed as settlers moved gradually further west. The frontier was officially declared closed in 1890.