the Deep South
noun /ðə ˌdiːp ˈsaʊθ/
/ðə ˌdiːp ˈsaʊθ/
[singular]- the southern states of the US, especially Georgia, Alabama, Mississippi, Louisiana and South CarolinaCultureThey are among the states that once had slaves and left the Union during the American civil war.