Tag Archives: Democrats Reclaiming the South

Democrats Will NEVER Take Back America Until They Reclaim the South

As I pointed out in last year’s very well-aged article “The Southeast is Actually the MOST Important Area for Democrats”—which you should read if you haven’t already—there is no future for the Democratic Party that doesn’t include major gains in the Southeast. What is currently being thought of as “hopelessly lost Trump country” is the only way Democrats are… Read More »