icon caret-left icon caret-right instagram pinterest linkedin facebook twitter goodreads question-circle facebook circle twitter circle linkedin circle instagram circle goodreads circle pinterest circle

 
 
 Born to Write
 
 
 
 

Florida, the Most Misunderstood State?

When most people think of Florida, what comes to mind is Disneyworld and beautiful beaches, yet that's a very narrow view of a very complex state. As a writer who lived in different parts of Florida, and whose husband grew up in the far-southwest part of the state, these misperceptions are so intriguing to me that they are one of the factors which led me to write my novel, "Miss Dreamsville," set in Florida in 1962. Since the publication of the book, I'm getting a lot of feedback from readers (including people who live in Florida) who had no knowledge of the state's true history. "I never even thought of it as a Southern state" is a fairly common comment from readers. Yet, Florida was the third state to secede from the Union (ahead of Virginia, in fact). Readers are often shocked that my characters encounter the KKK, and yet there is plenty of research showing that the Klan was huge in Florida. I've heard Florida described in many ways. But I wonder if there is any state that is more misunderstood.