Surprising jobs US presidents had before taking office
From farm boys to Hollywood stars!
LIFESTYLE Presidency
The vast majority of US presidents held political posts before they took office. From governors to vice presidents, the top job at the White House has historically been taken by those with at least some background in politics. But this is not always the case. It turns out not every American president started out in politics. In fact, many had completely different career paths before becoming president.
In this gallery, you'll get to know the surprising jobs presidents had before they took office. Curious? Then click on!