See Also
See Again
Surprising jobs US presidents had before taking office
From farm boys to Hollywood stars!
© Getty Images
The vast majority of US presidents held political posts before they took office. From governors to vice presidents, the top job at the White House has historically been taken by those with at least some background in politics. But this is not always the case. It turns out not every American president started out in politics. In fact, many had completely different career paths before becoming president.
In this gallery, you'll get to know the surprising jobs presidents had before they took office. Curious? Then click on!
RECOMMENDED FOR YOU
MOST READ
- Last Hour
- Last Day
- Last Week
-
1
CELEBRITY Relationships
-
2
CELEBRITY Sports
Celebrities spotted at the 2024 US Open Tennis Championships
-
3
LIFESTYLE Performing arts
-
4
LIFESTYLE Sports
-
5
CELEBRITY Red carpet
-
6
LIFESTYLE Space travel
-
7
LIFESTYLE Mystery
-
8
LIFESTYLE Generations
-
9
LIFESTYLE Business
-
10
LIFESTYLE Business