(Warning: this post is almost entirely made up of me ranting and worrying about the future. Read at your own peril.)
I love English, and because of that, I can’t think of anything I would love better than to study literature in college. All of my friends and family know this, and everyone says that they can’t see me majoring in any other field.
Now. Here’s the problem. My father and my best friend have been annoying me lately with their talks about how English majors can’t do anything once they leave college. Both of them start of by telling me that English is a critical skill that everyone should know, but then they say that I should double major in something else because I’ll make no money and I won’t have a good future.
However, I know that if I was planning on studying physics or chemistry, I would have no problem, which is so stupid since there are as few jobs in physics or chemistry as there are for English. I love English, but I hate that people always say that I can’t do anything except be a high school teacher. And what I hate most is that I truly admire my father for doing a job that he loves so much and gets so much joy out of, but then because my passion isn’t as financially stable, he thinks I need to tweak my passion.
Please don’t get the wrong idea though, my father is incredibly sweet and he’s only doing this because he doesn’t want to see me unhappy and poor in the future. But he’s given me so many talks now about how, after college, I should think about going into Law, or getting an MBA, or doubling majoring in science as well as literature, that I don’t think I can take it anymore. I love English, and it would be my dream to be an editor of a magazine or in a publishing house (which my Dad says is a disappearing job because of blogs and such), but I get so much flak over wanting to have a job in literature that I get so sick and tired of all of these people telling me that I should be studying something else, or something in addition to English.
So, after all of my rantings and ramblings, I was wondering if any of you current or past English majors can tell me about your experience as an English major. Is life as an English major really worse than any of the other majors? Are the editing/publishing jobs really dying out? And for those of you who are English professors in colleges, my Dad is under the impression that English professors have the lowest salaries and find it harder to get hired–is this true?
I’ve been getting so much unwanted and unfounded advice, that I would love to hear from people who actually know what they’re talking about.