The Times posted last Friday
an interesting article about the changing face of college majors. With college tuition higher than ever and employment post-college increasingly dim, are college students under more pressure to choose the "right" major?
While there's a lot of talk about the highest-paying majors and the ones most likely to land students a job right out of school, there's no magic formula - I was
a women's studies major who ended up with a corporate office job right out of school, and my closest friends ranged from environmental studies to business.
What was (/is) your college major? Would you recommend it to current college students?
I was a psychology major, and since I am still not sure exactly what I want to do, I think it was a great decision. No matter what, I'll have to know how to work with people.
ReplyDelete