Yesterday H and I had a discussion with my sister and her BF about college degrees and their value. My sister's BF says he doesn't really think getting a degree is worth it anymore (unless it's something like becoming a doctor or teacher-where you HAVE to get a degree to be in the field) because it costs a lot of money and because the qualities that get you a job are your character and work ethic.
I agree with him to an extent-I've seen plenty of lazy college students with no direction, but at the same time I've seen lots of graphs showing how much more the average graduate earns compared to someone with a high school diploma. Plus as H has been doing applications almost every application he fills out requires at least two years of post high school education.
What do you think-is a college degree worth it anymore? If you have kids, or plan to have kids, will you pay for their college education?