This is a very interesting topic. I definitely have my opinions on this. I went to college for human resource management.
Here's my two cents on college, and I'm probalby going to pinch some nerves here. If you have the drive and determination to build your company and be successful in life that certificate holds no significance in my opinion. I have friends that run multi-million dollar company's that don't even have a high school diploma and are some of the smartest people I know. I have a great friend that graduated top from and ivy league school that will tell you college degrees are the most over rated thing around. In addition, I have walked the dirt roads of third world countries and seen first hand what small business loans can do for those who are determined to provide for their families. Have I taken anything from college that has helped me build this company to where it is today? Some, but very little. Latin American music and culture for 3 credits did absolutely nothing.
So to answer your question, yes I guess degrees help some. More so if you want to be in corporate america. If you want to build a company bring skilled people to the table. If you are dead set on going for a degree go for business management.
Are my kids going to go to college? When that time comes we are going to have a very strong conversation about it. I see no need to spend $200k on a 4 year college. They can go to a community college for 2 and transfer, etc. I worked for 5 years before going to college, that was a huge benefit in itself.
There's my take.
"Earth, Turf, & Wood, Inc. is a high-end residential landscape & hardscape company that offers superior employment experiences for employees, exceptional opportunities for our architects, a premium service to our customers and value to the community through service and stewardship. We attempt to honor God in all we do by encouraging teamwork, pursuing excellence passionately, serving those who lead, and demonstrating stewardship of resources."