So football is not the story we often get feed. How the gladiators of our day enjoy a sport that is all about fair play and hard work. Just what do sports teach us? We put our kids in sports thinking they will learn responsibility, team work and the reward of practice. Yet is what they really learn? Not suggesting that kids should not play sports. Just asking what do sports really teach and say about our culture?
Ban the 'bounty' bums, but blame culture
Ban the 'bounty' bums, but blame culture
Comments
Post a Comment