Recently we've been talking about the 1950s seeing an increase in the diagnosis of depression and prescription of antidepressants and tranquilizers. We talked about gender roles and how middle-class, college-educated women were expected to get all their satisfaction from staying at home, buying consumer goods, and raising children, while men were expected to derive satisfaction from bureaucratic jobs and being fathers.
Basically society was making them depressed because they didn't fit what they were told was supposed to make them happy.
I wonder how much of that is going on today. Why are so many of us depressed? What are we told will make us happy? And if that's a lie, what will truly make us happy?