Monday, August 23, 2010

Department of Bad Ideas: Emerging Adulthood

There is an interesting, possibly Hegelian, probably insane assumption underlying the NYT Mag piece on 20-somethings. It seems that in the very recent past, human life was inauthentic and un-free because it was constrained by necessity. People had to marry and bear children young, start working early and never stop, and otherwise do things that we now put off, because otherwise, they would starve to death or be eaten by bears. Now, however, we have "emerging adulthood," an indicator that we live in a blessed age when those necessities no longer apply, and the resulting lives we forge in their absence are therefore more authentically human and free.

The first evidence of this new freedom was adolescence, which was discovered when the necessity of child labor was peeled away to reveal the angsty, rebellious, hormonal but authentic 14-year-old within. This asshole of a creature demonstrated that the previous incarnation of the 14-year-old--the one who worked in the mines or the fields or the kitchens--was a product of necessity and not truth. The adolescent was now liberated. But necessity still bound everyone beyond adolescence. Now emerging adulthood is here to advance the upper limits of human freedom by a few more years by casting off later necessities: "fewer entry-level jobs even after all that schooling; young people feeling less rush to marry because of the general acceptance of premarital sex, cohabitation and birth control; and young women feeling less rush to have babies given their wide range of career options and their access to assisted reproductive technology if they delay pregnancy beyond their most fertile years." Newly free from these externally applied burdens, we 20-somethings have more space to shape our lives according to our own arbitrary wills. We are free! We are authentic! And what have we made of ourselves in light of all this? Well, it seems that at present, the self-realization of the will manifests itself in...hipsters. But ok, no matter.

Emerging adulthood is good because freedom of the will is good, and so obviously constraint of the will by external necessity is bad. We know constraint is bad because if emerging adulthood was thought to be a product of new and different necessities, like the unavailability of jobs, or a fatal mismatch between educational preparation and employment qualifications, or a society in decline that no longer supports family and childrearing, then we would be fretting instead of celebrating our uniqueness. We would be feeling the despair of 1932 and not the euphoria of 2010. But marriage and family are basically lifestyle options now--valid but not necessary--and although the article flirts with economic explanations, they have to be rejected because unlike the 20-somethings of 1932, the emerging adults of 2010 have their parents' savings standing between them and the constraints of economic necessity. The economy might not be great, but 20-something hipsterdom can still be freely chosen; with no spouse or children to support, a Peace Corps stipend or barista salary goes a lot further, and your parents can always help you pay the cell phone bill (and full disclosure: my parents paid mine until last month--family plan savings, peeps!).

In practical, hard-nosed American terms, freedom from necessity is good because it buys time, and time results in better decisions: "Maybe if kids take longer to choose their mates and their careers, they’ll make fewer mistakes and live happier lives." This seems to be intended as the article's most persuasive argument in favor of adopting emerging adulthood as a developmental phase. I don't doubt that many adults wish in hindsight that their youth had lasted longer, but it's not actually clear from such nostalgia and wistfulness that a longer youth would've resulted in a happier or wiser adulthood, assuming they ever made it to adulthood. Since time itself is not guidance in matters of marriage or vocation (especially if both are delayed because neither seem available or obviously worthwhile), 20-somethings may just take longer to make the same mistakes. Unless we believe that the longer one takes to make a decision, the better it will be (so people who marry at 90 are most likely to choose the best mates), we have to look to some other standard to determine the wisdom of such decisions.

Or maybe these past decisions were not mistakes at all? After all, the view that all previous decisions about marriage, work, and childrearing were wrong is the biggest assumption of this piece. What evidence do we have that the decisions people made in the past about these things were wrong? Henig/Arnett rely on the assumption that more time must result in better decisions. For example, this neuroscientific hypothesis:
When people are forced to adopt adult responsibilities early, maybe they just do what they have to do, whether or not their brains are ready. Maybe it’s only now, when young people are allowed to forestall adult obligations without fear of public censure, that the rate of societal maturation can finally fall into better sync with the maturation of the brain.*
Well, but did these poor past victims of necessity perform their responsibilities ineffectively? How do we know they were forced to take them up too early if we don't know that they did them badly, or in some clearly immature way? And then we come up against the problem that the freedom-quashing necessities of the past are actually still present in most of the world right now:
It’s rare in the developing world, he says, where people have to grow up fast, and it’s often skipped in the industrialized world by the people who marry early, by teenage mothers forced to grow up, by young men or women who go straight from high school to whatever job is available without a chance to dabble until they find the perfect fit.
Apparently, the discipline of psychology requires that an observed group behavior be universally observable to be classified as a developmental stage, and Henig goes through some perfunctory hand-wringing over emerging adulthood's narrow application to affluent Westerners, and mostly Americans at that, since European emerging adults are actually constrained by unpleasant necessities like expensive urban housing, which force them to live with their parents forever. But in reality, the narrowness of observed emerging adulthood is no problem, since its premise is that all 20-somethings would behave this way if they could only be untethered from the grinding pressures that force them "to grow up fast." Underneath every seemingly grown up 20-something with a family and a steady job, a direction-less emerging adult is gasping to be released. Given freedom from economic want, social mores that encourage early marriage, and limits to college access, every poor Vietnamese rice farmer and rural Pakistani bride of an arranged marriage could be living in Greenpoint, going to yoga classes, and selling her handmade textiles on Etsy. The world could be much more awesome now, plus the future will be that much better when these emerging adults do finally decide to settle down and become actual adults--more "self-explored" and "self-discovered" adults than the world has ever seen before. And isn’t that something we may want to promote through--hint, hint--government programs?

Just look at the present unjust inequalities in emerging adulthood. We have a woman who, in a strikingly mature, actually adult way, managed to hold down a full-time job, take care of her family, and earn a degree. But think how much more fun she could be having if she didn't have all those pesky responsibilities to weigh her down:
Is it only a grim pessimist like me who sees how many roadblocks there will be on the way to achieving those dreams and who wonders what kind of freewheeling emerging adulthood she is supposed to be having?
Suddenly we've made the leap from emerging adulthood as an ambivalent period that has started to appear in the lives of affluent meritocrats to emerging adulthood as a human right, and one that federal programs are obliged to provide for everyone:
There aren’t institutions set up to serve people in this specific age range; social services from a developmental perspective tend to disappear after adolescence. But it’s possible to envision some that might address the restlessness and mobility that Arnett says are typical at this stage and that might make the experimentation of “emerging adulthood” available to more young people. How about expanding programs like City Year, in which 17- to 24-year-olds from diverse backgrounds spend a year mentoring inner-city children in exchange for a stipend, health insurance, child care, cellphone service and a $5,350 education award? Or a federal program in which a government-sponsored savings account is created for every newborn, to be cashed in at age 21 to support a year’s worth of travel, education or volunteer work...It requires only a bit of ingenuity — as well as some societal forbearance and financial commitment — to think of ways to expand some of the programs that now work so well for the elite, like the Fulbright fellowship or the Peace Corps, to make the chance for temporary service and self-examination available to a wider range of young people.
If the great difficulty of Arnett's theory is that emerging adulthood is not yet universal, then the universalization of emerging adulthood through government incentives will take care of that problem. And I personally can think of no more important use of taxes in this country than to level the emerging adulthood playing field so that the less fortunate can have equal access to a year or two of aimless hipsterdom after college. Let's call it the Federal Initiative for Equalizing Navel-Gazing Self-Examination Opportunities.

But, it seems, we are too hasty in condemning self-examination as a kind of laziness. Laziness is in the eye of the beholder:
A century ago, it was helpful to start thinking of adolescents as engaged in the work of growing up rather than as merely lazy or rebellious. Only then could society recognize that the educational, medical, mental-health and social-service needs of this group were unique and that investing in them would have a payoff in the future. Twenty-somethings are engaged in work, too, even if it looks as if they are aimless or failing to pull their weight, Arnett says. But it’s a reflection of our collective attitude toward this period that we devote so few resources to keeping them solvent and granting them some measure of security.
So if we all collectively convince ourselves that what 20-somethings are doing when they cycle through relationships and short-term barista gigs and their parents pay their rent is really "the work of growing up," we will be happier to give them more subsidies to do it.

But if self-reflection is a kind of essential psychological labor that needs to be recognized and supported by society, then what principle limits it to 20-somethings? Important life changes and decisions arise in subsequent years, too. If, "during the timeout they are granted from nonstop, often tedious and dispiriting responsibilities, 'emerging adults develop skills for daily living, gain a better understanding of who they are and what they want from life and begin to build a foundation for their adult lives,'" why can't adults do the same? Certainly the 20's can't be the only time when people wish for a reprieve from punching the clock and sweeping the floor. People in their 30's and 40's need to think through things too! What about the social-service needs of these groups? Why is the government subsidy machine ignoring them? Maybe everyone in America should just take a break from the constraints of necessity and be paid to meditate full-time? I say, forget "emerging adulthood" with its narrow, age-ist benefits! Freedom from necessity for all!

Who's with me?

*Silly pop-neuroscience alert? What the heck is a societal rate of maturation vs. a brain rate of maturation?


Withywindle said...

They only write these things to get a rise out of you. Ignore them, and they'll go away.

(I wish.)

Phoebe said...

Geez. I may have a similarly long rant about this as well, but in the mean time, yes, the question the article poses is about hipsters, and more than that, not "20-somethings" but recent college grads living with roommates in gentrifying areas of Brooklyn. The photos didn't help.

alex said...

I completely agree. And my mom also still pays my cell phone bill.

Also, maybe having actually worked at the Times will help in answering this question: Some variation of this article appears in the Times almost monthly, and I can't imagine why they think their audience cares so deeply about whatever imagined affliction the writer has pinned on 20-somethings. These articles however, are always the most emailed, and talked about and linked to obsessively. Does this generate more ad revenue? Since 20-somethings are more likely to post things to facebook and blogs, etc, and are quick to link to and debate anything about themselves, is it actually profitable to keep writing about them since it keeps page views up?

Miss Self-Important said...

Withywindle: I'm risen.

Phoebe: But deep down, everyone is a recent college grad living in a gentrifying area of Brooklyn. Why repress the true you?

Alex: Yes, I think so. I don't know about ad revenue, but lifestyle articles are the biggest draws in the paper and, at least while I was there, there was much hand-wringing about losing the 18-30 demographic to online media. It doesn't seem that this age group is all that interested in the stories about Afghanistan, but, as this article explains, they are really interested in themselves. Like us, or me at least.

sarah marie said...

What a great response to the NY Times article. My thoughts exactly, but stated more articulately and with better humor and wit than I could have achieved!

M. said...

There is a fair amount of research that indicates that a person's brain does not develop fully until well into his or her 20s. I read an interesting book about how all teenagers are crazy, but it's not totally their fault because their brains are still in development. It's called The Primal Teen, by Barbara Strauch. Really interesting read, especially for someone who's experienced first-hand the manic behavior of teens in the classroom. So I suppose that line of thinking brought the author to the conclusion that societal maturation is in a different place than brain maturation for young people.

hardlyb said...

I've watched the process of "emerging adulthood" from close in - when I began as a graduate student at Stanford there was no effective limit on how long graduate students could "take" to graduate - and it's pretty ugly. Those that took 6 years or more seemed to never finish, and lived in the limbo where the university provided most of the luxuries (great library, country club atmosphere, social life, health care) for absurdly cheap prices (TGR fees) while the "gradual student" came up with rent and food (which was usually subsidized by freeloading on students still fully supported).

In my experience, the number of students that graduated after a decade of "emerging" was equal (1) to the number that committed a cold-blooded murder of an inoffensive faculty member (they were different people, thank Heaven - the graduate students did not have a union). The rest of them were required to move on - and end their increasingly creepy existences hitting on undergraduates, playing chess and bridge (mostly quite well), and going to French films.

I suppose that people disillusioned by working in a dying industry at the shell of a formerly great newspaper getting paid nothing and seeing their futures' contract before their eyes might wish to instead sit around in the sunshine in shabby clothes reading novels and trying to convince the impressionable freshman to sleep with them, but I think that they should get a real job and let the NY Times finish its descent into a glorified Shopper's Daily.

Miss Self-Important said...

M: But Americans don't think teenagers are mature, so where is the disjunction between brain development and social expectation, unless it's that adolescents are even more childish than we thought and we need to forcibly restrain them and confiscate their driver's licenses to keep in sync with their brain development? The Henig/Arnett argument would seem to ask us do that for 20somethings, since they're still brain development infants. Also, this still seems kind of pop-neuroscience-y to me--does the brain develop independently of all external influences? Is there a brain stage at which people magically figure out how to do all the things connected to living on their own and managing their lives, or is there some learning through doing?

Hardlyb: I don't think that 10th year grad student is a demographic much addressed by this article. But if you want to include it to encourage me to drop out of grad school while I still can, that would be fine, as long as you also find me the real job with which you intend for me to replace my present source of income. However, I have yet to sleep with any impressionable freshmen, so maybe you should let me remain long enough to achieve that goal.

hardlyb said...

Mrs S-I: I intended that it encourage you to keep working so you would graduate before reaching your thirties. The infinite students that I met (one of whom had been 'studying' for over 13 years when I started at Stanford) did no perceptible work, but I am sure that they had found themselves numerous times (although were too disorganized to keep track afterward). And I think that you can fit the freshman (or freshmen) into your schedule whenever you choose. It will much easier than you seem to think.