Monday, August 21, 2017

Rosé Is Exhausting

We are now deep into rosé season, and by season, I mean the rest of our lives. We have also acquired a new summer rite. Each year, a number of daring publications venture the rather interesting question, “Have we hit peak rosé?” only to provide the thrilling answer, which I will summarize for you here: No.

If you are youngish and urbanish and drinking tonight, it’s very possible you’re drinking rosé. This is more likely if you are a woman, according to the wine industry, which claims that women are driving rosé sales, though men drink rosé too — a phenomenon that has been labeled “brosé.” Bros who brosé signify their courage to push the boundaries of masculinity by wearing colorful socks, which is practically like wearing jewelry, and through their willingness to be photographed holding a glass of something pink up to their stubbled faces. Frosé is also a thing. It’s a slushie made out of rosé, sometimes festooned with strawberries or extra booze, like Aperol or elderflower liqueur, and if you think that wine is improved when rendered palatable to a small child, you might be a fan.

Rosé is not a varietal. It is made from lightly extracted red grapes, including — but not limited to — Grenache, Syrah, Cinsault, and Pinot Noir. However, it is classified in sales simply as rosé because despite the huge diversity of what this means — there are sparkling rosés and Pét-Nat rosés, and there are dry ones and less dry ones — it’s pink and you generally know exactly what you’re getting. This presents an issue: Rosé is only good when it’s kind of surprising, but most rosé is the exact opposite of surprising, and that’s exactly why it is popular. It’s light, it’s uncomplicated — you sip, you swallow, then you drink some more.

Whether you’re buying haute rosé or supermarket rosé, what you must never forget is to be drinking it all the time, and thus never not living the rosé lifestyle: Go on a rosé cruise, take in a rosé sunset, have a rosé night. Tie your rose gold hair back with a rosé-colored silk scarf so it doesn’t get in your rosé while you write a text on your rose gold iPhone that says, “rosé o’clock, bitches.” You can also sip it all day — why else would the hashtag #roséallday exist? “At a low 11.3 percent alcohol, you could easily drink this wine all day long,” a 2016 Vine Pair article confirms. The founder of Wine Savvy, Sayle Milne, recently told Refinery29: "You should be drinking rosé when you wake up. You should have it at lunch, you should have it at dinner. You should have it with a straw."

Rosé is alcohol, and if you drink it all day, you will eventually black out and wake up under a porch in Fair Harbor, and you will be covered in ticks.

I feel a little bad yelling at rosé. It never meant to hurt anyone. It’s been around for a long time. The Greeks and the Romans made rosé. Monks made rosé. And, like all wine, rosé comes in delightful forms, less delightful forms, and fairly disgusting forms, and it does so at every price point. The annoying thing about rosé is that it isn’t just a wine, like California Chardonnay or cheap Bordeaux — it’s “a state of mind” or “a lifestyle” or “a way of life.”

But just because rosé has a lot of bullshit surrounding it doesn’t mean there aren’t great rosés. Trust me, I know. I wish I had a good bottle of Chablis for every time someone told me that I would like rosé if I only got rosé. I am not saying that no rosé is good — just that maybe 80 or 90 percent of them aren’t, and while no one can deny that rosé rhymes with #allday and #yesway and s’il vous plait, for me, the truly telling coincidence is that it rhymes with okay.

Rosé used to just be some swill your dad bought when, newly divorced and preparing to host his first date, he helplessly thought, “Ladies like, uhh... wine?” Then the ’80s became the ’90s, the ’90s turned into the ’00s, and then the ’00s became a big horrible blur known as “post-9/11,” so people were like, “What can we resurrect from the past? How can we comfort ourselves with nostalgia while still honoring our newfound cosmopolitanism?” Rosé was there for us.

by Sarah Miller, Eater |  Read more:
Image: Anna Sudit
[ed. See also: Starbucks is Now Selling Sushi Burritos.]

Disrupt the Citizen

By the time Uber and Lyft breached the levees of transport regulations, the American taxi system had already endured several waves of uneven deregulation. In the 1960s, in New York, the majority of taxi drivers formed a union with the aid of the mayor, Robert Wagner. Negotiations produced results: cabbies received a weekly paycheck, vacations, benefits, and a degree of job security. It was already standard for cab companies to insure their drivers, maintain their fleet, and check their drivers’ histories. Although they were private entities, cab companies were subject to heavy control because they were a public utility, a form of municipal transportation. As deregulation became the norm in the ’70s and ’80s, the US experimented with taxi deregulation, too. Cities like San Diego, Seattle, and Dallas increased the number of licenses, bringing thousands more taxis onto the streets. New York’s Taxi and Limousine Commission reclassified drivers as independent contractors, which made the job harder and put an end to the union. More and more drivers went full-time, chiefly to ensure they could pay off the leasing fees. Conditions deteriorated throughout ’90s as pay failed to keep up with expenses. The National Taxi Workers Alliance was founded in 1998 to give an increasingly diverse workforce a voice in a complex and punishing industry. When they launched successful strikes against low fares and harsh fines against drivers, it seemed like the industry might turn a corner.

Then came Uber and Lyft, under cover of app-enabled darkness, to induce more drastic deregulation. By 2015, the taxi industry in Chicago — a sprawling city with a smaller fleet than New York’s, where ride-sharing was poised to do best — reported that they had lost somewhere between 30 to 40 percent of their business to ride-sharing apps.

Uber and Lyft claimed their success was due to better software, better algorithms, and better responsiveness, but their overwhelming advantage came from breaking the law. They flooded streets with unlicensed cars acting as taxis, first in San Francisco and then in cities everywhere, because they thought nobody would stop them. Fare prices, set by the city to be equitable and predictable for taxis, were put entirely out of city control and made subject to whatever the companies considered demand: low on lazy Saturday afternoons, high on Saturday nights, and even higher after events like terrorist attacks. Taxi fares and tips were unreliable in their own way, but drivers faced a new level of capriciousness when ride-sharing companies began to set the fares. Fare prices not only changed throughout the day; the wage floor could be slashed at the whim of the company with little or no notice to drivers. This was viable because unlike the taxi industry, Uber and Lyft swim Scrooge McDuck–like through piles of venture capital. They don’t have to rely on fares as their only source of revenue.

Uber has also lied to drivers about how much they can make. As late as 2015, the company claimed that drivers could earn $90,000 a year working for them. In an exposé for the Philadelphia City Paper, reporter Emily Guendelsberger worked as an UberX driver and found this to be far from the truth. “If I worked 10 hours a day, six days a week with one week off, I’d net almost $30,000 a year before taxes,” she wrote. “But if I wanted to net that $90,000-a-year figure that so many passengers asked about, I would only have to work, let’s see . . . 27 hours a day, 365 days a year.” That doesn’t include the money required to maintain and insure the car. Thanks to financing from Goldman Sachs, Uber offers its drivers predatory “deep subprime” loans to acquire their cars, which drivers then have to work extra hours to service. (...)

Philadelphia was one of the last cities in Pennsylvania to permit Uber and Lyft. As a kind of trial run, ride-sharing services were made temporarily legal in the city just in time for last summer’s Democratic National Convention. All the convention materials advertised its presence. The first night’s party, hosted by the company, was picketed by taxi workers. Hundreds of Democrats walked past them.

Once Uber and Lyft were legalized in a city, it became impossible to hold them to existing regulations. When the companies refused to submit to driver-fingerprinting laws in Austin, the city put the requirement to a referendum and voters booted ride-sharing out of town. But in 2017, the Texas legislature overruled the city’s voters. The fingerprinting requirements were lifted, and Texans were left with no choice but to accept ride-sharing. The companies are believed to have spent $2.3 million on lobbying Texas lawmakers this year alone.

In their antiregulatory crusade, Uber and Lyft have fostered a divided society, pitting one kind of worker against another, one kind of user against another. The largest group of Uber drivers is white (40 percent), with black non-Hispanics the second largest (19.5 percent); the largest group of taxi drivers is black (over 30 percent), with white drivers in second (26 percent). Most Uber drivers are younger and have college experience, and many have degrees; most taxi drivers are older, married, and have never been to college. Though Uber is generally cheaper, its ridership is younger and richer than taxi riders, with most identifying themselves in the “middle 50 percent” of incomes (around $45,000 a year); seniors, the disabled, and the poor make up a higher percentage of taxi clientele than their share of the general population.

The political strategy behind ride-sharing lies in pitting the figure of the consumer against the figure of the citizen. As the sociologist Wolfgang Streeck has argued, the explosion of consumer choices in the 1960s and ’70s didn’t only affect the kinds of products people owned. It affected the way those people regarded government services and public utilities, which began to seem shabby compared with the vibrant world of consumer goods. A public service like mass transit came to seem less like a community necessity and more like one choice among many. Dissatisfied with goods formerly subject to collective provision, such as buses, the affluent ceased to pay for them, supporting private options even when public ones remained.

The promise of ride-sharing is that it complements public transit. In practice, ride-sharing eliminates public transit where it exists. The majority of ride-sharing trips in San Francisco take place in neighborhoods with the highest concentration of buses and subways, and even before New York’s summer of subway hell, train ridership had dipped. Bus ridership has decreased, too. What happened to all of those riders? Some are biking, some walking — but many are in cars on the streets. A 2017 study of traffic patterns proved conclusively that congestion in New York City has increased since the introduction of ride-sharing. Meanwhile, Uber and Lyft are negotiating with cities to replace public buses with subsidized rides.

by The Editors, N+1 |  Read more:
Image: uncredited

The Black Sun (detail), from the alchemical treatise “Splendor Solis”, 1582.
via:

On Identity Politics

Donald Trump’s victory last November was a shattering event for American liberalism. Surveying the destruction, the liberal Columbia University humanities professor Mark Lilla wrote that “one of the many lessons of the recent presidential election campaign and its repugnant outcome is that the age of identity liberalism must be brought to an end.” When his essay arguing for that claim appeared in The New York Times, it caused controversy on the left, because it dared to question one of American liberalism’s most dogmatically held beliefs.

Lilla has turned that op-ed piece into a short book called The Once And Future Liberal: After Identity Politics, which appears in bookstores today. It’s a thin but punchy book by a self-described “frustrated liberal” for liberals. Lilla is tired of losing elections, and tired of watching his own side sabotage itself. In an e-mail exchange, Lilla answered a few questions I put to him about the book: (...)

There is a barbed, pithy phrase toward the end of your book: “Black Lives Matter is a textbook example of how not to build solidarity.” You make it clear that you don’t deny the existence of racism and police brutality, but you do fault BLM’s political tactics. Would you elaborate?

There is no denying that by publicizing and protesting police mistreatment of African-Americans the BLM movement mobilized people and delivered a wake-up call to every American with a conscience. But then the movement went on to use this mistreatment to build a general indictment of American society and its racial history, and all its law enforcement institutions, and to use Mau-Mau tactics to put down dissent and demand a confession of sins and public penitence (most spectacularly in a public confrontation with Hillary Clinton, of all people). Which, again, only played into the hands of the Republican right.
As soon as you cast an issue exclusively in terms of identity you invite your adversary to do the same. Those who play one race card should be prepared to be trumped by another, as we saw subtly and not so subtly in the 2016 presidential election.

But there’s another reason why this hectoring is politically counter-productive. It is hard to get people willing to confront an injustice if they do not identify in some way with those who suffer it. I am not a black male motorist and can never fully understand what it is like to be one. All the more reason, then, that I need some way to identify with him if I am going to be affected by his experience. The more the differences between us are emphasized, the less likely I will be to feel outrage at his mistreatment.

There is a reason why the leaders of the Civil Rights Movement did not talk about identity the way black activists do today, and it was not cowardice or a failure to be woke. The movement shamed America into action by consciously appealing to what we share, so that it became harder for white Americans to keep two sets of books, psychologically speaking: one for “Americans” and one for “Negroes.” That those leaders did not achieve complete success does not mean that they failed, nor does it prove that a different approach is now necessary. There is no other approach likely to succeed. Certainly not one that demands that white Americans confess their personal sins and agree in every case on what constitutes discrimination or racism today. In democratic politics it is suicidal to set the bar for agreement higher than necessary for winning adherents and elections.

Chris Arnade, I believe it was, once wrote that college has replaced the church in catechizing America. You contend that “liberalism’s prospects depend in no small measure on what happens in our institutions of higher education.” What do you mean?

Up until the Sixties, those active in liberal and progressive politics were drawn largely from the working class or farm communities, and were formed in local political clubs or on union-dominated shop floors. Today they are formed almost exclusively in our colleges and universities, as are members of the mainly liberal professions of law, journalism, and education. This was an important political change, reflecting a deep social one, as the knowledge economy came to dominate manufacturing and farming after the sixties. Now most liberals learn about politics on campuses that are largely detached socially and geographically from the rest of the country – and in particular from the sorts of people who once were the foundation of the Democratic Party. They have become petri dishes for the cultivation of cultural snobbery. This is not likely to change by itself. Which means that those of us concerned about the future of American liberalism need to understand and do something about what has happened there.

And what has happened is the institutionalization of an ideology that fetishizes our individual and group attachments, applauds self-absorption, and casts a shadow of suspicion over any invocation of a universal democratic we. It celebrates movement politics and disprizes political parties, which are machines for reaching consensus through compromise – and actually wielding power for those you care about. Republicans understand this, which is why for two generations they have dominated our political life by building from the bottom up.

“Democrats have daddy issues” you write. I’d like you to explain that briefly, but also talk about why you use pointed phrasing like that throughout your polemic. I think it’s funny, and makes The Once And Future Liberal more readable. But contemporary liberalism is not known for its absence of sanctimony when its own sacred cows are being gored.


I was referring to Democrats’ single minded focus on the presidency. Rather than face up to the need to get out into the heartland of the country and start winning congressional, state, and local races – which would mean engaging people unlike themselves and with some views they don’t share – they have convinced themselves that if they just win the presidency by getting a big turnout of their constituencies on the two coasts they can achieve their goals. They forget that Clinton and Obama were stymied at almost every turn by a recalcitrant Congress and Supreme Court, and that many of their policies were undone at the state level. They get Daddy elected and then complain and accuse him of betrayal if he can’t just make things happen magically. It’s childish.

As for my writing, maybe Buffon was right that le style c’est l’homme même [style is the man — RD]. I find that striking, pithy statements often force me to think than do elaborate arguments. And I like to provoke. I can’t bear American sanctimony, self-righteousness, and moral bullying. We are a fanatical people.

As a conservative reading The Once And Future Liberal, I kept thinking how valuable this book is for my side. You astutely point out that before he beat Hillary Clinton, Donald Trump trounced the GOP establishment. Republicans may hold the high ground in Washington today, but I see no evidence that the GOP is ready for the new “dispensation,” as you call the time we have entered. It’s all warmed-over, think-tank Reaganism. What lessons can conservatives learn from your book?

I hope not too many, and not until we get our house in order! But of course if Palin-Trumpism – we shouldn’t forget her role as Jane the Baptist – has taught us anything, it is that the country has a large stake in having two responsible parties that care about truth and evidence, accept the norms of democratic comportment, and devote themselves to ennobling the demos rather than catering to its worse qualities. Democrats won’t be able to achieve anything lasting if they don’t have responsible partners on the other side. So I don’t mind lending a hand.

I guess that if I were a reformist Republican the lessons I would draw from The Once and Future Liberal would be two. The first is to abandon dogmatic, anti-government libertarianism and learn to start speaking about the common good again. This is a country, a republic, not a campsite or a parking lot where we each stay in our assigned spots and share no common life or purpose. We not only have rights in relation to government and our fellow citizens, we have reciprocal duties toward them. The effectiveness, not the size, of government is what matters. We have a democratic one, fortunately. It is not an alien spaceship sucking out our brains and corrupting the young. Learn to use it, not demonize it.

The second would be to become reality based again. Reaganism may have been good for its time but it cannot address the problems that the country – and Republican voters – face today. What is happening to the American family? How are workers affected by our new capitalism? What kinds of services (i.e., maternity leave, worker retraining) and regulations (i.e., anti-trust) would actually help the economy perform better and benefit us all? What kind of educational system will make our workers more highly skilled and competitive (wrong answer: home schooling)? If you don’t believe me, simply read Ross Douthat and Reihan Salam’s classic The Grand New Party, which laid this all out brilliantly and persuasively a decade ago. It’s been sitting on shelves gathering dust all this time while the party has skidded down ring after ring of the Inferno. (A conservative publisher should bring out an updated version…) Or take a look at the reformicon public policy journal National Affairs.

Oh, and a bonus bit of advice: get off the tit of Fox News. Now. It rots the brain, makes you crazy, ruins your judgment, and turns the demos into a mob, not a people. Find a more centrist Republican billionaire to set up a good, reality based conservative network. And relegate that tree-necked palooka Sean Hannity to a job he’s suited for, like coaching junior high wrestling…

As you know, there is a lot of pessimistic talk now about the future of liberal democracy. There’s a striking line in your book: “What’s extraordinary — and appalling — about the past four decades of our history is that politics have been dominated by two ideologies that encourage and even celebrate the unmaking of citizens.” You’re talking about the individualism that has become central to our politics, both on the left and the right. I would say that our political consciousness has been and is being powerfully formed by individualism and consumerism — tectonic forces that work powerfully against any attempt to build solidarity. Another tectonic force is what Alasdair MacIntyre calls “emotivism” — the idea that feelings are a reliable guide to truth. Could it be the case that identity politics are the only kind of politics of solidarity possible in a culture formed by these pre-political forces?

It’s an interesting argument that, if I’m not mistaken, Ross Douthat has made in other terms. I can see that they might be gestures toward solidarity but real solidarity comes when you identity more fully with the group and make a commitment to it, parking your individuality for the moment. Identitarian liberals have a hard time doing that.

Take the acronym LGBTQ as an example. It’s been fascinating to see how this list of letters has grown as each subgroup calls for recognition, rather than people in the groups finally settling on a single word as a moniker – say “gay,” or “queer,” or whatever. I don’t see how ID politics makes solidarity possible. Instead it just feeds what I call in the book the Facebook model of identity, one in which I like groups temporarily identify with, and unlike them when I no longer do, or get bored, or just want to move on.

by Rod Dreher, American Conservative |  Read more:
Image: Christophe Dellory
[ed. See also: Back to the Progressive Future.]

Sunday, August 20, 2017


Ōhno Bakufu 大野麦風 (1888–1976).
via:

The Electric-Bike Conundrum

It was nighttime, a soft summer night, and I was standing on Eighty-second Street and Second Avenue, in Manhattan, with my wife and another couple. We were in the midst of saying goodbye on the small island between the bike lane and the avenue when a bike whooshed by, soundless and very fast. I had been back in New York for only a week. As is always the case when I arrive after a period of months away, I was tuned to any change in the city’s ambient hum. When that bike flew past, I felt a shift in the familiar rhythm of the city as I had known it. I watched the guy as he travelled on the green bike path. He was speeding down the hill, but he wasn’t pedalling and showed no sign of exertion. For a moment, the disjunction between effort and velocity confused me. Then it dawned on me that he was riding an electric bike.

Like most of the guys you see with electric bikes in New York, he was a food-delivery guy. Their electric bikes tend to have giant batteries, capable of tremendous torque and horsepower. They are the vanguard, the visible part of the iceberg, but they are not indicative of what is to come. Their bikes are so conspicuously something other than a bike, for one thing. For another, the utility of having a battery speed up your delivery is so straightforward that it forecloses discussion. What lies ahead is more ambiguous. The electric bikes for sale around the city now have batteries that are slender, barely visible. The priority is not speed so much as assisted living.

I grew up as a bike rider in Manhattan, and I also worked as a bike messenger, where I absorbed the spartan, libertarian, every-man-for-himself ethos: you need to get somewhere as fast as possible, and you did what you had to do in order to get there. The momentum you give is the momentum you get. Bike messengers were once faddish for their look, but it’s this feeling of solitude and self-reliance that is, along with the cult of momentum, the essential element of that profession. The city—with its dedicated lanes and greenways—is a bicycle nirvana compared with what it once was, and I have had to struggle to remake my bicycle life in this new world of good citizenship. And yet, immediately, there was something about electric bikes that offended me. On a bike, velocity is all. That guy on the electric bike speeding through the night was probably going to have to brake hard at some point soon. If he wanted to pedal that fast to attain top speed on the Second Avenue hill that sloped down from the high Eighties, then it was his right to squander it. But he hadn’t worked to go that fast. And, after he braked—for a car, or a pedestrian, or a turn—he wouldn’t have to work to pick up speed again.

“It’s a cheat!” my friend Rob Kotch, the owner of Breakaway Courier Systems, said, when I got him on the phone and asked him about electric bikes. “Everyone cheats now. They see Lance Armstrong do it. They see these one-percenters making a ton of money without doing anything. So they think, why do I have to work hard? So now it’s O.K. for everyone to cheat. Everyone does it.” It took me a few minutes to realize that Kotch’s indignation on the subject of electric bikes was not coming from his point of view as a courier-system owner—although there is plenty of that. (He no longer employs bike messengers as a result of the cost of worker’s compensation and the competition from UberEATS, which doesn’t have to pay worker’s comp.) Kotch’s strong feelings were driven—so to speak—by his experience as someone who commutes twenty-three miles on a bicycle each day, between his home in New Jersey and his Manhattan office. He has been doing this ride for more than twenty years. (...)

I laughed and told him about a ride I took across the Manhattan Bridge the previous night, where several electric bikes flew by me. It was not, I insisted, an ego thing about who is going faster. Lots of people who flew by me on the bridge were on regular bikes. It was a rhythm thing, I said. On a bike, you know where the hills are, you know how to time the lights, you calibrate for the movement of cars in traffic, other bikes, pedestrians. The electric bike was a new velocity on the streets.

And yet, for all our shared sense that something was wrong with electric bikes, we agreed that, by any rational measure, they are a force for good.

“The engines are efficient, they reduce congestion,” he said.

“Fewer cars, more bikes,” I said.

We proceeded to list a few other Goo-Goo virtues. (I first encountered this phrase—short for good-government types—in Robert Caro’s “The Power Broker,” about Robert Moses, the man who built New York for the automobile.)

“If it’s such a good thing, why do we have this resentment?” I asked.

He wasn’t sure, he said. He confessed that he had recently tried a friend’s electric bike and found the experience appealing to the point of corruption.

“It’s only a matter of time before I get one,” he said ruefully. “And then I’ll probably never get on a real bike again.”

In some ways, the bike-ification of New York City can be seen as the ultimate middle finger raised to Robert Moses, a hero for building so many parks who then became a crazed highway builder who wanted to demolish part of Greenwich Village to make room for a freeway. But are all the bikes a triumph for his nemesis, Jane Jacobs, and her vision of cohesive neighborhoods anchored by street life, by which she meant the world of pedestrians on the sidewalk?

“The revolution under Bloomberg was to see the city as a place where pedestrians come first,” a longtime city bike rider and advocate I know, who didn’t wish to be named, said. “This electric phenomenon undermines this development. The great thing about bikes in the city is that, aesthetically and philosophically, you have to be present and aware of where you are, and where others are. When you keep introducing more and more power and speed into that equation, it goes against the philosophy of slowing cars down—of traffic calming—in order to make things more livable,” he said.

by Thomas Beller, New Yorker | Read more:
Image: Sophia Foster-Dimino

Bengt G. PetterssonBoat Bridge at Evening, Denmark, 1973.
via:

It’s Complicated

Have you ever thought about killing someone? I have, and I confess that it brought me peculiar feelings of pleasure to fantasize about putting the hurt on someone who had wronged me. I am not alone. According to the evolutionary psychologist David Buss, who asked thousands of people this same question and reported the data in his 2005 book, The Murderer Next Door, 91 percent of men and 84 percent of women reported having had at least one vivid homicidal fantasy in their life. It turns out that nearly all murders (90 percent by some estimates) are moralistic in nature—not cold-blooded killing for money or assets, but hot-blooded homicide in which perpetrators believe that their victims deserve to die. The murderer is judge, jury, and executioner in a trial that can take only seconds to carry out.

What happens in brains and bodies at the moment humans engage in violence with other humans? That is the subject of Stanford University neurobiologist and primatologist Robert M. Sapolsky’s Behave: The Biology of Humans at Our Best and Worst. The book is Sapolsky’s magnum opus, not just in length, scope (nearly every aspect of the human condition is considered), and depth (thousands of references document decades of research by Sapolsky and many others) but also in importance as the acclaimed scientist integrates numerous disciplines to explain both our inner demons and our better angels. It is a magnificent culmination of integrative thinking, on par with similar authoritative works, such as Jared Diamond’s Guns, Germs, and Steel and Steven Pinker’s The Better Angels of Our Nature. Its length and detail are daunting, but Sapolsky’s engaging style—honed through decades of writing editorials, review essays, and columns for The Wall Street Journal, as well as popular science books (Why Zebras Don’t Get Ulcers, A Primate’s Memoir)—carries the reader effortlessly from one subject to the next. The work is a monumental contribution to the scientific understanding of human behavior that belongs on every bookshelf and many a course syllabus.

Sapolsky begins with a particular behavioral act, and then works backward to explain it chapter by chapter: one second before, seconds to minutes before, hours to days before, days to months before, and so on back through adolescence, the crib, the womb, and ultimately centuries and millennia in the past, all the way to our evolutionary ancestors and the origin of our moral emotions. He gets deep into the weeds of all the mitigating factors at work at every level of analysis, which is multilayered, not just chronologically but categorically. Or more to the point, uncategorically, for one of Sapolsky’s key insights to understanding human action is that the moment you proffer X as a cause—neurons, neurotransmitters, hormones, brain-specific transcription factors, epigenetic effects, gene transposition during neurogenesis, dopamine D4 receptor gene variants, the prenatal environment, the postnatal environment, teachers, mentors, peers, socioeconomic status, society, culture—it triggers a cascade of links to all such intervening variables. None acts in isolation. Nearly every trait or behavior he considers results in a definitive conclusion, “It’s complicated.”

Does this mean we are relieved of moral culpability for our actions? As the old joke goes: nature or nurture—either way, it’s your parents’ fault. With all these intervening variables influencing our actions, where does free will enter the equation? Like most scientists, Sapolsky rejects libertarian free will: there is no homunculus (or soul, or separate entity) calling the shots for you, but even if there were a mini-me inside of you making choices, that mini-me would need a mini-mini-me inside of it, ad infinitum. That leaves two options: complete determinism and compatibilism, or “mitigated free will,” as Sapolsky calls it. A great many scientists are compatibilists, accepting the brute fact of a deterministic world with governing laws of nature that apply fully to humans, while conceding that such factors as brain injury, alcoholism, drug addiction, moments of uncontrollable rage, and the like can account for some criminal acts.

Sapolsky will have none of this. (...) Sapolsky quotes American cognitive scientist Marvin Minsky in support of the position that free will is really just “internal forces I do not understand.”

This is the part of Behave where the academic rubber meets the legal road as Sapolsky ventures into the areas of morality and criminal justice, which he believes needs a major overhaul. No, we shouldn’t let dangerous criminals out of jail to wreak havoc on society, but neither should we punish them for acts that, if we believe the science, they were not truly responsible for committing. Punishment as retribution is meaningless unless it is meted out in Skinnerian doses with the goal of deterring unwanted behaviors. Some progress has been made on this front. People who regularly suffer epileptic seizures are not allowed to drive, for example, but we don’t think of this ban as “punishing” them for their affliction. “Crowds of goitrous yahoos don’t excitedly mass to watch the epileptic’s driver’s license be publicly burned,” Sapolsky writes in his characteristic style. “We’ve successfully banished the notion of punishment in that realm. It may take centuries, but we can do the same in all our current arenas of punishment.”

by Michael Shermer, American Scholar |  Read more:
Image: Angelica Kauffman, Self-Portrait Hesitating between the Arts of Music and Painting, 1791

Steve Jobs’s Mock Turtleneck Gets a Second Life

Of the many technological and ­artistic triumphs of the fashion designer Issey Miyake—from his patented pleating to his soulful sculptural forms—his most famous piece of work will end up being the black mock turtleneck indelibly associated with Apple co-founder Steve Jobs.

The model was retired from production in 2011, after Jobs’s death, but in July, Issey Miyake Inc.—the innovative craftsman’s eponymous clothing brand—is releasing a $270 garment called the Semi-Dull T. It’s 60 percent polyester, 40 percent cotton, and guaranteed to inspire déjà vu.

Don’t call it a comeback. The company is at pains to state that the turtleneck, designed by Miyake protégé Yusuke Takahashi with a trimmer silhouette and higher shoulders than the original, isn’t a reissue. And even if the garment were a straight-up imitation, its importance as a cultural artifact is more about the inimitable way Jobs wore it.

For Jobs, this way of dressing was a kind of consolation prize after ­employees at Apple Inc. resisted his attempts to create a company uniform. In the early 1980s he’d visited Tokyo to tour the headquarters of Sony Corp., which had 30,000 employees in Japan. And all of them—from co-founder Akio Morita to each factory worker, sales rep, and ­secretary—wore the same thing: a traditional blue-and-white work jacket.

In the telling of Jobs biographer Walter Isaacson, Morita explained to Jobs that Sony had imposed a uniform since its founding in 1946. The workers of a nation ­humiliated in war were too broke to dress themselves, and corporations began supplying them with clothes to keep them looking professional and create a bond with their colleagues. In 1981, for Sony’s 35th anniversary, Morita had commissioned Miyake, already a fashion star after showing innovative collections in Paris, to design a jacket. Miyake returned with a futuristic taupe nylon model with no lapels and sleeves that unzipped to convert it into a vest.

Jobs loved it and commissioned Miyake to design a vest for Apple, which he then unsuccessfully pitched to a crowd in Cupertino, Calif. “Oh, man, did I get booed off the stage,” Jobs told Isaacson. “Everybody hated the idea.” Americans, with their cult of individuality, tend not to go in for explicit uniformity, conforming instead to dress codes that aren’t even written yet.

This left Jobs to ­contrive a uniform for himself, and he drew his daily ­wardrobe from a closet stocked with Levis 501s, New Balance 991s, and stacks of black mock turtlenecks—about 100 in total—supplied by Miyake.

How Jobs came to settle on this particular item of clothing isn’t recorded, but it had long been a totem of progressive high-culture types—San Francisco beatniks, Left Bank chanteuses, and Samuel Beckett flinching at the lens of Richard Avedon.

In the analysis of costume historian Anne Hollander, the existentialist black turtleneck indicates “the kind of freedom from sartorial convention demanded by deep thought,” and it’s tempting to read Jobs’s as the descendant of that symbol. His turtleneck was an extension of his aesthetic aspirations: severe but serene, ascetic but cushy. The garment, as Jobs wore it, was the vestment of a secular monk.

The shirt put an especially cerebral spin on the emerging West Coast ­business-casual look, implying that the Apple chief had evolved past such relics as neckties—an ­anti-establishment gesture that set a template for ­hoodie-clad Mark Zuckerbergs and every other startup kid disrupting a traditional dress code. In its minimalism and simplicity, the black turtleneck gave a flatscreen shimmer to Jobs’s ­self-presentation, with the clean lines of a blank slate and no old-fashioned buttons.

by Troy Patterson, Bloomberg |  Read more:
Image: Ted Cavanaugh for Bloomberg Pursuits, Stylist: Chloe Daley

Saturday, August 19, 2017

Aging Parents With Lots of Stuff, and Children Who Don’t Want It

Mothers and daughters talk about all kinds of things. But there is one conversation Susan Beauregard, 49, of Hampton, Conn., is reluctant to have with her 89-year-old mother, Anita Shear: What to do — eventually — with Mrs. Shear’s beloved set of Lenox china?

Ms. Beauregard said she never uses her own fine china, which she received as a wedding gift long ago. “I feel obligated to take my mom’s Lenox, but it’s just going to sit in the cupboard next to my stuff,” she said.

The only heirlooms she wants from her mother, who lives about an hour away, in the home where Ms. Beauregard was raised, are a few pictures and her mother’s wedding band and engagement ring, which she plans to pass along to her son.

So, in a quandary familiar to many adults who must soon dispose of the beloved stuff their parents would love them to inherit, Ms. Beauregard has to break it to her mother that she does not intend to keep the Hitchcock dining room set or the buffet full of matching Lenox dinnerware, saucers and gravy boats.

As baby boomers grow older, the volume of unwanted keepsakes and family heirlooms is poised to grow — along with the number of delicate conversations about what to do with them. According to a 2014 United States census report, more than 20 percent of America’s population will be 65 or older by 2030. As these waves of older adults start moving to smaller dwellings, assisted living facilities or retirement homes, they and their kin will have to part with household possessions that the heirs simply don’t want.

“We went from a 3,000-square-foot colonial with three floors to a single-story, 1,400-square-foot living space,” said Tena Bluhm, 76, formerly of Fairfax, Va. She and her 77-year-old husband, Ray Bluhm, moved this month to a retirement community in Lake Ridge, Va.

Before the move, their two adult children took a handful of items, including a new bed and a dining table and chairs. But Mrs. Bluhm could not interest them in “the china and the silver and the crystal,” her own generation’s hallmarks of a properly furnished, middle-class home.

The competitive accumulation of material goods, a cornerstone of the American dream, dates to the post-World War II economy, when returning veterans fled the cities to establish homes and status in the suburbs. Couples married when they were young, and wedding gifts were meant to be used — and treasured — for life.

“Americans spent to keep up with the Joneses, using their possessions to make the statement that they were not failing in their careers,” wrote Juliet B. Schor, the Boston College sociologist, in her 1998 book, “The Overspent American: Why We Want What We Don’t Need.”

But for a variety of social, cultural, and economic reasons, this is no longer the case. Today’s young adults tend to acquire household goods that they consider temporary or disposable, from online retailers or stores like Ikea and Target, instead of inheriting them from parents or grandparents.

This represents a significant shift in material culture, said Mary Kay Buysse, executive director of the National Association of Senior Move Managers, a professional organization of moving specialists who help older people downsize.

“This is the first time we’re seeing a kink in the chain of passing down mementos from one generation to another,” Ms. Buysse said in a telephone interview from the group’s headquarters in Hinsdale, Ill.

Accordingly, the senior move management industry has experienced unprecedented growth in recent years, Ms. Buysse said.

by Tom Verde, NY Times |  Read more:
Image: T.J. Kirkpatrick

'A Bit More'

Last year I fell in love with a toaster.

It looks like most others. A brushed, stainless-steel housing. Four slots, to accommodate the whole family’s bread-provisioning needs. It is alluring but modest, perched atop the counter on proud haunches.

But at a time when industry promises disruptive innovation, Breville, the Australian manufacturer of my toaster, offers something truly new and useful through humility rather than pride.

The mechanism that raises and lowers the bread from the chassis is motorized. After I press a button atop the frame, the basket silently lowers the bread into the device to become toast. On its own, this feature seems doomed to mechanical failure. But the risk is worthwhile to facilitate the toaster’s star ability: the “A Bit More” button. That modest attribute offers a lesson for design of all stripes—one that could make every designed object and experience better.

Toast is an imperfect art. Different breads brown at different rates. Even with the very same bread, similar toaster settings can produce varied results. When my bread doesn’t come up dark enough, I dial in a guess for another browning run. Usually I go overboard and burn the toast in the process. It’s toaster telephone game.

The “A Bit More” button enters here, at the friction point between good and great toast. When the toast reveals itself to me above the Breville’s chassis, I visually gauge its browness. If insufficient, I press the button, which actuates the basket motor. Down it goes for a brief, return visit to the coil. Then back up again, having been toasted, well, just a bit more.

The button also makes toasting bread, normally a quantitative act, more qualitative. The lever dials in numerical levels of browning, and the “A Bit More” button cuts it with you-know-what-I-mean ambiguity. That dance between numbers and feelings apologizes even for a slightly over-browned slice of toast by endearing the eater to the result the button helped produce.

Sure, I’m talking about toast. But Breville’s “A Bit More” Button is nothing short of brilliant. It highlights an obvious but still unseen problem with electric toasters, devices that have been around for more than a century. And then it solves that problem in an elegant way that is also delightful to use. It’s just the kind of solution that designers desperately hope to replicate, and users hope to discover in ordinary products. But agreeing on a method for accomplishing such achievements is harder.

The “A Bit More” Button was conceived by the industrial designer Keith Hensel, who worked for Sunbeam and then as Breville’s principal designer until his unexpected death in 2013, at the age of 47. His specialty was household products, like toasters, kettles, and blenders.

Breville’s head designer, Richard Hoare, tells me that Hensel, with whom he worked closely, fell upon the idea by “focusing on user empathy.” Hensel had been pondering the problem people have with toasters. “Your bread comes up too light, so you put it back down, then get distracted and forget, and it goes through a full cycle and burns,” Hoare relates. “Keith thought, why can’t the consumer have more control? Why can’t they have ‘A Bit More?’”

According to Hoare, the design team called the button by that name from the start. Some people within Breville thought it was too colloquial, and other options were considered. “Extra Darkness” was one, and “10% Extra” another. “These were confusing and clunky,” says Hoare. “In the end ‘A Bit More’ was the clearest.” Breville, which holds several patents in motorized toaster basket tech, started selling toasters with the feature in 2008. (...)

Hoare’s recollection corresponds with a trend in contemporary design practice—and one that claims to be particularly adept at producing outcomes like “A Bit More.” It’s called user-experience, or UX, design, a discipline that strives to craft pleasurable and useful encounters between people and things. Originally derived from human-computer interaction, or HCI, where user-interface design was its ancestor, UX purports to offer a general approach to design of all kinds, from software design to product design to architecture and urban planning.

But UX practice talks out of both sides of its mouth. On the one hand, UX fancies itself an empirical discipline. Its processes include ethnographic user research, specification drafting, iterative design, user testing, and so forth. UX inherits mid-century form-follows-function design ideals. It also embraces more recent trends, like participatory design, which deeply integrates stakeholders into the design process. Data are often incorporated into UX for affirming, denying, or directing elsewhere a design team’s attention.

On the other hand, UX design also privileges out-of-the-box genius to solve design problems. Apple, often considered to typify UX, is famous for conducting design in secret via a small cadre of geniuses. Steve Jobs is the ultimate example, a figure who held that “people don’t know what they want until you show it to them.” In the design-genius mentality, how a toaster (or smartphone, or building) ought to work becomes a type of soothsaying, whereby the designer earns the status of mastermind. Research becomes retrospective justification, the designer’s ingenuity validated by user adoption of the product—irrespective of how well it really serves their goals or interests.

Neither polarity of UX-style design really helps explain how one might best arrive at Breville’s “A Bit More” button. On one side is intuition. Keith Hensel, the genius who died too soon, possessed a sixth sense for taming the Maillard reaction and a congenial manner for proselytizing his solution. On the other side is evidence, via the research and participant observation conducted to cash out the “user empathy” Hoare cites as a compass bearing.

UX proponents tell tall tales about how good design really takes place. Bottom-up, evidentiary design implies that the designer is ultimately unnecessary, a mere facilitator who draws out a solution from the collective. The designer becomes a bureaucrat. And top-down, genius design becomes indistinguishable from salesmanship. As a result, design dissolves into other, more established disciplines like business intelligence, product marketing, and corporate evangelism. It’s an error that makes good design look far easier and more replicable than it really is. And worse, it allows people to conclude that their own expertise—from data analytics to advertising to illustration—is a sufficient stand-in for design. (...)

Allow me to indulge an analogy from philosophy. In both the genius and consensus registers, UX design predicates its success on knowledge: either the second sight of the designer, or the negotiated consensus of the user. Philosophers call the study of knowledge epistemology, and this approach to design is entirely epistemological. Just find the proper knowledge and the right design will emerge.

But when conducted best—including in Breville’s case, and despite Hoare’s insistence otherwise—design is more related to the philosophy of what things are, called ontology. It is a discipline of essence, that great bugbear of contemporary life, not of knowledge. Pursuing greater compatibility with a thing’s essence requires that the designer focus on the abstraction formed by the designed object and its human users together—whether it be toasting, dwelling, publishing, socializing, or anything else.

The designer’s job is not to please or comfort the user, but to make an object even more what it already is. Design is the stewardship of essence—not the pursuit of utility, or delight, or form. This is the orientation that produces solutions like the Breville “A Bit More” button. The design opportunities that would otherwise go unnoticed emerge not from what people know about or desire for toasting, but from deeply pursuing the nature of toasting itself.

by Ian Bogost, The Atlantic |  Read more:
Image: Breville

Mika Mäkinen, Dinosaur jr. - Gig poster project
via:

Pornhub Is the Kinsey Report of Our Time

The streaming sex empire may have done more to expand the sexual dreamscape than Helen Gurley Brown, Masters and Johnson, or Sigmund Freud.

Waking up on a Sunday morning, I received a text about what happened after I left the previous night’s party. “Everyone got high and we played truth or dare. Ted and Ivan docked.”

“Are you serious?” I replied. “I thought that only happened in porn.” Defined by Urban Dictionary as “the act of placing the head of one’s penis inside the foreskin of another’s penis,” docking is an act that, until that fateful night, nobody at the party had attempted or witnessed firsthand. (Or so they claimed.) But once you know a thing is a thing, sometimes you can’t get it out of your mind. And in a fit of libidinous boredom, or idle curiosity, or lust, or who even knows why anyone does anything anyway — you do that thing. Because that thing exists, and so do you. At some point, someone had to.

On the internet, there is a maxim known as Rule 34, which states: If you can imagine it, there is porn of it. No exceptions. And now that we are solidly into the age of internet pornography, I believe we are ready for another maxim: If there is porn of it, people will try it. (Maybe we can call it Rule 35.) And if people are trying that thing, then inevitably some of them will make videos of that thing and upload those to the internet. The result: an infinitely iterating feedback loop of sexual trial and error. Once upon a time, someone would try something new on film and it would take years to circulate on VHS or DVD through a relatively small community of porn watchers. But today, even the mainstream is porn-literate, porn-saturated, and porn-conversant. For a sexual butterfly effect to take place, you don’t even need to try that thing with your body — you can watch it, text about it, post jokes about it on Tumblr, chat about it on Grindr, masturbate while thinking about it, and type its name into so many search engines as to alter the sexual universe. There is such a thing, now, as a sexual meme — erotic acts and fantasies that replicate and spread like wildfire.

For we are living in a golden age of sexual creativity — an erotic renaissance that is, I believe, unprecedented in human history. Today you can, in a matter of minutes, see more boners than the most orgiastic member of Caligula’s court would see in a lifetime. This is, in itself, enough to revolutionize sexual culture at every level. But seeing isn’t even the whole story — because each of us also has the ability to replicate, share, and reinvent everything we see. Taken as a whole, this vast trove of smut is the Kinsey Report of our time, shedding light on the multiplicity of erotic desires and sexual behaviors in our midst. (...)

As long as there has been porn, there have been people worrying that porn is damaging sex. I’m not here to join that debate. The deeper we go down the internet-porn wormhole, the more it seems narrow-minded to understand porn exclusively in terms of what kind of sex it “teaches” us to have. Because in the streaming era, the amount and diversity of porn we watch exponentially outpaces that of the sex we have. Porn is bigger than its real-sex analog, and the difference isn’t just volume: The porn we see is weirder, wilder, and more particular than what most of us will ever have — or want — in our own lives. An expansive erotic landscape unto itself, pornography exists adjacent to and in constant conversation with real sex — but is much more capricious and capacious and creative. Pornography is more than a mere causal agent in the way we screw. It has also become a laboratory of the sexual imagination — and as such, it offers insight into a collective sexual consciousness that is in a state of high-speed evolution.

The speed of that evolution may be best observed in the deluge of sexual memes that depart from traditional real-world sexual behavior. In addition to acts like pussy-slapping and ball-squeezing — which could theoretically be included in some crazily updated version of The Joy of Sex — the new generation of sexual memes includes a new set of narrative memes. Pornographic scene-setting, erotic situations, and role-playing are being reinvented, and imaginations have expanded to accommodate a never-ending supply of novel stimuli. Some of these memes seem to live almost entirely within the realm of porn. (Does anybody enjoy being searched by the TSA?) Some may have real-world origins, but have undergone so much reimagining as to approach derivative art. (When homemade-porn versions of the video game Overwatch spiked last year, had there been a preceding spike in dirty talk in the headsets of Overwatch players?) And others are only acceptable when they don’t have real-world analogs. “Is it me or is there way too much stepdaughter porn lately?” a straight man recently asked. He was right, and it doesn’t stop there: In the U.S. in 2015 and 2016, the most popular search term on Pornhub was “stepmom.” Though he said he was “immensely insulted” by the genre, that didn’t prevent him from watching. “If I ignore the title and the girl looks hot, I open it.” And no, “stepsister” porn has not made him feel any different about his sisters, and I can go to hell for asking. (...)

How users navigate that material in private — what they choose to watch, in what sequence and for how long — is a sexual-sociological gold mine. MindGeek’s understanding of its users’ autoerotic habits is almost terrifyingly precise. Like Facebook, Google, Netflix, and every other major player online, Pornhub collects and analyzes a staggering amount of user data — some of which it uses, like those other companies, to help curate content and determine what a user sees. Pornhub also publicizes some of its anonymized findings on the company’s data-analytics blog, Pornhub Insights. (Which means the X-rated version of Netflix is actually more casual with its data than the real Netflix. Knowledge of the human condition, in the age of big data, is idiosyncratic and subject to corporate marketing strategies.) To celebrate the website’s tenth anniversary, Pornhub Insights analyzed a decade’s worth of data — and provided access to that data, granting us an unusual peek into the internet’s collective id. And it’s an id that is constantly shape-shifting — sometimes very rapidly. New sexual memes are invented daily, and when they explode in popularity, they can spawn thousands of spinoffs and imitators. And sometimes they fade away just as quickly — another porn fad that came, conquered, and vanished. Overnight.

by Maureen O’Connor, The Cut | Read more:
Image: Ben Wiseman
[ed. See also: What We Learned About Sexual Desire From 10 Years of Pornhub User Data]

There Is More to Becoming an Elite Route Runner Than Meets the Eye

Save for the lucky few anointed as quarterbacks, every kid who picks up a football starts as a wide receiver. At their core, backyard games are a series of one-on-one clashes between pass catchers and defensive backs, and the first challenge any aspiring gridiron star faces is learning how to get open. No skill on a football field is more relatable. No goal is more familiar.

That shared experience is part of what makes route running at the highest level so misunderstood. On one level, the idea of beating the person across from you is among the simplest in football. But against NFL cornerbacks, creating space requires as much nuance and attention to detail as any undertaking in the sport. “It’s all about efficiency,” Packers wide receiver turned running back Ty Montgomery tells The Ringer. “I think you learn that through repetition. How many steps [are you] taking at the top? How [are you] getting off the line? How are you creating separation? What ways are you able to make the same route look different every time you run it?”

Route running is a skill that’s both oft-discussed and underappreciated, and it’s become increasingly coveted in an era when many prospects come from spread backgrounds and have less formal training in that respect than ever before. The question, then, is what distinguishes a novice route runner from an expert—and how improvement happens. I talked to some of the league’s best receiving coaches and route runners to find out what goes into a part of the game that’s far more complex than it sounds.

When practices begin each season, Bengals wide receivers coach James Urban starts at square one with his players. Whether he’s working with six-time Pro Bowler A.J. Green or rookie first-round draft pick John Ross, Urban teaches every one of his receivers how to line up in a proper stance, which involves positioning the outside foot forward in order to create an initial burst with the back leg. “We use those foundations so when something kicks up or something isn’t quite as clean as we want it to be or doesn’t look right or the timing’s not right, I can say, ‘Hey, fix your stance,’” Urban says. “And then they know what that means.”

Part of the goal is to create consistency among the receiving corps. Part of it is correcting the mistakes of players who have used the wrong get-off for years. Cardinals receivers coach Darryl Drake claims that making quick adjustments is especially crucial when it comes to young players. “It has to become a habit more than anything else,” Drake says. “And it takes a while when you’ve been doing it [wrong] for four or five years.”

From there, the next step is reinforcing the fundamentals: pushing off—and not dropping back—the outside foot at the snap, learning which foot to plant with on inside and outside cuts, and keeping one’s shoulders over the knees in order to stay balanced and give off the illusion of running a vertical route for as long as possible. These are the types of things that go unnoticed to the casual fan watching on TV, but serve as the building blocks for every receiver. And even for stalwarts like Packers star Jordy Nelson, there is room for small tweaks that can make a huge difference on the field.

When Green Bay wide receivers coach Luke Getsy arrived on the staff as a quality-control assistant in 2014, he introduced a new method for getting in and out of the break at the top of routes. By first planting on the inside foot—as opposed to the outside foot—when getting to the break of a route, the Packers receivers eliminated one small step and created a subtle but vital advantage. “By allowing us to get to that drop in [three steps] and letting our plant foot hit before or at the same time as the DB, we’re going to be successful no matter how good the DB is,” Nelson says. With 98 catches for 1,519 yards with 13 touchdowns, the 2014 season also happened to be the most productive of Nelson’s career.

For younger players, picking up on these types of tricks during film sessions and drills can mean transforming from an average route runner into a devastating one. During the early years of his career, Ravens running back Danny Woodhead had the privilege of playing alongside some of the best route runners at their respective positions that the game has ever seen: LaDainian Tomlinson, Antonio Gates, and Wes Welker. Each taught Woodhead something he’s carried with him for the rest of his career. “I’ve been fortunate because I’ve been able to play with some Hall of Famers,” Woodhead says. “It’s huge when you can watch someone who’s done it before, and not only done it before, but done it before at the highest, highest level.”

The mantra that Woodhead took from Welker was to try to make every route look identical until the last possible moment. These days, Woodhead will ask Baltimore’s linebackers if any slight lean or misstep gives away his routes during practice. For running backs, the goal when route running is to mimic the same release out of the backfield on every play. For receivers, the key is pushing vertically to make defenders think that they are streaking down the field each time they come off the ball. “That’s what scares a DB the most—[a wideout] going by him,” Rams wide receivers coach Eric Yarber says. “Something that’s going to strike up the band and get the fans going. That makes a DB tremble and poo-poo in his pants.”

Yarber says that the main weakness most young players have is a lack of patience. They lift their chests too early, tipping their hand and letting opposing cornerbacks know it’s time to slow down. Other young wideouts have a tendency to flail their arms to the side as they come to a halt—“the air brakes,” as Urban calls them.

Good route runners keep their bodies compact as they move up the field; the greats eliminate any possible indicator as to which direction they’re going. This obsession with deception has led some receivers to have coverage preferences that may seem counterintuitive at first brush. Cowboys slot receiver Cole Beasley says that while no receiver likes to be manhandled, he’ll take matching up with a tight press-coverage corner over trying to beat a defender who cedes a few yards of ground any day.

“I feel like from further off [from a defender], you have to be more precise with your movements,” Beasley says. “You could give something away easier because they’re looking at you from a further distance. They can see your whole body. But when you’re right there, there’s not much for their eyes to focus on.”

Learning how to master the mechanics is only part of the equation, though. To rise into the upper echelon, receivers must have not only a keen awareness of their technique; they also must develop a sense for what the defense is trying to accomplish.

by Robert Mays, The Ringer |  Read more:
Image: Getty Images/Ringer Illustration

Friday, August 18, 2017

Hooper's Law of Drug Development

We've come to expect technology to improve each year. Moore's Law is justifiably famous, with its remarkable ability to explain the past and predict the future. It states that the number of transistors squeezed onto integrated circuits doubles every two years; this pattern has held true for half a century. More transistors on chips allow computers to perform faster mathematical calculations.

Moore's Law is optimistic and reflects the ability of humans to "chip" away at a problem, making sequential, cumulative advances. Much of technology fits this pattern. One glaring exception, tragically, is the drug development conducted by pharmaceutical companies. It is hugely expensive and has gotten more so each year. If costs continue to grow at 7.5 percent per year, real costs will more than double every 10 years. The pharmaceutical industry seems to be operating under a reverse-Moore's Law. I call it Hooper's Law. Here's the short version: Drug development costs double every decade. Why? Simple: the U.S. Food and Drug Administration is steadily increasing the cost per clinical trial participant and the number of required participants per clinical trial.

Technology and Moore's Law

The Cray 1 supercomputer that I used at NASA in the early 1980s cost an inflation-adjusted $28 million. Today's iPhone 7, at a cost of $650, is equal to 2,000 Cray 1 supercomputers. Per dollar, the iPhone 7 performs 90 million times as many calculations as the Cray 1. And for that price, you get a phone too.

Why shouldn't drug research and development fit this pattern? Every year scientists learn more about biology, physiology, pharmacology, and the natural history of diseases. They study what has worked and what hasn't. Their tools become more precise and more powerful. And yet the field of drug research and development seems immune to the powers that drive Moore's Law.

Drug Development is Expensive

Each year, to launch a certain number of new medicines, companies plow more and more money into research and development. Joseph DiMasi, Henry Grabowski, and Ronald Hansen, in a study performed for the Tufts Center for the Study of Drug Development, have estimated that the cost of bringing a new drug to market, in 2013 dollars, is $2.558 billion ($2.69 billion in 2017 dollars).12 Further, as a condition for approval, the FDA often requires drug companies to conduct post-marketing clinical trials to answer some remaining questions. Those post-marketing studies add $312 million, on average, to a drug's cost, raising the overall price tag to $2.87 billion in 2013 dollars ($3.02 billion in 2017 dollars).

Why is this number so large? One reason is that much of R&D is spent on the roughly 95 percent of drugs that fail along the way. The 95 percent failure rate is an average; some drugs have a 50 percent chance of success and others have a 1 percent chance. It depends on the drug, the therapeutic area, and the stage of the drug's development. A 2014 study by researchers at Cleveland Clinic found that 99.6 percent of more than 400 Alzheimer's clinical trials had failed.3 The $2.558 billion tab accounts for those "dry holes." (...)

Reasons for Expensive Clinical Trials

Why have drugs become more expensive to develop? Some examples illustrate why.

When I worked at Merck in the early 1990s, one of its biggest drugs was Vasotec (enalapril). It was tested in 2,987 patients before FDA approval. Mevacor (lovastatin), another of Merck's big drugs at the time, was tested in 6,582 patients in the EXCEL Study. At the time, that was thought to be a massive trial.

Now the situation is different.

Orexigen Therapeutics was conducting clinical trials on the obesity compound Contrave (naltrexone/bupropion). In 2011, the FDA asked the company to conduct a trial on between 60,000 and 100,000 patients. This clinical trial would have been enormously expensive, especially considering the resources available for a small company like Orexigen. In response to this request, Orexigen discontinued the development of Contrave and all of its other obesity drugs.7 During this period, the firm's stock price dropped 70 percent, and Orexigen laid off 40 percent of its staff.

Later, after negotiations with the FDA, Orexigen eventually ran a clinical trial on fewer than 10,000 patients. While this reduced requirement enabled the trial to proceed, this was still a huge and hugely expensive clinical trial.

The REVEAL trial, in which Merck is currently testing the experimental drug anacetrapib, includes a whopping 30,000 subjects and is being conducted at 430 hospitals and clinics in the United Kingdom, North America, China, Germany, Italy, and Scandinavia.

Between 1999 and 2005, the average length of a clinical trial grew from 460 days to 780 days, while the number of procedures on each patient (e.g., blood draws, scans) grew similarly, from 96 to 158.8 Comparing the 2001-2005 period to the 2011-2015 period, one study found that the number of study participant visits to care providers (e.g., hospitals, clinics, doctors' offices) increased 23-29 percent; the number of distinct procedures increased 44-59 percent; the total number of procedures performed increased 53-70 percent; and the cost per study volunteer per visit increased 34-61 percent.9

The protocols for clinical trials—those written recipes for how patients are to be recruited, dosed, and evaluated—have become more complex, as well. Dr. Gerry Messerschmidt, chief medical officer at Precision Oncology, reports, "When I was writing protocols 20 years ago, they were one-third the size that they are now. The change has really been quite dramatic."10

Clinical trials are more expensive now because the cost per participant has increased at the same time that the number of participants has grown. Why? Again, the answer is the FDA. (...)

Pharmaceutical companies typically estimate the future expenses and revenues for each prospective drug, looking forward 20 years. In some cases I know of intimately, they hire consultants to estimate expenses, revenues, and probabilities of success at each phase of development. They use these data to compute the financial value of each pharmaceutical project and, if the expected value (probability-adjusted value) of the project is negative, the consultants recommend discontinuing development.

Many new medicines are discarded for reasons that have nothing to do with safety and efficacy. Consultants have, for example, where the prospects looked poor, suggested killing drugs for brain cancer, ovarian cancer, melanoma, hemophilia, and other important conditions.11 Even though millions of dollars may have already been spent, these consultants would never recommend that a company knowingly proceed on a path toward losing more money unless some other crucial non-financial objective was being achieved.

by Charles L. Hooper, Econlib |  Read more:
Image: uncredited