Highlights

“They obstructed and endangered street uses of long-standing legitimacy,” as Peter D. Norton writes in Fighting Traffic. “Before the city could be physically reconstructed for the sake of motorists, its streets had to be socially reconstructed as places where motorists unquestionably belonged.” To view the street as a motor thoroughfare seems self-evident to us now, but it had to be redefined as such. This was a hard sell, given the carnage automobiles had lately caused. Norton writes of a “violent revolution in street use circa 1915–1930” that has been all but forgotten; “the full scale of the wave of blood, grief, and anger in American city streets in the 1920s has eluded notice.”
The technocrats and optimizers seek to make everything idiotproof, and pursue this by treating us like idiots. It is a presumption that tends to be self-fulfilling; we really do feel ourselves becoming dumber. Against such a backdrop, to drive is to exercise one’s skill at being free, and I suspect that is why we love to drive.
Has anyone bothered to ask why the world’s largest advertising firm—for that is what Google is—is making a massive investment in automobiles? By colonizing your commute—currently something you do, an actual activity in the tangible world that demands your attention—with yet another tether to the all-consuming logic of surveillance and profit, those precious fifty-two minutes of your attention are now available to be auctioned off to the highest bidder.
Driverless cars are going to happen in a big way, we are told, because it has been decided—by something called “the future.” It has become clear that the effort to develop driverless cars is not a response to consumer demand, but a top-down project that has to be sold to the public.
Casner et al. point out, “One unintended consequence of alerts and alarm systems is some drivers may substitute the secondary task of listening for alerts and alarms for the primary task of paying attention.” This has been called “primary-secondary task inversion,” and it is a familiar problem for pilots in highly automated airplanes. For example, they may subconsciously come to view their task as that of listening for an altitude alert, rather than that of maintaining the right altitude, and this is obviously a problem if the alert fails to sound. This is called the problem of complacency in the human factors literature.
You never attribute infallibility to a machine you have made yourself, or a machine you are intimately involved with through maintenance and repair. But if the machine was designed and built in a vast collaborative effort, and is beyond the capacity of any individual to fully understand, and works flawlessly 99 percent of the time, we adopt a very different posture toward it. We are not just daunted by the obscure logic of such machines, but seem to feel ourselves responsible to them, afraid of being wrong in their presence, and therefore reluctant to challenge them even as the computer-plane flies itself into the ground or the GPS directs us to drive into a lake.
Going to the DMV is a civic education in submission to a type of authority that relies on unintelligibility to insulate itself...