I’ve chosen Ambient Findability: What We Find Changes Who We Become (2005) for the third post in my series of book excerpts. The premise of the book is captured in this passage.
My fascination with this future present dwells at the crossroads of ubiquitous computing and the Internet. We’re creating new interfaces to export networked information while simultaneously importing vast amounts of data about the real world into our networks. Familiar boundaries blur in this great intertwingling. Toilets sprout sensors. Objects consume their own metadata. Ambient devices, findable objects, tangible bits, wearables, implants, and ingestibles are just some of the strange mutations residing in this borderlands of atoms and bits. They are signposts on the road to ambient findability, a realm in which we can find anyone or anything from anywhere at anytime. Of course, ambient findability is not necessarily a goal. We may have serious reservations about life in the global Panopticon. And from a practical perspective, it’s an unreachable destination. Perfect findability is unattainable. And yet, we’re surely headed in the general direction of the unexplored territory of ambient findability.
The launch of the iPhone in 2007 validated this vision, enabling an era in which the Google Maps app drives our wayfinding and Uber profits by turning cars and people into findable objects. But I’m still waiting for a better way to find lost keys than Tile.
A Brief History of Wayfinding, Chapter 2
Not all those who wander are lost.
– J.R.R. Tolkien
Labyrinths and mazes are two distinct creatures. In the modern world, we are most familiar with the maze, an intricate and often confusing network of interconnecting pathways or tunnels designed to challenge the skills of all who enter. Mazes are multicursal. They offer a choice of paths, along with a disorienting mix of twists, turns, blind alleys, and dead ends. In a maze, it’s hard to find your way and easy to get lost.
In contrast, a true labyrinth is unicursal. There is one well-defined path that leads into the center and back out again. The labyrinth is an ancient symbol with a 3,500 year history in religion and mythology in such diverse places as Egypt, Peru, Arizona, Iceland, India, and Sumatra. It combines the imagery of circle and spiral into a meandering but purposeful path, a reassuring metaphor for our journey through life.
In practice, we use the terms interchangeably. Our most famous labyrinth was really a maze, designed by the skillful architect Daedalus to entomb the Minotaur and its victims. Only by relying on Ariadne’s ball of thread was Theseus able to escape after slaying the beast at the center. Like today’s mazes of hedge and corn and ink, the labyrinth of Crete was a puzzle, inviting competitors to test their skills.
Semantics aside, our fascination with labyrinths and mazes stems from a primal fear of being lost. Over the course of history, the ability to venture out in search of food, water, and companionship, and then find our way home again has been central to survival. For animals and humans alike, getting lost has typically been a very dangerous prospect.
Our wayfinding instincts testify to the power of evolution. The diversity and sophistication of natural orientation and navigation skills is breathtaking. The environment challenges and evolution responds. And, of course, humans have also responded by creating wayfinding tools and technologies and by shaping the very environments in which we live.
In fact, the term wayfinding originated in the context of what architects call the built environment. First used by architect Kevin Lynch in 1960 to describe the role of maps, street numbers, directional signs, and other “way-finding” devices in cities, the term has since been appropriated by biologists, anthropologists, and psychologists to describe the behavior of animals and humans in natural and artificial environments.
Most recently, wayfinding has been applied to the study of user behavior within digital information environments. We talk about people getting lost in cyberspace. We create “breadcrumbs” and “landmarks” to support orientation and navigation in web sites. While these spatial metaphors are often taken too far, there is no doubting their resonance.
We do import our natural wayfinding behaviors and vocabularies into digital environments, and for that reason alone, the history of wayfinding is worth our attention. But at the intersection of location awareness and ubiquitous computing, we are increasingly navigating hybrid environments that connect physical and digital. The history of wayfinding only grows more interesting with each step into the future.
Asylum, Chapter 4
Do we really want to go there? This is a question we must continue to ask as we intertwingle ourselves into a future with exciting benefits but cloudy costs. Though subdermal RFID chips were approved on the basis of their lifesaving potential, the FDA raised serious questions about their safety including electrical hazards, MRI incompatibility, adverse tissue reaction, and migration of the implanted transponder. In their guidance document, the FDA also detailed the risks of compromised information security, noting that transmissions of medical and financial data could be intercepted, and that the devices could be used to track an individual’s movements and location. I don’t know about you, but I plan to wait out the beta test.
But even as we spurn the bleeding edge, these technologies seep into every nook and cranny of our lives. I recently stumbled across a telling story about ubiquitous computing at a leading psychiatric hospital in Manhattan. Based on the belief that allowing patients to talk on the phone may speed their recovery from depression or hasten their emergence from psychosis, doctors had approved the use of cell phones and other wireless devices. The patients loved this new freedom, and the ward was soon bustling with cell phones, laptops, Palm Pilots, and BlackBerries. As you might suspect, the results were mixed. The enhanced personal communication appeared to have real clinical benefit for many patients, but on the other hand, nurses found themselves constantly recharging batteries. In a place where people may use power cords to hang themselves, wireless has special meaning. However, the most serious problem was disruption. As one doctor noted:
There was a constant ringing on the unit. All these different ring tones. Some people would put them on vibrate mode and sneak them into group and then want to walk out to answer their calls. Or they would be talking to their friends and would ignore the nurses.
The final straw was the new camera phones, which threatened to eliminate any semblance of privacy. The intertwingling of inside and outside had spiraled out of control. The very essence of the asylum as a safe haven and protective shelter was under attack. The decision was made. Laptops and Palm Pilots were permitted, but no more cell phones. At first, there were protests as patients argued their right to communicate, but eventually the right to privacy and the virtues of peaceful sanctuary prevailed. Of course, as the lines between cell phones and Palm Pilots blur, the debate may be revisited. Said one doctor:
Is it over? I don’t know. Here at the institute, our enthusiasm for wireless connectivity has been tempered by reality. We have reclaimed a fragment of asylum. But my guess is that we will face a next challenge by the wireless world, and that we will continually have to work to define our relation to it. My guess is the battle has just begun.
This story resonates in the outside world. We love our cell phones but not the disruption. We love our email but not the spam. Our enthusiasm for ubiquitous computing will undoubtedly be tempered by reality. Our future will be at least as messy as our present. But we will muddle through as usual, satisficing under conditions of bounded rationality. If we are lucky and make good decisions about how to intertwingle our lives with technology, perhaps we too can reclaim a fragment of asylum.