PHILOSOPHY PROFESSOR Diane Michelfelder’s research interests include 20th-century European philosophy, animal ethics, and philosophy of technology and engineering. Macalester Today sat down with Michelfelder as she prepared to teach Ethics and the Internet, a class she debuted in 2014. (This interview has been condensed and edited.)

What is philosophy of technology?
Philosophy of technology reflects on technological products (which could be individual products, like bicycles), technical systems (like transportation systems), and the process of design.

The field also looks at human/technology relations. How do technologies help shape who we are both individually and as a society? And how do we, in turn, shape society through the technologies that we design? We live in a world that is increasingly engineered. It’s important to understand this shaping, both in terms of the possibilities it opens up, and the possibilities it might diminish or constrain.

What’s changed in the world of technology since you last taught Ethics and the Internet?
I geared the course then toward a user sitting in front of a screen, inputting data and searching for information. Today, I need to gear the course toward someone inhabiting environments that are more and more tech-saturated, and the screens are diminishing in size. In some cases, you’re interacting with computers even when you don’t have a screen. There’s been a massive development of driverless vehicles, Fitbits and other self-tracking technologies, desktop assistants like Alexa, robotics, and Bitcoin and blockchain, as well as new policies like the European General DataProtection Regulation.

Why do driverless cars provide such a rich case study for exploring philosophy of technology?
Driverless cars provoke three questions that are central to this field. First, how much and what do we want to outsource to technology? Is driving something we want to outsource?

Second, what values do we want to see embedded in technologies? Should it be, for example, that an autonomous vehicle would always protect the “driver” in case of an accident? Because the issues here are ones of life and death, driverless vehicles bring to public attention how technological development is not valueneutral—all technologies have moral values embedded within them.

Third, how do you introduce technological change? If selfdriving cars are going to be a part of our transportation landscape, how will they be coordinated with more conventional vehicles? It requires a moral imagination to figure out how that will work.

You’ve made the point that our increased use of technology improves efficiency but risks prioritizing efficiency over the richness of lived experience. What’s the risk?
If efficiency and optimization are values that drive the development of new technologies, what happens if I make them my own standards for how I ought to live my life? If I attempt to perform optimally, or most efficiently, this erodes my ability to pause and integrate more leisure and play into life. Part of the underlying ethos of a liberal arts education is that you have spaces of time for reflection, and to process. If you have technologies that are designed to constantly keep us busy and to keep us running, then you don’t have a lot of time to cultivate the skills needed just to pause, to sit still, and to think.

Is there a role for philosophy in the fast-moving world of start-ups and entrepreneurship?
I think you could say that philosophy is the original entrepreneurial activity—in fact, some have said the thought processes involved in philosophy are the ones needed to be a good entrepreneur. We do need to be cautious, though. We might get ourselves in a mindset that for any particular problem there is some tech solution to it; and for many, indeed, there are tech solutions. As a philosopher, I would ask: Is this a problem that can really be solved technologically, or does that technology just put a Band-Aid on a deeper problem that calls for a social-political solution instead?

Does reliance on technology erode our ability to trust ourselves and to problem-solve?
Apple just unveiled a new do-not-disturb feature for the iPhone so you aren’t bothered by notifications while you are sleeping. That’s a technological fix. Is it better to just develop the self-discipline not to look at my phone so frequently, rather than outsource my self-discipline to an app? I think there are a number of technologies being designed that cause us to trust ourselves less. We trust the science rather than our own perceptions, our own way of looking at the world. I worry about the impacts of this—that the less we trust ourselves to think well, to think critically, the more we will trust technologies instead. Do I need an app, for instance, to tell me how happy I am?

In 2014, you published a journal article called “Driving While Beagleated.” What was the subject—and are you still driving while beagleated?
That article was published in a journal on the theme of distracted driving, primarily examining talking on your phone while driving (this wasn’t focused on texting). The articles reflected a good deal of disagreement on whether “distracted driving” should be addressed by public policy means, or whether drivers—as part of learning how to drive—should try to develop competencies in switching attention and paying attention to more than one thing. The journal articles’ authors divided sharply along gender lines: all the male authors favored more regulation; all the female authors thought it was okay to talk on your phone and drive, and opted for other solutions. Lizzy, the canine star of “Driving While Beagleated,” lived to be 16.5 years old. If I had my beagle, I would still drive with her on my lap.

July 25 2018

Back to top