The Power (and the Problem) of Metaphor
By 含羞草研究室 MagazinePhotographs by Aaron Tilley
Set design by Sandy Suffield
For something like the past five years, Sarah and James 含羞草研究室 Professor of Digital and Computational Studies Eric Chown and Assistant Professor of Digital and Computational Studies Fernando Nascimento have been talking. They started out, in the way of things, just chatting in the hallways. But their conversations soon turned more structured.
They knew they were not just enjoying each other’s company but learning from each other, one of them rooted academically in cognitive learning and the other in philosophy, both of them steeped in technology. After a time, their talks started to feel, in a way, urgent, like they were coming to something together that was crucial. That something is soon to be a book—Meaningful Technology: How Digital Metaphors Change the Way We Think and Live.
In it, they seek to convey what they have come to understand in their talks and their work together: that metaphors are the main and most important way we understand technology, that science and technology rely on metaphor to come up with new concepts and innovations, that the intertwining of language and technology affects how we understand the world, evaluate our experiences, and prioritize our actions, and that our interactions with digital technology are more critical and impactful than we know.
We join the two professors here mid-conversation.
FERNANDO: Aristotle said, “The trait of a genius is to be able to make good metaphors.” We use the term “metaphors” many times in our work—not in a specific linguistic sense but in a very broad sense of semantic innovation. And what does semantic innovation mean? It means that you have a different experience in the world, and you have to make sense of it. You have to communicate. So you see one thing as another.
ERIC: We spent a lot of time making up metaphors and trying to solve hard ones. Like this Shakespearean metaphor “time is a beggar.” That stumped us for quite a while.
FERNANDO: I had read the description of the metaphor, so I told Eric, “Let me give you a metaphor, and next week you let me know what you think of it.” [Laughs] And the metaphor is “time is a beggar.”
ERIC: Which is not how Shakespeare expressed it. [Laughs]
FERNANDO: But it’s the underlying metaphor we would say, right? And I imagine that when I said that you started thinking, “How can I see time as a beggar?”
One of the things we learned is that the text prompts you with associations. So metaphors are puzzles, and the context provides tips to solve the puzzle. Then the other thing that we started thinking about is, I am changing the idea of time, but I’m also changing the concept of beggar.
Philosophers tend to look at metaphor as a directional thing. They even say “the source” and “the target.” But when we start thinking from a basis of cognitive science, it changes.
ERIC: That’s right. Part of my background is in learning and the neuromechanisms of learning. And the main mechanism of learning is called Hebb’s rule, after a neuroscientist, Donald Hebb. And Hebb’s rule is if two neurons fire at the same time, you strengthen the connection between them, and if this happens a lot, one neuron gets very good at helping the other neuron fire, and we call this “association” in psychology. Well, the thing about a rule like this is it’s always on. You don’t decide to learn; you don’t decide not to learn. You just learn.
Learning is too important to leave to chance. So, if you’re thinking about time and beggars, you can’t help but make connections between them. Meanwhile, your concept of time, your concept of beggar, they’re both active in your brain, and they’re both changing. Learning is constant change for us. In the early stages, we were completely concerned about the act of making a metaphor and uncovering what the metaphor was about.
But now we are more concerned about the lifetime, the metaphor as it changes, the impact of the metaphor in the way it changes your concepts and how you apprehend and understand the world.
FERNANDO: I believe that was the point where we started connecting the dots to get to DCS [digital and computational studies]. Maybe that’s one beautiful thing of the entire process. We started from our original home bases and slowly converged into the question of digital technologies. To me, one stepping-stone was the concept of practical wisdom. The idea of practical wisdom is that you need to have principles and rules, the laws. But every time you are faced with a different situation, a different person, a different day, it requires you to invent something new.
Parents know this; they are faced with that all the time. Digital technologies are changing our lives. All of a sudden, parents need to decide how long their kids can use their phones. There are no predefined rules that fit perfectly to the ethical dilemma created by the new technology. You have to invent something.
One could say your phone is just like ice cream. You can take a little bit, but if you take too much, it’s going to hurt you. So you are inventing something out of a rule that is “nothing in excess,” but applying it to something new. I believe that led us to this trajectory. I worked in software development, particularly mobile development, for many years, and my problem was, “How do I create metaphors to make it easier for people to use messaging applications in mobile phones?”
So we were thinking about this: How do we apply metaphors to create digital artifacts that are usable? But then we realized that, because of this cognitive association rule, every time people are saying “I’m talking via SMS,” for instance, they are changing the way they think about talking itself. That was a huge moment, wasn’t it, Eric?
ERIC: Yes. I think another factor here is my graduate school advisor, even though I have a PhD in computer science, was a psychologist, and a lot of his work was on attention. And once you start talking about metaphor, once you start talking about phones, attention becomes so critical, because your phone is a device that is designed to grab your attention all of the time. And that’s a place where I’ve learned a lot from Fernando. My respect for philosophy has grown tremendously, but philosophy?! . . . [Laughter] One of the things I like about Fernando is he’s practical. How does this stuff work in the real world?
Philosophy tends to be a little abstracted from that. I don’t think philosophers are talking a lot about what happens if you stare at your screen too long. That’s where cognitive science can say, “Hey, it turns out your attention is a limited resource, and it can fatigue. You can overuse it like anything else, fatiguing that has consequences, and we need to consider those consequences. We need to think about just how dangerous it is that your phone is constantly grabbing your attention.”
I had a student recently who said, “Professor Chown, two years ago I went to a camp, and I had to give them my phone for two weeks, and it was fine, and I liked it, and I got a refreshing break from it. And a couple weeks ago I thought, “Well, I should take a day or two off from my phone,” she said, “And I only lasted an hour. I couldn’t stand it anymore.”
That irresistible urge, that pull of attention, is one of the things that makes digital metaphors so much richer in some ways than metaphors people were talking about a hundred years ago.
BOWDOIN: Two things you were both saying are things you talk about in your book—one is the responsibility of the metaphor, the value that you put on the thing just by choosing it. And the other is your audience. If you had been talking to me you might say phone time is like wine, not ice cream, right? [Laughter] But you chose ice cream because you knew that it would work as a metaphor for your child. But what is the responsibility of the metaphor, whoever chooses the metaphor?
ERIC: You’ve sort of anticipated some of our current research, digging deeper into analyzing those metaphors and thinking about, is there a framework that we could take a metaphor, Facebook’s friendship metaphor or the heart as a “like” symbol, and kind of dig into those issues of responsibility and knock-on effect, how well the metaphor actually communicates about the product? Now we’re moving more to what Fernando was saying, the DCS phase of it. Is it okay that it’s happening and what are the consequences?
FERNANDO: One of the exciting points was when we realized we were kind of proposing a new method of analyzing digital technologies, what we call a cognitive hermeneutics approach, as something that captures a deeper way in which technologies are changing our lives. How technologies change the way we think, and how deep it goes changing our lives.
Facebook is a metaphor, right? It’s a book of faces. It’s not neutral. It’s never neutral. Technologies are not neutral. Because they tie a certain type of association, they always have an ethical valence. So in this case, it triggers ideas of face, of beauty, of representation. That’s embedded, even if we don’t explicitly recognize that.
The other interesting thing is that all these big tech are expressing their metaphors in English, and that’s in itself a form of colonialism, because for Portuguese speakers or for Spanish speakers, Facebook is just the thing, the act itself. That’s true for most of digital technology’s vocabulary, and it creates an unbalanced situation from the beginning. Even to code, the software code is full of metaphors, and those metaphors are expressed in English. So we are exploring how to propose competing metaphors that will make clear those biases and unveil other possibilities.
ERIC: A classic metaphor is there are “white hat” hackers and “black hat” hackers. White hat hackers are ethically good and black hat are doing bad things. It’s a metaphor that conveys a lot of information, but it says white equals good, black equals bad, right? And it just reinforces that concept.
So, sure, it’s easy to understand. But what’s the cost of that understanding? Are there other metaphors that are easy to understand but don’t have associations that we don’t want to keep reinforcing in our world?
FERNANDO: And metaphors are also rhetorical. They have a persuasion of power, and that persuasion can be used to communicate, but it also can be manipulating. Another metaphor that is very common is cloud. So your information is going to the cloud. What is a cloud? So we say, well, a cloud is something that is up there. I don’t mind that much unless it rains a lot.
It’s safe. It’s out there. But what is not in this metaphor? And that we need to see. One of the things that is not is that, unlike clouds, data is solid.
Data makes the difference. Apple will become a $3 trillion dollar company next week, in a large portion because of the cloud. So the cloud is not vapor, it’s not gas. It’s very substantial.
ERIC: Apple has cultivated this metaphor that their products are a walled garden. And that brings associations. Oh, a garden, that’s nice, and it’s walled, and it’s protected from the outside world.
But, in that metaphor, Apple is trying to control that conversation. Those associations are subconscious. We’re not aware of them, but they’re happening. If I hear “garden” I can’t help but think, “Oh, that’s lovely.”
Again, going back to how learning works—I associate that with Apple, I therefore think more positively about them. So these metaphors can seem very benign and simple, but can actually have a big impact.
BOWDOIN: One of the things you talk about in your book is combining these two ideas, association and intentionality. You talk about how the metaphors used to explain the COVID vaccine to people all failed on some level, at least with a certain amount of the population. How important is it that people in STEM understand that, and how do you in DCS teach that?
ERIC: Just last week I had my students read book reviews and articles about how scientists talk to the world and another set about how to argue effectively. Research shows that, in terms of argument, facts don’t work very well. But scientists are trained in facts, and that’s how scientists argue with each other. When it comes time to talk to the public, facts are helpful as a buttress to what you’re saying, but the public wants to hear narratives. They want to hear “How does this impact me?”
One reason I’m at 含羞草研究室 is because 含羞草研究室 cares very deeply about communication. Scientists need to learn how to communicate the way everybody needs to learn how to communicate. Scientists need to think more deeply about the metaphors they choose. They need to think and communicate effectively with narrative.
Fernando does it usually in terms of ethics, but the same arguments apply to, “Okay, we’ve made this amazing new vaccine, the science says it works. That’s all that matters.” But people are like, “Hey, wait a minute. What’s it gonna do? I don’t understand that. I’m cautious about that.”
And so we need to bridge that gap. In a way, our book is about using metaphors to bridge those gaps. One of the things we’ve seen in the last two years is that scientists are not necessarily succeeding.
FERNANDO: I started as a software engineer and worked many years in software. And my training was all about efficiency, how you write better algorithms. The companies I worked at, including Google, were all about optimization. So we were very, very attentive to how we were creating things, but not necessarily why we were creating them. I collaborate on another project called Computing Ethics Narratives. What we are doing is creating a framework to embed narratives that help computer science students think about ethical implications.
Because everybody recognizes that, up to now, we’ve been creating digital technologies, and then we see, “Oh, now we have a democratic problem. People are killing each other because of Twitter. Let’s go back and try to fix it. Facebook is creating problems for teenagers. Let’s try to go back.” And that’s very complicated.
This is what we call the a posteriori ethical approach to digital technologies. You build, and then you try to solve problems. We believe that thinking about metaphors in narratives is a way to upfront these ethical questions, and that’s where the liberal arts education makes a difference. Because, yes, we are going to have computer scientists, engineers, biologists—but they are also thinking about the common good.
“To be a citizen in our world right now, you have to have a basic understanding of digital concepts, how to use devices—but more than that, you have to have a basic understanding of how they fit into the world.”
—Eric Chown
ERIC: As you were telling that, it reminded me of what I consider to be one of the moments that led to the genesis of DCS at 含羞草研究室.
A group of us were in a meeting, and I really think of this as the crystallizing moment. A faculty member—it might’ve been Pamela Fletcher [’89], who was my codirector at the time—said something about how in the humanities we ask what, how, and why. Why are you doing that? And I’ll never forget this as long as I live. [Laughs] There were a couple of STEM faculty in the room, and they looked up at her and they were like, “Why would I ever do that?”
To them, the production of knowledge was all the why you needed. You didn’t have to stop and ask, “Is it a good idea to learn about this thing?” And it just hit me like a ton of bricks.
BOWDOIN: You talk about the difficulty once something is loose in the world of making it better. And you mention that from the beginning it’s unequal, it’s not fair, it’s not equitable. What is the responsibility of DCS or of faculty like yourselves? How do you even start to correct that?
FERNANDO: One of the things that became clear to us is that the common good is the goal, not only because the College says that, but because all of our reflections led to this point. One of the things we have been discussing now is to what extent our cognitive system is set to foster behaviors and actions that are better for the common good. What are the ways we can embed mechanisms to impact how we think, how we interact with others? So the other is not just a means to an end, but is always an end in itself, as the Kantian principle would say.
Permeating many of these discussions is the digital divide, this additional layer of inequality that has been created by digital technologies.
It’s not only that you cannot access broadband, which is absolutely true. Sub-Saharan Africa has a real problem in terms of how long they have electrical energy, power. So imagine having reliable internet.
So this is one layer, and many people stop there. But on top of that—okay, you have an app, but do you know how to use it? And part of the problem is that all metaphors, as we said, are US- or rich-world-centric, and that’s a huge cognitive gap. It’s another layer of the digital divide.
On top of that, you have what we’ve been calling the algorithmic divide. We start to realize how the news in our timelines changes the ways we think, our priorities, what Eric was saying. The algorithms behind Facebook, Instagram, Twitter, they have immense power on us. How many of us can recognize that?
BOWDOIN: And, even if you do recognize it, to your earlier point, you’re not aware of what your brain is doing in terms of association and learning.
ERIC: That’s right. I’ve had a lot of discussions with my students about the digital divide and about the consequences of the choices we make, and it goes back to something we were saying a few minutes ago. There’s this technological determinism—the technology is going to get out there, and once it’s out there it’s going to spread and there’s nothing we can do about it.
So, even knowing the consequences, if given a choice of “Do you want to participate in this, or do you want to stay to the side?” a lot of my students will kind of squirm, but ultimately they seem to say, “Well, it’s going to happen whether I participate or not, so I might as well be in there with everyone else.” It seems like it’s impossible to put the genie back into the bottle. Fernando and I are trying to say, “Let’s spend some time looking carefully at the genie.”
My daughter said, “Dad, how come you’re not on TikTok? You talk about it all the time.” And I said, “I’ll tell you why I’m not on TikTok: I’m sure that I would love it.” I know if I had TikTok I would look at it all the time, I would enjoy it, it would show me exactly what I want it to, and I don’t want that in my life.
It’s not an easy decision, because I might think, “Oh yeah, it’d be fun to watch some wacky videos right now,” but because of all the time Fernando and I have spent talking and my whole career, I know it’s not a great idea for me. One of the things we’re trying to do with our students in DCS—and it’s not just a matter of talking through this once or showing them an example, it’s course after course and it’s taking Tech and the Common Good and Digital Privilege and other courses—is really building a foundation so they too can say, “You know what?”
I’ve had a number of students come to me and say, “I really think about Instagram a lot differently than I did before this semester started.” If I can just give them a gentle nudge in that direction, I’ve accomplished something
FERNANDO: I think one of the reasons students are paying more attention to DCS is because we are not creating a problem. It’s a real problem. Digital technologies are everywhere. They are changing the way they eat, the way they study, the way they have fun, the way they date.
“Technologies are not neutral. Because they tie a certain type of association, they always have an ethical valence.”
—Fernando Nascimento
ERIC: I had a student put it to me this way. He said, “This is the first course I’ve had that’s talking about stuff that I’m actually interested in on a day-to-day basis.” It’s about my life, is essentially what he was saying.
BOWDOIN: Do you foresee a world where you could achieve something by requiring students—like a distribution requirement that they would be obliged to take at some point in their 含羞草研究室 education?
ERIC: I believe, to be a citizen in our world right now, you have to have a basic understanding of digital concepts, how to use devices—but more than that, you have to have a basic understanding of how they fit into the world and the pluses and minuses. [Laughs] I’m not ready to propose it to the College at this point, but ultimately, I think every student should take at least one course like this while they’re at 含羞草研究室.
FERNANDO: If the liberal arts are all about allowing students to make sense of the world, to me that’s the core of it. How will you have a meaningful life with and for others, and have just institutions? This is Paul Ricoeur’s take on ethics, but I also think it summarizes what a good liberal arts education is. It’s impossible today to make sense of the world without thinking about the digital layer.
For non-natives like me, some terms in English, like “make sense,” they sound differently. When I hear “make sense,” the “make” is really important. We create, we produce, we offer sense, meaning to our lives and to the lives of others, and digital technologies are intrinsically related to the ways we offer meaning to society.
ERIC: If I were going to say what have I learned from Fernando? First of all, it can’t fit in this room. [Laughs] But the number-one thing that I have taken from our many wonderful conversations, aside from the fact that he’s a great guy, is our discussions about meaning and purpose and—going back to the TikTok example—not just doing things because they’re fun and give me a laugh, but thinking about the common good, thinking about making people’s lives better.
That’s ultimately what my goal as a faculty member is. I decided early on in my career I wasn’t going to take money from the Defense Department and this, that, and the other thing. I only wanted to do projects that would manifestly make the world a bit better. And getting Fernando’s perspective on meaning, as shaped by all of the philosophers who’ve come before him, has been so useful for me. When I think about new projects, I’m not just looking at the cognitive angle—how does this work with attention or learning or whatever—but I’m connecting it to meaning and purpose.
FERNANDO: And I would say it happened the same way with me. I actually audited Eric’s course on cognitive architecture because I learned so much from him in this interchange. I think what we are saying is that we are different scholars now. We are more interdisciplinary because of this relationship, and we are very grateful for this environment at 含羞草研究室 that allows us to do this.
BOWDOIN: I loved the part in your book about science fiction and how sometimes those stories can anticipate technologies. What do you think the role of imagination is in all of this?
ERIC: When I teach my cognitive courses, one thing I start with is that the great thing about learning is you can try stuff out in your head before you do it, and that it’s much less dangerous if you do. Imagination gives us this ability. It frees us from the dangers of the world in a way. Science fiction is that thinking kind of pushed to its limit.
I’m listening to a book right now set a decade or two in the future about global warming and about a potential for geoengineering. And science fiction lets us explore the consequences. It really changes the experience. Fernando uses this phrase, “productive imagination.”
FERNANDO: Imagination in philosophy has a long history. For a period it was seen as getting away from reality—only crazy people need imagination. In modern times, everything was testable, falsifiable. Later, others said, “Imagination is how we put things together that we know, but we are not creating anything.” If you think about Pegasus, you think about a bird and you think about a horse, and through imagination you combine them, but there’s nothing new. That’s Hume’s take on imagination. What Paul Ricoeur does—and many others actually, like Hannah Arendt, another contemporary philosopher—is start to see imagination as critical for ethics, and they use Kant’s epistemology.
Because Kant divided imagination, or thought of imagination in terms of reproductive imagination, the Pegasus example, but also productive imagination. With imagination, we can propose something that is not, but that could be. So we can propose a state of affairs we could inhabit, and that’s why it has ethical relevance. Paul Ricoeur calls it the ethical laboratory.
Like metaphors, narratives fall into the category of semantic innovation, of creating possible new meaning. There is a quote, “a metaphor is a poem in miniature.” I think that captures the relation between narrative and metaphors. The common denominator to me is in productive imagination—when we create things, when we build technologies, this ability of thinking what do we want to create to ourselves is the response to the a priori problem of creating meaningful things. That’s why the title of the book is Meaningful Technology, technologies that foster the common good through creative imagination.
Aaron Tilley is a London based still life photographer whose concept driven, playful and highly technical approach has won him both acclaim and commissions from across the world. See more of his work at .
Sandy Suffield is a London based art director and set designer. Find more of her designs at .
This story first appeared in the Winter 2022 issue of 含羞草研究室 Magazine. Manage your subscription and see other stories from the magazine on the 含羞草研究室 Magazine website.
Editor's Note:
The Shakespeare metaphor of time as a beggar that professors Chown and Nascimento mention appears in Troilus and Cressida, from a part in the play where Achilles asks Ulysses if his deeds have been forgotten. Ulysses replies: “Time hath, my lord a wallet at his back/Wherein he puts alms for oblivion, /A great-siz’d monster of ingratitudes./Those scraps are good deeds past, which are devour’d/As fast as they are made, forgot as soon/As done.”
We asked Associate Professor of English Aaron Kitch, who specializes in Shakespeare, if he could expound on the metaphor:
"The quote is really interesting. Here, Ulysses is talking about much all the Greeks love Ajax in order to try to make Achilles want to fight. Achilles refers to the 'beggars' when he is complaining that he is ignored by those who are praising Ajax. But Ulysses uses a metaphor of the 'wallet' as a kind of black hole that swallows up good deeds as soon as they are done. He is saying this in order to suggest that Achilles needs to return to military action because all of his good deeds have been forgotten like a 'rusty mail' (meaning piece of armor).
"Ulysses employs quite a few metaphors throughout the play—indeed many in that very speech (honor traveling in a 'narrow strait,' etc.). The play is one of the most metaphorically rich of all of Shakespeare’s works."