How Art Advances Tech & Policy — Chris Csikszentmihalyi at Cornell

J. Nathan Matias
8 min readMar 21, 2022

--

How can artists contribute to technology policy, and how can artists engage the public in shared knowledge-making about science and technology?

Speaking today at the Cornell Department of Communication is Chris Csikszentmihalyi (@csik), Associate Professor in the Faculty of Computing and Information Science at Cornell University. Chris was the European Research Area Chair in Portugal, leading the Critical Technical Practice lab. Although we never overlapped at MIT, Chris co-founded and directed the MIT Center for Civic Media, where I did my PhD. As a student, I was inspired by Chris’s laser-focus on projects in public interest technology that made a point while also making a difference.

Art rises above all methods; in itself it cannot be taught, but the crafts certainly can be — Walter Gropius

Chris opens up by asking: “What do artists know?” Chris was trained as an artist, and early in his career, Chris was part of a project to identify the kind of knowledge that is created by artists. If you look at the founding documents like the Bauhaus Manifesto, Walter Gropius said “Art rises above all methods; in itself it cannot be taught, but the crafts certainly can be.” Art is something that you learn from practice and culture. As Frances Whitehead writes, art is a kind of “crafty intelligence” or “grounded intelligence” — knowledge that can’t be separated from place and experience. As James Scott writes in Seeing like a state, “the holder of such knowledge typically has a passionate interest in a particular outcome.” As a captain of a ship, you don’t necessarily need to know about probability- you need to know how to improve your chance of getting to land safely.

Dystopic Technologies and Military Surplus

As an artist, Chris approaches issues to consider what aspects of technology and life aren’t captured by science. He tells us about an early project of his on “Surplus technology” where he looked at the technologies that artists could buy and afford — often military surplus. So as an artist working with technology, Chris had to learn about the histories of systems like cruise missiles, autonomous robots, and other military systems. His first significant art piece was Hunter Hunter, an autonomous robot for locating loud noises and firing a nine millimeter weapon toward loud noises (such as a gunshot).

Chris called these artistic creations “dystopic technologies.” Then one day, he found out that police departments were starting to use triangulation to find gunshots as well. His art advisor remarked, “no matter how dystopic a technology you make, you only end up being a few years ahead.”

“Hunter Hunter” could triangulate the location of a loud noise (such as a gunshot) and fire a nine-millimeter weapon at the source. The purpose of this work of gallery art was to critique automated weapons.

Dystopic Technologies and Military Surplus

What do you when your dystopias keep on becoming real? Inspired by Phil Agre’s work on “Critical Technical Practice,” Chris decided to explore the history of science. In the next period, he studied ways that art can represent critiques while also prompting and supporting people to envision alternatives.

Many art / science projects prioritize science communication, giving scientists creative control over the ideas and representations that scientists create for the purposes set by scientists. Chris realized that he wanted to do art that was political, where the message of the work was shaped by artists and ultimately taken up and shaped by the public.

Where did Chris’s research into science and technology studies take him? Chris tells us about a robot DJ he created to mock ideas that technology would replace humans, while also demonstrating how that would work.

Chris lays down beats with a DJ robot

Chris next tells us about his lab’s more politically-engaged work, which started after the U.S. government’s response to 9/11. He tells us about the Total Information Awareness program, which funded companies and computer science departments to carry out large-scale surveillance. Many of the computer scientists didn’t see this as a problem, but anyone who understands the history of technology, says Chris, could guess that this would lead to problems

Slide from a U.S. government deck on the Total Information Awareness program

How can technologists respond to government surveillance? One option is to turn the lens back toward power. Chris tells us about Open Government Information Awareness, a system that Ryan McKinley and the Center for Civic Media created that would automatically monitor public information about government behavior from public records, television appearances, etc, and support people to build their own theories about the behavior of government officials.

What was their goal as researchers and artists? Ryan and Chris wanted to convince the US government to stop funding the system. So they made Open Government Information Awareness available to the public and talked about it widely in the media.

Some projects at the lab ended up profoundly shaping our information ecosystems. In 2004, a graduate student at the Center for Civic Media created a project called TXTMob, a system for sharing messages during protests. Around that time, a group of people from a company called Odeo decided to help with the project. After Odeo renamed their company to Twitter, they acknowledged the influence of the idea behind TXTMob on their early designs.

Over time, Chris and his students became interested in doing work that could “comfort the afflicted” alongside the work for “afflicting the comfortable.” One such project was Balloon Mapping, a project developed by gradstudent Jeff Warren for creating high-resolution maps with balloons, strings, and inexpensive cameras. During the Deepwater Horizon oil spill, Jeff was able to support communities in the Gulf of Mexico to document the environmental impacts of the spill.

Community and Technology Platforms

Chris next tells us about the next chapter in his career working as a researcher doing community media in Uganda. Initially he wondered if it might be possible to adopt mobile technologies to support activism. Unfortunately, while news reports described widespread mobile phone use in Sub-Sarahan Africa, phones were still very rare. In a given village, you might have three people with phones, but only one person has a charged phone and only some have active simcards. On the other hand, many people had access to working radios.

How can we make radio more participatory? The economies of scale for broadcast create incentives that disconnect stations from people. They created a project that allows radio hosts to engage with voice networks over phones. Working with projects like agricultural extension officers, they were able to create new kinds of call-in shows, and since people were being called by the radio show (rather than calling), they weren’t being charged. One of the first shows, “Got Talk,” was a community discussion of veterinary health. Another show offered live commentary on soccer matches. Another show featured contests between market sellers. By the end of the project, they were supporting shows in Uganda, Romania, Ireland, and Cape Verde.

Chris has now been at Cornell for a year and a half now. Most recently, he hosted a conference called “Data into Action” that convened three organizations whose founding involved Civic Media to talk about the history of their organizations (Public Lab, Little Sis, Environmental Data and Governance Initiative).

Going into the future, Chris is interested in creating a center for spinning off student organizations focused on the public interest.

Chris is also hoping to restart some of his art practice. He’s been working on some of his robots for protest, work he started in the 1990s (including projects like Afghan eXplorer, covered here by the BBC). The military can have drones that prevent people from being hurt directly, Chris says, while protesters don’t always have the same capacity. “I want to make a small army of these, protesting outside of Boston Robotics.”

Questions / Discussion

How do you think about what you want other people to get out of your work?

Artists are often very interested in other artists and what they think. But there are different relationships between scholars than relationships between artists. I recently wrote an article on Human-Robot Interaction, where I pointed out that artists working on robots have shared their work with millions of people, where what happens in academia tends to happen in labs, but roboticists don’t tend to get their work into the public. What moved me from gallery into a more tactical technology intervention space was that as a gallery artist, I was already talking to the public. I went to the Yes Men and I learned from them that you could broaden the reach of a project by making it mediagenic.

The moment I discovered free/open source software, the more interested I became in creating platforms, not just media. Throughout much of my work, I’ve been asking how we use platforms to transform the experience of people who are grounded in a local reality and how to extend power through some kind of power. That transition was only possible because of free/open source technology…

How do artists think about epistemology?

In art school, artists, are taught that you build something, you have a reason for making it, and you have a process where a bunch of people look at what you have created, and then tell you what they see in your work. That’s very unfamiliar to engineering culture, since one of the assumptions of the art world is that your work is interpretable and you are not the authoritative person about your own work. Artists put things out there, they watch reactions, they think carefully about what they made, and what they made.

Is creating platforms a way to do that in a more systemic way?

Yes! Platforms can accommodate completely different worlds.

Why use balloons for mapping? What do you think about drones? Does the balloon come back? How does that work?

The early balloon work was part of the Center for Civic at MIT, and a group of five people co-founded the Public Lab. On a practical level, in 2009–2010, drones were expensive, unreliable, and potentially dangerous. Balloons were cheap and available. Even today, drones are expensive systems that one expert can fly. But consider a project by a youth center we worked with in Uganda. We wanted to give them a chance to map their location. As a community-based science project, almost anyone can do it with almost no cost and minimal chances of getting hurt. When we started to fly the balloon, people mobbed the group, children wanted to fly the balloon, and have built up community capacities for environmental monitoring.

What do I think about drones? I’ve been fighting military drones from the earliest days. As a project to critique, I built a drone journalist ( Afghan eXplorer, covered here by the BBC).

--

--

J. Nathan Matias

Citizen social science to improve digital life & hold tech accountable. Assistant Prof, Cornell. citizensandtech.org Prev: Princeton, MIT. Guatemalan-American