March 26, 2019
Dr. Florence M. Chee is an assistant professor of digital communication at Loyola University Chicago's School of Communication. Dr. Chee also serves as the director of the university's Social & Interactive Media Lab, as well as being a member of the advisory board for the Center for Digital Ethics and Policy.
Why be on the advisory board for the Center for Digital Ethics and Policy?
Digital ethics is very much a part of my teaching and research agenda. So, the two very much play with one another. Sometimes I can't tell the difference (laughter). Looking at the ethics of how we engage with digital media, the policies that enable us to engage with them is very much what I look at on an everyday basis.
So, it makes sense for me to be interested in the center we have here, and as I teach students, that's part of my best work because they tell me that they don’t know what digital media ethics was, or what they’re just finding out in a class or through the symposium that we have every year and they are enlightened, and so you can imagine on an everyday basis in the public how little knowledge there is about the current challenges we face as a society and it's affecting us daily. I mean think of how many of us wake up in the morning and check our social media channels for example, and we have no idea, most of us, of what we have agreed to, in terms of service and user license agreement. Those are things that it makes sense that the average user wouldn’t really know about, but to even know that it's a ’thing’, it's an important issue that should require our explicit consent. These are all issues that are go beyond the digital now because they affect us whether we're on the grid or not, because it’s everywhere and even if you opt out of social networks, you're still on the grid and you’re interacting with people who are still on the grid; it is contagious, it is virality and it is network. Putting that into perspective for people is an important part of what the Center can do, and what I personally can do as a scholar.
As you’ve been researching and delving deep into the world of digital ethics, have you found that you’ve had to do some self-checking? You realize that maybe you were living in a digitally ambiguous space, so to speak, and you could have been more ethical, but just didn’t have the knowledge or tools to do that?
I am by no means claiming to be flawless, no one is, and I said before this interview began that I'm a hypocrite as well (laughter). I’ve talked about how these things can happen to people, and I know for a fact when I'm leaving myself open to hacking or I know when a practice is not necessarily the best practice I could have. So, that little bit of knowledge is important and certainly I think that people shouldn't feel bad about not knowing these issues because right now even lawyers are trying to figure it out. People who are very, very well versed and learned in ethics and digital media are still learning this and they've devoted their lives to studying it. The new challenges these issues throw our way are being faced by the companies themselves and we're all just trying to make sense of it and it’s important to public discussions as well, just how new everything is, and that no one is the absolute last word on this and we're negotiating as we go.
And it’s a long-term negotiation, we’re going to have to continually go back to the negotiation table. Doing research for this, the phrase ‘do no harm’ kept coming to mind. Although it’s a term used in the medical field, it seems applicable here too. You’re doing something digitally, your intention is to do good, but you might not necessarily totally know what you’re working with, the various internal and external stimuli are and you’re doing your best to make sure you’re doing as little harm as possible.
That brings up an interesting point with the Hippocratic Oath, with not harming people and that doctors have to swear to this oath, however Google doesn’t have to, and yet how much information does Google have about our medical situations, like when you google, ‘what is this spot on my face? Is it pre-cancerous?’ All these things, our deepest darkest desires, our problems in private, Google knows all of this just because of what we type in the search bar. So, I think a lot of people don't realize just how big and vast these databases are. They could easily do us all in (laughter) and this is part of what we’re all trying to get an idea about.
Werner Boote has a great documentary called Everything’s Under Control, he comes here to Chicago and shows us surveillance, like CCTV, in the middle of an intersection, and he does a great discussion on these questions that we all have to ask ourselves, like am I okay with these trade-offs? Am I giving up a little bit of information for a lot of gain back? We make these calculations all the time, but we also have a lot of decision fatigue. A lot of us don't know and can't make these decisions, and it's not our job to be petty lawyers about it (laughter), you can't expect everyone to be learned enough to read an entire contract before they click I agree.
That's something I've done research on is well with gaming and consent, when you click I agree before you begin, as an acceptance of participation. For example, how much information do third parties get about our accounts. Before we start any kind of game we have to give our email address or billing information and then that information can also go to law enforcement agencies. You can see the very, very, slippery slope, and this is where another arm of the research goes, which is creating community by playing games.
Can you tell us a little about your research with games?
I look at why game communities are compelling and that started as a hobby, I didn't know it would turn into a career (laughter). I've been playing since the good old days of Pac-Man and so games at that point unless you were making them, it didn't really present itself it in the public imagination as a career option. Now we have so many options, games have become part of the broader spectatorship of activities. How I came into it was through research and taking communication courses. My training started out in computer science and I was ultimately interested in multimedia, but I wasn't sure what kind of path there was there, and it seemed increasingly that communication was a way I could look at humanity, society and our engagement with technology.
I met some really great mentors, I did work placements where I ended up doing instructional design, like writing manuals to teach people software, and so my last placement was at a place called New Media and Innovation Center and it was just a government initiative that was meant to diversify the economy of British Columbia, where I'm from. Being a resource extraction oriented economy, they were trying to train people to work in high-tech industries, and I was part of a research social science cluster that worked with local companies to investigate things that included network games, and part of that research assistantship I was able to investigate why people play games and I thought this was my dream come true, because I was already playing games as a gamer, and suddenly I was working and getting paid to research games.
Academic research sounded really dry and intimidating before, I didn’t think it was for me. But if it meant that I could research society and games, that’s what pulled me into it, and then I started writing and doing graduate programs, and then eventually I got recognized by MIT and then I was doing my PhD and now here we are (laughter).
What a cool perspective you can bring to that, where you can look at things from the perspective of a gamer, but also through the lens of research and academia.
That’s really the goal, mostly in social science methods and especially ethnography. Going and not just simply playing the games that gamers play, but eating, breathing and sleeping the gaming culture is super important for the study. In the book I'm working on we talk about that. It's really about translation, I'm bringing these concepts to people who need to know that it's a thing that's important and that it also affects their everyday, and it might help them change their mind, it might help them educate others and with that kind of viral capability. That's what I kind of find really interesting and compelling. I think a translation capacity will always come in handy because we're talking about making academic concepts accessible to the public, and that's super important because there is a lot of misunderstanding or intimidation, or people don't see the relevance, but there are people doing extremely relevant work who aren't able to get it out there and have people read it, so if you can package it in a way that people understand and find useful that's a lot of the battle.
You mentioned the book you’re working on, is there anything else you’re working on?
The book is in progress now, but a couple of other projects include the Ethics of Care, which is a principal we were looking at in terms of looking at relationships, and not just in terms of ethics approval in the IRB sense (Institutional Review Board), but looking at even if I have a research assistant handling data about sexual harassment or very toxic data, what is that doing to the research assistant? In this specific case we were looking at Gamergate data, and how we made the decision after screening all this data from Twitter, 4chan and 8chan, to delete the 4chan and 8chan data. That was of course ideological, but also the issue of curation to begin with is a subjective one it's not objective at all so we discuss that whole argument in that paper.
The Ethics of Uber is the beginning of a broader project I have going on at the lab which is to do with various stakeholders in ride-sharing. So often the case for Uber is that we should just make everything Uber and we don't need public transit, so the case for de-vesting in public transit is made stronger by the success of Rideshare practices like Uber.
In that article I talked about the labor problems faced by drivers and the accessibility issues faced by riders who depend on public transit but who don't have access to an app or credit cards even. Later on, we're going to conduct focus groups in the lab having to do with people's strategies to stay safe when using Rideshare apps.
There’s a reason we still look at some of these age-old concepts, like safety and ethics, because they're still very relevant, if not more so in today’s digital environment as a way of understanding the chaos of what's going on now. So, if we can bring that to the broader public, all the better.
This interview has been edited for time and clarity.