top of page

The Antidote To False Information On Social Media: An Interview with Marshall Van Alstyne

Updated: Mar 23

Boston University Professor Marshall Van Alstyne is researching a solution to false information on social media, backed by a $550,000 National Science Fund grant and a goal to bring people together.

Since 2016, the Questrom School of Business professor has been analyzing trends of false information and developing a system to limit fake news’ influence on users and avoid censorship. Social media platforms have limited systems to protect users from false information, he said, and spend little to innovate them systems. At the same time, the government is not allowed to confront false information because of the First Amendment that protects the freedom of speech.

“I would really love to see us collectively solve this problem, from the left and right,” said Van Alstyne, an expert on information economics and network business models and the coauthor of the 2016 international bestseller Platform Revolution. He said he sees this problem as one that may bring people closer together. “Each [side] has valid criticisms of where we are, and I think we have to learn to hear both sides of that or all sides of that. Otherwise, we're in enough of an impasse that it's going to be hard to make progress.”

Student journalist Harrison Zuritsky met with Van Alstyne at his office to discuss his job as a professor of information systems, his research into false information and his love for skiing. The following interview has been edited and condensed for clarity.

What do you teach here as a professor? 

You can think of me as an information economist. Really, what is the value of information? How does it affect productivity, intellectual property rights? How does it affect decisions? At the moment, I'm more interested in the misinformation question. How does it affect elections?

What did you do before working at Boston University?

I've been a computer scientist, AI programmer, entrepreneurial start-up, did consulting for a while. But at the moment, you know, I really love academia. We get to do some fun research questions. My training is originally in computer science, followed by, managerial economics, information economics. So it's really been a long journey, to a lot of different questions, most of which are really fun to explore. 

What inspired you to teach and focus a lot of your time on information systems?  

Information is also a resource. I think it's underappreciated as a form of capital. You know, it certainly is more so now moving to data science and big data in AI, generative AI, things of that sort. But all of these are based on information. So how do you price it? How do you package it? What's it worth? How do you create more of it? Information is unusual in that when you use it to produce something, you don't consume it. You simply get more of it. 

What inspired your research into false information? 

So we're now seeing an enormous amount of production of misinformation that we didn't see previously. Now, misinformation or fake news is as old as Babylon or, you know, false fish stories. It's been around for a really, really long time. You can even find stolen frescoes of kings lying about their military accomplishments. So misinformation has been around for, you know, millennia. But it's become more of a problem today for reasons that I think are under appreciated. I think it's a combination of law and business model that we hadn't seen previously.

What is the problem of false information nowadays that makes this issue so complicated?

Section 230 is a portion of the 1996 law in the Communications Decency Act that absolves platforms not only from the content their users post but also from the editorial decisions about those. What that means is that users can now post more or less whatever they want under free speech practices, and then platforms can amplify what they want in order to get business for their engagement. The challenge is that it makes it then extremely hard to hold any one particular party accountable for the information pollution that we're now experiencing. 

What false information should we actually be worried about?

I think the problem is decision error and externality. Did you call someone to make the wrong choice, either by telling them, either lying through omission or commission? Either way, so the decision error, I think, is the proper metric, not truth falsity alone. The other piece that I think is fundamentally missed is externalities. So I don't know if, you know, if you've covered in basic economics, what is externality is in actuality is pollution. It's damage. It happens to third parties for any one or two party interaction.

Why does the current solution to false information not work?

Misinformation produces externalities. Externalities produce market failures. Market failures require intervention to fix them. But government intervention in speech is forbidden by the First Amendment. That means that attempts to just turn the problem over to the marketplace of ideas will fail because markets do not self-correct market failures. That's why the problem is so hard, right? Our laws at the moment don't allow the normal solutions. 

What can be done so people can see what they want on social media?

Imagine on social media you could choose the information filters that you thought were most reliable. You could choose from the CNN filter, the Consumer Reports filter, the BBC filter. But notice you can also choose from the Fox News filter, the Breitbart filter from any filter that you think is valuable because you think that gives you the cleanest information stream. That then means you're not going to hear the other stuff that you think is effectively the polluted information. Now, that gives you a right to hear in a right not to hear.

How do we hold users producing false information accountable?

You can say anything you want, but you don't have the right to yell to falsely shout fire in a crowded theater. You probably heard that. What's interesting is that the mechanisms I'm proposing would give you that right. But then you'd be on the hook for any damage that happened. You could say anything you want so it's not hurting free speech, but you would be on the hook for any damage that happens.

What are the kinds of damages of false information?

So it's the loss of herd immunity. It's global warming, or probably the insurrections that happened in Brazil and the United States that had [been started on] a platform. The platform isn't experiencing the damage, and the speakers and the listeners are experiencing the damage. It's everyone else that's experiencing the damage. In economic terms, misinformation produces externalities. 

What do you mean by people being “on the hook” for false information?

The speaker can override the listener's right not to hear in order to be heard. In an honest fashion, they accept responsibility for the misuse of that right. The way it would work would be simple. If I lie to you, I'm going to be responsible for [its cost]. 

Are you planning to launch with the upcoming election?

So I raised some National Science Foundation money, and we're running experiments to show them the data. I cannot imagine that those changes would be adopted prior to the election. I just think it's too big a change the business model. But I'm optimistic longer term, these kind of mechanisms would work. 

What are some of the obstacles that you've experienced?

So the obstacles are in building the tools to test the markets they are in, running the experiments there and getting the research results published. Other obstacles are in speaking across disciplines. If you're using economic tools, does it make sense to computer scientists and computer science tools make sense to a lawyer? To a regulator? And then also in the business world, this is enough of a shift in their business models that they have to see what's in it for them.

What is the timeline looking like for the project?

We're now running experiments to run tests to see whether or not these things can actually work. We will have our first results, probably, in the next three to four weeks. That's just the first results of the first papers. The first full papers will probably be this summer, and from there, we're hoping to seek more attention for the research once we have found actual results.

Outside of being a professor and working on this research to help the fight against false information, what do you do in your free time? 

I'm very fortunate. My wife loves to cook. I love to eat. I love getting around, travel, meeting folks, getting out, exercise, sports, skiing, hiking, biking. Those are the things I'd like to do in my free time. Last year we had season pass up with Killington, but it's three hours away, and, so we're actually headed up again shortly. We have an Epic season pass.  

Marshall Van Alstyne

Phone: 617 - 385 - 3571


Here is a clip from our podcast about the topic.

20 views0 comments


bottom of page