This article is part of a limited-run newsletter. You can sign up here.
This week’s Privacy Project newsletter is a pre-debate conversation with the former entrepreneur and current presidential candidate Andrew Yang. I wanted to speak to Yang since he’s the only candidate to address data privacy as a campaign policy issue. He’s a proponent of an idea that’s somewhat controversial among privacy professionals, which is that we should own our own data.
Our short conversation turned out to be pretty sprawling, touching on subjects like data dignity, whether Facebook should be able to run political ads, whether any of us have free will and what his proposed Department of the Attention Economy might look like.
This is a condensed and edited version of our conversation:
You’re the only candidate who has decided to make privacy a campaign issue. How’d you get there?
I’m an avid user of the internet and I understand that users are completely at the mercy of tech companies in terms of what happens to our data. They pretend it’s our choice. In reality, 99.9 percent of people scroll down and hit “I agree.” The trade we’re making is for cost and convenience, but in return we’re forfeiting our data.
That data is packaged and sold and resold and we are none the wiser. We occasionally get notifications of a data breach and think, “Oh, snap, should I change my password?” That’s an irritation but what’s going on with our data is much bigger than that.
I was talking to a researcher recently and she described a concept called data dignity, which I thought really says it all. Right now we’re being systematically deprived of our dignity and we think it is fine because we’re getting these incredible services. Perhaps that worked in the early stages of the internet. But now we’re waking up to the fact that the trade is much more serious and profound than we originally realized.
Your idea is to think less in terms of privacy and more in terms of property — that all this data we shed earns money for big tech companies, so we should receive a share of the economic value generated from that data. You never explicitly say we should get paid for it. Do you think we should be getting that money back in the form of a data dividend?
Yes, I think we should be getting paid in a data dividend. Every time we post a photo or interact with a social media company we’re putting information out there and that information should still be ours. If somebody is profiting from our data and we decide willingly to partner with a company that’s making use of this information, then that’s only fair as long as we get a slice. Right now we’re unaware of the value that’s changing hands and we’re definitely not getting a data check in the mail every season.
There are plenty of critiques when it comes to owning our own data. The first is that our data is shared (a conversation on social media, for example, has two participants) so it’s hard to parse out who owns what. Also, the notion of how scores of companies pay out data dividends to millions of Americans is a logistical nightmare. Is it worth the hassle?
It’s actually a fairly trivial administrative barrier in the sense that almost half of Americans are right now receiving direct transfers from the government in some form, via checks or deposits. I guarantee you that if it was reversed and it was tech companies that needed to extract tolls from millions of consumers there’d be zero issue with the administrative barrier. The company would be like, “I need your credit card.”
One of the more substantial critiques of data ownership is that if you treat data as purely a property right, it means that you can sell that right. If we treat privacy as a property right what that means is that we essentially have that right to trade it. And the problem is that it could worsen financial inequities. People with less income might be more willing to trade their important personal data for a small payment, while those who are better off can say, “I don’t need the dividend. I’ll keep my data.” How do you balance that?
There’d be less of a sense of desperation in a society where there’s a freedom dividend and everyone’s getting $1,000 per month. But it is the case that certain people’s data is worth more than others’ data. And if you look right now there are many people with different preferences. There are people out there choosing to share the intimate details of their lives with millions of people; some make a living off it. So, if individuals want to share their data or information or even their private lives with other people, then that’s their prerogative.
Some of the plans you outline for data are quite broad and amount to “Data needs to be owned by the people.” Some companies already subscribe to this but, in practice, we’re still at their mercy. Does the extent of your plan amount to more than just saying, “You can have your data deleted?”
What’s going to happen in real life is you’re going to use these companies. Things will happen behind the scenes while you’re busy living your life. And then, if you want to, you can go and delete your data. That’s a reasonable estimation of how it would work for many people. What I’m suggesting is that we can do better. But it’s not like individual consumers can band together to make this happen. Government needs to be a counterweight to the massive power and information inequities between us and the technology companies.
What I think most people would want is a place they could go to see what’s happening to their data, the option to delete their data and a record or log of all the times it trades hands. Then the companies do their thing and people would live with greater confidence that, if there are abuses, they’ll be made aware of that and can always pull the plug.
We’ve become like rats in a maze where we’re constantly hit by messages from these companies know everything about us. They know more about us than our families do. We’re responding to stimuli and we think we’re making choices. But it’s because we’ve shared so much over time that they have a keen sense of what we want. There’s something fundamental at stake here, which is: What does human agency look like? What are our rights as citizens?
Do you think Facebook should be allowed to run political ads?
Ideally, there’d be greater transparency with political content where you’d have a better sense of who is actually advertising to you. But our campaign finance system allows “super PACs” and funding sources that are shady. It’s one reason we’re struggling. There are a lot of interconnected problems.
Does it bother you that, if you win the nomination, President Trump could lie in advertisements on Facebook and the company would allow it? Does that feel like an unfair advantage — that Facebook isn’t properly monitoring that?
We all saw what happened in 2016. It’s clear that a foreign power influenced the election in large part because of Facebook. And it’s an open question whether things have changed between this last election and this one. I’m sure Facebook has been trying to stamp out certain forms of interference and misinformation. But I’m equally certain there are many forms that will persist.
[If you’re online — and, well, you are — chances are someone is using your information. We’ll tell you what you can do about it. Sign up for our limited-run newsletter.]
You tweeted a few months ago after Facebook was issued a fine by the Federal Trade Commission about creating a new government agency called the Department of the Attention Economy. What might that look like and how is it different than, say, giving the F.T.C. more teeth to go after Facebook?
[Laughing] Sorry. I was laughing about the F.T.C. Anyway.
We’re seeing record levels of anxiety and depression among teenagers, particularly teenage girls, that’s hand in hand with a surge in smartphone adoption and social media apps. So what do you do about that?
I’m friends with Tristan Harris, who talks about how we have some of the smartest engineers in this country turning supercomputers into dopamine delivery devices. Tristan started the nonprofit called Time Well Spent, which consults with various companies about ways to make apps healthier. But if you’re one of these companies, you have financial incentives tied to maximal engagement and scaling back on that will hurt your bottom line.
The question is how do you take the learning of a design ethicist, like Tristan Harris, and get it into the guts of social media apps? The answer is that it won’t happen on its own because the financial incentives are overpowering. So you need a sophisticated Department of the Attention Economy that says, “Here are the 20 design choices you’re making and you need to make 10 of them differently.”
The reason I laughed before is that if you think you can have, say, five more regulators at the F.T.C. … they’re not going to know what to do. You need a domain expert, like Tristan, to discern what the issues are and what the appropriate responses would be. We’re decades behind on this challenge. We all know Washington doesn’t understand technology very well. This Department of the Attention Economy would safeguard the health of our kids, but also provide that counterweight so we feel like we’re in good hands with the technology companies.
There’s this guy named Albert Wenger who talks about the idea that time will be the new money. If time is the new money, then you need somebody paying attention to how we’re spending our time. On a more human level, what’s more valuable than our time? Nothing. Again, a number of financial incentives push us to spend time in certain ways. That’s fine if it’s our choice. But, in many cases, we might not have the full independence and agency.
Would you base the agency outside of Washington?
Yeah, it’d have to be in Silicon Valley. I doubt I can get my design ethicists to move to D.C.
Follow @privacyproject on Twitter and The New York Times Opinion Section on Facebook and Instagram.