‘Everyone Is Not Watched Equally’

Janine Jackson interviewed Alvaro Bedoya on privacy, technology and the targets of surveillance for the March 11, 2016, CounterSpin. This is a lightly edited transcript.

Alvaro Bedoya

Alvaro Bedoya: “There is a sad and deep record of companies and the government making very poor decisions of how to treat the information of vulnerable people.”
(image: Big Picture Science)

X

MP3jPLAYLISTS.MI_0 = [
{ name: “1. CounterSpin Alvaro Bedoya Interview “, formats: [“mp3”], mp3: “aHR0cDovL3d3dy5mYWlyLm9yZy9hdWRpby9jb3VudGVyc3Bpbi9Db3VudGVyU3BpbjE2MDMxMUJlZG95YS5tcDM=”, counterpart:””, artist: “”, image: “true”, imgurl: “” }
];

MP3jPLAYERS[0] = { list:MP3jPLAYLISTS.MI_0, tr:0, type:’MI’, lstate:true, loop:false, play_txt:’Play’, pause_txt:’Pause’, pp_title:’FAIR’, autoplay:false, download:true, vol:80, height:71, cssclass:’ ‘, popout_css:{ enabled:true, colours: [“#fff”, “rgba(201,207,232,0.35)”, “rgb(241,241,241)”, “rgba(245,5,5,0.7)”, “rgba(92,201,255,0.8)”, “transparent”, “transparent”, “#525252”, “#525252”, “#768D99”, “#47ACDE”, “”, 600, 200 ],
cssInterface: { “color”: “#525252” },
cssTitle: { “left”: “16px”, “right”:”16px”, “top”:”8px” },
cssImage: { “overflow”: “hidden”, “width”:”auto”, “height”:”71px” },
cssFontSize: { “title”: “16px”, “caption”: “11.2px”, “list”: “12px” },
classes: { interface:’ verdana-mjp’, title:’ left-mjp norm-mjp plain-mjp childNorm-mjp childPlain-mjp’, image:’ Himg right-mjp’, poscol:”, ul:’ darken1-mjp verdana-mjp med-mjp childNorm-mjp childPlain-mjp left-mjp’ }} };

MP3 Link

Janine Jackson: Would it bother you if, when you walked into a department store, a hidden camera scanned your face and checked it against a database of VIP customers, suspected shoplifters and “known litigious people”? What if your church used facial recognition technology to see who was attending?

You can almost divide people between those who find the idea creepy and wrong, and those who say, “Eh, all in a day,” with the latter reaction meaning maybe, “I have nothing to hide, so so what,” or maybe “it’s inevitable anyway; that ship has sailed.”

But is this unprecedented corporate access into our lives permitted just because it’s technically possible? What expectations of privacy remain when we shop or walk down the street? How does it relate to government surveillance and, as with government surveillance, shouldn’t we ask who, when it comes down to it, is most likely to be harmed?

Alvaro Bedoya is the founding executive director of the Center on Privacy & Technology at Georgetown Law. He was chief counsel to the Senate Judiciary Subcommittee on Privacy, Technology and the Law. He joins us now by phone from Washington, DC. Welcome to CounterSpin, Alvaro Bedoya.

Alvaro Bedoya: Thank you. It’s great to be here.

JJ: Well, I’d like to just beg the question of inevitability and assume that these things are not inevitable; technology exists, but we decide socially how to employ it. In the case of facial recognition technology, you have been part of the process seeking to figure out how it’s used, how privacy is protected. And the way that that process went is worth recounting in some detail, because it explains a lot about how we got to where we are. Tell us about those 2015 negotiations. Who was there, first of all?

AB: These negotiations were convened by the Department of Commerce, that aimed to have privacy advocates and industry representatives come together and settle upon a set of privacy best practices that companies could basically declare, we’re going to adhere to these, and that then could be enforceable against the company for their use of facial recognition technology. They started in 2014, end of 2014, and ran actually to this day, but they broke down in the summer of 2015.

JJ: Well, now, you were representing privacy advocates and being a privacy advocate; there were at the table tech companies, advertisers, retailers and those sort of folks, right?

AB: That’s right. Facebook was there, Microsoft was there, the Interactive Advertising Bureau was there, industry associations like NetChoice were there, and so there were a lot of lobbyists and actual representatives from companies.

JJ: Well, the mind-blowing part of this—and folks can find your article about it on Slate.com—but what led to the walkout is, these groups really wouldn’t even agree to any instance in which consent was necessary. Tell us about what the stumbling block, if you will, was.

Facial recognition markersAB: Sure. At its most basic level, the right to privacy is the ability to say no, leave me alone. This is how, basically, the American legal system has defined the right to privacy with respect to companies since the late 1800s. Right now, when you actually see most companies rolling out facial recognition, as a matter of practice, as a matter of what they do, Janine, they generally don’t use it on people to identify them without their permission.

And so, in this negotiation, the privacy advocates said, well, OK, guys, here’s this common sense, actual in-practice best practice, on-the-ground best practice, that you have to get permission before using facial recognition to identify someone in general. And so we said, why don’t we use that general rule, that you have to get permission, as a baseline for these best practices? And every single company and industry group that spoke up said no.

And so then we narrowed the ask even further. We said, can we all at least agree that when you are walking on a public street, that a company you have never heard of should have to get permission before using facial recognition to identify you by name? And again, not a single company or trade association would agree to that.

And that’s when we said, you know what, we are arguing with a bunch of—you’ve heard of yes men? We’re arguing with a bunch of no men. We’re arguing with a bunch of folks who aren’t here to reach an agreement or a consensus; they’re here to stop this. And that’s when we walked out.

JJ: Well, you put your finger, even more finely, on what’s going on here. Because you note that, first of all, the policy that they wouldn’t agree to on paper is the policy that the companies actually use in practice.

AB: Most of them, that’s right.

JJ: Because they have customers, and customers demand it. And part of what you’re identifying as the problem is that the folks who are in that room doing the lobbying—their connection to people, to customers, is rather indirect.

AB: That’s right. And this happens at a couple different levels. First of all, the representatives from the companies aren’t the folks who are actually, you know, either on the sales floor or rolling out a product online; they are the DC lobbyists. But I want to be honest, frankly, I would have preferred to work with a bunch of actual representatives of companies with skin in the game.

JJ: Uh-huh.

AB: Because, at the end of the day, they do have brands and reputations to maintain. The deeper layer to this is that you have trade associations that effectively cater to the lowest common denominator of their membership. Let’s say I’m a trade association and there’s five companies who are my members; in other words, the folks who pay my bills as a lobbyist. If two of them are OK with heightened privacy standards but one of them says, no sir, I gain nothing from those, those will hurt me; I have to cater to that lowest common denominator. And so the trade associations are entities you’ve never heard of, I frankly never heard of until I entered in the negotiations in many cases, and they are putting their foot down and blocking any hope of progress to establish these best practices.

JJ: Which is really a fascinating statement about the way the regulatory process and the lobbying decision-making process goes on in this country. It’s this tier of industry lobbyists who aren’t directly tethered to customers who are, as you put it, shutting down Washington’s ability to protect consumer privacy. I mean, it’s really quite remarkable, and something I would hope that journalists would see has implications even beyond this particular case, obviously, and is worth digging into as just kind of a “how laws get made,” you know, “how policies get made” story.

AB: Yeah, that’s exactly right.

JJ: Well, you also note that there are some things that are kind of structural, in the sense that when we talk about government surveillance, we have a couple of places that we can check that. Whether or not we do is another question. But when it comes to private companies, there are kind of fewer tools in the toolbox, aren’t there?

AB: Yes. And this is a more subtle point that I think a lot of folks miss. A lot of people look at where we are in terms of government surveillance. You know, they look at the Snowden papers, they look at what the NSA, we now know, has done, and they think, oh, man, you know, we are at a low point in terms of our privacy against the government. But what people forget is that our nation was literally founded on the idea of checking government overreach. And so, you know, we have the Fourth Amendment, which protects us against unreasonable searches and seizures, and we have Congress, where it’s very easy to form bipartisan alliances to stop government overreach.

You know, Edward Snowden released his documents, the Guardian and the Post published them, June 2013. One year later, the House had already passed a law to curtail the call records program; two years later, the president was signing that law after the Senate had passed it, and a federal judge had declared this program to be unconstitutional. And so, yes, there is government overreach on government surveillance, but we’re ready for it.

On the other side of the coin, though, as you note, our country was not built to combat corporate overreach. The only world in which a court or a government agency steps in to protect our privacy against companies is a world where either Congress or a state legislature has passed a law allowing them to do that.

And, unfortunately, as a result of this lobbying, Congress has stopped passing consumer privacy laws. Since 2009, only one minor expansion to consumer privacy law has occurred. And, in fact, there’s been one other thing which was a contraction of consumer privacy law, so I’d call it a wash.

JJ: Right.

AB: Instead, what you’re seeing is state legislatures starting to pass these bills, and that’s where I think the hope lies in terms of consumer privacy.

JJ: Well, and I’m sure some folks will be saying that Congress could do more and that we could do more on other levels in checking government surveillance. But I think the point is, at least we have the structures, at least we have the —

AB: That’s right.

JJ:—mechanisms to do that, whereas when it comes to the corporate side we’re kind of grasping to use the tools that we’ve got. And they’re not doing what they might be doing, which is passing new laws. And it’s not as though nothing has changed in terms of questions that might be addressed since 2009, in terms of consumer privacy; there’s been plenty.

AB: That’s right. And look at the technology we’re talking about now: facial recognition, geolocation technology, wearables, connected home devices like a smart TV or the Amazon Echo. These are technologies that around 2008 or so, 2007, 2009, we’d really never heard of. Maybe we heard geolocation, but all the rest of these were really products of the last five or six years. And yet all these technologies are effectively unregulated right now.

JJ: Well, in a 2014 regulatory finding, you and the Center called on Commerce to support strong consumer controls on the collection of personal data. You noted that this kind of ubiquitous collection is not inevitable. But you also pointed out in that  finding that “a de-emphasis on consumer controls may be particularly harmful for traditionally disadvantaged groups, including the poor, racial and ethnic minorities, immigrants, LGBT individuals and the elderly; privacy is in many ways a shield for the weak.” I wonder if you can talk a little bit about what you mean by that.

AB: Yeah. Let’s put this in really simple terms. Right now, what a lot of folks in the government and what a lot of companies want is a world where we protect privacy after your personal information is collected. It’s kind of like shoot first and ask questions later. Maybe that’s unfair, but think about it this way. Traditionally we’ve protected privacy by empowering people, particularly in the private sector, to say, yes, I’m OK with you to collect this sensitive information; no, I don’t want you to collect that information. And this empowers people to make choices.

Now, obviously as technology becomes more complicated, as we generate almost infinitely larger amounts of data, that exercise becomes harder. But what many in government and what a lot of folks in the private sector are arguing for now is a world where we no longer have that choice. Where instead of protecting privacy at the point of collection, privacy is purportedly protected after the fact. And the real problem here is that, who is making those decisions about how your information is used? Companies, the government. And for years and years, there is a sad and deep record of companies and the government making very poor decisions of how to treat the information of vulnerable people.

Three Japanese-American boys stand by a fence in a U.S. internment camp during World War II.

Japanese-American children in a US internment camp during World War II.

The government, for example, during World War II used Census data, that was supposedly only going to be used for the Census, to figure out where Japanese-Americans were living, to track them down and detain them in internment camps. Nowadays, you have data brokers who are literally creating lists of individuals who are HIV positive or individuals who are victims of sexual assault or who have diabetes or Parkinson’s disease. These are lists that, by and large, aren’t going to help the people on those lists.

JJ: Right. They’re selling those to —

AB: To rip people off, that’s right.

JJ: So they’re selling them to companies. Just to be clear, they’re collecting this data and selling it to companies who might then sell things to those people or…?

AB: Data brokers do this. And sometimes they are marketed to other companies that sell them things. Sometimes they’re actually literally purchased by fraud rings that call people up and enter them into a series of sweepstakes and other basically fraudulent enterprises that bilk them of thousands upon thousands of dollars. And, frankly, we don’t know what we don’t know about this industry, because it is unregulated. And so when I say privacy is a shield for the weak, privacy allows vulnerable people to go about their business and also make choices about their lives without powerful government entities or corporate entities second-guessing them on it and looking over their shoulder and saying don’t do this, do that, or using their data in a way that might harm them.

JJ: Well, let’s pivot just a little bit. Government and corporate surveillance technology may be subject to different mechanisms, but they share this impulse to collect and to use information that people may not know is being used, and whose use can have a serious impact. So there’s a reason that your group is hosting a conference in early April that’s called “The Color of Surveillance.” I wonder if you could talk a little about what we ought to know about that aspect of this issue that we’re discussing.

AB: Certainly. There is a really interesting thing going on in our country right now. What are people talking about? They’re talking about the brutality and pervasiveness of policing in the black community. That is one conversation that’s occurring. There is another conversation, though, that’s occurring about the role of government surveillance in a free and democratic society. This was triggered by the Edward Snowden revelations, and it continues to this day. And so you have these two huge debates, historic debates, and yet they almost never intersect.

There’s no discussion of the fact that for almost the entirety of our nation’s history, the black community and people of color in general have been the disproportionate targets of unjust surveillance. People might know about Martin Luther King and how the FBI surveilled him. People don’t know that the NSA also wiretapped him. And they might not know that it wasn’t just Martin Luther King, it was Fannie Lou Hamer, it was Whitney Young of the National Urban League, it was Cesar Chavez. Before that, it was W.E.B. Dubois and Marcus Garvey.

And before that, enslaved people in our country in cities like New York, for example, literally could not walk outside after dark without carrying a lantern on them so that everyone could see them. These were called lantern laws.

And fast forward all the way to this day, and I realize I’m covering about 300 years of history in about six breaths and four sentences, but today activists with Black Lives Matter and journalists have revealed that the Department of Homeland Security, an agency founded to combat terrorism, is using its resources to monitor Black Lives Matter, even entirely peaceful activities. And so the purpose of this conference is to bring these communities and bring these conversations together, and show that in a world where everyone is watched, everyone is not watched equally. And we need to reckon with that, as a society and in Congress.

JJ: Writing on this issue, Malkia Cyril noted that “many journalists still focus their reporting on the technological trends and not the racial hierarchies that these trends are enforcing.” What would you like to see from the press corps? I did notice a piece on the previous issue of facial recognition technology. The Times had a piece on February 29, and it mentioned harm from companies collecting data, the way they might use a person’s health status or financial straits, and connect that to unfair or inferior treatment. Certainly we have seen some coverage of some spying or intervention on organizing. But in general, what would you like to see from media in putting these issues together and carrying this story forward?

AB: Malkia’s exactly right, and frankly she’s probably the leader on these issues. What I would like to see is more attention to who the eye is watching. We spend so much time thinking about the eye, how it works, how invasive it is. But we never stop to ask, OK, in what communities are Stingrays actually being deployed, and who is in these facial recognition databases, and what communities is predictive policing being rolled out in? And once we start asking the who of these questions and not the how, I think we will recognize that, by and large in this nation, invasive surveillance technology is almost as a rule beta tested on low-income black and Latino communities.

That is the first issue. We need to start asking about the who and not the what. The other issue is that we tend to think of surveillance as something technological, and in the 21st century and 2016, it increasingly is. But we can’t forget about all of the run-of-the-mill surveillance techniques, like, for example, stop and frisk and racial profiling of drivers and pulling over drivers “driving while black.” And we can’t forget that stopping someone and searching them on a pretense, or for frankly no good reason at all, that is surveillance. Stopping someone while they’re driving and asking them some questions and seeing if they trip up, that is also surveillance. We can’t just think that this is about technology. What it is is about monitoring a community. And we need a comprehensive evaluation of this trend, and we need to talk about how to fix it in Congress.

JJ: We’ve been speaking with Alvaro Bedoya of the Center on Privacy & Technology at Georgetown Law. His articles “The Color of Surveillance” and “Why I Walked Out of Facial Recognition Negotiations” can both be found on Slate.com. Alvaro Bedoya, thank you so much for joining us today on CounterSpin.

AB: It was my pleasure. Thank you so much.

This piece was reprinted by RINF Alternative News with permission from FAIR.