‘Women Were in Fact Pioneers in Computing Work’

Janine Jackson interviewed Marie Hicks about Google and women in tech for the August 25, 2017, episode of CounterSpin. This is a lightly edited transcript.


{ name: “1. CounterSpin Marie Hicks Interview “, formats: [“mp3”], mp3: “aHR0cDovL3d3dy5mYWlyLm9yZy9hdWRpby9jb3VudGVyc3Bpbi9Db3VudGVyU3BpbjE3MDgyNUhpY2tzLm1wMw==”, counterpart:””, artist: “”, image: “true”, imgurl: “” }

MP3jPLAYERS[0] = { list:MP3jPLAYLISTS.MI_0, tr:0, type:’MI’, lstate:true, loop:false, play_txt:’Play’, pause_txt:’Pause’, pp_title:’FAIR’, autoplay:false, download:true, vol:80, height:71, cssclass:’ ‘, popout_css:{ enabled:true, colours: [“#fff”, “rgba(201,207,232,0.35)”, “rgb(241,241,241)”, “rgba(245,5,5,0.7)”, “rgba(92,201,255,0.8)”, “transparent”, “transparent”, “#525252”, “#525252”, “#768D99”, “#47ACDE”, “”, 600, 200 ],
cssInterface: { “color”: “#525252” },
cssTitle: { “left”: “16px”, “right”:”16px”, “top”:”8px” },
cssImage: { “overflow”: “hidden”, “width”:”auto”, “height”:”71px” },
cssFontSize: { “title”: “16px”, “caption”: “11.2px”, “list”: “12px” },
classes: { interface:’ verdana-mjp’, title:’ left-mjp norm-mjp plain-mjp childNorm-mjp childPlain-mjp’, image:’ Himg right-mjp’, poscol:”, ul:’ darken1-mjp verdana-mjp med-mjp childNorm-mjp childPlain-mjp left-mjp’ }} };

MP3 Link

Janine Jackson: When a white male Google employee was fired after the release of a memo in which he complained about efforts to increase gender or racial diversity because employment gaps in tech may be due in part to “biological differences” which are “universal across human culture”—woman like feelings more than ideas, that sort of thing—this is not indication that, as USA Today hyperventilated, “the hot button issue of gender bias in the workplace has just gone thermonuclear.” It did provide occasion for the airing of some old ideas involving sexism and scientism, presented as somehow new, because technology.

Surely the current reinvigoration of ideas about white male fragility has something to do with the reception James Damore’s missive received. But the fact is the idea that some people just can’t do some jobs has a long history. And attention to institutional biases in the technology sector does have societal implications beyond the impact on tech workers themselves.

Our next guest works on just these issues. Marie Hicks is assistant professor of history of technology at the University of Wisconsin/Madison, and author of the book Programmed Inequality: How Britain Discarded Women Technologists and Lost Its Edge in Computing. She joins us now by phone from Madison. Welcome to CounterSpin, Marie Hicks.

Marie Hicks: Thank you for having me.

JJ: Well, on human-driven climate disruption, we’ve got to a point where media don’t, as a rule, feel a need to balance believers and deniers. And they even, I think, understand the harm that comes from giving denialism any oxygen, even if you think you’re addressing it.

One would hope that biological determinism would have reached that point long ago. Yet a number of news reports that I saw bore a resemblance to this August 10 USA Today piece that claimed that the Google memo “inflamed an already contentious debate”:

In his memo, Damore laid out the differences between men and women, saying that on average women have higher anxiety and lower stress tolerance, and that men have a higher drive for status.

While those thoughts outraged many people, others have urged that Damore’s points be assessed with balance.

Well, let’s see, “outrage” or “balance,” which are you for? Was there any surprise for you, any new wrinkle in the events themselves here, or in the big media’s reaction?

Marie Hicks

Marie Hicks: “We’re finally getting to the point where we’re starting to see sexism in tech is actually an issue we have to look at, and we can’t turn away from, or see it as a niche or a women’s issue anymore.”

MH: Sadly, there wasn’t any surprise for me in terms of content. This is a very old story, and it’s a very old gambit. And, as you pointed out, these ideas are very dangerous, because they slide very easily into discrimination, and they slide even into things like eugenics. You know?

But the thing that maybe surprised me a little bit was how the media took this memo and just ran with it. And I think that happened partly because of the whole, well, trying to get more clicks. But then, also, I think it happened because we’re finally getting to the point where we’re starting to see sexism in tech is actually an issue we have to look at, and we can’t turn away from, or see it as a niche or a women’s issue anymore.

JJ: When I did research years ago into media coverage of affirmative action, I found that a lot of media had sort of given up on it, almost, as a reportable story, as actual policies and programs. It was only an emotional field of claims and counterclaims. And, importantly, the news reports were completely untethered from any context of the discrimination itself. The technical and professional structure is “actively designed to discriminate,” as you have put it. Intentional discrimination designed to push women out of the field is a big part of the story you tell in the book Programmed Inequality, and it’s such an important counter to this idea that’s afoot that women or people of color just gravitate to certain kinds of work. Can you tell us a little about that theme in the book?

Early computer expert Andrina Wood

MH: Sure. The ideas presented in the Google memo, in addition to being nothing new, they are completely ahistorical. They’re disproved at every turn by, for instance, the history of computing. My book shows very clearly how on both sides of the Atlantic, women were in fact pioneers in programming, pioneers in computing work. And this was because the work was actually seen as fairly low-skilled. It was seen as unimportant, early on. And it’s only later on, in the ’60s, when the work begins to be seen as sort of powerful and important, that is when men start to be, essentially, pushed into the field. And there’s a real push to try to reskill the work, which means, make it seem as though it’s more skilled, not actually change the content of the work. And this is how we eventually get the perception that it’s more appropriate to men than it is to women to be doing computing work.

And to connect it up with today, one of the things that the memo sort of implies, and a lot of the readers of the memo have said, they say, oh, men are somehow being left out or pushed out of the field, in favor of women or people of color who are less well-suited and less qualified. And in fact, there have been studies done that show the exact opposite is happening, that when researchers have sent out resumes with men’s names on them and women’s names on them, identical resumes, it’s the men who are getting more job opportunities and getting more interviews than the women.

JJ: Yeah. It’s not just the kind of old-timey binary sexism and racism that gall. It’s this fake dispassion. You know, the employment of dry technical language to put forward a wholly unscientific idea, and to then kind of inoculate yourself against counterclaims as being almost by definition emotional.

I remember years ago a white male journalist who told me, “If I were black, I’d be at the Washington Post by now.” To which I responded, “Well, which one would you kill?” I mean, there’s a tiny handful of black people in the Post newsroom; the vast majority of them are white. But in this guy’s head, the only reason that he, as a white man, wasn’t working there was because he wasn’t black. It’s not just that it’s insulting, it kind of hurts your brain.

MH: Yeah. And this has a really long history, and it’s one of the reasons, for instance, it was really difficult to get good programmers early on. Because programming is not a science; it’s more like a craft, some might even say kind of an art to program well. And companies, IBM in particular, had their big programmer aptitude test, and it basically tried to regularize and make scientific the selection of programmers, which, of course, it didn’t work. It didn’t actually get people who were any more likely to be better programmers than anyone else. It was a matter of training people to be good programmers. But this sort of veneer of scientism, as you put it, has been with computing for a very, very long time.

JJ: The media discussion has focused, with reason, on the impact of discrimination on women employed, or looking to be employed, in tech. But your book also outlines the impact of that discrimination on the sector itself. What is salient, for the purposes of this discussion, of that aspect of the history that you recount in the book?

Programmed InequalityMH: When I went into the archives, I really wanted to understand what had caused this gendered labor flip, from programming and computer work being seen as feminine, and being feminized, to being seen as male-identified. But what I found was the shocking connection to what actually happened to the British computing industry: In the process of trying to push a new labor force into the field, and thereby cutting out everybody who actually had the skills to do this work, the British government and British industry created very intense labor shortages, in terms of programming and computer-operator positions.

It meant that they needed different kinds of machines—essentially larger, more centralized mainframes that could function with smaller labor forces. As a result, the British computing industry was essentially forced to provide to the government these huge, massive mainframes, at a point in time when the mainframe was actually on the way out. So the British government, as a result of their self-made labor shortages, forces the British computing industry to do something that’s actually against the industry’s interests, and in the process, they end up shooting their own computing industry right in the foot.

JJ: Yeah, it seems clear. I’m never 100 percent behind the “do diversity because it’s good for business” line alone. I mean, societal values are not the same as market values. But the idea that affirmative action and diversity and inclusion would be good for science, for technological advancement, seems to me something that’s just observably the case.

MH: Yeah, I want to make very clear that I’m not trying to make an argument for civil rights on the basis that civil rights are profitable. I think that once we make that our main argument, it is a very, very dangerous position to be in. But as you point out, again and again, what we see is that discrimination wrecks economies.

JJ: CNN said that because

all summer, headlines have screamed out warnings for women in tech…some experts worry that all the bad press could discourage more women from becoming engineers or pursuing other tech careers.

Now, I see what they’re trying to do there, but obviously it isn’t sunshine on the issue that is the real problem here.

MH: Yeah. I think that CNN’s claim is pretty baseless. Obviously, white women and people of color very much understand these issues already. And seeing a headline about them is probably not going to stereotype-threat them that much more. It may just validate what they already know to be true.

But the type of news reporting really does matter, and when you have this sort of fake “fair and balanced” news reporting, where you take ideas that on their face have basically no merit, and you present them as though they’re something reasonable to intellectually engage with, then that really is a problem. Because that confuses people, obviously, and people who have been subject to discrimination on the basis of their gender and/or their race, they get gaslit in the process; they start to see that the outside world isn’t really seeing things clearly. And that can be a very, very disturbing position for people to be in.

JJ: We’ve been speaking with Marie Hicks of the University of Wisconsin/Madison. Her book Programmed Inequality is out now from MIT Press. Marie Hicks, thank you very much for joining us this week on CounterSpin.

MH: Thank you so much for having me.

This piece was reprinted by RINF Alternative News with permission from FAIR.