Will the future of work be ethical Perspectives from MIT Technology Review

INSUBCONTINENT EXCLUSIVE:
Greg Epstein Contributor Share on Twitter Greg M
Epstein is the Humanist Chaplain at Harvard and MIT, and the author of the New York Times bestselling book Good Without God
Described as a &godfather to the [humanist] movement& by The New York Times Magazine in recognition of his efforts to build inclusive,
inspiring, and ethical communities for the nonreligious and allies, Greg was also named &one of the top faith and moral leaders in the
United States& by Faithful Internet, a project of the United Church of Christ and the Stanford Law School Center for Internet and Society
More posts by this contributor Will the future of work be ethical? Founder perspectives Will the future of work be ethical?
Future leader perspectives In June, TechCrunch Ethicist in Residence Greg M
Epstein attended EmTech Next, a conference organized by the MIT Technology Review
The conference, which took place at MIT famous Media Lab, examined how AI and robotics are changing the future of work. Greg essay, Will the
Future of Work Be Ethical? reflects on his experiences at the conference, which produced what he calls &a religious crisis, despite the fact
that I am not just a confirmed atheist but a professional one as well.& In it, Greg explores themes of inequality, inclusion and what it
means to work in technology ethically, within a capitalist system and market economy. Accompanying the story for Extra Crunch are a series
of in-depth interviews Greg conducted around the conference, with scholars, journalists, founders and attendees. Below he speaks to two key
organizers: Gideon Lichfield, the editor in chief of the MIT Technology Review, and Karen Hao, its artificial intelligence reporter
Lichfield led the creative process of choosing speakers and framing panels and discussions at the EmTech Next conference, and both Lichfield
and Hao spoke and moderated key discussions. Gideon Lichfield is the editor in chief at MIT Technology Review
Image via MIT Technology Review Greg Epstein: I want to first understand how you see your job — what impact are you really looking to
have? Gideon Lichfield: I frame this as an aspiration
Most of the tech journalism, most of the tech media industry that exists, is born in some way of the era just before the dot-com boom
When there was a lot of optimism about technology
And so I saw its role as being to talk about everything that technology makes possible
Sometimes in a very negative sense
More often in a positive sense
You know, all the wonderful ways in which tech will change our lives
So there was a lot of cheerleading in those days. In more recent years, there has been a lot of backlash, a lot of fear, a lot of dystopia,
a lot of all of the ways in which tech is threatening us
The way I&ve formulated the mission for Tech Review would be to say, technology is a human activity
It not good or bad inherently
It what we make of it. The way that we get technology that has fewer toxic effects and more beneficial ones is for the people who build it,
use it, and regulate it to make well informed decisions about it, and for them to understand each other better
And I said the role of a tech publication like Tech Review, one that is under a university like MIT, probably uniquely among tech
publications, we&re positioned to make that our job
To try to influence those people by informing them better and instigating conversations among them
And that part of the reason we do events like this
So that ultimately better decisions get taken and technology has more beneficial effects
So that like the high level aspiration
How do we measure that day to day? That an ongoing question
But that the goal. Yeah, I mean, I would imagine you measure it qualitatively
In the sense that… What I see when I look at a conference like this is, I see an editorial vision, right? I mean that I&m imagining that
you and your staff have a lot of sort of editorial meetings where you set, you know, what are the key themes that we really need to explore
What do we need to inform people about, right? Yes. What do you want people to take away from this conference then? A lot of the people in
the audience work at medium and large companies
And they&re thinking about…what effect does automation and AI going to have in their companies? How should it affect their workplace
culture? How should it affect their high end decisions? How should it affect their technology investments? And I think the goal for me is,
or for us is, that they come away from this conference with a rounded picture of the different factors that can play a role. There are no
clear answers
But they ought to be able to think in an informed and in a nuanced way
If we&re talking about automating some processes, or contracting out more of what we do to a gig work style platform, or different ways we
might train people on our workforce or help them adapt to new job opportunities, or if we&re thinking about laying people off versus
retraining them
All of the different implications that that has, and all the decisions you can take around that, we want them to think about that in a
useful way so that they can take those decisions well. You&re already speaking, as you said, to a lot of the people who are winning, and who
are here getting themselves more educated and therefore more likely to just continue to win
How do you weigh where to push them to fundamentally change the way they do things, versus getting them to incrementally change? That an
interesting question
I don&t know that we can push people to fundamentally change
We&re not a labor movement
What we can do is put people from labor movements in front of them and have those people speak to them and say, &Hey, this is the
consequences that the decisions you&re taking are having on the people we represent.& Part of the difficulty with this conversation has been
that it has been taking place, up till now, mainly among the people who understand the technology and its consequences
Which with was the people building it and then a small group of scholars studying it
Over the last two or three years I&ve gone to conferences like ours and other people&s, where issues of technology ethics are being
discussed
Initially it really was only the tech people and the business people who were there
And now you&re starting to see more representation
From labor, from community organizations, from minority groups
But it taken a while, I think, for the understanding of those issues to percolate and then people in those organizations to take on the
cause and say, yeah, this is something we have to care about. In some ways this is a tech ethics conference
If you labeled it as such, would that dramatically affect the attendance? Would you get fewer of the actual business people to come to a
tech ethics conference rather than a conference that about tech but that happened to take on ethical issues? Yeah, because I think they
would say it not for them. Right. Business people want to know, what are the risks to me? What are the opportunities for me? What are the
things I need to think about to stay ahead of the game? The case we can make is [about the] ethical considerations are part of that calculus
You have to think about what are the risks going to be to you of, you know, getting rid of all your workforce and relying on contract
workers
What does that do to those workers and how does that play back in terms of a risk to you? Yes, you&ve got Mary Gray, Charles Isbell, and
others here with serious ethical messages. What about the idea of giving back versus taking less? There was an L.A
Times op ed recently, by Joseph Menn, about how it time for tech to give back
It talked about how 20% of Harvard Law grads go into public service after their graduation but if you look at engineering graduates, the
percentage is smaller than that
But even going beyond that perspective, Anand Giridharadas, popular author and critic of contemporary capitalism, might say that while we
like to talk about &giving back,& what is really important is for big tech to take less
In other words: pay more taxes
Break up their companies so they&re not monopolies
To maybe pay taxes on robots, that sort of thing
What your perspective? I don&t have a view on either of those things
I think the interesting question is really, what can motivate tech companies, what can motivate anybody who winning a lot in this economy,
to either give back or take less? It about what causes people who are benefiting from the current situation to feel they need to also ensure
other people are benefiting. Maybe one way to talk about this is to raise a question I&ve seen you raise: what the hell is tech ethics
anyway? I would say there isn&t a tech ethics
Not in the philosophy sense your background is from
There is a movement
There is a set of questions around it, around what should technology companies& responsibility be? And there a movement to try to answer
those questions. A bunch of the technologies that have emerged in the last couple of decades were thought of as being good, as being
beneficial
Mainly because they were thought of as being democratizing
And there was this very naïve Western viewpoint that said if we put technology and power in the hands of the people they will necessarily
do wise and good things with it
And that will benefit everybody. And these technologies, including the web, social media, smart phones, you could include digital cameras,
you could include consumer genetic testing, all things that put a lot more power in the hands of the people, have turned out to be capable
of having toxic effects as well. That took everybody by surprise
And the reason that has raised a conversation around tech ethics is that it also happens that a lot of those technologies are ones in which
the nature of the technology favors the emergence of a dominant player
Because of network effects or because they require lots of data
And so the conversation has been, what is the responsibility of that dominant player to design the technology in such a way that it has
fewer of these harmful effects? And that again is partly because the forces that in the past might have constrained those effects, or
imposed rules, are not moving fast enough
It the tech makers who understand this stuff
Policy makers, and civil society have been slower to catch up to what the effects are
They&re starting to now. This is what you are seeing now in the election campaign: a lot of the leading candidates have platforms that are
about the use of technology and about breaking up big tech
That would have been unthinkable a year or two ago. So the discussion about tech ethics is essentially saying these companies grew too fast,
too quickly
What is their responsibility to slow themselves down before everybody else catches up? Another piece that interests me is how sometimes the
&giving back,& the generosity of big tech companies or tech billionaires, or whatever it is, can end up being a smokescreen
A way to ultimately persuade people not to regulate
Not to take their own power back as a people
Is there a level of tech generosity that is actually harmful in that sense? I suppose
It depends on the context
If all that happening is corporate social responsibility drives that involve dropping money into different places, but there isn&t any
consideration of the consequences of the technology itself those companies are building and their other actions, then sure, it a problem
But it also hard to say giving billions of dollars to a particular cause is bad, unless what is happening is that then the government is
shirking its responsibility to fund those causes because it coming out of the private sector
I can certainly see the U.S
being particularly susceptible to this dynamic, where government sheds responsibility
But I don&t think we&re necessarily there yet.