Will the future of work be ethical Future leader perspectives

INSUBCONTINENT EXCLUSIVE:
Greg Epstein Contributor Share on Twitter Greg M
Epstein is the Humanist Chaplain at Harvard and MIT, and the author of the New York Times bestselling book Good Without God
Described as a &godfather to the [humanist] movement& by The New York Times Magazine in recognition of his efforts to build inclusive,
inspiring, and ethical communities for the nonreligious and allies, Greg was also named &one of the top faith and moral leaders in the
United States& by Faithful Internet, a project of the United Church of Christ and the Stanford Law School Center for Internet and Society
More posts by this contributor Will the future of work be ethical? Founder perspectives Will the future of work be ethical?
Future leader perspectives In June, TechCrunch Ethicist in Residence Greg M
Epstein attended EmTech Next, a conference organized by the MIT Technology Review
The conference, which took place at MIT famous Media Lab, examined how AI and robotics are changing the future of work. Greg essay, Will the
Future of Work Be Ethical? reflects on his experiences at the conference, which produced what he calls &a religious crisis, despite the fact
that I am not just a confirmed atheist but a professional one as well.& In it, Greg explores themes of inequality, inclusion and what it
means to work in technology ethically, within a capitalist system and market economy. Accompanying the story for Extra Crunch are a series
of in-depth interviews Greg conducted around the conference, with scholars, journalists, founders and attendees. Below he speaks to two
conference attendees who had crucial insights to share
Meili Gupta is a high school senior at Phillips Exeter Academy, an elite boarding school in New Hampshire; Gupta attended the EmTech Next
conference with her mother and has attended with family in previous years as well; her voice and thoughts on privilege and inequality in
education and technology are featured prominently in Greg essay
Walter Erike is a 31-year-old independent consultant and SAP Implementation Senior Manager
from Philadelphia
Between conference session, he and Greg talked about diversity and inclusion at tech conferences and beyond. Meili Gupta is a senior at
Phillips Exeter Academy
Image via Meili Gupta Greg Epstein: How did you come to be at EmTech Next? Meili Gupta: I am a rising high school senior at Phillips Exeter
Academy; I&m one of the managing editors for my school science magazine called Matter Magazine. I [also] attended the conference last year
My parents have come to these conferences before, and that gave me an opportunity to come
I am particularly interested in the MIT Technology Review because I&ve grown up reading it. You are the Managing Editor of Matter, a
magazine about STEM at your high school
What subjects that Matter covers are most interesting to you? This year we published two issues
The first featured a lot of interviews from top {AI} professors like Professor Fei-Fei Li, at Stanford
We did a review for her and an interview with Professor Olga Russakovsky at Princeton
That was an AI special issue and, being at this conference you hear about how AI will transform industries. The second issue coincided with
Phillips Exeter Global Climate Action Day
We focused both on environmentalism clubs at Exeter and environmentalism efforts worldwide
I think Matter, as the only stem magazine on campus has a responsibility in doing that. AI and climate: in a sense, you&ve already dealt
with this new field people are calling the ethics of technology
When you hear that term, what comes to mind? As a consumer of a lot of technology and as someone of the generation who has grown up with a
phone in my hand, I&m aware my data is all over the internet
I&ve had conversations [with friends] about personal privacy and if I look around the classroom, most people have covers for the cameras on
their computers
This generation is already aware [of] ethics whenever you&re talking about computing and the use of computers. About AI specifically, as
someone who interested in the field and has been privileged to be able to take courses and do research projects about that, I&m hearing a
lot about ethics with algorithms, whether that fake news or bias or about applying algorithms for social good. What are your biggest
concerns about AI? What do you think needs to be addressed in order for us to feel more comfortable as a society with increased use of
AI? That not an easy answer; it something our society is going to be grappling with for years
From what I&ve learned at this conference, from what I&ve read and tried to understand, it a multidimensional solution
You&re going to need computer programmers to learn the technical skills to make their algorithms less biased
You&re going to need companies to hire those people and say, &This is our goal; we want to create an algorithm that fair and can do good.&
You&re going to need the general society to ask for that standard
That my generation job, too
WikiLeaks, a couple of years ago, sparked the conversation about personal privacy and I think there going to be more sparks. Seems like your
high school is doing some interesting work in terms of incorporating both STEM and a deeper, more creative than usual focus on ethics and
exploring the meaning of life
How would you say that Exeter in particular is trying to combine these issues? I&ll give a couple of examples of my experience with that in
my time at Exeter, and I&m very privileged to go to a school that has these opportunities and offerings for its students. Don&t worry, that
in my next question. Absolutely
With the computer science curriculum, starting in my ninth grade they offered a computer science 590 about [introduction to] artificial
intelligence
In the fall another 590 course was about self driving cars, and you saw the intersection between us working in our robotics lab and learning
about computer vision algorithms
This past semester, a couple students, and I was involved, helped to set up a 999: an independent course which really dove deep into machine
learning algorithms
In the fall, there another 590 I&ll be taking called social innovation through software engineering, which is specifically designed for each
student to pick a local project and to apply software, coding or AI to a social good project. I&ve spent 15 years working at Harvard and MIT
I&ve worked around a lot of smart and privileged people and I&ve supported them
I&m going to ask you a question about Exeter and about your experience as a privileged high school student who is getting a great education,
but I don&t mean it from a perspective of it now me versus you. Of course you&re not. I&m trying to figure this out for myself as well
We live in a world where we&re becoming more prepared to talk about issues of fairness and justice
Yet by even just providing these extraordinary educational experiences to people like you and me and my students or whomever, we&re
preparing some people for that world better than others
How do you feel about being so well prepared for this sort of world to come that it can actually be… I guess my question is, how do you
relate to the idea that even the kinds of educational experiences that we&re talking about are themselves deepening the divide between haves
and have nots? I completely agree that the issue between haves and have nots needs to be talked about more, because inequality between the
upper and the lower classes is growing every year
This morning, Mr
Isbell from Georgia Tech talk was really inspiring
For example, at Phillips Exeter, we have a social service club called ESA which houses more than 70 different social service clubs
One I&m involved with, junior computer programming, teaches programming to local middle school students
That the type of thing, at an individual level and smaller scale, that people can try to help out those who have not been privileged with
opportunities to learn and get ahead with those skills. What Mr
Isbell was talking about this morning was at a university level and also tying in corporations bridge that divide
I don&t think that the issue itself should necessarily scare us from pushing forward to the frontier to say, the possibility that everybody
who does not have a computer science education in five years won&t have a job. Today we had that debate about role or people jobs and robot
taxes
That a very good debate to have, but it sometimes feeds a little bit into the AI hype and I think it may be a disgrace to society to try to
pull back technology, which has been shown to have the power to save lives
It can be two transformations that are happening at the same time
One, that trying to bridge an inequality and is going to come in a lot of different and complicated solutions that happen at multiple levels
and the second is allowing for a transformation in technology and AI. What are you hoping to get out of this conference for yourself, as a
student, as a journalist, or as somebody who going into the industry? The theme for this conference is the future of the workforce
I&m a student
That means I&m going to be the future of the workforce
I was hoping to learn some insight about what I may want to study in college
After that, what type of jobs do I want to pursue that are going to exist and be in demand and really interesting, that have an impact on
other people? Also, as a student, in particular that interested in majoring in computer science and artificial intelligence, I was hoping to
learn about possible research projects that I could pursue in the fall with this 590 course. Right now, I&m working on a research project
with a Professor at the University of Maryland about eliminating bias in machine learning algorithms
What type of dataset do I want to apply that project to? Where is the need or the attention for correcting bias in the AI algorithms? As a
journalist, I would like to write a review summarizing what I&ve learned so other [Exeter students] can learn a little too. What would be
your biggest critique of the conference? What could be improved?