An Interview with the political scientist Isabella Hermann
Isabella Hermann was a jury member at the Berlin Sci-Fi Filmfest this year. She’s a sci-fi fan and an expert on A.I. Her field of expertise regards the ethical problem accompanying the development of A.I. in particular.
Scroll down if you want to read the interview:
Tell us who you are and what you’re doing here
My name is Isabella Hermann. And I’m a member of the Jury of the Berlin Sci-Fi Filmfest.
And I’m actually working as a research coordinator at the Berlin Brandenburg Academy of Sciences and Humanities.
So… I’m a political scientist working on the field of A.I. and Ethics.
Could you tell us more about your area of expertise?
Yeah, sure! I mean… Actually I’m a big science-fiction fan! Because of that… I’m bringing a kind of sociopolitical perspective as a jury member to the festival.
And there are lots of connections between. A.I. and Sci-Fi films.
Regarding this kind of ethical and legal… challenges, we’re having right now with this new technology, I think it’s really interesting that those problematics and challenges – and maybe risks – aren’t
really tackled in sci-fi films.
So, sci-fi films are always about the bad Terminator… Or about an overarching system like Skynet. Which might rule us and humanity will lose control…
But what we’re really dealing with right now are kind of unsexy questions like data bias, liability law and data protection. And those issues aren’t really reflected and echoed in sci-fi films.
And I mean, I’m a big sci-fi fan! But… In real life I tackle these problems.
Could you give a concrete example regarding your work?
An important issue is the question of data bias. Because right now A.I. means machine learning. And machines learn with data. And data was produced in the past and produced by those people
who are important, actually. And they reflect the power structures of society.
So, to be more concrete in past data for example, women aren’t really represented in an equal way. If machines now learn with this data and no one really corrects those data sets, women will also
be discriminated against in the application of those A.I. For example in companies, when they have systems for H.R.
This is really an issue! We need diverse programmer teams and diverse coders. We need people to be aware of that. That we don’t bring those
past inequalities to the future.
Find out more about her passion at: www.popular-political-science.org
If you enjoyed this interview, please share, like and comment. Many thanks 🙂