- Erik Huizer issues call to action at the Internet2 Global Summit.
- Personal ethics are endangered in the internet era.
- Reactionary security measure introduce vulnerablity, hampering innovation.
The internet: it's a broad platform that intersects human activity at many points. Ethics: also a broad topic. Lucky for us, we met up with Erik Huizer, chief technology officer for SURFnet, at the recent Internet2 Global Summit in Chicago. Huizer was there to discuss internet freedom and the ethics of technology, and he narrowed the scope for us a bit.
As citizens of Western democracies we've come to expect certain civil liberties. Are these still protected in the cyberworld like we expect them to be?
They don’t really apply in the cyberworld for the simple reason that there’s not one government in the cyberworld. It’s not only a multistate, but even more it's a multi-stakeholder approach to how we govern the internet. This requires extensive consultation meetings to solve problems.
A couple of years ago, 48 countries worked together to disable a child pornography network and to capture all the people involved in all these countries. That was a relatively easy task because child pornography laws are similar in almost every country.
But if you talk about something like initiating denial of service attacks (DDOS) and you want to clamp down on that with 48 countries, you have got a lot of talking to do because in some countries there’s no law against DDOS attacks. All of a sudden it’s way more difficult to do something in cyberspace because all these laws are not aligned.
Let's talk about the ethics of the technology in the context of national research and education networks (NRENs) like Internet2. What ethical issues do you think we should be focusing on?
Privacy is one of the chief ethical issues that we run into, not only from commercial parties and governments, but also of course from criminals who try to misuse your identity.
Privacy is dependent upon context. What I give away on Facebook I give away in the context of giving it away on Facebook. If I give something away on Twitter it’s in that context. If I buy a pair of shoes online I give away some more privacy in that context. But if Google starts to combine those three, then it changes the context and it invades my privacy.
An even clearer example of that for governments is seen in 1939 in my country of the Netherlands. Every village, every city had a very nice registration of every inhabitant in that city — name, surname, birth date, birth location, and religion. Nobody cared because they trusted that government.
One year later we were invaded by the Germans who used those databases to gather all the Jews in my country and deport them because they were registered as being Jewish religion. So the context of this information share changed incredibly.
So we need to think better about how we deal with these things in a way that they can evolve with our culture over time.
Does Edward Snowden have lessons for NRENs?
I think it’s very courageous what he’s done. He has shown that governments are misusing the internet without democratic control.
One of those aspects is the mass surveillance that is happening. The disadvantage of mass surveillance is that the point of departure is ‘trust no one.’ My question is: Can a democracy function if the government distrusts its citizens by default? And how can a researcher then promise data will only be used for his research? So you create a level of distrust that will probably work against researchers getting data which they normally would get in order to fight cancer or whatever.
The other thing that Snowden showed is that governments are consciously corrupting some of the software and hardware that’s out there. As NRENs we have to realize we have an obligation to investigate what the impact is because unfortunately it’s not only the good guys that use those vulnerabilities. Besides, who are the 'good guys'? The US government might be considered the 'good guys' by inhabitants of the US, bu not by the Chines — and vice versa.
At SURFnet we’re doing a full screening now of all the equipment that we have to detect those vulnerabilities, just to know what we are delivering and to be able to tell our customers what’s in there and what can be misused. This way we ensure that we fulfill our obligation and we educate them as fairly and as transparently as possible.
Are there ethical issues other than privacy you are concerned about?
Let me give an example: The self-driving car. You’re driving your own car along the highway and all of a sudden a child crosses the road. You have to make a split second decision there, and that split second decision that you made is a product of your morality that is ingrained in you by your culture.
Now if it’s a self -driving car, again, it’s a split second decision, but this time the decision is based on one line of software code – who has made that line? What morality applied, culture?
Another example to illustrate: The Apple app store does not allow any form of nudity in apps, which is probably okay for the American culture. But in France, for example, they have a different culture, so the French culture is invaded by the culture that Apple forces on them. This might seem harmless but if you do that too much what is left in the end?
This technology we’re delivering — how does it affect the lives of the people around us and what kind of decisions are preconfigured in there? How do we make sure that those are in line with what society wants, and how do we get it through a democratic process rather than just one person writing a piece of code somewhere?
You’re a champion of net neutrality, and often speak about the ‘Balkanization of the internet,’ but don't we already have different abilities to access and use the internet, such as in the case of the Science DMZ?
Net neutrality says once you have access, regardless of what speed, every byte is treated in the same way. So it’s not like Netflix works faster than HBO and so we only watch Netflix.
Having something like a Science DMZ does not bother net neutrality, because it does not hinder a private person from having access. As a researcher what you need at that point in time is a certain service with high-bandwidth to send data. All the packets, once you are using that, are treated in the same way. It’s just that the Science DMZ is configured for transferring high volumes of research data and not for accessing the public internet.
The distinction in net neutrality is not that you can’t have a second-tier network for different sectors. The distinction is that once an individual has access to the internet he has equal access to every service on the internet.
I think that everybody will be happy to have a second-tier for things like the Science DMZ because if they were to occur on the first tier, then they would impact your Netflix traffic then you would be competing for the bandwidth. That means the researcher would not get his data through, and you wouldn’t get your Netflix — and so nobody would be happy.
What do you wish the world understood better about ethical issues on the internet?
What I would like people to understand most is that we need a secure internet for our society to function. We also want to combat terrorism but taking draconian measures on the internet to avoid terrorism undermines the security that we need on the internet.
I would like politicians to understand that everything needs to be proportional. You don’t say just because of a terrorist attack let’s close down the internet and forbid encryption and stuff like that, because the harm done by those reactions is probably much bigger than the harm that is done by terrorism.