Internet Balkanization is here already, Mr. Schmidt.


In the technical community we like to say that the Internet is a network of networks, and that each network is independently operated and controlled. That may be true in some technical sense, but it far from the pragmatic truth.

By ProjectManhattan – Own work, CC BY-SA 4.0, https://commons.wikimedia.org/w/index.php?curid=39714913

Today’s New York Times contains an editorial that supports former Google CEO Eric Schmidt’s view that the Internet will balkanize into two – one centered around US/Western values and one around values of China, and indeed it goes farther, to state that there will be three large Internets, where Europe has its own center.

The fact is that this is the world in which we already live.  It is well known that China already has its own Internet, in which all applications can be spied by the government.  With the advent of the GDPR, those of us in Europe have been cut off from a number of non-European web sites because they refuse to comply with Europe’s privacy regulations.  For example, I cannot read the Los Angeles Times from Switzerland.  I get this lovely message:

Unfortunately, our website is currently unavailable in most European countries. We are engaged on the issue and committed to looking at options that support our full range of digital offerings to the EU market. We continue to identify technical compliance solutions that will provide all readers with our award-winning journalism.

And then there are other mini-Internets, such as that of Iran, in which they have attempted to establish their own borders, not only to preserve their culture, but also their security, at least in their view, thanks to such attacks as Stuxnet.

If China can make its own rules, and Europe can establish its own rules, and the U.S. has its own rules, and Iran has its own rules, can we really say that there is a single Internet today?  And how many more Internets will there be tomorrow?

The trend is troubling. 

We Internet geeks also like to highlight The Network Effect, in which the value of the network to each individual increases based on the number of network participants, an effect first observed with telephone networks.  There is a risk that it can operate in reverse: each time the network bifurcates, its value to each participant decreases because of the loss of the participants who are now on separate networks.

Ironically, the capabilities found in China’s network may be very appealing to other countries such as Iran and Saudi Arabia, just as shared values around the needs of law enforcement had previously meant that a single set of lawful intercept capabilities exists in most telecommunications equipment.  This latter example reflected shared societal values of the time.

If you believe that the Internet is a good thing on the whole, then a single Internet is therefore preferable to many bifurcated Internets.  But that value is, at least for the moment, losing to the divergent views that we see reflected in the isolationist policies of the United States, the unilateral policies of Europe, BREXIT, and of course China.  Unless and until the economic effects of the Reverse Network Effect are felt, there is no economic incentive for governments to change their direction.

But be careful.  A new consensus may be forming that some might not like: a number of countries seemingly led by Australia are seeking ways to gain access to personal devices such as iPhones for purposes of law enforcement, with or without strong technical protections.  Do you want to be on that Internet, and perhaps as  importantly, will you have a choice?   Perhaps there will eventually be one Internet, and we may not like it.

One thing is certain: I probably won’t be reading the LA Times any time soon.

My views do not necessarily represent those of my employer.


[del.icio.us] [Digg] [Facebook] [Reddit] [Twitter]

Turning the Home Router from a Threat to a Helping Hand

lybid_1002The Federal Communications Commission is set to vote on a proposed rule that would require cable companies to offer consumers more choices about whether they use a rented cable box or home router or their own.  More choice is good, and one could make a strong argument that lack of consumer choice has retarded development of home routers.  However, this decision may come with a few pitfalls from a security perspective.

Home routers were recently a component of the attack against krebsonsecurity.com.  There are many reasons that this would be the case.  Some routers have as a blank password with user name “admin” that allows anyone to access them.  Others have well-known vulnerabilities in their software that has gone unpatched for years.  If the service provider is providing the router, then we can say that it is responsible for the device’s maintenance.  On the other hand, the consumer has a particularly bad track record of doing a good job protecting the device.

Second, because most consumers do not employ security professionals to protect devices in their homes, the service provider is in a good position to offer that protection.  It does require that the service provider have access to the home router to identify threats within the home itself.  By having some control over that device and having access to logging information, the home router is in a position to identify potential attacks within the home itself.  But the router itself needs some guidance to perform that task, and the router itself typically cannot retain all of the necessary knowledge.  Cloud services are useful for this purpose, whether managed by the SP or by some other entity.

Regardless of what the FCC orders, SPs are in the position of setting the standards necessary to connect a router to the Internet.  CableLabs has set several standards, one known as DOCSIS.  While the current specification has a limited security section, one could easily envision additional capabilities that would protect device within the home.  As new entrants such as Google and Ubiquiti develop additional capabilities, they may have more to say about security in the home.  If home users are to have a choice, one choice they should have is to allow service providers to protect them.


Picture courtesy Sergiy dk on Wikimedia CC BY-SA 3.0

[del.icio.us] [Digg] [Facebook] [Reddit] [Twitter]

Who owns your identity?

“On the Internet, nobody knows you’re a dog.”  Right?  Not if you are known at all.  Those days are gone.  As if to prove the point, one of my favorite web sites is on the wrong side of this issue.  An actress unsuccessfully sued imdb.com for lost wages for having included her age on their site.  There is a well known axiom in Hollywood that starlets have a half-life, and age is something that is best kept secret.  IMDB countered that what matters is not an actress’ age but her ability to play a certain age.

My point is this: she sued and was unable to have information about her removed.  Is age something that you believe should be private?  I do.  I especially do for people born after 1989 where a birthday and a home city can lead to someone guessing your Social Security Number.

But what about other physical attributes one might consider private?  “He has a mole that you can only see if he’s naked.”  How about illness?  “This actor cannot lift his arm due to a stroke.”  Once the information is out there, there’s no way to get rid of it.   And this in the UK, which is subject to the European Data Privacy Directive.  The situation is considerably bleaker for your personal information in the United States.

Related to this is The Right To Be Forgotten.  In Europe they are considering new rules that say that you have a right to have information about you removed.  This has some American firms in an uproar, arguing that a lack of transparency only increases risk and inefficiency.  But what are the limits?  What about this actress who doesn’t want her age known?  How did her age provide for market efficiency?

[del.icio.us] [Digg] [Facebook] [Reddit] [Twitter]

WCIT, the Internet, the ITU-T, and what comes next

Courtesy of Mike Blanche of Google, the map on the left shows in black countries who signed the treaty developed at WCIT, countries in red who indicated they wouldn’t sign the treaty, and other countries who are thinking about it.  A country can always change its mind.

Over the next few weeks that map will change, and the dust will settle.  The fact that the developed world did not sign the treaty means that the Internet will continue to function relatively unmolested, at least for a time, and certainly between developed countries.   As the places that already heavily regulate telecommunications are the ones who are signing the treaty, its impact will be subtle.  We will continue to see international regulatory challenges to the Internet, perhaps as early as 2014 at the ITU’s next Plenipotentiary conference.  Certainly there will be heated debate at the next World Telecommunication Policy Forum.

This map also highlights that the ITU is the big loser in this debacle.  Secretary General Hamadoun Touré claimed that the ITU works by consensus.  It’s just not so, when matters are contentious, and quite frankly he lacked the power and influence to bring all the different viewpoints together.  This loss of consensus has split the Union, and has substantial ramifications.  There is no shared vision or goal, and this will need to be addressed at the next Plenipotentiary conference.

With different sectors and diverse participants, it is hard to lump the Union into a single group. Nations come together to manage radio spectrum in the ITU-R.  That’s important because spectrum crosses borders, and needs to be managed.  In the ITU-D, both developing and developed countries come together to have a dialog on key issues such as cybersecurity and interoperability.  The work of the -D sector needs to be increased.  Most notably, their Programmes need even more capability, and the study groups should be articulating more clearly the challenges and opportunities developing countries face.

The -T standardization sector is considerably more complex.  It’s important not to lose sight of the good work that goes on there. For example, many of the audio and video codecs we use are standardized in ITU-T study group 16.  Fiber transmission standards in study group 15 are the basis for long haul transmission.  Study group 12 has some of the foremost experts in the world on quality of service management.  However, the last six years have demonstrated a fundamental problem:

At the end of the day, when conflicts arise, and that is in the nature of standards work, because of one country one vote, the ITU-T is catering to developing countries who by their nature are not at the leading edge of technology.  The ITU-T likes to believe it holds a special place among standards organizations, and yet there have been entire study groups whose work have been ignored by the market and governments alike.  To cater to those who are behind the Rogers adoption curve is to chase away those who are truly in front.  This is why you don’t see active participation from Facebook, Twitter, or Google in ITU-T standards, and why even larger companies like Cisco, IBM, HP, and others prefer to do protocol work largely elsewhere.1

So what can be done?

In practice study groups in ITU-T serve four functions:

  • develop technical standards, known as recommendations;
  • provide fora for vertical standards coordination;
  • direct management of a certain set of resources, such as the telephone number space;
  • establish accounting rates and regulatory rules based on economics and policy discussions.

The first two functions are technical.  The other are political.  The invasion of political processes into technical standards development is also a fundamental issue.  I offer the above division to demonstrate a possible way forward to be considered.  The role of the -D sector should be considered in all of this.  Hearing from developing countries about the problems they are facing continues to be important.

The ITU-T and its member states will have the opportunity to consider this problem over the next two years, prior to its plenipotentiary meeting.  There is a need for member states to first recognize the problem, and to address it in a forthright manner.

What Does the Internet Technical Community Need to Do?

For the most part, we’re at this point because the Internet Technical Community has done just what it needed to do.  After all, nobody would care about regulating a technology that is not widely deployed.  For the most part, the Internet Technical Community should keep doing what we’re doing.  That does not mean there isn’t room for improvement.

Developing countries have real problems that need to be addressed. It takes resources and wealth to address cybersecurity, for example. To deny this is to feed into a political firestorm.  Therefore continued engagement and understanding are necessary.  Neither can be found at a political conference like WCIT.  WCIT has also shown that by the time people show up at such places, their opinions are formed.

Finally, we must recognize an uncomfortable truth with IPv4.  While Africa, Latin & South America still have free access to IPv4 address space, the rest of the world has exhausted its supply.  Whenever a scarce resource is given a price, there will be haves and have nots.  When the have nots are poor, and they often are, it can always be classed as an inequity.  In this case, there truly is no need for such inequity, because IPv6 offers everyone ample address space.  Clearly governments are concerned about this.  The private sector had better regulate itself before it gets (further) regulated.

Another Uncomfortable Truth

Developing countries are also at risk in this process, and perhaps most of all.  They have been sold the idea that somehow “bridging the standardization gap” is a good thing.   It is one thing to participate and learn.  It is another to impede leading edge progress through bloc votes.  Leading edge work will continue, just elsewhere, as it has.

1 Cisco is my employer, but my views may not be their views (although that does happen sometimes).
[del.icio.us] [Digg] [Facebook] [Reddit] [Twitter]

What’s WCIT about? It depends on who you ask.

This week the World Conference on International Telecommunication (WCIT) began with a remarkable and important declaration from the Secretary General, Dr. Hamadoun Touré:

And WCIT is not about Internet governance.  WCIT is about making sure that we connect the billion people without access to mobile telephony, and that we connect the 4.5 billion people who are still off line.

Let’s first take a moment to celebrate the fact that 2.5 billion people have access to the Internet, and that the rate of Internet penetration has grown at a rate of 14% over the last few years to 35%, according to the ITU’s own numbers.  That’s great news, and it leads to a question: how shall WCIT focus on improving on that number?   How have the International Telecommunication Regulations that have served 2.5 billion people not served the other 4.5 billion?

Unfortunately, none of the proposals that have been made available actually focus on this very problem.  Instead, at least one prominent proposal from Russia focuses on… Internet governance.  Let’s wish the Secretary General great success in persuading Russia and other governments that indeed that is not what this conference is about.

[del.icio.us] [Digg] [Facebook] [Reddit] [Twitter]