Imagine if you are urgently awaiting a shipment update or delivery confirmation that will trigger a transfer of funds from your customer into your bank account. The general expectation is that the moment the delivery is confirmed, an electronic update will immediately hit your phone or tracking system. But what if that’s not the case? What if the critical alerts and messages your business depends on are not deemed a high priority for delivery? What if your data is consigned to an internet ‘slow lane’, while the favoured or more lucrative (for your internet service provider (ISP), at least) data is pushed along the broadband ‘fast lane’? What might that mean for your business, your customers, your trading partners…?
For some context, keep in mind that the financial markets depend on transactions being executed in milliseconds and actually locate their data centres close to very high speed telecoms exchanges. A delay of just a second could mean several hundred thousand dollar losses if the exchange rate moves during that one second delay.
So is this likely, or is it just another unlikely scare story???
Well this and other similar scenarios are central to the debate surrounding the issue of Net Neutrality, and it’s entirely possible that the above scenario might occur. Back in the early noughties, at the point streaming video services started to appear in the market, some broadband network service providers realised that these services take up huge amounts of bandwidth. So they started to ‘control’ speeds through their networks unless the streaming services paid them more money.
Given that network capacity is finite, and back then, bandwidth capacity was quite limited for most people, this was a big deal. As a result people started thinking about the implications of fast and slow lanes across the Internet and who would win and who would lose out. Since that time there has been an impassioned debate in technology circles attempting to find a solution to this problem. This topic is described as ‘Net Neutrality’ and maintains that all data travelling across the Internet should be treated equally.
As technology has progressed, the volumes of data being passed across data networks has exploded. The volume of data is accelerating and with the increasing use of sensors and mobile tracking devices for a variety tasks, may soon be overwhelming. So how networks manage the broadband capacity they have while they upgrade their networks matters – a lot.
I would be very surprised if most people have even heard about this terminology, less so have any opinion about it. But it does have significant implications.
Given that most societies around the globe depend on some form of data transfer for their economies to function, should these transfers be prioritised depending on who can pay the most? What data should always get the fast lane, data related to Defence, Finance, Healthcare, etc.??? One of them, all of them???
Also, keep in mind that domestic customers are demanding streaming services for entertainment, sporting events and video calls to family and friends. All of this will take up network capacity, aka Bandwidth.
It seems an absurdity to try and choose, given that most people would imagine that these networks are controlled by the state, and the state usually has a duty to ensure a level playing field. But not so. Communications networks are (usually) commercial ventures that are subject to regulation at a national and occasionally international level. Part of the confusion comes from the way national regulation is designed, with many assuming the internet is regulated as a utility in the same way electricity markets or water supplies usually are. But how ISPs price and operate their network capacity (bandwidth), is largely left to them. The issue of Net Neutrality is fundamental to agreeing how the global communications infrastructure should manage data flows.
It’s important to understand a little about how the Internet is constructed. It originated as a research project in the USA in the 1960’s. The goal was to try and establish a resilient communications network that could survive a nuclear strike. A niche project at the UK’s National Physical Laboratory had invented a means to transfer data as ‘packets’, similar to the way parcels moved through postal service. This technology was adopted as a fundamental element of the US project called the ‘Arpanet’. As the project progressed, it spread out of the university laboratories and linked up other research institutions across the US and a couple of places in the UK. Eventually, this burgeoning network was renamed the ‘Internet’, but still generally funded by government agencies.
When Sir Tim Berners Lee developed the World Wide Web at the CERN research labs, that was the trigger for the commercial sector to get interested. But in most cases, connections to the Internet were through dedicated telephone lines provided by the national telephone company. Only large companies could justify the expensive of these dedicated wires. At the same time, a number of online services were starting up, giving customers access to a range of services through dial up phone connections. These online services were primarily in the US and used their own dedicated networks to connect customers. In many parts of the world connections were not possible. Keep in mind this was the early- to mid-1990’s and cellular networks for mobile phones were not common in many countries.
The point of this glance back in time is to highlight that a majority of the Internet infrastructure was located in the United States and, to a large degree, their key providers still carry huge amounts of global data. The key switches and Internet peering points (IPX’s), connecting global data pipes (Tier 1 networks) are largely owned by US communications companies. So how the US chooses to deal with this issue has implications across the world. They are seen as technology leaders and how they legislate technology services is often reflected in other countries.
Back in 2015, the Obama administration enacted legislation to ensure the US communications network service providers maintained Net Neutrality for the data traffic passing through their networks. But this was replaced in December 2017 by the ‘Restoring Internet Freedom Order’, enacted by FCC which removed certain provisions, including those which stopped ISPs blocking or slowing selected content or offering fast lanes for payment, as well as declassifying the internet from ‘utility’ status and returning it to lightly regulated ‘information service’ status. It’s fair to say that some of the network service providers were keen for this to happen so that they had the ability to price services more freely and, as they argue, improve the business case for investing in enhanced services, although no major provider has yet used their new found pricing or provision freedoms.
This debate is especially intense in the State of California, where very strong pro Net Neutrality legislation has just been deferred pending the review of the national decision to revoke the net neutrality legislation earlier this year, and a wider review of whether internet access is governed by state- or federal-level authorities.
In Europe, the EU seeks to maintain Net Neutrality through supranational regulation and so applying the same precise regulation to all member states, rather than through directives which can be changed as they are transposed into national law. Some member states have also enacted stronger legislation at the national level preventing the hindrance or slowing of data traffic, and other have intervened in certain circumstances to ensure net neutrality remains, such as the German Federal Network Agency’s action against Deutsche Telekom in December 2017.
Across the rest of the world the picture is mixed. Some countries like India are prepared to enact strong pro Net Neutrality legislation. India’s regulation bans the blocking, degrading, slowing down or granting of preferential speeds or treatment to any content, but does include provisions for enhanced access and speed to certain internet use cases, including for autonomous vehicles. South Korea, meanwhile, has paired looser regulation with incentives that have spurred competition amongst ISPs, creating an environment in which internet access and speeds are necessarily high to ensure profitability and survival, demonstrating a vastly different approach can thrive. This is a debate that will doubtless continue for a while…
Source: Transport Intelligence November 1, 2018
Author: Ken Lyon
The world's largest collection of global supply chain intelligence