Thursday, November 20, 2014

Household Bandwidth Requirements Will Grow 31% Annually, Next 5 Years

Average household bandwidth requirements will increase by 31 percent annually over the next five years, from a peak hour average usage per household of 2.9 Mbps in 2014 to 7.3 Mbps in 2018, according to a study sponsored by Ciena and conducted by ACG Research.

Some might argue the absolute speeds are less important than the growth rate, as different studies come up with different conclusions about the current state of peak hour speeds.

A study of peak hour speeds in the United Kingdom suggested speeds of about 6.2 Mbps during the peak evening hour.

Some would argue peak hour speeds are higher in the United States and Europe, ranging between 15 Mbps and 19 Mbps during the hours of heaviest household usage.

Over the top unicast video traffic is predicted to be 4.6 times greater than traditional multicast traffic by 2018, according to the study.

Usage of Internet video, which includes smart TVs, is expected to grow from 12 percent of overall peak average bandwidth in 2014 to 25 percent in 2018, a compound annual growth rate (CAGR) of 56 percent.

Internet video will be the largest contributor to household bandwidth consumption by 2018.

But there are several drivers of increased bandwidth consumption. Households and users now connect multiple Internet-using devices in a household, ranging from streaming consoles to smartphones, tablets and Internet-connected TVs.

Larger screen TVs consume more data than small screen devices, as does consumption of HDTV content, while new ultra-high definition 4K TVs consume even more data than HJDTV..

In fact, 4K streaming video services consume three to four times more bandwidth than HDTV.

Why Dish Network Wants Higher Spectrum Prices, in AWS Auction

Typically, would-be acquirers of new mobile spectrum would prefer to pay lower prices. Sometimes, though, a bidder might actually act to increase overall prices. That appears to be the case for the on-going U.S. auction of AWS spectrum.

The actual identities of bidders are secret, but Dish Network, which has three entities able to bid, actually has a valid business reason for desiring higher prices. Dish Network purchased similar spectrum earlier in 2014, namely a paired 10-MHz block of spectrum that runs from 1,915-1,920 MHz (for the uplink) and from 1,995-2,000 MHz (for the downlink). Dish also controls 40 MHz worth of spectrum adjacent to those frequencies.

If the current AWS auction prices climb, the value of spectrum Dish Network already owns should climb as well. That, in turn, should boost Dish Network’s stock price, which in turn would become a currency to be used for additional transactions, perhaps a bid to buy T-Mobile US, for example.

Some would argue the current AWS auction prices boosted Dish Network equity value equity value 10 percent.  Some also would estimate the value of Dish Network spectrum usable to support mobile communications mobile communications at more than $12 billion.  

Others might peg the value of potential mobile communications as representing most of Dish Network’s equity value, in excess of 75 percent of the firm’s market value. That is a stunning thought, for a company with 14 million video customers and zero mobile customers.

Skeptics might argue that all prior efforts by Dish Network to reinvent the company have essentially failed, though.

Of course, Dish Network could decide simply to sell all of its spectrum, and not become a mobile operating company. But that entails risk as well, since some of the equity valuation of Dish Network includes the assumption it will become a mobile service provider, adding not only subscribers, new product lines, revenue and cash flow, but giving Dish Network an opportunity to escape the clutches of a declining business (satellite TV entertainment).

But some might well question the viability of a fifth national U.S. mobile provider, leading to speculation Dish Network “has” to buy T-Mobile US.

Facebook Mobile Traffic Load Grows 60% in 1 Year Because of Video

Perhaps the single biggest new development in mobile data consumption over the past year has been Facebook’s embedding of video that automatically plays on user feeds.

Over a year, Facebook traffic increased by 60 percent on the mobile network, and by over 200 percent on the fixed network, according to Sandvine measurements.

That is the sort of unexpected and  significant action by a third party app provider that directly affects user data consumption, Internet service provider bandwidth planning, capital investment and retail service plan features.

That is particularly the case for mobile networks, which have spectrum constraints not faced in the same way by fixed network operators.

Real time entertainment (streaming video and audio) has been the largest traffic category on most networks, fixed or mobile. But it is mobile video consumption that has the greatest impact on network demand, performance and investment requirements, since bandwidth is more limited and costs-per-bit for end users are much higher than for fixed networks.  

In North America, “average” (mean, or arithmetic average)  monthly usage grew 18 percent in six months, from 465 MB to 522 MB, according to Sandvine.

As with some other physical networks, adding more capacity (Long Term Evolution fourth generation networks) actually increases usage. Median usage (a mid-point figure, with half higher and half lower) grew from 102 MB to 118 MB over six months.

During peak period, real time entertainment traffic accounts for 40 percent of downstream bytes on mobile networks.

In Europe, mean monthly mobile data consumption was 449.5 MB, an increase of over 13 percent from 394.4 MB observed six months ago. Real time entertainment traffic accounts for 38 percent of downstream traffic during peak usage periods.

In Latin America, mean monthly mobile data usage was 390.3 MB, a slight increase over the 355.4MB seen six months earlier.

In Latin America, social networking is the largest driver of mobile usage, accounting for 31 percent of peak downstream traffic.

One reason is the popularity of low-cost, “all-you-can use” social networking plans offered by mobile service providers.

The Asia-Pacific mobile typical data consumption is at least 1 GB a month on average. Real time entertainment represents 47 percent of total data consumption during peak hours.

In Africa, in contrast, real time entertainment accounts for only 6.6 percent of peak downstream traffic.

Wednesday, November 19, 2014

How Much Danger Does Google Pose to Other ISPs?

After Google Fiber, the notion that a major app provider might actually become an Internet service provider is no longer a possibility, but is a reality. The only issue now is how far that might extend, and which other firms might decide to do something similar.

Facebook appears likely to emerge as a satellite-based Internet service provider in Africa. And both Google and Facebook now own assets that produce unmanned aerial vehicles that could be used to supply Internet access.

Google is testing balloon-based Internet access for rural Australia, a test being conducted in conjunction with Telstra, existing Internet service providers have to be thinking about how far Google, Facebook and others might be looking at the ISP business.

Add to the that the fact that Amazon already is a specialized type of mobile virtual network operator, using AT&T’s mobile network to deliver content to Kindle devices.

There has been speculation that Apple might someday want to do something similar, perhaps becoming a provider of services that connect to any available mobile network network, something that has become a feature of the iPad.

Given the fact that, by perhaps 2020, 80 percent of all Internet access globally will use a smartphone, one has to wonder when that might become a focus for one or more application providers.

GoogleNet is Google’s vision to offer global, near-free Internet-access, mobile connectivity, and Internet-of-Things connectivity using a global, largely-wireless, Android-based, “GoogleNet,” according to Scott Cleland, of Precursor, a site with an admittedly “anti-Google” orientation.

Critics argue Google is following a business strategy of identifying markets with a valuable stream of consumer data, then creating an "open" or "free" product to induce adoption and “undermine the business model of existing market participants,” according to Fairsearch.org, an entity funded by Google competitors.

Once it gains dominance, Google then “closes” the market and excludes competitors, Fairsearch argues.

One does not have to accept the premise of the argument to agree that something important has happened. Google effectively disintermediates and commoditizes the direct relationships Internet or communications or entertainment suppliers have with their customers.

And telcos, cable companies and Internet service providers might have to worry more than they used to about Google. Initially, one might argue, Google was about businesses built on bits in virtual worlds. That is no small matter, as Google arguably has created rival communication products that displace products supplied by communications companies.

But Google now is moving into different realms, including “atoms in the physical world.” including Google Fiber, an Internet service provider operation that competes directly with cable TV company and telco high speed access and video entertainment products.

Beyond that, Google has invested in, and is testing, high-­altitude Wi-Fi balloons, and unmanned aerial aircraft that might also be used to support Internet access.

All of that, building on earlier Google investments, creates at least the potential for a “global Internet access” capability that would disintermediate other existing Internet service providers, as Google Fiber does in a growing number of U.S. cities.

Google bought Skybox Imaging (satellite technology) and plans to spend $1-3 billion on “180 small, high capacity satellites at lower altitudes than traditional satellites” to enable two-way Internet access.

Google also bought Titan Aerospace, a supplier of solar-powered, high-flying drones. Project Loon likewise is testing use of balloons for Internet access, most recently inking a deal to test them to provide Internet access in Australia, working with Telstra.

Google also operates its own global undersea network, including investments in four cable systems.

The issue is how widely Google’s ambitions might extend.

“It Can’t be Done”

Some of the most-dangerous statements an experienced and knowledgeable executive ever can make is that something “cannot be done,” or that a new way of doing something is underpowered, under-featured and essentially a non-serious approach to solving a problem.

If confronted with a requirement to support huge amounts of bandwidth, hundreds of times to perhaps 1,000 times greater than anything yet seen, it might seem obvious that only fixed networks will be able to handle the load.

That is why Marcus Weldon, Bell Labs President and Alcatel-Lucent CTO believes sophisticated core and fixed networks are essential, and that explorations of Internet access networks using unmanned aerial vehicles or balloons are unsophisticated approaches little better than “toys,” compared to the best of today’s telecom networks.

The phrase "toy networks" as applied to new Internet access platforms such as balloons or unmanned aerial vehicles reflects a perhaps-understandable reaction to new networks that lack the sophistication of the existing and future networks envisioned by the telecom industry.

But it is profoundly dangerous to underestimate the threat posed by such underpowered or feature-deficient new approaches. You might recall that the same sort of sentiment was uttered about voice over Internet Protocol.

Disruptive innovation, a term coined by Harvard Business School Professor Clayton Christensen, describes a process by which a product or service takes root initially in simple applications at the bottom of a market and then relentlessly moves up market, eventually displacing established competitors.

Such innovations might reasonably be derided by existing suppliers as “not very good” products with limited feature sets, unstable quality and some restrictions on ease of use. Skype initially could only be used by people communicating using personal computers, for example.

Microwave Communications Corp. (MCI) originally competed with AT&T for long-distance voice calls using a microwave network that likewise was deemed less reliable than AT&T’s own network.

Wi-Fi hotspots originally were hard to find, sometimes difficult to log on to, and obviously did not have the ubiquity of mobile Internet access or the speed of an at-home Internet access service.

Netflix originally required mailing of DVDs to view content (it was not “on demand”), and could not be viewed on demand, on a variety of devices.

What happens, over time, is that disruptive attacks gradually “move up the stack” in terms of features and quality of service, eventually competing head to head with the incumbents.

If you live long enough, you might see many examples of such derision.

I can remember being at a meeting at the headquarters of the National Cable Television Association, in the earlier days of high definition television discussions, where it was proposed that a full HDTV signal could be squeezed from about 45 Mbps of raw bandwidth to the 6-MHz channelization used by the North American television industry.

The room essentially exploded, as the attendees, mostly vice presidents of engineering from the largest cable TV and broadcast firms, disagreed with the sheer physics of the proposal. Later, the executive who suggested HDTV in 6 MHz was indeed possible talked with his firm’s engineering vice president, about the the science, to reaffirm that such a thing actually could be done. “Are you sure about this?” was the question, given the magnitude of opposition.

To make a longer story short, it did prove feasible to compress a full HDTV signal into just 6 MHz of bandwidth, making for a much-easier financial transition to full HDTV broadcasting, as well as an ability for cable TV operators to support the new format.

Similarly, when the U.S. cable TV industry began to ask for analog optical transmission systems capable of carrying 20 channels of standard definition video without complicated channel-by-channel coding and decoding, a distinguished engineer from Bell Laboratories privately assured me that such a thing was in fact not possible, and that people who claimed it was possible were simply wrong.

To make a longer story short, it did indeed prove possible to take a full complement of analog video signals (40 channels, as it turned out), convert the full set of broadband signals to analog optical format, and deliver them over distances useful for cable TV purposes.

On another occasion, the vice president of one of the world’s biggest suppliers of equipment said privately that “digital subscriber line does not work” as a platform for high speed Internet access, even at relatively low speeds. Ultimately, that also proved incorrect. Over time, DSL performance was not only proven to be commercially viable, but also delivered much-faster speeds, over longer distances, as experience was gained.

The point is that when a smart, experienced, thoroughly-knowledgeable executive says that something “cannot be done,” one has to translate. What the statement means is only that, at a given point in time, before the application of effort and ingenuity, a given entity has not been able to do something.

That does not actually mean something literally “cannot be done.” Quite often, formerly impossible things actually are made possible, after dedicated investigation and development.

That sort of thing happens often enough that statements deriding novel approaches to solving problems should not be lightly dismissed. New platforms and approaches often do appear to be “toys” at first. But that is not where developments remain for all time.

Executives generally truly believe disruptive new platforms and approaches are unsatisfactory substitutes for higher-performance solutions. That often is quite true, at first. But substitute products often do not remain fixed at such levels. They often improve to the point that, eventually, the new approach is a workable solution for a wider range of applications and customer use cases.
 
Having lived long enough to see the “smart guys” proven quite wrong, I am careful never to argue something really cannot be done. Sometimes, somebody, or another company, is able to do so, even when a reasonable, smart, experienced practitioner “knows” it cannot be done.

45% of Service Providers Now Offer Managed Services

More than 45 percent of communications service providers surveyed on behalf of Allot Communications now sell managed services for enterprises and small and mid-sized business customers ranging from basic email and storage to fully-fledged unified communications, customer relationship management and enterprise resource planning solutions, Allot says.  

Microsoft Office 365 is the most prevalent office suite, sold by a third of all service providers surveyed.

About 23 percent of respondents offer quality of service solutions for mission-critical applications and 32 percent sell cloud-based security services.

QoS management is more common when service providers are selling unified communications, Office and Microsoft Lync.

Allot argues that such cloud services are important for telcos because the opportunity is so large, and telcos need distinctiveness to compete with market leaders Google, IBM, Microsoft and Amazon. Cloud services are projected by Infonetics Research to be a $200 billion revenue business by 2018.  

A change seems to have happened, though. Where initially it was small businesses and smaller organizations that were most likely to buy a cloud-based managed service, the “threshold for an enterprise to source cloud apps has dropped,” said Yaniv Sulkes, Allot Communications AVP.

“In many cases, even large enterprises dind it advantageous to source from thje cloud, rather than hosting themselves,” said Sulkes. “Organizations with 5,000 to 20,000 employees now use cloud-based Salesforce.”

"Mobile Eats the World"

The phrase “software eats the world,” coined by venture capitalist Marc Andreessen in 2011, might have an analogy: mobile eats the world. Already, mobile devices (smartphones and tablets) represent about half the value of consumer electronics sales.

Andreessen’s 2011 quip was meant to illustrate the principle that “we are in the middle of a dramatic and broad technological and economic shift in which software companies are poised to take over large swathes of the economy.”

The newer adage--”mobile eats the world”--is meant to illustrate the growing shift of human activity from tethered to untethered devices. “There is no point in drawing a distinction between the future of technology and the future of mobile:they are the same,” says Benedict Evans, also a venture capitalist.

By 2020, Evans argues, the number of people using the Internet, and the number of people using smartphones, will be identical. By about 2017, the percentage of people “not using the Internet” will be identical to the percentage of people “not using smartphones.”

By 2020, 80 percent of everyone on the planet will be using a smartphone, Evans predicts. Every year, at least 7.5 trillion messages are sent by people using mobile networks.In 2010, the U.S. telecommunications industry along employed about 900,000 people.

WhatsApp now supports at least 7.2 trillion messages a year. WhatsApp employs 30 engineers. That shows the relative advantage software firms have in cost structure, compared to capital-intensive industries such as telecommunications, that supply similar services.



Many Winners and Losers from Generative AI

Perhaps there is no contradiction between low historical total factor annual productivity gains and high expected generative artificial inte...