Showing posts with label cable regulation. Show all posts
Showing posts with label cable regulation. Show all posts

Wednesday, September 29, 2010

Aspen Institute Fellow Recommends Big Changes In USF, Intercarrier Compensation, Use of Satellite Broadband

Blair Levin, Aspen Institute Fellow, says $10 billion, spent over 10 years, is enough to provide a minimum 4 Mbps downstream service for Americans in rural and isolated areas.

He proposes that the money be gotten by revamping the Universal Service Fund, including reducing or freezing funds currently allocated under the Interstate Access Support and Interstate Common Line Support funds, steps that would have immediate impact on many rural telcos and rural mobile providers.

Levin points out that there are about seven million housing units (about five percent of the total) without access to the 4 Mbps downstream and 1 Mbps upstream services the Federal Communications Commission now believes is a minimum.

The FCC has estimated the cost to provide such service with wired broadband at $32.4 billion, with a revenue projection of only $8.9 billion, leaving a $23.5 billion gap.

But Levin maintains that the costs are so high because of costs to build wired infrastructure to just 250,000 homes. Reaching those 250,000 homes would cost about $13.4 billion. Levin does not appear to believe that is a wise investment. So he suggests using satellite to reach the most-isolated, high-cost homes, instead. That would free up enough money to build out facilities to the roughly 6.75 million other rural homes.

In 2010, the federal fund (USF) is projected to make total outlays of $8.7 billion, but not specifically to support broadband access.

Some $4.6 billion is set aside for deployment of networks to high-cost areas, where population density or other factors would cause the price of services to consumers to be at a level that would not reasonably compare to urban areas (this is in addition to the 21 states that have similar high-cost funds that distribute a total of over $1.5 billion).

About $1.2 billion is allocated to provide discounts to make basic telephone service available
and affordable to low-income consumers (in addition, 33 states have similar programs).

Another $2.7 billion is reserved for subsidizing telecommunications services, Internet access and
internal connections to enable schools and libraries to connect to the Internet (in addition, nine states have similar programs).

Making better use of existing funding should be the first priority in any reform effort, Levin says. The universal service contribution factor—an assessment on interstate and international charges that usually appears as a surcharge on consumers’ phone bills—is already at about 15 percent (having risen dramatically in the last decade), he notes.

Further increases would create both political and policy problems, he suggests.

"More ambitious goals in terms of network speeds, at this time, would cause such an increase in the assessment on the current system that it could backfire in terms of driving America’s use of broadband," Levin argues. "For example, the FCC calculates that going from 4 Mbps to 6 Mbps would increase the investment gap by more than 100 percent."

The rational approach would be to avoid building fixed-line networks to serve a quarter million homes, at a cost of $13.4 billion, using satellite broadband. That would free up nearly all of the available funds to build fixed-line networks for 6.75 million rural households.

There are a number of problems with the current Universal Service Fund, Levin suggests. "Among these are that the fund is targeted to support analog voice requirements, rather than data networks; that the fund does not target unserved areas but rather funds particular kinds of companies; that the fund provides incentives for inefficient build outs; that there is no accountability for actually using the funds for their intended purposes; and that the support programs are not coordinated to
leverage the funds to maximize broader policy objectives," says Levin.

Though rural telcos might not like the idea, there are a number of current programs within the Universal Service Fund that need to be changed.

About $4 billion could be redireted to broadband support, over 10 years, by reductions in USF payments to wireless providers.

Interstate Access Support (IAS) payments could be reoriented to broadband, adding approximately $4 billion over 10 years.

Freezing Interstate Common Line Support (ICLS) would limit the growth of the existing high-cost fund and result in savings of about $1.8 billion over 10 years. Those funds also could be redirected to broadband support.

To accomplish this, the FCC would have to require that rate-of-return carriers move to incentive regulation.

Phasing out remaining legacy high-cost support for competitive carriers (wireless, primarily) would yield up to an additional $5.8 billion over the coming decade.

Together these actions would result in between $15 and 16 billion in savings from the existing high-cost program that could be used to support broadband facilities construction.

As logical as the changes might be, there will be resistance from any number of firms that currently rely on the current mechanisms for significant portions of their current revenue, including but not limited to, rural telcos.

Thursday, June 10, 2010

Is There a Need for Economic Regulation of the Internet

Two necessary preconditions must be satisfied to justify market intervention in the form of economic regulation on the part of the government, says Dennis Weisman, Professor of Economics at Kansas State University and an editor of the Review of Network Economics and a member of the Free State Foundation's Board of Academic Advisors.

The first one inquires as to whether there is a problem and the second one inquires as to whether there is a solution? Only if both questions can be answered in the affirmative can such intervention be justified.

He says the case for economic regulation of broadband markets is weak at best. The Federal Communications Commission can point to, at most, two cases where things went awry — Madison River and Comcast.

Madison River was resolved with dispatch; and in the case of Comcast, the supposed cover-up was arguably worse than the alleged crime, Weisman says. "There is no offense in reasonable network management practices designed to prevent congestion and maintain service quality," he adds.

Nor is there evidence that the major incumbent telecommunications carriers or the cable companies were earning supra-normal returns that might be suggestive of market power," which might imply there is a problem waiting to be solved. http://ssrn.com/abstract_id=1525568

The structure of broadband prices is a problem in the economics of two-sided markets, though. The issue is that it is difficult to determine how the price structure should be changed to enhance economic welfare. "In other words, there can be no reasonable assurance that regulatory intervention to alter the price structure would not do more harm than good," says Weisman.

Thursday, May 20, 2010

What Does "Effective Competition" Actually Look Like?

The U.S. Federal Communications Commission seems to be implying that U.S. wireless markets are "not competitive," though the inference is hard to glean from the FCC's own study on the U.S. wireless market. See the document at (http://hraunfoss.fcc.gov/edocs_public/attachmatch/FCC-10-81A1.pdf)

What "effective competition" looks like varies from market to market, from economist to economist. How many competitors a market must have to be deemed "competitive" is in this case a political question, not an economist's question, though.

There are some businesses where there is no "effective competition" because the market has "natural monopoly" characteristics. You can think of electrical power, waste water, highways and roads (generally speaking), water systems and national defense as clear examples.

Telecommunications once was deemed to be a "natural monopoly," but most regulators around the world now agree that is true only in part. In triple-play markets, for example, effective competition, but not "perfect" competition can occur, in an economic sense, with as few as two players, even though the U.S. market has many more than that in major metro markets, and typically at least two providers even in the rural markets.

In the real world, there are very few examples of major facilities-based competition beyond two major players, although in a few markets there are three facilities-based fixed line providers. As researchers at the Phoenix Center have suggested, in the fixed line triple play markets, imperfect though workable competition does in fact exist with one one dominant telco and one dominant cable provider. 


See http://www.phoenix-center.org/FordWirelessTestimonyMay2009%20Final.pdf, or http://www.phoenix-center.org/pcpp/PCPP12.pdf or www.phoenix-center.org/PolicyBulletin/PCPB11Final.doc.

The problem is what the level of effective competition actually is in the communications market. Presumably the FCC believes three to five competitors in a single market is not enough.

Tuesday, April 20, 2010

Small Cable Operators Think Dumb Pipe Might be a Better Business Model

Not every cable operator thinks over-the-top video is a worse business model than providing cable TV. In fact, some believe providing what might be wholesale services to third parties might actually provide better profit margins than cable TV now does.

"Our video margins are going down year after year," said Colleen Abdullah, the CEO of WideOpenWest Holdings.

Wave Broadband COO Steve Friedman also agreed that the profits from an over-the-top model might be better than the current cable TV business, especially if the new model simply substituted a bandwidth usage model for the current monthly subscription model.

While the dumb pipe model may in fact be better for small operators, that probably is not the case for larger providers.

Probably the worst of all possible outcomes is over-the-top competition from firms such as Comcast, where Comcast sells the video content directly to broadband users, and the local cable modem provider is not able to charge for the additional bandwidth consumed. That is one reason why the dumb pipe model would not work unless some form of consumption-based charging were adopted.

"Over-the-top video will eventually emerge as a challenge to the current model of large, expensive bundles of programming," said Blair Levin, the executive director of the FCC's Omnibus Broadband Initiative. Levin thinks such a move is "inevitable."

The basic tradeoff is that cable operators would essentially trade current linear video subscription revenue for higher broadband access revenues. That essentially was the business decision Qwest Communications made years ago, when it concluded it was better off outsourcing linear entertainment to DirecTV, and building its optical access infrastructure in a way that ultimately is conducive for over-the-top or on-demand video.

"The final inevitability is mobile broadband," said Levin. "We know it's coming. We know it's going to be very, very big."

"In 1994, you could envision as inevitable the Internet replacing existing platforms for communications and entertainment," Levin said. "And based on numerous metrics, that transformation is well underway."

Levin also warned that consumer anger over the cost of cable TV now reminds him of similar sentiment leading up to the 1992 cable act, and that there will likely be "some kind of response, either from the market or from the government," to address those concerns.

Any such move would further limit the upside from linear video and likely propel more movement towards an over-the-top approach.

http://www.lightreading.com/document.asp?doc_id=190749&site=lr_cable&f_src=lightreading_gnews

Sunday, February 28, 2010

Regulatory Pendulum Swings: But Which Way?

In the telecommunications business, the regulatory pendulum swings all the time, though slowly. So periods of relatively less-active regulation are followed by periods of relatively more active rule-making, then again followed by periods of deregulation.

It has been apparent for a couple of years that the regulatory pendulum in the the U.S. telecom arena was swinging towards more regulation.

What now is unclear, though, is whether such new rules will largely revolve around consumer protection and copyright or might extend further into fundamental business practices.

Current Federal Communications Commission inquiries into wireless handset subsidies and contract bundling, application of wireline Internet policies to service wireless providers, as well as the creation of new "network neutrality" rules are examples.

But so will the settting of a national broadband policy likely result in more regulation. And there are some voices calling for regulating broadband access, which always has been viewed as a non-regulated data service, as a common carrier service.

One example is a recent speech given by Lawrence Strickling, National Telecommunications and Information Administration assistant secretary, to the Media Institute.

He said the United States faces "an increasingly urgent set of questions regarding the roles of the commercial sector, civil society, governments, and multi-stakeholder institutions in the very dynamic evolution of the Internet."

Strickling notes that “leaving the Internet alone” has been the nation’s Internet policy since the Internet was first commercialized in the mid-1990s. The primary government imperative then was just to get out of the way to encourage its growth.

"This was the right policy for the United States in the early stages of the Internet," Strickling said. "But that was then and this is now."

Policy isues have ben growing since 2001, he argued, namely privacy, security and copyright infringement. For that reason, "I don’t think any of you in this room really believe that we should leave the Internet alone," he said.

In a clear shift away from market-based operation, Strickling said the Internet has "no natural laws to guide it."

And Strickling pointed to security, copyright, peering and packet discrimination. So government has to get involved, he said, for NTIA particilarly on issues relating to "trust" for users on the Internet.

Those issues represent relatively minor new regulatory moves. But they are illustrative of the wider shift of government thinking. Of course, the question must be asked: how stable is the climate?
Generally speaking, changes of political party at the presidential level have directly affected the climate for telecom policy frameworks. And while a year ago it might have seemed likely that telecom policy was clearly headed for a much more intrusive policy regime, all that now is unclear.

A reasonable and informed person might have argued in November 2008 that "more regulation" was going to be a trend lasting a period of at least eight years, and probably longer, possibly decades.

None of that is certain any longer. All of which means the trend towards more regulation, though on the current agenda, is itself an unstable development. One might wonder whether it is going to last much longer.

That is not to say some issues, such as copyright protection or consumer protection from identity theft. for example, might not continue to get attention in any case. But the re-regulatory drift on much-larger questions, such as whether broadband is a data or common carrier service, or whether wireless and cable operators should be common carriers, might not continue along the same path.

You can make your own decision about whether those are good or bad things. The point is that presidential elections matter, and the outcome of the 2012 election no longer is certain.

Tuesday, January 19, 2010

Is Net Neutrality a Case of "Feeling Good" Rather than "Doing Good"?

With typical wit, Andrew Orlowski at the U.K.-based "The Register" skewers "network neutrality" as a squishy, intellectually incoherent concept. It is so nebulous it can mean anything a person wants it to be, and often is posed as a simple matter of "goodness." Which makes people feel righteous, without having to noodle through the logical implications.

Yes, there often is a difference between feeling good, and doing good, and Orlowski wants to point that out.

"As a rule of thumb, advocating neutrality means giving your support to general goodness on the Internet, and opposing general badness," he says. "Therefore, supporting neutrality means you yourself are a good person, by reflection, and people who oppose neutrality are bad people."

"Because neutrality is anything you want it to be, you have an all-purpose morality firehose at your disposal," he says. "Just point it and shoot at baddies."

Beyond that, there are fundamental issues that seem hard to reconcile, because they are hard to reconcile. Consider the analogy to freedom of speech.

In the United States, at its founding, the right of free speech was said to belong to citizen "speakers," engaged in clearly political speech. Recently, the opposite view has been taken, that the right belongs to "hearers of speech." But that means there is tension: is it the creator of speech who is to be protected, or those who might, or might not, want to listen.

Does copyright protect creators of intellectual content, or those who might want to access it? Do property rights in real estate protect those who own property, or those who want to own it?

Network neutrality essentially poses similar issues, and they will not be easy to reconcile.

Sunday, March 23, 2008

Google's New Terrain

With the bulk of the U.S. auction for new 700-MHz spectrum now over, some observers note that Google was a big winner, though it didn't win any spectrum. Well, that's true, but so are nearly all potential or existing application or device providers.

In fact, it might be worth noting that Google's apparent strategy--to extract regulatory concessions without winning actual spectrum--illustrates the nature of the business environment Google now is entering.

Simply, to the extent that Google's financial interests now require involvement in regulated industries, it has to play the regulatory game, as do all major media and communications industries and contestants.

In the communications and "electronic media" industries, government decisions literally can create the potential for an industry to exist at all, and then dramatically affect its profit potential.

Obviously, scarce spectrum has to be allocated, either terrestrially or in space. But "smaller" decisions also can create fertile or hostile conditions for business activity. At once point early in its history, the cable industry was barred from the practice of importing broadcast TV signals from distant metropolitan areas to cablecast them in outlying areas.

Without access to those signals, there was no foundation for even rudimentary cable services. Without orbital slots, satellite providers could not create the "cable programming" industry. Without franchises, no cable operator can offer service in a city. In the early 1980s, those franchises were monopolies, and only later became non-exclusive.

Likewise, until the Telecommunications Act of 1996, it was not legal for companies to compete with the local phone companies to offer "dial tone" and other services to consumers. AT&T once was a single national monopoly provider, until the threat of a forced breakup lead to the creation of the "long distance" company AT&T and the separate local communications companies US West, Ameritech, Southwestern Bell, BellSouth, Nynex, Pacific Telesis and Bell Atlantic.

Likewise, though people might think movie studios do not have such concerns. They do. The reason movie studios may not own theater chains is because they are barred by law from doing so. The thinking originally was to prevent excessive industry concentration and monopoly.

As Google and other application providers discover their futures lie, in great part, in mobile services, they necessarily will have to participate in efforts to sway the regulatory and legislative process.

In that sense, Google, clearly a winner in the 700-MHz auctions, is starting to pay attention to the ways in which governmental regulations and statutes create, deny, enhance or limit business prospects.

The U.S. VoIP community made just such a rude discovery over the past several years as regulators and lawmakers began to take a look at VoIP, and figure out how to regulate it. Many had hoped that, as a "data" application, VoIP would be exempt from all or most regulations that apply to standard voice services.

That remains largely true for instant messaging-based, or Web site-based forms of voice. But other forms of VoIP, such as services that ultimately will replace today's "phone" services, increasingly are subject to the same sorts of regulations that govern "phone services."

Google has been spectacularly effective in its new rule as a stakeholder in the regulatory process surrounding communications. But its actions are hardly unprecedented.

Saturday, November 10, 2007

Cable Industry to Get Clipped by FCC


In a move that will limit business opportunities for Comcast and Time Warner Cable and help independent networks, the Federal Communications Commission is preparing to impose significant new regulations to open the cable television market to independent networks, after determining that cable operators are too dominant in the multichannel video entertainment market.

Satellite and telco competitors should benefit at least in part, as the new rules are expected to force cable-affiliated programming networks to sell their content to competitors at better rates.

The new rules essentially would prevent Comcast from acquiring any other system assets, and limit Time Warner Cable's ability to make large acquisitions, shutting off a revenue growth path for both firms.

One of the proposals under consideration by the commission would force the largest cable networks to be offered to the rivals of the big cable companies on an individual, rather than packaged, basis. Up to this point cable-affiliated programmers have used the "bundled" wholesale tactic to get wider carriage for niche networks that piggyback on the popularity of major networks. In other words, to get the "must have" channels, competing service providers have to buy the weaker networks as well.

The agency is also preparing to adopt a rule that would make it easier for independent programmers to lease access to cable channels. Cable operators oppose that measure because it reduces their control over scarce channel slots.

Though consumer advocates believe the rule changes will lead to lower prices, that might not happen. What might happen is that consumers will be able to buy more targeted channels and packages without the "buy through" requirements that typically result in viewers "paying" for scores of channels they don't want.

In all likelihood, the changes will benefit a small number of viewers that really are interested in just a few channels, or who do not want to buy sports programs. For most viewers, who watch eight to 12 channels fairly regularly, it likely still will make sense to buy a broad package.

ESPN and sports programming in general is a major reason cable prices have risen so much over the past couple of decades, so opting out of ESPN carriage is one way consumers might save some money. Conversely, the rule changes could be damaging to ESPN if any significant number of consumers they can live without it.

"Tokens" are the New "FLOPS," "MIPS" or "Gbps"

Modern computing has some virtually-universal reference metrics. For Gemini 1.5 and other large language models, tokens are a basic measure...