Monday, October 30, 2017

Misconceptualizing advanced landline telecommunications: not a competitive local market


America’s advanced telecommunications infrastructure gaps are not an inherently local problem. They occur all over the United States – in urban, suburban, exurban and rural areas. It is a nationwide issue requiring a national solution. A major impediment to addressing this issue from a national or regional perspective is telecommunications is typically conceived of as a local service offering rather than infrastructure that links localities to other localities, regions and states and nations – the way long distance telephone service did for decades. The root of this conceptualization has both old and new origins.

The older one is cable TV service. It got its start in the 1950s as definitively local service, serving localities that for reasons of distance and terrain could not reliably receive over the air television signals. Cable providers erected large community antennae to pick up and amplify the signals, delivering them over cables to customer premises. Hence its designation as CATV service -- Community Antenna Television. Local governments saw CATV – later fed with satellite delivered TV programming – as a local service and issued franchises to cable operators. Cable thus became to be thought of as a local service that varied from locality to locality.

The newer conceptualization of telecommunications as a local service comes courtesy of legacy telephone companies that delivered voice phone service over twisted pair copper for many decades starting early in the last century. Around 2000, telephone companies began providing Internet connections via Digital Subscriber Line (DSL) service. This technology is hyper local because of its limited range, able to serve customer premises only within about two and a half miles of phone company central office facilities. Consequently, localities ended up with some neighborhoods able to get DSL service while others too far from the central offices could not. That further reinforced the conception of advanced telecommunications as a highly localized service.

Then around 2005, cable providers began offering Internet protocol-based voice and data services. They realized local governments could require them to upgrade and build out their infrastructures to offer these advanced telecommunications services to all customer premises in a given local jurisdiction. Wanting to avoid the capital expenditures entailed with that, the cable companies championed legislation that took franchising authority away from the locals and transferred it to state public utility commissions. Consequently as with phone company DSL service, some neighborhoods are served while others not in cable companies’ desired service area “footprint” remain unserved.

Viewing advanced telecommunications as a local service offering – priced, advertised and sold in service bundles – naturally leads to an unrealistic expectation that it should be a competitive market like other widely advertised services. If Company X won’t serve my neighborhood, then I should be able to go to Company Y or Company Z. If Provider A doesn’t offer the service bundle at the price I can afford, then I should be able to shop Providers B, C and D for an alternative offer.

Problem is these service offers aren’t available because the other providers aren’t necessarily in the market, their advertising notwithstanding. The fine print in the ads from the legacy telephone and cable providers notes that service “may not be available in all areas.” That’s because in much of their nominal service areas, it costs too much and is too economically risky to support those other options under the dominant business model where the provider owns the infrastructure connecting customer premises that pay using recurring monthly subscriptions. The risk is not enough premises will subscribe or too many that do will close their accounts to justify the investment in high cost infrastructure. Any new providers who might compete with the incumbent providers face that risk and more since they would have to woo away customers from the incumbents as well as get their own.

That business case risk is unlikely to change if advanced landline telecommunications remains largely unregulated on a de facto basis and left to large, investor-owned legacy telephone and cable companies. They’re not promoting their ability to connect more and more customer premises and there is no enforced national regulatory policy that compels them to do so. Lately, their ads promote sports and entertainment content -- for the premises they choose to serve with landline infrastructure -- and mobile devices.

Tuesday, October 24, 2017

Where's the case Title II regulation of ISPs deters telecom infrastructure investment?

Improved broadband access is one of the most important benefits of reversing Title II overreach. The internet brought us what seems like endless opportunities. The corollary to this, however, is that Americans without access to the internet are left behind. Internet access and computer skills are key to being connected, well-informed and competitive — not only in today’s job market, but ultimately in today’s digital era. By returning to commonsense regulation that incentivizes broadband investment and expansion, we can build out more robust networks that keep the American dream alive for those striving to succeed in today’s technology-driven world. At least ten percent of Americans (35 million people) lack adequate broadband access, according to the FCC’s 2016 Broadband Progress Report. This includes 23 million Americans in rural communities. Faced with these troubling statistics, priority should be heightened to champion the urgency of broadband deployment.

Source: Rolling back 'net neutrality' is essential to the free internet's future.

Problem with this argument is it fails to state a clear case as to why subjecting Internet service providers offering Internet protocol-based telecommunications service under Title II of the Communications Act will deter deployment of telecommunications infrastructure. In fact, the Title II regulatory scheme mandates universal service to all Americans who reasonably request service. Many if not most of those millions of Americans the author points to as lacking adequate Internet access have repeatedly requested service and been denied service in violation of this requirement and its bar on neighborhood redlining. That's because providers have not adequately invested in their infrastructure to make service available to them. Those provisions of Title II were put in force in 2015 by the U.S. Federal Communications Commission's Open Internet rulemaking. (Click here for more background.)

Thursday, October 19, 2017

U.S. should avoid "broadband speed" standard, set infrastructure-based telecom modernization goal

On a conference call with reporters, U.S. Senate Minority Leader Charles E. Schumer today called on the Federal Communications Commission (FCC) to immediately reverse course and reject any proposal to downgrade the minimum benchmark definition of internet service, which would create the mirage of more widespread broadband service without actually improving quality or accessibility for high-speed home internet. Schumer emphasized that pushing this standard would undermine access to genuine high-speed broadband for Upstate New Yorkers, which should be the FCC’s focus, according to Schumer.  Schumer called on the FCC to end all attempts to “define access down. ”

*  *  *

Schumer said that each year the FCC evaluates national broadband deployment standards to ensure internet service providers (ISPs) are equally distributing quality broadband. In 2015, the FCC established a new definition of broadband, increasing the access requirement from 4Mbps minimum download speed, 1Mbps upload speed, to 25Mbps/3Mbps in order to serve the 55 million Americans without high-speed internet at those speeds. This decision was an attempt to raise the bar for the quality of internet being deployed and set goals aimed at increasing reliable broadband access for millions of Americans.
Press release from Schumer's office here.


This is well intended on Schumer’s part given that Upstate New York like much of America suffers from deficient advanced telecom infrastructure. But the fundamental problem isn’t the U.S. Federal Communications Commission potentially setting the bar too low. Rather, the wrong metric is being utilized.

Instead of throughput speed, the United States should establish an infrastructure-based goal of bringing modern fiber optic telecommunications connections to every home, business and institution. And do so as a crash program given the critical role of telecommunications in today’s digital information economy and the widespread infrastructure deficiencies. In setting this goal, the nation must also create a plan to achieve it since it’s meaningless without one.


Tuesday, October 17, 2017

Wildfires pose potential crisis -- and opportunity -- for PG&E

Wildfires create worst crisis for PG&E since San Bruno gas disaster | The Sacramento Bee: California’s wildfires have left Pacific Gas and Electric Co. confronting its most serious financial crisis since the 2010 San Bruno gas explosion, a disaster that threatened the company with bankruptcy and ultimately cost the utility $1.6 billion in fines and other costs. Two state agencies, Cal Fire and the California Public Utilities Commission, have launched investigations into whether Northern California’s largest utility could be at least partly responsible for the fires that ignited Oct. 8, killing at least 41 people and destroying roughly 5,700 homes and businesses. So far, neither Cal Fire nor the CPUC has cited evidence that PG&E contributed to any the ignitions. But the stock price of parent company PG&E Corp. has plunged over the last week amid investor jitters that the utility could be held responsible. PG&E shares closed Monday at $53.43, a drop of $4.34. Since Friday the company’s stock market value has fallen by more than $5 billion.

The threat of wildfires sparked by electric power transmission lines in PG&E's Northern California service territory will continue into the future after the recent deadly wildfires that ravaged California’s wine country, killing more than 40 people and destroying several thousand homes and businesses. Some predict the hazard will worsen due to climate change and continued residential development near fire prone wildland areas.

Out of crisis, goes the adage, opportunity often follows. For PG&E, that opportunity is to vastly reduce the chance of its power lines starting destructive wildfires and subjecting the company and its shareholders to significant legal liability. How so? By placing its last mile distribution lines serving customer premises in buried underground conduit instead of suspended overhead on wooden poles close to combustible flora and other materials. 

There’s an additional bonus on top of the reduced maintenance and storm outage costs associated with above ground transmission poles and infrastructure. PG&E recently filed an application with California utility regulators to serve as a wholesale telecommunications provider using its fiber optic infrastructure. Conduit for underground electrical power cables could also house fiber for telecommunications and bring it close to residential, business and institutional PG&E customers. PG&E could lease that fiber to internet service providers, providing an additional revenue stream to help offset the cost of undergrounding its premise electrical service lines.

And that's not all. In placing electric power lines in underground conduit, electric utilities can apply shielding to protect the grid from damaging electromagnetic flux from X-class solar flares or EMP weapons detonating at high altitude. 

Thursday, October 05, 2017

Stop the Cap! The End of Google Fiber Expansion: Where Did It All Go Wrong?

Stop the Cap! The End of Google Fiber Expansion: Where Did It All Go Wrong? : The bean counters also arrived at Google Access — the division responsible for Google Fiber — and by October 2016, Google simultaneously announced it was putting a hold on further expansion of Google Fiber and its CEO, Craig Barratt, was leaving the company. About 10% of employees in the division involuntarily left with him. Insufficiently satisfied with those cutbacks, additional measures were announced in April 2017 including the departure of Milo Medin, a vice president at Google Access and Dennis Kish, a wireless infrastructure veteran who was president of Google Fiber. Nearly 600 Google Access employees were also reassigned to other divisions. Medin was a Google Fiber evangelist in Washington, and often spoke about the impact Google’s fiber project would have on broadband competition and the digital economy. Porat’s philosophy had a sweeping impact on Alphabet and its various divisions. The most visionary/experimental projects that were originally green-lit with no expectation of making money for a decade or more now required a plan to prove profitability in five years or less. (Emphasis added).

In adopting that five year ROI cutoff, Google Fiber effectively placed itself under the same financial constraints governing slow moving legacy telephone and cable companies it hoped to overbuild with fiber to the premise (FTTP). Having ventured into FTTP nearly a decade ago with no overwhelming technological or marketing advantage and using the same recurring monthly subscription business model -- including TV programming -- as the incumbents, it should surprise no one it's retreating.

As a former advisor to Google co-founder Larry Page was quoted as saying in Phil Dampier's post mortem excerpted above, "There’s no flying-saucer shit in laying fiber." Indeed. So unless Google Fiber figures out how to teleport fiber conduit into the ground or develops fiber cables that hang in mid air defying gravity -- thus avoiding the need for pole access -- it's pointless for Google Fiber to remain in FTTP.

Google Fiber's parent company, Alphabet, has a unit simply dubbed "X" to develop "moonshot" inventions profiled in the November 2017 issue of The Atlantic. Perhaps X will be able to obsolete FTTP and the Internet itself by coming up with a way to store quantum bits of information in the substrate of space time and encrypted by a form of blockchain technology to ensure data integrity.

Saturday, September 30, 2017

A Better Deal falls short of urgent need to fully modernize America’s telecommunications infrastructure

Democrats this week unveiled a plank of the party’s A Better Deal platform declaring Internet protocol-based advanced telecommunications an essential modern utility equivalent to electric power service. It proposes a $40 billion Universal Grant Program to subsidize for profits, cooperatives and local governments to ensure it is available to every U.S. home, school and small business.

The proposal falls short relative to the urgent need to modernize America’s legacy metallic telecommunications infrastructure designed for analog telephone and cable TV of decades past to fiber optic infrastructure. Its main flaw is it isn’t framed an infrastructure initiative.

Rather, the proposal calls for a service standard couched in outdated terminology, calling for “universal high speed Internet.” That term describes a level of service and not infrastructure. It and “broadband” distinguish from narrowband, low speed dialup connections over phone lines commonly used in the 1990s (and unfortunately still the case in 2017 for too many American homes). In so doing, the Democratic proposal falls into the trap of the current debate over what constitutes “high speed Internet.” That can only add further delay to solving the deepening crisis of deficient telecommunications infrastructure in much of the United States that now requires an expedited effort.

In addition to its origins in the past, “high speed Internet” is also too present focused since that term means what’s sufficient to support today’s needs relative to high quality voice, video and data. It doesn’t take into account tomorrow’s needs which will undoubtedly require more bandwidth -- and the growth capacity only fiber optic premise connections can efficiently provide. That’s why instead of “high speed Internet,” the federal government should instead launch a cleanly defined telecom infrastructure modernization initiative to bring fiber connections to every American doorstep. And provide sufficient funding to achieve it. That will take at least five times the $40 billion the Democrats propose.

Thursday, September 28, 2017

Google Fiber's Kansas City experiment demonstrates need for publicly owned advanced telecom infrastructure

Google Fiber made Kansas City better but didn't transform it | The Kansas City Star: There may be a lesson here. Digital technology has undoubtedly transformed our world, disrupting media, entertainment, politics, retail, money management and more. But the miracle is at the end of the pipeline — the miracle isn’t the pipeline itself. Most Americans now see internet service as a utility, and price remains an important consideration. That could explain why Google Fiber is rethinking its role in getting digital service to the home.

Internet protocol-based advanced telecommunications is indeed a modern utility for residential, commercial and institutional premises just as electricity and telephone service before it. However, what remains unclear is the appropriate business and pricing model. Electricity is correctly billed on a consumption basis. Use more megawatts, pay more. That makes sense because the generation of those megawatts incurs costs directly attributable to their production. But the same cannot be said for the gigabits and terabits that power advanced telecommunications carrying voice, video and data.

The Kansas City Star correctly observes price of this most new utility is a consideration. It's because ISPs bill using a monthly recurring charge as do other utilities. Every household budgets based on its monthly recurring costs such as mortgage or rent payments and utilities. But is that the right pricing model for advanced telecommunications, particularly when the monthly recurring charge is based on bandwidth? While large businesses and data and call centers might be in the market to buy bandwidth, most consumers are not. They merely want reliable telecommunications service that doesn’t distort, slow down or stall and don’t care about the bandwidth that ensures that level of service.
 
The only way to ensure that service standard going forward as the bandwidth requirements of advanced telecommunications services evolve and grow is fiber to the premise telecom infrastructure. It’s the only technology that provides sufficient headroom for whatever services may be coming in the foreseeable future as well as adequately supporting today’s. In that regard, Google Fiber got the technology side of the equation right. But as the Star suggests, the business model essentially copied that used by legacy telephone and cable companies needs rethinking.

A better model would be to treat most telecommunications infrastructure as a public asset like roads and highways, funded by taxpayers at all levels of government – federal, state and local. Google Fiber and other ISPs would have a role to build and maintain those fiber thoroughfares and sell services over them on an open access basis. But they shouldn’t own them. Since they would be selling services, it would be in the economic interests of the ISPs to ensure the reliability of the network.

The current private ownership model of advanced telecommunications service is clearly broken and crippled by market failure in much of the United States lacking infrastructure capable of reliably delivering high quality voice, video and data. As the Google Fiber experiment shows, simply adding another investor-owned ISP isn’t going to solve that national problem. A new path forward is needed.