There’s still interest in rural broadband experiments, but no way to judge feasibility yet


If you don’t ask, you don’t get.

Hundreds of companies, communities and miscellaneous organisations, representing just a touch shy of a thousand projects, told the FCC last March that they wanted to take part in its rural broadband experiment program. When it came time to actually submit a bid – it’s effectively an auction process – only 181 applications were received by the 7 November 2014 deadline.

The FCC hasn’t released a list of the bidders. It’s only said that the process…

…has attracted almost 600 project bids from 181 applicants, representing nearly $885 million worth of projects.

In total, the 181 applicants proposed to serve over 76,000 census blocks in all 50 states and Puerto Rico.

Bidders included a diverse group of entities, including competitive providers, electric utilities, wireless internet service providers, and others.

Without knowing which census blocks are involved, it’s hard to come up with an accurate coverage number, but a very rough guestimate is that something like 3% to 5% of the rural U.S. population might be affected. There are about 11.2 million census blocks in the country, about a fifth of which are rural. 76,000 is about 3.4% of that, but factor out uninhabited census blocks and the figure pushes closer to 5%.

To take it one step further, if you assume 50 to 100 people per rural census block (another egregiously round guestimate), then maybe something half to three-quarters of a million people are affected. Maybe fewer: rural blocks can be tiny in terms of population. That’s a (very) rough indicator of interest, though, not an estimate of the number of rural people who will be seeing broadband service upgrades in the near future. The number of projects funded in this experimental round will be more like a dozen, which drops the assumed coverage by an order of magnitude.

It’ll be interesting to see who actually applied and what they’re proposing to do. A large proportion of those applicants are likely to be no hopers who figured they had nothing to lose by asking. The projects to look for are the ones that either involve truly innovative technology or business models, or are backed by companies with the scale to truly go large in rural areas if the experiment produces a positive result.

No comments

Google’s small business gigabit enables e-commerce for a few dollars more


An extra $30 or so gets you a commercially usable gigabit connection from Google. That’s the deal being rolled out to small businesses in a few districts of the Kansas City metro area.

The basic consumer price in Kansas City is $70 for a gigabit ($120 with television service). Google’s new small business package is $100 for the gigabit, plus another $20 if you want a static IP address (or 5 for $30). Google isn’t saying anything about quality of service or oversubscription levels, which means a commercial subscription might not be any better than consumer grade service. The difference is in the terms of service.

Google Fiber’s small business acceptable use policy says

You agree not to use…the Services provided to you for any of the following purposes…

To make the Services available to anyone outside the premises to which the Services are delivered. This limitation does not prohibit you from offering Internet service to your customers and other authorized users on your premises (for example, via Wi-Fi), subject to this AUP and our Terms of Service.

To resell the Services directly or indirectly, except as explicitly approved by Google Fiber in writing.

To create substitute or related services through the use of or access to the Services (for example, to use the Services to provide web hosting services to third parties).

Compare that to the consumer policy

You agree not to use or allow third parties to use the Services provided to you for any of the following purposes…

To operate servers for commercial purposes. However, personal, non-commercial use of servers that comply with this AUP is acceptable, including using virtual private networks (VPN) to access services in your home and using hardware or applications that include server capabilities for uses like multi-player gaming, video-conferencing, and home security.

Google is trying to draw a plain language line between consumer use (personal stuff for business or pleasure only), small business use (do what you need to do to support your own business, but don’t resell access or web services) and ISP-grade service. So far, the price seems to be scaling fairly.

No comments

The Blue Screen of Death is still deadly, but only for Microsoft


Microsoft’s cloud platform – Azure – had a stormy week, but the silver lining is that the company isn’t shying away from the necessary pain involved in transforming itself from a shrink wrapped software hawker to a computing services provider.

The 11 hour outage was the result of a poorly done system update. At the risk of taking a cheap shot, that’s the kind of glitch that the Windows operating system – and its predecessors – have routinely experienced for more than 30 years. To a large extent, those past problems were unavoidable – information technology grew so explosively that every day was a step into the unknown. Like everyone else, Microsoft engineers had no choice but to make it up on the fly.

But Microsoft’s near-monopoly in the desk top world also played a role. There was nowhere else for system administrators to go and no choice but to plow through whatever problems came with the updates.

It’s a different world now. Cloud computing is, in most respects, a commodity service. The cost of switching from one platform to another isn’t zero, but it isn’t an insurmountable barrier, either. If Amazon and Google are up and running, Microsoft can’t afford to simply shrug its shoulders when the Blue Screen of Death stares out at millions of users all at once.

Slowly but surely, Microsoft is learning how to live in a competitive marketplace. Another encouraging sign is the decision to let Microsoft Garage – a semi-independent corporate hacker space – develop apps for any platform. If Microsoft wants to remain relevant, R&D has to respond to the demands of a competitive and heterogenous market, not serve the demands of Windows and Office. The message seems to be getting through.

No comments

Fast track broadband projects proposed in northern California


Almost 120,000 people – 50,000 households – in 14 California counties would be reached by broadband projects reviewed by the California Emerging Technology Fund (CETF) at its rural forum in Redding last week.

The list of “projects with some current momentum” was developed with the cooperation, and in many cases active participation, of the regional broadband consortia that represent those areas. The plan going forward is to work with project backers, state and federal agencies and CETF to bundle financing together that will cover the typical 30% to 40% investment match requirements of the California Advanced Services Fund.

All the areas involved have been at least provisionally identified as qualifying for CASF grants, which cover 60% to 70% of project construction costs. Loans are also available to cover up to another 20%, to a maximum of $500,000 per project. Projects can be submitted to the California Public Utilities Commission beginning in a little over a week, on 1 December 2014.

One of the projects – dubbed Digital 299 after the state highway it would follow – is aimed at bringing inexpensive middle mile connectivity to Trinity County. The project particularly includes Weaverville, which was pinpointed as a potential hot spot for data center development due to the low cost of electricity there. It also ranked high on measures of social impact and business potential in an analysis I presented at the conference. It’s similar in concept to the Digital 395 fiber network that is now up and running from Reno, down the eastern side of the Sierra Nevada to Barstow.

No comments

How much of the net neutrality job will go to state regulators?


Whether or not the FCC decides to regulate broadband service as a common carrier utility, new net neutrality rules will be imposed, successfully or not. State utility regulators from across the country met in San Francisco this week. The California Public Utilities Commission hosted a conference on Internet regulation, and net neutrality in particular, chaired by commissioner Catherine Sandoval, yesterday afternoon.

A national panel of economics and law professors discussed where state regulators fit in. Not surprisingly, they didn’t agree.

Early on, mobile telecoms regulation was moved nearly completely to the federal level, resulting in “massive investment”, said Charles Davidson from the New York Law School. The same thing happened with broadband and “the result has been massive investment in broadband networks” along with deployment to most parts of the country. “Congress did not contemplate an active state role vis-à-vis broadband”, he concluded.

Mobile telecoms regulation is indeed a good model to follow, said Jim Tuthill from the U.C. Berkeley School of Law. But the lesson there is that mobile telecoms already fall under common carrier rules. “There’s an important role for the states under the regulatory regime of mobile services. Title II [common carrier] regulation of cellular hasn’t hurt investment, hasn’t hurt innovation” he said. “It worked for wireless, it’ll work for broadband”.

The premise for a higher level of state and federal broadband regulation is wrong, according to Michael Katz, from the U.C. Berkeley Economics Department. The debate over network neutrality is primarily a debate over whether some companies should be able to pay for Internet fast lanes to reach customers.

“Paying for better service does not, in and of itself, constitute price discrimination”, Katz said, pointing out that telling General Motors that it could only sell Cadillacs would hurt consumers. “It’s a good thing when different firms that are competing to serve customers go out and buy more expensive inputs to better serve customers”, he said. The answer, according to Katz, is not to try to create rules that will force companies to do or not do certain things, but use existing anti-trust law on a case by case basis to swat down anti-competitive behavior when it occurs.

The FCC’s original approach, using general authority to promote broadband under another section – 706, if you’re keeping score – of federal telecoms law is the better course, according to Christopher S. Yoo from the University of Pennsylvania Law School. Otherwise, big telecoms companies will have a huge advantage in a common carrier game, because “you have to have an army of lawyers to do it”. The original approach – assuming the commission can thread its way through court rulings – works because “network neutrality is a subsidy for the edge”, which incentivises companies to develop services that will attract new users to the Internet. “The rationale is that anything that promotes usage is within the power of 706″, which puts it under state authority, Yoo said. The role of states, he said, is pushing efforts such as broadband mapping studies and data collection, implementing universal service, promoting education and usage as envisaged in the national and state broadband plans and reducing switching costs for consumers.

It was a good discussion of the different regulatory tools in states’ broadband development kit. And the more options, the better, as Sunne Wright McPeak, CEO of the California Emerging Technology Fund, told the group.

“The solution is not a silver bullet, but silver shotgun pellets”, she said.

No comments

Mobile broadband divide detailed at California Broadband Council


Ken Biba, from Novarum Inc., briefed California Broadband Council members yesterday on the results of mobile broadband testing conducted by the California Public Utilities Commission. He reiterated conclusions previously published regarding the mobile broadband divide between rural and urban areas in California.

“It’s a one carrier state and it’s Verizon”, Biba said. Although AT&T has built out into rural areas, too, its service isn’t as available or well performing. As for the rest, “I can’t advise anyone to get a Sprint phone or a T-Mobile phone because you’re not going to get service”, he said.

There’s a wide variation in service, ranging from the best measurements on Verizon’s network along the Mexican border, to great gaping holes in the Sierra and along the northern coast. Where you are and who your carrier is determines the quality of your mobile broadband service, as does the device you’re using and the websites you’re frequenting. But as for the commonly heard claim that time of day determines mobile broadband performance, “it’s largely bullshit”, Biba said.

The data also points to the need for more fiber – and more access to existing fiber – particularly in rural areas. Once a mobile broadband connection hits a rural tower, it slows down. “There’s a 40% latency penalty for rural users over urban users”, Biba said, emphasising that this finding was preliminary and more research would be done to confirm (or rebut) it. Even so, “I think it’s real”.

In what was effectively a lame duck session, the council also heard telecoms regulators from New York and the Virgin Islands talk about the challenges they face, and closed out the session by thanking everyone, and particularly CPUC staff, for past years of support. Neither Alex Padilla nor Steven Bradford – outgoing state assembly and senate representatives respectively – attended. It’s also the last meeting for council chair – and CPUC president – Michael Peevey. Of the principal members (the others are staff representatives from various state agencies) only Sunne Wright McPeak, president of the California Emerging Technology Fund, will return next year. The council wrapped up the meeting by electing Carlos Ramos, head of the California Department of Technology and the state’s CIO, as the new chair, beginning in January.

No comments

California readies $25 million public housing broadband program


Public housing operators in California can start applying for broadband facilities and marketing subsidies beginning next month, assuming the California Public Utilities Commission approves draft rules for the program that were released yesterday.

I can’t summarise the program any better than CPUC staffer Tom Glegola…

The Account provides $20 million for grants and loans to finance inside wiring and equipment, and $5 million for adoption projects. AB 1299, the legislation creating the new account, limits eligibility for both activities to a “Publicly supported community” (PSC) which is defined as “a publicly subsidized multifamily housing development that is wholly owned” by either a chartered public housing authority or a 501 (c)(3) non-profit that has received public funding to subsidize the construction or maintenance of affordable housing.

The current proposal (not final until the Commission approves it) would award grants and loans to finance up to 100 percent of the costs to install inside wiring and equipment, but will not finance maintenance or operation costs (monthly costs). Grantees must maintain and operate the network for a minimum five years after receiving Commission funding…

For adoption projects, the current proposal is that Commission funds up to 85 percent of the costs for adoption projects.

Facilities grants would pay for wired and wireless networking equipment, wiring, modems and routers (but not computers or other personal equipment), engineering and the usual overhead. Broadband marketing efforts – or “adoption” programs, if you prefer – can include adequate personal equipment (in other words, not smart phones) as well as education costs, technical support and facilities for computer labs. That’s a summary, anyway. The full version of what is and isn’t allowed, plus application details, can be found in the draft rules.

The CPUC is scheduled to vote on the proposal at its meeting on 18 December 2014.

No comments

FCC squeezes the AT&T GigaWeasel


Sneak peak at AT&T’s response.

The FCC slapped back at AT&T on Friday, demanding it turn over information describing exactly what it means when it says it’s going to build fiber to 2 million more homes if its deal to buy DirecTv is approved, but will otherwise stop upgrading systems while the FCC decides whether to regulate broadband as a common carrier service.

That was the gist of comments made on Wednesday by AT&T CEO Randall Stephenson (h/t to Fred Pilot at Eldo Telecom for the heads up). Today, the FCC sent – and published – a letter to AT&T that gives the company a week to turn over…

Data regarding the Company’s current plans for fiber deployment, specifically: (1) the current number of households to which fiber is deployed and the breakdown by technology (i.e., FTTP or FTTN) and geographic area of deployment; (2) the total number of households to which the Company planned to deploy fiber prior to the Company’s decision to limit deployment to the 2 million households and the breakdown by technology and geographic area of deployment; and (3) the total number of households to which the Company currently plans to deploy fiber, including the 2 million households, and the breakdown by technology and geographic area of deployment…

The distinction between FTTP and FTTN (fiber to the premise versus fiber to the node) is important. Although AT&T has talked about building FTTP, there’s good reason to think they really intend to just do FTTN builds in most cases.

If AT&T actually puts anything interesting in its response, expect the juicy nuggets to be kept from public view. But since the FCC is asking for the information in the context of reviewing the DirecTv deal, AT&T will have little room to spin its GigaWeasel marketing language into a legal filing. It’ll either have to appeal Friday’s demand or comply with a high degree of truth and clarity. Which will give the FCC, if not the public, a way to either debunk AT&T’s threat now or hold it to performance standards later.

Comments (1)

FCC commissioners surf a common carrier wave


As detailed yesterday, an article in the the Washington Post describes well the split between U.S. president Barack Obama and FCC chair Tom Wheeler over common carrier regulation of Internet infrastructure and service. But it’s not a game of equals, which is why the safe bet is on adoption of Title II common carrier rules.

Even though the Post article puts Wheeler on an even footing with Obama as an independent policy maker, the reality is far different. The power of the presidency and Wheeler’s obligations to the man who appointed him aside, in the end he’s only 1 vote amongst 5 commissioners. The 2 republicans – Ajit Pai and Michael O’Rielly – seem absolutely opposed to any additional regulation. Mignon Clyburn has voiced muted concerns about Wheeler’s approach, not to mention the fact that her father is Jim Clyburn, the third ranking democrat in the house of representatives and one of Obama’s staunchest supporters.

Jessica Rosenworcel, also a democrat, has bucked Wheeler and looked favorably upon common carrier rules from the beginning, as her nuanced explanation of her vote to move ahead with the current FCC net neutrality proceeding made clear…

I support an open Internet. But I would have done this differently…I believe the process that got us to this rulemaking today is flawed. I would have preferred a delay. I think we moved too fast to be fair. So I concur. But I want to acknowledge that the Chairman has made significant adjustments to the text of the rulemaking we adopt today. He has expanded its scope and put all options on the table. Our effort now covers law and policy, Section 706 and Title II.

Wheeler has lost control of the game. Any deal he cuts will have to include extensive enough common carrier regulation to allow Obama to claim it as a victory. There’s still enough wiggle room to throw a bone to his colleagues in the cable and telco lobby, but it will be radically different from the no lobbyist left behind dreamworld that Wheeler first proposed.

No comments

Wheeler missed the point of the story: you can’t split a baby


Ex parte pleading for an ex partitio solution.

Although FCC chairman Tom Wheeler continues to play his cherished Beltway bandit game behind closed doors, the likelihood of Internet service coming largely, if not completely, under common carrier regulation is growing.

An excellent article by Brian Fung and Nancy Scola in the Washington Post clearly lays out the problem: U.S. president Barack Obama wants full on common carrier regulation, while lobbyist-in-chief Wheeler wants to cut a deal that pleases everyone, at least everyone who counts, which in Wheeler’s world is deep-pocketed lobbyists.

According to the Post article, Wheeler said as much in a meeting with lobbyists from “major Web companies, including Google, Yahoo and Etsy”…

“What you want is what everyone wants: an open Internet that doesn’t affect your business,” a visibly frustrated Wheeler said at the meeting, according to four people who attended. “What I’ve got to figure out is how to split the baby.”

Of course, in Wheeler’s world, everyone most particularly includes the likes of AT&T, Comcast and their cable and telco brethren, whom Wheeler represented as the top Washington lobbyist for both the mobile phone and cable industries over decades. His first shot at drafting net neutrality rules was tailor made for them, but millions of public comments, howls from technology and logistics companies and, finally, Obama’s démarche have backed him into a corner, severely constraining his ability to act as a deal maker.

There’s no doubting he’s frustrated. As long as he could maintain the outward appearance of Obama’s backing, he had the ability to lead the FCC in the direction he wanted to go. Now, the only vote he can count on at the commission is his own.

No comments