AT&T tells FCC sorry, we meant to say we’re bailing on DSL, not fiber.txt

FacebookTwitterGoogle+PinterestLinkedInRedditEmail

Under fire from the FCC, AT&T is walking back a statement by CEO Randall Stephenson that the company will stop building out fiber while common carrier regulation of broadband is on the table. With the exception, Stephenson said, of 2 million homes that were promised as part of AT&T’s bid to get regulatory approval of its purchase of DirecTv.

That statement, made at an investment conference and likely unscripted, provoked a demand from the FCC for AT&T to explain itself. Which it has now done in a lawerly letter to the commission (h/t to Omar Masry for the heads up).

The letter starts off by saying that AT&T isn’t reneging on its commitment to expand its fiber footprint in 25 metro areas, although it remains vague about what that means – it references the company’s GigaWeasel GigaPower offering, which can be and usually is copper-based and limited to 300 Mbps or slower. It then goes on to say that everything else is off the table until the uncertainty surrounding common carrier status is settled, including most particularly DSL upgrades or expansion

While we have reiterated that we will stand by the commitments described above, this uncertainty makes it prudent to pause consideration of any further investments – beyond those discussed above – to bring advanced broadband networks to even more customer locations, including additional upgrades of existing DSL and IPDSL lines, that might be feasible in the future under a more stable and predictable regulatory regime.

AT&T’s threat to hold hostage subscribers trapped with decaying copper plant should make interesting reading as the FCC considers new rules governing telcos’ efforts to scrap those networks.

Comments (1)

Marriott wants FCC cover for attacks on guests’ WiFi devices

FacebookTwitterGoogle+PinterestLinkedInRedditEmail

It’s interference, but it’s for your own good.

Lobbyists for Marriott and the hotel industry are asking for permission to use technological attacks to shut down personal WiFi hotspots and other devices on their properties (h/t to the Baller-Herbst list for the pointer). All in the name of security, of course. These are public spirited companies that would never do something so crass just to protect the profits generated from selling Internet access to guests.

As explained by Fletcher, Heald and Hildreth’s ComLawBlog, Marriott combats competing WiFi signals and what it considers misuse of its own network with digital counterattacks…

To address these various problems, Marriott and its friends commonly deploy sophisticated and expensive Wi-Fi network management systems that search for unauthorized or excessive uses of a network. When such uses are detected, the systems send codes disabling them. The signals technically don’t “interfere” with guest Wi-Fi signals – that is, they don’t “jam” any radio signals, which would obviously be illegal. Instead, they simply send management commands that keep unwanted systems from accessing the venue’s own system.

That’s the good news. The bad news is that the disabling management commands also keep the unwanted private systems from functioning independently, even if those systems don’t try to interconnect with or slow down the venue’s system.

Marriott and the industry’s lobbying front, the American Hospitality and Lodging Association, want FCC blessing to control unlicensed spectrum within the bounds of their own property. Purely to protect the public, of course.

The FCC has already fined Marriott $600,000 for its particular brand of WiFi aggression. The petition under consideration would create a loophole big enough to resume the offensive. The deadline for comments is 19 December 2014.

No comments

FCC looks at telcos’ copper network retirement by neglect, considers forced sales

FacebookTwitterGoogle+PinterestLinkedInRedditEmail

Landline telephone companies are backing off from traditional Plain Old Telephone Service in favor of less regulated and more advanced Internet Protocol technologies. When they invest in upgrades, it’s usually fiber and not copper-based. As a result, there’s a move away from copper networks and associated legacy services.

In a notice approved last week and published yesterday, the FCC is looking for comments on how it should regulate that process. One of the questions the commission wants to answer is what do about de facto retirement of copper plant, where telephone companies simply let unprofitable network segments rot on the poles…

There are numerous allegations that in some cases incumbent LECs are failing to maintain their copper networks that have not undergone the Commission’s existing copper retirement procedures…First, to establish whether there is a factual basis for new rules in this area, are incumbent LECs in some circumstances neglecting copper to the point where it is no longer reliably usable? We seek specific examples and facts concerning the consequences to consumers, competition, and public safety.

It’s a particular problem for rural areas in California and elsewhere. The big incumbents – AT&T and Verizon – want to get out of the rural landline business and convert subscribers to higher priced wireless service. Consequently, they’re refusing to upgrade overloaded DSL facilities or, in some cases, offer broadband service at all. That’s not exactly the copper-to-fiber/POTS-to-IP transition that the FCC is addressing right now, but it’s effectively the same thing.

One solution that’s mentioned in passing is requiring telcos to sell off copper networks that they no longer want…

We further intend to determine what role, if any, the Commission should play in any sale or auction of copper, including whether the Commission should establish rules requiring incumbent LECs to make a good faith effort to sell their copper networks before retiring the facilities.

The FCC’s notice doesn’t explicitly tie de facto abandonment to forced sales of systems. But it should. If AT&T or Verizon don’t want to invest in upgrading a rural system, then they shouldn’t be allowed to use their protected monopoly status to hold those subscribers hostage to decaying facilities.

No comments

Tacoma loses money on muni cable, won’t go all in on broadband to make up the difference

FacebookTwitterGoogle+PinterestLinkedInRedditEmail

Tacoma, Washington has a municipal broadband network that started out offering cable television service and then later added broadband. The system grew out of a fiber network that was originally installed to support the city-owned electric utility.

Called Click, the hybrid fiber-coax system was upgraded to DOCSIS 3 standards a couple of years ago. It competes with Comcast head-on, and with CenturyLink in the broadband space. Like any small cable system, Click has struggled with increasing programming costs. But since it’s run by a public agency, its ability to raise subscription rates are constrained by political factors as well as economic ones.

Overall, the system has lost money, something like $2 million a year according to a presentation prepared a couple of years ago when the city council was considering a rate increase. A recent article in the Tacoma News Tribune by Sean Robinson confirms that trend continues, with electric customers picking up the short fall…

Some of Click’s fiscal woes are structural, [public utility director Bill Gaines] said: As a public entity, Click is bound by union contracts and personnel expenses that private outfits don’t face. Lacking Comcast’s massive market share, Click has a tougher time negotiating programming rates.

In the big-picture sense, the $353 million utility can easily absorb the cable expense overruns with minimal impact on electrical ratepayers — the subsidy amounts to slivers of fractions of pennies on the dollar — but it’s still bad business.

“Right now, this thing (Click) is being subsidized to a degree by the electric power customers because it’s not recovering all of its costs,” Gaines said. “That will get worse over time, and that’s really not fair. It’s not fair to the electric power customers — so we’re trying to get out of that box.”

According to the article, it’s the cable TV side of the business that’s losing money; broadband is said to profitable, at least on an operating basis. But it wouldn’t be a money maker if TV service was axed: broadband cash flow alone won’t cover fixed expenses.

One option that’s being considered is selling the system off and getting out the business completely. Another is to double down on broadband.

Like other publicly owned power systems in Washington state, Tacoma runs an open access network. It sells broadband connections to private ISPs at wholesale rates, which in turn set their own prices and re-sell to consumers. It’s similar to how the Provo, Utah system was run before it was sold to Google for $1, because the revenue generated couldn’t keep up with operating expenses and bond repayments.

To claim the full share of Internet revenue, Click would have to start competing with ISPs at the consumer level, as it already does for commercial and industrial-grade accounts. Not surprisingly, the idea has generated considerable push back from those ISPs, which has been sufficient to stall the idea.

For now, Click remains in a holding pattern, while city officials consider their options.

No comments

There’s still interest in rural broadband experiments, but no way to judge feasibility yet

FacebookTwitterGoogle+PinterestLinkedInRedditEmail

If you don’t ask, you don’t get.

Hundreds of companies, communities and miscellaneous organisations, representing just a touch shy of a thousand projects, told the FCC last March that they wanted to take part in its rural broadband experiment program. When it came time to actually submit a bid – it’s effectively an auction process – only 181 applications were received by the 7 November 2014 deadline.

The FCC hasn’t released a list of the bidders. It’s only said that the process…

…has attracted almost 600 project bids from 181 applicants, representing nearly $885 million worth of projects.

In total, the 181 applicants proposed to serve over 76,000 census blocks in all 50 states and Puerto Rico.

Bidders included a diverse group of entities, including competitive providers, electric utilities, wireless internet service providers, and others.

Without knowing which census blocks are involved, it’s hard to come up with an accurate coverage number, but a very rough guestimate is that something like 3% to 5% of the rural U.S. population might be affected. There are about 11.2 million census blocks in the country, about a fifth of which are rural. 76,000 is about 3.4% of that, but factor out uninhabited census blocks and the figure pushes closer to 5%.

To take it one step further, if you assume 50 to 100 people per rural census block (another egregiously round guestimate), then maybe something half to three-quarters of a million people are affected. Maybe fewer: rural blocks can be tiny in terms of population. That’s a (very) rough indicator of interest, though, not an estimate of the number of rural people who will be seeing broadband service upgrades in the near future. The number of projects funded in this experimental round will be more like a dozen, which drops the assumed coverage by an order of magnitude.

It’ll be interesting to see who actually applied and what they’re proposing to do. A large proportion of those applicants are likely to be no hopers who figured they had nothing to lose by asking. The projects to look for are the ones that either involve truly innovative technology or business models, or are backed by companies with the scale to truly go large in rural areas if the experiment produces a positive result.

No comments

Google’s small business gigabit enables e-commerce for a few dollars more

FacebookTwitterGoogle+PinterestLinkedInRedditEmail

An extra $30 or so gets you a commercially usable gigabit connection from Google. That’s the deal being rolled out to small businesses in a few districts of the Kansas City metro area.

The basic consumer price in Kansas City is $70 for a gigabit ($120 with television service). Google’s new small business package is $100 for the gigabit, plus another $20 if you want a static IP address (or 5 for $30). Google isn’t saying anything about quality of service or oversubscription levels, which means a commercial subscription might not be any better than consumer grade service. The difference is in the terms of service.

Google Fiber’s small business acceptable use policy says

You agree not to use…the Services provided to you for any of the following purposes…

To make the Services available to anyone outside the premises to which the Services are delivered. This limitation does not prohibit you from offering Internet service to your customers and other authorized users on your premises (for example, via Wi-Fi), subject to this AUP and our Terms of Service.

To resell the Services directly or indirectly, except as explicitly approved by Google Fiber in writing.

To create substitute or related services through the use of or access to the Services (for example, to use the Services to provide web hosting services to third parties).

Compare that to the consumer policy

You agree not to use or allow third parties to use the Services provided to you for any of the following purposes…

To operate servers for commercial purposes. However, personal, non-commercial use of servers that comply with this AUP is acceptable, including using virtual private networks (VPN) to access services in your home and using hardware or applications that include server capabilities for uses like multi-player gaming, video-conferencing, and home security.

Google is trying to draw a plain language line between consumer use (personal stuff for business or pleasure only), small business use (do what you need to do to support your own business, but don’t resell access or web services) and ISP-grade service. So far, the price seems to be scaling fairly.

No comments

The Blue Screen of Death is still deadly, but only for Microsoft

FacebookTwitterGoogle+PinterestLinkedInRedditEmail

Microsoft’s cloud platform – Azure – had a stormy week, but the silver lining is that the company isn’t shying away from the necessary pain involved in transforming itself from a shrink wrapped software hawker to a computing services provider.

The 11 hour outage was the result of a poorly done system update. At the risk of taking a cheap shot, that’s the kind of glitch that the Windows operating system – and its predecessors – have routinely experienced for more than 30 years. To a large extent, those past problems were unavoidable – information technology grew so explosively that every day was a step into the unknown. Like everyone else, Microsoft engineers had no choice but to make it up on the fly.

But Microsoft’s near-monopoly in the desk top world also played a role. There was nowhere else for system administrators to go and no choice but to plow through whatever problems came with the updates.

It’s a different world now. Cloud computing is, in most respects, a commodity service. The cost of switching from one platform to another isn’t zero, but it isn’t an insurmountable barrier, either. If Amazon and Google are up and running, Microsoft can’t afford to simply shrug its shoulders when the Blue Screen of Death stares out at millions of users all at once.

Slowly but surely, Microsoft is learning how to live in a competitive marketplace. Another encouraging sign is the decision to let Microsoft Garage – a semi-independent corporate hacker space – develop apps for any platform. If Microsoft wants to remain relevant, R&D has to respond to the demands of a competitive and heterogenous market, not serve the demands of Windows and Office. The message seems to be getting through.

No comments

Fast track broadband projects proposed in northern California

FacebookTwitterGoogle+PinterestLinkedInRedditEmail

Almost 120,000 people – 50,000 households – in 14 California counties would be reached by broadband projects reviewed by the California Emerging Technology Fund (CETF) at its rural forum in Redding last week.

The list of “projects with some current momentum” was developed with the cooperation, and in many cases active participation, of the regional broadband consortia that represent those areas. The plan going forward is to work with project backers, state and federal agencies and CETF to bundle financing together that will cover the typical 30% to 40% investment match requirements of the California Advanced Services Fund.

All the areas involved have been at least provisionally identified as qualifying for CASF grants, which cover 60% to 70% of project construction costs. Loans are also available to cover up to another 20%, to a maximum of $500,000 per project. Projects can be submitted to the California Public Utilities Commission beginning in a little over a week, on 1 December 2014.

One of the projects – dubbed Digital 299 after the state highway it would follow – is aimed at bringing inexpensive middle mile connectivity to Trinity County. The project particularly includes Weaverville, which was pinpointed as a potential hot spot for data center development due to the low cost of electricity there. It also ranked high on measures of social impact and business potential in an analysis I presented at the conference. It’s similar in concept to the Digital 395 fiber network that is now up and running from Reno, down the eastern side of the Sierra Nevada to Barstow.

No comments

How much of the net neutrality job will go to state regulators?

FacebookTwitterGoogle+PinterestLinkedInRedditEmail

Whether or not the FCC decides to regulate broadband service as a common carrier utility, new net neutrality rules will be imposed, successfully or not. State utility regulators from across the country met in San Francisco this week. The California Public Utilities Commission hosted a conference on Internet regulation, and net neutrality in particular, chaired by commissioner Catherine Sandoval, yesterday afternoon.

A national panel of economics and law professors discussed where state regulators fit in. Not surprisingly, they didn’t agree.

Early on, mobile telecoms regulation was moved nearly completely to the federal level, resulting in “massive investment”, said Charles Davidson from the New York Law School. The same thing happened with broadband and “the result has been massive investment in broadband networks” along with deployment to most parts of the country. “Congress did not contemplate an active state role vis-à-vis broadband”, he concluded.

Mobile telecoms regulation is indeed a good model to follow, said Jim Tuthill from the U.C. Berkeley School of Law. But the lesson there is that mobile telecoms already fall under common carrier rules. “There’s an important role for the states under the regulatory regime of mobile services. Title II [common carrier] regulation of cellular hasn’t hurt investment, hasn’t hurt innovation” he said. “It worked for wireless, it’ll work for broadband”.

The premise for a higher level of state and federal broadband regulation is wrong, according to Michael Katz, from the U.C. Berkeley Economics Department. The debate over network neutrality is primarily a debate over whether some companies should be able to pay for Internet fast lanes to reach customers.

“Paying for better service does not, in and of itself, constitute price discrimination”, Katz said, pointing out that telling General Motors that it could only sell Cadillacs would hurt consumers. “It’s a good thing when different firms that are competing to serve customers go out and buy more expensive inputs to better serve customers”, he said. The answer, according to Katz, is not to try to create rules that will force companies to do or not do certain things, but use existing anti-trust law on a case by case basis to swat down anti-competitive behavior when it occurs.

The FCC’s original approach, using general authority to promote broadband under another section – 706, if you’re keeping score – of federal telecoms law is the better course, according to Christopher S. Yoo from the University of Pennsylvania Law School. Otherwise, big telecoms companies will have a huge advantage in a common carrier game, because “you have to have an army of lawyers to do it”. The original approach – assuming the commission can thread its way through court rulings – works because “network neutrality is a subsidy for the edge”, which incentivises companies to develop services that will attract new users to the Internet. “The rationale is that anything that promotes usage is within the power of 706″, which puts it under state authority, Yoo said. The role of states, he said, is pushing efforts such as broadband mapping studies and data collection, implementing universal service, promoting education and usage as envisaged in the national and state broadband plans and reducing switching costs for consumers.

It was a good discussion of the different regulatory tools in states’ broadband development kit. And the more options, the better, as Sunne Wright McPeak, CEO of the California Emerging Technology Fund, told the group.

“The solution is not a silver bullet, but silver shotgun pellets”, she said.

No comments

Mobile broadband divide detailed at California Broadband Council

FacebookTwitterGoogle+PinterestLinkedInRedditEmail

Ken Biba, from Novarum Inc., briefed California Broadband Council members yesterday on the results of mobile broadband testing conducted by the California Public Utilities Commission. He reiterated conclusions previously published regarding the mobile broadband divide between rural and urban areas in California.

“It’s a one carrier state and it’s Verizon”, Biba said. Although AT&T has built out into rural areas, too, its service isn’t as available or well performing. As for the rest, “I can’t advise anyone to get a Sprint phone or a T-Mobile phone because you’re not going to get service”, he said.

There’s a wide variation in service, ranging from the best measurements on Verizon’s network along the Mexican border, to great gaping holes in the Sierra and along the northern coast. Where you are and who your carrier is determines the quality of your mobile broadband service, as does the device you’re using and the websites you’re frequenting. But as for the commonly heard claim that time of day determines mobile broadband performance, “it’s largely bullshit”, Biba said.

The data also points to the need for more fiber – and more access to existing fiber – particularly in rural areas. Once a mobile broadband connection hits a rural tower, it slows down. “There’s a 40% latency penalty for rural users over urban users”, Biba said, emphasising that this finding was preliminary and more research would be done to confirm (or rebut) it. Even so, “I think it’s real”.

In what was effectively a lame duck session, the council also heard telecoms regulators from New York and the Virgin Islands talk about the challenges they face, and closed out the session by thanking everyone, and particularly CPUC staff, for past years of support. Neither Alex Padilla nor Steven Bradford – outgoing state assembly and senate representatives respectively – attended. It’s also the last meeting for council chair – and CPUC president – Michael Peevey. Of the principal members (the others are staff representatives from various state agencies) only Sunne Wright McPeak, president of the California Emerging Technology Fund, will return next year. The council wrapped up the meeting by electing Carlos Ramos, head of the California Department of Technology and the state’s CIO, as the new chair, beginning in January.

No comments