Category Archives: jim warner

CPUC tells FCC not to confuse copper networks with telecoms service

FacebookTwitterGoogle+PinterestLinkedInRedditEmail

Don’t confuse copper wireline infrastructure with the services it supports. That’s the message from the California Public Utilities Commission to the Federal Communications Commission. In comments regarding possible changes to federal wireless and wireline telecoms regulations, the CPUC said that the "FCC’s assumption that copper has outlived its usefulness is overstated"…

Copper technology is not inherently obsolete. Copper was originally used for telecommunications because it could serve as the backbone of a universal voice network: it was cheap to install, easy to use, and readily available. When the voice network expanded to provide broadband capability, new copper technologies were invented to provide data services and the internet to homes and businesses, using the existing architecture and infrastructure. Meanwhile, telecommunications carriers have gradually pushed fiber technologies further out from the core (where its capacity was well-suited to the big traffic requirements of interoffice communications), but fiber-to-the-home is not yet ubiquitous. Many carriers—especially those without a wireless affiliate—provide high-speed service to the home using either fiber or copper. For example, advances in the G.fast protocol have led to carrier strategies for serving multi-dwelling units using the existing copper loops. And some services—certain credit card readers, alarm systems, closed captioning, and emergency services, for example—still rely on copper technology. In a transitional technical environment like this one, all of these technologies—copper, fiber, wireless—should be used to their fullest. The FCC’s conflation of “fiber facilities” with “next-generation services” masks the difficulties that may arise if copper retirement is approached hastily.

That’s a position that four of the five CPUC commissioners agree with – they voted to approve these comments at their meeting on 15 June 2017, with commission president Michael Picker abstaining.

The comments also pushed back against federal preemption of California’s utility pole and right of way regulations, pointing out that the CPUC "currently has three ‘pole and conduit’ proceedings open". Under federal law, individual states can opt to regulate pole access and other telecoms policy themselves. California is one of twenty states that has done so.

Monterey Bay broadband expert group offers conduit design advice

FacebookTwitterGoogle+PinterestLinkedInRedditEmail

It’s one thing to say that empty telecoms conduit – shadow conduit – should be installed anytime a street is repaved or a utility trench is dug, but that begs a question: what kind of conduit, and how should it be designed?

To answer that question, the Monterey Bay Economic Partnership and the Central Coast Broadband Consortium convened a technical expert group that included senior public works engineers, Internet service providers, underground construction contractors and manufacturers. An intense discussion at an afternoon meeting at U.C. Santa Cruz produced a draft set of shadow conduit specifications and guidance, which then circulated through several rounds of revisions.

Consensus was reached on a number of key items, including appropriate conduit size…

  1. 2-inch conduit is sufficient for multiple high capacity fiber cables using current technology (432 strands or more), and can be subdivided using inner-duct that would allow multiple service providers to share a single conduit.
  2. 4-inch conduit has even more capacity but, due to its larger size, can present design problems, for example when connecting to vaults. This size of conduit was standard when telecommunications systems depended on thick bundles of copper cables, but is not necessary for most modern fiber applications. However, 4-inch conduit should be considered for installation on bridges, railroad crossings and in other circumstances where future changes would be particularly difficult or impossible.
  3. Smaller conduit, e.g. 1.25-inch, is useful when it is not possible to install 2-inch conduit or when many, separate conduits are installed. It may be preferred when conduits are expected to be used by a single service provider, rather than shared among many over time, or when it meets the needs of an anticipated project or service provider.

Other specs included vault and hand hole placement, conduit system design considerations and preferred installation locations.

The document is intended to guide shadow conduit design decisions, not dictate them. It represents the broad consensus of the expert group members, but actual designs will ultimately depend on the specific circumstances of any given project or jurisdiction.

The next subject that the MBEP/CCBC expert group plans to tackle is microtrenching. It’s a fiber installation technique that, on the one hand, reduces project costs, but on the other hand can impact street service life and maintenance costs.

MBEP/CCBC Shadow Conduit Specifications version 1.0

Mobile broadband gets faster in California, but maybe not fast enough

FacebookTwitterGoogle+PinterestLinkedInRedditEmail

Mobile broadband is better in California, and improvements have been made quickly. That was one of the takeaways from a meeting of Central Coast Internet service providers and California Public Utilities Commission staff in Seaside last week. Jim Warner, a network engineer at U.C. Santa Cruz and chair of the Central Coast Broadband Consortium’s technical expert group, discussed his analysis of results from the latest round of the CPUC’s mobile broadband field testing.

“Performance has doubled in year”, Warner said, albeit with the recognition that the statewide results do not necessarily give a good picture of what’s going on in rural areas. Although there are some aspects of the CPUC’s conclusions “that would cause a statistician’s eyebrows to twitch”, overall the assessment of mobile carrier performance and adjustments made to account for actual user experiences are sound, as he later explained in a note

CPUC staff use their average measured speed as the advertised speed since cellular carriers generally don’t specify speed the same way that their wireline brethren do. One way to think about this is: If the CPUC used the average, users could expect Internet performance that met the California target half the time. That is not what consumers expect. Shifting the measurement by one standard deviation allows the expectation of meeting the service level to be raised to perhaps 80 percent. Making some allowance for availability of reliable service seems appropriate.

Even so, perceived improvements could well be temporary.

“Traffic will grow to fill the pipes”, he said. “That’s the story of the Internet for the last 30 years. What looks served today will look underserved tomorrow”.

If you’re wondering how much it costs to use existing poles and conduit, it’s public information

FacebookTwitterGoogle+PinterestLinkedInRedditEmail

The most difficult and costly part of any wireline broadband infrastructure project is getting cable from point A to point B. There are two primary ways of doing it: stringing it on poles or running through buried conduit. Since the chances of getting permission to build a new pole route in California is only slightly better than the odds of getting approval to drill for oil in San Francisco Bay, your only independent alternative is to start digging, at the rate of $30 to $60 a foot or more.

But public utilities in California do not operate completely independently. That’s good news if you have the seal of approval from the California Public Utilities Commission, otherwise known as a certificate of public convenience and necessity. That piece of paper gives you the right to go to other (older) utilities, like PG&E or AT&T, and force them to let you use their poles and conduits. Up to a point, anyway. If there’s no space available, then it’s generally up to you to pay the cost of making room, which can be quite high if poles have to be replaced or new duct work installed.

Even so, the contract terms that regulated utilities impose on each other are, to a large extent, regulated and publicly disclosed. Jim Warner at UCSC has taken the trouble of hunting down several of these contracts and posting them. As he explains…

Regulated utilities with access to public right-of-way must share resources with other utilities. Rates are established in contracts that also set other terms of sharing. Underground duct rents for about $1 per foot per year. Right to attach a cable to a phone or power pole is about $5 per pole per year. I have a collection of contracts with rates.

Which is right here, although I’ve pasted the links he’s gathered to date below as well. Happy reading.

  • Sonic and AT&T, 2010
  • IP Networks and AT&T, 2010
  • Pioneer Telephone and AT&T, 2010
  • Summary of rates charged by AT&T to dozens of companies, 2011
  • Plumas Sierra Telecommunications and AT&T, 2012
  • Summary of rates charged by AT&T to dozens of companies, 2012
  • Summary of unbundled rates charged by AT&T to dozens of companies, 2012
  • Fireline Network Solutions and AT&T, 2013
  • AT&T trenching terms – who pays – 2013
  • Suddenlink and AT&T, 2013
  • Suddenlink and AT&T, 2013
  • AT&T’s stand-alone structure access agreement for poles, conduits and rights-of-way, 2013
  • PG&E’s standard overhead facilities license, 2007
  • PG&E fee schedule for wireline attachments to distribution poles, 2014
  • FCC’s E-rate program trading up to WiFi and a gig

    FacebookTwitterGoogle+PinterestLinkedInRedditEmail

    By Jim Warner
    Network engineer, U.C. Santa Cruz
    Chair, Central Coast Broadband Consortium technical expert group

    This is arguably a badly timed note about an FCC proposal due for decision on Friday, July 11. Any opportunity to comment – and have your comments count – ended months ago.

    A year ago the commission put out a Notice of Proposed Rule Making that reviewed this history of changes to the E-rate program that provides about $2.3B/yr subsidies to educational uses of telecommunications services:

    Click here for the NPRM

    The big headline – when the rules come out – is that the FCC will be shifting the E-rate program to make Wi-Fi service ubiquitous in the nation’s schools. And press releases carrying that message have been put out starting on July 1. The reality is more complex than just funding a few cartons of 802.11 access points. Wi-Fi will be the user visible front face of the shiny new program. But under the hood, most of the money will go to provide back haul and switches to make the Wi-Fi do something useful.

    President Obama proposed that schools should have 100 Mb/s for each increment of 1000 students. That should ratchet up to 1000 Mb/s in five years. Other proposers recommend 1000 Mb/s now with a 10 Gb/s target in 2017–8. Of course, this was a proposal a year ago. The recommendation could have shifted. The good news is that if the FCC makes any ruling on Friday, we will have K–12 and library standards for broadband services.

    The E-rate program appears to be wildly successful. To be sure there have been isolated reports of fraud but in major part, the program is credited with propelling schools from dial-up to modern broadband. One thing that has not been proposed is to give the program more funding. The FCC believes that some reallocations and rule changes can provide a jolt both school and library broadband services. It is hard to believe in a zero sum world that the program can be adjusted to have major new impacts. In large part, E-rate support for libraries is an untouched project. There is lots of work to do. So, who are the losers?

    The e-rate program of 1998 permitted schools to subscribe to pager service. And there is still $1M/yr being spend on them. A proposed change will reserve Priority E-rate funds for broadband services. So cell phones for staff will be on the out. Not clear how much is spent on this, but the rules still provide that schools can have subsidized subscriptions to e-mail services that most of us get now for free. Remember Compuserve? That will probably go away, too. There will also be rules changes that might have the effect of spreading the money more broadly than it is today. But that won’t, by itself, expand funding. Some tinkering with competitive bid rules have been proposed that could reduce paperwork burdens on small school districts. That could result in some real savings.

    Other changes are in the wait til Friday category. The proposal last year asked whether it would be good to revise the rules to make it easier for school and library districts that want to build their own dark fiber networks to get more support. This would be excellent news for the recently funded Soledad fiber build in the Salinas Valley.

    CPUC needs a smart and aggressive cat

    FacebookTwitterGoogle+PinterestLinkedInRedditEmail

    Telling it like it is.

    Mark Ferron, a commissioner on the California Public Utilities Commission, recently – and abruptly – announced he was resigning. What had been a very private battle with prostate cancer took a turn for the worse, and he stepped down in order to focus his energy on his health and family. His resignation message is worth reading for his insight on prostate cancer alone. But Ferron also leaves his fellow commissioners with some pointed advice on winning the – also heretofore private – struggle he sees to maintain relevance.

    Ferron paints a picture of commissioners pinned by three forces: confrontational utilities, an “inexperienced” legislature and an unaccountable staff. His primary area of responsibility and expertise was energy; he discusses energy policy at length and doesn’t specifically mention telecoms at all. But the problems he highlights are common to all the different privately-owned utilities that commissioners regulate.

    I wonder whether some top managers at our utilities have the ability or the will to understand and control the far-flung and complex organizations they oversee…We at the Commission need to watch our utilities’ management and their legal and compliance advisors very, very carefully: it is clear to me that the legalistic, confrontational approach to regulation is alive and well. Their strategy is often: “we will give the Commission only what they explicitly order us to give them”. This is cat and mouse, not partnership, so we have to be one smart and aggressive cat.

    The legislature, in Ferron’s view, is similarly antagonistic…

    We also have a Legislature that by many measures is very inexperienced, and yet considers itself expert in energy policy matters. Many of the more influential members and veteran staffers seem to display an open, almost knee-jerk hostility toward the CPUC. It’s as if some Legislators (or their staff) think that their reputations will be enhanced by slapping down this Commission’s policy initiatives…The CPUC needs to do a better job of convincing the Legislature that we are not their rivals nor their enemies – but rather their partners.

    Finally, Ferron voices frustration with an organisation that he and fellow his fellow commissioners cannot manage…

    We Commissioners rightly are held responsible for what happens in this building and yet we do not have any effective means to provide guidance and oversight to the CPUC’s permanent management and staff. My colleagues and I have discussed arranging ourselves similarly to the way that a Board of Directors is organized in Corporate America: we could create sub- committees dedicated to overseeing important internal issues…This arrangement could help give the Commissioners more effective senior-level oversight…and I believe would create a stronger and more effective agency. I do hope that my fellow Commissioners will act on my suggestion after I am gone.

    It’s up to governor Brown to name a replacement, who’ll have to be confirmed by the state senate.

    H/T to Jim Warner at UCSC for the link – it’s sometimes hard to keep up on California news whilst in New Zealand and I appreciate the help. Not that I’m complaining…

    Comprehensive study shows wireless radiation does not affect people

    FacebookTwitterGoogle+PinterestLinkedInRedditEmail

    It’s safe to take off the hat now.

    There is no scientifically valid evidence that radio waves produced by WiFi or mobile telecoms equipment harm people or make them sick. That’s the conclusion of a systematic review of 29 scientific studies that looked for connections between electromagnetic fields (EMF) and illnesses that people claim are caused by them…

    Idiopathic environmental intolerance attributed to electromagnetic fields (IEI-EMF) is a controversial illness in which people report symptoms that they believe are triggered by exposure to EMF. Double-blind experiments have found no association between the presence of EMF and self-reported outcomes in people with IEI-EMF…At present, there is no reliable evidence to suggest that people with IEI-EMF experience unusual physiological reactions as a result of exposure to EMF. This supports suggestions that EMF is not the main cause of their ill health.

    Of the studies, 24 showed that exposure to wireless transmissions does not produce any physiological changes or ill effects in people, including those who claim to be hypersensitive to EMF. Some physiological effects were observed in the other 5 studies, but either those results have not been reproduced or the way the experiments were designed didn’t insure that participants’ knowledge could not affect the outcome. Both are key requirements for scientific validity.

    Science notwithstanding, there are people who are convinced that their health problems are caused by technology, for example cell towers and smart meters. Many of them are tiresome fixtures at public meetings, angrily protesting anything that might lead to the construction of more wireless infrastructure and regularly lambasting elected officials who rely on science rather than ignorant and inchoate complaints.

    The review was conducted by a team of researchers from Norway, Sweden, Holland and the UK, and published in the journal Bioelectromagnetics.

    Update – in answer to requests, the title of the paper is:

    “Do People With Idiopathic Environmental Intolerance Attributed to Electromagnetic Fields Display Physiological Effects When Exposed to Electromagnetic Fields? A Systematic Review of Provocation Studies”, published in the December 2011 issue of Bioelectromagnetics. The abstract is available here. The full paper is behind a paywall. If can find a free source for it, I’ll update this post.

    Big thanks goes to Jim Warner at U.C. Santa Cruz for pointing me to the paper.

    Broadband 101 workshop in Santa Cruz looks at projects, policy

    FacebookTwitterGoogle+PinterestLinkedInRedditEmail

    Zach and friends.

    “Economic development is not just building a Costco or a car dealership”, said Santa Cruz County supervisor Zach Friend, closing out a three hour workshop on the basics of broadband development. “What we’re doing now is laying down a backbone for future economic development.”

    About forty people attended event last week at CruzioWorks, including supervisors, Santa Cruz mayor Hilary Bryant and local public works and IT staff from around the county. Cruzio CEO Peggy Dolgenos was the host and emcee.

    Joel Staker, network administrator for the City of Watsonville and chair of the Central Coast Broadband Consortium, talked about the big public policy picture, comparing subsidising Internet infrastructure now to the decision more than 50 years ago to build the Interstate highway system. He also gave an update on Watsonville’s municipal dark fiber project.

    I gave a presentation on what’s needed to build and upgrade infrastructure, and talked about broadband projects in Santa Cruz, Monterey and San Benito Counties, with a particular focus on the Sunesys middle mile project which is being considered for funding by the California Public Utilities Commission.

    Tammie Weigl talked about the diverse – and potentially vulnerable – network infrastructure she manages for the County of Santa Cruz. UCSC’s Jim Warner presented research he’s done regarding the problems encountered with mobile broadband service in a coastal community. Local entrepreneur and broadband advocate Larry Samuels talked about private investment opportunities and partnerships with public and community-based projects.

    The event was organised by Cruzio Internet, and co-sponsored by the Central Coast Broadband Consortium.

    Faster, cheaper fiber microtrenching gains acceptance

    FacebookTwitterGoogle+PinterestLinkedInRedditEmail


    Verizon’s microduct ready to be installed near Sea Ranch in Sonoma County.

    In what could lead to the first large scale urban use of fiber microtrenching in the U.S., Verizon and the City of New York have agreed to test it at 12 sites. Verizon has used microtrenching for other fiber projects, including one last year in a rural part of California.

    You can see a video of the process here. It involves sawing a narrow trench – 2 cm wide and up to 30 cm deep – into the roadway, inserting thin, flat microduct, and then sealing it back up. Because it’s relatively shallow, there’s less chance of hitting existing underground utilities. It’s a fast process too, reducing, and sometimes eliminating, street closure times.

    The New York project has an open access element to it. Depending on the width used, the skinny microduct can handle several fiber cables. The city has said other network providers can participate in the pilot project.


    California State Route 1 ready to be repaved. The Coke can is for scale and was not harmed in the making of this picture.

    Verizon already has at least one microtrenching pilot project going in California. In 2010, Caltrans approved microtrenching along ten miles of State Route 1 in Sonoma County and, as U.C. Santa Cruz network engineer Jim Warner discovered while on vacation, construction was underway last summer. He dug out the permit, which details the specs Verizon needed to meet, including cutting to a minimum depth of 10 cm and patching the slice in the road.

    If there are no complications, microtrenching can cut the cost of fiber installation by as much as two-thirds. Maintenance costs could be a little higher, because the shallower depth exposes a cable to more damage, for example during street repaving. And cutting into the surface could have more impact on road durability than drilling underneath it. Overall, it’s a lower impact process though, so assuming quality specs can be met it should be easier to get agency approval and start work.

    CPUC’s second field test building consistent picture of mobile broadband performance

    FacebookTwitterGoogle+PinterestLinkedInRedditEmail


    Spring up, Fall back.

    More mobile broadband performance measurements are available and accessible to Californians, thanks to field testing done by the California Public Utilities Commission and mapping and analysis done by Jim Warner at U.C. Santa Cruz.

    Warner, who is a network engineer for the University and chair of the Central Coast Broadband Consortium’s technical expert group, took the data collected in the CPUC’s first and second rounds of mobile data field testing and fed it into Google maps. The CPUC took readings at 1,200 locations in each round, for the most part repeating measurements at the same locations. AT&T, Sprint, T-Mobile and Verizon service was tested using both a smart phone and a netbook.

    “It is comforting to see that in some speed tests, Verizon results were above 40 Mbit/sec – fully six times faster than the California broadband definition,” Warner wrote. “This means that the equipment used for the tests is not coloring the results.”

    Warner shows, as an example, Verizon has reasonably consistent improvement in the Monterey Bay Area. On the other hand, to my eye there’s little difference in Verizon’s coverage in the Gold Country.

    Along I-80 in Solano County, to take another example, some of AT&T’s results are better, some worse. That supports Warner’s observation that “tests were done throughout the day at times that were not controlled for network loading by other traffic. Tests done early in the day might face less cross traffic congestion than tests later in the day.”

    In other places I checked, the variance also went both ways. That tells me that something other than new construction is the cause. But you can look for yourself and come to your own conclusions.