Cryptocurrencies like Bitcoin are different from other software and standards-based platforms. There are no governing authorities or dominant players or established industry groups. That’s deliberate. The whole point is to create a way of exchanging value that’s not centrally regulated by governments or private organisations. But that means a super-majority of the millions of individual users have to accept and adopt software updates, or else there’s the risk that Bitcoin will splinter into different versions with different values.
That’s what happened last week. The debate within the Bitcoin community over the best way to increase the capacity and speed of the underlying software resulted in a tenuous compromise earlier this year between many users with different interests. But not all of them. So on Tuesday, another big group agreed on a different method of updating the software and began running it their own way, in the process creating a new version called Bitcoin Cash.
Anyone who had one Bitcoin on Monday now has one unit of Bitcoin and one unit of Bitcoin Cash. The splinter group is big enough that the new unit of Bitcoin Cash actually has some value. It’s fluctuated wildly, but might – might – be stabilising in the $200 to $300 range. On the other hand, it’s not big enough (yet) to have hurt the value of the original Bitcoin, which topped $3,000 for the first time yesterday.
The drama isn’t over. A cryptocurrency’s value is, from the beginning, the cumulative result of millions of freely made, individual decisions, rather than a declaration made by a central authority that’s then moderated by whatever market forces are allowed. The compromise within the original Bitcoin community hasn’t been implemented yet and could fall apart, producing even more versions of the currency. That’s the inevitable – purposeful – risk of an unregulated medium of exchange. So far, as Bitcoin holders learned today, that risk is overwhelmingly outweighed by the reward.
The disruption in cryptocurrency markets this week, when Bitcoin sorta split into two, was the result of disagreements between different interests about the technology and crowd-sourced methods used to run it. It was also inevitable and purposeful – cryptocurrencies are intended to rise and fall according to the cumulative decisions of millions – eventually, billions – of sovereign, individual users, who won’t always agree with each other.
Bitcoin’s underlying software can’t keep up with the growing number and speed of transactions between its users. The limits of the software has been a known problem for years, but the urgency of solving it has increased in the past few months as the strain on the system began to slow down transactions.
The solution is simple: upgrade the software. But sometimes simple things are supremely difficult, and so it is with Bitcoin.
It’s nothing like updating a commercial application like Excel or iTunes that’s owned by a single company – Microsoft or Apple just do it. It’s not even much like Linux or other widely used open source software that can comfortably exist with many different versions – distros – floating around. Linux might be open source, but any given installation is a closed system – so long as you’re satisfied with the way your preferred version runs on your hardware, all is well. Operationally, it doesn’t matter if the person sitting next to you uses a different distro.
But if you’re exchanging information with other people – which is what Bitcoin is all about – then everyone has to format and process the data in the same way. Email works because everyone has more or less settled on a set of open standards that are periodically updated by industry groups that include big companies, like Google and Microsoft. If enough of the major players agree then pretty much everyone else has to follow along, or risk being shut out.
The same principle applies to cryptocurrencies like Bitcoin, but because schisms like we saw this week produce competing versions that, so far, have added value to the overall market and can be freely exchanged within their respective universes, there’s also an incentive to not standardise. By preventing consolidation into a single, monopoly platform, that balance has kept an ecosystem of independent cryptocurrencies alive.
Microsoft’s TV white space broadband initiative is many things – a worthy effort to expand Internet access, a way of squeezing more useable bandwidth out of finite radio spectrum, a call to action for rural economic development and, as willingly acknowledged, a business opportunity.
It is also a foray into the market economics of free software. White space is the gaps between active television channels, which vary according to where you are in relation to whatever TV stations might be around. The proposed solution to this spectrum management problem is active management via databases run by private companies. Like Microsoft.
Or like Google. Which opened up its database to all comers four years ago. Microsoft’s answer, which is wrapped in a well articulated but completely ordinary white paper about rural broadband access, is to offer up its intellectual property in a similar manner…
Our Rural Airband Technology Program will make our U.S. patents available under a royalty-free license to all comers, including to our competitors, for any work they undertake to stimulate broadband access through TV white spaces. These patents help tackle common problems associated with TV white spaces in a variety of ways…
Microsoft’s database-driven TV white spaces technology has continuously been improved through the use of machine learning that populates, maintains, and improves the content of the database, and cloud-based analytics to respond to database queries that, for example, leverages prior spectrum assignments for particular devices.
Google went from a Silicon Valley garage start up to being (at times) the world’s most valuable company by amassing vast quantities of data, giving away software that can make efficient use of it and then making gigabucks as the resulting traffic passed through – and made detours into – its servers.
In that context, its open access white space venture was nothing remarkable. And from that perspective, neither is Microsoft’s. Except that, well, it’s Microsoft. Welcome to club.
Goodbye to all that.
A particularly pathological cottage industry in east Texas is coming to an end, much to the delight of high tech entrepreneurs, and they have a low tech court case to thank for it. The federal supreme court ruled that patent trolls can’t go shopping for the most easily bamboozled judges and juries, but instead have to file law suits in the home state of the companies they’re trying to shake down.
According to a story in the Hill, the decision came in a case where Kraft – decidedly not a troll – tried to sue an Indiana-based company, TC Heartland, over water flavoring technology in a Delaware-based court…
The ruling will have broad implications for patent lawsuits, which are frequently moved to certain districts that have a track record of being favorable to patent infringement claims.
In delivering the court’s opinion, Justice Clarence Thomas wrote that much of the decision hinged on the word “resides,” which the court found to mean state of incorporation. Thomas wrote that because of this interpretation, updates to the rules by Congress did not change a 1957 Supreme Court decision that had previously found that patent suits must take place in the targeted company’s home state.
Though the TC Heartland and Kraft case focused on a disagreement between whether the case should take place in Indiana or Delaware, 40 percent of all patent suits are filed in east Texas. Ninety percent are brought in by “patent trolls,” or companies that hold patents but do not manufacture or produce anything, according to the Stanford Law Journal.
The decision hits trolls and the east Texas predatory bar that serves them, but it will also impact communities there. Big corporations, such as Samsung, have focused community relations dollars on east Texas, in the hope of building friendly relations with potential future jurors. Samsung might be able to afford to waste money on such endeavors, but few others can. The court’s ruling isn’t the end of the war against patent trolls, but it is a decisive battle.
Autonomous cars will be networked cars, manufacturers will maintain constant contact, and make themselves and onboard data available to the cops. That’s one of the takeaways from a draft set of new rules for testing them on California’s public streets that was published by the department of motor vehicles. If – when – manufacturers get to the point that self-driving vehicles can be tested on the open road without someone on standby in the driver’s seat, or even without a steering wheel or other old school controls, then they’ll have to make sure that…
There is a communication link between the vehicle and the remote operator to provide information on the vehicle’s location and status and allow two-way communication between the remote operator and any passengers if the vehicle experiences any failures that would endanger the safety of the vehicle’s passengers or other road users, or otherwise prevent the vehicle from functioning as intended, while operating without a driver. The certification shall include:
(A) That the manufacturer will continuously monitor the status of the vehicle and the two-way communication link while the vehicle is being operated without a driver;
(B) A description of how the manufacturer will monitor the communication link; and,
(C) An explanation of how all of the vehicles tested by the manufacturer will be monitored.
The two-way communication requirement would remain even when autonomous vehicles go into actual service. Police would also have to be able to get in touch with whoever is monitoring the vehicles remotely, and have access to a required on-board data recorder.
That requirement isn’t as Big Brother-ish as it might be – the black box would only has to hang onto data for 30 seconds before and 5 seconds after a crash. Of course, you don’t know in advance when the crash was coming, but even so there wouldn’t be a need to keep more than minute’s worth of data at any one time. But there’s nothing preventing car makers from keeping all the data collected or particularly limiting government access to it.
The DMV is taking comments on the draft rules, and will hold a workshop in Sacramento next week..
Apple has finally admitted that it has a self-driving car project in the works, but isn’t saying much else. It now has a permit from the California department of motor vehicles to test autonomous vehicles, which was issued, or at least posted, yesterday. According to the Wall Street Journal, its fleet consists of three Lexus SUVs which will be driven by six registered test drivers.
According to a story by Oscar Raymundo in Macworld, Apple’s business model might have shifted from making self-driving cars to developing software that’ll be offered to other manufacturers…
In 2016, however, Apple seemed to have pivoted the initiative, opting for creating just the self-driving software to license to established car-makers instead of assembling an entirely new Apple vehicle. This is a departure for Apple, which has created a legacy by developing both hardware and the software aspects of all its products.
He’s right, that would be a major strategic departure for Apple, which is why it would be a good idea not to bet the ranch that you won’t see an iCar, or whatever they’re going to call it, sometime in the future. Elon Musk expects Apple to get into the manufacturing game, and he has as much insight into what they’re doing as any outsider – in other words, no hard data but enough knowledge about the business to make an educated guess.
DMV registration carries with it an obligation to file public reports about any accidents, and to submit information once a year about whenever there a “disengagement of the autonomous mode caused by the failure of the technology or when the safe operation of the vehicle requires the test driver to take immediate manual control of the vehicle”. So we won’t have to wait too many months for a window into Apple’s development process.
In the meantime, if you’re cruising Cupertino, look for a tricked out Lexus.
If you look into the core of the Internet or just in a typical corporate or institutional data center, you’ll see rack after rack loaded with switches, routers and other gear made by Cisco. A vulnerability in even one of their products can leave a lot of networks and data open to attack. So you might come to the conclusion that spotting that kind of flaw and fixing it as quickly as possible is matter of national security.
You’d be wrong.
It turns out that more than three hundred Cisco devices can be breached via a cracking technique used by the Central Intelligence Agency and revealed in a massive document dump by Wikileaks. Company researchers have concluded that…
- Malware exists that seems to target different types and families of Cisco devices, including multiple router and switches families.
- The malware, once installed on a Cisco device, seem to provide a range of capabilities: data collection, data exfiltration, command execution with administrative privileges (and without any logging of such commands ever been executed), HTML traffic redirection, manipulation and modification (insertion of HTML code on web pages), DNS poisoning, covert tunneling and others.
- The authors have spent a significant amount of time making sure the tools, once installed, attempt to remain hidden from detection and forensic analysis on the device itself.
- It would also seem the malware author spends a significant amount of resources on quality assurance testing – in order, it seems, to make sure that once installed the malware will not cause the device to crash or misbehave.
There’s a quick way to block it – disable telnet, an ancient and insecure communications protocol – but a permanent fix has yet to be released.
Generally, there are two ways the CIA could have obtained this exploit: either it was developed internally or it was purchased on the black market. If the former, it could have been duplicated by anyone with sufficient skill. If the latter, it means the CIA knew that broad swathes of the world’s IT infrastructure was exposed to anyone with deep enough pockets. In either case, its first duty should have been to plug the hole, and not sit on it until its own firewall was breached.
Not the latest version.
The Central Intelligence Agency’s guide to cracking is getting bad reviews from the tech community. Published earlier this week on Wikileaks, the thousands of files of internal documentation maintained by the CIA’s engineering development group are mostly openly available cook books and mundane advice on how not to get caught.
A story by Sean Gallagher at Ars Technica steps through some of it and concludes it amounts to an outdated “Malware 101” textbook…
It’s not clear how closely tool developers at the CIA followed the tradecraft advice in the leaked document—in part because they realized how dated some of the advice was. Back in 2013, two users of the system said so in the comments area: “A lot of the basic tradecraft suggestions on that page seem flawed,” wrote one. Another followed, “Honestly, that stuff is probably already dated…”
Four years later, some of the recommendations have become even more stale. That’s largely because of the advances made in malware detection and security tools, including those built into many operating systems. But it’s also because the tradecraft used by everyday malware authors without the benefit of state sponsorship have surpassed these sorts of tradecraft suggestions.
One of the takeaways from the Wikileaks dump should come as no surprise: the CIA is an avid collector of zero day exploits, which are bugs in applications, operating systems and hardware that the rightful owners don’t know about yet. But plenty of others will. Apparently, the CIA buys at least some of these backdoors from the grey and black marketeers that openly sell them. Even a flaw discovered by the CIA’s team isn’t exactly a secret – it’s there for the taking by anyone else with the necessary, and far from rare, skills.
Spying is the CIA’s job. But the reason for doing it is to protect the U.S. Feeding the market for malware and hoarding it instead of fixing it makes us all less secure.
As someone who regularly spends several hours a week on a bicycle, wondering if the diesel rumble of a truck coming up behind me is the last sound I’ll ever hear, I was sorely disappointed to read that help, in the form of robotic vehicles, might be a long time coming.
A story by Peter Fairley on the IEEE Spectrum blog looks at the successes that self-driving car companies have had in developing software and sensors that can recognise other cars and predict their movements, and contrasts it with the failure to do the same with bicycles…
Nuno Vasconcelos, a visual computing expert at the University of California, San Diego, says bikes pose a complex detection problem because they are relatively small, fast and heterogenous. “A car is basically a big block of stuff. A bicycle has much less mass and also there can be more variation in appearance — there are more shapes and colors and people hang stuff on them”.
The autonomous vehicle technology is already starting to appear in automated emergency braking (AEB) systems, which is great for avoiding collisions with other cars, but not so helpful for cyclists…
AEB systems still suffer from a severe limitation that points to the next grand challenge that [autonomous vehicle] developers are struggling with: predicting where moving objects will go. Squeezing more value from cyclist-AEB systems will be an especially tall order, says Olaf Op den Camp, a senior consultant at the Dutch Organization for Applied Scientific Research (TNO). Op den Camp, who led the design of Europe’s cyclist-AEB benchmarking test, says that it’s because cyclists movements are especially hard to predict.
[Computer scientist Jana ]Kosecka agrees: “Bicycles are much less predictable than cars because it’s easier for them to make sudden turns or jump out of nowhere.”
It’s not completely out of our hands, though. As artificial intelligence systems slowly learn to cope with bicycles, cyclists can try to see the road as a self-driving car might see it and do their best to ride predictably. At least it’s more comforting than just hoping the guy who’s about to pass you is looking at the road and not at his smart phone.
What else does a boy need?
If you’re reading this, it’s courtesy of one of two operating systems that were born in the Rhythmless Void between the break up of the Beatles and the Great Disco Awakening: UNIX or CP/M. (Unless you are truly an uber geek and still rocking your Commodore 64 or pre-OS X Apple or something even more esoteric – I genuflect in abject admiration. Or unless you’re a masochist and you’re reading this on a Blackberry: I salute your embrace of pain and humiliation).
Microsoft Windows is a direct descendent of CP/M, although little of the original DNA is left. Pretty much everything else is within two or three degrees of consanguinity with UNIX. Mac OS, iOS, Android, Tizen and Linux all exchanged presents in their pyjamas on Christmas morning.
It’s been a long, long time (sorry, I’ll always have Linda Ronstadt on the brain – it’s a Seventies thing) since anyone wrote a new OS kernel with staying power. But Google is giving it a try. Google posted Fuchsia OS as an open source project on GitHub this past summer, and it is still under active development. It’s an operating system that’s been built from scratch, without obvious reference to the Glitter Rock era. According to a post on Linux.com by Sam Dean…
Could Google be completely reinventing the core functionality of what we consider to be an operating system? There are certainly historical precedents for that. When Google launched a beta release of Gmail in 2004, Hotmail, Yahoo! Mail, AOL Mail and other services had absolutely dominant positions in the online email space. Look what happened. Google reimagined online email. Likewise, Chrome OS reimagined the operating system with unprecedented security features and cloud-centricity.
It’s worth watching, even if it’s not strictly necessary. I’ll happily live out my years with just a stack of Linda’s 8-tracks beside me.