Data Architect, Ph.D, Information Technologist, Gamer
5982 stories

108 U.S F-35s Won’t Be Combat-Capable

1 Comment
The new F-35 program executive officer, U.S. Navy vice admiral Mat Winter, said his office is exploring the option of leaving 108 aircraft in their current state because the funds to upgrade them to the fully combat-capable configuration would threaten the Air Force’s plans to ramp up production in the...

Read the whole story
15 hours ago
Sydney, Australia
1 hour ago
Charlie Foxtrot 35.
Share this story

Sillamäe Again

1 Share

Here’s a light but crabby post for a Saturday. Fits my mood.

I’ve spent a fair bit of time in Sillamäe, Estonia, and more thinking about it. So when a publication screws up the facts, I feel a need to respond.

This time, it’s Atlas Obscura doing a remarkable job of stuffing errors into a short article.

When the article was first published, they confused ä and a, but they’ve fixed that now. Quotes from the article are in italics.

Due to the building of a uranium enrichment plant in the region, it became off-limits to anyone other than those directly linked to its operations.

One of the favorite misconceptions about Sillamäe is that it had an enrichment plant. Anyone who’s been there or cares to look at an overhead photo can see that there is no enrichment plant. What the Soviets did there was produce yellowcake from uranium ore and concentrates. Here are short and long references on that.

Even the architecture of Sillamäe is different than typical Soviet style; the buildings were built in the style of Stalinist neoclassicism.

The original buildings were. There are also Khrushchevian and Andropovian style buildings that were added as the town grew.

The main example is the town hall building that looks just like a church but has never served as one.

The town hall has always seemed to me to be more Estonian-style architecture, perhaps with a dash of Jugendstil. I’ve always wondered about that. The apartment buildings nearby are much better examples of Stalinist neoclassicism.

In the five decades of its functioning, the plant in Sillamäe refined over 100,000 tons of uranium, which was in turn used in 70,000 nuclear weapons, including the Soviet Union’s very first nuclear bomb.

I would have to go back to the longer of those two references to get the exact number, but simply dividing 100,000 tons of uranium by 70,000 nuclear weapons gives somewhat more than a ton of uranium per weapon, which is far too much. And the Soviet Union had more than one yellowcake plant. Perhaps the 100,000 tons refers to ore.

The Soviet Union had 40,000 nuclear weapons at its peak, the US 30,000. So 70,000 is the maximum that existed in the world at one time.

The story that Sillamäe provided the uranium for the Soviet Union’s first nuclear bomb is tempting in terms of timing, but the first Soviet nuclear test was of a plutonium bomb. It’s possible that Sillamäe provided the uranium for the reactors that produced the plutonium, but it’s not documented in the West, as far as I am aware.

This extended period of radioactivity did not leave the landscape unharmed and even before the fall of the Soviet Union, the operations were winding down at the plant. 

What does this – “This extended period of radioactivity” – even mean? The environmental damage that my Estonian colleagues and I faced was a gigantic tailings pond (waste depository) on the edge of the Baltic Sea. The Estonians remediated it and converted the plant to no uncontrolled emissions in 2009.

After the collapse, it was liberated along with the rest of Estonia, and the factory was adapted to process rare metals and particles.

“It was liberated” – The Estonians liberated themselves and contributed mightily to the fall of the Soviet Union. The plant converted from yellowcake to rare earth metals and oxides in the mid-1980s. And what does “particles” mean?

the street names are often in Russian

I recall the street signs being in the Roman alphabet, but I could be wrong, or some might have been changed back as Estonia becomes more relaxed about language.

The photos are nice – I love the town hall and Mere Puiestee (Ocean Boulevard), that wide street between the apartment blocks – but they left out the best one, the monument to the atomic workers, at the top of this post.

I wrote more about my experience in Sillamäe. I recommend that article to Atlas Obscura.

Read the whole story
19 hours ago
Sydney, Australia
Share this story

"Responsible encryption" fallacies

Deputy Attorney General Rod Rosenstein gave a speech recently calling for "Responsible Encryption" (aka. "Crypto Backdoors"). It's full of dangerous ideas that need to be debunked.

The importance of law enforcement

The first third of the speech talks about the importance of law enforcement, as if it's the only thing standing between us and chaos. It cites the 2016 Mirai attacks as an example of the chaos that will only get worse without stricter law enforcement.

But the Mira case demonstrated the opposite, how law enforcement is not needed. They made no arrests in the case. A year later, they still haven't a clue who did it.

Conversely, we technologists have fixed the major infrastructure issues. Specifically, those affected by the DNS outage have moved to multiple DNS providers, including a high-capacity DNS provider like Google and Amazon who can handle such large attacks easily.

In other words, we the people fixed the major Mirai problem, and law-enforcement didn't.

Moreover, instead being a solution to cyber threats, law enforcement has become a threat itself. The DNC didn't have the FBI investigate the attacks from Russia likely because they didn't want the FBI reading all their files, finding wrongdoing by the DNC. It's not that they did anything actually wrong, but it's more like that famous quote from Richelieu "Give me six words written by the most honest of men and I'll find something to hang him by". Give all your internal emails over to the FBI and I'm certain they'll find something to hang you by, if they want.

Or consider the case of Andrew Auernheimer. He found AT&T's website made public user accounts of the first iPad, so he copied some down and posted them to a news site. AT&T had denied the problem, so making the problem public was the only way to force them to fix it. Such access to the website was legal, because AT&T had made the data public. However, prosecutors disagreed. In order to protect the powerful, they twisted and perverted the law to put Auernheimer in jail.

It's not that law enforcement is bad, it's that it's not the unalloyed good Rosenstein imagines. When law enforcement becomes the thing Rosenstein describes, it means we live in a police state.

Where law enforcement can't go

Rosenstein repeats the frequent claim in the encryption debate:
Our society has never had a system where evidence of criminal wrongdoing was totally impervious to detection
Of course our society has places "impervious to detection", protected by both legal and natural barriers.

An example of a legal barrier is how spouses can't be forced to testify against each other. This barrier is impervious.

A better example, though, is how so much of government, intelligence, the military, and law enforcement itself is impervious. If prosecutors could gather evidence everywhere, then why isn't Rosenstein prosecuting those guilty of CIA torture?

Oh, you say, government is a special exception. If that were the case, then why did Rosenstein dedicate a precious third of his speech discussing the "rule of law" and how it applies to everyone, "protecting people from abuse by the government". It obviously doesn't, there's one rule of government and a different rule for the people, and the rule for government means there's lots of places law enforcement can't go to gather evidence.

Likewise, the crypto backdoor Rosenstein is demanding for citizens doesn't apply to the President, Congress, the NSA, the Army, or Rosenstein himself.

Then there are the natural barriers. The police can't read your mind. They can only get the evidence that is there, like partial fingerprints, which are far less reliable than full fingerprints. They can't go backwards in time.

I mention this because encryption is a natural barrier. It's their job to overcome this barrier if they can, to crack crypto and so forth. It's not our job to do it for them.

It's like the camera that increasingly comes with TVs for video conferencing, or the microphone on Alexa-style devices that are always recording. This suddenly creates evidence that the police want our help in gathering, such as having the camera turned on all the time, recording to disk, in case the police later gets a warrant, to peer backward in time what happened in our living rooms. The "nothing is impervious" argument applies here as well. And it's equally bogus here. By not helping police by not recording our activities, we aren't somehow breaking some long standing tradit

And this is the scary part. It's not that we are breaking some ancient tradition that there's no place the police can't go (with a warrant). Instead, crypto backdoors breaking the tradition that never before have I been forced to help them eavesdrop on me, even before I'm a suspect, even before any crime has been committed. Sure, laws like CALEA force the phone companies to help the police against wrongdoers -- but here Rosenstein is insisting I help the police against myself.

Balance between privacy and public safety

Rosenstein repeats the frequent claim that encryption upsets the balance between privacy/safety:
Warrant-proof encryption defeats the constitutional balance by elevating privacy above public safety.
This is laughable, because technology has swung the balance alarmingly in favor of law enforcement. Far from "Going Dark" as his side claims, the problem we are confronted with is "Going Light", where the police state monitors our every action.

You are surrounded by recording devices. If you walk down the street in town, outdoor surveillance cameras feed police facial recognition systems. If you drive, automated license plate readers can track your route. If you make a phone call or use a credit card, the police get a record of the transaction. If you stay in a hotel, they demand your ID, for law enforcement purposes.

And that's their stuff, which is nothing compared to your stuff. You are never far from a recording device you own, such as your mobile phone, TV, Alexa/Siri/OkGoogle device, laptop. Modern cars from the last few years increasingly have always-on cell connections and data recorders that record your every action (and location).

Even if you hike out into the country, when you get back, the FBI can subpoena your GPS device to track down your hidden weapon's cache, or grab the photos from your camera.

And this is all offline. So much of what we do is now online. Of the photographs you own, fewer than 1% are printed out, the rest are on your computer or backed up to the cloud.

Your phone is also a GPS recorder of your exact position all the time, which if the government wins the Carpenter case, they police can grab without a warrant. Tagging all citizens with a recording device of their position is not "balance" but the premise for a novel more dystopic than 1984.

If suspected of a crime, which would you rather the police searched? Your person, houses, papers, and physical effects? Or your mobile phone, computer, email, and online/cloud accounts?

The balance of privacy and safety has swung so far in favor of law enforcement that rather than debating whether they should have crypto backdoors, we should be debating how to add more privacy protections.

"But it's not conclusive"

Rosenstein defends the "going light" ("Golden Age of Surveillance") by pointing out it's not always enough for conviction. Nothing gives a conviction better than a person's own words admitting to the crime that were captured by surveillance. This other data, while copious, often fails to convince a jury beyond a reasonable doubt.

This is nonsense. Police got along well enough before the digital age, before such widespread messaging. They solved terrorist and child abduction cases just fine in the 1980s. Sure, somebody's GPS location isn't by itself enough -- until you go there and find all the buried bodies, which leads to a conviction. "Going dark" imagines that somehow, the evidence they've been gathering for centuries is going away. It isn't. It's still here, and matches up with even more digital evidence.

Conversely, a person's own words are not as conclusive as you think. There's always missing context. We quickly get back to the Richelieu "six words" problem, where captured communications are twisted to convict people, with defense lawyers trying to untwist them.

Rosenstein's claim may be true, that a lot of criminals will go free because the other electronic data isn't convincing enough. But I'd need to see that claim backed up with hard studies, not thrown out for emotional impact.

Terrorists and child molesters

You can always tell the lack of seriousness of law enforcement when they bring up terrorists and child molesters.

To be fair, sometimes we do need to talk about terrorists. There are things unique to terrorism where me may need to give government explicit powers to address those unique concerns. For example, the NSA buys mobile phone 0day exploits in order to hack terrorist leaders in tribal areas. This is a good thing.

But when terrorists use encryption the same way everyone else does, then it's not a unique reason to sacrifice our freedoms to give the police extra powers. Either it's a good idea for all crimes or no crimes -- there's nothing particular about terrorism that makes it an exceptional crime. Dead people are dead. Any rational view of the problem relegates terrorism to be a minor problem. More citizens have died since September 8, 2001 from their own furniture than from terrorism. According to studies, the hot water from the tap is more of a threat to you than terrorists.

Yes, government should do what they can to protect us from terrorists, but no, it's not so bad of a threat that requires the imposition of a military/police state. When people use terrorism to justify their actions, it's because they trying to form a military/police state.

A similar argument works with child porn. Here's the thing: the pervs aren't exchanging child porn using the services Rosenstein wants to backdoor, like Apple's Facetime or Facebook's WhatsApp. Instead, they are exchanging child porn using custom services they build themselves.

Again, I'm (mostly) on the side of the FBI. I support their idea of buying 0day exploits in order to hack the web browsers of visitors to the secret "PlayPen" site. This is something that's narrow to this problem and doesn't endanger the innocent. On the other hand, their calls for crypto backdoors endangers the innocent while doing effectively nothing to address child porn.

Terrorists and child molesters are a clichéd, non-serious excuse to appeal to our emotions to give up our rights. We should not give in to such emotions.

Definition of "backdoor"

Rosenstein claims that we shouldn't call backdoors "backdoors":
No one calls any of those functions [like key recovery] a “back door.”  In fact, those capabilities are marketed and sought out by many users.
He's partly right in that we rarely refer to PGP's key escrow feature as a "backdoor".

But that's because the term "backdoor" refers less to how it's done and more to who is doing it. If I set up a recovery password with Apple, I'm the one doing it to myself, so we don't call it a backdoor. If it's the police, spies, hackers, or criminals, then we call it a "backdoor" -- even it's identical technology.

Wikipedia uses the key escrow feature of the 1990s Clipper Chip as a prime example of what everyone means by "backdoor". By "no one", Rosenstein is including Wikipedia, which is obviously incorrect.

Though in truth, it's not going to be the same technology. The needs of law enforcement are different than my personal key escrow/backup needs. In particular, there are unsolvable problems, such as a backdoor that works for the "legitimate" law enforcement in the United States but not for the "illegitimate" police states like Russia and China.

I feel for Rosenstein, because the term "backdoor" does have a pejorative connotation, which can be considered unfair. But that's like saying the word "murder" is a pejorative term for killing people, or "torture" is a pejorative term for torture. The bad connotation exists because we don't like government surveillance. I mean, honestly calling this feature "government surveillance feature" is likewise pejorative, and likewise exactly what it is that we are talking about.


Rosenstein focuses his arguments on "providers", like Snapchat or Apple. But this isn't the question.

The question is whether a "provider" like Telegram, a Russian company beyond US law, provides this feature. Or, by extension, whether individuals should be free to install whatever software they want, regardless of provider.

Telegram is a Russian company that provides end-to-end encryption. Anybody can download their software in order to communicate so that American law enforcement can't eavesdrop. They aren't going to put in a backdoor for the U.S. If we succeed in putting backdoors in Apple and WhatsApp, all this means is that criminals are going to install Telegram.

If the, for some reason, the US is able to convince all such providers (including Telegram) to install a backdoor, then it still doesn't solve the problem, as uses can just build their own end-to-end encryption app that has no provider. It's like email: some use the major providers like GMail, others setup their own email server.

Ultimately, this means that any law mandating "crypto backdoors" is going to target users not providers. Rosenstein tries to make a comparison with what plain-old telephone companies have to do under old laws like CALEA, but that's not what's happening here. Instead, for such rules to have any effect, they have to punish users for what they install, not providers.

This continues the argument I made above. Government backdoors is not something that forces Internet services to eavesdrop on us -- it forces us to help the government spy on ourselves.

Rosenstein tries to address this by pointing out that it's still a win if major providers like Apple and Facetime are forced to add backdoors, because they are the most popular, and some terrorists/criminals won't move to alternate platforms. This is false. People with good intentions, who are unfairly targeted by a police state, the ones where police abuse is rampant, are the ones who use the backdoored products. Those with bad intentions, who know they are guilty, will move to the safe products. Indeed, Telegram is already popular among terrorists because they believe American services are already all backdoored. 

Rosenstein is essentially demanding the innocent get backdoored while the guilty don't. This seems backwards. This is backwards.

Apple is morally weak

The reason I'm writing this post is because Rosenstein makes a few claims that cannot be ignored. One of them is how he describes Apple's response to government insistence on weakening encryption doing the opposite, strengthening encryption. He reasons this happens because:
Of course they [Apple] do. They are in the business of selling products and making money. 
We [the DoJ] use a different measure of success. We are in the business of preventing crime and saving lives. 
He swells in importance. His condescending tone ennobles himself while debasing others. But this isn't how things work. He's not some white knight above the peasantry, protecting us. He's a beat cop, a civil servant, who serves us.

A better phrasing would have been:
They are in the business of giving customers what they want.
We are in the business of giving voters what they want.
Both sides are doing the same, giving people what they want. Yes, voters want safety, but they also want privacy. Rosenstein imagines that he's free to ignore our demands for privacy as long has he's fulfilling his duty to protect us. He has explicitly rejected what people want, "we use a different measure of success". He imagines it's his job to tell us where the balance between privacy and safety lies. That's not his job, that's our job. We, the people (and our representatives), make that decision, and it's his job is to do what he's told. His measure of success is how well he fulfills our wishes, not how well he satisfies his imagined criteria.

That's why those of us on this side of the debate doubt the good intentions of those like Rosenstein. He criticizes Apple for wanting to protect our rights/freedoms, and declare they measure success differently.

They are willing to be vile

Rosenstein makes this argument:
Companies are willing to make accommodations when required by the government. Recent media reports suggest that a major American technology company developed a tool to suppress online posts in certain geographic areas in order to embrace a foreign government’s censorship policies. 
Let me translate this for you:
Companies are willing to acquiesce to vile requests made by police-states. Therefore, they should acquiesce to our vile police-state requests.
It's Rosenstein who is admitting here is that his requests are those of a police-state.

Constitutional Rights

Rosenstein says:
There is no constitutional right to sell warrant-proof encryption.
Maybe. It's something the courts will have to decide. There are many 1st, 2nd, 3rd, 4th, and 5th Amendment issues here.

The reason we have the Bill of Rights is because of the abuses of the British Government. For example, they quartered troops in our homes, as a way of punishing us, and as a way of forcing us to help in our own oppression. The troops weren't there to defend us against the French, but to defend us against ourselves, to shoot us if we got out of line.

And that's what crypto backdoors do. We are forced to be agents of our own oppression. The principles enumerated by Rosenstein apply to a wide range of even additional surveillance. With little change to his speech, it can equally argue why the constant TV video surveillance from 1984 should be made law.

Let's go back and look at Apple. It is not some base company exploiting consumers for profit. Apple doesn't have guns, they cannot make people buy their product. If Apple doesn't provide customers what they want, then customers vote with their feet, and go buy an Android phone. Apple isn't providing encryption/security in order to make a profit -- it's giving customers what they want in order to stay in business.

Conversely, if we citizens don't like what the government does, tough luck, they've got the guns to enforce their edicts. We can't easily vote with our feet and walk to another country. A "democracy" is far less democratic than capitalism. Apple is a minority, selling phones to 45% of the population, and that's fine, the minority get the phones they want. In a Democracy, where citizens vote on the issue, those 45% are screwed, as the 55% impose their will unwanted onto the remainder.

That's why we have the Bill of Rights, to protect the 49% against abuse by the 51%. Regardless whether the Supreme Court agrees the current Constitution, it is the sort right that might exist regardless of what the Constitution says. 

Obliged to speak the truth

Here is the another part of his speech that I feel cannot be ignored. We have to discuss this:
Those of us who swear to protect the rule of law have a different motivation.  We are obliged to speak the truth.
The truth is that “going dark” threatens to disable law enforcement and enable criminals and terrorists to operate with impunity.
This is not true. Sure, he's obliged to say the absolute truth, in court. He's also obliged to be truthful in general about facts in his personal life, such as not lying on his tax return (the sort of thing that can get lawyers disbarred).

But he's not obliged to tell his spouse his honest opinion whether that new outfit makes them look fat. Likewise, Rosenstein knows his opinion on public policy doesn't fall into this category. He can say with impunity that either global warming doesn't exist, or that it'll cause a biblical deluge within 5 years. Both are factually untrue, but it's not going to get him fired.

And this particular claim is also exaggerated bunk. While everyone agrees encryption makes law enforcement's job harder than with backdoors, nobody honestly believes it can "disable" law enforcement. While everyone agrees that encryption helps terrorists, nobody believes it can enable them to act with "impunity".

I feel bad here. It's a terrible thing to question your opponent's character this way. But Rosenstein made this unavoidable when he clearly, with no ambiguity, put his integrity as Deputy Attorney General on the line behind the statement that "going dark threatens to disable law enforcement and enable criminals and terrorists to operate with impunity". I feel it's a bald face lie, but you don't need to take my word for it. Read his own words yourself and judge his integrity.


Rosenstein's speech includes repeated references to ideas like "oath", "honor", and "duty". It reminds me of Col. Jessup's speech in the movie "A Few Good Men".

If you'll recall, it was rousing speech, "you want me on that wall" and "you use words like honor as a punchline". Of course, since he was violating his oath and sending two privates to death row in order to avoid being held accountable, it was Jessup himself who was crapping on the concepts of "honor", "oath", and "duty".

And so is Rosenstein. He imagines himself on that wall, doing albeit terrible things, justified by his duty to protect citizens. He imagines that it's he who is honorable, while the rest of us not, even has he utters bald faced lies to further his own power and authority.

We activists oppose crypto backdoors not because we lack honor, or because we are criminals, or because we support terrorists and child molesters. It's because we value privacy and government officials who get corrupted by power. It's not that we fear Trump becoming a dictator, it's that we fear bureaucrats at Rosenstein's level becoming drunk on authority -- which Rosenstein demonstrably has. His speech is a long train of corrupt ideas pursuing the same object of despotism -- a despotism we oppose.

In other words, we oppose crypto backdoors because it's not a tool of law enforcement, but a tool of despotism.

Read the whole story
20 hours ago
Colorado Plateau
21 hours ago
Sydney, Australia
Share this story

Equifax: Umm, actually hackers stole records of 15.2 million Brits, not 400,000

1 Share
Equifax: Umm, actually hackers stole records of 15.2 million Brits, not 400,000

Equifax has confirmed that a recent data breach exposed a file containing 15.2 million UK personal information records.

David Bisson reports.

Read the whole story
1 day ago
Sydney, Australia
Share this story

A Very V-E-R-Y Long Day Without Software

1 Share

Over the summer, some friends at Veracode approached me and asked if I would be willing to help them with an experiment. Could I, they wanted to know, spend an entire day neither using nor leveraging any software whatsoever. They bet me that I couldn’t. I love a challenge as much as any journalist so I said “Sure. How hard could it possibly be?”

The point of this is to make business people better understand how devastating cyber thief and cyber terrorist attacks can be and how remarkably dependent we are today on software. Still, with a wee bit of creativity and ingenuity, why should the absence of a few executables slow me down?

First off, no need to boot up the laptop. Won’t be using that today. No problem. My trusty old electric typewriter is in a closet somewhere and that should allow me to write as much as I need.

For research, I have my iPhone, so I can call anyone I want and scour the Web. (Pause.) Uh-oh. The operating system and apps are clearly software. That sharply curtails my research efforts. For that matter, there also goes my plan for dictating what I write into the phone and sending it to my clients that way.

When I started my career, I would do 95 percent of my interviews in person (at the courthouse, when visiting police, interviewing a source over lunch, etc.), but that number has now flipped. With the Internet and sources around the globe to interview via Skype and regular phone calls, I do 95 percent of my interviews remotely. That means that my 2017 research options are much more limited. Uh-oh.

Maybe I take the day off of work and just have a day off. Ha, got you there, Veracode! Spending the day without software won’t be so difficult at all.

First off, given that I forgot to do it yet, I’ll go into my kitchen and craft some breakfast. Cooking on the gas stove is out—the natural gas comes to us from a utility that uses oceans of software and the electricity comes courtesy of another software-dependent utility. It will be a cold and room-temperature breakfast then.

A cold breakfast isn’t so bad on a hot August day in North Jersey. Just remove some oranges and kiwi from the refrigerator and add it on a plate next to a bowl of cereal with soy milk. (My college daughter is visiting so everything has to be vegan. Trust me. The moment she’s back at school, I have an overdue appointment with our neighborhood butcher.)

Uh-oh. That electricity that I couldn’t use in the oven also powers the refrigerator. Guess I can either ignore what’s in the refrigerator/freezer or unplug it. Might as well unplug it and keep the door closed as much as possible to preserve what cold is in there for however long I can. I’ll also drive to the local dry ice merchant (which, believe it or not, is a party rental place about a 15-minute drive from here. They also do free paper shredding once a month. I guess they’re diversified) to extend the cold a wee bit longer.

I pull the refrigerator/freezer from the wall and unplug it. Wow, when my wife gets home tonight, I will be so popular. She loves it when I do these kinds of experiments.

Come to think of it, I better get a lot more dry ice because the air-conditioning also runs on electricity, which is from that same utility using software. I was going to look up the high-temperature for today to try and generate some reader sympathy, but I concluded it would only depress me more. Given the choice, I’ll opt for ignorance over suicidal feelings.

I turn the air-conditioning off. I anticipate that will only make my wife even more animated when she gets home.

First, though, let me call the party supply place and make sure that they have enough dry ice in stock. I can’t use my mobile phone, of course, but I fortunately still have a couple of analog copper-line landlines, due to podcasts and webinars I produce. They have their own electricity courtesy of the copper, so we’re good.

As I pick up the phone and hear the comforting dial tone, I envision the dry ice moments away. Regrettably, I also remember that the dial tone comes courtesy of the telco’s switching network. And, yes, that network is run these days with software. I sadly hang up. Guess I’ll have to drive over there and hope they have enough dry ice.

Jumping into my 14-year-old Toyota (yeah, writing doesn’t pay that well), I am pleased I am allowed to drive at all in this deal. If this was a new car, everything would be managed by software and I couldn’t use it at all. The point, of course, is not merely that software exists. The idea is software that could be attacked by cyber thiefs or cyberterrorists. The new cars today have networks and wirelessly get updates from the mothership. Hence, they would be forbidden under this evil torturous scheme that Veracode tricked me into. (Yes, tricked. I’m wise to you, Veracode.)

But my old clunker has no car LAN and it has no way to grant access to a cyber bad guy. Fortunately, I had just had this sedan into the local mechanic last week and they fixed a major engine problem by connecting it to a diagnostic system.

No, don’t you dare go there. The rules of this challenge is that I couldn’t use software, not that I couldn’t use something that is only functioning because of software that was used days ago. I’m allowed to drive. Honest. Fine, be that way. I will check in with the Veracode judges and get a ruling whether I can drive.

Wow. Veracode cruelty knows no limit. They ruled that a day without software includes doing without anything that was recently enabled by software.

I don’t give up that easily. There’s a horse farm a few miles from here that teaches horse-riding. All I have to do is walk to that farm, rent a horse and I can ride to that party supply place and get my dry ice. This will be the most expensive breakfast I have had in a long time, but that’s the price I pay for being stubborn and refusing to concede.

Out I go into the heat to walk to the horse farm. I pass by the local bank, where LED lights tell me that it’s 101 degrees. Such information I didn’t need. Technically, I should have closed my eyes and not looked as software-enabled electricity powered that sign, but my curiosity overcame my stubbornness.

When I finally reached the horse farm, I found one of the owners who was more than willing to rent me a horse for the day. That dry ice was as good as mine.

They were willing to rent me one horse for the day for $180. Done! I pulled out my credit card and was about to close the deal. The merchant had an older card swipe and POS system and it took a moment for the handshake and approval. Uh-oh. The payment card authorization used software and was absolutely susceptible to attack. I can’t use plastic today and I don’t have nearly enough cash on me.

The owner kindly pointed to an ATM machine across the street, where I could get the cash. Alas, that machine definitely used software so it was forbidden to me. Oh well. I never learned to ride a horse so I would have probably gotten killed anyway.

As I was walking back home, I realized that the only community I can think of that could pass this test would be a Pennsylvania Dutch community, where they forego all modern conveniences. Unless companies start taking software security seriously, we all better brush up on our farming and pretzel-making skills.

Read the whole story
1 day ago
Sydney, Australia
Share this story

What Would It Look Like If We Put Warnings on IoT Devices Like We Do Cigarette Packets?


Presently sponsored by: Matchlight by Terbium Labs: Know when your exact data appears on the dark web. Contact us for a demo today.

What Would It Look Like If We Put Warnings on IoT Devices Like We Do Cigarette Packets?

A couple of years ago, I was heavily involved in analysing and reporting on the massive VTech hack, the one where millions of records were exposed including kids' names, genders, ages, photos and the relationship to parents' records which included their home address. Part of this data was collected via an IoT device called the InnoTab which is a wifi connected tablet designed for young kids; think Fisher Price designing an iPad... then totally screwing up the security.

Anyway, I read a piece today about VTech asking the court to drop an ongoing lawsuit that came about after the hack. In that story, the writer recalled how VTech has updated their terms and conditions after the attack in an attempt to absolve them of any future responsibility in subsequent attacks. So I gave VTech a suggestion:

Now that may have been (a bit) tongue in cheek, but it got me thinking - what would this actually look like? I mean if they're saying the product might not be safe, how would that look if they literally put it on the box? As it turns out, we know exactly how to put warnings on dangerous products down here in Australia because we've been doing it for years with cigarettes:

What Would It Look Like If We Put Warnings on IoT Devices Like We Do Cigarette Packets?

So how would warning labels on IoT devices that have had serious security vulnerabilities look? Well VTech is the obvious place to start:

What Would It Look Like If We Put Warnings on IoT Devices Like We Do Cigarette Packets?

Would you still buy it? Exactly.

But let's not stop there because in fairness to VTech, it's not like they're the only ones to have had serious issues in their IoT toys. For example, there was CloudPets earlier this year and frankly, I think we can be a lot less "legal-speak" and a lot more honest about the real world risks of IoT devices like these:

What Would It Look Like If We Put Warnings on IoT Devices Like We Do Cigarette Packets?

Speaking of pets, you know what real pets love? Food. You know what they hate? When they don't get fed because the IoT feeder is down:

What Would It Look Like If We Put Warnings on IoT Devices Like We Do Cigarette Packets?

Let's move onto something bigger - cars. Last year, there was a little hiccup with the Nissan LEAF when it turned out they were using the VIN number of the car to pull back data and control features of it via the mobile app:

What Would It Look Like If We Put Warnings on IoT Devices Like We Do Cigarette Packets?

The problem in many of these cases is that we're taking everyday consumer goods and adding internet for no apparent good reason. You know, like when you add a web server to a dishwasher which then exposes you to exactly the sorts of risks we've come to expect from web servers:

What Would It Look Like If We Put Warnings on IoT Devices Like We Do Cigarette Packets?

Now you may be thinking "why would you connect many of these things", and you'd be entirely correct in lamenting that. But that's not what the makers of the LIXIL Satis thought when they connected a toilet which, of course, then had a security advisory issued due to a hard-coded default PIN:

What Would It Look Like If We Put Warnings on IoT Devices Like We Do Cigarette Packets?

And while we're in that general region, how about taking your most intimate moments and digitising them with a connected vibrator that then records your bedroom habits. Yeah, that shit should definitely come with a warning:

What Would It Look Like If We Put Warnings on IoT Devices Like We Do Cigarette Packets?

Welcome to the future, where pointless IoT stuff meets warnings labels on everything!

Read the whole story
1 day ago
Sydney, Australia
Share this story
Next Page of Stories