The rest of the trip was uneventful, at least as far as bits were concerned. (Do not get me started on transportation. Or pest infestations either.)

My new SIMs worked as expected, with a recent legal change I can use the ones I bought in Italy to roam at no extra charge in other EU countries. This was super handy for the five hour layover in Germany. It also means (I think) I can keep them active. I go to Europe from time to time, but not Italy every year as would be needed to refresh my phone service and keep the number. (Important note: Italy requires a national tax ID number to buy a SIM. I have one, but most tourists don’t.)

The downloadable encrypted disk worked, and once I got it I was able to access the keys for my server. I was still cut off from everything tied to my US phone, but it was tolerable. Web email only, and I had to send messages from the mail application because of my setup. So that data was still local. But there were only a few things I really had to reply to. Besides, I was supposed to be speaking Italian, not websurfing.

I have a separate laptop login, with restricted permissions, that I intended to use for general web browsing. I mostly used the phone, however. The one helpful thing was my mifi had more bandwidth than my phone, so I could connect to it by wifi and VoIP calls over the VPN were less terrible.

On the flight back home, I deleted the disk image, cleared data in all the browsers I had been using, and shut down. That logged me out of websites, with no way to get access without my US phone. Nothing actually happened at customs, but the point of practicing one’s security plans is so you are more confident they would work (and you can execute them) if actually needed. And, in my case, to write up what I thought about it.

The most unexpected surprise was the reminder that average people have no idea what two factor auth is. They were confused why I could not login to Facebook, when I had a perfectly good phone and laptop right there. I mean, everyone is on Facebook right? It was challenging to explain that I required a message that was sent to a device I didn’t have. (I think I was then deemed one of those “computer people.” Fair enough.)

The VPN set up for always-on worked about as well as it does in the US, so I’m happy with that. (Some websites still reject you tho, boo.) I tried to use public wifi in various locations (the mall, inside train stations, etc) but mostly they did not work correctly and I was stuck with whatever signal I could find on my own. (They were either over-used and not responsive, or blocked my VPN connection.)

Next time I’m going to get a plan for my phone that includes voice service. I couldn’t call taxis, and that was a pain. I was not in a big city where it’s easy to find a taxi.

Well, “fun” isn’t exactly the word that comes to mind right now. In fact, the primary purpose of this post is a test to track down gremlins around redirecting incoming HTTP requests to HTTPS.

Since I was rebuilding the webserver from scratch anyway, now would be a good time to get it set up with a shiny new SSL certificate. I never could quite figure it out with the old system. Plus until fairly recently it meant significant cash to buy a cert from a reputable source.

But thanks to Let’s Encrypt, everybody can get one for free. It’s awesome. Random people using encryption for whatever random thing they are doing is effectively herd immunity. People who really need the protection of encryption to, say, not be murdered by their governments, no longer stand out in the crowd. And it makes it much, much harder for those trying to enact mathematically-challenged anti-encryption laws.

So this is a good thing. And I could make it easier by configuring my sites to switch incoming visitors over to HTTPS. Except my webserver configuration is thwarting me: HTTP connections are rejected rather than nicely switched over. (And I don’t know enough about HTTP/HTTPS/Apache yet to even explain it properly. I’m working on that.)

If you got a weird message when you tried to access the site, that’s what it was about. In the meantime, I’ll be over here buried in configuration and log files.

I did say that this blog would avoid getting into political issues and stick to practical concerns. But the events of the past week with Apple and the FBI are pretty disturbing and I want to talk about why.

First, nothing about the technical matters involved in the conversation (with one exception) is anything that you or I or any other private individual can do anything about. It’s all taking place in the rarified air of law enforcement vs public policy, from those who believe they know what is good for us, and have the power to change how others are allowed to access our personal data. We can lobby our elected officials and hope somebody can get past the fear mongering enough to listen.

Next, there is one technical thing you can do to protect your personal device: choose a strong passcode. I’m going to assume you already use a passcode, but the default four or six digit number isn’t going to stand up to a brute force attempt to break it. Make it longer. Make it numbers and letters if you can stand to (it’s a real pain to enter, I know.) Do it not because you “have something to hide” but because you will always have something you don’t want shared and it’s not possible to know in advance what, when, or how that might come about. Make protecting your personal data a normal activity. The longer and more complicated your passcode, the more effort it will take to guess. As long as we continue to use passcodes this will be true, and the goalpost is always moving.

Now, on with the real subject of this post. Get comfortable, this will take a while.

Folks who have followed this issue know that Apple (along with other companies) have routinely responded to search warrants and other official requests for customer data. From a practical standpoint, they have to. But they also have been re-designing parts of their systems to make it less feasible. (It’s important to note that recovering data from a locked device is not the same as unlocking it.) Not only is it now more difficult for people outside Apple to access private data on iOS devices, it’s also more difficult for Apple itself to do.

Discussion of the two current court cases, with detail on what is possible to recover from a locked device for various iOS versions
No, Apple Has Not Unlocked 70 iPhones For Law Enforcement

Court order requiring Apple to comply

The reason for this has many parts, and one very important part is of course to make their product more attractive to customers. Apple is in the business of selling equipment, that’s what they do. When it came out that we tinfoil hats hadn’t just been making up stuff we suspected the NSA was snooping on (and they far exceeded our speculations) suddenly US companies had a huge problem: international customers. Foreign organizations, businesses and governments alike, were none too keen to have confirmed in excruciating detail the extent that the US government was spying on everyone. If US companies want to continue to sell products, they have to be able to convince security-conscious customers that they aren’t just a lapdog for the NSA.

When somebody says “Apple is only doing this because of marketing” consider what that means. People don’t buy your product without “marketing.” Unless you have somehow managed a sweet exclusive deal that can never be taken away, your company depends on marketing for its continued existence. And your marketing and the products it promotes have to appeal to customers. All over the world, more and more people are saying “You know, I don’t much like the idea that the US Government could walk in and see my stuff.”

Strong cryptography is not just for protecting business interests. Livelihoods, and sometimes lives, also depend on the ability to keep private things private. For years people have claimed that products built in China are untrustworthy because the Chinese government can force their makers to provide a way in to protected data. It’s better to buy from trusted companies in enlightened countries where that won’t happen. Who is left on that list?

And what about terrorism? Of course, the things terrorists have done are awful. Nobody is contesting that. But opening up everyone to risk so governments have the ability to sneak up and overhear a potential terror plot doesn’t change how threats are discovered. The intelligence agencies already have more data than they are able to handle, it’s the process that’s broken and not that they suddenly have nothing to look at. There have been multiple cases where pronouncements of “This would have never happened without encryption” have been quickly followed by the discovery that perpetrators were using basic non-encrypted communications that were not intercepted or correctly analyzed. “Collect more data because we can” is not a rational proposal to improve the intelligence process, even if the abuse of privacy could be constitutionally justified.

There is no such thing as a magic key that only authorized users are permitted to use and all others will be kept out forever. If there’s a way in, someone will find it. Nothing is perfect, a defect will eventually be found, maybe even those authorized users will slip up and open the door. Also, state actors are hardly trustworthy when they say these powers will only be used to fight the most egregious terror threats and everybody else will be left alone. Even if they could prevent backdoors from being used without authorization, their own histories belie their claims.

The dangers of having “secret” entry enforced only by policy to not give out the key
TSA Doesn’t Care That Its Luggage Locks Have Been Hacked

Intelligence agencies claim encryption is the reason they can’t identify terror plots, when the far larger problem is that mass surveillance generates vast quantities of data they don’t have the ability to use effectively
5 Myths Regarding the Paris Terror Attacks

Officials investigating the San Bernardino attack report the terrorists used encrypted communication, but the Senate briefing said they didn’t
Clueless Press Being Played To Suggest Encryption Played A Role In San Bernardino Attacks

What the expanded “Sneak-and-Peek” secret investigatory powers of the Patriot Act, claimed to be necessary because of terrorism, are actually being used for
Surprise! Controversial Patriot Act power now overwhelmingly used in drug investigations

TSA ordered searches of cars valet parked at airports
TSA Is Making Airport Valets Search Your Trunk

What is being asked of Apple in this case?

Not to unlock the phone, because everyone agrees that’s not technically possible. Not to provide data recoverable from the locked device by Apple in their own labs, which they could do for previous iOS versions but not now. What the court order actually says is they must create a special version of the operating system that prevents data from being wiped after 10 incorrect passcodes, the means to rapidly try new passcodes in an automated fashion, and the ability to install this software on the target device (that will only accept OS updates via the Apple-authorized standard mechanism.)

What would happen if Apple did this?

The government says Apple would be shown to be a fine, upstanding corporate citizen, this one single solitary special case would be “solved,” and we all go on with our lives content to know that justice was served. Apple can even delete the software they created when they are done. The FBI claims the contents of this employer-owned phone are required to know if the terrorists were communicating with other terrorists in coordinated actions. No other evidence has suggested this happened, so it must be hidden on that particular phone (and not, for example, on the non-work phones that were destroyed or in any of the data on Apple’s servers that they did provide.)

How the law enforcement community is reacting to the prospect of the FBI winning this case
FBI Says Apple Court Order Is Narrow, But Other Law Enforcers Hungry to Exploit It

Apple would, first and foremost, be compelled to spend considerable effort on creating a tool to be used by the government. Not just “we’ll hack at it and see what we find” but a testable piece of software that can stand up to being verified at a level sufficient to survive court challenges of its accuracy and reliability. Because if the FBI did find evidence they wanted to use to accuse someone else, that party’s legal team will absolutely question how it was acquired. If that can’t be done, all this effort is wasted.

A discussion of the many, many requirements for building and maintaining a tool suitable for use as as source of evidence in criminal proceedings.
Apple, FBI, and the Burden of Forensic Methodology

Next, the probability of this software escaping the confines of Apple’s labs is high. The 3rd-party testing necessary to make the results admissible in court, at absolute minimum, gives anyone in physical possession of a test device access to reverse-engineer the contents. If the FBI has the target device, it too can give it to their security researchers to evaluate. Many people will need access to the software during the course of the investigation.

Finally, everyone in the world would know that Apple, who said they had no way to do this thing, now does. And now that it does, more people will want it. Other governments would love to have this capability and Apple, as a global company, will be pressured to give in. What would that pressure be? In the US, it’s standard for the government to threaten crushing fines or imprisonment of corporate officers for defying the courts. Governments can forbid Apple to sell products in their countries or assess punitive import taxes. Any of these can destroy a company.

Non-technical people often decry security folks as eggheads crying wolf over petty concerns when there are so many more important things to discuss. That’s fine, and our role as professionals includes the responsibility to educate and explain how what we do impacts others.

I encourage you to consider this issue for yourself and what it would mean if you were the person at the other end of that search warrant. Ubiquitous communication and data collection have fundamentally changed how much of the world lives and works, and there are plenty of governments far less principled than our own who want access to private data of mobile phone users. We should not be encouraging them by saying it’s just fine, no problem, go ahead, just hand over the (cryptographic) keys and everything will be ok.

Recently some folks with Tor, the open source project behind the global decentralized anonymizing network, released a beta version of a new chat client. It’s designed to be secure by default, but usable by normal people. This is something that has escaped many previous efforts, so it’s a welcome development

It encrypts messages with OTR (so only you and the person you are chatting with can see them) and sends them via the Tor network (to hide where you are on the Internet.) These are very, very good things and I’m happy to see user-friendly applications building on the excellent work Tor has been doing.

The difficulty for me is how it fits into the way I use chat, specifically that it’s impossible to save chat transcripts. While that has a benefit for the purest of high-impact security, what doesn’t exist can’t be compromised, it is exactly the opposite of how I use chat.

It seems that many people use instant messaging only for one-off communications. I treat it like email and constantly go back to reference something I’ve sent or information I received. This is a major reason I’m still using Apple’s Messages client, because it makes searching chats trivially easy.

But despite Messages allowing you to use a whole collection of different chat services, it doesn’t provide encryption for anything other than Apple’s own service. (Which I don’t use for reasons too long to go into right now.) I’ve tried other clients, but haven’t been thrilled. Even without getting into if or how they use encryption, I’ve found them clunky. And, most importantly, hard to reference old messages. The best of them, Adium, has a custom viewer only usable from inside the app but the archive chats use a tiny fixed size font that can’t be changed. That makes it useless for me.

Between encryption by default and using the Tor network, I really really want to like Tor Messenger. I dug around and with some help from the Tor folks figured out how to re-enable chat logs, but the results were not usable for several reasons:

First, it creates files in JSON format, something designed to be easily readable by computers. While it’s true that JSON contains text, it isn’t in a human-readable format by any rational definition because it contains a bunch of required formatting and other control structures that get in the way of human understanding.

Next, that file is overwritten every time the program starts. Unless you have your own way to save the contents automatically (and this is a far more difficult problem than it sounds) you lose your history anyway.

Finally, it’s located deep inside the app’s install directory. This is not a problem for me, but would certainly be an issue for anyone not very familiar with technical aspects of OS X. And that also means it’s excluded from Spotlight, Apple’s disk searching tool.

I still have hope, because it’s early and also because it’s open source. When they are able to release the Mac build instructions, I can just go change what’s annoying myself. (And if I’m going to choose an open source project to work on, I’m thinking I might prefer the more security-focused Tor over Adium. Sorry Adium friends.)

But for the moment, unless I’m willing to forge onward into the wilderness of creating my own custom version of something, I’m still stuck with the choice between secure and annoying, or insecure but fits into how I work.

Yesterday I decided to finally take a look at the iOS app E*Trade has been telling me all those wonderful things about. I’d been kinda skeptical about managing my brokerage account from my phone, but sometimes it’s nice to check on stuff. (Like if I actually transfered that money from savings to cover a check.)

Other reviewers can discuss the features (which seem a little clunky and definitely overly complex) but what I wanted to investigate is how the app secures data over the network.

The info about it from the App Store says it’s all wonderful and secure and stuff, because data is stored on the server and never on the device. That’s nice. And the website is all about how secure it is. Spiffy. How, exactly, is data protected as it goes from here to there? No Comment. Not even marketing copy about “Industry-Standard 938,842-bit Encryption.”

When I started up the app, the first thing I got was a giant agreement to read and accept. It was clearly written by lawyers, because there is an entire paragraph where they disclaim any and all liability for network data security. The user is responsible for ensuring the device’s connection to the Internet is reliable and secure, blah blah blah. (I tried to find a copy of this online, but haven’t yet.) As far as I can tell, they can send everything absolutely in the clear and according to the user agreement it would be just fine.

So I did what any self-respecting, security-aware user would do (no, not fire up Wireshark, or at least not yet) I call them up and asked.

The mobile trading support guy said “Of course everything is encrypted.” Ok, good. I recall my comment about SSL answered by “whatever that is.” Ok, he’s not a developer. I mentioned it would be nice if the description of the app actually said something about the encryption standards used, and he agreed.

What I got out of this exercise is that E*Trade almost certainly contracted out the development of their mobile apps (which is normal) and their customer-facing support staff doesn’t know much about the details of data protection for them (which is disconcerting.) I know enough iOS developers that the people who built the app were probably not so stupid as to ignore data security, but there was a breakdown in communication between them and the online documentation. I hope my feedback actually gets to someone who knows what SSL is.

In the meantime, if I absolutely must do something while away from my computer, I’ll turn on the VPN connection and at least keep it from being sniffed over the air. And look for an app update with a full description of how the app protects my data in transit.

I’ve been using FileVault to encrypt the drive on my travel machine for a while now, but I’ve only recently enabled it on my everyday machine. I rarely take it anywhere, but since I bought a nice new tiny laptop that will change. (I’ll discuss the mechanics of enabling it another time.)

Mostly this hasn’t made any difference in how I use my laptop, but here and there I run into something. (If you don’t already have your Mac configured to require a password, FileVault will enable that. Many people do, and many corporate IT policies require it.)

I had to take the new machine in for repair this week, and as part of the routine intake process they ask for the login and password. Uh, no. That’s pretty normal for me, I usually wipe a machine before I hand it over but I didn’t have time. Now if the service issue were software, this obviously wouldn’t work. But so far I’ve not needed to take something in overnight for anything other than broken hardware.

I dutifully inform them the encrypted state of the drive, and they will get back to me if there is a problem. But what exactly does that mean for them? FileVault 2 is full-disk encryption (unlike the original FileVault) so when you start up the machine you immediately get a login screen. If you don’t log in, it won’t even finish booting. After replacing my logic board, the repair tech will have to attach another bootable volume, use the Option key on startup to select where to boot from, and test it that way.

When I travel, I always shut down the machine before I pack it away. Not only does that mean it can’t accidentally wake up in transit (risking your hard drive if you have the old spinning kind, or your battery either way) but if anybody steals it there is no chance someone is getting into my hard drive. Now if they had my password-protected laptop and it were only sleeping, technically it would be “easier” to gain access. But by that I mean if a skilled and determined attacker were interested, there might be weaknesses in the OS or other things that could be compromised to allow unauthorized access. Might. If you are being tracked by a government agency and your laptop gets taken off in a black helicopter, perhaps you have some concern. The sketchy dude who lifted your MacBook Air from Starbucks? Unlikely.

Now one thing Sketchy Dude is likely to do is open it up to see if it works. If your laptop is able to connect to a wireless network and you have some kind of location tracking program enabled, then you might be able to find out where Sketchy Dude is. That wouldn’t happen if the machine were shut down. (It also wouldn’t happen if he wipes the drive before connecting it to a network, which thieves who know anything about computers will do.)

I haven’t enabled Find My Mac because if someone has taken off with my laptop, I’m not counting on getting it back. (It’s fully backed up, after all.) It also means it’s not constantly reporting its location, and there’s one less source of information about me to exist in somebody’s giant database. (I do use it on my phone, as that’s a different story.)

So enabling disk encryption hasn’t changed anything for me, but that might not be the same for someone else. If you really hate entering a password, you aren’t going to like FileVault.

Update:

Well, I did find one thing: Safe Boot doesn’t work with FileVault (see the link in the comments.) When I was having migration problems, the Apple tech recommended I restart with Safe Boot but I couldn’t. Unfortunately she also didn’t know that was on purpose. (Fortunately, for FV2 anyway, the migration issue didn’t seem to be related to encryption.)

Resources:
Complete guide to FileVault 2 in Lion

Rich Trouton’s blog posts about FileVault 2 (for hardcore IT folks)