Over the past several months I have fielded quite a few complaints from friends, some from strangers, and many from family. All these complaints were about how “hard” security is. One close friend in particular says he wants to be more secure but is daunted by its complexity, and he can’t decide where to start. To address these complaints and make security seem more approachable to the average individual, I have devised what I like to call the Thirty-Day Security Challenge. This one-month security challenge will attempt to break security down one day at a time. Continue reading “3DSC: The Thirty-Day Security Challenge”
I have mentioned in my books and on this blog that I like to convince people to use encryption. More specifically I like to persuade them to use the encryption that I use, especially for data-in-motion. There are a couple of reasons for this. First, if more of us use encryption, the more “noise” we all generate. Encrypted calls, messages, chats, and emails become the norm, none of them stand out just because they are encrypted, and the less alerting any one individual using encryption is. I also like to convince others to use the encryption that I use because it gives me a secure communication pathway with them. Individuals with whom I communicate represent a fairly significant weak point in my own security if I must revert to insecure email, voice, text, and other forms of communication with them. Finally, the more mainstream encrypted apps become, the easier it is to get others to join the fun. At this point it is not at all uncommon for one of my friends to install an encrypted app and see that several of his or her contacts is already on there. Continue reading “How to Convince Friends and Influence People (To Use Encryption)”
The legislatures of New York and California have recently introduced bills with language that would ban the sale of encrypted smartphones in their states. The bills are strikingly similar in that each would require devices manufactured on or after January 1, 2017 and sold in the respective state to be capable of being decrypted by the manufacturer or “operating system provider”. Failure to comply with the bill would impose a penalty of $2,500.00 per device.
Though the architects of these bills assure us this would affect only a very small minority of the population, this is alarming to me as an individual for both ideological, privacy-based reasons and for the inherent folly of insecurity as a feature. As a layman (read: not an attorney) I can only guess where the legal system will come down on this issue. The area in which I do feel slightly more qualified to weigh in are the potential second- and third-order effects of the passage of a law to ban smartphone encyrption. There are several possible outcomes of a law like this, but all of them hinge on the actions of the “operating system provider[s]”. The three possible outcomes that I imagine are listed below, but I cannot assign a reliable likelihood, either relatively or absolutely, to any of them.
- The first possible outcome that occurs to me is that manufacturers support the spirit of the law totally and either build backdoors into devices (probably more likely) or remove smartphone encryption entirely (probably less likely). Smartphones would no longer be available with unbreakable encryption, generally speaking. Privacy conscious individuals would have to purchase phones like the Blackphone and have them shipped to states where such laws are not passed. I see total capitulation as unlikely for Apple who, as of late, has staked a portion of its reputation on security and protecting consumer privacy. However, the impact of losing the ability to sell flagship devices in two of the four most populous states in the U.S could quickly change anyone’s mind. This is perhaps the worst possible outcome; lawmakers would achieve a decisive victory over encryption that would impact consumers nationally, not just in their state(s).
- The second possible outcome is that manufacturers create “CA and NY Compliant” models of their devices. This option would be almost as bad. Anti-privacy lawmakers in other states would be emboldened and, to borrow some Red Scare lingo, the dominoes would begin to fall. Doubtlessly a few states would hold out (one imagines Wyoming and Montana, and perhaps New Hampshire becoming the last bastions of digital freedom) but the damage would be done. The message that customers in these states would send to Apple and Google is “we don’t really care about encryption”. Eventually manufacturers would probably simply revert to selling a single, backdoored or unencrypted version of these devices.
- The third option, and the one that I hope occurs is that Apple and Google simply refuse to sell their products in these states (assuming this is a possibility, i.e. not in violation of contracts with cellular service providers or other legal impediments). The reason I hope for this outcome should these laws passed and be deemed constitutional is the reaction I imagine. This result would impact consumers directly, and hit them where it hurts: right in the smartphone. Their outcry would be immediate and overwhelming. Customers on the edges of these states would flock to Arizona, Nevada, and Oregon, New Jersey, Connecticut, Massachusetts (and maybe Canada?), to buy the new iPhone 7s and the latest Samsung Galaxy. AT&T, Sprint, T-Mobile, and Verizon would lobby hard for the right to sell smartphones again in these states where their businesses would take a major economic hit. Sales and management jobs at these locations would be lost, as would tax revenue in the affected states. New AT&T, Sprint, T-Mobile, and Verizon stores would spring up, ringing the terrestrial borders of these states almost overnight. As much as I enjoy imagining shuttered Verizon stores from San Diego to San Francisco, that level of business impact would probably never be felt before the law was repealed (though it might; alcohol prohibition did last thirteen years).
If these laws pass and the manufacturers stick to their proverbial guns, it will likely become another failed experiment in prohibition. Tell people they can’t have encryption? You’re likely to be met with sighs, yawns, and indifference. Tell people they can’t have smartphones? You’re much more likely to be met with torches and pitchforks. One hopes the latter impediments are of the metaphorical variety.
If you read just about any article about Wi-Fi security the question of hiding/not hiding your Wi-Fi SSID (Service Set Identifier) will almost inevitably come up. The SSID is the Wi-Fi router’s “name”, and it is what you click on when you wish to connect to that network. Most of these articles will say that hiding your SSID is counterproductive as it will make you more interesting to a hacker. In full fairness, this also includes my own writing. In both the Windows 7 and iOS editions of Your Ultimate Security Guide I recommended NOT hiding your SSID. I had some reasoning for recommending this: in my estimation it amounts to profile elevation. Like sending a Do Not Track request to a website, a hidden SSID makes you more distinctive than everyone around you.
But does hiding your Wi-Fi SSID alone really make you a more attractive target? To quote the inimitable Ulysses Everett McGill of O’ Brother Where Art Thou?, “it’s a fool who looks for logic in the chambers of the human heart.” To unequivocally say that an attacker will target you just because your SSID is hidden may not be tell the whole story, or may simply be dead wrong. Criminals are not known for following the same set of mental processes that guide the actions of the average, law-abiding individual. Sure, it may make you the more interesting target because you may seem like the more challenging target. But just as equally, it may not. The hacker may be looking for soft, langorous targets. Or perhaps he or she is after a specific target that is not you.
I think the reason this is constantly brought up is that SSID hiding has been placed in the “security” category of features for Wi-Fi networks. I contend that this is not a security feature at all. Choosing not to broadcast your SSID is, in my opinion, merely a choice about how “noisy” you want your network to be. While hiding your SSID cannot protect you from Anonymous, it do a few things. It can prevent your neighbors from seeing your network, and prevent kids in the waiting room at your practice from connecting to it. Again, it will absolutely not prevent a determined adversary from finding your network. There are various tools including inSSIDer and Kismet that will find these networks with ease.
My bottom line is this:
- Hiding your Wi-Fi SSID network is a personal preference that is essentially neutral as a security measure. It doesn’t necessarily make you less secure or a more attractive target, though it might based on factors that we can’t begin to model (i.e. human unpredictability).
- Hiding your SSID for security reasons is ineffective and an example of security-through-obscurity. If you are hiding your SSID as a security measure you should reconsider.
There are meaningful security measures you can take for your Wi-Fi network. The best and strongest of these is to ensure that your signal is encrypted with WPA2. The WPA2 protocol is actually very good (do not use WEP or WPA). It offers much, much more protectiong than silencing your Wi-Fi SSID. Another meaningful measure is to use a virtual private network; this will protect your traffic regardless of the security of your Wi-Fi. It will also protect it at a much deeper level, and provide you with a bunch of other benefits. We will delve much more deeply into Wi-Fi security in the upcoming Thirty-Day Security Challenge, so stay with me!
Signal Private Messenger is a free application, and my new favorite encrypted communication solution. Signal supports both voice and instant messaging (texting) in a single app. It is incredibly easy to use, and convince others to use. There is no complicated setup and no username or password to create and remember. This app is incredibly intuitive and resembles native phone and texting applications.
Signal uses your phone’s Wi-Fi or data connection. Signal has replaced the legacy RedPhone and TextSecure apps for Android and merged them into a single platform. To use Signal Private Messenger simply install the application. You will be prompted to enter your telephone number for verification. I have successfully used a Google Voice number for this, even though Signal specifically warns that GV numbers will not work. Full disclosure: I have also seen GV numbers fail. This is the ONLY reason for which I use a Google Voice number. I have no problem with this because the number is only used as an identifier and no data is sent though Google after the initial verification message. The app will verify the number by sending you a code that you must enter into the application. No other personal information is required or requested.
If you allow Signal Private Messenger to access your contacts it will identify the ones who have Signal installed. There is one slight downside to the way Signal identifies its users: in order for others to contact you via Signal they must have the telephone number you used to register the app in their contacts. This requires that you give out this number to others with whom you wish to use Signal. For this reason I recommend setting up a Google Voice number that is used only for Signal, and giving that number out to friend, family, and business contacts that are likely to use Signal (or be persuaded to), rather than giving out your real phone number. I will post in the future about why giving out your real phone number may be a bad idea.
Signal’s interface is almost disconcertingly simple. Tapping the “+” icon in the upper right of the interface a list of your contacts who have Signal installed. Tapping one of these contacts will open a new message to that contact. From there you can send a text message, photo, or video, or type the handset icon to initiate a voice call. In the search bar on this screen you may input a telephone number, which Signal will then search to see if the number has the app installed. Once a call is initiated a more typical phone interface is displayed with some standard phone options to mute the call or use the phone’s speaker.
The call interface will also display two random words. The words displayed will change with each voice call but should match on both handsets involved in the call. These words are used to ensure the call is not being tampered with by a man-in-the-middle. If an attacker were to successfully get in the middle of a call each phone would display different authentication words. This is becasue each handset would establish a key with the attacker rather than the intended recipeint’s handset . I recommend ALWAYS validating these words at the beginning of each conversation made over Signal. This is especially important before engaging in sensitive communications. The messaging portion of the application is likewise incredibly simple. Messages are composed and set like they are in any other messaging application. Attaching a file is as simple as tapping the paperclip icon beside the compose pane. Signal also supports group messaging.
Signal is one of the best privacy-enhancing applications available (especially considering its cost) and I strongly encourage its use. It’s encryption utilizes the “axolotl ratchet”, a system of perfect forward secrecy. Perfect forward secrecy means that each message is encrypted with a unique, ephemeral key. If one message is decrypted it has no impact on the others since each has a unique key.
As pointed out by the grugq, however, Signal does leak a great deal of metadata about you. This includes your contact list, who you talk to, and the frequency with which you talk to them. This metadata is certainly no worse than that generated by your normal telephone conversations. It is also not any worse than that created by other encrypted messaging applications. For this reason it may not be suitable for defeating certain threat models. For encrypting your day-to-day comms that would otherwise be made through insecure means, Signal is a major upgrade. Signal is funded by donations and grants, and much of the work in developing and maintaining the app is done by volunteers.
Comparing phone operating systems for any reason is akin to discussing religion or politics at a broadly mixed table. Tensions mount concurrently with blood pressure. Alliances are formed and the room becomes divided between “us” and “them”. Capabilities are compared, and not in cold, scientific language. Awkwardness ensues when the Android vs iOS debate is hauled out. I don’t enjoy confrontation for the sake of confrontation, so I generally avoid the subject if possible. If the conversation comes up, I typically try to bow out of it gracefully. After writing Your Ultimate Security Guide: iOS and beginning research for Your Ultimate Security Guide: Android, I no longer feel comfortable continuing to give a bland, “well, each has it’s strengths and weaknesses.”
So, this article will be a side-by-side comparison of the Google (now Alphabet, though I will continue to call it Google throughout this post)-produced Android and Apple’s iOS operating systems where the following two factors are primary above all else: privacy and security. It will not be a generalized “Android vs iOS” discussion. It will not take into account considerations like convenience, familiarity, availability of apps, availability/diversity/choice in hardware, ability to customize, or other factors that people frequently cite when comparing the two. It will focus entirely on privacy and security. That’s it. I will address eight areas of concern as follows: each companies general stance on privacy as evidenced by public statements and actions, data collection and monetization, device encryption and passcodes, default protection of data-in-transit, malware prevalence and susceptibility, operating system and app integrity and updates.
TL;DR: If you don’t want to be bothered with the justification and if privacy and security are your primary concerns, buy an iPhone.
General Stance on Privacy
Ok, so this one is a little hard to quantify, but I do think it is worth considering. I may be accused of cherry-picking quotes here, and I agree – I am. But on the whole I think these two quotes fairly epitomize the philosophies of these competing companies.
Apple’s policy on this is pretty clear, per Apple’s “commitment to your privacy“, signed by CEO Tim Cook:
“Our business model is very straightforward: We sell great products. We don’t build a profile based on your email content or web browsing habits to sell to advertisers. We don’t ‘monetize’ the information you store on your iPhone or in iCloud. And we don’t read your email or your messages to get information to market to you.”
Google’s stance on privacy is equally clear. CEO Erik Schmidt:
“…A person has no legitimate expectation of privacy in information he voluntarily turns over to third parties.”
The bottom line: Apple is a hardware company, interested in selling you, its customer, more product. Google is a advertising, data-mining, and marketing company interested in selling you, its product, to more customers.
Data Collection and Monetization
I’m not sure that I need to explain in great detail the vastness of Google’s data collection apparatus. Google Search/Image Search/Patent Search/News, Gmail, YouTube, Google Calendar, Google Drive, Google Voice, Google Plus, Google Books, Google Docs, Google Translate, Google Chat, Google Groups, Google Hangouts, Google Sites, Google Alerts, Google Maps, Google Streeview, and Google Earth are just a few of the “free” services are designed to entice you to put more, and more granular and detailed, information into the Google data stream. Google also controls a number of advertising services, including AdMob, AdSense, AdWords, AdWords Express, Double Click, Google Grants, etc. Google is now also involved in the Internet of Things, purchasing Nest, a manufacturer or internet-connected thermostats and smoke detectors for $3.2 billion. Why would an advertising company purchase a thermostat company for such a huge sum? Because it can record when you are at home, when you are sleeping, how active you, and other information that it aggregates along with your other information to build a more detailed advertising package about you. But perhaps the single most detailed collection platform in Google’s inventory is the Android-equipped phone. Your Android phone (make no mistake about it, it’s really a Google phone) can record your location, periods of wakefulness, frequency of movement, Wi-Fi connections and passwords, physical movement, correlation with other devices, and a host of other very, very detailed information that it shares with Google. Some of this can be opted out of to some extent, but it’s still a Google-powered handset, and Google put the software on the open market for a reason. Do you have privacy-related concerns about using a Chromebook? If so you should probably re-think your
Apple does allow some app developers to collect and sell data to advertisers, but in nothing even approaching the scale and scope of what Google does (again I will point out that Apple is a hardware company and Google is an advertising company). This data collection is done in a very limited manner through an initiative called iAds, and it is possible to opt out of iAds. This is not to downplay Apple’s data collection – I still don’t like it from a privacy perspective and you shouldn’t, either. But when compared with the immensity that is Google, well, it isn’t really much of a comparison. In fairness, though, Apple does still collect a lot of data and this is not a good thing, even if it doesn’t package and sell it; large repositories of data are dangerous because they are desirable targets for hackers and governments alike. All other things being equal though, I still prefer the company that does package and sell my data as a primary revenue stream.
Device Encryption and Passcodes
Because they are small and carried literally everywhere with us smartphones are much more vulnerable to loss or theft than desktop computers. Encryption on smartphones is incredibly important and this is another area in which Apple excels. Apple has included device encryption by default for years, and very good encryption at that. When they publicly announced that devices would no longer include a backdoor that allowed Apple to access information on devices Google quickly followed suit with a press release that said all Android devices would be encrypted by default. Unfortunately (and it is unfortunate – we need encryption!) Google quickly and quietly backtracked on this promise upon complaints of performance hit on encrypted devices. Android devices are still sold that are not encrypted by default. Of course users can choose to encrypt their devices I greatly prefer encryption that is implemented by default and does not require user input because we know a large percentage of users, either ignorant or uncaring, will not implement.
One thing worth mentioning here: on mobile devices your passcode is (usually) not the same thing as the decryption key. The decryption key is tied to a unique code burned into the hardware of the device. The purpose of the passcode is to provide OS-level protection of the device and prevent unauthorized access to data on the device. But since we are talking about passcodes it is worth taking a look at. Beginning with iOS 9 on the iPhone 6S, Apple required a six-digit passcode as a “simple” passcode. This is a substantial security upgrade over the old iOS requirements. Additionally, I have found no upper limit on the characters permitted in a passcode for iOS devices (I have gone as high as 30 characters). By comparison, Android permits not only 4-character simple passcodes but also the ridiculous swipe-to-unlock patterns, and have a maximum passcode character limit of 17 characters. A 17-character passcode is probably plenty but I was dismayed when I could not use the same, longer passcode I use on my iPhone to lock my privacy- and security-focused (and Android-based) Blackphone.
Default Data-in-Transit Encryption
Apple offers very good AES-256 encryption over its iMessage messaging and FaceTime voice- and video-telephony applications. It is so good it has raised the ire of the FBI. While there are plenty of more reputable, free, encrypted communications platforms out there but this is my favorite type of encryption: ubiquitous, organically-integrated, and seamless. Millions of encrypted messages are being sent in cases where very it is probable that few of the senders and recipients value or even know about the underlying encryption. This is a very good thing. Additionally, in iOS 9 Apple introduced App Transport Security (ATS), a developer protocol that encourages (though doesn’t require) app developers to use HTTPS when data from apps is transmitted from the device. This is a very good thing; the data that is constantly being transmitted by our apps reveals an enormity of data that is hard to overestimate.
Gmail, it must be said, also offers excellent security. Google permits incredibly long passwords and it’s two-factor authentication system is the standard by which others are judged. Your entire session with a Google product is typically HTTPS-encrypted, and encrypted inside of Google. All of these measures, however, are designed to protect you from everyone except Google (and it should be noted that email is only a tiny percentage of overall transmitted over Android handsets). Google holds the keys and your data, no matter how secure, is scraped by Google (unless of course, you have encrypted it yourself). Unfortunately the Android OS offers no competing (or, even more unfortunately, compatible) product to answer Apple’s iMessage and encrypt text messages by default and without requiring an additional app.
A smartphone is a computer and is subject to the same malware threats as computers. The commonality of malware for the two devices is incomparable: in 2014 the Cisco annual security report estimated that an astonishing ninety-nine percent of mobile malware was targeted at Android devices and there is little evidence to suggest this trend has changed dramatically in the intervening two years. Though Apple is not immune to malware, it still makes news when Apple products are found vulnerable to it. As an example of this, Zerodium recently, and very publicly, offered a $1,000,000.00 bounty for a remote jailbreak vulnerability for iOS 9. Only one team (of three possible) actually collected. Root access exploits for Android devices are far more common, and don’t make national news when they are found. As another indicator, Zerodium also publicly posted a pricing chart for remote exploits; nothing ranked higher in pricing (up to $500,000.00) than iOS; by comparison, Android exploits only fetch up to $100,000.00. Much of the malware problem with Android is due to the lack of routine, direct updates of the operating system and the inclusion of unvetted applications in the Google Play store.
OS Integrity and Updates
Much of the malware issue can be lain at the feet of operating system integrity – that is, the operating system remaining intact and being kept up-to-date. This is a major problem for Android handsets. Google released Android as an open-source project and as a result it can be freely modified. Hardware manufacturers modify the OS to suit their needs, and a service providers like AT&T and Verizon modify them even further. Updating is the real issue with Android, though. When Google pushes software updates they typically don’t go directly to the device. Instead they have to to work their way again through hardware manufacturers and service providers before reaching the end-user device.
The Apple OS, on the other hand, is designed for a particular set of hardware and is not modified. Further, and perhaps more importantly, updates are pushed directly from Apple directly to all handsets. This means that a significantly higher percentage of iOS devices get updated quickly. A number of articles (most citing Mixpanel statistics) highlight this trend. Within 72 hours of its release a higher percentage of iOS users had upgraded to the latest OS version (iOS 9) than Android users had in the previous nine months (to Android 5/Lollipop). At the time of this writing approximately 75% of iOS users are running the latest version of the OS compared to only 44% of Android users who are running the latest OS (iOS 9 has been out for under three months at this time; Lollipop has been available for more than 12). Even the brand new Blackberry Priv, a phone marketed around privacy and security, ships with an outdated operating system.
Installing an application on a device gives it an incredibly amount of privilege. Regardless of whether you use an iOS or Android device, it is only as private and secure as the applications you choose to install on it. With that being said, there is a difference between the level of trust I place in the apps I download from the App Store and Google Play. Apple’s App Store is a so called “walled garden”, into which only vetted applications are allowed. Curating apps in this manner prevents many potentially malicious apps from even being accessible to the user, let alone executed. This is not to say that the App Store is perfect; privacy- and security-compromising code does occasionally get through, and much to everyone’s chagrin, Apple is incredibly opaque about the vetting process for applications and what black- and whitelisted criteria they look for.
Apps for Android devices face no such scrutiny (or any at all really, unless the app interferes with Google). Anyone can create an Android app, and anyone can download an execute any Android app from nearly any source. Couple this with an outdated OS and the potential for abuse is staggering. Because the App Store is curated, fewer apps are available to Apple users than Android users, but this argument is beyond the scope of this post. I compromise my convenience on a daily basis for the sake of privacy and security, and have no problem “restricting” myself to the 1.5 million or so apps that are in the App Store.
Before I conclude my privacy- and security-centric Android vs iOS comparison, let me make one other thing clear. I will not list my credentials to support this claim, but I am certainly not an Apple “fan boi”. But I do use an iPhone and have for a long time. With that said, it should be equally clear that I consider brand loyalty to be a fool’s errand am brand-name agnostic. The only allegiance I have is to the brand that provides me with the right balance of privacy, security, and yes, convenience, not the brand that is (or isn’t) the one I love (or loathe). Though we are all, by nature, hesitant to change, the fear of change does not override my fear of mass surveillance. No allegiance, no loyalty, no limiting my options because I like or dislike one manufacturer over another. If you are using your iPhone just because it’s an Apple product, you’re doing it wrong. And vice-versa.
Is it possible to make an Android device very secure? Yes, it is, and the people at Silent Circle have proven it with the Blackphone. If you are a DIY-er, you can install custom versions of Android software like CyanogenMod that are frequently and directly updated and generally much more secure than stock Android. Can you backup your Android phone locally without sending data to Google? Yes, but again it requires rooting the phone and using something like Titanium Backup, yet another workaround. Rooting also introduces another host of vulnerabilities that must be secured. Because I place such a high value on privacy and security, I would rather start with a more secure baseline and work upward from there, rather than starting at the bottom an hoping to get to an acceptable point.
I admit being a holdout for TrueCrypt. I wrote about it in my Your Ultimate Security Guide: Windows 7 Edition. I encouraged it’s use among my friends and family. I have used it myself. I have stood so strongly beside TrueCrypt for two reasons. The first is The Audit. Being independently audited is incredibly rare among encryption tools and I placed a great deal of trust in the audit which was only recently completed, and the results of which were mostly good. There were some minor vulnerabilities but nothing to be overly concerned about, and certainly no backdoors. The other reason I held onto TrueCrypt for so long (and it pains me to admit this) was nostalgia. TrueCrypt was the gold standard for years and it had been with me through thick and thin, protecting my data on half a dozen personal laptops and across scores of international borders. Letting go of TrueCrypt felt like letting go of an old friend.
But, I didn’t hold onto it out of misplaced loyalty or nostalgia alone. The audit was huge, and until I had a good reason to believe TrueCrypt was insecure there was no reason to switch. But audits are not perfect, and now we have that reason. A new privilege escalation vulnerability was discovered in Windows versions of TrueCrypt (almost two months ago now) that allows the compromise of your full system. For this reason I am moving, and recommend moving to VeraCrypt as soon as possible.
Going back to an un-audited program feels like a huge step backward to me. I don’t think the developers have maliciously inserted a backdoor, but code is complex and getting encryption right is hard. But there is a very big silver lining. First, vulnerabilities like the one affecting TrueCrypt can be (and will be, and in this case, already have been) patched. TrueCrypt’s vulnerabilities will never be patches. Next, an audit is planned for VeraCrypt that will probably be undertaken after the program is in its next version and has added some new features. Finally, by increasing the number of iterations from a maximum of 2,000 in TrueCrypt to as many as 500,000 in VeraCrypt, the newer program is significantly stronger against brute-force attacks. Using VeraCrypt requires almost no learning curve for anyone familiar with TrueCrypt as the two programs are almost identical in up-front operation.
Unfortunately (or fortunately, depending on how you look at it), VeraCrypt and TrueCrypt volumes are incompatible. This means that if you are using volume-level encryption you will have to create a new VeraCrypt volume, mount your TrueCrypt volume, and drag files into the new one. If you are using full-disk encryption (which you should be) this will mean fully decrypting your machine and re-encrypting with VeraCrypt. While it’s decrypted would be an ideal time for a clean install, too.
11/23/2015: Shortly after this post was published this Ars Technica article was published indicating TrueCrypt is still safer than we thought. This is good news, but the clock is still ticking on the aging encryption application.
VeraCrypt URL and Checksums:
On this site I talk about a number of different security measures. Just as in my discussion of attacks and attackers it is important to have a firm understanding of security measures and exactly what type of security each provides. Though many, including me, view an alarm as a serious security upgrade it is important to realize that it does not actually make your home more difficult to get into. An alarm is merely a detective security measure; that is, it makes your home more difficult to get into undetected. There are three categories of security measures: deterring, delaying, and detective. Alternatively these categories can be thought of as “before” (deterring), “during” (delaying), and “after” (detective) security measures, based on what stage of an attack with are intended to address.
Category I: Deterring Measures. Deterrents are those security measures that play a role before the attack is even attempted (i.e. during the reconnaissance phase of an organized attack). Deterring security measures deter the attack from even attempting the breach by making him or her re-think your defenses in comparison to risk of compromise and his or her ability. Security measures in this category often include signs or stickers indicating the presence of an alarm, visible security cameras, etc. Other deterring measures include motion lights, visible cameras, signs warning of alarm systems and dogs, and routine police patrols.
Deterring security measures are difficult to quantify in the digital security realm, but they exist. A password prompt for a full-disk encrypted computer may serve as a deterrent to an attacker, as may a passcode on a smartphone.
Category II: Delaying Measures. Delaying devices are those devices that play a role during the breach attempt. Locks cannot make your home impossible to get into, but they can make the task take an unacceptably long time especially if the attack is intended to go undetected. Items in this category include locks, fences, anti-shatter window film, etc., all of which are intended to slow an attacker’s progress during the breach. In some cases delaying devices may exceed an attacker’s skill level and force him to move on to an easier target.
Delaying measures are the ones the average user primarily employs on the digital perimeter. These measures include strong encryption of data-at-rest using file-level and full-disk encryption on computers, encryption of data-in-transit using HTTPS and a VPN and ensuring your Wi-Fi is encrypted, and the use of good, strong passwords.
Category III: Detective Measures. Detective security devices are the “after” measures, the ones that alert you that a breach is in progress or has already occurred or been attempted. Devices in this category include intrusion detection systems (alarms) and surveillance cameras. The presence of these types of devices may have the added benefit of serving as Category 1 security measures, but this is generally not their primary purpose. In addition to alerting us to the breach or breach attempt, Category 3 security measures can also capture images of the attacker, alert police or security, and, if overt, place severe limitations on the amount of time an attacker is willing to spend “on target”. A good example of Category III measures in the digital world are event logs.
There is some degree of overlap in these categories and you should understand exactly what benefits a given security measure provides when considering your perimeter. A high security lock is a good example of a security measure serving in multiple categories. The lock is certainly primarily intended as a Category II security measure. Because of the novel mechanisms and tight manufacturing tolerances common to high security locks it would be extremely difficult to pick or otherwise defeat covertly, delaying the attack and forcing the attacker to spend a great deal of time exposed during this process. This simple fact alone may also place it in Category I. An intruder who notices the lock may decide it is simply too difficult to defeat (and wonder what other security measures you have) and move on. On the other hand, if the attacker is sufficiently determined to enter your home, he may make the decision to simply kick in the door or break a window. This would place the lock indirectly into Category III, as you would immediately notice a kicked-in door or broken window and know someone had been in your home. This is the chief comfort I derive from the high security locks I use: while I fully realize that a burglar could smash a window, I know with a reasonable degree of certainty that no one (except possibly a Level IV attacker) can enter my home without my knowledge.
One of my favorite features on my iPhone is the ability to take notes. Sadly, one of my least favorite features of my iPhone is the Notes’ inability to be encrypted or password protected, and its annoying tendency to backup to email accounts when you least expect it. Because of the lack of security inherent in the native Notes app I began looking for a replacement several years ago and found Codebook Secure Notebook.
Codebook is a refreshingly simple app that encrypts your notes using AES-256. Codebook also has some other cool security features. It has a pretty standard Auto-Lock function that locks the app after a specified period of time ranging from one minute to one hour, and allows you to disable Auto-Correct. Toggling the Auto-Correct slider to “off” prevents the phone’s dictionary from inspecting the contents of your notes, potentially preventing data from leaking in the OS from Codebook. This is important if you store passwords, credit card numbers, or other especially sensitive data in this application. The final setting that deals with security is Pasteboard: Clear on Exit. This clears your clipboard when you exit or minimize the application. This is helpful if you are copying text within Codebook, but you will want to leave this turned off if you copy text from Codebook into any other application.
Codebook does look dated (think iOS 5- or 6-ish) though, and at the time of this writing has not been updated since version 1.6.4 which was released in January of 2013. This gave me some pause when writing about the app in Your Ultimate Security Guide: iOS. Though the look of the app doesn’t really matter I had real questions about whether or not it was still being supported. The good news is that, yes, Codebook Secure Notebook is still being supported and an update is on its way very soon! I had the opportunity to TestFlight this app and I am sharing a few screenshots below.
Codebook is everything I like in an app: simple, uncluttered with superfluous features, and secure by default. I am incredibly pleased to know that Codebook will be around for the foreseeable future. I would love to see a version of Codebook for Android, as well. Codebook Secure Notebook costs $3.99 in the App Store but is money well spent.
With free upgrades to Windows 10 fully out in the wild the migration to the new OS has been, by all accounts, a resounding success for Microsoft. Though Windows 7 will doubtlessly remain king of the hill for the immediate future, with 75 million downloads in the last month Win10 is making serious inroads. Though popular out of the gate, it has not been received without some legitimate complaint. There are some major privacy issues with the new OS.
Express Settings: When going through the upgrade process, do NOT choose the “Express settings” option. In Express settings mode you are not allowed the opportunity to change privacy and security settings and they are set to defaults. Worse, allowing the Express settings can cause an encrypted version of your Wi-Fi password to be shared with your friends through Wi-Fi Sense so they can use your Wi-Fi if and when they are at your house. Instead choose the “Customize settings” option.
Forced Updates: Perhaps the fiercest complaint about Win10 is that updates are mandatory, not optional. While I strongly encrourage staying up-to-date, the ability to opt-out of select updates should be everyone’s right. This ability is especially importan when updates are buggy or cause system instability as has been the case with some updates for 10. Windows 10 users have no choice in the matter, though. At least now Windows actually offers some transparency and explains what these updates do. Before upgrading you should seriously consider whether you are willing to accept mandatory updates whether you want them or not.
Data Collection by Default: Windows 10’s data collection is enabled on the OS by default. The new Cortana feature (the competitor to Apple’s Siri and Google’s Now) constantly records you and your actions to “get to know you”. Windows 10 also has a very intuitive, very user-friendly Settings menu that contains a well laid-out Privacy section (shown below). Unfortunately most of these privacy settings are enabled to collect data by default. I strongly recommend going through these privacy settings immediately upon installing the new OS. These settings are not complete; there are . For more information on setting up the initial Privacy and Security settings in Windows 10 visit https://fix10.isleaked.com/.
Screenshots of my Win1o Privacy settings are attached a the end of this post. Note that for most of these settings you must enable the global setting before disabling individual apps. After you have disabled every app I recommend once again disabling the global settings. Also note that these settings are not a substitute for using basic best practices and security utilities like encyrption and antivirus.
Some good news: Windows 10 will still work with the security applications we know and love, like TrueCrypt, Password Safe, and others. In fact, aside from OS-specifics, nearly everything I detailed in Your Ultimate Security Guide: Windows 7 Edition is still applicable. Just one quick word of warning: if you are full-disk encrypted, DECRYPT YOUR HARD DRIVE before upgrading and re-encrypt upon completion of the upgrade. I learned this the hard way.
Everyone loves the appeal of a new operating system. Even I was excited at the prospect of an entirely new look when the computer finally finished installing 10. But the more rational side of me dislikes change just for the sake of change. After I complete the next installment of the Your Ultimate Security Guide series which will cover Windows 10 (look for it in March 2016) I plan to revert back to either Windows 7 or, much more likely, go full-time with a Linux distro.