Amidst so much media content about the relative securities of different mobile telephone messaging Apps, we ask whether any such application can really be secure; and if a person is going to use such an application for sensitive purposes, then what considerations they ought to bear in mind.
The first thing to say about this subject is that there is no static answer. The rules keep changing. That is because every time a truly secure messaging service becomes over-used, the technological authorities of one or the other countries divest resources into finding their way round the software's security. So as soon as the messaging software you use is sufficiently popular, some government or other is going to succeed in getting into it. And once a government can do that, private companies will find out how governments did that and will copy their methods and then sell them.
The net result is that ultimately anybody can find the means to hack anybody else. The way you get round this is by keeping on constantly changing the means by which you communicate.
The traditional history of messaging services started with SMS's, unencrypted sets of characters that mobile telephone servers could use to send messages to individual devices, and vice versa. One purpose of doing this was typically to let consumers buy mobile telephone bundles (e.g. send YES100 to 3322 for 100MB of free data - such messages clog up our mobile 'phones every day). However it was then discovered that actually consumers sending short message to one-another was rather popular. Hence SMS services became commercially developed in about 2000; people would be charged some nominal sum to send an SMS to one-another. SMS is still used to this day. The fact that SMS's could easily be hacked was a feature of what was known as the SS7 affair. SS7 is a system that connects different mobile 'phone networks to one-another, and at one point it provided numerous hacking opportunities.
Many people do not realise that you can actually disinstall your SMS messaging software, entailing that you can no long receive or send SMS messages related to a particular telephone number. This started to make a lot of sense in 2021, when it was discussed amongst the international media that an Israeli firm was selling a copy of the international government standard for hacking mobile 'phones. This software was supposedly a secret but in fact virtually every government uses it. It is called Pegasus. One way Pegasus works is by sending to a mobile 'phone with a specific SIM card in it an SMS that links to a piece of code. The code installs a piece of invisible Spyware on the 'phone, and then deletes the SMS that installs it. This indicates that there are at least three very important tools in avoiding government or other Spyware altogether. One is to delete one's SMS Application on one's phone. The second is to disable downloads of code / documents / photos attached to SMS's. The third is not to have a SIM card in one's phone at all; and we will come to that.
The next development was the sending of email using mobile telephones by BlackBerry. Blackberries were a tremendously popular mobile telephone collection that emerged in the early 2000's and had two unique features. One was that they had keyboards, which at the time made writing messages much quicker then when you had to press the key '1' three times quickly in a row to get the letter 'c', for example. Their other distinctive quality was to permit the emerging capacity of mobile 'phone networks to carry digital data, to connect mobile telephones to the internet and thereby to provide for emails to be received and sent by and to mobile telephones.
Nobody was really thinking too hard about internet security at this juncture. BlackBerry touted the security of its server; but really email has never been very secure whether sent by mobile 'phone or otherwise, because it's encryption protocols can be compromised. The purpose of email is to send complex documents or texts with a permanent record, as a quicker alternative to the postal mail; it is not particularly to make messages secure. Nevertheless if you are looking to make a permanent record, you use email. Lawyers use email; spies use chat Apps.
In time, the keyboard issue became irrelevant; it turned out that technology could be developed for on-screen keyboards that most people preferred to tiny clackety keys on a small handset. However the idea of sending instant messages, without all the heavy metadata associated with emails, caught on. People wanted to send messages to one-another without all these messages being indefinitely stored in an email server. Hence we saw the rise of VOIP messaging services, some of the first of which were WhatsApp (an American product) and Telegram (a Russian one). VOIP meant, very roughly, that messages were transmitted via the internet, rather than down telephone lines. At the same time, a programme called Skype became extremely popular. This allowed VOIP telephone calls.
All of these pieces of software were naturally encrypted, in the sense that you would have to interfere with a mobile telephone mast or physically compromise a participant's device if you wanted to read or hear the conversations people were having. Governments naturally insisted that the people developing this software provide them with access with a warrant or otherwise upon request, in case people were using these pieces of software to commit crimes and evade detection by law enforcement of their communications. The legal principle was the same as 'phone tapping, and it colloquially became known as 'back doors': mobile telephone software producers wrote into their code methods by which government agencies could, if they knew what the backdoor was, read or listen to the VOIP messaging and voice services. Nevertheless this was about as rare as 'phone tapping, at least in the beginning. There was no natural reason for paranoia, just because electronic communication technology had developed.
However this sort of government surveillance soon became much more common. Governments worked out how to scan messages and conversations for key words or phrases (e.g. 'terrorism'), upon which they would start recording or monitoring. After a while, governments built large computer storage warehouses that record virtually everything - and they did. The question then was what government did with all of this data they were collecting. It became not a question of IT capacity, but human capacity to listen to and read huge quantities of material.
This problem arose at the same time as the VOIP services started to recognise the marketing potential of handling all this data. They too could pick out key phrases and profiles, and sell the data of their users to people who wanted to sell things to them. This remains the principal way that most VOIP messaging services make money. Hence the software architects themselves were developing the techniques that would permit governments more efficiently to listen in on private individuals' conversations. It subsequently emerged, of course, that access to the 'backdoors', and the filters, could be sold to anyone who wanted them.
A few years, along came a CIA contractor called Edward Snowden who pointed out the extent of this process of listening in. At this point, everyone panicked. The US Government panicked, and Snowden has been in Moscow ever since. The mobile 'phone software producers also panicked, in a row saying that they would be installing so-called 256 bit encryption on their messaging services, using a long-understood encryption technique called 'Pretty Good Privacy' (PGP). This involves encrypting a message or document in such a way that it would take all the computers working together more time than the Earth will continue to exist to crack the encryption. PGP involves mobile telephones using so-called public and private keys. Its details are not important for our purposes; suffice it to say that PGP remains just as secure as it was. Things might change if and when quantum computers (computers that would currently need to be stored in liquid helium to operate - something it is impossible given current technology) can be made to work. But for the time being, PGP is impenetrable.
I say that, but of course it is not really accurate, because the various commercial message services have kept the backdoors. Hence if you are committing crimes using WhatsApp or Telegram, don't think for a minute that law enforcement authorities cannot get into your account and read your messages. They can.
WhatsApp in particular also served the Pegasus suite of Spyware services, because it turned out that somebody can send you a WhatsApp message with an attached piece of code (perhaps in an attached document); WhatsApp will automatically run the code for you, installing the Spyware and deleting the WhatsApp message so that you never knew about it. One key security measure to take if you use WhatsApp is therefore to disable the auto-download feature for attachments to WhatsApp messages. It is difficult for people to operate without WhatsApp, Telegram or Viber (a third, similar service) because they are so ubiquitous. Another option is to have two mobile telephones: one with these things installed, which is potentially 'hot'; and one without any of them installed. Indeed this second 'phone may operate without even a SIM card installed, so that the Pegasus SMS route can't be used either.
What does Pegasus allow governments to do? Well, it allows them to do anything to your mobile telephone that a piece of mobile telephone software could do. That includes accessing your address book (although most chat Apps allow the chat App itself to do that); turn on the camera and take photos or record; ditto the microphone; install a keystroke logger; and read your emails / messages, etcetera. Again the good news with Pegasus is that unless you are exclusively breaking the law using your 'phone, the sheer quantity of data the software tends to collect requires a team of people to scour in order to find anything even potentially valuable. In truth reconaissance of a person's mobile 'phone is a 24-7, three-to-five man job. It requires so much effort and hence expense that unless you are a known terrorist it's very unlikely that anyone is going to make this effort. And even then they may not do; even known terrorists also have ordinary lives. The sifting process is colossal.
Now we should turn to Signal. Signal is the first and only VOIP messaging and calls App to purport to publish its source code. It is, as software developers call it, 'open source'. This means that anyone can look at the coding and try to look for backdoors or vulnerabilities. Nobody has ever found one; and there are a lot of clever so-called 'white hat hackers' out there (computer hackers who use their skills to try to improve software robustness by hunting out security frailties). This means that for many years, Signal has been the gold standard. If you wanted to communicate securely, you installed Signal.
Signal, like any messaging system that identifies its users by reference to a mobile telephone number, is not anonymous. It is possible for the mobile telephone company to collect dates, times and numbers of telephones communicating with one-another via Signal (and ergo convey them to governments). This might encourage a person to do the following: instal Signal with a so-called 'burner' (an anonymous SIM card, i.e. one where you don't have to give your name and address on purchase - this is theoretically illegal in many countries but practically easy in most countries); instal Signal; then throw the SIM card away or put it into a different 'phone that uses WhatsApp or another Pegasus-compliant messaging App. This way, the authorities install Pegasus on the wrong 'phone. There are lots of variations of this sort of scheme, to protect yourself from Pegasus.
It is not actually known (at least not by this author) whether Pegasus, when installed on your 'phone, allows the observing government to read your Signal messages - or, at least, not reliably. That is because Signal creates a 'computer within a computer', its code completely insulated from the device in which it is installed.
Incidentally, there are ways of spotting whether your 'phone has Pegasus installed on it. One, that is comprehensively documented on the internet, involves downloading your entire mobile 'phone RAM onto an external hard drive and then scanning it. This requires some level of technical skill but it is not too onerous for a person familiar with IT issues. More informal indicators of Pegasus being installed include high battery usage rate. There are many others. if you think you have Pegasus installed on your 'phone, there are two possible approaches. One is just to ignore it; let some government authority read your rubbish. It's their loss. They are consuming massive resources. The other alternative is to factory reset your 'phone. It goes without saying you should never keep important data only on a small computer in your pocket. Factory reset deletes everything, and a security-minded person should always be prepared to factory reset their mobile 'phone on a moment's notice.
Now we turn to the brave new world. Signal many not be secure anymore, Pegasus or no Pegasus. The reason why not becomes obvious if you consider that just because someone publishes some source code on the internet, doesn't mean that this is actually what your mobile 'phone downloads when you press on the 'Install' button.
Now read this:
Signal has never had an outage in its life. The source code for Signal on the Internet does not provide for the possibility of the software providing a message saying that Signal is down due to 'technical problems'. That's the beauty of published source code: you know which error messages are genuine (they're listed in the source code) and which are not. It's an article for Reuters to publish: Signal back up after four hours down. It's hardly worthy of the international newswire quality of articles that Reuters routinely prints.
What seems to be happening is that governments or agencies are reinstalling software that frustrates their goal of surveillance whenever they wish to do that. The net result is that Signal can no longer be relied upon. We don't know what it is anymore. We don't know whether my Signal and yours are the same. Signal may now be less secure than WhatsApp. We don't know.
Now we should add a few words about an excellent piece of messaging software, Wickr. Unlike most messaging Apps, it does not ask you for your mobile telephone number. Instead your provide a unique ID number. Wickr participants can communicate anonymously, in the sense that it is impossible both to read their messages and to identify the mobile or other devices they are using. Wickr is an extremely high quality messaging service, providing for a range of auto-burn services for messages and similar security protections. The reason most people do not use messaging services de-linked from ther mobile telephone numbers is that they want their messaging access to access their phones' address books so they can stay in contact with their friends. WhatsApp in particular is very good at doing this: hence its commercial success. Anonymous Chat Apps (and there are several of them: Threema is another well-known but you must pay a small fee for it) are far less popular as a result; but they are far more secure.
However Wickr itself, in all other ways the gold standard of chat Apps, has one substantial disadvantage. It was written to enable military instructions to be conveyed securely to and from the US Department of Defense and individual military units (for example aircraft carrier strike fleets). Hence it must be assumed to have a US backdoor. If you don't mind the US Government reading your chats (at least in principle, which again takes massive resources on their part), then Wickr is the very best.
Then there is a handful of more or less obscure copies of Signal and Wickr, whose source code is not published, some of which are not anonymous (they are tied to a telephone number) and some of which are. If you're completely obsessed with privacy - and there may be legitimate or illegitimate reasons for that - then you cycle through these various 'second tier' chat Apps at a speed quicker than the person from whom you are seeking to conceal your communications can code and insert backdoors.
I also want to make a few observations about an assertion, that you will hear from many sides in the intelligence community, that laptops and desktops are more secure than mobile telephones. This seems to be some lore of the intelligence communities; but it is not at all obvious that it is true. The reality may be much more subtle.
Consider the following points. Firstly, there is an issue of which type of device more often changes the form of its internet access. Desktops are the most certain in this regard. They don't move at all. Therefore they always use the same WiFi connection. (Nobody much uses Ethernet cables anymore.) The consequence of this is that if someone knows you do all your work on your Desktop, then they know that to get access to what you do: all they need to do is to compromise a single WiFi connection and they've got the lot. Laptops move less often; mobile telephones move the most. Hence if you are trying to avoid compromise of your WiFi, then the best thing to do do may be to use a mobile telephone in lots of different locations.
Secondly, mobile telephones have volatile RAM whereas other computers don't (or far less so). Volatile RAM means that the device constantly moves round all its software applications to different locations within the RAM of the machine. One consequence of this is that it is far easier to delete a file on a mobile 'phone than on another sort of computer. You just delete the file (this involves removing the references to the file data, which itself is not deleted); then you turn your mobile 'phone off and on a couple of times, and with a bit of luck some other piece of software will have overriden the code for the file you want to get rid of. By contrast, if you 'delete' a file (i.e. move it to the Trash Can) on another sort of computer; and empty the Trash Can, then turn the computer off then on, the deleted file will still be there - just missing the pointers to it. You need to jumble up all the files to ensure that your deleted file data is actually overwritten. One way of doing this is by de-fragmenting the disk (I.e. making everything in the RAM more orderly); turning the computer off then on; de-fragmenting the disk again for good measure; and then turning off then on again. People hardly ever do this, of course. That's why it's often easy to obtain data from a computer even though the user thinks it's been deleted.
Apple machines can be entered by a hostile party even though you think you're encrypted the entire hard drive. There's a back door. So just closing the lid doesn't work. By contrast, mobile telephones have lots of different encryption mechanisms and not all of them have been hacked.
The main reason why the conventional view is that laptops and desktops are more secure than mobile 'phones is because mobile 'phone Apps are badly written, full of frailties in their coding that a determined hacker might be able to use to compromise the device.
However the speed and quantity with which mobile 'phone Apps are propagated cuts both ways. Any 22-year old with skills in the Python programming language (it's not that difficult) can write a mobile 'phone App. That's why there are so many of them. Many of these often-sloppily drafted Apps may have unintentional potential backdoors; but the capacity of government services actually to find those backdoors and to write code to exploit them has its limits. If someone in government is writing code to exploit a backdoor in an App that you are using, then they may be writing it just for you. Bear that in mind before you auto-click to agree to update them. It may be a patch; or it may be an invitation to accept Spyware.
Personally, I have more confidence in mobile 'phone Apps that have not been updated for some time. This suggests to me both that few people use them (entailing diminishing returns for those inclined to hack them); and that there was less wrong with them when they were actually written. Nevertheless that is just my own personal opinion.
Privacy is a game of chicken-and-egg. It will never end. Governments will find ways in, just as software writers will find ways of stopping them. In circumstances of accelerated competition for maintaining as opposed to compromising privacy, why not just go back to the good old telephone call? By now, all the people who knew how to hack simple telephone lines have mostly retired. Technological return to the Neanderthals may have its merits.
Comments