Signal and Telegram Apps? Zero Anonymity! Zero Privacy!
Anonymity and privacy are not about closing the door when you go to the bathroom.
For the individual, they might be about personal autonomy, political liberty or just protecting yourself in the digital world.
For the enterprise, employee privacy mitigates the risk of social engineering attacks, even blackmail. The more an attacker can learn about key people within an organization, the more targeted and effective they can make their attacks. Educating employees about how to protect their privacy, therefore, should be a core part of any cybersecurity awareness program.
The universe believes in encryption, a wise man once opined, because it is astronomically easier to encrypt than it is to brute force decrypt. The universe does not appear to believe in anonymity, however, as it requires significant work to remain anonymous.
We are using privacy and anonymity interchangeably, and this is incorrect. An encrypted message may protect your privacy — because (hopefully) no one else can read it besides you and your recipient — but encryption does not protect the metadata, and thus your anonymity. Who you're talking to, when, for how long, how many messages, size of attachments, type of communication (text message? email? voice call? voice memo? video call?), all this information is not encrypted and is easily discoverable by sophisticated hackers with a mass surveillance apparatus, which is most these days.
WAKE UP: Signal is not giving you anonymity, maybe some privacy if any at all.
You may have heard the mantra, "Use Signal, use Tor," and while this one-two punch combo is a great start, it won't take down your opponent. Signal is the best-of-breed encrypted messaging app that lets you send text messages and voice memos as well as voice calls and audio calls. It looks and feels just like any other messaging app but under the hood uses encryption that, to the best of our knowledge, not even the National Security Agency can brute-force.
What about the metadata? Any network-level adversary can tell that you're using Signal, for starters, and if your adversary is the U.S. or Five Eyes, then they have mass surveillance access to all Signal traffic and know who is talking to whom, when and for how long.
The makers of Signal are well aware of these technical limitations and are researching ways to push the boundaries of what's possible. Metadata-resistant communication is an unsolved, cutting-edge technical research problem.
Google Project Zero researcher Natalie Silvanovich discovered a logical vulnerability in the Signal messaging app for Android that could allow a malicious caller to force a call to be answered at the receiver's end without requiring his/her interaction
Bottom line: Signal is the most secure, easy-to-use messaging app available to date, and offers marginally more anonymity than any other app. Do not rely on it for any kind of anonymity, or privacy, however. In fact, it's questionable whether anything provides strong anonymity these days, which brings us to Tor...
So you have decided to open a Telegram account in the wake of the WhatsApp-NSO group spyware incident that affected 1,400 select users globally, including some in India. Some of you may even be attempting to join the chat app Signal for that elusive security that, unfortunately, was never there in the first place.
Take this seriously: Encryption is fundamentally flawed and once hackers get to know any vulnerability or bug in the app security ecosystem, including the mobile operating system, your personal data is at their mercy.
When you joined WhatsApp, end-to-end encryption was there and yet, third-party spyware, Pegasus, found a backdoor entry to snoop on you. Now, you are looking to take shelter in other so-called secure chat apps.
Unlike WhatsApp and Apple iMessage, Telegram conversations aren't encrypted end-to-end by default. Instead, you have to select the "Secret Chat" feature for an extra layer of security. But even that does not ensure a safety net.
A recent research paper from Massachusetts Institute of Technology (MIT) listed striking flaws in Telegram -- founded in 2013 by brothers Nikolai and Pavel Durov. Telegram uses its own proprietary messaging protocol called "MTProto", which lacks scrutiny from outside cryptographers.
Block Smartphone Tapping & Surveillance on Your Organization
DigitalBank Vault® provides sophisticated Digital Anti Surveillance technologies: military-grade encryption devices for ultra-secure anonymous communication (voice calls & text messaging) with untraceable file transfers & storage solutions
Telegram follows a conventional approach of using Cloud storage for its data.
"This means that if an adversary is able to gain control of their server system, they will have access to (at least) unencrypted messages and definitely to all the metadata," wrote MIT researchers Hayk Saribekyan and Akaki Margvelashvili.
Telegram initially asks for the contact list from the phone/desktop and stores them in their servers.
"This provides huge social network information for them that can either be attacked on their servers or can be possibly sold to different authorities without users' consent," the researchers added.
The truth is: There will always be loopholes for governments, nation-state bad actors or individual hackers to snoop on you.
Viruses like Pegasus affect the operating system of the mobile phone and the security provided by these messaging apps is rendered ineffective.
Everyday messaging apps are driven to get more users. Success is about the number of users and the levels of interaction. The focus is volume not value; quantity trumps quality. Ease of use/adoption is everything.
This does not encourage "privacy by design" - a regulatory requirement under Article 25 of GDPR.
This encourages "proliferation by design".
For example, you can give these apps access to your contacts, which will likely include professional contacts, including customers, and this data is then uploaded to the messaging app. Have those professional contacts of yours given their explicit consent for this to happen? Clearly not.
Furthermore, you can add anyone to a WhatsApp/Telegram/Signal group if you have their mobile phone number and they have the app. Have they given explicit consent to be added? Clearly not.
It gets worse...
In some of these apps you cannot delete your content (e.g. in WhatsApp you cannot delete your message after an hour).
Images are stored on your device and you cannot control what those images are (could be reputationally compromising).
You do not know who the group host is, and it may be that anyone can add anyone else to the group.
All these features do make for friction-free user experience and encourage fast proliferation, but at what cost to privacy?
These apps are not compliant with privacy regulation which may not matter in a consumer context but are very risky for professional purposes. Businesses are understandably keen to avoid a 20 million Euro fine under GDPR.
'Privacy' has some hard definitions in terms of regulation like GDPR. But otherwise, it is a much more personal and subjective concept that is heavily influenced by an individual's own beliefs, culture, geography, jurisdiction, etc.
Privacy is also about trust, dignity, respect.
Privacy can be encouraged by the right context, environment, and behaviors of those around you. For example, smaller groups naturally tend to self-correct poor behavior and protect the privacy and cohesiveness of the group. Larger groups need more hierarchy, process and policing. Dunbar's number gives the evidence for this.
Security can be enhanced with technology. But the real weakness is with people and processes. WhatsApp may be secure but we know that politicians using WhatsApp have their communications leaked all the time.
Any member of a messaging group, even in the most secure apps with secrecy features like self-destructing messages, can screenshot (or photograph if screenshotting is disabled) anyone else's content and publish it openly.
In the end, trust is more important than end-to-end encryption if you want true privacy.
In any case, it is looking likely that the UK and US governments may introduce new laws that would force messaging apps like WhatsApp to give them "backdoor access" anyway. This would allow law enforcement officials to unlock encrypted communications.
According to the Telegraph "The 'Five Eyes' nations, an intelligence alliance comprising the UK, US, Canada, Australia, and New Zealand, issued the warning in a joint statement following a meeting of immigration and security ministers last week."