CYBERSECURITY
28 April 2016
FUTURESCOT
5
Balancing the right to privacy
with the duty to protect
A hack of Firefox
and Google Chrome
underlines the
complexity of the
encryption debate
BY BILL BUCHANAN
There are few things nicer than
presenting some ideas in my home
city, Edinburgh, a place that is thriving
though IT innovation and enterprise. I
really enjoyed the Scot-Secure conference (www.scot-secure.com) last week.
It is always a responsive audience, and
it’s a place you can present on some
of the major issues of our time. To be
invited as an academic to an industryfocused event is always a good thing,
especially to stimulate a bit of debate
around key issues. I also sneaked in a
bit of maths and a reference to one of
Edinburgh’s finest sons, John Napier!
There is a major dilemma faced
with the development of a secure
Internet: how to balance the rights of
the individual to privacy, alongside the
rights of society to protect itself. In the
firing line is cryptography, where it is
used to protect identities and secure
communications, but, on the other
hand, it is used to hide terrorist activities. The debate around cryptography
is thus one of the major debates of the
21st century, and it shows no signs of
reaching a conclusion.
IN THE UK we have the Investigatory
Powers Bill (IBP), where ISPs will log
the communications of their users. The
scope of this power is likely to reduce
over the new five years, as almost
99% of all traffic on the Internet will
be encrypted and ‘tunnelled’, which
means that the logs will only contain
the destination IP address, and there
will be no details about the actual page
HRE1699_Head_Insite_Ad_v5.indd 1
visited or its content. Along with this,
the cloud service providers such as
Microsoft, Google and Facebook are
moving toward encryption by default,
where it will not be possible to connect
to the service unless it is encrypted.
In the US, we see the developing
Burr-Feinstein legislation, which has
the Catch-22 clause that says the US
companies must protect the privacy
of US citizens but support the legal
system in breaking any communications if required.
To many technical people, this is an
almost impossible situation, and could
only really be done with a ‘back door’
in software. This back door could obviously leave flaws in software that could
compromise the whole of the Internet.
Poor coding caused many of the current flaws, but a back door in software
is likely to be discovered or leaked. This
could lead to large-scale data leakages,
on a scale that could encapsulate the
whole world. Just imagine if Google’s
private key was released to the world;
every communication through Google
would then be open to those with the
secret key, or they could pretend to
be Google and set-up spoof search
engines.
MANY COUNTRIES are looking at
ways of breaking secure communications, including Kazakhstan which has
a plan to replace the digital certificates
that are provided over the Internet
with their own certificate, and thus be
able to listen to secure communications.
For HTTPs, the secure communications method for Web, typically works
by the client (the user) receiving a
digital certificate with the public key of
the Web server (such as Google.com),
and then creating a new encryption
key that they will share for the session.
This session key is then encrypted
with the public key of the server, and is
sent back. The encrypted session key
is then decrypted with the private key
of the server (such as Google.com). At
the end of this process, both the client
(the user) and the server (Google.com)
have the some encryption key, and can
now use it to secure the communications. We see here how important it is
to keep the private key secret.
The core of HTTPs is unique session
keys which are only used once, and
never stored. If there was some way
to store these keys, law enforcement
could easily go b 6