An analysis of Section 3 of the Regulation of Investigatory Powers Act 2000

Introduction

The Regulation of Investigatory Powers Act 2000 is a piece of UK law that, among a range of other things, contains a section that is meant to require the surrender of cryptographic keys to certain authorised parties (which are in effect instruments of the government). If such a request is made as part of an investigation, then the party who disclosed the key is not allowed to tell anyone that the authorities have that key or they face up to two years in prison. Equally, if the party fails to disclose the key, they also face up to two years in prison.

The aim of this article is a discussion of some of the finer points of that legislation and I aim to show that in a few cases (like channels secured by SSL or SSH) that it doesn't apply quite like the lawmakers might have hoped.

Some basic cryptography

Alice and Bob are friends who like to talk about, well, private things. They don't want anyone to know what they're talking about. Not you, not I, Not anyone but they have a problem! Alice lives South-Africa and Bob lives in Japan.

Alice and Bob want a secure channel. A secure channel is a communications link where Alice and Bob are assured beyond reasonable doubt that they're actually talking to each other and not someone else. They would also like to be assured that the communication is kept secret and hasn't been altered any way in transit.

So how do they communicate securely? They can't trust the post because anyone can open and alter their mail, the can't trust the telephone network, because anyone could be listening in, and they can't trust the internet because it's trivial with the correct equipment to eavesdrop as well as alter messages.

Given the lack of prebuilt secure channels in society, an obvious question arises: Can we transform and insecure channel into a secure one?. It turns out that given a few (but quite safe) assumptions we can in fact do this. It involves a topic in mathematics called number theory but for the purposes of this discussion we don't need to get into the details of how it works. All that we need to know is that we can exchange messages over an insecure channel and by doing so create a secret piece of information that no-one who is watching the channel can reconstruct. We can use this information as an encryption key, and so, protect the secrecy and integrity of our message.

The problem with doing the required number theory is that the numbers involved are huge. No human could ever be expected to do this by hand so Alice and Bob trust the computers to do this for them. The computer generates the key for them and never shows it them - they have no need to see the key since the computer is always going to encrypt their data for them.

Encryption, alone, does not guarantee that the message is authentic so we run another procedure on the message that produces an authentication code. This code allows Alice and Bob to confirm that the message hasn't been changed in transit. All messages authentication codes (MACS, for short) require some kinds of key. We run the authentication algorithm step before we encrypt. We then encrypt the message and the MAC with the key we agreed using our number theory. It is important that the MAC's key is different to the encryption key. This is not only good cryptographic practice but it also allows us to avoid seizure under RIPA.

There's a few more minor technicalities but basically that's it. Alice and Bob now have a secure channel. If Carol tries to eavesdrop she's brutally rebuffed by the encryption. If Dave tries to modify the messages in transit, then when Alice and Bob check the MAC, they'll detect it then. Of course, Dave can delete messages but you could do that before anyway.

These type of secure channels are used all the time. SSL or SSH are examples of a channels that work in this way. We shall see that these channels are not covered by the RIPA.

Examing the Act

The bit of the Regulation of Investigatory Powers Act 2000 that we're interested in is the third part. First, I would like you to read this section in full. Done? Okay proceed.

Under the act, a section 49 notice can be given that demands you hand over your secret key provided they have a good reason to ask for it and a person with the correct authority requests it. If you refuse, you could face two years in prison. What I intend to prove is that a secure channel, like what I described above, is not actually covered by the act.

Before we can do this, let us examine some the definitions they use in the act:

"key", in relation to any electronic data, means any key, code, password, algorithm or other data the use of which (with or without other keys)-

(a) allows access to the electronic data, or

(b) facilitates the putting of the data into an intelligible form

"protected information" means any electronic data which, without the key to the data-

(a) cannot, or cannot readily, be accessed, or

(b) cannot, or cannot readily, be put into an intelligible form;

"person" includes any organisation and any association or combination of persons - Taken from 'general interpretation'

"electronic signature" means anything in electronic form which-


(a) is incorporated into, or otherwise logically associated with, any electronic communication or other electronic data;

(b) is generated by the signatory or other source of the communication or data; and

(c) is used for the purpose of facilitating, by means of a link between the signatory or other source and the communication or data, the establishment of the authenticity of the communication or data, the establishment of its integrity, or both

(2) References in this Part to a person's having information (including a key to protected information) in his possession include references

(a) to its being in the possession of a person who is under his control so far as that information is concerned;

(b) to his having an immediate right of access to it, or an immediate right to have it transmitted or otherwise supplied to him; and

(c) to its being, or being contained in, anything which he or a person under his control is entitled, in exercise of any statutory power and without otherwise taking possession of it, to detain, inspect or search.

Next, I bring to your attention paragraph (2) and (9) of section 49:

(2) If any person with the appropriate permission under Schedule 2 believes, on reasonable grounds-

(a) that a key to the protected information is in the possession of any person,

[Snip some other points]

(9) A notice under this section shall not require the disclosure of any key which -

(a) is intended to be used for the purpose only of generating electronic signatures; and

(b) has not in fact been used for any other purpose

From these two paragraphs I will attempt to prove that you can set up a secure channel that legally avoids the section 49 notice. A secure channel is built from providing secrecy and authenticity so we'll treat them separately.

Authenticity of Message - From the definitions above and the paragraphs quoted, it becomes quite clear that MACs are not covered by a section 49 notice. They are a electronic signature and paragraph 9 explicitly states they are not covered by a notice served under section 49. Keys used for authentication, and only authentication are safe.

Secrecy of Message - This is a little more tricky. Now the question is, If I get the computer to agree the key for me, am i actually in possession of the key? Well, the answer appears to be yes but only if you own the computer in question. Section 56, Paragraph 2, Part C gives a definition of possession where if the key is 'contained' within something that is yours then you can be said to be possession of it. It's reasonable to assume that this definition includes a computer.

What is striking about this statement is that if you went to a public place and you made their computers agree a key for you it is now the owner of that PC that is responsible - you avoid section 49 all together. Another interesting point is the fact that in most circumstances you don't own the software that runs on your computer. You might have a license to use that software but that isn't ownership. Can I really be prosecuted if a piece of software I don't even own isn't written to allow me access to the encryption keys used for a session? Indeed, since you don't own the software that generated that key is that key really your property after all? Is it the software vendors?

Fortunately, for all parties involved, this mess appears to be avoidable. I draw your attention to paragraph 2 of section 53:

(2) In proceedings against any person for an offence under this section, if it is shown that that person was in possession of a key to any protected information at any time before the time of the giving of the section 49 notice, that person shall be taken for the purposes of those proceedings to have continued to be in possession of that key at all subsequent times, unless it is shown that the key was not in his possession after the giving of the notice and before the time by which he was required to disclose it.

This neatly does away with the problem of the definition of possession. If you are given a section 49 and you are not in possession of the key between the start and end of the notice period then you are unable to be charged for failure to comply. The burden of proof is on them; they have to prove you had the key during the notice period and failed to disclose it. Showing the source code to your security program is a pretty robust defence since you can prove that the agreed key is destroyed after use.

Conclusion

I have presented some analysis of Part 3 of the Regulation of Investigatory Powers Act 2000 and concluded that if you use a key agreement procedure (for encryption) where you never see the result and the key is destroyed promptly after use then it appears you can make a strong defence against a notice under section 49. The defence is even stronger if you dont own the machine on which the agreement procedure was run. As a trival point, I have also shown that authentication keys are not covered by the act.

Copyright Simon Johnson 2003