======When the GDPR Seems to Prevent an Entire Technology====== >Only a Sith deals in absolutes. //Obi-Wan Kenobi, Star Wars: Episode III -- Revenge of the Sith// The [[wp>General_Data_Protection_Regulation|GDPR]] (General Data Protection Regulation)(([[https://eur-lex.europa.eu/eli/reg/2016/679/oj|Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation)]])) is the current EU (European Union) privacy regulation. It has spawned an intellectual exercise where some provision of the GDPR is used to prove that the GDPR makes some system effectively illegal. Some examples: * Public blockchain based cryptocurrencies like Bitcoin: [[https://www.btcnews.com/europes-new-data-guidelines-fuel-bitcoin-ban-fears-details/|Europe’s New Data Guidelines Fuel Bitcoin Ban Fears]] * Artificial intelligence: [[https://tiffanyli.com/wp-content/uploads/2018/08/Humans-Forget-Machines-Remember_Final-PDF.pdf|Humans forget, machines remember: Artificial intelligence and the Right to Be Forgotten]] * Matrix messaging servers: [[https://anarc.at/blog/2022-06-17-matrix-notes/#gdpr-in-the-federation|Matrix notes]] * The World Wide Web: [[https://academic.oup.com/idpl/article/14/4/315/7853506|Does the GDPR break the Internet? The case for a public data exception]] There are of course many other regulatory regimes out there and the general principles discussed in this article should apply. It just seems that these sorts of claims are often based on the GDPR. =====The GDPR vs the SKS Network===== I will use the relatively obscure synchronizing key server (SKS) network, not just because it fits with the PGP theme of this series of articles but because it works as a really good practical example. The SKS (Synchronizing Key Server) network is a way to publicly publish PGP identities ((By "PGP identity" I mean the things that start with BEGIN PGP PUBLIC KEY BLOCK when ASCII armour is turned on. Otherwise I would have to talk about public keys inside public keys and I am not good enough at this to do that without causing lots of confusion.)). It has been around basically forever. It consists of a bunch of servers that communicate with the intent to have each server contain the same PGP identities in the same state. Uploading an identity results in that identity appearing on all the other servers. It was suggested that the SKS network was somehow illegal around 2018 when the GDPR first went into force(([[https://lists.gnupg.org/pipermail/gnupg-devel/2018-May/033704.html|Keyservers and GDPR]])). The idea is that the append-only nature of the SKS network is fundamentally incompatible with the GDPR right to erasure (GDPR Article 17). ====Why is the SKS network append-only?==== Keeping a database synchronized across a network is normally a difficult and complicated problem even if you can trust the server operators. For the end to end encrypted PGP case the database must be secure against attacks from the server operators and others. The SKS design cleverly makes an assumption that simplifies the problem tremendously. The database is treated as an append-only data structure. So data goes into the network and stays there forever. This somewhat surprisingly works inside the identities since PGP identities are modular. The SKS network retains all the past portions of the identities during updates. So if you attempt to replace some aspect of the identity, after you upload it to the SKS network you will end up with both aspects in the identity on the network. This (also somewhat surprisingly) works out in practice when these identities are used. Implementations expect to see older components in PGP identities. The append-only nature of the SKS network prevents an attacker from modifying the identities on the network by removing portions or replacing those portions with older portions. New encryption methods or signature keys stay new. A revoked key or complete identity stays revoked. Even if an attacker completely takes over a user's PGP environment they still can't prevent a user with access to a copy of the revocation certificate from revoking the identity and having it stay revoked forever. The important point here is that the distributed append-only nature of the network provides significant security advantages. The user gets a high level of control and security. The intent is that the people operating the servers get as little control over the user data as is possible. The most a server operator can do is modify their local copy. They can't maliciously downgrade the PGP identities on the rest of the network. ====The GDPR Right to Erasure==== >(GDPR art 17 (1)) The data subject shall have the right to obtain from the controller the erasure of personal data concerning him or her without undue delay and the controller shall have the obligation to erase personal data without undue delay where one of the following grounds applies: ... Followed by six grounds. Some of the grounds seem like they might apply to the SKS network. ... The security of the SKS network comes from the fact that it is append-only. In other words, that a right to erasure is not possible. This feels like the irresistible force meeting the immovable object. Since we can't just change the GDPR it is obvious that the SKS network will have to change. But we can't do that without making it ineffective. So obviously the SKS network must cease to exist. It vanishes in a puff of logic. If you have a legal background you could be irritated right now. If you understand and respect the GDPR you might even be offended. Destroying a pre-existing system that has not been causing trouble, particularly one intended to enhance the privacy of individual citizens, is not what the GDPR is intended to do. So my argument pretty much assumes that the designers of the GDPR are, well, idiots. In my defence, I work in technology. My world is full of absolutes. There is more electrical current going into a node than is coming out? A violation of [[wp>Kirchhoff's_circuit_laws#Kirchhoff's_current_law|Kirchhoff's current law]]. Then that can't be. Something has to change. I end up with more or less energy than I started with? A violation of [[wp>First_law_of_thermodynamics|the first law of thermodynamics]]. That can't be. Something has to change. Computer programming is all about dealing with absolutes. Hundreds ... thousands of them. I do understand the difference between the laws of the universe and the laws of humanity. But my instincts tend to betray me. My life in technology has created a sort of aggressive humility. I tend to be adamant that rules must be followed. I will throw out a helpful principle here: //The law is designed to be reasonable.// We shouldn't initially assume the law is broken and things can't work. To have the SKS network continue to exist we have to be able to claim that the operator of an SKS server has the right to refuse to remove a PGP identity on the request of the owner of that identity. So let's assume that is true to start. Our job will be to find out //why// that is the case. ====Personal Data==== The GDPR is about protecting the rights of people over their data. So we have to consider what data in a PGP identity actually belongs to a user in a GDPR sense. Such data is called "personal data" (GDPR art 4(1)). If there is no data relevant to the GDPR here then we can skip the rest of the discussion. It seems fairly clear that a public key used for identification can count as personal data under the GDPR. The question came up with respect to the public key used to identity users of blockchains((See section 3.3 of: [[https://www.europarl.europa.eu/RegData/etudes/STUD/2019/634445/EPRS_STU(2019)634445_EN.pdf|Blockchain and the General Data Protection Regulation]])). Does it count for the public key(s) in PGP identities? A PGP identity normally will represent the identity of a person. The Certifying Public Key portion is the primary indicator of that identity. A shortened version of the Certifying Public Key is generated (key fingerprint or ID) specifically to make it easy to associate that public key with an actual person. There is also an Encryption Public Key which is unique to the user and is also linked to the Certifying public Key. It is possible to [[pgpfan:signedanon|use a PGP identity in an anonymous way]] where the embedded public keys would not count as personal information. But that is very much not normal usage and requires special care. The key ID derived from the Certifying Public Key shows up in PGP signatures. A separate key ID derived from the Encryption Public key normally shows up as a plaintext portion of PGP encrypted material. So a PGP identity links signatures made by a user to material encrypted by that user. From the proceeding I get that the public keys in PGP identities could very much be considered personal data. There is also a User ID portion that provides a convenient handle for dealing with the PGP identity and is also linked to the Certifying public Key. It is an arbitrary string and might not be unique or even actually serve to identify a particular person. However, the convention is that it contains the name of a person and a PGP capable email address. So the User ID has to be treated as personal data. The personal data here is: * the certifying public key (and derived key fingerprint/ID) * an encryption public key * a name * an email address ... and the fact that the values are probably linked. So a PGP identity seems to be chock full of personal data as defined by the GDPR. ====The GDPR Right to Object==== Ground (b) of the Right to Erasure (GDPR art 17(b)) is withdrawal of consent. You consented to some specific use of your data, but now you don't. Uploading a PGP identity to the SKS network seems like some sort of implied consent. But implied consent is not really a thing under EU privacy law. Website operators attempted to use the concept of implied consent to justify their use of tracking cookies. The regulators were not amused by that argument and now we have all those explicit consent tracking cookie banners to enjoy. The SKS network is specifically designed to prevent the control of the personal information by the organization (SKS network) and leave it under the control of the individual. So uploading a PGP identity to the network is a unilateral action. A unilateral action does not involve the concept of consent. The question of consent doesn't matter in the end. The user can just base their request for erasure off the Right to Object (GDPR art 17%%(c)%% => GDPR art 21). The Right to Object is relevant when the user's data is being stored/processed for some legitimate reason. Such as, say, maintaining a secure public directory of PGP identities for a set of users as in the SKS network case. The user would be objecting to the existence of their PGP identity on the SKS network. The Right to Object is not and can not be absolute. Your wish to delete your bad grades or your upcoming jail sentence should not be respected. These sorts of exceptions are known as "legitimate interests" (GDPR art 6(1)(f)). ====Legitimate Interests==== Legitimate interests are a very important consideration when considering questions related to the GDPR (GDPR art 6(1)(f))(([[https://www.edpb.europa.eu/system/files/2024-10/edpb_guidelines_202401_legitimateinterest_en.pdf|Guidelines 1/2024 on processing of personal data based on Article 6(1)(f) GDPR]])). There is a list provided of possible legitimate interest arguments (GDPR art 6(4)) but the use of the term "inter alia" means that this list is not exclusive. Any requirement for processing/storing personal data could be potentially considered valid. Logically it would seem that legitimate interests entirely nerf the GDPR ... but that's the logic of absolutes. Instead this just means that the rights/needs of the entities involved need to be balanced in some reasonable way. Here that balance would involve the Right to Object of the user requesting erasure vs the Legitimate Interests of the other users of the SKS network. ====The EU Right to be Forgotten==== Typically the GDPR is about keeping private data private as much as possible, all the time. It is much easier to prevent the inappropriate use of data if the distribution of that data is strictly controlled. The SKS network case is somewhat special in that it involves data that someone deliberately made public but then attempted to make private at some time afterwards. That case most closely corresponds to the EU Right to be Forgotten. So it could be expected that a regulator would consider the SKS network in terms of that right. Many places in the world have some sort of right to be forgotten. Canada for example has the [[wp>Criminal_Records_Act|Criminal Records Act]] which provides for the suppression of criminal records in some situations. The right to be forgotten in Europe predates the GDPR (([[https://fra.europa.eu/sites/default/files/fra_uploads/fra-2024-factsheet-right-to-be-forgotten_en.pdf| Right to be forgotten ECtHR and CJEU Case-Law]])) and the GDPR is designed to allow regulators to enforce such a right((The subtitle of GDPR article 17, the right to erasure is 'right to be forgotten'.)). So let's balance the Right to Object against Legitimate Interests in the light of the right to be forgotten... ====In Support of the Right to Object==== * Erasure could prevent verification of signatures made by the user. * Erasure could prevent the encryption of messages sent to the user. * Erasure could prevent email spam from the email address in the user ID. * Erasure could remove knowledge that a user with a particular name had ever used PGP. * Forever is a very long time to retain data. ====In Support of Legitimate Interests==== * The possible consequences (GDPR art 6(4)(d)) are trivial. The right to be forgotten normally only concerns things like serious criminal convictions that could negatively affect a person's reputation. The sort of thing that could destroy a person's life((See paragraphs 231-235 from: [[https://hudoc.echr.coe.int/eng#{%22itemid%22:[%22001-225814%22]}|CASE OF HURBAIN v. BELGIUM]])) * A PGP identity is not "special data" (GDPR art 6(4)%%(c)%% => GDPR art 9). * An important context here is that the "controller" and the "data subjects" are effectively the same group (GDPR art 6(4)(b)). * The PGP identity is being retained by the SKS network for the purpose of information security (GDPR recital 49), in particular, authenticity and integrity of the public stored data. That security is important to the users of the SKS network. Recitals (the "Whereas" section) are not part of the legally effective part of the GDPR, but again, Legitimate Interests are not limited to a particular set. * A search must be performed to extract a PGP identity from the SKS network. It will not show up if, say, the user's name is entered into an internet search engine. So the availability of the personal data is limited((See paragraphs 112-113 of: [[https://hudoc.echr.coe.int/eng#{%22itemid%22:[%22001-183947%22]}|CASE OF M.L. AND W.W. v. GERMANY]])). A regulator would likely be influenced by existing cultural expectations when assigning weights to the various relevant factors. The example of the unlisted telephone number represents this quite well. Back when paper telephone directories were a thing, you could get an "unlisted" phone number. That meant that your number was changed to something else and that new number was not included in any new directories. That was inconvenient for the person with the unlisted number because they would then have to individually inform all their correspondents of the new number. Some of those correspondents would have memorized the old number. That was as opposed to collecting all the old phone directories and burning them. That was obviously impractical and would cause a tremendous amount of trouble for a great many people. In the same way, PGP identities are normally kept at the end points and are not available for destruction. A regulator would start with the impression that the fairest course of action would be for the user to simply create a new PGP identity and abandon the old one. It seems unlikely that a regulator would be very motivated to take an action that might cause harm to the SKS network. They certainly would not be forced into a course of action by the construction of the GDPR. The GDPR is designed to be reasonable. It is not based on absolutes. ====What is the worst that could happen?==== Between 2018 and 2023 only 1.3% of GDPR cases resulted in a fine(([[https://noyb.eu/en/data-protection-day-5-misconceptions-about-data-protection-debunked|Data Protection Day: 5 misconceptions about data protection, debunked]])). The fines against individuals and small not for profit organizations tend to be on the order of a few hundred Euros(([[https://www.enforcementtracker.com/|GDPR Enforcement Tracker]])). So, say, the operator of a SKS server can put some money in their sock drawer and not be concerned past that. This feels unsatisfying, we really want to know if things are actually legal or not. But this is an important consideration when considering the risk associated with a regulatory regime. ====What actually did happen?===== Another important consideration is the history of enforcement against a particular sort of activity. Previous regulator activity is a good predictor of future activity. The GDPR was adopted in 2016 and became effective in 2018. There doesn't seen have been any GDPR action against the SKS network since then. Perhaps just as important, there doesn't seem to have been any action against other append only systems. Making user data public is the primary purpose of the SKS network. Bitcoin on the other hand depends on an append only scheme and makes the transactions public as a result. The public release of financial transactions is of no particular benefit to the users and is often considered undesirable. So if a GDPR regulator came for the SKS network, they would have to go through Bitcoin and other similar systems on the way. Note that I am not suggesting that blockchain based currencies are special. There is likely some canary technology that works for Bitcoin. This also feels unsatisfying. It is sometimes suggested that what is happening here is that the regulators are acting as "judge, jury and executioner" or are "picking winners and losers" by choosing what entities to investigate and prosecute. A regulator has a limited amount of time and resources(([[https://lists.gnupg.org/pipermail/gnupg-devel/2018-May/033708.html|Keyservers and GDPR]])). So a choice must be made. A regulator is given a mandate by a government. That mandate usually comes with a significant amount of flexibility. As part of that, the regulator is expected to concentrate on issues that are causing actual social harm. Prioritizing is an important part of a regulator's job. ====The Attack Against the SKS network==== In 2019 some users of the SKS network experienced a harassment campaign (([[https://gist.github.com/rjhansen/67ab921ffb4084c865b3618d6955275f|SKS Keyserver Network Under Attack]])). Someone downloaded some PGP keys from well known people, added thousands or tens of thousands of third party signatures, and then re-uploaded them to the network. This caused subsequent imports of those keys to take a long time. Since the SKS network is append-only there was no easy way to immediately resolve the problem. The harasser was exploiting the basic nature of the network. There was still a significant amount of apprehension about the GDPR at the time. A sort of synergy occurred causing a level of hopelessness significantly more than the sum of both issues. The result of this actually caused significant damage to the network. There is no evidence that the GDPR threat was the result of any sort of maliciousness but the result was still the same. That is why this is important. A legal theory can act as a sort of attack on a project/system. The result can be a loss of interest in the project/system. It might seem easier to resolve the claimed legal problem in a technical way rather than having to engage fully with the legal aspects. That approach often does not work and can waste a lot of time and effort. SKS server operators actually received (receive?) requests for erasure that they obviously could not fulfill(([[https://gist.github.com/rjhansen/f716c3ff4a7068b50f2d8896e54e4b7e|SKS Keyserver Network Attack: Consequences]])). This might be a good place to mention that the GDPR has an explicit anti-trolling provision (GDPR art 12(5)) that is specifically intended to be used against such an attack. =====Further Comments===== The GDPR is very much about informing people about what is going to happen to their data (GDPR art 13(2)). Clients (web or application) with the capability to upload data to a keyserver often fail to inform the user that they are doing something that might be irrevocable. Warning a user that they are about to do something they can't undo is just good usability and is something that could be improved for the SKS network. When doing the research for this article, I was struck by how much the GDPR could double as a usability standard. Forever is a long time. It probably wouldn't degrade security to define a point where an append-only network would forget about really old data that has not been updated by its owner for a very long time (say 20 years). Such a convention could be helpful for tidying up the network(([[pgpfan:expire#cleaning_up_old_keys|PGP Key Expiry is a Usability Nightmare:Cleaning up old keys]])). =====Conclusion===== If I were to tell you that murder was legal in Canada you would be skeptical, even if you were not familiar with the laws of my country. But the statement is perfectly true, murder is legal in some situations involving self defence. The law is not absolute, even for the case of the most absolute law of all, the prohibition against murder. So if someone suggests that a particular technology is effectively illegal under some law you should be skeptical. You should apply a reasonableness test and if that test fails you should ask that someone to show how the law is so badly broken as to produce the unreasonable outcome. In the absence of a reasonable argument based on all of the law, not just some small portion of it, you will most likely be best off by just ignoring the suggestion and finding something more productive to worry about. [[pgpfan:index|PGP FAN index]]\\ [[em:index|Encrypted Messaging index]]\\ [[:|Home]]