5 epochal changes humanity must embrace ASAP to secure a bright future - Part 3 of 5
Published: February 16, 2025
In this long, five-part article, I'm focusing on what, in my humble opinion, humanity must do as soon as possible to guarantee ourselves a future so bright it seems utopian while effectively averting the worst that could happen.
Wait, this is part 3 of the article, if you missed it, part 1 is here, and part 2 is here.
In this third part of my lengthy article, I'm tackling something very close to my heart, because, unfortunately, it's been causing serious, continuous, and utterly pointless damage for many years. I'm talking about damage to both companies and individuals. It's inevitable, then, that I'll be speaking about this in a very heartfelt, and perhaps even passionate, way. Just think – although I often use various LLMs to help me write these articles, and they usually encourage me to be more moderate and present opposing viewpoints to increase my credibility, and even this time, of course, they were full of such advice, well, never before have I ignored them so completely. I only made sure that what I'm writing is true, meaning that the current laws actually say what I'm about to present. I've known this for years, of course, but an extra check couldn't hurt. So, today I'm going to deal with privacy, starting immediately with my proposal.
Privacy laws, and especially the GDPR, need to be completely rewritten from scratch
Since I'm going to say things that will seem outrageous to most, it's absolutely necessary to state right away, unequivocally, that I personally believe the right to privacy is absolutely sacrosanct for all of us. Therefore, it's perfectly right that there are laws that regulate it rigorously and that also provide for very hefty fines for those who don't respect them. That said, I'll start right off with a low blow: current privacy laws do everything, everything, and when I say everything, I really mean everything, even if, unfortunately, I'm referring to the damage, except actually protect the privacy of us citizens. This is why I propose rewriting them completely from scratch.
Let's see what I mean. To do this, I'll tell you how I first came into contact with the GDPR. This alone will show you how, right from the start, this law immediately revealed itself to be the incredibly damaging misstep in the protection and security of citizens' data that it actually is.
Those of us who have been computer geeks for many decades are used to having relatives, friends, and acquaintances who occasionally ask for help with computer-related things. With the advent of the internet, it quickly became normal to receive, often frequently, questions like: "What do I do? The email client is asking me if I should open the attachment!", "What do I click on this website? It asks me: <various deceptive questions>. Do I click yes or no?", "A window has appeared in front of all the others! There's only a 'OK' button, what do I do, should I click it?" and so on…
Those were the golden years of internet scammers, the jungle of spam emails, the pop-ups that appeared out of nowhere, even in your jacket pocket, and people like me spent hours thinking: "How do I teach all my friends to understand at a glance, on their own, how to avoid scams without needing to ask me or other experts for help every time?" And it turned out that there was actually an extremely simple answer to this question. So simple, in fact, that it almost seemed trivial. Simple, understandable by anyone, very easy to remember, very easy to apply, and it was: "Always click NO! And if there's no NO, click CANCEL. And if there's no CANCEL either, click the cross at the top that closes the window!" Period! End of story! That's it! Understandable by anyone! Doable always, 100% of the times, even by a two-year-old child! And for those who really insisted on understanding more, it was easy to explain: "If the attachment in the email wasn't dangerous, the email client wouldn't ask you if you want to open it or not. The mere fact that it's asking you is proof that it's better not to open it!". "If the pop-up in the browser wanted to do something legitimate, it wouldn't need to ask for your authorization, it could already do it. If it's asking you questions, it's because it's asking permission to do dangerous things, and this alone should make you understand that you should answer NO!". Thanks to the complicity of the UI and UX departments of the various browser manufacturers who, it must be said, did an excellent job from this point of view, organizing authorization requests in a coherent and universal way, that is, always and only requiring an explicit YES to authorize the most dangerous operations, it was very easy to make even the most computer-illiterate users understand that as long as they said NO, CANCEL, or closed the various windows without giving any answer, nothing bad would ever happen to them. The best practices in the whole world had all converged on the most logical and natural thing for anyone: unless you really know what it is, never say Yes! A direct derivation of the universal advice par excellence: "Don't accept candy from strangers!"
Then, the GDPR came along!
And here it is necessary to stop and take a deep breath.
Now, anyone who had even a minimal knowledge of the web, but, let's face it, of life in general, I purposely used the example of cookies from strangers, what would have legislated for the aspects of privacy on websites? The most obvious and predictable thing in the world! The law would have simply established that in a clearly visible place, that is, all the way up or all the way down in the footer, each site is required to have a link to the details on data processing, by going to which page the user is also offered the possibility of customizing some aspects. The end! Period! Enough! There was absolutely no need for anything more. Those who understood something would always have had the possibility of knowing where to go and what to do. Those who understood nothing would have continued to understand nothing, as is still the case today, but at least they were not tortured with questions that risk instilling in them the very dangerous habit of accepting things they understand nothing about. In short, in a world that has taught us to always say NO because the bad questions are all of the type:
Do you want to be deceived?
Are you okay with me stealing your money?
Shall I kick you in the teeth?
The European legislator has come up with the brilliant idea of forcing us to get used to saying YES to questions like:
Shall I steal your personal data? If you want to see the site you have to say YES!
What the hell, gents! Are you serious?
The reality is that whoever designed this law probably didn't consider the impact it would have on users' daily browsing experience. For the first time since the internet was born, in contrast to all the most widespread, consolidated, simple, logical, and automatically understandable best practices in the world, to do something harmless, normal, not even remotely dangerous, instead of clicking NO, you had to explicitly click "Yes," or "I accept!" or even "OK" or "I want it!", without even knowing what the heck it was talking about! And this created a significant problem: cybercriminals found a way to convince users to click "Yes," disguising themselves as GDPR, thus unlocking the most dangerous features that the various browser manufacturers had managed so well to protect until then. Not to mention, of course, the embarrassment of people like me who suddenly found themselves forced to explain to relatives and friends that: "Eh, I know, for years I told you that you should always click NO, but unfortunately now you have to click Yes to some questions if you want to browse that site!" But how to explain how and when to choose Yes and when NO? European legislators have complicated our lives without giving us anything useful in return.
The GDPR is, unfortunately, the strictest privacy law in the world, and therefore it is often taken as a model for being as compliant as possible if you want to ensure that you are compliant with all privacy laws in the world. Unfortunately, however, since the GDPR is a terrible law and certainly the worst of all privacy laws in the world, the fact that it is also very strict implies multiple additional disadvantages:
- It forces even websites and companies that are fortunately not in Europe and could avoid the absurdities of the GDPR to adopt the worst practices, but instead they endure them anyway in an attempt to optimize privacy management to be compliant with the whole world;
- Since it has become the reference standard worldwide for most, however bad it may be, modifying it implies a very high risk of undermining the work of millions of sites that have nevertheless managed to be compliant with it;
These two things are so true that in the end, after years of tremendous tribulations, companies around the world are starting to increase that, in order to avoid the absurdities of the GDPR, are giving up on making their products/services available in Europe. Many, naively, think that these are just a few isolated cases, like Apple Intelligence, like some still immature AI models, and very little else. Mistakenly even more, many are also convinced that these are temporary situations, that is, that everyone sooner or later will get in line and in the end everything will become available in Europe too. Let's say it right away: this is very false. There are already many products/services that are no longer available in Europe, nor are they only temporarily unavailable. What's more, the practice is consolidating that the default for any new technology/product/service ready for the market is made available only to non-European users and remains so forever, unless later, and only eventually, and only if it really turns out to be convenient, it will be evaluated whether it is really worth taking on the changes to make oneself compliant with the GDPR in order to also open up to the European market. But by default, the European market is increasingly being kept out of all the most innovative programs, out of the beta and research phases of products, out of experimental sites for new technologies, and all this not only by small or irrelevant companies: as we have seen, even companies like Apple, Google, and many others among the largest in the world are doing so.
Just what has been said so far should be enough to deprive us of sleep and serenity and to convince us to demand loudly and with solemn urgency from our governments to modify this damned burden of the GDPR. In reality, we have only shown the classic, negligible tip of the iceberg of the GDPR's problems; this unfortunate law hides far worse surprises for those who know it and have suffered it firsthand. It is a law that should protect our privacy. And yet: what happens if last night you were in that particular restaurant with a beautiful blonde and, having spotted you there, your doorman tells your wife who thought you were in the office until late in the evening? What happens to the person who sticks the sticker on your intercom with the words "Mr. <your name and surname> must urgently call the debt collector at the following number to settle his debt position"? What happens to the person who goes around talking about your three years in prison served a few years ago? And so on…. It's a privacy law, you'd expect these to be the things it should mainly deal with instead of confusing you with silly pop-ups while you browse the web. Well, on these other fronts, evidently considered very marginal by the legislators who designed this law, you are not protected in any way, zero, nothing at all. Unless you can irrefutably prove that you have suffered damage and can incontestably quantify the damage suffered economically, the only thing that could at most, eventually happen to those who have improperly disclosed your private affairs is a fine of a negligible amount.
To understand what we're talking about, consider that with the GDPR, multi-million euro fines are imposed every year on companies that do not comply with the regulation. As early as 2020, there was talk of 160 million euros per year with a growth trend of 40% per year! As if to say that a piece of data that may have accidentally and involuntarily escaped from a company regarding the incredibly confidential information that yesterday I bought mayonnaise from that brand instead of another is rightly punishable by a 20 million euro fine, compared to the total or near total impunity of someone who, motivated by the explicit intent to harm me, goes around talking about the rape I suffered a few months earlier. I don't feel protected in my privacy by a law like this. Where a fine or sanction is even provided for those who violate the GDPR, it never results in any kind of compensation for me, the offended citizen. And they have the nerve to say that this law was made to protect the privacy of the citizen! Another clear example of Orwellian doublethink!
But enough about the GDPR. Unfortunately, the GDPR is the worst of all privacy laws in the world, and we've understood that. At the same time, however, the problem is broader and more general. With the exception, perhaps, of alternative models such as the Californian (CCPA) or Japanese ones, which could offer insights for a more balanced approach, all over the world the issue of privacy has completely gotten out of hand. What has been forgotten almost everywhere is to design these laws starting from one of the principles that is also greatly consolidated in any modern constitutional law.
I am referring to the principle known as the "harm principle" or "non-aggression principle," formulated particularly clearly by the philosopher John Stuart Mill in his essay "On Liberty" (1859). In its classic formulation, Mill states that "the only purpose for which power can be rightfully exercised over any member of a civilized community, against his will, is to prevent harm to others." In the legal field, this concept is often expressed with the Latin phrase "neminem laedere" (do not harm anyone), which is one of the three fundamental principles of Roman law.
In modern constitutional law, this principle is reflected in the concept of "balancing of rights" or "balancing of fundamental rights," where an attempt is made to find the right balance between individual freedoms and collective interests. Another common formulation is the "principle of proportionality," used above all in constitutional and administrative law, which requires that limitations on individual rights be proportionate to the protective purpose that is to be achieved. All of this has been beautifully ignored in most of the privacy laws in the world, forgetting the fact that in modern societies, a great deal of data relating to our persons has an immense social value and utility even today, and their value will be even higher as artificial intelligence evolves. Preventing the use of this data always and in any case and forcibly entails significant damage to the community.
It is therefore perfectly right to protect the privacy of those who buy online, of those who expose themselves on social media, of those who use healthcare facilities and services. It is inappropriate, however, and highly detrimental to the interests of society, to allow citizens to prevent the use of their data, especially when anonymized with effective techniques, for market research or scientific research or in general for training artificial intelligence models. Privacy laws must evolve and understand that there are different types of data and that in some cases data derived from the original data must be usable in an absolutely free way by those who hold them without owing anything to the individual citizen. Unfortunately, current laws do not provide for this. At most, in fact, a type of data defined as anonymized is recognized, but unfortunately, by its very definition, it cannot be considered automatically disconnected from the original data and therefore, if the user does not give his consent to anonymize his data, that data cannot be used in any way. Two serious problems arise from this legal mess:
- The law provides that even when the user gives his consent to a certain type of use of his data, he can still revoke it at any time. This is a real disaster for companies. In fact, consider a company that has 30 million customers: how could this company keep up with the continuous changes of consent of thousands of customers and continue to use its database for research, training, and so on? What should a company that is carrying out a training that will last another 3 months do if, in the middle of it, even just one user calls and says: "Oh no, wait a minute, I changed my mind, you can't use my data in that training!"? I'm sorry, a law for the right to privacy of a citizen that allows this goes far beyond the threshold of the "principle of proportionality" at the base of any balanced and modern system of constitutional law;
- Recognizing the existence of only two types of data, the original data and the anonymized data, leaves room for endless discussions on when, how and why the anonymized data are still or not attributable to the original data, possibly through de-anonymization processes implemented with artificial intelligence, making the distinction between anonymized data and original data completely useless for practical purposes. If, in fact, it is not recognized that there are also data absolutely not attributable to the original data, even though they still derive from them, that is, the law does not recognize that there are also data whose correspondence with the original data is not one-to-one, the use of any type of data provided for by the law becomes in fact impossible without the consent of the user from whom the data originated. Yet the reality is very different. Today's technologies, in fact, show us that this third type of data does indeed exist, that is, it is possible to create completely synthetic data, although actually identical from a statistical point of view to the original data, so as to never be able to be traced back to the original data, without exception. This technology is that of GANs, which are very particular neural networks that have precisely this precise ability. So where's the problem? The problem is that GANs also need to be trained to produce synthetic data, and to train them you need the original data, but since the law does not recognize that the synthetic data produced by GANs are actually not attributable in any way to the original data, the law itself does not allow companies to use the original data to train GANs. Again, this goes far beyond the threshold of the "principle of proportionality" at the base of any balanced and modern system of constitutional law; the data resulting from such a process could not harm individual citizens in any way, not even in the most blatantly imaginary case, yet the law does not allow the data to be used in this way. Let's be honest, this does not mean protecting the privacy of the citizen, it means protecting their unjust and unrealistic whim at the expense of the collective good.
In summary: it is sacrosanct that each of us has the right to privacy, but laws like the GDPR often end up not protecting what really matters to the user, while making life impossible for those who do business and try to use data lawfully and sometimes even anonymously. In addition, users affected by real privacy violations do not receive compensation, but the state collects fines from companies, leaving citizens empty-handed. And as if that weren't enough, users can even prevent the use of anonymized data, with the result of blocking research, market analysis, and innovation.
My proposal, therefore, is:
- to make the management of data a thousand times simpler (and without bureaucratic labyrinths) for companies;
- the user's right to object to the use of anonymized data must be eliminated. Once the data is no longer attributable to the individual person, why prohibit its use and hinder opportunities for growth and competitiveness?
- create new rules that recognize automatic compensation for victims of violations. It must be the tool that balances the relationship between citizen and company, without requiring long legal battles or the need to prove often incalculable damages.
If you agree with all this, spread this article as much as possible, share it, become a promoter too. In the name of good for all of us and the possibility of seeing a right recognized at the same time, that of privacy, in a way that is truly useful to us citizens and not only useful to institutions to make money.