Nuremberg Code for Your Online Data

By Amandeep Midha, Principal IT Consultant, BEC

January 23, 2022

Let’s talk about the consent you give to your data in the Internet age. It’s no longer a secret that big tech companies tap into your preferences and search queries so they can send you targeted advertisements. What you may not know is that even when browsing in Incognito mode, your actions may not be as anonymous as you think, according to this Wired article. The difference between Incognito mode and “private” is where things get tricky. Nowhere was this more apparent than the lawsuit against Google over how it operates its Incognito mode.

Google said that “incognito” should not be confused with “invisible”. Incognito mode only avoids saving user data / cookies onto a local machine, while the rest of the functionality remains the same. For example, if your persona is matched and if Google can capture that insight about you on its end — without saving anything locally— then its Incognito function has served its purpose.

Google VP Brian Rakowski, the brain behind the Incognito mode, explained that though Google states Incognito enables browsing “privately,” what users expect “may not match” up with the reality (i.e., what is referred to as safe and private is not safe and private). 

Let’s look at the cookie universe, where an active data protection regulation like GDPR exists. A user trying to browse a website is given several cookies options. The most visually encouraged option is to press a single click on the “Accept All” button to continue browsing, with minor variants differentiating between types of cookies. But it’s not clear if there’s still an equivalent to Incognito mode if all cookies are rejected upon visiting a page.

However, it does not end there for users. As more user data is shared with outsourcing partners and vendors, it will spark more privacy concerns. Suppose you have a bank account and online access. Your banking platform now offers a plug-in or vendor integration where certain transaction insights are possible, but your “consent” is required. This raises certain questions :

  • What is the user consenting to? 
  • Is there an alternative for those users who would not like to consent with this vendor product plugin? 
  • Can the consent be revoked? How?
  • Is this not a case of Consent by Coercion?

Consent And GDPR

When it comes to consent, it is not just a mere binary input to a process, but rather a complex philosophical principle where the intent of giving consent is seen in the context of the user’s available possession. GDPR goes a step further to differentiate whether the nature of the consented data being is “private” or “private and confidential”.

Under GDPR, it becomes the responsibility of the data controller that the consent giver has understood why they are consenting and by providing what data and any other use of data than being stated are non-compliant. The non-compliance on consent may be seen as a fundamental omission that has significant legal implications (as a breach of the consent).

Any case of mistrust that a user has about the use of their data can be solved with a request for their data to be deleted from an organization’s records. The organization is legally bound to comply except in certain minor cases.

Could the war against Target Ads be the ultimate Objective?

Lately, there has been a significant amount of news about the war against targeted advertisements. To contain this to some extent, some websites offered cookies that were giving verbose classifications of “consent”.Some others included the ability to toggle on/off the target advertisements. 

The key question remains unanswered. As long as your search & browsing history, items bookmarked stay mapped to your digital persona, you may not be targeted with ads, but similar new innovative products could still be built to achieve this goal.

The reasoning given by the likes of Vestager is that it will hurt small businesses.  This way of reasoning is based on equitable distribution or envy (or both), while the same policymakers are fine with enormous hoarding of user data by governments.

One should not confuse the uproar against Targeted Advertisements as an initiative towards data privacy in any sense. The actual matter, therefore, comes to the user’s “voluntary” consent of submission of such data about their online activity, and the data controller’s commitment to comply with the intended use of data algorithmically or by whatever means otherwise. Privacy-oriented browsers, therefore, do come in handy. But then the current browser monetization models and relevant advertisement feed building would still take a centre stage in pursuit of commercial competition to stay afloat.  

GDPR & Nuremberg Code together are just ideal Marriage of Consent and Context of Consent

The Nuremberg Code of 1947 is generally regarded as the first document to set out ethical regulations in human experimentation based on informed consent. It is a set of research ethics principles for human experimentation created by the U.S. v Brandt court as one result of the Nuremberg trials at the end of the Second World War

The headache for privacy activists begins the moment users click the usual “I agree” button to accept the software’s terms and conditions. Buried inside those terms & conditions are endless contracts and explicit waivers of users’ fair level use common understandings around user collected data.

While GDPR clearly outlines the difference between a data controller and a data processor, it’s important to note the responsibility that lies with the data controller and where the distinction between private general data and private sensitive data would apply. One cannot be coerced to share certain private sensitive data in exchange for services. And once such data has been provided, the context and agreement under which conditions such data was collected must hold valid throughout the entire data retention period.

Whether your browsing history is sensitive data can be questionable. But can a search provider block your further searches unless you mandatorily share your browsing history (or allow them to store and construct your online persona)?

However, it is an entirely different discussion if an online search provider, which is a private entity, wants to offer you certain services with the mandatory exchange of such personal data (aka browsing history). There goes the foremost violation of the Nuremberg Code.

Could Web3 hold the Answer?

Surrounded by discussions around Web3, Tor Blair, founder of the private blockchain “The Secrecy Network” tweeted:

“NFTs, blockchains, all public-by-default and terrible for ownership and security”.

Most websites, even in the crypto space, use different third-party providers for APIs and analytics. That’s where your crypto identity can be traced, mixed with other private information, and lead to your entire transaction history from public ledgers.

Web3 does hold the promise with certain offerings like Panther Protocol and Findora. For example, the Panther protocol utilizes the zk-SNARKS technology. Despite its increased complexity, this also allows the  possibility of a banal “I agree” button , hence of a not so wholesome mix of centralization and decentralization where assigning the responsibility of consequences of data leak could get even trickier. 


The arrival of GDPR in May 2017, not only in the EU but worldwide,  triggered a focus towards online data privacy and the importance of consent in product architectures. While consent is important, it is far more important than the consent being obtained voluntarily and not coerced from the user by either a typical “I agree” form or by direct denial of services by othering those users who may not want to consent to certain usage of their data. After all, the Nuremberg Code had this important message for us: “Consent should be Voluntary” and never coerced. Technology providers should always remember that

(Image Source: Informed consent by Nick Youngson CC BY-SA 3.0 Pix4free)