On Fundamental Laws of Cyber Security

March 17, 2024

security

I am currently working my way through The Open Society and Its Enemies by Karl Popper and I find myself working through the critique of Plato and his views on nature and convention. In this section, Popper is working to draw a distinction between natural laws and normative laws.

A natural law is described as “… describing a strict, unvarying regularity which either in fact holds in nature (in this case, the law is a true statement) or does not hold (in this case it is false)”. Of course, the latter is termed a hypothesis, and science is constantly looking and testing these for their validity. In contrast, normative laws are described as a “legal enactment or a moral commandment”. Popper argues that norms are different than natural laws, even if some think that those norms are “in accordance with human nature”.

While interesting, it got me thinking about what some fundamental laws (or hypothesis in this case, since I’m not going to prove them) are from a cyber security perspective. What are some things we guarantee to be true, regardless of the context / use-case / etc. I came up with 3 that might be of interest.

  1. Data is always copied when accessed

I think this one is fundamental, and one that is often ignored when considering cyber security. In order to access data, and by access here I mean view/interact, you MUST copy the data from it’s location into a system that is capable of displaying or interacting with it. Data access can be obscured via various technologies, such as remote desktop or other, but ultimately it is still copied in some form. Consider the fact that the “meaning” of the data is actually what is useful, not always (or necessarily) just the data itself. Even from a DRM perspective, the data itself is copied to the target, and accessed via only approved software performing safe operations (hopefully). Various techniques exist to, say, limit the time (or the format, etc) the data is available on the destination, but it is still ultimately copied. The key here is that you can’t really limit the movement of data, so plan accordingly.

  1. All digital communication is built upon inherently un-authenticated patterns and protocols

Authentication and authorization in digital contexts are paramount, and form the basis of security whenever someone mentions the “modern perimeter” or “zero trust”. Buzz words aside, it is important to understand that, at the end, the underlying protocols are all un-authenticated. Authentication is something that is “built” on top of these protocols by “layering” various pieces of information in packets. The computer doesn’t have sensing capabilities like we do, and so, each packet ultimately contains a ton of information that gives the computer context as to the nature of a request, which could also include authentication information.

  1. All general purpose computing constructs can be convinced to do bad things

Okay, I’ll admit, this last one might be a stretch, but I think it’s likely valid. Anything designed for general purpose can be used in a way that is harmful, un-ethical, or contrary to “the makers” intent. This is why security requires a pinch of both detection and prevention (okay, maybe more than a pinch). Maybe what we need is a general way of creating the code we want to run, and then a “fixed” way of running them so that the intent of the code is not violated (… okay okay, this sounds a lot like the aims of confidential computing).

Conclusion

What I liked about this exercise is trying to think about the difference between natural and normative laws. Natural laws cannot be changed, no matter how hard we try, and we need to factor that in when building solutions with cyber security in mind. It doesn’t matter what the use-case is, even AI for that matter, these laws hold true and need to be accounted for. Normative laws can (and must) be context sensitive, and are also changeable over time. Can you think of other natural laws that we need to know for cyber security?