When it comes to privacy, regulation is not enough

John Ohno
3 min readFeb 16, 2022

Opsec is hard. When making opsec decisions, you must decide whether it is safe for you to let some second party know something about you — a decision that depends on who you are, who the second party is, and the resources, incentives, and material circumstances of both.

This calculation changes dramatically over time: at age 16, photographs of you getting drunk at a party are highly sensitive and can really only be shared with people you trust not to narc on you, but those same photographs are just fine at age 21, however, if you decide at 33 to run for political office then they once again become dangerous. It changes with circumstances: if you live in LA then it’s probably safe to be openly gay, but if you are forced to move to rural North Carolina you may want to go back into the closet for your own safety. It changes with technology: if some new technique for decoding distorted text comes out, then the street signs in the background of a low-resolution picture of you near your house are suddenly available to your stalker or abusive ex. It changes with incentives and access to resources: if the government decides you might be a terrorist or KiwiFarms decides you might be cringey, everything you ever said or did will be scoured for potential evidence by people who are already against you and are willing to spend hundreds of man-hours on speculation.

These changes suck, because once someone has a piece of information, they potentially have it forever.

Most social networks are mostly public by default — which is to say, once you post something, it is effectively potentially-permanent, because even if you delete it, someone might have taken a screenshot while it was visible. Public posts also make that second party in the opsec arithmetic, potentially, everyone living on the planet now or in the future.

Privacy advocates tend to focus on campaigning large corporations and lobbying for regulations (which is to say, campaigning the large corporation that is the state). This makes sense on one level: it is possible to do a lot of good with regard to the kind of data people don’t know they are leaking with a minimum of effort (as with the GDPR, which is a pretty good start — it theoretically keeps small corporations from storing and re-selling data to third parties, though large corporations can eat the fines because they are making more money from the sales than they are losing and small corporations can simply hope they won’t be…

John Ohno

Resident hypertext crank. Author of Big and Small Computing: Trajectories for the Future of Software. http://www.lord-enki.net