Guest post: The technical term “ick”
Originally a comment by latsot on Talk about imperfect timing.
And as for (over) sharing in general…. When I was an academic computer scientist, a lot of my research was about privacy. I’ve expanded on this work a little since for people with actual money and for the book I’m supposed to be finishing (I’ll do it in a minute, OK, stop nagging.)
Anyway, without going into detail, one of the things I talk about a lot is the idea of privacy as a community enterprise. We have a tendency to see secrets, for example, as things we own. But we don’t; most secrets are shared with other people and if they’re not, the mechanisms by which we keep secrets are shared. Think of codes of silence or village huts with very thin walls; we pretend not to hear or we give people space if we think they need it. We work at producing environments in which we can keep secrets or share them (either implicitly or explicitly) under some mutual understanding that if you blab, the tables could be turned.
It’s all a lot more complicated than that of course but hopefully you get the idea.
Anyway, it should be obvious that social media is not built according to this model. It encourages over-sharing and the custodians of our privacy are no longer people we kinda-sorta trust, working in mudily mutual self-interest. I hypothesise that this is one of the reasons that social media is so batshit insane. It’s not anonymity as such that’s the issue, it’s a mistuning of inhibition because the space we’re operating in is fundamentally different to how we think it is and the nature of our interactions is not what we believe it to be.
So, to badges announcing our pronouns, sexualities, love of the Klingon language and cycling proficiency test scores. There’s a technical term (really) in privacy research called “ick.” Ick is when something doesn’t feel comfortable. It feels like oversharing. Someone knows a bit too much about you or you know a bit too much about them. For instance, one night when he was drunk, my dad told me out of the blue that he was circumcised. I didn’t know what to do with that information and now neither do you. It’s icky. It didn’t fit the expectations of our interaction. There were no rules within that interaction that told me whether or not it was appropriate for me, for example, to share that information on a public blog. Ickiness muddies the rules. That unexpected item (or lack of it) in the baggage area (so to speak) ruins the reasonable expectation of privacy. It also creates a prompt to over-share; he told me something personal… am I supposed to tell him something personal to balance things out? Ick can feel like a debt even when you didn’t want the information you’ve learned in the first place.
You can see where I’m going with this. Interactions between a patient and a doctor are intimate but now the patient is being bombarded with whatever information the doctor has chosen to broadcast. It’s icky, it changes the balance of power and it creates an expectation to over-share. Am I supposed to tell my doctor about my sexuality? Is she expecting it, or just open to hearing it? Is she open to hearing about it? Just because she showed me hers, it doesn’t mean she wants to see mine. Or does it? Who the fuck knows any more?
OK, I’m quite aware that I’m having badges do quite a lot of heavy lifting here, but I hope you see my point: the reason they seem icky is because they violate our tacit expectations of how that kind of formal but intimate relationship is supposed to work and we have no idea how to react. As when we use social media, the environment is taken out of our hands and placed in those of people who do not have our best interests at heart for reasons we don’t fully understand.
And we don’t know how to react. And it’s icky.
Yes, there’s a particularly delicate dynamic between patient and doctor. Several years ago I had to see a specialist who was going to remove a small but uncomfortable bump on my eyelid. While we were waiting for my anesthetic to take effect he noticed my necklace and asked about it. I casually mentioned it was the atheist symbol. Very friendly, and equally casually, he started to tell me why he didn’t see how atheism could explain so many things about the beautiful world.
This made me very uncomfortable. I debated this stuff online. Over coffee or even sitting next to him on a bus I would have been fine having what was clearly meant to be a bit of neighborly apologetic chit-chat. But this man was about to take a scalpel and start cutting my eyelid. I couldn’t respond. He seemed nice, which was why I told him about the symbol, but no freaking way was I going to push it and explain why beauty was possible in a godless reality. He had a knife. I sat and smiled. And took it.
Ick.
Later, I thought about it and called him to complain as graciously as possible, because it occurred to me that what he did was rather inappropriate. I politely explained. He was desperately sorry and apologized profusely (Wisconsin.) No harm.
But these pro-trans-ideology buttons strike me the same way, only worse. The doctor is the one bringing in the controversial symbols. They’re a friendly little bit of what-could-you-possibly-object-to passive-aggressive privilege. It’s the doctor. He has a knife … somewhere. You have to sit and take it.
Oh yikes. Your EYELID of all places. It’s good that he apologized profusely because honestly, he should have realized. There’s that bromide about never discuss politics or religion at Thanksgiving dinner, or whatever it is. Surely doctors are told that early and emphatically.
I think the rule is never to discuss politics or religion – unless you’re the one holding the knife.
And Latsot, ICK!
:-)
I thought it was oversharing for my doctor (only a temp, thank FSM) to have a giant crucifix on the wall of the examining room. And I was wearing an atheist t-shirt that day.
This is off topic, but I have a question for latsot. I am currently writing a novel in which punch cards are being used because one of the characters believes they are harder to hack. They don’t have to be for my story to work, but I was wondering if you could speak to that? Are the old computers with punch cards more private?
Thanks.
My partner once was getting a dodgy mole removed, when the doctor, scalpel in my partner’s back, decided to raise his support and agreement with conservative right wing Christian politics. My partner, a slightly left of centre avowed atheist, felt unable to say anything at all. Because the doctor had a scalpel in his back.
iknklast,
Happy to help. It might be a complicated question, though, depending on what you mean: more private than what, exactly? And what do you mean by “hack”?
Forgive my rambly reply, I didn’t sleep much and I’m feeling a little disorganised. But here goes:
Punched cards were used for both programming computers (where the cards represented instructions in a programming language) and for data entry (where cards represented records to be stored).
To store a record, then, someone would have to collect the data from its source in the real world and, probably, write it down somewhere. Then someone (in those days, probably a secretary in a typing pool) would use a special typewriter to make the punched cards. Then the cards would be loaded into a hopper, the computer would suck them up and the data would be stored in the computer. You can see how easy it is for errors to be introduced at any stage here and how difficult it would be to check for them. This is itself a privacy issue.
To get data back out, you’d need to write a program, which would be a series of instructions encoded onto punched cards. So someone would need to make an informal request (“get me a list of all customers born before November 1937”), a programmer would have to turn that into a set of instructions in a programming language, those would have to be typed onto punched cards, which would be entered into the machine… And the results would appear on some other device (more punched cards, ticker tape, printer, one of those new-fangled CRT devices…)
Once a program was written, the cards would be stored in boxes so they could be used again. As often as not, someone dropped them on the floor and a decent portion of any programmer’s job was sorting the cards back into order. Not strictly a privacy or security issue, but I bet it was as annoying as hell.
So… more private than what? Using punched cards doesn’t in itself give you any extra privacy, they’re just a means of getting data and programs into computers. You could replace the cards with a keyboard with no real change (and in fact if by doing so you reduce the number of different people/stages/errors involved in data entry/retrieval, then you’re arguably increasing privacy).
You also have the physical cards that were used for data entry. If you don’t dispose of those properly, then you have them lying around for people to dumpster-dive. Ditto if the data is written down in longhand first before it’s transcribed to card. But this is true of any data entry process regardless of whether cards are used as the data entry medium. Security is a process and poor security means a poor process or a poor implementation of a process. This is true regardless of whether you’re using punched cards or something else.
How might a punched card machine be hacked? One deck of cards looks very much like another and whomever is feeding the deck into the hopper is unlikely to be able to tell the difference. Data entry and processing in such systems was carried out in ‘batches’. There were boxes and boxes full of input decks to be loaded in and boxes and boxes of output to be taken away and dealt with. A hacker might therefore introduce a malicious deck somewhere in a batch (or swap a legit deck for an evil one) and intercept the output. This would require that they had physical access to where the batches of cards and the output were stored. There’s a security adage: if a bad guy has physical access to a computer system, then all bets are off. This is largely true for modern systems, too: I wouldn’t recommend that you leave someone like me alone with any of your devices, password-protected or not.
A more complicated arrangement might be considered: a hacker might use a similar process to patch the machine’s operating system or some stored software in a malicious way so that future outputs from the machine are compromised. Then the hacker might only need access to the output… which they might be able to get hold of by dumpster diving. So physical access to the machine would be required only once.
So again, more private than what? There’s a certain security advantage in ancient machines because they were simpler. There were fewer interacting bits of software to attack (making it easier to run checksums for integrity) and, of course, they were not connected to the internet. But that has nothing to do with the use of cards as a physical medium and modern cryptographic techniques are far superior to security through obscurity or obsolescence. Unless, of course, the crypto implementation is broken, which happens more often than you might think.
I’ve no idea whether that answers your question. If you have any more details about what you mean by “more private” and “hacked” then I might be able to help more. But it’s difficult to imagine a scenario in which modern techniques coupled with good security wouldn’t be a better idea.
There are examples in literature of older technology being used for security purposes. For example, “airgapping” is the total physical disconnect of a computer from any external networks so that nothing nasty can get into the system from outside (fun fact: an employee almost always manages to break airgapping through incompetence. It happens quite a lot with voting machines). The example that springs immediately to mind is the reboot of Battlestar Galactica (OK, “literature” is pushing it) where the obsolete Battlestar is the only one to survive the Cyclon’s cyber attacks because its systems aren’t networked.
The Laundry Files series of books by Charlie Stross has one character using a (I think only ever hypothetical) Memex device which uses microfilm for input and output because the screen can not be snooped electronically by external means as modern screens can. Perhaps that scenario is similar to the one you have in mind?
There are lots of examples of ‘security by obscurity’ in literature, too, although I’m struggling to bring any to mind in my sleep-deprived state. This is where security relies on an attacker not knowing enough about a system to attack it because it is so old fashioned or rare. Generally speaking, this is a terrible idea because reverse engineering is easier than most people expect, but I can imagine some good story lines based on it.
Anyway, I hope some of this helps. Feel free to ask more and I know there are several people who hang around here who will be able to contribute.
@latsot #7
Also known as the ‘operator drop sort’ algorithm, and the reason for the diagonal line in black marker across the top of the card stack.
Thanks, latsot, that does answer my question. Since my character using the punched cards is a paranoid conspiracy theorist, I can use that with the notion that it may not be less accessible to snoops from the outside world; it doesn’t need to be a technical thing in the novel. I just hate to make mistakes about things like that without checking with someone who knows. I appreciate taking the time to respond. For this character, more private than what doesn’t matter, because, like I said, he’s a paranoid conspiracy theorist.
np iknklast.
Of course, there are some really good solutions for the modern paranoid conspiracy theorist, especially when paranoid conspiracy theorists want to talk to each other in secret…
There’s an arms race between people who wish to communicate in secret (by no means all paranoid conspiracy theorists, of course, mostly businesses and everyday members of the public), the intelligence services and law enforcement. So some of the solutions are fascinating, even for non-technical people.
Cory Doctorow talks about some of this stuff in some of his novels. The YA Little Brother series, for example.
latsot:
It was a neat idea.
And then they effed that up so badly in the (second season?) episode where they network the ship’s computers that I was shouting at the screen. If you recall, they connected the machines entirely via a physically wired LAN, which the Cylons were able to hack remotely because, um, magic?
Ick – that’s about as clear a description as I’ve ver heard.
As an MD I was always careful to maintain level of old fashioned decorum with patients, including not using first names when I addressed them (one of the reasons I HATE starbucks where someone 1/3 my age wants to call me by my first name…) Professional relationships aren’t about making a new friend – they are about instilling confidence, respecting boundaries, and providing a service.
I have a similar problem with academia; my students think they should get to call me by my first name. The worst part is that they call the male instructors Mr. or Dr., but the female (including me) by first name. I tell them now at the beginning of the semester that is not permitted. I can guarantee most of them will use my first name anyway. Though I have to admit, my face-to-face students do much better than my online, who persist in calling me by my first name regardless. Maybe that’s a testament to my ferocious tiger manner? (I’m really not ferocious tiger, but I am professional and firm. For a woman, that means uppity and shrill, I realize.)
iknklast, just tell them that you identify as your chosen title and that calling you anything else is an act of iknklastophobia.
We should take that mainstream. Good luck to Fox News presenters pronouncing that.