Perversions, Cults and Fetishes of the Information Age


Abstract

This write up belongs to the category of the rants that every now and then appear in this website. The main reason why I wrote it is to coming out as the old fart that I am 😊

First of all, let me start with a general consideration: everyone of us has fetishes, perversions or oddities. Someone people can only drink a certain brand of coffee; some other people like to be whipped, handcuffed and blindfolded; some like thinking about your mum dressed as princess Leia; and some other people like all men bands with a front-woman… I shouldn't have said this last one 😊

While the ones in the example above are more well established and kind of old style, the information age originated wilder, deeper and darker perversions, cults and fetishes.

ASCII/text-only emails cult: in the middle between the second and third decades of the 21st century C.E. there are people that still think that the problems of representing different scripts can be solved ignoring that the problem exists. These people, mentioning the fact that the recipient could have an outdated email client, decide that all the text that they can write must be limited to the 7 bits ASCII supported alphabet. Usually these people belong to a larger community that, still mentioning their imaginary and mummified friends accessing emails through mail clients such as pine (pine!), decide that the maximum of text richness is pretending that bold text is like that just because in between two asterisks and decide that emails can be only plain text. No modernity such as HTML is acceptable. Now, two things: ① HTML is quite an old standard now and you can still feel vintage using it and ② since the time when the web was born it was possible to render HTML into a terminal.

Suffering as a service: when I installed Linux on my first laptop I decided to install Mandrake (later the name changed into Mandriva, last release in 2011 and ultimately died in 2015). I went for that because it was easy to install. No other reason. It had a good support for drivers and I didn't have to spend days configuring it. All the time saved figuring out how to make my graphical card work could have been spent programming and studying (I was at the university back then). At my university there was a group of students meant to help people packaging and distributing applications and other things used by the professors in their classes. On a side, they helped people with more day by day stuff such as configuring the printers on someone's laptop etc. I do not remember why I went there and started the discussion but it more or less went in the following way. Me: «Hi! I need x and y», other person: «which distro do you have?», me: «mandra…», other person: «Ah! But that's the kids' distro!», puzzled-me: «what do you mean?», other person: «well… with that distro things are too easy, everything is already done. Real Linux users like to do difficult things!», more-puzzled-me: «I don't think I'm following you…», other person: «like configuring things, recompiling the kernel, etc.», illuminated-me: «but I don't want to be a sysadmin in my life and I can assure you that I do not spend my day doing easy things but I prefer spending time focusing on more interesting things», other-person: «…» I closed my laptop, went away and I didn't have the misfortune of dealing with this person ever again. I don't know what this person is doing but I am quite sure that it will involve suffering. A lot of…

The minimalistic development environment: I am of the opinion that while you are learning how to program your environment should be as minimalistic as possible in such a way that you can understand what is going on under the hood: what the parameters that the compiler needs are, what the dependencies are in building a project writing your own rules in the makefiles, etc. I am also of the opinion that at a certain point that must stop and the environment should do the heavy lifting for all these "administrative tasks" while you focus on the real problems to be solved. Many modern environments such as Visual Studio, NetBeans, Eclipse, etc. do that in a pretty good way. Even though, there are people that inspired by the point above think that your life must be hard and you need to painfully spend more time in resolving the library dependencies by hand than actually using that time in a more productive way. Going down the "do everything yourself" path includes not having things as auto-complete for functions and variables, being able to quickly jump to the place where things are defined or other amenities that can potentially save you hours of grep and jumping around files.

The shining new thing cult: There are people that, no matter what they actually need to do, they have to do it with the latest framework, service or system. It doesn't matter if it does not fit what has to be done, it is the new cool thing, everyone is talking about it and then it has to be used. The fact that it is a pre-alpha and totally untested in real environments, it has to be used. The thing is then usually hammered to make that fit what is needed or, the majority of time, the problem is hammered until it fits what the original problem that the shiny new thing was meant to solve. Usually this form fetish result in the very strange problem of losing the retro-compatibility with the present™ forcing unneeded changes in the existing infrastructure in order to address a problem that wasn't there.

The (un)agile zealot: The agile bandwagon arrived to town and everyone felt the urge to jump on it. Did they need it? Maybe. Did they ask themselves the question? Probably not. I have seen terrible things: waterfall models split in blocks of 2 weeks and calling it agile, scrum boards filled only by spikes, people reshuffling priorities three times a day, etc. I have been in discussions about agile for 6 years now and my judgement is that sometimes it works. It works when everyone in the team can do the work of anyone else, when there are a very limited amount of external dependencies and unknown and when the time that you need for designing is negligible with respect to the time that you spend just typing in code.

It has been difficult to write, it must be difficult to read: One too many times I found myself having to read code just to know what it was supposed to do or how it was supposed to be used and not to find bugs in it due to some production errors. I am not praising long pieces of documentation that people could just stop reading due to powerful bleeding from their eyes, I am just talking about writing some comments and writing the code for your human fellows before trying to make that readable from a machine only. Even more considering that making a piece of code readable by a machine is way easier than making it readable by a human. When I spend time reading a piece of code without comments, I always try to leave one there with my findings just because I could find myself going through the same code again in the future. Some people sustain that they prefer tests as a form of documentation because they are "running documentation". In all honesty, I think that that is pure bullshit. A test can be, at most "behavioural documentation": if the function takes in input 5, it returns 3 but it says nothing on what ① the function is actually computing, ② when it is supposed to be used nor ③ why it is there. All things that are important in the maintenance of a piece of code.

There a bunch of older perversions and fetishes that, while older, are still en vogue and for this reason they deserve at least to mentioned, such as ① the cargo cult of not using exceptions in C++ because at the time of Cfront they were slow, ② inventing yet another configuration format — better if turing-complete — , and ③ keeping your data in files spread all around the disk — reimplementing fault tolerance and access policies — instead of placing them in a data-base (because, you know, data should live into data-bases and not in XML files).

Share me with