The first chapter of the book “Introduction: Software and other Literacies’ in Software Literacy: Education and Beyond, Springer, Singapore”, by Khoo E, Hight C, Torrens R, Cowie B, explains why software and coding are notably important for the understanding of modern society. The authors elaborate on how the many performances of everyday matters now depend on a detailed engineered software that controls every possible outcome of human action in a digital environment; for example, in a geolocation mobile application such as Google Map or the predictive text feature of keyboards.
The purpose of this article it’s to point out the significance of software studies in this moment of history in order to reach software literacy, presented as a framework whereby we can understand the weight that this kind of technology has on the construction of our culture. However, this construction needs to understanded as part of digital literacy, in which people, as individuals and as a collective, take a big part in terms of the creation of this infrastructure.
As the authors express, we have become a “coded society” and that is not something we can overlook. We need to take seriously the role we have as users that interact with these technologies because that’s what the function of softwares is based on, our own actions; and that’s why there should be “debates over the extent to which we are collectively as a species ceding creative, conceptual and communicative agency to platforms and infrastructures” (Khoo; Hight; Torrens; Cowie, 2017, p. 4)
Not everything is perfect
Consequently, as the article stated, “No form of code is perfect; it emerges from human endeavour and is inscribed with the conditions of its creation as with all cultural artefacts” (Khoo; Hight; Torrens; Cowie, 2017, p. 3); and that started to be more visible every day.
In 2017 The Guardian published a news report titled “How white engineers built racist code – and why it’s dangerous for black people” and raised awareness to a not very well known issue. The piece talked about how facial recognition tools, used by the police in the United States, were biased in disfavor of people of colour. The problem root is that the software was made by people, people that have their own background and intellectual processes; and, even though this is changing, engineers are usually white males. So no, software can’t be neutral.
So, now what?
The last part of the article gave some suggestions that I find accurately aimed to solve this new problem in our society. First of all, we have to recognise that not everybody has the same digital knowledge and even young people are ignorant about the extent of what technology can do. Once we understand softwares, we need to be able to criticize it and try to make it better and more inclusive, from the beginning of the process till the end of it. If the program evolves, our culture will do it too.