The structure so far has been to begin with things on education and learning. This was because a) we want to try to do some things differently so needed some context as to why, b) the easiest and most effective way to provide you with the experience of disruption is to disrupt what you have been taught to take for granted. (We want to give you the experience of disruption for two reasons. The first is that a lot of learning happens when things that we take for granted are ‘made strange’ or unfamiliar, it is a way to make visible our assumptions and values. The over is that the internet is a disruptive technology in relation to heritage (industrial, pre-internet) media, and I want you to learn that the internet is both exciting but also in many ways eroding, the very industries you aspire to.
Then we moved to design fiction. This was a way to move from learning as saying what I already know to learning as being an investigation of what might be, including what I don’t yet know. Learning as a casting forward rather than a relying back to what I have already learned. Design fiction is also a very useful concept in relation to ways of engaging with change, technology, and the social. Sort of big ticket items for the network.
Then we have begun some pre-internet readings (Bush, Nelson). In both cases there is a utopian vision of technology that wants it to augment intelligence. This is a big deal. In the west, for example, the ‘robot’ tends to be always bad (it’s hard to find a film in the west where a robot is not actually going to kill us), and this in many quite real ways has hindered ideas and development (in the east there is a very different view, for example something as simple as Astro Boy shows a completely different view of robots, they have rights, they help, and it is generally only evil people who mistreat robots). So, computers could have gone the same way, always about to miscalculate and send the plane into the mountain, turning the car’s brakes off when you want them on, and your finance’s always at risk of a computer glitch. We trust computers, well, enough to let them fly our planes, manage our financial flows, and do a huge amount of diagnostic work in medicine. Hence Bush and Nelson matter because here you can see an approach that believed every person should have a computer because it would augment your ability to know and do. In 1970 (Nelson) this is an extraordinary vision, and one that is deeply grounded in the belief that intelligence, learning and making knowledge are the foundations of the human.
This segue into hypertext proper. Why. Lots of reasons. Hypertext existed before the Web, and so largely prefigures what the web is beginning to become (there are things I can do in a 1995 hypertext program, simple things, that the Web still cannot do). Hypertext theory, which comes out of the humanities and so is our province, has a lot to say about the ways in which digital media asks different questions for us about what an author is, a reader, and a text. So in hypertext we find the first real questioning, in sophisticated ways, of these things. In the same way hypertext has (still) most of the best ideas around multilinearity in relation to narrative, including not only what happens to stories in multilinear environments, but also how to go about making them. This is not a technical problem of software (that would be like thinking learning how to write a good essay is about learning how to make ink and paper), but about the problem of voice and structure. These readings will also matter in second year, because there we make hypertextual video works, but even though it is visible, the principals are exactly the same (and understood much better by the hypertext community than by the interactive video mob).
Finally, hypertext, as the idea of text made up of small chunks with different, multiple, possible connections between them, even though it describes possibly a single work, also describes the structure of the Web. This is Weinberger’s ‘small pieces loosely joined’, where he’s not talking about hypertext at all. In other words hypertext is a great model for thinking about the deep structure of the Web more broadly. So it’s a great place from which to begin.
Which brings us to this week, and probably next. These readings are about the network as a particular sort of structure. For me, this is a small step from hypertext, it is still about small more or less independent parts (a blog post, a node in a hypertext fiction), which now happen to be people, and about how they are connected to each other. The ideas here, the principals, are exactly the same that hypertext (and Ted Nelson) rely on and argue for. It is about many to many and one to many relations, and what sorts of ‘patterns’ then happen in such systems, and more importantly the consequences of these patterns. It is why there can be memes, things go viral (diseases, ideas, and YouTube clips) and why social media is possible. Remember, it is the same sort of ‘pattern’ that hypertext described.
So the readings are making a shape and a trajectory across an idea of what the network is. Generally the intent of it is to help us, it requires imagination, it is made up of loose small bits with lots of ways to connect, and disconnect, them, whether this be people, pages, likes, blogs, or tweets.
Networked Structures and Consequences
There will be some more reading from Watts, but for now this is just the introduction to this very readable book. As an introduction it doesn’t provide that many answers, but it has a great set of questions and problems and why they might matter. This is a book largely dedicated to the problem of how things move through networks, whether that be disease, information, or people. Turns out they all move in much the same way.
Watts, Duncan J. Six Degrees: The Science of a Connected Age. London: Vintage, 2003. Print. (Extract, PDF)
Chris Anderson wrote an entire book about this. This though is an earlier article, same idea, just smaller. In this he is describing what Watt’s describes as a ‘power law distribution’, which it turns out is one of the characteristics of the sort of network that the Web is. While Watt’s discusses this in a variety of theoretical and sociological ways (he’s a sociologist who did his PhD with a mathematician) Anderson, in typical North American Silicon Valley joy, goes straight to the marketing come financial implications. It is, though, a key point, and is one of the reasons why blogs a) have a staggeringly large readership, and b) why a blog with only a few readers still matters.
Anderson, Chris. “The Long Tail.” Wired. Oct. 2004. Web. 23 Aug. 2013. (PDF)