Category: Readings

Technological Dreaming

‘Ten Dreams Of Technology’ by Steve Dietz is the last ever reading for Networked Media, and it’s also probably one of my favourites.

The reading, from 2002, describes what happens when art and technology intersect, giving ten examples of the speculative thinking that we’ve focused on this semester in artistic works.

This is relevant because, as Dietz states, “artists were among the earliest and most active participants to recognise the potential of the internet”. These pieces of art won’t end up predicting the future, but they do display a plausible and somewhat realistic idea of what it could be, and this has been such a focus in Netmed.

One of my favourites was ‘The Dream Of Symbiosis’, detailing interaction between man and machine, and postulating that by allowing each to learn from the interaction with the other, both could evolve to higher levels of functioning.

The example given is Rokeby’s ‘Giver Of Names’ from 1990, seen in this video:

This is a metaphor producer, “which invokes the awe of naming and the power of the word to create universes”.

The one that jumps out as relevant to our subject is ‘The Dream Of Emergence’, which describes a “notion of networks as an extended or augmented nervous system out of which intelligence eventually and inevitably emerges”.

‘The Dream Of Immersion’ describes a virtual reality of sorts, with artworks that the viewer is totally immersed in. This combined well with ‘The Dream Of Transparency’, where the “computer resembles more and more its owner…with the passing of time, a computer ends up looking like its owner’s brain”, and immediately led me to think of the Google Glasses, where technology just becomes an extension of our being.

I found this reading to be a great way to sum up some of the key ideas and speculative notions that we’ve investigated in Networked Media throughout the semester.

Ants

This week’s solitary reading focused on the ‘Actor-Network Theory’ (ANT), which is an approach to social theory and research, and is written by the man behind this theory, Bruno Latour.

I found the reading very heavy going, and got continuously bogged down by the specific terminology used and my apparent lack of understanding of this area.

After finishing the ten or so pages, and starting at my computer screen with a throbbing head, I started googling for information on this concept, and found this Youtube clip, aiming to summarise this theory:

This simple video helped me to get my head around what I was actually reading about previously, and worded it in a much more concise and relatable way.

From what I can gather, this is a social theory that treats this society as a complex network, one where anyone and anything can be an ‘actor’, not just humans. This is apparently the most controversial aspect of this theory, and Latour explains this by saying an ‘actor’ can “literally be anything provided it is granted to be the source of an action”.

So in these networks, people and other objects interact together to create these links and connections, further creating a larger and more interactive network. Early in the article, Latour identifies the troubles in using the term ‘network’, due to its numerous connotations, but he later states that his reasoning for using the word was because “it has no priori order relation; it is not tied to the axiological myth of a top and of a bottom of society”, making it an effective term to employ.

The central concept of the theory is that neither technological or social aspects or given any more privilege or weight, they are treated as equals. These offsets the arguments between technological determinism and social determinism that we saw in the last lecture, with the theory considering both of these concepts as flawed, and choosing to operate on a “socio-technical account”.

From what I can see, it seems like the Actor-Network Theory is sitting on the fence in terms of a social theory, but not necessarily in a bad way. It doesn’t choose to agree with either of technological or social determinism, but rather incorporates aspects of both of these concepts, and in a way, gets the best of both worlds, as well as including both human and non-human agents.

As the reading identifies, actors aren’t just singular objects, but rather are networks in themselves, as they are in reality made up of multiple other actors. For example, the laptop that I’m typing this post on is usually viewed as a single ‘actor’ or object: a laptop. But when you think about it, it is also made up of multiple actors working together: the keyboard, the screen, the technical parts that I don’t know the names of, and again, these parts are made up of other actors.

This theory extends this idea to social relations and networks, and I think can be applied to the online networks of the internet that we’ve been studying this whole semester.

Poetic Databases

This one was a struggle. Amongst the scientific mumbo-jumbo and seemingly made-up words (Integrationalism? Come on now) I was able to find a few parts that were relevant and informative for me.

Even the name is confusing and wordy: ‘Recombinant Poetics And Related Database Aesthetics’ by Bill Seaman.

It starts by (I think) describing how even apparently very technological and digital actions are still, at their basics, human activities, emphasising the “physicality of experience”. Database, which is described in detail in the other reading for this week, is derived through human activities such as shooting and editing footage, sculpting virtual objects, compositions, writing, and even something as simple as naming files.

The process of writing computer code is related to something like poetry, as Richard Hamilton and Ecke Bonk state:

The poetic of computers lies in the genius of individual programmers to express the beauty of their thought using such an inexorable medium.

At their roots, computing and digital activities are still very human.

As Seaman states, interpretation in these new forms of technology is open, ongoing, and constantly changing, and I particularly liked this quote from Umberto Eco:

A work of art is a complete and closed form in its uniqueness as a balanced organic whole, while at the same time constituting an open product on account of its susceptibility to countless different interpretations which do not impinge on unadulterable specificity. Hence every reception of a work of art is an interpretation and a performance of it, because in every reception the work takes on a fresh perspective for itself”

The other part of the reading that stood out for me was the discussion of Dada Poems, ones that are constructed from putting seemingly random parts together in a seemingly random sequence.

These poems can be created through cutting up a newspaper article, separating each words, placing them in a bag, shaking it all up, then taking each word out one at a time. If you then write these words down in the order that they are drawn, you will create a poem that will “resemble you and you will find yourself to be an infinitely original writer with a charming sensitivity”.

This details how from complete randomness can emerge complete meaning, even if it’s meaning specific to the individual, it will be re-interpreted by whoever’s reading it.

Here is an example of one of these Dada Poems:

This quote from Lewis Carroll sums it these ‘random poems’ brilliantly:

First learn to be spasmodic
A very simple rule.
For first you write a sentence,
And then you chop it small;
Then mix the bits and sort them out
Just as they chance to fall;
The order of the phrases makes
No difference at all.

Databased

Lev Manovich’s ‘Database As Symbolic Form’ focuses on the idea of database in the age of new media and digital computers, and how this interacts and contrasts with more traditional forms of narrative.

Manovich defines database as a structured collection of data, and says that it “is anything but a simple collection of objects”. These databases are unstructured with no narrative, they represent the world as a list that refuses to be ordered. Databases have “become the centre of the creative process in the computer age”. Websites are inherently databases; they are endless and unstructured collections of images, texts, and other data records.

Manovich claims that new media objects don’t tell stories like we’re used to in the media; they don’t have a beginning or an end, but are just collections of individual items, each no more significant than the next. Many major hubs of the internet, as identified in the power laws structure of the network, are clearly databases, such as Wikipedia, which is a database of information posted by anyone and everyone and basically any topic you could think of, and Google, a database of pretty much any website you can think of.

Again, I found that using the analogy of cinema editing helped my to slightly understand this reading, with Manovich using these ideas to compare database with its natural enemy: narrative. Database and narrative are binary opposites, where database is unordered, unstructured, and unlinear, narrative is ordered, structured, and liner, telling a clear story to the reader.

A film is a sequential narrative (even non-linear ones like Tarantino), they are a timeline of individual shots that appear on the cinema screen one at a time. You can see the shots that are accumulated throughout the process of actually filming a script as creating a database: the shots are individual, arbitrarily, and not yet in order, as much of filming is logistically done out of order. It is only when the editor begins to piece these shots together to match the script that the film transforms from a database to a narrative. As the author says, the editor creates “a unique trajectory through the conceptual space of all possible films that could have been constructed”.

I found this analogy useful again in comprehending the pretty intense and heavy ideas that are encompassed in this reading, and I think we can view hypertext and these online databases as like a film where the viewer can actively choose how it is edited – what shot comes next, what song is used, which character is the protagonist) – while it is being shown.

I found this reading pretty heavy going, but I eventually got some interesting points out of it. I like the idea of describing of a website as a database, and how this is inherently in contrast with narrative as the two concepts are direct opposites. Databases exemplify the concept of hypertext that we’ve been focusing on; they are interactive and constantly changing, while narrative is orderly structured and static over time.

The Protocols Of Protocols

The other reading for this week is the Introduction from ‘Protocol: How Control Exists After Decentralization’ but Alexander Galloway.

To be honest, after reading this sizable chapter, I was left feeling a little like this:

The reading is very, very heavy on technical terms and internet coding and the likes, things that I really don’t have my head around. I thought it was a pretty damn dry reading, especially when we’re studying a topic that could have very interesting and interactive readings, ones that actually employ the hypertext that we are focusing on so much.

The reading focuses on the idea of the protocols that control the way in which networks on the internet function, and the importance of digital computers.

Galloway describes the internet as a “global distributed computer network”, and also covers how it was never originally intended to become anything like what it is today. The internet was originally developed by the US to be a “computer network that was independent of centralised command and control” so it could “withstand a nuclear attack that targets such centralised hubs”.

It was deliberately created as a decentralised network, but it was never meant to become the near-omniscient force in our lives that it is today; now it is “a global distributed network connecting billions of people around the world”.

According to Galloway, the concept of protocol is at the core of networked computing, and defines this as a “set of recommendations and rules that outlines specific technical standards”.

Protocols are apparently the standards that govern the implementation of specific technologies, the ‘rules’ of sorts of the internet. I found the analogy of the highway system somewhat helpful to help comprehend this concept: there are many different routes to get from one place to another, but there are uniform rules that apply to these ways, such as stopping at red lights, going the speed limit, staying on the actual roads etc. As Galloway says, “these conventional rules that govern the set of possible behaviour patterns within a heterogenous system are what computer scientists call protocol”. While the internet is so expansive and apparently ‘free’, there are still these rules and guidelines governing how it is used.

I think this is the key point that I took out of this reading (most of it I didn’t fully understand), the idea that although the internet is commonly viewed as this chaotic, ruleless utopia/dystopia, there are still a very stringent set of protocols governing how we act on the internet and contribute to the network that it creates.

 

The Technology Of Culture

This week’s reading was from the introduction of Murphie, Andrew, and John Potts’ book ‘Culture And Technology’ and focuses on efforts to defining key terms involved in a theoretical analysis of the complex relationship between culture and technology: technology, technique, and culture.

The authors summarize why these definitions are so important for this area of study towards the end of the introduction, by saying that “civilizations are based on the technologies of building and writing. Cultural activities are dependent on technology. Contemporary mass culture is made possible by the technologies of communication and production”. This terms are obviously crucially important to understanding culture, society, and the way in which networks are formed, and the clear starting point is to define them, which proves to be much harder than you’d think.

Words can just as easily adapt and evolve with societal changes as technology does, with different meanings and interpretations evolving. Potts uses the example of the word ‘technocrat’ for this, saying that in the 1920s it originally describes someone who supported technocracy, a style of governance. Consequentially this word could be either an insult or a compliment, depending on your political ideology. But across time, technocrat has lost much of its political foundation, and its modern understanding now describes someone who highly values the potential of technology.

Because words can adapt and transform across time, and quickly, like this, it makes it problematic to attempt to stringently define them. The reading uses the analogy of technological change to refer to how quickly things can adapt, saying that the “cultural ramifications of technological change are multiple and volatile, making fools of modern-day prophets”. Prophets can also looks like ‘fools’ when trying to pin down the meanings of words, as they can also develop along with technological changes.

Potts claims that the word technology has come to describe the overall system of machines and processes used, while also stating that it has “become so ubiquitous that it has been said that we now live in technology, are surrounded by technological systems, and are dependent on them”. The word technology now has such a broad definition that it is almost impossible to effectively deal with, because, as Potts says, “technology has become so central to so many societies that it needs to be considered as much more than a collection of tools and machines”.

In contrast to this, the word technique refers to “a specific method or skill”, the means by which technology is harnessed in order to achieve an intended goal. Very basically, it is “the use of skill to accomplish something”, but as Potts observes, anything to with our own bodies must also involve technique then, and “techniques are as crucial to cultural and the transmission of culture as technologies”.

Raymond Williams is quoted as saying that culture is “one of the two or three most complicated words in the English language”, and this is due to the fact that it can be so widely applied: it can described something self-contained, such as Australian culture, but can also be applied so broadly as to include the entire human race.

Potts says that “we see culture as messy, confused and riven with contradictions”, and that it is ultimately unpredictable, with the ‘inventors of culture’ inevitably not envisioning the technologies being used in the way that they now are, with the internet providing a relevant example.

I liked Brian Eno’s definition of culture the best, and thought it was the easiest to understand. Eno concisely defined culture as “everything we do not have to do”. For example, we have to move, but we don’t have to dance or run or skip, therefore these are cultures. I think this is a good way to simplify the ideas of culture into an easily understood and workable definition.

This reading provided some useful ideas surrounding troublesome definitions of the important concepts of technology, technique, and culture, and this will now be able to be applied to other aspects of analysis in the networked media.

The Richer Get Richer

The second week’s reading was another chapter from old mate Albert-László Barabási, again focusing on explaining the network of the internet scientifically.

It was pretty dry and heavy going, but there was a few interesting points buried in there.

This one continues from other, and mainly focuses on explaining why there are ‘hubs’ in the networks, huge websites that dominate traffic on the internet. This is explained through the rich get richer phenomenon, whereby the pages that we prefer to link are the ones that are already better known, ones with more links and views. The more links that a website has, the easier it is for them

It’s also stated that this power law distribution could potentially apply to all networks that we initially believed to be random, but it has been mathematically proven to apply to the internet.

The network of the internet is in no way random. It is entirely deliberate, calculated and co-ordinated, and is done so through two factors: growth, and preferential attachment.

Both of these are seemingly non-brainers nowadays. The internet is obviously constantly growing and changing, each millisecond of the day. Just with this very post, the internet is growing, however minutely. Page by page, post by post, unlecture notes by unlecture notes, the internet is growing all around us.

This is combined with the fact that we don’t link to things randomly (duh), but we link to pages that already have a lot of links to it, ones that are respected and already used, we employ preferential attachment. This, in a very basic way, is how a Google search works (I think).

These factors result in a power law distribution, due to this rich get richer phenomenon, with the creation of ‘hubs’ in the network, these big websites that dominate links and traffic on the internet.

This was the main point that I got out of the reading, primarily, the explanation for why, and how, the internet network is how it is.

The 80/20 Rule

The first reading for this week, ‘The 80/20 Rule’ by Albert-László Barabási, got into the nitty gritty aspect of networks on the internet, describing it in a scientific, very mathematical way. It took me a while to get my head around it, but with some useful examples, I think I kinda understand it.

It begins by describing this ‘80/20 Rule’, developed by Pareto when he was observing his garden. Pareto discovered that 80% of his peas were being produced by only 20% of the peapods, and then applied this to several other truisms, discovering that it is surprisingly accurate.

It states that in most cases, 4/5’s of our efforts will be mostly irrelevant, and this can approximated to the internet: 80% of links point to only around 15% of webpages.

Barabási then began to investigate the network behind the internet, and stated that he expected to find that they are connect randomly, but instead found that the distribution of links on various webpages followed a ‘power law’.

These power law distributions differ from random distributions as they don’t have a ‘peak’ and you can’t ascertain an average value from it. In a power law distribution, many small events coexist happily with a few large events; outliers are common, they aren’t really ‘outliers’.

Barabási describes this in terms of the internet by saying:

Millions of page creators work together in some magic way to generate a complex web that defies the random universe – collective action forces the degree distribution to evade the bell curve and to turn the web into a very peculiar network described by a power law.

I didn’t really ‘get’ this idea of a power law distribution under the example of maps.

The reading related a normal, random distribution to a roadmap, where the cities are the nodes and the highways connecting them are the links, it is a uniform network:

In contrast to this, a power law distribution is like an airline flight map, where the nodes are the airports, which are connected by direct flights between them. There are a few hubs, the major networks, but the vast majority of airports are tiny, appearing as nodes with at most a few links connecting them to other airports. In this map, the major networks are the likes of Sydney and Melbourne, where there are multiple links out to other airports, while smaller ones like Uluru only have two or three.

This is similar to the internet, where the are ‘major’ hubs, websites with a large volume of links to it, which then link out to other websites. This major hubs would have to include the likes of Google, Wikipedia, and Facebook.

This type of network is described as a ‘scale-free’ one, a very scientific way to describe the networks that we ourselves are contributing to with our blogs.

Six Degrees Of Kevin Bacon

The other reading for this week, ‘Six Degrees’ by Duncan J. Watts, describes networks and their historical origins, as well as contrasting physical and online networks.

Watts opens with the statement of we “have become increasingly reliant on a truly staggering and ever growing array of devices, facilities, and services that have turned a once hostile environment into the lifestyle equivalent of a cool breeze”, as well as saying “without power, pretty much everything we do, everything we use, and everything we consume would be nonexistent, inaccessible, or vastly more expensive and inconvenient.

This article was written in 2002, but is only more relevant today, with the prevalence of internet, smart phones, and social networks. Nearly everything we do is somehow connected to these ‘networks’ of the internet, and without them, many aspects of society would cease to function.

The reading serves to ask a series of thought-provoking questions and identify a number of interesting potentials to do with these networks, without necessarily answering them.

Watts says that “a network is nothing more than a collection of objects connected to each other in some fashion”, and has distinct historical origins. In the past, a network was viewed as “objects of pure structure whose properties are fixed in time”, but as Watts states, this “couldn’t be further from the truth”.

Watts claims that “real networks represent populations of individual components that are actually doing something”, and this is a brilliant way to summarise these networks that we are studying in this course.

Watts uses the example of the ‘small world’ saying to explain these notions, as well as Milgram’s experimentation with ‘Six Degrees Of Separation’. We have our own network of relationships between individuals, and these are expansive, because each person has their own circle of relationships, and these relationships have their own relationships and so, creating a wide and diverse network of individuals. Six Degrees Of Separation is the idea that anyone, anywhere, can be connected to another person through six other people.

This idea enjoyed a resurgence with the internet, becoming an item of popular culture, and spawning wonderful things such as ‘Six Degrees Of Kevin Bacon’, a game where people try to find the shortest path between any actor and Hollywood star Kevin Bacon.

These things are based on the ‘small world’ concept, which is brought into being whenever someone exclaims “it’s a small world” when they discover the person they are talking to is a friend of a friend or such. If the world was already small before the internet and its accompanying network, it is now miniscule. Online networks are overlapping and interacting constantly, and serve to bring us closer to anyone, anywhere, seen in the likes of social medias such as Facebook and Instagram.

Through these online networks, it is now infinitely easy for us to connect with anyone, be it a long-time friend living on the other side of the world, or a celebrity that chooses to respond to a Tweet or a Facebook post. This also applies to our blogs, with their possible reach to anyone in the world, and has exciting potentials that are only just now starting to emerge.

The Long Tail

The reading ‘The Long Tail’ by Chris Anderson details the impact that these new networks of media have had on they way we consume entertainment media, and the way in which it is marketed and sold to us.

Anderson describes the phenomenon of the Long Tail, the idea that any media “can find an audience, even if it’s just a few people a month, somewhere in the world”. The Long Tail incorporates expansive back-catalogues and archives of all different mediums of media.

Anderson identifies that the “future of entertainment is in the millions of niche markets at the shallow end of the bitstream”. These online networks mean that media distribution is no longer shackled by the constraints of physical sales. Mediums no longer have to find a local audience in order to be stocked, now we have “infinite shelf space with real-time information about buying trends and public opinion”.

This has resulted in the rise of niche and alternative products, as it is no longer subjected to a ‘lowest common denominator’ style of marketing, if a product has the potential to be sold anywhere, to anyone, it will be stocked. As Kevin Laws says, “the biggest money is in the smallest scale”.

The Long Tail is best exemplified by services such as Netflix, iTunes, and Spotify, where near-infinite back-catalogues of music, films, television shows etc are up for sale to people, allowing ‘cult’ media to become popular again.

This related most to our course because it describes how these networks that we have identified and studied can have a very real and highly significant impact on ‘physical’ things. The networks created by services such as ‘recommendations’ on online sales and links to similar products have facilitated the Long Tail, and in doing this, has drastically altered they way in which media is marketed and sold.

As Anderson says, “we live in the physical world, and until recently, most of our entertainment media did too”, but now it is becoming prevalent for media to be primarily sold online, in a way that can benefit both the consumer and the sellers.

Anderson says that “what matters is not where customers are, or even how many of them are seeking a particular title, but only that some number of them exist, anywhere”. This jumped out at me as relating to our own blogs, and our possible, imagined audience. Anyone, anywhere, can access our writing; we have a huge potential audience that is historically unrivaled.

At the article’s conclusion, Anderson talks about the idea of a “celestial jukebox of music services that offer every track ever made, playable on demand”, accurately predicating the dominance of services such as Spotify, that provide online streaming to nearly every song ever for a monthly subscription. This is the Long Tail, and much of Spotify’s profits would rely on people who want to access obscure, ‘cult-like’ songs, rather than those people that just want to listen to Lady Gaga.

I really enjoyed this reading as a very different and contrasting piece to much of what we have done previously, especially in that it displayed the exciting potentials that these networks of media have, and how big an impact than can have on media and entertainment.