Blood, Urine, Ochre and Sap: living with the saving power


Author: Dr Bruce Fell, Charles Sturt University, Australia

 


Blood, urine, ochre and sap having been heated or cooled, stretched or compressed — or simply laid bare — was mixed and fixed to rock, wood and fibre.
Having been scraped or brushed, chiselled or smoothed, sown or scribed, meaning became entwined with body and other. This melding, that had no
precedent and has no foreseeable end, has become our reality. It is the cloth that awaits our birth and the shroud that lays us down.

(Fell 2011)

From pre-history cave art through the shadow puppetry of 300 BCE and onto motion pictures and digital convergence commentators have philosophised about the relationship between the screen and decision-making. Our entwinement with the screen, from Plato’s Allegory of the Cave, to a plethora of postulating in the wake of fMRI scanning, speaks of a complex interaction between humanity and the screen.

Drawing on the critical social stance of social ecology the paper argues for a reappraisal of the role of the contemporary screen in an age of social and ecological degradation.

Atop our spine sits a large brain that houses a very big visual cortex. Arguably, combined with our dexterity and bipedal stance, such a combination makes us what we are — none of which happened overnight. Day in and day out humanity’s entwinement with the screen has been and remains central to how our big brain perceives the world.

Today, we increasingly rely on a personalised portable screen for the encyclopaedic information that was once accessed via a range of screens located in multiple locations: library, gallery, museum, cinema, television, etc. And while encyclopaedic information changes over time, as does screen technology, our entwinement with the screen remains fundamental.

We have no way of changing what made us a big brained, bipedal animal with a large visual cortex and amazing dexterity; equally, we have no way of changing the fact that we are biologically programmed to mark-making and subsequently revel in gazing upon surfaces that humans have made marks upon — from rock art to digital art.

Sophisticated rock art painted 20,000 years ago like the primitive crayon doodling on a bedroom wall by a contemporary child is cognitively inseparable. Or to put it another way, the child of 20,000 years ago, possessing the same sized brain and dexterity as a contemporary child would have, arguably, created similar doodles. Just as an adult today with similar dexterity, as back then, can replicate the 20,000 year-old art found across Europe, Australia and elsewhere.

What is significantly different about the mark-making of then and the mark-making of now is the memory such mark-making conjures. Mark-making and surface gazing conjure memory — though not necessarily the intended memory.

The narrative within mark-making is culturally coded. While the thing rendered can be viewed over time by a person from any culture, meaning is coded. For example, an actual type of bovine has been identified within the 20,000 year-old paintings on the cave walls of Lascaux. However, why the animal was painted and what the art speaks of has been lost in time. Just because we ‘see’ the marks made on the screen doesn’t equate to understanding the meaning imbued by the original scribe. To this end, a screen can be thought of as a surface transformed into a memory device, be it pre-historic or contemporary, be it made from granite or silicon, be it rendered in blood or code, be the memories accurate or distorted.

In The Beginning

Once hominids recognised surfaces as devices capable of holding external memory, cultural memory began being fixed to rock, wood, flesh and fibre. From then on memory could be remembered on demand, as opposed to the whimsical nature of circumstances conjuring up memory: the scent of a flower, the sound of a waterfall, the cry of a child, etc. Remembered symbols could be used to tell stories, to create calendars and diagrams, plus all manner of plans and dictums that make up encyclopaedic cultural data.

For the purpose of this discussion I will draw a link between the modern external memory devices of Film, Television and Smartphones and the very first known external memory device in order to make the argument that our understanding of being human is at least partially if not wholly derived through external memory devices. As such, in a global climate of ecological degradation, I want to put the case that the screen as an external memory device is capable of creating the memories required to bring about a sustainable social and global ecology.

Aspirational Reflections

Though this story begins before the domestication of fire, we’ll begin in 1933. The USA has entered ‘The Golden Age of Hollywood’ and Herbert Blumer is beginning to discover that popular characters portrayed on the screen have a certain influence on the audience – and the audience is large: at the time there were some 80 million cinema tickets purchased each week in the USA, alone (Austin, cited in Denzin, 1995:18).

Over the course of his research Blumer finds that attitudes presented on the popular screen surrounding career, religion, relationships, dress code and decorum, and even how we should move our bodies and arrange our faces, are being taken into account by the audience:

When I discovered that I should have this coquettish and coy look which all girls may have, I tried to do it in my room. And surprise! I learned the very way of taking my gentle friends to and from the door with that wistful smile, until it has become a part of me (Female, 19, white, college freshman in Blumer 1933 p24).

The appearance of such handsome men … dressed in sports clothes, evening attire, formals, etc., has encouraged me to dress as best as possible … One acquires positions such as standing, sitting, tipping one’s hat, holding one’s hat, offering one’s arm to a lady, etc (Male, 20, white, college sophomore in Blumer 1933 p33).

By the time most Australians had moved from regularly attending the cinema to regularly viewing stories on television, the ability of the moving screen to influence audience had become well and truly domesticated. For example, two million Australians switched over to Network Seven in June 1985 to witness the final instalment of a harrowing ten-week storyline culminating in the death of a popular soap opera character. By the time actor Anne Tenney’s character, Molly Jones, is scripted out of the television program A Country Practice, the influence of the screen, worldwide, has significantly shifted from cinema to television.

The emotion surrounding Molly’s death was such that viewers expressing their sorrow jammed Network Seven’s switchboard (Van den Nieuwenhof 2006). Like Blumer’s research into film, there are numerous examples of how popular television has caused audience members to respond as if the scripted story was reality (see, Fell 2008).

The above examples enable us to get a sense of how the screen influences the mind, and to that extent, how influential the screen has become within personal and community understanding. And while today we still attend the cinema and watch television, our physical interaction with the screen has dramatically shifted from the communal (film) through the domestic (television) to the personal (smartphone).

How popular film and television influenced us back in the day seems almost quaint by comparison with the influence of contemporary convergent media. For example, the rise of ‘phantom vibration syndrome’ (Jacobson 2011) is but one indicator of how central convergent screens have become within contemporary life. Phantom vibration syndrome manifests in the imagining that our smartphone is vibrating, when in fact it isn’t: imagining is the sole domain of the mind.

Today, many of us become anxious when we misplace our digital device; we are constantly checking the small screen for messages, the time, our shopping list — not to mention our favourite apps!

In terms of the moving image, the cinema was once the most sophisticated means of accessing external memory. The consumption of those movies involved a set of common events (some might say performances); we purposefully travelled to a place where a purpose built screen constructed our evening: we purchased a ticket, acquired confectionary and sat in the dark before a large screen in order to suspended our disbelief.

By the time Molly died (scripted out of A Country Practice), we had become accustomed to suspending our disbelief in a domestic environment.

Today, the circumstance in which the popular screen is able to influence has moved from the occasional (film) through the regular (television) onto the constant (smart phone). As Philosopher David Chalmers muses in the blogosphere:

A whole lot of my cognitive activities and my brain functions have now been uploaded into my iPhone. It stores a whole lot of my beliefs, phone numbers, addresses, whatever. It acts as my memory for these things. It’s always there when I need it. … [for example] … I have a list of all of my favourite dishes at the restaurant we go to all the time in Canberra. I say, OK, what are we going to order? Well, I’ll pull up the iPhone — these are the dishes we like here. It’s the repository of my desires, my plans. There’s a calendar, there’s an iPhone calculator, and so on. It’s even got a little decision maker that comes up, yes or no (Lehrer 2010).

 

The Pebble Before Silicon

The memories uploaded to our smartphones are both whole and fragmented. They are memories that were once located on other, now older, external memory devices: film, television, paper, canvas, sculpture, fibre, wood, flesh, rock, etc.

Within the discipline of cognitive archaeology (how ancient societies thought) the influence of the screen is seen as a means of understanding the foundation of human being; it speaks of where we came from and hints at where we are going. The first hint comes with the gathering of a pebble.

In 1925 schoolteacher and amateur archaeologist Wilfred I. Eizman was foraging about in a cave in the Makapansgat Valley, Limpopo, South Africa. Wilfred came across a five-centimetre face shaped red jasperite-pebble (known as the Makapansgat pebble). Though rudimentary, there can be no mistaking the human face-like shape of the pebble: two eyes, a nose and mouth ( http://mc2.vicnet.net.au/home/portable/web/manuport.html). The cave in which the Makapansgat pebble was found was occupied some 2.5 to 3.3 million years ago on a semi-permanent basis by Australopithecus africanus, and perhaps other hominids. The brain of Australopithecus africanus was about 460 cc (cubic centimetres); the average brain of a contemporary person is 1350 cc.

How a small-brained hominid with thick rough fingers could create such an artefact-like object would remain a mystery until 1997 when intense microscopic investigation revealed that the Makapansgat pebble is in fact a unique naturally occurring form. None of the Makapansgat pebble’s face-shaped markings had been manufactured. The eyes, nose and lips, like the shape of the pebble itself were formed by the elements over millennia (Bednarik 1998b). Within archaeology and anthropology such a form is know as a manuport. The Makapansgat pebble is thought to be the oldest example of a manuport.

Of interest to us is that the pebble appears to have had no practical purpose for its original collector. There are no indications that the face shaped pebble was used as a tool, yet it attracted sufficient attention that the ancient collector carried the face-like pebble for several miles from a site strewn in red jasperite rock to the cave where the pebble was found. No other pieces of red jasperite rock have been found in the cave.

For a primitive animal to carry the face-shaped pebble for several kilometres needs to be appreciated as a significant act. Australopithecus africanus was a wild animal, and as such it didn’t wear cloths, let alone have a pocket or a woven bag to place the pebble in. It is challenging to comprehend why the small-brained Australopithecus africanus carried the face-shaped pebble for several miles and placed it in the cave its tribe occupied.

The gathering up of the pebble demonstrates an incipient form of consciousness. The ancient mind that gathered the face-like piece of rock had a brain that possessed a crude plasticity. Its synaptic sparking, like fire first made: smoked — then flickered within the small chamber of its cranium. The face-like pebble can, arguably, be conceived of as a ‘thought’ previously not ‘seen’ — the birth of external memory.

Today, at the click of a mouse, the swipe of screen or the tap of a remote one can easily come across an image, be it beautiful or horrific, that engenders a thought not previously conceptualised: a new memory.

We don’t have fMRI neuroimages of the small brain that collected the Makapansgat pebble. However, scans of our large brain reveal that one of the processes involved in looking at a known object (face, etc.,) is recognising what Garry Lynch and Richard Grander (2008, p114) call ‘un-named partial assemblies’ — bits of neurological building-block information.

For example, humans have a strong tendency to perceive faces in objects. When this happens we see one thing and think of another: a rain splattered surface, or a broken seashell, or a piece of gnarled wood can sometimes trigger our visual cortex into not only seeing the object, but what the gnarled and/or fractured shape of the object reminds us of (face, etc.)

Once a neural pathway is created (recognising a face, ‘as a face’) there appears to be no end to the surfaces our ‘memory of face’ can be laid over. When the shape of an object triggers un-named partial assemblies such as those associated with remembering ‘face’, we are cognitively attracted to the familiar shape in the object. Did the Makapansgat pebble remind Australopithecus africanus of a face, did it create a thought previously not seen — a new memory? We may never know.

Arguably, to experience a thought not previously seen requires a mind. Susan Greenfield (2008) argues that the ‘mind’ is the first person perspective of identity. Neuropsychologist Paul Broks (2003) suggests that, in part, mind comes about as we build a story of ourselves from the raw materials of experience and memory. It is tantalising to speculate what effect the Makapansgat pebble may have had on the beast that found it and those it displayed it too.

Brain scans reveal that in both sleep and reflection our mind retraces our biological interaction with matter. While we will never conclusively know if Australopithecus africanus dreamt or remembered, what we do know is how we remember. For example, when learning to play a musical instrument we go over and over and over the most rudimentary elements of the chosen instrument. Yet, once body and instrument are in unison the learner never returns consciously to the elementary. From that point on the rudiments of playing appear to happen without thinking. From a seemingly impenetrable vista when we interact for the first time with a piano, sitar, etc., a new bumpy neural track begins. The subsequent combination of sleep, reflection and more physical interaction enables the process of remembering — a process that smooths the bumpy neural track into a neural pathway.

Once enough smooth neural pathways are formed the mind can travel at the speed of experience: a seasoned musician, mathematician, philosopher, sportsperson. All manner of physical and cognitive skill emerges from the process of remembering — ‘remembering’ enables the mind to seamlessly travel down smooth neural paths. Remembering enables the mind to grow. As neural paths cross and intersect the quickening mind expands.

The smaller the brain the less opportunity there is for ‘un-named partial assemblies’ to flourish. We have no way of knowing why the original collector of the Makapansgat pebble picked it up. It may have been the pebble’s colour or texture that sparked ‘un-named partial assemblies’, rather than its portraiture. It could have been used as a status symbol, a mere curiosity, a memory device or combination of.

From Rock Face To Facebook

There is another aspect to the Makapansgat pebble, it indicates a foundational relationship between mind and matter, one that took the mind a further two million years to purchase a cinema ticket, press a TV remote button and select the appropriate smartphone app.

To write on a Facebook wall requires the same sized brain that painted reindeer on the walls of Lascaux 20,000 years ago, or the long-extinct enyornis in Arnhem Land, etc.

Drawing on ethology (Dissanayake), neuropsychology (Broks) and cognitive archaeology (Bednarik) we can get a sense of how human cognition sprouted. Such a sense casts light on how contemporary humans are influenced when remembering.

Between 1.8 million and 300,000 years ago hominids stepped away from the tree climbing small-brained proto-humans and walked towards the large brained hominids of the Holocene epoch. Between the collecting of the face-shaped pebble and the end of the last ice-age some hominid brains evolved to 1225 cc. — Albert Einstein’s brain was five cubic centimetres larger, at 1230 cc.

Drawing on the extensive research by cognitive archaeologist Robert Bednarik, we see that over this vast expanse of time fingers grasped a haematite pebble in Hunsgi, India and used it, crayon-like, to mark a rock surface. Others hominids began to collect quartz crystals, while others made disc beads. Other folk began building watercraft, eventually populating terra nullius. Within this period of complex behaviour, symbolism and language emerged.

With a vast web of neural paths able to form, due to the size of their brain, hominids acquired a vast conceptual base: this is the alchemy — from Darwin’s cauldron a non-biological synthetic cortex began to manifest.

The External Reminding-device

When laid out before a large mind, having been purposefully placed together, various combinations of blood, urine, ochre, sap, rock, wood and fibre, having been scraped or brushed and purposefully displayed on body or matter, became an external reminding-device.

A surface upon which the mind can dwell upon its own mindful creations, as well as the mindful mark-making of other minds (both alive and deceased), is a surface transformed.

Such transformed surfaces evolved due to the size of our brain in combination with our dexterity. The transformation of a neutral surface into a memory device was made manifest by a brain big enough to house complex neural paths — a brain in which biological impulses were harnessed in a manner animals with a smaller brain and less dexterity could not manifest.

As we will see, the entwinement of biological impulses with neutral surfaces provides insight into how contemporary humans perceive the world.

All healthy human babies, on all continents and in all cultures, begin the journey down their individual neural paths at a similar biological time. We all begin to comprehend our mother’s face at the same developmental stage; we all murmur, sit-up, crawl, walk, dance and sing at stages similar to each other.    As ethologist Ellen Dissanayake (1995) argues, these are biological functions: culture comes later.

As our individual baby brain grows, ‘mind and memory’ entwine, our neural paths expand, we learn to play our instrument — the culture we are born into determines the music our instrument plays: but not the impulse or dexterity that enables the initial playing: that is biological.

Just about any animal can make an accidental mark using its teeth, tail, hoof, head, etc. Making a ‘mark’ is not incredible; a person’s first mark is indecipherable from marks made by other mammals. The difference is in the eventual awareness of the action and the subsequent cognitive capacity to  remember and synthesise such actions (a hermeneutic phenomenology of sorts). When, all those eons ago, a neutral surface was transformed into a screen by blood, urine, ochre or sap, the smear, scratch and imagination associated with transmogrifying the once neutral surface into a screen came about involuntarily due to the size and function of the brain in conjunction with physical dexterity.

When the neural paths associated with ‘fingers’, ‘crayon’ and ‘mark’ intersect, a stupendous landscape of meaning opens before us.

In terms of mark-marking (charcoal, crayon, etc.,) Derek Hodgson (2000) argues that we all initially make the same shaped marks, and do so (and have done so) in the same order of progression — from crude uncontrolled streaks, to purposeful straightish lines, to ever more precise circles, triangles and squares. Again, these are (at first) biological, not cultural functions.

A child makes a mark; a new bumpy neural track begins. The child makes the mark again; at some stage the neural track forged by a new mark intersects with the tracks forged by previously made marks: the first reminder: a pathway. Soon the neural journey between the new mark and all the previous marks becomes seamless — the pathway smooths, as does the line being drawn.

We can, and do, improve and hone our mark-making, but we commence our mark-making without outside encouragement. With the straight-line in hand it becomes possible to purposefully bend the line, to create a purposeful circle: symbolism and by extension cultural literacy evolves as a myriad of neural paths step back and forth between cortex and the marks made on the screen (rock, wood, tablet, etc.)

The Synthetic Cortex

We are a visual animal that reflects on mark-making. The synthetic cortex came about due to the biological impulse of a hominid to make a mark on an external surface, in conjunction with the cognitive ability to attribute symbolic meaning to its mark-making. Mark-making per se creates an important neural path, one that conceptualises the surface of an object as being replete with meaning: which can be memory, as well as a thought previously not seen.

The term ‘synthetic’ comes from the French ‘synthétique’ and modern Latin ‘syntheticus’ — its roots are found in the Greek word ‘sunthetikos’, based on ‘suntithenai’: place together.

The term ‘cortex’ comes from late Middle English, derived from Latin, literally meaning ‘bark.’ Cortex relates to surface — the notion of a rock, or skin or anything else having a surface doesn’t come into play until a purposeful mark, a symbol, is placed upon it by a mind capable of conceptualising the purposeful mark as a symbol.

Once a surface is marked by human action the surface is transmogrified. The biological to purposeful action by which the synthetic cortex came into existence set up a situation that expanded our mind beyond the confines of internal thought.

In terms of our body, a biological cortex wraps around our brain like bark warps around a tree. One of the first lessons in neuroscience, according to Susan Greenfield, is that the more sophisticated the species the greater the surface area of cortex. We can add: the more sophisticated a culture, the greater the surface area of the culture’s synthetic cortex.

The physical properties of the synthetic cortex would appear to be infinite; from pre-historic marks made on rock using haematite pebbles to contemporary portable, wearable screens.

Visible and Invisible Communication

As our personal potholed neural tracks smooth into expansive neural paths, the synthetic cortex (as apparatus) soon becomes cognitively invisible as our gaze moves from surface to content. Initially, for a very brief period, new technology – ochre, bronze, oil paint, photography, television, smartphone, etc. – rekindle our attention to the surface. But our mind doesn’t dwell there. Our mind longs for meaning; it aches for the story the symbols tell. The ‘synthetic’ cortex, in this sense, is the surface upon which a culture presents its story.

How cultures evolved within hominids, from Homo erectus (2 to 0.4 million years BCE) through subsequent hominid permutations to how cultures continued to evolve within Homo sapiens sapiens (130,000 years BCE to present) have much in common.

From the cognitive archaeology of Robert Bednarik, through the neuroscience and research psychology of Merlin Donald, to the philosophy of Martin Heidegger, it is agreed that persons both produce and are produced by culture. Our attention is directed by it, culture mediates our most private thoughts, we learn through its determinates.

The synthetic cortex created a physical, external, mind-place upon which a culture could place unlimited computations. The content of the synthetic cortex has been rendered (and erased) in the pursuit of ‘reminding’ the mind. That is, reminding the culture in question what it needs to remember. (What those in power deem worthy of remembering, which can include erasing memories no longer considered worth remembering.)

The synthetic cortex is central to our sense of self. The influence that the synthetic cortex has within culture has resulted in it falling under the control of those in power (Fell 2011) — Democratic and non-democratic bodies alike have a history of influencing the content of the synthetic cortex, of reminding the mind to remember and prioritise what to remember and worship (or not), within a specific culture. The synthetic cortex reflects the dominant priorities of culture: consumerism over sustainability, aggression over peace, self over community or visa versa, depending on the culture. The synthetic cortex wraps around culture; as such, it allows a culture’s sap to flow.

The original collector of the face–shaped pebble inherited a mind that predated the synthetic cortex. That mind relied solely on memory and thought generated by the rising and setting of the sun. It couldn’t make fire; its desires were shaped by pheromones and the cry of the wild beyond its own wild  self. It lived in a world devoid of entwinement. Then, existence was cold, linear and bare.

Evidence of the earliest fireplace dates back some 1.7 million years; that’s around one million years after the Makapansgat pebble was collected. After an unfathomable amount of time and several mutational (evolutional) quirks, mind moved out of the cold and bare 460 cc confines it had occupied for eons and walked towards the warmth of our large domesticated 1350 cc mind-space. Be it stick drawing in sand, ochre applied to rock, axe carving wood, fibre dyed and woven or rope knotted, each surface rendered our entwinement further onto the reflective mindfulness that is the synthetic cortex.

 Self And The Synthetic Cortex

Having established that the synthetic cortex predates Homo sapiens sapiens (see Bednarik and Hodgson for an in-depth discussion), it stands to reason that the synthetic cortex predates each individual person born into the world. Such an understanding enables us to appreciate that each newborn ‘mind’ is dependent upon and informed by the purposefully embossed synthetic cortex wrapped around the culture they are born into.

The contemporary mind can be understood as the entwinement of a person’s biological cortex with a ‘local / national / global’ synthetic cortex. Such a worldview is vastly different to that presented via the cave art from 20,000 years ago, or that of the Italian Renaissance of 1500.

As we have seen, we interact with the synthetic cortex prior to our conscious recognition of doing so. At what stage, if ever, we personally deconstruct the synthetic cortex is of prime concern when talking about social and ecological reform.

We learn to read the symbols of our culture’s synthetic cortex prior to the cognitive recognition that we are imbibing culture. As we have seen, our initial relationship with the synthetic cortex is biologically motivated prior to it becoming culturally informed. For example, my twelve-month old grandson ‘plays’ with his parents’ iPad — like his imbibing of television, internet, billboards, product wrappers, etc., culture will have made its mark
upon him prior to his ability to purposefully switch, click, scribe, read or unwrap. None of which will necessarily facilitate his questioning of the minds that wrap the synthetic cortex around his trunk — themselves, a fleck in the fibre.

The power and influence of the synthetic cortex is beguiling. In the late twentieth and early twenty-first century it presents an entwined global narrative – disturbingly, one that esteemed scientists and intellectuals argue undermines global ecological sustainability and human equality. To this end, having listened to experts in law, economics, medicine, politics, journalism, aboriginal affairs, earth sciences, religion, education, nuclear armaments, defence studies and ecology, the organising committee of the ‘Science and Ethics: Can Homo sapiens Survive?’ conference held at the Academy of Science in Canberra wrote:

… civilization as we know it will not survive beyond a few decades unless there is a radical change in human culture, from a society driven by the pursuit of material wealth to one focused on human well-being (Fenner, F., Boyden, S., Green, D., Glikson, A., and Clark, S. 2005).

How accurate or inaccurate the concerned scientists prediction is, time will tell. What we do know is that the state of the planet’s biodiversity has continued to deteriorate since 2005 — the capacity of the planet to sustain conditions suitable for civilisation to flourish is now shrouded in a darkening debate.

The lack of reminding on the synthetic cortex that the more-than-human world (Abram 2006) creates the only atmosphere conducive to human flourishing is an erasure in desperate need of re-scribing.

There have been times when people with the same sized brain as ours used the synthetic cortex to pay homage to the more-than-human world. It would now appear that the constant rendering of a commercial growth imperative upon the surface of our collective memory has convinced us that there is no other path than the one well trodden: a neurological pitfall, as demonstrated by McGilchrist’s (2009) meticulous research into the divided brain and the making of the modern world, a battle between left and right hemispheric thinking.

From supermarket to Burj Khalifa, from cinema to smartphone, the reminding on the contemporary synthetic cortex is the result of and resulting in neural paths leading away from ‘remembering’ the central nature of the more-than-human world — arguably, more so than at any time in the hominid saga.

That said, our contemporary unsustainable story can be erased if we can infuse our anthropocentric pathways with reminders of the more-than-human world and our place within it. The challenge that confronts all who make their mark upon the synthetic cortex: architect, author, actor, etc., is how to redirect the neural paths leading away from nature — how to emboss the synthetic cortex with flourishing reminders of the more-than-human world’s centrality to our existence.

If we can render the synthetic cortex with an ecological imprint, then we can bring about an environmentally sustainable global culture: one in which the biological to culture transformation of an individual, from scribble to scratch, from mumble to song, from crawling to dance takes place before an unquestioned (non-deconstructed) synthetic cortex, one in which ecological sustainability speaks prior to the individual recognising (or having to know) that their existence is sustainable.

The challenge is substantial. Perhaps no philosopher understood this more than Martin Heidegger: his seminal essay The Question Concerning Technology underscores the danger we confront. Drawing on Heidegger, via the poet Hölderlin, we can appreciate that our future does contain a saving power: ‘danger’ and ‘the saving power’ are entwined within the synthetic cortex:

 

But where danger is, grows

The saving power also.

(Heidegger, 1977)

 


Blood, urine, ochre and sap having been heated or cooled, stretched or compressed — or simply laid bare — was mixed and fixed to rock, wood and fibre. Having been scraped or brushed, chiselled or smoothed, sown or scribed, meaning became entwined with body and other. This melding, that had no precedent and has no foreseeable end, has become our reality. It is the cloth that awaits our birth and the shroud that lays us down.

 

Bibliography

Abram, David. (1996). The Spell Of The Sensuous: perception and language in a more-than human world. New York: Pantheon Books.

Bednarik, R. G. (2003a). Seafaring in the Pleistocene. Cambridge Archaeological Journal 13(1): 41–66.

Bednarik, R. G. (1990a). On the cognitive development of hominids. Man and Environment 15(2): 1–7.

Bednarik, R. G. (1998). The ‘australopithecine’ cobble from Makapansgat, South Africa. South African Archaeological Bulletin 53: 4-8.

Bednarik, Robert (forthcoming) The domestication of humans. (Draft Copy).

Blumer, Herbert. (1933). Movies And Conduct. New York: Macmillan.

Broks, Paul. (2003). Into the Silent Land: Travels in Neuropsychology. Great Britain: Atlantic Books.

Denzin, Norman. K. (1995). The Cinematic Society: the voyeur’s gaze. Calif: Sage.

Dissanayake, Ellen. (1992). Homo Aestheticus: Where Art Comes From and Why. Seattle: University of Washington Press.

Donald, M. (2001). A mind so rare: the evolution of human consciousness. W. W. Norton, New York.

Fell, Bruce. (2011). The Power And Influence Of The Synthetic Cortex in Social Ecology: applying ecological understanding to our lives and our planet, edited by David Wright, Catherine E. Camden-Pratt and Stuart B. Hill, Gloucestershire: Hawthorn Press.

Fell, Bruce. (2009). Television & Climate Change: the season finale. Germany: VDM Verlag.

Fenner, F., Boyden, S., Green, D., Glikson, A., & Clark, S. (2005). Science And Ethics: Can Homo Sapiens Survive? [Letter to the editor]. The Canberra Times. Retrieved June 19, 2005, from http://www.manningclark.org.au/papers/se05_fenner_letter.html

Greenfield, Susan. (2008). The Quest For Identity In the 21st Century. Great Britain: Sceptre

Heidegger, Martin. (1977). The Question Concerning Technology and Other Essays (W. Lovitt. Trans.). New York: Harper & Row. (Original work published 1954).

Hodgson, Derek. (2000). Art, Perception And Information Processing: an evolutionary perspective. In Rock Art Research, May. 17 (1), pp. 3-34.

Jacobson, Dan (June 15, 2001). “The Risks Digest Volume 21: Issue 49”. catless.ncl.ac.uk. Retrieved September 4, 2011.

Lehrer, Jonah. The iPhone Mind in THE FRONTAL CORTEX Retrieved March 20, 2010, from http://scienceblogs.com/cortex/2009/01/the_iphone_mind.php

Lynch, Garry & Grander, Richard. (2008). Big Brain: the origins and future of human intelligence. USA: Palgrave Macmillan

McGilchrist, Ian. (2009). The Master and his emissary: the divided brain and the making of the modern world. London: Yale University Press.

Van den Nieuwenhof (2006, January 21-22.). Goodbye Molly. The Weekend Australian (p 38).

About the Author

Dr Bruce Fell lectures in moving image literacy and production. He teaches internal and distances education subjects. Bruce has experience in Film, Television, Video and Digital production. Bruce’s PhD, The Question Concerning Commercial Television And The More-Than-Human World, looked at the circumstances in which ecologically sustainable messages could be woven into a commercial environment. Bruce’s MA (Hons) Electronic Media As An Aids Awareness Facilitator With Prison, compared the documentary style employed in a series of HIV/AIDS awareness programs that he produced, with the evolution of documentary screen-based communication production, from Lumière’s ‘actualities’ through to contemporary documentary practice.

Print Friendly, PDF & Email
Designed by Chris Orchard