[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [microsound] livecoding



Hallo,
mail@xxxxxxxxxxxx hat gesagt: // mail@xxxxxxxxxxxx wrote:

> Visual live enactment of electronic music is a must nowadays. We all agree
> on that. 

Really? What happened to the acousmatic experience Kim wrote about
several times ... ;)

> I ve seen the live coding videos, and it seems to me that there's still a
> gap between the code on the screen and the sounds emerging from the
> speakers. There's no way ordinary public can relate each code event to
> each sound. The live coders claim that there's no need to know how to play
> a guitar to enjoy a guitar performance. But they forget that when we watch
> "real world instruments" performances, there are constant synchresis
> points (Michel Chion). You may not be a guitarist but you clearly
> understand a string striking movement as the source of a sudden sound with
> a short attack time.

Maybe the videos I posted aren't a very good example for live coding,
as a lot of the performers aren't actually doing live coding. But I've
seen performances by Dave Griffiths several times now, and they are a
fascinating experience. Okay, I can code a bit myself, but from
looking at the faces of other people in the audience I got the
impression that in a strange way they found it fascinating as well to
see him constantly editing on the stage and to see these strange words
on screen change in accordance with the sounds generated. Even if it
was only because they grew bored of performances, where the performer
could have been playing minesweeper as well. I like the part of the
Livecoding Manifesto where it says: "Show us your screens!" 

And actually even I really didn't understand what the code on this
picture should mean:
http://lad.linuxaudio.org/contrib/zkm_meeting_2005/photos/frank_neumann/DSCN3593.JPG

> The same goes for the drums, and almost any other traditional instruments.
> Even piano, where the keyboard may not be in the audience's sight range,
> admits clear synchresis interpretation, watching the arm movements. And,
> like piano, all this instruments and their mechanisms are intuitively
> understood by virtually any human being.

Then why not stop using computers? I mean this seriously: One of the
main differences between the traditional instruments or traditional
tool in general and the computer as a meta-tool is that the computer
is a tool for an intrinsically un-physical act: the act of thinking.

In my view, the computer has more in common with pen and paper and the
book than with the hammer. Watching someone hit his fingers with a
hammer is interesting, watching someone read a book is boring. But
does that mean, that reading is wrong? What the live coders try to do
in this regard is to find a way to present the act of thinking with a
computer as an interesting activity nevertheless. And do so without
pretending that a computer is a hammer. Which a computer can be as
well, but then a hammer is a better hammer most of the time.

Ciao
-- 
 Frank Barknecht                 _ ______footils.org_ __goto10.org__

---------------------------------------------------------------------
To unsubscribe, e-mail: microsound-unsubscribe@xxxxxxxxxxxxx
For additional commands, e-mail: microsound-help@xxxxxxxxxxxxx
website: http://www.microsound.org