[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: [microsound] Lemur!
That's an absurd amount of money.... you can buy an 15" touchable lcd
from Wacom for $1000 less. OSC is nice, but not /that/ nice.
I think the whole idea is that Lemur accepts multiple touch points at
once. You can use all ten fingers at the same time if you wish.
I saw a live demo a little over a week ago and the multiple finger
input was definitely a major departure (and impressive).
I gather what goes on is you have a programmer app that lets you
arrange and lay out controller objects on the Lemur.
Right now they've developed 5 different objects. Most notable is X-Y
area that allows multiple point input and behaviors of tracked points
during after finger contact has been lifted. The others are more
conventional. Buttons with programmable states, virtual faders, etc.
You determine the exact OSC data being output by the controllers. You
can use formulas for instance to output the ranges and sorts of data
you want.
There was a mention that while the controller doesn't directly output
pressure data they felt felt that they would eventually be able to
interpret the change in the area of contact as something analogous to
pressure info.
Something that I wouldn't consider a criticism that came to mind during
the demo was that while the Lemur is incredibly seductive if you are
playing it yourself, the demo highlighted a bit of frustration from an
audience's viewpoint in that it is intimate in size and while it can
control a video performance (and I'm sure Jitter or GEM code could do
something custom in terms visualizing the output) it's not outputting a
capture-able visual or is itself on a large enough in scale to let the
audience really in on the performance. That's something that visual
tracking software (which has it's own pros and cons) does do
compellingly from an audience perspective.
What to me seemed like an understandable though major criticism is
overall you only have high level access to the Lemur end of things. For
instance you can't go in and develop a new controller object, redesign
the the graphics, in other words work on the Lemur side of things at a
lower level. You can develop something in PD that interprets the OSC
data from the Lemur any way but on the Lemur you are limited to the
parameters of the provided controller objects and the layout of
multiple controller objects.
One of the observers at the demo had a very valuable observation that
hasn't yet been explored by the Lemur developers. As it stands there is
no capability for an external app to update info on the Lemur screen.
The controllers themselves most definitely update themselves and
provide visual feedback but the controller info is streaming from the
Lemur to your device (most likely an app or patch running on the same
computer hosting the Lemur programmer app). Only the Lemur programmer
app is sending updates from the outside world to the Lemur, I'd think
this is a whole important area as a high end flexible performance tool
that needs to be addressed to take things from the seductively novel to
being a potentially extremely valuable device to realize new custom
performance interfaces.
nicholas d. kent
http://technopop.info/ndkent
---------------------------------------------------------------------
To unsubscribe, e-mail: microsound-unsubscribe@xxxxxxxxxxxxx
For additional commands, e-mail: microsound-help@xxxxxxxxxxxxx
website: http://www.microsound.org