Subject: MAX Digest - 27 May 1999 to 28 May 1999 (#1999-160)
Date: Sat, 29 May 1999 00:00:07 -0400
From: Automatic digest processor 
Reply-To: MAX - Interactive Music/Multimedia Standard Environments
     
To: Recipients of MAX digests 

There are 13 messages totalling 483 lines in this issue.

Topics of the day:

  1. legnth of QT movies
  2. time in movie object
  3. wireless headphone mix (2)
  4. Performance in LA
  5. fft~ and transposing
  6. MAX Digest - 26 May 1999 to 27 May 1999 (#1999-159) (3)
  7. pt~
  8. Bad Object
  9. Pitch-Shifter
 10. Notation apps for Win CE?

----------------------------------------------------------------------

Date:    Fri, 28 May 1999 08:05:53 +0200
From:    Jeffrey Burns 
Subject: legnth of QT movies

>What is the story behind this?  I really need to know the length of movie
>in my patch.  How can I get consistent units from the movie object?  The
>manual does mention the time inconsistencies between movies, but doesn't go
>farther than that.

It's true, QT is the worst documented software since Beethoven. You can
determine the speed of a QT movie when you make it. For example if you
compile it at 15 fps, you'll have a frame change every 40 index numbers.
That means the movies run at 600 index numbers per second. The docs don't
mention this number, but every cutie that I've seen conforms to this speed.
If you don't know the speed and legnth of your existing movies, just open
them in a movie object and, using a number box, see how often the frames
change and when the movie finishes.

If you don't like the speed at which any particular movie is playing, there
are 2 things you can do: 1) give the movie object a rate message. 2) give
the movie object the index number of each successive frame to be shown at
the speed you desire. Of course, you could also recompile the movie.

Jeff Burns

http://www.snafu.de/~jeff

------------------------------

Date:    Fri, 28 May 1999 00:02:07 -0700
From:    David Zicarelli 
Subject: Re: time in movie object

Peter Nyboer  writes:

>When you send a "length" message to the movie object, what are the units of
>the output?

The length outputs the units in the movie's time scale. Since there
is no way to determine the movie's time scale with a message,
I will add one. The default time scale is 600 units per second
but apparently it's not much of a default any more. It seemed to
be in 1992.

I will also add a duration message that returns the length in
milliseconds.

David Z.

------------------------------

Date:    Thu, 27 May 1999 21:56:25 +0200
From:    Michael Wieser 
Subject: Re: wireless headphone mix

Hi!

Sounds like a simple project :-)

I don`t know something on the market like this, but  the solution I would
suggest are 2 receivers (one for every RF) and a VCA, which uses the
intesity for the requested mixing level. Could be done with batterypower.
And no microcontrolers.
I think, that it could be done (inkl. housing, only reciver, no
transmitter) for around 1000US$
The problem here in Austria is, that you must not use RF-transmitters in
non public bands. And this is expensive. I don`t know the rules in
Australia.
One of the RF-plus is, that it is "audience.independent", so you can have
10 people in the area without troubles.

If Infrared could be used, it would be maybe a little bit easier, because
you can "direct" the light much easier then RF. On the other side, there
are the problems of light (shadow!) so it could happen, that the receiver
doesn`t see anything (more then one user!).It depends on the area, where
you use it.

An other idea- but much more complicated:
Take a videocamera, plug the videosignal to an online-measurement-system
(you must write the software) and use the output of the system for a
digital mixer (something like the 02R with MIDI-in). The mixer signal is
then sent to normal RF or infrared-headphones. But this solution needs a
lot of Computerpower and software.

Best regards

>I'm building an installation with two separate sound sources, one off CD
>and the other off the internal mac hard drive. Does anyone know if it's
>possible to use a set of wireless headphones and some sort of proximity
>sensor, so the closer a person gets to a sound source the more they hear
>of it in a headphone mix (only). The further away they get the more of a
>mix they get between the two, etc . . . . .
>
>Any technical info would be greatly appreciated.
>Damian Castaldi
>
Michael Wieser
m.k.w@magnet.at

Service and Audiodesign

------------------------------

Date:    Fri, 28 May 1999 00:44:07 -0700
From:    Jeff Rona 
Subject: Performance in LA

Dear Friends,

On Saturday, June 12th, I'll be performing in Los Angeles using MAX and MSP
with my friend Adam Rudolph's group, the MetaMusic Ensemble. It should be a
very interesting performance with some truly stellar musicians representing
a wide range of music from all over the world. I'm thrilled that Adam has
invited me to be a part of this evening. Adam and I played together a few
years ago in Jon Hassel's band. I'll be doing something quite different fro=
m
my film score work, using only a Powerbook, I'll to create sonic imagery,
textures and rhythms from unexpected sources. It promises to be a very
adventurous evening of music. I hope you can make it by.

Here's information on the evening that Adam has put together:

YOU ARE CORDIALLY INVITED TO ATTEND:

bootstrap:  creative emergence
Music Festival
June 11-13, 1999
Electric Lodge, Venice, CA
1416 Electric Ave.

Tickets: $15 per event ($10 for students w/valid ID)
Tickets on sale one hour before performance time. Seating is limited
Please Use Free On-site Parking
e mail:  om@metarecords.com
Schedule details:  Working Class Productions: 323-692-8080
 multi-kulti blend of roots
and textures influenced by Don Cherry's Organic Music and Ornette
Coleman's Harmolodic concepts.

SATURDAY June 12 8pm
Adam Rudolph Metamusic Ensemble with I Nyomen Wenten, Charles Moore,
Ralph Jones, G.E. Stinson, and Jeff Rona
 - Fresh from recent touring with Yusef Lateef and Pharoah Sanders,
Rudolph returns to Los Angeles with the premier of his new ensemble.
Hailed as "a pioneer in world music" (NY Times), Rudolph's new
composition features live electronics, Indonesian
percussion, guitar and winds in an improvisational exploration.

Namah with Pejman Hadadi, dancer Banafsheh, Greg Ellis, Ramin Torkian,
Afshin Mehrassa, and Shahram Hashemi=A0
 - Namah is a contemporary expression of Persian classical music with
percussion, strings & dance improvisation.

-------------------
  Be, Hear, Now
jrona@earthlink.net
-------------------

------------------------------

Date:    Fri, 28 May 1999 01:42:51 -0700
From:    dudas 
Subject: Re: fft~ and transposing

David Beaudry ponders:

>> I was reading/searching thru the max digest archive and came across a
>> similar question that I have, however the archive wasn't recent enough
for
>> me to get the answer (if there was one).  The question was: is there a
way
>> to transpose a note using fft~/ifft~?  For example, I have my clarinet
sound
>> analyzed by fft~ (by way of adc~)...before sending it out to ifft~, I
want
>> to tranpose it up a 3rd, for example, then send the new note out thru
dac~.
>> I there a way to do this?

the fft~ and ifft~ objects brake up incoming signals into "frames", whose
fourier analysis provides amplitude and phase information for a given
number of equally-spaced frequency bins.

let's say we do a 4 point fft (i.e. every 4 samples) at a 44.1 kHz sampling
rate... the fft~ object will return 4 real and 4 imaginary values which can
be converted into amplitude and phase values for bands of the spectrum's
sound centered around the frequencies: 0 Hz, 11.025 kHz, 22.05 kHz, and
33.075 kHz. So, a 512 point fft will give us amplitude/phase values for 512
frequency bins - every 86.1 Hz from 0 Hz to 44.1kHz.  These frequency bins
are at fixed intervals along the sound's spectrum; we can't ask the ifft~
to modify them, because it just doesn't work that way.

However, because the phase is related to the frequency (the difference in
phase between successive fft frames divided by the number of points between
successive frames gives us the phase derivative, sometimes referred to as
"instantaneous frequency"), it is possible to modify the _phase_ values in
such a way that when the sound is reconstructed, the phase difference
between successive (overlapping) "frames" modifies the frequencies within
each band with respect to their original location within the band.  This
would allow us to transpose or frequency-shift a sound, or even just some
localised areas of a sound's spectrum.

Unfortunately, making a patch to do this is not easy...

(but in theory it should be possible...)

-R

------------------------------

Date:    Fri, 28 May 1999 14:52:50 +0000
From:    Michael Kieslinger 
Subject: Re: wireless headphone mix

>From:    nomad 
>Subject: Re: wireless headphone mix
>
>I'm building an installation with two separate sound sources, one off CD
>and the other off the internal mac hard drive. Does anyone know if it's
>possible to use a set of wireless headphones and some sort of proximity
>sensor (e.g. from I-Cube), so the closer a person gets to a sound source
>the more they hear
>of it in a headphone mix (only). The further away they get the more of a
>mix they get between the two, etc . . . . .

i would use to proximity sensors, placed at the 2 choosen locations in the
space. they should send their info into MAX/MSP. there i would mix the two
sources (CD/HD) and send the resulting stereo-sound to the headphones.
basically the mixing is done in max, but the visitor perceives it in a
spatial context.

m.

=-=-=-=-=-=-=-=-=-=-=-=-=-=--=-=-=
Michael Kieslinger
Computer Related Design
Royal College of Art, London

tel: +44 171 590 4444 (ext. 4299)
e-mail: m.kieslinger2@rca.ac.uk
http://www.crd.rca.ac.uk/~michaelk
=-=-=-=-=-=-=-=-=-=-=-=-=-=--=-=-=

------------------------------

Date:    Fri, 28 May 1999 07:33:06 -0700
From:    Carl Stone 
Subject: Re: MAX Digest - 26 May 1999 to 27 May 1999 (#1999-159)

Better late than never:

Yasuhiro Otani and yours truly will both be ganging up on MSP tonight in
San Francisco:

Fri May 28 SAN FRANCISCO CA  8 pm NEW LANGTON ARTS [1246 Folsom Street]
for more information tel: 415.626.5416 or email nla_arts@sirius.com

CARL STONE

Today's Palindrome: N.A. medico: "Negro Jamaica? A CIA major genocide, man."

C  -----------------------------------------------------
A  INTERNET: cstone@sukothai.com      WELL: cstone
R
L  -----------------------------------------------------
 STONE      4104 24th Street PMB #410
                      San Francisco CA 94114 USA
         -----------------------------------------------
                            WEB: http://www.sukothai.com

------------------------------

Date:    Fri, 28 May 1999 12:58:52 -0400
From:    Robin Davies 
Subject: pt~

Hi,

I'm having a hard time getting Max to include pt~ in a compiled app.
Anyone know a secret I don't?

Thanks,

Robin

------------------------------

Date:    Fri, 28 May 1999 14:04:13 EDT
From:    JohnBrit@AOL.COM
Subject: Re: Bad Object

>
>(I wrote)
>>Free Object:<7 digit hex number>:Bad Object. Sometimes this prints out
>>repeatedly until I force quit Max, other times it prints out once and Max
>>quits with a type 2 error. The hex number is never the same twice. I have
>>tried through process of elimination for many hours off and on to try to
>>isolate the offending article but with no success. Anyone got an idea as
>to what is happening and how to remedy it ?
>
In a message dated 5/27/99 20:03:46, SK wrote:

>Did you try the old:
>- Save a backup copy of the patcher somewhere safe
>- Start deleting objects and saving/closing the parent patcher until
>  the problem goes away?
>
>
>Used to work for me.  As to what causes this error, I could never be
>sure.  You just have to futz around until it goes away.  Sometimes
>just the very act of moving objects around and resaving can fix the
>problem.

Yes, I 've even tried opening the patchers and writing a new identical one
side by side and then replacing the old with the new. Still get the same
result. Just when I thought I was pretty damn good at debugging and
isolating
conflicts, something like this comes along and fries my brain. I quite like
a
challenge - but not when I fail. Aaaaghhh.
JW

------------------------------

Date:    Fri, 28 May 1999 18:30:49 -0000
From:    "Timothy A. Place" 
Subject: Pitch-Shifter

Hello,

Due to the current interest in it I thought I would make available an
external I have for doing pitch-shifting/harmonizing.  This past April,
Ichiro Fujinaga coded the external for use in a piece that I was working on
for Saxophone and interactive electronics.  It uses phase vocoding
techniques for the analysis and a bank of 1000 oscillators for the
resynthesis - so it is a gigantic CPU hog.

It is kind of quirky, I haven't had the time to work on that yet, but I
figured I would upload it anyway so it can be used...   Here it is:

http://www.peabody.jhu.edu/~tap/MSP/pv~.bin

good luck!

   -Tim Place

Xavier Chabot wrote:
>
>- I assume you want to transpose in real time , right?
>- yes it is possible, but you would have to do it in C.  MSP programming
>would be quite a challenge. Note that using only fft/ifft would give you
>only a limited number of possible transposition ratios
>- you are better off using an harmonizer or a frequency shifter, but of
>course both have a very recognizable timbral signature
>
>xavier
>
>David Beaudry wrote:
>>
>> Posted this a last week but haven't received any response yet.  Is this
>> something that is at least possible, and if so, where should I look for
more
>> help.  Thanks.
>>
>> > Hello all:
>> > I was reading/searching thru the max digest archive and came across a
>> > similar question that I have, however the archive wasn't recent enough
for
>> > me to get the answer (if there was one).  The question was: is there a
way
>> > to transpose a note using fft~/ifft~?  For example, I have my clarinet
>>sound
>> > analyzed by fft~ (by way of adc~)...before sending it out to ifft~, I
want
>> > to tranpose it up a 3rd, for example, then send the new note out thru
>>dac~.
>> > I there a way to do this?
>> >
>> > Thanks in advance for any help.
>> >
>> > David Beaudry

____________________________
./`./`  Timothy A. Place
./`./`  tap@peabody.jhu.edu
./`./`  www.peabody.jhu.edu/~tap

------------------------------

Date:    Fri, 28 May 1999 13:07:47 -0700
From:    "Keith A.McMillen" 
Subject: Notation apps for Win CE?

This is off the main sequence but....I'm heading out of my current reality
for some weeks and just got a Win CE Phenom. I am looking for a simple
notation/audition app that would let me sketch out some melodies/ideas
while painfully away from my wonderful Mac/Max world. I've hit the usual
suspects and now turn to you. Any ideas?

Thanks,

Keith McMillen

------------------------------

Date:    Sat, 29 May 1999 01:09:17 +0100
From:    Carl Faia 
Subject: Re: MAX Digest - 26 May 1999 to 27 May 1999 (#1999-159)

>Edward Spiegel writes:
>
>>I just dropped by the Cycling74 web site and saw the MSP system
>>performance table and I was wondering if anyone has a nice little test
>>program they would share for coming up with such stats.

To add to the explanation of the very scientific method David uses to
measure MSP performance, different sound cards will affect the EPA estimate.

Carl

------------------------------

Date:    Sat, 29 May 1999 01:09:43 +0100
From:    Carl Faia 
Subject: Re: MAX Digest - 26 May 1999 to 27 May 1999 (#1999-159)

Jeff,

I've never really understood exactly how Overdrive is supposed to work.
That is, if you put audio in scheduler interrupt, my impression is that the
audio takes priority. That said, there are a lot of factors that affect the
timing in Max. What are you using as I/O vector size? Do you have a lot of
stuff displayed in real-time in the patch?

You can change the timing in the configuration preference. And maybe try to
put a defer object in a couple of mysterious places (take a look at the
help patch).

Carl

Jeff Rona writes:

> I've done a somewhat complex patch using a fair amount of audio
processing.
>The dac~ window shows I'm at about 70% of CPU Usage. Everything sounds
great
>and feels very in time (I'm triggering many audio events with a 'tempo'
>object).
>
> I added some simple MIDI functions. Mainly just some MIDI throughput on a
>single channel. This has thrown things for a bit of a loop (no pun). Moving
>one slider on a MIDI fader box (not a lot of data) causes the audio to
>totally stop while the data is passing through. This unexpected problem
only
>occurs if I have Audio In Scheduler Interrupt checked. So I uncheck it and
>the problem goes away.
> But I do find that the audio itself, when no MIDI is present, seems to be
>more rhythmically accurate with AISI checked.
>  Has anyone found a similar situation?

------------------------------

End of MAX Digest - 27 May 1999 to 28 May 1999 (#1999-160)
**********************************************************