Just a Few References

I have placed a few video’s on here with reference to a few things that we have been covering so far including Aphex Twins Beucephalus Boucing Ball, which we will be creating a piece in the style of. I put this here for reference for other students.

John Cage’s 4’33, a piece that we touched on in critical skills, which I found out was actually performed in 3 movements?! Something I found rather interesting to watch, in slight disbelief though, but I guess it makes sense to some people.

Some interesting information from Pierre Schaeffer showing some interesting facts and theories about timbre, like the similarity between the flute and the piano and how when we take away the attack on a sound we lose a lot of the perception on what the sonorous source actually is.

I have also put up a video of ‘Piano Phase’ by Steve Reich (process-based music) performed by one person on two pianos! Its extremely impressive. I came across this with a friend whilst doing research for the essay title I had chosen, which addresses some issues with process-based music.


BBCut or BBCut2  is a Supercollider based program that Jimmy and Julio demonstrated briefly to us in class this week. They used it to cut up a drumloop by separating it at certain hitpoints and playing it back at random orders at the requested numbers of periods in time, and came up with some really impressive results. It seems like a very good technique for incorporating random selection into a composition, and the outcome sounds like something that would take hours to recreate in a sequencing program. I haven’t played around with it myself just yet so I can’t elaborate too much, but I hope to in the second semester.

IXI Quarks

IXI Quarks is a software environment designed for live musical improvisation that allows for user interaction with the hardware, GUI (Graphical User Interface) and code. The program was created a in 2008 by Thor Magnusson who is actually a friend of Julio D’escrivan’s. It has lots of built-in instruments including the Quanoon, which is a virtual string instrument that shows a virtual fretboard, that can be plucked with the movement of the mouse over the strings, very cool. The SoundScratcher, which you can upload a sample to and perform many different sound manipulation tasks to, including the use of moving the mouse over the visual sample as show in a waveform pattern. The other instruments it has are the StratoSampler, Soundrops, Mushroom, Predators, Griddler, Polymachine, Grainbox, BufferPlayer, LiveCoder and the ScaleSynth, which each do their own crazy sound manipulations.

The IXI Quarks environment

The Quanoon

The SoundScratcher

The class has been split up into 5 groups, which have each been assigned the task of coming up with ideas for improvising that we are to perform to each other in class. In week 9’s class we performed our first draft, which turned out to be better than we had once thought it was going to be. Anyway here is a little analysis on this weeks pieces.

Group 1

This was my personal favourite, because it used progressive texture building, made good use of pauses, the players were very conscious of each other so the sounds didn’t interfere. This was also accomplished by their good selection of contrasting sounds/samples, that complemented each other.

Group 2

Was also a well performed piece but contrasted somewhat to group 1’s. It used a processed music technique which helped build up the textures. Once they had the main part of the piece going they were able to improvise over the top of it which also worked very well. This group also worked well with the sounds they chose.

Group 3

Group 3’s piece was structured quite similarly to group 2’s, but they incorporated a bit more improvisation.

Group 4

They used a drum loop sample, which acted as the ‘platform’ for the piece, for the rest of the group to built their sounds on. This worked quite well but from an improvisation point of view it partly took away the concept of listening to each other and responding appropriately. So you could say that putting in this beat was a bit of an easy way out.

Group 5 (Our group)

We also used this ‘platform’ technique similar to group 4 by adding our drum loop, but we also used changes in tempo to give the piece more contrast in character. We incorporated a drone into the piece to help give the piece a sturdy backbone to build upon, which definitely helped the performance and made it easier to improvise over. I used the SoundScratcher to manipulate a vocal sample and Sammy was using the ScaleSynth to add more texture. Unfortunately due to a slight computer malfunction, I wasn’t able to upload one of my samples so I wasn’t entirely happy with my performance in the piece, but it didn’t turn out all that bad in the end. To improve our piece we will be spending a lot more time together practicing listening to each other, which is one of the most important things in group improvisation, and using sounds that complement each other a bit more.

Final Writeup For Our Composition

For our finalised composition we had decided to get rid of the drum loop (controlled by Dan) that was being used as the platform for the piece, for us to improvise over. Dan changed his instrument to the Gridder Synth, which produces different pitches of a selected sound at random, at a rate determined by how many Grid blocks are activated. As Dan stated on his Blog, he felt that he had more control over it which helped him to keep time with the rest of the group. This helped a great deal by increasing the live interaction with each other, making it more like a true improvisation. Sammy stayed with the Scalesynth, keeping her drone to add more texture and also made use of the synth pad effects to give the piece extra melodic colour. I stayed with the SoundScratcher, but changed the samples to give out more synthesised sounds, rather than vocal, which is was I used before. We found this helped a great deal to give the piece harmony by stretching out the granulated sounds. I had more than one sample going at the same time, which meant I could add some fast, aggressive crescendos, and some slower more subtle ambiences, all of which helped to give the piece more character. Kayleigh changed to using the Predators Synth, which works in a very similar way to the Gridder Synth that Dan used, adding or subtracting the number of predators or prey to increase or decrease the rate of the desired sound is produced. This, added a bit more contrast in texture, although we found at times it clashed with the other parts in the piece. I think to improve our composition, we could incorporate the use of more samples, maybe keep the drone, but give it a greater range of tonality and plan out different ways to structure the piece, maybe with more crescendos, and buildups that we all contribute to.

Once we had recorded our performances on our laptops, we combined all of the tracks together on Logic and manipulated the panning to make it surround sound. Very fun stuff to work with. To activate surround sound in Logic you need to select the output of each track to ‘surround’ and change 5.1 sound to ‘Quadraphonic’ in the I/O Assignments tab in Preferences -Audio. I have put screen shots showing how I did this and also showing the panning position of each of the channels in Logic.

Dead good Mouse

In our week 8 laptop musicianship lesson we were showed what I thought was quite a fun thing using Supercollider; the MouseX and MouseY UGen. By typing this code:

enabled us to use the Mouse to control the frequency of the SinOsc. The arguments for MouseX.kr are (min value, max value, warp , lag). Warp can be either the values 0 or 1, 0 controls in a linear manner from the top to the bottom of the screen and 1 controls exponentially, from left to right. This particular MouseX UGen I have written here is controlling the SinOsc being played at Audio Rate.

So when we all got these SinOsc Mouse controllers running, the class was split up into groups. One person from the group had to move the mouse to a certain point on the screen to choose a certain pitch they wanted then they left it there sounding constantly. The rest of the group then had to use their musical perception to try to match that same frequency. This was quite interesting to see who was able to use their hearing perception in this manner and who wasn’t. We found that someone in our group had landed their frequency on the fifth degree above the correct frequency, which is a common mistake. I found that I was able to match the frequency pretty rapidly because I have a good ear for that kind of thing, but overall I just found it a fun task to do.

Playing Recordings of Supercollider in Logic

Recording your work in Supercollider is really easy, all you have to do is choose what type of recording file format you want with this line of code:

To record in different filetypes exchange the “WAV” with the desired type. Here is a couple of file format types to choose from, found in the ‘SoundFile’ help file in SC:

Once you have chosen the filetype, select ‘prepare rec’ button on the server you are running from in the bottom left, and click it once more to start recording. This is the Server with the record button ready to be clicked to record:

To stop recording, just it the same button again or hit ‘command – period’. Now to find your recording go to the ‘Finder’ and click on the icon with the name of your computer, Music, Supercollider Recordings, and there is the file. This is what it looks like on my computer:

Note: SC automatically gives the file a name, something like: ‘SC_091118_174012.wav’ , which is 2009/11th month/18th day at 17:40 and 12 seconds! haha. This is obviously the exact time that the file was created. Very clever.

Now just drag the icon of your track in to an open active audio track in Logic and in the words of Tommy Cooper, “just like that” there is your recorded SC file in Logic!