Wednesday 27 August 2008

CC2 - Semester 4 - Week 4

Ring Modulation / Amplitude Modulation.

Having worked with a number of electronic music softwares, I did not have much difficulty to actually "understand" the concepts of Ring Modulation (RM) and amplitude Modulation (AM). On the other hand, I personally think after the use of "Low Pass Filter" -with a reasonable amount of resonance-, RM and AM are the most widely used synthesis in these days' electronic music.
Coming up with the patch however, was not that easy at the beginning. I was never exposed to actually "calculate" the amount of RM and AM I was using; if it sounds good, it is good. Nevertheless, the insight of the whole procedure was pretty interesting.The most annoying thing in working with Max/MSP -at least for me- so far is when I use a shortcut (usually the space bar) to turn the device on and off... Not to make a confusion when there actually is a "mute" button provided, is a hard task; or probably a hard task for someone like me.
The stuff we have to make are becoming fancier, there might in fact be a possibility of getting some sort of a job with one of these Audio Technology companies (I'd love to work with Propellerhead!)


References:
- Christian Haines 'Creative Computing 2.2' Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 21/08/2008
- Max MSP, Wikipedia (http://en.wikipedia.org/wiki/Max/MSP) [Accessed 26/8/2008]

Tuesday 19 August 2008

Forum - Semester 4 - Week 3

First Year Students' Presentations

It was a good feeling looking at the works of some people creating music under the same consequences as we were last year. Their perceptions and motives were more or less the same, so were the expectations. Nevertheless, as the nature of music and particularly Musique concrète (the genre in which most of the students were presenting their works) once again we got acquainted with new ideas, approaches, concepts and perceptions; great for me.
Funny enough, the first year students sound like thinking the same way that we (at least I) used to last year, and I assume that they would project the same change and evolution of ideas as they continue; if!
I won't go through each piece; my main interest is the overall change -of attitude- in them, us, and others. Like always there was this big subject of "what is music" which I am tired of touching again; however, the good news is that it seems like the appreciation for avant-garde art -in this case Musique concrète- is growing. I guess we would have a huge leap in the amount of knowledge we gain in uni if we had more subjects on for example something like this:
"How to Recognise, Deal with, Appreciate and Like Something New which Might Look Unattractive us at First”...

References:
Stephen Whittington. "Music Technology Forum - Week 3 - First Year Student Presentations." Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 14/08/2008.

CC2 - Semester 4 - Week 3

Polyphony and Instancing

Using Poly~ was not much difficult in terms of designing the patch, nor was the actual understanding of the concept of a polyphonic device. What was new -and interesting- for me, was that having worked with softwares who had embedded this technology in them already (Reason, Live, Protools, etc..) I could kind of realise what was going on behind the scenes. In fact, I guess it would make more sense if I "rewired" Max and these softwares and come up with something more exclusive to me.
What I chose to polyphonies, was my last week's patch which was generating signals using a "cycle~" object and ramping its volume from full scale (100) to minimum (0).
I made a device that gets 5 inputs and constantly ramps their volume up and down in period of 2 seconds. The result is actually pretty nice! I set the device to divide each individual volume by 5 so the overall doesn't clip.
I also provided an octave changer (basically to look fancier and more like real-world control surfaces.
I might have done something wrong or insufficient but I had this feeling that something was not done at the end! Maybe I am too lucky with MSP or I am so stupid I don't understand the objectives!

Listen to the file here: http://www.box.net/shared/ded9f65v73

References:
- Christian Haines 'Creative Computing 2.1' Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 14/08/2008
- Max MSP, Wikipedia (http://en.wikipedia.org/wiki/Max/MSP) [Accessed 19/8/2008]

Friday 8 August 2008

Forum - Semester 4 - Week 2

This week's Forum session "My Favourite Things" was presented by David Harris.
David talked about his collaboration with the quartet Grainger in a three-concert series in 2008.
For some reason -which I didn't really understand- we were ought to follow the scores for the pieces; I lost both music and the scores at the end though.
His composition "Terra Rapta (Stolen Land)" was a piece dedicated to the story of the stolen generation of Australian Aboriginals (or maybe to the individuals themselves,..) which I found well-sounding; yet, not one of my "Favourite Things"...
David’s approach to naming his work was pretty interesting for me. Why Latin, exactly? I have the same issue too, when I want to name my tunes, I tend to come up with something as weird as possible but at the same time I can just think of simple things. Is the best way the use of non-English languages?
Following that, he played another piece of the concert series originally by Schubert which -as far as I observed- bored a considerable number of Music Tech students; a phenomenon which made Stephen comment on David's intentions as "provocative".
I have referenced to some stuff about Harris’ concert that I found on the internet.

https://www.qtix.com.au/show/Grainger_Quartet_Darkness_Light_08.aspx
http://www.graingerquartet.com/

References:
David Harris. "Music Technology Forum - Week 2 - My Favourite Things." Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 07/08/2008.

CC2 - Semester 4 - Week 2

Switching and Routing signals.

Most of my time this week was spent of providing help files for last week’s exercises. Still, I have issues understanding the fundamental differences of Max and MSP; however, I hope –and it seems like- there are not much basic dissimilarities in the logics and the application of various algorithms.

The three parts of this week’s exercise were:

a) Adding mute function to previous objects.

b) “GUIise[1]”ing the previous objects.

c) Creating two signal generators utilising Cycle~ and Phasor~.

Worthy to discuss, for the third part, I opted to merge Cycle~ and Phasor~[2]; forming a patcher with 3 parts: a Cycle~, a Phasor~ and a combination of these two, which works on the basis of having the Cycle~ generating the sound and the Phasor~ modifying it. The 3rd part is signified in the picture since I crafted it in OS platform.

Probably the most notable issue that I faced, was the noise which the objects Cycle~ and Phasor~ made when the phases of their generated waves were changing. I tried several ways to eliminate this noise but since you can not connect a number~ to any of the phase inlets, you would definitely get the noise of dealing with –relatively- low bit rate of number (and not a number~) object.


References:
- Christian Haines 'Creative Computing 2.1' Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 7/08/2008
- Max MSP, Wikipedia (http://en.wikipedia.org/wiki/Max/MSP) [Accessed 11/8/2008]

Post Script: Mistakenly, the ZIP file in the Box.net widget is labeled Sanad CC2 Week 4. Week 2 is correct, the contents of the ZIP file are all right though.

[1] GUIise = Create Graphical User Interface version of something.

[2] Believe it or not, I spent almost two hours searching for some object called “Rasor~”!

AA2 - Semester 4 - Week 2

Game Sound Analysis: Need for Speed; High Stakes (1999).


This racing game is based on the idea of running away from the police[1] listening to heavy metal music; hence the sounds of sirens, police helicopters, etc (which were new to the series of NFS games).

Much attention is paid to the detailed sounds and noises of the cars including their engine, their breaks, other parts of car bodies, etc.. Other cars also sound with respect to their position (left, right and in surround systems in front and back) in relation to the players(’) car(s). Most of the sounds can be categorised as hyper-real.

Another aspect of the sound in this game is its theme music; like many other themes in this time period (and beyond), there would be no pause in the music if the player paused the game; there are (even more than) two separated layers of sound going on at the same time. However, when the game completely changes the scene (eg. from “playing” to “choosing a car”) the music changes as well as all other effects, indicating the entrance to a new environment.

Not much narration in this game, except for policemen talking on their walkie-talkies.

(Video reference: seconds 6 to 51)



[1] It’s much more pleasant to me than kill-kill-kill games highly regarded as the sources of development in the game industry.


References:
- Christian Haines. "Audio Arts: Semester 4, Weeks 2." Lecture presented at the EMU, University of Adelaide, South Australia, 05/08/2008.



Friday 1 August 2008

Forum - Semester 4 - Week 1

Audio / Music listening Culture (or why people are listening to what they are listening to and why they do it the way they do it!)
The evolution of music technology has had its significant impacts on the way music is perceived by public. The topic of this session was the pros and cons of this phenomenon.
According to Stephen, Walkman -as a symbol of this revolutionary change- has acted as a device to provide a "movie soundtrack" for "people's lives"; however, I personally often wonder (for example) "... how bizarre it is that walking in streets at 2 o'clock after midnight is accompanied by some psytrance music"! maybe my life doesn't have a conventional soundtrack.Other opinions which popped-up suggested ideas such as "headphones have provided an extent of isolation-from-outside-world", "technology has increased the quantity of listening to music and has decreased its quality", etc...
Like many forum sessions, discussions eventually went all over the place and a whole lot of various issues arose again; my problem is that noone seems to comment on anything respecting the fact of these issues being "subjective". Because something on Triple J has become famous doesn't mean that "Radio is the way..." or such judgements. I think things radically differ in different places; if we are to observe these issues, we need to have a more open mindset and avoid issuing transcriptions for the entire globe; especially when our personal experiences are limited to few communities.

Stephen Whittington. "Music Technology Forum - Week 11 - Audio Culture." Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 31/07/2008.

CC2 - Semester 4 - Week 1

MSP (Max Signal Processing)

The basic challenge of these two patches -also happened to be beneficial- was to provide a “ramp” using which I can work with values starting from one number and finishing in the other one in a defined time period. The benefit of this was that at this moment, I have an already-developed patch of this ramp so I can use it under any circumstances in any patch of Max or MSP.
For this week, the main use of this “ramp” was to control the volume (amplitude) in a number of ways; most importantly to come up with a “smooth” change in the “on/off” process (of also any other device).
Unlike Max I didn’t face any major problem [yet] because most of the job was to deal with algorithms and algebra; basically working with numbers.
If this “ramp” is used in changing the phase of the generated wave (in the context of a stereo situation) it would cause –not surprisingly- the effect of a phaser! which is widely used thus sounds pleasant to many individuals (including me).
Most of my time however, was spent of debugging the device and making it more user-friendly.

References:
- Christian Haines 'Creative Computing 2.1' Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 30/7/2008
- Max MSP, Wikipedia (http://en.wikipedia.org/wiki/Max/MSP) [Accessed 31/7/2008]