Saturday, 18 October 2008

AA2 - Semester 4 - Week 10

Ambience sound.
At the moment I am writing this, my game is not yet approved; and it does not have ambient sound! Thus, I will imagine an ambience with some relevance to the game.
Since the game goes on in a relatively "creepy" environment, I have come up with a background in this kind of a sequence:
- The main character is running to find someone in the dark night; a tiny alley. Location: eastern Los Angeles!and
- The main character is under heavy influence of drugs. His tourturers have put him in a chamber. He is totally tripping!This is my concept of a game like The Godfather (by Electronic Arts). I read about it last night actually! It sounds pretty cool.

DOWNLOAD THE 2-PART MP3.

References:
- Christian Haines. "Audio Arts: Semester 4, Weeks 10." Lecture presented at the EMU, University of Adelaide, South Australia, 15/10/2008.

- The Godfather, The Game. Wikipedia. (http://en.wikipedia.org/wiki/The_Godfather:_The_Game) [Accessed 18/10/08]

CC2 - Semester 4 - Week 9

FFT

The basics and the technical characteristics of Fast Fourier Transform "thing" aside, it is a useful tool. This is my main interest.
This week's exercise was to utilise FTT in a device and it was recommended to have a look at the final project.
My final project is kind-of a mixer; hence the interface of this week's patch -and also its function really!-.
This patch which I have called the "Self Manipulator" takes a sample in its buffer, and using groove~ and one of the examples of FFT, provides additional "deformed" soundwaves which accompanying the original sample, sounds good -to me at least-.
As I mentioned, most of my attention was the applications of FFT, rather than what happens within the process. I will probably include this patch somewhere in my final project.
Note: It seems that some of the FFT examples' addresses should be allocated in Max. FFT convolution~ did not work until I manually added its address to the file preferences option. (how do you say it in proper English?)

DOWNLOAD THE PATCH

Cheers.

References:
- Christian Haines 'Creative Computing 2.2' Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 16/10/2008
- Max MSP, Wikipedia (http://en.wikipedia.org/wiki/Max/MSP) [Accessed 18/10/2008]

Tuesday, 14 October 2008

CC2 - Semester 4 - Week 8

This week’s exercise was simple; get the Novation ReMOTE SL to control a sampler.
I ended up using 7 controllers: 5 knobs (or pots), 2 buttons and 1 slider.
The interesting challenge was the GUI. The more it goes, the more sophisticated I think about the interfaces I come up with, and of course the more “bugs” I gotta fix!
Nevertheless, MSP still seems easier than Max; it is relatively end of this semester and if MSP was hard, it should have shown itself by now; or maybe I am just taking it super-easy?
Back to the point of GUI, I actually incorporated some funny thing this week. There is a Mute button (see the picture, above the Volume Control) and when it is muted, the box changes to “unmute” and starts blinking; not regular blinking though, it actually changes colours from black to red in a pretty smooth fashion. I liked this part the most to be honest!
This patch and the one I did –I think- in week 6 will be good tools for me and my some-time-I-would-start project of music collage.
Nice!

HERE is the ZIP file of this weeks exercise.

Note: don’t bother with the name of this patch; I thought it was week 9. In fact, it IS week 9, but we are 1 week behind or something I think…

References:
- Christian Haines 'Creative Computing 2.2' Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 09/10/2008
- Max MSP, Wikipedia (http://en.wikipedia.org/wiki/Max/MSP) [Accessed 11/10/2008]

Sunday, 12 October 2008

I have not been posting anything since ages ago:
Just about the AA part: On week 7 we had this team project which I think did not go really "teamly". In fact, it was really just Freddie, to a big extent Edward and to some extent Doug who actually gave importance to the exercise. I just failed to cooperate in a good way. The final result ended up in being far from what I had in mind. no complaints though, I really appreciate what others did; what I am saying is that 'team work' does not seem to be working for me.

AA2 - Semester 4 - Week 9

SciFi & Horror films; the crystallisations of "All the things I hate".

The exercise was to take 4 sounds from the game which we plan to do for the project and modify them according to the principles of readings (a) and (b) mentioned in the footnotes.
I have not yet finished the proposal sheet, so the game is not really approved but as far as I am concerned, this is what I want to do and if nothing changes, yeah here we go..

In the game, some "father" hits others with a baseball bat. I changed one sound which I had downloaded from freesound.org and compressed and equalised it. The result is the first part of the final MP3.

This father dude will eventually kill some chicks; hence the screaming sound. Honestly I found the original sound in the game pretty low-quality. Yet again freesound.org, compression, fade-in and fade-out,.. etc; 2nd part of the MP3.

the "bad guys" of the game are sometimes dogs, and they growl. This was pretty tricky 'cause I had to shift the pitch and stretch the time; 3rd part of the MP3.
The 4th part probably took the longest time to finish. As you can see in the picture, I quadropled the initial sound (glass breaking, also from freesounds.org) and changed the pitch of the 3rd and 4th part, panned them, compressed them and basically did so much to the poor sample.

By the way, all were done in Audacity; challenges: zero.

Downloadables:
1: the ORIGINAL sounds in order.
2: the MODIFIED sounds in order.

References:
- Christian Haines. "Audio Arts: Semester 4, Weeks 9." Lecture presented at the EMU, University of Adelaide, South Australia, 08/10/2008.


Readings:
(a) : "Chapter 5 - Sound Design: Basic Tools and Techniques" and "Chapter 6 - Advanced
Tools and Techniques". Childs, G. W. 2006, Creating Music and Sound for Games, Thomson Course Technology.

(b) : Kelleghan, Fiona. 1996, Sound Effects in SF and Horror Films, 2006,
.


Monday, 29 September 2008

CC2 - Semester 4 - Week 7

Record and Play.

Pretty easy exercise. There was basically a player (for which I was initially intending to use play~ but ended up using groove~) and a recorder. Obviously enough, there was a necessity to utilise buffer~ in both, reading from and writing to it.Aside these three elements, the object "waveform~" played a significant role in my patches. I did not find the whole "story" behind this object necessarily interesting; however, paying attention to the mindset of a "typical" user (i.e. commercialised enough to be impressed by the interface more than the usage of any given device) I think it is a really "fancy" object. In general, having some sort of visual element help a lot to "sell" the product; some marketing things anyway...

PS: Listen to Shulman. For a while please forget about Max/MSP and have some good chillout music in your ears please...

Download the patches here or go to the box in the right side.

References:
- Christian Haines 'Creative Computing 2.2' Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 11/09/2008
- Max MSP, Wikipedia (http://en.wikipedia.org/wiki/Max/MSP) [Accessed 12/9/2008]

Thursday, 18 September 2008

Forum - Semester 4 - Week 7

Presentations

This week was “our” chance to present what we had done. Like always, I did something the very last minute and played the tune in the forum; a 6/8 dance tune in which I had incorporated the use of Max, REASON, and other stuff. I thought of the track as pretty bad but apparently it was not as awful as I thought. I leave the rest of the feedbacks to other blogs of other students..Edward presented his work of last semester in Max. Despite the fact that there was no sound, I liked his idea of associating vision and sound. Unfortunately there would be no more Pink Floyd concerts (Richard Wright passed away) but if there were anything similar, I think Edward can potentially take a role in that.
Freddie and Doug also presented their Max projects. Finally seeing Freddie’s controversial patch –controversial because of the use of boobs! In the patch- was pretty nice; again it was the idea rather than the process, which fascinated me more.
In my opinion, Doug with his patch projected the most practical –or pragmatic- approach; he actually built up something that can be used to educate people! Yet again the issues with sound and computers stood on his way and he had to deal with some unexpected problems to show his work.


Rooster. Walls of The Wild. http://www.wallsofthewild.com/rooster.htm (Accessed 14/9/08)
Stephen Whittington. "Music Technology Forum - Week 6 - Negativland." Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 11/09/2008.

Tuesday, 9 September 2008

CC2 - Semester 4 - Week 6

Wave player basic

Using objects such as groove~ and wave~, I built a patch for playing up to 16 sound (being files such as .wav, .aif, etc –and not MP3-).

The sweetest part for me –of course- was to manipulate the samples’ playback using a –relatively- simple trick of a slider.

The other interesting issue was when I chose to use another slider to determine the starting and ending points of the loop with just dragging the mouse on the slider.

I still am not sure if it is the best way for a poly~ patch to pop-up different windows and work with each separately. I assume as long as the number of sounds –or parts in any poly~- is reasonably low (say less than 20) it would not get messy; however, although even with big numbers (imagine the poly~ patch deals with 200 windows) can run smoothly, I think there would be a need of two or three monitors to see what is going on.

Overall, the more MSP goes forward, the easier and the more time-consuming it becomes; ain’t it?[1]


Download: http://www.box.net/shared/0593ce74m6

References:
- Christian Haines 'Creative Computing 2.2' Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 04/09/2008
- Max MSP, Wikipedia (http://en.wikipedia.org/wiki/Max/MSP) [Accessed 5/9/2008]

[1] I just watched this fantastic film called Daytime Robbery and “ain’t it” is stuck in my mind; I had to use it, sorry…

Forum - Semester 4 - Week 6

What would you feel like if you heard this?

9 sounds, 9 emotions; maybe 9 lives too. The "objective" was to listen to 9 sounds each representing a particular emotion. The point was that each individual playing the sounds -or rather who has chosen the sounds- has had a different perception of each sound. In simpler English the way different people perceive different sounds is unlikely to be 100% identical; and we examined it.
Like most of the times I had not done my job and again like most of the times I opted to improvise (bloody self-confidence!)which according to Stephen turned out to be a dodgy experiment. Let's say it failed, which by itself is a valid result for an experiment.
Anyway, what I did was to speak in a language unknown to the rest of the class; Persian. I tried my best to embed emotions and deliver my message via the tonal projection of whatr I was saying. Surprisingly, it actually worked few times; which would possibly confirm the justification of such experiment. Back to the main issue, the "sound-and-feeling-testing"; the idea was apparently derived from the Indian perception and sacredness of sounds. I personally -as a materialist- am not sure if such definitions of sound and imagining "souls" for sounds would make sense or not but I admit that there definitely is "something" about the effects of sounds on us; probably music therapists know much more about this.
Isn't music therapy a relatively new-born idea? How about discussing that in forum?

References:

Saraswati, Wikipedia, http://en.wikipedia.org/wiki/Saraswati (Accessed 8/9/8)

Stephen Whittington. "Music Technology Forum - Week 6 - Negativland." Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 04/09/2008.

AA2 - Semester 4 - Week 6

Monday, 1 September 2008

CC2 - Semester 4 - Week 5

FM Synthesis

For this week, I undertook another approach to designing my device;
Since -like other exercises- one criteria was to have the "poly~" option embedded in the patch, and also as a result of each individual voice needing separate modifications, I chose to come up with a "multiple-parts" patch. The patch would initially ask the user for some basic settings (i.e. MIDI input, number of voices, etc..) and the user would specify the particular characteristics of each voice afterwards.
Yet again, my biggest issue was the incompatibility of MAX withing the contexts of OSX and PC. This software is simply NOT CROSS-PLATFORM. Nevertheless, like most of the other times, I opt to finalise my patch in OSX.
This patch will provide the user with the option to have the frequency, the modulation rate and the modulation depth canging over time; with the use of envelopes.

References:
- Christian Haines 'Creative Computing 2.2' Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 28/08/2008
- Max MSP, Wikipedia (http://en.wikipedia.org/wiki/Max/MSP) [Accessed 31/8/2008]

Forum - Semester 4 - Week 5

Negativland.

Stephen's favourite thing was a video of an "anti-copyright and stuff!" group called Negativland.
Everyone else in his blog is writing about who they are, so as a bit of difference, I will focus on what I got out of getting acquainted with these people. The most "crucial" and -since Stephen likes to use this word- "provocative" aspects of the video was that apparently many people would get "offended" by seeing their heros, being actors, singers, politicians, or prophets being teased and satirised. There is a long story -certainly more than 200 words- about why and how people are not tolerant, and this is directly related to the Negativelad. However, I hereby admit that it needs a lot of guts to challenge people's perceptions and manipulate the way they want to portrait their beliefs.
The Negativeland's work titled "Christianity is Stupid" was apparently the most controversial part of their works. In my personal opinion, it actually was a pretty weak work of art; but the name and the refrain "Communism is Good" (which was also pretty comic) caused the offence and controversy. I think of their way of expressing their opinions as similar as Richard Dawkins; good ideas, bad -or rather agressive- projection.
Anyway,.. Stephen's "provocative" favourite thing was pretty nice, a bit surprising too...
PS: Negativland has a lot more in the sense of music and art; but I just chose to talk about this aspect of their works...

References:
Stephen Whittington. "Music Technology Forum - Week 5 - Negativland." Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 28/08/2008.

AA2 - Semester 4 - Week 5

Designing sound for game; first practical step.

This week's story is about a software introduced to us by Christian. FMOD Designer is one of the first steps of ours towards manipulating the sounds and preparing them for a game. For the first part, we just tested few basic functions of the software and observed the outcome.

Most of what I did for this was basically to get acquainted with the way the software is programmed to organise stuff (i.e. sound files in various formats of WAV, MP3, AIFF, etc...)
Probably because of my zero familiarity with this field and my lack of previous experience in this, I had a bit of work to realise what was going on. In reality, it was not very complex; the interface gave me the impression of working with my files in Microsoft Windows explorer and organising them in that way. Nevertheless, -very much like Firefox indeed- the software follows the principle of window-in-window or tab-in-windows.
The only thing that I could not get to work was to make the software play the sounds I had defined for it to play; I had to play the result of the manipulation manually.

the MP3 is right in the box in the right side, or HERE..

References:
- Christian Haines. "Audio Arts: Semester 4, Weeks 4." Lecture presented at the EMU, University of Adelaide, South Australia, 26/08/2008.

Wednesday, 27 August 2008

CC2 - Semester 4 - Week 4

Ring Modulation / Amplitude Modulation.

Having worked with a number of electronic music softwares, I did not have much difficulty to actually "understand" the concepts of Ring Modulation (RM) and amplitude Modulation (AM). On the other hand, I personally think after the use of "Low Pass Filter" -with a reasonable amount of resonance-, RM and AM are the most widely used synthesis in these days' electronic music.
Coming up with the patch however, was not that easy at the beginning. I was never exposed to actually "calculate" the amount of RM and AM I was using; if it sounds good, it is good. Nevertheless, the insight of the whole procedure was pretty interesting.The most annoying thing in working with Max/MSP -at least for me- so far is when I use a shortcut (usually the space bar) to turn the device on and off... Not to make a confusion when there actually is a "mute" button provided, is a hard task; or probably a hard task for someone like me.
The stuff we have to make are becoming fancier, there might in fact be a possibility of getting some sort of a job with one of these Audio Technology companies (I'd love to work with Propellerhead!)


References:
- Christian Haines 'Creative Computing 2.2' Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 21/08/2008
- Max MSP, Wikipedia (http://en.wikipedia.org/wiki/Max/MSP) [Accessed 26/8/2008]

Tuesday, 19 August 2008

Forum - Semester 4 - Week 3

First Year Students' Presentations

It was a good feeling looking at the works of some people creating music under the same consequences as we were last year. Their perceptions and motives were more or less the same, so were the expectations. Nevertheless, as the nature of music and particularly Musique concrète (the genre in which most of the students were presenting their works) once again we got acquainted with new ideas, approaches, concepts and perceptions; great for me.
Funny enough, the first year students sound like thinking the same way that we (at least I) used to last year, and I assume that they would project the same change and evolution of ideas as they continue; if!
I won't go through each piece; my main interest is the overall change -of attitude- in them, us, and others. Like always there was this big subject of "what is music" which I am tired of touching again; however, the good news is that it seems like the appreciation for avant-garde art -in this case Musique concrète- is growing. I guess we would have a huge leap in the amount of knowledge we gain in uni if we had more subjects on for example something like this:
"How to Recognise, Deal with, Appreciate and Like Something New which Might Look Unattractive us at First”...

References:
Stephen Whittington. "Music Technology Forum - Week 3 - First Year Student Presentations." Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 14/08/2008.

CC2 - Semester 4 - Week 3

Polyphony and Instancing

Using Poly~ was not much difficult in terms of designing the patch, nor was the actual understanding of the concept of a polyphonic device. What was new -and interesting- for me, was that having worked with softwares who had embedded this technology in them already (Reason, Live, Protools, etc..) I could kind of realise what was going on behind the scenes. In fact, I guess it would make more sense if I "rewired" Max and these softwares and come up with something more exclusive to me.
What I chose to polyphonies, was my last week's patch which was generating signals using a "cycle~" object and ramping its volume from full scale (100) to minimum (0).
I made a device that gets 5 inputs and constantly ramps their volume up and down in period of 2 seconds. The result is actually pretty nice! I set the device to divide each individual volume by 5 so the overall doesn't clip.
I also provided an octave changer (basically to look fancier and more like real-world control surfaces.
I might have done something wrong or insufficient but I had this feeling that something was not done at the end! Maybe I am too lucky with MSP or I am so stupid I don't understand the objectives!

Listen to the file here: http://www.box.net/shared/ded9f65v73

References:
- Christian Haines 'Creative Computing 2.1' Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 14/08/2008
- Max MSP, Wikipedia (http://en.wikipedia.org/wiki/Max/MSP) [Accessed 19/8/2008]

Friday, 8 August 2008

Forum - Semester 4 - Week 2

This week's Forum session "My Favourite Things" was presented by David Harris.
David talked about his collaboration with the quartet Grainger in a three-concert series in 2008.
For some reason -which I didn't really understand- we were ought to follow the scores for the pieces; I lost both music and the scores at the end though.
His composition "Terra Rapta (Stolen Land)" was a piece dedicated to the story of the stolen generation of Australian Aboriginals (or maybe to the individuals themselves,..) which I found well-sounding; yet, not one of my "Favourite Things"...
David’s approach to naming his work was pretty interesting for me. Why Latin, exactly? I have the same issue too, when I want to name my tunes, I tend to come up with something as weird as possible but at the same time I can just think of simple things. Is the best way the use of non-English languages?
Following that, he played another piece of the concert series originally by Schubert which -as far as I observed- bored a considerable number of Music Tech students; a phenomenon which made Stephen comment on David's intentions as "provocative".
I have referenced to some stuff about Harris’ concert that I found on the internet.

https://www.qtix.com.au/show/Grainger_Quartet_Darkness_Light_08.aspx
http://www.graingerquartet.com/

References:
David Harris. "Music Technology Forum - Week 2 - My Favourite Things." Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 07/08/2008.

CC2 - Semester 4 - Week 2

Switching and Routing signals.

Most of my time this week was spent of providing help files for last week’s exercises. Still, I have issues understanding the fundamental differences of Max and MSP; however, I hope –and it seems like- there are not much basic dissimilarities in the logics and the application of various algorithms.

The three parts of this week’s exercise were:

a) Adding mute function to previous objects.

b) “GUIise[1]”ing the previous objects.

c) Creating two signal generators utilising Cycle~ and Phasor~.

Worthy to discuss, for the third part, I opted to merge Cycle~ and Phasor~[2]; forming a patcher with 3 parts: a Cycle~, a Phasor~ and a combination of these two, which works on the basis of having the Cycle~ generating the sound and the Phasor~ modifying it. The 3rd part is signified in the picture since I crafted it in OS platform.

Probably the most notable issue that I faced, was the noise which the objects Cycle~ and Phasor~ made when the phases of their generated waves were changing. I tried several ways to eliminate this noise but since you can not connect a number~ to any of the phase inlets, you would definitely get the noise of dealing with –relatively- low bit rate of number (and not a number~) object.


References:
- Christian Haines 'Creative Computing 2.1' Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 7/08/2008
- Max MSP, Wikipedia (http://en.wikipedia.org/wiki/Max/MSP) [Accessed 11/8/2008]

Post Script: Mistakenly, the ZIP file in the Box.net widget is labeled Sanad CC2 Week 4. Week 2 is correct, the contents of the ZIP file are all right though.

[1] GUIise = Create Graphical User Interface version of something.

[2] Believe it or not, I spent almost two hours searching for some object called “Rasor~”!

AA2 - Semester 4 - Week 2

Game Sound Analysis: Need for Speed; High Stakes (1999).


This racing game is based on the idea of running away from the police[1] listening to heavy metal music; hence the sounds of sirens, police helicopters, etc (which were new to the series of NFS games).

Much attention is paid to the detailed sounds and noises of the cars including their engine, their breaks, other parts of car bodies, etc.. Other cars also sound with respect to their position (left, right and in surround systems in front and back) in relation to the players(’) car(s). Most of the sounds can be categorised as hyper-real.

Another aspect of the sound in this game is its theme music; like many other themes in this time period (and beyond), there would be no pause in the music if the player paused the game; there are (even more than) two separated layers of sound going on at the same time. However, when the game completely changes the scene (eg. from “playing” to “choosing a car”) the music changes as well as all other effects, indicating the entrance to a new environment.

Not much narration in this game, except for policemen talking on their walkie-talkies.

(Video reference: seconds 6 to 51)



[1] It’s much more pleasant to me than kill-kill-kill games highly regarded as the sources of development in the game industry.


References:
- Christian Haines. "Audio Arts: Semester 4, Weeks 2." Lecture presented at the EMU, University of Adelaide, South Australia, 05/08/2008.



Friday, 1 August 2008

Forum - Semester 4 - Week 1

Audio / Music listening Culture (or why people are listening to what they are listening to and why they do it the way they do it!)
The evolution of music technology has had its significant impacts on the way music is perceived by public. The topic of this session was the pros and cons of this phenomenon.
According to Stephen, Walkman -as a symbol of this revolutionary change- has acted as a device to provide a "movie soundtrack" for "people's lives"; however, I personally often wonder (for example) "... how bizarre it is that walking in streets at 2 o'clock after midnight is accompanied by some psytrance music"! maybe my life doesn't have a conventional soundtrack.Other opinions which popped-up suggested ideas such as "headphones have provided an extent of isolation-from-outside-world", "technology has increased the quantity of listening to music and has decreased its quality", etc...
Like many forum sessions, discussions eventually went all over the place and a whole lot of various issues arose again; my problem is that noone seems to comment on anything respecting the fact of these issues being "subjective". Because something on Triple J has become famous doesn't mean that "Radio is the way..." or such judgements. I think things radically differ in different places; if we are to observe these issues, we need to have a more open mindset and avoid issuing transcriptions for the entire globe; especially when our personal experiences are limited to few communities.

Stephen Whittington. "Music Technology Forum - Week 11 - Audio Culture." Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 31/07/2008.

CC2 - Semester 4 - Week 1

MSP (Max Signal Processing)

The basic challenge of these two patches -also happened to be beneficial- was to provide a “ramp” using which I can work with values starting from one number and finishing in the other one in a defined time period. The benefit of this was that at this moment, I have an already-developed patch of this ramp so I can use it under any circumstances in any patch of Max or MSP.
For this week, the main use of this “ramp” was to control the volume (amplitude) in a number of ways; most importantly to come up with a “smooth” change in the “on/off” process (of also any other device).
Unlike Max I didn’t face any major problem [yet] because most of the job was to deal with algorithms and algebra; basically working with numbers.
If this “ramp” is used in changing the phase of the generated wave (in the context of a stereo situation) it would cause –not surprisingly- the effect of a phaser! which is widely used thus sounds pleasant to many individuals (including me).
Most of my time however, was spent of debugging the device and making it more user-friendly.

References:
- Christian Haines 'Creative Computing 2.1' Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 30/7/2008
- Max MSP, Wikipedia (http://en.wikipedia.org/wiki/Max/MSP) [Accessed 31/7/2008]

Wednesday, 30 July 2008

AA2 - Semester 4 - Week 1

Game music.
First things first; I have never been into gaming, I have never understood the concept of wasting time in front of computer ... anyway the point is that I am totally stupid in this field!

For the 1st week analysis of a game sound, I picked the game "The Pawn" by Magnetic Scrolls published in either 1985 or 1986!The Pawn is a text-adventure game (which I gave up playing in almost 5 minutes) and having known its age, has a relatively good environment and graphics. More importantly however, the significance of this game is it being a pioneer in the field of digitised computer game sounds. For the first time, the developers utilised Amiga's Paula sound chip, which was revolutionary at that time!
The introduction theme (which I never got to hear anyway, but I read that it actually exists!) is fully stereo, MIDI-sounding, multi layer and extremely relaxing tune by John Molloy. Since Paula has been used for this game, it is made using 4 DMA* -driven 8-bit PCM** sample sound channels.
I couldn't find any video of this game anywhere on the internet; my torrent client doesn't work either!

For your sonic experience, you can either check the box at the right side of this page or download the music from HERE.

* Direct Memory Access
** Pulse Code Modulation

References:

- Christian Haines. "Audio Arts: Semester 4, Weeks 1." Lecture presented at the EMU, University of Adelaide, South Australia, 29/07/2008.


Friday, 27 June 2008

AA2 - Semester 3 - Final Project

I did two bands:
1st, a rock band called "Friends of Enemies"
Here is their photo (taken from the singer's Facebook page)
The 2nd band was "Aaron Austin Quartet", and performance of the jazz classic "Have You Met Ms Jones" can be found in this box thingy that is here on my blog.
The documentations would be here soon..

Sunday, 18 May 2008

AA2 - Semester 3 - Week 9

Mixing; stage 2:

- For this week I chose to work on the sound of the band I will be recording for the final project.
This jazz band called "Aaron Austin Quartet" played a bit of blues in the last weekend and I recorded a bit of it.
- Just for the sake of trying, I used the delay plug-ins, very very slightly affecting the sound of the solos (particularly the 2nd and the 3rd examples, the guitar and bass solos).
It sounded pretty much awesome! It actually gave the sound some effect between a reverb and a chorus.

I will probably set my session pretty much like this one for this band's recording some days later..


References:
- David Grice, 'Audio Arts 2.1' Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 6/5/2008

Tuesday, 13 May 2008

Forum - Semester 3 - Week 8

The best forum ever, EVER..

Peter Dowdall
Alright, the story is that we got speeched! by some gentleman who has heaps of experience in REAL-LIFE; i.e. sound engineering and recording.
Having spent a notable number of years in the music market of New York, he talked to us about some points which we ave to take into consideration. (of course if we want to end up as a part of this industry).
The tenseness of the job, the value of time and the hardness of satisfying the clients were main topics of his instructions. He took an example of his work for Pepsi; Britney Spears was supposed to sing on a whole bunch of different videos and the final mix was for him to finish.
The various points of technical difficulties expressed by him was my personal main interest.
At the end, I talked to some of my classmates and there were a few who were literally "freaked-out" and were saying that this was not what they would opt to do as their career.
In contrast, I got moer interested in the whole "thing" of audio engineernig and production.
It was a really good one. Congratulations Stephen (or Chrisitian?)

PS: Peter also talked about a recording session which he did a while ago, honestly wasn't really the most interesting topic for me!

This picture below is Peter Dawdall (Britney Spears' sound engineer), South Park interpretation: References:
- Peter Dawdall "Music Technology Forum - Week 8." Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 10/04/2008.
- South Park, Season 12, Episode 1202 "Britney's New Look",Southpark Studios (http://www.southparkstudios.com/) [Accessed 13/5/2008]

CC2 - Semester 3 - Week 8

Controlling Reason via Max MSP

As the topic says, the exercise was to create a patch which sends information to Reason and controls it. The features were supposed to be added to the probability sequencer from the last week.
Since I had screwed up the last week exercise, I had to do a double job!
The MP3 of this week, demonstrates a sequencer set to play notes C, E, G and B (C Major 7th) from different octaves.
There are also controllers for ADSR, filter, etc,... in the MAX patch; of course they control parameters in Reason.
The issue I came across was the common control numbers. For instance controller 25 is for both panning channel 3 in the Mixer in Reason and off the Keyboard Track on the Subtractor sampler. However, there are ways to get away with it.
My patch looks like this:

and the MP3 of around a minute of the result could be found in the Box.net box in the right side of the page.

PS: The TAB key turns the device on and off!

References:
- Christian Haines 'Creative Computing 2.1' Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 8/5/2008
- Max MSP, Wikipedia (http://en.wikipedia.org/wiki/Max/MSP) [Accessed 5/5/2008]

Wednesday, 7 May 2008

AA2 - Semester 3 - Week 8

Mixing; stage 1.

For this week's exercise, we had to mixdown 3 different mixes of owned material.
I recorded one of my friends' bands last semester, having chosen 1 of 11 different versions of the tune I had, I took 2 different parts of the song and re-mixed them. (files labeled (1) and (2))

My Protools session including all the plug-ins that I used looked something like this:

The other re-mix was from a soundtrack I made for a TV documentary; the mix includes acoustic instruments (guitar and a traditional Iranian lute) and sampled sounds. I rewired Reason and Protools for this tune. (file labeled (3))

References:
- David Grice, 'Audio Arts 2.1' Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 6/5/2008

Forum - Semester 3 - Week 7

Tristram Cary;

The composer of the music theme for the series "Doctor Who" who died on April 24th, 2008 in Adelaide, was a British electronic musician who basically gained his reputation from works he did right after the WWII. As a radar engineer, he got acquainted with various electronic devices and developed a -relatively unique- concept of electronic music.For his biography click here; I better not talk more and if interested, you might as well check his website yourself. Unfortunately, by this date his website is not updated. He is passed away (at the age of 82) but the website says that he is productive at the age of 81.
Stephen Whittington basically told us about Cary's life, the way he got into what he got into, the ups and downs of his career and his late works. According to Stephen, a notable part of his late years was spent on re-archiving and re-mixing his former works; apparently 73 (or 75?) CDs in total.

References:
- Stephen Whittington "Music Technology Forum - Week 5 - Pierre Henry." Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 3/04/2008.
- Tristram Cary - Wikipedia (http://en.wikipedia.org/wiki/Tristram_Cary) [Accessed 6/5/8]
- Tristram Cary - Official website (http://www.tristramcary.com/) [Accessed 6/5/8]

CC2 - Semester 3 - Week 7

Probability Sequencer.

Direct and honest! this was out objective:
Create a probability sequencer application using the multislider object as the fundamental GUI component. The user will select 12 different notes with each note having a certain probability of being triggered. The note will be triggered in sequence at a specific BPM. A table will be used to store the information and a histo / table object will be used to analyse the results of different sequences.I had two major issues: I came up with a patch to serve the algorithm but I used it 12 times! for 12 different notes; I must have been pretty stupid. I know it is possible to use a list, i.e. a set of identical parameters to do this job but SO FAR I haven't been able to do so. Wait for a while, please..
The second BIG issue was that I realised (AFTER I ACTUALLY UPLOADED THIS FILE!) that I had understood everything wrong!
My patch should have done something else basically!
It's funny 'cause what we were asked for, was much easier than what I came up with but there is about 3 hours left to the deadline and I have a class and stuff.. sorry, my bad..

PS: I came up with my project idea and -apparently- Mr Christian Haines has roughly approved it. It's good news, isn't it?
PS2: Week 7 CC was on my birthday, May 1st!

- Christian Haines 'Creative Computing 2.1' Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 1/5/2008
- Max MSP, Wikipedia (http://en.wikipedia.org/wiki/Max/MSP) [Accessed 5/5/2008]

Sunday, 4 May 2008

AA2 - Semester 3 - Week 7

Piano recording.
Like other weeks' exercises, the job this week was to experiment several recordings of piano and several mixes of them.
The notable point here was the magnificence! of PZM microphone; I had tried it before and I had come to realisation that it was a wicked mike but I think I needed a confirmation as well; David Grice confirmed the quality of sound recorded by PZM!

The first is a mixture of C414, an NT5 positioned in front of the piano and a U87 picking the reverb of the room: Sounds a bit metallic but I kind-of like it. Especially when you have "repetitive" sort of music, (house,etc..) a metallic piano line would be nice.. the file is labeled: 1-aa2sem3week7c414nT5RU87R(1). Should be easy to find..

The second mix is pretty simple: a U87 with an omni pattern and two NT5s positioned in the read and he back of the instrument. I find the sound pretty wide and nice.. Labeled: 2-aa2sem3week7MSO-NT5s(2)

The 3rd one is probably the best one; a PZM sticked to the open lid of the piano and a stereoised! U87 (figure of 8 pattern and doubled; left and right are 180 degree different in phase) labeled: 3-aa2sem3week7PZM-MS(3)

The 4th exercise probably carries the "fullest" sound; It's not really surprising because there are 5 microphones present in the sound. PZM, both U87s of the M-S technique, and two NT5s:
4-aa2sem3week7PZMMS8MSoNT5s(4)

References:
- David Grice, 'Audio Arts 2.1' Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 29/4/2008

Sunday, 13 April 2008

Forum - Semester 3 - Week 6

CC2 - Semester 3 - Week 6

This week's exercise was to expand the MIDI sequencer and add a random sequencer to it.
Besides this, there were few new concepts of working with MAX, namely encapsulation, abstraction in providing objects,...
This is how my patch looks and the entire project can be found in the Box.net section in the right side of the blog.Debugging was my major issue. I had to consider myself the most stupid user ever and fix the program according to the most stupid behaviour. However, the aesthetics of the patch-programming were really interesting jobs!

References:
- Christian Haines 'Creative Computing 2.1' Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 10/4/2008
- Max MSP, Wikipedia (http://en.wikipedia.org/wiki/Max/MSP) [Accessed 22/4/2008]

Saturday, 12 April 2008

AA2 - Semester 3 - Week 6

Recording strings:

Having realised that we were not necessarily supposed to record 4 different instruments each time (i.e. life is easier than it looks) I recorded Violin, in 4 different ways using 4 different mixture of microphones.

Example 1: I used an NT3 positioned pretty close to the sound source and a Yamaha MZ204 positioned in the rear side of the instrument for picking up the -relatively- bassier frequencies.
Example 2: I used an NT5 close to the violin and a U87 far from the source to pick the natural reverb of the room.
Example 3: An SM58 picking the direct sound of the instrument and an AKG414 for the room reverb.Example 4: Is the mix of a U87 close to the source, a SHOTGUN! Sennheiser MKH416 for the room reverb (that's why they use them, no?) and the U87 placed far from the instrument for more natural sweet room reverb!
See you.

References:
- David Grice, 'Audio Arts 2.1' Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 8/4/2008

Thursday, 3 April 2008

CC2 - Semester 3 - Week 5

Here is the expansion of the last week's patch.
There is a legend for the use of the patch so there should be not much problem.
The patch looks like this:NEWS: I have incorporated this Box.net thingy in my blog. It can be found in the right side of the blog.
Bye.

References:
- Christian Haines 'Creative Computing 2.1' Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 3/4/2008
- Max MSP, Wikipedia (http://en.wikipedia.org/wiki/Max/MSP) [Accessed 4/4/2008]

Forum - Semester 3 - Week 5

Pierre Henry.
This week we were shown a video of Pierre Henry, one of the godfathers of avant-garde electronic music and Musique concrète.
The film was called "The Art of Sound", Directed and Written by Eric Darmon & Franck Mallet.
The most interesting part for me was Henry's appreciation for a) being creative and b)sound.
Following the film, Stephen played few excerpts o
f Henry's works, but to be honest the film and the environment in which Henry was/is working was more impressive for me.The question that came to my mind while watching the film (and I can recall of other similar situations too) was the link (or rather HOW to link) influences and a work of art.
It often happens that artists are influenced by elements outside their respective field of art. Taking Henry as an example, -according to Stephen- in one of his projects, he was influenced by Tibetan sacred literature. I guess one of the most important parts of education, particularly in universities, is to teach the ways of "being able to materialise ideas and influences". Not to mention it would not happen without a vision of financial survival as well!
Anyway, back to
Monsieur Henry.. It was actually one of the first times I really enjoyed musique concrète! I think this IS the influence of the EMU.

PS: The photo is a futuristic representation of EMU Recording Space

References:

- Stephen Whittington "Music Technology Forum - Week 5 - Pierre Henry." Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 3/04/2008.

AA2 - Semester 3 - Week 5

Saturday, 29 March 2008

Forum - Semester 3 - Week 4

This week, the forum continued
introducing non-first-year students and their respective projects to the newcomers.
Ben Prubert presented his Multimedia project of MAX in which he was manipulating audio and video in a real-time manner using MAX. It looked pretty fascinating, however, since I don't have enough ideas how MAX actually works, I can't really analyze what he did. Considering the outcome (which to me is more than just the most important part), it was great.
Freddie presented his 1st semester project for Creative Computing. To me it was good in the sense that I could easily imagine of his work as a soundtrack for a movie. (His final result had the same vibe as the soundtrack for the film "Deadman" by Jim Jarmusch. The music for that film is by Neil Young.
Doug presented an interesting video of his process of making his 1st semester Creative Computing project. It was one of the best presentations so far (in my opinion) showing the actual progress of the developing of an artwork
..

CC2 - Semester 3 - Week 4

MIDI Controller (Part II)
The idea was to expand last weeks' MIDI controller patch in MAX; however, since I needed a new review on what I was doing, I started from scratch. Hence, the different appearance of my "Astalavista MIDI Controller"!
It looks like this:
Since I'm not rich yet, and therefore I can't genuinely support copyright, I have added the entire MAX patch text here as an ultra-open-source for this fantastic innovation of mine!
Here it is:Note: The external MIDI controller keyboard Novation Remote SL does not cover the entire range of pitch bend and modulation (i.e. 128 values). I haven't tested the patch with other hardware so I do not know of its performance elsewhere..
Cheers!
References:
- Christian Haines 'Creative Computing 2.1' Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 27/3/2008
- Max MSP, Wikipedia (http://en.wikipedia.org/wiki/Max/MSP) [Accessed 31/3/2008]

AA2 - Semester 3 - Week 4

Recording percussion.
For the 1st one, a recording of a snare, I used an SM57, a U-87 and an AKg414. The trick in the mix is that in the final result, the U-87 and the 414 (which are positioned far from the sound source) are panned 100% left and right.The 2nd one is a bongo. I used all 4 mics of mine (the previous ones plus an MD421) to get my desired sound. This mix contains a stereoised!! MD421 track. (two channels are in 180 degrees of phase difference)
For the third one, I played a tambourine through the recording room in a circle. I was simultaneously moving between the four microphones.. the result is interesting to be listened to in a pair of headphones. (the initial idea was from Freddie)This was my fantastic job of the 4th week.
Cheers.
Sanad

References:
- David Grice, 'Audio Arts 2.1' Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 25/3/2008

Sunday, 23 March 2008

Forum - Semester 3 - Week 3

This week's session was dedicated to presentation by few 2nd and 3rd year students, me, Jacob, David and Edward.
My personal impression was that some of the works we have/opt to do outside the regular educational curriculum, is unconsciously too sophisticated. In other words, many of us students' final results can be achieved through processes simpler than what we normally go through.
I presented my last semester's project in Earpoke, in which I had fused a female vocal line and noise-based rhythms. I could have done a better job simply by using 2 or 3 softwares and byebye! However, the other side of these experiments and exercises is that in the way of coming up with these fantastic! sonic works, we are urged to overcome several problems which consequently
help us be more "creative".I hope 1st year students are not freaked-out! some works (particularly David's) were way advanced-looking and a bit scary at the first look!

CC2 - Semester 3 - Week 3

This was 3rd week's exercise:

Create a small virtual keyboard 'application' that emulates the core behaviour of the application "MIDIkeys", namely - MIDI input, MIDI output, velocity and channel. Also, include a display that shows the following - octave number, pitch class, MIDI note number and MIDI note name.

and I came up with this MAX patch:
The actual patch is stored here:References:
- Christian Haines 'Creative Computing 2.1' Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 20/3/2008
- Max MSP, Wikipedia (http://en.wikipedia.org/wiki/Max/MSP) [Accessed 22/3/2008]

Saturday, 22 March 2008

AA2 - Semester 3 - Week 3

The objective for this week was to experiment some recordings of a guitar amplifier.
There were obviously several microphones an also different positioning techniques.
As apparent in the picture below, I examined few microphones and different positions...I used a PZM mike, an SM58, an AKG414, a Sennheiser MD421 and I also used the result of the Direct Input; which ended up being present in all the final 3 mixes.
1st mix: microphones are close to the amplifier; the mix is the results of the DI, AKG and Sennheiser.2nd: mics are in a distance of about 10 inches from the amp; DI and SM583rd: mics are about a meter away from the amp; DI and MD421.

References:
- David Grice, 'Audio Arts 2.1' Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 18/3/2008

Friday, 14 March 2008

Forum - Semester 3 - Week 2

A big part of this session was spent on discussing the issues and problems with blogging; Not just the problems faced by students, but also concepts and controversies of blogging in general.
Some terms in the world of blogging were introduced to us, such as:
- Blogademia: term refering to the academic study of weblogs. Coined in 2003 as the title of a blog concerning a study of the language used in blogs.
- Blogalisation:The Internet trend from the beginning of the 21st century of making a blog out of everything. You have sport blogs, car blogs, news blogs, love blogs, photo blogs, etc.
- Blog fodder: An interesting idea, story, or link. Referred to as blog fodder when your first reaction is to use it in your blog.
- Blogarrhea: A condition where a person posts rambling, long, or frequent entries in their blog.
- Blog hawk: One who constantly checks or refreshes their own blog to see how many hits they've gotten.
- Blogamy: the custom or condition of having a marital relationship strictly confined to the blogosphere. People in a blogamous relationship may or may not be married to others in real life.
- Blogerati: The blogosphere intelligentsia.
- Blogblivion: When a blog is neglected by its creator.

Nice, huh?

image taken from Freelance Switch: (http://freelanceswitch.com/images/freelancers_blog.jpg)

References:
- Urban Dictionary: (www.urbandictionary.com) [Accessed 14/3/8]