Friday, 27 June 2008

AA2 - Semester 3 - Final Project

I did two bands:
1st, a rock band called "Friends of Enemies"
Here is their photo (taken from the singer's Facebook page)
The 2nd band was "Aaron Austin Quartet", and performance of the jazz classic "Have You Met Ms Jones" can be found in this box thingy that is here on my blog.
The documentations would be here soon..

Sunday, 18 May 2008

AA2 - Semester 3 - Week 9

Mixing; stage 2:

- For this week I chose to work on the sound of the band I will be recording for the final project.
This jazz band called "Aaron Austin Quartet" played a bit of blues in the last weekend and I recorded a bit of it.
- Just for the sake of trying, I used the delay plug-ins, very very slightly affecting the sound of the solos (particularly the 2nd and the 3rd examples, the guitar and bass solos).
It sounded pretty much awesome! It actually gave the sound some effect between a reverb and a chorus.

I will probably set my session pretty much like this one for this band's recording some days later..


References:
- David Grice, 'Audio Arts 2.1' Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 6/5/2008

Tuesday, 13 May 2008

Forum - Semester 3 - Week 8

The best forum ever, EVER..

Peter Dowdall
Alright, the story is that we got speeched! by some gentleman who has heaps of experience in REAL-LIFE; i.e. sound engineering and recording.
Having spent a notable number of years in the music market of New York, he talked to us about some points which we ave to take into consideration. (of course if we want to end up as a part of this industry).
The tenseness of the job, the value of time and the hardness of satisfying the clients were main topics of his instructions. He took an example of his work for Pepsi; Britney Spears was supposed to sing on a whole bunch of different videos and the final mix was for him to finish.
The various points of technical difficulties expressed by him was my personal main interest.
At the end, I talked to some of my classmates and there were a few who were literally "freaked-out" and were saying that this was not what they would opt to do as their career.
In contrast, I got moer interested in the whole "thing" of audio engineernig and production.
It was a really good one. Congratulations Stephen (or Chrisitian?)

PS: Peter also talked about a recording session which he did a while ago, honestly wasn't really the most interesting topic for me!

This picture below is Peter Dawdall (Britney Spears' sound engineer), South Park interpretation: References:
- Peter Dawdall "Music Technology Forum - Week 8." Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 10/04/2008.
- South Park, Season 12, Episode 1202 "Britney's New Look",Southpark Studios (http://www.southparkstudios.com/) [Accessed 13/5/2008]

CC2 - Semester 3 - Week 8

Controlling Reason via Max MSP

As the topic says, the exercise was to create a patch which sends information to Reason and controls it. The features were supposed to be added to the probability sequencer from the last week.
Since I had screwed up the last week exercise, I had to do a double job!
The MP3 of this week, demonstrates a sequencer set to play notes C, E, G and B (C Major 7th) from different octaves.
There are also controllers for ADSR, filter, etc,... in the MAX patch; of course they control parameters in Reason.
The issue I came across was the common control numbers. For instance controller 25 is for both panning channel 3 in the Mixer in Reason and off the Keyboard Track on the Subtractor sampler. However, there are ways to get away with it.
My patch looks like this:

and the MP3 of around a minute of the result could be found in the Box.net box in the right side of the page.

PS: The TAB key turns the device on and off!

References:
- Christian Haines 'Creative Computing 2.1' Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 8/5/2008
- Max MSP, Wikipedia (http://en.wikipedia.org/wiki/Max/MSP) [Accessed 5/5/2008]

Wednesday, 7 May 2008

AA2 - Semester 3 - Week 8

Mixing; stage 1.

For this week's exercise, we had to mixdown 3 different mixes of owned material.
I recorded one of my friends' bands last semester, having chosen 1 of 11 different versions of the tune I had, I took 2 different parts of the song and re-mixed them. (files labeled (1) and (2))

My Protools session including all the plug-ins that I used looked something like this:

The other re-mix was from a soundtrack I made for a TV documentary; the mix includes acoustic instruments (guitar and a traditional Iranian lute) and sampled sounds. I rewired Reason and Protools for this tune. (file labeled (3))

References:
- David Grice, 'Audio Arts 2.1' Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 6/5/2008

Forum - Semester 3 - Week 7

Tristram Cary;

The composer of the music theme for the series "Doctor Who" who died on April 24th, 2008 in Adelaide, was a British electronic musician who basically gained his reputation from works he did right after the WWII. As a radar engineer, he got acquainted with various electronic devices and developed a -relatively unique- concept of electronic music.For his biography click here; I better not talk more and if interested, you might as well check his website yourself. Unfortunately, by this date his website is not updated. He is passed away (at the age of 82) but the website says that he is productive at the age of 81.
Stephen Whittington basically told us about Cary's life, the way he got into what he got into, the ups and downs of his career and his late works. According to Stephen, a notable part of his late years was spent on re-archiving and re-mixing his former works; apparently 73 (or 75?) CDs in total.

References:
- Stephen Whittington "Music Technology Forum - Week 5 - Pierre Henry." Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 3/04/2008.
- Tristram Cary - Wikipedia (http://en.wikipedia.org/wiki/Tristram_Cary) [Accessed 6/5/8]
- Tristram Cary - Official website (http://www.tristramcary.com/) [Accessed 6/5/8]

CC2 - Semester 3 - Week 7

Probability Sequencer.

Direct and honest! this was out objective:
Create a probability sequencer application using the multislider object as the fundamental GUI component. The user will select 12 different notes with each note having a certain probability of being triggered. The note will be triggered in sequence at a specific BPM. A table will be used to store the information and a histo / table object will be used to analyse the results of different sequences.I had two major issues: I came up with a patch to serve the algorithm but I used it 12 times! for 12 different notes; I must have been pretty stupid. I know it is possible to use a list, i.e. a set of identical parameters to do this job but SO FAR I haven't been able to do so. Wait for a while, please..
The second BIG issue was that I realised (AFTER I ACTUALLY UPLOADED THIS FILE!) that I had understood everything wrong!
My patch should have done something else basically!
It's funny 'cause what we were asked for, was much easier than what I came up with but there is about 3 hours left to the deadline and I have a class and stuff.. sorry, my bad..

PS: I came up with my project idea and -apparently- Mr Christian Haines has roughly approved it. It's good news, isn't it?
PS2: Week 7 CC was on my birthday, May 1st!

- Christian Haines 'Creative Computing 2.1' Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 1/5/2008
- Max MSP, Wikipedia (http://en.wikipedia.org/wiki/Max/MSP) [Accessed 5/5/2008]

Sunday, 4 May 2008

AA2 - Semester 3 - Week 7

Piano recording.
Like other weeks' exercises, the job this week was to experiment several recordings of piano and several mixes of them.
The notable point here was the magnificence! of PZM microphone; I had tried it before and I had come to realisation that it was a wicked mike but I think I needed a confirmation as well; David Grice confirmed the quality of sound recorded by PZM!

The first is a mixture of C414, an NT5 positioned in front of the piano and a U87 picking the reverb of the room: Sounds a bit metallic but I kind-of like it. Especially when you have "repetitive" sort of music, (house,etc..) a metallic piano line would be nice.. the file is labeled: 1-aa2sem3week7c414nT5RU87R(1). Should be easy to find..

The second mix is pretty simple: a U87 with an omni pattern and two NT5s positioned in the read and he back of the instrument. I find the sound pretty wide and nice.. Labeled: 2-aa2sem3week7MSO-NT5s(2)

The 3rd one is probably the best one; a PZM sticked to the open lid of the piano and a stereoised! U87 (figure of 8 pattern and doubled; left and right are 180 degree different in phase) labeled: 3-aa2sem3week7PZM-MS(3)

The 4th exercise probably carries the "fullest" sound; It's not really surprising because there are 5 microphones present in the sound. PZM, both U87s of the M-S technique, and two NT5s:
4-aa2sem3week7PZMMS8MSoNT5s(4)

References:
- David Grice, 'Audio Arts 2.1' Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 29/4/2008

Sunday, 13 April 2008

Forum - Semester 3 - Week 6

CC2 - Semester 3 - Week 6

This week's exercise was to expand the MIDI sequencer and add a random sequencer to it.
Besides this, there were few new concepts of working with MAX, namely encapsulation, abstraction in providing objects,...
This is how my patch looks and the entire project can be found in the Box.net section in the right side of the blog.Debugging was my major issue. I had to consider myself the most stupid user ever and fix the program according to the most stupid behaviour. However, the aesthetics of the patch-programming were really interesting jobs!

References:
- Christian Haines 'Creative Computing 2.1' Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 10/4/2008
- Max MSP, Wikipedia (http://en.wikipedia.org/wiki/Max/MSP) [Accessed 22/4/2008]

Saturday, 12 April 2008

AA2 - Semester 3 - Week 6

Recording strings:

Having realised that we were not necessarily supposed to record 4 different instruments each time (i.e. life is easier than it looks) I recorded Violin, in 4 different ways using 4 different mixture of microphones.

Example 1: I used an NT3 positioned pretty close to the sound source and a Yamaha MZ204 positioned in the rear side of the instrument for picking up the -relatively- bassier frequencies.
Example 2: I used an NT5 close to the violin and a U87 far from the source to pick the natural reverb of the room.
Example 3: An SM58 picking the direct sound of the instrument and an AKG414 for the room reverb.Example 4: Is the mix of a U87 close to the source, a SHOTGUN! Sennheiser MKH416 for the room reverb (that's why they use them, no?) and the U87 placed far from the instrument for more natural sweet room reverb!
See you.

References:
- David Grice, 'Audio Arts 2.1' Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 8/4/2008

Thursday, 3 April 2008

CC2 - Semester 3 - Week 5

Here is the expansion of the last week's patch.
There is a legend for the use of the patch so there should be not much problem.
The patch looks like this:NEWS: I have incorporated this Box.net thingy in my blog. It can be found in the right side of the blog.
Bye.

References:
- Christian Haines 'Creative Computing 2.1' Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 3/4/2008
- Max MSP, Wikipedia (http://en.wikipedia.org/wiki/Max/MSP) [Accessed 4/4/2008]

Forum - Semester 3 - Week 5

Pierre Henry.
This week we were shown a video of Pierre Henry, one of the godfathers of avant-garde electronic music and Musique concrète.
The film was called "The Art of Sound", Directed and Written by Eric Darmon & Franck Mallet.
The most interesting part for me was Henry's appreciation for a) being creative and b)sound.
Following the film, Stephen played few excerpts o
f Henry's works, but to be honest the film and the environment in which Henry was/is working was more impressive for me.The question that came to my mind while watching the film (and I can recall of other similar situations too) was the link (or rather HOW to link) influences and a work of art.
It often happens that artists are influenced by elements outside their respective field of art. Taking Henry as an example, -according to Stephen- in one of his projects, he was influenced by Tibetan sacred literature. I guess one of the most important parts of education, particularly in universities, is to teach the ways of "being able to materialise ideas and influences". Not to mention it would not happen without a vision of financial survival as well!
Anyway, back to
Monsieur Henry.. It was actually one of the first times I really enjoyed musique concrète! I think this IS the influence of the EMU.

PS: The photo is a futuristic representation of EMU Recording Space

References:

- Stephen Whittington "Music Technology Forum - Week 5 - Pierre Henry." Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 3/04/2008.

AA2 - Semester 3 - Week 5

Saturday, 29 March 2008

Forum - Semester 3 - Week 4

This week, the forum continued
introducing non-first-year students and their respective projects to the newcomers.
Ben Prubert presented his Multimedia project of MAX in which he was manipulating audio and video in a real-time manner using MAX. It looked pretty fascinating, however, since I don't have enough ideas how MAX actually works, I can't really analyze what he did. Considering the outcome (which to me is more than just the most important part), it was great.
Freddie presented his 1st semester project for Creative Computing. To me it was good in the sense that I could easily imagine of his work as a soundtrack for a movie. (His final result had the same vibe as the soundtrack for the film "Deadman" by Jim Jarmusch. The music for that film is by Neil Young.
Doug presented an interesting video of his process of making his 1st semester Creative Computing project. It was one of the best presentations so far (in my opinion) showing the actual progress of the developing of an artwork
..

CC2 - Semester 3 - Week 4

MIDI Controller (Part II)
The idea was to expand last weeks' MIDI controller patch in MAX; however, since I needed a new review on what I was doing, I started from scratch. Hence, the different appearance of my "Astalavista MIDI Controller"!
It looks like this:
Since I'm not rich yet, and therefore I can't genuinely support copyright, I have added the entire MAX patch text here as an ultra-open-source for this fantastic innovation of mine!
Here it is:Note: The external MIDI controller keyboard Novation Remote SL does not cover the entire range of pitch bend and modulation (i.e. 128 values). I haven't tested the patch with other hardware so I do not know of its performance elsewhere..
Cheers!
References:
- Christian Haines 'Creative Computing 2.1' Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 27/3/2008
- Max MSP, Wikipedia (http://en.wikipedia.org/wiki/Max/MSP) [Accessed 31/3/2008]

AA2 - Semester 3 - Week 4

Recording percussion.
For the 1st one, a recording of a snare, I used an SM57, a U-87 and an AKg414. The trick in the mix is that in the final result, the U-87 and the 414 (which are positioned far from the sound source) are panned 100% left and right.The 2nd one is a bongo. I used all 4 mics of mine (the previous ones plus an MD421) to get my desired sound. This mix contains a stereoised!! MD421 track. (two channels are in 180 degrees of phase difference)
For the third one, I played a tambourine through the recording room in a circle. I was simultaneously moving between the four microphones.. the result is interesting to be listened to in a pair of headphones. (the initial idea was from Freddie)This was my fantastic job of the 4th week.
Cheers.
Sanad

References:
- David Grice, 'Audio Arts 2.1' Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 25/3/2008

Sunday, 23 March 2008

Forum - Semester 3 - Week 3

This week's session was dedicated to presentation by few 2nd and 3rd year students, me, Jacob, David and Edward.
My personal impression was that some of the works we have/opt to do outside the regular educational curriculum, is unconsciously too sophisticated. In other words, many of us students' final results can be achieved through processes simpler than what we normally go through.
I presented my last semester's project in Earpoke, in which I had fused a female vocal line and noise-based rhythms. I could have done a better job simply by using 2 or 3 softwares and byebye! However, the other side of these experiments and exercises is that in the way of coming up with these fantastic! sonic works, we are urged to overcome several problems which consequently
help us be more "creative".I hope 1st year students are not freaked-out! some works (particularly David's) were way advanced-looking and a bit scary at the first look!

CC2 - Semester 3 - Week 3

This was 3rd week's exercise:

Create a small virtual keyboard 'application' that emulates the core behaviour of the application "MIDIkeys", namely - MIDI input, MIDI output, velocity and channel. Also, include a display that shows the following - octave number, pitch class, MIDI note number and MIDI note name.

and I came up with this MAX patch:
The actual patch is stored here:References:
- Christian Haines 'Creative Computing 2.1' Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 20/3/2008
- Max MSP, Wikipedia (http://en.wikipedia.org/wiki/Max/MSP) [Accessed 22/3/2008]

Saturday, 22 March 2008

AA2 - Semester 3 - Week 3

The objective for this week was to experiment some recordings of a guitar amplifier.
There were obviously several microphones an also different positioning techniques.
As apparent in the picture below, I examined few microphones and different positions...I used a PZM mike, an SM58, an AKG414, a Sennheiser MD421 and I also used the result of the Direct Input; which ended up being present in all the final 3 mixes.
1st mix: microphones are close to the amplifier; the mix is the results of the DI, AKG and Sennheiser.2nd: mics are in a distance of about 10 inches from the amp; DI and SM583rd: mics are about a meter away from the amp; DI and MD421.

References:
- David Grice, 'Audio Arts 2.1' Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 18/3/2008

Friday, 14 March 2008

Forum - Semester 3 - Week 2

A big part of this session was spent on discussing the issues and problems with blogging; Not just the problems faced by students, but also concepts and controversies of blogging in general.
Some terms in the world of blogging were introduced to us, such as:
- Blogademia: term refering to the academic study of weblogs. Coined in 2003 as the title of a blog concerning a study of the language used in blogs.
- Blogalisation:The Internet trend from the beginning of the 21st century of making a blog out of everything. You have sport blogs, car blogs, news blogs, love blogs, photo blogs, etc.
- Blog fodder: An interesting idea, story, or link. Referred to as blog fodder when your first reaction is to use it in your blog.
- Blogarrhea: A condition where a person posts rambling, long, or frequent entries in their blog.
- Blog hawk: One who constantly checks or refreshes their own blog to see how many hits they've gotten.
- Blogamy: the custom or condition of having a marital relationship strictly confined to the blogosphere. People in a blogamous relationship may or may not be married to others in real life.
- Blogerati: The blogosphere intelligentsia.
- Blogblivion: When a blog is neglected by its creator.

Nice, huh?

image taken from Freelance Switch: (http://freelanceswitch.com/images/freelancers_blog.jpg)

References:
- Urban Dictionary: (www.urbandictionary.com) [Accessed 14/3/8]



CC2 - Semester 3 - Week 2

I don't really know if I have properly understood the exercise for this week or not; here it is:
Create a patch that prints the notes of a Pythagorean chromatic scale in descending order. The patch must contain a user selectable starting pitch and octave.

This patch I have created, asks the user to determine the note and the octave, then it prints the frequencies out,..

As an example, I give the patch the input of C# and the 2nd octave: the results would be: (check the MAX window in the top right hand side of the picture)
The patch is here to download:




References:
- Christian Haines 'Creative Computing 2.1' Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 13/3/2008
- Max MSP, Wikipedia (http://en.wikipedia.org/wiki/Max/MSP) [Accessed 16/3/2008]

AA2 - Semester 3 - Week 2

This session was about recording voice/vocals. I have personally had some experiences of recording vocals before. However, this time we were told of some tricks and David arose some points which were new to me.
For the exercise, Freddie, Edward and I recorded few takes of different sentences in various intonations:
1- AD: Edward read this sentence as if he were to promote some product, however, the dynamics were not of an issue since he was relatively reading the sentence like a narration in a film:



2- Speech: Freddie read the procedure of starting up/shutting down the studio in a very calm manner; more or less like the normal way he speaks.



3- Dynamics: Same set of phrases were read by Freddie but this time the dynamics were really matters of attention. I used a compressor and basically did some jobs of mixing after the recording:



Here is a good PDF about vocal miking techniques.

References:
- David Grice, 'Audio Arts 2.1' Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 11/3/2008
- Vocal Miking Techniques (PDF Format), California State University - Chico (www.csuchico.edu/~ja77/RecArts/Downloads/VocalMikingTechniques.PDF) [Accessed 14/3/8]

Forum - Semester 3 - Week 1

This session was held not in the place and not at the time it was supposed to be held. Besides, I had attended WOMADelaide and I was not present.
I apologise for any inconvenience caused.
Sanad

CC2 - Semester 3 - Week 1

We are to get into programming; as far as I know the main software with which we would deal a lot is Max MSP.
As an introduction to the whole programming environment, we were told of Pseudocode. The university of North Florida describes it as "an artificial and informal language that helps programmers develop algorithms... a "text-based" detail (algorithmic) design tool "(I tried my best not to consider Wikipedia as a source).
Our task for this week is pretty simple: we should write a pseudocode program that continuously
maps and converts the different components of a drum kit into different respective notes
of the major D scale. After the conversion the program should then play the notes on a
musical device until 20 notes are heard.
This is what I came up with:
References:
- Christian Haines 'Creative Computing 2.1' Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 6/3/2008
- Pseudocode Examples, The university of North Florida (http://www.unf.edu/~broggio/cop2221/2221pseu.htm) [Accessed 8/3/2008]

Monday, 10 March 2008

AA2 - Semester 3 - Week 1

As my main field of interest is ethnomusicology, for this week I thought of a recording session having few non-western musical instruments; say 3!

The first one is a didgeridoo-like woodwind. In order to get a proper sound out of this, the microphone I would use would be the same microphone designed to record a trumpet or saxophone. The important part of the procedure would be to clamp the microphone to the instrument.


The second instrument would be a frame drum (something like Daf, shown in the picture below). Recording this one is tricky. As a result of the shape of the instrument, the players usually tend to move while playing Daf. Therefore I would choose a microphone with a cardioid pattern of recording.

The third one would be a zither and it would be easier to record. Since the instrument does not move during the recording, a proper positioning makes a huge difference to the final result. I would take the instrument “Santour” as the example; and I would use 2 SM57s for two sides of it.


References:
Grice, David. 2008. “AA2 – Session planning.” Seminar presented at the University of Adelaide.

Wednesday, 14 November 2007

AA1 - Semester 2 - Project

This project is a recreation of the sonic environment of a bus stop. There are various sounds of people walking, the pedestrian road signal, cars assing, moble phones beeping for text messages and rings, people speaking on the phone, etc. None of the sounds used (except for person talking to phone in Dutch language) are recorded from the original source; i.e. the sounds of the cars are made of other sounds and noises. In some cases -such as the ring of the mobile phone) MIDI technology has been utilised. There are also other sounds (particularly sounds of car engines), which are made using Plogue bidule. In reality however, there are many more sounds audible to an individual standing in a bus stop.
A number of sounds, which I used in this project, actually come out of my mouth! In these cases, I have recorded the sounds and have modified them using Protools. Utilising various effects, time stretch and other processes in protocols, in some cases I have dramatically changed the original sound and have simulated a real-world sound.
There also are a number of sounds, which I have used Plogue for. Particularly digital (or rather artificial) sounds such as mobile beeps are made by plogue.
My approach was to get as close as possible to the real situation of a bus stop. Although the result is not the best simulation of that particular situation, I have come very close to some of my aims.
Most of the issues, which I had to deal with, were regarding the ordinary errors and problems of using digital mediums such as computers.
The software that I had to use in order to compile my project (Cubase) was/is not the most reliable software and it occasionally slowed me very down.
In general I experimented new ways of manipulating various sounds and noises; The most important part was that each single element of this project should imitate the some existing sounds and should lead the listeners’ imagination! This needed a new approach to observe the sound scene.
The MP3 of the final result is here:



The documentation for this project can be downloaded too:

Monday, 12 November 2007

CC 1 - Semester 2 - Final Project

Project; Sounds Without Origins.

My project is to experiment the effects of different devices on sound. The original sound, is an electric guitar and utilising the software Plogue Bidule, various effect devices affect the sound.
Firstly it is the sound of guitar which fills the final mix, but after a while a rhythm enters, the sound of guitar get delayed, reverbed, distorted, etc.. till the original sound completely fades out from the final mix and there are just “aftermaths” of the sound-generating process..
It should be noted that the patch provided by me, contains two rewire setups. It already is rewired with Reason and Live. In the example (blogged) I have used the rhythm of a Reason Dr. Rex device.
The idea behind this project of mine is to examine the existence of several electronic and digital effect processors without the presence of the original sound; hearing something and not knowing where it comes from or how it is generated.
In addition to this, the patch also adds harmonics to the original signals (pitch-shifter is one of the elements used in this patch).
For the most important part, the effects and the amount of them in the final mix could be controlled via an external controller (again in the case of my example, I have used Novation 61 SL) therefore in the final mix, there are different levels for different sounds throughout the duration of the project.
Issues that could possibly occur mostly deal with the interconnection of different softwares. It is essential to consider Plogue Bidule as the master software of the rewiring process. On the other hand it is also essential NOT to start another software which could be rewired to Plogue while the tune is going on.
This technology and the sound coming out of this patch, in my opinion, would be reasonably useful for a sound track or generally a “background theme”. Like many other tunes in the genre of Ambient, the final mix is not made up of too many sounds and the listener would not have many difficulties distinguishing the sounds. Mostly for this reason, the patch suits a film score.

Here is a piece that I did using this patch:



you can download the documentation needed for this project from here: (ZIP file.)

Friday, 12 October 2007

CC 1 - Semester 2 - Week 10

Integrated stuff

For my project, with regards to the new information on “Integrated Setup” of several devices I’m going to use, I will –as said before- utilise Plogue Bidule, Reason and Ableton Live.
There is one single device that I just realised would be great to use; a Control Surface.
What I will do would follow a simple algorithm; the signal –most probably of a guitar- would come into Plogue, while Ableton is rewired to it, and some additional effects on the sound would be affecting the entire result via Reason. Ableton Live would most probably provide me with a rhythm; and I would control Reason’s effects with a control surface!
The setup is not as sophisticated as what I initially intended it to be. But after experimenting for a while, I got to a point that I realised not to overuse what I have.

Here is a test for this setup; the only difference is that I sequenced a riff and looped it, then I started controlling the effects in Reason: (Unfortunately it’s around 4 minutes!)
cc1sem2week10.mp3

References:
- Christian Haines 'Creative Computing 1.2' Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 11/10/2007
- Plogue Bidule. Official website. (www.plogue.com) [Accessed 12/10/2007]

AA1 - Semester 2 - Week 10

Additive Synthesis

For this week, I provided a Plogue patch which has a very simple structure and function.
All the story is to ADD two different wave files, (and of course this is called « ADDitive synthesis !) and see the result.
I simulated a sound that I actually would need for my final project ; the « beep » of the bus indicator light when it gets close to a stop.
I grouped the different devices I used to construct the patch and provided a controller for it. In his controller, you can define the frequencies and the waveforms of the signals which you intend to add together.
All what I have explained is apparent in the picture I have for the patch :
The sonic result of this patch is here to listen to :
aa1sem2week10.mp3

References:
- Christian Haines 'Audio Arts 1.2' Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 09/10/2007
- Additive (Fourier) Synthesis. the University of Princeton (http://soundlab.cs.princeton.edu/learning/tutorials/SoundVoice/add1.htm) [Accessed 12/10/07]
- Additive Synthesis. Wikipedia. (http://en.wikipedia.org/wiki/Additive_synthesis) [Accessed 12/10/07]

Monday, 8 October 2007

CC 1 - Semester 2 - Week 9

Integrated Setup

What I did for this week was not very complicated. In Plogue, I just assigned two sine waves to control two pannings on the mixer. My Plogue patch contained a delay and a reverb. (Both being panned but in different frequencies)
On the other side, I had Ableton Live to add more effects to the final sound and Reason rewired to this whole setup.
I played the keyboard using a very typical retro sound of 80s’ and Erik played guitar.
The result is here; I think everyone knows how to use this DJ MP3 player below...
cc1sem2week9final....

Setting up bunch of integrated –in this case- softwares would be interesting when “appropriate use of each device’s capabilities” is taken into consideration. By that I mean NOT to use “more” or “less” than needed; example:
Both Ableton and Plogue have the reverb effect (and so does Reason) but I wanted the “reverb” to be a part of what is being “maximised” within Ableton Live. Therefore I reverbed the coming signal IN PLOGUE, and not in Live.
I plan to use more controllers in my final project and less “note playing”. This time I was basically providing the session with some sort of solo; I’d rather change the characteristics of the add-on materials, particularly effects in REAL-time in the project.
Having noted that, my setup would probably be: Guitar -> Plogue -> Reason -> Live -> speakers!
On the other hand I will set a surround system of sounding for my final project; 5.1 and assign sine waves to it in a manner that the sound ROUNDS the room! Nice, hey?

References:
- Christian Haines 'Creative Computing 1.2' Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 04/10/2007
- Performing Expressive Rhythms with Billaboop Voice-Driven Drum Generator. Institut Universitari de l'audiovisual (IUA),Universitat Pompeu Fabra, Barcelona, Spain. (PDF format) (www.iua.upf.edu/mtg/publications/b5fb70-dafx05-ahazan.pdf)
Wikipedia. (http://en.wikipedia.org/wiki/Category:Ableton_Live_users) [Accessed 08/10/07]
- Surround Sound. Wikipedia. (http://en.wikipedia.org/wiki/Surround_sound) [Accessed 08/10/2007]

Sunday, 7 October 2007

AA1 - Semester 2 - Week 9

FM

Apparently I had done more than enough last week; there was no obligation to provide an FM patch. Anyway, -oops- I did it again! And came up with some new stuff…
As you can see in the picture, the 1st FM of mine uses an oscillator as the carrier, which is being modulated regarding a constant value:
The modulation rate speeds up! and slows down with regards to the rate of the amplitude of the first oscillator (oscillator_3).
On the other hand, the other constant value of this patch (Constant value (filter!!!)) is practically an LP filter!

There are two more final results and I put them here. The picture of my pro-tools session is also below to visualise what I was doing!This one below is again some sort of car engine sound! For some reason, I always come up with such things.This one could probably be used for my final project; a person walking!And here we go; my protools regioned! session:
References:
- Christian Haines 'Audio Arts 1.2' Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 02/10/2007
- Frequency Modulation. The University of California
Berkeley Robotics and Intelligent Machines Lab (http://robotics.eecs.berkeley.edu/~sastry/ee20/modulation/node4.html) [Accessed 07/10/07]
- Frequency Modulation. Federation of American Scientists. (http://www.fas.org/man/dod-101/navy/docs/es310/FM.htm) [Accessed 07/10/2007]