Friday 12 October 2007

CC 1 - Semester 2 - Week 10

Integrated stuff

For my project, with regards to the new information on “Integrated Setup” of several devices I’m going to use, I will –as said before- utilise Plogue Bidule, Reason and Ableton Live.
There is one single device that I just realised would be great to use; a Control Surface.
What I will do would follow a simple algorithm; the signal –most probably of a guitar- would come into Plogue, while Ableton is rewired to it, and some additional effects on the sound would be affecting the entire result via Reason. Ableton Live would most probably provide me with a rhythm; and I would control Reason’s effects with a control surface!
The setup is not as sophisticated as what I initially intended it to be. But after experimenting for a while, I got to a point that I realised not to overuse what I have.

Here is a test for this setup; the only difference is that I sequenced a riff and looped it, then I started controlling the effects in Reason: (Unfortunately it’s around 4 minutes!)
cc1sem2week10.mp3

References:
- Christian Haines 'Creative Computing 1.2' Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 11/10/2007
- Plogue Bidule. Official website. (www.plogue.com) [Accessed 12/10/2007]

AA1 - Semester 2 - Week 10

Additive Synthesis

For this week, I provided a Plogue patch which has a very simple structure and function.
All the story is to ADD two different wave files, (and of course this is called « ADDitive synthesis !) and see the result.
I simulated a sound that I actually would need for my final project ; the « beep » of the bus indicator light when it gets close to a stop.
I grouped the different devices I used to construct the patch and provided a controller for it. In his controller, you can define the frequencies and the waveforms of the signals which you intend to add together.
All what I have explained is apparent in the picture I have for the patch :
The sonic result of this patch is here to listen to :
aa1sem2week10.mp3

References:
- Christian Haines 'Audio Arts 1.2' Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 09/10/2007
- Additive (Fourier) Synthesis. the University of Princeton (http://soundlab.cs.princeton.edu/learning/tutorials/SoundVoice/add1.htm) [Accessed 12/10/07]
- Additive Synthesis. Wikipedia. (http://en.wikipedia.org/wiki/Additive_synthesis) [Accessed 12/10/07]

Monday 8 October 2007

CC 1 - Semester 2 - Week 9

Integrated Setup

What I did for this week was not very complicated. In Plogue, I just assigned two sine waves to control two pannings on the mixer. My Plogue patch contained a delay and a reverb. (Both being panned but in different frequencies)
On the other side, I had Ableton Live to add more effects to the final sound and Reason rewired to this whole setup.
I played the keyboard using a very typical retro sound of 80s’ and Erik played guitar.
The result is here; I think everyone knows how to use this DJ MP3 player below...
cc1sem2week9final....

Setting up bunch of integrated –in this case- softwares would be interesting when “appropriate use of each device’s capabilities” is taken into consideration. By that I mean NOT to use “more” or “less” than needed; example:
Both Ableton and Plogue have the reverb effect (and so does Reason) but I wanted the “reverb” to be a part of what is being “maximised” within Ableton Live. Therefore I reverbed the coming signal IN PLOGUE, and not in Live.
I plan to use more controllers in my final project and less “note playing”. This time I was basically providing the session with some sort of solo; I’d rather change the characteristics of the add-on materials, particularly effects in REAL-time in the project.
Having noted that, my setup would probably be: Guitar -> Plogue -> Reason -> Live -> speakers!
On the other hand I will set a surround system of sounding for my final project; 5.1 and assign sine waves to it in a manner that the sound ROUNDS the room! Nice, hey?

References:
- Christian Haines 'Creative Computing 1.2' Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 04/10/2007
- Performing Expressive Rhythms with Billaboop Voice-Driven Drum Generator. Institut Universitari de l'audiovisual (IUA),Universitat Pompeu Fabra, Barcelona, Spain. (PDF format) (www.iua.upf.edu/mtg/publications/b5fb70-dafx05-ahazan.pdf)
Wikipedia. (http://en.wikipedia.org/wiki/Category:Ableton_Live_users) [Accessed 08/10/07]
- Surround Sound. Wikipedia. (http://en.wikipedia.org/wiki/Surround_sound) [Accessed 08/10/2007]

Sunday 7 October 2007

AA1 - Semester 2 - Week 9

FM

Apparently I had done more than enough last week; there was no obligation to provide an FM patch. Anyway, -oops- I did it again! And came up with some new stuff…
As you can see in the picture, the 1st FM of mine uses an oscillator as the carrier, which is being modulated regarding a constant value:
The modulation rate speeds up! and slows down with regards to the rate of the amplitude of the first oscillator (oscillator_3).
On the other hand, the other constant value of this patch (Constant value (filter!!!)) is practically an LP filter!

There are two more final results and I put them here. The picture of my pro-tools session is also below to visualise what I was doing!This one below is again some sort of car engine sound! For some reason, I always come up with such things.This one could probably be used for my final project; a person walking!And here we go; my protools regioned! session:
References:
- Christian Haines 'Audio Arts 1.2' Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 02/10/2007
- Frequency Modulation. The University of California
Berkeley Robotics and Intelligent Machines Lab (http://robotics.eecs.berkeley.edu/~sastry/ee20/modulation/node4.html) [Accessed 07/10/07]
- Frequency Modulation. Federation of American Scientists. (http://www.fas.org/man/dod-101/navy/docs/es310/FM.htm) [Accessed 07/10/2007]

Tuesday 2 October 2007

Forum - Semester 2 - Instrument

Electronic musical instrument of mine:

In building this instrument, I had 3 initial principals:

1- Physical computing (Arduino)
2- Tone generating (Square wave generating)
3- Victorian Synth.

What my instrument does (as apparent in the video below) is that it gets a signal from the computer (using Arduino), controls a 4093 chip which generates a square wave and sends the signal to a speaker (through an amplifier) and then there is the Victorian synth.


This picture kind-of explains the principals following which the instrument works. (Frankly, I have forgotten what I wanted to call this invention of mine, if it is an invention at all)
It should be mentioned that I had to add another extra component, the amplifier, in order to actually get a reasonably loud sound out of this dude; nevertheless, I don't think this part of the job is counted by any means.

It was not an easy time fixing this instrument. S*** load of soldering and stuff was needed..