Wednesday 19 September 2007

CC1 - Semester 2 - Week 8

Performance Sequencing

What I did was to create 3 different loops using Propellerheads’ Reason; a drum loop, a bass line loop and a guitar loop. For the guitar loop, I pitch-shifted in a real-time fashion.
After inserting the drum loop into Ableton Live, I warped it and introduced a different loop.
I also added a delay effect to the guitar line and controlled it real-time as well.
In total, my sequence is followed this order:
Original Drum Loop -> Modified Drum Loop -> Bass Loop -> Guitar Loop (with delay amount controlled and varied throughout the tune)-> Bass Loop -> Modified Drum -> Original Drum.
The result can be downloaded here:The interesting part of this week’s exercise for me was that Ableton Live actually makes it much easier to fuse different genres of music. Controlling rhythm and its tempo (as well as its groove) provides the user with lots of options to play around with different riffs and themes and at the same time it makes it possible to precisely and accurately mix these diverse stuff together. The reason I didn’t use my recordings from last semester was that my project didn’t involve drums and I needed rhythm to do this.
I think parts of this exercise of mine sounded like early experimental drum n bass tunes (i.e. early Aphex Twin).
Here is a very useful video of warping in Live. I know noone does but please have a look.

References:
- Christian Haines 'Creative Computing 1.2' Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 12/09/2007
- Ableton Live Users. Wikipedia. (http://en.wikipedia.org/wiki/Category:Ableton_Live_users) [Accessed 19/09/07]
- Ableton Live. John Hopkings University Digital Media Centre. (http://digitalmedia.jhu.edu/learning/documentation/live) [Accessed 19/09/2007]

Monday 17 September 2007

AA1 - Semester 2 - Week 8

Amplitude and Frequency Modulation (AM and FM)

Plogue Bidule is a total disaster; because I still have not mastered it.
However, after ages of trying to create two patches of Am and FM, I finally figured out the very simple and logical procedure of it. In amplitude modulation, the amplitude input of the oscillator should be receiving a signal from another oscillator. In Frequency modulation the frequency input should be the receiver!
Despite the simple algorithm of FM and AM, simulating a natural or man-made sound was pretty hard.
For the AM, I simulated the time period in which an 8-cylinder car (Chevrolet came to my mind) starts its engine, accelerates and stops. I like what I came up with!
The MP3 can be downloaded from here:for FM, I simulated the sound of an ambulance alarm (or whatever it is called in English).
The MP3 can be downloaded from here:The interesting occurrence in FM was the slight change of the tine when the carrier and the modulator have frequencies very close to each other. To see the effect, get the Plogue file from here and have a listen to it.
Here is the list of the AM and FM Plogue files I have and also some more stuff.
AM for the above MP3 (Chevrolet) :(Click the Download button)FM for the above MP3 (ambulance): (Click the Download button).
As it is apparent in the picture, I have modulated the signal with ITSELF. Nice sound..Here is the picture of my Protools session for the whole thing:Amplitude Modulation on YouTube..

References:
- Christian Haines 'Audio Arts 1.2' Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 10/09/2007
- Amplitude Modulation. WhatIs.com . (http://searchsmb.techtarget.com/sDefinition/0,,sid44_gci214073,00.html) [Accessed 15/09/07]
- Amplitude Modulation. The University of Berkeley, Los Angeles, California, The United States of America. (http://robotics.eecs.berkeley.edu/~sastry/ee20/modulation/node3.html) [Accessed 15/09/2007]

Thursday 6 September 2007

Forum - Semester 2 - Week 7

CC1 - Semester 2 - Week 7

Ableton live

What I did was to sequence a track using NIN samples provided in software Ableton Live. I did the sequencing (or rather mixing) live and real-time.
Arguably, Live has a good interface which compared to many others is more user-friendly. Most of the features that a typical user needs at one time are on the screen and there is not that much of need to browse the menus. Like many other softwares, however, there is a necessity to "flip" sides of the interface and go from the editing section to sequencing; this could possibly be problematic. The other good feature of Live is that it recognises the beginning of the sample (or rather the "beat") pretty precisely and does not always go off-time (it should be mentioned that the whole syncronisation process highly depends on the initial loop of the sample.)Restrictive practices that are to be experienced while using Live would allow the user to be more "creativity". As an example, since the demo version of Live did not allow me to save the file, I had to come up with an intelligent approach to playing and recording real-time using two different softwares (in this case, Live and Protools.)
Here is my final result in MP3 format:..and here is a video of two DJs called "Telephone Tel Aviv". It demonstrates how these guys utilise Live to get their jobs done:


References:
Christian Haines Creative Computing 1.2' Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 06/09/2007
- Ableton (www.ableton.com) [Accessed: 06/09/2007]
- Ableton Live, Wikipedia (http://en.wikipedia.org/wiki/Ableton_Live) [Accessed: 06/09/07]

AA1 - Semester 2 - Week 7

Analogue / Digital synthesis; Juno 6

I experimented with a few synthesisers including Roland Juno 6 and Jupiter. For this week, I played with Juno 6 and recorded the result.
Obviously, the result of around one hour of mocking around with the synthesiser gave me a whole lot of different sound patterns. Not many of them, however, would be considered simulations of sounds n the "real" world.
At the end, I merged two parts of my final result; the first one sounds like activities of a volcano (bubbling?) and the second one sounds to me like wind blowing.here is the MP3 containing these two parts:After all, "additive synthesis" (the way through which electronic synthesisers and most of the softwares work) provide many option by which it is possible to create various sounds. In addition to that, the most enjoying part of the additive story for me is that since the processes are to be done step by step, it is easier to keep the track of what is being done throughout the whole way.
Below, is a video of Roland SH3A Synthesiser; it is a short demo and a good example of how additive (..and to me also addictive!) synthesis work:

There is a good lecture on additive synthesis from Princeton University, New Jersey, US. I've put the link in the references.

References:
- Christian Haines 'Audio Arts 1.2' Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 03/09/2007
- Roland Juno6. Vintage Synth Explorer. (http://www.vintagesynth.com/index2.html) [Accessed 07/09/07]
- Additive Synthesis. Princeton University (http://soundlab.cs.princeton.edu/learning/tutorials/SoundVoice/add1.htm) [Accessed 07/09/2007]

Forum - Semester 2 - Week 6

CC1 - Semester 2 - Week 6

AA1 - Semester 2 - Week 6

Interactive sound design

I chose my external hard drive, Maxtor One Touch III for analysing its sound design.
There are two different sounds (beeps) coming out of this. When it’s connected to a computer via a USB cable there could be 3 things happening.
1- Everything is fine! And there is no sound.
2- The drive is connected but there is not enough power; could be caused by the cable or the USB port of the computer. In this case, there would be a constant tone, which won’t finish unless the device is disconnected.
3- The drive is connected but it’s busy being recognised (or rather configured) by the computer; in this case there would be an on-off beep indicating that nothing should be touched.
The job of designing the sound for this particular product in my opinion is not bad.
a) It is more or less has a simple structure and is not hard for the user to realise what goes on; on the other hand, it just uses one tone; not so hard to note and recognise.
b) The constant sound the device makes when there is a problem is so annoying; the first thing that comes to users’ minds is to disconnect the drive and it is intentionally what is expected to be done.
c) The on-off beep gives a sense of not-being-sure and typically the user waits to see what happens next; it is also exactly what is needed to be done. While the user is confused, the device takes its time would sort everything out itself!Here is a link to a project of interactive sound design by students of the University of Melbourne. : http://www.sounddesign.unimelb.edu.au/web/biogs/P000565b.htm
..and some good stuff (obviously related to interactive sound design) from Kristianstad University, Sweden : http://www.fb1.uni-lueneburg.de/fb1/austausch/kristian.htm

References:
- Christian Haines 'Audio Arts 1.2' Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 03/09/2007
-Interactive Sound Environment - Australian Sound Design Project Work, The University of Melbourne (http://www.sounddesign.unimelb.edu.au/web/biogs/P000565b.htm)[Accessed 07/09/2007]
- Interactive Sound Design, Kristinstad University (http://lab.tec.hkr.se/moodle/) [Accessed 07/09/2007]

CC1 - Semester 2 - Week 5

Naturalisation

For this week, I took a MIDI file for the song "Sultans of Swing" by Dire Straits and naturalised it. Most of my work was to process the guitar part (particularly the last solo of the tune). I mostly modified the velocities, durations, and starting points (groove) of the notes.
Here is the ORIGINAL MIDI part also available in .cpr format for Cubase) ..
and HERE is my modified version of the tune. (and its .cpr file)

Naturalisation might come handy in many cases, firstly to add the spice of "human mistakes" to the artificial result of machines providing simulations of musical pieces. Nevertheless, many of MIDI files these days are already played by a person and they originally do have the human error within their structure. Still, there are many corrections needed to make the MIDI file sound as close as possible to the original song.
Besides, in many electronic compositions there is a huge need of "groove" which could just be provided by skillfully apply certain applications while producing the tune, for the final result to sound natural to ordinary audience.

References:
Christian Haines Creative Computing 1.2' Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 21/08/2007
- Dire Straits, Wikipedia (http://en.wikipedia.org/wiki/Dire_Straits) [Accessed: 06/09/2007]
- MIDI search engine, Music Robot (http://www.musicrobot.com/) [Accessed: 06/09/07]

Forum - Semester 2 - Week 4

Forum - Semester 2 - Week 3

This week’s exercises were to generate square waves and manipulate the sound using various devices (potentiometers, resistors, etc..)
The first exercise is pretty simple :
The square wave in generated using a 4093 chip, a capacitor and a resistor ; as you can see in the video below:


I changed the resistor and observed the effect :



In the next one, I used an LDR (Light Dependant Resistor) and experimented it :



In this one, using a potentiometer (also called POT) I varied the resistance and checked the effect :



This time, I used a potentiometer AND an LDR, nice stuff.. :



At last, I used TWO potentiometers, one serving the device as a generator and the other one the modulator. I think I needed different POTs to get a more apparent sound :