The DroneSynth is a digital synthesizer and MIDI control interface. The project started as coursework for the class Digital Electronics Lab, and developed further into a second iteration. The second prototype, was similar to the first in terms of the software, but featured a more durable control interface. I used this prototype during my participation at the Bang on a Can Musicians' Intensive at the NYU Abu Dhabi Arts Center in February 2017.
This video shows the second prototype being used in an improvised performance/composition.
You can basically think of this project in 2 parts: the software synthesizer in Max and the controller, which consists of the Sensors and Arduino Sketch. Let me talk about the Synthesizer first. Within the patch there are 3 sound waves produced. At the foundation is a sine wave whose pitch is determined by the potentiometer value on the main pot. This base frequency is summed with any harmonics which are added in by turning the corresponding pots (A1, A2, B1, B2, C1, C2). The 6 pots allow you to add up to six harmonics to the base frequency. The harmonics are multiplications of the order 2, 3, 4, 6, 8, 9, and 14x the base frequency.
This summed signal goes through 2 separate processings: through the p littleProcessor and p angelHaze subpatchers.
In p littleProcessor, the signal is processed so that you get the following outputs:
1. Summed signal multiplied by its square root.
2. and 3. The square root of the signal multiplied by an LFO.
This LFO rate is determined by the value output from the IR sensor. The closer your hand is to the sensor, the slower the LFO rate will be.
Output 2 is fed into Delay 1 and 2, summed and sent to the Degrader module. Here, the signal goes through a sample and bit degrade to produce a distortion of the signal. The amount of distortion is determined by the amount of bend of the Flex Sensor. More bend = more distortion.
The clean summed signal is output to channel 1. This second degraded signal is output to a second channel.
The third channel sees the summed harmonics out of p harmonix sent through p angelHaze. Inside it undergoes low frequency oscillation, at a different set of values than the clean signal, but also based on your hand’s position from the IR sensor. This is also distorted in the Degrader and then routed to delays 3 and 4, summed, and output to the third channel.
That sums up how the patch produces the audio.
The controller functions like this. The 6 pot knobs determine how much of each of the 6 harmonics is present.
With all the pots turned completely left, you get just the bass frequency. Bending the Flex sensor produces distortion. An initial press of the first button initialises the patch, while all subsequent presses perform a delay shift in all 4 delays.
The delay shift works like this: for each delay the current delay value (0-127 for the slider) is examined when the button is pressed. If the delay is <50% of maximum &40;<63&41;, a random delay above 63 is generated as the new delay. If the delay is >63, then a random delay below 63 is generated. This allows for some creative uses of the relation between how much harmonics the performer implements and the speed of the day ramp. One very high input harmonic will give a long delay, but then adding more lower input harmonics will reduce the average. So there is some space to play with this in a performance setting.
The rest of the buttons engage different play modes. Mode 1 simply adds the amount of harmonic to the wave and outputs it constantly. Mode 2 is a 2-step sequence, where the harmonics are automated to cycle between a given level and 0, in an ON/OFF pattern. The rate of this is taken from the value of delay 2. Mode 3 is a ramp mode. Here the values of each harmonic fade in and out from 0 to their selected level. The ramp time comes from delay 1. This time is multiplied by 6 randomly generated coefficients so that each harmonic ramps over a different time but these are still somehow interrelated.