Inspiration & Motivation

A MIDI controller is an electronic interface that generates commands using the well-known MIDI protocol, most often resembling a keyboard. The tones can be greatly customized to achieve a certain ‘type’ of sound: cutting/boosting frequencies, reverb, sustain, etc. are all various types of effects that can be applied. However, keyboard synthesizers are limited due to the fact that only discrete notes can be generated - this significantly reduces the type of music synthesizers can play (for example, orchestral pieces). Thus, to resolve this issue, we are proposing to build a new form of controller - the Cybersynth. Instead of using discrete keys to generate tones, the Cybersynth will use a long strip of membrane as its input. Similar to the strings of a violin, this will allow the synth to produce a continuous range of sound, unlocking an additional dimension to its pre-existing array of tone modulation options.

What it does

The Cybersynth is a modular MIDI-controller that is played like a violin, inspired by Wintergatan's Modulin. As a MIDI-controller, the instrument is able to generate a wide range of output signals - this is achieved through adjusting various knobs, sensors, and a joystick that control a note’s pitch, volume, and articulation. The Cybersynth is successful in supporting a wide range of MIDI commands, allowing its output to be played on an external speaker/computer through a digital audio workbench (DAW).

How we built it

The Cybersynth is built with two Arduino UNOs - we programmed the Atmega328p within each board using bare metal C to take in, process, and transmit signals from our sensors (SoftPot, FSR, Linear Pots) and to our MIDI jack. Functionality was implemented by programming the Atmega328p using bare metal C, by utilizing various concepts learned throughout the course (UART, ADC, Timers, Interrupts). The body of the Cybersynth is constructed with laser cut MDF and acrylic.

What we learned

From this project, we were able to gain a deeper understanding of the MIDI audio protocol. We also gained a better understanding of the processes of ADC, specifically when dealing with multiple analog input sources and a constrained environment for taking in these inputs. We are especially proud of the way we were able to send MIDI (using our custom library), a protocol which is intended to be used for discrete output, and through our design decisions and implementations, we were able to use it to create the continuous effect we were looking for. Around halfway through our project, we realized we needed to change our approach and our overall goals. We had initially sought to do the sound processing and generation onboard, however, we quickly realized this would become too computationally expensive and complicated. We instead shifted our goals to creating a MIDI-controller which could be used for a similar purpose, in order to shift most of the difficult computation onto pre-existing systems.

Doing this again, we would research the sound generation and processing component of our project more thoroughly, in order to either reach a better solution for dealing with this aspect of the project, or decide earlier on in the design process to not deal with that aspect, like we ended up doing.

A next step for this project could take two paths. One way forward would be to increase the number of features and functionality of the controller. Some potential ideas we had was implementing a sustain feature, key signatures, and automatic arpeggios. The other way forward could be to return to the idea of onboard sound generation, potentially with a second dedicated microcontroller whose sole purpose is in processing the MIDI commands and generating audio output. With enough time, both of these paths could be combined.

Built With

  • baremetal
  • c
Share this project:

Updates