Record of MFA DT MAJOR STUDIO 1
Broader quesion: How can I explore the relationship between music and color or other visual pattern?
Specific question: How can I make music more accesible and democratize music expression by using visual works?
For the first prototype, I continued the process I’ve done before to draw a pattern for my friend’s composition. But this time I tried to establish a rule of the relationship between visual work and music. After I draw it, I asked myself why I draw like this. I found that sometimes I just drew down without any reason. So it’s very subjective. Also it’s so hard to establish a rule because there is lots of parameters in both music and visual pattern. What I wanna find before, the emotion of certain chords or certain progresion, is not existed as well because there are many other parameters in the music would affect emotion. On the other hand, the same chords progression might lead to different emotion if they are in different context. So I decided to do this in a different direction.
Whether to visualize music or to translate visual pattern into music all need to consider the timeline of music. So for prototpye 2, I was thinking about indicating timeline in different ways. I don’t want to be limited to 2D medium, so I tried to indicate timeline in 3D space.


I found it pretty interesting to make a rule of translating what people draw into music. This time I only considered pitch and bass line. I really want to explore more for the next day. I’d consider how to create melody and chord progression with visual pattern.
I did a small research before. There are two webside of using computer keyboard to make music.
I asked people which one do they prefer and why, and I found that they all prefered mikutap because they could play music randomly. The result was as I expected. The reason behind it is that mikutap provides an accompany, or a certian chord progression, when people press the computer keyboard to create melody, making the experience more “musical”. Also, mikutap only use pentatonic scale in which all the notes are harmonious with the accompany, making it sound good whatever keys people press. However, patatap is more like “sound effect” rather than “music”. For me, I’m about to make things more musical, so I came up with today’s guiding question.
This prototype is inspired by the color version of Tonnetz, or “tone network”, found in Riemannian theory. It is an arrangement of notes to illustrate harmonies and their relationships.

The interaction could both happen in the physical or digital world like a web-based experience.
I found the process so interesting and I think there’s a lot I can do to make it better.
Based on the idea of prototype4, I did some changes and improvements and made it a web-based experience. At first I wanted to make it a game which goals is to sort the chords in a correct order. But I realize there should not be any correct or wrong answer in music. So I just want people to sort in a way it sounds good and feel it.
This prototype is inspired by Wassily Kandinsky, an artist who compared painting to making music. “The sound of colors is so definite that it would be hard to find anyone who would express bright yellow with base notes or dark lake with the treble.”

What if we can search music by color? What if we can use color to represent our emotion and then get the music we want? Now we search music by words. But I think color has its own expression which is different from the expression of words.


Translating generated visual patterns into certain music might use the technology of Machine learning album covers. Example:
The interaction could both happen in physical world or digital world like VR, depending on different technology.