Saturday, December 26, 2009

How high do your singers need to sing to do this piece?

Here's one way of figuring it out:

(see the original blog post if your reposting turns this code into goobley-gook...)

import copy
import music21
from music21 import corpus, meter, stream

score = corpus.parseWork('bach/bwv366')
ts = score.flat.getElementsByClass(meter.TimeSignature)[0]
ts.beat.partition(3)

found = stream.Stream()
for part in score:
found.append(part.flat.getElementsByClass(music21.clef.Clef)[0])
highestNoteNum = 0
for m in part.measures:
for n in m.notes:
if n.midi > highestNoteNum:
highestNoteNum = n.midi
highestNote = copy.deepcopy(n) # optional

# These two lines will keep the look of the original
# note values but make each note 1 4/4 measure long:
highestNote.duration.components[0].unlink()
highestNote.quarterLength = 4

highestNote.lyric = '%s: M. %s: beat %s' % (
part.getInstrument().partName[0],
m.measureNumber, ts.getBeat(n.offset))
found.append(highestNote)

found.show()


... which generates this snippet of notation...



...showing that for at least one piece, Bach was (probably accidentally) using the old medieval authentic, plagal, lower-octave-authentic, lower-octave-plagal, range designations!

This code is still needlessly complicated -- we're still working on simplifying it (notes will know their own clefs and beats; '3/4' will know that it should be divided into 1+1+1), but just a little taste of what's possible.

Oh, and just for fun, all the C#s in the piece with this code snippet substitution:. Or discover that all the raised leading tones in this d-minor composition happen on the two strongest beat of a measure:


if n.name == 'C#':
...
found.append(n)


Saturday, December 19, 2009

Mapping the Eroica

Alex Ross discusses on his blog a project by Eric Grunin to document changes in the performance of Beethoven's Eroica Symphony over the years. His visualization of changes in tempo in the first movement over time are particularly interesting:



(Red dots represent performances that take the repeat at the end of the exposition; blue dots do not).

Grunin finds all sorts of new and interesting data to analyze. I especially wanted to link to it to show that there is a lot of amazing projects in computer-aided musicology still to be done that do not require the use of analysis of music as symbolic data (as music21 does)

Thursday, December 10, 2009

How do Mozart and Chopin use their notes?

Mozart's minuets don't sound much like Chopin's mazurkas. There are many reasons for this -- the rhythms, the tone of their pianos, changes in repetition, and so on. But one of the ways that music21 let us discover is that they articulate their pitch space differently. For instance, here is the pitch distribution in the soprano register of one minuet by Mozart:



The x (left<->right) axis shows shorter and longer notes -- this excerpt [like the Chopin] uses almost entirely quarter notes (1.0) and eighth notes (0.5 quarterLength). The y (front<->back) axis shows lower to higher notes (middle C = 60). The z (top<->bottom) axis shows the number of times each pitch/rhythm combination appears. Looking along the y axis, we see something like a bell-shaped curve. Mozart uses notes in the middle of the register the most often, followed by a smooth trailing off towards higher and lower notes.

Contrast the Mozart graph with the pitch distribution of the right hand of a Chopin mazurka:



Chopin emphasizes certain notes quite a bit more than other notes. There is no smooth distribution; instead some pitch-classes (C# and G# especially) recur much more often in certain registers of the piano. We will be returning to these examples (probably refining the graphs slightly to deal better with grace-notes, etc.) as music21 moves toward its initial beta release.

Wednesday, November 18, 2009

Expressive Notation Package

A potentially interesting article on some of the challenges of music notation. Things that we're thinking about while creating a system that encodes much of the ambiguity and power of music notation

Friday, October 30, 2009

Simple things simple

(1)

from music21 import serial

p = [8,1,7,9,0,2,3,5,4,11,6,10]
print serial.rowToMatrix(p)

0 5 11 1 4 6 7 9 8 3 10 2
7 0 6 8 11 1 2 4 3 10 5 9
1 6 0 2 5 7 8 10 9 4 11 3
11 4 10 0 3 5 6 8 7 2 9 1
8 1 7 9 0 2 3 5 4 11 6 10
6 11 5 7 10 0 1 3 2 9 4 8
5 10 4 6 9 11 0 2 1 8 3 7
3 8 2 4 7 9 10 0 11 6 1 5
4 9 3 5 8 10 11 1 0 7 2 6
9 2 8 10 1 3 4 6 5 0 7 11
2 7 1 3 6 8 9 11 10 5 0 4
10 3 9 11 2 4 5 7 6 1 8 0


(2) We want to graphically show correlations between the length of notes and their heights using a piece coded in musicxml or humdrum format (these are from our downloaded corpora):

for work in ['opus18no1', 'opus59no3']:
movementNumber = 3
score = corpus.parseWork(work, movementNumber)

for part in score:
instrumentName = part.getElementsByClass(instrument.Instrument)[0].findName()
grapher = correlate.NoteAnalysis(part.flat.sorted)
grapher.pitchToLengthScatter(title='%s, Movement %s, %s' % (work, movementNumber, instrumentName))
Displays 8 images, including:

Music21 Preview -- Welcome! Creating measures

This is a preview of the music21 system for computer-aided musicology being developed at MIT (Michael Scott Cuthbert, Asst. Prof., Principal Investigator; Christopher Ariza, Visiting Asst. Prof., Development Lead). We'll be using this blog to showcase some of the features of the system, and to highlight other interesting things happening in computational, statistical, and other empirical methods in musicology.

Although computers have transformed how we listen to, obtain, compose, and notate music, they have not fundamentally changed how we research and analyze music. Though many computer databases have been created for musicology, they are not well adapted for sophisticated music queries. For instance, melodies can be found if exact matches exist. But melodic variations such as the repetition of a phrase or a change in embellishment are extremely common, yet cause searches to fail. More complex investigations, such as finding all melodies that imply a particular underlying harmony, can barely begin to be created with existing software packages. The lack of relevant software for analyzing music hampers scientific attempts to understand what we listen for and how we process what we hear; these activities are little understood despite music’s nearly universal presence in our daily lives.

The music21 project at M.I.T. will give to the music community the set of tools it needs to conduct sophisticated musical and statistical analysis using modern programming techniques. The software framework, written in Python, manipulates music as a collection of symbolic data, such as pitch names and note durations, that can then be classified as higher level musical structures according to the style, region, or period being studied.

Music21 focuses specifically on the manipulation of symbolic music data: it leaves to the many preexisting open-source and proprietary software packages the notation and audio playback of scores (the two areas where computer-aided music research is most developed). By focusing on the points of greatest need to musicology, the framework will give rapid results within a short timeframe.

The music21 framework will be freely available in early 2010 under the LGPL open source license.

Some demos:

from music21 import *

n = note.Note("F#4")
n.quarterLength = 3

a = stream.Stream()
a.repeatAdd(n, 20) # add 20 copies of n

a.insertAtOffset( 0, meter.TimeSignature("5/4"))
a.insertAtOffset(10, meter.TimeSignature("2/4"))
a.insertAtOffset( 3, meter.TimeSignature("3/16"))
# N.B.they don't have to be inserted in order

a.insertAtOffset(20, meter.TimeSignature("9/8"))
a.insertAtOffset(40, meter.TimeSignature("10/4"))
a.insertAtOffset(50, meter.TimeSignature("29/32"))
a.show()