Saturday, December 26, 2009

How high do your singers need to sing to do this piece?

Here's one way of figuring it out:

(see the original blog post if your reposting turns this code into goobley-gook...)

import copy
import music21
from music21 import corpus, meter, stream

score = corpus.parseWork('bach/bwv366')
ts = score.flat.getElementsByClass(meter.TimeSignature)[0]
ts.beat.partition(3)

found = stream.Stream()
for part in score:
found.append(part.flat.getElementsByClass(music21.clef.Clef)[0])
highestNoteNum = 0
for m in part.measures:
for n in m.notes:
if n.midi > highestNoteNum:
highestNoteNum = n.midi
highestNote = copy.deepcopy(n) # optional

# These two lines will keep the look of the original
# note values but make each note 1 4/4 measure long:
highestNote.duration.components[0].unlink()
highestNote.quarterLength = 4

highestNote.lyric = '%s: M. %s: beat %s' % (
part.getInstrument().partName[0],
m.measureNumber, ts.getBeat(n.offset))
found.append(highestNote)

found.show()


... which generates this snippet of notation...



...showing that for at least one piece, Bach was (probably accidentally) using the old medieval authentic, plagal, lower-octave-authentic, lower-octave-plagal, range designations!

This code is still needlessly complicated -- we're still working on simplifying it (notes will know their own clefs and beats; '3/4' will know that it should be divided into 1+1+1), but just a little taste of what's possible.

Oh, and just for fun, all the C#s in the piece with this code snippet substitution:. Or discover that all the raised leading tones in this d-minor composition happen on the two strongest beat of a measure:


if n.name == 'C#':
...
found.append(n)


Saturday, December 19, 2009

Mapping the Eroica

Alex Ross discusses on his blog a project by Eric Grunin to document changes in the performance of Beethoven's Eroica Symphony over the years. His visualization of changes in tempo in the first movement over time are particularly interesting:



(Red dots represent performances that take the repeat at the end of the exposition; blue dots do not).

Grunin finds all sorts of new and interesting data to analyze. I especially wanted to link to it to show that there is a lot of amazing projects in computer-aided musicology still to be done that do not require the use of analysis of music as symbolic data (as music21 does)

Thursday, December 10, 2009

How do Mozart and Chopin use their notes?

Mozart's minuets don't sound much like Chopin's mazurkas. There are many reasons for this -- the rhythms, the tone of their pianos, changes in repetition, and so on. But one of the ways that music21 let us discover is that they articulate their pitch space differently. For instance, here is the pitch distribution in the soprano register of one minuet by Mozart:



The x (left<->right) axis shows shorter and longer notes -- this excerpt [like the Chopin] uses almost entirely quarter notes (1.0) and eighth notes (0.5 quarterLength). The y (front<->back) axis shows lower to higher notes (middle C = 60). The z (top<->bottom) axis shows the number of times each pitch/rhythm combination appears. Looking along the y axis, we see something like a bell-shaped curve. Mozart uses notes in the middle of the register the most often, followed by a smooth trailing off towards higher and lower notes.

Contrast the Mozart graph with the pitch distribution of the right hand of a Chopin mazurka:



Chopin emphasizes certain notes quite a bit more than other notes. There is no smooth distribution; instead some pitch-classes (C# and G# especially) recur much more often in certain registers of the piano. We will be returning to these examples (probably refining the graphs slightly to deal better with grace-notes, etc.) as music21 moves toward its initial beta release.