The catholic church has done many interesting things during the centuries, some bizarre, some just plain stupid, and some that are both. One of those is the excommunication of the tritone – the topic of this article.
The tritone is not Triton, who is the son of Poseidon and the largest moon of Neptune, although the church probably excommunicated these too at some point. The tritone is just an interval of six semitones. Six semitones are equal to three tones and hence in classical music this interval is called a tritone. The interval between A and D#, for example, is a tritone, as it has six semitones.
We can also call the tritone an augmented fourth if it is between some note on a scale and the fourth note after that, such as between the first and the fourth note of the Lydian scale (e.g., between F and B in the Lydian F, G, A, B, C, D, E). We can also call it a diminished fifth if it is between some note on a scale and the fifth note after that, such as between the first and the fifth note of the Locrian scale (e.g., between B and F in the Locrian B, C, D, E, F, G, A). It is the interval between the root and the "blue" note in a common blues scale. Very often, the tritone is a part of the diminished chord (say Adim = A, C, D#, the tritone being the interval between A and D#). All in all, the tritone is used quite frequently nowadays.
The church must have foreseen the 20th century and must have hated it – with all the blues in the early part of the century (a lot of diminished chords in Robert Johnson's "Me and the Devil"), with all the blues in the middle (say, the beginning of Jimi Hendrix's "Purple Haze" or "Red House"), and the end, well, apparently, the tritone is heavily used in heavy metal (go figure).
People have offered all kinds of explanations. Here are some.
The tritone is too pagan: It could be, although I am not an expert. It does exist in some folklore music. The Lydian for example has been around for a while, although it is said that in the middle ages the B in the Lydian example above would often be dropped to a Bb to produce the major scale in F. I have heard the same "pagan" argument about the seventh of the scale (i.e., the major or dominant seventh of heptatonic scales eleven or ten semitones up from the root).
Too eerie: Some say the interval was apparently called the "devil's interval" – I cannot be sure. It sounds alright to me.
Sounds crappy: The argument was that instrument tuning was different and this interval just was not working. If you are working with the chromatic scale and use Pythagorean tuning then yes. If you start from A, for example, you can go up to Eb and you would be off from the contemporary equal tempered tuning by about an eighth of a semitone up. If you go down from A you would get to D# and you would be off from the contemporary D# by another eighth of a semitone, this time down. More importantly, you get an D# that is about a quarter of a semitone from Eb. Which brings us to:
Difficult to tune: Alright. If you use the Pythagorean tuning you get a problem around the triton. In the example above, D# will be off from Eb by about a quarter of a semitone, which is very noticeable. Thus, you can go up from A to Eb, or down from A to D#, but crossing the Eb / D# place is a problem. This off-tune interval resulting from such tunings is called a "wolf interval".
Natural order: The fact that there are exactly seven notes shows that there is natural order and god exists. Just kidding. This has nothing to do with this article. But I did see this put in a forum on the question of what is wrong with the tritone. There are so many problems with this argument that I am not sure where to begin. Just for the sake of the example, Western music uses 12 notes, although it typically relies on seven note scales. That is of course, if you forget the pentatonic scales in blues, passing tones in bebop, six note scales in some folklore, quarter tone scales in Arabic music, nineteen notes for some Renaissance theorists, and so on.
Testing for the old tritone
Did the tritone sound bad in older times? Older times used different tuning and getting to that tuning is not easy. Any good contemporary software can pitch shift. Perhaps, even if your instrument cannot produce some pitch, you can get to it through pitch shifting, if you know how much to pitch shift. I tried. And I failed.
I decided on MIDI, just to avoid issues with having, say, a wrongly tuned instrument to begin with. I pulled up the Orinj MIDI editor and put some notes in sequence. First, A and D# sequentially, then together, then the arpeggiated Adim = A, C, D#. I repeated the same sequence, but pitch shifted C and D# to reflect the Pythagorean tuning starting with A. Say that we start with A = 440 Hz. Then, in contemporary equal tempered tuning C = 440 * 2^3/12 = 523.25 Hz and D# = 440 * 2^6/12 = 622.25 Hz. In Pythagorean tuning C = 440 * ((2/3)^3) * 4 =512.48 Hz and D# = 440 * ((2/3)^6) * 8 = 618.05 Hz. The difference is: C pythagorean – C modern = 1200 * log2(521.48 / 523.25) = -5.86 cents. D# Pythagorean – D# modern = 11.73 cents (100 cents is a semitone).
If you want to pitch shift in MIDI, the pitch range usually goes from 0x0000 (pitch shift two semitones down) to 0x3FFF (pitch shift two semitones up). 0x2000 is the middle (no pitch). Pitch shifting 622.25 to 618.06 for D# means using 0x1E20. 0x1E20 / 0x2000 = 0.941 in decimal and this is a drop of about 0.0585 of two semitones, which is 11.72 cents.
This said, I heard the exact same sequence of notes with pitch shifting as without pitch shifting. This means that: 1) I did not do it correctly; 2) my ear does not recognize such a small pitch shift; 3) the Microsoft MIDI mapper is good enough to handle such small pitch shifts. All of the above are likely. I would bet on (3), as when I put a large enough pitch shift (i.e., a whole semitone) it is very audible. The other way to test it is to record the MIDI file to a wave file, which will ensure that the music has the right original pitch, and then to shift the wave data later, using some other pitch shift. That is just too much effort.