CS› 3604 Writing Assignment #3: Final Copy
S. Keith Ratcliffe
››››››››››› Throughout the history of human culture, the desire for musical expression has compelled us to invent countless devices for creating and controlling musical sounds.› Many of these devices, such as the percussion, string, brass, and woodwind instruments of the modern orchestra and others of the non-western world, have been with us for hundreds of years and have reached what many consider to be near-optimal design.› Thus, the majority of these instruments have changed very little during the last few centuries, with only minor improvements, if any, to their original design.› Until recently, the practices that were developed, governing how music for these instruments is composed, performed, and archived, had also seen only minor changes (Paradiso, 1).› Only within the last four or five decades, with great technological advances in electronics and computers, has significant change in these designs and practices been possible.
The evolution of computers and computer-related technology has had a dramatic impact on nearly every facet of human culture.› Its impact on our music and on our musicians is certainly no exception.› The purpose of this writing is to discuss, in chronological order, the žthree stages of insertionÓ as they relate to the insertion of computers into the fields of music creation and music performance beginning in the 1950s.› By definition, the first stage, Direct Replacement of Activity, deals with the initial replacement of some current set of tasks by an entirely new set of tasks afforded by the insertion of computer automation.› Here it is shown that no real gains in productivity were realized and that little or no replacement of people came about as a result of the insertion.› In the second stage, Enhancement of Activities, the effects of computer insertion are shown to have progressed such that significant strides were made in productivity, resulting in the replacement of people no longer necessary for the completion of tasks. Lastly, in the Ability to Perform New Functionality stage, new products and capabilities are identified which have come about as a direct result of computer insertion.
››››››››››› In 1957, Max Mathews, now recognized as the žfather of computer musicÓ, was working in the Acoustic and Behavioral Research Department at Bell Laboratories.› There, in an attempt to simulate telephone communications, MathewsŪ colleagues figured out how to convert human speech into binary data, input that data into a computer, and then revert the digital information back into analog sound waves.› As an accomplished violinist, Mathews saw the potential for this technology to be applied in the field of music and immediately began work on MUSIC I, a software program to be written in assembly code for the IBM 704 computer.› This revolutionary software allowed musicians to input certain information representing his or her musical score onto computer punch cards.› When compiled and run, the resulting music would be output in digital form onto magnetic tape and could then be played back through a digital-to-analog converter.› The music achieved through this digital synthesis is reported to have been crude at best.› The notes produced were pure tones (or sine waves) which lack most of the characteristics that we find pleasing in tones produced by acoustic instruments.› However, for the first time in history, a computer was being used to automate the process of creating and performing music. In 1968, Mathews abandoned work on his groundbreaking software upon completing MUSIC V, written in FORTRAN for the IBM 360. Developers from MIT and Stanford University picked up where he left off, creating two more versions of the software, MUSIC 10 and MUSIC 360 (Johnstone).› Although this early work in the digital synthesis of music went largely unnoticed by the outside world, it clearly provided the impetus for the digital revolution that would engulf the music industry in coming decades.
This first stage of insertion continued throughout the 1970s with further advancements being limited to a few significant breakthroughs in the design of computerized musical instruments.› Due to the work done previously by Mathews at Bell Labs, and due to advancements in the construction of electronic (piano) keyboards in the 1960s, the concept of a žhybridÓ musical instrument began to take shape. The idea behind the hybrid instrument was to combine the functionality and power of new, smaller computers with the more intuitive interface provided by the standard piano keyboard.› Some of these early instruments included the GROOVE (Generated Real-time Output Operations on Voltage-controlled Equipment), developed by Max Mathews himself in 1970, the Synclavier, developed at Dartmouth College in New Hampshire in 1975, and the Fairlight CMI (Computer Music Instrument), designed by two Australian engineers in 1979. These computerized instruments introduced the world to the concept of digital sampling, a process by which real-life sounds are recorded, digitized, and modified for playback through an electronic device.› For the first time, musicians could emulate the sounds of traditional acoustic instruments and produce new sounds never before heard by the human ear.
› (Figure 2)
Although the GROOVE system was developed mainly for experimental purposes, the Fairlight CMI and the Synclavier (shown above) were among the first to be developed for commercial applications and were used a great deal during the late seventies in movies and in high-budget pop music recordings.› However, due to the enormous cost of these instruments, they were extremely rare; only the wealthiest recording studios and movie studios could afford them. Therefore, despite the considerable impact these machines had on peoplesŪ perception of music creation and music performance in the latter part of the decade, their overall impact on the employment of musicians was negligible(Anonymous).
››››››››››› The music industry began to enter the second stage of insertion in the early 1980s due to two main factors: the development of MIDI (Musical Instrument Digital Interface), a powerful digital protocol which facilitates communication between electronic musical instruments, and the decreasing cost of consumer electronics, including digital synthesizers and personal computers.
At the start of the 1980s, the top competing synthesizer manufacturers, such as Korg, Roland, Yamaha, and others, began to face a serious dilemma.› Their customers, still comprised mostly of professional musicians and music studio executives, were beginning to demand a higher degree of compatibility among opposing synthesizer brands.› Most manufacturers were already providing for some degree of scalability within their own product lines, often allowing their instruments to be connected in order to facilitate the sharing of data in various ways.› However, only limited compatibility existed among a few opposing brands. Their solution was to cooperate in devising the MIDI protocol in 1983.› MIDI is a digital communications protocol that includes a standard hardware interface that can be used to connect multiple electronic musical devices that are similarly equipped.› With multiple MIDI devices linked together, a single musician can use one of the devices in the chain to control the performance of all the others in the chain. By striking a key on the controller device, the musician may send a message to one or all of the other devices, specifying which note or notes he wishes to invoke from them.› These messages may also include instructions for the desired loudness and timbre of the note, etc.› It is also possible for the musician to create elaborate MIDI files that can be input into the controller device whenever he or she wishes to have that musical script acted out (Hubert and Runstein 154-157).
By the mid-eighties, affordable personal computers and MIDI equipped musical instruments began to flood the marketplace, setting the stage for a musical revolution.› By this time, music technology had advanced such that it was becoming increasingly difficult distinguish žrealÓ music from synthesized music.› Professional and amateur musicians now had tools at their disposal that allowed them to single-handedly emulate any assortment of instruments in any musical settingůthe string quartet, the jazz ensemble, the symphonic orchestra, the marching band, . . . the list goes on.› Moreover, the desire to create a completely new style of music had already been born.› Electronic music had laid its roots firmly in pop culture back in the 1960s through the experimentation done by bands like The Beatles, The Who, The Rolling Stones, and others. By the mid- to late-eighties, this experimentation had matured into its own unique art form, and pop music became entrenched with musical groups that used synthesizers and other computerized instruments not to enhance their music, but to define it (Brabec and Brabec 210).
At the same time, the personal computer began to play a new and vital role in the process of creating music.› Since the MIDI protocol uses binary digits to communicate information between musical instruments, it naturally lent itself to computer applications. It didnŪt take long for recording industry professionals to discover the virtues of computers in the recording studio, as they proved to be the ultimate controllers for devices within a MIDI chain.› For the first time, professional quality musical works could be composed and performed without the musician/programmer having to leave his or her computer workstation.› One famous example of the application of this technology goes back to 1984 at Virgin Records where the music for Tina TurnerŪs žPrivate DancerÓ, one of the top selling albums of the 80s, was created almost entirely on a computer controlled MIDI workstation by producer Terry Britten (Brabec and Brabec 225).›
Despite the many virtues of this powerful technology, its widespread use has had unfortunate and inevitable consequences for the more traditional musician.› Whenever a computerized device is used in a professional capacity to emulate the performance of other instruments, employment opportunities for human performers are lost. When you consider that the vast majority of all music recordings since the early 1980s have contained at least some degree of digital synthesis, particularly in pop music, music for television, and music for movies, there can be no doubt that computer technology has negatively impacted the employment of musicians.› It is simply more economical for employers of musicians to hire a single musician/programmer who is capable of performing the work of many. This has especially proven to be true in the hiring of musicians for work in television and movies, where the extremely high salaries paid to actors overshadow all other aspects of budgeting for the production.› With computers and electronic instruments becoming increasingly powerful and increasingly inexpensive, it is unlikely that this trend will decline, especially as it becomes harder and harder to distinguish between the human performer and the virtual one (Wacholtz 91).
Stage three of insertion began in the mid- to late-1990s, when affordable, professional-quality digital-audio recording software became widely available for the average desktop PC.› These tools have the ability to turn any computer with a sound card into a professional recording studio, allowing the user to record, mix, and edit musical tracks input through the sound card. The most popular examples are CakewalkŪs Sonar and Home Studio software lines, and Syntrillium Software CorporationŪs Cool Edit Pro.
Musicians can now produce high-quality digital recordings in the privacy of their own homes. No longer is it necessary to pay exorbitant fees for recording time in commercial recording studios(Hubert and Runstein 190). While itŪs too early to judge the overall impact of this technology on the music industry, it is certainly reasonable to predict that the effects will be profound.
››››››››››› The first stage of computer insertion into the fields of music creation and music performance took place in 1957 at Bell Labs when Max Mathews created MUSIC I.› MUSIC I was the first software application to ever facilitate the digital synthesis of music by a computer.› MathewsŪ work at Bell Labs formed the basis for later work that resulted in the creation of computerized žhybridÓ keyboard instruments in the 1970Ūs.
››››››››››› The second stage of insertion occurred in the 1980s due to the development of MIDI (Musical Instrument Digital Interface) and the affordability of electronic devices such as synthesizers and personal computers.› MIDI is a digital protocol that allows electronic musical instruments and computers to be linked together and to communicate with each other in various ways.› One device in a MIDI chain, often a computer, is used to control the performance of the other instruments in the chain.› This technology gave musicians unprecedented power in automating the process of creating and performing music.› Although this automation expanded the creative powers of many musicians, it also had a negative impact on the employment of traditional musicians who could now be easily replaced.
››››››››››› The third stage of insertion began in the mid- to late-1990s as digital-audio recording software became widely available for the personal computer.› These applications allow computers to effectively replace the modern studio in the creation of music recordings and further enhance the ability of musicians to achieve autonomy in their musical endeavors.
››››››››››› The impact of computer technology on music and on society has been profound. In the 1950s, the use of computers in music was purely experimental and had basically no impact on anyone except for those directly involved in the experimentation.› However, within the span of only a few decades, computers have come to play a vital role in nearly every musical activityůimpacting not only musicians, but society as a whole. ›As consumers of music, we all have benefited from computer technology.› Modern digital recording techniques have allowed a piece of music to be captured and reproduced in its original form exactly the way the artist performed it. Digital copies of musical works will never suffer the degradation in quality that plagues analog recordings over time.› For musicians, however, the evolution of music technology has been more of a double-edged sword.› On the positive side, computer technology has opened up a whole new world to musicians. In composition and performance, they are no longer limited by the physical constraints of the past; they are limited now only by their imaginations. However, as with other fields in which computers now play a part, the application of computer technology in the field of music has, in many instances, led to the replacement of people by machines.
Anonymous, žComputer Music: Music I-V & GROOVEÓ, žThe Fairlight CMIÓ, and
žThe Synclavier.Ó 120 Years of ElectronicMusic: Electronic Musical Instruments 1870ů1990.›
(11 Oct. 2001).
Brabec, Jeffrey, and Todd Brabec.› Music, Money, and Success: The InsiderŪs Guide
››››››››››› to the Music Industry.› New York: Schirmer Books, 1998.
Hubert, David, and Robert Runstein.› Modern Recording Techniques. 4th ed.
››››››››››› Boston: Focal Press, 2000.
Johnstone, Bob.› žWave of the Future.Ó Wired: Archive. 1994.
››››››››››› http://www.wired.com/wired/archive/2.03/waveguides.html (10 Oct. 2001).
Paradiso, Joseph A.› žAmerican Innovations In Electronic Musical Instruments.Ó
››››››››››› New Music Box: In The Third Person. 1999. http://www.newmusicbox.org/
third-person/index_oct99.html (11 Oct. 2001).
Wacholtz, Larry E.› Star Tracks: Principles for Success in the Music and Entertainment
››››››››››› Industry.› Nashville: Thumbs Up Publishing, 1997.
Figure 1: http://www.obsolete.com/120_years/machines/fairlight/index.html
Figure 2: http://www.obsolete.com/120_years/machines/synclavier/index.html
Figure 3: http://proaudiomusic.com/software/cakewalk/home_studio_big_shot.htm