爵士风格之互动式音乐系统

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 41

國立暨南國際大學資訊工程學系

碩士論文

爵士風格之互動式音樂系統

Interactive Jazz Style System

指導教授:陳恆佑博士

張克寧博士

研究生:蔡政穎

中華民國 106 年 12 月
Acknowledgements
回想起研究所的這段期間真的是過得非常的充實,完成了許多困難的挑戰,不管是在

上百人面前作發表或是擔任熱音社社長一起籌辦內地搖滾,這些都是我根本沒想過會做

的事。科技和音樂方面的知識及待人處事也都學到了不少,遇到了很多很棒的人、事、

物。在這段期間的收穫早已遠遠大出我的預期,心中除了感謝還是感謝,感謝在這段路

上我遇到的每一個人。

首先感謝我的家人媽媽、爸爸的養育之恩,總是支持我的決定並在我的求學過程中不

斷的資助我,讓我能衣食無缺並且專心地完成我的學業,以及姊姊一直以來給我的鼓勵

以及論文的意見還有美食情報的提供,家人對我無微不至的關心讓我就算身處異地也能

時時刻刻感受到溫暖。

感謝恆佑老師及克寧老師細心的指導我的研究並提供我許多的意見,讓我學到了很多

音樂和科技的專業知識。

MT LAB 的所有夥伴,總是默默的關心 LAB 所有人的大學長富能,學長阿蔡、煒志、喇

叭、Rain、小冠,不僅帶我們熟悉實驗室環境也帶我們吃喝玩樂,很幸運有這麼照顧我

們的學長們。 一起奮鬥做蠢事的老頭、小王和學長阿名,有你們的陪伴讓我的研究生

涯不孤單。 學弟妹粘粘、怡安、彥成、凱鈞、昱靜,雖然相處時間不長,不過很開心

實驗室有你們的加入,相信實驗室未來一定會更多采多姿。

感謝熱音社的夥伴橘子、婉軒、阿彬、廷瑋在我擔任社長的期間給予我各種的支援,

一起去台灣各地看了很多的表演,並一起完成了三屆的內地搖滾,這些都是我這輩子最

無可取代的回憶。

工作室的老師和朋友們桑桑、軍政、裕翔、童童,讓我不管是樂器或是音樂上都學到

了許多知識,也一起做了很多荒唐的蠢事留下很多很棒的回憶,很高興能遇到這些這麼

棒的人們。

最後感謝鼓勵我也給我很多意見的高中同學庭璿、佩儀、維恒,每次見面都一直問我

論文什麼時候出來,你看,這不就出來了嗎。
i
論文名稱:爵士風格之互動式音樂系統

校院系:國立暨南國際大學科技學院資訊工程學系 頁數:32

畢業時間:中華民國 106 年 12 月 學位別:碩士

研究生:蔡政穎 指導教授:陳恆佑博士

張克寧博士

摘要
爵士樂的現場表演非常有趣。與單純用電腦播音樂相比,同一首歌的現場表

演每次都能令人有很多不同感覺。爵士樂的現場演奏不僅僅是將音樂按照樂譜彈

奏出來而已,更重要的是樂手間的互動以及歌曲的即興演奏,這需要豐富的樂理

知識和抽象的情感表達以及演奏者之間的默契,這些能力都需要長時間的培養。

而在科技這麼發達的現代,我們不禁好奇科技能不能夠模擬複雜的爵士樂的現場

演奏?

本論文旨於設計一套能模擬爵士樂風格演奏歌曲的系統。系統將演奏的歌曲

分類成主旋律以及伴奏兩個部分,並透過演算法將演奏中複雜的樂理知識寫入系

統。使用者不需要按照樂譜上的音符彈奏相對應的琴鍵,也不需要懂樂理知識,

只需按 MIDI 鍵盤上的任何一個鍵就能夠演奏主旋律或是在主旋律上即興變奏。

在伴奏的部分我們設計變速以及暫停兩個功能來模擬現場演奏時樂手間的互動。

演奏時系統所演奏的伴奏會隨著主旋律做出不同的反應,使用者與系統透過這樣

的互動來模擬爵士樂的現場演奏。

關鍵字:人機互動、自動伴奏、爵士樂
ii
Title of Thesis : Interactive Jazz Style System

Name of Institute : Department of Computer Science and Information Engineering,

College of Science and Technology, National Chi Nan University Pages: 32

Graduation Time: 12/2017 Degree: Master

Student Name: Zheng-Ying Tsai Advisor Name: Dr. Herng-Yow Chen

Dr. Keh-Ning Chang

Abstract
Live performance of Jazz music is very fascinating. It has a lot of difference from CD

recording. The same song may give a lot of different feelings in live performance. Live

performance of Jazz music is not just playing notes on sheet music, but is the interaction

between musicians and the improvisation of the song. It also includes much music theory,

abstract expression of emotion and synchronization between musicians minds. All these

abilities have to be cultivated for a long time. Under the modern science and technology

developed influence, we can’t help but wonder can technology simulate complicated jazz

music?

This thesis aims to design a system that can simulate the performance of jazz music.

System divides music in two parts, main melody and accompaniment. We also import the

complicated music theories in system by algorithm. Users not only do not need to follow the

notes on the lead sheet, but also do not have to learn any music theories. All they should do is

to press any key on the MIDI-keyboard, then users can play the main melody or variation

easily as they wish. In the part of accompaniment, we design two functions “Bpm Detection”

and “Accompaniment Pause” to simulate the interaction between musicians. The

accompaniment that played by system will make different responses to users playing. Users

and system can simulate the live performance of jazz music through such interaction.

Keywords: Human-computer interaction, Auto-accompaniment, Jazz


iii
Table of Contents

Acknowledgements……………………………………………………………………i

摘要…………………………………………………………………………………ii

Abstract………………………………………………………………………………iii

Table of Contents……………………………………………………………………iv

List of Figure………………………………………………………………………vi

Chapter 1 Introduction……………………………………………………………… 1

1.1 Motivation…………………………………………………………………… 1

1.2 Background…………………………………………………………………… 2

1.3 Thesis Organization…………………………………………………………… 3

Chapter 2 Background Knowledge and Related Works……………………………… 4

2.1 MIDI…………………………………………………………………………… 4

2.1.1 MIDI Keyboard……………………………………………………… 5

2.2 Jazz Genre…………………………………………………………………… 6

2.2.1 Characteristics and Establishment……………………………………… 6

2.2.2 Swing…………………………………………………………………… 6

2.2.3 Improvisation………………………………………………………… 8

2.2.4 Structure……………………………………………………………… 9

2.2.5 Lead Sheet…………………………………………………………… 10

2.3 Related Works………………………………………………………………… 11

2.3.1 A.I. Duet……………………………………………………………… 11

2.3.2 Arranger keyboard…………………………………………………… 12

iv
Chapter 3 Implementation……………………………………………………………13

3.1 System Overview……………………………………………………………… 13

3.2 Song Selection………………………………………………………………… 14

3.3 User Interface………………………………………………………………… 15

3.4 Melody and Accompaniment………………………………………………… 16

3.5 Start Performance………………………………………………………………16

3.5.1 Bpm Detection……………………………………………………………17

3.5.2 Accompaniment Pause……………………………………………………20

3.5.3 Improvisation……………………………………………………………22

3.5.4 Performance End…………………………………………………………24

Chapter 4 Evaluation…………………………………………………………………25

Chapter 5 Conclusion and Future Work…………………………………………… 29

5.1 Conclusion…………………………………………………………………… 29

5.2 Future Work………………………………………………………………… 29

Reference……………………………………………………………………………30

一、英文文獻……………………………………………………………………30

二、中文文獻……………………………………………………………………30

三、網路資料……………………………………………………………………31

Appendix……………………………………………………………………………32

Appendix A Questionnaire about our system……………………………………32

v
List of Figure

Figure 1. Live performance with electronic musical instruments ……………………2

Figure 2. The information of MIDI events……………………………………………4

Figure 3. MIDI KeyBoard Controller ……………………………………………… 5

Figure 4. Swing 8th notes……………………………………………………………7

Figure 5. A straight C major scale in 8th notes………………………………………7

Figure 6. A swung 8th note feel C major scale………………………………………7

Figure 7. Chromatic approach…………………………………………………………8

Figure 8. The structure of Jazz music…………………………………………………9

Figure 9. Sheet music of Autumn Leaves……………………………………………10

Figure 10. Lead sheet of Autumn Leaves……………………………………………10

Figure 11. A.I. Duet…………………………………………………………………11

Figure 12. Arranger keyboard - KORG Pa900……………………………………12

Figure 13. The flow chart of our system…………………………………………14

Figure 14. The icon of three songs…………………………………………………14

Figure 15. User interface and Option bar……………………………………………15

Figure 16. The establishment of the song……………………………………………16

Figure 17. The process of playing notes……………………………………………17

Figure 18. Users can turn on or off BPM detection by check the checkbox………18

Figure 19. The process of BPM Detection I…………………………………………18

Figure 20. The process of BPM Detection II………………………………………19

Figure 21. The process of BPM Detection III………………………………………19

Figure 22. The illustration when users playing too fast……………………………20

Figure 23. The process of Accompaniment Pause I…………………………………20


vi
Figure 24. The process of Accompaniment Pause II…………………………………21

Figure 25. The process of Accompaniment Pause III………………………………21

Figure 26. The illustration when accompaniment pause……………………………22

Figure 27. Note repeat technique……………………………………………………23

Figure 28. Chromatic approach technique…………………………………………23

Figure 29. The diagram of Improvisation……………………………………………24

Figure 30. Basic information of subjects……………………………………………25

Figure 31. Result regarding friendliness of user interface…………………………26

Figure 32. Result regarding accompaniment………………………………………27

Figure 33. Result regarding improvisation…………………………………………27

Figure 34. Result regarding evaluation of system…………………………………28

vii
Chapter 1 Introduction

1.1 Motivation

Jazz, a popular music style, creates a nice relaxed atmosphere with unique swing

rhythm. Improvisation of jazz gives the same song different feeling. The audience are

surprised every time and relieve their stress of life. Although jazz has a hundred years

of history but to the modern times it is still loved by people. The most important part

of jazz music is improvisation. But to play improvisationally, one needs to understand

the structure of the chord, the use of rhythm, as well as the synchronization between

musicians minds. All these abilities have to be cultivated for a long time. The

combining of so many more complex things is why jazz music is wonderful.

Technology is developed in tremendous speed nowadays. Computers have been

able to do many things only people can do before, such as conversation, playing chess,

even driving a car or flying a plane. In the case of computer is being almighty, we

can’t help but wonder can computer play complicated jazz music?

In order to discuss this problem, we designed a system to simulate the

performance of jazz music. Users can play jazz ensemble or improvisation easily with

our system by MIDI keyboard. When users are playing ensemble with our system,

accompaniment will vary with users in real-time. Users can play wonderful jazz

music with our system.

1
1.2 Background
In modern time, due to the prevalence of technology development, music and

technology gradually merged. With the support of science and technology music has

developed many different genres. The advance of technology also helped music in

many ways, such as composition, recording, or even coordination of live

performances[J.-H Su 2011]. Music shows more possibilities with the support of

technology.

Accompaniment by definition is to play in accordance with a soloist or an

ensemble. Most accompaniments nowadays are accomplished by human beings, but

with the advance of computer technology, auto accompaniment appears to provide

some convenience to create and perform music.

Figure 1. Live performance with electronic musical instruments

2
1.3 Thesis Organization

In the next chapter, we will give a brief introduction about jazz music and the

technology we used. Chapter 3 will have an overview of our system. We will also

introduce the methods and user interface we implement. In order to test the effect of

our system, some evaluations about the system will be shown in Chapter 4. Finally,

we give the conclusion and future works in Chapter 5.

3
Chapter 2 Background Knowledge and Related Works

2.1 MIDI

MIDI (Musical Instrument Digital Interface), a technical standard presented by

Dave Smith in 1981, solves the communication problem between different electronic

musical devices [C.-H Wang 2012]. It uses unified format called event to represent

the music. A note will have two MIDI events: onset time event and offset time event.

Onset time is the time of a note starting to play. Offset time is the time of a note

stopping. Both events contain the information below: pitch, velocity, duration,

channel, and program (timbre) [Table 1]. Advantages of MIDI include compactness,

ease of modification and manipulation and a wide choice of instruments. [H.-W

Chang 2014]

Figure 2. The information of MIDI events

4
2.1.1 MIDI KeyBoard

MIDI keyboard is a piano-style user interface keyboard device. It is used for

sending MIDI signals or commands over a USB or MIDI cable to other devices

connected on the same MIDI protocol interface. The basic MIDI keyboard does not

produce sound. Instead, MIDI information is sent to an electronic module capable of

reproducing an array of digital sounds or samples that resemble traditional musical

instruments. These samples or waveforms are also referred to as voices or timbres.

Figure 3. MIDI KeyBoard Controller

5
2.2 Jazz Genre

Jazz originated in the late 18th century. It transported to the America by black

people, in order to break the myth that music is owned and enjoyed by upper class.

The advent of jazz also indirectly influenced the creation of many genres, such as the

improvisation of the pastoral, the gospel, the church hymn, the reggae, the blues, the

rhythm and blues (R&B), the rock and roll, and the rap music that was so late. It’s one

of the most important contributions of the culture and the history of western music [S.

Marc 2003].

2.2.1 Characteristics and Establishment

Two characteristics of Jazz are “Improvisation” and “Swing”. The creativity of

improvisation and the agreeable swing make jazz so enthusiastic and cozy. The

establishment of a jazz band is flexible. For example, a pianist can solo or duet with a

trumpeter or trio with a bassist and a drummer. The establishment of the jazz band can

be quartet, quintet, sextet or more. According to the establishment, the melody often

includes a reed, brass or stringed instruments. The rhythm is mainly based on bass and

drum. Such a establishment can be expanded to the so-called "Big Band" [Kai 2010].

2.2.2 Swing

Swing is a quality attributed to jazz performance. Though basic to the perception

and performance of jazz, swing does not have a precise definition or description. Most

attempts refer it to a rhythmic phenomenon [Kuan 2016].

6
Although swing does not has a precise definition, we try to describe it in a simple

way. As shown in Figure 4., instead of two 8th notes dividing the beat exactly in half,

the second 8th note in each pair occurs two-thirds of the way through the beat

(equivalent to playing on the first and third parts of an 8th note triplet). Figure 5.

shows a score of straight C major scale in 8th notes. We can play a swing feeling C

major scale according to the method aforementioned (Figure 6.).

Figure 4. Swing 8th notes

Figure 5. A straight C major scale in 8th notes

Figure 6. A swung 8th note feel C major scale

7
2.2.3 Improvisation

Jazz improvisation is the process of spontaneously creating fresh melodies over

the continuously repeating cycle of chord. The improviser may depend on the

structure of the original tune, or solely on the possibilities of the chords' harmonies.

[C.-L Lu 2009]

There are many kinds of improvisation techniques. One of improvisation

techniques we used in our system is called chromatic approach [Chipin]. Performers

will insert a semitone between two notes when the interval of them is a whole step or

a half step. In Figure 7., we insert a semitone between F note and E note. Melodies

will be unexpectedly and a little bit untrammeled. It retains the original aspect of

melody and not too rigid. This kind of phrase is called embellishment in jargon. The

United States Berklee College of Music teaching system called it Indirect Resolution.

It is a improvisation skill that lots of jazz musicians used.

Figure 7. Chromatic approach

In addition to the improvisation on the melody, we also improvise on the rhythm.

When jazz music is playing, it may not always be prim and follow the sheet music.

Like other improvisation, although the rhythm changed, the melody will still retain

the original aspect. It make jazz music more lively.


8
2.2.4 Structure

Most jazz is based on a form that is actually quite similar to the sonata form from

classical theory: an optional introduction, the exposition or theme (possibly repeated),

the development section (improvisation), and the recapitulation, possibly followed by

a coda [Marc. S 2003].

Figure 8. is the structure of Jazz music. The introduction, if present, sets the tone

for the song; the exposition is the main melody; the development section is where the

composer extends the ideas of the exposition; the recapitulation is a restatement of the

theme; and the coda is an ending.

Figure 8. The structure of Jazz music

9
2.2.5 Lead Sheet

With a huge difference from sheet music (Figure 9.), lead sheet is a form of

musical notation that specifies the essential elements of a song: the melody, lyrics and

harmony (Figure 10.). The melody is written in modern Western music notation, the

lyric is written as text below the staff and the harmony is specified with chord

symbols above the staff. The lead sheet does not describe the chord voicing, voice

leading, bass line or other aspects of the accompaniment [Benward. B 2003]. These

are specified later by composer or improvised by performers.

Figure 9. Sheet music of Autumn Leaves

Figure 10. Lead sheet of Autumn Leaves


10
2.3 Related Works

There are some implementations of algorithmic composition which are similar to

our system and inspire us in some respects. We will give a brief introduction to them.

2.3.1 A.I. Duet

A.I. Duet is a machine learning software developed by Yotam Mann from

Google Creative Lab [Yotam 2016]. Users can play a piano duet with the computer.

Figure 11. is the GUI of A.I. Duet. Users can play music by clicking the keys of the

piano below. The idea of this software and the way computer interact with users'

playing are similar to our system. Users just need to play some notes, and the

computer will respond a melody which is like an accompaniment. Users do not even

have to know how to play piano. They just need to press some keys and listen to what

comes back. It is an experiment of how machine learning can inspire people to be

creative in new ways.

Figure 11. A.I. Duet


11
2.3.2 Arranger Keyboard

Arranger keyboard is a quite common instrument for live performance. It allows

users to play chord and the system of arranger keyboard will generate accompaniment

automatically. So the music can be more plentiful and distinctive. The instrument of

accompaniment includes drum, bass and orchestra. The style of accompaniment are

diversified, including Jazz, Rock, Bossa Nova, Hip hop, House, and so on. The way

of arranger keyboard to play accompaniment is similar to our accompaniment system.

But there are something different between arranger keyboard and our system. The

accompaniment of arranger keyboard needs users' operation, but the accompaniment

function we provide: “Bpm Detction” and “Accompaniment Pause” are operated

automatically according to users playing.

Figure 12. Arranger keyboard – KORG Pa900

12
Chapter 3 Implementation

This chapter introduces the user interface, explains how our system operates,

shows what functions our system have, and explains how these functions are

designed.

3.1 System Overview

Figure 13. is the flow chart of our system.

Step 1 Select a song and set the BPM(beats per minutes).

Step 2 After a song is selected, our system will show the lead sheet on the

interface. Users can start performance anytime when they know the lead

sheet well.

Step 3 When the performance begins, users play the main melody. In the course

of performance, users can improvise melody freely by using the

MIDI-keyboard.

Step 4 Users complete a performance when the song is finished. Then user can

re-select a song to performance.

13
Figure 13.The flow chart of our system

3.2 Song Selection

Our system provides three songs for users to play. They are “Autumn Leaves”,

“Fly Me to the Moon” and “Jingle Bell Rock”. The first two songs are selected

because they are very popular jazz songs. We hope our system can play a jazz feeling

on every songs, so we choose “Jingle Bell Rock” as our third song.

Figure 14. The icon of three songs

14
3.3 User Interface

An optimal GUI should be intuitive and simple to operate [W.-C Tai 2000]. We

design a friendly GUI which allows users to select songs and play music by clicking a

few buttons.

Figure 15. User interface and Option bar

Before playing, users have to make sure the MIDI-keyboard is connected. All the

instruments will be prepared when the “READY” button is pressed. Then users can

start performance with the MIDI-keyboard.

15
3.4 Melody and Accompaniment

The establishment of our songs is jazz trio. Three instruments are involved.

Piano plays the melody. Double bass and drum play the accompaniment. Users play

the piano as the main melody in the course of song. The double bass and the drum

will be played by system as accompaniment.

Figure 16. The establishment of the song

The songs in our system have been converted to MIDI files through the

DAW(digital audio workstation) software. Therefore, we can capture the MIDI events

of a song, namely pitch, velocity, duration, channel and timbre. With these

information, we can control the playing of instruments to make the music have plenty

of expression.

3.5 Start Performance

Before users begin to play music, they have to press the “READY” button first.

The “READY” button initiates all instruments and ensures that all instruments are

16
ready before users start playing. Users can also restart the performance with this

button.

While users playing with MIDI-keyboard, they do not need to follow the notes

on lead sheet and play corresponding keys. They just have to play any keys on the

MIDI-keyboard, and then melody will play one after another according to the lead

sheet. As a result, users only have to notice the point in time to play notes, music

theories are not necessary.

Figure 17. The process of playing notes

The purpose of our system is to investigate whether computers can simulate jazz

performance. Therefore, we hope to minimize the manual control so that users can

play music with only few buttons.

3.5.1 Bpm Detection

In the course of performance, if one of the instruments starts to change the speed,

the other instruments will follow it to the same speed. It changes speed faster or

slower in order to let the music have different atmosphere. We hope that our system

can accompany like a real musician and follow the BPM of users. Therefore, we

design a BPM detection function with which accompaniment can change the BPM in

real time as well as users playing.

17
Figure 18. Users can turn on or off BPM detection by check the checkbox

In order to get the BPM of the user's performance immediately, we use the

following method to calculate BPM of melody :

We set a variable called ”ClickedTimeDiff” to calculate the time difference

between each button clicked. “BPM” means beat per minute, therefore we use 60

seconds divided by “ClickedTimeDiff” to get the BPM.

Figure 19. The process of BPM Detection I

However, because notes have different duration, the above method can only be

used on continuous quarter notes. If there is a half note or whole note, the system’s

calculation will be incorrect.

18
Figure 20. The process of BPM Detection II

To fix the problem of duration, we set another variable “TimeStampDiff” and use

“Timestamp” to calculate the time difference of each note in MIDI event.

“Timestamp” can capture the midi event from midi file of the music, also record the

point of time of every note. “ClickedTimeDiff” means the time difference between

two notes which was playing by users, and “TimeStampDiff” means the time

difference between two notes in midi event [C.-Y Yu 2002]. According to Figure 21.,

60 seconds divided by “ClickedTimeDiff” and multiplied it by “TimeStampDiff”,

then we can get the current BPM of melody.

Figure 21. The process of BPM Detection III

When users are playing too fast, our system will show an illustration to warn

users they are playing too fast (Figure 22.).


19
Figure 22. The illustration when users play too fast

3.5.2 Accompaniment Pause

In the course of performance, it may happen if one of the instruments pause

without warning, so the other instruments will pause too. In order to simulate the real

situation, we design an “Accompaniment pause” function. When the user who plays

the melody does not play for a while, the accompaniment will pause. Users can

choose to restart or continue playing. If users want to continue playing, they only need

to continue playing the MIDI keyboard, and the accompaniment will resume from the

the paused measure.

The system accompaniment will pause after users do not play for one measure.

In order to know when is the system accompaniment going to pause, we set a variable

called “timeIntervalToPauseAcc” to calculate how many seconds the system

accompaniment will pause when user stop playing. Its value roughly equals to a

measure.

Figure 23. The process of Accompaniment Pause I

First of all, Divide 60 seconds by BPM default to get the seconds per beat.
20
“playRateScalar “ represents current BPM magnification. After divided by

“playRateScalar”, we can get the seconds per beat of current BPM.

Figure 24. The process of Accompaniment Pause II

Finally, the value is multiplied by “accPauseDelay”, a variable for calculating

how much time accompaniment are going to pause. With the calculation above, the

system will know when the accompaniment is going to pause.

Figure 25. The process of Accompaniment Pause III

We merge this function with the play button. Every time when users click the

play button, the timer in the button will reset and start. As long as the users continue

21
playing the accompaniment will continue playing. If users do not play after

“timeIntervalToPauseAcc” seconds, the system accompaniment will pause.

Figure 26. The illustration when accompaniment pause

3.5.3 Improvisation

In the part of main melody, we design some simple techniques to play

improvisation. Users can perform without any music theory but intuition.

First of all is improvisation on rhythm. When music starting, users can play the

notes on lead sheet with any keys on the MIDI-keyboard. They do not need to follow

the notes on lead sheet and play corresponding keys or follow the rhythm on the lead

sheet.They can change the rhythm of melody as they prefer, it would not affects BPM

detection but shows an unrestrained feeling on melody.

The other technique of improvisation is note repeat. When users press the same

MIDI keyboard in quick succession, the note that user just played will play again

afterwards.

22
Figure 27. Note repeat technique

The last technique of improvisation is chromatic approach. In Figure 28., when

users play the different key in quick succession, then system will insert a semitone

before F note and B note. The note that system insert will be a half step from the next

note that user is about to play.

Figure 28. Chromatic approach technique

All these techniques of improvisation are pretty simple and intuitive. Users do

not have to know any music theory. Therefore, users can play jazz feeling music if

they know the melody well and use these three techniques appropriately during

performance.

23
Figure 29. The diagram of Improvisation

3.5.4 Performance End

When the music is over, users can decide to play another song or play again. After

select a song, users can restart playing with the “READY” button and “PLAY”

button.

24
Chapter 4 Evaluation

In order to test whether our system can simulates jazz music, we invite ten

musicians to use and give us feedback. All the musicians have some experience of

playing musical instrument and ensemble. Figure 30. is the basic information of

subjects.

Figure 30. Basic information of subjects

According to Figure 31., most users agree that illustrations in our GUI are clear.

They also agree the system is smooth to use. Only one user neither agree or disagree

this subject.

25
Figure 31. Result regarding friendliness of user interface

In Figure 32, most users agree our system can accompany like real musician, but

four users neither agree or disagree. We guess that it may be the MIDI timbre does

not sound like real musical instrument, or the interaction between users and system

cannot simulate the synchronization between musicians minds. BPM detection

function did not get a well estimate according Figure 32. We conjecture the method of

BPM changing is too sensitive to meet users expectations.

26
Figure 32. Result regarding accompaniment

About improvisation, most users agree the improvisation is pleasing. It is proved

that the improvisation we designed is reasonable and follows the music theory; users

can play jazz feeling music through our system (Figure 33). Our system is easy to

play that users does not need to know about music theory. Therefore, most users agree

our system is simple to improvise.

Figure 33. Result regarding improvisation


27
According to Figure 34, users can be acquainted with the music by our system.

Most users agree our system can play as an ensemble, but three users neither agree or

disagree. We think it could be the synchronization problem. If melody is faster than

accompaniment or too much beyond it, system accompaniment could not catch up the

melody. Only one user disagree our system can simulate the performance of jazz

music. We guess that it is because our system cannot play the complete improvisation

but only melodic variations.

Figure 34. Result regarding evaluation of system

28
Chapter 5 Conclusion and Future Work

5.1 Conclusion

We have developed a system that can simulate jazz performance. Users can play

wonderful jazz music with our system. In the course of performance, users can

improvise through simple operation. According to user’s performance, system

accompaniment will make a variety of changes.

Although many places of our system need to be enhanced, we believe our system

can be a prototype of a music performance simulating system. We are looking

forward to seeing technology which can help musical development to a higher level in

the future.

5.2 Future Work

Our system allows users to improvise on the main melody. Improvisation of jazz,

however, is not just melodic variations. We hope to raise a new solo section for the

music. When it comes to solo section, users can improvise freely out of the main

melody by press any key on MIDI-keyboard. After the section is finished, main

melody and accompaniment will come back to the original music. We also hope to

include more kinds of musical instruments like trumpet, saxophone or oboe, so the

music can be more plentiful and distinctive.

29
Reference

一、英文文獻

[Masataka. G 1998] Masataka. G, Yoichi. M, “An Audio-based Real-time Beat


Tracking System and Its Applications," Proceedings of the
ICMC, pp. 17-20 , 1998.

[Jonathan. F 2001] Jonathan. F and Shingo. U, “The beat spectrum: a new


approach to rhythm analysis,” Proceedings of the 2001 IEEE
International Conference on Multimedia and Expo, pp.
22-25 , 2001.

[Benward. B 2003] Benward. B and Saker. M, “Music: In Theory and Practice,”


Vol. I, p.76. 7th Edition. ISBN 978-0-07-294262-0 , 2003.

[C.-L, Lu 2009] Chung-Li, Lu, “ICYS: An Interactive Jazz Piano


Accompaniment System,” College of Electrical Engineering
and Computer Science National Taiwan University master's
dissertation, pp. 1-71, 2009.

二、中文文獻

[W.-C Tai 2000] Wen-Chin Tai, “視覺化音樂編輯系統的設計與實作,” 國


立交通大學 資訊工程學系碩士論文, pp. 1-44, 2000.

[C.-Y Yu 2002] Ching-Yeh Yu, “音樂節拍辨識,” 國立臺灣大學 電機工程


學系碩士論文, pp. 1-44, 2002.

[H.-R Yang 2004] Hen-Ru Yang, “即時抓取節拍之研究,” 大同大學 資訊工


程學系碩士論文, pp. 1-30, 2004.

30
[J.-H Su 2011] Jin-Hwei Su, “電腦科技在音樂創作教學上的發展與應
用 ,” 人文社會學報, vol.7, no.2, pp. 21-35, 2011.

[C.-H Wang 2012] Chia-Hua Wang, “互動型彈琴機器手,” 國立中央大學 電


機工程學系碩士論文, pp. 1-56, 2012.

[T.-T Wu 2012] Ting-Tsao Wu, “音樂節拍與節奏的即時分析,” 逢甲大學


自動控制工程學系碩士論文, pp. 1-68, 2012.

[H.-W Chang 2014] Hui-Wen Chang, “Midi 音源之節拍判定,” 國立東華大學


應用數學系碩士論文, pp. 1-38, 2014.

三、網路資料

[Chipin] http://www.chipinkaiyajazz.com/p/archives.html

[Jazz] http://www.drumnet.tw/jaz1.htm

[MIDI] http://www.recordingblogs.com/sa/Wiki?topic=MIDI+event

[V.Dominique] http://www.music-software-development.com/index.html

[Marc. S 2003] “A Jazz Improvisation Primer,” http://outsideshore.com/,


2003.

[Kai 2010] http://freejazz.pixnet.net/blog , 2010.

[Kuan 2016] http://nicechord.com , 2016.

[Yotam 2016] https://experiments.withgoogle.com/ai/ai-duet/view/ , 2016.

31
Appendix A Questionnaire about our system

32

You might also like