This repository has been archived by the owner on Jul 3, 2019. It is now read-only.
-
Notifications
You must be signed in to change notification settings - Fork 2
/
result_cleaned.xml
3918 lines (3918 loc) · 617 KB
/
result_cleaned.xml
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
876
877
878
879
880
881
882
883
884
885
886
887
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
954
955
956
957
958
959
960
961
962
963
964
965
966
967
968
969
970
971
972
973
974
975
976
977
978
979
980
981
982
983
984
985
986
987
988
989
990
991
992
993
994
995
996
997
998
999
1000
<documents>
<document>
<name>nime2001_003.pdf</name>
<abstract>This paper will present observations on the design, artistic, and human factors of creating digital music controllers. Specific projects will be presented, and a set of design principles will be supported from those examples. </abstract>
<keywords>Musical control, artistic interfaces. </keywords>
</document>
<document>
<name>nime2001_007.pdf</name>
<abstract>Over the last four years, we have developed a series oflectures, labs and project assignments aimed at introducingenough technology so that students from a mix ofdisciplines can design and build innovative interfacedevices.</abstract>
<keywords>Input devices, music controllers, CHI technology, courses. </keywords>
</document>
<document>
<name>nime2001_011.pdf</name>
<abstract>In this paper we describe our efforts towards thedevelopment of live performance computer-based musicalinstrumentation. Our design criteria include initial easeof use coupled with a long term potential for virtuosity,minimal and low variance latency, and clear and simplestrategies for programming the relationship betweengesture and musical result. We present customcontrollers and unique adaptations of standard gesturalinterfaces, a programmable connectivity processor, acommunications protocol called Open Sound Control(OSC), and a variety of metaphors for musical control. We further describe applications of our technology to avariety of real musical performances and directions forfuture research.</abstract>
</document>
<document>
<name>nime2001_015.pdf</name>
<abstract>This paper reviews the existing literature on input deviceevaluation and design in human-computer interaction (HCI)and discusses possible applications of this knowledge tothe design and evaluation of new interfaces for musicalexpression. Specifically, a set of musical tasks is suggestedto allow the evaluation of different existing controllers. </abstract>
<keywords>Input device design, gestural control, interactive systems </keywords>
</document>
<document>
<name>nime2001_019.pdf</name>
<abstract>?<.0) L'L$&) L&$0$"#0) #<$) ."#$&T',$) ($Q$+-LN$"#0) '"()N40.,-T) #<$) (4-) U."#$&T',$1V) T-&N$() J;) 64&#.0) W'<") '"() 5'"?&4$N'"M) ) X$) ($0,&.J$) %$0#4&'+) ."0#&4N$"#) ($0.%"1."#$&',#.Q$) L$&T-&N'",$) ."#$&T',$0) T-&) .NL&-Q.0'#.-"'+N40.,1)0L<$&.,'+)0L$'S$&0)EN4+#.C,<'""$+1)-4#R'&(C&'(.'#."%%$-($0.,) 0L$'S$&) '&&';0F) '"() 3$"0-&C3L$'S$&C/&&';0E3$"3/0Y) ,-NJ."'#.-"0) -T) Q'&.-40) 0$"0-&) ($Q.,$0) R.#<0L<$&.,'+) 0L$'S$&) '&&';0FM)X$) (.0,400) #<$) ,-",$L#1) ($0.%"'"(),-"0#&4,#.-")-T)#<$0$)0;0#$N01)'"(1)%.Q$)$Z'NL+$0)T&-N0$Q$&'+)"$R)L4J+.0<$()650)-T)R-&S)J;)W'<")'"()?&4$N'"M</abstract>
<keywords>!"#$&',#.Q$) O40.,) $&T-&N'",$1) [$0#4&'+ !"#$&T',$1) 3-".,5.0L+';1)3$"0-&\3L$'S$&)/&&';1)3$"3/M </keywords>
</document>
<document>
<name>nime2001_024.pdf</name>
<keywords>multidimensionality, control, resonance, pitch tracking </keywords>
</document>
<document>
<name>nime2001_027.pdf</name>
<abstract>The Accordiatron is a new MIDI controller for real-timeperformance based on the paradigm of a conventionalsqueeze box or concertina. It translates the gestures of aperformer to the standard communication protocol ofMIDI, allowing for flexible mappings of performance datato sonic parameters. When used in conjunction with a realtime signal processing environment, the Accordiatronbecomes an expressive, versatile musical instrument. Acombination of sensory outputs providing both discrete andcontinuous data gives the subtle expressiveness and controlnecessary for interactive music.</abstract>
<keywords>MIDI controllers, computer music, interactive music, electronic musical instruments, musical instrument design, human computer interface </keywords>
</document>
<document>
<name>nime2001_030.pdf</name>
<abstract>The technologies behind passive resonant magneticallycoupled tags are introduced and their application as amusical controller is illustrated for solo or groupperformances, interactive installations, and music toys. </abstract>
<keywords>RFID, resonant tags, EAS tags, musical controller, tangible interface </keywords>
</document>
<document>
<name>nime2001_034.pdf</name>
<abstract>In this paper, we introduce our research challenges for creating new musical instruments using everyday-life media with intimate interfaces, such as the self-body, clothes, water and stuffed toys. Various sensor technologies including image processing and general touch sensitive devices are employed to exploit these interaction media. The focus of our effort is to provide user-friendly and enjoyable experiences for new music and sound performances. Multimodality of musical instruments is explored in each attempt. The degree of controllability in the performance and the richness of expressions are also discussed for each installation. </abstract>
<keywords>New interface, music controller, dance, image processing, water interface, stuffed toy </keywords>
</document>
<document>
<name>nime2001_038.pdf</name>
<abstract>The MATRIX (Multipurpose Array of Tactile Rods forInteractive eXpression) is a new musical interface foramateurs and professionals alike. It gives users a 3dimensional tangible interface to control music using theirhands, and can be used in conjunction with a traditionalmusical instrument and a microphone, or as a stand-alonegestural input device. The surface of the MATRIX acts as areal-time interface that can manipulate the parameters of asynthesis engine or effect algorithm in response to aperformer's expressive gestures. One example is to have therods of the MATRIX control the individual grains of agranular synthesizer, thereby "sonically sculpting" themicrostructure of a sound. In this way, the MATRIXprovides an intuitive method of manipulating sound with avery high level of real-time control.</abstract>
<keywords>Musical controller, tangible interface, real-time expression, audio synthesis, effects algorithms, signal processing, 3-D interface, sculptable surface </keywords>
</document>
<document>
<name>nime2001_051.pdf</name>
<abstract>KL0/! %,%()! )(M0(N/! ,! 2&$H()! #4! %)#O(F'/! 'L,'! (P%+#)(!H&0+-021! (+(F')#20F! $&/0F,+! !"#$%&Q! 02'()4,F(/! ,2-! #HO(F'/!-(/012(-! '#! H(! &/(-! ,2-! (2O#3(-! H3! ,23H#-3! H&'! 02!%,)'0F&+,)! 'L#/(! NL#! -#! 2#'! /((! 'L($/(+M(/! ,/! 2,'&),++3!$&/0F,+J! R2! )(4+(F'021! #2! 'L(! /')(21'L/! #4! 'L(/(! %)#O(F'/Q!02'()(/'021! -0)(F'0#2/! 4#)! /0$0+,)! N#)S! 02! 'L(! 4&'&)(! ,)(!F#2/0-()(-J!</abstract>
<keywords>T+,3Q!(P%+#),'0#2Q!/#&2-!$,%%021Q!(21,1021!F#2'(2'Q!/#&2-! -(/012J! </keywords>
</document>
<document>
<name>nime2002_001.pdf</name>
<abstract>In this paper we describe the digital emulation of a optical photosonic instrument. First we briefly describe theoptical instrument which is the basis of this emulation.Then we give a musical description of the instrumentimplementation and its musical use and we concludewith the "duo" possibility of such an emulation.</abstract>
<keywords>Photosonic synthesis, digital emulation, Max-Msp, gestural devices. </keywords>
</document>
<document>
<name>nime2002_005.pdf</name>
<abstract>In this paper we will have a short overview of some of the systems we have been developing as an independent company over the last years. We will focus especially on our latest experiments in developing wireless gestural systems using the camera as an interactive tool to generate 2D and 3D visuals and music. </abstract>
</document>
<document>
<name>nime2002_010.pdf</name>
<abstract>This paper describes the design and development of several musical instruments and MIDI controllers built byDavid Bernard (as part of The Sound Surgery project:www.thesoundsurgery.co.uk) and used in club performances around Glasgow during 1995-2002. It argues thatchanging technologies and copyright are shifting ourunderstanding of music from "live art" to "recorded medium" whilst blurring the boundaries between sound andvisual production.</abstract>
<keywords>Live electronic music, experimental instruments, MIDI controllers, audio-visual synchronisation, copyright, SKINS digital hand drum. </keywords>
</document>
<document>
<name>nime2002_012.pdf</name>
<abstract>This paper discusses the Jam-O-Drum multi-player musical controller and its adaptation into a gaming controller interface known as the Jam-O-Whirl. The Jam-O-World project positioned these two controller devices in a dedicated projection environment that enabled novice players to participate in immersive musical gaming experiences. Players' actions, detected via embedded sensors in an integrated tabletop surface, control game play, real-time computer graphics and musical interaction. Jam-O-World requires physical and social interaction as well as collaboration among players. </abstract>
<keywords>Collaboration, computer graphics, embedded sensors, gam- ing controller, immersive musical gaming experiences, mu- sical controller, multi-player, novice, social interaction. </keywords>
</document>
<document>
<name>nime2002_038.pdf</name>
<abstract>Mapping, which describes the way a performer's controls are connected to sound variables, is a useful concept when applied to the structure of electronic instruments modelled after traditional acoustic instruments. But mapping is a less useful concept when applied to the structure of complex and interactive instruments in which algorithms generate control information. This paper relates the functioning and benefits of different types of electronic instruments to the structural principles on which they are based. Structural models of various instruments will be discussed and musical examples played. </abstract>
<keywords>mapping fly-by-wire algorithmic network in- teractivity instrument deterministic indetermin- istic </keywords>
</document>
<document>
<name>nime2002_043.pdf</name>
<abstract>This paper describes a virtual musical instrument based on the scanned synthesis technique and implemented in Max-Msp. The device is composed of a computer and three gesture sensors. The timbre of the produced sound is rich and changing. The instrument proposes an intuitive and expressive control of the sound thanks to a complex mapping between gesture and sound. </abstract>
</document>
<document>
<name>nime2002_050.pdf</name>
<abstract>In this paper we describe three new music controllers, eachdesigned to be played by two players. As the intimacy between two people increases so does their ability to anticipateand predict the other's actions. We hypothesize that this intimacy between two people can be used as a basis for newcontrollers for musical expression. Looking at ways peoplecommunicate non-verbally, we are developing three new instruments based on different communication channels. TheTooka is a hollow tube with a pressure sensor and buttonsfor each player. Players place opposite ends in their mouthsand modulate the pressure in the tube with their tongues andlungs, controlling sound. Coordinated button presses control the music as well. The Pushka, yet to be built, is a semirigid rod with strain gauges and position sensors to track therod's position. Each player holds opposite ends of the rodand manipulates it together. Bend, end point position, velocity and acceleration and torque are mapped to musicalparameters. The Pullka, yet to be built, is simply a string attached at both ends with two bridges. Tension is measuredwith strain gauges. Players manipulate the string tensionat each end together to modulate sound. We are looking atdifferent musical mappings appropriate for two players.</abstract>
<keywords>Two person musical instruments, intimacy, human-human communication, cooperative music, passive haptic interface </keywords>
</document>
<document>
<name>nime2002_056.pdf</name>
<abstract>The Cardboard Box Garden (CBG) originated from a dissatisfaction with current computer technology as it is presented to children. This paper shall briefly review the process involved in the creation of this installation, from motivation through to design and subsequent implementation and user experience with the CBG. Through the augmentation of an everyday artefact, namely the standard cardboard box, a simple yet powerful interactive environment was created that has achieved its goal of stirring childrens imagination - judging from the experience of our users. </abstract>
<keywords>Education, play, augmented reality, pervasive comput- ing, disappearing computer, assembly, cardboard box </keywords>
</document>
<document>
<name>nime2002_059.pdf</name>
<abstract>Research and musical creation with gestural-orientedinterfaces have recently seen a renewal of interest andactivity at Ircam [1][2]. In the course of several musicalprojects, undertaken by young composers attending theone-year Course in Composition and Computer Music or byguests artists, Ircam Education and Creation departmentshave proposed various solutions for gesture-controlledsound synthesis and processing. In this article, we describethe technical aspects of AtoMIC Pro, an Analog to MIDIconverter proposed as a re-usable solution for digitizingseveral sensors in different contexts such as interactivesound installation or virtual instruments.The main direction of our researches, and of this one inparticular, is to create tools that can be fully integrated intoan artistic project as a real part of the composition andperformance processes.</abstract>
<keywords>Gestural controller, Sensor, MIDI, Music. SOLUTION FOR MULTI-SENSOR ACQUISITION </keywords>
</document>
<document>
<name>nime2002_065.pdf</name>
<abstract>We explore the role that metaphor plays in developing expressive devices by examining the MetaMuse system. MetaMuse is a prop-based system that uses the metaphor of rainfall to make the process of granular synthesis understandable. We discuss MetaMuse within a framework we call"transparency" that can be used as a predictor of the expressivity of musical devices. Metaphor depends on a literature,or cultural basis, which forms the basis for making transparent device mappings. In this context we evaluate the effectof metaphor in the MetaMuse system.</abstract>
<keywords>Expressive interface, transparency, metaphor, prop-based con- troller, granular synthesis. </keywords>
</document>
<document>
<name>nime2002_071.pdf</name>
<abstract>The use of free gesture in making music has usually been confined to instruments that use direct mappings between movement and sound space. Here we demonstrate the use of categories of gesture as the basis of musical learning and performance collaboration. These are used in a system that reinterprets the approach to learning through performance that is found in many musical cultures and discussed here through the example of Kpelle music. </abstract>
<keywords>Collaboration, Performance, Metaphor, Gesture </keywords>
</document>
<document>
<name>nime2002_073.pdf</name>
<abstract>This paper presents a novel coupling of haptics technology and music, introducing the notion of tactile composition or aesthetic composition for the sense of touch. A system that facilitates the composition and perception of intricate, musically structured spatio-temporal patterns of vibration on the surface of the body is described. An initial test of the system in a performance context is discussed. The fundamental building blocks of a compositional language for touch are considered. </abstract>
</document>
<document>
<name>nime2002_080.pdf</name>
<abstract>The Circular Optical Object Locator is a collaborative and cooperative music-making device. It uses an inexpensive digital video camera to observe a rotating platter. Opaque objects placed on the platter are detected by the camera during rotation. The locations of the objects passing under the camera are used to generate music. </abstract>
<keywords>Input devices, music controllers, collaborative, real-time score manipulation. </keywords>
</document>
<document>
<name>nime2002_082.pdf</name>
<abstract>We have created a new electronic musical instrument, referred to as the Termenova (Russian for "daughter of Theremin") that combines a free-gesture capacitivesensing device with an optical sensing system that detects the reflection of a hand when it intersects a beam of an array of red lasers. The laser beams, which are made visible by a thin layer of theatrical mist, provide visual feedback and guidance to the performer to alleviate the difficulties of using a non-contact interface as well as adding an interesting component for the audience to observe. The system uses capacitive sensing to detect the proximity of the player's hands; this distance is mapped to pitch, volume, or other continuous effect. The laser guide positions are calibrated before play with positioncontrolled servo motors interfaced to a main controller board; the location of each beam corresponds to the position where the performer should move his or her hand to achieve a pre-specified pitch and/or effect. The optical system senses the distance of the player's hands from the source of each laser beam, providing an additional dimension of musical control. </abstract>
<keywords>Theremin, gesture interface, capacitive sensing, laser harp, optical proximity sensing, servo control, musical controller </keywords>
</document>
<document>
<name>nime2002_094.pdf</name>
<keywords>musical controller, Tactex, tactile interface, tuning sys- tems </keywords>
</document>
<document>
<name>nime2002_101.pdf</name>
<abstract>We are interested in exhibiting our programs at your demo section at the conference. We believe that the subject of your conference is precisely what we are experimenting with in our musical software. </abstract>
<keywords>Further info on our website http//www.ixi-software.net. </keywords>
</document>
<document>
<name>nime2002_102.pdf</name>
<abstract>In this paper we present Afasia, an interactive multimedia performance based in Homer's Odyssey [2]. Afasia is a one-man digital theater play in which a lone performer fitted with a sensor-suit conducts, like Homer, the whole show by himself, controlling 2D animations, DVD video and conducting the music mechanically performed by a robot quartet. After contextualizing the piece, all of its technical elements, starting with the hardware input and output components, are described. A special emphasis is given to the interactivity strategies and the subsequent software design. Since its first version premiered in Barcelona in 1998, Afasia has been performed in many European and American countries and has received several international awards. </abstract>
<keywords>Multimedia interaction, musical robots, real-time musi- cal systems. </keywords>
</document>
<document>
<name>nime2002_108.pdf</name>
<abstract>This paper describes the design of an electronic Tabla controller. The E-Tabla controls both sound and graphics simultaneously. It allows for a variety of traditional Tabla strokes and new performance techniques. Graphical feedback allows for artistical display and pedagogical feedback. </abstract>
<keywords>Electronic Tabla, Indian Drum Controller, Physical Models, Graphical Feedback </keywords>
</document>
<document>
<name>nime2002_113.pdf</name>
<abstract>In this paper, we describe a computer-based solo musical instrument for live performance. We have adapted a Wacom graphic tablet equipped with a stylus transducer and a game joystick to use them as a solo expressive instrument. We have used a formant-synthesis model that can produce a vowel-like singing voice. This instrument allows multidimensional expressive fundamental frequency control and vowel articulation. The fundamental frequency angular control used here allows different mapping adjustments that correspond to different melodic styles. </abstract>
<keywords>Bi-manual, off-the-shelf input devices, fundamental fre- quency control, sound color navigation, mapping. </keywords>
</document>
<document>
<name>nime2002_118.pdf</name>
<abstract>This paper introduces a subtle interface, which evolved from the design of an alternative gestural controller in the development of a performance interface. The conceptual idea used is based on that of the traditional Bodhran instrument, an Irish frame drum. The design process was user-centered and involved professional Bodhran players and through prototyping and usertesting the resulting Vodhran emerged. </abstract>
<keywords>Virtual instrument, sound modeling, gesture, user- centered design </keywords>
</document>
<document>
<name>nime2002_120.pdf</name>
<abstract>Here we present 2Hearts, a music system controlled bythe heartbeats of two people. As the players speak andtouch, 2Hearts extracts meaningful variables from theirheartbeat signals. These variables are mapped to musical parameters, conveying the changing patterns of tension and relaxation in the players' relationship. We describe the motivation for creating 2Hearts, observationsfrom the prototypes that have been built, and principleslearnt in the ongoing development process.</abstract>
<keywords>Heart Rate, Biosensor, Interactive Music, Non-Verbal Communication, Affective Computing, Ambient Display </keywords>
</document>
<document>
<name>nime2002_126.pdf</name>
<keywords>Gesture, weight distribution, effort, expression, intent, movement, 3D sensing pressure, force, sensor, resolu- tion, control device, sound, music, input. </keywords>
</document>
<document>
<name>nime2002_131.pdf</name>
<abstract>This paper briefly describes a number of performance interfaces under the broad theme of Interactive Gesture Music (IGM). With a short introduction, this paper discusses the main components of a Trans-Domain Mapping (TDM) framework, and presents various prototypes developed under this framework, to translate meaningful activities from one creative domain onto another, to provide real-time control of musical events with physical movements. </abstract>
<keywords>Gesture, Motion, Interactive, Performance, Music. </keywords>
</document>
<document>
<name>nime2002_137.pdf</name>
<abstract>The design of a virtual keyboard, capable of reproducing the tactile feedback of several musical instruments is reported. The key is driven by a direct drive motor, which allows friction free operations. The force to be generated by the motor is calculated in real time by a dynamic simulator, which contains the model of mechanisms' components and constraints. Each model is tuned on the basis of measurements performed on the real system. So far, grand piano action, harpsichord and Hammond organ have been implemented successfully on the system presented here. </abstract>
<keywords>Virtual mechanisms, dynamic simulation </keywords>
</document>
<document>
<name>nime2002_143.pdf</name>
<abstract>Interactivity has become a major consideration in the development of a contemporary art practice that engages with the proliferation of computer based technologies. Keywords </abstract>
<keywords>are your choice. </keywords>
</document>
<document>
<name>nime2002_145.pdf</name>
<abstract>Passive RF Tagging can provide an attractive medium for development of free-gesture musical interfaces. This was initially explored in our Musical Trinkets installation, which used magnetically-coupled resonant LC circuits to identify and track the position of multiple objects in real-time. Manipulation of these objects in free space over a read coil triggered simple musical interactions. Musical Navigatrics builds upon this success with new more sensitive and stable sensing, multi-dimensional response, and vastly more intricate musical mappings that enable full musical exploration of free space through the dynamic use and control of arpeggiatiation and effects. The addition of basic sequencing abilities also allows for the building of complex, layered musical interactions in a uniquely easy and intuitive manner. </abstract>
<keywords>passive tag, position tracking, music sequencer interface </keywords>
</document>
<document>
<name>nime2002_148.pdf</name>
<abstract>We present Audiopad, an interface for musical performance that aims to combine the modularity of knob based controllers with the expressive character of multidimensional tracking interfaces. The performer's manipulations of physical pucks on a tabletop control a real-time synthesis process. The pucks are embedded with LC tags that the system tracks in two dimensions with a series of specially shaped antennae. The system projects graphical information on and around the pucks to give the performer sophisticated control over the synthesis process.</abstract>
<keywords>RF tagging, MIDI, tangible interfaces, musical controllers, object tracking </keywords>
</document>
<document>
<name>nime2002_156.pdf</name>
<abstract>In this paper, we develop the concept of "composed instruments". We will look at this idea from two perspectives: the design of computer systems in the context of live performed music and musicological considerations. A historical context is developed. Examples will be drawn from recent compositions. Finally basic concepts from computer science will be examined for their relation ship to this concept. </abstract>
<keywords>Instruments, musicology, composed instrument, Theremin, Martenot, interaction, streams, MAX. </keywords>
</document>
<document>
<name>nime2002_161.pdf</name>
<abstract>The cicada uses a rapid sequence of buckling ribs to initiate and sustain vibrations in its tymbal plate (the primarymechanical resonator in the cicada's sound production system). The tymbalimba, a music controller based on thissame mechanism, has a row of 4 convex aluminum ribs (ason the cicada's tymbal) arranged much like the keys on acalimba. Each rib is spring loaded and capable of snappingdown into a V-shape (a motion referred to as buckling), under the downward force of the user's finger. This energygenerated by the buckling motion is measured by an accelerometer located under each rib and used as the inputto a physical model.</abstract>
<keywords>Bioacoustics, Physical Modeling, Controllers, Cicada, Buck- ling mechanism. </keywords>
</document>
<document>
<name>nime2002_167.pdf</name>
<abstract>This paper describes the hardware and the software of a computer-based doppler-sonar system for movement detection. The design is focused on simplicity and lowcost do-it-yourself construction. </abstract>
<keywords>sonar </keywords>
</document>
<document>
<name>nime2002_171.pdf</name>
<abstract>This paper describes a technique of multimodal, multichannel control of electronic musical devices using two control methodologies, the Electromyogram (EMG) and relative position sensing. Requirements for the application of multimodal interaction theory in the musical domain are discussed. We introduce the concept of bidirectional complementarity to characterize the relationship between the component sensing technologies. Each control can be used independently, but together they are mutually complementary. This reveals a fundamental difference from orthogonal systems. The creation of a concert piece based on this system is given as example. </abstract>
<keywords>Human Computer Interaction, Musical Controllers, Electromyogram, Position Sensing, Sensor Instruments </keywords>
</document>
<document>
<name>nime2002_177.pdf</name>
<abstract>Active force-feedback holds the potential for precise andrapid controls. A high performance device can be builtfrom a surplus disk drive and controlled from an inexpensive microcontroller. Our new design,The Plank hasonly one axis of force-feedback with limited range ofmotion. It is being used to explore methods of feelingand directly manipulating sound waves and spectra suitable for live performance of computer music.</abstract>
<keywords>Haptics, music controllers, scanned synthesis. </keywords>
</document>
<document>
<name>nime2002_181.pdf</name>
<abstract>Here we propose a novel musical controller which acquiresimaging data of the tongue with a two-dimensional medicalultrasound scanner. A computer vision algorithm extractsfrom the image a discrete tongue shape to control, in realtime, a musical synthesizer and musical effects. We evaluate the mapping space between tongue shape and controllerparameters and its expressive characteristics.</abstract>
<keywords>Tongue model, ultrasound, real-time, music synthesis, speech interface </keywords>
</document>
<document>
<name>nime2002_186.pdf</name>
<abstract>The Beatbugs are hand-held percussive instruments that allow the creation, manipulation, and sharing of rhythmic motifs through a simple interface. When multiple Beatbugs are connected in a network, players can form large-scale collaborative compositions by interdependently sharing and developing each other's motifs. Each Beatbug player can enter a motif that is then sent through a stochastic computerized "Nerve Center" to other players in the network. Receiving players can decide whether to develop the motif further (by continuously manipulating pitch, timbre, and rhythmic elements using two bend sensor antennae) or to keep it in their personal instrument (by entering and sending their own new motifs to the group.) The tension between the system's stochastic routing scheme and the players' improvised real-time decisions leads to an interdependent, dynamic, and constantly evolving musical experience. A musical composition entitled "Nerve" was written for the system by author Gil Weinberg. It was premiered on February 2002 as part of Tod Machover's Toy Symphony [1] in a concert with the Deutsches Symphonie Orchester Berlin, conducted by Kent Nagano. The paper concludes with a short evaluative discussion of the concert and the weeklong workshops that led to it. </abstract>
<keywords>Interdependent Musical Networks, group playing, per- cussive controllers. </keywords>
</document>
<document>
<name>nime2002_192.pdf</name>
<abstract>In this demonstration we will show a variety of computer-based musical instruments designed for live performance. Our design criteria include initial ease of use coupled with a long term potential for virtuosity, minimal and low variance latency, and clear and simple strategies for programming the relationship between gesture and musical result. We present custom controllers and unique adaptations of standard gestural interfaces, a programmable connectivity processor, a communications protocol called Open Sound Control (OSC), and a variety of metaphors for musical control. </abstract>
<keywords>Expressive control, mapping gestures to acoustic results, metaphors for musical control, Tactex, Buchla Thunder, digitizing tablets. </keywords>
</document>
<document>
<name>nime2002_195.pdf</name>
<abstract>The Mutha Rubboard is a musical controller based on therubboard, washboard or frottoir metaphor commonly usedin the Zydeco music genre of South Louisiana. It is not onlya metamorphosis of a traditional instrument, but a modernbridge of exploration into a rich musical heritage. It usescapacitive and piezo sensing technology to output MIDI andraw audio data.This new controller reads the key placement in two parallelplanes by using radio capacitive sensing circuitry expanding greatly on the standard corrugated metal playing surface. The percussive output normally associated with therubboard is captured through piezo contact sensors mounteddirectly on the keys (the playing implements). Additionally,mode functionality is controlled by discrete switching onthe keys.This new instrument is meant to be easily played by bothexperienced players and those new to the rubboard. It lendsitself to an expressive freedom by placing the control surface on the chest and allowing the hands to move uninhibited about it or by playing it in the usual way, preserving itsmusical heritage.</abstract>
<keywords>MIDI controllers, computer music, Zydeco music, interac- tive music, electronic musical instrument, human computer interface, Louisiana heritage, physical modeling, bowl res- onators. </keywords>
</document>
<document>
<name>nime2002_199.pdf</name>
<abstract>Falling Up is an evening-length performance incorporatingdance and theatre with movement-controlled audio/videoplayback and processing. The solo show is a collaboration between Cindy Cummings (performance) and Todd Winkler(sound, video), first performed at the Dublin Fringe Festival,2001. Each thematic section of the work shows a different typeof interactive relationship between movement, video andsound. This demonstration explains the various technical configurations and aesthetic thinking behind aspects of the work.</abstract>
<keywords>Dance, Video processing, Movement sensor, VNS, Very Nervous System </keywords>
</document>
<document>
<name>nime2002_201.pdf</name>
<keywords>Hyperbow, Hyperviolin, Hyperinstrument, violin, bow, po- sition sensor, accelerometer, strain sensor </keywords>
</document>
<document>
<name>nime2003_003.pdf</name>
<abstract>In this paper we present a design for the EpipE, a newexpressive electronic music controller based on the IrishUilleann Pipes, a 7-note polyphonic reeded woodwind. Thecore of this proposed controller design is a continuouselectronic tonehole-sensing arrangement, equally applicableto other woodwind interfaces like those of the flute, recorder orJapanese shakuhachi. The controller will initially be used todrive a physically-based synthesis model, with the eventualgoal being the development of a mapping layer allowing theEpipE interface to operate as a MIDI-like controller of arbitrarysynthesis models.</abstract>
<keywords>Controllers, continuous woodwind tonehole sensor, uilleann pipes, Irish bagpipe, physical modelling, double reed, conical bore, tonehole. </keywords>
</document>
<document>
<name>nime2003_015.pdf</name>
<keywords>MIDI Controller, Wind Controller, Breath Control, Human Computer Interaction. </keywords>
</document>
<document>
<name>nime2003_019.pdf</name>
<abstract>The STRIMIDILATOR is an instrument that uses the deviation and the vibration of strings as MIDI-controllers. Thismethod of control gives the user direct tactile force feedbackand allows for subtle control. The development of the instrument and its different functions are described.</abstract>
<keywords>MIDI controllers, tactile force feedback, strings. Figure The STRIMIDILATOR </keywords>
</document>
<document>
<name>nime2003_024.pdf</name>
<abstract>Over the past year the instructors of the Human ComputerInteraction courses at CCRMA have undertaken a technology shift to a much more powerful teaching platform. Wedescribe the technical features of the new Atmel AVR basedplatform, contrasting it with the Parallax BASIC Stampplatform used in the past. The successes and failures ofthe new platform are considered, and some student projectsuccess stories described.</abstract>
</document>
<document>
<name>nime2003_030.pdf</name>
<abstract>The Disc Jockey (DJ) software system Mixxx is presented.Mixxx makes it possible to conduct studies of new interaction techniques in connection with the DJ situation, by itsopen design and easy integration of new software modulesand MIDI connection to external controllers. To gain a better understanding of working practices, and to aid the designprocess of new interfaces, interviews with two contemporarymusicians and DJ's are presented. In contact with thesemusicians development of several novel prototypes for DJinteraction have been made. Finally implementation detailsof Mixxx are described.</abstract>
</document>
<document>
<name>nime2003_036.pdf</name>
</document>
<document>
<name>nime2003_054.pdf</name>
<abstract>In this paper, we examine the use of spatial layouts of musicalmaterial for live performance control. Emphasis is given tosoftware tools that provide for the simple and intuitivegeometric organization of sound material, sound processingparameters, and higher-level musical structures.</abstract>
</document>
<document>
<name>nime2003_070.pdf</name>
<abstract>This paper first introduces two previous software-based musicinstruments designed by the author, and analyses the crucialimportance of the visual feedback introduced by theirinterfaces. A quick taxonomy and analysis of the visualcomponents in current trends of interactive music software isthen proposed, before introducing the reacTable*, a newproject that is currently under development. The reacTable* isa collaborative music instrument, aimed both at novices andadvanced musicians, which employs computer vision andtangible interfaces technologies, and pushes further the visualfeedback interface ideas and techniques aforementioned.</abstract>
</document>
<document>
<name>nime2003_077.pdf</name>
<abstract>A handheld electronic musical instrument, named the BentoBox, was developed. The motivation was to develop aninstrument which one can easily carry around and play inmoments of free time, for example when riding publictransportation or during short breaks at work. The device wasdesigned to enable quick learning by having various scalesprogrammed for different styles of music, and also beexpressive by having hand controlled timbral effects whichcan be manipulated while playing. Design analysis anditeration lead to a compact and ergonomic device. This paperfocuses on the ergonomic design process of the hardware.</abstract>
<keywords>MIDI controller, electronic musical instrument, musical instrument design, ergonomics, playability, human computer interface. </keywords>
</document>
<document>
<name>nime2003_083.pdf</name>
<keywords>Chemical music, Applied chemistry, Battery Controller. </keywords>
</document>
<document>
<name>nime2003_087.pdf</name>
<abstract>This report details work on the interdisciplinary mediaproject TGarden. The authors discuss the challengesencountered while developing a responsive musicalenvironment for the general public involving wearable,sensor-integrated clothing as the central interface and inputd e v i c e . T h e p r o j e c t ' s d r a m a t u r g i c a l andtechnical/implementation background are detailed toprovide a framework for the creation of a responsive hardwareand software system that reinforces a tangible relationshipbetween the participant's improvised movement and musicalresponse. Finally, the authors take into consideration testingscenarios gathered from public prototypes in two Europeanlocales in 2001 to evaluate user experience of the system.</abstract>
<keywords>Gesture, interaction, embodied action, enaction, physical model, responsive environment, interactive musical systems, affordance, interface, phenomenology, energy, kinetics, time constant, induced ballistics, wearable computing, accelerometer, audience participation, dynamical system, dynamic compliance, effort, wearable instrument, augmented physicality. </keywords>
</document>
<document>
<name>nime2003_091.pdf</name>
<abstract>We present a sensor-doll interface as a musical outlet forpersonal expression. A doll serves the dual role of being bothan expressive agent and a playmate by allowing solo andaccompanied performance. An internal computer and sensorsystem allow the doll to receive input from the user and itssurroundings, and then respond accordingly with musicalfeedback. Sets of musical timbres and melodies may bechanged by presenting the doll with a series of themed clothhats, each suggesting a different style of play. The doll mayperform by itself and play a number of melodies, or it maycollaborate with the user when its limbs are squeezed or bent.Shared play is further encouraged by a basic set of aural tonesmimicking conversation.</abstract>
<keywords>Musical improvisation, toy interface agent, sensor doll, context awareness. </keywords>
</document>
<document>
<name>nime2003_095.pdf</name>
</document>
<document>
<name>nime2003_109.pdf</name>
<abstract>In the project Sonic City, we have developed a system thatenables users to create electronic music in real time by walkingthrough and interacting with the urban environment. Weexplore the use of public space and everyday behaviours forcreative purposes, in particular the city as an interface andmobility as an interaction model for electronic music making.A multi-disciplinary design process resulted in theimplementation of a wearable, context-aware prototype. Thesystem produces music by retrieving information aboutcontext and user action and mapping it to real-time processingof urban sounds. Potentials, constraints, and implications ofthis type of music creation are discussed.</abstract>
</document>
<document>
<name>nime2003_116.pdf</name>
<abstract>The role of the face and mouth in speech production as well asnon-verbal communication suggests the use of facial action tocontrol musical sound. Here we document work on theMouthesizer, a system which uses a headworn miniaturecamera and computer vision algorithm to extract shapeparameters from the mouth opening and output these as MIDIcontrol changes. We report our experience with variousgesture-to-sound mappings and musical applications, anddescribe a live performance which used the Mouthesizerinterface.</abstract>
<keywords>Video-based interface; mouth controller; alternative input devices. </keywords>
</document>
<document>
<name>nime2003_122.pdf</name>
<keywords>Alternate controller, gesture, microphone technique, vocal performance, performance interface, electronic music. </keywords>
</document>
<document>
<name>nime2003_129.pdf</name>
<abstract>We explore a variety of design criteria applicable to thecreation of collaborative interfaces for musical experience. Themain factor common to the design of most collaborativeinterfaces for novices is that musical control is highlyrestricted, which makes it possible to easily learn andparticipate in the collective experience. Balancing this tradeoff is a key concern for designers, as this happens at theexpense of providing an upward path to virtuosity with theinterface. We attempt to identify design considerationsexemplified by a sampling of recent collaborative devicesprimarily oriented toward novice interplay. It is our intentionto provide a non-technical overview of design issues inherentin configuring multiplayer experiences, particularly for entrylevel players.</abstract>
<keywords>Design, collaborative interface, musical experience, multiplayer, novice, musical control. </keywords>
</document>
<document>
<name>nime2003_135.pdf</name>
<abstract>MidiGrid is a computer-based musical instrument, primarilycontrolled with the computer mouse, which allows liveperformance of MIDI-based musical material by mapping 2dimensional position onto musical events. Since itsinvention in 1987, it has gained a small, but enthusiastic,band of users, and has become the primary instrument forseveral people with physical disabilities. This paper reviewsits development, uses and user interface issues, and highlightsthe work currently in progress for its transformation intoMediaGrid.</abstract>
</document>
<document>
<name>nime2003_140.pdf</name>
<abstract>This paper presents a study of bimanual control applied tosound synthesis. This study deals with coordination,cooperation, and abilities of our hands in musical context. Wedescribe examples of instruments made using subtractivesynthesis, scanned synthesis in Max/MSP and commercialstand-alone software synthesizers via MIDI communicationprotocol. These instruments have been designed according to amulti-layer-mapping model, which provides modular design.They have been used in concerts and performanceconsiderations are discussed too.</abstract>
<keywords>Gesture control, mapping, alternate controllers, musical instruments. </keywords>
</document>
<document>
<name>nime2003_146.pdf</name>
<abstract>This paper describes the implementation of Time Delay NeuralNetworks (TDNN) to recognize gestures from video images.Video sources are used because they are non-invasive and do notinhibit performer's physical movement or require specialistdevices to be attached to the performer which experience hasshown to be a significant problem that impacts musiciansperformance and can focus musical rehearsals and performancesupon technical rather than musical concerns (Myatt 2003).We describe a set of hand gestures learned by an artificial neuralnetwork to control musical parameters expressively in real time.The set is made up of different types of gestures in order toinvestigate:-aspects of the recognition process-expressive musical control-schemes of parameter mapping-generalization issues for an extended set for musicalcontrolThe learning procedure of the Neural Network is describedwhich is based on variations by affine transformations of imagesequences of the hand gestures.The whole application including the gesture capturing isimplemented in jMax to achieve real time conditions and easyintegration into a musical environment to realize differentmappings and routings of the control stream.The system represents a practice-based research using actualmusic models like compositions and processes of compositionwhich will follow the work described in the paper.</abstract>
<keywords>Gesture Recognition, Artificial Neural Network, Expressive Control, Real-time Interaction </keywords>
</document>
<document>
<name>nime2003_151.pdf</name>
<abstract>This paper describes the artistic projects undertaken at ImmersionMusic, Inc. (www.immersionmusic.org) during its three-yearexistence. We detail work in interactive performance systems,computer-based training systems, and concert production.</abstract>
<keywords>Interactive computer music systems, gestural interaction, Conductor's Jacket, Digital Baton </keywords>
</document>
<document>
<name>nime2003_161.pdf</name>
<keywords>Motion capture, gestural control, mapping. </keywords>
</document>
<document>
<name>nime2003_164.pdf</name>
<abstract>In this paper, we discuss a design principle for the musicalinstruments that are useful for both novices and professionalmusicians and that facilitate musically rich expression. Webelieve that the versatility of conventional musicalinstruments causes difficulty in performance. By dynamicallyspecializing a musical instrument for performing a specific(genre of) piece, the musical instrument could become moreuseful for performing the piece and facilitates expressiveperformance. Based on this idea, we developed two new typesof musical instruments, i.e., a "given-melody-based musicalinstrument" and a "harmonic-function-based musicalinstrument." From the experimental results using twoprototypes, we demonstrate the efficiency of the designprinciple.</abstract>
</document>
<document>
<name>nime2003_170.pdf</name>
<abstract>In this paper, we introduce Block Jam, a Tangible UserInterface that controls a dynamic polyrhythmic sequencerusing 26 physical artifacts. These physical artifacts, that wecall blocks, are a new type of input device for manipulatingan interactive music system. The blocks' functional andtopological statuses are tightly coupled to an ad hocsequencer, interpreting the user's arrangement of the blocksas meaningful musical phrases and structures.We demonstrate that we have created both a tangible andvisual language that enables both the novice and musicallytrained users by taking advantage of both their explorativeand intuitive abilities. The tangible nature of the blocks andthe intuitive interface promotes face-to-face collaborationand social interaction within a single system. The principleof collaboration is further extended by linking two BlockJam systems together to create a network.We discuss our project vision, design rational, relatedworks, and the implementation of Block Jam prototypes.Figure 1. A cluster of blocks, note the mother block on thebottom right</abstract>
<keywords>Tangible interface, modular system, polyrhythmic sequencer. VISION We believe in a future where music will no longer be considered a linear composition, but a dynamic structure, and musical composition will extend to interaction. We also believe that through the </keywords>
</document>
<document>
<name>nime2003_180.pdf</name>
<abstract>In this paper I present the gluiph, a single-board computer thatwas conceived as a platform for integrated electronic musicalinstruments. It aims to provide new instruments as well asexisting ones with a stronger identity by untethering themfrom the often lab-like stage setups built around general purpose computers. The key additions to its core are a flexiblesensor subsystem and multi-channel audio I/O. In contrast toother stand-alone approaches it retains a higher degree offlexibility by supporting popular music programming languages, with Miller Puckette's pd [1] being the current focus.</abstract>
</document>
<document>
<name>nime2003_184.pdf</name>
<abstract>In this paper, we describe a new interface for musicalperformance, using the interaction with a graphical userinterface in a powerful manner: the user directly touches ascreen where graphical objects are displayed and can useseveral fingers simultaneously to interact with the objects. Theconcept of this interface is based on the superposition of thegesture spatial place and the visual feedback spatial place; i tgives the impression that the graphical objects are real. Thisconcept enables a huge freedom in designing interfaces. Thegesture device we have created gives the position of fourfingertips using 3D sensors and the data is performed in theMax/MSP environment. We have realized two practicalexamples of musical use of such a device, using PhotosonicSynthesis and Scanned Synthesis.</abstract>
<keywords>HCI, touch screen, multimodality, mapping, direct interaction, gesture devices, bimanual interaction, two-handed, Max/MSP. </keywords>
</document>
<document>
<name>nime2003_201.pdf</name>
<abstract>This paper suggests that there is a need for formalizing acomponent model of gestural primitive throughput in musicinstrument design. The purpose of this model is to construct acoherent and meaningful interaction between performer andinstrument. Such a model has been implicit in previous researchfor interactive performance systems. The model presented heredistinguishes gestural primitives from units of measure ofgestures. The throughput model identifies symmetry betweenperformance gestures and musical gestures, and indicates a rolefor gestural primitives when a performer navigates regions ofstable oscillations in a musical instrument. The use of a highdimensional interface tool is proposed for instrument design, forfine-tuning the mapping between movement sensor data andsound synthesis control data.</abstract>
<keywords>Performance gestures, musical gestures, instrument design, mapping, tuning, affordances, stability. </keywords>
</document>
<document>
<name>nime2003_208.pdf</name>
<abstract>SensorBox is a low cost, low latency, high-resolutioninterface for obtaining gestural data from sensors for use inrealtime with a computer-based interactive system. Wediscuss its implementation, benefits, current limitations, andcompare it with several popular interfaces for gestural dataacquisition.</abstract>
<keywords>Sensors, gestural acquisition, audio interface, interactive music, SensorBox. </keywords>
</document>
<document>
<name>nime2003_211.pdf</name>
<abstract>This software tool, developed in Max/MSP, presentsperformers with image files consisting of traditional notationas well as conducting in the form of video playback. Theimpetus for this work was the desire to allow the musicalmaterial for each performer of a given piece to differ withregard to content and tempo.</abstract>
<keywords>Open form, notation, polymeter, polytempi, Max/MSP. </keywords>
</document>
<document>
<name>nime2003_213.pdf</name>
<abstract>This document describes modular software supporting livesignal processing and sound file playback within theMax/MSP environment. Dsp.rack integrates signalprocessing, memory buffer recording, and pre-recordedmulti-channel file playback using an interconnected,programmable signal flow matrix, and an eight-channel i/oformat.</abstract>
<keywords>Digital signal processing, Max/MSP, computer music performance, matrix routing, live performance processing. </keywords>
</document>
<document>
<name>nime2003_216.pdf</name>
<abstract>This paper is a demo proposal for a new musical interfacebased on a DNA-like double-helix and concepts in charactergeneration. It contains a description of the interface,motivations behind developing such an interface, variousmappings of the interface to musical applications, and therequirements to demo the interface.</abstract>
<keywords>Performance, Design, Experimentation, DNA, Big Five. </keywords>
</document>
<document>
<name>nime2003_218.pdf</name>
<abstract>This paper describes a system which uses the output fromhead-tracking and gesture recognition software to drive aparameterized guitar effects synthesizer in real-time.</abstract>
<keywords>Head-tracking, gestural control, continuous control, parameterized effects processor. </keywords>
</document>
<document>
<name>nime2003_222.pdf</name>
<abstract>Sodaconductor is a musical interface for generating OSCcontrol data based on the dynamic physical simulation toolSodaconstructor as it can be seen and heard onhttp://www.sodaplay.com.</abstract>
</document>
<document>
<name>nime2003_225.pdf</name>
<abstract>Ircam has been deeply involved into gesture analysis and sensingfor about four years now, as several artistic projects demonstrate.Ircam has often been solicited for sharing software and hardwaretools for gesture sensing, especially devices for the acquisition andconversion of sensor data, such as the AtoMIC Pro [1][2]. Thisdemo-paper describes the recent design of a new sensor to MIDIinterface called EoBody1</abstract>
<keywords>Gestural controller, Sensor, MIDI, Computer Music. </keywords>
</document>
<document>
<name>nime2004_001.pdf</name>
<abstract>The Tooka was created as an exploration of two personinstruments. We have worked with two Tooka performers toenhance the original experimental device to make a musicalinstrument played and enjoyed by them. The main additions tothe device include: an additional button that behaves as amusic capture button, a bend sensor, an additional thumbactuated pressure sensor for vibrato, additional musicalmapping strategies, and new interfacing hardware. Thesedevelopments a rose through exper iences andrecommendations from the musicians playing it. In addition tothe changes to the Tooka, this paper describes the learningprocess and experiences of the musicians performing with theTooka.</abstract>
<keywords>Musician-centred design, two-person musical instrument. </keywords>
</document>
<document>
<name>nime2004_007.pdf</name>
<abstract>This paper describes the design of an Electronic Sitar controller, adigitally modified version of Saraswati's (the Hindu Goddess ofMusic) 19-stringed, pumpkin shelled, traditional North Indianinstrument. The ESitar uses sensor technology to extract gesturalinformation from a performer, deducing music information suchas pitch, pluck timing, thumb pressure, and 3-axes of head tilt totrigger real-time sounds and graphics. It allows for a variety oftraditional sitar technique as well as new performance methods.Graphical feedback allows for artistic display and pedagogicalfeedback. The ESitar uses a programmable Atmel microprocessorwhich outputs control messages via a standard MIDI jack.</abstract>
</document>
<document>
<name>nime2004_013.pdf</name>
<keywords>Sound feedback, Karate, Learning environment, Wearable device </keywords>
</document>
<document>
<name>nime2004_019.pdf</name>
<abstract>This article reflects the current state of the reacTable* project,an electronic music instrument with a tangible table-basedinterface, which is currently under development at theAudiovisual Institute at the Universitat Pompeu Fabra. In thispaper we are focussing on the issue of Dynamic Patching,which is a particular and unique aspect of the sound synthesisand control paradigms of the reacTable*. Unlike commonvisual programming languages for sound synthesis, whichconceptually separate the patch building process from theactual musical performance, the reacTable* combines theconstruction and playing of the instrument in a unique way.The tangible interface allows direct manipulation control overany of the used building blocks, which physically representthe whole synthesizer function.</abstract>
</document>
<document>
<name>nime2004_023.pdf</name>
</document>
<document>
<name>nime2004_027.pdf</name>
</document>
<document>
<name>nime2004_031.pdf</name>
<abstract>This paper presents a project involving a percussionist playing on a virtual percussion. Both artistic and technical aspects of the project are developed. Especially, a method forstrike recognition using the Flock of Birds is presented, aswell as its use for artistic purpose.</abstract>
<keywords>Gesture analysis, virtual percussion, strike recognition. </keywords>
</document>
<document>
<name>nime2004_039.pdf</name>
<abstract>In this paper, we describe an adaptive approach to gesture mapping for musical applications which serves as a mapping system for music instrument design. A neural network approach is chosen for this goal and all the required interfaces and abstractions are developed and demonstrated in the Pure Data environment. In this paper, we will focus on neural network representation and implementation in a real-time musical environment. This adaptive mapping is evaluated in different static and dynamic situations by a network of sensors sampled at a rate of 200Hz in real-time. Finally, some remarks are given on the network design and future works. </abstract>
<keywords>Real-time gesture control, adaptive interfaces, Sensor and actuator technologies for musical applications, Musical mapping algorithms and intelligent controllers, Pure Data. </keywords>
</document>
<document>
<name>nime2004_047.pdf</name>
<abstract>This paper describes the use of evolutionary and artificial life techniques in sound design and the development of performance mapping to facilitate the real-time manipulation of such sounds through some input device controlled by the performer. A concrete example of such a system is described which allows musicians without detailed knowledge and experience of sound synthesis techniques to interactively develop new sounds and performance manipulation mappings according to their own aesthetic judgements. Experiences with the system are discussed. </abstract>
</document>
<document>
<name>nime2004_051.pdf</name>
<abstract>In this report, we discuss Tree Music, an interactive computermusic installation created using GAIA (Graphical Audio InterfaceApplication), a new open-source interface for controlling theRTcmix synthesis and effects processing engine. Tree Music,commissioned by the University of Virginia Art Museum, used awireless camera with a wide-angle lens to capture motion andocclusion data from exhibit visitors. We show how GAIA wasused to structure and navigate the compositional space, and howthis program supports both graphical and text-based programmingin the same application. GAIA provides a GUI which combinestwo open-source applications: RTcmix and Perl.</abstract>
<keywords>Composition, new interfaces, interactive systems, open source, Real time audio, GUI controllers, video tracking </keywords>
</document>
<document>
<name>nime2004_055.pdf</name>
<abstract>This essay outlines a framework for understanding newmusical compositions and performances that utilizepre-existing sound recordings. In attempting toarticulate why musicians are increasingly using soundrecordings in their creative work, the author calls fornew performance tools that enable the dynamic use ofpre-recorded music. </abstract>
<keywords>Call and response, turntablism, DJ tools, oral culture </keywords>
</document>
<document>
<name>nime2004_059.pdf</name>
<abstract>When envisaging new digital instruments, designers do not have to limit themselves to their sonic capabilities (which can be absolutely any), not even to their algorithmic power; they must be also especially careful about the instruments' conceptual capabilities, to the ways instruments impose or suggest to their players new ways of thinking, new ways of establishing relations, new ways of interacting, new ways of organizing time and textures; new ways, in short, of playing new musics. This article explores the dynamic relation that builds between the player and the instrument, introducing concepts such as efficiency, apprenticeship and learning curve It aims at constructing a framework in which the possibilities and the diversity of music instruments as well as the possibilities and the expressive freedom of human music performers could start being evaluated. </abstract>
<keywords>Musical instruments design, learning curve, apprenticeship, musical efficiency. </keywords>
</document>
<document>
<name>nime2004_064.pdf</name>
<abstract>This report presents a novel interface for musical performance which utilizes a record-player turntable augmented with a computation engine and a high-density optical sensing array. The turntable functions as a standalone step sequencer for MIDI events transmitted to a computer or another device and it is programmed in real-time using visual disks. The program instructions are represented on printed paper disks directly as characters of English alphabet that could be read by human as effectively as they are picked up by the machine's optical cartridge. The result is a tangible interface that allows the user to manipulate pre-arranged musical material by hand, by adding together instrumental tracks to form a dynamic mix. A functional implementation of this interface is discussed in view of historical background and other examples of electronic instruments for music creation and performance incorporating optical turntable as a central element.</abstract>
<keywords>Interaction, visualization, tangible interface, controllers, optical turntable, performance. </keywords>
</document>
<document>
<name>nime2004_068.pdf</name>
<abstract>This paper describes the first system designed to allow children to conduct an audio and video recording of an orchestra. No prior music experience is required to control theorchestra, and the system uses an advanced algorithm totime stretch the audio in real-time at high quality and without altering the pitch. We will discuss the requirements andchallenges of designing an interface to target our particularuser group (children), followed by some system implementation details. An overview of the algorithm used for audiotime stretching will also be presented. We are currently using this technology to study and compare professional andnon-professional conducting behavior, and its implicationswhen designing new interfaces for multimedia. You're theConductor is currently a successful exhibit at the Children'sMuseum in Boston, USA.</abstract>
</document>
<document>
<name>nime2004_074.pdf</name>
<abstract>The PebbleBox and the CrumbleBag are examples of a granular interaction paradigm, in which the manipulation ofphysical grains of arbitrary material becomes the basis forinteracting with granular sound synthesis models. The soundsmade by the grains as they are manipulated are analysed,and parameters such as grain rate, grain amplitude andgrain density are extracted. These parameters are then usedto control the granulation of arbitrary sound samples in realtime. In this way, a direct link is made between the haptic sensation of interacting with grains and the control ofgranular sounds.</abstract>
<keywords>Musical instrument, granular synthesis, haptic </keywords>
</document>
<document>
<name>nime2004_080.pdf</name>
</document>
<document>
<name>nime2004_087.pdf</name>
<abstract>The choice of mapping strategies to effectively map controller variables to sound synthesis algorithms is examined.Specifically, we look at continuous mappings that have ageometric representation. Drawing from underlying mathematical theory, this paper presents a way to compare mapping strategies, with the goal of achieving an appropriatematch between mapping and musical performance context.This method of comparison is applied to existing techniques,while a suggestion is offered on how to integrate and extendthis work through a new implementation.</abstract>
<keywords>Mapping, Interface Design, Interpolation, Computational Geometry </keywords>
</document>
<document>
<name>nime2004_092.pdf</name>
<abstract>This paper discusses some of the issues pertaining to thedesign of digital musical instruments that are to effectively fillthe role of traditional instruments (i.e. those based on physicalsound production mechanisms). The design andimplementation of a musical instrument that addresses some ofthese issues, using scanned synthesis coupled to a "smart"physical system, is described.</abstract>
<keywords>Digital musical instruments, real-time performance, scanned synthesis, pd, tactile interfaces, sensors, Shapetape, mapping. </keywords>
</document>
<document>
<name>nime2004_096.pdf</name>
<abstract>This paper describes an approach to match visual and acoustic parameters to produce an animated musical expression.Music may be generated to correspond to animation, asdescribed here; imagery may be created to correspond tomusic; or both may be developed simultaneously. This approach is intended to provide new tools to facilitate bothcollaboration between visual artists and musicians and examination of perceptual issues between visual and acousticmedia. As a proof-of-concept, a complete example is developed with linear fractals as a basis for the animation, andarranged rhythmic loops for the music. Since both visualand acoustic elements in the example are generated fromconcise specifications, the potential of this approach to create new works through parameter space exploration is accentuated, however, there are opportunities for applicationto a wide variety of source material. These additional applications are also discussed, along with issues encounteredin development of the example.</abstract>
<keywords>Multimedia creation and interaction, parameter space, visu- alization, sonification. </keywords>
</document>
<document>
<name>nime2004_100.pdf</name>
<keywords>Interactive Music Systems, Networking and Control, Voice and Speech Analysis, Auracle, JSyn, TransJam, Linear Pre- diction, Neural Networks, Voice Interface, Open Sound Con- trol </keywords>
</document>
<document>
<name>nime2004_104.pdf</name>
<abstract>In this paper, we propose Thermoscore, a musical score form-that dynamically alters the temperature of the instrument/player interface. We developed the first version of theThermoscore display by lining Peltier devices on piano keys.The system is controlled by MIDI notes-on messages from anMIDI sequencer, so that a composer can design songs that aresequences of temperature for each piano key. We also discussmethodologies for composing with this system, and suggesttwo approaches. The first is to make desirable keys (or otherkeys) hot. The second one uses chroma-profile, that is, a radarchart representation of the frequency of pitch notations in the-piece. By making keys of the same chroma hot in reverse proportion to the value of the chroma-profile, it is possible to-constrain the performer's improvisation and to bring the tonality space close to a certain piece.</abstract>
<keywords>musical score, improvisation, peltier device, chroma profile </keywords>
</document>
<document>
<name>nime2004_112.pdf</name>
<abstract>This paper describes ThumbTEC, a novel general purposeinput device for the thumb or finger that is useful in a widevariety of applications from music to text entry. The device i smade up of three switches in a row and one miniature joystickon top of the middle switch. The combination of joystickdirection and switch(es) controls what note or alphanumericcharacter is selected by the finger. Several applications aredetailed.</abstract>
<keywords>One-Thumb Input Device, HCI, Isometric Joystick, Mobile Computing, Handheld Devices, Musical Instrument. </keywords>
</document>
<document>
<name>nime2004_120.pdf</name>
<keywords>Rencon, Turing Test, Musical Expression, Performance Ren- dering </keywords>
</document>
<document>
<name>nime2004_124.pdf</name>
<abstract>This paper describes an approach for playing expressivemusic, as it refers to a pianist's expressiveness, with atapping-style interface. MIDI-formatted expressiveperformances played by pianists were first analyzed andtransformed into performance templates, in which thedeviations from a canonical description was separatelydescribed for each event. Using one of the templates as askill complement, a player can play music expressivelyover and under the beat level. This paper presents ascheduler that allows a player to mix her/his own intensionand the expressiveness in the performance template. Theresults of a forty-subject user study suggest that using theexpression template contributes the subject's joy of playingmusic with the tapping-style performance interface. Thisresult is also supported by a brain activation study that wasdone using a near-infrared spectroscopy (NIRS).Categories and Subject DescriptorsH.5.5 [Information Interfaces and Presentation]: Sound andMusic Computing methodologies and techniques.</abstract>
<keywords>Rencon, interfaces for musical expression, visualization </keywords>
</document>
<document>
<name>nime2004_130.pdf</name>
<abstract>A series of demonstrations of synthesized acappella songsbased on an auditory morphing using STRAIGHT [5] willbe presented. Singing voice data for morphing were extracted from the RWCmusic database of musical instrument sound. Discussions on a new extension of the morphing procedure to deal with vibrato will be introduced basedon the statistical analysis of the database and its effect onsynthesized acappella will also be demonstrated.</abstract>
<keywords>Rencon, Acappella, RWCdatabase, STRAIGHT, morph- ing </keywords>
</document>
<document>
<name>nime2004_138.pdf</name>
<abstract>On-the-fly programming is a style of programming in which the programmer/performer/composer augments and modifies the program while it is running, without stopping or restarting, in order to assert expressive, programmable control at runtime. Because of the fundamental powers of programming languages, we believe the technical and aesthetic aspects of on-the-fly programming are worth exploring. In this paper, we present a formalized framework for on-the-fly programming, based on the ChucK synthesis language, which supports a truly concurrent audio programming model with sample-synchronous timing, and a highly on-the-fly style of programming. We first provide a well-defined notion of on-thefly programming. We then address four fundamental issues that confront the on-the-fly programmer: timing, modularity, conciseness, and flexibility. Using the features and properties of ChucK, we show how it solves many of these issues. In this new model, we show that (1) concurrency provides natural modularity for on-the-fly programming, (2) the timing mechanism in ChucK guarantees on-the-fly precision and consistency, (3) the Chuck syntax improves conciseness, and (4) the overall system is a useful framework for exploring on-the-fly programming. Finally, we discuss the aesthetics of on-the-fly performance. </abstract>
</document>
<document>
<name>nime2004_144.pdf</name>
<abstract>This paper describes the design of an expressive tangible interface for cinema editing as a live performance. A short survey of live video practices is provided. The Live Cinema instrument is a cross between a musical instrument and a film editing tool, tailored for improvisational control as well as performance presence. Design specifications for the instrument evolved based on several types of observations including: our own performances in which we used a prototype based on available tools; an analysis of performative aspects of contemporary DJ equipment; and an evaluation of organizational aspects of several generations of film editing tools. Our instrument presents the performer with a large canvas where projected images can be grabbed and moved around with both hands simultaneously; the performer also has access to two video drums featuring haptic display to manipulate the shots and cut between streams. The paper ends with a discussion of issues related to the tensions between narrative structure and hands-on control, live and recorded arts and the scoring of improvised films. </abstract>
<keywords>live cinema, video controller, visual music, DJ, VJ, film editing, tactile interface, two-hand interaction, improvisation, performance, narrative structure. </keywords>
</document>
<document>
<name>nime2004_150.pdf</name>
<abstract>A system is introduced that allows a string player to control asynthesis engine with the gestural skills he is used to. Theimplemented system is based on an electric viola and asynthesis engine that is directly controlled by the unanalysedaudio signal of the instrument and indirectly by controlparameters mapped to the synthesis engine. This method offersa highly string-specific playability, as it is sensitive to thekinds of musical articulation produced by traditional playingtechniques. Nuances of sound variation applied by the playerwill be present in the output signal even if those nuances arebeyond traditionally measurable parameters like pitch,amplitude or brightness. The relatively minimal hardwarerequirements make the instrument accessible with littleexpenditure.</abstract>
<keywords>Electronic bowed string instrument, playability, musical instrument design, human computer interface, oscillation controlled sound synthesis </keywords>
</document>
<document>
<name>nime2004_154.pdf</name>
<abstract>We present a system for collaborative musical creation onmobile wireless networks. The work extends on simple peerto-peer file sharing systems towards ad-hoc mobility andstreaming. It extends upon music listening from a passiveact to a proactive, participative activity. The system consistsof a network based interactive music engine and a portablerendering player. It serves as a platform for experiments onstudying the sense of agency in collaborative creativeprocess, and requirements for fostering musical satisfactionin remote collaboration. </abstract>
</document>
<document>
<name>nime2004_157.pdf</name>
<abstract>This paper reports our recent developments on sensor acquisition systems, taking advantage of computer network technology. We present a versatile hardware system which can be connected to wireless modules, Analog to Digital Converters, and enables Ethernet communication. We are planning to make freely available the design of this architecture. We describe also several approaches we tested for wireless communication. Such technology developments are currently used in our newly formed Performance Arts Technology Group. </abstract>
<keywords>Gesture, Sensors, Ethernet, 802.11, Computer Music. </keywords>
</document>
<document>
<name>nime2004_161.pdf</name>
<abstract>Sonic City is a wearable system enabling the use of the urban environment as an interface for real-time electronic music making, when walking through and interacting with a city. The device senses everyday interactions and surrounding contexts, and maps this information in real time to the sound processing of urban sounds. We conducted a short-term study with various participants using our prototype in everyday settings. This paper describes the course of the study and preliminary results in terms of how the participants used and experienced the system. These results showed that the city was perceived as the main performer but that the user improvised different tactics and ad hoc interventions to actively influence and participate in how the music was created. </abstract>
<keywords>User study, new interface for musical expression, interactive music, wearable computing, mobility, context-awareness. </keywords>
</document>
<document>
<name>nime2004_165.pdf</name>
<abstract> This paper begins by evaluating various systems in terms of factors for building interactive audiovisual environments. The main issues for flexibility and expressiveness in the generation of dynamic sounds and images are then isolated. The design and development of an audiovisual system prototype is described at the end. </abstract>
<keywords>Audiovisual, composition, performance, gesture, image, representation, mapping, expressiveness. </keywords>
</document>
<document>
<name>nime2004_169.pdf</name>
<abstract>We describe a simple, computationally light, real-time system for tracking the lower face and extracting informationabout the shape of the open mouth from a video sequence.The system allows unencumbered control of audio synthesismodules by action of the mouth. We report work in progressto use the mouth controller to interact with a physical modelof sound production by the avian syrinx.</abstract>
<keywords>Mouth Controller, Face Tracking, Bioacoustics </keywords>
</document>
<document>
<name>nime2004_177.pdf</name>
<keywords>Improvisation support, jam session, melody correction, N-gram model, melody modeling, musical instrument </keywords>
</document>
<document>
<name>nime2004_181.pdf</name>
<abstract>This paper describes new work and creations of LEMUR, agroup of artists and technologists creating robotic musicalinstruments.</abstract>
</document>
<document>
<name>nime2004_185.pdf</name>
<abstract>Though musical performers routinely use eye movements to communicate with each other during musical performances, very few performers or composers have used eye tracking devices to direct musical compositions and performances. EyeMusic is a system that uses eye movements as an input to electronic music compositions. The eye movements can directly control the music, or the music can respond to the eyes moving around a visual scene. EyeMusic is implemented so that any composer using established composition software can incorporate prerecorded eye movement data into their musical compositions.</abstract>
<keywords>Electronic music composition, eye movements, eye tracking, human-computer interaction, Max/MSP. </keywords>
</document>
<document>
<name>nime2004_189.pdf</name>
<abstract>When working with sample-based media, a performer is managing timelines, loop points, sample parameters and effects parameters. The Slidepipe is a performance controller that gives the artist a visually simple way to work with their material. Its design is modular and lightweight, so it can be easily transported and quickly assembled. Also, its large stature magnifies the gestures associated with its play, providing a more convincing performance. In this paper, I will describe what the controller is, how this new controller interface has affected my live performance, and how it can be used in different performance scenarios. </abstract>
<keywords>Controller, Sample Manipulation, Live Performance, Open Sound Control, Human Computer Interaction </keywords>
</document>
<document>
<name>nime2004_193.pdf</name>
<abstract>This paper describes a theory for modulated objects based onobservations of recent musical interface design trends. Thetheory implies extensions to an object-based approach tocontroller design. Combining NIME research withethnographic study of shamanic traditions. The authordiscusses the creation of new controllers based on theshamanic use of ritual objects.</abstract>
<keywords>Music and Video Controllers, New Interface Design, Music Composition, Multimedia, Mythology, Shamanism, Ecoacoustics </keywords>
</document>
<document>
<name>nime2004_199.pdf</name>
<abstract>The Epipe is a novel electronic woodwind controller with continuous tonehole coverage sensing, an initial design for which was introduced at NIME '03. Since then, we have successfully completed two fully operational prototypes. This short paper describes some of the issues encountered during the design and construction of this controller. It also details our own early experiences and impressions of the interface as well as its technical specifications. </abstract>
<keywords>woodwind controller, variable tonehole control, MIDI, capacitive sensing </keywords>
</document>
<document>
<name>nime2004_201.pdf</name>
<abstract>This paper describes the SillyTone Squish Factory, a haptically engaging musical interface. It contains the motivation behind the device's development, a description of the interface, various mappings of the interface to musical applications, details of its construction, and the requirements to demo the interface. </abstract>
</document>
<document>
<name>nime2004_203.pdf</name>
<abstract>StickMusic is an instrument comprised of two haptic devices, a joystick and a mouse, which control a phase vocoder in real time. The purpose is to experiment with ideas of how to apply haptic feedback when controlling synthesis algorithms that have no direct analogy to methods of generating sound in the physical world. </abstract>
<keywords>haptic feedback, gestural control, performance, joystick, mouse </keywords>
</document>
<document>
<name>nime2004_205.pdf</name>
<abstract>High capacity of transmission lines (Ethernet in particular) is much higher than what imposed by MIDI today. So it is possible to use capturing interfaces with high-speed and high-resolution, thanks to the OSC protocol, for musical synthesis (either in realtime or non real-time). These new interfaces offer many advantages, not only in the area of musical composition with use of sensors but also in live and interactive performances. In this manner, the processes of calibration and signal processing are delocalized on a personal computer and augments possibilities of processing. In this demo, we present two hardware interfaces developed in La kitchen with corresponding processing to achieve a high-resolution, high-speed sensor processing for musical applications. </abstract>
<keywords>Interface, Sensors, Calibration, Precision, OSC, Pure Data, Max/MSP. </keywords>
</document>
<document>
<name>nime2004_207.pdf</name>
<abstract>We will discuss the case study of application of the VirtualMusical Instrument and Sound Synthesis. Doing thisapplication, the main subject is advanced Mapping Interface inorder to connect these. For this experiment, our discussionalso refers to Neural Network, as well as a brief introduction ofthe Virtual Musical Instrument "Le SuperPolm" and GestureController "BodySuit".</abstract>
<keywords>Virtual Musical Instrument, Gesture Controller, Mapping Interface </keywords>
</document>
<document>
<name>nime2004_209.pdf</name>
<abstract>In this paper, we describe a new MIDI controller, the LightPipes. The Light Pipes are a series of pipes that respond toincident light. The paper will discuss the design of theinstrument, and the prototype we built. A piece was composedfor the instrument using algorithms designed in Pure Data.</abstract>
<keywords>Controllers, MIDI, light sensors, Pure Data. </keywords>
</document>
<document>
<name>nime2004_211.pdf</name>
<abstract>In this paper, I describe a realtime sampling system for theturntablist, and the hardware and software design of the secondprototype, 16padjoystickcontroller.</abstract>
<keywords>DJ, Turntablism, Realtime Sampling, MAX/MSP, Microchip PIC microcontroller, MIDI </keywords>
</document>
<document>
<name>nime2004_213.pdf</name>
<abstract>This paper describes the design and on-going development ofan expressive gestural MIDI interface and how this couldenhance live performance of electronic music.</abstract>
<keywords>gestural control, mapping, Pure Data (pd), accelerometers, MIDI, microcontrollers, synthesis, musical instruments </keywords>
</document>
<document>
<name>nime2004_215.pdf</name>
<abstract>This paper proposes an interface for improvisational ensemble plays which synthesizes musical sounds and graphical images on the floor from people's act of "walking." The aim of this paper is to develop such a system that enables nonprofessional people in our public spaces to play good contrapuntal music without any knowledge of music theory. The people are just walking. This system is based on the i-trace system [1] which can capture the people's behavior and give some visual feedback. </abstract>
<keywords>Improvisational Ensemble Play, Contrapuntal Music, Human Tracking, Traces, Spatially Augmented Reality </keywords>
</document>
<document>
<name>nime2005_002.pdf</name>
</document>
<document>
<name>nime2005_005.pdf</name>
<keywords>Infra-instruments, hyperinstruments, meta-instruments, virtual instruments, design concepts and principles. </keywords>
</document>
<document>
<name>nime2005_011.pdf</name>
<abstract>In this paper, we introduce and analyze four gesture-controlled musical instruments. We briefly discuss the test platform designed to allow for rapid experimentation of new interfaces and control mappings. We describe our design experiences and discuss the effects of system features such as latency, resolution and lack of tactile feedback. The instruments use virtual reality hardware and computer vision for user input, and three-dimensional stereo vision as well as simple desktop displays for providing visual feedback. The instrument sounds are synthesized in real-time using physical sound modeling. </abstract>
</document>
<document>
<name>nime2005_023.pdf</name>
<abstract>In this paper we study the potential and the challenges posed by multi-user instruments, as tools that can facilitate interaction and responsiveness not only between performers and their instrument but also between performers as well. Several previous studies and taxonomies are mentioned, after what different paradigms exposed with examples based on traditional mechanical acoustic instruments. In the final part, several existing systems and implementations, now in the digital domain, are described and identified according to the models and paradigms previously introduced. </abstract>
<keywords>Multi-user instruments, collaborative music, new instruments design guidelines. </keywords>
</document>
<document>
<name>nime2005_027.pdf</name>
<abstract>This paper will investigate a variety of alternate controllers that are making an impact in interactive entertainment, particularly in the video game industry. Since the late 1990's, the surging popularity of rhythmic and musical performance games in Japanese arcades has led to the development of new interfaces and alternate controllers for the consumer market worldwide. Rhythm action games such as Dance Dance Revolution, Taiko No Tatsujin (Taiko: Drum Master), and Donkey Konga are stimulating collaborative gameplay and exposing consumers to custom controllers designed specifically for musical and physical interaction. We are witnessing the emergence and acceptance of these breakthrough controllers and models for gameplay as an international cultural phenomenon penetrating the video game and toy markets in record numbers. Therefore, it is worth considering the potential benefits to developers of musical interfaces, electronic devices and alternate controllers in light of these new and emerging opportunities, particularly in the realm of video gaming, toy development, arcades, and other interactive entertainment experiences. </abstract>
</document>
<document>
<name>nime2005_038.pdf</name>
<abstract>The Self-Contained Unified Bass Augmenter (SCUBA) is a new augmentative OSC (Open Sound Control) [5] controller for the tuba. SCUBA adds new expressive possibilities to the existing tuba interface through onboard sensors. These sensors provide continuous and discrete user-controlled parametric data to be mapped at will to signal processing parameters, virtual instrument control parameters, sound playback, and various other functions. In its current manifestation, control data is mapped to change the processing of the instrument's natural sound in Pd (Pure Data) [3]. SCUBA preserves the unity of the solo instrument interface by acoustically mixing direct and processed sound in the instrument's bell via mounted satellite speakers, which are driven by a subwoofer below the performer's chair. The end result augments the existing interface while preserving its original unity and functionality. </abstract>
<keywords>Interactive music, electro-acoustic musical instruments, musical instrument design, human computer interface, signal processing, Open Sound Control (OSC) </keywords>
</document>
<document>
<name>nime2005_042.pdf</name>
<abstract>This paper presents a novel controller built to exploit thephysical behaviour of a simple dynamical system, namely aspinning wheel. The phenomenon of gyroscopic precessioncauses the instrument to slowly oscillate when it is spunquickly, providing the performer with proprioceptive feedback. Also, due to the mass of the wheel and tire and theresulting rotational inertia, it maintains a relatively constant angular velocity once it is set in motion. Various sensors were used to measure continuous and discrete quantitiessuch as the the angular frequency of the wheel, its spatialorientation, and the performer's finger pressure. In addition, optical and hall-effect sensors detect the passing of aspoke-mounted photodiode and two magnets. A base software layer was developed in Max/MSP and various patcheswere written with the goal of mapping the dynamic behaviorof the wheel to varied musical processes.</abstract>
<keywords>HCI, Digital Musical Instruments, Gyroscopic Precession, Rotational Inertia, Open Sound Control </keywords>
</document>
<document>
<name>nime2005_046.pdf</name>
<abstract>The Smart Controller is a portable hardware device that responds to input control voltage, OSC, and MIDI messages; producing output control voltage, OSC, and MIDI messages (depending upon the loaded custom patch). The Smart Controller is a stand alone device; a powerful, reliable, and compact instrument capable of reducing the number of electronic modules required in a live performance or installation, particularly the requirement of a laptop computer. More powerful, however, is the Smart Controller Workbench, a complete interactive development environment. In addition to enabling the composer to create and debug their patches, the Smart Controller Workbench accurately simulates the behaviour of the hardware, and functions as an incircuit debugger that enables the performer to remotely monitor, modify, and tune patches running in an installation without the requirement of stopping or interrupting the live performance. </abstract>
<keywords>Control Voltage, Open Sound Control, Algorithmic Composition, MIDI, Sound Installations, programmable logic control, synthesizers, electronic music, Sensors, Actuators, Interaction. </keywords>
</document>
<document>
<name>nime2005_050.pdf</name>
<abstract>This paper describes an installation created by LEMUR(League of Electronic Musical Urban Robots) in January, 2005.The installation included over 30 robotic musical instrumentsand a multi-projector real-time video projection and wascontrollable and programmable over a MIDI network. Theinstallation was also controllable remotely via the Internet andcould be heard and viewed via room mics and a robotic webcam connected to a streaming server.</abstract>
</document>
<document>
<name>nime2005_060.pdf</name>
<abstract>When learning a classical instrument, people often eithertake lessons in which an existing body of "technique" is delivered, evolved over generations of performers, or in somecases people will "teach themselves" by watching people playand listening to existing recordings. What does one do witha complex new digital instrument?In this paper I address this question drawing on my experience in learning several very different types of sophisticatedinstruments: the Glove Talk II real-time gesture-to-speechinterface, the Digital Marionette controller for virtual 3Dpuppets, and pianos and keyboards. As the primary userof the first two systems, I have spent hundreds of hourswith Digital Marionette and Glove-Talk II, and thousandsof hours with pianos and keyboards (I continue to work asa professional musician). I will identify some of the underlying principles and approaches that I have observed duringmy learning and playing experience common to these instruments. While typical accounts of users learning new interfaces generally focus on reporting beginner's experiences, forvarious practical reasons, this is fundamentally different byfocusing on the expert's learning experience.</abstract>
<keywords>performance, learning new instruments </keywords>
</document>
<document>
<name>nime2005_065.pdf</name>
<keywords>Adaptive System, Sound Installation, Smart Interfaces, Music Robots, Spatial Music, Conscious Subconscious Interaction. </keywords>
</document>
<document>
<name>nime2005_080.pdf</name>
<abstract>McBlare is a robotic bagpipe player developed by the Robotics Institute at Carnegie Mellon University. McBlare plays a standard set of bagpipes, using a custom air compressor to supply air and electromechanical "fingers" to control the chanter. McBlare is MIDI controlled, allowing for simple interfacing to a keyboard, computer, or hardware sequencer. The control mechanism exceeds the measured speed of expert human performers. On the other hand, human performers surpass McBlare in their ability to compensate for limitations and imperfections in reeds, and we discuss future enhancements to address these problems. McBlare has been used to perform traditional bagpipe music as well as experimental computer generated music. </abstract>
<keywords>bagpipes, robot, music, instrument, MIDI </keywords>
</document>
<document>
<name>nime2005_085.pdf</name>
<abstract>In this report, we describe our development on the Max/MSPtoolbox MnM dedicated to mapping between gesture andsound, and more generally to statistical and machine learningmethods. This library is built on top of the FTM library, whichenables the efficient use of matrices and other data structuresin Max/MSP. Mapping examples are described based onvarious matrix manipulations such as Single ValueDecomposition. The FTM and MnM libraries are freelyavailable.</abstract>
<keywords>Mapping, interface design, matrix, Max/MSP. </keywords>
</document>
<document>
<name>nime2005_089.pdf</name>
<abstract>This paper describes DspMap, a graphical user interface (GUI)designed to assist the dynamic routing of signal generators andmodifiers currently being developed at the International Academyof Media Arts & Sciences. Instead of relying on traditional boxand-line approaches, DspMap proposes a design paradigm whereconnections are determined by the relative positions of the variouselements in a single virtual space.</abstract>
<keywords>Graphical user interface, real-time performance, map, dynamic routing </keywords>
</document>
<document>
<name>nime2005_093.pdf</name>
<abstract>The breath pressure signal applied to wind music instruments is generally considered to be a slowly varying function of time. In a context of music control, this assumptionimplies that a relatively low digital sample rate (100-200Hz) is sufficient to capture and/or reproduce this signal.We tested this assumption by evaluating the frequency content in breath pressure, particularly during the use of extended performance techniques such as growling, humming,and flutter tonguing. Our results indicate frequency contentin a breath pressure signal up to about 10 kHz, with especially significant energy within the first 1000 Hz. We furtherinvestigated the frequency response of several commerciallyavailable pressure sensors to assess their responsiveness tohigher frequency breath signals. Though results were mixed,some devices were found capable of sensing frequencies upto at least 1.5 kHz. Finally, similar measurements were conducted with Yamaha WX11 and WX5 wind controllers andresults suggest that their breath pressure outputs are sampled at about 320 Hz and 280 Hz, respectively.</abstract>
<keywords>Breath Control, Wind Controller, Breath Sensors </keywords>
</document>
<document>
<name>nime2005_097.pdf</name>
<abstract>Tangible Acoustic Interfaces (TAI) rely on various acousticsensing technologies, such as sound source location and acoustic imaging, to detect the position of contact of users interacting with the surface of solid materials. With their ability to transform almost any physical objects, flat or curved surfaces and walls into interactive interfaces, acoustic sensing technologies show a promising way to bring the sense of touch into the realm of computer interaction. Because music making has been closely related to this sense during centuries, an application of particular interest is the use of TAI's for the design of new musical instruments that matches the physicality and expressiveness of classical instruments. This paper gives an overview of the various acoustic-sensing technologies involved in the realisation of TAI's and develops on the motivation underlying their use for the design of new musical instruments. </abstract>
<keywords>Tangible interfaces, new musical instruments design. </keywords>
</document>
<document>
<name>nime2005_101.pdf</name>
</document>
<document>
<name>nime2005_105.pdf</name>
<abstract>This paper aims to present some perspectives on mappingembouchure gestures of flute players and their use as controlvariables. For this purpose, we have analyzed several typesof sensors, in terms of sensitivity, dimension, accuracy andprice, which can be used to implement a system capable ofmapping embouchure parameters such as air jet velocity andair jet direction. Finally, we describe the implementationof a sensor system used to map embouchure gestures of aclassical Boehm flute.</abstract>
<keywords>Embouchure, air pressure sensors, hot wires, mapping, aug- mented flute. </keywords>
</document>
<document>
<name>nime2005_109.pdf</name>
<keywords>Haptic, interaction, sound, music, control, installation. </keywords>
</document>
<document>
<name>nime2005_115.pdf</name>
<abstract>We report on The Manual Input Sessions, a series of audiovisual vignettes which probe the expressive possibilities of free-form hand gestures. Performed on a hybrid projection system which combines a traditional analog overhead projector and a digital PC video projector, our vision-based software instruments generate dynamic sounds and graphics solely in response to the forms and movements of the silhouette contours of the user's hands. Interactions and audiovisual mappings which make use of both positive (exterior) and negative (interior) contours are discussed. </abstract>
<keywords>Audiovisual performance, hand silhouettes, computer vision, contour analysis, sound-image relationships, augmented reality. </keywords>
</document>
<document>
<name>nime2005_121.pdf</name>
<abstract>The HandySinger system is a personified tool developedto naturally express a singing voice controlled by the gestures of a hand puppet. Assuming that a singing voice is akind of musical expression, natural expressions of the singingvoice are important for personification. We adopt a singingvoice morphing algorithm that effectively smoothes out thestrength of expressions delivered with a singing voice. Thesystem's hand puppet consists of a glove with seven bendsensors and two pressure sensors. It sensitively capturesthe user's motion as a personified puppet's gesture. Tosynthesize the different expressional strengths of a singingvoice, the "normal" (without expression) voice of a particular singer is used as the base of morphing, and three different expressions, "dark," "whisper" and "wet," are used asthe target. This configuration provides musically expressedcontrols that are intuitive to users. In the experiment, weevaluate whether 1) the morphing algorithm interpolatesexpressional strength in a perceptual sense, 2) the handpuppet interface provides gesture data at sufficient resolution, and 3) the gestural mapping of the current systemworks as planned.</abstract>
<keywords>Personified Expression, Singing Voice Morphing, Voice Ex- pressivity, Hand-puppet Interface </keywords>
</document>
<document>
<name>nime2005_127.pdf</name>
<abstract>The central role of the face in social interaction and non-verbal communication suggest we explore facial action as a means of musical expression. This paper presents the design, implementation, and preliminary studies of a novel system utilizing face detection and optic flow algorithms to associate facial movements with sound synthesis in a topographically specific fashion. We report on our experience with various gesture-to-sound mappings and applications, and describe our preliminary experiments at musical performance using the system. </abstract>
<keywords>Video-based musical interface; gesture-based interaction; facial expression; facial therapy interface. </keywords>
</document>
<document>
<name>nime2005_132.pdf</name>
<abstract>In this paper we present an example of the use of the singingvoice as a controller for digital music synthesis. The analysis of the voice with spectral processing techniques, derivedfrom the Short-Time Fourier Transform, provides ways ofdetermining a performer's vocal intentions. We demonstratea prototype, in which the extracted vocal features drive thesynthesis of a plucked bass guitar. The sound synthesis stageincludes two different synthesis techniques, Physical Modelsand Spectral Morph.</abstract>
<keywords>Singing voice, musical controller, sound synthesis, spectral processing. </keywords>
</document>
<document>
<name>nime2005_136.pdf</name>
<abstract>Electronic Musical Instrument Design is an excellent vehiclefor bringing students from multiple disciplines together towork on projects, and help bridge the perennial gap betweenthe arts and the sciences. This paper describes how at TuftsUniversity, a school with no music technology program,students from the engineering (electrical, mechanical, andcomputer), music, performing arts, and visual arts areas usetheir complementary skills, and teach each other, to developnew devices and systems for music performance and control.</abstract>
<keywords>Science education, music education, engineering, electronic music, gesture controllers, MIDI. </keywords>
</document>
<document>
<name>nime2005_140.pdf</name>
<abstract>The [hid] toolkit is a set of software objects for designingcomputer-based gestural instruments. All too frequently,computer-based performers are tied to the keyboard-mousemonitor model, narrowly constraining the range of possiblegestures. A multitude of gestural input devices are readilyavailable, making it easy to utilize a broader range of gestures. Human Interface Devices (HIDs) such as joysticks,tablets, and gamepads are cheap and can be good musicalcontrollers. Some even provide haptic feedback. The [hid]toolkit provides a unified, consistent framework for gettinggestural data from these devices, controlling the feedback,and mapping this data to the desired output. The [hid]toolkit is built in Pd, which provides an ideal platform forthis work, combining the ability to synthesize and controlaudio and video. The addition of easy access to gesturaldata allows for rapid prototypes. A usable environmentalso makes computer music instrument design accessible tonovices.</abstract>
<keywords>Instrument design, haptic feedback, gestural control, HID </keywords>
</document>
<document>
<name>nime2005_144.pdf</name>
<abstract>An experimental study comparing different user interfaces for a virtual drum is reported. Virtual here means that the drum is not a physical object. 16 subjects played the drum on five different interfaces and two metronome patterns trying to match their hits to the metronome clicks. Temporal accuracy of the playing was evaluated. The subjects also rated the interfaces subjectively. The results show that hitting the drum alternately from both sides with motion going through the drum plate was less accurate than the traditional one sided hitting. A physical stick was more accurate than a virtual computer graphic stick. Visual feedback of the drum slightly increased accuracy compared to receiving only auditory feedback. Most subjects evaluated the physical stick to offer a better feeling and to be more pleasant than the virtual stick. </abstract>
<keywords>Virtual drum, user interface, feedback, musical instrument design, virtual reality, sound control, percussion instrument. </keywords>
</document>
<document>
<name>nime2005_148.pdf</name>
<abstract>This paper takes the reader through various elements of the GoingPublik sound artwork for distributive ensemble and introduces the Realtime Score Synthesis tool (RSS) used as a controller in the work. The collaboration between artists and scientists, details concerning the experimental hardware and software, and new theories of sound art are briefly explained and illustrated. The scope of this project is too broad to be fully covered in this paper, therefore the selection of topics made attempts to draw attention to the work itself and balance theory with practice. </abstract>
<keywords>Mobile Multimedia, Wearable Computers, Score Synthesis, Sound Art, System Research, HCIs </keywords>
</document>
<document>
<name>nime2005_152.pdf</name>
<keywords>Motion tracking, mapping strategies, public installation, multiple participants music interfaces. </keywords>
</document>
<document>
<name>nime2005_156.pdf</name>
<abstract>This paper describes software tools used to create java applications for performing music using mobile phones. The tools provide a means for composers working in the Pure Data composition environment to design and audition performances using ensembles of mobile phones. These tools were developed as part of a larger project motivated by the desire to allow large groups of non-expert players to perform music based on just intonation using ubiquitous technology. The paper discusses the process that replicates a Pure Data patch so that it will operate within the hardware and software constraints of the Java 2 Micro Edition. It also describes development of objects that will enable mobile phone performances to be simulated accurately in PD and to audition microtonal tuning implemented using MIDI in the j2me environment. These tools eliminate the need for composers to compose for mobile phones by writing java code. In a single desktop application, they offer the composer the flexibility to write music for multiple phones. </abstract>
<keywords>Java 2 Micro Edition; j2me; Pure Data; PD; Real-Time Media Performance; Just Intonation. </keywords>
</document>
<document>
<name>nime2005_160.pdf</name>
<abstract>This paper details the motivations, design, and realization of Sustainable, a dynamic, robotic sound installation that employs a generative algorithm for music and sound creation. The piece is comprised of seven autonomous water gong nodes that are networked together by water tubes to distribute water throughout the system. A water resource allocation algorithm guides this distribution process and produces an ever-evolving sonic and visual texture. A simple set of behaviors govern the individual gongs, and the system as a whole exhibits emergent properties that yield local and large scale forms in sound and light. </abstract>
</document>
<document>
<name>nime2005_164.pdf</name>
<abstract>We present our work in the development of an interface for an actor/singer and its use in performing. Our work combines aspects of theatrical music with technology. Our interface has allowed the development of a new vocabulary for musical and theatrical expression and the possibility for merging classical and experimental music. It gave rise to a strong, strange, unpredictable, yet coherent, "character" and opens up the possibility for a full performance that will explore aspects of voice, theatrical music and, in the future, image projection. </abstract>
<keywords>Theatrical music, computer interaction, voice, gestural control. </keywords>
</document>
<document>
<name>nime2005_168.pdf</name>
<abstract>This paper describes the development, function andperformance contexts of a digital musical instrument called"boomBox." The instrument is a wireless, orientation-awarelow-frequency, high-amplitude human motion controller forlive and sampled sound. The instrument has been used inperformance and sound installation contexts. I describe someof what I have learned from the project herein.</abstract>
</document>
<document>
<name>nime2005_176.pdf</name>
<abstract>In this paper, we describe a course of research investigating thepotential for new types of music made possible by locationtracking and wireless technologies. Listeners walk arounddowntown Culver City, California and explore a new type ofmusical album by mixing together songs and stories based ontheir movement. By using mobile devices as an interface, wecan create new types of musical experiences that allowlisteners to take a more interactive approach to an album.</abstract>
<keywords>Mobile Music, Digital Soundscape, Location-Based Entertainment, Mobility, Interactive Music, Augmented Reality </keywords>
</document>
<document>
<name>nime2005_180.pdf</name>
<abstract>Bangarama is a music controller using headbanging as the primaryinteraction metaphor. It consists of a head-mounted tilt sensor and aguitar-shaped controller that does not require complex finger positions. We discuss the specific challenges of designing and buildingthis controller to create a simple, yet responsive and playable instrument, and show how ordinary materials such as plywood, tinfoil, and copper wire can be turned into a device that enables a fun,collaborative music-making experience.</abstract>
<keywords>head movements, music controllers, interface design, input devices </keywords>
</document>
<document>
<name>nime2005_184.pdf</name>
<abstract>In recent years Computer Network-Music has increasingly captured the attention of the Computer Music Community. With the advent of Internet communication, geographical displacement amongst the participants of a computer mediated music performance achieved world wide extension. However, when established over long distance networks, this form of musical communication has a fundamental problem: network latency (or net-delay) is an impediment for real-time collaboration. From a recent study, carried out by the authors, a relation between network latency tolerance and Music Tempo was established. This result emerged from an experiment, in which simulated network latency conditions were applied to the performance of different musicians playing jazz standard tunes. The Public Sound Objects (PSOs) project is web-based shared musical space, which has been an experimental framework to implement and test different approaches for on-line music communication. This paper describe features implemented in the latest version of the PSOs system, including the notion of a network-music instrument incorporating latency as a software function, by dynamically adapting its tempo to the communication delay measured in real-time. </abstract>
<keywords>Network Music Instruments; Latency in Real-Time Performance; Interface-Decoupled Electronic Musical Instruments; Behavioral Driven Interfaces; Collaborative Remote Music Performance; </keywords>
</document>
<document>
<name>nime2005_188.pdf</name>
<abstract>We present the Pin&Play&Perform system: an interface inthe form of a tablet on which a number of physical controlscan be added, removed and arranged on the fly. These controls can easily be mapped to existing music sofware usingthe MIDI protocol. The interface provides a mechanism fordirect manipulation of application parameters and eventsthrough a set of familiar controls, while also encouraging ahigh degree of customisation through the ability to arrange,rearrange and annotate the spatial layout of the interfacecomponents on the surface of the tablet.The paper describes how we have realized this concept using the Pin&Play technology. As an application example, wedescribe our experiences in using our interface in conjunction with Propellerheads' Reason, a popular piece of musicsynthesis software.</abstract>
<keywords>tangible interface, rearrangeable interface, midi controllers </keywords>
</document>
<document>
<name>nime2005_196.pdf</name>
<abstract>ChucK is a programming language for real-time sound synthesis. It provides generalized audio abstractions and precise control over timing and concurrency - combining the rapid-prototyping advantages of high-level programming tools, such as Pure Data, with the flexibility and controllability of lower-level, text-based languages like C/C++. In this paper, we present a new time-based paradigm for programming controllers with ChucK. In addition to real-time control over sound synthesis, we show how features such as dynamic patching, on-the-fly controller mapping, multiple control rates, and precisely-timed recording and playback of sensors can be employed under the ChucK programming model. Using this framework, composers, programmers, and performers can quickly write (and read/debug) complex controller/synthesis programs, and experiment with controller mapping on-the-fly. </abstract>
<keywords>Controller mapping, programming language, on-the-fly programming, real-time interaction, concurrency. </keywords>
</document>
<document>
<name>nime2005_200.pdf</name>
<abstract>Drum controllers designed by researchers and commercialcompanies use a variety of techniques for capturing percussive gestures. It is challenging to obtain both quick responsetimes and low-level data (such as position) that contain expressive information. This research is a comprehensive studyof current methods to evaluate the available strategies andtechnologies. This study aims to demonstrate the benefitsand detriments of the current state of percussion controllersas well as yield tools for those who would wish to conductthis type of study in the future.</abstract>
<keywords>Percussion Controllers, Timbre-recognition based instruments, Electronic Percussion, Sensors for Interface Design </keywords>
</document>
<document>
<name>nime2005_204.pdf</name>
<abstract>Discussion of time in interactive computer music systems engineering has been largely limited to data acquisition rates and latency.Since music is an inherently time-based medium, we believe thattime plays a more important role in both the usability and implementation of these systems. In this paper, we present a time designspace, which we use to expose some of the challenges of developing computer music systems with time-based interaction. Wedescribe and analyze the time-related issues we encountered whilstdesigning and building a series of interactive music exhibits thatfall into this design space. These issues often occur because ofthe varying and sometimes conflicting conceptual models of timein the three domains of user, application (music), and engineering.We present some of our latest work in conducting gesture interpretation and frameworks for digital audio, which attempt to analyzeand address these conflicts in temporal conceptual models.</abstract>
</document>
<document>
<name>nime2005_208.pdf</name>
<keywords>Reconfigurable, Sensors, Computer Music </keywords>
</document>
<document>
<name>nime2005_212.pdf</name>
<abstract>This paper describes the audio human computer interface experiments of ixi in the past and outlines the current platform for future research. ixi software [5] was founded by Thor Magnusson and Enrike Hurtado Mendieta in year 2000 and since then we've been working on building prototypes in the form of screen-based graphical user interfaces for musical performance, researching human computer interaction in the field of music and creating environments which other people can use to do similar work and for us to use in our workshops. Our initial starting point was that computer music software and the way their interfaces are built need not necessarily be limited to copying the acoustic musical instruments and studio technology that we already have, but additionally we can create unique languages and work processes for the virtual world. The computer is a vast creative space with specific qualities that can and should be explored. </abstract>
<keywords>Graphical user interfaces, abstract graphical interfaces, hyper- control, intelligent instruments, live performance, machine learning, catalyst software, OSC, interfacing code, open source, Pure Data, SuperCollider. </keywords>
</document>
<document>
<name>nime2005_216.pdf</name>
<abstract> Musicians and composers have been using brainwaves as generative sources in music for at least 40 years and the possibility of a brain-computer interface for direct communication and control was first seriously investigated in the early 1970s. Work has been done by many artists and technologists in the intervening years to attempt to control music systems with brainwaves and - indeed - many other biological signals. Despite the richness of EEG, fMRI and other data which can be read from the human brain, there has up to now been only limited success in translating the complex encephalographic data into satisfactory musical results. We are currently pursuing research which we believe will lead to the possibility of direct brain-computer interfaces for rich and expressive musical control. This report will outline the directions of our current research and results. </abstract>
</document>
<document>
<name>nime2005_220.pdf</name>
<abstract>We present a real-time system which allows musicians tointeract with synthetic virtual characters as they perform.Using Max/MSP to parameterize keyboard and vocal input, meaningful features (pitch, amplitude, chord information, and vocal timbre) are extracted from live performancein real-time. These extracted musical features are thenmapped to character behaviour in such a way that the musician's performance elicits a response from the virtual character. The system uses the ANIMUS framework to generatebelievable character expressions. Experimental results arepresented for simple characters.</abstract>
<keywords>Music, synthetic characters, advanced man-machine inter- faces, virtual reality, behavioural systems, interaction tech- niques, visualization, immersive entertainment, artistic in- stallations </keywords>
</document>
<document>
<name>nime2005_224.pdf</name>
<abstract>In the Expression Synthesis Project (ESP), we propose adriving interface for expression synthesis. ESP aims toprovide a compelling metaphor for expressive performance soas to make high-level expressive decisions accessible to nonexperts. In ESP, the user drives a car on a virtual road thatrepresents the music with its twists and turns; and makesdecisions on how to traverse each part of the road. The driver'sdecisions affect in real-time the rendering of the piece. Thepedals and wheel provide a tactile interface for controlling thecar dynamics and musical expression, while the displayportrays a first person view of the road and dashboard from thedriver's seat. This game-like interface allows non-experts tocreate expressive renderings of existing music without havingto master an instrument, and allows expert musicians toexperiment with expressive choice without having to firstmaster the notes of the piece. The prototype system has beentested and refined in numerous demonstrations. This paperpresents the concepts underlying the ESP system and thearchitectural design and implementation of a prototype.</abstract>
<keywords>Music expression synthesis system, driving interface. </keywords>
</document>
<document>
<name>nime2005_228.pdf</name>
<abstract>While many new interfaces for musical expression have been presented in the past, methods to evaluate these interfaces are rare.This paper presents a method and a study comparing the potentialfor musical expression of different string-instrument based musicalinterfaces. Cues for musical expression are defined based on results of research in musical expression and on methods for musicaleducation in instrumental pedagogy. Interfaces are evaluated according to how well they are estimated to allow players making useof their existing technique for the creation of expressive music.</abstract>
<keywords>Musical Expression, electronic bowed string instrument, evaluation of musical input devices, audio signal driven sound synthesis </keywords>
</document>
<document>
<name>nime2005_232.pdf</name>
</document>
<document>
<name>nime2005_236.pdf</name>
<abstract>A wide variety of singing synthesis models and methods exist,but there are remarkably few real-time controllers for thesemodels. This paper describes a variety of devices developedover the last few years for controlling singing synthesismodels implemented in the Synthesis Toolkit in C++ (STK),Max/MSP, and ChucK. All of the controllers share somecommon features, such as air-pressure sensing for breathingand/or loudness control, means to control pitch, and methodsfor selecting and blending phonemes, diphones, and words.However, the form factors, sensors, mappings, and algorithmsvary greatly between the different controllers.</abstract>
<keywords>Singing synthesis, real-time singing synthesis control. </keywords>
</document>
<document>
<name>nime2005_238.pdf</name>
<abstract>The author describes a recent composition for piano andcomputer in which the score performed by the pianist, readfrom a computer monitor, is generated in real-time from avocabulary of predetermined scanned score excerpts. Theauthor outlines the algorithm used to choose and display aparticular excerpt and describes some of the musicaldifficulties faced by the pianist in a performance of the work. </abstract>
<keywords>Score generation, Jitter. </keywords>
</document>
<document>
<name>nime2005_240.pdf</name>
<abstract>No Clergy is an interactive music performance/installation inwhich the audience is able to shape the ongoing music. In it,members of a small acoustic ensemble read music notation fromcomputer screens. As each page refreshes, the notation is alteredand shaped by both stochastic transformations of earlier musicwith the same performance and audience feedback, collected viastandard CGI forms. </abstract>
<keywords>notation, stochastic, interactive, audience, Python, Lilypond </keywords>
</document>
<document>
<name>nime2005_242.pdf</name>
<abstract>This paper describes the design of SoniMime, a system forthe sonification of hand movement for real-time timbre shaping. We explore the application of the tristimulus timbremodel for the sonification of gestural data, working towardthe goals of musical expressivity and physical responsiveness. SoniMime uses two 3-D accelerometers connected toan Atmel microprocessor which outputs OSC control messages. Data filtering, parameter mapping, and sound synthesis take place in Pd running on a Linux computer.</abstract>
<keywords>Sonification, Musical Controller, Human Computer Interac- tion </keywords>
</document>
<document>
<name>nime2005_244.pdf</name>
<keywords>Musical controller, sensate surface, mapping system </keywords>
</document>
<document>
<name>nime2005_246.pdf</name>
<abstract>This paper describes the design and implementation of BeatBoxing, a percussive gestural interface for the liveperformance of electronic music and control of computerbased games and musical activities.</abstract>
<keywords>Performance, Gestural Mapping, Music Controller, Human- Computer Interaction, PureData (Pd), OSC </keywords>
</document>
<document>
<name>nime2005_248.pdf</name>
<abstract>This paper describes the development of AirStick, an interface for musical expression. AirStick is played "in the air", in a Theremin style. It is composed of an array of infrared proximity sensors, which allow the mapping of the position of any interfering obstacle inside a bi-dimensional zone. This controller sends both x and y control data to various real-time synthesis algorithms. </abstract>
<keywords>Music Controller, Infrared Sensing, Computer Music. </keywords>
</document>
<document>
<name>nime2005_250.pdf</name>
<keywords>Musical Controller, Collaborative Control, Haptic Interfaces </keywords>
</document>
<document>
<name>nime2005_252.pdf</name>
<abstract>We present a Virtual Interface to Feel Emotions called VIFE _alpha v.01 (Virtual Interface to Feel Emotions). The work investigates the idea of Synaesthesia and her enormous possibilities creating new realities, sensations and zones where the user can find new points of interaction. This interface allows the user to create sonorous and visual compositions in real time. 6 three-dimensional sonorous forms are modified according to the movements of the user. These forms represent sonorous objects that respond to this by means of sensorial stimuli. Multiple combinations of colors and sound effects superpose to an a the others to give rise to a unique experience. </abstract>
<keywords>Synaesthesia, 3D render, new reality, virtual interface, creative interaction, sensors. </keywords>
</document>
<document>
<name>nime2005_254.pdf</name>
<abstract>The Sonictroller was originally conceived as a means ofintroducing competition into an improvisatory musicalperformance. By reverse-engineering a popular video gameconsole, we were able to map sound information (volume,pitch, and pitch sequences) to any continuous or momentaryaction of a video game sprite.</abstract>
<keywords>video game, Nintendo, music, sound, controller, Mortal Kombat, trumpet, guitar, voice </keywords>
</document>
<document>
<name>nime2005_258.pdf</name>
<abstract>In this presentation, we discuss and demonstrate a multiple touch sensitive (MTS) keyboard developed by Robert Moog for John Eaton. Each key of the keyboard is equipped with sensors that detect the three-dimensional position of the performer's finger. The presentation includes some of Eaton's performances for certain earlier prototypes as well as this keyboard. </abstract>
<keywords>Multiple touch sensitive, MTS, keyboard, key sensor design, upgrading to present-day computers </keywords>
</document>
<document>
<name>nime2005_260.pdf</name>
<abstract>This paper will demonstrate the use of the Smart Controller workbench in the Interactive Bell Garden. </abstract>
<keywords>Control Voltage, Open Sound Control, Algorithmic Composition, MIDI, Sound Installations, Programmable Logic Control, Synthesizers. </keywords>
</document>
<document>
<name>nime2005_262.pdf</name>
<abstract>The Swayway is an audio/MIDI device inspired by the simpleconcept of the wind chime.This interactive sculpture translates its swaying motion,triggered by the user, into sound and light. Additionally, themotion of the reeds contributes to the visual aspect of thepiece, converting the whole into a sensory and engagingexperience.</abstract>
<keywords>Interactive sound sculpture, flex sensors, midi chimes, LEDs, sound installation. </keywords>
</document>
<document>
<name>nime2005_264.pdf</name>
<abstract>This paper describes the transformation of an everyday object into a digital musical instrument. By tracking hand movements and tilt on one of two axes, the Bubbaboard, a transformed handheld washboard, allows a user to play scales at different octaves while simultaneously offering the ability to use its inherent acoustic percussive qualities. Processed sound is fed to the Mommaspeaker, which creates physically generated vibrato at a speed determined by tilting the Bubbaboard on its second axis. </abstract>
<keywords>Gesture based controllers, Musical Performance, MIDI, Accelerometer, Microcontroller, Contact Microphone </keywords>
</document>
<document>
<name>nime2005_266.pdf</name>
<abstract>The Wise Box is a new wireless digitizing interface for sensors and controllers. An increasing demand for this kind of hardware, especially in the field of dance and computer performance lead us to design a wireless digitizer that allows for multiple users, with high bandwidth and accuracy. The interface design was initiated in early 2004 and shortly described in reference [1]. Our recent effort was directed to make this device available for the community on the form of a manufactured product, similarly to our previous interfaces such as AtoMIC Pro, Eobody or Ethersense [1][2][3]. We describe here the principles we used for the design of the device as well as its technical specifications. The demo will show several devices running at once and used in real-time with a various set of sensors. </abstract>
<keywords>Gesture, Sensors, WiFi, 802.11, OpenSoundControl. </keywords>
</document>
<document>
<name>nime2005_268.pdf</name>
<abstract>Soundstone is a small wireless music controller that tracks movement and gestures, and maps these signals to characteristics of various synthesized and sampled sounds. It is intended to become a general-purpose platform for exploring the sonification of movement, with an emphasis on tactile (haptic) feedback. </abstract>
<keywords>Gesture recognition, haptics, human factors, force, acceleration, tactile feedback, general purpose controller, wireless. </keywords>
</document>
<document>
<name>nime2005_271.pdf</name>
<abstract>Contemplace is a spatial personality that redesigns itselfdynamically according to its conversations with its visitors.Sometimes welcoming, sometimes shy, and sometimeshostile, Contemplace's mood is apparent through a display ofprojected graphics, spatial sound, and physical motion.Contemplace is an environment in which inhabitationbecomes a two-way dialogue.</abstract>
<keywords>Interactive space, spatial installation, graphic and aural display, motion tracking, Processing, Flosc </keywords>
</document>
<document>
<name>nime2005_272.pdf</name>
<abstract>Mocean is an immersive environment that creates sensoryrelationships between natural media, particularly exploringthe potential of water as an emotive interface.</abstract>
<keywords>New interface, water, pipe organ, natural media, PIC microcontroller, wind instrument, human computer interface. </keywords>
</document>
<document>
<name>nime2006_026.pdf</name>
<abstract>This paper presents the concepts and techniques used in afamily of location based multimedia works. The paper hasthree main sections: 1.) to describe the architecture of anaudio-visual hardware/software framework we havedeveloped for the realization of a series of locative mediaartworks, 2.) to discuss the theoretical and conceptualunderpinnings motivating the design of the technicalframework, and 3.) to elicit from this, fundamental issuesand questions that can be generalized and applicable to thegrowing practice of locative media.</abstract>
<keywords>Mobile music, urban fiction, locative media. </keywords>
</document>
<document>
<name>nime2006_037.pdf</name>
<abstract>This paper describes two new live performance scenarios for performing music using bluetooth-enabled mobile phones. Interaction between mobile phones via wireless link is a key feature of the performance interface for each scenario. Both scenarios are discussed in the context of two publicly performed works for an ensemble of players in which mobile phone handsets are used both as sound sources and as hand-held controllers. In both works mobile phones are mounted in a specially devised pouch attached to a cord and physically swung to produce audio chorusing. During performance some players swing phones while others operate phones as hand-held controllers. Wireless connectivity enables interaction between flying and hand-held phones. Each work features different bluetooth implementations. In one a dedicated mobile phone acts as a server that interconnects multiple clients, while in the other point to point communication takes place between clients on an ad hoc basis. The paper summarises bluetooth tools designed for live performance realisation and concludes with a comparative evaluation of both scenarios for future implementation of performance by large ensembles of nonexpert players performing microtonal music using ubiquitous technology. </abstract>
<keywords>Java 2 Micro Edition; j2me; Pure Data; PD; Real-Time Media Performance; Just Intonation. </keywords>
</document>
<document>
<name>nime2006_043.pdf</name>
<abstract>Physically situated public art poses significant challenges for the design and realization of interactive, electronic sound works. Consideration of diverse audiences, environmental sensitivity, exhibition conditions, and logistics must guide the artwork. We describe our work in this area, using a recently installed public piece, Transition Soundings, as a case study that reveals a specialized interface and open-ended approach to interactive music making. This case study serves as a vehicle for examination of the real world challenges posed by public art and its outcomes. </abstract>
<keywords>Music, Sound, Interactivity, Arts, Public Art, Network Systems, Sculpture, Installation Art, Embedded Electronics. </keywords>
</document>
<document>
<name>nime2006_049.pdf</name>
<keywords>Graphical interfaces, collaborative performance, networking, computer music ensemble, emergence, visualization, education. </keywords>
</document>
<document>
<name>nime2006_053.pdf</name>
<abstract>The culture of laptop improvisation has grown tremendously in recent years. The development of personalized software instruments presents interesting issues in the context of improvised group performances. This paper examines an approach that is aimed at increasing the modes of interactivity between laptop performers and at the same time suggests ways in which audiences can better discern and identify the sonic characteristics of each laptop performer. We refer to software implementation that was developed for the BLISS networked laptop ensemble with view to designing a shared format for the exchange of messages within local and internet based networks. </abstract>
<keywords>Networked audio technologies, laptop ensemble, centralized audio server, improvisation </keywords>
</document>
<document>
<name>nime2006_061.pdf</name>
<keywords>touch screen, PDA, Pure Data, controller, mobile musical instrument, human computer interaction </keywords>
</document>
<document>
<name>nime2006_065.pdf</name>
<abstract>This paper discusses the concept of using background music to control video game parameters and thus actions on the screen. Each song selected by the player makes the game look different and behave variedly. The concept is explored by modifying an existing video game and playtesting it with different kinds of MIDI music. Several examples of mapping MIDI parameters to game events are presented. As mobile phones' MIDI players do not usually have a dedicated callback API, a real-time MIDI analysis software for Symbian OS was implemented. Future developments including real-time group performance as a way to control game content are also considered. </abstract>
<keywords>Games, MIDI, music, rhythm games, background music reactive games, musically controlled games, MIDI-controlled games, Virtual Sequencer. </keywords>
</document>
<document>
<name>nime2006_071.pdf</name>
<abstract>Turntable musicians have yet to explore new expressions with digital technology. New higher-level development tools open possibilities for these artists to build their own instruments that can achieve artistic goals commercial products cannot. This paper will present a rough overview on the practice and recent development of turntable music, followed by descriptions of two projects by the author. </abstract>
<keywords>Turntable music, DJ, turntablist, improvisation, Max/MSP, PIC Microcontroller, Physical Computing </keywords>
</document>
<document>
<name>nime2006_075.pdf</name>
<abstract>This report presents an interface for musical performance called the spinCycle. spinCycle enables performers to make visual patterns with brightly colored objects on a spinning turntable platter that get translated into musical arrangements in realtime. I will briefly describe the hardware implementation and the sound generation logic used, as well as provide a historical background for the project.</abstract>
<keywords>Color-tracking, turntable, visualization, interactivity, synesthesia </keywords>
</document>
<document>
<name>nime2006_081.pdf</name>
<abstract>The PETECUBE project consists of a series of musical interfaces designed to explore multi-modal feedback. This paper will briefly describe the definition of multimodal feedback, the aim of the project, the development of the first PETECUBE and proposed further work. </abstract>
<keywords>Multi-modal Feedback. Haptics. Musical Instrument. </keywords>
</document>
<document>
<name>nime2006_085.pdf</name>
<keywords>Digital musical instrument, kinesthetic feedback </keywords>
</document>
<document>
<name>nime2006_089.pdf</name>
<abstract>The Orbophone is a new interface that radiates rather thanprojects sound and image. It provides a cohesive platformfor audio and visual presentation in situations where bothmedia are transmitted from the same location andlocalization in both media is perceptually correlated. Thispaper discusses the advantages of radiation overconventional sound and image projection for certain kindsof interactive public multimedia exhibits and describes theartistic motivation for its development against a historicalbackdrop of sound systems used in public spaces. Oneexhibit using the Orbophone is described in detail togetherwith description and critique of the prototype, discussingaspects of its design and construction. The paper concludeswith an outline of the Orbophone version 2.</abstract>
<keywords>Immersive Sound; Multi-channel Sound; Loud-speaker Array; Multimedia; Streaming Media; Real-Time Media Performance; Sound Installation. </keywords>
</document>
<document>
<name>nime2006_093.pdf</name>
<abstract>The gluion is a sensor interface that was designed to overcomesome of the limitations of more traditional designs based onmicrocontrollers, which only provide a small, fixed number ofdigital modules such as counters and serial interfaces. These areoften required to handle sensors where the physical parametercannot easily be converted into a voltage. Other sensors arepacked into modules that include converters and communicatevia SPI or I2C. Finallly, many designs require outputcapabilities beyond simple on/off.The gluion approaches these challenges thru its FPGA-baseddesign which allows for a large number of digital I/O modules.It also provides superior flexibility regarding theirconfiguration, resolution, and functionality. In addition, theFPGA enables a software implementation of the host link - inthe case of the gluion the OSC protocol as well as theunderlying Ethernet layers.</abstract>
</document>
<document>
<name>nime2006_097.pdf</name>
<abstract>A new sensor integration system and its first incarnation i sdescribed. As well as supporting existing analog sensorarrays a new architecture allows for easy integration of thenew generation of low-cost digital sensors used in computermusic performance instruments and installation art.</abstract>
<keywords>Gesture, sensor, MEMS, FPGA, network, OSC, configurability </keywords>
</document>
<document>
<name>nime2006_101.pdf</name>
<abstract>How can we provide interfaces to synthesis algorithms thatwill allow us to manipulate timbre directly, using the sametimbre-words that are used by human musicians to communicate about timbre? This paper describes ongoingwork that uses machine learning methods (principally genetic algorithms and neural networks) to learn (1) to recognise timbral characteristics of sound and (2) to adjust timbral characteristics of existing synthesized sounds.</abstract>
<keywords>timbre; natural language; neural networks </keywords>
</document>
<document>
<name>nime2006_103.pdf</name>
<keywords>composition, process, materials, gesture, controller, cross- modal interaction </keywords>
</document>
<document>
<name>nime2006_114.pdf</name>
<abstract>This paper reports on ongoing studies of the design and use ofsupport for remote group music making. In this paper weoutline the initial findings of a recent study focusing on thefunction of decay of contributions in collaborative musicmaking. Findings indicate that persistent contributions lendthemselves to individual musical composition and learningnovel interfaces, whilst contributions that quickly decayengender a more focused musical interaction in experiencedparticipants.</abstract>
</document>
<document>
<name>nime2006_118.pdf</name>
<keywords>Collaborative interface, remote jamming, network music, interaction design, novice, media space INTRODUCTION Most would agree that music is an inherently social ac- tivity [30], but since the </keywords>
</document>
<document>
<name>nime2006_124.pdf</name>
<abstract>In this paper, we describe the networking of multiple Integral Music Controllers (IMCs) to enable an entirely new method for creating music by tapping into the composite gestures and emotions of not just one, but many performers. The concept and operation of an IMC is reviewed as well as its use in a network of IMC controllers. We then introduce a new technique of Integral Music Control by assessing the composite gesture(s) and emotion(s) of a group of performers through the use of a wireless mesh network. The Telemuse, an IMC designed precisely for this kind of performance, is described and its use in a new musical performance project under development by the authors is discussed. </abstract>
</document>
<document>
<name>nime2006_129.pdf</name>
<abstract>This paper explores the use of perturbation in designing multiperformer or multi-agent interactive musical interfaces. A problem with the multi-performer approach is how to cohesively organize the independent data inputs into useable control information for synthesis engines. Perturbation has proven useful for navigating multi-agent NIMEs. The author's Windtree is discussed as an example multi-performer instrument in which perturbation is used for multichannel ecological modeling. The Windtree uses a physical system turbulence model controlled in real time by four performers. </abstract>
</document>
<document>