In this paper we presented a study of the influence of the mapping layer as a determinant factor in expressivity control possibilities. We introduced a three-layer classification of mapping schemes that proved useful in determining mapping parameter relationships for different performance situations; these mappings were applied to the control of additive synthesis. From this experience, the authors feel that the mapping layer is a key element in attaining expressive control of signal model synthesis.
Several mapping examples were presented and discussed. In an instrumental approach, the convergent mappings demonstrated in this paper have the potential to provide higher levels of expressivity to existing MIDI controllers. Without the need to develop new hardware, off-the-shelf controllers can be given new life via coupling schemes that attempt to simulate the behaviors of acoustic instruments.
Finally, regarding the interpolation between additive models, we showed that in order to achieve a ``correct'' morphing between models, the non-linear coupling phenomena must be considered. The interpolations between the partial frequencies thus must be allowed only among groups of partials having correponding ``regimes'' of fluctuations, i.e, coupled partials, non-coupled partials and ``noise''. In order to bypass this problem, we currently eliminate all inharmonicity from the models before performing the interpolations.