top of page

Music Making in Metaverse

Another opportunity the metaverse holds in terms of creation is a space for music production, composing, and performing. Taking advantage of metaverse features such as customization and building will enable musicians to perform, interact, and collaborate remotely and simultaneously with other musicians around the world through virtual instruments and virtual environments.





Collaborative music-making in metaverse The nature of collaborative music-making often requires coordination, which usually happens with the help of visual or auditory cues. At present, since the technology has not been fully developed, collaboratively composing or performing in metaverse can be challenging and complex due to speed limitations in our current internet infrastructures, which causing low bandwidth and latency issues. With the developing technology, we will soon see that such problems can be reduced and eventually eliminated; therefore, metaverse can provide co-presence musicians to perform seamlessly as if they are in the same physical space.

Visual communication Body movements, facial expressions and gestures are the part of visual communication in collaborative music-making. These visual cues along with auditory cues are useful to stay coordinated and communicate with each other during the process. When the metaverse technology evolves to the state of more natural and realistic visual representation of the individuals, this could be useful to deliver more information therefore enhance access to the cues.

Sence of presence Sense of presence, the feeling of being in the same place, is essential for the creative and collaborative music-making process, thus anticipating each other’s moves. Diminished absence of co-presence can cause problems to coordinate creatively or simply lower satisfaction (Loveridge, 2020). According to new research on this topic from Yakura and Goto (2020), the presentation of the entire body movements in avatar forms would contribute to the improvement in the sense of co-presence.


In that case, metaverse with 3D designed virtual environments can provide an immersive visual experience and a complete sense of presence when the individual's self movements are seamlessly reflected.


 

Networked music performance

In the past two decades we saw tremendous growth of the internet expanded the freedom for new forms of musical interactions such as Networked Music Performance (NMP). According to Rottondi et al. (2016) NMP is reffered to a group of musicians interacting and performing through a telecomunication network.


Computer systems for NMP offered musicians to interact with virtual and physical instruments and create sonic environments. In these these practices, most of the time musicians were visually presenting themselves throught remote video-conference technologies to be able to detect visual cues from other musicians.


Even though many current developments helped to reduce delay and synchornization problems for NMP, lack of visualization of space and musicians were limiting visual communication, therefore not ableing to see the visual cues, which is the important component of collaborative music performances.


 

Avatar Orchestra Metaverse

Founded in 2007, The Avatar Orchestra Metaverse is a collaborative group of performers who creates a live audio-visual performance and rehearse within a metaverse by using virtual instruments. AOM (The Avatar Orchestra Metaverse) consists of artists specializing in various disciplines such as music, architecture, visual art, and sound art. AOM members worked together to build specific landscapes, architecture, specific choreography, avatars, and instruments to perform with within Second Life.


At some performances, avatars of the AOM members are also designed as musical instrument, meaning that each movement of the avatars trigger sound samples and manipulate them. AOM members followed a score (in form of text, picture or traditional notation) or conductor avatar in order to stay synced during the performances. Several HUDs (Heads-Up Display), a user interface that helps to display and control data used by AOM members to control sound, movements of avatars, visuals (Martin, 2018).



Aviaphone, an instrument created by AOM member Bingo Onomatopei that plays found sounds from Second Life.
Aviaphone, an instrument created by AOM member Bingo Onomatopei that plays found sounds from Second Life.

For some performances, avatars and landscapes are designed to trigger through the sounds made by individual players; therefore, shapes and colors change simultaneously. These changes also made it easier for the audience to understand which avatar/musician is playing at that moment. Audiences experiencing these performances through their avatars perceive the volume and sound differently depending on the positioning. For example, moving or walking closely to any avatar performer will increase the sound and the sound level coming from that performer or vice versa.




1 Comment


This digital space not only allows for unique creative expressions but also offers new avenues for audience engagement and interaction. It will be exciting to see how music-making evolves in this virtual realm and what innovative experiences will emerge for both creators and listeners. I recommend get on spotify algorithmic playlists and actively promoting your songs for the best results.

Edited
Like
bottom of page