Abstract:

      This research explores the use or Virtual Reality in Electronic Music production and performance. Firstly, considering how this technology can be integrated with Digital Audio Workstations such as Ableton Live, and visual workstations such as Resolume. Secondly, considering how consumers may participate in live performance within a VR environment.  Findings demonstrate the procedures undertaken to successfully integrate  live VR music production with performance. Developing means for live VR performance for remote audiences.  This is presented through in-depth descriptions as well as accompanying videos of the applications and procedures implemented to achieve the final results. This research concludes with a six-minute portfolio performance which encapsulates all of these findings into a cohesive demonstration.  


Introduction:

      Electronic music is one of the largest genres today, the notion of electronic music has always been a synergy of technology and music. As technology continues to develop exponentially, so does the potential to expand the scope of Electronic Music.  The following research is exploring the use of Virtual Reality (VR) and Mixed Reality (MR) (or Augmented Reality (AR)) and their possible involvement in Electronic Music in terms of both production and performance, including VR opportunities for that audience as well as the performer.  In order to do so, the study is divided into two components, ‘Production’ and ‘Performance’.  

      Firstly, production entails the methods implored in order to manipulate electronic music in a virtual environment, taking traditional analogue components such as faders, buttons, and knobs, and reinventing them in a virtual space. Benefits for this include a reduced volume of gear required to produce this music, personal customisation, and visual appeal for the audience. This technology can transform a performance into motion-based music production.  This research aims to link this technology with existing be Digital Audio Workstations (DAWS) such as Ableton Live for a simple and practical merger of these current and developing technologies. 

      Secondly, performance involves inviting audience members into a VR world, where they can enjoy both the visual ascetics of the performer up close and personal, as well as an artistic virtual environment which offers a visual representation of the music as an immersive reality.   This will include the use of industry standard visuals applications such as Resolume. The research also aims to be cost-efficient for consumer, rather than requiring an expensive full VR headset, one only requires an affordable ($15) plastic headset which holds their smartphone.

      The research will then explore the potential of blending these two into a cohesive experience for both performer and audience.  Ultimately aiming towards a final product or ‘Portfolio’ what will represent the research and findings in a six minute VR Electronic Music performance of which all are welcome to participate in, given they have access to the headset to do so, and if not they can still watch this performance in 2D from there computer or smartphone.   

Literature review:

      The evolution of computing has proven to be an exponential process over the past decades, as developments in the field then a larger scope for further developmentation. Today we see screen time playing an ever larger role in the life of the average consumer. To the point that the argument is raised that technology is removing the consumer from the experience of life. This translates to the music industry as a potential distraction for audience from being fully immersed and active in such traditional events such as concerts, raves, and music festivals. Additionally electronic driven music is also often accused of leaving the artists removed from their audience due to lap tops and other technological devices deviating the artists attention away from the audience, compared to traditional bands which were very audience focused. However VR may offer a middle ground of a hyper realistic variations of such events that consumers may partake in from their own home, thereby allowing the technology to act as a bridge between artist and audience, rather than a distraction. 
VR for Production

      At 23 years old, Russian scientist, Leon Theremin created musical history by developing an all new instrument, the Theremin in 1920.  The first electronic, as well as motion-based, instrument, with two circuits, pitch and volume, the device holds an electromagnetic field which is interrupted by a humans magnetic field, thereby alters the circuit and the audio output.  In air hand movements can be used to skilfully control pitch and volume, as an “experienced Theremin master appears to be dancing with her hands, drawing a song from thin air, like magic” (Grimes, 2011, para. 17).  
Video, double click for playback.
 
 (The Haunted Theremin of Carolina Eyck | Reverb, 2018)

      With the development of electronic music, home computers, and gaming systems; further exploration of dance like movement as a trigger for  electronic music began to be developed. An excellent example of this is Dieter Doepfer modifications to the Nintendo Power Glove; a device released by Nintendo in 1989 intended to allow players to interact with games through hand gestures (Chandler, 2015).  Doepfer however saw the opportunity for music production and developed the MOGLI, a music MIDI processer that connected to the glove, as opposed to the intended Nintendo.  This processer allowed the Power Glove wrist controls to select MIDI functionality, and hand movements and gestures were sent to a receiver which interpreted the electronic signal and output MIDI sounds accordingly (Williams, 2017). Although comparable to the Theremin in functionality, the electronic processes of this new motion-based musical instrument now operated via MIDI, therefore binary, meaning it could trigger any synthesised or pre-recorded sound, greatly surpassing the limited timbre variance offered by the Theremin. 
Video, double click for playback.
 
(Doepfer MOGLI Midi Glove (Rare - Vintage), 2017)

      Over the following three decades many further explored the concept of movement-based music production, such as David Rokeby's Very Nervous System (VNS), which “offers a sophisticated level of computer control for detecting the accurate speed and location of dancers on stage” in the late 1990s.  The device used motion sensors connected to computer programming to turn dancers into musical performers (Winkler, 1998, pg. 1).  
By the 2010s a myriad of new midi instruments began to reach the market, driven by the now mainstream electronic music scene. Devices like the  Crystal Ball by Naonext in 2014,  played almost like a Theremin, it uses five inferred sensors to detect hand movements around the ball, with additional 24 buttons to control the factuality of the sensors (Rovito, 2014). 

      The Mi.Mu gloves by Imogen Heap  originally released in 2014, have now launchied their 2nd generation. A quantum leap from the Nintendo Power Glove, these gloves detect all hand gestures and movements and connect to software to fully map all hand movements to any chosen parameters into your chosen DAW (Sawh, 2019).  These gloves are among the leaders of today’s motion-based music production, however a pair of these gloves cost just under $5,000, making then un-accessible to many music producers. 
Video, double click for playback.
 
(MI.MU Gloves: Music Through Movement, 2019)

      The V Motion Project was a New Zealand based project where a team, including award-winning producer Joel Little, developed and conducted a performance which turned kinetic movement, in the form of choreographed dancing, into live music production. The team described “creating music through motion is at the heart of this creation and uses the power of the Kinect to capture movement and translate it into music which is performed live and projected on a huge wall” (Sterling, 2012).  With just under a million views on YouTube, this video has helped to raise awareness of developing motion-based music technologies.
Video, double click for playback.
 
(The V Motion Project -- Can’t Help Myself [Official Music Video]‬, 2012)

      New consumer focused apps are also reaching the market, such as Kogura Pro who ran a kickstart in 2016, which uses a web camera, allows assignable midi triggers to be placed in the frame, then by looking at oneself on screen they can move their hands to hit these triggers to compose live music (Sethi, 2016). Interestingly, this application offers something that the others do not, a visual reference point, rather than depending on the performer to mentally calculate distances and memorise gestures in order to produce each musical input.  

      Today, at the dawn of the 2020’s two new perspectives of reality a quickly become apart from mainstream culture. Firstly, Virtual Reality (VR), these headsets visually transport users into an alternative reality of which they can often interact. Today, VR is most common for game play and simulated training.  Secondly, augmented or Mixed Reality (MR) is where the user wear glasses through which their current reality is still visible, yet it is augmented or mixed with additional, often intractable objects that are not actually in their environment (Augmented Reality vs. Virtual Reality: AR and VR Made Clear, 2018). Today MR is most common among industry professionals such as surgeons or engineers to allow for enhanced abilities in their work.  MR is also currently much more expensive the VR, with Facebook’s new Oculus Quest 2 VR headset being released at a $569 NZD (Oculus, n.d.) vs. the New Microsoft HoloLens 2 MR headset selling for $6,199 NZD (Microsoft HoloLens, n.d.).  Therefore, for the scope of this project we will focus on VR due to its practical pricing.  However, ultimately MR would be well suited for music production as it allows the user to view the real world, and interact with necessary tools such as microphones and instruments as normal; yet having the addition of augmented objects such as virtual instruments and potentiometers. 
Video, double click for playback.
 
(Augmented Reality vs. Virtual Reality: AR and VR Made Clear, 2018)

      The most notable development between VR/MR and other motion-based music production is the ability for the user to constantly see the variables that they are manipulating, rather than programming in and then memorising gestures. There are a number of VR ‘games’ on the market that allow users to visually interact with music in a free form way, such as Electronauts, giving consumers the opportunity to construct music, but within a given framework, thus “the game has been designed with non-musicians in mind” (Barker, 2018, para. 6). However, less development has been made in applications that allow producers to interact between VR and a DAW, which would allow full the functionality of professional music production software used in the industry. 

      A few platforms that do interact with Ableton directly are already available.  Being, Modulia, an aesthetically appealing and appease to be quite user-friendly, yet designed for “those experienced music producers who have invested in software like Ableton Live, Modulia Studio then opens up to allow even greater control. Ableton Live features can now be used inside VR, offering creator a new way to interact with their music” (Graham, 2019). This app is available on the Oculus store, however the business may be having some form of internal difficulty as there have been no updates or posts on their Facebook page in over one year.  If no technical support or updates are available this app may become an unfortunate dead-end in terms of future proofing. 
Video, double click for playback.
 
(Modulia Studio - Your Unlimited VR Music Studio | Now on Oculus Rift & Rift S, 2019)


      Another application available is AliveInVR, a similar app although appears more crude aesthetically, and also looks more limited in terms of control methods.  However its basic layout may prove to be quite flexible and allow for greater customisation, additionally this app offers the bonus of “Stream video of your performance to the desktop with in-game camera for screen recording and sharing online.” (MusicTech.net, 2017, para. 2)  and even  ability for VR spectators as well as collaboration between remote artists.  At this time the developer website appears to be current, yet the developer may have shifted focus slightly to an additional platform called Transiant which is all-inclusive, rather than connecting to a DAW, this model appears powerful as it can incorporate VST3’s and loops, but ultimately limited compared to Ableton.
Video, double click for playback.
 
(Aliveinvr Trailer, 2018)

VR for Performance

      Virtual Reality and Mixed Reality are poised to become the new way of computing as Mark Zuckerberg, founder and CEO of Facebook, Americas sixth largest company, (Duffin & 19, 2020) claims that VR and MR are the next evolutionary step in computing and communication, from text, to images, to videos, to VR, to AR [MR] (Breen, 2016). Zuckerberg is not alone in this forecast, with Google, Microsoft, and Apple all making moves into VR and AR, this future will soon become a mainstream reality.

      In addition to VR offering musicians/producers a new arena to compose and perform, VR can also offer an opportunity to unite music fans and artists in new and exciting ways, including remotely. Currently, VR is predominately focused towards game play, and from this we can gain perspective as to its potential relevance in the music industry, for example a study that compared VR gaming to traditional monitor based gaming, demonstrated that “players showed a more intense emotional response” and enhanced “perceived sense of presence” recorded through “questionnaire, … heart rate, and skin conductanceas” (Pallavicini et al., 2019, p. 140).  These findings imply that audiences may receive an enhanced experience of music events via VR, rather than through a conventional screen. 

      VR is getting very affordable and more commonplace, however in 2018 it was estimated that only 171 million people were VR users (Tankovska & 22, 2020), whereas there were 2.9 billion smartphone users in the same year (Gaubys, 2020).  These figures make it obvious why companies like Google, are focusing on mobile VR with the release of Daydream system in 2016. “Daydream View gives you a simple way to experience 360 degree games, videos and panoramic photos if you fasten your phone into its all-fabric headset body” (Swider, 2017, para. 1).  These 360° videos offer a comparable experience to VR all form the usere's smartphone. As Julius Solaris describes in his article How 360° Live Video is Paving the Way for Virtual Reality at Events, “360° live video is the perfect antipasto while we wait for the main entrée, full, accessible VR for everyone” (2017, para. 20). 
Video, double click for playback.
 
(Meet Daydream, Google’s Vision for Virtual Reality, 2016)

      Today we see the concept of VR concerts is already being explored and “virtual concerts are rapidly growing in popularity online” (Charron, 2017, p. 803).  With 360 footage of concerts available on YouTube as well as websites such as MelodyVR, partnered with “Warner Music Group, … Microsoft, Roc Nation, Sony Music Entertainment, and Universal Music Group” (Merrill, 2020, para. 2), offer a 360° video experience of concerts from top artists. By offering these performances, originally focused toward a live audience, to online viewers, MelodyVR “expects to accelerate VR adoption by bringing the public unique digital music experiences thanks to this elite network of relationships” (Merrill, 2020, para. 1). 
Video, double click for playback.
 
 
(MelodyVR 2019 Highlights, 2020)
      With the opportunity now available for both VR music production and 360° performance, this research will explore these two facets independently as well as how they can be merged together to create cohesive events that may offer an enhanced experience for both the performer and the audience. 

Method:

      This research has been undertaken autoethnographically, in that the study has been conducted by one researcher, Jesse Brand, who is a musician, electronic music producer, and performer, as well as an undergraduate of Audio Production at the Southern Institute of Technology, New Zealand.  This research is considered his major project in his graduating year.  As the sole author and conductor of this research, Jesse respects his subjectivity and subsequent bias within the work, and therefore offers this research as one of many possible outcomes when researching the field of VR in Electronic Music Production in order to understand the current standing of this field, as well as to explore its potential and incite others to further in this field of research. 

      The following results were determined through trial and error, reading through forms, trying many apps and configurations.  These processes were scattered with dead ends and quick decisions, the sheer volume of trials and adjustments makes these processes difficult to document in their entirety, as well as being redundant, because ultimately we are looking for the functional results rather than the errors.  Therefore, the majority of following research will outline the successes only. Even in discussing the successes, the volume of information can still be overwhelming, further highlighting the importance of only documenting the highlights of the research. 

Music Production in VR

      The first phase of research was to identify and experiment with any available applications that would allow for VR music production.  Separating consumer and professional  applications was the first step, as there are ‘games’ such as Electronauts, which offers general consumers the opportunity to create their own electronic music in VR without any previous experience.  These ‘game’ style applications are not the focus of this research and were dismissed.  
Stand out apps that focuses on professional level music production and Digital Audio Workstation (DAW) integration are, Control Room, Alive in VR, Transient, and Modulia Studios.  Each app is available for VR platform Oculus Quest and/or Rift. Rift is the PC powered option from Oculus, whereas Quest is stand alone. (Rift games can now run wirelessly on a Quest via an app called Virtual Desktop).  Each App was purchased and evaluated for its compatibility with live music performance in relation to the authors preferred workflow and methodology. 

      Firstly Control Room was explored as it appears very tidy and well-designed. Offering Ableton synchronisation, it did have a playable element such as drum pads, however it became quickly apparent that, as the name suggests, this app is tailored to a home production environment, rather than performance.  The app replicates a profession recording studio, and functions accordingly.  For this reason no more than an hour was spent exploring the potential of this particular VR application.
VR application developers Alive in Tech offer two professional VR Production applications Transient and Alive in VR.  Transient is operating as a stand-alone platform, with a lot of the functionality that most DAWs offer, including the ability to utilise Virtual Studio Technology 3 (VST3) plugins as well as audio samples. In addition, this platform offers a multitude of virtual worlds and avatars where an audience can attend and watch the performance, as well as allowing artists to collaborate in real time.   This attempt to produce a VR ‘DAW’ is ambitious and the social abilities of the platform are impressive, however there  are ultimately considerable gaps between this DAW and the components of established DAW’s such as Ableton. 

      Alive in VR was the predecessor to Transient and acted as virtual controller for Ableton, this platform offers a lot of functionality that Control Room lacked such as the ability to launch Ableton scenes and more, yet there was a distinct lack of midi mapping for variable control parameters.  It also appears as though the  Alive in Tech are focusing on their newer release, all in one application Transient, and Alive in VR has had fewer updates and focus as a result.  There for neither of these two applications were able to offer a rounded VR production/performance experience in this researcher's opinion.

      Lastly Modulia studios was explored. Modulia is only available for Oculus Rift and not Quest, posing initial challenges.  This means that the application would run from a PC and not from the headset unit its self, although the operating computer met the minimum requirements to run Rift games, the graphics processer unit (GPU) was reported as inadequate by the Rift gaming centre application, thereby refusing to launch the app.  Forcing two eventual work around options, firstly to run Sidequest, an alternate gaming centre application that uploads unregistered apps directly onto the Quest.  Initially this method was perfect and allowed for exploration of this application and Ableton midi linking via Wi-Fi.  After linking to Ableton , and experimenting with the devices available such as MIDI mappable drum pads, and XYZ (3D) variable potentiometer units, it was clear this application offered most of the functionality the others had lacked.  Experimentation with XYZ and drum pad mapping, and composing and performing music in a virtual environment was the highlight of this research so far. The results were both exciting and entertaining, demonstrating that immersive live music production can indeed surpass traditional methods of using tactile knobs and buttons. 

Changes in the App:

      After identifying Modulia Studios as the preferred application and undertaking numerous hours of experimentation, a number of conclusions were reached in regard to how to take the best available app and get it to the point where a performance could be undertaken professionally.  The following is an email that was written to the company, which turned out to be a solo operation run by William Dulot, outlining my appreciation for the app and possible improvements. 

Hey Moduila Team,

Hope you are well and would like to say 'great work' you have literally made my dreams come true, because I first realised I might be able to control Ableton using VR in a dream.  I woke up and started researching and found your app as well as Alive in VR and Control Room, I bought a Quest right away and am now furthering my research.  In fact, I am in my final year of my Bachelor of Audio Production degree and have chosen to do my major research project on this topic.  "Is it practical to produce and perform electronic music using VR, in real life, on a streaming service such as twitch, and in a VR environment with a VR audience?" 

I have been performing live EDM with Ableton for several years, and been a musician most of my life. I have worked with the Push and countless other controllers as well as apps like Touchable and Lemur.  All of these have had pros and cons and I believe that VR, and eventually AR will be the ultimate solution. 

So far I have reviewed Alive in VR, Control Room, and Modulia Studios and believe Modulia shows the most potential (it's really great what you have done!!!).  Therefore, I am reaching out with a few suggestions and would love to know your thoughts on them.  I am very passionate and would love to help you and the team however I can, including ideas and feedback.  Please forgive me if any of my ideas are already obvious to you!
Here are my ideas:

1) 'Pass-through' as an optional environment - as cool as it is to be in space etc. as a performer I feel it is important to see the world around me, so I can sing into a microphone, pick up and instrument, and see the audience.  It is vital for VR production to incorporate AR to move from a 'game' to a legitimate tool in the industry, in my opinion. Oculus is already talking about upcoming full colour Pass-through and improved visuals, plus technology such as HoloLens is coming fast. 

2) The main usability issue I have is lack of 'record arm' as well as 'solo' and mute'. In addition, I noticed you currently cannot record into an empty clip slot even when you arm the track in the DAW. Lastly, it would be fantastic if the clips could record to a set length, just as the Push as TouchAble allow. I believe Ableton view needs to function like a push,  but even better! As everything in VR can be easily customisable!! 
3) a floating clip clock or step sequencer so one knows where there are in the loop cycle at all times.  For example, this was an issue I have had before VR, as the clock in Ableton is too small on stage, I resolved it with a Beatstep sequencer controller always playing the loop cycle acting as a large visual loop clock. 

4) As great as the App will become it will never replace all the functions of Ableton desktop, and it sucks to remove headset to look at the computer for tasks such as set up and for midi mapping etc, I feel it would be very helpful to be able to switch to a 'Virtual Desktop' style version of Ableton without having to fully exit Modulia. Or maybe have Ableton as a floating window?!! 

5) A custom 'Name Plate' on the bottom front of each object, so one could write 'Bass Filter' on a XYZ Box for example, so it is easy to know what one is which. 

Summary:
1) Pass-through
2) - Arm, Solo, Mute
    - Set Clip Record Length
3) Loop Clock
4) Ableton Desktop View
5) Name Plate
I will stop here for now!  I am sure these ideas are a lot harder to set up then it is for me to explain them!!! 
Please do keep in touch, and let me know which if any, of these ideas might be possible, , and if I can be of any help. I complete my audio degree in 4 months and I plan to part of the future of electronic music with VR in every way possible. 
Thank again for your time and all you have done so far to create this ground-breaking app!!

Jesse Brand 
www.jessedbrand.com
www.facebook.com/jessedbrand
https://www.youtube.com/user/jessebrandband

William replied promptly with this following:

Hi Jesse,

Wow!! Thanks a lot for your email and for your involvement in Modulia Studio! I'm so glad that you share my enthusiasm about music production in VR and that you have such a cool major research project for your Bachelor!
Your ideas are all very relevant. Here are my thoughts about them:

1) Pass-through: To be transparent with you, I've always wanted VR to be more...transparent...! In fact, AR has been my goal since the very beginning of the project. I had a chance to develop for the HoloLens for several months and unfortunately, interactions really lacked accuracy for music creation (and the price and accessibility of the device was a no-go too). So I can't wait for an AR device that would really be convenient and precise! About the Oculus Quest's pass-through: I find it really useful in the Oculus Home and I want to implement it in Modulia but Oculus has not made it possible to do that in apps (for now..). But be sure that I will implement that option if that becomes possible.

2) Arm, solo, mute: I agree that solo and mute are missing. I add that to the TODO-list. About arm: I've made a system that automatically arms and sets MIDI channel for an Ableton track when you create an instrument that plays that track in Modulia. Record buttons on each Modulia instrument lets you record clips, it automatically records it in an empty clip slot (for now you can't choose which clip slot it is). I understand that these behaviours can be a bit rigid when you are used to Ableton and the Push but the idea was to keep it simple and automatic, to let user focus on creation instead of configuration.

3) Great idea for the clock, that is one module I need to add! I've once made a VR performance (https://www.facebook.com/163099574292009/videos/1001006580287483) and I had made a clock module with the hour of the day and the duration of the performance. I should add a utility module with all these information.

4) Having a virtual desktop would be really useful indeed. Unfortunately, that would be a lot of development, so I can't promise it would be implemented. (AR will solve a lot of these problems..!)

5) Great idea for the name plate. Indeed, I think mixers particularly lack indications.

Just to be transparent, Modulia Studio has been a one-man project since the beginning. I had been working on it full-time for a couple of years but this is no longer my main activity (I've been hired by BLEASS, a small music company that makes audio plugins https://www.facebook.com/BLEASSapps). So I continue to develop and maintain Modulia Studio beside but at a slower rhythm.

Thank you again for all these great ideas!
Good luck with your Bachelor and do not hesitate if you have any other feedback!
William DULOT
william@modulia-studio.com
www.modulia-studio.com
  
      Williams reply was timely, clear, through and helpful; however it did not indicate that any of these current issues would be resolved in the near future, at least from his end.  In light of this, attempts were then made to solve these problems independently as follows:

1)    Pass through mode utilises Oculus’s outward facing cameras, which are intended for room orientation during gameplay, to display the room to the user via the internal screen. Due to lack of ‘pass through’ authority integration in Modulia Studios, Jesse experimented with using Oculus’s ‘pass through’ boundary safety control to his advantage.  By placing necessary objects such as instruments and instrument microphones just outside the user designated room boundary, this allows the user to move through this boundary intentionally, (which engages passthrough mode) and see the actual room while inside the headset through the inbuilt black and white cameras.  Additionally, a beta development from oculus  now allows a double tap anywhere on the side of the headset to initiate pass through mode.  The disadvantage of this system is that the VR controls and parameters are no longer visible.  Rather than a true Mix Reality of both worlds, but these results would act as a compromise at this time.
Video, double click for playback
 
(Original Research)

2)    There is no way to independently modify Modulia in order to add mute, solo, and arm buttons, however this is accomplished as a result of question 4, which was plausible to achieve independently.  However, the fixed loop length also mentioned was achievable by running a program called TouchAble  and device that allows Ableton interaction via touch screen devices such as phone or iPad.  One key feature of Touch able is it allows fix loop length recording, meaning after a designated amount of bars a loop that is being recorded will stop recording and begin to loop.  This feature is not native to Ableton nor available via plugins (including Max for Live) in order to allow this fix loop length from the desktop and therefore in VR (Ableton Push has fixed loop length but only on the Midi controller, not the desktop). By connecting Ableton to TouchAble on a phone and engaging fixed loop length, this feature also became available from the desktop its self.
Video, double click for playback
 
(Original Research)

3)    A floating loop cycle clock would be a major asset to timing control while working in the VR environment. Knowing whereas the music was in the loop cycle allows for controlled build-ups and drops. Despite this being an issue many Ableton users have mentioned in forums, no Max for Live or other plugins seemed to offer this functionality.  Finally, Circles was discovered, a plugin designed for multi-layered looping, operated a visual analogue clock, displaying the loop cycle.  Unfortunately, it refused to float on top of Ableton screen permanently.  However, by running two sessions of Ableton simultaneously, one for the music and one as a ‘dummy session’ solely for the clock app, which was tempo matched via ‘Ableton Link’ a feature that allows two sessions of Ableton on the same local network to link their tempos, for collaborations etc. This would allow for the floating clock to remain on top of the ‘dummy session’ and therefore be captured by OVR Toolkit (OVR is further discussed under number 4) and added as a 2nd floating window in VR displaying the loop cycle of the main music session at all time. 
Video, double click for playback
 
(Original Research)

4)    In order to see the Ableton screen inside the VR game was a long and laborious task, yet successful, and valuable. Upon discovering that some gamers were already able to access their Twitch (a popular gamer streaming website) chat when in VR and streaming live, allowing interacting with their online audience at the same time.  However, this feature was not available for oculus, but only for Steam VR, an alternative VR platform used mostly by Vive headset owners.  Firstly, accessing the  Steam VR platform via oculus was made possible by Virtual Desktop (an oculus app that allows the Quest to wireless  access a Windows device and play VR games that are running from this device), Virtual Desktop integrates with Steam VR when both are run in conjunction.  Yet Modulia Studios is not available on the Steam platform, only Oculus, and so research eventually led to a patch called ‘Revive’ which was “created by LibreVR to play Rift games exclusive to Oculus Home on the HTC Vive” (Steam Community, n.d.)  If both Rift and Steam are installed on the same device.  This now allowed access to the previously unplayable yet purchased Rift version of Modulia Studios, (as Rift its self would not allow my computer, do it its graphics card, to launch the application) to play through Steam VR with no complications.  

      In order to now view Ableton the intention was to use the Steam Desktop feature found by pressing the left had setting button.  To temporally display the desktop.  This posed massive problems as the feature refused to display anything but a black screen,  yet it was working because when moving the VR cursor around this black space, the cursor was also moving on the computer.  This clue encouraged further research, spotting several others in forums with similar same problem but no solution. Eventually an unrelated article indicated the cause could be that the computer has two graphics cards and the two cards are not communicating properly in this context.  After disabling one graphics card completely the issue was solved. However, the end result of this method did not function as hoped.  
      From there, further research led revealed two ‘overlay’ apps that may display Ableton from the computer in the VR headset.  An app that Steam allowed running simultaneously to any other app already running such as Modulia that is now running in Steam via the Revive link.  “OVR Toolkit is a utility application designed to make viewing the desktop in VR simple and fast” (Steam OVR Toolkit, n.d.) and eventually allowed for any running applications on the computer screen to now be permanently visible and floating within VR, allowing full interaction with Ableton desktop screen, inside the VR environment while still utilising all the features of Modulia simultaneously. 
Video, double click for playback
 
(Original Research)

5)    Adding name plates is not something that a 3rd party can achieve within Modulia studios. However, this request did receive the most positive response and therefore will hopefully be added in the near future.  Regardless, name plates would be helpful but are not essential to proceed.
Video, double click for playback
 
(Original Research)

      Once these objectives had been completed the last objective was to record what was being seen in the VR for others to witness. This was achieved via Sidequests live streaming feature which captures what is in VR and transfers this to the computer screen via Wi-Fi, and windows inbuilt game recorded allows for recording this screen as a movie file. 
      This subjectively completed, a VR music production set up that would allow an artist to perform live electronic dance music.  However, the element of audience inclusion is yet to be addressed.  

      The ability to orchestrate electronic music from a VR environment is great for the performer, yet in today’s music industry it is more important  than ever to consider what the added value for the consumer will be. This enhancement may be found in two direct forms;  firstly, ease of use for the performer may enhance the music, but this will probably not be noticeable to the audience directly.  Secondly, the artist will be able to leave the booth and computers behind and take centre stage.  Using more entertaining, movement-based production techniques rather than traditional buttons and knobs to control the music, much like a conductor of an orchestra.  

      Despite these two benefits, the audience themselves may enjoy partaking in this VR reality.  To achieve this, the next component of research involved developing a means to offer a 360 video environment for the viewer which included having the performer in this space.  One option would be a simple 360° live video similar to MelodyVR’s offered experience. However, this only offers a less than complete replica of an event, and there for leaves the online audience coming up short in comparison.  Thus, an all new approve was developed containing seven main steps.

Note: Two computers were used, one operating music and VR (Ableton and Modulia), and one operating Video (Resolume, OBS, YouTube) further detail follows:

1)    360 Background Videos: A source of 360° videos would be required.  YouTube currently has the largest variety of 360° videos, therefore, referenced use of these videos would be necessary, as sources for 360° videos apart from YouTube is very limited.  Websites such as en.savefrom.net  allow users to download any YouTube video with a URL, however 360° videos will not download in a usable format.  Great research was required to find means to download these videos in their original, equirectangular format, uploaded using a Github application and command prompt.  

2)     Green Screen: In order to green screen or chroma key a performer into these videos a conventional camera could be used, however it had disadvantages such as; it did not consistently realistic, it left the presenter confided to one small section of the 360° environment, it ran the risk of an arm or leg getting cut off if the performer moved outside their frame, and it also stopped the performer from being able to utilise and interact with the entirety of the 360° space.  Therefore, a 360° camera was purchased and positioned in the centre of a room which was 360° green screened, these ideas was developed by Jesse Brands, not referencing an external source.  Lighting a 360° room proved difficult with much trial and error before coming to a successful result.
Video, double click for playback
 
(Original Research)

3)    Controlling Video: In order to keep the events entertaining it would be needed to change the 360° scenes throughout the set, this was initially achieved by editing a 360° video that would change the scene every few minutes, however this meant the performer had to adhere to a schedule if the musical events and visual events were to align. Therefore a new solution was developed.
This involved using Resolume, a popular video program designed for visuals at live events. A plugin was discovered that allowed ‘mapping’ Ableton parameters to Resolume parameters. This can take place on one computer or over the local network between computers, as two computers were required to handle the processing demand. With this it became possible not only to change the video scene in perfect sync with the audio scene, but also to map transition effects for the video that were triggered by transition or ‘build up’ effects in the music. Remembering that these 360° videos are an entire environment; to have powerful video transition effects synchronised with the music would completely alter the reality of the viewer in an exciting way.
Video, double click for playback
 
(Original Research)

4)    Video, Audio, & Green Screen Merger: The video now needs to be sent out of Resolume and into a program with the ability to green-screen or ‘chroma key’ the performer into the virtual world, mix it with the audio output from Ableton and record or broadcast the final product.  OBS is an open source, free application capable of completing these tasks.  To get the video out of Resolume a process called spout was used, a method to share video between applications in real time with no preservable latency.  The audio from Ableton is output to the audio interface from the ‘music’ computer and then input into the ‘video’ computers microphone input.  
Video, double click for playback
 
(Original Research)

5)    2D to 360: In order to make this video two methods were discovered. 
a.    Recorded the video, open this recording in Adobe Premiere and convert it to 360 format and export. This option is not live and involves an extra step, however might be helpful for certain goals such as re-orientating the 360 field.
b.    Stream directly to YouTube and  select ‘this video is 360’ and YouTube will take the equirectangular video and convert it into a 360 video live. 
Video, double click for playback
 
(Original Research)

6)    Viewing Video in VR: Although the audience will be immersed in this 360 world, the performer will be in Modulia Studios, yet the performer is controlling the videos and needs to know what the audience is seeing, both to manipulate effects in a stylistic manner and to have the ability to comment on the current world they are in.  To achieve this a free application was found called NDI, which allows you to screen share between computers on a local network without latency. NDI can be added to OBS on the ‘audio/VR computer’ and receive the OBS video from the ‘video’ computer. With the final video now on the Audio/VR computer it can be added as a window in Modulia Studios via OVT toolkit. 
Video, double click for playback
 
(Original Research)

7)    VR View into Background of 360 Video:  Although viewers can see the performer and the 360 world they currently do not know what is happening inside Modulia Studios.  To give viewers a glimpse into this world it would be nice to offer a virtual projector screen within the 360 environment where they can see exactly what the performer is seeing too.  For this it was deduced to use a program called SideQuest (previously mentioned above), with this installed on the ‘video computer’ it is possible to stream, via Wi-Fi, Modulia Studios, through the Oculus Quest, to the ‘video computer’ and window capture this content with OBS. Then one can place Modulia Studios as a small, floating projector screen in the background and behind the performer.  
Video, double click for playback
 
(Original Research)

      With all of these steps completed, we are able to enter the VR environment, load up the OVR tool kit with Ableton and Video screens, MIDI map parameters to Ableton from the VR components and test its overall usability. This final test was streamed to YouTube and is visible online in 360 by clicking on the image below. Please make sure the resolution is set to 4K (if your bandwidth will allow), make the video full screen, and use your cursor to navigate the 360 world.  You are also welcome to find this video on your phone and move the phone in 360 degrees to get a closer example of an immersive environment.  If you happen to have a VR headset or phone VR headset, you can find this video on YouTube and get the full experience. *** Note: chroma key settings were not perfect and presenter appears at a lower resolution than normally possible. 
YouTube Link, double click for playback


Results:

Data analysis:
      Due to the nature of autoethnographic research, the data analysis is found within the methodology above. 

Limitations:
      This research was limited by finance, as mix reality would be valuable to explore but is currently outside this researches budget.  Another limitation is the ability for consumers to currently view this content, a test group would be the next step in this research and 10 sets of goggles have been ordered from China, however they are currently delayed In the mail.  Another limitation for this research was computing power.  Despite a new computer being purchased for this work, it still lacked the ultimate power required to handle all tasks independently (in truth, it was actual a cooling issue that prevented the new laptop from performing to its full potential, discussions are still in place with the distributor to resolve this), further complicating the research to work between two separate computers. Again this issue could have been resolved with a larger budget. 

Further research: 
      The ultimate recommendation would be to either work as a partner for Moduila Studios or develop an independent application which will offer full control and customisation.  Additionally, integrating the ability to perform and add 360 videos as one homogenous unit rather than depending on such a multitude of interwoven applications would simplify the procedure for performers and develop opportunity for monetisation. Launching a website would allow linking artists and performers together with a plethora of tools such as tickets, calendar information, memberships, artists profiles, fan chats, reviews and more. 
Furthermore, as more consumers adopt VR, technology in current games which allows game players to embody avatars and interact with one another, could be adopted by the music industry. Then 360 video realities could evolve to fully interactive realities where audiences can see and dance with one another as well as view the performance.  Likewise, performers will to be able to collaborate remotely in real-time.  Additionally, performers will likely move to MR as it becomes more affordable, to allow greater connection to audiences, as well as the ability to play real instruments as well as virtual ones in a mixed reality. 

Conclusion:
      This research has demonstrated that VR technology is ready to play a part in the contemporary Electronic Music Industry for those who are willing to be the innovators. It appears possible that VR may offer the tools necessary to finally take motion-based music production to centre stage.  This research demonstrates just one avenue that an innovator may take to mould the current available technology to suit an artist’s needs. Likewise, the 360 methodology implemented may offer a foundation for a variance of stylistic approaches toward remote performances, ranging from realistic, to hyperrealistic, through to fantastic reality’s.  This technology will take time before the early majority of both performers and audiences get involved. Yet, at this point, based on the actions by tech giants like Facebook, YouTube, Google, Microsoft, etc. we see that VR and MR is a safe bet and is likely to become an important part of our everyday life.  It is the author's belief that innovators who get involved in this brave new world now may be on top when the large majority of users embrace this tech and it really takes a hold in the market. Therefore, this researcher will continue to explore, develop and optimise his findings and be a part of tomorrow, today. 

Portfolio:
      To conclude this research a portfolio has been developed that demonstrates the potential of the final outcomes.  Please watch this introduction below and then follow the link to the six minute  360 performance, the intent is not to demonstrate the quality of the music rather to demonstrate the means of this music production and its interrelation with the video content.  In order to gain full respect of this work viewers are encouraged to watch this video via a phone in a VR headset, or a complete VR headset such as an Oculus.  
Thank You and Enjoy,
Jesse Brand (aka Jesse Firebrand)

Citations:

aliveinvr Trailer. (2018, May 14). [Video]. YouTube. https://www.youtube.com/watch?v=AsXJXGGQRuU

Augmented reality vs. virtual reality: AR and VR made clear. (2018, August 6). [Video]. YouTube. https://www.youtube.com/watch?v=NOKJDCqvvMk

Barker, S. (2018, August 11). Review: Electronauts (PS4). Retrieved July 08, 2020, from https://www.pushsquare.com/reviews/ps4/electronauts

Breen, M. (2016). Mark Zuckerberg: Facebook in Virtual Reality is the Future. Retrieved September 05, 2020, from https://www.nbcnews.com/tech/tech-news/mark-zuckerberg-facebook-virtual-reality-future-n523721

Chandler, N. (2015, March 25). How the Nintendo Power Glove Worked. Retrieved October 14, 2020, from https://electronics.howstuffworks.com/nintendo-power-glove.htm

Charron, J.-P. (2017). Music Audiences 3.0: Concert-Goers’ Psychological Motivations at the Dawn of Virtual Reality. Frontiers in Psychology, 8, 800–805. https://doi.org/10.3389/fpsyg.2017.00800

Doepfer MOGLI Midi Glove (Rare - Vintage). (2017, February 19). [Video]. YouTube. https://www.youtube.com/watch?v=xPemyhtWQc8&feature=emb_logo

Duffin, P., & 19, A. (2020, August 19). Biggest companies in the world 2019. Retrieved September 05, 2020, from https://www.statista.com/statistics/263264/top-companies-in-the-world-by-market-capitalization/

Gaubys, J. (2020, May 11). How Many People Have Smartphones in 2020. Retrieved October 16, 2020, from https://www.oberlo.com/statistics/how-many-people-have-smartphones

Graham, P. (2019). Modulia Studio Enables Music Creation in VR With 'No Limit' Says Thérémix Tech. Retrieved July 08, 2020, from https://www.vrfocus.com/2019/07/modulia-studio-enables-music-creation-in-vr-with-no-limit-says-theremix-tech/

Grimes, G. (2011, July 25). How a Theremin Works. Retrieved October 14, 2020, from https://electronics.howstuffworks.com/gadgets/audio-music/theremin.htm

Meet Daydream, Google’s vision for virtual reality. (2016, May 18). [Video]. YouTube. https://www.youtube.com/watch?v=lo3GTYSFhzw

MelodyVR 2019 Highlights. (2020, February 24). [Video]. YouTube. https://www.youtube.com/watch?v=KN01Dc0CYDo

Merrill, P. (2020, August 27). MelodyVR Raises $10 Million For Its Global VR Push. GRAMMY.Com. https://www.grammy.com/grammys/news/melodyvr-raises-10-million-its-global-vr-push

MI.MU Gloves: Music Through Movement. (2019, April 24). [Video]. YouTube. https://www.youtube.com/watch?v=CvyVQqCO8pY

Microsoft HoloLens. (n.d.). HoloLens 2: Find Specs and Features – Microsoft HoloLens 2. Retrieved October 15, 2020, from https://www.microsoft.com/en-nz/p/holoLens-2/91pnzzznzwcp/?activetab=pivot%3Aoverviewtab
Modulia Studio - Your unlimited VR music studio | Now on Oculus Rift & Rift S. (2019, April 25). [Video]. YouTube. https://www.youtube.com/watch?v=qGYzlRl78qA

MusicTech.net. (2017). Control Ableton Live In Virtual Reality With New App ALiveInVR. Retrieved July 08, 2020, from https://www.musictech.net/news/control-ableton-live-vr-aliveinvr/

Oculus. (n.d.). Retrieved October 14, 2020, from https://www.oculus.com/cart/

Pallavicini, F., Pepe, A., & Minissi, M. E. (2019). Gaming in Virtual Reality: What Changes in Terms of Usability, Emotional Response and Sense of Presence

Compared to Non-Immersive Video Games? Simulation & Gaming, 50(2), 136–159. https://doi.org/10.1177/1046878119831420

Rovito, M. (2014, September 24). Review: Naonext Crystall Ball Infrared Sensor Controller. Retrieved October 14, 2020, from https://djtechtools.com/2014/09/24/review-naonext-crystall-ball-infrared-sensor-controller/

Sawh, M. (2019, April 25). Mi.Mu gloves want to make creating music less complicated. Retrieved October 14, 2020, from https://www.wareable.com/wearable-tech/mimu-gloves-2-music-wearable-tech-7194

Sethi, R. (2016, August 14). Wave Your Hands In The Air And Make Music With Kagura. Retrieved October 14, 2020, from https://ask.audio/articles/wave-your-hands-in-the-air-and-make-music-with-kagura

Steam Community. (n.d.). Guide :: How to set up Revive. Retrieved October 16, 2020, from https://steamcommunity.com/sharedfiles/filedetails/?id=1401048874

Sterling, B. (2012, July 17). Augmented Reality: The V Motion Project. Retrieved October 16, 2020, from https://www.wired.com/2012/07/augmented-reality-the-v-motion-project/

Solaris, J. (2017, March 06). How 360° Live Video is Paving the Way for Virtual Reality at Events. Retrieved October 16, 2020, from https://www.eventmanagerblog.com/360-live-video-events

Swider, M. (2017, October 05). Google Daydream View (2016) review. Retrieved October 16, 2020, from https://www.techradar.com/nz/reviews/google-daydream-view-review

Tankovska, H., & 22, S. (2020, September 22). Active virtual reality users forecast worldwide 2014-2018. Retrieved October 16, 2020, from https://www.statista.com/statistics/426469/active-virtual-reality-users-worldwide/

The Haunted Theremin of Carolina Eyck | Reverb. (2018, October 31). [Video]. YouTube. https://www.youtube.com/watch?v=nE_sAnSkW-Q

The V Motion Project -- Can’t Help Myself [Official Music Video]‬. (2012, July 6). [Video]. YouTube. https://www.youtube.com/watch?v=YERtJ-5wlhM&list=RDYERtJ-5wlhM&start_radio=1&t=101

Winkler, T. (1998). Motion-sensing Music: Artistic and Technical Challenges In Two Works For Dance. Retrieved October 15, 2020 from: https://www.researchgate.net/publication/246873653_Motion-sensing_Music_Artistic_and_Technical_Challenges_In_Two_Works_For_Dance 

Williams, H. (2017). Watch a video on how Kraftwerk's rare MIDI controller glove works. Retrieved October 14, 2020, from https://mixmag.net/read/kraftwerks-rare-midi-controller-glove-explained-news