AIgo-Rhythms: The Future Of Album Collaboration

Video: Life Support | 360° VR Music Video Composed By AI – Taryn Southern
 
Taryn Southern, Contributor
Taryn Southern is a digital artist and filmmaker. She is currently co-directing a documentary about the brain, and her album I AM AI is set to release this September. As a recovering YouTuber, she’s produced more than 1000 videos that have garnered more than 500 million views online.
 
One year ago, I began working on an album. I write vocal melodies and lyrics, while my partner does the composition. We both work on instrumentation, and complement each other well. The only odd part of the relationship is…my partner isn’t human.
 
It’s AI.
 
The relationship was born out of curiosity. Fear-driven headlines had been dominating my news feed for some time….headlines like: AI will take our jobs, our data, and eventually, our souls.
 
The arguments left me wondering. What’s really happening with AI? I stumbled across an article chronicling how AI was now being used to compose music. After a quick Google search, I found that song creation was just the tip of the iceberg – AI was also writing poems, editing films, and synthesizing art…and passing the Turing test.  
 
Eager to learn more, I began to experiment with every AI music making tool I could get my hands on. Amper, and Aiva to start, then later, IBM Watson and Google Magenta (there are countless others on the scene – AI Music, Jukedeck, and Landr to name a few).
 
My side project quickly evolved into a full-fledged album (“I AM AI”) along with a series of virtual reality music videos exploring the tenuous relationship between humans and technology. Last September, I released the first full single I produced with Amper, Break Free, which grabbed the attention – and curiosity – of the larger creative community.
 
Many inquired: are you worried AI will be more creative than you? No. In many ways, AI helped me become more creative, evolving my role into something resembling more of an editor or director.  I gave AI direction (in the form of data to learn from or parameters for the output), and it sends back raw material, which I then edit and arrange to create a cohesive song. It also allowed me to spend more time on other aspects of the creation process like the vocal melodies, lyrics, and music videos. It’s still creative, just different. But technophobes, rejoice: AI isn’t a perfect companion just yet.
 
What the future of our co-evolutionary world looks like with AI is anyone’s guess… but I’m optimistic.
Since there is still a lot of mystery surrounding the process of collaborating with AI, a breakdown is a helpful way to baseline the conversation. Here are the primary platforms I’ve used and my takeaways from collaborating with each one:
 

Amper:  co-founded by several musicians, Amper launched as a platform to compose original scores for productions. Currently free to the public, Amper has a  simple front-facing UI that you can use to modify parameters like BPM, instrumentation, and mood. No need to know code here!

Takeaway: Prior to working with Amper, I couldn’t recognize the sounds of different instruments, nor did I believe I had any particular musical preferences. Now, I recognize dozens of instruments, and have honed a particular creative style. For instance, I’m developed a strong taste for mixing electronic synthesizers with piano and deep bass, as you can hear in Life Support above, which I produced a 360 VR music video for.
 

AIVA: Aiva is an award-winning deep learning algorithm, and the first to be registered with an authors’ rights society. I first met one of the founders, Pierre Barreau, in London, and we became really excited about the opportunity of combining classical learning styles with pop/synth instrumentation. AIVA uses deep learning and reinforcement learning to analyze thousands of pieces of classical music in specific styles and compose new scores.

 
Takeaway: My first track with AIVA, Lovesick, was created from the analysis of thousands of pieces from the late Romantic Period (early to mid 1800s.) The result is a Westworld-esque piano piece that I arranged into a pop-funk track with electronic synth elements. Collaborating with such unfamiliar source material was incredibly fun because it forces out of the box thinking. When arranging the track, I really had to ignore a lot of my “pop style” conditioning instincts.

,

 

Source: TechCrunch

more insights