Art meets technology in quartet's new take on early music

Craig Phillips, bass vocalist of the two-time Grammy nominated quartet New York Polyphony and a University of Oregon professor of voice, describes the internationally critically acclaimed group’s brand of pre-Baroque-style music as “early music with modern sensibility.”

Early, because much of their repertoire comprises rare and rediscovered Renaissance and medieval works that have not seen the light of day in centuries. And modern, because the ensemble strives to reveal it through a contemporary lens. One such effort, a multimedia collaboration with artificial intelligence experts from the UO, presents a set of ancient lamentations as a metaphor for climate change.

A recent finalist for the prestigious Gramophone Award in Early Music, the group’s latest CD, "Lamentationes," centers on two never-before-recorded settings of the “Lamentationes Jeremiae prophetae” by Spanish Renaissance composer Francisco de Peñalosa.

“One of the things the quartet has tried to do is push the boundaries about what the format or delivery mechanism is for our music because the context of modern audiences has changed,” Phillips said.

In tune with their mission to get their music in front of people in creative ways, they have hosted a remix competition allowing DJs around the world to play with their Gregorian chants and commissioned a secular version of sacred text replaced with text written by Charles Darwin.

Yet, in an era of overly edited, autotuned and highly produced music, the quartet — four voices each singing a cappella on a line of music without amplification or studio modification — is refreshing.

“Our music is as intrinsically human and authentic to the human experience as possible,” Phillips said.

Much of the music the group seeks out exists in cathedral libraries awaiting discovery. Many old tomes have been converted to modern notation, but a lot of material has yet to resurface in contemporary editions.

“We discovered this music in a music library and it just leapt off the page,” Phillips said. “It looked like it was written for us. It was clearly written for four men. It matched our ideal ranges. It was perfect.”

Working with Australian musicologist Jane Hardie, a member of the Directorium of the International Musicological Society and an honorary research associate at the Medieval and Early Modern Centre of the University of Sydney, New York Polyphony is the first group to record these particular lamentations.

The next step came naturally to Phillips. He reached out to his UO colleagues and was impressed with the work being done in the College of Design’s art and technology program.

“I thought, here we are in the music building and some incredible stuff is happening over there,” he said. “These worlds need to collide.”

Old was about to meet new in what would become a yearlong collaboration with digital arts professor Colin Ives, former director of the art and technology program in the College of Design and the Artificial Intelligence Creative Practice Research Group.

“The timing was perfect,” Ives said. “I was looking for a project to do with my artificial intelligence creative practice research group. Craig came to me and asked if I’d be interested in doing a music video and I was like, ‘Actually, let’s do something more interesting.’ I showed him some of the stuff we’d been playing with in terms of artificial intelligence, how it can generate video based on training on existing videos.”

Aleph Earth logo

Inspired by the album’s track “Lamentationes Jeremiae Feria V,” which opens with Jeremiah weeping at the abandonment of God during the 586 B.C. destruction of Jerusalem, Ives and Phillips arrived at the same conclusion: The loss, despair, and hope in the lyrics of the lamentation could be used as a contemporary metaphor for climate change.

“There’s something about the text of the lamentations,” Phillips said. “Even if you don’t come to it with a religious background, the poetry is very affecting and it speaks to human suffering, things that are universal.”

The result is a visually stunning 12-minute audiovisual presentation, “Aleph Earth.” Aleph, the first letter in the Hebrew alphabet, loosely translates to “Earth first.”

Ives and the artificial intelligence team, comprised of graduate students Zachary Boyt and Thomas Newlands, developed a machine learning approach to generate video that matched the vocal composition. Using their AI model, the team used existing footage of Australian bush fires, icebergs in Greenland and Oregon’s Clear Lake, which Ives shot with an underwater drone, to analyze and abstract the patterns it found within it. Once the model had been trained, it generated new video based on those patterns.

“Aleph Earth” premiered at the 2020 Currents New Media Festival in August and September and is available for presenters to book as a virtual concert or educational event through Opus 3 Artists. Discussions are also underway to convert it into a touring, virtual exhibit.

Ives and Phillips hope this project will lead to further opportunities to merge art and technology.

“It’s a down payment on future projects, given the enthusiasm and fun that we’ve invested,” Phillips said. “We hope we can use this as a jumping off point for even more elaborate collaborations.”

—By Sharleen Nelson, University Communications