MIT News - Artificial intelligence 07月30日 05:09
“FUTURE PHASES” showcases new frontiers in music technology and interactive performance
index_new5.html
../../../zaker_core/zaker_tpl_static/wap/tpl_guoji1.html

 

“FUTURE PHASES”音乐会于麻省理工学院(MIT)音乐科技与计算研究生项目主办,在全新的林德音乐楼举办,展示了弦乐与电子音乐的融合创新。本次活动与MIT媒体实验室的“未来歌剧”小组及波士顿室内乐团A Far Cry合作,呈现了包括MIT作曲家Evan Ziporyn和Tod Machover的最新作品,以及通过公开征集选出的三部作品。演出由A Far Cry乐团精彩演绎,并利用音乐厅的环绕音响系统,为观众带来多维度的听觉体验。其中,Ziporyn与Egozy合作的“EV6”更是让观众通过手机成为乐团一部分,实现了前所未有的互动式音乐体验。音乐会后,还展示了学生在音乐科技领域的最新研究成果,彰显了MIT在音乐与科技前沿的探索与发展。

🌟 **MIT音乐科技前沿展演**:麻省理工学院(MIT)最新举办的“FUTURE PHASES”音乐会,由MIT音乐科技与计算研究生项目呈现,集中展示了弦乐与电子音乐的创新结合,标志着该校在音乐技术领域的重要成果。

🎻 **A Far Cry乐团与作品亮点**:波士顿的A Far Cry乐团担纲演出,与MIT作曲家Evan Ziporyn、Tod Machover的作品一同呈现了包括“EV6”和“FLOW Symphony”在内的多部精彩作品,并通过公开征集选出了另外三部作品,展现了音乐的多样性。

📱 **“EV6”的沉浸式互动体验**:Evan Ziporyn和Eran Egozy的作品“EV6”利用了“Tutti”技术,让观众通过智能手机化身为乐团的一部分,实时演奏并与现场弦乐团互动,创造了前所未有的“共同演奏”体验,将观众深度融入音乐创作。

🔊 **先进的音乐厅技术应用**:音乐会充分利用了托马斯·图尔音乐厅的24个内置环绕音响扬声器,为每位听众提供了独特的、多维度的声音体验,使音乐在空间中流动演变,增强了演出的艺术感染力。

💡 **音乐科技研究的成果展示**:音乐会结束后,还举办了六场音乐科技演示,展出了MIT本科生和研究生的最新研究成果,涵盖了音乐界面设计、人机协作创作、音乐数据分析、音频特征提取以及创意乐器开发等多个前沿方向。

Music technology took center stage at MIT during “FUTURE PHASES,” an evening of works for string orchestra and electronics, presented by the MIT Music Technology and Computation Graduate Program as part of the 2025 International Computer Music Conference (ICMC). 

The well-attended event was held last month in the Thomas Tull Concert Hall within the new Edward and Joyce Linde Music Building. Produced in collaboration with the MIT Media Lab’s Opera of the Future Group and Boston’s self-conducted chamber orchestra A Far Cry, “FUTURE PHASES” was the first event to be presented by the MIT Music Technology and Computation Graduate Program in MIT Music’s new space.

“FUTURE PHASES” offerings included two new works by MIT composers: the world premiere of “EV6,” by MIT Music’s Kenan Sahin Distinguished Professor Evan Ziporyn and professor of the practice Eran Egozy; and the U.S. premiere of “FLOW Symphony,” by the MIT Media Lab’s Muriel R. Cooper Professor of Music and Media Tod Machover. Three additional works were selected by a jury from an open call for works: “The Wind Will Carry Us Away,” by Ali Balighi; “A Blank Page,” by Celeste Betancur Gutiérrez and Luna Valentin; and “Coastal Portrait: Cycles and Thresholds,” by Peter Lane. Each work was performed by Boston’s own multi-Grammy-nominated string orchestra, A Far Cry.

“The ICMC is all about presenting the latest research, compositions, and performances in electronic music,” says Egozy, director of the new Music Technology and Computation Graduate Program at MIT. When approached to be a part of this year’s conference, “it seemed the perfect opportunity to showcase MIT’s commitment to music technology, and in particular the exciting new areas being developed right now: a new master’s program in music technology and computation, the new Edward and Joyce Linde Music Building with its enhanced music technology facilities, and new faculty arriving at MIT with joint appointments between MIT Music and Theater Arts (MTA) and the Department of Electrical Engineering and Computer Science (EECS).” These recently hired professors include Anna Huang, a keynote speaker for the conference and creator of the machine learning model Coconet that powered Google’s first AI Doodle, the Bach Doodle.

Egozy emphasizes the uniqueness of this occasion: “You have to understand that this is a very special situation. Having a full 18-member string orchestra [A Far Cry] perform new works that include electronics does not happen very often. In most cases, ICMC performances consist either entirely of electronics and computer-generated music, or perhaps a small ensemble of two-to-four musicians. So the opportunity we could present to the larger community of music technology was particularly exciting.”

To take advantage of this exciting opportunity, an open call was put out internationally to select the other pieces that would accompany Ziporyn and Egozy’s “EV6” and Machover’s “FLOW Symphony.” Three pieces were selected from a total of 46 entries to be a part of the evening’s program by a panel of judges that included Egozy, Machover, and other distinguished composers and technologists.

“We received a huge variety of works from this call,” says Egozy. “We saw all kinds of musical styles and ways that electronics would be used. No two pieces were very similar to each other, and I think because of that, our audience got a sense of how varied and interesting a concert can be for this format. A Far Cry was really the unifying presence. They played all pieces with great passion and nuance. They have a way of really drawing audiences into the music. And, of course, with the Thomas Tull Concert Hall being in the round, the audience felt even more connected to the music.”

Egozy continues, “we took advantage of the technology built into the Thomas Tull Concert Hall, which has 24 built-in speakers for surround sound allowing us to broadcast unique, amplified sound to every seat in the house. Chances are that every person might have experienced the sound slightly differently, but there was always some sense of a multidimensional evolution of sound and music as the pieces unfolded.”

The five works of the evening employed a range of technological components that included playing synthesized, prerecorded, or electronically manipulated sounds; attaching microphones to instruments for use in real-time signal processing algorithms; broadcasting custom-generated musical notation to the musicians; utilizing generative AI to process live sound and play it back in interesting and unpredictable ways; and audience participation, where spectators use their cellphones as musical instruments to become a part of the ensemble.

Ziporyn and Egozy’s piece, “EV6,” took particular advantage of this last innovation: “Evan and I had previously collaborated on a system called Tutti, which means ‘together’ in Italian. Tutti gives an audience the ability to use their smartphones as musical instruments so that we can all play together.” Egozy developed the technology, which was first used in the MIT Campaign for a Better World in 2017. The original application involved a three-minute piece for cellphones only. “But for this concert,” Egozy explains, “Evan had the idea that we could use the same technology to write a new piece — this time, for audience phones and a live string orchestra as well.”

To explain the piece’s title, Ziporyn says, “I drive an EV6; it’s my first electric car, and when I first got it, it felt like I was driving an iPhone. But of course it’s still just a car: it’s got wheels and an engine, and it gets me from one place to another. It seemed like a good metaphor for this piece, in which a lot of the sound is literally played on cellphones, but still has to work like any other piece of music. It’s also a bit of an homage to David Bowie’s song ‘TVC 15,’ which is about falling in love with a robot.”

Egozy adds, “We wanted audience members to feel what it is like to play together in an orchestra. Through this technology, each audience member becomes a part of an orchestral section (winds, brass, strings, etc.). As they play together, they can hear their whole section playing similar music while also hearing other sections in different parts of the hall play different music. This allows an audience to feel a responsibility to their section, hear how music can move between different sections of an orchestra, and experience the thrill of live performance. In ‘EV6,’ this experience was even more electrifying because everyone in the audience got to play with a live string orchestra — perhaps for the first time in recorded history.”

After the concert, guests were treated to six music technology demonstrations that showcased the research of undergraduate and graduate students from both the MIT Music program and the MIT Media Lab. These included a gamified interface for harnessing just intonation systems (Antonis Christou); insights from a human-AI co-created concert (Lancelot Blanchard and Perry Naseck); a system for analyzing piano playing data across campus (Ayyub Abdulrezak ’24, MEng ’25); capturing music features from audio using latent frequency-masked autoencoders (Mason Wang); a device that turns any surface into a drum machine (Matthew Caren ’25); and a play-along interface for learning traditional Senegalese rhythms (Mariano Salcedo ’25). This last example led to the creation of Senegroove, a drumming-based application specifically designed for an upcoming edX online course taught by ethnomusicologist and MIT associate professor in music Patricia Tang, and world-renowned Senegalese drummer and MIT lecturer in music Lamine Touré, who provided performance videos of the foundational rhythms used in the system.

Ultimately, Egozy muses, “'FUTURE PHASES' showed how having the right space — in this case, the new Edward and Joyce Linde Music Building — really can be a driving force for new ways of thinking, new projects, and new ways of collaborating. My hope is that everyone in the MIT community, the Boston area, and beyond soon discovers what a truly amazing place and space we have built, and are still building here, for music and music technology at MIT.”

Fish AI Reader

Fish AI Reader

AI辅助创作,多种专业模板,深度分析,高质量内容生成。从观点提取到深度思考,FishAI为您提供全方位的创作支持。新版本引入自定义参数,让您的创作更加个性化和精准。

FishAI

FishAI

鱼阅,AI 时代的下一个智能信息助手,助你摆脱信息焦虑

联系邮箱 441953276@qq.com

相关标签

MIT 音乐科技 电子音乐 弦乐 互动体验
相关文章