An AI-based musical instrument that responds to, accompanies, and inspires you!
一个AI赋能的的乐器,不仅响应你,还能陪伴你,启发你。
By Aven Le Zhou, David Santiano, Sam Jianghao Hu
Designed for lonely kids or adults, Therem{Ai}n allows users to perform music simply by moving two hands in the air. Based on what the user plays, the instrument plays back new melodies generated by a neural network as a response.
“ai琴”是为感到孤独的孩童和成年人设计的乐器。使用者在空中简单挥动双手便可以用它演奏音乐。根据使用者演奏的旋律,“ai琴”会利用AI神经网络生成并回馈一段新的旋律,与使用者的演奏向呼应。
Core Tech Used: Magenta.js, TensorFlow, Leap Motion, Python, etc.
Concept | 理念
What our team is focusing on is how AI can be used to work hand in hand with an activity that is inherently special to human beings, the creation of music. We are going beyond the novelty of an artificial intelligence-based performance and homing in on how in a world where AI is often seen as a replacement for jobs and production, that AI can accompany, assist, and respond to our creative process.
我们团队的重心在于思考人工智能在音乐谱写等诸多人类创造过程中的运用。不同于颇具新鲜感的人工智能表演,我们更在乎的是,在当下人工智能被普遍视作会代替人力的大环境下,人工智能如何在我们的创意和创造过程中起到响应,陪伴,以及辅助的作用。
Making Process | 制作流程
步骤1 Digitize a Theremin
- Analog Instrument --> Digital Instrument
- Leap Motion Data --> Music Notes&Amplitude | 传感器数据 --> 音调&音量
- Sound synthesis and live playback using Python libraries | python libraries进行声音合成与回放
步骤2 Feed MIDI data into the Neural Network
- Leap Motion Data --> Music Notes&Amplitude --> MIDI Data | 进一步将音调&音量数据转化为MIDI
- Throw in MIDI data into the AI and retrieve generated MIDI data | 输入MIDI并获取AI生成的新的MIDI
步骤3 Playback Generated MIDI
Future Development | 展望
- Become an interactive installation | 交互装置
- Become an integrated product | 一体化产品
_深蓝_2023.12.31
没看懂,
luyi2020.02.05
好赞啊,学习了!
管理员2018.11.20
哇 这个赞 想看视频