iMP » links http://imp.upf.edu IMP PROJECT WEBSITE Thu, 04 Aug 2011 16:23:03 +0000 http://wordpress.org/?v=2.9.1 en hourly 1 Talk:Remixing of movie soundtracks into immersive 3D audio http://imp.upf.edu/2009/11/26/talkremixing-of-movie-soundtracks-into-immersive-3d-audio/ http://imp.upf.edu/2009/11/26/talkremixing-of-movie-soundtracks-into-immersive-3d-audio/#comments Thu, 26 Nov 2009 09:40:02 +0000 admin http://193.145.50.211/?p=144 Reference: http://clam-project.org/

Conference: Blender conference in Amsterdam, 23-25 October

What happened: Clam developers Pau Arumí­ and Natanel Olaiz recently presented some new work in the fantastic Blender conference in Amsterdam. The talk was about a technology developed at BarcelonaMedia within the iMP and 20203D-Media european projects, involving an innovative usage of Blender for 3D audio using CLAM for the audible-scene rendering and decoding and Ardour for playing out to any loudspeaker-layout. It was really nice to meet Blender developers and artists, and the overall conference was fun and a great experience! Now we expect to collaborate more with the Blender project in the future.

Abstract: We present a use of Blender for an innovative purpose: the remastering of traditional movie soundtracks into highly-immersive 3D audio soundtracks. To that end we developed a complete workflow making use of Blender with Python extensions, Ardour (the Digital Audio Workstation) and audio plugins for 3D audio spatialization and room acoustics simulation. The workflow consists in two main stages: the authoring of a simplified scene and the audio rendering. The first stage is done within Blender: taking advantage of the video sequence editor playing next to a 3D view, the operator recreates the animation of sound sources mimicking the original video. He then associates the objects in the scene with existing audio tracks of an Ardour session with the soundtrack mix and, optionally, adds acoustics properties to the scene prop materials (e.g. defining how a wooden room will sound) to render acoustics simulation using ray-tracing algorithms. In the second stage, a specification of the loudspeakers positions used in the exhibition is given, and the Ardour session with the soundtrack is automatically modified incorporating all the Blender’s edited sound scene, the necessary routing, and the 3D audio decoding plugins such as Ambisonics and other techniques implemented with CLAM.

DAW

]]>
http://imp.upf.edu/2009/11/26/talkremixing-of-movie-soundtracks-into-immersive-3d-audio/feed/ 0