New Paper on SpatDIF

The latest issue of the Computer Music Journal (MIT Press) includes an article on the Spatial Sound Description Interchange Format (SpatDIF) by Trond Lossius, Jan Schacher, and myself, entitled “The Spatial Sound Description Interchange Format: Principles, Specification, and Examples”.

SpatDIF_CMJ2013

SpatDIF, the Spatial Sound Description Interchange Format, is an ongoing collaborative effort offering a semantic and syntactic specification for storing and transmitting spatial audio scene descriptions. The SpatDIF core is a lightweight minimal solution providing the most essential set of descriptors for spatial sound scenes. Additional descriptors are introduced as extensions, expanding the namespace and scope with respect to authoring, scene description, rendering, and reproduction of spatial sound. A general overview presents the principles informing the specification, as well as the structure and the terminology of the SpatDIF syntax. Two use cases exemplify SpatDIF’s potential for pre-composed pieces as well as interactive installations, and several prototype implementations that have been developed show its real-life utility.

Best paper award at SMC 2012

Today Trond LossiusJan C. Schacher, and me received the Best Paper Award for “SpatDIF: Principles, Specification, and Examples” at the 9th Sound and Music Computing Conference.

 

This paper presents the current state of our long-term effort in creating a community-driven interchange format for spatial audio scenes. More on www.SpatDIF.org.

Here the reference:

@inproceedings{SpatDIF-SMC12,
 Address = {Copenhagen, DK},
 Author = {Nils Peters and Trond Lossius and Jan C. Schacher},
 Booktitle = {Proc. of the 9th Sound and Music Computing Conference},
 Title = {{SpatDIF}: Principles, Specification, and Examples},
 Year = {2012}}

We are now invited to submit a revised and expanded version for publication in The Computer Music Journal (MIT Press).

 

I am similarly excited that the paper “An Automated Testing Suite for Computer Music Environments” I wrote together with Trond Lossius and Timothy Place was also nominated for Best Paper.

New papers

p>I am looking forward to SMC2012 in Copenhagen, Denmark, the 133rd AES Convention in San Francisco, USA, and the ACM Multimedia 2012 in Nara, Japan:

  • Peters N., Schacher J., Lossius T.: SpatDIF: Principles, Specification, and Examples, to appear in Proc. of the 9th Sound and Music Computing Conference (SMC), Copenhagen, Denmark, 2012.
  • Peters N., Lossius T., Place T.: An Automated Testing Suite for Computer Music Environments, to appear in Proc. of the 9th Sound and Music Computing Conference (SMC), Copenhagen, Denmark, 2012.
  • Peters N., Choi J., Lei H.: Matching artificial reverb settings to unknown room recordings: a recommendation system for reverb plugins, to appear at 133rd AES Convention, San Francisco, 2012.
  • Peters N., Lei. H., Friedland G.: Name That Room: Room identification using acoustic features in a recording, to appear at ACM Multimedia 2012, Nara, Japan, 2012.

Coordinate systems and the brain

Last week I attended a lecture by neuroscientist Vittorio Gallese entitled “What is so special with embodied simulation”. Among other things, I was really surprised to learn that the brain encodes positions of objects in space using egocentric as well as allocentric coordinate systems. Is that the neurological argument why SpatDIF supports more than just one coordinate system?