“Delicacies” is my incoherent, irregular, unpredictable collection of interesting sparks I came across online. Handpicked by a human, no robots, no AI. A form of tripping, wandering, dérivé, with some loosely undefined theme holding them together. Delicacies have no fixed frequency: I hit the publish button when there is enough material. That can be after a week or after 3 months. No pressure, literally. Just click the image below. Enjoy!
Petervan’s Musical Ride May 2025– 50 songs. Recent releases include Kae Tempest, Froid Dub, and Saint Etienne. This time, there are many oldies, such as Roy Orbison, Cher, The Who, and many others. Play in shuffle mode to increase the surprise factor. Enjoy!
“Delicacies” is my incoherent, irregular, unpredictable collection of interesting sparks I came across online. Handpicked by a human, no robots, no AI. A form of tripping, wandering, dérivé, with some loosely undefined theme holding them together. Delicacies have no fixed frequency: I hit the publish button when there is enough material. That can be after a week, or after 3 months. No pressure, literally. Just click the image below. Enjoy!
Petervan’s Musical Ride April 2025– 56 songs. Recent releases include Mauro Pawlowski, Pomrad, and Lorde > Oldies from Sly & Robbie, Alanis Morissette, and Netsky > Play in shuffle mode to increase the surprise factor. Enjoy!
As part of the research for my immersive projects and performances, I am trying to better understand the visual and audio aspects of XR experiences. In that context, I attended the Immersive Music Day at PXL in Hasselt, Belgium, organized by the PXL Music Research team. The full program, schedule, and lineup are here.
It is a relatively small-scale event (I guess about 100 PAX), which is great as it enables networking with the participants and the speakers. The event was held at a location with great immersive audio infrastructure (3 rooms with full 360 sound set-up). For the rest, it was a no-frills event with super-friendly staff and good food at breakfast and lunch.
Example of an immersive music room set-up
I was also pleasantly surprised by the mix of ages, ranging from fresh-faced high school students to seasoned audio veterans and legends, plus corporate fossils like myself. That kind of diversity usually signals that something truly interesting is about to unfold.
But the best part was the content and the speakers.
If there was an intended or unintended theme, it would be the subjective aspects of the immersive experience (how sound “feels”, or about the experiential coherence of auditive, visual, and spatial input) vs. the technological aspects of immersive sound (like precise localisation of sound in space). But I am sure that in some other sessions, the content was quite nerdy, up to the detailed coded and mathematical aspects of encoders/decoders.
Here are a few notes and reflections from the sessions I attended.
Immersive Space – An Agent for Creating and Experiencing Music
Program synopsis: Humans have sensory capabilities for recognizing their presence and immersion in space. Music ideally matches these capabilities by presenting dynamic, tonal, harmonic, and rhythmic structures in sound. Musicians use space to generate and blend sounds of ensemble, to hide and reveal musical voices, to dramatize perspectives, and to harness emotion in music making and listening. The talk explores immersive space as a modern technological tool for augmenting people’s experience of music
CIRMMT Dome with 32 speakers
Notes:
I had never considered immersive sound as a medium for live music performance—being physically present in one space while listening to live musicians through a 360° sound system that simulates the acoustics of an entirely different environment. Wieslaw talked about auditory “fingerprints” of spaces. This goes way beyond sound effects like reverb that simulate the reverb of a cathedral. No, this fingerprint captures the full acoustic character of a space—every corner, every height, every nuance. And there are plug-ins available that let you import this detailed acoustic profile directly into consumer-level digital audio workstations like Logic Pro and others.
This allows performing artists to shape and test their artistic expression for a specific space, like the San Francisco Cathedral, or lets the audience experience the music as if they were actually there, immersed in that very acoustic environment.
Altering the Immersive Potential: The Case of the Heilung Concert at Roskilde Festival
Speakers: Birgitte Folmann, Head of Research, Sonic College, and Lars Tirsbæk, Consultant in Sound & Emerging Technologies, Educator 3D audio, Sonic College
Program synopsis: Immersive concert experiences are often described as specific, emotionally moving, dynamic, and complex – qualities that require experimental and interdisciplinary methods to be meaningfully understood. In this talk, we explore the immersive and engaging potential of live concerts through the lens of the Heilung performance at Roskilde Festival. Drawing on anthropological fieldwork and insights into the technical systems that supported the experience, we discuss how a deeper understanding of immersion can inform both artistic and technological development to enhance future audience experiences
Notes:
The talk was about the Heilung Concert at Roskilde Festival in 2024, in a festival tent holding about 17,000 people. Details about the technical set-up by Meyer Sound here.
What struck me was that the concert wasn’t branded as an “immersive” experience—there was no expectation set in advance. Yet, the immersion began the moment people entered the tent: birdsong filled the air, subtly blurring the line between environment and performance. It reminded me of my Innotribe days, where we also paid close attention to how people entered a space. After all, arrival and departure are integral parts of both the performance and the scenography.
The first part of the talk by Lars was about the technical challenges of delivering a 360 immersive sound experience in such a huge space. The second part by Birgitte was about the anthropological and subjective aesthetic experience of immersive music by the audience. Her slogan, “Aesthetics is a Verb” is great t-shirt material. They also talked about the “attunement” of the audience to the experience, and that you can’t fight the visuals: for example, when the drums play on the front stage, having the 360 sound coming from behind you does not work for the human brain.
Their team is now starting to document the findings of their field research. More to come.
Designing the Live Immersive Music Experience
Speaker: Paul Geluso, Music Assistant Professor, Director of the Music Technology Program – NYU Steinhardt University
Program synopsis: Paul Geluso’s work simultaneously encompasses theoretical, practical, and artistic dimensions of immersive sound recording and reproduction. His first book, “Immersive Sound: The Art and Science of Binaural and Multi-channel Audio,” published by Focal Press-Routledge, has become a standard textbook in its field. Geluso will share his research experience while providing exclusive previews of interviews and insights with featured immersive audio masters from his forthcoming book, “Immersive Sound II: the Design and Practice of Binaural and Multi-Channel Experiences” set to be published in fall of 2025. This presentation will also included discussions on his 3DCC microphone technique, a 3D Sound Object speaker design capable of holophonic sound playback, and his work on in-air sound synthesis and other site-specific immersive sound experience building techniques.
Notes:
Paul Geluso is God. Some years ago, he published “Immersive Sound: The Art and Science of Binaural and Multi-channel Audio,” considered by audiophiles as “The Bible”. He is also good friends with Flanders’ best artist, Piet Goddaer aka Ozark Henry,who specializes in immersive sound and music.
Ozark Henry in his studio
Paul took us on a journey of his research on immersive recording (making custom made 3D microphones and codes) and playback (making his own “Ambi-Speaker Objects”.
Paul Geluso’s immersive 3D Sound Object (Ambi-Speaker)
This was more of a backdrop for his upcoming book. While his first book was more about the how – the technology to record and playback immersive music – his new book will focus on the why – in essence, about leading with the story and the artistic intent. He hopes the new book will be out in 2025.
I had the chance to have a short 1-1 conversation with Paul, who seemed interested in our immersive performance ideas, which was exciting to know.
Subjective Evaluation of Immersive Microphone Techniques for Drums
Speaker: Arthur Moelants, Researcher PXL-Music
Program synopsis: When presenting a group of listeners with four immersive microphone techniques in two songs, will they always choose the most objectively correct one? An experiment with drum recordings in different acoustics and musical contexts challenges the assumption that objective parameters like ICTD and ICLD should always determine the best choice. While non-coincident techniques often score better in these metrics, listener preferences can shift depending on the musical context, as other techniques offer different sonic and practical qualities that might benefit the production more.
A microphone set-up for drums
Notes:
Arthur is part of my team for our immersive performances, like The New New Babylon, where he acts as both a cinematographer and immersive music expert. He is a member of the PXL-Music Research team. I was curious to see how he’d handle public speaking and delivery, and he did not disappoint. I’m always impressed by how some young professionals manage to blend deep, almost nerd-level technical expertise with polished communication and presentation skills.
His talk was about his research on the subjective experience of drums, and how that experience differs depending on the recording technique and on the context of the drums as part of a song. I really like the simple graphics of his slides to explain some quite technical aspects of immersive music. Not an easy talk to deliver as he was also giving live demos on a 360 system to let us hear the subtle differences.
Spring update on Petervan Studios. The previous update was one year ago! It is not that nothing happened. A lot has happened since then. Let’s have a look at what’s on/in my head.
Head measuring device – seen in GUM Science Museum – Wunderkammer of Truth
General Status
Since February 2024, I have disconnected from all social media, including FB, Twitter, and LinkedIn: You won’t find me there anymore
There have been fewer conversations, but the remaining contacts have become true friends, project & sparring partners.
It is challenging to find budgets for anything that even smells artistic.
On the family front, there was both joy and grief. Joy: Astrid passed the entry exam and started her bachelor’s at the Faculty of Veterinary Medicine of the University of Ghent. Grief: My mother-in-law passed away on 1 June 2024. She was a saint. The mourning set some of the tone for the rest of the year. Join me in wishing my father-in-law, my wife Mieke, and my daughter Astrid strength in dealing with this loss.
Green Green Grass of Hope – Bicycle ride 26 Oct 2024
The Art Studio
The main focus of the art studio was digital. I did a deep dive into what I would call “immersive software”. A deep dive means spending a lot of time in the Unity Editor, following numerous online courses, and doing a lot of little experiments.
Example of Ableton Live with Envelop for Live 3D Source Panner
An example of a simple VCV Rack set-up
Static example from Wave Unstable rule in CAPOW software by Rudy Rucker
Although not intended this way, most of the knowledge and skills acquired culminated in the first and subsequent versions of the New New Babylon performance, giving leeway to other projects. Some of these projects are detailed below.
I did make some analog work, mainly pencil and Chinese ink on paper, and very little with paint on canvas.
Together with some friends, we submitted a SoP24 Protocol Improvement Grant proposal for Conversation Protocols for Humans and Machines. Unfortunately, our team did not make it to the 2024 season of the Summer of Protocols. There were 130 candidates and only 5 residencies available. We learned a lot in preparing the submission material.
Toolmaking for Spatial Intelligence
Followed DigitalFutures Workshop “Toolmaking for Spatial Intelligence” and got my certificate
Masterclass XR in Industry
I am following the Masterclass XR in the Industry (an online course with some on-site assignments) at the Howest Academy in Kortrijk (with Digital Arts & Entertainment as one of the best game schools in the world) and HITLab (Human Interface Technology Lab) until June 2025.
5) Visualizing the unseen (IOT data visualization/digital twin)
6) Virtual control (interaction with machines/robots via XR)
Performances
Performance: Claim Your Cybernetic Word – 17 June 2024
I was invited to do an online performance for the 60th-anniversary conference of the American Society for Cybernetics.
Attendees were encouraged to participate actively by offering cybernetic terminology. We discussed for example the Paskian Knobs required to steer randomness and the style of the outcome. The session resulted in more than 300 generated words. They were consolidated in an on-the-spot generated word cloud. We also created an AI-generated cybernetic song.
Resulting world cloud
Performance: What Makes Us Human? – 28 August 2024
The Cybernetic performance uses a new format for delivering and creating content in some dream-state flow. I showed it to Josie Gibson from the Catalyst Network, who invited me to create a similar online workshop “What Makes Us Human?”
This performance is an engaging, immersive, and poetic screen-foray crafted to elicit compelling language embodying The Catalyst Network’s human dimensions. Attendees are encouraged to offer catalyst and humanistic terminology, visually depicted in an immersive cloud-like interactive video installation and a bespoke soundscape. The session opens with an artistic cinematic dream sequence. Through facilitated brainstorming sessions with the audience, participants can fine-tune word generation. The session closes with a cinematic catalyst song outro “A Woven World of Humans”.
Trailer:
Performance: New New Babylon
This has been my main focus over the last months, and that paid off: I made good progress on this performance. This performance is now available for virtual and physical on-stage delivery.
The New New Babylon performance is a 45-60 minute immersive experience, divided into six chapters—Awakening, Stepping, Flying, Gliding, Folding, and Vertigo. Each chapter explores different facets of the New New Babylon concept, blending art and interaction. Audience members are invited to actively participate, engaging with interactive elements that promote a sense of community and shared creativity. The performance integrates rich soundscapes, video projections, visual art, poetry, masks, VR, and stage props to create a multisensory journey.
To get here, I have spent loads of time in Unity Editor (a software tool to create 2D, 3D and VR environments, well known in the game development industry), did some in-depth reading on modern urbanism, registered for a masterclass “XR in the Industry”, and partnered with NUMENA, a renowned interdisciplinary creative studio from Germany, specializing in award-winning spatial design and programming.
For the next iteration, we plan an API-supported LLM infrastructure to enable live interactions with AI Agents. We aim to facilitate real-time exploration of historical New Babylon research resources during on-stage and online sessions. This infrastructure is currently being developed by Thomas McLeish, Adjunct Lecturer – at Berkeley Master of Design, and master creator of the 2018 replica of the Colloquy of Mobiles. Arthur Moelants – a talented young cinematographer and immersive audio expert from Flanders – also joined our project.
We submitted the performance as a candidate for the Venice Biennale College 2025, but we did not make it.
Performance Dream My Dream
The team decided to re-work the New New Babylon performance into a live experience that does not require any hardware (headsets) for the audience. We have renamed the performance to “Dream My Dream”
Trailer
17 Minute Video simulation of the performance
Dream My Dream is an immersive performance experience in six dream states: Awakening, Stepping, Flying, Gliding, Folding, and Vertigo. A live VR performer embodies an architect-researcher and dreams about the New New Babylon, a speculative future society transformed and eroded by automation, artificial intelligence, and digital technology. The dream explores profound and universal questions about the essence of existence. The audience is invited to interpret the dream in their own unique way.
This artistic performance has a poetic, gentle, and profoundly human touch, evoking a dreamy, Magritte-like surrealism. The atmosphere is calm and harmonious, steering from Sci-Fi or dystopian themes toward a non-aggressive, understated, and subtly utopian vision.
We submitted “Dream My Dream” to the Cannes Festival Immersive 2025 competition but did not make it to the final ten.
We are now revisiting the synopsis and tagline of this performance and making some adaptations to the treatment with the ultimate goal of premiering at one of the major film/immersive festivals.
Artistic Research Project: New New Babylon
The performance is one of the deliverables of the main artistic research project. There has been renewed interest from several parties interested in partnering on the main New New Babylon project.
We are looking for a team/consortium to overlay an existing city (district) with a VR environment for A/B Testing of the urbanistic, economic, and governance aspects.
The deliverables of Phase-1, the Vision-Phase, are:
A beta version of an Urbanistic Artistic Rendering VR Environment, inspired by an existing or planned City or Real Estate project
Artistic Performance (minimum Online, ideally IRL), see above
Art Expo (minimum Online, ideally IRL)
Art Book
Stealth
A new project, very embryonic, written and directed together with Andreea Ion Cojocura, complemented by my cousin (a 17th-century art expert) and a world-renowned artist as the MC.
The project is an experimental alternate reality experience about the nature of flesh, human suffering, and technological advances. It seeks to find an answer to the question: “Who are the new Gods that can deliver us from suffering?”
It is a surreal experience to see the truth. It happens over a three-year timeline. Unfolding in real time, the project involves participants in the preparation, execution, and aftermath of the fictitious latest ecumenical council.
The multi-year project entails a prelude phase as a mockumentary in 360 video, audio, and VR, followed by an in-person council, concluding in new canons and summarising mockumentary.
At the moment of writing, we are finalising the pitch.
A theory of space/time dimensions
The analog gnarly curved art projects, the 3D software learnings, the few conversations, a couple of computation and math books about flat and 3D dimensions (see books section below in this blog), and especially the Stealth project led me to fantasize about a multi-dimensional (not multiverse) world that is suddenly revealed and that changes everything we know about science.
One of the fantasies envisions a world encircled by a Saturn-like ring of knowledge-infused water, unveiling stereoscopic windows into higher and lower dimensions of time and space—if such concepts even exist.
Don’t take anything here too seriously. To quote Rudy Rucker: “I am a science fiction writer and the secret of science fiction is pile on the bullshit and keep a straight face”
Delicacies
“Delicacies” is my incoherent, irregular, unpredictable collection of interesting sparks I came across online. Handpicked by a human, no robots, no AI. A form of tripping, wandering, dérivé, with some loosely undefined theme holding them together. Delicacies have no fixed frequency: I hit the publish button when there is enough material. That can be after a week or after 3 months. No pressure, literally.
Check out the May, June, September 2024, and February 2025 editions here.
Books
Highlights:
Geometry, Relativity and the 4th Dimension – by Rudy Rucker (1977)
Art, Technology, Consciousness: Mind@Large – by Roy Ascott (2000)
Love & Math – by Edward Frenkl (2013)
Behave – by Robert Sapolsky (2017)
Mind in Motion – by Barbara Tversky (2019)
Soft City – by David Sim (2019)
and re-reading The Lifebox, the Seashell, and the Soul – by Rudy Rucker (second edition 2016)
Petervan’s Musical Ride March 2025– 61 songs. Recent releases include Panda Bear, Kae Tempest, and Ozark Henry > Oldies from Madonna, Roy Ayers, and DJ Funk > Play in shuffle mode to increase the surprise factor. Enjoy!
Petervan’s Musical Ride February 2025– 41 songs. Recent releases include Horsegirl, The Murder Capital, and Porridge Radio > Oldies from Lenny Kravitz, The Pebbles, and Jonathan Richman > Play in shuffle mode to increase the surprise factor. Enjoy!
“Delicacies” is my incoherent, irregular, unpredictable collection of interesting sparks I came across online. Handpicked by a human, no robots, no AI. A form of tripping, wandering, dérivé, with some loosely undefined theme holding them together. Delicacies have no fixed frequency: I hit the publish button when there is enough material. That can be after a week, or after 3 months. No pressure, literally. Just click the image below. Enjoy!
Some highlights from this edition:
A preview of the upcoming Antikytera Journal. I love the entries “What is Life?”
Venkat has an awesome essay “We Are the Robots” on evolving human self-conceptions to machines
Petervan’s Musical Ride January 2025– only 36 songs. Recent releases include Voice Actor, Matt Berry, and Roland > Oldies from David Lynch, Donna Summer, Netsky > Play in shuffle mode to increase the surprise factor. Enjoy!