

The instrument will be virtually reconstructed and transformed into a Digital Twin, with missing parts digitally rebuilt. Material testing of the ancient fragments will guide the creation of physical prototypes. Acoustic profile digitization of architectural spaces will be carried out in order to preserve the instrument’s performance environment and recreate authentic soundscapes. AI models will simulate long-term material aging and its impact on sound quality and tone, updating the instrument’s acoustic behavior in real time.The final model will be placed in immersive virtual environments, offering interactive, real-time sound rendering and dynamic optimization of pitch and harmonics based on environmental and material changes.


Modern reconstructions of historical organs aim to bridge gaps in understanding, using advanced materials and technologies to replicate the sound and structure of ancient instruments. However, challenges such as accurately scaling pipes and recreating the acoustics of historical spaces remain significant. The study and preservation of church organs underscore the importance of interdisciplinary collaboration among musicologists, conservationists, acoustics experts and engineers. These efforts not only enhance our understanding of medieval and Renaissance music but also ensure that the rich legacy of these magnificent instruments is preserved for future generations.
Digitising historic church organs will contribute to the preservation of their unique acoustic and structural identities. Using Finite Element Method (FEM) and Computational Fluid Dynamics (CFD), MusicSphere will simulate airflow, turbulence, vibration and the effects of material aging within organ pipes. Parameterised models will allow adaptation to different organ designs, supporting predictive predictive conservation and documented restoration. High-resolution 3D scanning, material analysis and CAD modeling will enable precise virtual reconstructions. Acoustic profiles of churches housing these organs will be captured using directional impulse responses and ambisonic recordings, simulating sound reflections and absorption patterns to recreate the interplay between instrument and space. These data will be integrated into interactive XR environments that combine accurate 3D visualisations with spatialised soundscapes, allowing users to explore virtual reconstructions and experience organ sounds in historical contexts.Real-time polyphonic transcription will archive performances, while AI synthesis will recreate historic organ sounds for virtual and live contexts. These technologies will enhance preservation, research, and public engagement, ensuring the cultural and musical legacy of these instruments endures.