[ad_1]
With visual/rendering materials, a very popular system for this is PBR. Materials made using PBR may define properties like albedo color, normal maps, emission, roughness, metallic, specular, etc. This allows creating objects that interact with light in a realistic way, using general-purpose parameters.
With physics materials, it’s common for them to define friction, restitution/bounciness, absorbency, etc. This allows defining how objects react to each other when sliding against each other or impacting each other in the physics engine, using general-purpose parameters.
With audio materials, such as those used for calculating footstep sounds, impact sounds, bullet hole sounds, etc, as far as I know most games just use a hard-coded list of sounds. For example a game may define an object as sounding like “wood”, so you walk on that object and it plays the wood footstep sound.
However, I am interested in building an engine-agnostic representation of audio materials, so a hard-coded list of sounds is not ideal. I would prefer to have a series of parameters like PBR provides for visual materials. For example if game engine A has “wood” and “birch wood”, and game engine B has only “wood”, I would like to be able to convert “birch wood” into parameters, then game engine B can play the closest equivalent sound to those parameters, or possibly interpolate between sounds, or procedurally generate sounds.
Is there a PBR-like material system for audio? Is there a way to define how objects sound using a series of parameters instead of a hard-coded list of strings? Some set of parameters to define audio resonance, brittleness, impact response, thermal response, moisture response, hollowness, reverberation, etc? Do any existing game engines have this? What about other software like those used to make movies?
[ad_2]