Re: Looking for the chart that converts XILICA or others to others
In an ideal world, there would only be one problem: How would all published settings and / or existing presets out there map onto the standard Q/BW definition? Even with no real-world technical reasons why DSPs sound different, and with no business reasons to want incompatible definitions, the transition to a standardized Q/BW definition would not be easy. Replacing many thousands of presets / units so that the "new" settings will sound correctly, while still being able to use them in a mixed setup with "old" products is not going to happen. So manufacturers will stick to their existing definitions until a standard emerges that they are forced to adopt for some reason.
Plus: The world is not ideal. Even with matching Q/BW definitions for Bell filters and shelving filters, DSP units will sound differently because of real-world limitations. The most obvious being sample rate. On the same processor, with the same Q/BW definitions, a given filter will sound differently when you switch from 48kHz sample rate to 96kHz sample rate. This is due to the "warping" of filter responses at frequencies near (or even not quite so near) half the sample rate, but also because of secondary effects such as simply the difference in the used anti-aliasing filters. Not to mention the percieved "sound" of different A/D and D/A converters, and analog signal paths that would cause the engineer to choose a different setting just because some opamps may sound different than others. Not to mention processing accuracy and dumb but human mistakes.
So, even with a fully standardized filter response, a 1:1 match would never be possible. And for reasons already stated in earlier posts, maybe not even desirable - from a manufacturer's point of view.
That being said, I am a strong supporter of standardizing Q/BW as a function of gain. I just don't see it happening.
I wonder if there would be resistance to bandwidth definition by some manufacturers because their filter definitions give a particular "sound" to their particular product, errors and all. In theory, if everyone's filter parameters matched, they should sound identical.
In an ideal world, there would only be one problem: How would all published settings and / or existing presets out there map onto the standard Q/BW definition? Even with no real-world technical reasons why DSPs sound different, and with no business reasons to want incompatible definitions, the transition to a standardized Q/BW definition would not be easy. Replacing many thousands of presets / units so that the "new" settings will sound correctly, while still being able to use them in a mixed setup with "old" products is not going to happen. So manufacturers will stick to their existing definitions until a standard emerges that they are forced to adopt for some reason.
Plus: The world is not ideal. Even with matching Q/BW definitions for Bell filters and shelving filters, DSP units will sound differently because of real-world limitations. The most obvious being sample rate. On the same processor, with the same Q/BW definitions, a given filter will sound differently when you switch from 48kHz sample rate to 96kHz sample rate. This is due to the "warping" of filter responses at frequencies near (or even not quite so near) half the sample rate, but also because of secondary effects such as simply the difference in the used anti-aliasing filters. Not to mention the percieved "sound" of different A/D and D/A converters, and analog signal paths that would cause the engineer to choose a different setting just because some opamps may sound different than others. Not to mention processing accuracy and dumb but human mistakes.
So, even with a fully standardized filter response, a 1:1 match would never be possible. And for reasons already stated in earlier posts, maybe not even desirable - from a manufacturer's point of view.
That being said, I am a strong supporter of standardizing Q/BW as a function of gain. I just don't see it happening.