Ogg quality levels

I have already read UE4 documentation concerning sound compression, but still would like to clarify something: what is the exact correlation between the scale used in UE4 (where you can set quality from 1 to 100 with 100 being the highest quality) and OGG bitrate? I’m asking because I know that OGG can go up to 500 kbps, but I’m not sure whether 100 in Unreal Engine eqals 500kbps.

We actually reduce the quality slightly in an attempt to get a consistent cross platform sound quality to XMA. The quality setting in the editor is reduced by 15 and then divided by 100 to get an OGG quality setting clamped between -0.1 and 1. If you’re using a quality of 100 that results in an OGG setting of 0.85, which would fall in the range of 256 - 320 kb/s though as that wiki suggests, the quality is not intended to correlate to an exact bit rate.

Thank you for the answer, Matt. Most people with non-audiophile equipment can’t tell any difference between MP3 320 kb/s and a perfect FLAC in an ABX test, so this is more than satisfactory. Besides, OGG is optimised better than MP3, so OGG 320 kb/s sounds better than MP3 320 kb/s.

What would the default XMA quality of 40 compare to then? According to your example it should be in the lines of ogg quality setting 0.25 ~ 125kbps. Is that correct?
It would be nice to have more explanations in the documentations, so people could judge for themselves what XMA compression quality they need.

Can we bypass this behavior if we’re limiting to certain platforms, please? Audiophile here. Target audience may also be audiophiles…

Perhaps the decision to do this should only be a default behavior but configurable to bypass.