

A higher bit rate must imply DTS will be superior sounding right? In theory, the less compression used in the encoding process, the more realistic the sound will be, as it will better represent the original source. DD compresses a 5.1 channel surround track to 384 kbps to 448 kbps (DVD Standard limited, DD has the potential of up to 640 kbps) while DTS uses much higher bit rates up to 1.4 Mbps for CD's / LD's and 1.5 Mbps for DVD. "In order to minimize the limited space allocated on a DVD for audio soundtracks, DD and DTS utilize lossy data reduction algorithms, which reduce the number of bits needed to encode an audio signal. If anybody knows more about this, we'd like to know more. Since DTS-HD does not use Normalization, therefore the processed signal is relatively bit for bit. Hard Disk Space: 20 MB of free space required.Correct me if I’m wrong but my understanding about Dolby True-HD (which I read somewhere) is when normalization is applied the encoding process the audio attenuation on the audio encoder is set to -2dB or is it -4db? (dialog normalization default) resulting in a lower SPL relative to DTS at an equivalent gain setting, so when it's decompress a.

Processor: Intel processor, 2.0 GHz Dual-Core and higher.Operating System: Windows XP/Vista/7/8/8.1/10.

