Skip to content

FEC Settings

Martin Pulec edited this page Aug 7, 2018 · 23 revisions

Forward Error Correction

FEC can be turned on on sender, while recognized automatically with receiver. All schemes can be used for video while first one (multiplied stream) is eligible also for audio.

For high-bitrate video (uncompressed, JPEG, DXT) it is usually a good idea to use LDGM. For lower bitrate video (H.264/HEVC) it is advisable to use Reed-Solomon.

Interleaved multiplied stream

Turned on by

uv ... -f mult:3

where 3 is multiplying factor, so video streams goes three-times.

LDGM

There are more possibilities to control properties of LDGM:

  • If you are aware of LDGM scheme, you can set its properties directly. The syntax is:

uv ... -f LDGM:<k>:<m>:<c>

Where is matrix width, matrix height and number of ones per column.

Basically, k specifies matrix width, m number of redundant lines (thus k/m specifies redundancy). c shall be some small value, usually something about 5 will be ok, while a good value for k is in order of hundreds or few tousands (lets say up to 2000).

  • You can use also following syntax:

uv ... -f LDGM:<p>%

In that case UltraGrid tries to cover losses up to <p> percent. Please note that this doesn't guarantee you that it will cover that percent loss - there are specified few good presets for FullHD formats (uncompressed or JPEG) that will be chosen from. If the stream is eg. H.264 it won't give good results (and for H.264/H.265 is better to use a Reed-Solomon scheme).

  • Last possibility is not to specify anything:

uv ... -f LDGM

In that case, static predefined values are used. Note that this way is useful only in some cases. It has 1/3 redundancy.

Selecting encoding device

By default, CPU is used to compute LDGM parity. This is usually sufficient for lower bitstream (up to uncompressed HD). However, its performance falls behind with higher bitrates. In that case, CUDA implementation of LDGM should be used:

uv -f LDGM --ldgm-device GPU -t <capture> <receiver> # sets encoding of LDGM on GPU

In a similar way you can set decoding of LDGM on GPU

uv -d gl --ldgm-device GPU <sender>

If you want to set explicitly that encoding/decoding should be performed on CPU (default), you can use option:

uv --ldgm-device CPU

Reed–Solomon

For streams with lower bitrate (eg. H.264) it is more useful to use Reed-Solomon error correction codes. Usage:

uv -f rs -t <capture> -c libavcodec:codec=H.264

uses Reed-Solomon with default parameters, you can also specify parameters of RS directly with following syntax:

uv -f rs:k:n

  • k is the count of source symbols
  • n is count of generated symbols (source + parity)
  • k/n gives code ratio of the scheme, k around 200 is recommended because both k and n shouldn't exceed 255

Therefore, following command causes the use of 200 symbols plus 50 redundant symbols per frame (25% redundancy):

uv -f rs:200:250 -t <capture> -c libavcodec:codec=H.264

Clone this wiki locally