loopback: Increase the maximum allowed latency

Currently the biggest possible sink latency is 10 seconds. The total
latency of the loopback is divided evenly for the source, an
intermediate buffer and the sink, so if I want to test 10 s sink
latency, the total needs to be three times that, i.e. 30 seconds.
This commit is contained in:
Tanu Kaskinen 2013-07-16 14:12:42 +03:00
parent 908deb136c
commit 60f1e695fd

View file

@ -811,7 +811,7 @@ int pa__init(pa_module *m) {
channels_set = true; channels_set = true;
latency_msec = DEFAULT_LATENCY_MSEC; latency_msec = DEFAULT_LATENCY_MSEC;
if (pa_modargs_get_value_u32(ma, "latency_msec", &latency_msec) < 0 || latency_msec < 1 || latency_msec > 2000) { if (pa_modargs_get_value_u32(ma, "latency_msec", &latency_msec) < 0 || latency_msec < 1 || latency_msec > 30000) {
pa_log("Invalid latency specification"); pa_log("Invalid latency specification");
goto fail; goto fail;
} }