In our current world, it is possible to have a negative delay. This
means that the stream should be delayed to sync with other streams.
The pulse-server sets negative delay and the Latency message can hold
those negative values so make sure we handle them in the helper
functions as well.
Do the delay calculations in pw_stream and JACK with signed values to
correctly handle negative values. Clamp JACK latency range to 0 because
negative latency is not supported in JACK.
We should also probably make sure we never end up with negative
latency, mostly in ALSA when we set a Latency offset, but that is
another detail.
Add a new followerClock block in the profiler info. This is only set
when the follower could be a driver and it contains the clock info used
for following the driver, mostly the rate difference and delay.
Dump this info in pw-profiler -J
Make sure we always set the info in the clock, especially also when we
are following.
Make a function that can initialize raw audio info from a dict and fill
in the defaults. We can use this in many of the modules when the audio
format is parsed.
Use the helper instead of duplicating the same code.
Also add some helpers to parse a json array of uint32_t
Move some functions to convert between type name and id.
UMP (universal Midi Packet) is an improved version of transmitting MIDI
messages. It also has support for MIDI 2.0 and is backwards compatible
with MIDI 1.0.
The Pipewire libcamera spa plugin exposes multiple camera properties.
Unlike v4l2, libcamera usually exposes these as normalized floating
point values. But as the SPA_PROP types are based on v4l2, they are
currently set to integers.
This causes a problem when using pw-cli to change the properties,
as the spa_json_to_pod_part function casts the properties according
to their spa_type_info. Other software that doesn't depend on the
spa_type_info can correctly set the properties, as the values are
encoded in the spa_pod type and therefore also carry a type.
As the limited range from switching integers to floats is likely not a
problem, the affected spa properties were changed to the Float type.
This will cause pw-cli to also generate spa_pod values of type float
when setting v4l2 properties. Therefore the v4l2 spa plugin is also
adapted to allow floating point properties and cast these to integers.
Signed-off-by: Sven Püschel <s.pueschel@pengutronix.de>
This provides access to GNU C library-style endian and byteswap functions.
Windows doesn't provide pre-processor defines for endianness, but
all current Windows architectures (X32, X64, ARM) are little-endian.
Change the GenericFd data type to SyncObj. It's probably better to
explicitly state the data type than to make something generic. Otherwise
we would need to transfer the specific fd type somewhere else and there
is no room for that in the buffer and the the metadata is not a good idea
either because it can be modified and corrupted at runtime.
Add the SyncTimeline metadata. This contains 2 points on two timelines
(SyncObj datas in the buffer). The buffer can be accessed when the
acquire_point is signaled on the timeline and when the buffer
can be released, the release_point on the timeline should be signaled.
Add a SPA_PARAM_BUFFERS_metaType in the Buffers object. This contains a
bitmask of the mandatory metadata items that should be included on a
buffer when using this Buffers param.
Make the buffer allocation logic skip over the Buffers params that
require unavailable metadata.
This can be used to, for example, enforce specific metadata to describe
extra buffer memory (such as the meaning of generic file descriptors).
One such use is the explicit sync, where an extra buffer data is needed
for the sync fd along with metadata that contains the sync_point.
HFP 1.9 adds LC3 as a possible codec in addition to CVSD & mSBC.
E.g. Pixel Buds Pro latest firmware supports it.
Add the RFCOMM side and codec selection for it.
The tag param has a list of arbitrary key/value pairs. Like the Latency
param, it travels up and downstream. Mixers will append the info
dictionaries or do some more fancy merging.
The purpose is to transport arbirary metadata, out-of-band, through the
graph and it's used for stream metadata and other stream properties.
WirePlumber checks for the ENCODED audio format to determine if the
format is compressed/encoded. Without this info, it is not able
to automatically link compressed audio nodes.