Currently the v4l2 and libcamera plugins map `SPA_PROP_exposure` in incompatible
ways. So change the v4l2 mapping to `V4L2_CID_EXPOSURE_ABSOLUTE` because at least
that is in units of time (a step closer to addressing #4697), and because that
is more relevant for UVC cameras.
Also change the pipewire-v4l2 translation layer.
Fixes warning:
[186/359] Compiling C object spa/plugins/v4l2/libspa-v4l2.so.p/v4l2-source.c.o
In file included from ../spa/plugins/v4l2/v4l2-source.c:164:
../spa/plugins/v4l2/v4l2-utils.c: In function ‘spa_v4l2_enum_format’:
../spa/plugins/v4l2/v4l2-utils.c:1103:22: warning: unused variable ‘drop_next’ [-Wunused-variable]
1103 | bool drop_next = false;
| ^~~~~~~~~
This assumes that the modifier is always 'linear'. That is not quite
correct for all V4L2 formats. But PipeWire only uses input devices and
other modifiers are very unlikely.
This makes it possible to use DMABUFs with the GStreamer pipewiresrc.
The workaround is typically needed with usb-cameras using jpeg streams.
Re-enabling the skip shouldn't affect what the commit was trying to do,
i.e. fix encoded streams like h264 etc.
Fixes: bcf0c0cf8 (v4l2: only skip buffer for raw formats)
Without this for continuous frame intervals, the default is set to 25
fps even if that is outside the min/max bounds (e.g. defined by the
peer).
Clip the default with the min/max bound to avoid this.
v4l2 frame intervals instead of frame rates, which is basically the
inverse. For the most part, this is already handled correctly and
numerator and denominator are swapped accordingly.
However, the minimim frame interval is the maximum frame rate so those
need to be swapped as well.
Without this, the minimum frame rate is larger than the maximum frame
rate for v4l2 devices that define a continuous frame interval.
When we don't support EXPBUF according to the probe, the host will
allocate memory for us. We will then try to use USERPTR to import the
memory into v4l2.
If this is not supported, try to fall back to MMAP support, mmap the
buffers and memcpy the MMAP buffer to the host allocated buffers.
Instead of just probing with 2 buffers, probe with the MAX_BUFFERS and
remember how many buffers we received.
Some drivers (v4l2loopback) work with less than MAX_BUFFERS (8) and we
need to set the max number of buffers in the Buffers param.
Some V4L2 controls like focus, pan or tilt can be relative (change
the lens / angle relative to previous positions) these return EACCES
when calling VIDIOC_G_CTRL on them.
Fix the v4l2 SPA not working on cameras with relative controls,
such as e.g. the Logitech QuickCam Orbit MP.
Signed-off-by: Hans de Goede <hdegoede@redhat.com>
When we drop the first buffer to avoid timestamp problems, queue it
again in the driver or else we will not be able to dequeue is again
later and we will be running with a buffer less.
The Pipewire libcamera spa plugin exposes multiple camera properties.
Unlike v4l2, libcamera usually exposes these as normalized floating
point values. But as the SPA_PROP types are based on v4l2, they are
currently set to integers.
This causes a problem when using pw-cli to change the properties,
as the spa_json_to_pod_part function casts the properties according
to their spa_type_info. Other software that doesn't depend on the
spa_type_info can correctly set the properties, as the values are
encoded in the spa_pod type and therefore also carry a type.
As the limited range from switching integers to floats is likely not a
problem, the affected spa properties were changed to the Float type.
This will cause pw-cli to also generate spa_pod values of type float
when setting v4l2 properties. Therefore the v4l2 spa plugin is also
adapted to allow floating point properties and cast these to integers.
Signed-off-by: Sven Püschel <s.pueschel@pengutronix.de>
Use dynamic pod builder so that we can also build complex formats.
Make sure we zero the format before we parse it or else we end up with
potentially uninitialized values.
When ENUM_FRAMESIZES or VIDIOC_ENUM_FRAMEINTERVALS return EINVAL for the
first index, make a dummy result and continue with that. This will
trigger an intersect withe filter so that we end up with something valid
instead of nothing.
Handle 0 framerates without crashing.
See #4063
Advertise support for the videotransform metadata.
Make a new meta.videotransform.transform property to configure the
desired video transformation in the metadata.
This makes it possible for a session manager or other rules to set
a custom transformation on the source.
See #4034
A quite big number of UVC cameras - due to firmware or kernel driver
issues - have bad timestamps of the first frame, confusing clients
like pipewiresrc.
Drop the first frame, as this seems to be the most reliable workaround
for the time being.
Closes https://gitlab.freedesktop.org/pipewire/pipewire/-/issues/3910
Add a MAPPABLE data flag that hints that the fd in the data is mappable
with a simple mmap/munmap. Normally, DmaBuf is not mappable like that
unless explicitly indicated with this flag.
Set the MAPPABLE flag on the DmaBuf from v4l2 and libcamera fd.
When asked, mmap the buffer memory in all cases when the MAPPABLE
flag is set.
This solves the case where v4l2 has exported DmaBuf and is streaming to
node A and then node B links but doesn't get automatically mmaped
memory.
Fixes#3840
After we set the format, probe if we can do EXPBUF and enable/disable
the ALLOC_BUFFERS flag on the port.
This should gracefully handle the case where EXPBUF is not available.
Fixes#3821
When we try to alloc buffers but EXPBUF is not supported, make sure to
clear the alloc_buffers flag so that the caller can try again with
allocated buffers instead.
See #3821
When the filter has no format property, just enumerate all possible
framerates. Handle error case where the filter has the wrong type.
Makes gst-launch gstpipewiresrc ! video/x-raw ! fakesink work.
See #1793
Actually count the number of frame fractions we add. If we added 0, we
don't have any supported framerate that intersects with the filter and we
try the next frame size.
See #1793
A driver node should use the target_duration and target_rate to adjust
the quantum and rate when the graph starts.
The camera nodes don't currently support any of this and simply enforce
a specific rate and duration for the graph clock. Mark this with a
FIXME. Otherwise, pipewire will complain that the node is ignoring the
configured graph rate.
We should really look at the graph target rate/quantum and only produce
a buffer when it is inside the current graph cycle. This would make it
possible to join audio and camera nodes and have them be in sync.
When we get buffers from the driver, check if we have at least as many
as we requested. If we have more, that's ok, we will simply not queue
them.
Previously we would ignore the input number of buffers and use the
allocated amount to fill up the buffer array, which might be too small
and then we crash.
It is valid for V4L2 devices to not implement any controls. QUERYCTRL
returns ENOTTY in these cases. Enumerating the controls must not fail in
these cases but return no controls.
The PropInfo either has a registered id (and then also a name from the
type-info) or a custom name as a string.
In all cases, the description contains a free form text that clarifies
the property.
Use the description in the stream controls name.