In multi-ASE configurations there can be multiple transports per device,
each corresponding to different channels.
Emit sink/source nodes for each BAP transport present.
Combine them into a single sink/source in the same way as we do for
device sets.
For multi-ASE configurations, BlueZ does the channel allocation itself,
and passes us the result in the ChannelAllocation parameter.
If it is present, don't do the allocation ourselves but use that value
instead.
If Supported_Max_Codec_Frames_Per_SDU is less than what is required by
Supported_Audio_Channel_Counts, override its value assuming the device
actually supports at least that. Needed for Creative Zen Hybrid Pro.
Fix default value for channel count bitmask.
Do relaxed parsing of RFCOMM commands for AG & HF roles, allowing
multiple commands in same buffer.
Use same parser code for all HFP/HSP AG/HF. Parse input in relaxed way,
as some devices emit spurious \n
Primark True Wireless earbud doesn't support sbc-xq. Having it
enabled causes bluez to enter into a loop enabling/disabling
the device dozens of times per minute, making it unusable.
Don't have separate input route for A2DP and HFP, as it is generally not
necessary.
When in A2DP mode when there's also HFP possible, emit the input route
in SPA_PARAM_Route, even though there is no corresponding input node
emitted.
The host may then emit a loopback microphone node, and switch profiles
according to its status. Having the input route available at all times
allows to retain changes to volume settings made when there is no real
input node.
We delay the audio a bit to keep packet intervals equal, which keeps
some data in buffers.
In theory the calculation keeps one buffer free, but it doesn't
explicitly keep "extra" buffer space so in theory might flush too late
and next process() might not have free buffers. However, as we encode
next packet right away this shouldn't really occur...
Try to keep one extra spare buffer free so that the flush time is
certainly early enough.
Some devices appear to set Supported_Max_Codec_Frames_Per_SDU == 1 while
claiming they support two channels per stream, which is then not
possible.
In this case, limit the number of channels by the number of frames per
SDU when selecting.
Also adjust PAC sorting.
GCC 14 introduces a new -Walloc-size included in -Wextra which gives:
```
./pipewire-0.3.84/spa/plugins/bluez5/codec-loader.c:176:14: warning: allocation of insufficient size ‘1’ for type ‘struct impl’ with size ‘1032’ [-Walloc-size]
```
The calloc prototype is:
```
void *calloc(size_t nmemb, size_t size);
```
So, just swap the number of members and size arguments to match the prototype, as
we're initialising 1 struct of size `sizeof(struct impl)`. GCC then sees we're not
doing anything wrong.
Signed-off-by: Sam James <sam@gentoo.org>
BlueZ now requires endpoints to set Locations/Context, so set them to
some sensible default values. These could in principle be made
configurable later.
Use bigger fallback maximum MTU, when kernel fails to tell us, which
shouldn't happen but apparently can. We choose the packet size based on
incoming data, so these values aren't usually needed so we can just bump
them.
Also report errors as necessary.
If remote supports both HFP HF and AG, both may get connected, which
occurs with Pipewire<->Pipewire connection. In this case, Pipewire on
both sides may pick the audio-gateway profile.
To avoid both sides being audio-gateway, if remote is both A2DP sink and
HF, use lower priority for the audio-gateway profile. Generally, BlueZ
won't connect both A2DP Source and Sink between same devices at the same
time, so we use that to determine which side should be the receiver.
It may occur that we have RFCOMM connected as both HF and AG. The codec
switching and support checks should in this case always use the remote
HF RFCOMM.
Fix by finding the RFCOMM with the correct profile, remote as HF.
Make supported codec checks to use profiles, not "is-a-sink" flag, to
determine which codecs can be used.
Fixes bluez5-device checking only source profiles, even when the local
device is only a sink.
Once Pipewire is started it will try to register a BAP broadcast source media endpoint on UUID 00001852-0000-1000-8000-00805f9b34fb if the media codec that supports BAP and the adapter indicates LE Audio is supported.
When the endpoint is detected (over DBus) by Pipewire and it has a broadcast sink UUID, a new device will be created with the address 00:00:00:00:00:00. This device will be our simulated remote device. This is done because a broadcast source emitting device does not need any connection to start transmitting the audio. This device is set as connected.
When the SetConfiguration DBus method is called and the spa_bt_transport structure with the profile BAP broadcast source is created we switch the device from the one read from DBus to the one created by us. This is done because in BlueZ, when the transport is created, at the Device property, BlueZ sets the adapter as the device that the transport is connected to. Here the device will have the newly created SPA_BT_PROFILE_BAP_BROADCAST_SINK profile connected.
Added code that allows to create a node in the graph for a device connected to the SPA_BT_PROFILE_BAP_BROADCAST_SINK profile.
This avoids the potential confusion when both codecless and codec profiles are
enumerated for A2DP.
Give base name to highest priority profile, so that best codec can be selected
at command line with out knowing which codecs are actually supported.
Some A2DP devices don't like reusing the same transport for different
media-sink instances, possibly because encoder is reset in between and
there can be a gap in transmitted audio.
This doesn't matter for SCO/ISO.