Hmm, it might be an unavoidable feature of the GL driver on these devices actually (inherited from Android).
If the hardware graphics driver is saying OK to 16-bit pixels then I'm not sure what we can do to stop it. We'd just have to avoid asking for less than 8 bits per channel in SDL, or get around to implementing enhancement bug 1469673.
Mir tries to make it very clear that only 8 bits per channel is supported and forces everyone (including SDL) to choose a 24 or 32-bit pixel format. So it sounds like that's not providing a guarantee that the Android GL driver will actually honour the agreed pixel format.
On that note, I have suspected for a while that Mir forcing clients to choose a pixel format independently of choosing a GL config was redundant and could cause confusion like this. If we can remove one of those steps it would help.
Hmm, it might be an unavoidable feature of the GL driver on these devices actually (inherited from Android).
If the hardware graphics driver is saying OK to 16-bit pixels then I'm not sure what we can do to stop it. We'd just have to avoid asking for less than 8 bits per channel in SDL, or get around to implementing enhancement bug 1469673.
Mir tries to make it very clear that only 8 bits per channel is supported and forces everyone (including SDL) to choose a 24 or 32-bit pixel format. So it sounds like that's not providing a guarantee that the Android GL driver will actually honour the agreed pixel format.
On that note, I have suspected for a while that Mir forcing clients to choose a pixel format independently of choosing a GL config was redundant and could cause confusion like this. If we can remove one of those steps it would help.