glx: Don't use sRGB configs on llvmpipe
Needs RevisionPublic

Authored by fredrik on Jul 2 2019, 12:19 AM.

Details

Reviewers
fvogt
Group Reviewers
KWin
Summary

This is necessary to keep openQA working. See bug #408594.

CCBUG: 408594

Diff Detail

Repository
R108 KWin
Lint
Lint Skipped
Unit
Unit Tests Skipped
fredrik created this revision.Jul 2 2019, 12:19 AM
Restricted Application added a project: KWin. · View Herald TranscriptJul 2 2019, 12:19 AM
Restricted Application added a subscriber: kwin. · View Herald Transcript
fredrik requested review of this revision.Jul 2 2019, 12:19 AM
fvogt added a subscriber: fvogt.EditedJul 2 2019, 8:32 AM

llvmpipe generally works fine, just not when using it with cirrus if the kernel uses 16bpp.

romangg added a subscriber: romangg.Jul 9 2019, 3:11 PM

Please be more verbose in your summary.

fvogt requested changes to this revision.Tue, Aug 13, 7:22 AM
fvogt added a reviewer: KWin.

From the bug report:

(In reply to Fabian Vogt from comment #31)
> (In reply to Fredrik Höglund from comment #30)
> > (In reply to Fabian Vogt from comment #27)
> > > > So I'm going to solve this by blacklisting sRGB configs on LLVMPipe instead.
> > >
> > > That sounds like a bit too much, everything except cirrus with 16bpp seems to
> > > work.
> >
> > Unfortunately we can't easily detect that the video device is a Cirrus
> > device. The OpenGL driver can only tell us that it is llvmpipe; it doesn't
> > know where the results of the rendering is going to be presented.
>
> Luckily that shouldn't be necessary, as we only know that llvmpipe + 16bpp
> is broken. Is detecting a 16bpp default framebuffer possible?

That's something that we can do.

I'll update https://phabricator.kde.org/D22203

This revision now requires changes to proceed.Tue, Aug 13, 7:22 AM