Summary: | Grow Home colors are wrong with HD6950 | ||
---|---|---|---|
Product: | Mesa | Reporter: | serge <serge.yquel> |
Component: | Drivers/Gallium/r600 | Assignee: | Default DRI bug account <dri-devel> |
Status: | RESOLVED MOVED | QA Contact: | Default DRI bug account <dri-devel> |
Severity: | normal | ||
Priority: | medium | CC: | apinheiro |
Version: | git | ||
Hardware: | x86-64 (AMD64) | ||
OS: | Linux (All) | ||
Whiteboard: | |||
i915 platform: | i915 features: |
Description
serge
2016-02-05 13:23:21 UTC
(In reply to serge from comment #0) > I also made an apitrace which you can find on the following link, if someone > is interested into looking at this issue: > > https://drive.google.com/file/d/0B6bdV-rKtJnFWDlvWE1vZnFtaXc/view?usp=sharing Under the risk of adding noise here, I tried this apitrace on a intel HD4600 (haswell). I got several visual artifacts (in addition to be really slow), getting several performance related debug warnings. For example, at the initial phase, where it is mostly compiling shaders: 10202: message: shader compiler performance issue 478: SIMD16 shader failed to compile: FS compile failed: Failure to register allocate. Reduce number of live scalar values to avoid this And then, when running the log and the menu, messages like the following: 327430: message: api performance issue 1506: Scanning index buffer to compute index buffer bounds. Use glDrawRangeElements() to avoid this 3836043: message: api performance issue 1553: GTT mapping a busy miptree BO stalled and took 1516.024 ms Anyway, probably this would need to be taken with a grain of salt, as usually games detect hw capabilities in order to know what to use. Probably it would be worth to try the game on a intel hw (unfourtunately I don't have access to that game). (In reply to Alejandro Piñeiro (freenode IRC: apinheiro) from comment #1) > (In reply to serge from comment #0) > > > I also made an apitrace which you can find on the following link, if someone > > is interested into looking at this issue: > > > > https://drive.google.com/file/d/0B6bdV-rKtJnFWDlvWE1vZnFtaXc/view?usp=sharing > > Under the risk of adding noise here, I tried this apitrace on a intel HD4600 > (haswell). I got several visual artifacts (in addition to be really slow), > getting several performance related debug warnings. For example, at the > initial phase, where it is mostly compiling shaders: It does the same when playing the apitrace with my card, but those artifacts where not there at recording time. I think they are due to fake calls injections. There is an explanation for this on the tracing section of the following link: https://github.com/apitrace/apitrace/blob/master/docs/BUGS.markdown -- GitLab Migration Automatic Message -- This bug has been migrated to freedesktop.org's GitLab instance and has been closed from further activity. You can subscribe and participate further through the new bug through this link to our GitLab instance: https://gitlab.freedesktop.org/mesa/mesa/issues/574. |
Use of freedesktop.org services, including Bugzilla, is subject to our Code of Conduct. How we collect and use information is described in our Privacy Policy.