drm/radeon: avoid UVD corruption on AGP cards using GPU gart
authorAlex Deucher <alexander.deucher@amd.com>
Mon, 16 Sep 2013 03:23:07 +0000 (23:23 -0400)
committerAlex Deucher <alexander.deucher@amd.com>
Fri, 20 Sep 2013 18:28:14 +0000 (14:28 -0400)
If the user has forced the driver to use the internal GPU gart
rather than AGP on an AGP card, force the buffers to vram
as well.

Signed-off-by: Alex Deucher <alexander.deucher@amd.com>
Reviewed-by: Christian König <christian.koenig@amd.com>
Tested-by: Dieter Nützel <Dieter@nuetzel-hh.de>
Cc: stable@vger.kernel.org
drivers/gpu/drm/radeon/radeon_cs.c

index ac6ece61a47627931e9a3bcb68c3a34c636efe0a..80285e35bc6513fda3d5f5c975399a60feaab1e3 100644 (file)
@@ -85,7 +85,7 @@ static int radeon_cs_parser_relocs(struct radeon_cs_parser *p)
                   VRAM, also but everything into VRAM on AGP cards to avoid
                   image corruptions */
                if (p->ring == R600_RING_TYPE_UVD_INDEX &&
-                   (i == 0 || p->rdev->flags & RADEON_IS_AGP)) {
+                   (i == 0 || drm_pci_device_is_agp(p->rdev->ddev))) {
                        /* TODO: is this still needed for NI+ ? */
                        p->relocs[i].lobj.domain =
                                RADEON_GEM_DOMAIN_VRAM;