Nightwulf|work: hi all
mikkoc: plenty of "[drm:radeon_cs_ioctl] *ERROR* Failed to parse relocation !" with vanilla 2.6.34-rc3
mikkoc: mesa and radeon from git
airlied: mikkoc:does movingthe mouse make ithappen more often?
mikkoc: airlied: yes i think so, and probably scrolling too
mikkoc: it was there with -rc2 too, but not 2.6.33
airlied: mikkoc: fix is in drm-linus, just reomves the printf ;-)
airlied: or at least doesn't print it when nothing is wrong
mikkoc: ahah nice fix :P
mikkoc: will wait for rc4 then
Nightwulf|work: just tried out the master version of radon with 220.127.116.11 kernel (KMS enabled), libdri 2.4.18 and mesa 7.7.1...what can i say...all works like a charm now...now redraw issues with kwin any longer...good job! thanks!
Nightwulf|work: s/now redraw/no redraw ;-)
adamk: Wow... r600 is painfully slow in heretec2 when there's fire in the scene.
adamk: heretic2, even.
MrCooper: could be a software fallback
lordheavy: happily, now the exa slowdown seem fixed for r600 :-)
adamk: Yeah, that's what I'm assuming.
adamk: Gonna try with KMS enabled.
twnqx: lordheavy: fixed where?
twnqx: as in, in which kernel?
adamk: Shoot... heretic2 won't even load games in linux.
lordheavy: twnqx: xf86-video-ati from git http://cgit.freedesktop.org/xorg/driver/xf86-video-ati/commit/?id=bc93395b3eb5e3511c1b62af90693269f4fa6e13
twnqx: oh, driver
twnqx: i thought the slowdown was caused by kernel module.
adamk: I can't reproduce the heretic2 slowdown in linux, with or without KMS.
adamk: But I can't load the saved game where I saw it, so I'm not sure if it's specific to that scene or not.
suokko: Is there any kernel parameter to disable loading of radeon when booting fc12 livecd?
NightNord: suokko: it's not a kernel parameter, but rc system or initramfs system
NightNord: But you may disable KMS by radeon.modeset=0
suokko: In fact I think I know that problem is that drm modules are detecting card as unposted and are trying to post it. I haven't seen that happen in any others kernel that I have booted on that machine
suokko: But anyway. I got to root console with fc12 livecd and fixed the broken grub from there ;)
NightNord: Hm... Strange thing, I've compiled 32bit live mesa, and wine seems to be much happier with it, but now I have just black screen =)
NightNord: Seems to be low opengl support issue...
suokko: Black screen when?
NightNord: Everytime =)
suokko: Some wine application is just rendering blakc?
NightNord: In video/menu/everything
NightNord: Ah, no, in cnc4
NightNord: Will try something else now
suokko: NightNord: Try to check that something like 3DMark2001 renders something. It should use old enough api to be sure
Thunderbird: something like cnc4 is way too new for this driver, any real modern dx9c game is
Thunderbird: games take heavy advantage of shaders, float rendering, FBOs and so on
NightNord: In videos too? =)
NightNord: Maybe FBO is a problem, but shaders...
Thunderbird: we need really good shader support for shader model 3.0 games
Thunderbird: better than most drivers offer at this point
NightNord: I've been unable to get futher menu with ati-drivers anyway
NightNord: Thunderbird: and what is current state? RadeonFeature wiki page lacks much detail =)
Thunderbird: glsl in fglrx is not that good but it isn't terrible
Thunderbird: I don't know much details about these drivers, I'm a wine developer
NightNord: Thunderbird: so, the truth is that no current driver supports all features that hardware may support? Even proprietary ones?
suokko: NightNord: yes. That is sad truth
Thunderbird: the only reliable driver for our purpose at this point is nvidia
suokko: And it requires quite a lot lines of driver code to support all the features
Thunderbird: also we have to perform a lot of shader fixups
suokko: Thunderbird: Too bad nvidia driver doesn't always do what spec says so it might cause trouble for others drivers
Thunderbird: we have a test suite
Thunderbird: regarding the shader fixups this 'eats' vertex/fragment shader uniforms
Thunderbird: some games use the maximum number of those (well, the HLSL shader compiler apparently 'schedules' it that way)
Thunderbird: and this causes big issues and some shaders just fail
adamk_: Hmmm... Explosions in quake2forge have the same slowdown issues as in heretic2.
adamk_: I wonder if r300 does better.
Thunderbird: on nvidia geforce8 and up this doesn't occur because GLSL offers more than the d3d9 limits
Thunderbird: AMD hardware offers exactly the d3d9 limit and if you want to use more you have to use uniform buffers
Thunderbird: we don't use those at this point in wine and this is the cause of some shader fails on fglrx (and I guess on the open drivers as well if they are good enough to run these games)
suokko: I wonder if shaders could be recompiled to use less uniforms
Thunderbird: likely they can
suokko: (Inside mesa that is)
Thunderbird: we could do it in wine as well but we don't have our own shader compiler
Thunderbird: we are at the mercy of the HLSL compiler which is in d3dx9_*.dll which you have to install to run most modern games
suokko: I would think it would take less code to do that in actual drivers
Thunderbird: but we also reload uniforms frequently (I don't know this code well), so it might become a mess
NightNord: Linux has no own shader compiler?
Thunderbird: a shader compiler is in a driver
Thunderbird: in case of d3d there is just a common shader compiler
suokko: NightNord: Mesa has shader compiler
NightNord: In proprietary driver?
suokko: They have thier own
Thunderbird: too bad there is no common opengl shader compiler ..
Thunderbird: it would have saved a lot of headaches for most developers (it would have to output a low level format which opengl drivers can compile to their own hardware)
NightNord: So, wine lacks good shader support, mesa lacks good shader support and, just to end this, end drivers are also lacks good shader support =)
BioTube: well, Gallium's got a compiler
Thunderbird: d3dx9_* is external; it feeds us shader assembler which we convert to GLSL (or legacy shaders)
NightNord: I was just about to ask about gallium =)
Thunderbird: if someone would create a HLSL frontend to gallium that would help us a lot
Thunderbird: some people wrote those (according to mail archive searches) but didn't publish them
suokko: Thunderbird: IIRWIR ;)
suokko: It is ready when it is ready
NightNord: radeon support gallium?
NightNord: At full, I mean
NightNord: RadeonFeature page shows 'mostly' for r600/r700
adamk: Alright, so the r300 hits the same software fallbacks in quake2forge in FreeBSD
BioTube: r600g crashed my computer last I tried it
adamk: Hmm... I wonder if bugle can tell me where it's sitting fallbacks.
BioTube: and it hasn't been developed too much yet
Wizzup: NightNord: afaik it is not ready
NightNord: BioTube: I'm using mesa with gallium enabled now, but i'm not sure, if it's working
BioTube: NightNord: r600g isn't in the mesa tree yet
BioTube: it's in glisse's repo
suokko: adamk: Can you make quake2forge run in explosion loop and profile it with sysprof?
suokko: Or what ever is best system wide profiler in freebsd
NightNord: But... If gallium supports new features and radeon will (one day it will) support gallium - all this advanced features will be working well? Including Ogl 3.* and so on?
suokko: NightNord: None can see future but that is the plan
Thunderbird: note gallium is mostly a way to save development time; it may not be the best abstraction for each hardware
NightNord: Nice. So, that's a plan - if someone will implement HLSL for gallium and wine will make support it, radeon will get gallium support as well, we all will get at least as-proprietary level of features in wine?
Thunderbird: if someone would write a HLSL llvm frontend we would be happy to write a backend and plug that into our d3dx9_* / d3dx10_*
Thunderbird: it would allow us to output nicer shaders to drivers and it avoids the use of native microsoft dlls
suokko: Thunderbird: Too bad llvm is still missing features required for shader compiler
Thunderbird: what functionality is missing?
Thunderbird: a few years ago it lacked functionality but I thought it might have catched up since it is being used for gallium
Thunderbird: and apple, nvidia and others are using it for their opencl compilers as well
Thunderbird: perhaps HLSL is something for zhasha ;)
glisse: llvm isn't suited for shader imho
glisse: open64 is maybe better choice
glisse: not for shader directly
glisse: but maybe somethings like nvidia is doing
glisse: shader -> open64 -> specific assembler -> hw
suokko: Thunderbird: It is close to be good enough to compile to some intermediate language but there was recent talk that conversion from LLVM IR to TGSI was hard to make lossless
Thunderbird: yeah Cg uses open64
Thunderbird: not sure how much open64 is used; most projects seem to swear by llvm these days
glisse: well i have been talking with compiler folks and some believe open64 is better than llvm
suokko: I guess it is more important which wil lget more development effort in future
suokko: Current state is not so important if work required to plug them is close to similar
Thunderbird: for us the maintenance / future work part is important
NightNord: Too bad I had no knowledge in 3d-stuff and compilers =)
NightNord: I'm a kernel-alike c-programmer, it seems to be nice thing to work on, at least at will make my card more usable
NightNord: glisse: gallium already has some compiler?
glisse: NightNord: not sure we can call it a compiler
glisse: but there is frontend to translate from glsl or arb to tgsi
MrCooper: 'Open64 4.2.1 supports i386, x86_64 and IA-64' not very portable
NightNord: glisse: hlsl is much more complex than glsl, so it will need some external compiler?
glisse: no hlsl is as complex as glsl
glisse: MrCooper: i wasn't thinking to make hw backend
Thunderbird: there are nvidia bits in open64 (and there is some open64 snapshot on their ftp) it mentions 'nvidia ISA' not sure what it is
glisse: but it seems to fit the need of nvidia
Thunderbird: err, open64 is used for cuda
Thunderbird: this is an older part of their Cg source http://developer.nvidia.com/attach/6565
Thunderbird: the glsl compiler also uses this framework I think but it isn't open
MrCooper: no surprise there if nVidia is involved :}
NightNord: So, what's so bad about glsl translation idea? Is it working too slow or whatever?
Thunderbird: what idea?
NightNord: < glisse> but there is frontend to translate from glsl or arb to tgsi
NightNord: Why same this couldn't be done with hlsl?
Thunderbird: it can be done
NightNord: Or it would be just easier to make front-end?
Thunderbird: I don't know all hlsl details but it has some things which are not in glsl which might require additional functionality
NightNord: glisse: you are proposing to make compiler to some object code and implement object_code -> hw translator into gallium?
edwin: I think there is a BSD licensed HLSL2GLSL compiler
Thunderbird: they support so called 'effect files' which bundle shaders and you need some 'readback' about what vertex/pixel shader numbers are used
glisse: NightNord: i am not proposing anything, i am just pointing out other solution
glisse: i am myself writting some compiler infrastructure
glisse: because i want to test few compiler algo of my own
edwin: ^isn't that good for converting hlsl to glsl?
Thunderbird: direct hlsl to glsl, is not that useful for us because we have to perform a lot of fixups in our shaders
edwin: what is missing from LLVM wrt TGSI though?
Thunderbird: for a part simple fixups like coordinate system things (d3d and opengl use a different coordinate system) and more complex things
Thunderbird: if we would directly convert hlsl to glsl we need a lot of shader parsing voodoo
Thunderbird: which is much easier if you receive bytecode ASM
edwin: so you want HLSL -> llvm -> tgsi, and GLSL -> llvm -> tgsi?
NightNord: AnyOtherShaderLanguageFromFuture -> llvm -> tgsi
Thunderbird: in Wine we want HLSL -> llvm -> d3d_asm and then d3d_asm -> GLSL (->llvm -> tgsi)
Thunderbird: the first step is in d3dx9/d3dx10 and the other part is in d3d9/d3d10; the x-libraries are helper libraries
edwin: why the d3d_asm intermediary?
Thunderbird: d3d9 only supports d3d_asm
edwin: and can't it be d3d_asm -> llvm -> GLSL
Thunderbird: the supplier can be d3dx9 but the game might also ship with precompiled shaders
edwin: you'd share the same optimizers with HLSL then
NightNord: I got it
Thunderbird: no, we have to perform the d3d_asm -> glsl step in wine; there are a lot of fixups we have to apply which can for instance depend on the texture formats a game uses
NightNord: So we have 3 different formats, sources: GLSL, HLSL and binary d3d_asm
edwin: I am not sure about what kind of optmimizations we are talking about here, I'm not familiar with shaders
NightNord: And we need some way to translate them all to some uniform format, that will be supported by gallium?
edwin: are they arithmetic optimizations (CSE, operation simplifications, etc.)?
NightNord: Thunderbird: isn's 'fixups' are just conversion procedures?
edwin: can't you apply the fixups on the intermediary bytecode (whether its LLVM or something else)
edwin: instead of the asm?
Thunderbird: some of the fixups are of that type but others depend on lets say the texture data being used
NightNord: I've told, that ogl and d3d use different coordinate systems, so it's some permanent difference between format
Thunderbird: or we need coordinate fixups depending on whether ARB_texture_rectangle is used or not
NightNord: Ok. So, you need something far from gallium and mesa infrastructure?
NightNord: GLSL implementation seems to be on it's way to gallium, so, you are lacking just hlsl -> ... -> d3d_asm converter
NightNord: *I.e. just reimplementation of ms's compiler
Thunderbird: the only thing we need is d3d_asm and we then provide an opengl driver with GLSL (though arb_vertex_program/_fragment_program is easier though as it is easier to map)
edwin: glisse: how is your r600 gallium driver working? can it be tested already? :)
Thunderbird: yes, the hlsl -> (some compiler) -> d3d_asm is lacking
Thunderbird: if the glsl -> llvm -> tgsi is could we might be able to modify the frontend
glisse: edwin: won't be usefull unless you want to hack on it
glisse: i should be able to work on it this weekend
glisse: likely adding glxgears + texture support
edwin: glisse: does it use LLVM?
glisse: maybe more
glisse: no it use my own compiler stuff
NightNord: Thunderbird: why you need glsl -> llvm -> tgsi, if there is such conversion into gallium?
Thunderbird: I mean we would modify glsl -> llvm to hlsl -> llvm
adamk: Alright, so there's another issue with the r600 driver and quake2foe that doesn't exist with the r300 driver. Seems unrelated to the software fallbacks I'm seeing only in FreeBSD as this visual problem happens with FreeBSD and Linux.
Thunderbird: we would then need to write our own llvm -> d3d_asm
adamk: http://18.104.22.168/q2f-2-r300.jpg http://22.214.171.124/q2f-2-r600.jpg
adamk: Any thoughts on this one?
NightNord: Thunderbird: that is "if there would be glsl -> llvm frontend, it won't be so hard to make hlsl -> llvm frontend on it's base and then we will need to make only llvm -> d3d_asm, which is simpler"?
Thunderbird: I haven't touched llvm myself but some people said that writing a frontend is relatively hard and that a backend is easier
Thunderbird: initially performance isn't our main goal though (it would be good if it started out good)
Thunderbird: legal reasons are a higher priority
glisse: adamk: i guess issue with point sprite or something like tha
glisse: try to look if mesa got a demo
adamk: I'll give them all a shot.
glisse: adamk: there is a lot of demo in mesa :)
MrCooper: adamk: for fallbacks you can try RADEON_DEBUG=fall
glisse: MrCooper: sadly it seems i can't hack around stuff in ttm agp backend ...
glisse: would need unmap followed by map
glisse: thus lead to potential failure during the time laps btw unmap & map
adamk: MrCooper: What should that do? I'm not seeing anything extra on stdout.
MrCooper: glisse: oh well
glisse: adamk: i think its a compile flag iirc
MrCooper: adamk: should print something on stderr for any software fallback
glisse: MrCooper: i will add a new function to agp
adamk: Oh :-)
adamk: Gotcha :-)
glisse: or maybe simply hack driver to point to a dummy page
adamk: So it's not a compile flag?
adamk: Damn lag.
taiu: afaik r600 does't fallback on anything - we drop or misrender or smth
MrCooper: runtime, but maybe it's not hooked up in r600
adamk_: Let's see how adamk_ is doing. Better.
adamk_: Hmmm... OK.
taiu: it seems like point size misrendered and again afaik we do most params there execpt attenuation or smth
MrCooper: taiu: grep seems to disagree
NightNord: Thunderbird: still, I'm not clear about it: you are making some hacks around binary format every time shader is called, depending on call evironment?
Thunderbird: NightNord, yes :(
Thunderbird: mostly because of small GLSL / HLSL differences and different d3d behavior
taiu: MrCooper: havent seen it ever do sw ;)
Thunderbird: though some of the fixups can be fixed by some new GL extensions like ARB_texture_swizzl
NightNord: =( Make sense, why it's lagging sometimes so bad
NightNord: And there is absolutelly no way to precompile/preconvert this? Or just cache results?
Thunderbird: in case of ARB_texture_swizzle (we aren't using it yet) the problem is that the value of an unused color component between d3d and opengl differs
Thunderbird: lets say one of them returns 1.0 and the other 0.0
Thunderbird: and some games relied on this :(
adamk_: glisse: pointblast :-)
adamk_: glisse: Clearly the same problem. Renders fine, albeit slowly, on i915.
adamk_: Guess I should open a bug report, huh?
glisse: adamk_: then open a bug, otherwise it will get lost in the irc blackhole
adamk_: Doing it now.
adamk_: I should probably just double check it on r300 first to be certain that it is the same issue that q2f is having.
glisse: MrCooper: hhhmmm so the agp scratch page thingy wasn't working for you ?
adamk: Oh wowllw...
adamk: REbooting now and I see a lot of "wait idle failed status" messages on the console.
MrCooper: glisse: that's pretty much exactly the opposite of what I said :)
adamk: I wonder if that's related to the slowdown.
glisse: or you are not uninorth3
glisse: MrCooper: you just added .needs_scratch_page»····= true right ?
glisse: MrCooper: i was looking at u3_agp_driver not uninorth_agp_driver
glisse: and iirc you are on uninorth_agp_driver
MrCooper: and set the table entries to point to the scratch page instead of invalid, yes
MrCooper: U3 seems to support using a scratch page for disabled entries
NightNord: So, as far as I got it: 1) there is no common compiler for shaders, even for glsl, even in gallium 2) compiler suites appliance is unknown, open64 used by nvidia, but llvm seems to be more alive 3) there is need for at least two frontends (glsl, hlsl) and two backends (tgsi, d3d_asm) 4) even this will not solve wine lagging in some places, as it will still need to do conversion between d3d_asm -> glsl and then compile glsl again every time shad
glisse: MrCooper: yeah i am looking at other driver and i think i will do a patch to drop need_scratch page and force scratch page for all driver
MrCooper: glisse: not sure that'll work without API or other driver changes... e.g. uninorth currently complains when trying to bind an already bound entry
glisse: MrCooper: yeah it will need a bunch of change :(
glisse: but i think it's saffer
adamk: Alright, now I'm getting confused.
adamk: r300 is showing the same blockiness with pointblast.
adamk: Is there a way to force software rendering?
MrCooper: e.g. LIBGL_ALWAYS_SOFTWARE=1
adamk: Yeah, with that option, it shows pointblast with circles exploding from the center.
adamk: With the driver, they are blocks.
adamk: I'm guessing that software is correct in this case.
adamk: So something isn't implemented either way, whether it's related to q2f or not.
MrCooper: no, non-smooth points are squares
MrCooper: or at least can be
MrCooper: though I guess at least r300 ignores the smooth bit with low impact fallbacks disabled
adamk_: Yeah, if I change that option in driconf, it's slow with the driver but renders properly.
adamk_: Wow... If I set "disable low-impact fallbacks" to "No" quake2forge isn't even playable with r300.
adamk_: Alright, so what I've determined, then, is that the blockiness in quake2force is *not* related to the blockiness in pointblast. Both r300 and r600 and blocky in pointblast, but only r600 is blocky in q2f.
adamk_: Time to keep trying demos then, I guess.
MrCooper: I think r300g could use the draw module for smooth points
hifi: oh, forge
hifi: why would anyone use so old quake 2 client anyway?
adamk_: Heh... Yeah, forge. Sorry.l
adamk_: What's better?
hifi: I'd say even ioq2
hifi: http://icculus.org/quake2/ though, that too is old
hifi: I use a much modern client from 2007 and it works just fine with R700
hifi: can you show me a screenshot of the blockiness you see?
adamk_: http://126.96.36.199/q2f-2-r300.jpg http://188.8.131.52/q2f-2-r600.jpg
adamk_: One is with r300 one is with r600.
hifi: I had a stable 200 fps with RV570 back when I had one
hifi: I think that might be an issue with the client
hifi: my grenades look normal :)
adamk_: Well http://icculus.org/projects/ioquake2/ is empty.
hifi: I'll take a shot with R700 KMS and a grenade with my client
adamk_: I see lots of download links for ioquake2 for handheld stuff.
adamk_: Either way, if r300 renders it fine and r600 doesn't, then r600 seems to be missing something.
lordheavy: got this when running ogre3d sample demos with r600:
lordheavy: Please report at bugzilla.freedesktop.org
lordheavy: Mesa 7.9-devel implementation error: bad format in do_row()
hifi: adamk_: those small particles look the same as your r300 shot
adamk_: hifi: OK. Still doesn't change the fact that r300 renders q2f properly but r600 doesn't :-)
Obscene_CNN: notes that optimizing the mesa stack is addictive.
edwin: Obscene_CNN: did you get your patches committed?
Obscene_CNN: not yet
Obscene_CNN: sometimes its amazing, I can confirm that changes should make things faster by looking at the assembly code but it benchmarks worse. :/
edwin: well I can't complain about 2D performance
edwin: its usually very good, and not noticable, unless there is a bug and it falls back to software
edwin: with 3D its another matter, but software fallbacks are the problem ther etoo
edwin: Obscene_CNN: maybe you could make the software fallbacks work better?
edwin: Obscene_CNN: I think when it gets there, its just pushing a lot of data back and forth between the GPU and system
edwin: I don't think that CPUs are that bad at doing simple things like doing compositing operation with alpha mask
edwin: unless you rerender the entire image when only a part of it changes
MrCooper: usually it's better to avoid 3D software fallbacks in the first place than to optimize them
MrCooper: a Gallium driver won't need any software rasterization
edwin: right, people will at least notice something is wrong
edwin: but for 2D, come on, software fallbacks shouldn't be that bad
Obscene_CNN: wonders if the lack of performance improvements in Nexuiz vs Torcs is due to more software fall backs in Nexuiz ......
edwin: I can't set nexuiz to max settings
edwin: its way too slow
MrCooper: edwin: feel free to profile and optimize :)
Obscene_CNN: the hardware accel in 2D can be quite noticable
edwin: MrCooper: I'd rather have bugs fixed first, then optimize later
Obscene_CNN: especially with CPU intensive programs
edwin: MrCooper: any idea how to track down why depth textures are broken (and the shadowtex demo in Mesa), it just shows random textures
MrCooper: not offhand, sorry
Obscene_CNN: step through the assembly, one instruction at a time. ;)
edwin: are there any debugging mechanisms in Mesa I could turn on to see where that texture memory is written/read?
edwin: I mean do you know of similar bugs in the past, and how they got tracked down and fixed?
agd5f: well, right now the r600 driver doesn't really handle fallbacks in most cases
TheBrayn: I need some advice on how to regulate my fan
laumonier: how can i reconfigure my video settings???which command please? because my graphics bugs
Wizzup: Do you mean set resolution?
fabiosl: anybody using ubuntu lucid + 3200?
Ingmar: What would cause "[ 29195.805] (II) RADEON(0): GPU accel disabled or not working, using shadowfb for KMS" in http://dev.exherbo.org/~ingmar/temp/Xorg.0.log line 462?
Ingmar: Using libdrm and ddx from git, mesa-7.8, xserver 1.8 RC2, and linux 184.108.40.206
NightNord: Ingmar: check dmesg
NightNord: You may see some ring_test failed
Ingmar: eh. I'm an idiot
Suprano: hmm 3d for hd5770 is not ready yet, right?
Ingmar: [ 63.916170] r600_cp: Failed to load firmware "radeon/R700_rlc.bin"
Ingmar: [ 63.916179] [drm:rv770_startup] *ERROR* Failed to load firmware!
Ingmar: that'd do it?
Suprano: and my second screen fails to display anything at all
Suprano: its just off, no signal
Suprano: i use the git version
NightNord: Ingmar: I was about other error, but this is fatal too =)
NightNord: You'll need radeon-ucode for R600/700 =)
Suprano: doing xrandr --output DVI-0 --auto also has no effect
Ingmar: NightNord: thanks
Suprano: so i guess I need to keep using catalyst for now.
chithead: rv770 is radeon 4730/4850/4870
Obscene_CNN: on r600/r700 I take it that the code in radeon_tile.c in the mesa dri isn't used much
agd5f: Obscene_CNN: not yet
agd5f: Obscene_CNN: the sw detiling stuff is implemented in radeon_span.c
Obscene_CNN: okay, There is a MIN2 macro that can be moved out inner for loop's test section
BRMatt: Has anyone got a radeon x800 card working with ubuntu 9.1 / x11 ?
BRMatt: I've tried different stuff but nothing seems to work
BRMatt: Also tried this on the off chance http://wiki.x.org/wiki/radeonBuildHowTo#xf86-video-ati.28ddx.29
adamk: BRMatt: It should 'just work' without any extra installation steps.
adamk: Exactly what does 'nothing seems to work' mean? Did X not even start?
DanaG: random thing: I keep getting gpu hangs (and resets, yay) at exactly the sort of situations where fglrx would lag...
BRMatt: Well graphics are working, but none of the extra effectsy stuff seems to work
DanaG: that is, the readback-from-system-RAM that Xorg rather pointlessly does upon window allocation.
BRMatt: and everything seems a little bit laggy
DanaG: er, readback from VRAM.
adamk: BRMatt: We'd need to see the /var/log/Xorg.0.log file that was generated to figure out what's going on.
BioTube: BRMatt: IIRC, you've only got OpenGL 1.4 support; also, the driver defaulted to slower XAA back then
adamk: The desktop effects should still work.
adamk: If they aren't, then there's something else going on.
BioTube: adamk: I know kwin wouldn't activate them for OpenGL until 2.0 support was in
BRMatt: It tries to look for drivers then all the windows disappear, reappear and it says desktop effects cannot be enabled
adamk: BioTube: That's not true at all.
BioTube: adamk: personal experience
adamk: BioTube: I've been using kwin desktop effects since KDE4 was first released, well before opengl 2.0 was available in any driver.
adamk: BRMatt: Again, we need to see the Xorg.0.log file.
BioTube: I wonder why I had to settle for XRender then..
BRMatt: adamk, I'm pastebining it
BRMatt: Oh and every time I logon it also "forgets" my settings for dual display resolutions and just mirrors the two screens
adamk: Well direct rendering is enabled, as is AIGLX.
adamk: compiz desktop effects should work just fine.
adamk: As for gnome (I'm presuming gnome?) forgetting your settings for dual displays, you may want to take that up with #gnome.
adamk: However, please pastebin the output of 'compiz &'
BRMatt: Not really much there so I'll paste it here
BRMatt: Oh wait
adamk: If it's more than 3 lines, please pastebin it.
adamk: Oh good :-)
adamk: Since when is radeon not a whitelisted driver?
BRMatt: Well I followed the instructions in this topic title
BRMatt: wrong link
adamk: Alright, let's check the output of "LIBGL_DEBUG=verbose glxinfo", too.
adamk: Right, but that has nothing to do with this error from compiz.
BRMatt: ok, good
adamk: And you shouldn't have needed to do that in the first place :-)
BRMatt: heh, well I thought better to try whatever's in the title before asking :P
adamk: radeon has been in the whitelist on Ubuntu for at least a few releases.
adamk: Was this a clean install of 9.10 or an upgrade?
adamk: There's something quite wrong here, then.
adamk: In any case, you have 3D acceleration and AIGLX working.
adamk: You can force compiz to try and start by running:
BRMatt: Being a total graphics noob I presume that's good?
adamk: SKIP_CHECKS=yes compiz &
adamk: (As your normal user).
adamk: See if that works.
BRMatt: well my gnome task bar is gone
BRMatt: but the windows have nice aeroy effects
adamk: Give it a minute.
BRMatt: by task bar I mean the thing at the top
DanaG: yeah, gnome-panel likes to disappear sometimes.
adamk: BRMatt: At the top of the window, or the top of the screen?
adamk: I've never seen that happen with compiz.
DanaG: I've seen it sometimes upon starting compiz.
DanaG: usually I kill gnome-panel, and it comes back.
spreeuw: I've never used compiz
adamk: compiz-manager was being stupid.
DanaG: Checking screen 1Comparing resolution (2304x1024) to maximum 3D texture size (2048): Failed.
BRMatt: It works!
adamk: The real issue is that your screen resolution exceeds 2048, which is the maximum 3D texture resolution of your video card.
BRMatt: And now the panel at the top's decided to come back :D
BRMatt: My resolution is 1280*1024
BRMatt: At least, it is on my main monitor
BRMatt: The second one has a max of 1024
DanaG: hmm, you must have a Virtual size that's too big.
adamk: Which is not the end of the world, but will cause problems if you have a wallpaper drawn across both screens.
adamk: DanaG: Nah, he's using two monitors that exceed 2048.
BRMatt: Alright well thanks for your help guys
adamk: BRMatt: If you move a window over the right side of the right monitor, does it leave trails on the wallpaper?
adamk: Well I won't question it, then.
adamk: BTW, you can set that SKIP_CHECKS variable permanently so you don't have to start compiz like that each time.
adamk: mkdir -p ~/.config/compiz; echo SKIP_CHECKS=yes
adamk: Not that.
BRMatt: Is that that then or do I have to run that command all the time
adamk: mkdir -p ~/.config/compiz; echo SKIP_CHECKS=yes >> ~/.config/compiz/compiz-manager
adamk: Then you can enable desktop effects via system --> preferences --> appearances --> effects.
adamk: And, finally, you proably want to switch to EXA instead of XAA>
BRMatt: the top bar's gone again...
adamk: In /etc/X11/xorg.conf, in the Device section, add
adamk: Option "AccelMethod" "EXA"
TheBrayn: what do I have to do to slow down my fan when the card is idling?
adamk: BRMatt: I'm honestly not sure what's causing your gnome-panel to disappear.
BRMatt: Oh it's come back now
adamk: BRMatt: But it's not likely to be a driver issue, so maybe something to ask in #ubuntu or #gnome.
DanaG: hmm, is the panel across both screens?
adamk: Is it on the right screen?
BRMatt: No, left
BRMatt: left is the primary screen
DanaG: I use only one screen, and sometimes I get my panel disappearing, too.
DanaG: Even on my netbook, with Intel.
adamk: Then that is very odd.
BRMatt: But anyway, it looks like you've fixed my problem
adamk: Yeah, I don't use gnome much.
BRMatt: All the windows now bounce when I move them :D
BRMatt: Thanks again adamk and DanaG, you've been a great help
DanaG: Ugh, guawd, I hate that stupid window-allocation lag.
DanaG: It's not just fglrx that's slowed by it.
DanaG: Happens with radeon, too.
DanaG: And nvidia binary, as well, on some hardware.
soreau: adamk: FWIW, I see why it said no whitelisted driver found
adamk: soreau: ?
soreau: adamk: compiz-manager checks for the string "Loading /usr/lib/xorg/modules/drivers//radeon_drv.so" but his prefix is /opt/xorg
adamk: That could be smarter :-)
adamk: I didn't even check that myself.
soreau: adamk: Took me a minute to figure it out ;)
adamk: Hah... And I bet compiz didn't start out-of-the-box on his machine because he was using two displays.
adamk: So originally he received the maximum size error, and then the whitelist error after he built radeon himself.