Here's the situation:
I have a view which renders the scene depth into a bmap, which is then used in a deferred shader to reconstruct the world position and do shadowmapping. That view is attached to the camera via
camera.stage = depthView;
set(depthView,CHILD);

So far, this is working just fine.

Now when i switch on the NOPARTICLES flag for depthView, my reconsturcted world position is wrong.

This is how it should look:


And this is how it looks with NOPARTICLES set:


I then figured out, that when NOPARTICLES is set, my matrix is rolled by 90° and tilted by 90°. So after applying a rotation matrix to my shader, it worked again. However this is just a workaround, as i cannot toggle NOPARTICLES without changing my shader again.

here's my matrix (doesn't matter if i calculate it in a material event or in a function, it's always messed up once i set NOPARTICLES for depthView):
Code:
mat_set(mtl.matrix,matWorld);
mat_multiply(mtl.matrix,matView);
mat_multiply(mtl.matrix,matProj);
mat_inverse(mtl.matrix,mtl.matrix);



And here's how i reconstruct the world position:
Code:
//compute world position
float4 vp = float4(inTex.xy * float2(2, -2) - float2(1, -1), depth, 1);
float4 v = mul(vp, matMtl);
float3 position = v.xyz/v.w;





But wait, there's more grin

If i don't attach the depth view to the camera via view.stage and CHILD, but by manually setting it's position,pan and arc to the camera, it works fine, no matter if i set NOPARTICLES or not. I would do it that way, but then my depthmap lags behind the current scene a little if i move the screen too fast and shadows get messed up.



So to sum it up:
- using stage, CHILD and NOPARTICLES = this messes up my matrix, it gets rolled and tilted by 90°
- manually attaching my depthview to the camera works just fine, but my depthmap then lags behind


Any ideas why NOPARTICLES could mess up my matrix?


Shade-C EVO Lite-C Shader Framework