From my point of view, Unity isnīt that different compared to Gamestudio and it isnīt as restrictive as ShiVa.
It has a consistent and well working API and the Editor does a lot of things to simplify the users life. Like importing newly added files, compiling the code and integreting the codes interface into the editor.
This works basicly very well, but especially on the shader side, where it even has to compile for the different systems supported, this sometimes ends within a mess. Using shaders initially written in CG on the iPhone for example is something I didnīt yet see working correctly. But one can at least work around this easily by providing the same effect written in GLSL.
Another example where it doesnīt work correctly are texture lookups. It seems as if it sometimes optimizes them away. I for example failed to render the r channel of one texture and the g and b channel of another as a fullscreen effect. I tried many ways and while each texture on its own could be displayed correctly, it refused to display both at once.
Also the template shaders, are kinda messy and inconsistent, especially the postprocessing ones, just like in gamestudio.
I am getting used to Unity lately and the high request for custom shaders in the community shows that the provided onces arenīt enough. Which is actually always the case, as alround shaders tend to be slow, while a custom one exactly fitting the needs for a scene is often much faster and gives better results.
What I am now trying to figure out is how to manipulate the vertices and rendered pixels within the deferred pipeline, while still allowing the standard "surface shader" to work. The only other shader which seems to do something like that looks not very nice...