Well, we have WebGL but we don't have WebGLU.
It's shitty.
That's what I'm saying.
And the worst part about it is that the shaders have to be stored as strings. It's not as if it's possible to write shaders in JavaScript (even though it could be possible), so it's really an outrageous expectation when using WebGL from square one to have to write GLSL shaders inline in this specific language (to put a triangle on the screen). Even streaming them as text resources, there is not an easy way to write or debug a fragment shader in the browser that isn't stroke inducing. In fact, it doesn't seem that there's a reasonable way to do that AT ALL.
Your "very little use" case covers about 10,000 snippets that could be easy to share if they didn't end up looking like this, first result from Google for "Pastebin Webgl": http://pastebin.com/uiHGium5
Compare a decently developed web standard like HTML 5 video: http://pastebin.com/znbHgG3r
Wow, a 40 line demo that is easy to demonstrate.
The whole domain of 3D programming is full of such arrogant programmers that it's no wonder that everything continues to be difficult even 20 years on. It takes expert knowledge. Why even bother using a graphics card or language, when it's trivial for a programmer to write their own software rasterizer? This is obviously how someone should learn to program, if they're going to do it the RIGHT WAY.
Spare me.