I'm rendering the same scene using the same exact C++ code, once to native OpenGL on windows and once using Emscripten to WebGL. Everything in the scene looks exactly the same, except when I'm rendering something with alpha != 1.0. The difference looks like this:
The blue cube color is (0.0, 0.0, 1.0, 0.5)
The shader used for rendering the cube does nothing except draw the color.
On the right is how it looks with OpenGL and is the expected result, just blue with half transparency. On the left is how it looks with Emscripten+WebGL. It looks like the color which is rendered is actually (0.5, 0.5, 1.0, 0.5)
The blend function I use is the standard:
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
Is there some kind of difference with alpha in WebGL? What can possibly cause this to happen?
See Question&Answers more detail:os