Saturday, February 1, 2014

Render To Texture (RTT)

For my next android application I need to apply simple Gaussian blur effect. For this to achieve I need to get current scene to texture and then perform post-production filter, in my case Gaussian blur filter.
In this post I’ll cover only FBO, Gaussian blur will be covered in next post.
To achieve this render scene to Frame Buffer Object (FBO), which is off-screen rendering technique, which allows us to render scene to a texture. We can use this texture to apply any post-processing filters.
Here is screenshot of what we are trying to achieve
clip_image002[22]
In this post I’ll try to cover
1) Load images from Assets folder (android specific)
2) Initialize FBO
3) Render to FBO
4) Use rendered FBO texture in other scene

As usual code is available in Google Code.

Load images from Assets folder

This is a straight forward, not much code is involved. Below is the code
public static Bitmap GetFromAssets(GLSurfaceView view,String name)
{
    Bitmap img = null;
    //get asset manager
    AssetManager assetManager = view.getContext().getAssets();
    InputStream istr;
    try {
        //open image to input stream
        istr = assetManager.open(name);
        //decode input stream
        img = BitmapFactory.decodeStream(istr);
    } catch (IOException e) {
        e.printStackTrace();
    }
    return img;       
}

 

Initialize FBO

This is the most important part in using FBO, once initialization is done properly rest of the process is very simple.
I’ll put initialization in below steps.
a) Generate Frame Buffer
b) Generate Texture, to use with frame buffer
c) Generate Render Buffer
d) Bind Frame buffer generated in first step
e) Bind texture
f) Define texture parameters like format, dimension and min/mag filters.
g) Bind render buffer and define buffer dimension
h) Attach texture FBO color attachment
i) Attach render buffer to depth attachment
Make sure that dimensions are in POT (Power Of Two), because some devices may not support NPOT textures.
On a general note, to be on the safe side use images with POT dimensions.
Below is the code part for initializing FBO
int[] temp = new int[1];
//generate fbo id
GLES20.glGenFramebuffers(1, temp, 0);
fboId = temp[0];
//generate texture
GLES20.glGenTextures(1, temp, 0);
fboTex = temp[0];
//generate render buffer
GLES20.glGenRenderbuffers(1, temp, 0);
renderBufferId = temp[0];
//Bind Frame buffer
GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, fboId);
//Bind texture
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, fboTex);
//Define texture parameters
GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, 0, GLES20.GL_RGBA, fboWidth, fboHeight, 0, GLES20.GL_RGBA, GLES20.GL_UNSIGNED_BYTE, null);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_LINEAR);
//Bind render buffer and define buffer dimension
GLES20.glBindRenderbuffer(GLES20.GL_RENDERBUFFER, renderBufferId);
GLES20.glRenderbufferStorage(GLES20.GL_RENDERBUFFER, GLES20.GL_DEPTH_COMPONENT16, fboWidth, fboHeight);
//Attach texture FBO color attachment
GLES20.glFramebufferTexture2D(GLES20.GL_FRAMEBUFFER, GLES20.GL_COLOR_ATTACHMENT0, GLES20.GL_TEXTURE_2D, fboTex, 0);
//Attach render buffer to depth attachment
GLES20.glFramebufferRenderbuffer(GLES20.GL_FRAMEBUFFER, GLES20.GL_DEPTH_ATTACHMENT, GLES20.GL_RENDERBUFFER, renderBufferId);
//we are done, reset
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, 0);
GLES20.glBindRenderbuffer(GLES20.GL_RENDERBUFFER, 0);
GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, 0);

  Render to FBO

Rendering to FBO is same as normal rendering except for two initial steps
1) Bind the frame buffer to which we rendering
2) Set the viewport size to FBO width and height
GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, fboId);
GLES20.glViewport(0, 0, fboWidth, fboHeight);
        ******Rendering Code*******
       
GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, 0);
 

Use rendered FBO texture in other scene

Now that we have rendered scene to texture, we can use this texture in other scenes or for post-processing etc.,
public void onDrawFrame(GL10 arg0)
{       
    //call FBORenderer to render to texture
    fbor.RenderToTexture();
    //reset the projection, because viewport is set by FBO renderer is different
    GLES20.glViewport(0, 0, vwidth, vheight);
    float ratio = (float)vwidth/(float)vheight;
    float a = 5f;
    Matrix.orthoM(m_fProj, 0, -a*ratio, a*ratio, -a*ratio, a*ratio, 1, 10);   
    //multiply view matrix with projection matrix
    Matrix.multiplyMM(m_fVPMat, 0, m_fProj, 0, m_fView, 0);
    //below procedure is same as any other rendering
    GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT|GLES20.GL_DEPTH_BUFFER_BIT);
   
    GLES20.glUseProgram(iProgId);
   
    vertexBuffer.position(0);
    GLES20.glVertexAttribPointer(iPosition, 3, GLES20.GL_FLOAT, false, 0, vertexBuffer);
    GLES20.glEnableVertexAttribArray(iPosition);
   
    texBuffer.position(0);
    GLES20.glVertexAttribPointer(iTexCoords, 2, GLES20.GL_FLOAT, false, 0, texBuffer);
    GLES20.glEnableVertexAttribArray(iTexCoords);
   
    GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
    GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, iTexId);
    GLES20.glUniform1i(iTexLoc, 0);
    //since I'm multi-texturing, bind fboId to texture1
    GLES20.glActiveTexture(GLES20.GL_TEXTURE1);
    GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, fboId);
    GLES20.glUniform1i(iTexLoc1, 1);
    //for rotating cube
    Matrix.setIdentityM(m_fModel, 0);
    Matrix.rotateM(m_fModel, 0, -xAngle, 0, 1, 0);
    Matrix.rotateM(m_fModel, 0, -yAngle, 1, 0, 0);
    //multiply model matrix with view-projection matrix
    Matrix.multiplyMM(m_fMVPMat, 0, m_fVPMat, 0, m_fModel, 0);   
    //pass model-view-projection matrix to shader
    GLES20.glUniformMatrix4fv(iMVPMat, 1, false, m_fMVPMat, 0);   
    //draw cube
    GLES20.glDrawElements(GLES20.GL_TRIANGLES, cube.m_nIndeces, GLES20.GL_UNSIGNED_SHORT, indexBuffer);
}
If you look at the code, FBO related initialization, rendering is separated in FBORenderer class to avoid confusion when we try to achieve Gaussian blur using 2 FBOs.
That’s it in this post. Next post will be on Gaussian blur.
If you have any related questions/queries or If you find any mistakes in this post please do leave a comment.

Simple Directional Lighting

I have combined per-vertex and per-pixel lighting effects in one program, so that we can have good look at what is the difference in output. This lighting is very basic and does not involve shadows and specular lighting.
Output looks like
Menu
Per-Vertex
Per-Pixel
if observe in above screens, per-pixel shading is much smoother than per-vertex.
Directional Lighting, illuminates all objects equally from a given direction, like an area light of infinite size and infinite distance from the scene; there is shading, but cannot be any distance falloff
For detailed theory, you can refer here and here which offers great theoretical part of lightings. I have nothing new to explain.
In this post I’ll try to show coding difference between per-vertex and per-pixel shading.
First we’ll look at shader code and then remaining rendering part

Vertex Shader


Per-Vertex
Per-Pixel
"attribute vec4 a_position;" +
"attribute vec3 a_normals;" +
"attribute vec2 a_texCoords;" +
"uniform mat4 u_ModelViewMatrix;" +
"uniform vec3 u_LightDir;" +
"uniform vec3 u_LightColor;" +
"uniform vec3 u_SpecLightColor;" +
"uniform float u_Shine;" +
"varying vec3 v_colorWeight;" +
"varying vec2 v_texCoords;" +
"void main()" +
"{" +
    "gl_Position = u_ModelViewMatrix * a_position;" +
    "v_texCoords = a_texCoords;" +
    "vec3 normal = normalize(vec3(u_ModelViewMatrix * vec4(a_normals,0.0)));" +
    "vec3 lightNorm = normalize(u_LightDir);" +
    "float lightWeight = max(dot(normal,lightNorm),0.0);" +
    "vec3 halfVec = normalize(u_LightDir - gl_Position.xyz);" +
    "float specWeight = pow(max(dot(normal,halfVec),0.0),u_Shine);" +
    "v_colorWeight = vec3(0.0,0.0,0.0) + (lightWeight * u_LightColor) + (u_SpecLightColor*specWeight);" +
"}";

"attribute vec4 a_position;" +
"attribute vec3 a_normals;" +
"attribute vec2 a_texCoords;" +
"uniform mat4 u_ModelViewMatrix;" +
"varying vec3 u_Normals;" +
"varying vec2 v_texCoords;" +
"varying vec4 v_position;" +
"void main()" +
"{" +
    "v_texCoords = a_texCoords;" +
     "u_Normals =  normalize(vec3(u_ModelViewMatrix * vec4(a_normals,0.0)));" +
    "gl_Position = u_ModelViewMatrix * a_position;" +
     "v_position = gl_Position;" +
"}";



In Per-Vertex approach we are calculating color weight based in the object normals and light normal. but in Per-Pixel approach we are transforming normals and passing to Fragment shader. we need to transform the normals as we are transforming position vertices co-ordinates.

For vertex shader, we are taking vertex normals of Sphere along with vertices as input. As usual passing model-view matrix for calculating object vertex position with respective to current projection.
We also need normals for model view matrix for transforming vertex normals, which we will use for calculating lighting brightness. I’ll explain how to calculate this normals later in below sections.
We need two more variable for lighting parameters such as light direction, light color. I assumed ambience color as (r:0.2, g:0,2, b:0,2) for calculations.
A varying variable, v_colorWeight, for passing the calculated result to fragment shader.
Calculations are very simple, we are here transforming normals and calculating the light color weight for modifying objects’s color.

Fragment Shader










Per-Vertex

Per-Pixel
String strFShaderPV =
   "precision mediump float;" +
   "varying vec3 v_colorWeight;" +
   "varying vec2 v_texCoords;" +
   "uniform sampler2D u_texId;" +
   "void main()" +
   "{" +
"vec4 texColor = texture2D(u_texId, 

v_texCoords);" +
"gl_FragColor = 

vec4(texColor.rgb * v_colorWeight, 

texColor.a);" +
   "}";
String strFShaderPP = "precision mediump float;" +
    "uniform vec3 u_LightDir;" +
    "uniform vec3 u_LightColor;" +   
    "uniform sampler2D u_texId;" +
    "varying vec2 v_texCoords;" +
    "varying vec3 u_Normals;" +
    "void main()" +
    "{" +
     "vec3 LNorm = normalize(u_LightDir);" +
     "vec3 normal = normalize(u_Normals);" +
     "float intensity = max(dot(LNorm, normal),0.0);" +
     "vec4 texColor = texture2D(u_texId, v_texCoords);" +
     "vec3 calcColor = vec3(0.2,0.2,0.2) + u_LightColor * intensity;" +
     "gl_FragColor = vec4(texColor.rgb * calcColor, texColor.a);" +
    "}";

in Per-Vertex approach we are combining calculated color weight from vertex shader. but in Per-Pixel approach we taking the normals and performing all the calculations in Fragment Shader.
Let’s look at remaining code part.
Two new classes I have created in common package for Mat3 for 3X3 matrix calculation and Mesh class for creating sphere and cube (as of now, will add loading 3DS, OBJ and other file support on the go).

Renderer

public LightRenderer(ES2SurfaceView view)
{
    sphere = new Mesh();
    sphere.Sphere(4, 10);
    curView = view;
    normalMat = new Mat3();
    cubeBuffer = sphere.getVertexBuffer();
    normalsBuffer = sphere.getNormalsBuffer();
    indexBuffer = sphere.getIndecesBuffer();
    texBuffer = sphere.getTextureBuffer();
}
in constructor of renderer we are creating a sphere and getting vertex, texture coordinates and index buffers for local reference.
public void onDrawFrame(GL10 gl) {
    GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT | GLES20.GL_DEPTH_BUFFER_BIT);
        
    GLES20.glUseProgram(iProgId);
    
    cubeBuffer.position(0);
    GLES20.glVertexAttribPointer(iPosition, 3, GLES20.GL_FLOAT, false, 0, cubeBuffer);
    GLES20.glEnableVertexAttribArray(iPosition);
        
    texBuffer.position(0);
    GLES20.glVertexAttribPointer(iTexCoords, 2, GLES20.GL_FLOAT, false, 0,texBuffer);
    GLES20.glEnableVertexAttribArray(iTexCoords);
        
    normalsBuffer.position(0);
    GLES20.glVertexAttribPointer(iNormals, 3, GLES20.GL_FLOAT, false, 0, normalsBuffer);
    GLES20.glEnableVertexAttribArray(iNormals);
        
    GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
    GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, iTexId);
    GLES20.glUniform1i(iTexLoc, 0);
        
    GLES20.glUniform3fv(iLightColor, 1, m_fLightColor, 0);
    GLES20.glUniform3fv(iLightDirection, 1, m_fLightDir, 0);
        
    GLES20.glUniform3fv(iSpecColor, 1, m_fSpecColor, 0);
    GLES20.glUniform1f(iShine, fShine);
        
    Matrix.setIdentityM(m_fModel, 0);
    Matrix.rotateM(m_fModel, 0, -xAngle, 0, 1, 0);
    Matrix.rotateM(m_fModel, 0, -yAngle, 1, 0, 0);
        
    Matrix.multiplyMM(m_fMVPMatrix, 0, m_fVPMatrix, 0, m_fModel, 0);
            
    GLES20.glUniformMatrix4fv(iVPMatrix, 1, false, m_fMVPMatrix, 0);
        
    GLES20.glDrawElements(GLES20.GL_TRIANGLES, sphere.m_nIndeces, GLES20.GL_UNSIGNED_SHORT, indexBuffer);

}

highlighted code is the only difference part compared to previous posts. which creates a 3X3 normals matrix for Model View projection matrix, which is used in shaders for transforming sphere normals.

Normals are calculated for top- left 3X3 matrix.

SetFrom4X4 function will copy 3X3 matrix from passed 4X4 matrix

Inverse this 3X3 matrix and then transpose will give us normals matrix.

If you like to know how to mathematically calculate, refer here.


A Little bit of Android, Creating Options Menu


There are two ways of loading menus. One from coding and other from resource.
Here I have done in code, as we have only two items for menu
We have to override two methods in activity.


  • public boolean onCreateOptionsMenu(Menu menu)
  • public boolean onOptionsItemSelected(MenuItem item)
onCreateOptionMenu method will be called when menu button is pressed on phone. We will create menu in this function.
public boolean onCreateOptionsMenu(Menu menu) {
     menu.add(Menu.NONE, 0, Menu.NONE, "Per Vertex");
     menu.add(Menu.NONE, 1, Menu.NONE, "Per Pixel");
      return super.onCreateOptionsMenu(menu);

    }

With this we create menu with two items “Per-Pixel” and “Per-Vertex” with ids 0 and 1 respectively.

To handle menu event we have to implement onOptionsItemSelected method.
public boolean onOptionsItemSelected(MenuItem item) {
    
     if (item.getItemId() == 0)
     {
      view.LoadProgram(item.getItemId());
     } else if (item.getItemId() == 1)
     {
      view.LoadProgram(item.getItemId());
     } else {
      return super.onOptionsItemSelected(item);
     }
     return true;
    }

In this method I’m calling view’s LoadProgram function to switch between Per-Vertex and Per-Pixel shader programs.

That’s it in this post.