Tips/tricks related to Computer Graphics, GPUs and other programming

Tutorials on how to write shaders for the Android platform are a little difficult to come by on the Internet, so I decided to work on a small but simple program which hopefully provides a simple framework to get started writing programs for OpenGL ES 2.0.

I’ll first link to the apk, source code and the repository:
apk here.
Source code here.
Google Code Repository Β here. NOTE: Branch is “default”

Here’s what my program does:

  • Mesh Loading:The program loads triangular meshes in these formats (as .txt files):
    • (.OFF): These consist of only vertex positions. My program will calculate per-vertex normals.
    • (.OBJ): Vertex positions, normals and texture coordinates are provided. Sample textured cube mesh was exported from Blender.
  • Shaders:
    • Gouraud Shading: Per-vertex lighting [with texture mapping if enabled].
    • Phong Shading: Per-pixel lighting [with texture mapping if enabled].
    • Normal Mapping: Gives a fake appearance of depth to a mesh. Sample normal map included.

NOTE: If your program is crashing with glError 1281 and you have a relatively new phone (a Samsung Galaxy S2 for example), change all dot() method calls in the shaders to dot2().

How it works:

The program will only work if your phone supports OpenGL ES 2.0. If it doesn’t my program quits without a warning.

When the program starts you will see an octahedron mesh with Gouraud shading enabled. There is a light rotating around the mesh. You can use one finger
to rotate the mesh, and two fingers (pinch-to-zoom) to scale the mesh (though it’s not *perfect* ). Clicking the menu button on your phone allows you to:

  • Toggle Light Rotation: Light rotation can be toggled through this option. A short ‘Toast’ notification pops up.
  • Shader switching: Can switch between Gouraud shading, phong shading and normal mapping.
  • Mesh switching: Can switch between an Octahedron, Tetrahedron (both .off files) and Textured Cube Mesh (.obj file).
  • Toggle Texturing: Can turn off texture mapping if there is an associated texture with the object (only the textured cube mesh).

Background needed:

Before you start reading this tutorial it would be useful if you have experience with:

  • Java Programming. If you have experience with C/C++/Python/etc. programming it shouldn’t be that hard to understand the code here. But I’m guessing you are already familiar with Java if you are here.
  • OpenGL. Previous experience working with OpenGL (ES or otherwise) would be nice. As I go along I will try to explain what I’m doing in my code, but I won’t go through my shader code line-by-line. You can lookup info on what Gouraud shading, Phong Shading and normal mapping are in other places.
  • Android Development. I won’t go over in detail about how to setup and write Android applications, but I will mention how to create an Activity, an OpenGL view, xml files, menus, etc. as used in my program. Once again, there are very good tutorials out there on how to write an Android program.

Let’s Begin:

I decided to start off by editing and included in the Android SDK. There are five .java files in my program:

  1. This is the ‘Activity’ started up by the phone, sets up the GLSurfaceView and it’s Renderer, handles touch events, creates and handles the menu, etc.
  2. This is the OpenGL renderer itself. It creates the shaders, the camera views, the objects and then handles the rendering for each frame. This is the file I’ll spend most time on.
  3. A class to represent a shader object (along with it’s vertex and pixel shaders). Just reads the corresponding vertex and pixel shader files and generates the programs. The class is rather straightforward so I won’t spend much time explaining it.
  4. Represents one object. It stores a triangular mesh and associated texture ids and files for an object.
  5. This is the class which represents a triangular mesh. It loads a .OFF or .OBJ file and stores all the data in a vertex buffer and also generates an index buffer for rendering (therefore we will be using glDrawElements rather than glDrawArrays). I will go over this in a little detail.

Apart from these java files my other files – images (.png), shaders (.txt) and meshes (.txt) are stored in the res/raw/ directory. Storing them here allows easy access in my code.


i. We are going to start off with the OnCreate method which is called when you start the program:

protected void onCreate(Bundle savedInstanceState) {

	// Create a new GLSurfaceView - this holds the GL Renderer
	mGLSurfaceView = new GLSurfaceView(this);

	// detect if OpenGL ES 2.0 support exists - if it doesn't, exit.
	if (detectOpenGLES20()) {
		// Tell the surface view we want to create an OpenGL ES 2.0-compatible
		// context, and set an OpenGL ES 2.0-compatible renderer.
		renderer = new Renderer(this);
	else { // quit if no support - get a better phone!

	// set the content view

It sets the ContentView to be a GLSurfaceView object. It then detects whether the phone supports OpenGL ES 2.0; if it does it sets the renderer otherwise the program quits.

ii. Creating the menu:

The menu is created via the OnCreateOptionsMenu method:

public boolean onCreateOptionsMenu(Menu menu) {
	MenuInflater inflater = getMenuInflater();
	inflater.inflate(, menu);
	return true;

It ‘inflates’ the menu from the xml file ‘game_menu.xml’ included in the res/menu directory. It’s a simple file with a list of items.

iii. Handling Menu Events:

The following method is responsible for handling what the user pressed in the menu. Hopefully it’s self-explanatory:

public boolean onOptionsItemSelected(MenuItem item) {
	// Handle item selection
	switch (item.getItemId()) {
	case          // Toggle lighting
		return true;
	case             // Gouraud Shading
		return true;
	case                  // Quit the program
		return true;
	case                // Cube
		return true;
	case	            // Enable/disable texturing
		return true;
		return super.onOptionsItemSelected(item);

iv. Handling Touch Events:

You can rotate and scale the meshes by handling touch events in the onTouchEvent method. I won’t list the code here but you can get more
info at this link.


The Mesh class is responsible for reading in the mesh files and constructing the vertex and index buffers. The vertex buffer (a 1-D array) layout is as follows:

vp = vertex position (x, y, z);
vn = vertex normal (x, y, z);
tc = Texture Coordinate (u, v);

[vp1, vn1, tc1, vp2, vn2, tc2,….]

So one vertex occupies 8 (3 + 3 + 2) positions in the 1D array. The constructor calls the loadFile() method which reads in the mesh file, distinguishes whether it is a .off or .obj file by reading the first line (OFF or OBJ), and then proceeds to read in the file.

The methods are too long to post here, but I’ll just mention a few words about reading in the files:

  • .OFF format: This format has a list of all the vertices first followed by a list of all the faces (with indices of which vertex is included in the face). This makes it simple to fill up the vertex buffer with position info and the index buffer, but we still need to calculate the vertex normals. I first calculate the face normal of each face (cross product of two vectors on the face), and then average the normals of all the faces around each vertex to get the vertex normal.
    Unfortunately since texture coordinates are not provided in this file there is unused space in the vertex buffer when working with .OFF meshes. You could probably do some extra work to overcome this but I was a little lazy : P
  • .OBJ format: The OBJ format is more common (and Blender can export them which is very useful). Reading this format is a little more tedious since it doesn’t provide # of vertices/faces/etc at the beginning of the file so you can initialize your arrays. You have to read each line and figure out whether it’s a vertex position, a vertex normal, a face or a texture coordinate.
    NOTE:My reader will not read every obj file out there. You will have to tinker with them a bit to make sure it’s compatible with mine. Here’s the format which I can read in:
    list of vertices:
    v x y z
    list of tex coords:
    vt u v
    list of normals:
    vn x y z
    list of faces
    f pos1/tc1/n1 pos2/tc2/n2 pos3/tc3/n3

    The file has to be in the order given above, though in an actual obj file the lines can be arbitrary. I also don’t read in any material information, so be cautioned.


This is the main OpenGL renderer which does all the setup and per-frame rendering.

There are 4 main methods I am going to look at: the constructor, the onSurfaceCreated(initialization), onSurfaceChanged(changing viewport) and the big one: onDrawFrame().

i. Constructor:

The code below just helps setup the shader text arrays. ‘R.raw.gouraud_vs’ refers to the gouraud_vs.txt file in the res/raw folder which defines the Gouraud vertex shader:

// setup all the shaders
vShaders = new int[3];
fShaders = new int[3];

// basic - just gouraud shading
vShaders[GOURAUD_SHADER] = R.raw.gouraud_vs;
fShaders[GOURAUD_SHADER] = R.raw.gouraud_ps;

Now we create some objects. We pass in the textures associated with the object along with the text file representing the mesh (R.raw.octahedron for example).

// Create some objects - pass in the textures, the meshes
try {
	int[] normalMapTextures = {R.raw.diffuse_old, R.raw.diffusenormalmap_deepbig};
	_objects[0] = new Object3D(R.raw.octahedron, false, context);
	_objects[1] = new Object3D(R.raw.tetrahedron, false, context);
	_objects[2] = new Object3D(normalMapTextures, R.raw.texturedcube, true, context);
} catch (Exception e) {
	//showAlert("" + e.getMessage());

ii. OnSurfaceCreated:
This method is called when the surface is first created. It is used for setting up your initial variables – light properties, material properties, shaders, etc.

Code below creates the Shader objects:

// initialize shaders
try {
	_shaders[GOURAUD_SHADER] = new Shader(vShaders[GOURAUD_SHADER], fShaders[GOURAUD_SHADER], mContext, false, 0); // gouraud
	_shaders[PHONG_SHADER] = new Shader(vShaders[PHONG_SHADER], fShaders[PHONG_SHADER], mContext, false, 0); // phong
	_shaders[NORMALMAP_SHADER] = new Shader(vShaders[NORMALMAP_SHADER], fShaders[NORMALMAP_SHADER], mContext, false, 0); // normal map
} catch (Exception e) {
	Log.d("Shader Setup", e.getLocalizedMessage());

Then I enable depth testing, clear the depth buffer, specify function for depth comparison, enable backface culling:

GLES20.glDepthFunc( GLES20.GL_LEQUAL );
GLES20.glDepthMask( true );

// cull backface
GLES20.glEnable( GLES20.GL_CULL_FACE );

Light properties and material properties (ambient, diffuse, specular) are setup next. Ideally you’d probably want to have a separate “Light” class and store material properties in the Object3D itself. But I wanted to keep it simple.

// light variables
// light position
float[] lightP = {30.0f, 0.0f, 10.0f, 1};
this.lightPos = lightP;


// material properties - ambient first
float[] mA = {1.0f, 0.5f, 0.5f, 1.0f};
matAmbient = mA;


Lastly I setup the textures for all the objects (setupTexture method) and set the modelviewMatrix via setLookAt:

// setup textures for all objects
for(int i = 0; i < _objects.length; i++)

// set the view matrix
Matrix.setLookAtM(mVMatrix, 0, 0, 0, -5.0f, 0.0f, 0f, 0f, 0f, 1.0f, 0.0f);

The eye is placed at the location [0, 0, -5], looks at the origin [0, 0, 0] and has an up vector of [0, 1, 0]. The 4×4 modelview matrix generated is stored in the variable mVMatrix (which will later be multiplied with the Projection Matrix and sent to the vertex shader).

iii. onSurfaceChanged:

This method is called whenever the GL Surface is changed. For example when you switch the phone’s orientation from Portrait to Landscape (or vice-versa).

GLES20.glViewport(0, 0, width, height);
float ratio = (float) width / height;
Matrix.frustumM(mProjMatrix, 0, -ratio, ratio, -1, 1, 0.5f, 10);

The viewport is first set to completely cover the whole window of the phone (width and height are variables passed into the method).
The Projection Matrix is then set by calling the frustumM method. I am using Perspective projection instead of Orthographic. The view frustum is set using six clip planes (left, right, bottom, top, near and far). If you want to use Orthographic projection instead use the method “orthoM”. This is a good resource for understanding perspective/orthographic projections (if you want to get into the math-y stuff).

Now we can finally move onto rendering everything.

iv. onDrawFrame:

This method is called repeatedly as it generates a new frame again and again. The first step is to clear the GL_DEPTH_BUFFER_BIT and the GL_COLOR_BUFFER_BIT (which clear the screen), and then we call glClearColor to set the background to black.

Next we lookup the current shader and start using it:

// the current shader
Shader shader = _shaders[this._currentShader];
int _program = shader.get_program();

// Start using the shader

Next we try to rotate the light if it is enabled. Then we try to generate the ModelViewProjectionMatrix, which is done by multiplying the various matrices (scaling, rotation, modelview and then projection matrix):

// scaling
Matrix.setIdentityM(mScaleMatrix, 0);
Matrix.scaleM(mScaleMatrix, 0, scaleX, scaleY, scaleZ);

// Rotation along x
Matrix.setRotateM(mRotXMatrix, 0, this.mAngleY, -1.0f, 0.0f, 0.0f);
// Set the ModelViewProjectionMatrix
float tempMatrix[] = new float[16];
Matrix.multiplyMM(tempMatrix, 0, mRotYMatrix, 0, mRotXMatrix, 0);
Matrix.multiplyMM(mMVPMatrix, 0, mProjMatrix, 0, mMVPMatrix, 0);

To send the matrix to the vertex shader we call the glUniformMatrix4fv method:

GLES20.glUniformMatrix4fv(GLES20.glGetUniformLocation(_program, "uMVPMatrix"), 1, false, mMVPMatrix, 0);

This method takes in the shader variable handle inside the shader (“uMVPMatrix”) and then passes in the matrix.

Next we pass in the transformed normal matrix, which is the invert of the ModelViewProjectionMatrix followed by it’s transpose.

To pass in the lighting/material/eye position vectors we just call “glUniformxfv” (where x is the length of the vector):

// lighting variables
GLES20.glUniform4fv(GLES20.glGetUniformLocation(_program, "lightPos"), 1, lightPos, 0);
// material
GLES20.glUniform4fv(GLES20.glGetUniformLocation(_program, "matAmbient"), 1, matAmbient, 0);
// eye position
GLES20.glUniform3fv(GLES20.glGetUniformLocation(_program, "eyePos"), 1, eyePos, 0);

Drawing using the Vertex and Index Buffers:

Finally we come to the main part (And I appreciate you reading all the way down here!). We are going to send the vertex positions, normals and texture coordinates now. I will just demonstrate how vertex positions are sent, and you can figure out the rest.

// Get buffers from mesh
Object3D ob = this._objects[this._currentObject];
Mesh mesh = ob.getMesh();
FloatBuffer _vb = mesh.get_vb();
ShortBuffer _ib = mesh.get_ib();

short[] _indices = mesh.get_indices();

// the vertex coordinates
GLES20.glVertexAttribPointer(GLES20.glGetAttribLocation(_program, "aPosition"), 3, GLES20.GL_FLOAT, false,
GLES20.glEnableVertexAttribArray(GLES20.glGetAttribLocation(_program, "aPosition"));

The first step is to get the vertex buffer from the mesh of the object we are viewing. Once that’s done we position the buffer to where the vertex positions start. If we remember our vertex buffer started with vertex positions, followed by normal positions and texture coordinates. So TRIANGLE_VERTICES_DATA_POS_OFFSET = 0. TRIANGLE_VERTICES_DATA_NOR_OFFSET = 3 and so on. To pass in the vertices we use glVertexAttribPointer – we pass in the shader variable, the size of each vector, the type, the stride (8 in our case) and finally the buffer.

To make sure it works we have to enable the shader variable at the end. This procedure is repeated for normals and texture coordinates. Finally we draw with the index buffer:

// Draw with indices
GLES20.glDrawElements(GLES20.GL_TRIANGLES, _indices.length, GLES20.GL_UNSIGNED_SHORT, _ib);

We use glDrawElements to render using the index buffers. 3 consecutive indices in the index buffer define one triangle, so we use GL_TRIANGLES as the primitive. Other options are GL_TRIANGLE_STRIP, etc. (use GL_LINES if you want to draw wireframe). From what I’ve read it’s more efficient to use glDrawElements over glDrawArrays, which just takes in a list of vertex data. In the case of glDrawArrays there is going to be repetition of data which is something we want to avoid in a mobile platform with limited memory.


So I went over how to: interact with the renderer, load meshes, generate shader programs, setup the various variables and finally render everything. I might write up later as to how I implemented the shaders, but you can easily find other sources which describe how phong shading and bump mapping are done in GLSL.

If you have any feedback and/or questions please leave a comment, and thanks for reading.

Links to the apk, source code and the repository:
apk here.
Source code here.
Google Code Repository Β here. NOTE: Branch is “default”


  1. Bump Mapping
  2. How to use Multi-touch in Android

Comments on: "Getting started with OpenGL ES 2.0 shaders on Android" (54)

  1. Great Tutotial!! In your code project, the new class has quads variables…

    why this variables is added and why drawElements function is called two times..?

    • I’m guessing you are referring to the Google Code link? That is actually not directly related to this tutorial. I built upon this project to learn how to do “Render To Texture”:

      Basically when you are doing render to texture you are going to render your scene into a texture (hence the “quad” variables). This is the first “glDrawElements” call. The second “glDrawElements” call renders the texture itself by rendering a quad with the texture attached to it.

      I hope that makes sense! I would suggest downloading the .zip file provided rather than looking at the Google Code link since that is different.

  2. dalasjoe said:

    May be your post and source is the best android shader tutorial!!

    looking forward to your next lecure.

    Thanks again sjaay

  3. dalasjoe said:

    Hi again sjaay..

    To translate a mesh object, where the code Matrix.translateM placed …..?

    i try to below, but in normal mappping mode texture cube is crashed..

    private boolean renderToTexture() {
    float tempRotateMatrix[] = new float[16];
    Matrix.setIdentityM(mTranslateMatrix, 0);
    Matrix.translateM(mTranslateMatrix, 0, 4.0f, 4.0f , 4f);
    Matrix.setIdentityM(tempRotateMatrix, 0);
    Matrix.multiplyMM(tempRotateMatrix, 0, mTranslateMatrix, 0, tempRotateMatrix, 0);
    Matrix.multiplyMM(tempRotateMatrix, 0, mRotYMatrix, 0, mRotXMatrix, 0);
    Matrix.multiplyMM(tempRotateMatrix, 0, mTranslateMatrix, 0, tempRotateMatrix, 0); Matrix.multiplyMM(mMMatrix, 0, mScaleMatrix, 0, tempRotateMatrix, 0);
    Matrix.multiplyMM(mMVPMatrix, 0, mVMatrix, 0, mMMatrix, 0);
    Matrix.multiplyMM(mMVPMatrix, 0, mProjMatrix, 0, mMVPMatrix, 0);

  4. DrakeDex said:

    This tutorial is great :X Man πŸ˜€ THANK YOU very much. I was killing my brains out trying to do this in native C. You saved me πŸ˜€

  5. Well… i’m no expert but your shader looks incorrect to me.

    You seem to be completely ignoring your material and light values for the ambient component when a texture is loaded, and your diffuse and specular components don’t incorporate the texture at all.

    • When texture is on I consider the texture colour to be the “ambient” value of the material. I could multiply it with the ambient material but it would have produced a pinkish colour which I wanted to avoid.

      The diffuse and specular values are still “white” (1.0, 1.0, 1.0) when the texture is loaded. That’s why you see a white specular highlight on the textured surface (ideally you’d have a specular map)

      • Well, the problem with this is if you only have an ambient light in your scene, your textured model is going to appear “full bright” irregardless of any lights you have in the scene.

        Furthermore your model isn’t going to change no matter what colour the lighting is.

        I think you may have confused the meaning of material light values. They do *not* define the light that should illuminate the model, but rather how the model should *react* to the light in the scene. If your particular model was turning pink, think is an issue with your models material and not with the code.

      • @Deus X

        You are right. Thing is I ignored a lot of things in the shader – I just use “light color” rather than having ambient/diffuse/specular values for light.

        And I know that material values are supposed to interact with the light color – for now I only have it interacting for the ambient value (in case textures are off).

  6. Hello,
    Your tutorial is very good ! Thank you.
    But I can load an OBJ file without texture.
    To try with an more simple mesh, I juste create a new OBJ file with the sames lines that your texturecube without lines “vt”.
    In the Mesh class, when parsing file, I removed all parts which deals with texture.
    In the Renderer, in define my objet like following :
    _objects[2] = new Object3D(, false, context);
    When apk is running on my device, shape is not a cube but severals triangles … Do you know what could be my problem?
    Best Regards and thank you by advance,

    • did you also change the “face” information in the OBJ file? Here is the OBJ format for faces:

      * Loads an OBJ file
      * OBJ FORMAT:
      * ———-
      list of vertices:
      v x y z
      list of tex coords:
      vt u v
      list of normals:
      vn x y z
      list of faces
      f pos1/tc1/n1 pos2/tc2/n2 pos3/tc3/n3 **/

      As you can see after f you have the position, then the texture coordinate and then the normal. You would have to remove the texture coordinate and then maybe also modify the loadOBJ() method

      • I changed face information ==> f pos1/n1 pos2/n2 pos3/n3
        I modified the loadOBJ method by removing all texture lines.
        But I still get many triangles…
        That you do (tell me if i’m wrong)
        >read vertices in obj file
        >//read textures
        >read normals
        _v is float[] of vertice ; vs is buffer of vertice
        _n id float of normals ; vn is buffer of normals
        >read faces
        mainBuffer is a buffer with vertices,normals and textures according to the numbers vert texc vertN
        >put this buffer in _vertices[]
        >put vertices in buffer _vb
        _vb is the buffer used to draw.
        that i don’t understand it the fact tat in this buffer there are vertices normals and texture. How the draw method recognize that a float in this buffer is a normal or a vertex or a texture?
        And if I remove texture, the draw method will not know that and think the next float is a texture one but it is not since I removed it. I hope you understand me ^^.

        Moreover, the display does not recover the hole windows of the phone, there is a black band at the bottom. (I changed the background colour to white). You did not use any configChooser, and EGLsurfaceFactory as in the google sample, can it create this bad rendering?

        I really enjoy your help, many thanks!! And I don’t know if I can ask you that but may you send me an loadOBJ which work without texture? I think It is the best way for me to understand.
        Best Regards,

    • Since you changed the face information in the file you also have to change the read part of it (starting from line 385 in Make sure you handle that first

      The way the buffer is laid out is how the draw method knows. As I mentioned above the layout of the vertex buffer is as follows:

      [v1Pos.x, v1Pos.y, v1Pos.z, v1Nor.x, v1Nor.y, v1Nor.z, v1Tex.u, v1Tex.v, v2Pos.x….]

      So whenever we are passing in position, normal or tex coordinates we have to position the buffer at the exact value in the buffer (TRIANGLE_VERTICES_DATA_POS_OFFSET= 0 for position, the offset for normals is 3, for tex coords it is 5).

      When you do loadOFF() the Tex coords values in the buffer are empty, so that’s the way your buffer should look. Try debugging and looking at the values in your buffer when doing the reading (add breakpoints in Eclipse)

  7. Hi,

    I’m trying to load the off files in this site

    But, I’m unable to load any of them. Ex:- Teapot, Geometry, Numbers….

    What am I’m doing wrong.

    • I forgot to point this out in the post I think:

      My program only handles loading meshes with triangular faces. I downloaded models from that site too and a lot of them have faces with more than 3 vertices. I think that will cause a problem – see if you can find some with only triangular faces.
      I only use triangular ones because GPUs take in face data as triangles, so that makes it easier to handle. You could try converting all the faces to triangular using something like Blender.

  8. Thanks. I converted few using Blender.
    One quick question is about the application running on Android 3.0

    How to make it full screen? It seems to use only half of the screen. Sorry, I’m still a noob in both Android and GLES.


    • Since the application is designed for Android 2.2 the maximum resolution supported is 854 X 480 or something. (I tested it on my Galaxy S which has a res of 800 X 480). So when you run it on Android 3.0 it might just be doing a backwards compatibility thing and running it at half the res. Not sure how you could get it to run at full-screen (I’m guessing your res. is 1280 X 800). look at setting the target in the manifest file

    • This might help:

      (One of the headaches of Android development)

      • Yeah. Thanks. I looked at it. It suggests to add support multi screens to manifest. It didn’t work. Anyway, thanks for your help.

  9. Hi Shayan,

    Do you have any more GLSL ES shaders for Android.??

  10. I took a brick shader and edited according to your program

    When I run the application, its showing this error

    Renderer(1450): glDrawElements: glError 1281

    Couldn’t figure out what is it about. Please let me know if you know.

    • It’s hard to debug shaders with just that glError – you will have to make sure your textures, coordinates, normals, etc. are all proper

  11. Hi! Great tutorial, unfortunately on my phone (Galaxy S2 with Android 2.3.6) it crashes when I try the normal mapping. GLError 1281. Any idea why?


    • Hard to tell without having access to the hardware directly, but GLError 1281 usually refers to a problem with texture mapping (since normal mapping uses a bunch of textures). Try doing this:

      -Change the dimensions of the textures so both width and height are powers of 2 (for example 512 X 512, 256 X 256, etc.). I think my textures are all powers of 2, but I don’t remember exactly.

      Here is some more info:

      • Dirk Coetsee said:


        Not related to the texture size, literally the same apk works on the Galaxy S, crashes on the S2. The former uses PowerVR and the latter uses Mali. It’s odd, I need to download the android source to debug it.

      • Yes I use a Galaxy S myself and it works there.

        The GPU does have an effect on texture size. For example on my Galaxy S I cannot have textures where the width/height are not powers of 2 (but it is allowable in phones like the HTC Evo 4G). GLError 1281 usually refers to texture issues – perhaps the size or the way the texture is created. (maybe something to do with bit depth?)

  12. Pookmaster said:

    Okay, I have a samsung galaxy s2 and i also got that error 1281, after debugging it I found that the normal vertex and pixel shader files were using the dot() function which i believe is now a reserved command so I changed all occurances of it in the vs and ps file to dot2() and it ran perfectly πŸ™‚

  13. I like this tutorial, but I’m having trouble adding objects to the screen… What I’m trying to do is add another cube, but smaller and at a different XY position…

    • The way the program is currently setup is that it renders one object at a time. If you want to render another cube you will have to read another .obj/.off file into a Mesh object, and then make another call similar to line 229-279 on

  14. The 1281 error was for me to do with two things:
    1. TRIANGLE_VERTICES_DATA_POS_OFFSET (and such) should be in Bytes not in components
    2. The element / face count being passed to DrawElements was indices count rather than face count.
    I can’t honestly see how this ever worked in the first place :S

  15. Actually I’m not entirely sure about point 1. I am getting conflicting information from the usage of ByteBuffer and using glBindBuffer. However I believe I am correct with regards to point 2. It should not be indices array length, but indices length / 3.

    • Hmm I am not sure. Did you try doing indices.length/3? Let me know what happens.

      This person was having trouble with passing in numFaces:

      • The original 1281 error I got was from the glVertexAttribPointer, where glGetAttribLocation call to aPosition, was returning -1. This is an invalid argument for glVertexAttribPointer. You should attempt to catch this or at least put a call to checkGlError after each call rather than just after glDrawElements. I think that’s what’s misleading people πŸ™‚

  16. Oh god you’re right, damn that’s misleading πŸ˜› if anyones interested πŸ™‚ I’m so used to Direct3D

  17. Hi, nice tutorial and thank you very much.

    There is something i cant understand here;

    // Create the normal modelview matrix
    // Invert + transpose of mvpmatrix
    Matrix.invertM(normalMatrix, 0, mMVPMatrix, 0);
    Matrix.transposeM(normalMatrix, 0, normalMatrix, 0);

    // send to the shader
    GLES20.glUniformMatrix4fv(GLES20.glGetUniformLocation(_program, “normalMatrix”), 1, false, mMVPMatrix, 0);

    Storing inverted mvpmatrix into normalMatrix then storing transpose of normalmatrix into normalmatrix then sending mvpmatrix to shader, why? its like;

    return b;

    • Thanks for pointing that out and reminding me about it. I was trying to figure out the normal matrix to send in, and did the Invert + Transpose first. I saw that it wasn’t giving me the proper image, so I sent in the modelviewprojection matrix instead which seems to give a better result. I forgot to comment out the normal modelview matrix code. Hope that clears it up

  18. Hey, so I’ve tried running code and the apk on my HTC desire s and everything seems to be working fine except i don’t see texture 😦 its just pink color

  19. Awesome tutorial!!! I was looking everywhere for something that clearly showed how to use shaders and load meshes from files. I just finished writting an exporter script for maya that creates OFF files, and I have succesfully imported meshes from Maya 2012 into my phone! Can’t wait for more tutorials! Great Job!

  20. The source code doesn’t seem to work with the latest version of android

    • Thanks for the comment. What is the error exactly? As you can tell, this was developed a long time ago (for Android 2.2/2.3?). Might need some updates.

  21. First thanks for the tutorial.
    Second if anybody gets the following error, here is the solution:

    Process: graphics.shaders, PID: 18524
    java.lang.IllegalArgumentException: length – offset < count*4 < needed
    at android.opengl.GLES20.glUniform4fv (Native Method)
    at graphics.shaders.Renderer.onDrawFrame (
    atndroid.opengl.GLSurfaceView$GLThread.guardedRun (
    at android.opengl.GLSurfaceView$ (

    The source of problem is that you have to pass a 4 element vector to each of GLES20.glUniform4fv() functions in onDrawFrame method. Only one of the vectors, float[] lightC = {0.5f, 0.5f, 0.5f} initialized with 3 element. Change it to float[] lightC = {0.5f, 0.5f, 0.5f, 1.0f}; and it will work.

  22. Hi, Thanks you very much. It’s nice tutorial. I want to change the 3D model, when I exported and loaded OBJ it’s not working. It’s getting crash! I think i have a problem of exporting OBJ. Can you please let me know how we can export OBJ and do you have custom OBJ exporter from Blender!

  23. Can anybody post me blender .OBJ exporter for this??

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: