To do this I'm going to need some image data. Well since I'll be sticking to simple shapes, how about a cube and a dice texture? I threw together this image to use as the texture:
I'll also need an OBJ file for the cube. This should do:
1: # Blender v2.69 (sub 0) OBJ File: ''
2: # www.blender.org
3: mtllib Cube.mtl
4: o Cube_Cube.001
5: v 1.000000 -1.000000 -9.000000
6: v 1.000000 -1.000000 -7.000000
7: v -1.000000 -1.000000 -7.000000
8: v -1.000000 -1.000000 -9.000000
9: v 1.000000 1.000000 -9.000000
10: v 1.000000 1.000000 -7.000000
11: v -1.000000 1.000000 -7.000000
12: v -1.000000 1.000000 -9.000000
13: vt 0.000000 0.000000
14: vt 0.000000 1.000000
15: vt 1.000000 0.000000
16: vt 1.000000 1.000000
17: vn 0.000000 0.000000 -1.000000
18: vn -1.000000 -0.000000 -0.000000
19: vn -0.000000 -0.000000 1.000000
20: vn -0.000001 0.000000 1.000000
21: vn 1.000000 -0.000000 0.000000
22: vn 1.000000 0.000000 0.000001
23: vn 0.000000 1.000000 -0.000000
24: vn -0.000000 -1.000000 0.000000
25: usemtl None
26: s off
27: f 5/1/1 1/2/1 4/4/1
28: f 5/1/1 4/2/1 8/1/1
29: f 3/4/2 7/3/2 8/1/2
30: f 3/4/2 8/1/2 4/2/2
31: f 2/4/3 6/3/3 3/2/3
32: f 6/3/4 7/1/4 3/2/4
33: f 1/4/5 5/3/5 2/2/5
34: f 5/3/6 6/1/6 2/2/6
35: f 5/3/7 8/1/7 6/4/7
36: f 8/1/7 7/2/7 6/4/7
37: f 1/4/8 2/3/8 3/1/8
38: f 1/4/8 3/1/8 4/2/8
Graphics resources are ready. Now let's build the code to make this show up on screen. Since this texture is being applied to my model, why not put all the texture handling code in the GLModel class. First we need to add a few more integer values to the class; cbUVCoordsChange to mark when the UV coordinates have been modified, ciTextureID to hold the ID of the texture object, ciUVCoordsBuffer to hold the buffer object that stores the UV coordinates, and ciUVCoordsID to hold the shader variable ID for the UV coordinates.
The constructor will need a slight change to initialize those new variables:
1: public GLModel() {
2: int[] aiBuffList;
3: cbVertexChange = true;
4: cbNormalChange = true;
5: cbUVCoordsChange = true;
6: cbUniformChange = true;
7: ciUniformBuffer = -1;
8: ciUniformID = -1;
9: ciVertexBuffer = -1;
10: ciVertexID = -1;
11: ciTextureID = -1;
12: ciUVCoordsID = -1;
13: aiBuffList = new int[4];
14: GL.GenBuffers(aiBuffList.Length, aiBuffList);
15: ciVertexBuffer = aiBuffList[0];
16: ciUniformBuffer = aiBuffList[1];
17: ciUVCoordsBuffer = aiBuffList[2];
18: }
Once that's done everything should be ready to load a texture. I'll add a new method LoadTexture() to do just that:
1: public bool LoadTexture(string strFile) {
2: Bitmap bmpTexture;
3: System.Drawing.Imaging.BitmapData oTextureData;
4: Rectangle rctImageBounds;
5:
6: //Load the image from file
7: bmpTexture = new Bitmap(strFile);
8:
9: //Convert the image to a form compatible with openGL
10: rctImageBounds = new Rectangle(0, 0, bmpTexture.Width, bmpTexture.Height);
11:
12: oTextureData = bmpTexture.LockBits(rctImageBounds, System.Drawing.Imaging.ImageLockMode.ReadOnly, System.Drawing.Imaging.PixelFormat.Format24bppRgb);
13:
14: //Generate a new texture
15: ciTextureID = GL.GenTexture();
16:
17: //Copy the image data into the texture
18: GL.BindTexture(TextureTarget.Texture2D, ciTextureID);
19:
20: GL.TexImage2D(TextureTarget.Texture2D, 0, PixelInternalFormat.Rgba, bmpTexture.Width, bmpTexture.Height, 0, PixelFormat.Rgb, PixelType.UnsignedByte, oTextureData.Scan0);
21:
22: GL.TexParameter(TextureTarget.Texture2D, TextureParameterName.TextureMinFilter, (float)TextureMinFilter.Linear);
23: GL.TexParameter(TextureTarget.Texture2D, TextureParameterName.TextureMagFilter, (float)TextureMagFilter.Linear);
24:
25: bmpTexture.UnlockBits(oTextureData);
26:
27: return true;
28: }
Let's take a closer look at what's going on in this function. You might notice that a I use stuff from the System.Drawing.Imaging namespace using the full path. I don't add a using directive for this namespace since some of its contents have the same name as objects in the OpenTK.Graphics namespace. The compiler will balk at this ambiguity and insist that you use a complete name to clarify.
On line 7 the image data is loaded from a file. The C# Bitmap class is pretty nice and allows access to BMP, GIF, EXIF, JPG, PNG and TIFF image formats.
Line 10 creates a rectangle which will be used to define the portion of the image to use in the texture. You can use this to put portions of the image into different texture objects if necessary. In this case I don't see much reason for that and will use the entire image.
The Bitmap class provides a means to fix the pixel data into a particular format. Which is great since OpenGL does have some expectations for this data when creating a texture. On line 12 I'm setting the format I want which is 24 bits per pixel with the colors ordered red, blue, then green.
The texture object must be created and bound before we can work on it, so I do this on lines 15 and 18.
The GL.TexImage2D function takes our image data and puts it into the OpenGL texture object. We need to tell it in order: what sort of texture we want, a 2D Texture; the level of detail if we're using mipmaps, we aren't so 0; what format we want the pixel data kept in, red blue green and alpha; the width of the texture; height of the texture; the size of the border, my image doesn't have one so 0; the format our loaded data is in, red green blue with no alpha; the data type holding our loaded data, unsigned bytes; and finally the image data itself.
The function GL.TexParameter() us used to set various parameters on the texture to adjust how OpenGL draws it. On line 22 I'm setting the minifying method to use to Linear, this instructs GL to take a look at the nearby pixels generate an average color from them. This smooths the image a bit when its shrunk down on screen.
Line 23 sets the magnifying method to Linear as well. This will smooth the texture a bit when it's stretched on screen.
Line 25 releases the lock on the loaded image data and we're all done.
Our texture is all ready to be used, all we need to do is bind it when we render the model so that the shaders have access to it. A few small changes to the RenderObjectFunction() are needed for this:
1: public bool RenderObject() {
2: cstrErrDesc = "";
3: float[] afUniformData;
4:
5: //Be sure the buffer objects have been specified
6: if ((ciUniformID == -1)||(ciUniformBuffer == -1)) {
7: cstrErrDesc = "Unable to render with no Uniform Buffer or ID specified.";
8: return false;
9: }
10:
11: if ((ciVertexBuffer == -1)||(ciVertexID == -1)) {
12: cstrErrDesc = "Unable to render with no Vertex Buffer or ID specified.";
13: return false;
14: }
15:
16: //Check if buffer data needs refreshed
17: if (cbVertexChange == true) {
18: GL.BindBuffer(BufferTarget.ArrayBuffer, ciVertexBuffer);
19: GL.BufferData(BufferTarget.ArrayBuffer, (IntPtr)(cav3Vertexes.Length * Vector3.SizeInBytes), cav3Vertexes, BufferUsageHint.DynamicDraw);
20: cbVertexChange = false;
21: }
22:
23: if (cbUVCoordsChange == true) {
24: GL.BindBuffer(BufferTarget.ArrayBuffer, ciUVCoordsBuffer);
25: GL.BufferData(BufferTarget.ArrayBuffer, (IntPtr)(cav2UVCoords.Length * Vector2.SizeInBytes), cav2UVCoords, BufferUsageHint.DynamicDraw);
26: cbVertexChange = false;
27: }
28:
29: if (cbNormalChange == true) {
30:
31: cbNormalChange = false;
32: }
33:
34: if (cbUniformChange == true) {
35: //Set Uniform buffer data
36: afUniformData = new float[4];
37:
38: afUniformData[0] = cv4BaseColor.X;
39: afUniformData[1] = cv4BaseColor.Y;
40: afUniformData[2] = cv4BaseColor.Z;
41: afUniformData[3] = cv4BaseColor.W;
42:
43: GL.BindBuffer(BufferTarget.UniformBuffer, ciUniformBuffer);
44: GL.BufferData(BufferTarget.UniformBuffer, (IntPtr)(afUniformData.Length * sizeof(float)), afUniformData, BufferUsageHint.DynamicDraw);
45: }
46:
47: //Draw the model
48: if (ciTextureID != -1) {//A texture is loaded, use it
49: GL.BindTexture(TextureTarget.Texture2D, ciTextureID);
50: }
51:
52: GL.BindBuffer(BufferTarget.UniformBuffer, ciUniformBuffer);
53: GL.BindBufferBase(BufferTarget.UniformBuffer, ciUniformID, ciUniformBuffer);
54:
55: GL.BindBuffer(BufferTarget.ArrayBuffer, ciVertexBuffer);
56: GL.EnableVertexAttribArray(ciVertexID);
57: GL.VertexAttribPointer(ciVertexID, 3, VertexAttribPointerType.Float, false, Vector3.SizeInBytes, 0);
58:
59: GL.BindBuffer(BufferTarget.ArrayBuffer, ciUVCoordsBuffer);
60: GL.EnableVertexAttribArray(ciUVCoordsID);
61: GL.VertexAttribPointer(ciUVCoordsID, 2, VertexAttribPointerType.Float, false, Vector2.SizeInBytes, 0);
62:
63: GL.DrawArrays(BeginMode.Triangles, 0, cav3Vertexes.Length);
64: GL.DisableVertexAttribArray(ciVertexID);
65: GL.DisableVertexAttribArray(ciUVCoordsID);
66:
67: return true;
68: }
On lines 48 through 50 I check to see if a texture has been set, if so I bind it for use. Lines 59 through 61 I set the UV coordinate array up so the shaders can get to it. Nothing really new, just new uses for old functions.
Still not done though. All the data is being passed to the shaders, but they aren't ready to use them yet. Time to get them into shape. The vertex shader needs less work, so I'll start there:
1: #version 130
2: #extension GL_ARB_uniform_buffer_object : enable
3:
4: uniform Data {
5: mat4 m4Projection;
6: vec4 v4BaseColor;
7: };
8:
9: uniform ModelData {
10: vec4 v4ModelColor;
11: };
12:
13: attribute vec3 v3Location;
14: attribute vec2 v2UVCoord;
15:
16: varying vec4 v4FragColor;
17: varying vec2 v2UV;
18:
19: void main() {
20: gl_Position = vec4(v3Location, 1.0) * m4Projection;
21:
22: v4FragColor = v4ModelColor;
23:
24: v2UV = v2UVCoord;
25: }
Line 17 creates the variable to hold the new UV coordinates attribute buffer. Line 24 is where that value gets passed over to the fragment shader. I haven't really explained what those coordinates are for yet, but they are pretty important. I was hoping to get something on the screen to use as a reference, it's just taking a lot longer than I had expected.
That's all for the vertex shader, so let's have a look at the fragment shader.
1: #version 130
2: #extension GL_ARB_uniform_buffer_object : enable
3:
4: uniform Data {
5: mat4 m4Projection;
6: vec4 v4BaseColor;
7: };
8:
9: uniform sampler2D sSampler;
10:
11: uniform ModelData {
12: vec4 v4ModelColor;
13: };
14:
15: varying vec4 v4FragColor;
16: varying vec2 v2UV;
17:
18: void main() {
19: gl_FragColor = texture(sSampler, v2UV).rgba;
20: }
Line 9 establishes a new uniform variable, one that we didn't touch at all in the program. When we bind the texture it will populate this uniform automatically, so we don't need to set it.
The way the color gets set for the fragment, in line 19, is also changed. The Texture() function lets you retrieve a color from the texture image. The UV coordinates are allowing us to specify where in the image to sample the color from. The UV coordinates shown here are not the same as the UV coordinates we put into the buffer object, they are derived from there but aren't the ones we get here. The reason is because we specified a UV for each vertex, but there's a different number of fragments in a polygon than vertexes. So OpenGL will calculate the UV coordinates for the fragment shader based on what we specify at the vertexes.
We put all this together, and should end up with something like this on the screen:
That looks terrible. Well it proves that our texture was loaded since it's right there on the screen, but beyond that this isn't at all what I wanted. We can't tell that we're looking at a cube since we just see that one face, and the whole texture is showing on that side not just the one die face.
I really wanted to get all this done in one post, but there's just too much that needs to happen. Still to come: correcting the texture placement so the die faces match the cube sides, and moving this cube around so we can tell it's really a cube.
For the complete source code for this post go here.
No comments:
Post a Comment