Click here to Skip to main content
15,867,835 members
Articles / Mobile Apps / iOS

Textures and Shaders in Metal on iOS

Rate me:
Please Sign up or sign in to vote.
4.50/5 (5 votes)
19 Nov 2014CPOL4 min read 23.2K   67   2   4
How to start in Metal on iOS.

Introduction

The articles is a follower to my OpenGL article and showing to work with the new Metal-API from Apple which is developed from the scratch to succeed OpenGL by better effiency, performance and maintainability. Important is that Metal is only supported on newer hardware, precisly: A7 and higher Chips from Apple and running iOS 8. So some will need a update, maybe also on hardware side. The different button are giving access to routine in which different shader technologies are implemented.

Image 1

Download demo

Background

 While I got some tasks with OpenGL I wanted to learn about Metal. And so It is a follow up for my same structured article about OpenGL, which makes the comparison possible, because I use the same task levels and designs.

Using the code

The use of the Metal-API is straightforward, but to separete in three parts. At the one side you initialize the Metal-device, the shader code and at you run your drawing code. My imagination is that "The Device" is the GPU and the Display both Hardware. And the Shaders is the Software which runs on the GPU.

The hardware

The hardware (A7 or higher) and software (iOS 8 or higher) check is in the Model

C++
_device = MTLCreateSystemDefaultDevice();
		
if( _device == nil )
{
	AppLog(@"ERROR: No Metal device");
	return false;
}

and in the view. Remember it only works if the layer is Metal capable. Else it is nil.

_metalLayer = (CAMetalLayer*) self.layer;
	
AppLog(@"Layer pixel format: %d",(int)_metalLayer.pixelFormat);

if( _metalLayer == nil )
{
	AppLog(@"ERROR: No Metal layer. iOS 8 and A7 or higher needed");
	return false;
}

I have also some code earlier to check and diagnose the occure of this case.

The software

I will condense the Metal shader language to this link to Apple documentation, but stress out the similarity to OpenGL in that you need a vertex function and a fragment function. You also have special keywords which get marked with a double pair of shaped clips. These keywords are the interface to the objective-C language. You can transfer data in buffers (like structures) and textures. Most interesting is, that these memory is allocated by the Metal-API and so the Metal-API can access it directly.

In vertex function is computed how the coordinates for the individuals vertices (4D-points) are and than the result get into the fragment function which is called for every pixel and interpolates the color. So it is interesting to return a structure ffrom the vertex to enhance input for the fragment.

vertex TheVertex vertexTexture(	constant float4 *position  	[[ buffer(0) ]],
								uint      		vid   		[[ vertex_id ]])
{
	TheVertex data;
	data.color        = half4(0,1,0,1);//possibly set color 
	return data;
}

fragment half4 fragmentBuffer(	TheVertex		input		[[ stage_in		]],

Show me the pixels

To see some pixels, we need to call the software with our data. For that reason we need to set up some buffers or textures and finally let Metal do the work. But that are many steps. Interesting is to know that you prepare all fix data once and only rebuild what is needed.

Create a texture an fill its with nice image bits

//create Texturedescriptor (blueprint for texture)
MTLTextureDescriptor *mtlTextDesc = 
[MTLTextureDescriptor texture2DDescriptorWithPixelFormat:MTLPixelFormatRGBA8Unorm
                                                   width:_width			        
                                                  height:_height
											   mipmapped:NO];
//create texture with "blueprint"
_texture = [device newTextureWithDescriptor:mtlTextDesc];
		
MTLRegion region = MTLRegionMake2D(0, 0, _width, _height);
		
		[_texture replaceRegion:region
					mipmapLevel:0
					  withBytes:imageBits
					bytesPerRow:4*_width];

Create the render pipeline, by point to the shader programs. An advantage of Metal is, that the programs are already compiled at build time and so they only need to be loaded

// get the vertex function from the library
vertexProgram = [library newFunctionWithName:progVertex];

// create a pipeline description
MTLRenderPipelineDescriptor *desc = [MTLRenderPipelineDescriptor new];

desc.vertexFunction                     = vertexProgram;
desc.fragmentFunction                   = fragmentProgram;
desc.depthAttachmentPixelFormat         = MTLPixelFormatInvalid; //no depth texture set
desc.stencilAttachmentPixelFormat       = MTLPixelFormatInvalid;
desc.colorAttachments[0].pixelFormat    = metalLayer.pixelFormat;// framebuffer pixel format must match with metal layer
desc.sampleCount                        = 1;

NSError *error = nil;

id <mtlrenderpipelinestate> pipelineState = [device newRenderPipelineStateWithDescriptor:desc error:&error];

if (error != nil) {
    NSLog(@"Error creating RenderPipelineState: %@. Vertex or fragment bogous?", error);
}

NSAssert(pipelineState != nil, @"ERROR: We need a renderpipelinestate");

return pipelineState;

The final stage is to create the render pass and put it in a command buffer. It is the final preparation stage and quite complex. Interesting is to know that one or more render pass can get processed at one time. The command buffer get the final "go" with this line of code:

// Put command buffer now into the queue
[commandBuffer commit];

Points of Interest

Metal is a powerful API but in testing it, I must confess that my needs are already fulfilled with OpenGL. But I really admire the great work of the people by Apple which results that the compiler is yelling about most of the bugs and debugging the code is fun by throwing exceptions what is missing. 

I made other tests in which Metal were faster and some 5% more energy efficient. Maybe drawing some pixels isnt screaming for optimization.

Another point of interest, is that Metal is addressing the pixels, so Retina resolution needs more pixel processing and the high resolution needs more power and time. Fot that reason I have spend some time to make a useful log. The Retina routines needs 4x times as the standard resolution. It is no wonder, because there are 4 times the pixels.

Image 2

 

Who has now appetite for this kind of Metal should dig into the Apple documentation. The videos provide a good overview and the sample code is quite interesting.

Also the project MetalCamera and Metal by Example are really great pieces of software.

History

- Initial Version 

License

This article, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)


Written By
Software Developer
Germany Germany
I am living in germany and now living from programming for some Years. In my spare time I like sports as jogging, playing football (soccer) and basketball.

We must take care for our planet, because we and our family has no other. And everybody has to do something for it.

Comments and Discussions

 
QuestionFormat? Pin
Nelek7-Dec-14 1:01
protectorNelek7-Dec-14 1:01 
AnswerRe: Format? Pin
KarstenK15-Dec-14 1:37
mveKarstenK15-Dec-14 1:37 
GeneralRe: Format? Pin
Nelek15-Dec-14 1:49
protectorNelek15-Dec-14 1:49 
GeneralMy vote of 5 Pin
Volynsky Alex19-Nov-14 11:57
professionalVolynsky Alex19-Nov-14 11:57 
Nice!

General General    News News    Suggestion Suggestion    Question Question    Bug Bug    Answer Answer    Joke Joke    Praise Praise    Rant Rant    Admin Admin   

Use Ctrl+Left/Right to switch messages, Ctrl+Up/Down to switch threads, Ctrl+Shift+Left/Right to switch pages.