SCNTechnique in SceneKit

New in ios8 is the SCNTechnique. It’s a way to get your models to pass through a custom shader program. This is an added layer on top of an objects filters. For instance, you’re able to apply a Gaussian blur to a single node in a SceneKit scene, then pass the scene to a fragment shader to change the color of the scene.

Getting started I’m setting up a new ios game.Screen Shot 2014-08-22 at 10.46.31 AM

This will create the scene view that SceneKit uses to render it’s 3d objects to. I’ve named the project SCNTechnique, this will make it obvious what this project is being used for.

Screen Shot 2014-08-22 at 10.47.22 AM

I’m also enabling the local source control, I think git is required for this to work. If you don’t have this then I suggest you use it. If you break something it’s useful to know what you changed.

In the project we’ll want to set up a new dictionary that’s used to reference and define data going to and from the SCNTechnique rendering pass. I’m adding an iOS property list. Selecting add new file to the project to open this choose a template for your new file dialog:

Screen Shot 2014-08-22 at 10.57.10 AM

Then I’m naming it firstPass, it’ll add the plist file extension for me. A property list is an XML file which Xcode allows us to edit with a more friendly interface.

Screen Shot 2014-08-22 at 10.57.28 AM

The file appears with Root as the only section of data. We need to populate this with a few things to turn it into something that can be used by SCNTechnique. There are three entries [sequence, passes, symbols] where sequence is an Array and passes and symbols are both Dictionaries. These are added with the plus icon, and the type is changed with a popup menu under the type column.

Screen Shot 2014-08-22 at 11.11.29 AM

sequence defines what pass is rendered when. To be honest, I think this means that you can define multiple passes in a single SCNTechnique, but so far this seems to be quite buggy and I can’t seem to get more than one pass working. Passes defines what parameters are passed to and from the shader programs. We’ll get to shader programs in a moment. symbols defines what name and type the data is passed from your code into the shaders.

So shader programs, I couldn’t find any appropriate templates to start with for these, so I ended up picking iOS/Other/Empty template. Which isn’t much of a template… But it is necessary to ensure that the project is aware of your new file. I added two files, a firstProgram.vsh and a firstProgram.fsh.

Screen Shot 2014-08-22 at 11.17.05 AM

These live in the project like so.

Screen Shot 2014-08-22 at 11.19.12 AM

What is the fsh and vsh for? the fsh means fragment shader and the vsh means vertex shader. Basically these are common OpenGL type shader programs. To make the connection between these shader programs and SCNTechnique in your SceneKit scene you’ll need to add them to the firstPass.plist so back to defining the technique’s property list.

In the Sequence array we need to name a rendering pass in the passes list. This seems to be the entry point for the technique. I’m naming it renderingPass.

Screen Shot 2014-08-22 at 11.25.23 AM

This tells the Technique where to start. The item 0 of the sequence needs to point to an entry in the passes item. Then in passes we are require to have 4 entries.

Screen Shot 2014-08-22 at 11.28.19 AM

draw program inputs and outputs, draw is a string program is another string and inputs and outputs are both dictionaries.

Screen Shot 2014-08-22 at 12.23.46 PM

 

draw needs to be set to DRAW_QUAD which is used to draw data piped into the shader program, we also have DRAW_NODE or DRAW_SCENE. Each is used for specific reasons, but for now we’ll use DRAW_QUAD. The program refers to the .vsh and .fsh files we created and added to the project a moment ago. (note: the case and underscore are important) Don’t add the .vsh or .fsh to the program, Technique knows what to do with the name. I’ll always look for both files, if one or the other are missing the Technique will fail and your app will stop when looking for the program.

Screen Shot 2014-08-22 at 11.45.15 AMIt seems rather necessary that one of the inputs is COLOR which goes into a colorSampler. COLOR is an openGL data source and colorSampler will be available in the shader program with data coming from COLOR. the a_position is a variable which points to a_position-symbol. We will need to add this to the symbols section of the technique’s plist. So we’ll add that in next.

Screen Shot 2014-08-22 at 11.53.41 AMIn the a_position we add a_position-symbol and in symbols create a_position-symbol as a dictionary and add semantic vertex string to create a link between your SCNTechnique and the program. I’m not exactly sure why we don’t need to do this for colorSampler, but that’s how it works.

now we should add our code to both the .vsh and .fsh to make this work. In the firstProgram.fsh I’ve added the following code:

This isn’t objective-c it’s c, and the colorSampler here is being piped in from the plist which SCNTechnique is using to tunnel data through. gl_fragColor is the final output to the fragment shader.

In the vsh we need to add in the following code:

The gl_Position is the final output for the vertices in the vertex shader. With these two parts of the firstProgram in place and the firstPass.plist setup we can now use it in the GameViewController.m

In the template code we follow after the scnView is setup in the scene. Then we create a NSURL *url; to fill with url = [[NSBundle mainBundoe] URLForResource:@”firstPass” withExtension:@”plist”]; this makes a url that points to the firstPass.plist that we created with all of the data used to connect our shader program with the SCNTechnique. The we use SCNTechnique *firstTechnique to load in that dictionary. Finally we assign scnView.technique to the SCNTechnique we created from the plist.

IMG_0104

Horray, we’re drawing using a SCNTechnique, it doesn’t look like anything interesting is happening, but it’s really going through our shader program and not the usual scene.view. We can prove this by editing the shader program. We can draw the technique as red by changing the fsh to the following code.

This sets the gl_fragColor to red. And now the iPad draws the image as all red. I’m still working on more interesting things to do with all of this but if you’re familiar with openGL then you’ll be able to do quite a lot. I’m not that familiar with openGL myself, so I’m learning what I can while I can. Hope this tutorial helps.

 

Read More

SceneKit shaderModifiers

So I started with the basic GameKit, this gives you a colorful jet object in the scene.

By default the code in the GameViewController.m looks like the following:

There’s not much going on other than adding models, lights and cameras to the scene. I wanted to do some learning up on shaderModifiers so I started by getting the mesh in the ship.dae.

To do that I need to get to the material on the ship.

The above shows up with nothing, the ship included in the scene has no geometry in the ship node. This took a few minutes of fiddling before I found that the ship has a shipMesh child object.

Screen Shot 2014-07-25 at 6.48.44 PMThe child is called shipMesh, and that’s the node that would have a firstMaterial. To get to that I used

The line assigns the actual mesh I need to get to the *shipMesh SCNNode object. now I can use the following:

This assigns the shipMesh’s material to the SCNMaterial *mat so I can start adding shaderModifiers to the object in the scene.

Starting with the basics I tried out:

this turned the ship to a grey color. on the iPad:

Screen Shot 2014-07-25 at 6.55.28 PMSo this was expected, vec3(0.5) tells GLSL to make a rgb color of r 0.5, g 0.5. b 0.5, it’s a handy short-hand for color making. Another thing I tried was the following.

This wasn’t assigning color.rgb but color, which expects a vec4, and I got the same result; a grey jet appears on the iPad. Just to be thorough, I tried assigning an interesting value to the color.

The above turned into a reddish jet.

Screen Shot 2014-07-25 at 6.58.46 PM

So this again was expected. I took some code from the WWDC presentation and found some code to try out on the jet.

The above made the jet all wobbly.

Screen Shot 2014-07-25 at 7.00.46 PMAll of the verts were going off of a sin(u_time) which is a value in the GLSL world that all shaders have access to. Cool stuff so far. After this I found some interesting car paint shader from the WWDC presentation and tweaked it to get the following result.

 

Screen Shot 2014-07-25 at 7.04.07 PMStrangely enough I didn’t delete the code that made the jet wobbly, so I’m guessing something was overwritten. The shaderModifiers is being assigned one value when we use = to assign the value an @{“”}; object. So a bit of looking at the shaderModifiers object in the materials and I see that it’s an NSDictionary. The NSMutableDictionary and NSDictionary are supposedly compatible, so I make a new NSMutableDictionary with the following:

I’ll use that and assign multiple keys with code to the *shaders dictionary.

this looks like the following:

The different SCNShaderModifierEntryPointGeometry and PointSurfaces are key values with objects assigned to them. At the end we assign mat.shaderModifiers = shaders to assign both of these objects in the NSMutableDictionary to the material. So now the final result looks like the following:

Screen Shot 2014-07-25 at 7.09.39 PM

 

A wobbly pink jet. The significance of all of this is that to apply multiple shaders to the object you need to create a dictionary first, and use the different entrypoint constant values as keys. Hope this all made sense.

 

Read More

Looking for some Unreal Stuff?

http://okita.com/alex/unreal-game-development-archive/

The archives and what’s been going on can be found on the link above. Updates and adventures will be forthcoming as well as a few project announcements.

IMG_20140610_083914Making the most of a small workshop means approaching a critical density of stuff. I need to organize this into something more sensible, and some cool gizmos will be coming to this site soon. Check back often!

 

Read More

notes on Objective-C

Getting and setting variables in obj-c

The @properties in the @interface allow the class to define any instance or class variables. setting values which are static are done in the @implementation section.

using them is either with the .dot operator or through [class object] interface.

the output for the above is

the last line where we use classVars staticFloat had no @synthesize or @property, anyone know why those weren’t necessary?

Read More

Finished with edits, next steps.

So, the first and second rounds of edits on my book have been submitted to my editor.

http://okita.com/Unity/2014/04/almost-done/

I’ve been working non stop, procrastinating, thinking, living, and dwelling on my Unity C# book for over a year. Last night at around 1am I wrote an email, attached a bunch of word files and pdfs that contained the last round of comments and answers to questions from my editor. I’ve put off going to the gym, running, and having a life on the weekends having to re-read and re-work much of my book, and now I’m done.

I have to admit, I’m already considering my next book, I think I might approach something more fun this time. To be honest, I’ve started reading more fiction books and it’s inspired me. I do find myself curious if I could handle making a living from writing. At this point I’ll leave that subject for my day dreaming, I’ve got a lot of game projects to tackle.

I do have a puzzle game I want to finish up. Once I get the mechanic for that working I’m planning on releasing the code on github to allow for as many people to clone it as possible. I’m not planning on making money from the game, or it’s mechanics. It’s a cool idea, and I have plans for how to use the puzzle as a mechanic for a different game which I do want to add some fun/profitable collection game play aspects.

There’s also the rest of life to get back to.

Read More

Gravity waves, and expansion.

http://www.space.com/25066-major-astrophysics-discovery-announcement-monday.html?cmpid=514648_20140317_20190254

Today there’s a new announcement about new data recorded at a telescope in the south pole. A study of the radiation over the course of 9 years, 9 freakn’ years has resulted in a pattern emerging in the background radiation of our universe. This pattern infers gravity waves, a direct result of the math Einstein used to explain gravity.

Not related to E=mc^2, relativity is used every day to figure out where your cell phone is. GPS requires a very precise clock to compare signals traveling at the speed of light from a satellite to a matching pattern in your phone. Depending on where the patterns match up determines where you are on the planet. As civilians we’re not allowed to use devices as accurate as those used by the military. Otherwise, you’d be able to locate your cellphone hidden under a ball cap in the corner of your living room next to a marble.

From the wiki page the satellites need to transmit on an ever so slightly slower frequency; 10.22999999543mhz compared to the receiver on land at 10.23mhz. That small difference in frequency is to compensate for relativity. If that adjustment wasn’t made, GPS simply wouldn’t work. That same calculation can be applied to many more things other than GPS, like gravity waves being propagated by the big bang. The number didn’t come about by guessing. That number was determined by a physicist before the satellite went up.

The significance of the evidence by the telescope is huge. The math predicted that there should be something observable in the form of gravity waves in the microwave background noise of space. If this background noise didn’t exhibit this property then the big bang theory would need to be tossed out and a new theory would have to replace it. However, the prediction found supporting evidence.

This is cool, the math behind it is cool. The prediction is amazing, and the evidence found to support the prediction took a long time to record. I almost feel bad that the evidence was found to match so well. This means the other theories are losing to the big bang leaving fewer options with supporting evidence.

Read More