I posted several fragment-only shaders in the Shader Playground announce topic, but I’ve also got quite a few more (mostly ones that also have vertex kernels) in this project (a fairly up-to-date snapshot of a now-public repo, originally posted in the “Samples” topic). They’re sort of a mixed bag: some aim to look good, others are mostly experimental.
Some original work I made for that (during the beta) can be found hereand here. The latter is an attempt to build up a suite of reusable code fragments which can then be automatically* stitched together when dependencies on them are detected in your code (this is why several of the shaders in that project are so mysteriously short), whereas the other one has the loader code, plus other utilities.
* - “Automatic” involves a GLSL parser that probably isn’t too hard to break, if one tried. Any improvements are most welcome!
Hey! That is really impressive. Also, I noticed that you can use any image on imgur.com as a texture. It is better if it is pow2 and square, but all you have to do, is to type in address bar url of an image after # and reload a page.
Here’s a water one I’m playing with at the moment. For some reason the image is displayed upside down, so really what you are seeing is the view when underwater.
I wrote how to use any image in playground earlier in this thread. Here’s example: http://goo.gl/aT3PqP, or example with your shader with custom texture - http://goo.gl/B5x2DN
I know you did but that doesn’t mean the playground shouldn’t have these types of images available automatically. I only posted my shader just to see if it worked really, it isn’t something I’ll spend more than a second or two on - I’d rather be actually playing with shaders in-game
This was also not what I was going for, but I realized the positions stuff I did for the previous one made it feasible (this is one that wasn’t fragment-only, in my original batch):
Ice (put a normal map in sampler 0, and an “environment” in sampler 1)
EDIT : Link botched
EDIT 2 : New version, with some comments and Catmull-Rom interpolation.
With angle set to 0 you can get something that’s just sort of “hopping around” on the “map”, but the “t * t” sharpens the curve, so it lags a bit at the peak. Even with just “t” it still spends too much time there, but I didn’t invest much into finding a more middle-heavy curve. (Maybe one of the Corona transitions would adapt well.)
I’d been thinking about how to do some of the old classic fire demos, but it looks like these match those and more. I did get them sort of running in stock Corona a couple years back (with *cough* one rect per pixel), but they depended on previous frames and neighbors, so not terribly obvious how to transfer them to stateless shader-land.
fBm seems to work well for fire in general. It also underlies the sun-ish effect I posted (in the playground announce thread).
@ Antheor
I’d be rather surprised if Corona built support into the tool. They probably just wanted to get it out the door. :) In theory, if you had audio as raw samples (say from a WAV, or decompressed from another source and captured) you could encode those to pixels and then read them out of a texture (perhaps “streaming” them through a snapshot), with time-based “texture” coordinates. I don’t know how well this would work in the presence of filtering, though, unless one could specify “nearest” mode.
At some point, I want to make some shaders for 1D / 2D FFT and convolution, if I don’t stumble on working versions in the meantime. I imagine the implementation will be a beast. :D (I’ve written them in Lua…) I have a very specific computer vision-style problem in mind, but really any application of DSP would be game. (Doing something, for instance, along the lines of what I mention here.)
Seems to be relatively easy, although I have no idea which type of variable to assign to each, so I’m essentially just chucking P_COLOR and P_UV randomly at stuff and it seems to work