diff --git a/README.md b/README.md index ba431ca..448e394 100644 --- a/README.md +++ b/README.md @@ -1,95 +1,135 @@ -# [CIS565 2015F] YOUR TITLE HERE +# [CIS565 2015F] Ray Marcher **GLSL Ray Marching** **University of Pennsylvania, CIS 565: GPU Programming and Architecture, Project 5** -* (TODO) YOUR NAME HERE -* Tested on: (TODO) **Google Chrome 222.2** on - Windows 22, i7-2222 @ 2.22GHz 22GB, GTX 222 222MB (Moore 2222 Lab) +* Megan Moore +* Tested on: **Google Chrome** on + MacBook Pro 2.6 GHz Intel Core i5 @ 8 GB 1600 MHz DDR3, Intel Iris 1536 MB -### Live on Shadertoy (TODO) +### Live on Shadertoy -[![](img/thumb.png)](https://www.shadertoy.com/view/TODO) +[![](img/bridge2.png)](https://www.shadertoy.com/view/Mt2XRV#) + +[![](img/debug_image_orig.png)](https://www.shadertoy.com/view/MtSXR3) ### Acknowledgements This Shadertoy uses material from the following resources: +Morgan McGuire, Williams College. +*Numerical Methods for Ray Tracing Implicitly Defined Surfaces* (2014). +[PDF](http://graphics.cs.williams.edu/courses/cs371/f14/reading/implicit.pdf) + +Iñigo Quílez. +*Terrain Raymarching* (2007). +[Article](http://www.iquilezles.org/www/articles/terrainmarching/terrainmarching.htm) + +http://www.iquilezles.org/www/articles/menger/menger.htm - mendel sponge +https://en.wikipedia.org/wiki/Blinn%E2%80%93Phong_shading_model - blinn-phong lighting +http://graphics.cs.williams.edu/courses/cs371/f14/reading/implicit.pdf - ray marching/sphere tracing + +http://www2.compute.dtu.dk/pubdb/views/edoc_download.php/6392/pdf/imm6392.pdf - ambient occlusion/soft shadows + +https://www.shadertoy.com/view/4t2SRz - smoke color +https://www.shadertoy.com/view/MdXGW2 - water +https://www.shadertoy.com/view/MdX3zr - smoke movement -* TODO -### (TODO: Your README) + -Instructions (delete me) -======================== +###Final Image with debug views + + -This is due at midnight on the evening of Monday, October 19. **Summary:** In this project, you'll see yet another way in which GPU parallelism and compute-efficiency can be used to render scenes. -You'll write a program in the popular online shader editor +I wrote a program in the popular online shader editor [Shadertoy](http://www.shadertoy.com/). -Your goal will be to implement and show off different features in a cool and -interesting demo. See Shadertoy for inspiration - and get creative! Ray marching is an iterative ray casting method in which objects are represented as implicit surfaces defined by signed distance functions (SDFs). This method is widely used in the Shadertoy community to render complex scenes which are defined in the fragment shader code executed for each pixel. -**Important Notes:** -* Even though you will be coding in Shadertoy, it is important as always to - save versions of your code so that you do not lose progress! Commit often! -* A significant portion of this project will be in write-up and performance - analysis - don't save it for later. - -**Provided Code:** -The provided code in `raymarch.glsl` is straight from iq's Raymarching -Primitives; see {iq-prim}. It just sets up a simple starter camera. - ### Features **Required Features:** * Two ray marching methods (comparative analysis required) * Naive ray marching (fixed step size) {McGuire 4} + * The naive ray marching method requires a ray to be shot from the camera to each different pixel in the image. Then, given a step size, we march down the ray step by step and determine what the minimum distance is from that point to all of the "objects" in the scene. When this distance drops below an epsilon, you can stop the ray march and print the color of the object that ray has hit. If the total distance you have stepped along the ray surpasses your max distance, you stop marching and assume that ray does not hit anything. As you can imagine, this technique can be very slow, since there is no optimization, and the step size is very small. This is why the sphere tracing was implemented to help speed up the frame rate. * Sphere tracing (step size varies based on signed distance field) {McGuire 6} + * The difference between sphere tracing and ray marching is the step size. Instead of marching down the ray with a constant step size, we find the largest possible step we can take without hitting any objects. In order to do this, the beginning method is the same as the ray march, we find the distant from the point to each object in the scene. But then, even if the distance is not less than our epsilon, we take a step in the direction of the ray that is a length equal to the minimum distance the point is away from the objects. This allows you to take the largest possible next step without stepping over any objects. This helps make the run time much faster you are taking less steps with each ray. * 3 different distance estimators {McGuire 7} {iq-prim} * With normal computation {McGuire 8} + * I ended up using many different distance estimators throughout each of my scenes. I implemented the sphere, plane, cube, rounded cube, capsule, torus, cross, and cylinder. Distance estimators are the reason that the scene can be rendered with just two triangles, instead of multiple objects. The rays are not actually intersecting with objects that are in the scene. When the distance is being calculated, you use a distance estimator to find how far the point is from the object, if it were to be there. This helps to allow complicated scenes render very quickly, because there are not actual objects being rendered, but the correct color is being calculated. * One simple lighting computation (e.g. Lambert or Blinn-Phong). + * I implemented a Blinn-Phong lighting scheme. Each object has it's own diffuse color. Then, after calculating it's normal, the dot product of the normal and the light vector is used to determine how much diffuse and specular light will be seen. This image shows three white objects that have a blinn-phong lighting applied to them + + + * Union operator {McGuire 11.1} - * Necessary for rendering multiple objects + * The union operator allows for multiple objects to be rendered in the same scene. The implementation of the union operator requires the comparison of each distance from the point to each object. The object that has the minimum distance from the point on the ray should be drawn. Thus, the union operator compares these distances and takes the minimum. I do this in my scene function, comparing the t value of each object in the scene and choosing the minimum value to be drawn. * Transformation operator {McGuire 11.5} + * The transformation operator was implemented by creating a transformation matrix. The inputs into the function were the translation, rotation, and scale vector. Each coordinate in the rotation matrix represents the rotation about the x, y, or z axis. The inverse of this matrix is multiplied by the center of the object, and the product of this gives the correct new position. The image below shows a cube being rotated about the z axis around the origin. + + ![](img/rotating.gif) + * Debug views (preferably easily toggleable, e.g. with `#define`/`#if`) - * Distance to surface for each pixel - * Number of ray march iterations used for each pixel + * There are three differenct debug views that can be used in my scene. The first is the normals, which will color each point on an object the color of the calculated normal at the point on the suface. The second is the number of steps along the ray that are taken before it hits an object. In the view, the less steps taken along a ray print black, while the larger amount of steps taken along a ray print a red color. The last debug view is the distance to the surface for each pixel. This will print a black color for objects that are close and white for objects that are further away. These veiws can easily be toggled between at the top of the script. + **Extra Features:** -You must do at least 10 points worth of extra features. -**INSTRUCTOR TODO: review point values** +* Other basic distance estimators/operations {McGuire 7/11} + * Some other operators that I implemented were the difference, displacement, and repeat operators. Below is an image of the repeat operator (the others have been shown in previos images). These operations take the value given by the distance estimators and tweak them in some way to change the shape that is being generated. + +![](img/wood_material.png) -* (0.25pt each, up to 1pt) Other basic distance estimators/operations {McGuire 7/11} * Advanced distance estimators - * (3pts) Height-mapped terrain rendering {iq-terr} - * (3pts) Fractal rendering (e.g. Menger sponge or Mandelbulb {McGuire 13.1}) - * **Note** that these require naive ray marching, if there is no definable - SDF. They may be optimized using bounding spheres (see below). + * Height-mapped terrain rendering {iq-terr} + * Height map terrain can be used to render complicated and never ending terrain very quickly. This is done by using a texture to deterine the height of the terrain at each pixel. The height of the point along the ray is compared to a value determined from the texture at specific coordinates. This value can be manipulated in amany different ways to help get the desired resultes. In this case, I used the average of the R, G, and B values of the color at those texture coordinates. The image below on the left shows a height map using a checkered texture. The image on the right is using two different textures. One to obtain the height at each pixel, and another to actually texture the terrain. While this method is very useful for rendering complicated scenes, it does require you use the naive method. This is because you do not have specific objects in the scene, you have a height for each pixel given by the texture, and you would have to compare the distance of each spot on the terrain (each pixel). This would end up taking much longer than just stepping through the ray by a constant value. + + + + * Fractal rendering (e.g. Menger sponge or Mandelbulb {McGuire 13.1}) + * The fractal rendering shows a repeating desing on an object. In the image below, it shows the difference of a box and cross repeating on the cube. This is done by using the difference estimators fro the box and cross, along with the difference operator. Then, you iterate through and apply the pattern to each smaller section of the cube. The more iterations you step through, the more repetitions of the patter will appear on the object. Thread divergence is not an issue in this implementation, and it can be used with both the naive ray march and sphere tracing. + + ![](img/menger_sponge.png) + * Lighting effects - * (3pts) Soft shadowing using secondary rays {iq-prim} {iq-rwwtt p55} - * (3pts) Ambient occlusion (see 565 slides for another reference) {iq-prim} -* Optimizations (comparative analysis required!) - * (3pts) Over-relaxation method of sphere tracing {McGuire 12.1} - * (2pts) Analytical bounding spheres on objects in the scene {McGuire 12.2/12.3} - * (1pts) Analytical infinite planes {McGuire 12.3} + * Soft shadowing using secondary rays {iq-prim} {iq-rwwtt p55} + * Soft shadowing requires a secondary ray cast. This secondary ray is cast from the intersected surface point and directed towards the light. If the ray intersects the light, then no shadow will be applied. If the ray hits another object before reaching the light, a shadow is drawn. The implementation of soft shadowing can take up a lot of time when rendering. This is because, not only does it cause for more ray marching (using the sphere tracing), but it also can cause thread divergence. + +![](img/soft_shadow.png) + + * Ambient occlusion {iq-prim} + * Ambient occlusions allows the lighting in a scene to be more realistic, by measuring how exposed a point on the surface is to ambient lighting. This means that surfaces that are more occluded will recieve less ambient lighting and will be darker than other surfaces. In the images below (a very simple scene, so not much ambient light is scene) you can see where the ambient light comes into play in the scene and how it changes the final images. When implementing ambient occlusion, it requires another ray trace. This is a shorter ray march than in the soft shadows, but it can still add time to your render. However, there is no thread divergence that will occur when adding ambient occlusion. + + + + +###Bridge and Train - Step by step break down +![](img/bridge_tracks.png) + +Here the box and capsule distance estimators were used with the difference and repeat operators. The capsoles are being repeated along the x axis and are being subtracted from the rectangle that makes up the bridge. + +![](img/smoke.gif) + +The smoke uses a sphere and a noise function as it's distance estimator. The noise function changes because the variable iGlobalTime is used to compute how much noise is implemented into the sphere. This image allows you to see the smokes movements much better than the final picture. Because the size of the smoke had to be small, the noise was cut back. + +![](img/bridge_under_water.png) + +Here the bridge and train appear to be under the water. This was happening because the color at each pixel added a reflection distrotion as long as the height of the water was less than 200, which is the horizon line. This needed to be changed, so that instead of comparing the height to the horizon line, it compared it to all other obejcts in the scene. This way, if the object was above the water at the distance, it would not have a reflection distoriton added to the scene. + +![](img/bridge2.png) + -This extra feature list is not comprehensive. If you have a particular idea -that you would like to implement, please **contact us first** (preferably on -the mailing list). -## Write-up For each feature (required or extra), include a screenshot which clearly shows that feature in action. Briefly describe the feature and mention which @@ -98,35 +138,19 @@ reference(s) you used. ### Analysis * Provide an analysis comparing naive ray marching with sphere tracing - * In addition to FPS, implement a debug view which shows the "most expensive" - fragments by number of iterations required for each pixel. Compare these. -* Compare time spent ray marching vs. time spent shading/lighting - * This can be done by taking measurements with different parts of your code - enabled (e.g. raymarching, raymarching+shadow, raymarching+shadow+AO). - * Plot this analysis using pie charts or a 100% stacked bar chart. -* For each feature (required or extra), estimate whether branch divergence - plays a role in its performance characteristics, and, if so, point out the - branch in question. - (Like in CUDA, if threads diverge within a warp, performance takes a hit.) -* For each optimization feature, compare performance with and without the - optimization. Describe and demo the types of scenes which benefit from the - optimization. - -**Tips:** - -* To avoid computing frame times given FPS, you can use the - [stats.js bookmarklet](https://github.com/mrdoob/stats.js/#bookmarklet) - to measure frame times in ms. + * In the images below, you can see the difference between a scene generated using the naive ray marcher (left) and the sphere tracer (right). As you can see, the naive ray marcher requires many more steps per ray cast. (The more red the pixel, the more steps were required.) This shows how much time can be saved by the sphere tracer, since it can do the same amount of work with many less steps. Also, in this particular scene, the FPS when using the sphere tracer was almost always at 60.0 FPS. However, when I switched to using a naive ray marcher, the FPS dropped to a mere 5.5 FPS. -### Resources -You **must** acknowledge any resources you use, including, but not limited to, -the links below. **Do not copy non-trivial code verbatim.** Instead, use the -references to understand the methods. + + -For any code/material in the 565 -[slides](http://cis565-fall-2015.github.io/lectures/12-Ray-Marching.pptx), -please reference the source found at the bottom of the slide. +* Compare time spent ray marching vs. time spent shading/lighting + * The following analysis was done on the bridge scene, as the final scene was already running at 60.0 FPS and could not give good data by not including soft shadows or ambient occlusion. Below is a pie graph that shows how much time was spent on each of the different ray marches in the scene. Clearly, the original ray march took up the most time, as it is done for every pixel. On average, with both the soft shadows and ambient occlusion off, the FPS was about 48.6. The soft shadows took up the next most amount of time. This makes sense, as it allows for more iterations than the ambient occlusion. The soft shadows and ambient occlusion function are only called when the original ray hits an object, so they will ovvur less often. On average, with the soft shadow function, the FPS was 40.0. The ambient occlusion added on the least amount of time. On average, the scene rendered at 36.4 FPS. + +![](img/pie.png) + + +### Resources * {McGuire} Morgan McGuire, Williams College. @@ -152,38 +176,3 @@ please reference the source found at the bottom of the slide. [GitHub](https://github.com/ashima/webgl-noise) * You may use this code under the MIT-expat license. - -## Submit - -### Post on Shadertoy - -Post your shader on Shadertoy (preferably *public*; *draft* will not work). -For your title, come up with your own demo title and use the format -`[CIS565 2015F] YOUR TITLE HERE` (also add this to the top of your README). - -In the Shadertoy description, include the following: - -* A link to your GitHub repository with the Shadertoy code. -* **IMPORTANT:** A copy of the *Acknowledgements* section from above. - * Remember, this is public - strangers will want to know where you got your - material. - -Add a screenshot of your result to `img/thumb.png` -(right click rendering -> Save Image As), and put the link to your -Shadertoy at the top of your README. - -### Pull Request - -**Even though your code is on Shadertoy, make sure it is also on GitHub!** - -1. Open a GitHub pull request so that we can see that you have finished. - The title should be "Submission: YOUR NAME". - * **ADDITIONALLY:** - In the body of the pull request, include a link to your repository. -2. Send an email to the TA (gmail: kainino1+cis565@) with: - * **Subject**: in the form of `[CIS565] Project N: PENNKEY`. - * Direct link to your pull request on GitHub. - * Estimate the amount of time you spent on the project. - * If there were any outstanding problems, or if you did any extra - work, *briefly* explain. - * Feedback on the project itself, if any. diff --git a/bridge.glsl b/bridge.glsl new file mode 100644 index 0000000..53698d0 --- /dev/null +++ b/bridge.glsl @@ -0,0 +1,547 @@ +//http://www.iquilezles.org/www/articles/menger/menger.htm - mendel sponge +//https://en.wikipedia.org/wiki/Blinn%E2%80%93Phong_shading_model - blinn-phong lighting +//http://graphics.cs.williams.edu/courses/cs371/f14/reading/implicit.pdf - ray marching/sphere tracing +//http://www2.compute.dtu.dk/pubdb/views/edoc_download.php/6392/pdf/imm6392.pdf - ambient occlusion/soft shadows +//https://www.shadertoy.com/view/4t2SRz - smoke color +//https://www.shadertoy.com/view/MdXGW2 - water +//https://www.shadertoy.com/view/MdX3zr - smoke movement +//--Distance Functions------------------------------------------------------------------- + +float time; +float newPos = 0.0; +const mat2 m2 = mat2(0.8,-0.6,0.6,0.8); + +#define BUMPFACTOR 0.1 +#define EPSILON 0.1 +#define BUMPDISTANCE 60. + +float noise(vec3 p) //Thx to Las^Mercury +{ + vec3 i = floor(p); + vec4 a = dot(i, vec3(1., 57., 21.)) + vec4(0., 57., 21., 78.); + vec3 f = cos((p-i)*acos(-1.))*(-.5)+.5; + a = mix(sin(cos(a)*a),sin(cos(1.+a)*(1.+a)), f.x); + a.xy = mix(a.xz, a.yw, f.y); + return mix(a.x, a.y, f.z); +} + +float sphere(vec3 p, vec4 spr) +{ + return (length(spr.xyz-p) - spr.w); +} + +float planeDist( vec3 p ) +{ + /*if (p.y < ((sin(p.x) - sin(p.z)) / 4.0)) return (sin(p.x) - sin(p.z)) / 4.0; + else return 100.0; + return 100.0;*/ + return p.y; + +} + +float ellipsoidDist( in vec3 p, in vec3 r ) +{ + return (length( p/r ) - 1.0) * min(min(r.x,r.y),r.z); +} + +float cylinderDist( vec3 p, vec2 h ) +{ + vec2 d = abs(vec2(length(p.xz),p.y)) - h; + return min(max(d.x,d.y),0.0) + length(max(d,0.0)); +} + +float sphereDist(vec3 p, float r) { + return length(p) - r; +} + +float flame(vec3 p) +{ + float d = sphere(p*vec3(6.,-1.0,5.), vec4(.0,-.9,.0,1.0)); + return d + (noise(p+vec3(.0,iGlobalTime*6.0,.0)) + noise(p*6.)*.5)*.25*(p.y) ; +} + +float boxDist( vec3 p, vec3 b ) +{ + vec3 d = abs(p) - b; + return min(max(d.x,max(d.y,d.z)),0.0) + length(max(d,0.0)); +} + +float torusDist( vec3 p, vec2 t ) +{ + return length( vec2(length(p.xz)-t.x,p.y) )-t.y; +} + +float roundBoxDist( vec3 p, vec3 b, float r ) +{ + return length(max(abs(p)-b,0.0))-r; +} + +float capsoleDist( vec3 p, vec3 a, vec3 b, float r ) +{ + vec3 pa = p-a, ba = b-a; + float h = clamp( dot(pa,ba)/dot(ba,ba), 0.0, 1.0 ); + return length( pa - ba*h ) - r; +} + +float crossDist( in vec3 p ) +{ + float da = boxDist(p.xyz,vec3(100000,1.0,1.0)); + float db = boxDist(p.yzx,vec3(1.0,100000,1.0)); + float dc = boxDist(p.zxy,vec3(1.0,1.0,100000)); + return min(da,min(db,dc)); +} + +//--CSG Functions---------------------------------------------------------------------- + +float diffFunc(float d1, float d2) { + return max(d1, -d2); +} + +float intersectionFunc(float d1, float d2) { + return max(d1, d2); +} + +float repeat( vec3 p, vec3 c ) +{ + vec3 q = mod(p,c)-0.5*c; + vec4 height = texture2D(iChannel0, p.xz); + //float avg = clamp((height.x + height.y + height.z + height.w) / 4.0, 0.0, 2.0); + return capsoleDist(q - vec3(-1.0, -1.0, -2.0), vec3(1.0,-4.0,1.0), vec3(1.0,1.0,1.0), 0.75); +} + +/*vec3 transform( vec3 p, mat4 m ) + { + //vec3 q = invert(m)*p; + return box(p); + }*/ + +float displace( vec3 p ) +{ + float d1 = capsoleDist(p, vec3(-6.3,-4.0,-8.1), vec3(2.8,-0.50,-3.), 5.0); + float d2 = (sin(20.0*p.z)*sin(20.0*p.y)*sin(20.0*p.x)*sin(iGlobalTime)) / (16.0*p.y); + return d1+d2; +} + +float bend( vec3 p ) +{ + float c = cos(20.0*p.y); + float s = sin(20.0*p.y); + mat2 m = mat2(c,-s,s,c); + vec2 newp = m*p.xz; + vec3 q = vec3(newp.x, p.y, newp.y); + return boxDist(q, vec3(1.0)); +} + +vec2 myMin(vec2 d1, vec2 d2) { + + return (d1.x 10.0) { + color *= pt.y; + } + + // snow + /*float h = smoothstep(55.0,80.0,pt.y/SC + 25.0*fbm(0.01*pt.xz/SC) ); + float e = smoothstep(1.0-0.5*h,1.0-0.1*h,norm.y); + float o = 0.3 + 0.7*smoothstep(0.0,0.1,norm.x+h*h); + float s = h*e*o; + col = mix( col, 0.29*vec3(0.62,0.65,0.7), smoothstep( 0.1, 0.9, s ) );*/ + return vec3(color); + +} + +float softshadow( in vec3 ro, in vec3 rd, in float tmin, in float tmax ) +{ + float shadow = 1.0; + float t = tmin; + float d = 0.0; + + for( int i=0; i<16; i++ ) + { + d = scene(ro, rd, t).x; + if (d < 0.0001) return 0.0; + shadow = min( shadow, 8.0*d/t ); + t += d; + if( d<0.0001 || t > tmax) break; + } + return clamp(shadow, 0.0, 1.0); + +} + +float ambientOcc( in vec3 pt, in vec3 norm ) +{ + float occ = 0.0; + float d = 0.0; + for(float k=1.0; k<10.0; k++ ) + { + d = scene(pt, norm, .01*k).x; + occ = (1.0 / pow(2.0, k)) * (k*.01 - d); + } + return clamp(1.0 - 3000.0*occ, 0.0, 1.0); +} + +float random(float x) { + + return fract(sin(x) * 10000.); + +} + +float noise2(vec2 p) { + + return random(p.x + p.y * 10000.); + +} + +vec2 sw(vec2 p) { return vec2(floor(p.x), floor(p.y)); } +vec2 se(vec2 p) { return vec2(ceil(p.x), floor(p.y)); } +vec2 nw(vec2 p) { return vec2(floor(p.x), ceil(p.y)); } +vec2 ne(vec2 p) { return vec2(ceil(p.x), ceil(p.y)); } +float smoothNoise(vec2 p) { + + vec2 interp = smoothstep(0., 1., fract(p)); + float s = mix(noise2(sw(p)), noise2(se(p)), interp.x); + float n = mix(noise2(nw(p)), noise2(ne(p)), interp.x); + return mix(s, n, interp.y); + +} + + +float noise3( const in vec3 x ) { + vec3 p = floor(x); + vec3 f = fract(x); + f = f*f*(3.0-2.0*f); + + vec2 uv = (p.xy+vec2(37.0,17.0)*p.z) + f.xy; + vec2 rg = texture2D( iChannel0, (uv+ 0.5)/256.0, -100.0 ).yx; + return mix( rg.x, rg.y, f.z ); +} + + +const mat2 m4 = mat2( 0.60, -0.80, 0.80, 0.60 ); + +const mat3 m3 = mat3( 0.00, 0.80, 0.60, + -0.80, 0.36, -0.48, + -0.60, -0.48, 0.64 ); + +float fbm( in vec3 p ) { + float f = 0.0; + f += 0.5000*noise3( p ); p = m3*p*2.02; + f += 0.2500*noise3( p ); p = m3*p*2.03; + f += 0.1250*noise3( p ); p = m3*p*2.01; + f += 0.0625*noise3( p ); + return f/0.9375; +} + +float fractalNoise(vec2 p) { + + float n = 0.; + n += smoothNoise(p); + n += smoothNoise(p * 2.) / 2.; + n += smoothNoise(p * 4.) / 4.; + n += smoothNoise(p * 8.) / 8.; + n += smoothNoise(p * 16.) / 16.; + n /= 1. + 1./2. + 1./4. + 1./8. + 1./16.; + return n; + +} + +float waterMap( vec2 pos ) { + vec2 posm = pos * m4; + + return abs( fbm( vec3( 8.*posm, time ))-0.5 )* 0.1; +} + +bool intersectPlane(const in vec3 ro, const in vec3 rd, const in float height, inout float dist) { + if (rd.y==0.0) { + return false; + } + + float d = -(ro.y - height)/rd.y; + d = min(100000.0, d); + if( d > 0. && d < dist ) { + dist = d; + return true; + } + return false; +} + +vec3 lig = normalize(vec3( 0.3,0.25, -0.6)); + +vec3 bgColor( const in vec3 rd ) { + float sun = clamp( dot(lig,rd), 0.0, 1.0 ); + vec3 col = vec3(0.5, 0.52, 0.55) - rd.y*0.2*vec3(1.0,0.8,1.0) + 0.15*0.75; + col += vec3(1.0,.6,0.1)*pow( sun, 8.0 ); + col *= 0.95; + return col; +} + + +vec4 render(in vec3 ro, in vec3 rd, float t) { + // TODO + int debug = 0; + bool root = false; + vec3 col = vec3(.8, .9, 1.0); + //float t = -1.0; + vec3 dist; + if (root) { + dist = findRoot(ro, rd); + } + else { + dist = sphereTrace(ro, rd, t); + } + t = dist.x; + if (t < 10.0) { + vec3 pt = ro + rd*t; + vec3 norm = calcNorm(pt); + + //material + + vec4 x = texture2D( iChannel1, pt.yz ); + vec4 y = texture2D( iChannel1, pt.zx ); + vec4 z = texture2D( iChannel1, pt.yx ); + vec3 a = abs(norm); + vec4 diffuse = (x*a.x + y*a.y + z*a.z) / (a.x + a.y + a.z); + if (dist.z == 1.0 && norm.y >= 0.999) { + diffuse *= vec4(0.3, 0.0, 0.0, 1.0); + } + if (dist.z == 2.0) { + diffuse = vec4(.2118, 0.2706, 0.3098, 1.0); + } + + //end material + + vec3 ref = reflect(rd, norm); + vec3 light = normalize(vec3(0.0, 2.0, 2.0) - pt); + float lambert = clamp(dot(light, norm), 0.0, 1.0); + //float amb = ambientOcc(pt, norm); + //soft shadows + //lambert *= softshadow( pt, light, 0.02, 2.5 ); + float dom = smoothstep( -0.1, 0.1, ref.y ); + //dom *= softshadow( pt, ref, 0.02, 2.5 ); + + float specular = 0.0; + if (lambert > 0.0) { + vec3 viewDir = normalize(-pt); + vec3 halfDir = normalize(light + viewDir); + float specAngle = clamp(dot(halfDir, norm), 0.0, 1.0); + specular = pow(specAngle, 4.0); + + } + + col = vec3(.2) + lambert * vec3(diffuse) + specular * vec3(0.5); //amb* + + col = pow(col, vec3(1.0/2.2)); + col *= 1.0 - smoothstep( 20.0, 40.0, t ); + if (dist.z == 3.0) { + float x = fractalNoise(pt.xz * 6.); + col = mix(vec3(x), vec3(.75, .85, 1.0), pow(abs(pt.y), .6)); + //return vec4(rd, dist.y*3.0 / 50.0); + } + if (dist.z == 4.0) { + col = getColor(pt, norm, t); + } + + if (debug == 1) { + col = norm; + } + else if (debug == 2) { + if (root) { + col = vec3(1.0, 1.0, 1.0)*(dist.y / 1000.0); + } + else { + col = vec3(1.0, 0.0, 0.0)*(dist.y*3.0 / 50.0); + } + } + else if (debug == 3) { + col = vec3(1.0) * ((5.0 - t) / 5.0); + } + + + } + + + return vec4(col, 1.0); //rd; // camera ray direction debug view +} + +mat3 setCamera(in vec3 ro, in vec3 ta, float cr) { + // Starter code from iq's Raymarching Primitives + // https://www.shadertoy.com/view/Xds3zN + + vec3 cw = normalize(ta - ro); + vec3 cp = vec3(sin(cr), cos(cr), 0.0); + vec3 cu = normalize(cross(cw, cp)); + vec3 cv = normalize(cross(cu, cw)); + return mat3(cu, cv, cw); +} + +float intersectSimple( const vec3 ro, const vec3 rd ) { + float maxd = 10000.0; + float precis = 0.001; + float h=precis*2.0; + float t = 0.0; + for( int i=0; i<50; i++ ) { + if( abs(h)maxd ) break; { + t += h; + vec2 newt = scene(ro, rd, t); + h = newt.x; + } + } + + return t; +} +void mainImage(out vec4 fragColor, in vec2 fragCoord) { + // Starter code from iq's Raymarching Primitives + // https://www.shadertoy.com/view/Xds3zN + vec2 q = fragCoord.xy / iResolution.xy; + vec2 p = -1.0 + 2.0 * q; + p.x *= iResolution.x / iResolution.y; + vec2 mo = iMouse.xy / iResolution.xy; + + time = 15.0 + iGlobalTime; + + // camera + vec3 ro = vec3( + -0.5 + 3.5 * cos(0.1 + 6.0 * mo.x), + 0.0 + 2.0 * mo.y, + 0.5 + 3.5 * sin(0.1 + 6.0 * mo.x)); + vec3 ta = vec3(-0.5, -0.4, 0.5); + + // camera-to-world transformation + mat3 ca = setCamera(ro, ta, 0.0); + + // ray direction + vec3 rd = ca * normalize(vec3(p.xy, 2.0)); + + + float fresnel, refldist = 5000., maxdist = 5000.; + bool reflected = false; + vec3 normal, col = bgColor( rd ); + vec3 roo = ro, rdo = rd, bgc = col; + float distSimple = intersectSimple(ro,rd); + if( intersectPlane( ro, rd, 0., refldist ) && refldist < distSimple ) { + ro += (refldist)*rd; + vec2 coord = ro.xz; + float bumpfactor = BUMPFACTOR * (1. - smoothstep( 0., BUMPDISTANCE, refldist) ); + + vec2 dx = vec2( EPSILON, 0. ); + vec2 dz = vec2( 0., EPSILON ); + + normal = vec3( 0., 1., 0. ); + normal.x = -bumpfactor * (waterMap(coord + dx) - waterMap(coord-dx) ) / (2. * EPSILON); + normal.z = -bumpfactor * (waterMap(coord + dz) - waterMap(coord-dz) ) / (2. * EPSILON); + normal = normalize( normal ); + + float ndotr = dot(normal,rd); + fresnel = pow(1.0-abs(ndotr),5.); + + rd = reflect( rd, normal); + + reflected = true; + bgc = col = bgColor( rd ); + } + + // render + vec4 color = render(roo, rdo, 0.); + col = vec3(color); + if(reflected) { + col = mix( col.xyz, bgc, 1.0-exp(-0.0000005*refldist*refldist) ); + col *= fresnel*0.9; + vec3 refr = refract( rdo, normal, 1./1.3330 ); + intersectPlane( ro, refr, -2., refldist ); + col += mix( texture2D( iChannel2, (roo+refldist*refr).xz*1.3 ).xyz * + vec3(1.,.9,0.6), vec3(1.,.9,0.8)*0.5, clamp( refldist / 3., 0., 1.) ) + * (1.-fresnel)*0.125; + + } + col = pow(col, vec3(0.7)); //4545)); + col = col*col*(3.0-2.0*col); + col = mix( col, vec3(dot(col,vec3(0.33))), -0.5 ); + col *= 0.25 + 0.75*pow( 16.0*q.x*q.y*(1.0-q.x)*(1.0-q.y), 0.1 ); + fragColor = vec4(col, 1.0); + + +} \ No newline at end of file diff --git a/finalScene.glsl b/finalScene.glsl new file mode 100644 index 0000000..b7ff9a7 --- /dev/null +++ b/finalScene.glsl @@ -0,0 +1,410 @@ +//http://www.iquilezles.org/www/articles/menger/menger.htm - mendel sponge +//https://en.wikipedia.org/wiki/Blinn%E2%80%93Phong_shading_model - blinn-phong lighting +//http://graphics.cs.williams.edu/courses/cs371/f14/reading/implicit.pdf - ray marching/sphere tracing +//http://www2.compute.dtu.dk/pubdb/views/edoc_download.php/6392/pdf/imm6392.pdf - ambient occlusion/soft shadows +//--Distance Functions------------------------------------------------------------------- + +#define NORMALS 0 +#define RAY_STEPS 0 +#define DISTANCE 0 +#define SPHERE_TRACE 1 + + +float planeDist( vec3 p ) +{ + /*if (p.y < ((sin(p.x) - sin(p.z)) / 4.0)) return (sin(p.x) - sin(p.z)) / 4.0; + else return 100.0; + return 100.0;*/ + return p.y; + +} + +float sphereDist(vec3 p, float r) { + return length(p) - r; +} + +float boxDist( vec3 p, vec3 b ) +{ + vec3 d = abs(p) - b; + return min(max(d.x,max(d.y,d.z)),0.0) + length(max(d,0.0)); +} + +float torusDist( vec3 p, vec2 t ) +{ + return length( vec2(length(p.xz)-t.x,p.y) )-t.y; +} + +float roundBoxDist( vec3 p, vec3 b, float r ) +{ + return length(max(abs(p)-b,0.0))-r; +} + +float crossDist( in vec3 p ) +{ + float da = boxDist(p.xyz,vec3(100000,1.0,1.0)); + float db = boxDist(p.yzx,vec3(1.0,100000,1.0)); + float dc = boxDist(p.zxy,vec3(1.0,1.0,100000)); + return min(da,min(db,dc)); +} + +float crossDist2( in vec3 p ) +{ + float da = boxDist(p.xyz,vec3(.7,.3,.3)); + float db = boxDist(p.yzx,vec3(.3,.7,.3)); + float dc = boxDist(p.zxy,vec3(.3,.3,.7)); + return min(da,min(db,dc)); +} + +//--CSG Functions---------------------------------------------------------------------- + +float diffFunc(float d1, float d2) { + return max(d1, -d2); +} + +float intersectionFunc(float d1, float d2) { + return max(d1, d2); +} + +float repeat( vec3 p, vec3 c ) +{ + vec3 q = mod(p,c)-0.5*c; + vec4 height = texture2D(iChannel0, p.xz); + //float avg = clamp((height.x + height.y + height.z + height.w) / 4.0, 0.0, 2.0); + return roundBoxDist(q - vec3(0.0, 0.0, 0.0), vec3(.35, 0.1, .35), 0.1); +} + +float displace( vec3 p ) +{ + float d1 = torusDist(p, vec2(.2, .2)); + float d2 = (sin(10.0*p.x)*sin(10.0*p.y)*sin(10.0*p.z)) / (16.0*p.y); + return d1+d2; +} + +vec3 transform(vec3 pt, vec3 translate, vec3 rot, vec3 scale) { + scale.x = 1.0/scale.x; + scale.y = 1.0/scale.y; + scale.z = 1.0/scale.z; + mat3 invRot = mat3(scale.x*cos(rot.y)*cos(rot.x), sin(rot.y)*sin(rot.z)*cos(rot.x) - cos(rot.z)*sin(rot.x) , sin(rot.y)*sin(rot.x) + cos(rot.z)*sin(rot.y)*cos(rot.x) , + cos(rot.y)*sin(rot.x), (sin(rot.z)*sin(rot.y)*sin(rot.x) + cos(rot.z)*cos(rot.x))*scale.y, sin(rot.x)*sin(rot.y)*cos(rot.z) - cos(rot.x)*sin(rot.z), + -sin(rot.y), cos(rot.y)*sin(rot.z), cos(rot.y)*cos(rot.z)*scale.z); + mat4 trans = mat4(scale.x*cos(rot.y)*cos(rot.x), sin(rot.y)*sin(rot.z)*cos(rot.x) - cos(rot.z)*sin(rot.x) , sin(rot.y)*sin(rot.x) + cos(rot.z)*sin(rot.y)*cos(rot.x) , 0.0, + cos(rot.y)*sin(rot.x), (sin(rot.z)*sin(rot.y)*sin(rot.x) + cos(rot.z)*cos(rot.x))*scale.y, sin(rot.x)*sin(rot.y)*cos(rot.z) - cos(rot.x)*sin(rot.z), 0.0, + -sin(rot.y), cos(rot.y)*sin(rot.z), cos(rot.y)*cos(rot.z)*scale.z, 0.0, + (-invRot*translate).x, (-invRot*translate).y, (-invRot*translate).z, 1.0); + + vec4 newPt = vec4(pt, 1.0); + newPt = trans*newPt; + return vec3(newPt); + +} + +vec2 myMin(vec2 d1, vec2 d2) { + + return (d1.x tmax) break; + } + return clamp(shadow, 0.0, 1.0); + +} + +float ambientOcc( in vec3 pt, in vec3 norm ) +{ + float occ = 0.0; + float d = 0.0; + for(float k=1.0; k<10.0; k++ ) + { + d = scene(pt, norm, .01*k).x; + occ = (1.0 / pow(2.0, k)) * (k*.01 - d); + } + return clamp(1.0 - 3000.0*occ, 0.0, 1.0); +} + +vec3 render(in vec3 ro, in vec3 rd) { + // TODO + int debug = 1; + bool root; + if (SPHERE_TRACE == 1) root = false; + else root = true; + vec3 col = vec3(.8, .9, 1.0); + float t = -1.0; + vec3 dist; + if (root) { + dist = findRoot(ro, rd); + } + else { + dist = sphereTrace(ro, rd); + } + t = dist.x; + if (t < 10.0) { + vec3 pt = ro + rd*t; + vec3 norm = calcNorm(pt); + vec4 diffuse = vec4(1.0); + //material + if (dist.z == 0.0) { + diffuse = vec4(0.0, 0.0, 0.0, 1.0); + } + if (dist.z == 1.0) { + vec4 x = texture2D( iChannel1, pt.yz ); + vec4 y = texture2D( iChannel1, pt.zx ); + vec4 z = texture2D( iChannel1, pt.yx ); + vec3 a = abs(norm); + diffuse = (x*a.x + y*a.y + z*a.z) / (a.x + a.y + a.z); + + } + if (dist.z == 6.0) { + vec4 x = texture2D( iChannel2, pt.yz ); + vec4 y = texture2D( iChannel2, pt.zx ); + vec4 z = texture2D( iChannel2, pt.yx ); + vec3 a = abs(norm); + diffuse = (x*a.x + y*a.y + z*a.z) / (a.x + a.y + a.z); + } + //end material + if (dist.z == 2.0) { + diffuse = vec4(0.0, 1.0, 0.0, 1.0); + } + if (dist.z == 3.0) { + diffuse = vec4(0.0, 1.0, 1.0, 1.0); + } + if (dist.z == 5.0) { + diffuse = vec4(1.0, 0.0, 0.0, 1.0); + } + vec3 ref = reflect(rd, norm); + vec3 light = normalize(vec3(0.0, 2.0, 2.0) - pt); + float lambert = clamp(dot(light, norm), 0.0, 1.0); + float amb = ambientOcc(pt, norm); + //soft shadows + lambert *= softshadow( pt, light, 0.02, 2.5 ); + float dom = smoothstep( -0.1, 0.1, ref.y ); + dom *= softshadow( pt, ref, 0.02, 2.5 ); + + float specular = 0.0; + if (lambert > 0.0) { + vec3 viewDir = normalize(-pt); + vec3 halfDir = normalize(light + viewDir); + float specAngle = clamp(dot(halfDir, norm), 0.0, 1.0); + specular = pow(specAngle, 4.0); + + } + + col = vec3(amb*.2) + lambert * vec3(diffuse) + specular * vec3(0.5); + + col = pow(col, vec3(1.0/2.2)); + col *= 1.0 - smoothstep( 20.0, 40.0, t ); + + + + if (NORMALS == 1) { + col = norm; + } + else if (RAY_STEPS == 1) { + if (root) { + col = vec3(1.0, 0.0, 0.0)*(dist.y / 1000.0); + } + else { + col = vec3(1.0, 0.0, 0.0)*(dist.y / 50.0); + } + } + else if (DISTANCE == 1) { + col = vec3(1.0) * ((5.0 - t) / 5.0); + } + } + return col; //rd; // camera ray direction debug view +} + +mat3 setCamera(in vec3 ro, in vec3 ta, float cr) { + // Starter code from iq's Raymarching Primitives + // https://www.shadertoy.com/view/Xds3zN + + vec3 cw = normalize(ta - ro); + vec3 cp = vec3(sin(cr), cos(cr), 0.0); + vec3 cu = normalize(cross(cw, cp)); + vec3 cv = normalize(cross(cu, cw)); + return mat3(cu, cv, cw); +} + +void mainImage(out vec4 fragColor, in vec2 fragCoord) { + // Starter code from iq's Raymarching Primitives + // https://www.shadertoy.com/view/Xds3zN + + vec2 q = fragCoord.xy / iResolution.xy; + vec2 p = -1.0 + 2.0 * q; + p.x *= iResolution.x / iResolution.y; + vec2 mo = iMouse.xy / iResolution.xy; + + float time = 15.0 + iGlobalTime; + + // camera + vec3 ro = vec3( + -0.5 + 3.5 * cos(0.1 * time + 6.0 * mo.x), + 1.0 + 2.0 * mo.y, + 0.5 + 3.5 * sin(0.1 * time + 6.0 * mo.x)); + vec3 ta = vec3(-0.5, -0.4, 0.5); + + // camera-to-world transformation + mat3 ca = setCamera(ro, ta, 0.0); + + // ray direction + vec3 rd = ca * normalize(vec3(p.xy, 2.0)); + + // render + vec3 col = render(ro, rd); + + col = pow(col, vec3(0.4545)); + + fragColor = vec4(col, 1.0); +} \ No newline at end of file diff --git a/firstScene.glsl b/firstScene.glsl new file mode 100644 index 0000000..2e6252f --- /dev/null +++ b/firstScene.glsl @@ -0,0 +1,347 @@ +//http://www.iquilezles.org/www/articles/menger/menger.htm - mendel sponge +//https://en.wikipedia.org/wiki/Blinn%E2%80%93Phong_shading_model - blinn-phong lighting +//http://graphics.cs.williams.edu/courses/cs371/f14/reading/implicit.pdf - ray marching/sphere tracing +//http://www2.compute.dtu.dk/pubdb/views/edoc_download.php/6392/pdf/imm6392.pdf - ambient occlusion/soft shadows +//--Distance Functions------------------------------------------------------------------- +#define NO_DEBUG 1 +#define NORMALS 0 +#define RAY_STEPS 0 +#define DISTANCE 0 +#define SPHERE_TRACE 1 + + +float planeDist( vec3 p ) +{ + /*if (p.y < ((sin(p.x) - sin(p.z)) / 4.0)) return (sin(p.x) - sin(p.z)) / 4.0; + else return 100.0; + return 100.0;*/ + return p.y; + +} + +float sphereDist(vec3 p, float r) { + return length(p) - r; +} + +float boxDist( vec3 p, vec3 b ) +{ + vec3 d = abs(p) - b; + return min(max(d.x,max(d.y,d.z)),0.0) + length(max(d,0.0)); +} + +float torusDist( vec3 p, vec2 t ) +{ + return length( vec2(length(p.xz)-t.x,p.y) )-t.y; +} + +float roundBoxDist( vec3 p, vec3 b, float r ) +{ + return length(max(abs(p)-b,0.0))-r; +} + +float crossDist( in vec3 p ) +{ + float da = boxDist(p.xyz,vec3(100000,1.0,1.0)); + float db = boxDist(p.yzx,vec3(1.0,100000,1.0)); + float dc = boxDist(p.zxy,vec3(1.0,1.0,100000)); + return min(da,min(db,dc)); +} + +//--CSG Functions---------------------------------------------------------------------- + +float diffFunc(float d1, float d2) { + return max(d1, -d2); +} + +float intersectionFunc(float d1, float d2) { + return max(d1, d2); +} + +float repeat( vec3 p, vec3 c ) +{ + vec3 q = mod(p,c)-0.5*c; + vec4 height = texture2D(iChannel0, p.xz); + //float avg = clamp((height.x + height.y + height.z + height.w) / 4.0, 0.0, 2.0); + return roundBoxDist(q - vec3(0.0, 0.0, 0.0), vec3(.35, 0.1, .35), 0.1); +} + +float displace( vec3 p ) +{ + float d1 = torusDist(p, vec2(1.0, .2)); + float d2 = (sin(20.0*p.x)*sin(20.0*p.y)*sin(20.0*p.z)) / (16.0*p.y); + return d1+d2; +} + +vec3 transform(vec3 pt, vec3 translate, vec3 rot, vec3 scale) { + scale.x = 1.0/scale.x; + scale.y = 1.0/scale.y; + scale.z = 1.0/scale.z; + mat3 invRot = mat3(scale.x*cos(rot.y)*cos(rot.x), sin(rot.y)*sin(rot.z)*cos(rot.x) - cos(rot.z)*sin(rot.x) , sin(rot.y)*sin(rot.x) + cos(rot.z)*sin(rot.y)*cos(rot.x) , + cos(rot.y)*sin(rot.x), (sin(rot.z)*sin(rot.y)*sin(rot.x) + cos(rot.z)*cos(rot.x))*scale.y, sin(rot.x)*sin(rot.y)*cos(rot.z) - cos(rot.x)*sin(rot.z), + -sin(rot.y), cos(rot.y)*sin(rot.z), cos(rot.y)*cos(rot.z)*scale.z); + mat4 trans = mat4(scale.x*cos(rot.y)*cos(rot.x), sin(rot.y)*sin(rot.z)*cos(rot.x) - cos(rot.z)*sin(rot.x) , sin(rot.y)*sin(rot.x) + cos(rot.z)*sin(rot.y)*cos(rot.x) , 0.0, + cos(rot.y)*sin(rot.x), (sin(rot.z)*sin(rot.y)*sin(rot.x) + cos(rot.z)*cos(rot.x))*scale.y, sin(rot.x)*sin(rot.y)*cos(rot.z) - cos(rot.x)*sin(rot.z), 0.0, + -sin(rot.y), cos(rot.y)*sin(rot.z), cos(rot.y)*cos(rot.z)*scale.z, 0.0, + (-invRot*translate).x, (-invRot*translate).y, (-invRot*translate).z, 1.0); + + vec4 newPt = vec4(pt, 1.0); + newPt = trans*newPt; + return vec3(newPt); + +} +//--Different Scenes------------------------------------------------------------------- + +float sceneFractal(vec3 ro, vec3 rd, float t) { + vec3 pt = ro + rd*t; + + float tmin = boxDist(pt - vec3(1.0, 0.0, 0.0), vec3(.5)); + + float s = 1.0; + for( int m=0; m<3; m++ ) + { + vec3 a = mod( pt*s, 2.0 )-1.0; + s *= 3.0; + vec3 r = abs(1.0 - 3.0*abs(a)); + + float da = max(r.x,r.y); + float db = max(r.y,r.z); + float dc = max(r.z,r.x); + float c = (min(da,min(db,dc))-1.0)/s; + + tmin = max(tmin,c); + } + tmin = min(tmin, planeDist(pt - vec3(0.0, -.5, 0.0))); + return tmin; + +} + +float sceneDisplacement(vec3 ro, vec3 rd, float t) { + vec3 pt = ro + rd*t; + float tmin = displace(pt); + + return tmin; +} + +float sceneNothin(vec3 ro, vec3 rd, float t) { + vec3 pt = ro + rd*t; + float tmin = 0.0; + + return tmin; +} +float sceneRepeat(vec3 ro, vec3 rd, float t) { + vec3 pt = ro + rd*t; + float tmin = repeat(pt, vec3(1.0, 0.0, 1.0)); + + return tmin; + +} + +float sceneTransform(vec3 ro, vec3 rd, float t) { + vec3 pt = ro + t*rd; + + vec3 pos = transform(vec3(pt), vec3(1.0, 0.0, 0.0), vec3(radians(iGlobalTime), radians(0.), radians(45.)), vec3(.5, 1.0, 1.0)); + float tmin = boxDist(pos, vec3(0.5)); + + return tmin; +} + +float sceneHeight(vec3 ro, vec3 rd, float t) { + vec3 pt = ro + rd*t; + vec4 color1 = texture2D (iChannel0, pt.xz); + + float tmin = pt.y - (color1.y); + return tmin; +} + +float sceneLighting(vec3 ro, vec3 rd, float t) { + vec3 pt = ro + rd*t; + float tmin = sphereDist(pt - vec3(0.0, -1.0, 0.0), .50); + tmin = min(tmin, planeDist(pt - vec3(0.0, -.5, 0.0))); + tmin = min(tmin, boxDist(pt - vec3( 1.0,-.25, 0.0), vec3(0.25))); + tmin = min(tmin, torusDist(pt - vec3(-1.0, 0.25, 0.0), vec2(0.20,0.05))); + tmin = min(tmin, diffFunc(boxDist(pt - vec3(0.0), vec3(0.50, 0.30, 0.30)), sphereDist(pt - vec3(0.0), 0.40))); + return tmin; +} + +//--Ray Marching------------------------------------------------------------------ + +vec3 calcNorm( in vec3 pos ) +{ + vec3 eps = vec3( 0.001, 0.0, 0.0 ); + vec3 nor = vec3( + scene(pos+eps.xyy, vec3(0.0), 0.0) - scene(pos-eps.xyy, vec3(0.0), 0.0), + scene(pos+eps.yxy, vec3(0.0), 0.0) - scene(pos-eps.yxy, vec3(0.0), 0.0), + scene(pos+eps.yyx, vec3(0.0), 0.0) - scene(pos-eps.yyx, vec3(0.0), 0.0) ); + return normalize(nor); +} + +vec2 findRoot(vec3 ro, vec3 rd) { + float dist = 100.0; + float i = 0.0; + for (float t = 0.0; t < 5.0; t += .01) { + i++; + if (scene(ro, rd, t) < 0.0) { + dist = t; + break; + } + } + + return vec2(dist, i); + +} + +vec2 sphereTrace(vec3 ro, vec3 rd) { + float t = 0.0; + float dt; + float numTraces = 0.0; + for (int i = 0; i < 50; i++) { + numTraces++; + dt = scene(ro, rd, t); + t = t + dt; + if (dt < 0.0001) { + break; + } + } + return vec2(t, numTraces); +} + +float softshadow( in vec3 ro, in vec3 rd, in float tmin, in float tmax ) +{ + float shadow = 1.0; + float t = tmin; + float d = 0.0; + for( int i=0; i<16; i++ ) + { + d = scene(ro, rd, t); + if (d < 0.0001) return 0.0; + shadow = min( shadow, 8.0*d/t ); + t += d; + if( d<0.0001 || t > tmax) break; + } + return clamp(shadow, 0.0, 1.0); + +} + +float ambientOcc( in vec3 pt, in vec3 norm ) +{ + float occ = 0.0; + float d = 0.0; + for(float k=1.0; k<10.0; k++ ) + { + d = scene(pt, norm, .01*k); + occ = (1.0 / pow(2.0, k)) * (k*.01 - d); + } + return clamp(1.0 - 3000.0*occ, 0.0, 1.0); +} + +vec3 render(in vec3 ro, in vec3 rd) { + // TODO + int debug = 1; + bool root; + if (SPHERE_TRACE == 1) root = false; + else root = true; + vec3 col = vec3(.8, .9, 1.0); + float t = -1.0; + vec2 dist; + if (root) { + dist = findRoot(ro, rd); + } + else { + dist = sphereTrace(ro, rd); + } + t = dist.x; + if (t < 10.0) { + vec3 pt = ro + rd*t; + vec3 norm = calcNorm(pt); + + //material + vec4 x = texture2D( iChannel1, pt.yz ); + vec4 y = texture2D( iChannel1, pt.zx ); + vec4 z = texture2D( iChannel1, pt.yx ); + vec3 a = abs(norm); + vec4 diffuse = (x*a.x + y*a.y + z*a.z) / (a.x + a.y + a.z); + + //end material + + vec3 ref = reflect(rd, norm); + vec3 light = normalize(vec3(0.0, 2.0, 2.0) - pt); + float lambert = clamp(dot(light, norm), 0.0, 1.0); + float amb = ambientOcc(pt, norm); + //soft shadows + lambert *= softshadow( pt, light, 0.02, 2.5 ); + float dom = smoothstep( -0.1, 0.1, ref.y ); + dom *= softshadow( pt, ref, 0.02, 2.5 ); + + float specular = 0.0; + if (lambert > 0.0) { + vec3 viewDir = normalize(-pt); + vec3 halfDir = normalize(light + viewDir); + float specAngle = clamp(dot(halfDir, norm), 0.0, 1.0); + specular = pow(specAngle, 4.0); + + } + + col = vec3(.2) + lambert * vec3(diffuse) + specular * vec3(0.5); //amb* + + col = pow(col, vec3(1.0/2.2)); + col *= 1.0 - smoothstep( 20.0, 40.0, t ); + if (NORMALS == 1) { + col = norm; + } + else if (RAY_STEPS == 1) { + if (root) { + col = vec3(1.0, 0.0, 0.0)*(dist.y / 500.0); + } + else { + col = vec3(1.0, 0.0, 0.0)*(dist.y / 50.0); + } + } + else if (DISTANCE == 1) { + col = vec3(1.0) * ((5.0 - t) / 5.0); + } + } + return col; //rd; // camera ray direction debug view +} + +mat3 setCamera(in vec3 ro, in vec3 ta, float cr) { + // Starter code from iq's Raymarching Primitives + // https://www.shadertoy.com/view/Xds3zN + + vec3 cw = normalize(ta - ro); + vec3 cp = vec3(sin(cr), cos(cr), 0.0); + vec3 cu = normalize(cross(cw, cp)); + vec3 cv = normalize(cross(cu, cw)); + return mat3(cu, cv, cw); +} + +void mainImage(out vec4 fragColor, in vec2 fragCoord) { + // Starter code from iq's Raymarching Primitives + // https://www.shadertoy.com/view/Xds3zN + + vec2 q = fragCoord.xy / iResolution.xy; + vec2 p = -1.0 + 2.0 * q; + p.x *= iResolution.x / iResolution.y; + vec2 mo = iMouse.xy / iResolution.xy; + + float time = 15.0 + iGlobalTime; + + // camera + vec3 ro = vec3( + -0.5 + 3.5 * cos(0.1 * time + 6.0 * mo.x), + 1.0 + 2.0 * mo.y, + 0.5 + 3.5 * sin(0.1 * time + 6.0 * mo.x)); + vec3 ta = vec3(-0.5, -0.4, 0.5); + + // camera-to-world transformation + mat3 ca = setCamera(ro, ta, 0.0); + + // ray direction + vec3 rd = ca * normalize(vec3(p.xy, 2.0)); + + // render + vec3 col = render(ro, rd); + + col = pow(col, vec3(0.4545)); + + fragColor = vec4(col, 1.0); +} \ No newline at end of file diff --git a/img/amb_occ_debug.png b/img/amb_occ_debug.png new file mode 100644 index 0000000..8593488 Binary files /dev/null and b/img/amb_occ_debug.png differ diff --git a/img/ambient_occlusion.png b/img/ambient_occlusion.png new file mode 100644 index 0000000..00ccf90 Binary files /dev/null and b/img/ambient_occlusion.png differ diff --git a/img/before_amb_occ.png b/img/before_amb_occ.png new file mode 100644 index 0000000..225ffcc Binary files /dev/null and b/img/before_amb_occ.png differ diff --git a/img/blinn_phong_lighting.png b/img/blinn_phong_lighting.png new file mode 100644 index 0000000..16c4352 Binary files /dev/null and b/img/blinn_phong_lighting.png differ diff --git a/img/bridge2.png b/img/bridge2.png new file mode 100644 index 0000000..0ceb291 Binary files /dev/null and b/img/bridge2.png differ diff --git a/img/bridge_over_water.png b/img/bridge_over_water.png new file mode 100644 index 0000000..893cd97 Binary files /dev/null and b/img/bridge_over_water.png differ diff --git a/img/bridge_tracks.png b/img/bridge_tracks.png new file mode 100644 index 0000000..25f6a45 Binary files /dev/null and b/img/bridge_tracks.png differ diff --git a/img/bridge_under_water.png b/img/bridge_under_water.png new file mode 100644 index 0000000..b4bb36c Binary files /dev/null and b/img/bridge_under_water.png differ diff --git a/img/debug_image.png b/img/debug_image.png new file mode 100644 index 0000000..26a1458 Binary files /dev/null and b/img/debug_image.png differ diff --git a/img/debug_image_norm.png b/img/debug_image_norm.png new file mode 100644 index 0000000..035a695 Binary files /dev/null and b/img/debug_image_norm.png differ diff --git a/img/debug_image_orig.png b/img/debug_image_orig.png new file mode 100644 index 0000000..45d12ec Binary files /dev/null and b/img/debug_image_orig.png differ diff --git a/img/debug_image_steps.png b/img/debug_image_steps.png new file mode 100644 index 0000000..24d748a Binary files /dev/null and b/img/debug_image_steps.png differ diff --git a/img/debug_naive.png b/img/debug_naive.png new file mode 100644 index 0000000..907f5ed Binary files /dev/null and b/img/debug_naive.png differ diff --git a/img/debug_sphere.png b/img/debug_sphere.png new file mode 100644 index 0000000..1e6b38e Binary files /dev/null and b/img/debug_sphere.png differ diff --git a/img/final_scene_gif.gif b/img/final_scene_gif.gif new file mode 100644 index 0000000..3d1f934 Binary files /dev/null and b/img/final_scene_gif.gif differ diff --git a/img/height_map1.png b/img/height_map1.png new file mode 100644 index 0000000..b30c966 Binary files /dev/null and b/img/height_map1.png differ diff --git a/img/height_map2.png b/img/height_map2.png new file mode 100644 index 0000000..5ed4e71 Binary files /dev/null and b/img/height_map2.png differ diff --git a/img/menger_sponge.png b/img/menger_sponge.png new file mode 100644 index 0000000..928b758 Binary files /dev/null and b/img/menger_sponge.png differ diff --git a/img/pie.png b/img/pie.png new file mode 100644 index 0000000..da6ace8 Binary files /dev/null and b/img/pie.png differ diff --git a/img/rotating.gif b/img/rotating.gif new file mode 100644 index 0000000..9acef32 Binary files /dev/null and b/img/rotating.gif differ diff --git a/img/rotating_box_gif.gif b/img/rotating_box_gif.gif new file mode 100644 index 0000000..01ac8c5 Binary files /dev/null and b/img/rotating_box_gif.gif differ diff --git a/img/smoke.gif b/img/smoke.gif new file mode 100644 index 0000000..68f7869 Binary files /dev/null and b/img/smoke.gif differ diff --git a/img/soft_shadow.png b/img/soft_shadow.png new file mode 100644 index 0000000..bdffe10 Binary files /dev/null and b/img/soft_shadow.png differ diff --git a/img/sun_reflecting.png b/img/sun_reflecting.png new file mode 100644 index 0000000..cd2382c Binary files /dev/null and b/img/sun_reflecting.png differ diff --git a/img/train.png b/img/train.png new file mode 100644 index 0000000..0d231fd Binary files /dev/null and b/img/train.png differ diff --git a/img/train_gif.gif b/img/train_gif.gif new file mode 100644 index 0000000..c96d63a Binary files /dev/null and b/img/train_gif.gif differ diff --git a/img/train_smoke.png b/img/train_smoke.png new file mode 100644 index 0000000..39c0d07 Binary files /dev/null and b/img/train_smoke.png differ diff --git a/img/wood_material.png b/img/wood_material.png new file mode 100644 index 0000000..cc1fe67 Binary files /dev/null and b/img/wood_material.png differ diff --git a/raymarch.glsl b/raymarch.glsl deleted file mode 100644 index aac90c8..0000000 --- a/raymarch.glsl +++ /dev/null @@ -1,47 +0,0 @@ -vec3 render(in vec3 ro, in vec3 rd) { - // TODO - return rd; // camera ray direction debug view -} - -mat3 setCamera(in vec3 ro, in vec3 ta, float cr) { - // Starter code from iq's Raymarching Primitives - // https://www.shadertoy.com/view/Xds3zN - - vec3 cw = normalize(ta - ro); - vec3 cp = vec3(sin(cr), cos(cr), 0.0); - vec3 cu = normalize(cross(cw, cp)); - vec3 cv = normalize(cross(cu, cw)); - return mat3(cu, cv, cw); -} - -void mainImage(out vec4 fragColor, in vec2 fragCoord) { - // Starter code from iq's Raymarching Primitives - // https://www.shadertoy.com/view/Xds3zN - - vec2 q = fragCoord.xy / iResolution.xy; - vec2 p = -1.0 + 2.0 * q; - p.x *= iResolution.x / iResolution.y; - vec2 mo = iMouse.xy / iResolution.xy; - - float time = 15.0 + iGlobalTime; - - // camera - vec3 ro = vec3( - -0.5 + 3.5 * cos(0.1 * time + 6.0 * mo.x), - 1.0 + 2.0 * mo.y, - 0.5 + 3.5 * sin(0.1 * time + 6.0 * mo.x)); - vec3 ta = vec3(-0.5, -0.4, 0.5); - - // camera-to-world transformation - mat3 ca = setCamera(ro, ta, 0.0); - - // ray direction - vec3 rd = ca * normalize(vec3(p.xy, 2.0)); - - // render - vec3 col = render(ro, rd); - - col = pow(col, vec3(0.4545)); - - fragColor = vec4(col, 1.0); -}