View allAll Photos Tagged glsl
GLSL generated sphere with the help of some audio data to disturb the form. The whole computing of form normals and lightning is done on the gpu. Processing as just use for audio analysis.
they want their colour cycling back - funny that in 2008 you need to make a hardware shader and a load of code to recreate this. time for some rave videos!
GLSL generated sphere with the help of some audio data to disturb the form. The whole computing of form normals and lightning is done on the gpu. Processing as just use for audio analysis.
Quick realtime depth of field test. Each sphere is a weak deferred point light source with a random colour (high attenuation) - this causes all the different colours. A simple HDR tone mapping equation is used to adjust the exposure - without it, all the lights bunched in the center tend to turn a bright white.
Unfortunately there is no documentation on this project. So in a few words ...
3D Scenes can be loaded in the application and animated in realtime. The content was projected on long layers of transparent material in an old beer brewery in Vienna.
This is my first collaboration with Yannick Jacquet (Antivj / Legoman).
Openframeworks + GLSL Geometry Shaders.
Pretty quiet in Europe.
Made with a C++ framework being developed by a team at Barbarian Group, headed up by Andrew Bell ( www.drawnline.net ).
In working on the Java landscape engine, I started to finally realize and accept that I should be doing it all in C++, the demon bastard language that has claimed many an art-school graduate. Luckily for me, Andrew and his gang of geniuses have been cranking away on a fantastic C++ cozy that has made the transition from Java 95% painless.
This particular project, which is being made as a way for me to take baby steps instead of diving right into the deep end (mixed metaphor!), is a visualization of the last 7 days of earthquakes with a magnitude of 2.5 and higher. Incidentally, I accidentally rediscovered bumpmapping when trying to wrap my head around normals and normal maps. I love it when that happens! Instead of actually reading about bumpmaps and learning it the proper way, I did it by trying every arithmetic operator one by one until I got something interesting. Three cheers for oblivious discovery! Now if only I could accidentally understand how to use quaternions.
Oh, and the awesome high-res texture maps are from the extremely wonderful oera.net. www.oera.net/How2/TextureMaps2.htm
Should be viewed large (or on black) to appreciate lighting.
I love the way this blends organic and crystal structure and coloring. LIke some malachite deposit gone wild.
Beautiful Mandelbrot spiral with "Orbit Traps" (the glowing white lines).
Created using Fragmentarium, an IDE that makes it easy (well, easier) to explore fractal and generative systems written in GLSL shader language.
Quite amazing.
Quick realtime depth of field test. Each sphere is a weak deferred point light source with a random colour (high attenuation) - this causes all the different colours. A simple HDR tone mapping equation is used to adjust the exposure - without it, all the lights bunched in the center tend to turn a bright white.
Some interesting reading that I stumbled upon about black bears in Ontario (from the EVALUATION OF ONTARIO’S PROVINCIAL BEAR WISE PROGRAM, SUBMITTED BY:
DAVE PAYNE and DAN DUGGAN in JANUARY, 2009):
Two areas of the province, Sudbury and Sault Ste Marie have been noted to have particularly high numbers of human-bear conflicts. Sudbury’s situation with respect to problem bears is a direct result of human development on the local environment.
One hundred and twenty years ago the Sudbury area was dominated by a mature red/white pine forest. In the late 1800’s, mineral exploration and mining resulted in much of this forest being lost; and the fumes from the open pit roasting of sulphur bearing ore (with wood as fuel) acidified the soils to the extent that very little could grow within an area of about 140 square miles (approx. 362 km2).
The blueberry shrub, because it thrives in acidic soils, was one of the few species able to flourish in what was otherwise depicted as a “moonscape.” In the early 1970s, NASA even sent its astronauts to Sudbury in order to prepare for a moon landing. Over time, the increase in available food supply meant that the Sudbury area was able to support a higher density of bears than elsewhere in Ontario. During parts of the growing season when blueberries are unavailable, some of these animals invariably come into conflict with people as they attempt to take advantage of alternative food sources (i.e. urban garbage, bird seed, etc.).
Since the late 1970’s, there has been a concerted effort to “re-green” the Sudbury landscape. This has resulted in a slow decline in the availability of blueberries as other plant species re-establish themselves and has caused bears to forage farther into urban areas for food.
The area’s high rate of human-bear conflicts are further exacerbated by the fact that unlike communities that have a readily definable core area and perimeter, Sudbury is made up of more than a dozen widely-spaced communities. Undeveloped rock ridges are common within each of these small towns and provide foraging bears with travel corridors to almost all urban areas.
Sault Ste. Marie (SSM) District lies predominately in the Great Lakes St. Lawrence (GLSL) and GLSL/Boreal transition forest and like Sudbury, supports a relatively high density of black bear. Blueberries are not abundant in Sault Ste Marie area and black bear are not as dependent on them as they would be in other areas of the province. There is however, a good diversity of other fruit and nut bearing plant species utilized by black bear as forage, especially in the southern portion of the district. Apple and other fruit trees provide a significant food source. Black bear in the Sault Ste Marie area are therefore not as affected by blueberry crop failures due to late frosts and/or summer drought.
GLSL generated torrus with the help of some audio data to disturb the form. The whole computing of form normals and lightning is done on the gpu. Processing as just use for audio analysis.
Generated using S.A.R.A (Synchronous Audio Reactive Algorithms) - a custom software built in c++/opengl for real-time procedural audio reactive visual generation.
Demo here: vimeo.com/accidentally/sara
Installation by 1024 architecture
The work takes the form of a cube of 3 meters, consisting of a matrix of 81 LED bars inclined 24° forward and implementing a new GLSL software created by 1024 together with partners Garage Cube, which is used in CORE to generatively produce spatial visualization of light.
[]
Electronic: From Kraftwerk to The Chemical Brothers
(July – February 2021)
Evoking the experience of being in a club, the exhibition will transport you through the people, art, design, technology and photography that have been shaping the electronic music landscape.
Celebrate 50 years of legendary group Kraftwerk with their 3D show. Step into the visual world of The Chemical Brothers for one of their legendary live shows, as visuals and lights interact to create a new three-dimensional experience by Smith & Lyall.
Travel to dance floors from Detroit to Chicago, Paris, Berlin and the UK’s thriving scene; featuring over 400 objects and the likes of Detroit techno legends Kevin Saunderson, Juan Atkins, Jeff Mills and Richie Hawtin, "Godfather of House Music" Frankie Knuckles, Haçienda designer Ben Kelly and the extreme visual world created by Weirdcore for Aphex Twin’s ‘Collapse’.
Discover early pioneers Daphne Oram and the seminal BBC Radiophonic Workshop. Indulge your senses with large scale images of rave culture by Andreas Gursky, iconic DJ masks and fashion, a genre-spanning soundtrack by French DJ and producer Laurent Garnier, a sound reactive visual installation created specifically for the exhibition by 1024 architecture, graphics from Peter Saville CBE, history-making labels and club nights.
[Design Museum]
Start playing with a crosshatch shader. Starting point was here: learningwebgl.com/blog/?p=2858
Video: vimeo.com/18280138
Beautiful Mandelbrot spiral with "Orbit Traps" (the glowing white lines).
Created using Fragmentarium, an IDE that makes it easy (well, easier) to explore fractal and generative systems written in GLSL shader language.
Quite amazing.
Had a bunch of rendering experiments and snippets lying around and figured it was about time to assemble them into something a bit more reusable.
I've been wanting to create a flexible little OpenGL rendering toolset that can be used in creative frameworks such as OF and Cinder, or with GLFW, but is decoupled and does not have any dependency on them.
Here's where I'm at after a few days of mashing the keys:
- Deferred rendering (pointlights diffuse + specular)
- Normal mapping + specular maps
- Basic mesh + submeshes
- Materials
- Model loading through Assimp
- Image loading (stb_image.c)
- SSAO
Interactive: glslsandbox.com/e#14115.2
#ifdef GL_ES
precision mediump float;
#endif
uniform float time;
uniform vec2 mouse;
uniform vec2 resolution;
// this version keeps the circle inversion at r=1,
// but the affine transform is variable:
// user controls the rotate and zoom,
// translate automatically cycles through a range of values.
#define N 100
#define PI2 6.2831853070
void main( void ) {
// map frag coord and mouse to model coord
vec2 v = (gl_FragCoord.xy - resolution / 2.) * 20.0 / min(resolution.y,resolution.x);
// transform parameters
float angle = PI2*mouse.x;
float C = cos(angle);
float S = sin(angle);
vec2 shift = vec2( 2.*mouse.x-1., 4.*mouse.y-2. );
float scale = 1.75;
float rad2 = 1.;
float rsum = 0.0;
for ( int i = 0; i rad2 ){
rr = rad2/rr;
v.x = v.x * rr;
v.y = -v.y * rr;
}
rsum =max(rr, rsum);
// affine transform: rotate, scale, and translate
v = vec2( C*v.x-S*v.y, S*v.x+C*v.y ) * scale + shift;
}
float col = rsum*rsum * (1000.0 / float(N) / rad2);
// color basis vectors
vec3 cb1 = vec3(0.8,0.5,0.0);
vec3 cb2 = vec3(0.0,0.5,0.8);
vec3 cb3 = vec3(1.)-cb1-cb2;
gl_FragColor = vec4( vec3(
max(0.,cos(col*1.8))*cb1 + max(0.,cos(col*2.2))*cb2 + max(0.,cos(col*2.2))*cb3), 1.0 );
}
This is turning out to be an interesting exercise in data organization. As some early images indicated, there is a ton of overlap with earthquakes. They tend to happen in the same general areas over and over so you end up with clusters of activity. For these two images, I decided to only showcase quakes larger than 6.0M. This means they get colored red and their location is written out.
To spread out the other quakes to avoid overlap, I used the quake's magnitude as the charge variable and forced the quakes to spread out and push away from each other using magnetic repulsion but still remain anchored to the original location. That way, the numbers will push into empty spaces so they can be read, but the side effect is that the magnitude text is no longer right above the actual quake epicenter.
Made with a C++ framework being developed by a team at Barbarian Group, headed up by Andrew Bell ( www.drawnline.net ).
In working on the Java landscape engine, I started to finally realize and accept that I should be doing it all in C++, the demon bastard language that has claimed many an art-school graduate. Luckily for me, Andrew and his gang of geniuses have been cranking away on a fantastic C++ cozy that has made the transition from Java 95% painless.
This particular project, which is being made as a way for me to take baby steps instead of diving right into the deep end (mixed metaphor!), is a visualization of the last 7 days of earthquakes with a magnitude of 2.5 and higher.
The textures are from NASA's Blue Marble collection. visibleearth.nasa.gov/view_set.php?categoryID=2363
Start playing with a crosshatch shader. Starting point was here: learningwebgl.com/blog/?p=2858
Video: vimeo.com/18280138
Fragmentarium (my GLSL-based Pixel Bender clone) is progressing nicely.
Now with user sliders for vectors and color choosers. I have also implemented a very simple GLSL raytracer for Distance Estimator based fractals: only Ambient Occlusion (based on number of ray steps), no Phong-lightning, no anti-alias - but it is very fast.
The above is a Menger DE.
Unfortunately there is no documentation on this project. So in a few words ...
3D Scenes can be loaded in the application and animated in realtime. The content was projected on long layers of transparent material in an old beer brewery in Vienna.
This is my first collaboration with Yannick Jacquet (Antivj / Legoman).
Openframeworks + GLSL Geometry Shaders.
Using diffuse, bump, and normal maps from FilterForge is a great way to learn more about the nuance of using GLSL shaders. In this example, which I hope to post a vide of shortly, I am using the live webcam feed as a reflectivity map on this mass of oil and foam. The geometry of the project consists of a single quad. All of the visuals are being done on a single frag shader and runs fullscreen (with audio reactivity as well) in realtime.
Video on Vimeo: www.vimeo.com/7874474
Added rivers to the normal and elevation maps (from naturalearth.springercarto.com/ne3_data/8192/textures/2_n... ) because they were missing as details in the NASA Blue Marble textures (understandably...tree canopies would hide most of the Amazon river from the passing satellites).
Made with a C++ framework being developed by a team at Barbarian Group, headed up by Andrew Bell ( www.drawnline.net ).
In working on the Java landscape engine, I started to finally realize and accept that I should be doing it all in C++, the demon bastard language that has claimed many an art-school graduate. Luckily for me, Andrew and his gang of geniuses have been cranking away on a fantastic C++ cozy that has made the transition from Java 95% painless.
This particular project, which is being made as a way for me to take baby steps instead of diving right into the deep end (mixed metaphor!), is a visualization of the last 7 days of earthquakes with a magnitude of 2.5 and higher.
realtime shader test in the studio onto one of vishal's sculptures, experimenting with mapping to organic form.
This GLSL uses video buffer feedback to approximate an iterated function system.
Interactive: glslsandbox.com/e#13785.0
This one's finicky. Not all systems support the video buffer feedback. May not work in your browser, definitely without a GPU.
Testing performance after some updates to my deferred renderer. Ugly as sin and no way practical to have that many lights :) but hits a stable 60fps with 12,000 dynamic moving lights at 1920x1280 resolution. Big performance gains were from minimizing OpenGL draw calls, shader switching, and binding/unbinding of VBOs + textures - by keeping these bound when possible and making sure that all objects sharing the same VBO, texture, and shader are rendered together. Pretty obvious stuff now, but it's easy to be careless ;)
Definitely becoming a profiler junkie - CodeXL is great on Linux (and Windows) and OpenGL Profiler from Apple does the trick on OS X.