View allAll Photos Tagged openframeworks
This project was inspired by the work of an incredibly talented community of artists and designers that are using video mapping as a medium to reinterpret and transform banal, expected environments. The work of Pablo Valbuena was a strong influence over our explorations, and we sought to introduce dynamic interactivity to augmented sculpture as our novel addition to this community.
Developed in C++, Openframeworks, and OpenNI, we are using the depth mapping capabilities of the kinect to evaluate the viewer(participant's) hand, and position it as the light source of the physical model. In effect, their hand becomes the sun, lighting or dimming our abstracted cityscape.
Some screen grabs from my latest interactive installation.
Made for "The New Sublime" exibition at Clearleft during the Brighton Digital Festival.
More info here: www.clearleft.com/does/art
Some screen grabs from my latest interactive installation.
Made for "The New Sublime" exibition at Clearleft during the Brighton Digital Festival.
More info here: www.clearleft.com/does/art
Tile frustum light culling using an OpenCL kernel that divides the screen up into 32x32 pixel tiles. For each tile, the kernel then finds the minimum and maximum Z depth of the pixels in that tile through a reduction.
With this information, a bounding frustum is created for each tile and each point light's position + attenuation (affected area) in view space is then culled against this frustum.
The idea is that you can then get a list of all the lights that are affecting each tile, cutting down on the number of fragments you need to process for shading. In the lighting shader, you figure out which tile the frag coords fall under and then shade using the lights listed in that tile's light index buffer.
This can be used in both deferred rendering as well as forward rendering (and extended in Forward+ rendering). Looking forward to trying out light indexed forward rendering... I'm a little sick of no transparency and material challenges with full deferred ;)
I'm currently using 2 separate kernels for min/max reduction and light culling - need to combine these as a next step for speed.
Debug view of the buffers used for light culling - Final Render, Linear Depth Buffer, Min Z for each tile (Max Z is also calculated, but not displayed in this debug view), Light "heat map" showing number of lights affecting each tile (provides a visual idea of how much processing each tile will need to perform, not actually used for rendering).
Swipin' Safari begins with the swipe, the ubiquitous one-finger gesture of the mobile device. For this iPhone app, this swipe is the vehicle for taking a virtual safari across an infinite painting. With each gesture, we encounter new species of brush strokes and colorful patterns from the digital world.
Swipin' Safari was created for Safari, an online exhibition for SPAMM (Super Art Moderne Musuem): safari.spamm.fr
The app is available for free in the iTunes Store: itunes.apple.com/us/app/swipin-safari/id635434195?ls=1&am...
Final tweaks on software piece for CLOUDS documentary by James George and Jonathan Minard. Random output, not post-processed.
Realtime video distortion openFrameworks App for Champagne Valentine. It was projected on the wall of an old sauna for a fashion event
Installing "Light Leaks" at La Gaîté Lyrique for the Capitaine Futur show. gaite-lyrique.net/en/exposition/capitaine-futur-and-the-e...
Captures of the sofware I'm using in live with Murcof. Most of the content is procedural and/or audio reactive.
Still needs some work but could end up being fun. Looks pretty different running at a low FPS, due to the screen recording overhead, will try on my Mac Pro again at some point.
Physics are done using Bullet3D.
This body of work is centered around bombshells from the 50s 60s 70s and is heavily influenced by modern graffiti, marbleized paper patterns and Chicago artist Ed Paschke. The work is created using software particle painting engine in c++ and openGL that is very similar to spray painting graffiti except I can use fingers on my tablet instead.
Limited prints are available, contact me if interested.
More at www.donrelyea.com
Prefalll 135 is an interactive audio-visual installation.
It uses the energy of falling water to make watermils rotate and produce sound and graphics.
By opening and closing the taps, the user is controlling the water circuit and defining the parameters of the audiovisual system
Visuals :: Openframeworks+MSAFluids
Sound :: Pure Data
Physical interaction :: photoreflector IR encoder+arduino
by:Rodrigo Carvalho, Katerina Antonoupoulou, Javier Chavarri
video :: vimeo.com/40450746
photos by Paulo Pinto
Fabra i Coats, Barcelona, April 13th 2012
Final tweaks on software piece for CLOUDS documentary by James George and Jonathan Minard. Random output, not post-processed.
playing with contour maps.
using multiple thresholds over a perlin noise image, a la julapy
www.julapy.com/blog/2011/02/24/powerhouse-ecologic-exhibi...
project 3d points to screen, use points index divided by number of points as HSB color, build voronoi diagram.
Final tweaks on software piece for CLOUDS documentary by James George and Jonathan Minard. Random output, not post-processed.
Final tweaks on software piece for CLOUDS documentary by James George and Jonathan Minard. Random output, not post-processed.
Final tweaks on software piece for CLOUDS documentary by James George and Jonathan Minard. Random output, not post-processed.
Final tweaks on software piece for CLOUDS documentary by James George and Jonathan Minard. Random output, not post-processed.
Basically I want to use shaders with Openframeworks on iOS, which you can't do if you are rendering using OpenGL ES 1.1 as we are now.
So far this was relatively low hanging fruit thanks to code.google.com/p/gles2-bc/ and the awesome swappable renderer system in OF 0.07.
For now I'm just focusing on getting the standard OF drawing stuff working.
Lives here for now: github.com/andreasmuller/openFrameworks/tree/develop-open...
Final tweaks on software piece for CLOUDS documentary by James George and Jonathan Minard. Random output, not post-processed.
Tests controlling a 504 OLED panels BlackBody iRain luster with a Kinect through 126 DMX controllers.
Developped with openFrameworks (C++)
Final tweaks on software piece for CLOUDS documentary by James George and Jonathan Minard. Random output, not post-processed.