View allAll Photos Tagged openframeworks

Another fun experiment made by Oriol.

uri.cat

 

Uploaded With FlickrDrop

formfallstudien_teil1

Another fun experiment made by Oriol.

uri.cat

 

Uploaded With FlickrDrop

upcoming examples from openFrameworks 0071

upcoming examples from openFrameworks 0071

Final tweaks on software piece for CLOUDS documentary by James George and Jonathan Minard. Random output, not post-processed.

programmed in openFrameworks, 2013

upcoming examples from openFrameworks 0071

Another fun experiment made by Oriol.

uri.cat

 

Uploaded With FlickrDrop

upcoming examples from openFrameworks 0071

Final tweaks on software piece for CLOUDS documentary by James George and Jonathan Minard. Random output, not post-processed.

Final tweaks on software piece for CLOUDS documentary by James George and Jonathan Minard. Random output, not post-processed.

upcoming examples from openFrameworks 0071

Some screen grabs from my latest interactive installation.

 

Made for "The New Sublime" exibition at Clearleft during the Brighton Digital Festival.

 

More info here: www.clearleft.com/does/art

Absolut Inn — Absolut Transformer

upcoming examples from openFrameworks 0071

Installing "Light Leaks" at La Gaîté Lyrique for the Capitaine Futur show. gaite-lyrique.net/en/exposition/capitaine-futur-and-the-e...

Workshop with Daito Manabe at FITC 2013, tracking hand movements with basic computer vision in openFrameworks and using them to actuate hand gestures with a custom electrostim box.

Selected frames of a new video art piece submitted to Digital Graffiti 2015 using an updated particle painting engine created in openframeworks.

Some screen grabs from my latest interactive installation.

 

Made for "The New Sublime" exibition at Clearleft during the Brighton Digital Festival.

 

More info here: www.clearleft.com/does/art

inspired by photos like this one from a lot of sports publications and old school edgerton multiflash.

www.flickr.com/photos/joseph_gurney/3504398524/in/pool-se...

some sort of video effect that does in real time? or after the fact on a shot? this is available already and i'm wasting my time?

 

trying out a couple simple filters for with open frameworks video grabber. hoping to transfer to movies to avoid the problems with the isight camera autobalancing itself quite frequently.

Unfortunately there is no documentation on this project. So in a few words ...

 

3D Scenes can be loaded in the application and animated in realtime. The content was projected on long layers of transparent material in an old beer brewery in Vienna.

 

This is my first collaboration with Yannick Jacquet (Antivj / Legoman).

 

Openframeworks + GLSL Geometry Shaders.

 

www.antivj.com

www.kinesis.be

legoman.crea-composite.net/

 

Well not quite, this is one of the things I was trying out in isolation. part o the control software we used for: www.youtube.com/watch?v=zwqKy07OhXI (More documentation coming soon.)

 

Basically it's a way to project a piece of video onto an arbitrary shape of lights.

 

For this vimeo.com/16762766 I hardcoded in 4 rectangles around the model of the house, but in this new system I represent the surface in 3D space, use that as a view and then get the lights positions within the 2D surface by looking them up in an ortographic projection.

 

Then when you have the 2D positions of the lights within the surface it's easy to look up what brightness they should be depending on the video playing or whatever it is you are drawing into that surface.

 

The view in the screenshot is looking straight down onto an almost flat set of lights but of course the real benefit comes when the shape of the lights is a bit more irregular.

programmed in openFrameworks, 2013

Final tweaks on software piece for CLOUDS documentary by James George and Jonathan Minard. Random output, not post-processed.

This project was inspired by the work of an incredibly talented community of artists and designers that are using video mapping as a medium to reinterpret and transform banal, expected environments. The work of Pablo Valbuena was a strong influence over our explorations, and we sought to introduce dynamic interactivity to augmented sculpture as our novel addition to this community.

 

Developed in C++, Openframeworks, and OpenNI, we are using the depth mapping capabilities of the kinect to evaluate the viewer(participant's) hand, and position it as the light source of the physical model. In effect, their hand becomes the sun, lighting or dimming our abstracted cityscape.

gaudy lighting has really been helping me recently

Some screen grabs from my latest interactive installation.

 

Made for "The New Sublime" exibition at Clearleft during the Brighton Digital Festival.

 

More info here: www.clearleft.com/does/art

Some screen grabs from my latest interactive installation.

 

Made for "The New Sublime" exibition at Clearleft during the Brighton Digital Festival.

 

More info here: www.clearleft.com/does/art

Tile frustum light culling using an OpenCL kernel that divides the screen up into 32x32 pixel tiles. For each tile, the kernel then finds the minimum and maximum Z depth of the pixels in that tile through a reduction.

 

With this information, a bounding frustum is created for each tile and each point light's position + attenuation (affected area) in view space is then culled against this frustum.

 

The idea is that you can then get a list of all the lights that are affecting each tile, cutting down on the number of fragments you need to process for shading. In the lighting shader, you figure out which tile the frag coords fall under and then shade using the lights listed in that tile's light index buffer.

 

This can be used in both deferred rendering as well as forward rendering (and extended in Forward+ rendering). Looking forward to trying out light indexed forward rendering... I'm a little sick of no transparency and material challenges with full deferred ;)

 

I'm currently using 2 separate kernels for min/max reduction and light culling - need to combine these as a next step for speed.

 

Debug view of the buffers used for light culling - Final Render, Linear Depth Buffer, Min Z for each tile (Max Z is also calculated, but not displayed in this debug view), Light "heat map" showing number of lights affecting each tile (provides a visual idea of how much processing each tile will need to perform, not actually used for rendering).

programmed in openFrameworks, 2013

Swipin' Safari begins with the swipe, the ubiquitous one-finger gesture of the mobile device. For this iPhone app, this swipe is the vehicle for taking a virtual safari across an infinite painting. With each gesture, we encounter new species of brush strokes and colorful patterns from the digital world.

 

Swipin' Safari was created for Safari, an online exhibition for SPAMM (Super Art Moderne Musuem): safari.spamm.fr

 

The app is available for free in the iTunes Store: itunes.apple.com/us/app/swipin-safari/id635434195?ls=1&am...

Final tweaks on software piece for CLOUDS documentary by James George and Jonathan Minard. Random output, not post-processed.

upcoming examples from openFrameworks 0071

Realtime video distortion openFrameworks App for Champagne Valentine. It was projected on the wall of an old sauna for a fashion event

#WIP #Generative #RealTime #Graphics #Openframeworks #Interactive

1 2 3 4 6 ••• 79 80