View allAll Photos Tagged openframeworks
I'm working on a light installation with a friend of mine, so I'm evaluating the Pixelline RGB LED tubes.
It's a perfect opportunity to add colour support and do some refactoring on the Lighting Control software (it needs a name) that I've been working on, previously it only dealt with dimmers.
On the todo list is ArtNet support as I can see that working with RGB LEDs will gobble up DMX channels pretty quick, but it'll have to wait until a project requires it as I will need to buy some gear. So far 512 channels is enough and I could always run another Enttec USB DMX adapter if I need to get something running quickly.
Another thing to check out is if I can use PBOs to speed up the glReadPixels from the FBOs that make up the small canvases that are used to map pixels to lights.
Strangely enough when I tried to record this in Windows Fraps would steal the glReadPixels and I would end up with no colour data to output to the lights, an odd gotcha.
I really need to sit down and work on the layout of any on-screen information at some point as well, right now it's all just thrown somewhere it would fit.
Quicktime Screen Capture did not grab the sound unfortunately.
buffer overflow ran into a text string being printed to screen, and changed the length of the text. whoops.
dali en.wikipedia.org/wiki/Crucifixion_(Corpus_Hypercubus)
françoise gamma videogramo.8bitpeoples.com/
marius watz mariuswatz.com/2010/03/01/kbg/
quayola vimeo.com/11777813
this one is available in a larger size secure.flickr.com/photos/kylemcdonald/6676160665/sizes/o/...
Swiping is an audio/visual animation composed using swipe gestures made on an iPad – a contemporary Abstract Expressionist painting that reflects on the gesture in the digital age. Each new swipe generates a colorful brush-like form that dynamically expands in 3-dimensions, accompanied by synthetic sound. Here, the gesture is not the expressive act that Abstract Expressionist painters such as Jackson Pollock were known for; instead it is a physical command to explore an infinite flow of digital information.
To create Swiping, thousands of gestures were recorded on an iPad and then animated using custom software. Sound by Chris Carlson (@modulationindex). Made with OpenFrameworks.
Screenshot of the OpenFrameworks program which runs the "Double-Taker (Snout)" robot. Subscreens are (from top left): (1) masked input; (2) background (via running median); (3) depthmap, derived from binocular disparity (via PointGrey Triclops SDK); (4) presence (via background subtraction); (5) motion (via frame-differencing); (6) "activity" (via blurred feedback buffer of the weighted sum of presence and motion). Below left: larger image of live camera input. Below right: "stick figure" animation-skeleton of the simulated robot.
The system tracks the two most significant moving-things in the camera's view. (Two seemed like enough for such a slow robot). When a new person appears they are given priority, so the robot will generally turn to look abruptly-ish at people who enter the scene.
The weird mask in the image processing reduces distractions from non-critical areas of the camera's view. For example, it's important that the robot does not see itself, or else it would always be the most significant moving thing, causing a feedback, etc.
A screenshot from a wacky little app I made in 30 hours at a Kinect Hackathon hosted by Microsoft. More info and video here - jamesalliban.wordpress.com/2016/01/06/totem
Visualizing all the issues on openFrameworks. Time moves from left to right, each line is an issue, and when a comment is added the line gets moved downward a bit. The y axis is the issue number, so if the overall trend is downward then lots of issues are being created. If the overall trend is rightward, issues are being created slowly. Hue corresponds to how long the issue took to close.
experimenting with visualising streamlines inside a vector field.
rendered as 1bpp (black and white only) gif animation below - give it a tick to load.
low-poly tape art with a layer of audio reactive projection mapping over the top. made with openframeworks.
The idea is to design and print 3d objects and then use the technique of projection mapping to enhance the printed object adding ‘fake’ textures & materials, simulated lighting, animations..etc!
For now it’s very basic, I’ve just designed a low poly vase in Blender and printed it in white ABS :
www.flickr.com/photos/kikko_fr/8192035716/in/photostream
Then used a customized version of Mapamok (github.com/YCAMInterlab/ProCamToolkit/wiki/mapamok-(English)) where I can directly load my 3D model, and a 80 lumens pico projector to make this quick setup.