All posts in Interaction
We strive to be a platform agnostic firm, making sure we take advantage of whatever tool seems to fit the job best. Even so, our two most common tools are […]
We are not only designers but also stewards of our office, so we want to track the environmental variables of our workspace and log them to a database to track any changes over time. To do this we’re using Arduino and a host of sensors, mostly from SparkFun, to make office occupancy sensor pods.
This post serves as a critique for existing modeling methods. We’re interested in a method which combines the Gumball with The O-Snap for early massing designs. The Box Snap enables the user to create models quickly and run simulations from within the Grasshopper canvas.
Lately we’ve been looking at the Arduino, an open source microcontroller which enables interactive designs. As we delve further into open-sourced electronics, we’ll get the chance to create custom circuit boards. And while Fritzing is a great site for helping with this, laser cutting a circuit board in house would not only be awesome, but would also save time and money.
Here’s an early look at the tangible user interface with Grasshopper. In this video, we’re using the Reactivision Listener for Firefly (a plugin for Grasshopper). While the video represents one basic model, we can use hundreds of these tags (called fiduciary markers) to represent building masses or program blocks. By plugging a physical model into Grasshopper, we now have access to a wide range of tools for simulation, visualization, and geometric generation. Each scheme can be saved with a simple toggle and studied further.
This post – part of a series of 3 posts on the Kinect Multitouch Interaction – is a detailed technical walkthrough of the code we implemented to support touch from depth. The downloadable code that this post refers to is in Part 2 in the series. This post is meant to serve as a foundational primer for the major modules of the code that build up the touch-from-depth interaction.
Over the past year, we’ve made the Microsoft Kinect sense touch – and in turn, gestures – which we then used to control the Grasshopper canvas via keyboard and mouse events. We’ve had a lot of fun building Kinect Multitouch Interactions but – being an architecture firm – we can only spend so much time developing the code. We think we’ve created a solid foundation and would like to share with the broader community to use, modify, and extend. Obviously, Grasshopper is only one possible application and we’d love to see what others could do. In the spirit of openness, we’re providing the complete source.
In a previous post, we elaborated on how more real estate for the Grasshopper Canvas can be beneficial and usable with Wiimote interaction. Since then, we have been toying around […]
Grasshopper needs more real estate. We’ve been addressing this problem by moving our Grasshopper definitions to a table-top display, tracking an IR LED light pen for interaction on the canvas. The canvas can take up the entire table-top while the linked geometry can be projected on a nearby wall. We’ve only begun setting up the equipment, but our early tests are promising.
We used 10 high-intensity LEDs to light a 1:24 scale model of a studio theater. The LEDs were wired to an Arduino micro-controller and programmed using the Firefly components for Grasshopper. All 10 LEDs can be individually controlled from a Grasshopper definition or wired together (in GH) to form banks of lights. The result is a computer-controlled mini theatrical lighting system on the cheap.