I was playing around with the Vive lighthouse tracking system for VR and started thinking about hardware projects to use it for. One idea I had was to take an image and print it onto a wall by using the tracking system to sense where a spray paint can was and then cause it to spray paint depending on the color in the virtual image at the location where the spray paint can was.
I built a simple rig with a solenoid and some 3D printed and laser cut parts to create a mechanism that could trigger the spray paint can. A BeagleBone single board computer was used to communicate over WiFi back to the computer that was actually connected to the lighthouse tracking system and running code to decide when to trigger the spray paint can.
It didn't work particularly well because the code didn't take into account the build-up of the paint as it sprayed, and it didn't attempt to use the orientation of the spray paint can to decide which pixel in the image it was in front of. If I were going to go back and work on this more I would cast a ray from the spray paint can nozzle to the virtual wall and then sample the pixel at that location instead of just projecting the location of the can onto the wall along the vector normal to the wall. And I would accumulate "paint" in a texture and fire the spray paint can based on the difference between that texture and the desired image so that it would shut off after accumulating paint for a short time.
Additionally there are many optimizations that one could make taking into account the distance of the spray paint can from the wall and the size of the spray spot. Ideally a cone would be used instead of a simple ray and it would not allow the spray paint can to be fired when it's too far from the wall.
The tool would also benefit from a UI able to show the 3D scene from the perspective of the spray paint can. Without that kind of feedback it was hard to know whether the can wasn't firing because it wasn't working or just because it was over a blank portion of the image.
Done in 2017