I’m inspired by how quickly people can learn to meet their own needs. For a long time, via my team at AnyShare, I’ve been exploring how to make matching needs and resources as intuitive and efficient as possible. In this project, I explore how to do that in physical spaces.

I soldered some LED matrixes (2 different sizes to test ergonomics), 3d printed enclosures, connected them to a Raspberry pi, and programmed it as a controller for interacting with a flat screen TV showing data from a Sharing Network on AnyShare.

This bridges the gap between the physical and virtual network via a game-ish interface. The short-term vision is putting these in public spaces and private spaces to learn how certain color patterns and triggers can be learned intuitively by participants to control a vast set of needs/resources they are looking to match.

I’m guided by something I learned from genius inventor Buckminster Fuller. In his book “Critical Path,” he states highest goal in designing calibrated solutions is to introduce them to the environment invisibly and have them be immediately absorbed… no persuasive sales, no tutorials, and no delay.

This is an attempt to follow this high standard.

 

Here’s a quick video showing aspects of the interface being controlled via movement gestures.

Leave a Reply

Your email address will not be published. Required fields are marked *