Project Journal


Project Journal

 

4/4/12

 

The Interactive Plant

 

I plan to use the Kinect to detect my voice and gestures in order to help sustain the life of a plant. I will use moisture sensors within the soil of the plant to send a signal to my computer. Based on this signal, my computer will speak from the perspective of the plant if prompted. The voice will state the current condition of the plant. I will use gestures or voice confirmation, such as, "water it" to send a signal to the Kinect, which signals the Arduino to pour water in the plant.

 

Highest Risk:

I think there are many high risks, and given my current set of knowledge, I couldn't accurately quantify them relative to each other. With that said, I think the most difficult challenges will be implementing voice and gesture recognition using the Kinect in a processing framework. I also think it may be difficult to have the motors in the Arduino watering the plant properly.

 

Plan of Action: 

I will overcome these challenges finding help whenever I need it, using the internet and human resources to learn about using these systems. I will start by doing a tutorial on Kinect hacking in processing. As for the motors, I will learn by seeing and doing, and obviously trial and error.

 

For Next Class:

For the upcoming class, I will begin to hack the Kinect, using tutorials and online resources to learn how to plant inputs, and create responses. I will also order the moisture sensors and motors if necessary.

 

What Actually Happened:

I spent the entire class in anguish as I couldn't figure out how to use the Terminal to install contributed libraries for Processing. I was able to get one library working, but it had very few available examples. After class, and for the ensuing week, I figured out how to install the libraries. Then, I used a hand-recognition example to determine the position of a single point on the screen that represent's the user's hand. With this data, I was able to map gestures using easing and a variety of conditionals, counters, and booleans. Once the gestures were mapped, I recorded female vocal samples that I found from the internet and connected them with specific gestures. After this was in place, I adapted an example of bouncy glowing metaballs into a single orb that eases towards the user's hand. I deconstructed the example and created alternate color displays that will represent the plant's thirst for water.

 

Assistance that I need:

I need assistance determining the mechanics behind the plant's water source.

 

 

 

4/11/12

 

Highest Risk:

Currently, the highest risk is implementing speech recognition with the Kinect.

 

Plan of Action:

My plan for approaching this is to install Windows 7 on my Macbook Pro using either boot camp or Parallels. With Windows, I will install the necessary Kinect libraries and the Microsoft Kinect SDK. I will also import my current gestural interface and visualization on Processing. With the SDK, I will use the imported speech recognition libraries in c++ to input my own commands. I will then connect specific commands to responses, which are also tied to gestures.

 

For Next Class:

I will install Windows, the libraries, and the Kinect SDK. I will get started with the speech recognition development. I will also record Elizabeth's voice for the plant's characterization. Aside from that, I will research and purchase the necessary sensor required for moisture recognition. Lastly, I plan to begin preliminary development on the plant's water source by doing research, asking questions, and sketching prototypes.