*-2\\H(n)machine
Last updated
Last updated
Here there is our reposotory on git hub https://github.com/javierserraa/HnMI-t03-.git
It was a live coding course focused on human and machine interactions, this new relathions could be thanks live coding. Probably I will try to do some project in this field, I really enjoy and have fun. For sure our group, called Conductive nose team create an amazing vibe.
In this serial lessons we understand that our body is a tool to collect data and at the same time, due to body and mind are connected, mind is a dataset of information; as designer we could work on this field to create and shape our identity. Why? Because with human and interaction machine we'll have the power to change phenomena. Phenomena is a situtations/episode collected by us (body-mind) that it starts emotions in us.
We built soft pressure sensors using velostat and conductive fabric, connected to Arduino, and visualized real-time data in Processing. With arduino we managed how to collect data from the sensor and with Processing we use collecting data to create a live visual. This learning by doing approach was amazing and highlighted how bodily actions can be translated into digital feedback.
On the second day, we focused on p5.js, an open-source platform for learning and creating live visuals and live coding. I really enjoy this website because it offers a vast library filled with examples and references. In class, we experimented with visual sketches that dynamically responded to sound inputs from the microphone.
In the third session, we explored serial communication between Arduino and p5.js using the Web Serial library. By mapping pressure sensor data to the size and color of a circle, we created a dynamic visual response. Additionally, implemented time-based animations, showcasing how real-time data and user interaction can visually represent bodily states.