Andrea Santi - MDEF course
  • About me // MDEF journey
  • Term 1
    • U.U \\Design Studio 01
      • \\Design Space
      • \\Prototype 4 David
      • \\1PP intervention
      • \\Collective design intervention
    • ^_^ \\The machine Paradox
    • =_0 \\Living with our own idea
    • "_" \\Extended Intelligences
    • *_* \\Design with others
    • °_° \\Biology 0
    • ù_ù \\Agricolture 0
    • !_! \\Digital Fabbrication
    • -.- \\Documenting Design
    • Design Dialogues 01
  • Term 2
    • U.U \\Design Studio 02
    • w_w \\Situated Design
    • £^_ \\Living materials
    • |"_"| \\Collective intelligence
    • °<" \\Communicating ideas
      • Project Description prompt
    • *-2\\H(n)machine
    • "_" \\Extended Intelligences 2
    • *-_ \\Micro challenges
    • Design Dialogues 02
  • Term 3
    • Design Studio 03
    • Micro Challange
  • Final projects
    • Design Publications
Powered by GitBook
On this page
  • \\ SOFT SENSORS
  • \\p5.js - SOUND INTERACTION
  • \\p5.js - TIME based INTERACTION
  • \\ SENSING and SENSATING symposium
  • \\ REFLECTION
  1. Term 2

*-2\\H(n)machine

PreviousProject Description promptNext"_" \\Extended Intelligences 2

Last updated 2 months ago

Here there is our reposotory on git hub

It was a live coding course focused on human and machine interactions, this new relathions could be thanks live coding. Probably I will try to do some project in this field, I really enjoy and have fun. For sure our group, called Conductive nose team create an amazing vibe.

In this serial lessons we understand that our body is a tool to collect data and at the same time, due to body and mind are connected, mind is a dataset of information; as designer we could work on this field to create and shape our identity. Why? Because with human and interaction machine we'll have the power to change phenomena. Phenomena is a situtations/episode collected by us (body-mind) that it starts emotions in us.

\\ SOFT SENSORS

We built soft pressure sensors using velostat and conductive fabric, connected to Arduino, and visualized real-time data in Processing. With arduino we managed how to collect data from the sensor and with Processing we use collecting data to create a live visual. This learning by doing approach was amazing and highlighted how bodily actions can be translated into digital feedback.

// ARDUINO CODE collect data 
int sensorPin = A0;
int sensorValue;
void setup() { Serial.begin(9600); pinMode(sensorPin, INPUT); }
void loop() { sensorValue = analogRead(sensorPin); Serial.println(sensorValue); delay(1000); }
// PROCESSING CODE visualize data
import processing.serial.*;

Serial mySerial;
String myString;
int nl = 10;
float myVal;

void setup() {
  size(800, 600); // Canvas size
  printArray(Serial.list()); // List available ports
  delay(1000);
  String myPort = Serial.list()[0]; // Select the first available port
  mySerial = new Serial(this, myPort, 9600);
}

void draw() {
  while (mySerial.available() > 0) {
    myString = mySerial.readStringUntil(nl);
    background(125, 0, 125);
    if (myString != null) {
      myVal = float(myString);
      println(myVal);
      circle(width / 2, height / 2, myVal);
      smooth();
    }
  }
}

\\p5.js - SOUND INTERACTION

On the second day, we focused on p5.js, an open-source platform for learning and creating live visuals and live coding. I really enjoy this website because it offers a vast library filled with examples and references. In class, we experimented with visual sketches that dynamically responded to sound inputs from the microphone.

\\p5.js - TIME based INTERACTION

In the third session, we explored serial communication between Arduino and p5.js using the Web Serial library. By mapping pressure sensor data to the size and color of a circle, we created a dynamic visual response. Additionally, implemented time-based animations, showcasing how real-time data and user interaction can visually represent bodily states.

// EXAMPLE that combines this two methodologies 
let mic; 
let time= 0;
let lin= 10;

function setup() {
  createCanvas(400, 400);
  mic = new p5.AudioIn();
  mic.start();
  
}

function draw() {
  background(0);
  
  
  var vol = mic.getLevel();
  var circlesize = map(vol, 0, 1, 5*10, 200*10);
  square (30,circlesize,50);
  fill('yellow')
  ellipse(time, lin, circlesize, circlesize);
  time = time+5;

  
  if(time == 400){
    lin = lin + 20;
    time = 0;
}
}  

PLAY//FUN//LEARNING

\\ SENSING and SENSATING symposium

We had an immersive and amazing workshop with Umanesimo Artificiale is a non-profit cultural organization and collective of international creative coders based in Fano, Italy. Founded by Filippo Rosati, it explores the question: "What does it mean to be human in the era of artificial intelligence?"

The collective promotes coding and computational creative thinking through digital and performing arts, aiming to foster a fruitful interaction between humans and machines. So in this workshop we developed an exhibition that explores the question and subquestion about human and machine interaction.

For our final project in the H(n)MI workshop, our group chose to explore music interaction through P5.js. We began by creating a simple theremin-inspired digital instrument, where mouse movements controlled the oscillator's frequency and volume. Our initial objective was to replace the mouse input with hand tracking, aiming to create a more intuitive and performative experience.

As our project evolved, we recognized the potential of not only rethinking the interaction method but also diversifying the sound output. To further expand our exploration, each team member developed a distinct hand/(non)hand-controlled instrument, culminating in a collaborative orchestra performance we named "Conductive Noise Orchestra."

  • Ziming designed a system where finger movements controlled multiple sound layers, generating complex textures.

  • Javi created a mixing table where two index fingers manipulated track transitions, with the Y-axis adjusting volume and the X-axis controlling playback speed.

  • Andrea focused on percussive sounds and effects, using mouth opening and closing to trigger drum beats.

In our git hub repository you could find all the code and if you would have fun you can play it on p5 website, we're glad to see people enjoy our project. Btw the git hub link is on the top website page.

// Mine code on p5js 
let drumSound;
let faceMesh;
let video;
let faces = [];
let options = { maxFaces: 1, refineLandmarks: false, flipHorizontal: false };
let mouthOpen = false;
let mouthPreviouslyOpen = false;

function preload() {
  faceMesh = ml5.faceMesh(options);
  drumSound = loadSound('joke-drums-242242.mp3');
}

function setup() {
  createCanvas(640, 480);
  video = createCapture(VIDEO);
  video.size(640, 480);
  video.hide();
  faceMesh.detectStart(video, gotFaces);
}

function draw() {
  image(video, 0, 0, width, height);

  if (faces.length > 0) {
    let face = faces[0];
    let Mount1 = face.keypoints[13]; // Upper lip
    let Mount2 = face.keypoints[14]; // Lower lip
    let mouthDist = dist(Mount1.x, Mount1.y, Mount2.x, Mount2.y);
    
    // Mouth is open if distance > 5
    mouthOpen = mouthDist > 5;

    // Play sound only if mouth opens and was previously closed
    if (mouthOpen && !mouthPreviouslyOpen) {
      drumSound.play();
    }

    // Update previous state
    mouthPreviouslyOpen = mouthOpen;

    // Draw keypoints
    for (let j = 0; j < face.keypoints.length; j++) {
      let keypoint = face.keypoints[j];
      fill(j == 13 || j == 14 ? 'red' : 'green');
      noStroke();
      circle(keypoint.x, keypoint.y, 5);
    }
  }
}

function gotFaces(results) {
  faces = results;
}

\\ REFLECTION

I think this symposium was just amazing, I reach some goals that I would never believe to done. Is always a pleasure work in team and help each other, in this workshop we were split in different group but every body helps each other like a big family.

I don't know if I will never work in this field, but for sure is a very interesting approach to understand the interaction between human and machine; for speculative design methodologies I believe this kind of workshop are perfect.

For sure I learn a lot of things in just two days, so I'm so happy and I feel more comfortable now in this fields, at the MDEF's journey beginning I was really scared by technologies and in particular in coding. Now I feel more free because step by step I'm understanding this digital words.

EMOTION's ETHIMOLOGY 1570s, "a (social) moving, stirring, agitation," from French émotion (16c.), from Old French emouvoir "stir up" (12c.), from Latin emovere "move out, remove, agitate," from assimilated form of ex "out" (see ) + movere "to move" (from PIE root "to push away").

ex-
*meue-
https://github.com/javierserraa/HnMI-t03-.git