Social Things_11 ↓ Ars Electronica

Here is my project at Ars Electronica ↓

At the beginning, a bad habit re-surfaced – my low-confidence self took over and I was really sick of my project. I guess it’s because I didn’t have any feedback of the updated version of the project, since the version I submitted was a prototype finalized in its concept but not in its form. Those last steps were made during the summer, and I didn’t get to meet or talk to many people during that span of time.

Seeing everyone pretty confident about their own projects and happily talking about its, forced myself to reach to the audience – despite my hesitations. Also, my cohort was a good support in that – it’s always scary to putting yourself out there, at least for me.

Turn out my project was more easily understood than expected, and I got some good feedbacks. It seems it’s definitely relaxing, and most of the audience got right away the concept of the scroll gesture or at least why I did that. I guess the title “DO IT RIGHT, DO IT SLOW” helped in that, as well as the actual shape of the object…? At least, that’s what I got.

I hope there will be opportunities to continue exhibiting my object – and definitely take it further. I think I already said that in a previous post, but I want to make it as a modular synth instead of simply playing samples. Also, get rid of the Mac Mini part to make it self-sufficient. It’s too bad MAX/MSP isn’t supported by Raspberry Pi, but I guess Pure Data would take over then.

It’s my last entry blog as a student for that project, but it is to be continued 🙂

Social Things_10 ↓ Making

I had a great time this summer crafting my object! I’m posting it in one go but I recall I may have spent 1-2 weeks in total for the final making.

My main obstacle was to find out how to make the wavy shape. Thankfully, Nathalie from the 3D workshop is full of tricks. She told me that bending wood is a whole another level of difficulty, but that I could take a different approach about its. We started by laser-cutting the sides in their wavy shapes, and forced a very thin layer of wood on the top to bend over the cut shapes.

I don’t know if it makes sense, so here are some explanatory pictures ↓

Social Things_10 ↓ Making
My first plan, vastly corrected by Nathalie. I’m definitely not a 3D person, but I try.


Social Things_10 ↓ Making 9
Nathalie’s sketches. It’s basically a box in two parts. She recommended MakerCase to generate my plans and then adjust them on Illustrator for the wavy part – super useful, I didn’t have to think about the joints.


Social Things_10 ↓ Making 7
All the parts are cut and glued together. I unfortunately didn’t take any in-between pictures, but you can see the thin layer of the wood glued to the thicker sides – hence the shape shaped.
Social Things_10 ↓ Making 6
Pipe helping me with the scotching so that the glue stays put.


Social Things_10 ↓ Making 5
After deliberation (Nicolas, Betty and Pipe being the jurés), I decided to get rid of the bottom half. It was way too big compared to my expectations.


Social Things_10 ↓ Making 4
Thus, I made another bottom part to be able to close the box.


Social Things_10 ↓ Making 3
I re-tested the electronic part, since I changed the alu to copper tape – much more stable.


Time to sand, sand, sand.


I applied a black satin paint in 2-3 coats, and another satin finish coat on the top.

I also changed the Arduino and the MAX/MSP part. Gareth helped me out for the Arduino part – instead of just reading the pin number, we adjusted its to the speed being read between two pin numbers. Thus, I can use that number within MAX/MSP to play out different samples according to the speed.

It’s all ready and set for Ars Electronica 😎


Social Things_09 ↓ PE

I just handed in my Portfolio of Evidence for this Physical Computing unit. Here is the demonstration video I’ve made, with the hands of Betty scrolling down through my prototype ↓↓↓

I used this song made by my musician friend Sima Kim, and tweaked it a little bit to demonstrate the type of effects I want to produce. Indeed, I’m working towards an evolution of the sound part through the speed of the gesture. Thus, I have some works to do with the MAX/MSP patch, but it might be a lot of fun. Since I’ve discovered MAX/MSP, I’ve always wanted to find the time to actually compose music with its. Should be a cool summer homework!

First, I should still take the next weeks to finish the object’s design in priority.

Social Things_08 ↓ Final Crit

We had the Project Final Crit this morning, in the presence of Rania Svaronou and Riccie Janus from IBM again. We organized it as a 2P2 feedback, as you can see it below. Pretty cool to see everyone’s project coming through their last iterations!

Here is my (5th) prototype ↓↓↓

(I wish I took a self-explanatory picture before I glued everything, instead of that long paragraph coming 😅)

I made up a very DIY case to ensure the foil was secured: plastic sheet for the touch and colored paper sheet to hide it. I’m considering to simply use colored plastic sheet for the last version, as I don’t need to see the BTS that much anymore.

Compared to the 4th prototype, I didn’t use copper tapes but simply switched back to foil to have bigger strips. I cut around 3 cm compared to 5 mm for the tapes. I also left around 3 cm space between each strip, while the tapes were dispatched too closely and created confusion for the MPR121. I also only used 3 strips compared to the 6 I previously had. I think it’s plenty enough considering the interactions I actually need from them – not that much.

Pretty simple, as instructed: the person has to hold on to the first strip, then slide through the two others strips. I noticed the foil strips sometimes went “off” or were confused between one and another despite the space between them – forcing me to re-start the circuit. It didn’t happen before, not sure if because foil has less stability than copper or it might be simply thank to the tape format? Well, need: bigger tapes!

The technical part didn’t change much from the 4th prototype: I used the same wiring + code for the Arduino part, and I simplified the MAX/MSP patch. Note: the first strip is wired to the 0 pin, the second strip is wired to the 6 pin, and the third strip to 11 pin.

Social Things_08 ↓ Final Crit 4

While the 0 pin didn’t change, I used select to bang each time it detects the 6 pin plus counter to bang each time it effectively counted 6 to 11. Both select and counter are linked to timer to know how many milliseconds has passed since the finger passes the second strip (aka first bang) through the third strip (aka second bang). Then I linked it to a gain function: the more the gesture is fast, the more there wouldn’t be much volume.

Social Things_08 ↓ Final Crit 1

(Here is Pipe interacting with my prototype, you can also see the title I’m settling on: LET’S DO IT RIGHT, LET’S DO IT SLOW.)

I wrote down the main feedbacks I got + my thoughts on that:

  • Audrey: “When moving fast, not aware of the reaction or the idea ‘slow down’.”

Agreed, the sound effects definitely need to be more obvious instead of gain, else it looks like it’s broken. I re-linked that to a feedback function right away, so it distorted the sound instead.

  • Rania: “Loves the idea. Thinking from an UX perspective, better to use vertical scroll instead. Match the speed of the gesture to the content and that’s all it needs more.

It was great to see the idea understood rapidly with straightforward advices. Plus, it seems the vertical scroll definitely comes off as more familiar and match the infinite scrolling we are doing on our social apps.

  • Gareth: “Loves the concept, definitely getting through: it’s the most important, technical part come later. Mention of psychological studies to the scroll gesture, and the insatisfaction we get from it through our never-ending feeds. Doesn’t think the scroll needs to be vertical.

Interesting thoughts – and also related to what I’m looking for my FMP. Maybe the gesture could work in both cases, depending on how people want to handle the object depending on their own preferences – siding horizontal and vertical?

  • Stephanie: “Advised a strong reminder for the context of the Slow Movement – a more high-tech approach with the phone, and the use of fabrics to tone down that approach.

Not into the phone direction, but I got where she came from and it actually gave me an idea: maybe I can ask people to put their smartphones besides my touch pad so that action acts like they are substituting their smartphones for my device?

  • Nicolas: “Something is happening: trust relationship with the object. Need an evolution of the content now: for example, if you scroll right enough to reach a good volume, the next step would be to maintain a good sound effect? The gesture is good as it is now: the hand rests while holding while the other hand scrolls? Last step is the object design, also think about where I want this object to be used? About the question of the fabrics, it could be filled up with cottons and such: take inspiration from toy stores, and look up at kinetic sound.

Digging that “evolution” idea. Definitely a home object, acting as a substitute for the smartphone as I just ideated. To be honest, I don’t think I will use any fabrics except silky ones: 1/ I want a slick touch to remind the screen. 2/ I don’t want my object’s design to be playful. Since I view it as therapy from the infinite scrolling gesture ≠ aka won’t be a toy, my aim is definitely an adult (teenagers included) audience.

The object’s design will also definitely shape the gesture I mentioned the wave idea to Nicolas. In my previous blog post, I previously mentioned that I ordered a plastic ball in order to prototype with its wavy shape, well I don’t know where my package is – hence the flat prototype…

Now looking into kinetic sound, my prototyping process is taking me more into the sound part – which it’s why I think I might let down the light part, I don’t think it’ll add anything much to the interaction. I will still consider it for my final sketches, more as a bonus aesthetic part. I’m still thinking about how you have these flashes when you close your eyes after seeing lights. Well, it’ll depend on the shape but it’ll need to be transparent at least on that part for the light to come through and hiding the strips would be extra work – and make caution that the MPR121 would still be reliable with the distance I’d need.

Though I got my concept across – which I’m feeling pretty relieved about, I still have then few mostly technical steps left: the object design, and the sound part of the MAX/MSP patch.

It might be better to hand over the sketches for the PE, and aim for an actual delivery with the objective of Ars Electronica (I didn’t mention it before but the class is going to Ars this September, and I’m bringing that Social Things project in my suitcase).

Social Things_07 ↓ Fourth Prototype

Here is the 4th prototype, where things are finally starting to come together ↓↓↓

Wiring: I wired up both MPR121 and the RGB LED to a prototype shield + a small breadboard to minimize the size of the circuit. I wired up the RGB LED as usual, look up my previous post and/or directly this tutorial. I wired up the MPR121 as instructed there on Sparkfun, to 6 copper tapes.

(Note: the MPR121 from Sparkfun has been discontinued, but you can find the same on Adafruit)

Code: I first used the original library but I wasn’t able to change the threshold so the tapes would still work with a plastic sheet as you can see on top. I asked Gareth and he advised me to use Bare Conductive’s library – indeed it was pretty easy to change the values there. Here is the code with my values + the RGB LED implemented part ↓↓↓

const int redPin = 5;
const int greenPin = 6;
const int bluePin = 7;
boolean ledState = LOW;

#include <MPR121.h>
#include <Wire.h>

#define numElectrodes 12

void setup()
 while (!Serial); // only needed if you want serial feedback with the
 // Arduino Leonardo or Bare Touch Board


 // 0x5C is the MPR121 I2C address on the Bare Touch Board
 if (!MPR121.begin(0x5A)) {
 Serial.println("error setting up MPR121");
 switch (MPR121.getError()) {
 case NO_ERROR:
 Serial.println("no error");
 Serial.println("incorrect address");
 Serial.println("readback failure");
 Serial.println("overcurrent on REXT pin");
 Serial.println("electrode out of range");
 Serial.println("not initialised");
 Serial.println("unknown error");
 while (1);

 // pin 4 is the MPR121 interrupt on the Bare Touch Board

 // this is the touch threshold - setting it low makes it more like a proximity trigger
 // default value is 40 for touch

 // this is the release threshold - must ALWAYS be smaller than the touch threshold
 // default value is 20 for touch

 // initial data update

 pinMode(redPin, OUTPUT);
 pinMode(greenPin, OUTPUT);
 pinMode(bluePin, OUTPUT);

void setColor (int red, int green, int blue) {
 analogWrite(redPin, 255 - red);
 analogWrite(greenPin, 255 - green);
 analogWrite(bluePin, 255 - blue);

void loop()
 if (MPR121.touchStatusChanged()) {
 for (int i = 0; i < numElectrodes; i++) {
 if (MPR121.isNewTouch(i)) {
 // Serial.print("electrode ");
 Serial.println(i, DEC);
 setColor(255, 0, 0);
 //Serial.println(" was just touched");
 } else if (MPR121.isNewRelease(i)) {
 //Serial.print("electrode ");
 Serial.println(i, DEC);
 //Serial.println(" was just released");
 digitalWrite(redPin, ledState);
 digitalWrite(greenPin, ledState);
 digitalWrite(bluePin, ledState);

Time to re-create the scrolling gesture: the thumb has to hold on the first tape while the others fingers scroll down through the rest of the tapes. This one-hand gesture is pretty similar to what you’re doing on your trackpad or on your smartphone.

I spent quite some time on MAX/MSP to figure out how I could make sure that the fingers have to pass through all the tapes – you might cheat by hold on to one tape only and it’ll still work. After trying out select, clocker and such, I used counter and it does count after the full action from the moment I hit the first tape to the last one! Still, I need to figure out how to measure the speed of that action.

I asked Nicolas for advice and we tried some stuff such as thresh or select added to timer. It didn’t quite work the way I wanted – aka no cheating allowed, but it gave me insight about how I can make it work for the next prototype. Indice: I’m thinking to use counter with timer.

For now, the only thing working is using select down the first tape to activate the sound (I chose an ambient style of music made by my friend Sima Kim in his debut days) through a comb function I intend to fully make use of for its effects.

It’s a bit messy but here the actual MAX/MSP patch ↓↓↓

Also, this is the MAX30100 which is the heart rate sensor I intended to use for the other hand to rest down ↓↓↓

I decided to not use it anymore – not that it didn’t work for some reasons… After discussing it with Nicolas and saying my aim was to parody the tracking technology, he said that my point wouldn’t come across as it’ll only be perceived as technologically intrusive – in fact, exactly why I wanted to get out from tracking datas at the beginning. Well I tried, out for good now!

The light also doesn’t have any real use for now. I’m still struggling to sketch an object’s design that’ll make the most of it while hiding all the wires. I’m thinking of a wavy kind of shape though. I ordered this plastic ball to use its half to cover up the LED, in order to envision the wavy part since I can make up the flat part – next prototype if I got it safely delivered.

Social Things_06 ↓ Third Prototype

Reminder: my prototyping process has lead me to an unexpected path; the path of re-creating a touch pad. I did think of directly buying one, but I wanted to get out from both the aesthetics and the shape which is pretty determined by its manufacturing and standardization – starting with the Apollo Computer in 1982.

Be it touch pads or touch screens – they both work with capacitive sensing.

It is very easy to get started with that kind of prototype with Arduino. Below – I used foil but you can pretty much use any conductive material.

Wiring: I connected (1) wire and (1) 1M resistor to respectively digital pins 3 and 4 directly on the Arduino. Both of them were attached to crocodiles wires holding on to a sheet of foil. I used an 1M resistor for its to only respond by direct touch, but you can use a higher resistor and it will respond by few inches.


Code: With its, I used the Capacitive Sensing library which is great to quickly get it to work ↓↓↓

#include <CapacitiveSensor.h>

CapacitiveSensor cs_4_2 = CapacitiveSensor(4, 2);
void setup()

void loop()
 long start = millis();
 long total1 = cs_4_2.capacitiveSensor(30);

 Serial.print(millis() - start);



By using the Serial Monitor, I could see the numbers going up when – I paused my finger; I paused it long enough; I pressed more than one finger. It is very straightforward, but it seems it needs stable conditions for its to work. There are copper tapes in the studio I can make use of instead of foil – it seems to be better in terms of stability.

It also might be better to switch to the MPR121, as I would be able to use different strips of foil in a easy and stable way. I didn’t use it previously because I thought MPR121 only had an on/off state, but I just need to time up these states according to Nicolas.

I also started to use MAX/MSP with Arduino, here is a simple patch visualizing the datas ↓↓↓

Social Things_06 ↓ Third Prototype

I’m actually thinking to use MAX/MSP to trigger sounds. More on that on my next prototype!

Social Things_05 ↓ Scroll Gesture

OK – I have to admit I spent the last three weeks a bit lost there. Despite the fact that I have tried different kind of sensors, none of them did the job for me: it was about that precise interaction I wanted to pin down. After being through a slump I named inert-eraction, I had this “Euraka!” moment I still shall doubt every 15 minutes during any next brainstorming. Nevertheless, I got the missing element I was looking for my meditation device ↓↓↓

Yes, the scroll gesture – be it with the mouse, the touch pad and the smartphone. I’m not sure how exactly I ideated it – maybe when I was myself scrolling down and thought “I’m not moving much, am I?“, but I already had precise thoughts: I wanted an actual pause of movements without any physical sensors. The digital gestures match that! Why not using this gesture as the main interaction of my object?

Plus, the way we relentlessly stare at screens almost makes me think of a trance. For example, how many of us has been binging-watch series without noticing the hours passing by? Surfing on the Internet might then be a kind of non-conscious meditative state. The relationship between the perception of time is very interesting here, and the Slow Movement is indeed encouraging time mindfulness – I’m taking back my time lost in scroll-trance by scroll-meditating. Here it’s about the mean!

Thus, I need a touch pad to scroll on. When I explained my concept to Nicolas on this morning’s tutorial, he encouraged me to create a low-tech touch pad. He mentioned that I could use conductive fabric but advised me to first try out a DIY version using foil.

We talked on my thoughts of how these gestures are related to types of cognitive and psychological responses in their interactions, and how my device could end up creating another type of gesture. So far, I’m only aware of the research project and book Curious Rituals by Nicolas NovaKatherine Miyake, Nancy Kwon and Walton Chiu.

I found books that are more and less related, though: The Best Interface is No Interface by Golden Krishna, and Irresistible by Adam Alter (this article found on the Guardian is a good review of its). Both gives a different insight of the gestures we use with our digital devices; the first about how we get through the interface by designing better interactions, and the latter about how this interface get us addicted. Exactly the contrary of what I want to accomplish here, by taking the gesture down to another type of interaction.

Although I now know what I want my device to be, I still don’t know its output: sound, light, both? I mentioned the fact that I’m pretty influenced by James Turell in my works, and my wish to create an immersive experience using light.

Social Things_04 ↓ Second Prototype

So here the sensors I borrowed.


I had to connect it using Bluetooth with OSC, added to MAX/MSP. Thank Nicolas who made these super useful patches ↓↓↓

Accelerometer – YES/NO

I wired it to the Arduino and I just read its values with Serial Monitor – nothing special but in case someone wants the code (I hope the picture is clear enough for the wiring part, I don’t have any plans to share sorry!)  ↓↓↓

int x;
int y;
int z;

void setup() {

void loop() {
 x = analogRead(A0);
 Serial.print("value x ");
 y = analogRead(A1);
 Serial.print("value y ");
 z = analogRead(A2);
 Serial.println("value z ");


While I didn’t for sure make full use of them, the results I got weren’t satisfying. In fact, I don’t enjoy the handling of these – even though I did say I wanted the user to fully (inter)act with the sensors, aka with gestures. Both actually can be manipulated with small moves, but the in-air gestures aren’t something I envision as specifically meditative. It would have been great if my intention was to make a wearable device, but my idea of meditation is actually equal to a pause of movements –when you immerse in and face your mind.

I’m still lacking that specific interaction to go along with my meditation concept, and I have difficulty to see the output of its: I’m hesitating between a screen with generated visuals, or lights?

Social Things_03 ↓ First Prototype

The advantage of this classe’s fast-paced schedule despite the irony of my Slow theme, is that even though I feel – and might certainly be right, that I’m ideating without any clear plans, I still have to get something out there.

If I don’t want to be literal – aka not doing anything with the phone and the notion of time, how would my object be possibly referring to the Slow Movement? Isn’t it why examples found are that literal?

No more doubts, here goes my first prototype! Lead by my contextual and knowledge research, I finally opted for a meditation device. The value of mindfulness advocated by the Slow Movement isn’t far-fetched from spirituality after all; I did find in my ethnographic research that some of my interviewees had activities such as meditation, others were into walking or biking. It depends but you got the idea: any mind-free activity.

For that, I assembled a DIY GRS sensor – aka taping down (2) wires to foil using Velcro, reacting to a LED with three states: NONE light when it isn’t used, GREEN when the person’s stress level is detected as normal, and RED when the person is presumably stressed. It won’t certainly work as such but I tried to convey my main idea at this moment: an object gaining its meaningfulness solely by the input of its user, else utterly useless.

Wiring: First, I followed Adafruit’s RGB LED tutorial. Then for the GRS part, I connected one of its two wires to ground through the breadboard and the other one to an analog pin through a 330Ω resistor on the breadboard.

Code: It’s super simple – read pins outputs and set LED colors with if variables ↓↓↓

const int redPin = 8;
const int greenPin = 9;
const int bluePin = 10;
const int GSR = A0;
int sensorValue;
boolean ledState = LOW;

void setup() {
pinMode(redPin, OUTPUT);
pinMode(greenPin, OUTPUT);
pinMode(bluePin, OUTPUT);

void setColor (int red, int green, int blue) {
analogWrite(redPin, 255 - red);
analogWrite(greenPin, 255 - green);
analogWrite(bluePin, 255 - blue);

void loop() {
sensorValue = analogRead(GSR);
Serial.print("sensorValue = ");
if (sensorValue > 0) {
setColor(255, 0, 0);
if (sensorValue > 5) {
Serial.println("Slow Down!");
setColor(0, 255, 0);
if (sensorValue > 10) {
Serial.println("Too High!");
setColor(0, 0, 255);
digitalWrite(redPin, ledState);
digitalWrite(greenPin, ledState);
digitalWrite(bluePin, ledState);

I don’t think my object should be thought around utility, but I feel it lacks both character and content in its meditative aspect. You don’t need such a device to meditate, after all.

Well, I did found a project by KP Kaiser that makes it somewhat useful; read his blog posts here and there. He used others sensors from GRS such as Heart Rate and Skin Temperature, making it actually track your meditation level and linked it to an app. It’s pretty high-tech; and for these sensors to fully work, it’s better for the user to actually wear them.

From now, I’m thinking I might take a low-tech approach in order to focus solely on the (inter)action. I don’t want the user to wear any sensors but to actually uses them. I might have an idea while iterating the prototyping, hence I just borrowed others sensors to try them out – let’s see how it goes.

Social Things_02 ↓ First Presentation

We had to make a first presentation in the presence of the alumni Rania Svaronou and her colleague Riccie Janus working at IBM. In this 5 mn presentation, I presented the Slow Movement and its main sub-movement I’m interested in which is Slow Design, with its main principles: craft engagement with meaningful, to bring sustainability.

You can read more on that subject with the paper The Slow Design Principles (2008) by Carolyn Strauss and Alastair Fuad-Luke. Again Alastair Fuad-Luke, I think that his paper ‘Slow Design?’– A Paradigm Shift in Design Philosophy? (2002) started it all by coining the term. I also found a pretty interesting paper about Slow Technology instead, Slow Technology: Designing for Reflection by Lars Hallnäs and Johan Redström (2000).

Plus, the video made for the paper Slow Design for Meaningful Interactions (2013) by Barbara Grosse-Hering, Jon Mason, Dzmitry Aliakseyeu and Conny Bakker, is pretty good to rapidly understand what lies behind Slow Design.


Particularly the last part: “It’s about slowing interaction down at the right moment!“, which reminds me of what Carl Honoré wrote – whom I remind is the one that popularized the Slow Movement: “The Slow Movement is not about doing everything at a snail’s pace. […] On the contrary, the movement is made up of people like you and me, people who want to live better in a fast-paced, modern world. That is why the Slow philosophy can be summed up in a single word: balance. Be fast when it makes sense to be fast, and be slow when slowness is called for. Seek to live at what musicians call the tempo giusto – the right speed.”

Last reference before I’ll go into my presentation. This talk given by William Odom shows various examples of what is Slow Interaction Design:

I also presented two decisions, despite having a prototyping idea: I didn’t want to do anything with an app nor a wearable device as I mentioned there are already good options out there, plus the fact that I actually want to get off the screen to craft a tangible object. For example, I’m pretty fond of these vintage calendars, I own several of them back in my parents’ place. They act like they should – that is telling you the day it is, and I also bizarrely enjoy the fact that you have to turn the handles to literally switch to another day.

Hence, there are interactions and gestures I’m missing in the digitalized world, and I’m trying to ideate to fill these gaps. Still, I’m thinking it might be too literal to even refer to the notion of time in my object. As I go on my contextual research, I actually mostly find projects focused around the notion of time. Here are examples from the Slow Tech exhibit curated by Wallpaper and Protein at the London Design Week. Nicolas also referred me to the Slow Watch project, and Betty linked me to the pretty similar Hidden Time Watch project. Pure counter reaction: I want time out of my object.

One feedback I got from the presentation I particularly retain is: “It’s not about the technology coming to us but us going to technology“, acting as a pretty good reminder of my previous references. Quoting Nina Simon in the Participatory Museum, “Imagine looking at an object not for its artistic or historical significance but for its ability to spark conversation“, I’m thinking my object might actually fall down that path.