I met with my mentor Pete Wallace, we had loads of catch-up to do about our respective projects.
Prior to that meeting, I didn’t voice out my concept at all. Thus, I was pretty insure about its clarity. Turns out that Pete got exactly what I was trying to explain (aka what I previously wrote → “Bring corporality to the experience of web surfing“), and it definitely helped clearing up my thoughts. I got so many ideas while I was kind of stuck by myself. Because of his background with projection and moving image, he also advised me a lot on the technical part. After that, I sketched out few ideas for the execution – as you can see below-low.
We also talked about my Ars’ project, and my interest in the relationship between sounds and physical gestures. Pete e-mailed some of his colleagues whose line of works could intertwine with mine. I exchanged a bit with Benji Fox – he advised me on binary sound, and sent me a few references of contemporary modular synths with interesting control gestures. I particularly liked Landscape ↓
In the same vein, I also went to the Music Hackspace Artist Talk at Sommerset House this Monday, with Kacper Ziemianin presenting its LightSeq ↓
In the age where everything is pretty much doable within your sole laptop, the experience of playing // watching the performance still matters.
Although my FMP might not follow that musical path – I’m better off concentrating on the visual part, I still can definitely link my interest in the gestural experience into web surfing.
Anyways – this is always useful, particularly in the case I’m planning to improve my Ars’ project. I’m somewhat trying to accomplish a cohesive line of works, and I can’t really disregard my own interest about sounds. I know it shouldn’t be forced though, so I’m backing up a little bit to reflect and solve that visual content part first.
I have to admit I’m stuck on that part. 🙃 The only thing I know is that I want to re-create a web universe, it’s plenty vague enough. I didn’t talk about it much with Pete, though he gave me a few ideas for the variables I could use, such as a RGB detection.
As usual, I mindmapped – and simplified everything out ↓
Here are the sketches ↓