Senior Lecturer Ilona Sagar: Reconfigured Vision – SPACE, The White Building


Read interview with Ilona Sagar Here:

Participants joined artist in residence Ilona Sagar for a co-inquiry, taking the question ‘what do objects sound like?’ as its starting point.

Working in collaboration with OxSight and Torr Vision Lab, organisations who are engineering devises to assist users who are registered legally blind and Alex Taylor from Microsoft Research Cambridge Lab, Human Experiences & Design Group, Ilona Sagar sought to investigate our relationship to assistive technologies, asking how can we safeguard agency and subjective experience? How do the devices we use dictate our shared environment?

27112722RAW FINAL.00_49_14_18.Still004

Participants were invited to imagine new ways of rendering our physical environment and translate the objects it contains into sonic semantics. Testing wearable vision enhancement tools actting as a catalyst for a much wider exploration into the politics and language of assistive technologies.

The idea that bodies are either enhanced or normalised is an uncomfortable perspective of wearable technologies and raises the question, what is a good body? To what extent are we a product of our ‘body capital’, labour and efficiency?



Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s