Synch.Live was created and directed by Hillary Leone and developed in collaboration with Daniel Bor, PhD, Pedro Mediano, PhD, Fernando Rosas, PhD, Madalina Sas, MEng, and Andrei Pacuraru.
Hillary explains that emergence can mean many things: “capturing the creation of galaxies, intelligent behaviour in ant colonies, economics, ecosystems, and conscious brains.” The use of it in this context is brand new.
Idea emerging
How does one go about creating such a system? “Functionally, the system needed to extract position and movement information, feed it into the emergence algorithm, and then create a visual feedback loop to signal how close the participants were to the goal of the game,” Hillary explains. “The system needed to run the emergence criterion in real-time and provide feedback to the player hats. The hats needed to be able to ‘blink’ in perfect sync, like fireflies do in nature. They needed to be safe and comfortable; flexible to allow for complex patterns and artistic expression; and precise and reliable in order for the results of the scientific part of Synch.Live to be valid.”
Building the system wasn’t so easy, and the team also wanted to make the code easy to use as well. “We wanted to build a system that was easy for other artists, scientists, and enthusiasts to replicate and play with,” Hillary says. “Scalability was also an important requirement: while we are currently running small pilots with only a few players (ten), we envision a future for Synch.Live where dozens of players meet up to create large mesmerising displays of emergent co-ordination.
“With those requirements in mind, we designed the software to be highly modular and extensible. Precise clock synchronisation and scalable deployment were tough nuts to crack; we solved them with RTC modules and Ansible. We have been lucky to have a network of very supportive friends and collaborators, including professional software engineers and computer vision researchers, who have helped us make this project a reality. The project has been developed fully open-source, and our in-house software engineer Madalina has thoroughly documented the build process in her blog.”
Scientific fun
If you’re like us, this all sounds very fascinating, and also a fun experience to participate in.
“People are incredibly excited about the work and its potential applications,” Hillary mentions. “As one attendee wrote after one of our presentations: ‘The vision is powerful, the implications and potential are clearly articulated and beautifully expressed, and the need for this work is deep and significant.’”
The project is currently undergoing further refinements as pilots are run to test the system, and Synch.Live will show up on 18-19 June at Imperial College London.
“Once we have the data to demonstrate collective emergence, we want to scale-up the group size, design the wearable, develop new scenarios and rules sets, and explore applications such as conflict resolution and team building.” Hillary finishes, “Ultimately, we want to make the experience of Synch.Live available to communities around the world, and make a film that celebrates our collective humanity.”
Keep an eye on upcoming events on Eventbrite.