Fireflies, pendulum clocks, our heart muscles and the neurons regulating our daily internal processes: these things all have a tendency toward synchrony, a phenomenon that emerges when individual oscillators overcome their isolation and couple to the others around them, influencing their dynamics and together creating patterns out of their fluctuations. Synchrony is all around us, from the electrical pulses in our brains to planetary orbits locked together by gravitational forces.
Synchrony is but one of several collective behaviors of coupled oscillators that have fascinated researchers for centuries. Scholarly forays into the synchrony of coupled oscillators can be found in diverse fields, from physics to neuroscience to computing. But for all the ubiquity of this phenomenon, a complete understanding of the collective behaviors of these oscillators remains elusive.
“There certainly is a lack of quantitative modeling that could explain what these oscillators are doing,” said UC Santa Barbara mechanical engineering professor Francesco Bullo, who specializes in the modeling, dynamics and control of multi-agent network systems. “And there are many fascinating mathematical questions about the behavior of these networks,” he added. “When do they synchronize? If they don’t synchronize, what do they do?”
To help close this gap, Bullo and colleagues from UCSB, UC San Diego, UC Riverside, University of Pennsylvania, MIT, Northwestern University and University of Virginia have teamed up to establish a Multidisciplinary University Research Initiative (MURI). With their project,
“NEURAL-SYNC: From Synchronized Oscillations to Neural Computing, Communication and Adaptation,” and supported with federal funding to the tune of $9 million over five years, the diverse group of researchers will dive into the fundamentals of oscillatory systems, apply their learning and expertise to cognitive neuroscience, and build optimized platforms that can take advantage of all sorts of collective oscillatory behaviors, including rhythmic patterns, synchronous evolution, resonances and traveling waves.
“The research will be conducted along three parallel thrusts, moving from relatively simple dynamics and static optimization problems, to more complex theory and problems. In each thrust, the researchers will pursue hardware demonstrations on analog and digital circuits,” said Bullo, adding that the group is “honored to be selected for this highly competitive award.”
At the core of this project is the oscillator, be it natural like a neuron or pacemaker cell, or a manmade device like a pendulum or a circuit, that generates a cyclic, fluctuating signal. Connect one to another in a way that allows for energy transfer and together the two oscillators generate new patterns. The complexity of the system – and its capacity to model and solve computational problems – increases with each additional coupled oscillator.
“The interaction is where all the magic is,” Bullo said. Depending on the context, the patterns generated by these interactions could result in outcomes such as complex behaviors or dense computations; a quantitative step-by-step model is required to understand the dynamics of these patterns and their significance.
“The beauty of mathematics is that it is an abstract language that allows you to talk about systems in many walks of life,” Bullo said.
The team’s combined expertise in dynamical systems and optimization, computational neuroscience, physics and the engineering of computer chips will allow them to uncover the mathematics that underlies networked oscillators, and how they might solve increasingly complex optimization problems, for which several outcomes are possible that must be ranked. The insights can be applied to an understanding and a spatial theory of brain cognitive functions and designed into analog and digital integrated circuits.
“This MURI is about a dedicated effort to understanding and building ‘natural’ computers whose dynamics solve problems humanity cares about,” said co-PI Kerem Çamsari, a professor of electrical and computer engineering at UCSB.
In this era of Moore’s Law, which in the 1970s predicted that processing speed and power would double roughly every two years, increases in density of transistors on a microchip have led computer scientists and engineers to focus on improving digital computers, he said.
However, transistors can get only so small and densely packed before they hit hard physical limits to their performance, signaling the nearing end of this era. As a result, the field of computing has started to consider many new types of computers using different building blocks.
“One such approach is the nature-inspired computer with interacting building blocks, whose natural evolution — synchronized or not synchronized — can be guided to solve hard computational problems,” Çamsari said. “This is very different from building digital computers with deterministic and precise algorithms.” There are many parallels to quantum computing, he added, which also lets nature do the problem solving.
In addition to tapping into the new and exciting field of natural computing, the researchers also expect to gain a deeper understanding of brain-inspired neural synchrony for the purposes of energy efficiency and robust fault tolerance. In contrast to the continuous power draw of today’s conventional computers, neuromorphic computing architectures aim to mimic the brain’s parallel processing and sparse firing of neurons, which only consume energy when there is input to process. The results could revolutionize a variety of real-world applications where massive amounts of data are constantly being analyzed, such as supply-chain logistics, vehicle routing, healthcare and ecommerce.