Harvard and Google Collaborate to Create Artificial Brain for Virtual Rat Control
ICARO Media Group
Harvard University and Google's DeepMind AI lab have joined forces to develop an artificial brain that can control a virtual rat model with remarkable accuracy. The researchers aimed to gain a better understanding of how natural brains control movements. This breakthrough could pave the way for advancements in the field of robotics and the emulation of animal and human motion.
Diego Aldarondo, a graduate student at Harvard involved in the study, highlighted the challenges faced in both hardware and software aspects. Building robots that possess the flexibility, robustness, and energy efficiency of animal bodies has proven difficult for researchers. On the software side, the team encountered hurdles in developing efficient physics simulations and machine learning pipelines to train controllers that could mimic human movement. Additionally, they struggled with the "sim-to-real gap," which refers to the difficulty of transferring control skills learned in simulation to real-life robots.
Under the guidance of Professor Bence Ölveczky from Harvard's Department of Organismic and Evolutionary Biology, Aldarondo and his colleagues created a biologically realistic digital model of a rat. To accomplish this, they collaborated with Google DeepMind, leveraging their tools to train artificial neural networks (ANNs) capable of controlling biomechanical models within physics simulators. Using MuJoCo, a physics simulator that incorporates gravity and other physical forces, along with a training pipeline called Motor IMItation and Control (MIMIC), the researchers trained the ANN on rat behavior using high-resolution data recorded from real rats.
By employing ANNs, the team successfully developed inverse dynamic models that mimic how our brains guide bodily movements, allowing us to go from our current physical state to a desired state. These inverse models produce the muscle activations required to achieve a specific posture, taking into account the physical attributes of the body. Through this framework, motor neuroscience can explore how movement coordination is learned through experience interacting with the world and accounting for the body's physical properties.
The virtual rat model, trained on real rat data, was able to learn the necessary forces to produce desired movements even when not explicitly trained on them. In an exciting discovery, when researchers measured neural activity in both the real rats and the virtual model, they found that the virtual rat accurately predicted the neural activity observed in the real rats. This breakthrough opens up a new frontier of virtual neuroscience, where AI-simulated animals can be utilized to study neural circuits and potentially examine their compromises in disease.
Ölveczky, renowned for teaching rats complex behaviors, expressed a keen interest in using these virtual models to help solve challenges faced by real rats. The team plans to employ the virtual rats to test ideas and further our understanding of how real brains generate complex behaviors.
The groundbreaking research, published today in the journal Nature, marks a significant milestone in the field of neuroscience and robotics. The collaboration between Harvard and Google's DeepMind demonstrates the potential for artificial brains to control virtual models that closely resemble their real-life counterparts. This breakthrough not only enhances our understanding of brain function but also opens up new possibilities for studying neural circuits and discovering solutions for real-world challenges.