A new technique that combines gene therapy and nanotechnology enables mice and human retinas to detect infrared radiation. Such a method could someday supplement lost vision in patients suffering from macular degeneration or other progressive forms of blindness (Science 2020, DOI: 10.1126/science.aaz5887).
In degenerative eye diseases, the retina’s photoreceptor cells lose their light sensitivity. As the disease progresses, healthy and ailing photoreceptors coexist, causing blurry vision and eventually severe vision loss. Existing therapies can inadvertently damage healthy photoreceptors while trying to restore light-sensitivity to impaired ones. For instance, in a technique called optogenetics now in clinical trials, cells in the retina are made to express a light-sensitive protein that has to be excited by light so bright that it can damage normal photoreceptors.
“So our idea was to shift the spectrum of excitation to the infrared,” says Daniel Hillier of the German Primate Center. By coaxing photoreceptor cells to sense lower-energy infrared light instead of visible light, Hillier, Botond Roska of the Institute of Molecular and Clinical Ophthalmology, and their colleagues avoid damaging functional photoreceptors.
The researchers were inspired by some snake species that combine normal vision with heat sensing through infrared radiation to improve precision. The animals have special heat-activated proteins in the cell membranes of infrared-detecting organs. These proteins translate heat generated from infrared wavelengths into electrical signals sent to the brain.
To give mammal retinas a similar infrared-sensitivity, the team came up with a three-part sensor system that is injected into the eye. The first component is engineered DNA that makes photoreceptor cells express a heat-activated protein called a transient receptor potential (TRP) channel. When activated by heat, these proteins produce an electrical signal, similar to the one created when light hits photoreceptors. The other two components are gold nanorods that convert infrared light into heat and antibodies that bind the nanorods to the TRP channels.
The researchers injected these components into the retinas of live mice with a form of inherited degenerative blindness and into cultured human retinas that had lost light response. Shining infrared light on the retinas triggered an electric signal in the retinal neurons. In the mice, the researchers also picked up electrical pulses in the visual cortices of the animals’ brains, suggesting that the mice could “see” the infrared light.
Of course, not all objects emit infrared heat, so Hillier says that a practical system would require special goggles that convert visible light into infrared. “They would capture the environment and project an infrared image onto the retina system,” he says.
“This provides an elegant way of overcoming visual impairment, albeit with a very different visual modality,” says Vincent Rotello, a chemist at the University of Massachusetts Amherst.
Katrin Franke of Tübingen University says that “in principle, this approach is similar to conventional optogenetics that uses light-sensitive [proteins] and gene therapy to restore light sensitivity in degenerated retina.” But the new method will require much more careful testing before it can reach patients. Specifically, scientists need to assess the long-term stability and safety of introducing nanoparticles into the eye in a variety of model species including primates.
Hillier is hopeful that the path to human trials should be smooth given the similarity of the new method to optogenetics.