A team at the University of Florida has developed a new kind of computer chip that uses light with electricity to perform one of the most power-intensive parts of artificial intelligence—image recognition and similar pattern-finding tasks. Using light dramatically cuts the power needed to perform these tasks, with efficiency 10 or even 100 times that of current chips performing the same calculations. Using this approach could help rein in the enormous demand for electricity that is straining power grids while enabling higher performance AI models and systems.