Research team develops reconfigurable photonic computing architecture for lifelong learning

Artificial intelligence (AI) tasks have become increasingly abundant and complex, fueled by large-scale datasets. With the plateau of Moore’s law and end of Dennard scaling, energy consumption becomes a major barrier to more widespread applications of today’s heavy electronic deep neural models, especially in terminal/edge systems.

This article is brought to you by this site.

Skip The Dishes Referral Code