We will be welcoming Chris Kello (University of California-Merced) to give this year’s lecture.
Deep learning has its roots in “connectionist” neural network modeling that is foundational to the field of cognitive science. Connectionist models were coined as “deep learning” when they became powerful enough for industrial-strength applications in, most notably, computer vision and speech recognition. Natural language processing was slower at first to benefit from deep learning, but then the transformer neural network, introduced in 2017, advanced sequence processing to a level that few if any researchers could have imagined just fifteen years ago. This talk will briefly review the history of deep learning leading up to the large language models that underlie ChatGPT, including some basics of how they work, and how they are changing cognitive science and the world.
Contact: Dr. Damian Kelty-Stephen (keltystd@newpaltz.edu)
(photo credit MIT news)