Top Rated Speakers
Speaker DirectoryHomeManufacturing

Profile

Jurgen Schmidhuber

Director, AI Inititative

KAUST (King Abdullah University of Science and Technology)

Country or State

Switzerland

Bio

Hiring at KAUST, the university with the highest impact per faculty: https://cemse.kaust.edu.sa/ai/hiring-faculty-postdocs-phd-students-ai. Bio: Since age 15 or so, the main goal of professor Jürgen Schmidhuber has been to build a self-improving Artificial Intelligence (AI) smarter than himself, then retire. His lab's Deep Learning Neural Networks (NNs) have revolutionised machine learning and AI. In 2009, the CTC-trained Long Short-Term Memory (LSTM) of his team was the first recurrent NN to win international pattern recognition competitions. In 2010, his lab's fast and deep feedforward NNs on GPUs greatly outperformed previous methods, without using any unsupervised pre-training, a popular deep learning strategy that he pioneered in 1991. In 2011, the DanNet of his team was the first feedforward NN to win computer vision contests, achieving superhuman performance. By the mid 2010s, his lab's NNs were on 3 billion devices, and used billions of times per day through users of the world's most valuable public companies, e.g., for greatly improved speech recognition on most smartphones, greatly improved machine translation through Google Translate and Facebook (over 4 billion LSTM-based translations per day), Apple's Siri and Quicktype on iPhones, the answers of Amazon's Alexa, and numerous other applications. In May 2015, his team published the Highway Net, the first working really deep feedforward NN with hundreds of layers—its open-gated version called ResNet (Dec 2015) has become the most cited NN of the 21st century, LSTM the most cited NN of the 20th (Bloomberg called LSTM the arguably most commercial AI achievement). His lab's NNs are heavily used in healthcare and medicine, helping to make human lives longer and healthier. He introduced metalearning machines that learn to learn (since 1987), unsupervised generative adversarial neural networks that fight each other in a minimax game to implement artificial curiosity (1990), and neural fast weight programmers (1991) formally equivalent to what's now called Transformers with linearized self-attention. His formal theory of creativity & curiosity & fun explains art, science, music, and humor. He also generalized algorithmic information theory and the many-worlds theory of physics, and introduced the concept of Low-Complexity Art, the information age's extreme form of minimal art. He is recipient of numerous awards and Chief Scientist of the company NNAISENSE, which aims at building the first practical general purpose AI. He is a frequent keynote speaker, and advising various governments on AI strategies.

Similar Speakers

Current Position

Director, AI Inititative at KAUST (King Abdullah University of Science and Technology)

Past Talks

Automate 2022

Skills