optimization

Welcome Back to SIGAI, Featuring Gradient Descent

Read more

Welcome back to SIGAI! We'll be covering some administrative needs – like how we're doing lectures/workshops and what we expect of coordinators, since we'll have elections in March. Then we'll go over some math to check everyone's background so you're uber prepared for next week! Once we've covered that, we'll go over Gradient Descent and get a rough idea of how it works – this is integral to almost all our content this semester.

: Introductions & an Intro to Neural Networks

Read more

Welcome back to SIGAI! 😃 Tonight we'll go over some changes that have happened over the summer, how we'll handle things after moving forward, then dive into our classic first lecture/workshop series, An Intro to Neural Nets. This time, though, we'll go into significantly more depth, historically and mathematically, than we have in the past. See you there!

Getting Started With Neural Networks

Read more

You've heard about them: Beating humans at all types of games, driving cars, and recommending your next Netflix series to watch, but what ARE neural networks? In this lecture, you'll actually learn step by step how neural networks function and learn. Then, you'll deploy one yourself!