CS 182: Deep Learning CS W182 282A at UC Berkeley Designing, Visualizing and Understanding Deep Neural Networks Lectures: M W 5:30-7 p m , via Zoom
CS W182 282A Resources - GitHub Pages CS W182 282A at UC Berkeley Resources The primary resources for this course are the lecture slides, discussion worksheets, and homework assignments on the front page
CS W182 282A Syllabus CS W182 282A at UC Berkeley Syllabus Technology Piazza We will use Piazza as the 'one-stop shop' throughout the semester: for a Q A forum and for official announcements Enrollment in Piazza is mandatory If you have questions about anything related to the course, please post them on Piazza rather than emailing the instructor or TAs
CS 182 282A Designing, Visualizing and Understanding Deep Neural . . . 2 Meta Learning for Supervised Learning For supervised meta-learning, models are typically trained over a variety of learning tasks Each task is associated with a labeled dataset D that contains both feature vectors and true labels We split each dataset into two parts, Dtr for adapting and a prediction set Dts for evaluating For example, in the few-shot classi cation framework, we consider
CS W182 282A Midterm 1 Spring 2021 CS W182 282A at UC Berkeley Midterm 1 Monday, March 8, format TBA
CS 182 282A Designing, Visualizing and Understanding Deep Neural . . . Other tricks to avoid vanishing gradients include using real-valued discriminators like in Least-squares GAN (which turns out to be equivalent to minimizing the Pearson 2 divergence under ideal settings), and instance noise, which adds noise to the inputs to smooth out the densities of both the real and generated data distributions, improving the learning signal