-
# [CS231n] 강의11 Attention and Transformers (1) 리뷰 | AI & PSYC
본 글은 2022년 5월에 강의한 스탠포드 대학의 Attention and Transformers 2022년 강의를 듣고 정리한 내용입니다. seq2seq2, attention, image captioning, transformer 등이 그 예입…
-
大佬好,我是最近在做cs231n的一些作业,对作业1有些疑惑。
[作业1](https://github.com/ccmoony/Stanford-CS231n/blob/main/assignment1/cs231n/classifiers/linear_svm.py) 是首先要计算一下 `dW` 矩阵,也就是使用`svm`作为损失函数 计算 权重的梯度。(对应到 `cs231n\clas…
-
### What?
- [CS231n: Deep Learning for Computer Vision](http://cs231n.stanford.edu/) assignments self-study
-
请问使用上面写的rnn_step_backward反向传播时,为什么dnext_h参数是dh[:, t, :] + stepdprev_h呢?如下:
stepdx, stepdprev_h, stepdWx, stepdWh, stepdb = rnn_step_backward(dh[:, t, :] + stepdprev_h, cache[t])
-
Task Q3 from cs231n course (https://cs231n.github.io/assignments2024/assignment1/)
Task formulation and "skeleton" of the solution are located here:
https://github.com/bulygin1985/ML_CV_study/blob/m…
-
Just so that we don't lost track of it. It would be cool to have a way to try out Oscar without installing Oscar. The canonical candidate would be to have a way to spin up an instance of a jupyter not…
-
Hi. Hi. Hi.
:)
-
# CS231n 深度学习与计算机视觉
* Lesson 1:计算机视觉历史回顾与介绍 Date: 18th July
* Lesson 2:KNN与线性分类器 Date: 21th July
* Lesson 3:线性分类器损失函数与最优化 Date: 21th July
* Lesson 4:反向传播与神经网络初步 Date: 25th July
* Lesson 5:神经网络训…
-
https://stackoverflow.com/questions/16798888/2-d-convolution-as-a-matrix-matrix-multiplication
https://pytorch.org/docs/stable/generated/torch.nn.Conv2d.html
-
https://github.com/naya0000/cs231n/blob/e1192dc8cbaf078c3cfb691e12b8d6d2ec40c8fa/assignment1/cs231n/classifiers/linear_svm.py#L110
Can someone explain why this subtraction is done? An explanation for…