delip / PyTorchNLPBook

Code and data accompanying Natural Language Processing with PyTorch published by O'Reilly Media https://amzn.to/3JUgR2L
Apache License 2.0
1.98k stars 807 forks source link

Jupyter Notebook Kernel keeps dying in the Toy Dataset Problem on a Macbook Pro #29

Open harshithbelagur opened 4 years ago

harshithbelagur commented 4 years ago

All the previous codes from Chapter 1 seem to be running fine but just this particular part of code seems to be causing the issue -

lr = 0.01 input_dim = 2

batch_size = 1000 n_epochs = 12 n_batches = 5

seed = 1337

torch.manual_seed(seed) np.random.seed(seed)

perceptron = Perceptron(input_dim=input_dim) optimizer = optim.Adam(params=perceptron.parameters(), lr=lr) bce_loss = nn.BCELoss()

losses = []

x_data_static, y_truth_static = get_toy_data(batch_size) fig, ax = plt.subplots(1, 1, figsize=(10,5)) visualize_results(perceptron, x_data_static, y_truth_static, ax=ax, title='Initial Model State') plt.axis('off')

plt.savefig('initial.png')

change = 1.0 last = 10.0 epsilon = 1e-3 epoch = 0 while change > epsilon or epoch < n_epochs or last > 0.3:

for epoch in range(n_epochs):

for _ in range(n_batches):

    optimizer.zero_grad()
    x_data, y_target = get_toy_data(batch_size)
    y_pred = perceptron(x_data).squeeze()
    loss = bce_loss(y_pred, y_target)
    loss.backward()
    optimizer.step()

    loss_value = loss.item()
    losses.append(loss_value)

    change = abs(last - loss_value)
    last = loss_value

fig, ax = plt.subplots(1, 1, figsize=(10,5))
visualize_results(perceptron, x_data_static, y_truth_static, ax=ax, epoch=epoch, 
                  title=f"{loss_value}; {change}")
plt.axis('off')
epoch += 1
#plt.savefig('epoch{}_toylearning.png'.format(epoch))

Would you happen to know what the cause could be?