vlab-kaist / NN101_23S

MIT License
6 stars 7 forks source link

[LAB] Week 1_Problem 2_Yungeun Song #35

Closed Diiligent closed 1 year ago

Diiligent commented 1 year ago

Problem

Week 1_Problem 2

Source Code

import torch
from random import random 
from typing import Callable

##                        Problem 1                           ##
##                                                            ##
##         Arbitrary quadratic function will be given.        ##
## Return the optimal point(global minimum) of given function ##
##          Condition: highest order term is positive         ##
##                  Made by @jangyoujin0917                   ##
##                                                            ##

def solution(func: Callable, start_point: float) -> float: # DO NOT MODIFY FUNCTION NAME    
    ### IMPLEMENT FROM HERE

    x = torch.tensor([start_point], dtype=torch.float, requires_grad=True)

    z = func(x)

    step_num = 20000
    lr = 1e-4

    velocity = torch.Tensor([0.5])
    beta = 0.9

    scale = torch.tensor([1.0] * 1)
    for i in range(step_num):
        z.backward(scale, retain_graph=True)
        velocity = beta * velocity - lr*x.grad.data 
        x.data += velocity
        #x.data -= lr*x.grad.data
        x.grad.data.zero_()
        # if (i%1000 == 0):
        #     print(f"the loss of {x} is {func(x)}")

    return(float(x))

if __name__ == "__main__":
    def test_func(x): # function for testing;function for evaluation will be different.
        return x ** 2
    t = 10*random()
    print(solution(test_func, t))

Description

Use momentum to avoid to be set on local minimum

Set initial velocity 0.5 for the case start at local minimum point

Output (Optional)

No response

github-actions[bot] commented 1 year ago

This is an auto-generated grading output. Checking code of youngandtherich {'youngandtherich': 0.0}

github-actions[bot] commented 1 year ago

This is an auto-generated grading output. Checking code of youngandtherich {'youngandtherich': 0.0}

Dongyeongkim commented 1 year ago

This issue is closed now because of the lack of progress.