njs03332 / ml_study

3 stars 0 forks source link

2023/11/02 ~ 2023/11/09 #75

Open njs03332 opened 1 year ago

njs03332 commented 1 year ago
njs03332 commented 1 year ago

assign roles -s 1102 -c 1 2 3

njs03332 commented 1 year ago
0 1 2
member 김유리 주선미 한단비
chapter 1 2 3
givitallugot commented 1 year ago

15.4.2 단기 기억 문제 해결하기

LSTM 셀

핍홀 연결

danbi5228 commented 1 year ago

GRU 셀

1D 합성곱 층을 사용해 시퀀스 처리하기

model = keras.models.Sequential([
    keras.layers.Conv1D(filters=20, kernel_size=4, strides=2, padding="valid", input_shape=[None, 1]),
    keras.layers.GRU(20, return_sequences=True),
    keras.layers.GRU(20, return_sequences=True),
    keras.layers.TimeDistributed(keras.layers.Dense(10))
])

model.compile(loss="mse", optimizer="adam", metrics=[last_time_step_mse])
history=model.filt(X_train, Y_train[:, 3::2], epochs=20, validation_data=(X_valid, Y_valid[:, 3::2]))

WAVENET

njs03332 commented 1 year ago

15.4 긴 시퀀스 다루기

15.4.1 불안정한 그레이디언트 문제와 싸우기

class LNSimpleRNNCell(keras.layers.Layer):
    def __init__(self, units, activation="tanh", **kwargs):
        super().__init__(**kwargs)
        self.state_size = units
        self.output_size = units
        self.simple_rnn_cell = keras.layers.SimpleRNNCell(units, activation=None)
        self.layer_norm = keras.layers.LayerNormalization()
        self.activation = keras.activations.get(activation)
    def call(self, inputs, states):
        outputs, new_states = self.simple_rnn_cell(inputs, states)
        norm_outputs = self.activation(self.layer_norm(outputs))
        return norm_outputs, [norm_outputs]
model = keras.models.Sequential([
    keras.layers.RNN(LNSimpleRNNCell(20), return_sequences=True, input_shape=[None, 1]),
    keras.layers.RNN(LNSimpleRNNCell(20), return_sequences=True),
    keras.layers.TimeDistributed(keras.layers.Dense(10))
])