Open soumyac1999 opened 1 year ago
A simpler example:
from flexflow.keras.layers import Dense, Concatenate, Input, Add
import flexflow.keras.optimizers
import numpy as np
def top_level_task():
input0 = Input(shape=(16,), dtype="float32")
input1 = Input(shape=(10,), dtype="float32")
input2 = Input(shape=(10,), dtype="float32")
x0 = Dense(10, activation='relu')(input0)
c0 = Add()([x0, input1])
c1 = Add()([x0, input2])
out = Concatenate(1)([c0, c1])
model = flexflow.keras.models.Model([input0, input1, input2], out)
opt = flexflow.keras.optimizers.SGD(learning_rate=0.01)
model.compile(optimizer=opt, loss='mean_squared_error', metrics=['mean_squared_error'])
print(model.summary())
model.fit(
x = [
np.random.randn(300, 16).astype(np.float32),
np.random.randn(300, 10).astype(np.float32),
np.random.randn(300, 10).astype(np.float32),
],
y = np.random.randn(300, 20).astype(np.float32)
)
if __name__ == '__main__':
top_level_task()
And the operator graph:
(The dot file is generated in model.compile
which fails so I had to add the Dense
node on my own.)
@lockshaw, any updates on this?
Will get to it soon, hoping to have the repository restructuring done first (#562) as it will make modifying and testing search much easier. In the meantime you should actually be able to get around this by simply only constructing one Input
node--would that at at least unblock you until I can get #562 wrapped up?
Running the following code (a hacky implementation of repeat).
Command to run:
flexflow_python file.py -ll:py 1 -ll:gpu 1 -ll:fsize 8192 -ll:zsize 12192
Gives the following error:
Do you know what could be going wrong?
Priority: I'll need either this or elementwise multiplication with broadcasting