compare_initializers.py
source link: https://gist.github.com/zjwarnes/341e7c60d64c9a5bdc5fe8aba3bcac02
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
compare_initializers.py · GitHub
Instantly share code, notes, and snippets.
initializers = { 'Glorot Uniform':tf.keras.initializers.GlorotUniform(), 'He Uniform':tf.keras.initializers.HeUniform(), 'Lecun Uniform':tf.keras.initializers.LecunUniform(), 'Random Uniform':tf.keras.initializers.RandomUniform() } activation_histories = {} activation_functions = ['relu', 'tanh', 'sigmoid'] for a_func in activation_functions: histories = [] for k, v in initializers.items(): model = create_model(dense_units, a_func, v) optimizer = Adam(learning_rate=INIT_LEARNING_RATE) model.compile(optimizer=optimizer, loss=loss) histories.append(model.fit( x=X_train,y=y_train, epochs=EPOCHS, batch_size=BATCH_SIZE, validation_data=(X_test, y_test), verbose=0 )) activation_histories[a_func] = histories fig = plot_histories(histories, EPOCHS, list(initializers.keys()), a_func) fig.show()
Recommend
About Joyk
Aggregate valuable and interesting links.
Joyk means Joy of geeK