-
Notifications
You must be signed in to change notification settings - Fork 278
Open
Description
Hi Awni Y. Hannun, I have been reproducing you paper published on Nature medicine.
I am confused about your settings of MaxPooling 1d.
Here is the function for building the whole network.
Lines 42 to 81 in c97bb96
| def resnet_block( | |
| layer, | |
| num_filters, | |
| subsample_length, | |
| block_index, | |
| **params): | |
| from keras.layers import Add | |
| from keras.layers import MaxPooling1D | |
| from keras.layers.core import Lambda | |
| def zeropad(x): | |
| y = K.zeros_like(x) | |
| return K.concatenate([x, y], axis=2) | |
| def zeropad_output_shape(input_shape): | |
| shape = list(input_shape) | |
| assert len(shape) == 3 | |
| shape[2] *= 2 | |
| return tuple(shape) | |
| shortcut = MaxPooling1D(pool_size=subsample_length)(layer) | |
| zero_pad = (block_index % params["conv_increase_channels_at"]) == 0 \ | |
| and block_index > 0 | |
| if zero_pad is True: | |
| shortcut = Lambda(zeropad, output_shape=zeropad_output_shape)(shortcut) | |
| for i in range(params["conv_num_skip"]): | |
| if not (block_index == 0 and i == 0): | |
| layer = _bn_relu( | |
| layer, | |
| dropout=params["conv_dropout"] if i > 0 else 0, | |
| **params) | |
| layer = add_conv_weight( | |
| layer, | |
| params["conv_filter_length"], | |
| num_filters, | |
| subsample_length if i == 0 else 1, | |
| **params) | |
| layer = Add()([shortcut, layer]) | |
| return layer |
You created the shortcut on line 62, where the subsample_length can only be 1 or 2 in your settings.
shortcut = MaxPooling1D(pool_size=subsample_length)(layer) when subsample_length = 1, MaxPooling1D applies 1x1 window on input data, therefore, no change made to the input.
Is this intended?
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels