2020-04-29 08:13:05.454162: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcuda.so.1 2020-04-29 08:13:06.223964: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1618] Found device 0 with properties: name: Quadro P6000 major: 6 minor: 1 memoryClockRate(GHz): 1.645 pciBusID: 0000:3b:00.0 2020-04-29 08:13:06.224334: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.0 2020-04-29 08:13:06.226473: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10.0 2020-04-29 08:13:06.228330: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10.0 2020-04-29 08:13:06.228770: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10.0 2020-04-29 08:13:06.231167: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10.0 2020-04-29 08:13:06.233015: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10.0 2020-04-29 08:13:06.238739: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2020-04-29 08:13:06.241962: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1746] Adding visible gpu devices: 0 2020-04-29 08:13:15.254047: I tensorflow/core/platform/cpu_feature_guard.cc:142] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 AVX512F FMA 2020-04-29 08:13:15.429416: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x52a0630 executing computations on platform CUDA. Devices: 2020-04-29 08:13:15.429531: I tensorflow/compiler/xla/service/service.cc:175] StreamExecutor device (0): Quadro P6000, Compute Capability 6.1 2020-04-29 08:13:15.471651: I tensorflow/core/platform/profile_utils/cpu_utils.cc:94] CPU Frequency: 1700000000 Hz 2020-04-29 08:13:15.472741: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x526fa60 executing computations on platform Host. Devices: 2020-04-29 08:13:15.472783: I tensorflow/compiler/xla/service/service.cc:175] StreamExecutor device (0): , 2020-04-29 08:13:15.474633: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1618] Found device 0 with properties: name: Quadro P6000 major: 6 minor: 1 memoryClockRate(GHz): 1.645 pciBusID: 0000:3b:00.0 2020-04-29 08:13:15.474837: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.0 2020-04-29 08:13:15.474863: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10.0 2020-04-29 08:13:15.474884: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10.0 2020-04-29 08:13:15.474908: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10.0 2020-04-29 08:13:15.474927: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10.0 2020-04-29 08:13:15.474950: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10.0 2020-04-29 08:13:15.474972: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2020-04-29 08:13:15.477989: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1746] Adding visible gpu devices: 0 2020-04-29 08:13:15.478712: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.0 2020-04-29 08:13:15.481884: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1159] Device interconnect StreamExecutor with strength 1 edge matrix: 2020-04-29 08:13:15.481905: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1165] 0 2020-04-29 08:13:15.481915: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1178] 0: N 2020-04-29 08:13:15.486846: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1304] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 23068 MB memory) -> physical GPU (device: 0, name: Quadro P6000, pci bus id: 0000:3b:00.0, compute capability: 6.1) Start 2020-04-29 08:13:06.243483 Parameters: contact_or_dist_or_binned binned num_chains_for_training -1 file_weights binned256.hdf5 training_window 128 training_epochs 8 arch_depth 128 filters_per_layer 64 pad_size 10 batch_size 2 dir_dataset ../../data/ Number of bins 34 Actual bins: {0: '0.0 4.0', 1: '4.0 4.2', 2: '4.2 4.4', 3: '4.4 4.6', 4: '4.6 4.8', 5: '4.8 5.0', 6: '5.0 5.2', 7: '5.2 5.4', 8: '5.4 5.6', 9: '5.6 5.8', 10: '5.8 6.0', 11: '6.0 6.2', 12: '6.2 6.4', 13: '6.4 6.6', 14: '6.6 6.8', 15: '6.8 7.0', 16: '7.0 7.2', 17: '7.2 7.4', 18: '7.4 7.6', 19: '7.6 7.8', 20: '7.8 8.0', 21: '8.0 8.4', 22: '8.4 9.0', 23: '9.0 9.8', 24: '9.8 10.8', 25: '10.8 12.0', 26: '12.0 13.4', 27: '13.4 15.0', 28: '15.0 16.8', 29: '16.8 18.8', 30: '18.8 21.0', 31: '21.0 23.4', 32: '23.4 26.0', 33: '26.0 1000.0'} Split into training and validation set.. Total validation proteins : 100 Total training proteins : 3356 Validation proteins: ['3aa0A', '5cs0A', '2z51A', '3q64A', '4p3aA', '1nkzA', '4d5tA', '1jogA', '4u89A', '3wisA', '1mkfA', '1kaeA', '4o8bA', '2gfqA', '4cmlA', '1vrmA', '4kwyA', '2yfaA', '1tnfA', '3pivA', '1nw1A', '4ng0A', '4qt9A', '3no4A', '3qd7X', '3o4pA', '3agnA', '2yilA', '4jivD', '4fz4A', '2g3vA', '2o0qA', '4levA', '1t6t1', '2fzpA', '1rj8A', '2z7fI', '1dx5I', '2gsoA', '4tshB', '3vtoA', '2p3yA', '3pcvA', '3kfoA', '3v6iB', '4htiA', '2q0tA', '4l3uA', '4pt1A', '3c1qA', '1ux5A', '1h9mA', '3oufA', '4rt5A', '3njcA', '2q73A', '1yz1A', '5c50B', '3hnxA', '1knyA', '1tr8A', '4qicB', '3f1iS', '2fyuI', '4ic9A', '3iruA', '2xu8A', '3g7lA', '3hshA', '1vq0A', '4z04A', '2huhA', '4dh2B', '2p9xA', '1m9zA', '2czrA', '2bh1X', '3ghfA', '4ui1C', '2otaA', '1vk1A', '1su1A', '3sfvB', '2f5tX', '1xyiA', '2okuA', '2h5nA', '1kptA', '2qc5A', '2hueC', '2i5vO', '2gs5A', '4lmoA', '4njcA', '3ronA', '3g1pA', '4m8aA', '3ajfA', '1j8uA', '4u65A'] len(train_generator) : 1678 len(valid_generator) : 50 Actual shape of X : (2, 128, 128, 57) Actual shape of Y : (2, 128, 128, 34) Channel summaries: Channel Avg Max Sum 1 0.3450 1.0000 5652.9 2 0.3497 1.0000 5730.2 3 0.5649 0.9971 9255.5 4 0.5604 0.9971 9181.0 5 0.0213 0.1790 349.7 6 0.0212 0.1790 346.9 7 0.0591 0.8262 968.6 8 0.0589 0.8262 964.8 9 0.0292 0.8042 477.9 10 0.0289 0.8042 474.0 11 0.0519 0.4890 849.9 12 0.0515 0.4890 843.1 13 0.0764 0.9673 1251.6 14 0.0758 0.9673 1241.5 15 0.0077 0.1740 126.7 16 0.0077 0.1740 125.7 17 0.0235 0.4348 384.2 18 0.0233 0.4348 381.1 19 0.0610 0.5542 999.6 20 0.0605 0.5542 991.5 21 0.0232 0.2771 380.2 22 0.0230 0.2771 377.1 23 0.0101 0.1631 165.8 24 0.0100 0.1631 164.5 25 0.0580 0.6577 949.7 26 0.0575 0.6577 942.0 27 0.1035 0.9673 1696.5 28 0.1027 0.9673 1682.8 29 0.0518 0.8911 849.2 30 0.0514 0.8911 842.4 31 0.0216 0.4021 354.5 32 0.0215 0.4021 351.7 33 0.0400 0.9238 655.8 34 0.0397 0.9238 650.5 35 0.0111 0.7393 182.0 36 0.0110 0.7393 180.5 37 0.0588 0.5815 963.9 38 0.0584 0.5815 956.2 39 0.0192 0.2119 314.1 40 0.0190 0.2119 311.6 41 0.0133 0.7393 217.7 42 0.0132 0.7393 216.0 43 0.0232 0.7280 380.8 44 0.0231 0.7280 377.7 45 0.0355 0.3750 582.3 46 0.0353 0.3750 577.7 47 0.1528 0.9673 2502.9 48 0.1588 0.9673 2601.7 49 1.4347 2.5293 23506.7 50 1.4242 2.5293 23334.8 51 0.2559 0.7490 4192.3 52 0.2605 0.8931 4268.3 53 0.7907 0.9546 12955.0 54 0.7894 0.9546 12933.4 55 0.0333 0.0927 545.7 56 0.0013 1.9990 21.3 57 -0.0044 0.6069 -72.1 Ymin = 0.00 Ymean = 0.03 Ymax = 1.00 Build a model.. Model params: L 128 num_blocks 128 width 64 expected_n_channels 57 Compile model.. Model: "model" __________________________________________________________________________________________________ Layer (type) Output Shape Param # Connected to ================================================================================================== input_1 (InputLayer) [(None, 128, 128, 57 0 __________________________________________________________________________________________________ batch_normalization (BatchNorma (None, 128, 128, 57) 228 input_1[0][0] __________________________________________________________________________________________________ activation (Activation) (None, 128, 128, 57) 0 batch_normalization[0][0] __________________________________________________________________________________________________ conv2d (Conv2D) (None, 128, 128, 64) 3712 activation[0][0] __________________________________________________________________________________________________ batch_normalization_1 (BatchNor (None, 128, 128, 64) 256 conv2d[0][0] __________________________________________________________________________________________________ activation_1 (Activation) (None, 128, 128, 64) 0 batch_normalization_1[0][0] __________________________________________________________________________________________________ conv2d_1 (Conv2D) (None, 128, 128, 64) 36928 activation_1[0][0] __________________________________________________________________________________________________ dropout (Dropout) (None, 128, 128, 64) 0 conv2d_1[0][0] __________________________________________________________________________________________________ activation_2 (Activation) (None, 128, 128, 64) 0 dropout[0][0] __________________________________________________________________________________________________ conv2d_2 (Conv2D) (None, 128, 128, 64) 36928 activation_2[0][0] __________________________________________________________________________________________________ add (Add) (None, 128, 128, 64) 0 conv2d_2[0][0] conv2d[0][0] __________________________________________________________________________________________________ batch_normalization_2 (BatchNor (None, 128, 128, 64) 256 add[0][0] __________________________________________________________________________________________________ activation_3 (Activation) (None, 128, 128, 64) 0 batch_normalization_2[0][0] __________________________________________________________________________________________________ conv2d_3 (Conv2D) (None, 128, 128, 64) 36928 activation_3[0][0] __________________________________________________________________________________________________ dropout_1 (Dropout) (None, 128, 128, 64) 0 conv2d_3[0][0] __________________________________________________________________________________________________ activation_4 (Activation) (None, 128, 128, 64) 0 dropout_1[0][0] __________________________________________________________________________________________________ conv2d_4 (Conv2D) (None, 128, 128, 64) 36928 activation_4[0][0] __________________________________________________________________________________________________ add_1 (Add) (None, 128, 128, 64) 0 conv2d_4[0][0] add[0][0] __________________________________________________________________________________________________ batch_normalization_3 (BatchNor (None, 128, 128, 64) 256 add_1[0][0] __________________________________________________________________________________________________ activation_5 (Activation) (None, 128, 128, 64) 0 batch_normalization_3[0][0] __________________________________________________________________________________________________ conv2d_5 (Conv2D) (None, 128, 128, 64) 36928 activation_5[0][0] __________________________________________________________________________________________________ dropout_2 (Dropout) (None, 128, 128, 64) 0 conv2d_5[0][0] __________________________________________________________________________________________________ activation_6 (Activation) (None, 128, 128, 64) 0 dropout_2[0][0] __________________________________________________________________________________________________ conv2d_6 (Conv2D) (None, 128, 128, 64) 36928 activation_6[0][0] __________________________________________________________________________________________________ add_2 (Add) (None, 128, 128, 64) 0 conv2d_6[0][0] add_1[0][0] __________________________________________________________________________________________________ batch_normalization_4 (BatchNor (None, 128, 128, 64) 256 add_2[0][0] __________________________________________________________________________________________________ activation_7 (Activation) (None, 128, 128, 64) 0 batch_normalization_4[0][0] __________________________________________________________________________________________________ conv2d_7 (Conv2D) (None, 128, 128, 64) 36928 activation_7[0][0] __________________________________________________________________________________________________ dropout_3 (Dropout) (None, 128, 128, 64) 0 conv2d_7[0][0] __________________________________________________________________________________________________ activation_8 (Activation) (None, 128, 128, 64) 0 dropout_3[0][0] __________________________________________________________________________________________________ conv2d_8 (Conv2D) (None, 128, 128, 64) 36928 activation_8[0][0] __________________________________________________________________________________________________ add_3 (Add) (None, 128, 128, 64) 0 conv2d_8[0][0] add_2[0][0] __________________________________________________________________________________________________ batch_normalization_5 (BatchNor (None, 128, 128, 64) 256 add_3[0][0] __________________________________________________________________________________________________ activation_9 (Activation) (None, 128, 128, 64) 0 batch_normalization_5[0][0] __________________________________________________________________________________________________ conv2d_9 (Conv2D) (None, 128, 128, 64) 36928 activation_9[0][0] __________________________________________________________________________________________________ dropout_4 (Dropout) (None, 128, 128, 64) 0 conv2d_9[0][0] __________________________________________________________________________________________________ activation_10 (Activation) (None, 128, 128, 64) 0 dropout_4[0][0] __________________________________________________________________________________________________ conv2d_10 (Conv2D) (None, 128, 128, 64) 36928 activation_10[0][0] __________________________________________________________________________________________________ add_4 (Add) (None, 128, 128, 64) 0 conv2d_10[0][0] add_3[0][0] __________________________________________________________________________________________________ batch_normalization_6 (BatchNor (None, 128, 128, 64) 256 add_4[0][0] __________________________________________________________________________________________________ activation_11 (Activation) (None, 128, 128, 64) 0 batch_normalization_6[0][0] __________________________________________________________________________________________________ conv2d_11 (Conv2D) (None, 128, 128, 64) 36928 activation_11[0][0] __________________________________________________________________________________________________ dropout_5 (Dropout) (None, 128, 128, 64) 0 conv2d_11[0][0] __________________________________________________________________________________________________ activation_12 (Activation) (None, 128, 128, 64) 0 dropout_5[0][0] __________________________________________________________________________________________________ conv2d_12 (Conv2D) (None, 128, 128, 64) 36928 activation_12[0][0] __________________________________________________________________________________________________ add_5 (Add) (None, 128, 128, 64) 0 conv2d_12[0][0] add_4[0][0] __________________________________________________________________________________________________ batch_normalization_7 (BatchNor (None, 128, 128, 64) 256 add_5[0][0] __________________________________________________________________________________________________ activation_13 (Activation) (None, 128, 128, 64) 0 batch_normalization_7[0][0] __________________________________________________________________________________________________ conv2d_13 (Conv2D) (None, 128, 128, 64) 36928 activation_13[0][0] __________________________________________________________________________________________________ dropout_6 (Dropout) (None, 128, 128, 64) 0 conv2d_13[0][0] __________________________________________________________________________________________________ activation_14 (Activation) (None, 128, 128, 64) 0 dropout_6[0][0] __________________________________________________________________________________________________ conv2d_14 (Conv2D) (None, 128, 128, 64) 36928 activation_14[0][0] __________________________________________________________________________________________________ add_6 (Add) (None, 128, 128, 64) 0 conv2d_14[0][0] add_5[0][0] __________________________________________________________________________________________________ batch_normalization_8 (BatchNor (None, 128, 128, 64) 256 add_6[0][0] __________________________________________________________________________________________________ activation_15 (Activation) (None, 128, 128, 64) 0 batch_normalization_8[0][0] __________________________________________________________________________________________________ conv2d_15 (Conv2D) (None, 128, 128, 64) 36928 activation_15[0][0] __________________________________________________________________________________________________ dropout_7 (Dropout) (None, 128, 128, 64) 0 conv2d_15[0][0] __________________________________________________________________________________________________ activation_16 (Activation) (None, 128, 128, 64) 0 dropout_7[0][0] __________________________________________________________________________________________________ conv2d_16 (Conv2D) (None, 128, 128, 64) 36928 activation_16[0][0] __________________________________________________________________________________________________ add_7 (Add) (None, 128, 128, 64) 0 conv2d_16[0][0] add_6[0][0] __________________________________________________________________________________________________ batch_normalization_9 (BatchNor (None, 128, 128, 64) 256 add_7[0][0] __________________________________________________________________________________________________ activation_17 (Activation) (None, 128, 128, 64) 0 batch_normalization_9[0][0] __________________________________________________________________________________________________ conv2d_17 (Conv2D) (None, 128, 128, 64) 36928 activation_17[0][0] __________________________________________________________________________________________________ dropout_8 (Dropout) (None, 128, 128, 64) 0 conv2d_17[0][0] __________________________________________________________________________________________________ activation_18 (Activation) (None, 128, 128, 64) 0 dropout_8[0][0] __________________________________________________________________________________________________ conv2d_18 (Conv2D) (None, 128, 128, 64) 36928 activation_18[0][0] __________________________________________________________________________________________________ add_8 (Add) (None, 128, 128, 64) 0 conv2d_18[0][0] add_7[0][0] __________________________________________________________________________________________________ batch_normalization_10 (BatchNo (None, 128, 128, 64) 256 add_8[0][0] __________________________________________________________________________________________________ activation_19 (Activation) (None, 128, 128, 64) 0 batch_normalization_10[0][0] __________________________________________________________________________________________________ conv2d_19 (Conv2D) (None, 128, 128, 64) 36928 activation_19[0][0] __________________________________________________________________________________________________ dropout_9 (Dropout) (None, 128, 128, 64) 0 conv2d_19[0][0] __________________________________________________________________________________________________ activation_20 (Activation) (None, 128, 128, 64) 0 dropout_9[0][0] __________________________________________________________________________________________________ conv2d_20 (Conv2D) (None, 128, 128, 64) 36928 activation_20[0][0] __________________________________________________________________________________________________ add_9 (Add) (None, 128, 128, 64) 0 conv2d_20[0][0] add_8[0][0] __________________________________________________________________________________________________ batch_normalization_11 (BatchNo (None, 128, 128, 64) 256 add_9[0][0] __________________________________________________________________________________________________ activation_21 (Activation) (None, 128, 128, 64) 0 batch_normalization_11[0][0] __________________________________________________________________________________________________ conv2d_21 (Conv2D) (None, 128, 128, 64) 36928 activation_21[0][0] __________________________________________________________________________________________________ dropout_10 (Dropout) (None, 128, 128, 64) 0 conv2d_21[0][0] __________________________________________________________________________________________________ activation_22 (Activation) (None, 128, 128, 64) 0 dropout_10[0][0] __________________________________________________________________________________________________ conv2d_22 (Conv2D) (None, 128, 128, 64) 36928 activation_22[0][0] __________________________________________________________________________________________________ add_10 (Add) (None, 128, 128, 64) 0 conv2d_22[0][0] add_9[0][0] __________________________________________________________________________________________________ batch_normalization_12 (BatchNo (None, 128, 128, 64) 256 add_10[0][0] __________________________________________________________________________________________________ activation_23 (Activation) (None, 128, 128, 64) 0 batch_normalization_12[0][0] __________________________________________________________________________________________________ conv2d_23 (Conv2D) (None, 128, 128, 64) 36928 activation_23[0][0] __________________________________________________________________________________________________ dropout_11 (Dropout) (None, 128, 128, 64) 0 conv2d_23[0][0] __________________________________________________________________________________________________ activation_24 (Activation) (None, 128, 128, 64) 0 dropout_11[0][0] __________________________________________________________________________________________________ conv2d_24 (Conv2D) (None, 128, 128, 64) 36928 activation_24[0][0] __________________________________________________________________________________________________ add_11 (Add) (None, 128, 128, 64) 0 conv2d_24[0][0] add_10[0][0] __________________________________________________________________________________________________ batch_normalization_13 (BatchNo (None, 128, 128, 64) 256 add_11[0][0] __________________________________________________________________________________________________ activation_25 (Activation) (None, 128, 128, 64) 0 batch_normalization_13[0][0] __________________________________________________________________________________________________ conv2d_25 (Conv2D) (None, 128, 128, 64) 36928 activation_25[0][0] __________________________________________________________________________________________________ dropout_12 (Dropout) (None, 128, 128, 64) 0 conv2d_25[0][0] __________________________________________________________________________________________________ activation_26 (Activation) (None, 128, 128, 64) 0 dropout_12[0][0] __________________________________________________________________________________________________ conv2d_26 (Conv2D) (None, 128, 128, 64) 36928 activation_26[0][0] __________________________________________________________________________________________________ add_12 (Add) (None, 128, 128, 64) 0 conv2d_26[0][0] add_11[0][0] __________________________________________________________________________________________________ batch_normalization_14 (BatchNo (None, 128, 128, 64) 256 add_12[0][0] __________________________________________________________________________________________________ activation_27 (Activation) (None, 128, 128, 64) 0 batch_normalization_14[0][0] __________________________________________________________________________________________________ conv2d_27 (Conv2D) (None, 128, 128, 64) 36928 activation_27[0][0] __________________________________________________________________________________________________ dropout_13 (Dropout) (None, 128, 128, 64) 0 conv2d_27[0][0] __________________________________________________________________________________________________ activation_28 (Activation) (None, 128, 128, 64) 0 dropout_13[0][0] __________________________________________________________________________________________________ conv2d_28 (Conv2D) (None, 128, 128, 64) 36928 activation_28[0][0] __________________________________________________________________________________________________ add_13 (Add) (None, 128, 128, 64) 0 conv2d_28[0][0] add_12[0][0] __________________________________________________________________________________________________ batch_normalization_15 (BatchNo (None, 128, 128, 64) 256 add_13[0][0] __________________________________________________________________________________________________ activation_29 (Activation) (None, 128, 128, 64) 0 batch_normalization_15[0][0] __________________________________________________________________________________________________ conv2d_29 (Conv2D) (None, 128, 128, 64) 36928 activation_29[0][0] __________________________________________________________________________________________________ dropout_14 (Dropout) (None, 128, 128, 64) 0 conv2d_29[0][0] __________________________________________________________________________________________________ activation_30 (Activation) (None, 128, 128, 64) 0 dropout_14[0][0] __________________________________________________________________________________________________ conv2d_30 (Conv2D) (None, 128, 128, 64) 36928 activation_30[0][0] __________________________________________________________________________________________________ add_14 (Add) (None, 128, 128, 64) 0 conv2d_30[0][0] add_13[0][0] __________________________________________________________________________________________________ batch_normalization_16 (BatchNo (None, 128, 128, 64) 256 add_14[0][0] __________________________________________________________________________________________________ activation_31 (Activation) (None, 128, 128, 64) 0 batch_normalization_16[0][0] __________________________________________________________________________________________________ conv2d_31 (Conv2D) (None, 128, 128, 64) 36928 activation_31[0][0] __________________________________________________________________________________________________ dropout_15 (Dropout) (None, 128, 128, 64) 0 conv2d_31[0][0] __________________________________________________________________________________________________ activation_32 (Activation) (None, 128, 128, 64) 0 dropout_15[0][0] __________________________________________________________________________________________________ conv2d_32 (Conv2D) (None, 128, 128, 64) 36928 activation_32[0][0] __________________________________________________________________________________________________ add_15 (Add) (None, 128, 128, 64) 0 conv2d_32[0][0] add_14[0][0] __________________________________________________________________________________________________ batch_normalization_17 (BatchNo (None, 128, 128, 64) 256 add_15[0][0] __________________________________________________________________________________________________ activation_33 (Activation) (None, 128, 128, 64) 0 batch_normalization_17[0][0] __________________________________________________________________________________________________ conv2d_33 (Conv2D) (None, 128, 128, 64) 36928 activation_33[0][0] __________________________________________________________________________________________________ dropout_16 (Dropout) (None, 128, 128, 64) 0 conv2d_33[0][0] __________________________________________________________________________________________________ activation_34 (Activation) (None, 128, 128, 64) 0 dropout_16[0][0] __________________________________________________________________________________________________ conv2d_34 (Conv2D) (None, 128, 128, 64) 36928 activation_34[0][0] __________________________________________________________________________________________________ add_16 (Add) (None, 128, 128, 64) 0 conv2d_34[0][0] add_15[0][0] __________________________________________________________________________________________________ batch_normalization_18 (BatchNo (None, 128, 128, 64) 256 add_16[0][0] __________________________________________________________________________________________________ activation_35 (Activation) (None, 128, 128, 64) 0 batch_normalization_18[0][0] __________________________________________________________________________________________________ conv2d_35 (Conv2D) (None, 128, 128, 64) 36928 activation_35[0][0] __________________________________________________________________________________________________ dropout_17 (Dropout) (None, 128, 128, 64) 0 conv2d_35[0][0] __________________________________________________________________________________________________ activation_36 (Activation) (None, 128, 128, 64) 0 dropout_17[0][0] __________________________________________________________________________________________________ conv2d_36 (Conv2D) (None, 128, 128, 64) 36928 activation_36[0][0] __________________________________________________________________________________________________ add_17 (Add) (None, 128, 128, 64) 0 conv2d_36[0][0] add_16[0][0] __________________________________________________________________________________________________ batch_normalization_19 (BatchNo (None, 128, 128, 64) 256 add_17[0][0] __________________________________________________________________________________________________ activation_37 (Activation) (None, 128, 128, 64) 0 batch_normalization_19[0][0] __________________________________________________________________________________________________ conv2d_37 (Conv2D) (None, 128, 128, 64) 36928 activation_37[0][0] __________________________________________________________________________________________________ dropout_18 (Dropout) (None, 128, 128, 64) 0 conv2d_37[0][0] __________________________________________________________________________________________________ activation_38 (Activation) (None, 128, 128, 64) 0 dropout_18[0][0] __________________________________________________________________________________________________ conv2d_38 (Conv2D) (None, 128, 128, 64) 36928 activation_38[0][0] __________________________________________________________________________________________________ add_18 (Add) (None, 128, 128, 64) 0 conv2d_38[0][0] add_17[0][0] __________________________________________________________________________________________________ batch_normalization_20 (BatchNo (None, 128, 128, 64) 256 add_18[0][0] __________________________________________________________________________________________________ activation_39 (Activation) (None, 128, 128, 64) 0 batch_normalization_20[0][0] __________________________________________________________________________________________________ conv2d_39 (Conv2D) (None, 128, 128, 64) 36928 activation_39[0][0] __________________________________________________________________________________________________ dropout_19 (Dropout) (None, 128, 128, 64) 0 conv2d_39[0][0] __________________________________________________________________________________________________ activation_40 (Activation) (None, 128, 128, 64) 0 dropout_19[0][0] __________________________________________________________________________________________________ conv2d_40 (Conv2D) (None, 128, 128, 64) 36928 activation_40[0][0] __________________________________________________________________________________________________ add_19 (Add) (None, 128, 128, 64) 0 conv2d_40[0][0] add_18[0][0] __________________________________________________________________________________________________ batch_normalization_21 (BatchNo (None, 128, 128, 64) 256 add_19[0][0] __________________________________________________________________________________________________ activation_41 (Activation) (None, 128, 128, 64) 0 batch_normalization_21[0][0] __________________________________________________________________________________________________ conv2d_41 (Conv2D) (None, 128, 128, 64) 36928 activation_41[0][0] __________________________________________________________________________________________________ dropout_20 (Dropout) (None, 128, 128, 64) 0 conv2d_41[0][0] __________________________________________________________________________________________________ activation_42 (Activation) (None, 128, 128, 64) 0 dropout_20[0][0] __________________________________________________________________________________________________ conv2d_42 (Conv2D) (None, 128, 128, 64) 36928 activation_42[0][0] __________________________________________________________________________________________________ add_20 (Add) (None, 128, 128, 64) 0 conv2d_42[0][0] add_19[0][0] __________________________________________________________________________________________________ batch_normalization_22 (BatchNo (None, 128, 128, 64) 256 add_20[0][0] __________________________________________________________________________________________________ activation_43 (Activation) (None, 128, 128, 64) 0 batch_normalization_22[0][0] __________________________________________________________________________________________________ conv2d_43 (Conv2D) (None, 128, 128, 64) 36928 activation_43[0][0] __________________________________________________________________________________________________ dropout_21 (Dropout) (None, 128, 128, 64) 0 conv2d_43[0][0] __________________________________________________________________________________________________ activation_44 (Activation) (None, 128, 128, 64) 0 dropout_21[0][0] __________________________________________________________________________________________________ conv2d_44 (Conv2D) (None, 128, 128, 64) 36928 activation_44[0][0] __________________________________________________________________________________________________ add_21 (Add) (None, 128, 128, 64) 0 conv2d_44[0][0] add_20[0][0] __________________________________________________________________________________________________ batch_normalization_23 (BatchNo (None, 128, 128, 64) 256 add_21[0][0] __________________________________________________________________________________________________ activation_45 (Activation) (None, 128, 128, 64) 0 batch_normalization_23[0][0] __________________________________________________________________________________________________ conv2d_45 (Conv2D) (None, 128, 128, 64) 36928 activation_45[0][0] __________________________________________________________________________________________________ dropout_22 (Dropout) (None, 128, 128, 64) 0 conv2d_45[0][0] __________________________________________________________________________________________________ activation_46 (Activation) (None, 128, 128, 64) 0 dropout_22[0][0] __________________________________________________________________________________________________ conv2d_46 (Conv2D) (None, 128, 128, 64) 36928 activation_46[0][0] __________________________________________________________________________________________________ add_22 (Add) (None, 128, 128, 64) 0 conv2d_46[0][0] add_21[0][0] __________________________________________________________________________________________________ batch_normalization_24 (BatchNo (None, 128, 128, 64) 256 add_22[0][0] __________________________________________________________________________________________________ activation_47 (Activation) (None, 128, 128, 64) 0 batch_normalization_24[0][0] __________________________________________________________________________________________________ conv2d_47 (Conv2D) (None, 128, 128, 64) 36928 activation_47[0][0] __________________________________________________________________________________________________ dropout_23 (Dropout) (None, 128, 128, 64) 0 conv2d_47[0][0] __________________________________________________________________________________________________ activation_48 (Activation) (None, 128, 128, 64) 0 dropout_23[0][0] __________________________________________________________________________________________________ conv2d_48 (Conv2D) (None, 128, 128, 64) 36928 activation_48[0][0] __________________________________________________________________________________________________ add_23 (Add) (None, 128, 128, 64) 0 conv2d_48[0][0] add_22[0][0] __________________________________________________________________________________________________ batch_normalization_25 (BatchNo (None, 128, 128, 64) 256 add_23[0][0] __________________________________________________________________________________________________ activation_49 (Activation) (None, 128, 128, 64) 0 batch_normalization_25[0][0] __________________________________________________________________________________________________ conv2d_49 (Conv2D) (None, 128, 128, 64) 36928 activation_49[0][0] __________________________________________________________________________________________________ dropout_24 (Dropout) (None, 128, 128, 64) 0 conv2d_49[0][0] __________________________________________________________________________________________________ activation_50 (Activation) (None, 128, 128, 64) 0 dropout_24[0][0] __________________________________________________________________________________________________ conv2d_50 (Conv2D) (None, 128, 128, 64) 36928 activation_50[0][0] __________________________________________________________________________________________________ add_24 (Add) (None, 128, 128, 64) 0 conv2d_50[0][0] add_23[0][0] __________________________________________________________________________________________________ batch_normalization_26 (BatchNo (None, 128, 128, 64) 256 add_24[0][0] __________________________________________________________________________________________________ activation_51 (Activation) (None, 128, 128, 64) 0 batch_normalization_26[0][0] __________________________________________________________________________________________________ conv2d_51 (Conv2D) (None, 128, 128, 64) 36928 activation_51[0][0] __________________________________________________________________________________________________ dropout_25 (Dropout) (None, 128, 128, 64) 0 conv2d_51[0][0] __________________________________________________________________________________________________ activation_52 (Activation) (None, 128, 128, 64) 0 dropout_25[0][0] __________________________________________________________________________________________________ conv2d_52 (Conv2D) (None, 128, 128, 64) 36928 activation_52[0][0] __________________________________________________________________________________________________ add_25 (Add) (None, 128, 128, 64) 0 conv2d_52[0][0] add_24[0][0] __________________________________________________________________________________________________ batch_normalization_27 (BatchNo (None, 128, 128, 64) 256 add_25[0][0] __________________________________________________________________________________________________ activation_53 (Activation) (None, 128, 128, 64) 0 batch_normalization_27[0][0] __________________________________________________________________________________________________ conv2d_53 (Conv2D) (None, 128, 128, 64) 36928 activation_53[0][0] __________________________________________________________________________________________________ dropout_26 (Dropout) (None, 128, 128, 64) 0 conv2d_53[0][0] __________________________________________________________________________________________________ activation_54 (Activation) (None, 128, 128, 64) 0 dropout_26[0][0] __________________________________________________________________________________________________ conv2d_54 (Conv2D) (None, 128, 128, 64) 36928 activation_54[0][0] __________________________________________________________________________________________________ add_26 (Add) (None, 128, 128, 64) 0 conv2d_54[0][0] add_25[0][0] __________________________________________________________________________________________________ batch_normalization_28 (BatchNo (None, 128, 128, 64) 256 add_26[0][0] __________________________________________________________________________________________________ activation_55 (Activation) (None, 128, 128, 64) 0 batch_normalization_28[0][0] __________________________________________________________________________________________________ conv2d_55 (Conv2D) (None, 128, 128, 64) 36928 activation_55[0][0] __________________________________________________________________________________________________ dropout_27 (Dropout) (None, 128, 128, 64) 0 conv2d_55[0][0] __________________________________________________________________________________________________ activation_56 (Activation) (None, 128, 128, 64) 0 dropout_27[0][0] __________________________________________________________________________________________________ conv2d_56 (Conv2D) (None, 128, 128, 64) 36928 activation_56[0][0] __________________________________________________________________________________________________ add_27 (Add) (None, 128, 128, 64) 0 conv2d_56[0][0] add_26[0][0] __________________________________________________________________________________________________ batch_normalization_29 (BatchNo (None, 128, 128, 64) 256 add_27[0][0] __________________________________________________________________________________________________ activation_57 (Activation) (None, 128, 128, 64) 0 batch_normalization_29[0][0] __________________________________________________________________________________________________ conv2d_57 (Conv2D) (None, 128, 128, 64) 36928 activation_57[0][0] __________________________________________________________________________________________________ dropout_28 (Dropout) (None, 128, 128, 64) 0 conv2d_57[0][0] __________________________________________________________________________________________________ activation_58 (Activation) (None, 128, 128, 64) 0 dropout_28[0][0] __________________________________________________________________________________________________ conv2d_58 (Conv2D) (None, 128, 128, 64) 36928 activation_58[0][0] __________________________________________________________________________________________________ add_28 (Add) (None, 128, 128, 64) 0 conv2d_58[0][0] add_27[0][0] __________________________________________________________________________________________________ batch_normalization_30 (BatchNo (None, 128, 128, 64) 256 add_28[0][0] __________________________________________________________________________________________________ activation_59 (Activation) (None, 128, 128, 64) 0 batch_normalization_30[0][0] __________________________________________________________________________________________________ conv2d_59 (Conv2D) (None, 128, 128, 64) 36928 activation_59[0][0] __________________________________________________________________________________________________ dropout_29 (Dropout) (None, 128, 128, 64) 0 conv2d_59[0][0] __________________________________________________________________________________________________ activation_60 (Activation) (None, 128, 128, 64) 0 dropout_29[0][0] __________________________________________________________________________________________________ conv2d_60 (Conv2D) (None, 128, 128, 64) 36928 activation_60[0][0] __________________________________________________________________________________________________ add_29 (Add) (None, 128, 128, 64) 0 conv2d_60[0][0] add_28[0][0] __________________________________________________________________________________________________ batch_normalization_31 (BatchNo (None, 128, 128, 64) 256 add_29[0][0] __________________________________________________________________________________________________ activation_61 (Activation) (None, 128, 128, 64) 0 batch_normalization_31[0][0] __________________________________________________________________________________________________ conv2d_61 (Conv2D) (None, 128, 128, 64) 36928 activation_61[0][0] __________________________________________________________________________________________________ dropout_30 (Dropout) (None, 128, 128, 64) 0 conv2d_61[0][0] __________________________________________________________________________________________________ activation_62 (Activation) (None, 128, 128, 64) 0 dropout_30[0][0] __________________________________________________________________________________________________ conv2d_62 (Conv2D) (None, 128, 128, 64) 36928 activation_62[0][0] __________________________________________________________________________________________________ add_30 (Add) (None, 128, 128, 64) 0 conv2d_62[0][0] add_29[0][0] __________________________________________________________________________________________________ batch_normalization_32 (BatchNo (None, 128, 128, 64) 256 add_30[0][0] __________________________________________________________________________________________________ activation_63 (Activation) (None, 128, 128, 64) 0 batch_normalization_32[0][0] __________________________________________________________________________________________________ conv2d_63 (Conv2D) (None, 128, 128, 64) 36928 activation_63[0][0] __________________________________________________________________________________________________ dropout_31 (Dropout) (None, 128, 128, 64) 0 conv2d_63[0][0] __________________________________________________________________________________________________ activation_64 (Activation) (None, 128, 128, 64) 0 dropout_31[0][0] __________________________________________________________________________________________________ conv2d_64 (Conv2D) (None, 128, 128, 64) 36928 activation_64[0][0] __________________________________________________________________________________________________ add_31 (Add) (None, 128, 128, 64) 0 conv2d_64[0][0] add_30[0][0] __________________________________________________________________________________________________ batch_normalization_33 (BatchNo (None, 128, 128, 64) 256 add_31[0][0] __________________________________________________________________________________________________ activation_65 (Activation) (None, 128, 128, 64) 0 batch_normalization_33[0][0] __________________________________________________________________________________________________ conv2d_65 (Conv2D) (None, 128, 128, 64) 36928 activation_65[0][0] __________________________________________________________________________________________________ dropout_32 (Dropout) (None, 128, 128, 64) 0 conv2d_65[0][0] __________________________________________________________________________________________________ activation_66 (Activation) (None, 128, 128, 64) 0 dropout_32[0][0] __________________________________________________________________________________________________ conv2d_66 (Conv2D) (None, 128, 128, 64) 36928 activation_66[0][0] __________________________________________________________________________________________________ add_32 (Add) (None, 128, 128, 64) 0 conv2d_66[0][0] add_31[0][0] __________________________________________________________________________________________________ batch_normalization_34 (BatchNo (None, 128, 128, 64) 256 add_32[0][0] __________________________________________________________________________________________________ activation_67 (Activation) (None, 128, 128, 64) 0 batch_normalization_34[0][0] __________________________________________________________________________________________________ conv2d_67 (Conv2D) (None, 128, 128, 64) 36928 activation_67[0][0] __________________________________________________________________________________________________ dropout_33 (Dropout) (None, 128, 128, 64) 0 conv2d_67[0][0] __________________________________________________________________________________________________ activation_68 (Activation) (None, 128, 128, 64) 0 dropout_33[0][0] __________________________________________________________________________________________________ conv2d_68 (Conv2D) (None, 128, 128, 64) 36928 activation_68[0][0] __________________________________________________________________________________________________ add_33 (Add) (None, 128, 128, 64) 0 conv2d_68[0][0] add_32[0][0] __________________________________________________________________________________________________ batch_normalization_35 (BatchNo (None, 128, 128, 64) 256 add_33[0][0] __________________________________________________________________________________________________ activation_69 (Activation) (None, 128, 128, 64) 0 batch_normalization_35[0][0] __________________________________________________________________________________________________ conv2d_69 (Conv2D) (None, 128, 128, 64) 36928 activation_69[0][0] __________________________________________________________________________________________________ dropout_34 (Dropout) (None, 128, 128, 64) 0 conv2d_69[0][0] __________________________________________________________________________________________________ activation_70 (Activation) (None, 128, 128, 64) 0 dropout_34[0][0] __________________________________________________________________________________________________ conv2d_70 (Conv2D) (None, 128, 128, 64) 36928 activation_70[0][0] __________________________________________________________________________________________________ add_34 (Add) (None, 128, 128, 64) 0 conv2d_70[0][0] add_33[0][0] __________________________________________________________________________________________________ batch_normalization_36 (BatchNo (None, 128, 128, 64) 256 add_34[0][0] __________________________________________________________________________________________________ activation_71 (Activation) (None, 128, 128, 64) 0 batch_normalization_36[0][0] __________________________________________________________________________________________________ conv2d_71 (Conv2D) (None, 128, 128, 64) 36928 activation_71[0][0] __________________________________________________________________________________________________ dropout_35 (Dropout) (None, 128, 128, 64) 0 conv2d_71[0][0] __________________________________________________________________________________________________ activation_72 (Activation) (None, 128, 128, 64) 0 dropout_35[0][0] __________________________________________________________________________________________________ conv2d_72 (Conv2D) (None, 128, 128, 64) 36928 activation_72[0][0] __________________________________________________________________________________________________ add_35 (Add) (None, 128, 128, 64) 0 conv2d_72[0][0] add_34[0][0] __________________________________________________________________________________________________ batch_normalization_37 (BatchNo (None, 128, 128, 64) 256 add_35[0][0] __________________________________________________________________________________________________ activation_73 (Activation) (None, 128, 128, 64) 0 batch_normalization_37[0][0] __________________________________________________________________________________________________ conv2d_73 (Conv2D) (None, 128, 128, 64) 36928 activation_73[0][0] __________________________________________________________________________________________________ dropout_36 (Dropout) (None, 128, 128, 64) 0 conv2d_73[0][0] __________________________________________________________________________________________________ activation_74 (Activation) (None, 128, 128, 64) 0 dropout_36[0][0] __________________________________________________________________________________________________ conv2d_74 (Conv2D) (None, 128, 128, 64) 36928 activation_74[0][0] __________________________________________________________________________________________________ add_36 (Add) (None, 128, 128, 64) 0 conv2d_74[0][0] add_35[0][0] __________________________________________________________________________________________________ batch_normalization_38 (BatchNo (None, 128, 128, 64) 256 add_36[0][0] __________________________________________________________________________________________________ activation_75 (Activation) (None, 128, 128, 64) 0 batch_normalization_38[0][0] __________________________________________________________________________________________________ conv2d_75 (Conv2D) (None, 128, 128, 64) 36928 activation_75[0][0] __________________________________________________________________________________________________ dropout_37 (Dropout) (None, 128, 128, 64) 0 conv2d_75[0][0] __________________________________________________________________________________________________ activation_76 (Activation) (None, 128, 128, 64) 0 dropout_37[0][0] __________________________________________________________________________________________________ conv2d_76 (Conv2D) (None, 128, 128, 64) 36928 activation_76[0][0] __________________________________________________________________________________________________ add_37 (Add) (None, 128, 128, 64) 0 conv2d_76[0][0] add_36[0][0] __________________________________________________________________________________________________ batch_normalization_39 (BatchNo (None, 128, 128, 64) 256 add_37[0][0] __________________________________________________________________________________________________ activation_77 (Activation) (None, 128, 128, 64) 0 batch_normalization_39[0][0] __________________________________________________________________________________________________ conv2d_77 (Conv2D) (None, 128, 128, 64) 36928 activation_77[0][0] __________________________________________________________________________________________________ dropout_38 (Dropout) (None, 128, 128, 64) 0 conv2d_77[0][0] __________________________________________________________________________________________________ activation_78 (Activation) (None, 128, 128, 64) 0 dropout_38[0][0] __________________________________________________________________________________________________ conv2d_78 (Conv2D) (None, 128, 128, 64) 36928 activation_78[0][0] __________________________________________________________________________________________________ add_38 (Add) (None, 128, 128, 64) 0 conv2d_78[0][0] add_37[0][0] __________________________________________________________________________________________________ batch_normalization_40 (BatchNo (None, 128, 128, 64) 256 add_38[0][0] __________________________________________________________________________________________________ activation_79 (Activation) (None, 128, 128, 64) 0 batch_normalization_40[0][0] __________________________________________________________________________________________________ conv2d_79 (Conv2D) (None, 128, 128, 64) 36928 activation_79[0][0] __________________________________________________________________________________________________ dropout_39 (Dropout) (None, 128, 128, 64) 0 conv2d_79[0][0] __________________________________________________________________________________________________ activation_80 (Activation) (None, 128, 128, 64) 0 dropout_39[0][0] __________________________________________________________________________________________________ conv2d_80 (Conv2D) (None, 128, 128, 64) 36928 activation_80[0][0] __________________________________________________________________________________________________ add_39 (Add) (None, 128, 128, 64) 0 conv2d_80[0][0] add_38[0][0] __________________________________________________________________________________________________ batch_normalization_41 (BatchNo (None, 128, 128, 64) 256 add_39[0][0] __________________________________________________________________________________________________ activation_81 (Activation) (None, 128, 128, 64) 0 batch_normalization_41[0][0] __________________________________________________________________________________________________ conv2d_81 (Conv2D) (None, 128, 128, 64) 36928 activation_81[0][0] __________________________________________________________________________________________________ dropout_40 (Dropout) (None, 128, 128, 64) 0 conv2d_81[0][0] __________________________________________________________________________________________________ activation_82 (Activation) (None, 128, 128, 64) 0 dropout_40[0][0] __________________________________________________________________________________________________ conv2d_82 (Conv2D) (None, 128, 128, 64) 36928 activation_82[0][0] __________________________________________________________________________________________________ add_40 (Add) (None, 128, 128, 64) 0 conv2d_82[0][0] add_39[0][0] __________________________________________________________________________________________________ batch_normalization_42 (BatchNo (None, 128, 128, 64) 256 add_40[0][0] __________________________________________________________________________________________________ activation_83 (Activation) (None, 128, 128, 64) 0 batch_normalization_42[0][0] __________________________________________________________________________________________________ conv2d_83 (Conv2D) (None, 128, 128, 64) 36928 activation_83[0][0] __________________________________________________________________________________________________ dropout_41 (Dropout) (None, 128, 128, 64) 0 conv2d_83[0][0] __________________________________________________________________________________________________ activation_84 (Activation) (None, 128, 128, 64) 0 dropout_41[0][0] __________________________________________________________________________________________________ conv2d_84 (Conv2D) (None, 128, 128, 64) 36928 activation_84[0][0] __________________________________________________________________________________________________ add_41 (Add) (None, 128, 128, 64) 0 conv2d_84[0][0] add_40[0][0] __________________________________________________________________________________________________ batch_normalization_43 (BatchNo (None, 128, 128, 64) 256 add_41[0][0] __________________________________________________________________________________________________ activation_85 (Activation) (None, 128, 128, 64) 0 batch_normalization_43[0][0] __________________________________________________________________________________________________ conv2d_85 (Conv2D) (None, 128, 128, 64) 36928 activation_85[0][0] __________________________________________________________________________________________________ dropout_42 (Dropout) (None, 128, 128, 64) 0 conv2d_85[0][0] __________________________________________________________________________________________________ activation_86 (Activation) (None, 128, 128, 64) 0 dropout_42[0][0] __________________________________________________________________________________________________ conv2d_86 (Conv2D) (None, 128, 128, 64) 36928 activation_86[0][0] __________________________________________________________________________________________________ add_42 (Add) (None, 128, 128, 64) 0 conv2d_86[0][0] add_41[0][0] __________________________________________________________________________________________________ batch_normalization_44 (BatchNo (None, 128, 128, 64) 256 add_42[0][0] __________________________________________________________________________________________________ activation_87 (Activation) (None, 128, 128, 64) 0 batch_normalization_44[0][0] __________________________________________________________________________________________________ conv2d_87 (Conv2D) (None, 128, 128, 64) 36928 activation_87[0][0] __________________________________________________________________________________________________ dropout_43 (Dropout) (None, 128, 128, 64) 0 conv2d_87[0][0] __________________________________________________________________________________________________ activation_88 (Activation) (None, 128, 128, 64) 0 dropout_43[0][0] __________________________________________________________________________________________________ conv2d_88 (Conv2D) (None, 128, 128, 64) 36928 activation_88[0][0] __________________________________________________________________________________________________ add_43 (Add) (None, 128, 128, 64) 0 conv2d_88[0][0] add_42[0][0] __________________________________________________________________________________________________ batch_normalization_45 (BatchNo (None, 128, 128, 64) 256 add_43[0][0] __________________________________________________________________________________________________ activation_89 (Activation) (None, 128, 128, 64) 0 batch_normalization_45[0][0] __________________________________________________________________________________________________ conv2d_89 (Conv2D) (None, 128, 128, 64) 36928 activation_89[0][0] __________________________________________________________________________________________________ dropout_44 (Dropout) (None, 128, 128, 64) 0 conv2d_89[0][0] __________________________________________________________________________________________________ activation_90 (Activation) (None, 128, 128, 64) 0 dropout_44[0][0] __________________________________________________________________________________________________ conv2d_90 (Conv2D) (None, 128, 128, 64) 36928 activation_90[0][0] __________________________________________________________________________________________________ add_44 (Add) (None, 128, 128, 64) 0 conv2d_90[0][0] add_43[0][0] __________________________________________________________________________________________________ batch_normalization_46 (BatchNo (None, 128, 128, 64) 256 add_44[0][0] __________________________________________________________________________________________________ activation_91 (Activation) (None, 128, 128, 64) 0 batch_normalization_46[0][0] __________________________________________________________________________________________________ conv2d_91 (Conv2D) (None, 128, 128, 64) 36928 activation_91[0][0] __________________________________________________________________________________________________ dropout_45 (Dropout) (None, 128, 128, 64) 0 conv2d_91[0][0] __________________________________________________________________________________________________ activation_92 (Activation) (None, 128, 128, 64) 0 dropout_45[0][0] __________________________________________________________________________________________________ conv2d_92 (Conv2D) (None, 128, 128, 64) 36928 activation_92[0][0] __________________________________________________________________________________________________ add_45 (Add) (None, 128, 128, 64) 0 conv2d_92[0][0] add_44[0][0] __________________________________________________________________________________________________ batch_normalization_47 (BatchNo (None, 128, 128, 64) 256 add_45[0][0] __________________________________________________________________________________________________ activation_93 (Activation) (None, 128, 128, 64) 0 batch_normalization_47[0][0] __________________________________________________________________________________________________ conv2d_93 (Conv2D) (None, 128, 128, 64) 36928 activation_93[0][0] __________________________________________________________________________________________________ dropout_46 (Dropout) (None, 128, 128, 64) 0 conv2d_93[0][0] __________________________________________________________________________________________________ activation_94 (Activation) (None, 128, 128, 64) 0 dropout_46[0][0] __________________________________________________________________________________________________ conv2d_94 (Conv2D) (None, 128, 128, 64) 36928 activation_94[0][0] __________________________________________________________________________________________________ add_46 (Add) (None, 128, 128, 64) 0 conv2d_94[0][0] add_45[0][0] __________________________________________________________________________________________________ batch_normalization_48 (BatchNo (None, 128, 128, 64) 256 add_46[0][0] __________________________________________________________________________________________________ activation_95 (Activation) (None, 128, 128, 64) 0 batch_normalization_48[0][0] __________________________________________________________________________________________________ conv2d_95 (Conv2D) (None, 128, 128, 64) 36928 activation_95[0][0] __________________________________________________________________________________________________ dropout_47 (Dropout) (None, 128, 128, 64) 0 conv2d_95[0][0] __________________________________________________________________________________________________ activation_96 (Activation) (None, 128, 128, 64) 0 dropout_47[0][0] __________________________________________________________________________________________________ conv2d_96 (Conv2D) (None, 128, 128, 64) 36928 activation_96[0][0] __________________________________________________________________________________________________ add_47 (Add) (None, 128, 128, 64) 0 conv2d_96[0][0] add_46[0][0] __________________________________________________________________________________________________ batch_normalization_49 (BatchNo (None, 128, 128, 64) 256 add_47[0][0] __________________________________________________________________________________________________ activation_97 (Activation) (None, 128, 128, 64) 0 batch_normalization_49[0][0] __________________________________________________________________________________________________ conv2d_97 (Conv2D) (None, 128, 128, 64) 36928 activation_97[0][0] __________________________________________________________________________________________________ dropout_48 (Dropout) (None, 128, 128, 64) 0 conv2d_97[0][0] __________________________________________________________________________________________________ activation_98 (Activation) (None, 128, 128, 64) 0 dropout_48[0][0] __________________________________________________________________________________________________ conv2d_98 (Conv2D) (None, 128, 128, 64) 36928 activation_98[0][0] __________________________________________________________________________________________________ add_48 (Add) (None, 128, 128, 64) 0 conv2d_98[0][0] add_47[0][0] __________________________________________________________________________________________________ batch_normalization_50 (BatchNo (None, 128, 128, 64) 256 add_48[0][0] __________________________________________________________________________________________________ activation_99 (Activation) (None, 128, 128, 64) 0 batch_normalization_50[0][0] __________________________________________________________________________________________________ conv2d_99 (Conv2D) (None, 128, 128, 64) 36928 activation_99[0][0] __________________________________________________________________________________________________ dropout_49 (Dropout) (None, 128, 128, 64) 0 conv2d_99[0][0] __________________________________________________________________________________________________ activation_100 (Activation) (None, 128, 128, 64) 0 dropout_49[0][0] __________________________________________________________________________________________________ conv2d_100 (Conv2D) (None, 128, 128, 64) 36928 activation_100[0][0] __________________________________________________________________________________________________ add_49 (Add) (None, 128, 128, 64) 0 conv2d_100[0][0] add_48[0][0] __________________________________________________________________________________________________ batch_normalization_51 (BatchNo (None, 128, 128, 64) 256 add_49[0][0] __________________________________________________________________________________________________ activation_101 (Activation) (None, 128, 128, 64) 0 batch_normalization_51[0][0] __________________________________________________________________________________________________ conv2d_101 (Conv2D) (None, 128, 128, 64) 36928 activation_101[0][0] __________________________________________________________________________________________________ dropout_50 (Dropout) (None, 128, 128, 64) 0 conv2d_101[0][0] __________________________________________________________________________________________________ activation_102 (Activation) (None, 128, 128, 64) 0 dropout_50[0][0] __________________________________________________________________________________________________ conv2d_102 (Conv2D) (None, 128, 128, 64) 36928 activation_102[0][0] __________________________________________________________________________________________________ add_50 (Add) (None, 128, 128, 64) 0 conv2d_102[0][0] add_49[0][0] __________________________________________________________________________________________________ batch_normalization_52 (BatchNo (None, 128, 128, 64) 256 add_50[0][0] __________________________________________________________________________________________________ activation_103 (Activation) (None, 128, 128, 64) 0 batch_normalization_52[0][0] __________________________________________________________________________________________________ conv2d_103 (Conv2D) (None, 128, 128, 64) 36928 activation_103[0][0] __________________________________________________________________________________________________ dropout_51 (Dropout) (None, 128, 128, 64) 0 conv2d_103[0][0] __________________________________________________________________________________________________ activation_104 (Activation) (None, 128, 128, 64) 0 dropout_51[0][0] __________________________________________________________________________________________________ conv2d_104 (Conv2D) (None, 128, 128, 64) 36928 activation_104[0][0] __________________________________________________________________________________________________ add_51 (Add) (None, 128, 128, 64) 0 conv2d_104[0][0] add_50[0][0] __________________________________________________________________________________________________ batch_normalization_53 (BatchNo (None, 128, 128, 64) 256 add_51[0][0] __________________________________________________________________________________________________ activation_105 (Activation) (None, 128, 128, 64) 0 batch_normalization_53[0][0] __________________________________________________________________________________________________ conv2d_105 (Conv2D) (None, 128, 128, 64) 36928 activation_105[0][0] __________________________________________________________________________________________________ dropout_52 (Dropout) (None, 128, 128, 64) 0 conv2d_105[0][0] __________________________________________________________________________________________________ activation_106 (Activation) (None, 128, 128, 64) 0 dropout_52[0][0] __________________________________________________________________________________________________ conv2d_106 (Conv2D) (None, 128, 128, 64) 36928 activation_106[0][0] __________________________________________________________________________________________________ add_52 (Add) (None, 128, 128, 64) 0 conv2d_106[0][0] add_51[0][0] __________________________________________________________________________________________________ batch_normalization_54 (BatchNo (None, 128, 128, 64) 256 add_52[0][0] __________________________________________________________________________________________________ activation_107 (Activation) (None, 128, 128, 64) 0 batch_normalization_54[0][0] __________________________________________________________________________________________________ conv2d_107 (Conv2D) (None, 128, 128, 64) 36928 activation_107[0][0] __________________________________________________________________________________________________ dropout_53 (Dropout) (None, 128, 128, 64) 0 conv2d_107[0][0] __________________________________________________________________________________________________ activation_108 (Activation) (None, 128, 128, 64) 0 dropout_53[0][0] __________________________________________________________________________________________________ conv2d_108 (Conv2D) (None, 128, 128, 64) 36928 activation_108[0][0] __________________________________________________________________________________________________ add_53 (Add) (None, 128, 128, 64) 0 conv2d_108[0][0] add_52[0][0] __________________________________________________________________________________________________ batch_normalization_55 (BatchNo (None, 128, 128, 64) 256 add_53[0][0] __________________________________________________________________________________________________ activation_109 (Activation) (None, 128, 128, 64) 0 batch_normalization_55[0][0] __________________________________________________________________________________________________ conv2d_109 (Conv2D) (None, 128, 128, 64) 36928 activation_109[0][0] __________________________________________________________________________________________________ dropout_54 (Dropout) (None, 128, 128, 64) 0 conv2d_109[0][0] __________________________________________________________________________________________________ activation_110 (Activation) (None, 128, 128, 64) 0 dropout_54[0][0] __________________________________________________________________________________________________ conv2d_110 (Conv2D) (None, 128, 128, 64) 36928 activation_110[0][0] __________________________________________________________________________________________________ add_54 (Add) (None, 128, 128, 64) 0 conv2d_110[0][0] add_53[0][0] __________________________________________________________________________________________________ batch_normalization_56 (BatchNo (None, 128, 128, 64) 256 add_54[0][0] __________________________________________________________________________________________________ activation_111 (Activation) (None, 128, 128, 64) 0 batch_normalization_56[0][0] __________________________________________________________________________________________________ conv2d_111 (Conv2D) (None, 128, 128, 64) 36928 activation_111[0][0] __________________________________________________________________________________________________ dropout_55 (Dropout) (None, 128, 128, 64) 0 conv2d_111[0][0] __________________________________________________________________________________________________ activation_112 (Activation) (None, 128, 128, 64) 0 dropout_55[0][0] __________________________________________________________________________________________________ conv2d_112 (Conv2D) (None, 128, 128, 64) 36928 activation_112[0][0] __________________________________________________________________________________________________ add_55 (Add) (None, 128, 128, 64) 0 conv2d_112[0][0] add_54[0][0] __________________________________________________________________________________________________ batch_normalization_57 (BatchNo (None, 128, 128, 64) 256 add_55[0][0] __________________________________________________________________________________________________ activation_113 (Activation) (None, 128, 128, 64) 0 batch_normalization_57[0][0] __________________________________________________________________________________________________ conv2d_113 (Conv2D) (None, 128, 128, 64) 36928 activation_113[0][0] __________________________________________________________________________________________________ dropout_56 (Dropout) (None, 128, 128, 64) 0 conv2d_113[0][0] __________________________________________________________________________________________________ activation_114 (Activation) (None, 128, 128, 64) 0 dropout_56[0][0] __________________________________________________________________________________________________ conv2d_114 (Conv2D) (None, 128, 128, 64) 36928 activation_114[0][0] __________________________________________________________________________________________________ add_56 (Add) (None, 128, 128, 64) 0 conv2d_114[0][0] add_55[0][0] __________________________________________________________________________________________________ batch_normalization_58 (BatchNo (None, 128, 128, 64) 256 add_56[0][0] __________________________________________________________________________________________________ activation_115 (Activation) (None, 128, 128, 64) 0 batch_normalization_58[0][0] __________________________________________________________________________________________________ conv2d_115 (Conv2D) (None, 128, 128, 64) 36928 activation_115[0][0] __________________________________________________________________________________________________ dropout_57 (Dropout) (None, 128, 128, 64) 0 conv2d_115[0][0] __________________________________________________________________________________________________ activation_116 (Activation) (None, 128, 128, 64) 0 dropout_57[0][0] __________________________________________________________________________________________________ conv2d_116 (Conv2D) (None, 128, 128, 64) 36928 activation_116[0][0] __________________________________________________________________________________________________ add_57 (Add) (None, 128, 128, 64) 0 conv2d_116[0][0] add_56[0][0] __________________________________________________________________________________________________ batch_normalization_59 (BatchNo (None, 128, 128, 64) 256 add_57[0][0] __________________________________________________________________________________________________ activation_117 (Activation) (None, 128, 128, 64) 0 batch_normalization_59[0][0] __________________________________________________________________________________________________ conv2d_117 (Conv2D) (None, 128, 128, 64) 36928 activation_117[0][0] __________________________________________________________________________________________________ dropout_58 (Dropout) (None, 128, 128, 64) 0 conv2d_117[0][0] __________________________________________________________________________________________________ activation_118 (Activation) (None, 128, 128, 64) 0 dropout_58[0][0] __________________________________________________________________________________________________ conv2d_118 (Conv2D) (None, 128, 128, 64) 36928 activation_118[0][0] __________________________________________________________________________________________________ add_58 (Add) (None, 128, 128, 64) 0 conv2d_118[0][0] add_57[0][0] __________________________________________________________________________________________________ batch_normalization_60 (BatchNo (None, 128, 128, 64) 256 add_58[0][0] __________________________________________________________________________________________________ activation_119 (Activation) (None, 128, 128, 64) 0 batch_normalization_60[0][0] __________________________________________________________________________________________________ conv2d_119 (Conv2D) (None, 128, 128, 64) 36928 activation_119[0][0] __________________________________________________________________________________________________ dropout_59 (Dropout) (None, 128, 128, 64) 0 conv2d_119[0][0] __________________________________________________________________________________________________ activation_120 (Activation) (None, 128, 128, 64) 0 dropout_59[0][0] __________________________________________________________________________________________________ conv2d_120 (Conv2D) (None, 128, 128, 64) 36928 activation_120[0][0] __________________________________________________________________________________________________ add_59 (Add) (None, 128, 128, 64) 0 conv2d_120[0][0] add_58[0][0] __________________________________________________________________________________________________ batch_normalization_61 (BatchNo (None, 128, 128, 64) 256 add_59[0][0] __________________________________________________________________________________________________ activation_121 (Activation) (None, 128, 128, 64) 0 batch_normalization_61[0][0] __________________________________________________________________________________________________ conv2d_121 (Conv2D) (None, 128, 128, 64) 36928 activation_121[0][0] __________________________________________________________________________________________________ dropout_60 (Dropout) (None, 128, 128, 64) 0 conv2d_121[0][0] __________________________________________________________________________________________________ activation_122 (Activation) (None, 128, 128, 64) 0 dropout_60[0][0] __________________________________________________________________________________________________ conv2d_122 (Conv2D) (None, 128, 128, 64) 36928 activation_122[0][0] __________________________________________________________________________________________________ add_60 (Add) (None, 128, 128, 64) 0 conv2d_122[0][0] add_59[0][0] __________________________________________________________________________________________________ batch_normalization_62 (BatchNo (None, 128, 128, 64) 256 add_60[0][0] __________________________________________________________________________________________________ activation_123 (Activation) (None, 128, 128, 64) 0 batch_normalization_62[0][0] __________________________________________________________________________________________________ conv2d_123 (Conv2D) (None, 128, 128, 64) 36928 activation_123[0][0] __________________________________________________________________________________________________ dropout_61 (Dropout) (None, 128, 128, 64) 0 conv2d_123[0][0] __________________________________________________________________________________________________ activation_124 (Activation) (None, 128, 128, 64) 0 dropout_61[0][0] __________________________________________________________________________________________________ conv2d_124 (Conv2D) (None, 128, 128, 64) 36928 activation_124[0][0] __________________________________________________________________________________________________ add_61 (Add) (None, 128, 128, 64) 0 conv2d_124[0][0] add_60[0][0] __________________________________________________________________________________________________ batch_normalization_63 (BatchNo (None, 128, 128, 64) 256 add_61[0][0] __________________________________________________________________________________________________ activation_125 (Activation) (None, 128, 128, 64) 0 batch_normalization_63[0][0] __________________________________________________________________________________________________ conv2d_125 (Conv2D) (None, 128, 128, 64) 36928 activation_125[0][0] __________________________________________________________________________________________________ dropout_62 (Dropout) (None, 128, 128, 64) 0 conv2d_125[0][0] __________________________________________________________________________________________________ activation_126 (Activation) (None, 128, 128, 64) 0 dropout_62[0][0] __________________________________________________________________________________________________ conv2d_126 (Conv2D) (None, 128, 128, 64) 36928 activation_126[0][0] __________________________________________________________________________________________________ add_62 (Add) (None, 128, 128, 64) 0 conv2d_126[0][0] add_61[0][0] __________________________________________________________________________________________________ batch_normalization_64 (BatchNo (None, 128, 128, 64) 256 add_62[0][0] __________________________________________________________________________________________________ activation_127 (Activation) (None, 128, 128, 64) 0 batch_normalization_64[0][0] __________________________________________________________________________________________________ conv2d_127 (Conv2D) (None, 128, 128, 64) 36928 activation_127[0][0] __________________________________________________________________________________________________ dropout_63 (Dropout) (None, 128, 128, 64) 0 conv2d_127[0][0] __________________________________________________________________________________________________ activation_128 (Activation) (None, 128, 128, 64) 0 dropout_63[0][0] __________________________________________________________________________________________________ conv2d_128 (Conv2D) (None, 128, 128, 64) 36928 activation_128[0][0] __________________________________________________________________________________________________ add_63 (Add) (None, 128, 128, 64) 0 conv2d_128[0][0] add_62[0][0] __________________________________________________________________________________________________ batch_normalization_65 (BatchNo (None, 128, 128, 64) 256 add_63[0][0] __________________________________________________________________________________________________ activation_129 (Activation) (None, 128, 128, 64) 0 batch_normalization_65[0][0] __________________________________________________________________________________________________ conv2d_129 (Conv2D) (None, 128, 128, 64) 36928 activation_129[0][0] __________________________________________________________________________________________________ dropout_64 (Dropout) (None, 128, 128, 64) 0 conv2d_129[0][0] __________________________________________________________________________________________________ activation_130 (Activation) (None, 128, 128, 64) 0 dropout_64[0][0] __________________________________________________________________________________________________ conv2d_130 (Conv2D) (None, 128, 128, 64) 36928 activation_130[0][0] __________________________________________________________________________________________________ add_64 (Add) (None, 128, 128, 64) 0 conv2d_130[0][0] add_63[0][0] __________________________________________________________________________________________________ batch_normalization_66 (BatchNo (None, 128, 128, 64) 256 add_64[0][0] __________________________________________________________________________________________________ activation_131 (Activation) (None, 128, 128, 64) 0 batch_normalization_66[0][0] __________________________________________________________________________________________________ conv2d_131 (Conv2D) (None, 128, 128, 64) 36928 activation_131[0][0] __________________________________________________________________________________________________ dropout_65 (Dropout) (None, 128, 128, 64) 0 conv2d_131[0][0] __________________________________________________________________________________________________ activation_132 (Activation) (None, 128, 128, 64) 0 dropout_65[0][0] __________________________________________________________________________________________________ conv2d_132 (Conv2D) (None, 128, 128, 64) 36928 activation_132[0][0] __________________________________________________________________________________________________ add_65 (Add) (None, 128, 128, 64) 0 conv2d_132[0][0] add_64[0][0] __________________________________________________________________________________________________ batch_normalization_67 (BatchNo (None, 128, 128, 64) 256 add_65[0][0] __________________________________________________________________________________________________ activation_133 (Activation) (None, 128, 128, 64) 0 batch_normalization_67[0][0] __________________________________________________________________________________________________ conv2d_133 (Conv2D) (None, 128, 128, 64) 36928 activation_133[0][0] __________________________________________________________________________________________________ dropout_66 (Dropout) (None, 128, 128, 64) 0 conv2d_133[0][0] __________________________________________________________________________________________________ activation_134 (Activation) (None, 128, 128, 64) 0 dropout_66[0][0] __________________________________________________________________________________________________ conv2d_134 (Conv2D) (None, 128, 128, 64) 36928 activation_134[0][0] __________________________________________________________________________________________________ add_66 (Add) (None, 128, 128, 64) 0 conv2d_134[0][0] add_65[0][0] __________________________________________________________________________________________________ batch_normalization_68 (BatchNo (None, 128, 128, 64) 256 add_66[0][0] __________________________________________________________________________________________________ activation_135 (Activation) (None, 128, 128, 64) 0 batch_normalization_68[0][0] __________________________________________________________________________________________________ conv2d_135 (Conv2D) (None, 128, 128, 64) 36928 activation_135[0][0] __________________________________________________________________________________________________ dropout_67 (Dropout) (None, 128, 128, 64) 0 conv2d_135[0][0] __________________________________________________________________________________________________ activation_136 (Activation) (None, 128, 128, 64) 0 dropout_67[0][0] __________________________________________________________________________________________________ conv2d_136 (Conv2D) (None, 128, 128, 64) 36928 activation_136[0][0] __________________________________________________________________________________________________ add_67 (Add) (None, 128, 128, 64) 0 conv2d_136[0][0] add_66[0][0] __________________________________________________________________________________________________ batch_normalization_69 (BatchNo (None, 128, 128, 64) 256 add_67[0][0] __________________________________________________________________________________________________ activation_137 (Activation) (None, 128, 128, 64) 0 batch_normalization_69[0][0] __________________________________________________________________________________________________ conv2d_137 (Conv2D) (None, 128, 128, 64) 36928 activation_137[0][0] __________________________________________________________________________________________________ dropout_68 (Dropout) (None, 128, 128, 64) 0 conv2d_137[0][0] __________________________________________________________________________________________________ activation_138 (Activation) (None, 128, 128, 64) 0 dropout_68[0][0] __________________________________________________________________________________________________ conv2d_138 (Conv2D) (None, 128, 128, 64) 36928 activation_138[0][0] __________________________________________________________________________________________________ add_68 (Add) (None, 128, 128, 64) 0 conv2d_138[0][0] add_67[0][0] __________________________________________________________________________________________________ batch_normalization_70 (BatchNo (None, 128, 128, 64) 256 add_68[0][0] __________________________________________________________________________________________________ activation_139 (Activation) (None, 128, 128, 64) 0 batch_normalization_70[0][0] __________________________________________________________________________________________________ conv2d_139 (Conv2D) (None, 128, 128, 64) 36928 activation_139[0][0] __________________________________________________________________________________________________ dropout_69 (Dropout) (None, 128, 128, 64) 0 conv2d_139[0][0] __________________________________________________________________________________________________ activation_140 (Activation) (None, 128, 128, 64) 0 dropout_69[0][0] __________________________________________________________________________________________________ conv2d_140 (Conv2D) (None, 128, 128, 64) 36928 activation_140[0][0] __________________________________________________________________________________________________ add_69 (Add) (None, 128, 128, 64) 0 conv2d_140[0][0] add_68[0][0] __________________________________________________________________________________________________ batch_normalization_71 (BatchNo (None, 128, 128, 64) 256 add_69[0][0] __________________________________________________________________________________________________ activation_141 (Activation) (None, 128, 128, 64) 0 batch_normalization_71[0][0] __________________________________________________________________________________________________ conv2d_141 (Conv2D) (None, 128, 128, 64) 36928 activation_141[0][0] __________________________________________________________________________________________________ dropout_70 (Dropout) (None, 128, 128, 64) 0 conv2d_141[0][0] __________________________________________________________________________________________________ activation_142 (Activation) (None, 128, 128, 64) 0 dropout_70[0][0] __________________________________________________________________________________________________ conv2d_142 (Conv2D) (None, 128, 128, 64) 36928 activation_142[0][0] __________________________________________________________________________________________________ add_70 (Add) (None, 128, 128, 64) 0 conv2d_142[0][0] add_69[0][0] __________________________________________________________________________________________________ batch_normalization_72 (BatchNo (None, 128, 128, 64) 256 add_70[0][0] __________________________________________________________________________________________________ activation_143 (Activation) (None, 128, 128, 64) 0 batch_normalization_72[0][0] __________________________________________________________________________________________________ conv2d_143 (Conv2D) (None, 128, 128, 64) 36928 activation_143[0][0] __________________________________________________________________________________________________ dropout_71 (Dropout) (None, 128, 128, 64) 0 conv2d_143[0][0] __________________________________________________________________________________________________ activation_144 (Activation) (None, 128, 128, 64) 0 dropout_71[0][0] __________________________________________________________________________________________________ conv2d_144 (Conv2D) (None, 128, 128, 64) 36928 activation_144[0][0] __________________________________________________________________________________________________ add_71 (Add) (None, 128, 128, 64) 0 conv2d_144[0][0] add_70[0][0] __________________________________________________________________________________________________ batch_normalization_73 (BatchNo (None, 128, 128, 64) 256 add_71[0][0] __________________________________________________________________________________________________ activation_145 (Activation) (None, 128, 128, 64) 0 batch_normalization_73[0][0] __________________________________________________________________________________________________ conv2d_145 (Conv2D) (None, 128, 128, 64) 36928 activation_145[0][0] __________________________________________________________________________________________________ dropout_72 (Dropout) (None, 128, 128, 64) 0 conv2d_145[0][0] __________________________________________________________________________________________________ activation_146 (Activation) (None, 128, 128, 64) 0 dropout_72[0][0] __________________________________________________________________________________________________ conv2d_146 (Conv2D) (None, 128, 128, 64) 36928 activation_146[0][0] __________________________________________________________________________________________________ add_72 (Add) (None, 128, 128, 64) 0 conv2d_146[0][0] add_71[0][0] __________________________________________________________________________________________________ batch_normalization_74 (BatchNo (None, 128, 128, 64) 256 add_72[0][0] __________________________________________________________________________________________________ activation_147 (Activation) (None, 128, 128, 64) 0 batch_normalization_74[0][0] __________________________________________________________________________________________________ conv2d_147 (Conv2D) (None, 128, 128, 64) 36928 activation_147[0][0] __________________________________________________________________________________________________ dropout_73 (Dropout) (None, 128, 128, 64) 0 conv2d_147[0][0] __________________________________________________________________________________________________ activation_148 (Activation) (None, 128, 128, 64) 0 dropout_73[0][0] __________________________________________________________________________________________________ conv2d_148 (Conv2D) (None, 128, 128, 64) 36928 activation_148[0][0] __________________________________________________________________________________________________ add_73 (Add) (None, 128, 128, 64) 0 conv2d_148[0][0] add_72[0][0] __________________________________________________________________________________________________ batch_normalization_75 (BatchNo (None, 128, 128, 64) 256 add_73[0][0] __________________________________________________________________________________________________ activation_149 (Activation) (None, 128, 128, 64) 0 batch_normalization_75[0][0] __________________________________________________________________________________________________ conv2d_149 (Conv2D) (None, 128, 128, 64) 36928 activation_149[0][0] __________________________________________________________________________________________________ dropout_74 (Dropout) (None, 128, 128, 64) 0 conv2d_149[0][0] __________________________________________________________________________________________________ activation_150 (Activation) (None, 128, 128, 64) 0 dropout_74[0][0] __________________________________________________________________________________________________ conv2d_150 (Conv2D) (None, 128, 128, 64) 36928 activation_150[0][0] __________________________________________________________________________________________________ add_74 (Add) (None, 128, 128, 64) 0 conv2d_150[0][0] add_73[0][0] __________________________________________________________________________________________________ batch_normalization_76 (BatchNo (None, 128, 128, 64) 256 add_74[0][0] __________________________________________________________________________________________________ activation_151 (Activation) (None, 128, 128, 64) 0 batch_normalization_76[0][0] __________________________________________________________________________________________________ conv2d_151 (Conv2D) (None, 128, 128, 64) 36928 activation_151[0][0] __________________________________________________________________________________________________ dropout_75 (Dropout) (None, 128, 128, 64) 0 conv2d_151[0][0] __________________________________________________________________________________________________ activation_152 (Activation) (None, 128, 128, 64) 0 dropout_75[0][0] __________________________________________________________________________________________________ conv2d_152 (Conv2D) (None, 128, 128, 64) 36928 activation_152[0][0] __________________________________________________________________________________________________ add_75 (Add) (None, 128, 128, 64) 0 conv2d_152[0][0] add_74[0][0] __________________________________________________________________________________________________ batch_normalization_77 (BatchNo (None, 128, 128, 64) 256 add_75[0][0] __________________________________________________________________________________________________ activation_153 (Activation) (None, 128, 128, 64) 0 batch_normalization_77[0][0] __________________________________________________________________________________________________ conv2d_153 (Conv2D) (None, 128, 128, 64) 36928 activation_153[0][0] __________________________________________________________________________________________________ dropout_76 (Dropout) (None, 128, 128, 64) 0 conv2d_153[0][0] __________________________________________________________________________________________________ activation_154 (Activation) (None, 128, 128, 64) 0 dropout_76[0][0] __________________________________________________________________________________________________ conv2d_154 (Conv2D) (None, 128, 128, 64) 36928 activation_154[0][0] __________________________________________________________________________________________________ add_76 (Add) (None, 128, 128, 64) 0 conv2d_154[0][0] add_75[0][0] __________________________________________________________________________________________________ batch_normalization_78 (BatchNo (None, 128, 128, 64) 256 add_76[0][0] __________________________________________________________________________________________________ activation_155 (Activation) (None, 128, 128, 64) 0 batch_normalization_78[0][0] __________________________________________________________________________________________________ conv2d_155 (Conv2D) (None, 128, 128, 64) 36928 activation_155[0][0] __________________________________________________________________________________________________ dropout_77 (Dropout) (None, 128, 128, 64) 0 conv2d_155[0][0] __________________________________________________________________________________________________ activation_156 (Activation) (None, 128, 128, 64) 0 dropout_77[0][0] __________________________________________________________________________________________________ conv2d_156 (Conv2D) (None, 128, 128, 64) 36928 activation_156[0][0] __________________________________________________________________________________________________ add_77 (Add) (None, 128, 128, 64) 0 conv2d_156[0][0] add_76[0][0] __________________________________________________________________________________________________ batch_normalization_79 (BatchNo (None, 128, 128, 64) 256 add_77[0][0] __________________________________________________________________________________________________ activation_157 (Activation) (None, 128, 128, 64) 0 batch_normalization_79[0][0] __________________________________________________________________________________________________ conv2d_157 (Conv2D) (None, 128, 128, 64) 36928 activation_157[0][0] __________________________________________________________________________________________________ dropout_78 (Dropout) (None, 128, 128, 64) 0 conv2d_157[0][0] __________________________________________________________________________________________________ activation_158 (Activation) (None, 128, 128, 64) 0 dropout_78[0][0] __________________________________________________________________________________________________ conv2d_158 (Conv2D) (None, 128, 128, 64) 36928 activation_158[0][0] __________________________________________________________________________________________________ add_78 (Add) (None, 128, 128, 64) 0 conv2d_158[0][0] add_77[0][0] __________________________________________________________________________________________________ batch_normalization_80 (BatchNo (None, 128, 128, 64) 256 add_78[0][0] __________________________________________________________________________________________________ activation_159 (Activation) (None, 128, 128, 64) 0 batch_normalization_80[0][0] __________________________________________________________________________________________________ conv2d_159 (Conv2D) (None, 128, 128, 64) 36928 activation_159[0][0] __________________________________________________________________________________________________ dropout_79 (Dropout) (None, 128, 128, 64) 0 conv2d_159[0][0] __________________________________________________________________________________________________ activation_160 (Activation) (None, 128, 128, 64) 0 dropout_79[0][0] __________________________________________________________________________________________________ conv2d_160 (Conv2D) (None, 128, 128, 64) 36928 activation_160[0][0] __________________________________________________________________________________________________ add_79 (Add) (None, 128, 128, 64) 0 conv2d_160[0][0] add_78[0][0] __________________________________________________________________________________________________ batch_normalization_81 (BatchNo (None, 128, 128, 64) 256 add_79[0][0] __________________________________________________________________________________________________ activation_161 (Activation) (None, 128, 128, 64) 0 batch_normalization_81[0][0] __________________________________________________________________________________________________ conv2d_161 (Conv2D) (None, 128, 128, 64) 36928 activation_161[0][0] __________________________________________________________________________________________________ dropout_80 (Dropout) (None, 128, 128, 64) 0 conv2d_161[0][0] __________________________________________________________________________________________________ activation_162 (Activation) (None, 128, 128, 64) 0 dropout_80[0][0] __________________________________________________________________________________________________ conv2d_162 (Conv2D) (None, 128, 128, 64) 36928 activation_162[0][0] __________________________________________________________________________________________________ add_80 (Add) (None, 128, 128, 64) 0 conv2d_162[0][0] add_79[0][0] __________________________________________________________________________________________________ batch_normalization_82 (BatchNo (None, 128, 128, 64) 256 add_80[0][0] __________________________________________________________________________________________________ activation_163 (Activation) (None, 128, 128, 64) 0 batch_normalization_82[0][0] __________________________________________________________________________________________________ conv2d_163 (Conv2D) (None, 128, 128, 64) 36928 activation_163[0][0] __________________________________________________________________________________________________ dropout_81 (Dropout) (None, 128, 128, 64) 0 conv2d_163[0][0] __________________________________________________________________________________________________ activation_164 (Activation) (None, 128, 128, 64) 0 dropout_81[0][0] __________________________________________________________________________________________________ conv2d_164 (Conv2D) (None, 128, 128, 64) 36928 activation_164[0][0] __________________________________________________________________________________________________ add_81 (Add) (None, 128, 128, 64) 0 conv2d_164[0][0] add_80[0][0] __________________________________________________________________________________________________ batch_normalization_83 (BatchNo (None, 128, 128, 64) 256 add_81[0][0] __________________________________________________________________________________________________ activation_165 (Activation) (None, 128, 128, 64) 0 batch_normalization_83[0][0] __________________________________________________________________________________________________ conv2d_165 (Conv2D) (None, 128, 128, 64) 36928 activation_165[0][0] __________________________________________________________________________________________________ dropout_82 (Dropout) (None, 128, 128, 64) 0 conv2d_165[0][0] __________________________________________________________________________________________________ activation_166 (Activation) (None, 128, 128, 64) 0 dropout_82[0][0] __________________________________________________________________________________________________ conv2d_166 (Conv2D) (None, 128, 128, 64) 36928 activation_166[0][0] __________________________________________________________________________________________________ add_82 (Add) (None, 128, 128, 64) 0 conv2d_166[0][0] add_81[0][0] __________________________________________________________________________________________________ batch_normalization_84 (BatchNo (None, 128, 128, 64) 256 add_82[0][0] __________________________________________________________________________________________________ activation_167 (Activation) (None, 128, 128, 64) 0 batch_normalization_84[0][0] __________________________________________________________________________________________________ conv2d_167 (Conv2D) (None, 128, 128, 64) 36928 activation_167[0][0] __________________________________________________________________________________________________ dropout_83 (Dropout) (None, 128, 128, 64) 0 conv2d_167[0][0] __________________________________________________________________________________________________ activation_168 (Activation) (None, 128, 128, 64) 0 dropout_83[0][0] __________________________________________________________________________________________________ conv2d_168 (Conv2D) (None, 128, 128, 64) 36928 activation_168[0][0] __________________________________________________________________________________________________ add_83 (Add) (None, 128, 128, 64) 0 conv2d_168[0][0] add_82[0][0] __________________________________________________________________________________________________ batch_normalization_85 (BatchNo (None, 128, 128, 64) 256 add_83[0][0] __________________________________________________________________________________________________ activation_169 (Activation) (None, 128, 128, 64) 0 batch_normalization_85[0][0] __________________________________________________________________________________________________ conv2d_169 (Conv2D) (None, 128, 128, 64) 36928 activation_169[0][0] __________________________________________________________________________________________________ dropout_84 (Dropout) (None, 128, 128, 64) 0 conv2d_169[0][0] __________________________________________________________________________________________________ activation_170 (Activation) (None, 128, 128, 64) 0 dropout_84[0][0] __________________________________________________________________________________________________ conv2d_170 (Conv2D) (None, 128, 128, 64) 36928 activation_170[0][0] __________________________________________________________________________________________________ add_84 (Add) (None, 128, 128, 64) 0 conv2d_170[0][0] add_83[0][0] __________________________________________________________________________________________________ batch_normalization_86 (BatchNo (None, 128, 128, 64) 256 add_84[0][0] __________________________________________________________________________________________________ activation_171 (Activation) (None, 128, 128, 64) 0 batch_normalization_86[0][0] __________________________________________________________________________________________________ conv2d_171 (Conv2D) (None, 128, 128, 64) 36928 activation_171[0][0] __________________________________________________________________________________________________ dropout_85 (Dropout) (None, 128, 128, 64) 0 conv2d_171[0][0] __________________________________________________________________________________________________ activation_172 (Activation) (None, 128, 128, 64) 0 dropout_85[0][0] __________________________________________________________________________________________________ conv2d_172 (Conv2D) (None, 128, 128, 64) 36928 activation_172[0][0] __________________________________________________________________________________________________ add_85 (Add) (None, 128, 128, 64) 0 conv2d_172[0][0] add_84[0][0] __________________________________________________________________________________________________ batch_normalization_87 (BatchNo (None, 128, 128, 64) 256 add_85[0][0] __________________________________________________________________________________________________ activation_173 (Activation) (None, 128, 128, 64) 0 batch_normalization_87[0][0] __________________________________________________________________________________________________ conv2d_173 (Conv2D) (None, 128, 128, 64) 36928 activation_173[0][0] __________________________________________________________________________________________________ dropout_86 (Dropout) (None, 128, 128, 64) 0 conv2d_173[0][0] __________________________________________________________________________________________________ activation_174 (Activation) (None, 128, 128, 64) 0 dropout_86[0][0] __________________________________________________________________________________________________ conv2d_174 (Conv2D) (None, 128, 128, 64) 36928 activation_174[0][0] __________________________________________________________________________________________________ add_86 (Add) (None, 128, 128, 64) 0 conv2d_174[0][0] add_85[0][0] __________________________________________________________________________________________________ batch_normalization_88 (BatchNo (None, 128, 128, 64) 256 add_86[0][0] __________________________________________________________________________________________________ activation_175 (Activation) (None, 128, 128, 64) 0 batch_normalization_88[0][0] __________________________________________________________________________________________________ conv2d_175 (Conv2D) (None, 128, 128, 64) 36928 activation_175[0][0] __________________________________________________________________________________________________ dropout_87 (Dropout) (None, 128, 128, 64) 0 conv2d_175[0][0] __________________________________________________________________________________________________ activation_176 (Activation) (None, 128, 128, 64) 0 dropout_87[0][0] __________________________________________________________________________________________________ conv2d_176 (Conv2D) (None, 128, 128, 64) 36928 activation_176[0][0] __________________________________________________________________________________________________ add_87 (Add) (None, 128, 128, 64) 0 conv2d_176[0][0] add_86[0][0] __________________________________________________________________________________________________ batch_normalization_89 (BatchNo (None, 128, 128, 64) 256 add_87[0][0] __________________________________________________________________________________________________ activation_177 (Activation) (None, 128, 128, 64) 0 batch_normalization_89[0][0] __________________________________________________________________________________________________ conv2d_177 (Conv2D) (None, 128, 128, 64) 36928 activation_177[0][0] __________________________________________________________________________________________________ dropout_88 (Dropout) (None, 128, 128, 64) 0 conv2d_177[0][0] __________________________________________________________________________________________________ activation_178 (Activation) (None, 128, 128, 64) 0 dropout_88[0][0] __________________________________________________________________________________________________ conv2d_178 (Conv2D) (None, 128, 128, 64) 36928 activation_178[0][0] __________________________________________________________________________________________________ add_88 (Add) (None, 128, 128, 64) 0 conv2d_178[0][0] add_87[0][0] __________________________________________________________________________________________________ batch_normalization_90 (BatchNo (None, 128, 128, 64) 256 add_88[0][0] __________________________________________________________________________________________________ activation_179 (Activation) (None, 128, 128, 64) 0 batch_normalization_90[0][0] __________________________________________________________________________________________________ conv2d_179 (Conv2D) (None, 128, 128, 64) 36928 activation_179[0][0] __________________________________________________________________________________________________ dropout_89 (Dropout) (None, 128, 128, 64) 0 conv2d_179[0][0] __________________________________________________________________________________________________ activation_180 (Activation) (None, 128, 128, 64) 0 dropout_89[0][0] __________________________________________________________________________________________________ conv2d_180 (Conv2D) (None, 128, 128, 64) 36928 activation_180[0][0] __________________________________________________________________________________________________ add_89 (Add) (None, 128, 128, 64) 0 conv2d_180[0][0] add_88[0][0] __________________________________________________________________________________________________ batch_normalization_91 (BatchNo (None, 128, 128, 64) 256 add_89[0][0] __________________________________________________________________________________________________ activation_181 (Activation) (None, 128, 128, 64) 0 batch_normalization_91[0][0] __________________________________________________________________________________________________ conv2d_181 (Conv2D) (None, 128, 128, 64) 36928 activation_181[0][0] __________________________________________________________________________________________________ dropout_90 (Dropout) (None, 128, 128, 64) 0 conv2d_181[0][0] __________________________________________________________________________________________________ activation_182 (Activation) (None, 128, 128, 64) 0 dropout_90[0][0] __________________________________________________________________________________________________ conv2d_182 (Conv2D) (None, 128, 128, 64) 36928 activation_182[0][0] __________________________________________________________________________________________________ add_90 (Add) (None, 128, 128, 64) 0 conv2d_182[0][0] add_89[0][0] __________________________________________________________________________________________________ batch_normalization_92 (BatchNo (None, 128, 128, 64) 256 add_90[0][0] __________________________________________________________________________________________________ activation_183 (Activation) (None, 128, 128, 64) 0 batch_normalization_92[0][0] __________________________________________________________________________________________________ conv2d_183 (Conv2D) (None, 128, 128, 64) 36928 activation_183[0][0] __________________________________________________________________________________________________ dropout_91 (Dropout) (None, 128, 128, 64) 0 conv2d_183[0][0] __________________________________________________________________________________________________ activation_184 (Activation) (None, 128, 128, 64) 0 dropout_91[0][0] __________________________________________________________________________________________________ conv2d_184 (Conv2D) (None, 128, 128, 64) 36928 activation_184[0][0] __________________________________________________________________________________________________ add_91 (Add) (None, 128, 128, 64) 0 conv2d_184[0][0] add_90[0][0] __________________________________________________________________________________________________ batch_normalization_93 (BatchNo (None, 128, 128, 64) 256 add_91[0][0] __________________________________________________________________________________________________ activation_185 (Activation) (None, 128, 128, 64) 0 batch_normalization_93[0][0] __________________________________________________________________________________________________ conv2d_185 (Conv2D) (None, 128, 128, 64) 36928 activation_185[0][0] __________________________________________________________________________________________________ dropout_92 (Dropout) (None, 128, 128, 64) 0 conv2d_185[0][0] __________________________________________________________________________________________________ activation_186 (Activation) (None, 128, 128, 64) 0 dropout_92[0][0] __________________________________________________________________________________________________ conv2d_186 (Conv2D) (None, 128, 128, 64) 36928 activation_186[0][0] __________________________________________________________________________________________________ add_92 (Add) (None, 128, 128, 64) 0 conv2d_186[0][0] add_91[0][0] __________________________________________________________________________________________________ batch_normalization_94 (BatchNo (None, 128, 128, 64) 256 add_92[0][0] __________________________________________________________________________________________________ activation_187 (Activation) (None, 128, 128, 64) 0 batch_normalization_94[0][0] __________________________________________________________________________________________________ conv2d_187 (Conv2D) (None, 128, 128, 64) 36928 activation_187[0][0] __________________________________________________________________________________________________ dropout_93 (Dropout) (None, 128, 128, 64) 0 conv2d_187[0][0] __________________________________________________________________________________________________ activation_188 (Activation) (None, 128, 128, 64) 0 dropout_93[0][0] __________________________________________________________________________________________________ conv2d_188 (Conv2D) (None, 128, 128, 64) 36928 activation_188[0][0] __________________________________________________________________________________________________ add_93 (Add) (None, 128, 128, 64) 0 conv2d_188[0][0] add_92[0][0] __________________________________________________________________________________________________ batch_normalization_95 (BatchNo (None, 128, 128, 64) 256 add_93[0][0] __________________________________________________________________________________________________ activation_189 (Activation) (None, 128, 128, 64) 0 batch_normalization_95[0][0] __________________________________________________________________________________________________ conv2d_189 (Conv2D) (None, 128, 128, 64) 36928 activation_189[0][0] __________________________________________________________________________________________________ dropout_94 (Dropout) (None, 128, 128, 64) 0 conv2d_189[0][0] __________________________________________________________________________________________________ activation_190 (Activation) (None, 128, 128, 64) 0 dropout_94[0][0] __________________________________________________________________________________________________ conv2d_190 (Conv2D) (None, 128, 128, 64) 36928 activation_190[0][0] __________________________________________________________________________________________________ add_94 (Add) (None, 128, 128, 64) 0 conv2d_190[0][0] add_93[0][0] __________________________________________________________________________________________________ batch_normalization_96 (BatchNo (None, 128, 128, 64) 256 add_94[0][0] __________________________________________________________________________________________________ activation_191 (Activation) (None, 128, 128, 64) 0 batch_normalization_96[0][0] __________________________________________________________________________________________________ conv2d_191 (Conv2D) (None, 128, 128, 64) 36928 activation_191[0][0] __________________________________________________________________________________________________ dropout_95 (Dropout) (None, 128, 128, 64) 0 conv2d_191[0][0] __________________________________________________________________________________________________ activation_192 (Activation) (None, 128, 128, 64) 0 dropout_95[0][0] __________________________________________________________________________________________________ conv2d_192 (Conv2D) (None, 128, 128, 64) 36928 activation_192[0][0] __________________________________________________________________________________________________ add_95 (Add) (None, 128, 128, 64) 0 conv2d_192[0][0] add_94[0][0] __________________________________________________________________________________________________ batch_normalization_97 (BatchNo (None, 128, 128, 64) 256 add_95[0][0] __________________________________________________________________________________________________ activation_193 (Activation) (None, 128, 128, 64) 0 batch_normalization_97[0][0] __________________________________________________________________________________________________ conv2d_193 (Conv2D) (None, 128, 128, 64) 36928 activation_193[0][0] __________________________________________________________________________________________________ dropout_96 (Dropout) (None, 128, 128, 64) 0 conv2d_193[0][0] __________________________________________________________________________________________________ activation_194 (Activation) (None, 128, 128, 64) 0 dropout_96[0][0] __________________________________________________________________________________________________ conv2d_194 (Conv2D) (None, 128, 128, 64) 36928 activation_194[0][0] __________________________________________________________________________________________________ add_96 (Add) (None, 128, 128, 64) 0 conv2d_194[0][0] add_95[0][0] __________________________________________________________________________________________________ batch_normalization_98 (BatchNo (None, 128, 128, 64) 256 add_96[0][0] __________________________________________________________________________________________________ activation_195 (Activation) (None, 128, 128, 64) 0 batch_normalization_98[0][0] __________________________________________________________________________________________________ conv2d_195 (Conv2D) (None, 128, 128, 64) 36928 activation_195[0][0] __________________________________________________________________________________________________ dropout_97 (Dropout) (None, 128, 128, 64) 0 conv2d_195[0][0] __________________________________________________________________________________________________ activation_196 (Activation) (None, 128, 128, 64) 0 dropout_97[0][0] __________________________________________________________________________________________________ conv2d_196 (Conv2D) (None, 128, 128, 64) 36928 activation_196[0][0] __________________________________________________________________________________________________ add_97 (Add) (None, 128, 128, 64) 0 conv2d_196[0][0] add_96[0][0] __________________________________________________________________________________________________ batch_normalization_99 (BatchNo (None, 128, 128, 64) 256 add_97[0][0] __________________________________________________________________________________________________ activation_197 (Activation) (None, 128, 128, 64) 0 batch_normalization_99[0][0] __________________________________________________________________________________________________ conv2d_197 (Conv2D) (None, 128, 128, 64) 36928 activation_197[0][0] __________________________________________________________________________________________________ dropout_98 (Dropout) (None, 128, 128, 64) 0 conv2d_197[0][0] __________________________________________________________________________________________________ activation_198 (Activation) (None, 128, 128, 64) 0 dropout_98[0][0] __________________________________________________________________________________________________ conv2d_198 (Conv2D) (None, 128, 128, 64) 36928 activation_198[0][0] __________________________________________________________________________________________________ add_98 (Add) (None, 128, 128, 64) 0 conv2d_198[0][0] add_97[0][0] __________________________________________________________________________________________________ batch_normalization_100 (BatchN (None, 128, 128, 64) 256 add_98[0][0] __________________________________________________________________________________________________ activation_199 (Activation) (None, 128, 128, 64) 0 batch_normalization_100[0][0] __________________________________________________________________________________________________ conv2d_199 (Conv2D) (None, 128, 128, 64) 36928 activation_199[0][0] __________________________________________________________________________________________________ dropout_99 (Dropout) (None, 128, 128, 64) 0 conv2d_199[0][0] __________________________________________________________________________________________________ activation_200 (Activation) (None, 128, 128, 64) 0 dropout_99[0][0] __________________________________________________________________________________________________ conv2d_200 (Conv2D) (None, 128, 128, 64) 36928 activation_200[0][0] __________________________________________________________________________________________________ add_99 (Add) (None, 128, 128, 64) 0 conv2d_200[0][0] add_98[0][0] __________________________________________________________________________________________________ batch_normalization_101 (BatchN (None, 128, 128, 64) 256 add_99[0][0] __________________________________________________________________________________________________ activation_201 (Activation) (None, 128, 128, 64) 0 batch_normalization_101[0][0] __________________________________________________________________________________________________ conv2d_201 (Conv2D) (None, 128, 128, 64) 36928 activation_201[0][0] __________________________________________________________________________________________________ dropout_100 (Dropout) (None, 128, 128, 64) 0 conv2d_201[0][0] __________________________________________________________________________________________________ activation_202 (Activation) (None, 128, 128, 64) 0 dropout_100[0][0] __________________________________________________________________________________________________ conv2d_202 (Conv2D) (None, 128, 128, 64) 36928 activation_202[0][0] __________________________________________________________________________________________________ add_100 (Add) (None, 128, 128, 64) 0 conv2d_202[0][0] add_99[0][0] __________________________________________________________________________________________________ batch_normalization_102 (BatchN (None, 128, 128, 64) 256 add_100[0][0] __________________________________________________________________________________________________ activation_203 (Activation) (None, 128, 128, 64) 0 batch_normalization_102[0][0] __________________________________________________________________________________________________ conv2d_203 (Conv2D) (None, 128, 128, 64) 36928 activation_203[0][0] __________________________________________________________________________________________________ dropout_101 (Dropout) (None, 128, 128, 64) 0 conv2d_203[0][0] __________________________________________________________________________________________________ activation_204 (Activation) (None, 128, 128, 64) 0 dropout_101[0][0] __________________________________________________________________________________________________ conv2d_204 (Conv2D) (None, 128, 128, 64) 36928 activation_204[0][0] __________________________________________________________________________________________________ add_101 (Add) (None, 128, 128, 64) 0 conv2d_204[0][0] add_100[0][0] __________________________________________________________________________________________________ batch_normalization_103 (BatchN (None, 128, 128, 64) 256 add_101[0][0] __________________________________________________________________________________________________ activation_205 (Activation) (None, 128, 128, 64) 0 batch_normalization_103[0][0] __________________________________________________________________________________________________ conv2d_205 (Conv2D) (None, 128, 128, 64) 36928 activation_205[0][0] __________________________________________________________________________________________________ dropout_102 (Dropout) (None, 128, 128, 64) 0 conv2d_205[0][0] __________________________________________________________________________________________________ activation_206 (Activation) (None, 128, 128, 64) 0 dropout_102[0][0] __________________________________________________________________________________________________ conv2d_206 (Conv2D) (None, 128, 128, 64) 36928 activation_206[0][0] __________________________________________________________________________________________________ add_102 (Add) (None, 128, 128, 64) 0 conv2d_206[0][0] add_101[0][0] __________________________________________________________________________________________________ batch_normalization_104 (BatchN (None, 128, 128, 64) 256 add_102[0][0] __________________________________________________________________________________________________ activation_207 (Activation) (None, 128, 128, 64) 0 batch_normalization_104[0][0] __________________________________________________________________________________________________ conv2d_207 (Conv2D) (None, 128, 128, 64) 36928 activation_207[0][0] __________________________________________________________________________________________________ dropout_103 (Dropout) (None, 128, 128, 64) 0 conv2d_207[0][0] __________________________________________________________________________________________________ activation_208 (Activation) (None, 128, 128, 64) 0 dropout_103[0][0] __________________________________________________________________________________________________ conv2d_208 (Conv2D) (None, 128, 128, 64) 36928 activation_208[0][0] __________________________________________________________________________________________________ add_103 (Add) (None, 128, 128, 64) 0 conv2d_208[0][0] add_102[0][0] __________________________________________________________________________________________________ batch_normalization_105 (BatchN (None, 128, 128, 64) 256 add_103[0][0] __________________________________________________________________________________________________ activation_209 (Activation) (None, 128, 128, 64) 0 batch_normalization_105[0][0] __________________________________________________________________________________________________ conv2d_209 (Conv2D) (None, 128, 128, 64) 36928 activation_209[0][0] __________________________________________________________________________________________________ dropout_104 (Dropout) (None, 128, 128, 64) 0 conv2d_209[0][0] __________________________________________________________________________________________________ activation_210 (Activation) (None, 128, 128, 64) 0 dropout_104[0][0] __________________________________________________________________________________________________ conv2d_210 (Conv2D) (None, 128, 128, 64) 36928 activation_210[0][0] __________________________________________________________________________________________________ add_104 (Add) (None, 128, 128, 64) 0 conv2d_210[0][0] add_103[0][0] __________________________________________________________________________________________________ batch_normalization_106 (BatchN (None, 128, 128, 64) 256 add_104[0][0] __________________________________________________________________________________________________ activation_211 (Activation) (None, 128, 128, 64) 0 batch_normalization_106[0][0] __________________________________________________________________________________________________ conv2d_211 (Conv2D) (None, 128, 128, 64) 36928 activation_211[0][0] __________________________________________________________________________________________________ dropout_105 (Dropout) (None, 128, 128, 64) 0 conv2d_211[0][0] __________________________________________________________________________________________________ activation_212 (Activation) (None, 128, 128, 64) 0 dropout_105[0][0] __________________________________________________________________________________________________ conv2d_212 (Conv2D) (None, 128, 128, 64) 36928 activation_212[0][0] __________________________________________________________________________________________________ add_105 (Add) (None, 128, 128, 64) 0 conv2d_212[0][0] add_104[0][0] __________________________________________________________________________________________________ batch_normalization_107 (BatchN (None, 128, 128, 64) 256 add_105[0][0] __________________________________________________________________________________________________ activation_213 (Activation) (None, 128, 128, 64) 0 batch_normalization_107[0][0] __________________________________________________________________________________________________ conv2d_213 (Conv2D) (None, 128, 128, 64) 36928 activation_213[0][0] __________________________________________________________________________________________________ dropout_106 (Dropout) (None, 128, 128, 64) 0 conv2d_213[0][0] __________________________________________________________________________________________________ activation_214 (Activation) (None, 128, 128, 64) 0 dropout_106[0][0] __________________________________________________________________________________________________ conv2d_214 (Conv2D) (None, 128, 128, 64) 36928 activation_214[0][0] __________________________________________________________________________________________________ add_106 (Add) (None, 128, 128, 64) 0 conv2d_214[0][0] add_105[0][0] __________________________________________________________________________________________________ batch_normalization_108 (BatchN (None, 128, 128, 64) 256 add_106[0][0] __________________________________________________________________________________________________ activation_215 (Activation) (None, 128, 128, 64) 0 batch_normalization_108[0][0] __________________________________________________________________________________________________ conv2d_215 (Conv2D) (None, 128, 128, 64) 36928 activation_215[0][0] __________________________________________________________________________________________________ dropout_107 (Dropout) (None, 128, 128, 64) 0 conv2d_215[0][0] __________________________________________________________________________________________________ activation_216 (Activation) (None, 128, 128, 64) 0 dropout_107[0][0] __________________________________________________________________________________________________ conv2d_216 (Conv2D) (None, 128, 128, 64) 36928 activation_216[0][0] __________________________________________________________________________________________________ add_107 (Add) (None, 128, 128, 64) 0 conv2d_216[0][0] add_106[0][0] __________________________________________________________________________________________________ batch_normalization_109 (BatchN (None, 128, 128, 64) 256 add_107[0][0] __________________________________________________________________________________________________ activation_217 (Activation) (None, 128, 128, 64) 0 batch_normalization_109[0][0] __________________________________________________________________________________________________ conv2d_217 (Conv2D) (None, 128, 128, 64) 36928 activation_217[0][0] __________________________________________________________________________________________________ dropout_108 (Dropout) (None, 128, 128, 64) 0 conv2d_217[0][0] __________________________________________________________________________________________________ activation_218 (Activation) (None, 128, 128, 64) 0 dropout_108[0][0] __________________________________________________________________________________________________ conv2d_218 (Conv2D) (None, 128, 128, 64) 36928 activation_218[0][0] __________________________________________________________________________________________________ add_108 (Add) (None, 128, 128, 64) 0 conv2d_218[0][0] add_107[0][0] __________________________________________________________________________________________________ batch_normalization_110 (BatchN (None, 128, 128, 64) 256 add_108[0][0] __________________________________________________________________________________________________ activation_219 (Activation) (None, 128, 128, 64) 0 batch_normalization_110[0][0] __________________________________________________________________________________________________ conv2d_219 (Conv2D) (None, 128, 128, 64) 36928 activation_219[0][0] __________________________________________________________________________________________________ dropout_109 (Dropout) (None, 128, 128, 64) 0 conv2d_219[0][0] __________________________________________________________________________________________________ activation_220 (Activation) (None, 128, 128, 64) 0 dropout_109[0][0] __________________________________________________________________________________________________ conv2d_220 (Conv2D) (None, 128, 128, 64) 36928 activation_220[0][0] __________________________________________________________________________________________________ add_109 (Add) (None, 128, 128, 64) 0 conv2d_220[0][0] add_108[0][0] __________________________________________________________________________________________________ batch_normalization_111 (BatchN (None, 128, 128, 64) 256 add_109[0][0] __________________________________________________________________________________________________ activation_221 (Activation) (None, 128, 128, 64) 0 batch_normalization_111[0][0] __________________________________________________________________________________________________ conv2d_221 (Conv2D) (None, 128, 128, 64) 36928 activation_221[0][0] __________________________________________________________________________________________________ dropout_110 (Dropout) (None, 128, 128, 64) 0 conv2d_221[0][0] __________________________________________________________________________________________________ activation_222 (Activation) (None, 128, 128, 64) 0 dropout_110[0][0] __________________________________________________________________________________________________ conv2d_222 (Conv2D) (None, 128, 128, 64) 36928 activation_222[0][0] __________________________________________________________________________________________________ add_110 (Add) (None, 128, 128, 64) 0 conv2d_222[0][0] add_109[0][0] __________________________________________________________________________________________________ batch_normalization_112 (BatchN (None, 128, 128, 64) 256 add_110[0][0] __________________________________________________________________________________________________ activation_223 (Activation) (None, 128, 128, 64) 0 batch_normalization_112[0][0] __________________________________________________________________________________________________ conv2d_223 (Conv2D) (None, 128, 128, 64) 36928 activation_223[0][0] __________________________________________________________________________________________________ dropout_111 (Dropout) (None, 128, 128, 64) 0 conv2d_223[0][0] __________________________________________________________________________________________________ activation_224 (Activation) (None, 128, 128, 64) 0 dropout_111[0][0] __________________________________________________________________________________________________ conv2d_224 (Conv2D) (None, 128, 128, 64) 36928 activation_224[0][0] __________________________________________________________________________________________________ add_111 (Add) (None, 128, 128, 64) 0 conv2d_224[0][0] add_110[0][0] __________________________________________________________________________________________________ batch_normalization_113 (BatchN (None, 128, 128, 64) 256 add_111[0][0] __________________________________________________________________________________________________ activation_225 (Activation) (None, 128, 128, 64) 0 batch_normalization_113[0][0] __________________________________________________________________________________________________ conv2d_225 (Conv2D) (None, 128, 128, 64) 36928 activation_225[0][0] __________________________________________________________________________________________________ dropout_112 (Dropout) (None, 128, 128, 64) 0 conv2d_225[0][0] __________________________________________________________________________________________________ activation_226 (Activation) (None, 128, 128, 64) 0 dropout_112[0][0] __________________________________________________________________________________________________ conv2d_226 (Conv2D) (None, 128, 128, 64) 36928 activation_226[0][0] __________________________________________________________________________________________________ add_112 (Add) (None, 128, 128, 64) 0 conv2d_226[0][0] add_111[0][0] __________________________________________________________________________________________________ batch_normalization_114 (BatchN (None, 128, 128, 64) 256 add_112[0][0] __________________________________________________________________________________________________ activation_227 (Activation) (None, 128, 128, 64) 0 batch_normalization_114[0][0] __________________________________________________________________________________________________ conv2d_227 (Conv2D) (None, 128, 128, 64) 36928 activation_227[0][0] __________________________________________________________________________________________________ dropout_113 (Dropout) (None, 128, 128, 64) 0 conv2d_227[0][0] __________________________________________________________________________________________________ activation_228 (Activation) (None, 128, 128, 64) 0 dropout_113[0][0] __________________________________________________________________________________________________ conv2d_228 (Conv2D) (None, 128, 128, 64) 36928 activation_228[0][0] __________________________________________________________________________________________________ add_113 (Add) (None, 128, 128, 64) 0 conv2d_228[0][0] add_112[0][0] __________________________________________________________________________________________________ batch_normalization_115 (BatchN (None, 128, 128, 64) 256 add_113[0][0] __________________________________________________________________________________________________ activation_229 (Activation) (None, 128, 128, 64) 0 batch_normalization_115[0][0] __________________________________________________________________________________________________ conv2d_229 (Conv2D) (None, 128, 128, 64) 36928 activation_229[0][0] __________________________________________________________________________________________________ dropout_114 (Dropout) (None, 128, 128, 64) 0 conv2d_229[0][0] __________________________________________________________________________________________________ activation_230 (Activation) (None, 128, 128, 64) 0 dropout_114[0][0] __________________________________________________________________________________________________ conv2d_230 (Conv2D) (None, 128, 128, 64) 36928 activation_230[0][0] __________________________________________________________________________________________________ add_114 (Add) (None, 128, 128, 64) 0 conv2d_230[0][0] add_113[0][0] __________________________________________________________________________________________________ batch_normalization_116 (BatchN (None, 128, 128, 64) 256 add_114[0][0] __________________________________________________________________________________________________ activation_231 (Activation) (None, 128, 128, 64) 0 batch_normalization_116[0][0] __________________________________________________________________________________________________ conv2d_231 (Conv2D) (None, 128, 128, 64) 36928 activation_231[0][0] __________________________________________________________________________________________________ dropout_115 (Dropout) (None, 128, 128, 64) 0 conv2d_231[0][0] __________________________________________________________________________________________________ activation_232 (Activation) (None, 128, 128, 64) 0 dropout_115[0][0] __________________________________________________________________________________________________ conv2d_232 (Conv2D) (None, 128, 128, 64) 36928 activation_232[0][0] __________________________________________________________________________________________________ add_115 (Add) (None, 128, 128, 64) 0 conv2d_232[0][0] add_114[0][0] __________________________________________________________________________________________________ batch_normalization_117 (BatchN (None, 128, 128, 64) 256 add_115[0][0] __________________________________________________________________________________________________ activation_233 (Activation) (None, 128, 128, 64) 0 batch_normalization_117[0][0] __________________________________________________________________________________________________ conv2d_233 (Conv2D) (None, 128, 128, 64) 36928 activation_233[0][0] __________________________________________________________________________________________________ dropout_116 (Dropout) (None, 128, 128, 64) 0 conv2d_233[0][0] __________________________________________________________________________________________________ activation_234 (Activation) (None, 128, 128, 64) 0 dropout_116[0][0] __________________________________________________________________________________________________ conv2d_234 (Conv2D) (None, 128, 128, 64) 36928 activation_234[0][0] __________________________________________________________________________________________________ add_116 (Add) (None, 128, 128, 64) 0 conv2d_234[0][0] add_115[0][0] __________________________________________________________________________________________________ batch_normalization_118 (BatchN (None, 128, 128, 64) 256 add_116[0][0] __________________________________________________________________________________________________ activation_235 (Activation) (None, 128, 128, 64) 0 batch_normalization_118[0][0] __________________________________________________________________________________________________ conv2d_235 (Conv2D) (None, 128, 128, 64) 36928 activation_235[0][0] __________________________________________________________________________________________________ dropout_117 (Dropout) (None, 128, 128, 64) 0 conv2d_235[0][0] __________________________________________________________________________________________________ activation_236 (Activation) (None, 128, 128, 64) 0 dropout_117[0][0] __________________________________________________________________________________________________ conv2d_236 (Conv2D) (None, 128, 128, 64) 36928 activation_236[0][0] __________________________________________________________________________________________________ add_117 (Add) (None, 128, 128, 64) 0 conv2d_236[0][0] add_116[0][0] __________________________________________________________________________________________________ batch_normalization_119 (BatchN (None, 128, 128, 64) 256 add_117[0][0] __________________________________________________________________________________________________ activation_237 (Activation) (None, 128, 128, 64) 0 batch_normalization_119[0][0] __________________________________________________________________________________________________ conv2d_237 (Conv2D) (None, 128, 128, 64) 36928 activation_237[0][0] __________________________________________________________________________________________________ dropout_118 (Dropout) (None, 128, 128, 64) 0 conv2d_237[0][0] __________________________________________________________________________________________________ activation_238 (Activation) (None, 128, 128, 64) 0 dropout_118[0][0] __________________________________________________________________________________________________ conv2d_238 (Conv2D) (None, 128, 128, 64) 36928 activation_238[0][0] __________________________________________________________________________________________________ add_118 (Add) (None, 128, 128, 64) 0 conv2d_238[0][0] add_117[0][0] __________________________________________________________________________________________________ batch_normalization_120 (BatchN (None, 128, 128, 64) 256 add_118[0][0] __________________________________________________________________________________________________ activation_239 (Activation) (None, 128, 128, 64) 0 batch_normalization_120[0][0] __________________________________________________________________________________________________ conv2d_239 (Conv2D) (None, 128, 128, 64) 36928 activation_239[0][0] __________________________________________________________________________________________________ dropout_119 (Dropout) (None, 128, 128, 64) 0 conv2d_239[0][0] __________________________________________________________________________________________________ activation_240 (Activation) (None, 128, 128, 64) 0 dropout_119[0][0] __________________________________________________________________________________________________ conv2d_240 (Conv2D) (None, 128, 128, 64) 36928 activation_240[0][0] __________________________________________________________________________________________________ add_119 (Add) (None, 128, 128, 64) 0 conv2d_240[0][0] add_118[0][0] __________________________________________________________________________________________________ batch_normalization_121 (BatchN (None, 128, 128, 64) 256 add_119[0][0] __________________________________________________________________________________________________ activation_241 (Activation) (None, 128, 128, 64) 0 batch_normalization_121[0][0] __________________________________________________________________________________________________ conv2d_241 (Conv2D) (None, 128, 128, 64) 36928 activation_241[0][0] __________________________________________________________________________________________________ dropout_120 (Dropout) (None, 128, 128, 64) 0 conv2d_241[0][0] __________________________________________________________________________________________________ activation_242 (Activation) (None, 128, 128, 64) 0 dropout_120[0][0] __________________________________________________________________________________________________ conv2d_242 (Conv2D) (None, 128, 128, 64) 36928 activation_242[0][0] __________________________________________________________________________________________________ add_120 (Add) (None, 128, 128, 64) 0 conv2d_242[0][0] add_119[0][0] __________________________________________________________________________________________________ batch_normalization_122 (BatchN (None, 128, 128, 64) 256 add_120[0][0] __________________________________________________________________________________________________ activation_243 (Activation) (None, 128, 128, 64) 0 batch_normalization_122[0][0] __________________________________________________________________________________________________ conv2d_243 (Conv2D) (None, 128, 128, 64) 36928 activation_243[0][0] __________________________________________________________________________________________________ dropout_121 (Dropout) (None, 128, 128, 64) 0 conv2d_243[0][0] __________________________________________________________________________________________________ activation_244 (Activation) (None, 128, 128, 64) 0 dropout_121[0][0] __________________________________________________________________________________________________ conv2d_244 (Conv2D) (None, 128, 128, 64) 36928 activation_244[0][0] __________________________________________________________________________________________________ add_121 (Add) (None, 128, 128, 64) 0 conv2d_244[0][0] add_120[0][0] __________________________________________________________________________________________________ batch_normalization_123 (BatchN (None, 128, 128, 64) 256 add_121[0][0] __________________________________________________________________________________________________ activation_245 (Activation) (None, 128, 128, 64) 0 batch_normalization_123[0][0] __________________________________________________________________________________________________ conv2d_245 (Conv2D) (None, 128, 128, 64) 36928 activation_245[0][0] __________________________________________________________________________________________________ dropout_122 (Dropout) (None, 128, 128, 64) 0 conv2d_245[0][0] __________________________________________________________________________________________________ activation_246 (Activation) (None, 128, 128, 64) 0 dropout_122[0][0] __________________________________________________________________________________________________ conv2d_246 (Conv2D) (None, 128, 128, 64) 36928 activation_246[0][0] __________________________________________________________________________________________________ add_122 (Add) (None, 128, 128, 64) 0 conv2d_246[0][0] add_121[0][0] __________________________________________________________________________________________________ batch_normalization_124 (BatchN (None, 128, 128, 64) 256 add_122[0][0] __________________________________________________________________________________________________ activation_247 (Activation) (None, 128, 128, 64) 0 batch_normalization_124[0][0] __________________________________________________________________________________________________ conv2d_247 (Conv2D) (None, 128, 128, 64) 36928 activation_247[0][0] __________________________________________________________________________________________________ dropout_123 (Dropout) (None, 128, 128, 64) 0 conv2d_247[0][0] 2020-04-29 08:15:46.332684: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 __________________________________________________________________________________________________ activation_248 (Activation) (None, 128, 128, 64) 0 dropout_123[0][0] __________________________________________________________________________________________________ conv2d_248 (Conv2D) (None, 128, 128, 64) 36928 activation_248[0][0] __________________________________________________________________________________________________ add_123 (Add) (None, 128, 128, 64) 0 conv2d_248[0][0] add_122[0][0] __________________________________________________________________________________________________ batch_normalization_125 (BatchN (None, 128, 128, 64) 256 add_123[0][0] __________________________________________________________________________________________________ activation_249 (Activation) (None, 128, 128, 64) 0 batch_normalization_125[0][0] __________________________________________________________________________________________________ conv2d_249 (Conv2D) (None, 128, 128, 64) 36928 activation_249[0][0] __________________________________________________________________________________________________ dropout_124 (Dropout) (None, 128, 128, 64) 0 conv2d_249[0][0] __________________________________________________________________________________________________ activation_250 (Activation) (None, 128, 128, 64) 0 dropout_124[0][0] __________________________________________________________________________________________________ conv2d_250 (Conv2D) (None, 128, 128, 64) 36928 activation_250[0][0] __________________________________________________________________________________________________ add_124 (Add) (None, 128, 128, 64) 0 conv2d_250[0][0] add_123[0][0] __________________________________________________________________________________________________ batch_normalization_126 (BatchN (None, 128, 128, 64) 256 add_124[0][0] __________________________________________________________________________________________________ activation_251 (Activation) (None, 128, 128, 64) 0 batch_normalization_126[0][0] __________________________________________________________________________________________________ conv2d_251 (Conv2D) (None, 128, 128, 64) 36928 activation_251[0][0] __________________________________________________________________________________________________ dropout_125 (Dropout) (None, 128, 128, 64) 0 conv2d_251[0][0] __________________________________________________________________________________________________ activation_252 (Activation) (None, 128, 128, 64) 0 dropout_125[0][0] __________________________________________________________________________________________________ conv2d_252 (Conv2D) (None, 128, 128, 64) 36928 activation_252[0][0] __________________________________________________________________________________________________ add_125 (Add) (None, 128, 128, 64) 0 conv2d_252[0][0] add_124[0][0] __________________________________________________________________________________________________ batch_normalization_127 (BatchN (None, 128, 128, 64) 256 add_125[0][0] __________________________________________________________________________________________________ activation_253 (Activation) (None, 128, 128, 64) 0 batch_normalization_127[0][0] __________________________________________________________________________________________________ conv2d_253 (Conv2D) (None, 128, 128, 64) 36928 activation_253[0][0] __________________________________________________________________________________________________ dropout_126 (Dropout) (None, 128, 128, 64) 0 conv2d_253[0][0] __________________________________________________________________________________________________ activation_254 (Activation) (None, 128, 128, 64) 0 dropout_126[0][0] __________________________________________________________________________________________________ conv2d_254 (Conv2D) (None, 128, 128, 64) 36928 activation_254[0][0] __________________________________________________________________________________________________ add_126 (Add) (None, 128, 128, 64) 0 conv2d_254[0][0] add_125[0][0] __________________________________________________________________________________________________ batch_normalization_128 (BatchN (None, 128, 128, 64) 256 add_126[0][0] __________________________________________________________________________________________________ activation_255 (Activation) (None, 128, 128, 64) 0 batch_normalization_128[0][0] __________________________________________________________________________________________________ conv2d_255 (Conv2D) (None, 128, 128, 64) 36928 activation_255[0][0] __________________________________________________________________________________________________ dropout_127 (Dropout) (None, 128, 128, 64) 0 conv2d_255[0][0] __________________________________________________________________________________________________ activation_256 (Activation) (None, 128, 128, 64) 0 dropout_127[0][0] __________________________________________________________________________________________________ conv2d_256 (Conv2D) (None, 128, 128, 64) 36928 activation_256[0][0] __________________________________________________________________________________________________ add_127 (Add) (None, 128, 128, 64) 0 conv2d_256[0][0] add_126[0][0] __________________________________________________________________________________________________ batch_normalization_129 (BatchN (None, 128, 128, 64) 256 add_127[0][0] __________________________________________________________________________________________________ activation_257 (Activation) (None, 128, 128, 64) 0 batch_normalization_129[0][0] __________________________________________________________________________________________________ conv2d_257 (Conv2D) (None, 128, 128, 34) 19618 activation_257[0][0] __________________________________________________________________________________________________ activation_258 (Activation) (None, 128, 128, 34) 0 conv2d_257[0][0] ================================================================================================== Total params: 9,510,150 Trainable params: 9,493,524 Non-trainable params: 16,626 __________________________________________________________________________________________________ None Evaluate validation set.. Model params: L 512 num_blocks 128 width 64 expected_n_channels 57 1/100 [..............................] - ETA: 3:23:33 2/100 [..............................] - ETA: 1:41:38 3/100 [..............................] - ETA: 1:07:38 4/100 [>.............................] - ETA: 50:37  5/100 [>.............................] - ETA: 40:25 6/100 [>.............................] - ETA: 33:36 7/100 [=>............................] - ETA: 28:44 8/100 [=>............................] - ETA: 25:04 9/100 [=>............................] - ETA: 22:13 10/100 [==>...........................] - ETA: 19:56 11/100 [==>...........................] - ETA: 18:04 12/100 [==>...........................] - ETA: 18:48 13/100 [==>...........................] - ETA: 17:26 14/100 [===>..........................] - ETA: 18:03 15/100 [===>..........................] - ETA: 18:50 16/100 [===>..........................] - ETA: 19:29 17/100 [====>.........................] - ETA: 18:30 18/100 [====>.........................] - ETA: 18:15 19/100 [====>.........................] - ETA: 17:29 20/100 [=====>........................] - ETA: 16:47 21/100 [=====>........................] - ETA: 17:48 22/100 [=====>........................] - ETA: 16:55 23/100 [=====>........................] - ETA: 18:18 24/100 [======>.......................] - ETA: 18:12 25/100 [======>.......................] - ETA: 17:27 26/100 [======>.......................] - ETA: 17:41 27/100 [=======>......................] - ETA: 16:56 28/100 [=======>......................] - ETA: 16:17 29/100 [=======>......................] - ETA: 15:36 30/100 [========>.....................] - ETA: 15:05 31/100 [========>.....................] - ETA: 14:38 32/100 [========>.....................] - ETA: 14:05 33/100 [========>.....................] - ETA: 14:34 34/100 [=========>....................] - ETA: 14:01 35/100 [=========>....................] - ETA: 13:33 36/100 [=========>....................] - ETA: 13:07 37/100 [==========>...................] - ETA: 12:35 38/100 [==========>...................] - ETA: 12:09 39/100 [==========>...................] - ETA: 12:34 40/100 [===========>..................] - ETA: 13:30 41/100 [===========>..................] - ETA: 13:00 42/100 [===========>..................] - ETA: 13:40 43/100 [===========>..................] - ETA: 13:14 44/100 [============>.................] - ETA: 12:57 45/100 [============>.................] - ETA: 12:29 46/100 [============>.................] - ETA: 12:02 47/100 [=============>................] - ETA: 11:51 48/100 [=============>................] - ETA: 11:27 49/100 [=============>................] - ETA: 11:05 50/100 [==============>...............] - ETA: 10:42 51/100 [==============>...............] - ETA: 10:56 52/100 [==============>...............] - ETA: 10:35 53/100 [==============>...............] - ETA: 10:12 54/100 [===============>..............] - ETA: 9:50  55/100 [===============>..............] - ETA: 9:31 56/100 [===============>..............] - ETA: 9:10 57/100 [================>.............] - ETA: 8:52 58/100 [================>.............] - ETA: 8:36 59/100 [================>.............] - ETA: 8:18 60/100 [=================>............] - ETA: 8:08 61/100 [=================>............] - ETA: 7:49 62/100 [=================>............] - ETA: 7:30 63/100 [=================>............] - ETA: 7:12 64/100 [==================>...........] - ETA: 6:54 65/100 [==================>...........] - ETA: 6:39 66/100 [==================>...........] - ETA: 6:31 67/100 [===================>..........] - ETA: 6:15 68/100 [===================>..........] - ETA: 5:59 69/100 [===================>..........] - ETA: 5:43 70/100 [====================>.........] - ETA: 5:36 71/100 [====================>.........] - ETA: 5:22 72/100 [====================>.........] - ETA: 5:08 73/100 [====================>.........] - ETA: 4:54 74/100 [=====================>........] - ETA: 4:40 75/100 [=====================>........] - ETA: 4:26 76/100 [=====================>........] - ETA: 4:16 77/100 [======================>.......] - ETA: 4:03 78/100 [======================>.......] - ETA: 3:50 79/100 [======================>.......] - ETA: 3:37 80/100 [=======================>......] - ETA: 3:24 81/100 [=======================>......] - ETA: 3:15 82/100 [=======================>......] - ETA: 3:04 83/100 [=======================>......] - ETA: 2:57 84/100 [========================>.....] - ETA: 2:47 85/100 [========================>.....] - ETA: 2:35 86/100 [========================>.....] - ETA: 2:24 87/100 [=========================>....] - ETA: 2:13 88/100 [=========================>....] - ETA: 2:01 89/100 [=========================>....] - ETA: 1:53 90/100 [==========================>...] - ETA: 1:41 91/100 [==========================>...] - ETA: 1:32 92/100 [==========================>...] - ETA: 1:21 93/100 [==========================>...] - ETA: 1:11 94/100 [===========================>..] - ETA: 1:01 95/100 [===========================>..] - ETA: 50s  96/100 [===========================>..] - ETA: 40s 97/100 [============================>.] - ETA: 30s 98/100 [============================>.] - ETA: 20s 99/100 [============================>.] - ETA: 10s 100/100 [==============================] - 1016s 10s/step MAE for 0 3aa0A lr_d8 = 9.47 mlr_d8 = 7.77 lr_d12 = 7.12 mlr_d12 = 6.43 MAE for 1 5cs0A lr_d8 = 7.36 mlr_d8 = 6.67 lr_d12 = 5.64 mlr_d12 = 5.16 MAE for 2 2z51A lr_d8 = 5.10 mlr_d8 = 4.40 lr_d12 = 3.75 mlr_d12 = 3.47 MAE for 3 3q64A lr_d8 = 2.99 mlr_d8 = 2.34 lr_d12 = 2.11 mlr_d12 = 1.91 MAE for 4 4p3aA lr_d8 = 4.01 mlr_d8 = 4.00 lr_d12 = 3.34 mlr_d12 = 3.32 MAE for 5 1nkzA lr_d8 = nan mlr_d8 = nan lr_d12 = nan mlr_d12 = 6.03 MAE for 6 4d5tA lr_d8 = 11.48 mlr_d8 = 11.24 lr_d12 = 8.62 mlr_d12 = 8.31 MAE for 7 1jogA lr_d8 = 3.23 mlr_d8 = 2.99 lr_d12 = 2.49 mlr_d12 = 2.33 MAE for 8 4u89A lr_d8 = 10.27 mlr_d8 = 8.21 lr_d12 = 8.44 mlr_d12 = 7.22 MAE for 9 3wisA lr_d8 = 3.54 mlr_d8 = 3.51 lr_d12 = 3.71 mlr_d12 = 3.53 MAE for 10 1mkfA lr_d8 = 12.29 mlr_d8 = 11.81 lr_d12 = 9.49 mlr_d12 = 9.10 MAE for 11 1kaeA lr_d8 = 9.20 mlr_d8 = 8.40 lr_d12 = 8.22 mlr_d12 = 7.42 MAE for 12 4o8bA lr_d8 = 2.41 mlr_d8 = 3.34 lr_d12 = 2.51 mlr_d12 = 2.82 MAE for 13 2gfqA lr_d8 = 7.60 mlr_d8 = 7.18 lr_d12 = 6.95 mlr_d12 = 6.41 MAE for 14 4cmlA lr_d8 = 6.01 mlr_d8 = 5.38 lr_d12 = 6.02 mlr_d12 = 5.52 MAE for 15 1vrmA lr_d8 = 3.92 mlr_d8 = 3.74 lr_d12 = 3.78 mlr_d12 = 3.50 MAE for 16 4kwyA lr_d8 = 3.49 mlr_d8 = 3.23 lr_d12 = 3.62 mlr_d12 = 3.40 MAE for 17 2yfaA lr_d8 = 3.49 mlr_d8 = 3.11 lr_d12 = 3.09 mlr_d12 = 2.77 MAE for 18 1tnfA lr_d8 = 4.00 mlr_d8 = 3.99 lr_d12 = 3.72 mlr_d12 = 3.60 MAE for 19 3pivA lr_d8 = 4.66 mlr_d8 = 4.52 lr_d12 = 3.26 mlr_d12 = 3.14 MAE for 20 1nw1A lr_d8 = 6.63 mlr_d8 = 5.84 lr_d12 = 5.55 mlr_d12 = 5.10 MAE for 21 4ng0A lr_d8 = 8.38 mlr_d8 = 8.54 lr_d12 = 6.85 mlr_d12 = 7.06 MAE for 22 4qt9A lr_d8 = 8.32 mlr_d8 = 8.07 lr_d12 = 6.89 mlr_d12 = 6.63 MAE for 23 3no4A lr_d8 = 6.62 mlr_d8 = 6.69 lr_d12 = 5.88 mlr_d12 = 5.83 MAE for 24 3qd7X lr_d8 = 9.64 mlr_d8 = 8.24 lr_d12 = 8.19 mlr_d12 = 7.23 MAE for 25 3o4pA lr_d8 = 7.35 mlr_d8 = 5.33 lr_d12 = 5.41 mlr_d12 = 4.44 MAE for 26 3agnA lr_d8 = 6.93 mlr_d8 = 5.73 lr_d12 = 5.07 mlr_d12 = 4.53 MAE for 27 2yilA lr_d8 = 6.13 mlr_d8 = 5.21 lr_d12 = 5.21 mlr_d12 = 4.35 MAE for 28 4jivD lr_d8 = 9.15 mlr_d8 = 6.72 lr_d12 = 6.16 mlr_d12 = 5.04 MAE for 29 4fz4A lr_d8 = 12.39 mlr_d8 = 11.41 lr_d12 = 10.27 mlr_d12 = 9.25 MAE for 30 2g3vA lr_d8 = 16.93 mlr_d8 = 16.79 lr_d12 = 13.83 mlr_d12 = 13.37 MAE for 31 2o0qA lr_d8 = 3.50 mlr_d8 = 3.59 lr_d12 = 2.91 mlr_d12 = 3.02 MAE for 32 4levA lr_d8 = 13.04 mlr_d8 = 11.04 lr_d12 = 11.23 mlr_d12 = 9.98 MAE for 33 1t6t1 lr_d8 = 6.40 mlr_d8 = 5.26 lr_d12 = 5.83 mlr_d12 = 4.90 MAE for 34 2fzpA lr_d8 = 16.37 mlr_d8 = 14.72 lr_d12 = 13.34 mlr_d12 = 12.24 MAE for 35 1rj8A lr_d8 = 7.48 mlr_d8 = 6.98 lr_d12 = 6.72 mlr_d12 = 6.26 MAE for 36 2z7fI lr_d8 = 6.26 mlr_d8 = 5.82 lr_d12 = 4.55 mlr_d12 = 3.95 MAE for 37 1dx5I lr_d8 = 10.72 mlr_d8 = 6.80 lr_d12 = 8.65 mlr_d12 = 5.27 MAE for 38 2gsoA lr_d8 = 7.97 mlr_d8 = 8.07 lr_d12 = 7.26 mlr_d12 = 7.25 MAE for 39 4tshB lr_d8 = 11.20 mlr_d8 = 10.36 lr_d12 = 9.84 mlr_d12 = 9.13 MAE for 40 3vtoA lr_d8 = nan mlr_d8 = 2.14 lr_d12 = 3.74 mlr_d12 = 1.72 MAE for 41 2p3yA lr_d8 = 10.52 mlr_d8 = 10.27 lr_d12 = 10.04 mlr_d12 = 9.73 MAE for 42 3pcvA lr_d8 = 2.37 mlr_d8 = 2.20 lr_d12 = 1.78 mlr_d12 = 1.67 MAE for 43 3kfoA lr_d8 = 13.43 mlr_d8 = 12.21 lr_d12 = 11.06 mlr_d12 = 9.92 MAE for 44 3v6iB lr_d8 = nan mlr_d8 = nan lr_d12 = nan mlr_d12 = nan MAE for 45 4htiA lr_d8 = 4.06 mlr_d8 = 3.82 lr_d12 = 3.09 mlr_d12 = 2.90 MAE for 46 2q0tA lr_d8 = 6.24 mlr_d8 = 5.67 lr_d12 = 5.05 mlr_d12 = 4.50 MAE for 47 4l3uA lr_d8 = 4.42 mlr_d8 = 4.23 lr_d12 = 3.50 mlr_d12 = 3.29 MAE for 48 4pt1A lr_d8 = 7.36 mlr_d8 = 6.02 lr_d12 = 5.87 mlr_d12 = 4.94 MAE for 49 3c1qA lr_d8 = 2.81 mlr_d8 = 2.25 lr_d12 = 2.15 mlr_d12 = 1.83 MAE for 50 1ux5A lr_d8 = 6.29 mlr_d8 = 6.45 lr_d12 = 4.89 mlr_d12 = 4.90 MAE for 51 1h9mA lr_d8 = 4.20 mlr_d8 = 3.39 lr_d12 = 4.96 mlr_d12 = 3.98 MAE for 52 3oufA lr_d8 = 10.34 mlr_d8 = 8.18 lr_d12 = 8.21 mlr_d12 = 6.45 MAE for 53 4rt5A lr_d8 = 2.43 mlr_d8 = 2.58 lr_d12 = 2.51 mlr_d12 = 2.35 MAE for 54 3njcA lr_d8 = 6.44 mlr_d8 = 5.53 lr_d12 = 5.67 mlr_d12 = 5.04 MAE for 55 2q73A lr_d8 = 2.32 mlr_d8 = 2.81 lr_d12 = 1.65 mlr_d12 = 2.18 MAE for 56 1yz1A lr_d8 = 4.80 mlr_d8 = 4.62 lr_d12 = 4.10 mlr_d12 = 3.99 MAE for 57 5c50B lr_d8 = 6.72 mlr_d8 = 6.24 lr_d12 = 6.42 mlr_d12 = 6.29 MAE for 58 3hnxA lr_d8 = 9.78 mlr_d8 = 5.99 lr_d12 = 7.12 mlr_d12 = 5.27 MAE for 59 1knyA lr_d8 = 6.28 mlr_d8 = 5.35 lr_d12 = 5.18 mlr_d12 = 4.53 MAE for 60 1tr8A lr_d8 = 4.93 mlr_d8 = 3.42 lr_d12 = 3.75 mlr_d12 = 2.65 MAE for 61 4qicB lr_d8 = nan mlr_d8 = 15.30 lr_d12 = nan mlr_d12 = 12.15 MAE for 62 3f1iS lr_d8 = 2.58 mlr_d8 = 1.97 lr_d12 = 1.80 mlr_d12 = 1.41 MAE for 63 2fyuI lr_d8 = 19.37 mlr_d8 = 17.11 lr_d12 = 16.44 mlr_d12 = 14.93 MAE for 64 4ic9A lr_d8 = 10.09 mlr_d8 = 9.31 lr_d12 = 8.38 mlr_d12 = 7.78 MAE for 65 3iruA lr_d8 = 5.73 mlr_d8 = 5.29 lr_d12 = 4.96 mlr_d12 = 4.54 MAE for 66 2xu8A lr_d8 = 14.24 mlr_d8 = 10.88 lr_d12 = 11.01 mlr_d12 = 9.73 MAE for 67 3g7lA lr_d8 = 2.94 mlr_d8 = 1.40 lr_d12 = 2.09 mlr_d12 = 1.70 MAE for 68 3hshA lr_d8 = 6.93 mlr_d8 = 3.33 lr_d12 = 4.54 mlr_d12 = 2.89 MAE for 69 1vq0A lr_d8 = 8.05 mlr_d8 = 6.67 lr_d12 = 6.76 mlr_d12 = 5.89 MAE for 70 4z04A lr_d8 = 2.63 mlr_d8 = 2.53 lr_d12 = 2.23 mlr_d12 = 2.14 MAE for 71 2huhA lr_d8 = 5.84 mlr_d8 = 5.02 lr_d12 = 5.23 mlr_d12 = 4.79 MAE for 72 4dh2B lr_d8 = 6.14 mlr_d8 = 5.34 lr_d12 = 3.96 mlr_d12 = 3.53 MAE for 73 2p9xA lr_d8 = 14.58 mlr_d8 = 11.96 lr_d12 = 11.45 mlr_d12 = 9.18 MAE for 74 1m9zA lr_d8 = 8.92 mlr_d8 = 7.81 lr_d12 = 6.98 mlr_d12 = 6.38 MAE for 75 2czrA lr_d8 = 11.98 mlr_d8 = 9.48 lr_d12 = 9.09 mlr_d12 = 7.68 MAE for 76 2bh1X lr_d8 = 1.88 mlr_d8 = 1.98 lr_d12 = 1.57 mlr_d12 = 1.58 MAE for 77 3ghfA lr_d8 = 5.62 mlr_d8 = 4.97 lr_d12 = 5.28 mlr_d12 = 4.59 MAE for 78 4ui1C lr_d8 = 4.82 mlr_d8 = 4.34 lr_d12 = 3.41 mlr_d12 = 3.06 MAE for 79 2otaA lr_d8 = nan mlr_d8 = 2.09 lr_d12 = 1.26 mlr_d12 = 1.84 MAE for 80 1vk1A lr_d8 = 12.69 mlr_d8 = 11.96 lr_d12 = 10.76 mlr_d12 = 9.80 MAE for 81 1su1A lr_d8 = 4.65 mlr_d8 = 5.29 lr_d12 = 4.45 mlr_d12 = 4.69 MAE for 82 3sfvB lr_d8 = 14.81 mlr_d8 = 12.44 lr_d12 = 12.50 mlr_d12 = 10.75 MAE for 83 2f5tX lr_d8 = 12.54 mlr_d8 = 10.64 lr_d12 = 10.59 mlr_d12 = 9.32 MAE for 84 1xyiA lr_d8 = 8.36 mlr_d8 = 6.98 lr_d12 = 5.77 mlr_d12 = 4.90 MAE for 85 2okuA lr_d8 = 5.55 mlr_d8 = 5.02 lr_d12 = 4.33 mlr_d12 = 4.00 MAE for 86 2h5nA lr_d8 = 3.40 mlr_d8 = 3.23 lr_d12 = 2.66 mlr_d12 = 2.50 MAE for 87 1kptA lr_d8 = 4.22 mlr_d8 = 3.95 lr_d12 = 3.49 mlr_d12 = 3.29 MAE for 88 2qc5A lr_d8 = 4.59 mlr_d8 = 3.43 lr_d12 = 3.46 mlr_d12 = 2.69 MAE for 89 2hueC lr_d8 = 2.44 mlr_d8 = 1.70 lr_d12 = 2.28 mlr_d12 = 1.54 MAE for 90 2i5vO lr_d8 = 12.98 mlr_d8 = 7.58 lr_d12 = 8.46 mlr_d12 = 6.72 MAE for 91 2gs5A lr_d8 = 9.03 mlr_d8 = 8.22 lr_d12 = 8.45 mlr_d12 = 7.95 MAE for 92 4lmoA lr_d8 = 9.05 mlr_d8 = 8.47 lr_d12 = 7.82 mlr_d12 = 7.37 MAE for 93 4njcA lr_d8 = 5.16 mlr_d8 = 4.94 lr_d12 = 5.06 mlr_d12 = 4.33 MAE for 94 3ronA lr_d8 = 12.05 mlr_d8 = 10.35 lr_d12 = 10.27 mlr_d12 = 9.18 MAE for 95 3g1pA lr_d8 = 4.41 mlr_d8 = 3.83 lr_d12 = 3.50 mlr_d12 = 3.13 MAE for 96 4m8aA lr_d8 = 9.70 mlr_d8 = 7.78 lr_d12 = 6.86 mlr_d12 = 5.39 MAE for 97 3ajfA lr_d8 = 10.94 mlr_d8 = 10.15 lr_d12 = 8.14 mlr_d12 = 7.44 MAE for 98 1j8uA lr_d8 = 9.36 mlr_d8 = 8.17 lr_d12 = 7.42 mlr_d12 = 6.71 MAE for 99 4u65A lr_d8 = 5.73 mlr_d8 = 4.78 lr_d12 = 4.99 mlr_d12 = 4.40 Average MAE : lr<8A = 7.4282 mlr<8A = 6.4906 lr<12A = 6.0200 mlr<12A = 5.4572 Precision for 0 - 3aa0A top_L5_lr = 0.6852 top_L_lr = 0.3667 top_Nc_lr = 0.3849 top_L5_mlr = 0.7778 top_L_mlr = 0.5667 top_Nc_mlr = 0.4787 (total_true_lr = 252 total_true_mlr = 399) Precision for 1 - 5cs0A top_L5_lr = 0.4884 top_L_lr = 0.2140 top_Nc_lr = 0.2617 top_L5_mlr = 0.5116 top_L_mlr = 0.2651 top_Nc_mlr = 0.2632 (total_true_lr = 149 total_true_mlr = 190) Precision for 2 - 2z51A top_L5_lr = 0.6774 top_L_lr = 0.4156 top_Nc_lr = 0.4236 top_L5_mlr = 0.8387 top_L_mlr = 0.6494 top_Nc_mlr = 0.5090 (total_true_lr = 144 total_true_mlr = 222) Precision for 3 - 3q64A top_L5_lr = 0.9677 top_L_lr = 0.6968 top_Nc_lr = 0.6855 top_L5_mlr = 1.0000 top_L_mlr = 0.9226 top_Nc_mlr = 0.7319 (total_true_lr = 159 total_true_mlr = 276) Precision for 4 - 4p3aA top_L5_lr = 1.0000 top_L_lr = 0.6000 top_Nc_lr = 0.6230 top_L5_mlr = 0.8571 top_L_mlr = 0.6429 top_Nc_mlr = 0.5750 (total_true_lr = 61 total_true_mlr = 80) Precision for 5 - 1nkzA top_L5_lr = nan top_L_lr = nan top_Nc_lr = nan top_L5_mlr = nan top_L_mlr = nan top_Nc_mlr = nan (total_true_lr = 0 total_true_mlr = 0) Precision for 6 - 4d5tA top_L5_lr = 0.2667 top_L_lr = 0.1656 top_Nc_lr = 0.1556 top_L5_mlr = 0.3000 top_L_mlr = 0.1788 top_Nc_mlr = 0.1688 (total_true_lr = 135 total_true_mlr = 160) Precision for 7 - 1jogA top_L5_lr = 0.5556 top_L_lr = 0.3852 top_Nc_lr = 0.3981 top_L5_mlr = 0.6667 top_L_mlr = 0.4444 top_Nc_mlr = 0.4470 (total_true_lr = 103 total_true_mlr = 132) Precision for 8 - 4u89A top_L5_lr = 0.7778 top_L_lr = 0.4044 top_Nc_lr = 0.3488 top_L5_mlr = 0.8000 top_L_mlr = 0.6444 top_Nc_mlr = 0.4711 (total_true_lr = 301 total_true_mlr = 433) Precision for 9 - 3wisA top_L5_lr = 1.0000 top_L_lr = 0.8056 top_Nc_lr = 0.6560 top_L5_mlr = 1.0000 top_L_mlr = 0.8500 top_Nc_mlr = 0.6458 (total_true_lr = 282 total_true_mlr = 319) Precision for 10 - 1mkfA top_L5_lr = 0.1081 top_L_lr = 0.1482 top_Nc_lr = 0.1162 top_L5_mlr = 0.2162 top_L_mlr = 0.1941 top_Nc_mlr = 0.1382 (total_true_lr = 714 total_true_mlr = 854) Precision for 11 - 1kaeA top_L5_lr = 0.8506 top_L_lr = 0.6037 top_Nc_lr = 0.4585 top_L5_mlr = 0.8736 top_L_mlr = 0.6820 top_Nc_mlr = 0.4749 (total_true_lr = 711 total_true_mlr = 857) Precision for 12 - 4o8bA top_L5_lr = 0.9000 top_L_lr = 0.4851 top_Nc_lr = 0.6981 top_L5_mlr = 0.9000 top_L_mlr = 0.5446 top_Nc_mlr = 0.6267 (total_true_lr = 53 total_true_mlr = 75) Precision for 13 - 2gfqA top_L5_lr = 0.9483 top_L_lr = 0.7292 top_Nc_lr = 0.5073 top_L5_mlr = 0.9655 top_L_mlr = 0.7118 top_Nc_mlr = 0.4915 (total_true_lr = 615 total_true_mlr = 708) Precision for 14 - 4cmlA top_L5_lr = 1.0000 top_L_lr = 0.7871 top_Nc_lr = 0.5900 top_L5_mlr = 1.0000 top_L_mlr = 0.8871 top_Nc_mlr = 0.6252 (total_true_lr = 578 total_true_mlr = 723) Precision for 15 - 1vrmA top_L5_lr = 0.9677 top_L_lr = 0.7605 top_Nc_lr = 0.6226 top_L5_mlr = 0.9677 top_L_mlr = 0.8414 top_Nc_mlr = 0.6276 (total_true_lr = 469 total_true_mlr = 623) Precision for 16 - 4kwyA top_L5_lr = 0.9630 top_L_lr = 0.7591 top_Nc_lr = 0.6798 top_L5_mlr = 0.9630 top_L_mlr = 0.8394 top_Nc_mlr = 0.6878 (total_true_lr = 178 total_true_mlr = 237) Precision for 17 - 2yfaA top_L5_lr = 0.9362 top_L_lr = 0.6441 top_Nc_lr = 0.6967 top_L5_mlr = 0.9574 top_L_mlr = 0.7585 top_Nc_mlr = 0.7197 (total_true_lr = 211 total_true_mlr = 264) Precision for 18 - 1tnfA top_L5_lr = 1.0000 top_L_lr = 0.8487 top_Nc_lr = 0.6263 top_L5_mlr = 1.0000 top_L_mlr = 0.8750 top_Nc_mlr = 0.6200 (total_true_lr = 289 total_true_mlr = 350) Precision for 19 - 3pivA top_L5_lr = 0.7742 top_L_lr = 0.4744 top_Nc_lr = 0.4416 top_L5_mlr = 0.7097 top_L_mlr = 0.4808 top_Nc_mlr = 0.4231 (total_true_lr = 197 total_true_mlr = 208) Precision for 20 - 1nw1A top_L5_lr = 0.9315 top_L_lr = 0.6137 top_Nc_lr = 0.5088 top_L5_mlr = 0.9863 top_L_mlr = 0.7068 top_Nc_mlr = 0.5494 (total_true_lr = 509 total_true_mlr = 648) Precision for 21 - 4ng0A top_L5_lr = 0.7895 top_L_lr = 0.5876 top_Nc_lr = 0.5285 top_L5_mlr = 0.8947 top_L_mlr = 0.7010 top_Nc_mlr = 0.5088 (total_true_lr = 123 total_true_mlr = 171) Precision for 22 - 4qt9A top_L5_lr = 0.7711 top_L_lr = 0.5386 top_Nc_lr = 0.4181 top_L5_mlr = 0.8072 top_L_mlr = 0.5870 top_Nc_mlr = 0.4251 (total_true_lr = 708 total_true_mlr = 868) Precision for 23 - 3no4A top_L5_lr = 0.9434 top_L_lr = 0.7121 top_Nc_lr = 0.5203 top_L5_mlr = 0.9434 top_L_mlr = 0.7462 top_Nc_mlr = 0.5084 (total_true_lr = 467 total_true_mlr = 535) Precision for 24 - 3qd7X top_L5_lr = 1.0000 top_L_lr = 0.5725 top_Nc_lr = 0.4365 top_L5_mlr = 1.0000 top_L_mlr = 0.7786 top_Nc_mlr = 0.5115 (total_true_lr = 181 total_true_mlr = 217) Precision for 25 - 3o4pA top_L5_lr = 0.9524 top_L_lr = 0.6242 top_Nc_lr = 0.4718 top_L5_mlr = 1.0000 top_L_mlr = 0.9013 top_Nc_mlr = 0.5918 (total_true_lr = 496 total_true_mlr = 806) Precision for 26 - 3agnA top_L5_lr = 0.9130 top_L_lr = 0.5351 top_Nc_lr = 0.4545 top_L5_mlr = 1.0000 top_L_mlr = 0.7193 top_Nc_mlr = 0.5399 (total_true_lr = 154 total_true_mlr = 213) Precision for 27 - 2yilA top_L5_lr = 1.0000 top_L_lr = 0.5802 top_Nc_lr = 0.5157 top_L5_mlr = 1.0000 top_L_mlr = 0.6794 top_Nc_mlr = 0.5474 (total_true_lr = 159 total_true_mlr = 232) Precision for 28 - 4jivD top_L5_lr = 0.8947 top_L_lr = 0.5054 top_Nc_lr = 0.4135 top_L5_mlr = 0.8947 top_L_mlr = 0.8280 top_Nc_mlr = 0.5604 (total_true_lr = 133 total_true_mlr = 207) Precision for 29 - 4fz4A top_L5_lr = 0.3548 top_L_lr = 0.1558 top_Nc_lr = 0.1691 top_L5_mlr = 0.5484 top_L_mlr = 0.2143 top_Nc_mlr = 0.2185 (total_true_lr = 136 total_true_mlr = 151) Precision for 30 - 2g3vA top_L5_lr = 0.1562 top_L_lr = 0.0311 top_Nc_lr = 0.0342 top_L5_mlr = 0.1250 top_L_mlr = 0.0373 top_Nc_mlr = 0.0385 (total_true_lr = 146 total_true_mlr = 156) Precision for 31 - 2o0qA top_L5_lr = 1.0000 top_L_lr = 0.7544 top_Nc_lr = 0.6374 top_L5_mlr = 1.0000 top_L_mlr = 0.7544 top_Nc_mlr = 0.6250 (total_true_lr = 171 total_true_mlr = 192) Precision for 32 - 4levA top_L5_lr = 0.6351 top_L_lr = 0.3035 top_Nc_lr = 0.2702 top_L5_mlr = 0.8784 top_L_mlr = 0.4770 top_Nc_mlr = 0.3664 (total_true_lr = 459 total_true_mlr = 595) Precision for 33 - 1t6t1 top_L5_lr = 0.9545 top_L_lr = 0.4907 top_Nc_lr = 0.4690 top_L5_mlr = 1.0000 top_L_mlr = 0.7037 top_Nc_mlr = 0.5570 (total_true_lr = 113 total_true_mlr = 158) Precision for 34 - 2fzpA top_L5_lr = 0.2222 top_L_lr = 0.0746 top_Nc_lr = 0.0904 top_L5_mlr = 0.6296 top_L_mlr = 0.2388 top_Nc_mlr = 0.1674 (total_true_lr = 177 total_true_mlr = 215) Precision for 35 - 1rj8A top_L5_lr = 1.0000 top_L_lr = 0.7786 top_Nc_lr = 0.5315 top_L5_mlr = 1.0000 top_L_mlr = 0.8143 top_Nc_mlr = 0.5184 (total_true_lr = 286 total_true_mlr = 353) Precision for 36 - 2z7fI top_L5_lr = 0.4000 top_L_lr = 0.3800 top_Nc_lr = 0.3778 top_L5_mlr = 0.5000 top_L_mlr = 0.4800 top_Nc_mlr = 0.4028 (total_true_lr = 45 total_true_mlr = 72) Precision for 37 - 1dx5I top_L5_lr = 0.3750 top_L_lr = 0.1441 top_Nc_lr = 0.2619 top_L5_mlr = 0.7500 top_L_mlr = 0.3814 top_Nc_mlr = 0.3636 (total_true_lr = 42 total_true_mlr = 143) Precision for 38 - 2gsoA top_L5_lr = 0.9605 top_L_lr = 0.7723 top_Nc_lr = 0.5241 top_L5_mlr = 0.9737 top_L_mlr = 0.7775 top_Nc_mlr = 0.5011 (total_true_lr = 725 total_true_mlr = 888) Precision for 39 - 4tshB top_L5_lr = 0.8454 top_L_lr = 0.5381 top_Nc_lr = 0.4023 top_L5_mlr = 0.7216 top_L_mlr = 0.5278 top_Nc_mlr = 0.3978 (total_true_lr = 942 total_true_mlr = 1116) Precision for 40 - 3vtoA top_L5_lr = nan top_L_lr = nan top_Nc_lr = nan top_L5_mlr = 0.6500 top_L_mlr = 0.1818 top_Nc_mlr = 0.6667 (total_true_lr = 0 total_true_mlr = 18) Precision for 41 - 2p3yA top_L5_lr = 0.9348 top_L_lr = 0.6356 top_Nc_lr = 0.4530 top_L5_mlr = 0.9348 top_L_mlr = 0.6920 top_Nc_mlr = 0.4398 (total_true_lr = 777 total_true_mlr = 939) Precision for 42 - 3pcvA top_L5_lr = 1.0000 top_L_lr = 0.6027 top_Nc_lr = 0.7000 top_L5_mlr = 1.0000 top_L_mlr = 0.7260 top_Nc_mlr = 0.7181 (total_true_lr = 120 total_true_mlr = 149) Precision for 43 - 3kfoA top_L5_lr = 0.3721 top_L_lr = 0.2441 top_Nc_lr = 0.2325 top_L5_mlr = 0.6279 top_L_mlr = 0.3239 top_Nc_mlr = 0.2809 (total_true_lr = 228 total_true_mlr = 267) Precision for 44 - 3v6iB top_L5_lr = nan top_L_lr = nan top_Nc_lr = nan top_L5_mlr = nan top_L_mlr = nan top_Nc_mlr = nan (total_true_lr = 0 total_true_mlr = 0) Precision for 45 - 4htiA top_L5_lr = 0.6667 top_L_lr = 0.5909 top_Nc_lr = 0.5794 top_L5_mlr = 0.7222 top_L_mlr = 0.6477 top_Nc_mlr = 0.5725 (total_true_lr = 107 total_true_mlr = 138) Precision for 46 - 2q0tA top_L5_lr = 0.7308 top_L_lr = 0.4264 top_Nc_lr = 0.4360 top_L5_mlr = 0.7308 top_L_mlr = 0.4806 top_Nc_mlr = 0.4621 (total_true_lr = 250 total_true_mlr = 290) Precision for 47 - 4l3uA top_L5_lr = 0.9200 top_L_lr = 0.4878 top_Nc_lr = 0.5354 top_L5_mlr = 0.9200 top_L_mlr = 0.5528 top_Nc_mlr = 0.5397 (total_true_lr = 99 total_true_mlr = 126) Precision for 48 - 4pt1A top_L5_lr = 0.8846 top_L_lr = 0.4574 top_Nc_lr = 0.4957 top_L5_mlr = 1.0000 top_L_mlr = 0.6279 top_Nc_mlr = 0.5506 (total_true_lr = 115 total_true_mlr = 158) Precision for 49 - 3c1qA top_L5_lr = 1.0000 top_L_lr = 0.4087 top_Nc_lr = 0.6452 top_L5_mlr = 1.0000 top_L_mlr = 0.6783 top_Nc_mlr = 0.7156 (total_true_lr = 62 total_true_mlr = 109) Precision for 50 - 1ux5A top_L5_lr = 0.8659 top_L_lr = 0.4623 top_Nc_lr = 0.4906 top_L5_mlr = 0.8659 top_L_mlr = 0.4720 top_Nc_mlr = 0.4697 (total_true_lr = 373 total_true_mlr = 413) Precision for 51 - 1h9mA top_L5_lr = 1.0000 top_L_lr = 0.9078 top_Nc_lr = 0.7080 top_L5_mlr = 1.0000 top_L_mlr = 0.9716 top_Nc_mlr = 0.7675 (total_true_lr = 226 total_true_mlr = 314) Precision for 52 - 3oufA top_L5_lr = 0.3684 top_L_lr = 0.2366 top_Nc_lr = 0.2800 top_L5_mlr = 0.4737 top_L_mlr = 0.3118 top_Nc_mlr = 0.3699 (total_true_lr = 50 total_true_mlr = 73) Precision for 53 - 4rt5A top_L5_lr = 1.0000 top_L_lr = 0.7723 top_Nc_lr = 0.7714 top_L5_mlr = 1.0000 top_L_mlr = 0.8416 top_Nc_mlr = 0.7438 (total_true_lr = 105 total_true_mlr = 160) Precision for 54 - 3njcA top_L5_lr = 0.8710 top_L_lr = 0.5033 top_Nc_lr = 0.5352 top_L5_mlr = 0.7419 top_L_mlr = 0.6209 top_Nc_mlr = 0.5521 (total_true_lr = 142 total_true_mlr = 192) Precision for 55 - 2q73A top_L5_lr = 1.0000 top_L_lr = 0.5111 top_Nc_lr = 0.7455 top_L5_mlr = 1.0000 top_L_mlr = 0.5667 top_Nc_mlr = 0.6714 (total_true_lr = 55 total_true_mlr = 70) Precision for 56 - 1yz1A top_L5_lr = 1.0000 top_L_lr = 0.6918 top_Nc_lr = 0.6138 top_L5_mlr = 0.9310 top_L_mlr = 0.8151 top_Nc_mlr = 0.6076 (total_true_lr = 189 total_true_mlr = 237) Precision for 57 - 5c50B top_L5_lr = 1.0000 top_L_lr = 0.6649 top_Nc_lr = 0.5000 top_L5_mlr = 1.0000 top_L_mlr = 0.7730 top_Nc_mlr = 0.5401 (total_true_lr = 288 total_true_mlr = 337) Precision for 58 - 3hnxA top_L5_lr = 0.7727 top_L_lr = 0.4352 top_Nc_lr = 0.4536 top_L5_mlr = 1.0000 top_L_mlr = 0.7963 top_Nc_mlr = 0.6270 (total_true_lr = 97 total_true_mlr = 185) Precision for 59 - 1knyA top_L5_lr = 0.8039 top_L_lr = 0.5415 top_Nc_lr = 0.5125 top_L5_mlr = 0.9216 top_L_mlr = 0.6877 top_Nc_mlr = 0.5610 (total_true_lr = 279 total_true_mlr = 385) Precision for 60 - 1tr8A top_L5_lr = 0.9444 top_L_lr = 0.4565 top_Nc_lr = 0.6875 top_L5_mlr = 0.9444 top_L_mlr = 0.7609 top_Nc_mlr = 0.7300 (total_true_lr = 48 total_true_mlr = 100) Precision for 61 - 4qicB top_L5_lr = nan top_L_lr = nan top_Nc_lr = nan top_L5_mlr = 0.0000 top_L_mlr = 0.0000 top_Nc_mlr = 0.0000 (total_true_lr = 0 total_true_mlr = 3) Precision for 62 - 3f1iS top_L5_lr = 0.6667 top_L_lr = 0.1818 top_Nc_lr = 0.7143 top_L5_mlr = 0.9333 top_L_mlr = 0.3117 top_Nc_mlr = 0.7500 (total_true_lr = 14 total_true_mlr = 24) Precision for 63 - 2fyuI top_L5_lr = 0.0000 top_L_lr = 0.0000 top_Nc_lr = 0.0000 top_L5_mlr = 0.0000 top_L_mlr = 0.0000 top_Nc_mlr = 0.0000 (total_true_lr = 10 total_true_mlr = 23) Precision for 64 - 4ic9A top_L5_lr = 0.4231 top_L_lr = 0.2538 top_Nc_lr = 0.2872 top_L5_mlr = 0.6154 top_L_mlr = 0.3385 top_Nc_mlr = 0.3147 (total_true_lr = 94 total_true_mlr = 143) Precision for 65 - 3iruA top_L5_lr = 0.9818 top_L_lr = 0.7428 top_Nc_lr = 0.5801 top_L5_mlr = 0.9818 top_L_mlr = 0.8188 top_Nc_mlr = 0.5959 (total_true_lr = 443 total_true_mlr = 532) Precision for 66 - 2xu8A top_L5_lr = 0.3043 top_L_lr = 0.0948 top_Nc_lr = 0.1410 top_L5_mlr = 0.9130 top_L_mlr = 0.3190 top_Nc_mlr = 0.2721 (total_true_lr = 78 total_true_mlr = 136) Precision for 67 - 3g7lA top_L5_lr = 0.5455 top_L_lr = 0.2182 top_Nc_lr = 0.5714 top_L5_mlr = 1.0000 top_L_mlr = 0.8182 top_Nc_mlr = 0.8214 (total_true_lr = 14 total_true_mlr = 56) Precision for 68 - 3hshA top_L5_lr = 0.0000 top_L_lr = 0.0179 top_Nc_lr = 0.0000 top_L5_mlr = 0.7273 top_L_mlr = 0.5893 top_Nc_mlr = 0.6000 (total_true_lr = 2 total_true_mlr = 60) Precision for 69 - 1vq0A top_L5_lr = 0.9655 top_L_lr = 0.6000 top_Nc_lr = 0.4903 top_L5_mlr = 0.9655 top_L_mlr = 0.7414 top_Nc_mlr = 0.5623 (total_true_lr = 414 total_true_mlr = 562) Precision for 70 - 4z04A top_L5_lr = 1.0000 top_L_lr = 0.7419 top_Nc_lr = 0.6516 top_L5_mlr = 1.0000 top_L_mlr = 0.8790 top_Nc_mlr = 0.6579 (total_true_lr = 155 total_true_mlr = 228) Precision for 71 - 2huhA top_L5_lr = 1.0000 top_L_lr = 0.7211 top_Nc_lr = 0.5605 top_L5_mlr = 1.0000 top_L_mlr = 0.7687 top_Nc_mlr = 0.6102 (total_true_lr = 223 total_true_mlr = 295) Precision for 72 - 4dh2B top_L5_lr = 1.0000 top_L_lr = 0.4722 top_Nc_lr = 0.5079 top_L5_mlr = 1.0000 top_L_mlr = 0.5694 top_Nc_mlr = 0.5625 (total_true_lr = 63 total_true_mlr = 80) Precision for 73 - 2p9xA top_L5_lr = 0.3000 top_L_lr = 0.1327 top_Nc_lr = 0.1905 top_L5_mlr = 0.5500 top_L_mlr = 0.2449 top_Nc_mlr = 0.2444 (total_true_lr = 63 total_true_mlr = 90) Precision for 74 - 1m9zA top_L5_lr = 0.9048 top_L_lr = 0.4381 top_Nc_lr = 0.3476 top_L5_mlr = 1.0000 top_L_mlr = 0.6000 top_Nc_mlr = 0.4310 (total_true_lr = 164 total_true_mlr = 239) Precision for 75 - 2czrA top_L5_lr = 0.5556 top_L_lr = 0.1947 top_Nc_lr = 0.2222 top_L5_mlr = 0.7333 top_L_mlr = 0.3894 top_Nc_mlr = 0.3012 (total_true_lr = 189 total_true_mlr = 342) Precision for 76 - 2bh1X top_L5_lr = 0.9286 top_L_lr = 0.5735 top_Nc_lr = 0.7255 top_L5_mlr = 0.9286 top_L_mlr = 0.8235 top_Nc_mlr = 0.7093 (total_true_lr = 51 total_true_mlr = 86) Precision for 77 - 3ghfA top_L5_lr = 0.7500 top_L_lr = 0.4700 top_Nc_lr = 0.4476 top_L5_mlr = 0.9000 top_L_mlr = 0.5400 top_Nc_mlr = 0.4841 (total_true_lr = 105 total_true_mlr = 126) Precision for 78 - 4ui1C top_L5_lr = 0.7143 top_L_lr = 0.4583 top_Nc_lr = 0.5000 top_L5_mlr = 0.8571 top_L_mlr = 0.5139 top_Nc_mlr = 0.5195 (total_true_lr = 60 total_true_mlr = 77) Precision for 79 - 2otaA top_L5_lr = nan top_L_lr = nan top_Nc_lr = nan top_L5_mlr = 0.2308 top_L_mlr = 0.0746 top_Nc_mlr = 0.2000 (total_true_lr = 0 total_true_mlr = 5) Precision for 80 - 1vk1A top_L5_lr = 0.5870 top_L_lr = 0.4052 top_Nc_lr = 0.3166 top_L5_mlr = 0.6087 top_L_mlr = 0.4440 top_Nc_mlr = 0.3247 (total_true_lr = 398 total_true_mlr = 462) Precision for 81 - 1su1A top_L5_lr = 1.0000 top_L_lr = 0.7065 top_Nc_lr = 0.6000 top_L5_mlr = 0.9730 top_L_mlr = 0.7935 top_Nc_mlr = 0.5697 (total_true_lr = 270 total_true_mlr = 423) Precision for 82 - 3sfvB top_L5_lr = 0.3971 top_L_lr = 0.1433 top_Nc_lr = 0.2000 top_L5_mlr = 0.4559 top_L_mlr = 0.2515 top_Nc_mlr = 0.2522 (total_true_lr = 205 total_true_mlr = 341) Precision for 83 - 2f5tX top_L5_lr = 0.8511 top_L_lr = 0.4249 top_Nc_lr = 0.2895 top_L5_mlr = 0.8511 top_L_mlr = 0.5794 top_Nc_mlr = 0.3826 (total_true_lr = 373 total_true_mlr = 507) Precision for 84 - 1xyiA top_L5_lr = 0.0769 top_L_lr = 0.0909 top_Nc_lr = 0.0811 top_L5_mlr = 0.6154 top_L_mlr = 0.3333 top_Nc_mlr = 0.3205 (total_true_lr = 37 total_true_mlr = 78) Precision for 85 - 2okuA top_L5_lr = 0.6250 top_L_lr = 0.4754 top_Nc_lr = 0.4786 top_L5_mlr = 0.7083 top_L_mlr = 0.5656 top_Nc_mlr = 0.5000 (total_true_lr = 117 total_true_mlr = 142) Precision for 86 - 2h5nA top_L5_lr = 0.8846 top_L_lr = 0.5000 top_Nc_lr = 0.5038 top_L5_mlr = 0.8462 top_L_mlr = 0.5152 top_Nc_mlr = 0.5000 (total_true_lr = 131 total_true_mlr = 148) Precision for 87 - 1kptA top_L5_lr = 0.4286 top_L_lr = 0.4952 top_Nc_lr = 0.4286 top_L5_mlr = 0.4286 top_L_mlr = 0.5524 top_Nc_mlr = 0.4541 (total_true_lr = 175 total_true_mlr = 229) Precision for 88 - 2qc5A top_L5_lr = 1.0000 top_L_lr = 0.7282 top_Nc_lr = 0.6364 top_L5_mlr = 1.0000 top_L_mlr = 0.9362 top_Nc_mlr = 0.7055 (total_true_lr = 374 total_true_mlr = 686) Precision for 89 - 2hueC top_L5_lr = 0.4375 top_L_lr = 0.1463 top_Nc_lr = 0.5000 top_L5_mlr = 0.8750 top_L_mlr = 0.3537 top_Nc_mlr = 0.5676 (total_true_lr = 14 total_true_mlr = 37) Precision for 90 - 2i5vO top_L5_lr = 0.0204 top_L_lr = 0.0691 top_Nc_lr = 0.0312 top_L5_mlr = 0.9388 top_L_mlr = 0.5813 top_Nc_mlr = 0.4430 (total_true_lr = 96 total_true_mlr = 386) Precision for 91 - 2gs5A top_L5_lr = 0.9487 top_L_lr = 0.6062 top_Nc_lr = 0.4634 top_L5_mlr = 1.0000 top_L_mlr = 0.7098 top_Nc_mlr = 0.5026 (total_true_lr = 287 total_true_mlr = 390) Precision for 92 - 4lmoA top_L5_lr = 0.7000 top_L_lr = 0.3785 top_Nc_lr = 0.4043 top_L5_mlr = 0.7800 top_L_mlr = 0.4622 top_Nc_mlr = 0.4175 (total_true_lr = 230 total_true_mlr = 285) Precision for 93 - 4njcA top_L5_lr = 1.0000 top_L_lr = 0.7000 top_Nc_lr = 0.6825 top_L5_mlr = 1.0000 top_L_mlr = 0.8333 top_Nc_mlr = 0.6778 (total_true_lr = 63 total_true_mlr = 90) Precision for 94 - 3ronA top_L5_lr = 0.8750 top_L_lr = 0.5721 top_Nc_lr = 0.3978 top_L5_mlr = 0.9750 top_L_mlr = 0.7512 top_Nc_mlr = 0.4574 (total_true_lr = 357 total_true_mlr = 446) Precision for 95 - 3g1pA top_L5_lr = 0.9800 top_L_lr = 0.7349 top_Nc_lr = 0.5721 top_L5_mlr = 1.0000 top_L_mlr = 0.8635 top_Nc_mlr = 0.6175 (total_true_lr = 402 total_true_mlr = 570) Precision for 96 - 4m8aA top_L5_lr = 0.3077 top_L_lr = 0.2537 top_Nc_lr = 0.2778 top_L5_mlr = 0.7692 top_L_mlr = 0.3284 top_Nc_mlr = 0.2824 (total_true_lr = 54 total_true_mlr = 85) Precision for 97 - 3ajfA top_L5_lr = 0.1667 top_L_lr = 0.0870 top_Nc_lr = 0.0862 top_L5_mlr = 0.2778 top_L_mlr = 0.1196 top_Nc_mlr = 0.1250 (total_true_lr = 58 total_true_mlr = 80) Precision for 98 - 1j8uA top_L5_lr = 0.8033 top_L_lr = 0.4788 top_Nc_lr = 0.4113 top_L5_mlr = 0.8361 top_L_mlr = 0.5570 top_Nc_mlr = 0.4299 (total_true_lr = 423 total_true_mlr = 535) Precision for 99 - 4u65A top_L5_lr = 0.8462 top_L_lr = 0.5496 top_Nc_lr = 0.5338 top_L5_mlr = 1.0000 top_L_mlr = 0.6947 top_Nc_mlr = 0.5926 (total_true_lr = 148 total_true_mlr = 189) Average Precision: top_L5_lr = 73.77 top_L_lr = 47.24 top_Nc_lr = 44.86 top_L5_mlr = 80.43 top_L_mlr = 58.03 top_Nc_mlr = 48.06 Evaluate test set.. Model params: L 512 num_blocks 128 width 64 expected_n_channels 57 1/150 [..............................] - ETA: 2:31:02 2/150 [..............................] - ETA: 1:16:16 3/150 [..............................] - ETA: 51:22  4/150 [..............................] - ETA: 38:54 5/150 [>.............................] - ETA: 31:25 6/150 [>.............................] - ETA: 26:25 7/150 [>.............................] - ETA: 22:51 8/150 [>.............................] - ETA: 20:10 9/150 [>.............................] - ETA: 18:04 10/150 [=>............................] - ETA: 16:23 11/150 [=>............................] - ETA: 15:01 12/150 [=>............................] - ETA: 13:52 13/150 [=>............................] - ETA: 12:53 14/150 [=>............................] - ETA: 12:03 15/150 [==>...........................] - ETA: 11:19 16/150 [==>...........................] - ETA: 12:47 17/150 [==>...........................] - ETA: 12:25 18/150 [==>...........................] - ETA: 12:08 19/150 [==>...........................] - ETA: 11:32 20/150 [===>..........................] - ETA: 11:01 21/150 [===>..........................] - ETA: 11:22 22/150 [===>..........................] - ETA: 11:24 23/150 [===>..........................] - ETA: 11:49 24/150 [===>..........................] - ETA: 11:20 25/150 [====>.........................] - ETA: 10:56 26/150 [====>.........................] - ETA: 10:58 27/150 [====>.........................] - ETA: 10:39 28/150 [====>.........................] - ETA: 10:38 29/150 [====>.........................] - ETA: 10:42 30/150 [=====>........................] - ETA: 10:38 31/150 [=====>........................] - ETA: 10:52 32/150 [=====>........................] - ETA: 10:38 33/150 [=====>........................] - ETA: 10:39 34/150 [=====>........................] - ETA: 10:30 35/150 [======>.......................] - ETA: 10:13 36/150 [======>.......................] - ETA: 9:59  37/150 [======>.......................] - ETA: 10:01 38/150 [======>.......................] - ETA: 9:54  39/150 [======>.......................] - ETA: 9:52 40/150 [=======>......................] - ETA: 9:53 41/150 [=======>......................] - ETA: 10:08 42/150 [=======>......................] - ETA: 9:54  43/150 [=======>......................] - ETA: 9:50 44/150 [=======>......................] - ETA: 9:36 45/150 [========>.....................] - ETA: 9:25 46/150 [========>.....................] - ETA: 9:27 47/150 [========>.....................] - ETA: 9:29 48/150 [========>.....................] - ETA: 9:18 49/150 [========>.....................] - ETA: 9:05 50/150 [=========>....................] - ETA: 8:53 51/150 [=========>....................] - ETA: 8:54 52/150 [=========>....................] - ETA: 8:46 53/150 [=========>....................] - ETA: 8:36 54/150 [=========>....................] - ETA: 8:23 55/150 [==========>...................] - ETA: 8:16 56/150 [==========>...................] - ETA: 8:25 57/150 [==========>...................] - ETA: 8:24 58/150 [==========>...................] - ETA: 8:26 59/150 [==========>...................] - ETA: 8:16 60/150 [===========>..................] - ETA: 8:05 61/150 [===========>..................] - ETA: 8:06 62/150 [===========>..................] - ETA: 8:01 63/150 [===========>..................] - ETA: 8:01 64/150 [===========>..................] - ETA: 7:56 65/150 [============>.................] - ETA: 7:57 66/150 [============>.................] - ETA: 7:48 67/150 [============>.................] - ETA: 7:50 68/150 [============>.................] - ETA: 7:41 69/150 [============>.................] - ETA: 7:39 70/150 [=============>................] - ETA: 7:33 71/150 [=============>................] - ETA: 7:23 72/150 [=============>................] - ETA: 7:16 73/150 [=============>................] - ETA: 7:07 74/150 [=============>................] - ETA: 7:04 75/150 [==============>...............] - ETA: 7:03 76/150 [==============>...............] - ETA: 6:56 77/150 [==============>...............] - ETA: 6:49 78/150 [==============>...............] - ETA: 6:46 79/150 [==============>...............] - ETA: 6:42 80/150 [===============>..............] - ETA: 6:41 81/150 [===============>..............] - ETA: 6:39 82/150 [===============>..............] - ETA: 6:33 83/150 [===============>..............] - ETA: 6:25 84/150 [===============>..............] - ETA: 6:15 85/150 [================>.............] - ETA: 6:07 86/150 [================>.............] - ETA: 6:02 87/150 [================>.............] - ETA: 5:58 88/150 [================>.............] - ETA: 5:53 89/150 [================>.............] - ETA: 5:46 90/150 [=================>............] - ETA: 5:46 91/150 [=================>............] - ETA: 5:43 92/150 [=================>............] - ETA: 5:39 93/150 [=================>............] - ETA: 5:33 94/150 [=================>............] - ETA: 5:27 95/150 [==================>...........] - ETA: 5:20 96/150 [==================>...........] - ETA: 5:12 97/150 [==================>...........] - ETA: 5:03 98/150 [==================>...........] - ETA: 4:59 99/150 [==================>...........] - ETA: 4:53 100/150 [===================>..........] - ETA: 4:48 101/150 [===================>..........] - ETA: 4:41 102/150 [===================>..........] - ETA: 4:33 103/150 [===================>..........] - ETA: 4:26 104/150 [===================>..........] - ETA: 4:21 105/150 [====================>.........] - ETA: 4:15 106/150 [====================>.........] - ETA: 4:10 107/150 [====================>.........] - ETA: 4:03 108/150 [====================>.........] - ETA: 3:56 109/150 [====================>.........] - ETA: 3:53 110/150 [=====================>........] - ETA: 3:50 111/150 [=====================>........] - ETA: 3:43 112/150 [=====================>........] - ETA: 3:36 113/150 [=====================>........] - ETA: 3:30 114/150 [=====================>........] - ETA: 3:25 115/150 [======================>.......] - ETA: 3:19 116/150 [======================>.......] - ETA: 3:16 117/150 [======================>.......] - ETA: 3:09 118/150 [======================>.......] - ETA: 3:03 119/150 [======================>.......] - ETA: 2:56 120/150 [=======================>......] - ETA: 2:52 121/150 [=======================>......] - ETA: 2:47 122/150 [=======================>......] - ETA: 2:41 123/150 [=======================>......] - ETA: 2:34 124/150 [=======================>......] - ETA: 2:28 125/150 [========================>.....] - ETA: 2:21 126/150 [========================>.....] - ETA: 2:15 127/150 [========================>.....] - ETA: 2:11 128/150 [========================>.....] - ETA: 2:05 129/150 [========================>.....] - ETA: 1:58 130/150 [=========================>....] - ETA: 1:53 131/150 [=========================>....] - ETA: 1:47 132/150 [=========================>....] - ETA: 1:41 133/150 [=========================>....] - ETA: 1:35 134/150 [=========================>....] - ETA: 1:30 135/150 [==========================>...] - ETA: 1:24 136/150 [==========================>...] - ETA: 1:18 137/150 [==========================>...] - ETA: 1:13 138/150 [==========================>...] - ETA: 1:08 139/150 [==========================>...] - ETA: 1:03 140/150 [===========================>..] - ETA: 57s  141/150 [===========================>..] - ETA: 51s 142/150 [===========================>..] - ETA: 46s 143/150 [===========================>..] - ETA: 40s 144/150 [===========================>..] - ETA: 34s 145/150 [============================>.] - ETA: 28s 146/150 [============================>.] - ETA: 22s 147/150 [============================>.] - ETA: 17s 148/150 [============================>.] - ETA: 11s 149/150 [============================>.] - ETA: 5s  150/150 [==============================] - 887s 6s/step MAE for 0 1a3aA lr_d8 = 3.61 mlr_d8 = 3.26 lr_d12 = 3.01 mlr_d12 = 2.77 MAE for 1 1a6mA lr_d8 = 2.24 mlr_d8 = 2.31 lr_d12 = 2.05 mlr_d12 = 2.09 MAE for 2 1a70A lr_d8 = 3.44 mlr_d8 = 3.50 lr_d12 = 4.09 mlr_d12 = 4.04 MAE for 3 1aapA lr_d8 = 6.07 mlr_d8 = 4.99 lr_d12 = 5.04 mlr_d12 = 4.16 MAE for 4 1abaA lr_d8 = 2.10 mlr_d8 = 2.49 lr_d12 = 2.15 mlr_d12 = 2.35 MAE for 5 1ag6A lr_d8 = 4.63 mlr_d8 = 4.60 lr_d12 = 4.09 mlr_d12 = 4.23 MAE for 6 1aoeA lr_d8 = 4.15 mlr_d8 = 3.91 lr_d12 = 3.68 mlr_d12 = 3.55 MAE for 7 1atlA lr_d8 = 6.37 mlr_d8 = 6.09 lr_d12 = 5.76 mlr_d12 = 5.35 MAE for 8 1atzA lr_d8 = 1.87 mlr_d8 = 2.14 lr_d12 = 1.82 mlr_d12 = 1.94 MAE for 9 1avsA lr_d8 = 4.17 mlr_d8 = 3.45 lr_d12 = 2.53 mlr_d12 = 2.21 MAE for 10 1bdoA lr_d8 = 2.78 mlr_d8 = 2.27 lr_d12 = 2.74 mlr_d12 = 2.42 MAE for 11 1bebA lr_d8 = 7.19 mlr_d8 = 5.36 lr_d12 = 5.72 mlr_d12 = 4.79 MAE for 12 1behA lr_d8 = 6.70 mlr_d8 = 6.40 lr_d12 = 5.84 mlr_d12 = 5.65 MAE for 13 1bkrA lr_d8 = 4.28 mlr_d8 = 3.70 lr_d12 = 3.39 mlr_d12 = 2.89 MAE for 14 1brfA lr_d8 = 2.62 mlr_d8 = 3.00 lr_d12 = 2.10 mlr_d12 = 2.35 MAE for 15 1bsgA lr_d8 = 4.67 mlr_d8 = 4.26 lr_d12 = 3.92 mlr_d12 = 3.69 MAE for 16 1c44A lr_d8 = 6.92 mlr_d8 = 6.15 lr_d12 = 5.77 mlr_d12 = 5.15 MAE for 17 1c52A lr_d8 = 7.87 mlr_d8 = 7.00 lr_d12 = 6.17 mlr_d12 = 5.69 MAE for 18 1c9oA lr_d8 = 3.91 mlr_d8 = 3.23 lr_d12 = 3.52 mlr_d12 = 3.16 MAE for 19 1cc8A lr_d8 = 1.52 mlr_d8 = 1.85 lr_d12 = 1.38 mlr_d12 = 1.56 MAE for 20 1chdA lr_d8 = 3.28 mlr_d8 = 2.94 lr_d12 = 2.77 mlr_d12 = 2.48 MAE for 21 1cjwA lr_d8 = 5.57 mlr_d8 = 4.81 lr_d12 = 4.40 mlr_d12 = 3.96 MAE for 22 1ckeA lr_d8 = 3.83 mlr_d8 = 3.51 lr_d12 = 3.22 mlr_d12 = 2.84 MAE for 23 1ctfA lr_d8 = 1.47 mlr_d8 = 1.68 lr_d12 = 1.86 mlr_d12 = 2.01 MAE for 24 1cxyA lr_d8 = 4.43 mlr_d8 = 4.16 lr_d12 = 3.94 mlr_d12 = 3.46 MAE for 25 1cznA lr_d8 = 3.56 mlr_d8 = 3.70 lr_d12 = 2.94 mlr_d12 = 2.97 MAE for 26 1d0qA lr_d8 = 3.34 mlr_d8 = 3.05 lr_d12 = 2.45 mlr_d12 = 2.41 MAE for 27 1d1qA lr_d8 = 2.94 mlr_d8 = 2.84 lr_d12 = 2.41 mlr_d12 = 2.33 MAE for 28 1d4oA lr_d8 = 5.24 mlr_d8 = 5.42 lr_d12 = 4.33 mlr_d12 = 4.39 MAE for 29 1dbxA lr_d8 = 3.95 mlr_d8 = 3.55 lr_d12 = 3.27 mlr_d12 = 3.03 MAE for 30 1dixA lr_d8 = 9.83 mlr_d8 = 8.76 lr_d12 = 8.32 mlr_d12 = 7.40 MAE for 31 1dlwA lr_d8 = 2.87 mlr_d8 = 2.74 lr_d12 = 2.10 mlr_d12 = 2.02 MAE for 32 1dmgA lr_d8 = 6.14 mlr_d8 = 5.71 lr_d12 = 6.05 mlr_d12 = 5.57 MAE for 33 1dqgA lr_d8 = 16.66 mlr_d8 = 12.98 lr_d12 = 13.79 mlr_d12 = 11.13 MAE for 34 1dsxA lr_d8 = 5.84 mlr_d8 = 5.37 lr_d12 = 4.85 mlr_d12 = 4.41 MAE for 35 1eazA lr_d8 = 3.35 mlr_d8 = 2.21 lr_d12 = 2.50 mlr_d12 = 2.07 MAE for 36 1ej0A lr_d8 = 2.93 mlr_d8 = 2.89 lr_d12 = 2.52 mlr_d12 = 2.46 MAE for 37 1ej8A lr_d8 = 7.74 mlr_d8 = 7.42 lr_d12 = 6.47 mlr_d12 = 6.44 MAE for 38 1ek0A lr_d8 = 2.50 mlr_d8 = 2.56 lr_d12 = 2.16 mlr_d12 = 2.16 MAE for 39 1f6bA lr_d8 = 4.24 mlr_d8 = 4.68 lr_d12 = 4.21 mlr_d12 = 4.31 MAE for 40 1fcyA lr_d8 = 11.70 mlr_d8 = 10.80 lr_d12 = 9.29 mlr_d12 = 8.49 MAE for 41 1fk5A lr_d8 = 8.12 mlr_d8 = 7.54 lr_d12 = 6.27 mlr_d12 = 5.67 MAE for 42 1fl0A lr_d8 = 7.18 mlr_d8 = 6.37 lr_d12 = 5.92 mlr_d12 = 5.30 MAE for 43 1fnaA lr_d8 = 3.34 mlr_d8 = 2.91 lr_d12 = 2.74 mlr_d12 = 2.52 MAE for 44 1fqtA lr_d8 = 3.28 mlr_d8 = 3.34 lr_d12 = 2.95 mlr_d12 = 2.63 MAE for 45 1fvgA lr_d8 = 4.45 mlr_d8 = 4.16 lr_d12 = 3.62 mlr_d12 = 3.41 MAE for 46 1fvkA lr_d8 = 5.58 mlr_d8 = 5.44 lr_d12 = 5.20 mlr_d12 = 4.97 MAE for 47 1fx2A lr_d8 = 4.06 mlr_d8 = 3.55 lr_d12 = 3.37 mlr_d12 = 2.95 MAE for 48 1g2rA lr_d8 = 4.22 mlr_d8 = 3.94 lr_d12 = 3.63 mlr_d12 = 3.30 MAE for 49 1g9oA lr_d8 = 2.31 mlr_d8 = 2.12 lr_d12 = 2.58 mlr_d12 = 2.35 MAE for 50 1gbsA lr_d8 = 11.37 mlr_d8 = 9.57 lr_d12 = 8.96 mlr_d12 = 7.66 MAE for 51 1gmiA lr_d8 = 3.35 mlr_d8 = 3.07 lr_d12 = 3.22 mlr_d12 = 2.97 MAE for 52 1gmxA lr_d8 = 4.06 mlr_d8 = 3.53 lr_d12 = 3.08 mlr_d12 = 2.78 MAE for 53 1guuA lr_d8 = 2.66 mlr_d8 = 2.35 lr_d12 = 1.87 mlr_d12 = 1.70 MAE for 54 1gz2A lr_d8 = 4.49 mlr_d8 = 4.32 lr_d12 = 3.97 mlr_d12 = 3.73 MAE for 55 1gzcA lr_d8 = 5.43 mlr_d8 = 5.52 lr_d12 = 5.31 mlr_d12 = 5.28 MAE for 56 1h0pA lr_d8 = 4.47 mlr_d8 = 4.09 lr_d12 = 4.18 mlr_d12 = 3.84 MAE for 57 1h2eA lr_d8 = 3.15 mlr_d8 = 2.91 lr_d12 = 3.00 mlr_d12 = 2.76 MAE for 58 1h4xA lr_d8 = 3.18 mlr_d8 = 3.14 lr_d12 = 3.57 mlr_d12 = 3.32 MAE for 59 1h98A lr_d8 = 9.26 mlr_d8 = 8.24 lr_d12 = 8.12 mlr_d12 = 6.66 MAE for 60 1hdoA lr_d8 = 2.94 mlr_d8 = 2.80 lr_d12 = 2.22 mlr_d12 = 2.19 MAE for 61 1hfcA lr_d8 = 7.51 mlr_d8 = 6.86 lr_d12 = 6.53 mlr_d12 = 5.84 MAE for 62 1hh8A lr_d8 = 9.30 mlr_d8 = 6.64 lr_d12 = 7.54 mlr_d12 = 5.77 MAE for 63 1htwA lr_d8 = 2.52 mlr_d8 = 2.84 lr_d12 = 2.16 mlr_d12 = 2.46 MAE for 64 1hxnA lr_d8 = 12.51 mlr_d8 = 9.52 lr_d12 = 10.31 mlr_d12 = 8.17 MAE for 65 1i1jA lr_d8 = 10.48 mlr_d8 = 7.90 lr_d12 = 8.82 mlr_d12 = 7.48 MAE for 66 1i1nA lr_d8 = 5.15 mlr_d8 = 4.35 lr_d12 = 4.14 mlr_d12 = 3.73 MAE for 67 1i4jA lr_d8 = 2.01 mlr_d8 = 2.00 lr_d12 = 2.07 mlr_d12 = 1.98 MAE for 68 1i58A lr_d8 = 3.41 mlr_d8 = 3.20 lr_d12 = 2.74 mlr_d12 = 2.60 MAE for 69 1i5gA lr_d8 = 5.45 mlr_d8 = 5.81 lr_d12 = 4.32 mlr_d12 = 4.50 MAE for 70 1i71A lr_d8 = 5.05 mlr_d8 = 5.10 lr_d12 = 3.78 mlr_d12 = 4.11 MAE for 71 1ihzA lr_d8 = 4.63 mlr_d8 = 3.84 lr_d12 = 3.99 mlr_d12 = 3.44 MAE for 72 1iibA lr_d8 = 1.81 mlr_d8 = 1.77 lr_d12 = 1.57 mlr_d12 = 1.50 MAE for 73 1im5A lr_d8 = 4.36 mlr_d8 = 4.56 lr_d12 = 3.88 mlr_d12 = 3.95 MAE for 74 1iwdA lr_d8 = 4.67 mlr_d8 = 4.31 lr_d12 = 3.69 mlr_d12 = 3.58 MAE for 75 1j3aA lr_d8 = 6.80 mlr_d8 = 6.52 lr_d12 = 6.09 mlr_d12 = 5.62 MAE for 76 1jbeA lr_d8 = 1.55 mlr_d8 = 1.46 lr_d12 = 1.46 mlr_d12 = 1.36 MAE for 77 1jbkA lr_d8 = 4.46 mlr_d8 = 4.43 lr_d12 = 4.16 mlr_d12 = 4.10 MAE for 78 1jfuA lr_d8 = 6.01 mlr_d8 = 5.65 lr_d12 = 4.94 mlr_d12 = 4.56 MAE for 79 1jfxA lr_d8 = 4.09 mlr_d8 = 3.71 lr_d12 = 3.73 mlr_d12 = 3.45 MAE for 80 1jkxA lr_d8 = 3.43 mlr_d8 = 2.85 lr_d12 = 2.94 mlr_d12 = 2.56 MAE for 81 1jl1A lr_d8 = 3.58 mlr_d8 = 3.16 lr_d12 = 3.38 mlr_d12 = 3.08 MAE for 82 1jo0A lr_d8 = 4.45 mlr_d8 = 3.97 lr_d12 = 3.76 mlr_d12 = 3.31 MAE for 83 1jo8A lr_d8 = 3.65 mlr_d8 = 2.21 lr_d12 = 3.47 mlr_d12 = 2.47 MAE for 84 1josA lr_d8 = 2.40 mlr_d8 = 2.13 lr_d12 = 2.49 mlr_d12 = 2.21 MAE for 85 1jvwA lr_d8 = 4.39 mlr_d8 = 3.86 lr_d12 = 4.20 mlr_d12 = 3.83 MAE for 86 1jwqA lr_d8 = 2.16 mlr_d8 = 2.19 lr_d12 = 2.12 mlr_d12 = 2.10 MAE for 87 1jyhA lr_d8 = 2.18 mlr_d8 = 2.10 lr_d12 = 2.88 mlr_d12 = 2.69 MAE for 88 1k6kA lr_d8 = 2.85 mlr_d8 = 2.63 lr_d12 = 2.09 mlr_d12 = 1.94 MAE for 89 1k7cA lr_d8 = 5.63 mlr_d8 = 6.30 lr_d12 = 5.14 mlr_d12 = 5.39 MAE for 90 1k7jA lr_d8 = 3.19 mlr_d8 = 3.10 lr_d12 = 2.81 mlr_d12 = 2.72 MAE for 91 1kidA lr_d8 = 5.17 mlr_d8 = 4.87 lr_d12 = 4.57 mlr_d12 = 4.21 MAE for 92 1kq6A lr_d8 = 8.77 mlr_d8 = 6.64 lr_d12 = 6.73 mlr_d12 = 5.69 MAE for 93 1kqrA lr_d8 = 10.81 mlr_d8 = 9.15 lr_d12 = 8.37 mlr_d12 = 7.42 MAE for 94 1ktgA lr_d8 = 4.45 mlr_d8 = 4.23 lr_d12 = 3.86 mlr_d12 = 3.63 MAE for 95 1ku3A lr_d8 = 6.68 mlr_d8 = 5.41 lr_d12 = 5.54 mlr_d12 = 4.02 MAE for 96 1kw4A lr_d8 = 7.83 mlr_d8 = 6.16 lr_d12 = 6.98 mlr_d12 = 5.05 MAE for 97 1lm4A lr_d8 = 7.52 mlr_d8 = 6.67 lr_d12 = 6.01 mlr_d12 = 5.57 MAE for 98 1lo7A lr_d8 = 2.38 mlr_d8 = 2.04 lr_d12 = 2.20 mlr_d12 = 2.02 MAE for 99 1lpyA lr_d8 = 4.08 mlr_d8 = 4.43 lr_d12 = 3.33 mlr_d12 = 3.62 MAE for 100 1m4jA lr_d8 = 5.06 mlr_d8 = 4.54 lr_d12 = 4.24 mlr_d12 = 3.97 MAE for 101 1m8aA lr_d8 = 5.02 mlr_d8 = 3.49 lr_d12 = 3.36 mlr_d12 = 2.85 MAE for 102 1mk0A lr_d8 = 5.41 mlr_d8 = 5.51 lr_d12 = 4.55 mlr_d12 = 4.69 MAE for 103 1mugA lr_d8 = 4.14 mlr_d8 = 4.60 lr_d12 = 3.78 mlr_d12 = 4.10 MAE for 104 1nb9A lr_d8 = 4.99 mlr_d8 = 4.16 lr_d12 = 4.51 mlr_d12 = 4.04 MAE for 105 1ne2A lr_d8 = 9.28 mlr_d8 = 8.27 lr_d12 = 7.07 mlr_d12 = 6.52 MAE for 106 1npsA lr_d8 = 5.84 mlr_d8 = 5.22 lr_d12 = 5.23 mlr_d12 = 4.58 MAE for 107 1nrvA lr_d8 = 7.57 mlr_d8 = 5.84 lr_d12 = 5.76 mlr_d12 = 4.59 MAE for 108 1ny1A lr_d8 = 6.24 mlr_d8 = 5.88 lr_d12 = 5.52 mlr_d12 = 5.14 MAE for 109 1o1zA lr_d8 = 2.59 mlr_d8 = 2.43 lr_d12 = 2.50 mlr_d12 = 2.31 MAE for 110 1p90A lr_d8 = 4.65 mlr_d8 = 3.95 lr_d12 = 3.66 mlr_d12 = 3.29 MAE for 111 1pchA lr_d8 = 2.23 mlr_d8 = 2.30 lr_d12 = 1.92 mlr_d12 = 1.91 MAE for 112 1pkoA lr_d8 = 4.19 mlr_d8 = 3.90 lr_d12 = 3.73 mlr_d12 = 3.53 MAE for 113 1qf9A lr_d8 = 3.54 mlr_d8 = 3.38 lr_d12 = 3.03 mlr_d12 = 2.84 MAE for 114 1qjpA lr_d8 = 5.62 mlr_d8 = 3.43 lr_d12 = 4.76 mlr_d12 = 3.75 MAE for 115 1ql0A lr_d8 = 9.79 mlr_d8 = 8.61 lr_d12 = 7.98 mlr_d12 = 7.32 MAE for 116 1r26A lr_d8 = 2.46 mlr_d8 = 2.32 lr_d12 = 1.98 mlr_d12 = 1.93 MAE for 117 1roaA lr_d8 = 4.03 mlr_d8 = 3.22 lr_d12 = 3.84 mlr_d12 = 3.35 MAE for 118 1rw1A lr_d8 = 2.03 mlr_d8 = 1.99 lr_d12 = 1.86 mlr_d12 = 1.82 MAE for 119 1rw7A lr_d8 = 4.19 mlr_d8 = 4.13 lr_d12 = 3.85 mlr_d12 = 3.70 MAE for 120 1rybA lr_d8 = 2.32 mlr_d8 = 2.48 lr_d12 = 2.29 mlr_d12 = 2.37 MAE for 121 1smxA lr_d8 = 4.31 mlr_d8 = 4.11 lr_d12 = 4.05 mlr_d12 = 3.64 MAE for 122 1svyA lr_d8 = 5.91 mlr_d8 = 4.93 lr_d12 = 5.08 mlr_d12 = 4.21 MAE for 123 1t8kA lr_d8 = 1.77 mlr_d8 = 2.28 lr_d12 = 1.53 mlr_d12 = 1.75 MAE for 124 1tifA lr_d8 = 4.81 mlr_d8 = 3.46 lr_d12 = 3.98 mlr_d12 = 3.09 MAE for 125 1tqgA lr_d8 = 1.58 mlr_d8 = 1.41 lr_d12 = 1.19 mlr_d12 = 1.08 MAE for 126 1tqhA lr_d8 = 4.40 mlr_d8 = 4.17 lr_d12 = 3.86 mlr_d12 = 3.69 MAE for 127 1tzvA lr_d8 = 2.63 mlr_d8 = 2.34 lr_d12 = 1.85 mlr_d12 = 1.66 MAE for 128 1vfyA lr_d8 = 3.56 mlr_d8 = 3.61 lr_d12 = 2.91 mlr_d12 = 2.82 MAE for 129 1vhuA lr_d8 = 2.84 mlr_d8 = 2.90 lr_d12 = 2.63 mlr_d12 = 2.55 MAE for 130 1vjkA lr_d8 = 2.28 mlr_d8 = 2.46 lr_d12 = 2.27 mlr_d12 = 2.68 MAE for 131 1vmbA lr_d8 = 1.87 mlr_d8 = 1.70 lr_d12 = 1.87 mlr_d12 = 1.76 MAE for 132 1vp6A lr_d8 = 2.81 mlr_d8 = 2.60 lr_d12 = 3.35 mlr_d12 = 2.94 MAE for 133 1w0hA lr_d8 = 4.66 mlr_d8 = 4.02 lr_d12 = 3.68 mlr_d12 = 3.29 MAE for 134 1whiA lr_d8 = 6.82 mlr_d8 = 5.24 lr_d12 = 5.80 mlr_d12 = 4.84 MAE for 135 1wjxA lr_d8 = 4.40 mlr_d8 = 3.60 lr_d12 = 3.88 mlr_d12 = 3.44 MAE for 136 1wkcA lr_d8 = 2.87 mlr_d8 = 2.80 lr_d12 = 2.35 mlr_d12 = 2.34 MAE for 137 1xdzA lr_d8 = 4.13 mlr_d8 = 3.66 lr_d12 = 3.63 mlr_d12 = 3.29 MAE for 138 1xffA lr_d8 = 3.11 mlr_d8 = 2.88 lr_d12 = 2.81 mlr_d12 = 2.55 MAE for 139 1xkrA lr_d8 = 7.04 mlr_d8 = 6.08 lr_d12 = 6.61 mlr_d12 = 5.97 MAE for 140 2arcA lr_d8 = 8.26 mlr_d8 = 7.69 lr_d12 = 8.26 mlr_d12 = 7.52 MAE for 141 2cuaA lr_d8 = 5.46 mlr_d8 = 4.49 lr_d12 = 5.25 mlr_d12 = 4.37 MAE for 142 2hs1A lr_d8 = 10.41 mlr_d8 = 9.44 lr_d12 = 9.99 mlr_d12 = 8.88 MAE for 143 2mhrA lr_d8 = 4.55 mlr_d8 = 4.10 lr_d12 = 3.08 mlr_d12 = 2.88 MAE for 144 2phyA lr_d8 = 11.60 mlr_d8 = 8.31 lr_d12 = 8.59 mlr_d12 = 7.05 MAE for 145 2tpsA lr_d8 = 4.06 mlr_d8 = 3.24 lr_d12 = 3.96 mlr_d12 = 3.33 MAE for 146 2vxnA lr_d8 = 2.95 mlr_d8 = 2.89 lr_d12 = 2.48 mlr_d12 = 2.40 MAE for 147 3borA lr_d8 = 2.64 mlr_d8 = 2.79 lr_d12 = 2.38 mlr_d12 = 2.43 MAE for 148 3dqgA lr_d8 = 4.92 mlr_d8 = 4.30 lr_d12 = 4.47 mlr_d12 = 4.05 MAE for 149 5ptpA lr_d8 = 3.87 mlr_d8 = 3.31 lr_d12 = 3.21 mlr_d12 = 2.90 Average MAE : lr<8A = 4.8302 mlr<8A = 4.3317 lr<12A = 4.1439 mlr<12A = 3.7738 Precision for 0 - 1a3aA top_L5_lr = 0.9655 top_L_lr = 0.7655 top_Nc_lr = 0.6201 top_L5_mlr = 0.9655 top_L_mlr = 0.8621 top_Nc_mlr = 0.6537 (total_true_lr = 229 total_true_mlr = 283) Precision for 1 - 1a6mA top_L5_lr = 0.9667 top_L_lr = 0.7483 top_Nc_lr = 0.7434 top_L5_mlr = 1.0000 top_L_mlr = 0.7881 top_Nc_mlr = 0.7486 (total_true_lr = 152 total_true_mlr = 175) Precision for 2 - 1a70A top_L5_lr = 1.0000 top_L_lr = 0.7526 top_Nc_lr = 0.6744 top_L5_mlr = 1.0000 top_L_mlr = 0.8454 top_Nc_mlr = 0.6684 (total_true_lr = 129 total_true_mlr = 190) Precision for 3 - 1aapA top_L5_lr = 0.8182 top_L_lr = 0.5536 top_Nc_lr = 0.5614 top_L5_mlr = 0.9091 top_L_mlr = 0.7857 top_Nc_mlr = 0.6238 (total_true_lr = 57 total_true_mlr = 101) Precision for 4 - 1abaA top_L5_lr = 1.0000 top_L_lr = 0.7586 top_Nc_lr = 0.7975 top_L5_mlr = 1.0000 top_L_mlr = 0.8046 top_Nc_mlr = 0.7379 (total_true_lr = 79 total_true_mlr = 103) Precision for 5 - 1ag6A top_L5_lr = 1.0000 top_L_lr = 0.7374 top_Nc_lr = 0.5988 top_L5_mlr = 1.0000 top_L_mlr = 0.7980 top_Nc_mlr = 0.6010 (total_true_lr = 162 total_true_mlr = 208) Precision for 6 - 1aoeA top_L5_lr = 0.8421 top_L_lr = 0.6406 top_Nc_lr = 0.5409 top_L5_mlr = 0.8684 top_L_mlr = 0.7135 top_Nc_mlr = 0.5480 (total_true_lr = 281 total_true_mlr = 354) Precision for 7 - 1atlA top_L5_lr = 0.9750 top_L_lr = 0.6800 top_Nc_lr = 0.5625 top_L5_mlr = 1.0000 top_L_mlr = 0.7350 top_Nc_mlr = 0.5542 (total_true_lr = 320 total_true_mlr = 406) Precision for 8 - 1atzA top_L5_lr = 1.0000 top_L_lr = 0.7467 top_Nc_lr = 0.7971 top_L5_mlr = 1.0000 top_L_mlr = 0.8267 top_Nc_mlr = 0.7292 (total_true_lr = 69 total_true_mlr = 96) Precision for 9 - 1avsA top_L5_lr = 0.8750 top_L_lr = 0.3333 top_Nc_lr = 0.5526 top_L5_mlr = 0.9375 top_L_mlr = 0.5062 top_Nc_mlr = 0.6000 (total_true_lr = 38 total_true_mlr = 60) Precision for 10 - 1bdoA top_L5_lr = 1.0000 top_L_lr = 0.9000 top_Nc_lr = 0.7391 top_L5_mlr = 1.0000 top_L_mlr = 0.9500 top_Nc_mlr = 0.7545 (total_true_lr = 115 total_true_mlr = 167) Precision for 11 - 1bebA top_L5_lr = 0.9677 top_L_lr = 0.4936 top_Nc_lr = 0.4242 top_L5_mlr = 0.9677 top_L_mlr = 0.7756 top_Nc_mlr = 0.5482 (total_true_lr = 198 total_true_mlr = 301) Precision for 12 - 1behA top_L5_lr = 0.9730 top_L_lr = 0.7283 top_Nc_lr = 0.5192 top_L5_mlr = 1.0000 top_L_mlr = 0.8207 top_Nc_mlr = 0.5419 (total_true_lr = 364 total_true_mlr = 430) Precision for 13 - 1bkrA top_L5_lr = 0.8636 top_L_lr = 0.5000 top_Nc_lr = 0.5632 top_L5_mlr = 0.9545 top_L_mlr = 0.6759 top_Nc_mlr = 0.6124 (total_true_lr = 87 total_true_mlr = 129) Precision for 14 - 1brfA top_L5_lr = 1.0000 top_L_lr = 0.6226 top_Nc_lr = 0.7317 top_L5_mlr = 1.0000 top_L_mlr = 0.7358 top_Nc_mlr = 0.6833 (total_true_lr = 41 total_true_mlr = 60) Precision for 15 - 1bsgA top_L5_lr = 0.9623 top_L_lr = 0.7331 top_Nc_lr = 0.5936 top_L5_mlr = 0.9811 top_L_mlr = 0.8308 top_Nc_mlr = 0.6123 (total_true_lr = 438 total_true_mlr = 570) Precision for 16 - 1c44A top_L5_lr = 0.8400 top_L_lr = 0.6423 top_Nc_lr = 0.5060 top_L5_mlr = 1.0000 top_L_mlr = 0.7561 top_Nc_mlr = 0.5333 (total_true_lr = 166 total_true_mlr = 210) Precision for 17 - 1c52A top_L5_lr = 0.7692 top_L_lr = 0.4198 top_Nc_lr = 0.3875 top_L5_mlr = 0.8077 top_L_mlr = 0.5496 top_Nc_mlr = 0.4264 (total_true_lr = 160 total_true_mlr = 197) Precision for 18 - 1c9oA top_L5_lr = 1.0000 top_L_lr = 0.7576 top_Nc_lr = 0.7571 top_L5_mlr = 1.0000 top_L_mlr = 0.8485 top_Nc_mlr = 0.7281 (total_true_lr = 70 total_true_mlr = 114) Precision for 19 - 1cc8A top_L5_lr = 1.0000 top_L_lr = 0.9167 top_Nc_lr = 0.8409 top_L5_mlr = 1.0000 top_L_mlr = 0.8333 top_Nc_mlr = 0.7632 (total_true_lr = 88 total_true_mlr = 114) Precision for 20 - 1chdA top_L5_lr = 0.9750 top_L_lr = 0.8939 top_Nc_lr = 0.7019 top_L5_mlr = 0.9750 top_L_mlr = 0.9495 top_Nc_mlr = 0.7160 (total_true_lr = 416 total_true_mlr = 507) Precision for 21 - 1cjwA top_L5_lr = 0.9394 top_L_lr = 0.6386 top_Nc_lr = 0.5092 top_L5_mlr = 1.0000 top_L_mlr = 0.7711 top_Nc_mlr = 0.5686 (total_true_lr = 218 total_true_mlr = 306) Precision for 22 - 1ckeA top_L5_lr = 1.0000 top_L_lr = 0.7736 top_Nc_lr = 0.6498 top_L5_mlr = 1.0000 top_L_mlr = 0.8066 top_Nc_mlr = 0.6578 (total_true_lr = 297 total_true_mlr = 339) Precision for 23 - 1ctfA top_L5_lr = 1.0000 top_L_lr = 0.9118 top_Nc_lr = 0.8588 top_L5_mlr = 1.0000 top_L_mlr = 0.9559 top_Nc_mlr = 0.8381 (total_true_lr = 85 total_true_mlr = 105) Precision for 24 - 1cxyA top_L5_lr = 1.0000 top_L_lr = 0.7037 top_Nc_lr = 0.6905 top_L5_mlr = 0.9375 top_L_mlr = 0.7407 top_Nc_mlr = 0.6535 (total_true_lr = 84 total_true_mlr = 101) Precision for 25 - 1cznA top_L5_lr = 1.0000 top_L_lr = 0.8994 top_Nc_lr = 0.6487 top_L5_mlr = 1.0000 top_L_mlr = 0.8935 top_Nc_mlr = 0.6236 (total_true_lr = 316 total_true_mlr = 364) Precision for 26 - 1d0qA top_L5_lr = 0.9500 top_L_lr = 0.4608 top_Nc_lr = 0.6212 top_L5_mlr = 0.9500 top_L_mlr = 0.7157 top_Nc_mlr = 0.6638 (total_true_lr = 66 total_true_mlr = 116) Precision for 27 - 1d1qA top_L5_lr = 1.0000 top_L_lr = 0.8491 top_Nc_lr = 0.6732 top_L5_mlr = 1.0000 top_L_mlr = 0.8679 top_Nc_mlr = 0.6959 (total_true_lr = 257 total_true_mlr = 319) Precision for 28 - 1d4oA top_L5_lr = 0.9714 top_L_lr = 0.7514 top_Nc_lr = 0.5491 top_L5_mlr = 0.9714 top_L_mlr = 0.7401 top_Nc_mlr = 0.5204 (total_true_lr = 326 total_true_mlr = 367) Precision for 29 - 1dbxA top_L5_lr = 1.0000 top_L_lr = 0.8421 top_Nc_lr = 0.6889 top_L5_mlr = 1.0000 top_L_mlr = 0.9079 top_Nc_mlr = 0.7143 (total_true_lr = 270 total_true_mlr = 336) Precision for 30 - 1dixA top_L5_lr = 0.8571 top_L_lr = 0.5673 top_Nc_lr = 0.4238 top_L5_mlr = 0.9286 top_L_mlr = 0.6538 top_Nc_mlr = 0.4684 (total_true_lr = 328 total_true_mlr = 380) Precision for 31 - 1dlwA top_L5_lr = 1.0000 top_L_lr = 0.6810 top_Nc_lr = 0.6981 top_L5_mlr = 1.0000 top_L_mlr = 0.7500 top_Nc_mlr = 0.7120 (total_true_lr = 106 total_true_mlr = 125) Precision for 32 - 1dmgA top_L5_lr = 1.0000 top_L_lr = 0.7151 top_Nc_lr = 0.6067 top_L5_mlr = 1.0000 top_L_mlr = 0.8023 top_Nc_mlr = 0.6081 (total_true_lr = 239 total_true_mlr = 296) Precision for 33 - 1dqgA top_L5_lr = 0.0741 top_L_lr = 0.0896 top_Nc_lr = 0.0915 top_L5_mlr = 0.8889 top_L_mlr = 0.3806 top_Nc_mlr = 0.2530 (total_true_lr = 164 total_true_mlr = 249) Precision for 34 - 1dsxA top_L5_lr = 0.7647 top_L_lr = 0.4713 top_Nc_lr = 0.5217 top_L5_mlr = 0.9412 top_L_mlr = 0.5172 top_Nc_mlr = 0.5000 (total_true_lr = 69 total_true_mlr = 96) Precision for 35 - 1eazA top_L5_lr = 0.8571 top_L_lr = 0.4951 top_Nc_lr = 0.5942 top_L5_mlr = 1.0000 top_L_mlr = 0.8738 top_Nc_mlr = 0.7342 (total_true_lr = 69 total_true_mlr = 158) Precision for 36 - 1ej0A top_L5_lr = 0.9722 top_L_lr = 0.7556 top_Nc_lr = 0.6525 top_L5_mlr = 1.0000 top_L_mlr = 0.8667 top_Nc_mlr = 0.6640 (total_true_lr = 259 total_true_mlr = 375) Precision for 37 - 1ej8A top_L5_lr = 0.6786 top_L_lr = 0.5929 top_Nc_lr = 0.4504 top_L5_mlr = 0.8214 top_L_mlr = 0.7643 top_Nc_mlr = 0.4848 (total_true_lr = 242 total_true_mlr = 328) Precision for 38 - 1ek0A top_L5_lr = 1.0000 top_L_lr = 0.8810 top_Nc_lr = 0.7270 top_L5_mlr = 1.0000 top_L_mlr = 0.8988 top_Nc_mlr = 0.7164 (total_true_lr = 282 total_true_mlr = 342) Precision for 39 - 1f6bA top_L5_lr = 0.8857 top_L_lr = 0.7102 top_Nc_lr = 0.6000 top_L5_mlr = 0.8857 top_L_mlr = 0.7159 top_Nc_mlr = 0.5581 (total_true_lr = 260 total_true_mlr = 310) Precision for 40 - 1fcyA top_L5_lr = 0.8723 top_L_lr = 0.4068 top_Nc_lr = 0.3529 top_L5_mlr = 0.8298 top_L_mlr = 0.4831 top_Nc_mlr = 0.3834 (total_true_lr = 289 total_true_mlr = 326) Precision for 41 - 1fk5A top_L5_lr = 0.7368 top_L_lr = 0.4301 top_Nc_lr = 0.4000 top_L5_mlr = 0.7368 top_L_mlr = 0.4409 top_Nc_mlr = 0.4050 (total_true_lr = 100 total_true_mlr = 121) Precision for 42 - 1fl0A top_L5_lr = 0.9697 top_L_lr = 0.6402 top_Nc_lr = 0.4836 top_L5_mlr = 1.0000 top_L_mlr = 0.7683 top_Nc_mlr = 0.4986 (total_true_lr = 275 total_true_mlr = 351) Precision for 43 - 1fnaA top_L5_lr = 1.0000 top_L_lr = 0.7692 top_Nc_lr = 0.6480 top_L5_mlr = 1.0000 top_L_mlr = 0.8022 top_Nc_mlr = 0.6590 (total_true_lr = 125 total_true_mlr = 173) Precision for 44 - 1fqtA top_L5_lr = 0.9091 top_L_lr = 0.7523 top_Nc_lr = 0.7387 top_L5_mlr = 0.9545 top_L_mlr = 0.8807 top_Nc_mlr = 0.6816 (total_true_lr = 111 total_true_mlr = 201) Precision for 45 - 1fvgA top_L5_lr = 0.9474 top_L_lr = 0.8177 top_Nc_lr = 0.6218 top_L5_mlr = 0.9737 top_L_mlr = 0.8594 top_Nc_mlr = 0.6441 (total_true_lr = 349 total_true_mlr = 399) Precision for 46 - 1fvkA top_L5_lr = 0.8947 top_L_lr = 0.6755 top_Nc_lr = 0.5657 top_L5_mlr = 0.8947 top_L_mlr = 0.7074 top_Nc_mlr = 0.5607 (total_true_lr = 251 total_true_mlr = 280) Precision for 47 - 1fx2A top_L5_lr = 1.0000 top_L_lr = 0.5357 top_Nc_lr = 0.6129 top_L5_mlr = 1.0000 top_L_mlr = 0.6607 top_Nc_mlr = 0.6356 (total_true_lr = 93 total_true_mlr = 118) Precision for 48 - 1g2rA top_L5_lr = 0.9474 top_L_lr = 0.5638 top_Nc_lr = 0.6104 top_L5_mlr = 0.8947 top_L_mlr = 0.6596 top_Nc_mlr = 0.6240 (total_true_lr = 77 total_true_mlr = 125) Precision for 49 - 1g9oA top_L5_lr = 1.0000 top_L_lr = 0.8242 top_Nc_lr = 0.7593 top_L5_mlr = 1.0000 top_L_mlr = 0.9670 top_Nc_mlr = 0.7848 (total_true_lr = 108 total_true_mlr = 158) Precision for 50 - 1gbsA top_L5_lr = 0.9459 top_L_lr = 0.4162 top_Nc_lr = 0.3720 top_L5_mlr = 0.9459 top_L_mlr = 0.5892 top_Nc_mlr = 0.4244 (total_true_lr = 207 total_true_mlr = 311) Precision for 51 - 1gmiA top_L5_lr = 1.0000 top_L_lr = 0.9037 top_Nc_lr = 0.7182 top_L5_mlr = 1.0000 top_L_mlr = 0.9630 top_Nc_mlr = 0.7181 (total_true_lr = 220 total_true_mlr = 298) Precision for 52 - 1gmxA top_L5_lr = 0.9048 top_L_lr = 0.7290 top_Nc_lr = 0.6111 top_L5_mlr = 0.9048 top_L_mlr = 0.8131 top_Nc_mlr = 0.6330 (total_true_lr = 144 total_true_mlr = 188) Precision for 53 - 1guuA top_L5_lr = 0.5000 top_L_lr = 0.2400 top_Nc_lr = 0.5385 top_L5_mlr = 1.0000 top_L_mlr = 0.4400 top_Nc_mlr = 0.6333 (total_true_lr = 13 total_true_mlr = 30) Precision for 54 - 1gz2A top_L5_lr = 0.9286 top_L_lr = 0.7536 top_Nc_lr = 0.6171 top_L5_mlr = 0.9643 top_L_mlr = 0.7681 top_Nc_mlr = 0.5993 (total_true_lr = 222 total_true_mlr = 272) Precision for 55 - 1gzcA top_L5_lr = 0.9792 top_L_lr = 0.8201 top_Nc_lr = 0.5761 top_L5_mlr = 0.9792 top_L_mlr = 0.8536 top_Nc_mlr = 0.5651 (total_true_lr = 493 total_true_mlr = 630) Precision for 56 - 1h0pA top_L5_lr = 1.0000 top_L_lr = 0.8626 top_Nc_lr = 0.6476 top_L5_mlr = 1.0000 top_L_mlr = 0.9121 top_Nc_mlr = 0.6651 (total_true_lr = 349 total_true_mlr = 433) Precision for 57 - 1h2eA top_L5_lr = 1.0000 top_L_lr = 0.8551 top_Nc_lr = 0.7287 top_L5_mlr = 1.0000 top_L_mlr = 0.9130 top_Nc_mlr = 0.7270 (total_true_lr = 317 total_true_mlr = 403) Precision for 58 - 1h4xA top_L5_lr = 0.9545 top_L_lr = 0.7182 top_Nc_lr = 0.6693 top_L5_mlr = 0.8636 top_L_mlr = 0.7364 top_Nc_mlr = 0.6467 (total_true_lr = 127 total_true_mlr = 167) Precision for 59 - 1h98A top_L5_lr = 0.6667 top_L_lr = 0.4286 top_Nc_lr = 0.4091 top_L5_mlr = 0.7333 top_L_mlr = 0.4026 top_Nc_mlr = 0.3596 (total_true_lr = 88 total_true_mlr = 114) Precision for 60 - 1hdoA top_L5_lr = 1.0000 top_L_lr = 0.8683 top_Nc_lr = 0.6676 top_L5_mlr = 1.0000 top_L_mlr = 0.9366 top_Nc_mlr = 0.6821 (total_true_lr = 352 total_true_mlr = 475) Precision for 61 - 1hfcA top_L5_lr = 0.8710 top_L_lr = 0.5541 top_Nc_lr = 0.4643 top_L5_mlr = 0.9032 top_L_mlr = 0.6561 top_Nc_mlr = 0.4712 (total_true_lr = 224 total_true_mlr = 278) Precision for 62 - 1hh8A top_L5_lr = 0.8158 top_L_lr = 0.3333 top_Nc_lr = 0.3721 top_L5_mlr = 0.9737 top_L_mlr = 0.6719 top_Nc_mlr = 0.5129 (total_true_lr = 172 total_true_mlr = 271) Precision for 63 - 1htwA top_L5_lr = 1.0000 top_L_lr = 0.7468 top_Nc_lr = 0.6396 top_L5_mlr = 1.0000 top_L_mlr = 0.8354 top_Nc_mlr = 0.6140 (total_true_lr = 222 total_true_mlr = 285) Precision for 64 - 1hxnA top_L5_lr = 0.7381 top_L_lr = 0.2333 top_Nc_lr = 0.2242 top_L5_mlr = 0.9762 top_L_mlr = 0.6333 top_Nc_mlr = 0.4040 (total_true_lr = 223 total_true_mlr = 401) Precision for 65 - 1i1jA top_L5_lr = 0.9524 top_L_lr = 0.5472 top_Nc_lr = 0.4359 top_L5_mlr = 1.0000 top_L_mlr = 0.8585 top_Nc_mlr = 0.5708 (total_true_lr = 156 total_true_mlr = 219) Precision for 66 - 1i1nA top_L5_lr = 1.0000 top_L_lr = 0.7054 top_Nc_lr = 0.5823 top_L5_mlr = 1.0000 top_L_mlr = 0.8571 top_Nc_mlr = 0.6379 (total_true_lr = 328 total_true_mlr = 486) Precision for 67 - 1i4jA top_L5_lr = 1.0000 top_L_lr = 0.9091 top_Nc_lr = 0.7622 top_L5_mlr = 1.0000 top_L_mlr = 0.9545 top_Nc_mlr = 0.7756 (total_true_lr = 164 total_true_mlr = 205) Precision for 68 - 1i58A top_L5_lr = 0.8947 top_L_lr = 0.6296 top_Nc_lr = 0.5462 top_L5_mlr = 0.9474 top_L_mlr = 0.7249 top_Nc_mlr = 0.5867 (total_true_lr = 238 total_true_mlr = 300) Precision for 69 - 1i5gA top_L5_lr = 0.9655 top_L_lr = 0.6181 top_Nc_lr = 0.5397 top_L5_mlr = 0.9655 top_L_mlr = 0.5903 top_Nc_mlr = 0.4800 (total_true_lr = 189 total_true_mlr = 250) Precision for 70 - 1i71A top_L5_lr = 0.9412 top_L_lr = 0.6747 top_Nc_lr = 0.5755 top_L5_mlr = 0.9412 top_L_mlr = 0.6988 top_Nc_mlr = 0.5714 (total_true_lr = 106 total_true_mlr = 133) Precision for 71 - 1ihzA top_L5_lr = 0.8889 top_L_lr = 0.7132 top_Nc_lr = 0.6133 top_L5_mlr = 0.9630 top_L_mlr = 0.8676 top_Nc_mlr = 0.6667 (total_true_lr = 181 total_true_mlr = 249) Precision for 72 - 1iibA top_L5_lr = 1.0000 top_L_lr = 0.8350 top_Nc_lr = 0.7778 top_L5_mlr = 1.0000 top_L_mlr = 0.8932 top_Nc_mlr = 0.7790 (total_true_lr = 126 total_true_mlr = 181) Precision for 73 - 1im5A top_L5_lr = 0.9444 top_L_lr = 0.7765 top_Nc_lr = 0.5957 top_L5_mlr = 0.9722 top_L_mlr = 0.8045 top_Nc_mlr = 0.5849 (total_true_lr = 329 total_true_mlr = 371) Precision for 74 - 1iwdA top_L5_lr = 0.9302 top_L_lr = 0.7302 top_Nc_lr = 0.5853 top_L5_mlr = 0.9767 top_L_mlr = 0.8326 top_Nc_mlr = 0.6000 (total_true_lr = 381 total_true_mlr = 480) Precision for 75 - 1j3aA top_L5_lr = 1.0000 top_L_lr = 0.7597 top_Nc_lr = 0.5634 top_L5_mlr = 0.9615 top_L_mlr = 0.7829 top_Nc_mlr = 0.5619 (total_true_lr = 213 total_true_mlr = 226) Precision for 76 - 1jbeA top_L5_lr = 0.9600 top_L_lr = 0.7937 top_Nc_lr = 0.7347 top_L5_mlr = 0.9600 top_L_mlr = 0.8889 top_Nc_mlr = 0.7594 (total_true_lr = 147 total_true_mlr = 212) Precision for 77 - 1jbkA top_L5_lr = 0.9737 top_L_lr = 0.6984 top_Nc_lr = 0.5563 top_L5_mlr = 1.0000 top_L_mlr = 0.7037 top_Nc_mlr = 0.5498 (total_true_lr = 293 total_true_mlr = 311) Precision for 78 - 1jfuA top_L5_lr = 0.9429 top_L_lr = 0.7386 top_Nc_lr = 0.5922 top_L5_mlr = 0.9429 top_L_mlr = 0.7614 top_Nc_mlr = 0.5824 (total_true_lr = 309 total_true_mlr = 364) Precision for 79 - 1jfxA top_L5_lr = 0.9767 top_L_lr = 0.7604 top_Nc_lr = 0.6310 top_L5_mlr = 1.0000 top_L_mlr = 0.8894 top_Nc_mlr = 0.6674 (total_true_lr = 355 total_true_mlr = 478) Precision for 80 - 1jkxA top_L5_lr = 1.0000 top_L_lr = 0.8804 top_Nc_lr = 0.7177 top_L5_mlr = 1.0000 top_L_mlr = 0.9187 top_Nc_mlr = 0.7406 (total_true_lr = 294 total_true_mlr = 397) Precision for 81 - 1jl1A top_L5_lr = 1.0000 top_L_lr = 0.8684 top_Nc_lr = 0.7284 top_L5_mlr = 1.0000 top_L_mlr = 0.9145 top_Nc_mlr = 0.7541 (total_true_lr = 243 total_true_mlr = 305) Precision for 82 - 1jo0A top_L5_lr = 1.0000 top_L_lr = 0.7010 top_Nc_lr = 0.6574 top_L5_mlr = 1.0000 top_L_mlr = 0.7938 top_Nc_mlr = 0.6870 (total_true_lr = 108 total_true_mlr = 131) Precision for 83 - 1jo8A top_L5_lr = 0.9167 top_L_lr = 0.5690 top_Nc_lr = 0.6304 top_L5_mlr = 1.0000 top_L_mlr = 0.9655 top_Nc_mlr = 0.7629 (total_true_lr = 46 total_true_mlr = 97) Precision for 84 - 1josA top_L5_lr = 0.9500 top_L_lr = 0.7400 top_Nc_lr = 0.7238 top_L5_mlr = 0.9500 top_L_mlr = 0.9100 top_Nc_mlr = 0.7603 (total_true_lr = 105 total_true_mlr = 146) Precision for 85 - 1jvwA top_L5_lr = 1.0000 top_L_lr = 0.8625 top_Nc_lr = 0.7243 top_L5_mlr = 1.0000 top_L_mlr = 0.9187 top_Nc_mlr = 0.7383 (total_true_lr = 243 total_true_mlr = 298) Precision for 86 - 1jwqA top_L5_lr = 1.0000 top_L_lr = 0.9665 top_Nc_lr = 0.7901 top_L5_mlr = 1.0000 top_L_mlr = 0.9721 top_Nc_mlr = 0.7780 (total_true_lr = 362 total_true_mlr = 410) Precision for 87 - 1jyhA top_L5_lr = 1.0000 top_L_lr = 0.8968 top_Nc_lr = 0.7478 top_L5_mlr = 1.0000 top_L_mlr = 0.9226 top_Nc_mlr = 0.7552 (total_true_lr = 230 total_true_mlr = 286) Precision for 88 - 1k6kA top_L5_lr = 1.0000 top_L_lr = 0.7254 top_Nc_lr = 0.6705 top_L5_mlr = 1.0000 top_L_mlr = 0.8310 top_Nc_mlr = 0.6942 (total_true_lr = 173 total_true_mlr = 206) Precision for 89 - 1k7cA top_L5_lr = 0.9787 top_L_lr = 0.7983 top_Nc_lr = 0.6043 top_L5_mlr = 0.9787 top_L_mlr = 0.8326 top_Nc_mlr = 0.5661 (total_true_lr = 417 total_true_mlr = 507) Precision for 90 - 1k7jA top_L5_lr = 1.0000 top_L_lr = 0.8592 top_Nc_lr = 0.6742 top_L5_mlr = 1.0000 top_L_mlr = 0.9078 top_Nc_mlr = 0.6835 (total_true_lr = 353 total_true_mlr = 455) Precision for 91 - 1kidA top_L5_lr = 1.0000 top_L_lr = 0.7772 top_Nc_lr = 0.5816 top_L5_mlr = 1.0000 top_L_mlr = 0.8135 top_Nc_mlr = 0.6057 (total_true_lr = 337 total_true_mlr = 383) Precision for 92 - 1kq6A top_L5_lr = 0.6429 top_L_lr = 0.4143 top_Nc_lr = 0.4298 top_L5_mlr = 1.0000 top_L_mlr = 0.6286 top_Nc_mlr = 0.5523 (total_true_lr = 121 total_true_mlr = 172) Precision for 93 - 1kqrA top_L5_lr = 0.8750 top_L_lr = 0.4813 top_Nc_lr = 0.3550 top_L5_mlr = 0.8438 top_L_mlr = 0.5563 top_Nc_mlr = 0.3963 (total_true_lr = 262 total_true_mlr = 376) Precision for 94 - 1ktgA top_L5_lr = 0.6667 top_L_lr = 0.6496 top_Nc_lr = 0.5728 top_L5_mlr = 0.7407 top_L_mlr = 0.5401 top_Nc_mlr = 0.5118 (total_true_lr = 213 total_true_mlr = 254) Precision for 95 - 1ku3A top_L5_lr = 0.9167 top_L_lr = 0.4262 top_Nc_lr = 0.6552 top_L5_mlr = 0.8333 top_L_mlr = 0.4590 top_Nc_mlr = 0.6216 (total_true_lr = 29 total_true_mlr = 37) Precision for 96 - 1kw4A top_L5_lr = 0.7143 top_L_lr = 0.4000 top_Nc_lr = 0.5000 top_L5_mlr = 1.0000 top_L_mlr = 0.5429 top_Nc_mlr = 0.5692 (total_true_lr = 44 total_true_mlr = 65) Precision for 97 - 1lm4A top_L5_lr = 0.8947 top_L_lr = 0.6085 top_Nc_lr = 0.4738 top_L5_mlr = 0.9737 top_L_mlr = 0.7143 top_Nc_mlr = 0.5037 (total_true_lr = 325 total_true_mlr = 407) Precision for 98 - 1lo7A top_L5_lr = 1.0000 top_L_lr = 0.8786 top_Nc_lr = 0.7754 top_L5_mlr = 1.0000 top_L_mlr = 0.9500 top_Nc_mlr = 0.8085 (total_true_lr = 187 total_true_mlr = 235) Precision for 99 - 1lpyA top_L5_lr = 0.7500 top_L_lr = 0.3889 top_Nc_lr = 0.4397 top_L5_mlr = 0.7500 top_L_mlr = 0.4753 top_Nc_mlr = 0.4659 (total_true_lr = 116 total_true_mlr = 176) Precision for 100 - 1m4jA top_L5_lr = 0.8148 top_L_lr = 0.5489 top_Nc_lr = 0.5390 top_L5_mlr = 0.7407 top_L_mlr = 0.5865 top_Nc_mlr = 0.5349 (total_true_lr = 154 total_true_mlr = 215) Precision for 101 - 1m8aA top_L5_lr = 0.7500 top_L_lr = 0.4262 top_Nc_lr = 0.5263 top_L5_mlr = 1.0000 top_L_mlr = 0.7213 top_Nc_mlr = 0.6250 (total_true_lr = 38 total_true_mlr = 72) Precision for 102 - 1mk0A top_L5_lr = 1.0000 top_L_lr = 0.6289 top_Nc_lr = 0.5943 top_L5_mlr = 1.0000 top_L_mlr = 0.7216 top_Nc_mlr = 0.5906 (total_true_lr = 106 total_true_mlr = 149) Precision for 103 - 1mugA top_L5_lr = 0.9697 top_L_lr = 0.7879 top_Nc_lr = 0.6335 top_L5_mlr = 0.9394 top_L_mlr = 0.7697 top_Nc_mlr = 0.5748 (total_true_lr = 251 total_true_mlr = 301) Precision for 104 - 1nb9A top_L5_lr = 0.9655 top_L_lr = 0.7211 top_Nc_lr = 0.6029 top_L5_mlr = 0.9655 top_L_mlr = 0.8571 top_Nc_mlr = 0.6667 (total_true_lr = 204 total_true_mlr = 267) Precision for 105 - 1ne2A top_L5_lr = 0.9429 top_L_lr = 0.4943 top_Nc_lr = 0.3849 top_L5_mlr = 1.0000 top_L_mlr = 0.7330 top_Nc_mlr = 0.4211 (total_true_lr = 239 total_true_mlr = 399) Precision for 106 - 1npsA top_L5_lr = 0.9444 top_L_lr = 0.6818 top_Nc_lr = 0.5956 top_L5_mlr = 1.0000 top_L_mlr = 0.8182 top_Nc_mlr = 0.5618 (total_true_lr = 136 total_true_mlr = 178) Precision for 107 - 1nrvA top_L5_lr = 0.6000 top_L_lr = 0.5000 top_Nc_lr = 0.5055 top_L5_mlr = 1.0000 top_L_mlr = 0.7200 top_Nc_mlr = 0.5959 (total_true_lr = 91 total_true_mlr = 146) Precision for 108 - 1ny1A top_L5_lr = 0.9574 top_L_lr = 0.8000 top_Nc_lr = 0.5962 top_L5_mlr = 0.9362 top_L_mlr = 0.8255 top_Nc_mlr = 0.6091 (total_true_lr = 416 total_true_mlr = 486) Precision for 109 - 1o1zA top_L5_lr = 0.9778 top_L_lr = 0.8186 top_Nc_lr = 0.7158 top_L5_mlr = 1.0000 top_L_mlr = 0.9204 top_Nc_mlr = 0.7197 (total_true_lr = 292 total_true_mlr = 421) Precision for 110 - 1p90A top_L5_lr = 1.0000 top_L_lr = 0.6341 top_Nc_lr = 0.5725 top_L5_mlr = 1.0000 top_L_mlr = 0.7967 top_Nc_mlr = 0.6073 (total_true_lr = 138 total_true_mlr = 219) Precision for 111 - 1pchA top_L5_lr = 1.0000 top_L_lr = 0.9318 top_Nc_lr = 0.7313 top_L5_mlr = 1.0000 top_L_mlr = 0.9091 top_Nc_mlr = 0.7032 (total_true_lr = 134 total_true_mlr = 155) Precision for 112 - 1pkoA top_L5_lr = 1.0000 top_L_lr = 0.8145 top_Nc_lr = 0.6763 top_L5_mlr = 1.0000 top_L_mlr = 0.9113 top_Nc_mlr = 0.6923 (total_true_lr = 173 total_true_mlr = 260) Precision for 113 - 1qf9A top_L5_lr = 1.0000 top_L_lr = 0.7320 top_Nc_lr = 0.6324 top_L5_mlr = 1.0000 top_L_mlr = 0.7577 top_Nc_mlr = 0.6361 (total_true_lr = 272 total_true_mlr = 305) Precision for 114 - 1qjpA top_L5_lr = 1.0000 top_L_lr = 0.7737 top_Nc_lr = 0.6784 top_L5_mlr = 1.0000 top_L_mlr = 1.0000 top_Nc_mlr = 0.7900 (total_true_lr = 171 total_true_mlr = 319) Precision for 115 - 1ql0A top_L5_lr = 1.0000 top_L_lr = 0.5602 top_Nc_lr = 0.3828 top_L5_mlr = 1.0000 top_L_mlr = 0.7054 top_Nc_mlr = 0.4418 (total_true_lr = 431 total_true_mlr = 550) Precision for 116 - 1r26A top_L5_lr = 1.0000 top_L_lr = 0.7788 top_Nc_lr = 0.6875 top_L5_mlr = 1.0000 top_L_mlr = 0.8142 top_Nc_mlr = 0.6964 (total_true_lr = 144 total_true_mlr = 168) Precision for 117 - 1roaA top_L5_lr = 0.9545 top_L_lr = 0.6937 top_Nc_lr = 0.6327 top_L5_mlr = 1.0000 top_L_mlr = 0.9009 top_Nc_mlr = 0.7033 (total_true_lr = 147 total_true_mlr = 209) Precision for 118 - 1rw1A top_L5_lr = 0.9565 top_L_lr = 0.7632 top_Nc_lr = 0.7280 top_L5_mlr = 1.0000 top_L_mlr = 0.8070 top_Nc_mlr = 0.7133 (total_true_lr = 125 total_true_mlr = 150) Precision for 119 - 1rw7A top_L5_lr = 1.0000 top_L_lr = 0.8596 top_Nc_lr = 0.6469 top_L5_mlr = 1.0000 top_L_mlr = 0.8851 top_Nc_mlr = 0.6319 (total_true_lr = 456 total_true_mlr = 527) Precision for 120 - 1rybA top_L5_lr = 1.0000 top_L_lr = 0.8871 top_Nc_lr = 0.7476 top_L5_mlr = 1.0000 top_L_mlr = 0.9086 top_Nc_mlr = 0.7261 (total_true_lr = 313 total_true_mlr = 387) Precision for 121 - 1smxA top_L5_lr = 1.0000 top_L_lr = 0.6897 top_Nc_lr = 0.6941 top_L5_mlr = 1.0000 top_L_mlr = 0.7816 top_Nc_mlr = 0.6912 (total_true_lr = 85 total_true_mlr = 136) Precision for 122 - 1svyA top_L5_lr = 1.0000 top_L_lr = 0.6931 top_Nc_lr = 0.6015 top_L5_mlr = 1.0000 top_L_mlr = 0.8119 top_Nc_mlr = 0.6524 (total_true_lr = 133 total_true_mlr = 187) Precision for 123 - 1t8kA top_L5_lr = 1.0000 top_L_lr = 0.6364 top_Nc_lr = 0.7679 top_L5_mlr = 1.0000 top_L_mlr = 0.7143 top_Nc_mlr = 0.7179 (total_true_lr = 56 total_true_mlr = 78) Precision for 124 - 1tifA top_L5_lr = 0.8000 top_L_lr = 0.4211 top_Nc_lr = 0.5192 top_L5_mlr = 0.9333 top_L_mlr = 0.7368 top_Nc_mlr = 0.6667 (total_true_lr = 52 total_true_mlr = 90) Precision for 125 - 1tqgA top_L5_lr = 1.0000 top_L_lr = 0.6000 top_Nc_lr = 0.7143 top_L5_mlr = 1.0000 top_L_mlr = 0.7619 top_Nc_mlr = 0.7692 (total_true_lr = 77 total_true_mlr = 104) Precision for 126 - 1tqhA top_L5_lr = 0.9792 top_L_lr = 0.8182 top_Nc_lr = 0.6546 top_L5_mlr = 0.9792 top_L_mlr = 0.8802 top_Nc_mlr = 0.6708 (total_true_lr = 388 total_true_mlr = 489) Precision for 127 - 1tzvA top_L5_lr = 1.0000 top_L_lr = 0.6667 top_Nc_lr = 0.7040 top_L5_mlr = 1.0000 top_L_mlr = 0.7872 top_Nc_mlr = 0.7439 (total_true_lr = 125 total_true_mlr = 164) Precision for 128 - 1vfyA top_L5_lr = 1.0000 top_L_lr = 0.4328 top_Nc_lr = 0.6216 top_L5_mlr = 1.0000 top_L_mlr = 0.6418 top_Nc_mlr = 0.5750 (total_true_lr = 37 total_true_mlr = 80) Precision for 129 - 1vhuA top_L5_lr = 1.0000 top_L_lr = 0.8698 top_Nc_lr = 0.6769 top_L5_mlr = 1.0000 top_L_mlr = 0.8906 top_Nc_mlr = 0.6699 (total_true_lr = 325 total_true_mlr = 412) Precision for 130 - 1vjkA top_L5_lr = 1.0000 top_L_lr = 0.8161 top_Nc_lr = 0.7426 top_L5_mlr = 1.0000 top_L_mlr = 0.9540 top_Nc_mlr = 0.7742 (total_true_lr = 101 total_true_mlr = 155) Precision for 131 - 1vmbA top_L5_lr = 1.0000 top_L_lr = 0.8785 top_Nc_lr = 0.7823 top_L5_mlr = 1.0000 top_L_mlr = 0.9439 top_Nc_mlr = 0.7857 (total_true_lr = 147 total_true_mlr = 182) Precision for 132 - 1vp6A top_L5_lr = 1.0000 top_L_lr = 0.8872 top_Nc_lr = 0.7178 top_L5_mlr = 1.0000 top_L_mlr = 0.8947 top_Nc_mlr = 0.7344 (total_true_lr = 202 total_true_mlr = 241) Precision for 133 - 1w0hA top_L5_lr = 0.9500 top_L_lr = 0.6950 top_Nc_lr = 0.6060 top_L5_mlr = 0.9500 top_L_mlr = 0.8100 top_Nc_mlr = 0.6373 (total_true_lr = 302 total_true_mlr = 375) Precision for 134 - 1whiA top_L5_lr = 0.8750 top_L_lr = 0.5410 top_Nc_lr = 0.4506 top_L5_mlr = 1.0000 top_L_mlr = 0.8361 top_Nc_mlr = 0.5540 (total_true_lr = 162 total_true_mlr = 278) Precision for 135 - 1wjxA top_L5_lr = 1.0000 top_L_lr = 0.7321 top_Nc_lr = 0.6301 top_L5_mlr = 1.0000 top_L_mlr = 0.8750 top_Nc_mlr = 0.6985 (total_true_lr = 146 total_true_mlr = 199) Precision for 136 - 1wkcA top_L5_lr = 0.9706 top_L_lr = 0.7560 top_Nc_lr = 0.6586 top_L5_mlr = 0.9706 top_L_mlr = 0.8155 top_Nc_mlr = 0.6489 (total_true_lr = 249 total_true_mlr = 319) Precision for 137 - 1xdzA top_L5_lr = 0.9792 top_L_lr = 0.8361 top_Nc_lr = 0.6597 top_L5_mlr = 0.9792 top_L_mlr = 0.8950 top_Nc_mlr = 0.6777 (total_true_lr = 385 total_true_mlr = 484) Precision for 138 - 1xffA top_L5_lr = 1.0000 top_L_lr = 0.9118 top_Nc_lr = 0.7034 top_L5_mlr = 1.0000 top_L_mlr = 0.9496 top_Nc_mlr = 0.7189 (total_true_lr = 445 total_true_mlr = 594) Precision for 139 - 1xkrA top_L5_lr = 0.9268 top_L_lr = 0.5756 top_Nc_lr = 0.4563 top_L5_mlr = 0.9756 top_L_mlr = 0.7317 top_Nc_mlr = 0.5248 (total_true_lr = 309 total_true_mlr = 383) Precision for 140 - 2arcA top_L5_lr = 1.0000 top_L_lr = 0.6398 top_Nc_lr = 0.4733 top_L5_mlr = 1.0000 top_L_mlr = 0.7702 top_Nc_mlr = 0.4818 (total_true_lr = 262 total_true_mlr = 303) Precision for 141 - 2cuaA top_L5_lr = 1.0000 top_L_lr = 0.7623 top_Nc_lr = 0.6108 top_L5_mlr = 1.0000 top_L_mlr = 0.8934 top_Nc_mlr = 0.6591 (total_true_lr = 185 total_true_mlr = 264) Precision for 142 - 2hs1A top_L5_lr = 0.7000 top_L_lr = 0.3131 top_Nc_lr = 0.2936 top_L5_mlr = 0.7000 top_L_mlr = 0.4949 top_Nc_mlr = 0.3663 (total_true_lr = 109 total_true_mlr = 172) Precision for 143 - 2mhrA top_L5_lr = 0.8333 top_L_lr = 0.5085 top_Nc_lr = 0.5882 top_L5_mlr = 0.9583 top_L_mlr = 0.6186 top_Nc_mlr = 0.6000 (total_true_lr = 102 total_true_mlr = 130) Precision for 144 - 2phyA top_L5_lr = 0.5600 top_L_lr = 0.2960 top_Nc_lr = 0.3333 top_L5_mlr = 0.9600 top_L_mlr = 0.5760 top_Nc_mlr = 0.4573 (total_true_lr = 102 total_true_mlr = 199) Precision for 145 - 2tpsA top_L5_lr = 1.0000 top_L_lr = 0.8053 top_Nc_lr = 0.6551 top_L5_mlr = 1.0000 top_L_mlr = 0.9336 top_Nc_mlr = 0.7176 (total_true_lr = 316 total_true_mlr = 471) Precision for 146 - 2vxnA top_L5_lr = 0.9800 top_L_lr = 0.7550 top_Nc_lr = 0.6278 top_L5_mlr = 0.9800 top_L_mlr = 0.8153 top_Nc_mlr = 0.6355 (total_true_lr = 395 total_true_mlr = 502) Precision for 147 - 3borA top_L5_lr = 1.0000 top_L_lr = 0.8660 top_Nc_lr = 0.7044 top_L5_mlr = 1.0000 top_L_mlr = 0.8866 top_Nc_mlr = 0.6764 (total_true_lr = 318 total_true_mlr = 377) Precision for 148 - 3dqgA top_L5_lr = 0.9333 top_L_lr = 0.6622 top_Nc_lr = 0.5648 top_L5_mlr = 1.0000 top_L_mlr = 0.8378 top_Nc_mlr = 0.6245 (total_true_lr = 193 total_true_mlr = 277) Precision for 149 - 5ptpA top_L5_lr = 1.0000 top_L_lr = 0.8739 top_Nc_lr = 0.6764 top_L5_mlr = 1.0000 top_L_mlr = 0.9459 top_Nc_mlr = 0.6957 (total_true_lr = 445 total_true_mlr = 575) Average Precision: top_L5_lr = 92.88 top_L_lr = 68.37 top_Nc_lr = 60.51 top_L5_mlr = 96.59 top_L_mlr = 78.40 top_Nc_mlr = 62.63 Save predictions.. Evaluate on CAMEO-HARD set.. Model params: L 1300 num_blocks 128 width 64 expected_n_channels 57 WARNING!! Some values in the pdb structure of 5OD1_A l = 96 are missing or nan! Indices are: (array([0, 1]),) WARNING!! Different len(X) and len(Y) for 5OD1_A 97 96 WARNING!! Some values in the pdb structure of 5OD9_B l = 95 are missing or nan! Indices are: (array([ 0, 1, 2, 48, 49, 50, 51]),) WARNING!! Different len(X) and len(Y) for 5OD9_B 97 95 WARNING!! Some values in the pdb structure of 5OVM_A l = 89 are missing or nan! Indices are: (array([0, 1, 2, 3, 4, 5, 6]),) WARNING!! Some values in the pdb structure of 5W5P_A l = 623 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12]),) 1/131 [..............................] - ETA: 1:21:12 2/131 [..............................] - ETA: 48:47  3/131 [..............................] - ETA: 37:54 4/131 [..............................] - ETA: 32:24 WARNING!! Some values in the pdb structure of 5WB4_H l = 194 are missing or nan! Indices are: (array([0]),)  5/131 [>.............................] - ETA: 49:58 WARNING!! Different len(X) and len(Y) for 5WB4_H 195 194 WARNING!! Some values in the pdb structure of 5XJO_E l = 52 are missing or nan! Indices are: (array([25, 26, 27, 28, 29, 30, 31, 32, 33, 34]),) WARNING!! Some values in the pdb structure of 5XKJ_F l = 52 are missing or nan! Indices are: (array([25, 26, 27, 28, 29, 30, 31]),) WARNING!! Some values in the pdb structure of 5XKN_F l = 50 are missing or nan! Indices are: (array([ 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 27, 28, 29, 30, 31, 32]),) WARNING!! Different len(X) and len(Y) for 5XKN_F 51 50  6/131 [>.............................] - ETA: 44:43 WARNING!! Some values in the pdb structure of 5YA6_B l = 211 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45]),)  7/131 [>.............................] - ETA: 40:21 8/131 [>.............................] - ETA: 37:03 WARNING!! Some values in the pdb structure of 5YRQ_E l = 148 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35]),) WARNING!! Some values in the pdb structure of 5YVQ_A l = 504 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 98, 99, 100, 101, 102, 103, 104, 105, 106, 107, 108, 109, 110, 111, 112, 113, 114, 115, 116, 117, 118, 119, 120, 121, 122, 123, 124, 125, 126, 127, 128, 129, 130, 131, 132, 133, 134, 135, 136, 472, 473, 474, 475, 476, 477, 478, 479, 480]),)  9/131 [=>............................] - ETA: 34:27 10/131 [=>............................] - ETA: 32:21 11/131 [=>............................] - ETA: 30:36 12/131 [=>............................] - ETA: 29:08 WARNING!! Different len(X) and len(Y) for 5YVQ_B 175 103 WARNING!! Some values in the pdb structure of 5Z2H_B l = 99 are missing or nan! Indices are: (array([0, 1, 2, 3, 4, 5, 6, 7]),) WARNING!! Different len(X) and len(Y) for 5Z2H_B 105 99 WARNING!! Some values in the pdb structure of 5Z2I_D l = 99 are missing or nan! Indices are: (array([0, 1, 2, 3, 4, 5, 6]),)  13/131 [=>............................] - ETA: 32:08 WARNING!! Different len(X) and len(Y) for 5Z2I_D 105 99 WARNING!! Some values in the pdb structure of 5Z34_A l = 365 are missing or nan! Indices are: (array([0, 1, 2, 3]),)  14/131 [==>...........................] - ETA: 30:41 15/131 [==>...........................] - ETA: 29:25 16/131 [==>...........................] - ETA: 28:18 WARNING!! Different len(X) and len(Y) for 5Z34_A 369 365 WARNING!! Some values in the pdb structure of 5Z3F_A l = 399 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27]),)  17/131 [==>...........................] - ETA: 28:35 18/131 [===>..........................] - ETA: 27:35 WARNING!! Different len(X) and len(Y) for 5Z3F_A 405 399 WARNING!! Some values in the pdb structure of 5Z3K_B l = 484 are missing or nan! Indices are: (array([0]),)  19/131 [===>..........................] - ETA: 29:25 WARNING!! Different len(X) and len(Y) for 5Z3K_B 485 484 WARNING!! Some values in the pdb structure of 5Z6D_B l = 206 are missing or nan! Indices are: (array([0, 1]),)  20/131 [===>..........................] - ETA: 33:04 WARNING!! Different len(X) and len(Y) for 5Z6D_B 215 206 WARNING!! Some values in the pdb structure of 5Z7C_A l = 463 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 98, 99, 100, 101, 102, 383, 384, 385, 386, 387, 388, 389, 390, 391, 392, 393, 394]),)  21/131 [===>..........................] - ETA: 32:10 WARNING!! Some values in the pdb structure of 5Z8B_B l = 206 are missing or nan! Indices are: (array([0, 1]),)  22/131 [====>.........................] - ETA: 34:33 WARNING!! Different len(X) and len(Y) for 5Z8B_B 242 206 WARNING!! Some values in the pdb structure of 5Z9T_B l = 536 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36]),)  23/131 [====>.........................] - ETA: 33:36 WARNING!! Some values in the pdb structure of 5ZB2_A l = 405 are missing or nan! Indices are: (array([0, 1, 2]),)  24/131 [====>.........................] - ETA: 36:54 WARNING!! Some values in the pdb structure of 5ZER_B l = 354 are missing or nan! Indices are: (array([230, 231, 232, 233, 234, 265, 266, 267]),)  25/131 [====>.........................] - ETA: 37:59 WARNING!! Some values in the pdb structure of 5ZKE_A l = 178 are missing or nan! Indices are: (array([0, 1, 2, 3]),)  26/131 [====>.........................] - ETA: 38:14 WARNING!! Different len(X) and len(Y) for 5ZKE_A 184 178 WARNING!! Some values in the pdb structure of 5ZKH_B l = 171 are missing or nan! Indices are: (array([ 0, 27, 28]),) WARNING!! Different len(X) and len(Y) for 5ZKH_B 180 171  27/131 [=====>........................] - ETA: 37:01 WARNING!! Some values in the pdb structure of 5ZME_A l = 686 are missing or nan! Indices are: (array([ 93, 94, 95, 96, 97, 98, 99, 100, 101, 102, 103, 104, 105, 106, 107, 108, 109, 175, 176, 177, 178, 179, 180, 444, 445, 446, 447, 448, 449, 450, 451, 544, 545, 546, 547, 548, 549, 550, 551, 552, 553, 554, 555, 556, 557, 558, 559, 560]),)  28/131 [=====>........................] - ETA: 35:51 29/131 [=====>........................] - ETA: 34:44 WARNING!! Different len(X) and len(Y) for 5ZME_A 687 686 WARNING!! Some values in the pdb structure of 5ZNS_A l = 382 are missing or nan! Indices are: (array([0]),)  30/131 [=====>........................] - ETA: 39:08 WARNING!! Different len(X) and len(Y) for 5ZNS_A 385 382 WARNING!! Some values in the pdb structure of 5ZT0_J l = 47 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27]),) WARNING!! Different len(X) and len(Y) for 5ZT0_J 75 47 WARNING!! Some values in the pdb structure of 5ZT7_B l = 263 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10]),)  31/131 [======>.......................] - ETA: 39:27 32/131 [======>.......................] - ETA: 38:14 33/131 [======>.......................] - ETA: 37:06 WARNING!! Different len(X) and len(Y) for 5ZT7_B 273 263 WARNING!! Some values in the pdb structure of 5ZX9_A l = 287 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 98, 99, 100, 101, 102, 103, 104, 105, 106, 107, 108, 175, 176, 177, 178, 246, 247, 248, 249, 250, 251, 252, 253, 254]),)  34/131 [======>.......................] - ETA: 36:04 WARNING!! Different len(X) and len(Y) for 5ZX9_A 325 287 WARNING!! Some values in the pdb structure of 5ZYO_D l = 158 are missing or nan! Indices are: (array([0, 1, 2, 3, 4, 5]),)  35/131 [=======>......................] - ETA: 35:23 WARNING!! Different len(X) and len(Y) for 6A2W_A 167 166 WARNING!! Some values in the pdb structure of 6A5F_B l = 152 are missing or nan! Indices are: (array([0]),)  36/131 [=======>......................] - ETA: 34:24 WARNING!! Different len(X) and len(Y) for 6A5F_B 165 152 WARNING!! Some values in the pdb structure of 6A5G_B l = 159 are missing or nan! Indices are: (array([0, 1]),)  37/131 [=======>......................] - ETA: 33:27 WARNING!! Different len(X) and len(Y) for 6A5G_B 165 159 WARNING!! Some values in the pdb structure of 6A68_A l = 173 are missing or nan! Indices are: (array([136, 137, 138, 139, 140, 141, 142, 143, 144, 145, 146]),)  38/131 [=======>......................] - ETA: 32:33 WARNING!! Different len(X) and len(Y) for 6A68_A 184 173 WARNING!! Some values in the pdb structure of 6A83_A l = 394 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36]),)  39/131 [=======>......................] - ETA: 31:41 40/131 [========>.....................] - ETA: 30:52 WARNING!! Some values in the pdb structure of 6A9J_B l = 379 are missing or nan! Indices are: (array([219, 220, 260, 261, 262, 263, 264, 265, 266, 267, 268, 269, 270, 271, 272, 273, 274, 275, 276, 277, 278, 279, 307, 308, 309, 310, 311, 312, 313, 314, 341, 342, 343, 344, 345, 360, 361, 362, 363, 364]),)  41/131 [========>.....................] - ETA: 30:58 WARNING!! Different len(X) and len(Y) for 6A9J_B 389 379 WARNING!! Some values in the pdb structure of 6A9W_A l = 314 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 196, 197, 198, 199, 200, 201, 202, 203, 204, 205, 206, 207, 208, 209]),)  42/131 [========>.....................] - ETA: 31:05 WARNING!! Different len(X) and len(Y) for 6A9W_A 320 314 WARNING!! Some values in the pdb structure of 6AAY_A l = 1225 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 335, 336, 337, 338, 339, 340, 341, 342, 343, 395, 396, 397, 398, 1065, 1066, 1067, 1068]),)  43/131 [========>.....................] - ETA: 30:48 WARNING!! Different len(X) and len(Y) for 6AAY_A 1232 1225 WARNING!! Some values in the pdb structure of 6AE1_B l = 145 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43]),) WARNING!! Different len(X) and len(Y) for 6AE1_B 148 145 WARNING!! Some values in the pdb structure of 6AE8_D l = 119 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 98, 99, 100, 101, 102, 103, 104, 105, 106]),) WARNING!! Different len(X) and len(Y) for 6AE8_D 120 119  44/131 [=========>....................] - ETA: 41:46 45/131 [=========>....................] - ETA: 40:38 46/131 [=========>....................] - ETA: 39:32 WARNING!! Different len(X) and len(Y) for 6AE9_B 260 253 WARNING!! Some values in the pdb structure of 6AEF_A l = 459 are missing or nan! Indices are: (array([0]),)  47/131 [=========>....................] - ETA: 38:29 WARNING!! Different len(X) and len(Y) for 6AEF_A 468 459 WARNING!! Some values in the pdb structure of 6AGH_B l = 335 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 127, 128, 148, 149, 150, 151, 152, 153, 154, 155, 156, 157, 158, 159, 160, 161, 162, 163, 164, 165, 166, 167, 168, 232, 233, 234, 280, 281, 282, 283, 284, 317, 318, 319, 320, 321, 322]),)  48/131 [=========>....................] - ETA: 38:45 WARNING!! Different len(X) and len(Y) for 6AGH_B 339 335 WARNING!! Some values in the pdb structure of 6AGJ_B l = 374 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 158, 159, 160, 161, 162, 163, 164, 165, 166, 167, 168, 169, 170, 171, 172, 173, 174, 175, 176, 177, 178, 179, 180, 181, 182, 183, 184, 185, 186, 187, 188, 189, 190, 191, 192, 193, 194, 195, 196, 197, 198, 199, 200, 201, 202, 203, 204, 205, 317, 318, 319]),)  49/131 [==========>...................] - ETA: 38:11 WARNING!! Different len(X) and len(Y) for 6AGJ_B 382 374 WARNING!! Some values in the pdb structure of 6AHQ_T l = 129 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21]),) WARNING!! Some values in the pdb structure of 6AIT_F l = 438 are missing or nan! Indices are: (array([ 0, 110, 111, 112, 113, 114, 115, 133, 134, 135, 136, 137, 138, 139, 140, 141, 142, 143, 144, 145, 146, 147, 190, 191, 192, 193, 194, 195, 196, 203, 204, 205]),)  50/131 [==========>...................] - ETA: 37:49 51/131 [==========>...................] - ETA: 36:49 WARNING!! Different len(X) and len(Y) for 6AIT_F 439 438 WARNING!! Some values in the pdb structure of 6AJJ_A l = 941 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 368, 369, 370, 371, 372, 373, 374, 375, 376, 377, 378, 379, 380, 381, 382, 383, 384, 385, 386, 387, 388, 389, 390, 391, 392, 393, 394, 395, 396, 397, 398, 399]),)  52/131 [==========>...................] - ETA: 36:43 WARNING!! Different len(X) and len(Y) for 6AJJ_A 943 941 WARNING!! Some values in the pdb structure of 6AKJ_B l = 152 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 53, 54, 55, 56, 57, 58, 59, 60, 61, 109, 110, 111, 112, 113, 114, 115, 116, 117, 118, 119, 120, 121, 122, 123, 124, 125, 126, 127]),) WARNING!! Different len(X) and len(Y) for 6AKJ_B 155 152 WARNING!! Some values in the pdb structure of 6BEA_A l = 438 are missing or nan! Indices are: (array([0, 1, 2, 3, 4]),)  53/131 [===========>..................] - ETA: 40:50 54/131 [===========>..................] - ETA: 39:46 WARNING!! Different len(X) and len(Y) for 6BEA_A 466 438 WARNING!! Some values in the pdb structure of 6BS5_A l = 340 are missing or nan! Indices are: (array([0, 1, 2, 3, 4, 5, 6, 7, 8, 9]),)  55/131 [===========>..................] - ETA: 39:34 WARNING!! Some values in the pdb structure of 6BS5_B l = 375 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 185, 186, 187, 188, 189, 190, 191, 192, 226, 227, 307, 308, 309, 310, 311, 312]),)  56/131 [===========>..................] - ETA: 38:59 WARNING!! Different len(X) and len(Y) for 6BS5_B 394 375 WARNING!! Some values in the pdb structure of 6BWH_C l = 227 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 99, 100, 101, 156, 157, 186]),)  57/131 [============>.................] - ETA: 38:31 WARNING!! Different len(X) and len(Y) for 6BWH_C 228 227 WARNING!! Some values in the pdb structure of 6BXS_C l = 277 are missing or nan! Indices are: (array([0, 1, 2, 3, 4, 5, 6]),)  58/131 [============>.................] - ETA: 37:36 WARNING!! Different len(X) and len(Y) for 6BXS_C 281 277 WARNING!! Some values in the pdb structure of 6BXW_A l = 277 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 107, 108, 109, 110, 111, 112]),)  59/131 [============>.................] - ETA: 36:51 WARNING!! Different len(X) and len(Y) for 6BXW_A 278 277 WARNING!! Some values in the pdb structure of 6BZT_D l = 522 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20]),)  60/131 [============>.................] - ETA: 36:06 61/131 [============>.................] - ETA: 35:10 62/131 [=============>................] - ETA: 34:15 WARNING!! Some values in the pdb structure of 6CB6_A l = 121 are missing or nan! Indices are: (array([0, 1, 2]),) WARNING!! Different len(X) and len(Y) for 6CB6_A 122 121 WARNING!! Some values in the pdb structure of 6CCI_A l = 486 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 98, 99, 100, 101, 102, 103, 104, 105, 106, 107, 108, 109, 110, 111, 112, 113, 114, 115, 116, 117, 118, 119, 120, 121, 122, 123, 124, 125, 126, 127, 128, 129, 130, 131]),)  63/131 [=============>................] - ETA: 34:13 64/131 [=============>................] - ETA: 33:19 WARNING!! Different len(X) and len(Y) for 6CCI_A 487 486 WARNING!! Some values in the pdb structure of 6CGO_B l = 545 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 455, 456, 457, 458, 459, 460, 533, 534, 535, 536, 537]),)  65/131 [=============>................] - ETA: 33:04 WARNING!! Some values in the pdb structure of 6CK1_D l = 408 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 186, 187, 188, 189, 190, 191]),)  66/131 [==============>...............] - ETA: 33:14 WARNING!! Some values in the pdb structure of 6CMK_A l = 405 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29]),)  67/131 [==============>...............] - ETA: 32:51 WARNING!! Different len(X) and len(Y) for 6CMK_A 421 405 WARNING!! Some values in the pdb structure of 6CP8_B l = 163 are missing or nan! Indices are: (array([0]),) WARNING!! Some values in the pdb structure of 6CP8_D l = 164 are missing or nan! Indices are: (array([0, 1, 2, 3, 4, 5, 6]),)  68/131 [==============>...............] - ETA: 32:27 WARNING!! Some values in the pdb structure of 6CP9_G l = 121 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11]),)  69/131 [==============>...............] - ETA: 31:36 WARNING!! Different len(X) and len(Y) for 6CP9_G 126 121 WARNING!! Different len(X) and len(Y) for 6CP9_H 116 114 WARNING!! Some values in the pdb structure of 6CPU_A l = 570 are missing or nan! Indices are: (array([ 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 104, 105, 106, 107, 142, 143, 144, 145, 146, 147, 148, 149, 150, 151, 152, 153, 154, 324, 325, 326, 327, 328]),)  70/131 [===============>..............] - ETA: 30:45 71/131 [===============>..............] - ETA: 29:56 72/131 [===============>..............] - ETA: 29:08 WARNING!! Different len(X) and len(Y) for 6CPU_A 571 570 WARNING!! Some values in the pdb structure of 6CSV_D l = 93 are missing or nan! Indices are: (array([ 0, 40, 41, 42, 43, 44, 45]),) WARNING!! Different len(X) and len(Y) for 6CSV_D 94 93 WARNING!! Some values in the pdb structure of 6CUL_H l = 275 are missing or nan! Indices are: (array([ 0, 27, 28, 29, 30, 225, 226, 227]),)  73/131 [===============>..............] - ETA: 29:09 74/131 [===============>..............] - ETA: 28:22 WARNING!! Some values in the pdb structure of 6CZ6_D l = 423 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 408, 409]),)  75/131 [================>.............] - ETA: 27:40 WARNING!! Different len(X) and len(Y) for 6CZ6_D 429 423 WARNING!! Some values in the pdb structure of 6CZT_A l = 90 are missing or nan! Indices are: (array([0, 1, 2, 3, 4, 5, 6, 7]),) WARNING!! Different len(X) and len(Y) for 6CZT_A 196 90  76/131 [================>.............] - ETA: 27:16 WARNING!! Different len(X) and len(Y) for 6D0I_C 160 159 WARNING!! Some values in the pdb structure of 6D0I_D l = 72 are missing or nan! Indices are: (array([0]),) WARNING!! Some values in the pdb structure of 6D2S_A l = 284 are missing or nan! Indices are: (array([ 0, 203, 204, 205, 206, 207, 208, 209, 210]),)  77/131 [================>.............] - ETA: 26:31 78/131 [================>.............] - ETA: 25:47 79/131 [=================>............] - ETA: 25:04 WARNING!! Different len(X) and len(Y) for 6D2S_A 289 284 WARNING!! Some values in the pdb structure of 6D7Y_A l = 96 are missing or nan! Indices are: (array([0, 1, 2, 3]),)  80/131 [=================>............] - ETA: 24:22 WARNING!! Some values in the pdb structure of 6D97_D l = 547 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24]),)  81/131 [=================>............] - ETA: 23:41 82/131 [=================>............] - ETA: 23:00 WARNING!! Some values in the pdb structure of 6D9F_B l = 325 are missing or nan! Indices are: (array([0, 1, 2]),)  83/131 [==================>...........] - ETA: 22:54 WARNING!! Different len(X) and len(Y) for 6D9F_B 332 325  84/131 [==================>...........] - ETA: 22:24 WARNING!! Different len(X) and len(Y) for 6D9M_A 337 326 WARNING!! Some values in the pdb structure of 6DAN_D l = 329 are missing or nan! Indices are: (array([0, 1, 2, 3, 4, 5]),)  85/131 [==================>...........] - ETA: 21:54 WARNING!! Different len(X) and len(Y) for 6DAN_D 334 329 WARNING!! Some values in the pdb structure of 6DFL_A l = 259 are missing or nan! Indices are: (array([54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71]),)  86/131 [==================>...........] - ETA: 21:25 WARNING!! Some values in the pdb structure of 6DGN_B l = 95 are missing or nan! Indices are: (array([0, 1, 2, 3]),)  87/131 [==================>...........] - ETA: 20:50 88/131 [===================>..........] - ETA: 20:11 WARNING!! Different len(X) and len(Y) for 6DII_L 636 616 WARNING!! Some values in the pdb structure of 6DKA_I l = 233 are missing or nan! Indices are: (array([0, 1, 2, 3, 4, 5, 6]),)  89/131 [===================>..........] - ETA: 20:12 WARNING!! Different len(X) and len(Y) for 6DKA_I 234 233 WARNING!! Some values in the pdb structure of 6DKM_G l = 78 are missing or nan! Indices are: (array([0, 1]),) WARNING!! Different len(X) and len(Y) for 6DKM_G 79 78 WARNING!! Some values in the pdb structure of 6DLC_A l = 115 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 76, 77, 78, 79]),) WARNING!! Some values in the pdb structure of 6DLO_A l = 358 are missing or nan! Indices are: (array([ 0, 1, 20, 21, 22, 23, 24, 115, 116, 117, 118, 119, 167, 168, 169, 170, 171, 172, 173, 174, 257, 258, 259, 260, 261, 262, 263, 264, 265, 266, 267, 268, 324, 325, 326, 327, 328, 339, 340, 341, 342, 343, 344, 345, 346, 347, 348]),)  90/131 [===================>..........] - ETA: 19:36 91/131 [===================>..........] - ETA: 18:58 92/131 [====================>.........] - ETA: 18:21 WARNING!! Different len(X) and len(Y) for 6DLO_A 389 358  93/131 [====================>.........] - ETA: 17:49 WARNING!! Some values in the pdb structure of 6DTD_A l = 1125 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 118, 119, 120, 121, 215, 216, 217, 218, 219, 220, 221, 222, 338, 339, 340, 341, 342, 343, 344, 345, 431, 432, 433, 434, 435, 436, 897, 898, 899, 900, 901, 902, 903, 1094, 1095, 1096, 1097]),)  94/131 [====================>.........] - ETA: 17:13 WARNING!! Different len(X) and len(Y) for 6DTD_A 1127 1125 WARNING!! Some values in the pdb structure of 6E0K_A l = 296 are missing or nan! Indices are: (array([0]),)  95/131 [====================>.........] - ETA: 18:26 WARNING!! Some values in the pdb structure of 6E0M_A l = 291 are missing or nan! Indices are: (array([76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94]),)  96/131 [====================>.........] - ETA: 17:52 WARNING!! Different len(X) and len(Y) for 6E0M_A 292 291 WARNING!! Some values in the pdb structure of 6E3C_C l = 137 are missing or nan! Indices are: (array([0, 1, 2]),) WARNING!! Some values in the pdb structure of 6E9B_D l = 420 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 77, 78, 79, 108, 109, 110, 123, 124]),)  97/131 [=====================>........] - ETA: 17:18 98/131 [=====================>........] - ETA: 16:40 WARNING!! Some values in the pdb structure of 6E9O_A l = 443 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 230, 231, 232, 233, 234, 235, 236, 237, 238, 239, 240, 241, 242, 276, 277, 278, 279, 280, 281, 282, 283, 284, 285, 286, 287, 288, 289]),)  99/131 [=====================>........] - ETA: 16:12 WARNING!! Different len(X) and len(Y) for 6E9O_A 460 443 WARNING!! Some values in the pdb structure of 6EAZ_B l = 374 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 151, 152, 153, 154, 155, 156, 157, 158, 159, 160, 161, 162, 163, 164, 165, 166, 167, 168, 169, 170, 171]),)  100/131 [=====================>........] - ETA: 15:46 WARNING!! Different len(X) and len(Y) for 6EAZ_B 379 374 WARNING!! Some values in the pdb structure of 6EDB_B l = 73 are missing or nan! Indices are: (array([0, 1, 2, 3, 4, 5]),) WARNING!! Different len(X) and len(Y) for 6EDB_B 455 73 WARNING!! Some values in the pdb structure of 6EGC_A l = 155 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 78, 79, 80, 81]),)  101/131 [======================>.......] - ETA: 15:16 WARNING!! Different len(X) and len(Y) for 6EGC_A 156 155 WARNING!! Some values in the pdb structure of 6FCG_F l = 430 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36]),)  102/131 [======================>.......] - ETA: 14:39 103/131 [======================>.......] - ETA: 14:03 WARNING!! Some values in the pdb structure of 6FTO_C l = 81 are missing or nan! Indices are: (array([0, 1, 2, 3, 4, 5]),) WARNING!! Some values in the pdb structure of 6FXD_B l = 141 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19]),) WARNING!! Some values in the pdb structure of 6G1H_A l = 343 are missing or nan! Indices are: (array([298]),)  104/131 [======================>.......] - ETA: 13:34 105/131 [=======================>......] - ETA: 12:59 106/131 [=======================>......] - ETA: 12:23 WARNING!! Different len(X) and len(Y) for 6G1H_A 345 343 WARNING!! Some values in the pdb structure of 6G3B_B l = 231 are missing or nan! Indices are: (array([ 0, 1, 45, 46, 47, 48]),)  107/131 [=======================>......] - ETA: 11:52 WARNING!! Different len(X) and len(Y) for 6G3B_B 238 231 WARNING!! Some values in the pdb structure of 6G70_B l = 607 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 280, 281, 282, 283, 284, 285, 286, 287, 288, 289, 290, 291, 292, 293, 294, 295, 296, 297, 298]),)  108/131 [=======================>......] - ETA: 11:19 WARNING!! Different len(X) and len(Y) for 6G70_B 670 607  109/131 [=======================>......] - ETA: 11:00 110/131 [========================>.....] - ETA: 10:25 WARNING!! Some values in the pdb structure of 6GDJ_B l = 71 are missing or nan! Indices are: (array([0, 1]),)  111/131 [========================>.....] - ETA: 9:55 WARNING!! Different len(X) and len(Y) for 6GDJ_B 88 71 WARNING!! Some values in the pdb structure of 6GHO_B l = 295 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20]),)  112/131 [========================>.....] - ETA: 9:22 113/131 [========================>.....] - ETA: 8:49 WARNING!! Different len(X) and len(Y) for 6GHO_B 298 295 WARNING!! Some values in the pdb structure of 6GMA_F l = 139 are missing or nan! Indices are: (array([ 0, 31, 32, 33, 34, 35, 36, 37, 38, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96]),)  114/131 [=========================>....] - ETA: 8:16 WARNING!! Different len(X) and len(Y) for 6GMA_F 140 139  115/131 [=========================>....] - ETA: 7:44 WARNING!! Some values in the pdb structure of 6H2X_A l = 354 are missing or nan! Indices are: (array([0, 1]),)  116/131 [=========================>....] - ETA: 7:12 117/131 [=========================>....] - ETA: 6:41 118/131 [==========================>...] - ETA: 6:10 WARNING!! Different len(X) and len(Y) for 6H2X_A 367 354 WARNING!! Some values in the pdb structure of 6H6N_B l = 129 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10]),) WARNING!! Different len(X) and len(Y) for 6H6N_B 131 129 WARNING!! Some values in the pdb structure of 6HC2_X l = 68 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43]),) WARNING!! Different len(X) and len(Y) for 6HC2_X 71 68 WARNING!! Some values in the pdb structure of 6HPV_A l = 370 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 199, 200, 201, 202, 203, 204, 205, 206, 207, 208, 209, 225, 226, 227, 228, 229, 230, 231, 232, 233, 234, 235, 236, 251, 252, 253, 254, 255, 256, 257, 258, 259, 260, 261, 262, 263, 264, 265, 266, 267, 268, 269, 270, 271, 272, 273, 274, 275, 276, 297, 298, 299, 300, 301, 302, 303, 304, 305, 306]),)  119/131 [==========================>...] - ETA: 5:40 120/131 [==========================>...] - ETA: 5:10 121/131 [==========================>...] - ETA: 4:40 WARNING!! Different len(X) and len(Y) for 6HPV_A 378 370 WARNING!! Some values in the pdb structure of 6I1R_B l = 313 are missing or nan! Indices are: (array([27, 28, 29]),)  122/131 [==========================>...] - ETA: 4:12 WARNING!! Different len(X) and len(Y) for 6I1R_B 322 313 WARNING!! Some values in the pdb structure of 6I9H_A l = 94 are missing or nan! Indices are: (array([0, 1]),) WARNING!! Some values in the pdb structure of 6IAI_D l = 114 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 95]),) WARNING!! Different len(X) and len(Y) for 6IAI_D 121 114  123/131 [===========================>..] - ETA: 3:43 WARNING!! Some values in the pdb structure of 6IEH_A l = 97 are missing or nan! Indices are: (array([46, 47, 48, 73, 74, 75, 76, 77, 78, 84, 85, 86]),) WARNING!! Different len(X) and len(Y) for 6IEH_A 105 97 WARNING!! Some values in the pdb structure of 6N0T_A l = 440 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 89, 90, 91, 92, 93, 208, 209, 210, 211, 212]),)  124/131 [===========================>..] - ETA: 3:14 125/131 [===========================>..] - ETA: 2:46 126/131 [===========================>..] - ETA: 2:17 WARNING!! Different len(X) and len(Y) for 6N0T_A 441 440 WARNING!! Some values in the pdb structure of 6N8P_A l = 926 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 459, 460, 461, 462, 463, 464, 465, 466, 467, 468, 469, 470, 471, 472, 623, 624, 625, 626, 627, 628, 629, 630, 631, 632, 633, 634, 635, 636, 637, 638, 639, 640, 641, 642, 643, 644, 645, 646, 647, 648, 649, 650, 651, 652, 653, 654, 655, 656, 657, 658, 659, 660, 661, 662, 663, 664, 665, 666, 667, 668, 669, 670, 671, 672, 673, 674, 675, 676, 677, 678, 679, 680, 681, 682, 683, 684, 685, 686, 687, 688, 689, 690, 691, 692, 693, 694, 695, 841, 842, 843, 844, 845, 846]),)  127/131 [============================>.] - ETA: 1:50 WARNING!! Different len(X) and len(Y) for 6N8P_A 979 926  128/131 [============================>.] - ETA: 1:26 WARNING!! Some values in the pdb structure of 6OHZ_A l = 230 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 133, 134, 135, 136, 137, 138, 139, 140, 141, 142, 143]),)  129/131 [============================>.] - ETA: 57s  130/131 [============================>.] - ETA: 28s WARNING!! Some values in the pdb structure of 5OD1_A l = 96 are missing or nan! Indices are: (array([0, 1]),) WARNING!! Different len(X) and len(Y) for 5OD1_A 97 96 WARNING!! Some values in the pdb structure of 5OD9_B l = 95 are missing or nan! Indices are: (array([ 0, 1, 2, 48, 49, 50, 51]),) WARNING!! Different len(X) and len(Y) for 5OD9_B 97 95 WARNING!! Some values in the pdb structure of 5OVM_A l = 89 are missing or nan! Indices are: (array([0, 1, 2, 3, 4, 5, 6]),) WARNING!! Some values in the pdb structure of 5W5P_A l = 623 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12]),) WARNING!! Some values in the pdb structure of 5WB4_H l = 194 are missing or nan! Indices are: (array([0]),)  131/131 [==============================] - 3861s 29s/step WARNING!! Different len(X) and len(Y) for 5WB4_H 195 194 WARNING!! Some values in the pdb structure of 5OD1_A l = 96 are missing or nan! Indices are: (array([0, 1]),) WARNING!! Some values in the pdb structure of 5OD9_B l = 95 are missing or nan! Indices are: (array([ 0, 1, 2, 48, 49, 50, 51]),) WARNING!! Some values in the pdb structure of 5OVM_A l = 89 are missing or nan! Indices are: (array([0, 1, 2, 3, 4, 5, 6]),) WARNING!! Some values in the pdb structure of 5W5P_A l = 623 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12]),) WARNING!! Some values in the pdb structure of 5WB4_H l = 194 are missing or nan! Indices are: (array([0]),) WARNING!! Some values in the pdb structure of 5XJO_E l = 52 are missing or nan! Indices are: (array([25, 26, 27, 28, 29, 30, 31, 32, 33, 34]),) WARNING!! Some values in the pdb structure of 5XKJ_F l = 52 are missing or nan! Indices are: (array([25, 26, 27, 28, 29, 30, 31]),) WARNING!! Some values in the pdb structure of 5XKN_F l = 50 are missing or nan! Indices are: (array([ 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 27, 28, 29, 30, 31, 32]),) WARNING!! Some values in the pdb structure of 5YA6_B l = 211 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45]),) WARNING!! Some values in the pdb structure of 5YRQ_E l = 148 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35]),) WARNING!! Some values in the pdb structure of 5YVQ_A l = 504 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 98, 99, 100, 101, 102, 103, 104, 105, 106, 107, 108, 109, 110, 111, 112, 113, 114, 115, 116, 117, 118, 119, 120, 121, 122, 123, 124, 125, 126, 127, 128, 129, 130, 131, 132, 133, 134, 135, 136, 472, 473, 474, 475, 476, 477, 478, 479, 480]),) WARNING!! Some values in the pdb structure of 5Z2H_B l = 99 are missing or nan! Indices are: (array([0, 1, 2, 3, 4, 5, 6, 7]),) WARNING!! Some values in the pdb structure of 5Z2I_D l = 99 are missing or nan! Indices are: (array([0, 1, 2, 3, 4, 5, 6]),) WARNING!! Some values in the pdb structure of 5Z34_A l = 365 are missing or nan! Indices are: (array([0, 1, 2, 3]),) WARNING!! Some values in the pdb structure of 5Z3F_A l = 399 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27]),) WARNING!! Some values in the pdb structure of 5Z3K_B l = 484 are missing or nan! Indices are: (array([0]),) WARNING!! Some values in the pdb structure of 5Z6D_B l = 206 are missing or nan! Indices are: (array([0, 1]),) WARNING!! Some values in the pdb structure of 5Z7C_A l = 463 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 98, 99, 100, 101, 102, 383, 384, 385, 386, 387, 388, 389, 390, 391, 392, 393, 394]),) WARNING!! Some values in the pdb structure of 5Z8B_B l = 206 are missing or nan! Indices are: (array([0, 1]),) WARNING!! Some values in the pdb structure of 5Z9T_B l = 536 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36]),) WARNING!! Some values in the pdb structure of 5ZB2_A l = 405 are missing or nan! Indices are: (array([0, 1, 2]),) WARNING!! Some values in the pdb structure of 5ZER_B l = 354 are missing or nan! Indices are: (array([230, 231, 232, 233, 234, 265, 266, 267]),) WARNING!! Some values in the pdb structure of 5ZKE_A l = 178 are missing or nan! Indices are: (array([0, 1, 2, 3]),) WARNING!! Some values in the pdb structure of 5ZKH_B l = 171 are missing or nan! Indices are: (array([ 0, 27, 28]),) WARNING!! Some values in the pdb structure of 5ZME_A l = 686 are missing or nan! Indices are: (array([ 93, 94, 95, 96, 97, 98, 99, 100, 101, 102, 103, 104, 105, 106, 107, 108, 109, 175, 176, 177, 178, 179, 180, 444, 445, 446, 447, 448, 449, 450, 451, 544, 545, 546, 547, 548, 549, 550, 551, 552, 553, 554, 555, 556, 557, 558, 559, 560]),) WARNING!! Some values in the pdb structure of 5ZNS_A l = 382 are missing or nan! Indices are: (array([0]),) WARNING!! Some values in the pdb structure of 5ZT0_J l = 47 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27]),) WARNING!! Some values in the pdb structure of 5ZT7_B l = 263 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10]),) WARNING!! Some values in the pdb structure of 5ZX9_A l = 287 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 98, 99, 100, 101, 102, 103, 104, 105, 106, 107, 108, 175, 176, 177, 178, 246, 247, 248, 249, 250, 251, 252, 253, 254]),) WARNING!! Some values in the pdb structure of 5ZYO_D l = 158 are missing or nan! Indices are: (array([0, 1, 2, 3, 4, 5]),) WARNING!! Some values in the pdb structure of 6A5F_B l = 152 are missing or nan! Indices are: (array([0]),) WARNING!! Some values in the pdb structure of 6A5G_B l = 159 are missing or nan! Indices are: (array([0, 1]),) WARNING!! Some values in the pdb structure of 6A68_A l = 173 are missing or nan! Indices are: (array([136, 137, 138, 139, 140, 141, 142, 143, 144, 145, 146]),) WARNING!! Some values in the pdb structure of 6A83_A l = 394 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36]),) WARNING!! Some values in the pdb structure of 6A9J_B l = 379 are missing or nan! Indices are: (array([219, 220, 260, 261, 262, 263, 264, 265, 266, 267, 268, 269, 270, 271, 272, 273, 274, 275, 276, 277, 278, 279, 307, 308, 309, 310, 311, 312, 313, 314, 341, 342, 343, 344, 345, 360, 361, 362, 363, 364]),) WARNING!! Some values in the pdb structure of 6A9W_A l = 314 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 196, 197, 198, 199, 200, 201, 202, 203, 204, 205, 206, 207, 208, 209]),) WARNING!! Some values in the pdb structure of 6AAY_A l = 1225 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 335, 336, 337, 338, 339, 340, 341, 342, 343, 395, 396, 397, 398, 1065, 1066, 1067, 1068]),) WARNING!! Some values in the pdb structure of 6AE1_B l = 145 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43]),) WARNING!! Some values in the pdb structure of 6AE8_D l = 119 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 98, 99, 100, 101, 102, 103, 104, 105, 106]),) WARNING!! Some values in the pdb structure of 6AEF_A l = 459 are missing or nan! Indices are: (array([0]),) WARNING!! Some values in the pdb structure of 6AGH_B l = 335 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 127, 128, 148, 149, 150, 151, 152, 153, 154, 155, 156, 157, 158, 159, 160, 161, 162, 163, 164, 165, 166, 167, 168, 232, 233, 234, 280, 281, 282, 283, 284, 317, 318, 319, 320, 321, 322]),) WARNING!! Some values in the pdb structure of 6AGJ_B l = 374 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 158, 159, 160, 161, 162, 163, 164, 165, 166, 167, 168, 169, 170, 171, 172, 173, 174, 175, 176, 177, 178, 179, 180, 181, 182, 183, 184, 185, 186, 187, 188, 189, 190, 191, 192, 193, 194, 195, 196, 197, 198, 199, 200, 201, 202, 203, 204, 205, 317, 318, 319]),) WARNING!! Some values in the pdb structure of 6AHQ_T l = 129 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21]),) WARNING!! Some values in the pdb structure of 6AIT_F l = 438 are missing or nan! Indices are: (array([ 0, 110, 111, 112, 113, 114, 115, 133, 134, 135, 136, 137, 138, 139, 140, 141, 142, 143, 144, 145, 146, 147, 190, 191, 192, 193, 194, 195, 196, 203, 204, 205]),) WARNING!! Some values in the pdb structure of 6AJJ_A l = 941 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 368, 369, 370, 371, 372, 373, 374, 375, 376, 377, 378, 379, 380, 381, 382, 383, 384, 385, 386, 387, 388, 389, 390, 391, 392, 393, 394, 395, 396, 397, 398, 399]),) WARNING!! Some values in the pdb structure of 6AKJ_B l = 152 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 53, 54, 55, 56, 57, 58, 59, 60, 61, 109, 110, 111, 112, 113, 114, 115, 116, 117, 118, 119, 120, 121, 122, 123, 124, 125, 126, 127]),) WARNING!! Some values in the pdb structure of 6BEA_A l = 438 are missing or nan! Indices are: (array([0, 1, 2, 3, 4]),) WARNING!! Some values in the pdb structure of 6BS5_A l = 340 are missing or nan! Indices are: (array([0, 1, 2, 3, 4, 5, 6, 7, 8, 9]),) WARNING!! Some values in the pdb structure of 6BS5_B l = 375 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 185, 186, 187, 188, 189, 190, 191, 192, 226, 227, 307, 308, 309, 310, 311, 312]),) WARNING!! Some values in the pdb structure of 6BWH_C l = 227 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 99, 100, 101, 156, 157, 186]),) WARNING!! Some values in the pdb structure of 6BXS_C l = 277 are missing or nan! Indices are: (array([0, 1, 2, 3, 4, 5, 6]),) WARNING!! Some values in the pdb structure of 6BXW_A l = 277 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 107, 108, 109, 110, 111, 112]),) WARNING!! Some values in the pdb structure of 6BZT_D l = 522 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20]),) WARNING!! Some values in the pdb structure of 6CB6_A l = 121 are missing or nan! Indices are: (array([0, 1, 2]),) WARNING!! Some values in the pdb structure of 6CCI_A l = 486 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 98, 99, 100, 101, 102, 103, 104, 105, 106, 107, 108, 109, 110, 111, 112, 113, 114, 115, 116, 117, 118, 119, 120, 121, 122, 123, 124, 125, 126, 127, 128, 129, 130, 131]),) WARNING!! Some values in the pdb structure of 6CGO_B l = 545 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 455, 456, 457, 458, 459, 460, 533, 534, 535, 536, 537]),) WARNING!! Some values in the pdb structure of 6CK1_D l = 408 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 186, 187, 188, 189, 190, 191]),) WARNING!! Some values in the pdb structure of 6CMK_A l = 405 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29]),) WARNING!! Some values in the pdb structure of 6CP8_B l = 163 are missing or nan! Indices are: (array([0]),) WARNING!! Some values in the pdb structure of 6CP8_D l = 164 are missing or nan! Indices are: (array([0, 1, 2, 3, 4, 5, 6]),) WARNING!! Some values in the pdb structure of 6CP9_G l = 121 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11]),) WARNING!! Some values in the pdb structure of 6CPU_A l = 570 are missing or nan! Indices are: (array([ 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 104, 105, 106, 107, 142, 143, 144, 145, 146, 147, 148, 149, 150, 151, 152, 153, 154, 324, 325, 326, 327, 328]),) WARNING!! Some values in the pdb structure of 6CSV_D l = 93 are missing or nan! Indices are: (array([ 0, 40, 41, 42, 43, 44, 45]),) WARNING!! Some values in the pdb structure of 6CUL_H l = 275 are missing or nan! Indices are: (array([ 0, 27, 28, 29, 30, 225, 226, 227]),) WARNING!! Some values in the pdb structure of 6CZ6_D l = 423 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 408, 409]),) WARNING!! Some values in the pdb structure of 6CZT_A l = 90 are missing or nan! Indices are: (array([0, 1, 2, 3, 4, 5, 6, 7]),) WARNING!! Some values in the pdb structure of 6D0I_D l = 72 are missing or nan! Indices are: (array([0]),) WARNING!! Some values in the pdb structure of 6D2S_A l = 284 are missing or nan! Indices are: (array([ 0, 203, 204, 205, 206, 207, 208, 209, 210]),) WARNING!! Some values in the pdb structure of 6D7Y_A l = 96 are missing or nan! Indices are: (array([0, 1, 2, 3]),) WARNING!! Some values in the pdb structure of 6D97_D l = 547 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24]),) WARNING!! Some values in the pdb structure of 6D9F_B l = 325 are missing or nan! Indices are: (array([0, 1, 2]),) WARNING!! Some values in the pdb structure of 6DAN_D l = 329 are missing or nan! Indices are: (array([0, 1, 2, 3, 4, 5]),) WARNING!! Some values in the pdb structure of 6DFL_A l = 259 are missing or nan! Indices are: (array([54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71]),) WARNING!! Some values in the pdb structure of 6DGN_B l = 95 are missing or nan! Indices are: (array([0, 1, 2, 3]),) WARNING!! Some values in the pdb structure of 6DKA_I l = 233 are missing or nan! Indices are: (array([0, 1, 2, 3, 4, 5, 6]),) WARNING!! Some values in the pdb structure of 6DKM_G l = 78 are missing or nan! Indices are: (array([0, 1]),) WARNING!! Some values in the pdb structure of 6DLC_A l = 115 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 76, 77, 78, 79]),) WARNING!! Some values in the pdb structure of 6DLO_A l = 358 are missing or nan! Indices are: (array([ 0, 1, 20, 21, 22, 23, 24, 115, 116, 117, 118, 119, 167, 168, 169, 170, 171, 172, 173, 174, 257, 258, 259, 260, 261, 262, 263, 264, 265, 266, 267, 268, 324, 325, 326, 327, 328, 339, 340, 341, 342, 343, 344, 345, 346, 347, 348]),) WARNING!! Some values in the pdb structure of 6DTD_A l = 1125 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 118, 119, 120, 121, 215, 216, 217, 218, 219, 220, 221, 222, 338, 339, 340, 341, 342, 343, 344, 345, 431, 432, 433, 434, 435, 436, 897, 898, 899, 900, 901, 902, 903, 1094, 1095, 1096, 1097]),) WARNING!! Some values in the pdb structure of 6E0K_A l = 296 are missing or nan! Indices are: (array([0]),) WARNING!! Some values in the pdb structure of 6E0M_A l = 291 are missing or nan! Indices are: (array([76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94]),) WARNING!! Some values in the pdb structure of 6E3C_C l = 137 are missing or nan! Indices are: (array([0, 1, 2]),) WARNING!! Some values in the pdb structure of 6E9B_D l = 420 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 77, 78, 79, 108, 109, 110, 123, 124]),) WARNING!! Some values in the pdb structure of 6E9O_A l = 443 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 230, 231, 232, 233, 234, 235, 236, 237, 238, 239, 240, 241, 242, 276, 277, 278, 279, 280, 281, 282, 283, 284, 285, 286, 287, 288, 289]),) WARNING!! Some values in the pdb structure of 6EAZ_B l = 374 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 151, 152, 153, 154, 155, 156, 157, 158, 159, 160, 161, 162, 163, 164, 165, 166, 167, 168, 169, 170, 171]),) WARNING!! Some values in the pdb structure of 6EDB_B l = 73 are missing or nan! Indices are: (array([0, 1, 2, 3, 4, 5]),) WARNING!! Some values in the pdb structure of 6EGC_A l = 155 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 78, 79, 80, 81]),) WARNING!! Some values in the pdb structure of 6FCG_F l = 430 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36]),) WARNING!! Some values in the pdb structure of 6FTO_C l = 81 are missing or nan! Indices are: (array([0, 1, 2, 3, 4, 5]),) WARNING!! Some values in the pdb structure of 6FXD_B l = 141 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19]),) WARNING!! Some values in the pdb structure of 6G1H_A l = 343 are missing or nan! Indices are: (array([298]),) WARNING!! Some values in the pdb structure of 6G3B_B l = 231 are missing or nan! Indices are: (array([ 0, 1, 45, 46, 47, 48]),) WARNING!! Some values in the pdb structure of 6G70_B l = 607 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 280, 281, 282, 283, 284, 285, 286, 287, 288, 289, 290, 291, 292, 293, 294, 295, 296, 297, 298]),) WARNING!! Some values in the pdb structure of 6GDJ_B l = 71 are missing or nan! Indices are: (array([0, 1]),) WARNING!! Some values in the pdb structure of 6GHO_B l = 295 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20]),) WARNING!! Some values in the pdb structure of 6GMA_F l = 139 are missing or nan! Indices are: (array([ 0, 31, 32, 33, 34, 35, 36, 37, 38, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96]),) WARNING!! Some values in the pdb structure of 6H2X_A l = 354 are missing or nan! Indices are: (array([0, 1]),) WARNING!! Some values in the pdb structure of 6H6N_B l = 129 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10]),) WARNING!! Some values in the pdb structure of 6HC2_X l = 68 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43]),) WARNING!! Some values in the pdb structure of 6HPV_A l = 370 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 199, 200, 201, 202, 203, 204, 205, 206, 207, 208, 209, 225, 226, 227, 228, 229, 230, 231, 232, 233, 234, 235, 236, 251, 252, 253, 254, 255, 256, 257, 258, 259, 260, 261, 262, 263, 264, 265, 266, 267, 268, 269, 270, 271, 272, 273, 274, 275, 276, 297, 298, 299, 300, 301, 302, 303, 304, 305, 306]),) WARNING!! Some values in the pdb structure of 6I1R_B l = 313 are missing or nan! Indices are: (array([27, 28, 29]),) WARNING!! Some values in the pdb structure of 6I9H_A l = 94 are missing or nan! Indices are: (array([0, 1]),) WARNING!! Some values in the pdb structure of 6IAI_D l = 114 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 95]),) WARNING!! Some values in the pdb structure of 6IEH_A l = 97 are missing or nan! Indices are: (array([46, 47, 48, 73, 74, 75, 76, 77, 78, 84, 85, 86]),) WARNING!! Some values in the pdb structure of 6N0T_A l = 440 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 89, 90, 91, 92, 93, 208, 209, 210, 211, 212]),) WARNING!! Some values in the pdb structure of 6N8P_A l = 926 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 459, 460, 461, 462, 463, 464, 465, 466, 467, 468, 469, 470, 471, 472, 623, 624, 625, 626, 627, 628, 629, 630, 631, 632, 633, 634, 635, 636, 637, 638, 639, 640, 641, 642, 643, 644, 645, 646, 647, 648, 649, 650, 651, 652, 653, 654, 655, 656, 657, 658, 659, 660, 661, 662, 663, 664, 665, 666, 667, 668, 669, 670, 671, 672, 673, 674, 675, 676, 677, 678, 679, 680, 681, 682, 683, 684, 685, 686, 687, 688, 689, 690, 691, 692, 693, 694, 695, 841, 842, 843, 844, 845, 846]),) WARNING!! Some values in the pdb structure of 6OHZ_A l = 230 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 133, 134, 135, 136, 137, 138, 139, 140, 141, 142, 143]),) WARNING: Some pdbs in the following list have NaNs in their distances: ['5OD1_A', '5OD9_B', '5OQK_A', '5OVM_A', '5W5P_A', '5WB4_H', '5XJO_E', '5XKJ_F', '5XKN_F', '5Y08_A', '5YA6_B', '5YRQ_E', '5YVQ_A', '5YVQ_B', '5Z2H_B', '5Z2I_D', '5Z34_A', '5Z36_A', '5Z3F_A', '5Z3K_B', '5Z6D_B', '5Z7C_A', '5Z8B_B', '5Z9T_B', '5ZB2_A', '5ZER_B', '5ZKE_A', '5ZKH_B', '5ZKT_B', '5ZME_A', '5ZNS_A', '5ZOR_A', '5ZT0_J', '5ZT7_B', '5ZX9_A', '5ZYO_D', '6A2W_A', '6A5F_B', '6A5G_B', '6A68_A', '6A83_A', '6A9J_B', '6A9W_A', '6AAY_A', '6AE1_B', '6AE8_D', '6AE9_B', '6AEF_A', '6AGH_B', '6AGJ_B', '6AHQ_T', '6AIT_F', '6AJJ_A', '6AKJ_B', '6BEA_A', '6BS5_A', '6BS5_B', '6BWH_C', '6BXS_C', '6BXW_A', '6BZJ_A', '6BZK_A', '6BZT_D', '6CB6_A', '6CCI_A', '6CGO_B', '6CK1_D', '6CMK_A', '6CP8_B', '6CP8_D', '6CP9_G', '6CP9_H', '6CPU_A', '6CSV_D', '6CUL_H', '6CZ6_D', '6CZT_A', '6D0I_C', '6D0I_D', '6D2S_A', '6D7Y_A', '6D7Y_B', '6D97_D', '6D9F_B', '6D9M_A', '6DAN_D', '6DFL_A', '6DGN_B', '6DII_L', '6DKA_I', '6DKM_G', '6DLC_A', '6DLO_A', '6DRF_A', '6DTD_A', '6E0K_A', '6E0M_A', '6E3C_C', '6E9B_D', '6E9O_A', '6EAZ_B', '6EDB_B', '6EGC_A', '6FCG_F', '6FTO_C', '6FXD_B', '6G1H_A', '6G3B_B', '6G70_B', '6G7G_A', '6G7O_A', '6G8Y_A', '6GCJ_A', '6GDJ_B', '6GHO_B', '6GMA_F', '6GMS_A', '6GW7_A', '6H2X_A', '6H6N_B', '6HC2_X', '6HPV_A', '6I1R_B', '6I9H_A', '6IAI_D', '6IEH_A', '6N0T_A', '6N8P_A', '6NU4_A', '6NX4_A', '6OHZ_A'] WARNING!! Some values in the pdb structure of 5OD1_A l = 96 are missing or nan! Indices are: (array([0, 1]),) WARNING!! Some values in the pdb structure of 5OD9_B l = 95 are missing or nan! Indices are: (array([ 0, 1, 2, 48, 49, 50, 51]),) WARNING!! Some values in the pdb structure of 5OVM_A l = 89 are missing or nan! Indices are: (array([0, 1, 2, 3, 4, 5, 6]),) WARNING!! Some values in the pdb structure of 5W5P_A l = 623 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12]),) WARNING!! Some values in the pdb structure of 5WB4_H l = 194 are missing or nan! Indices are: (array([0]),) WARNING!! Some values in the pdb structure of 5XJO_E l = 52 are missing or nan! Indices are: (array([25, 26, 27, 28, 29, 30, 31, 32, 33, 34]),) WARNING!! Some values in the pdb structure of 5XKJ_F l = 52 are missing or nan! Indices are: (array([25, 26, 27, 28, 29, 30, 31]),) WARNING!! Some values in the pdb structure of 5XKN_F l = 50 are missing or nan! Indices are: (array([ 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 27, 28, 29, 30, 31, 32]),) WARNING!! Some values in the pdb structure of 5YA6_B l = 211 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45]),) WARNING!! Some values in the pdb structure of 5YRQ_E l = 148 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35]),) WARNING!! Some values in the pdb structure of 5YVQ_A l = 504 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 98, 99, 100, 101, 102, 103, 104, 105, 106, 107, 108, 109, 110, 111, 112, 113, 114, 115, 116, 117, 118, 119, 120, 121, 122, 123, 124, 125, 126, 127, 128, 129, 130, 131, 132, 133, 134, 135, 136, 472, 473, 474, 475, 476, 477, 478, 479, 480]),) WARNING!! Some values in the pdb structure of 5Z2H_B l = 99 are missing or nan! Indices are: (array([0, 1, 2, 3, 4, 5, 6, 7]),) WARNING!! Some values in the pdb structure of 5Z2I_D l = 99 are missing or nan! Indices are: (array([0, 1, 2, 3, 4, 5, 6]),) WARNING!! Some values in the pdb structure of 5Z34_A l = 365 are missing or nan! Indices are: (array([0, 1, 2, 3]),) WARNING!! Some values in the pdb structure of 5Z3F_A l = 399 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27]),) WARNING!! Some values in the pdb structure of 5Z3K_B l = 484 are missing or nan! Indices are: (array([0]),) WARNING!! Some values in the pdb structure of 5Z6D_B l = 206 are missing or nan! Indices are: (array([0, 1]),) WARNING!! Some values in the pdb structure of 5Z7C_A l = 463 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 98, 99, 100, 101, 102, 383, 384, 385, 386, 387, 388, 389, 390, 391, 392, 393, 394]),) WARNING!! Some values in the pdb structure of 5Z8B_B l = 206 are missing or nan! Indices are: (array([0, 1]),) WARNING!! Some values in the pdb structure of 5Z9T_B l = 536 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36]),) WARNING!! Some values in the pdb structure of 5ZB2_A l = 405 are missing or nan! Indices are: (array([0, 1, 2]),) WARNING!! Some values in the pdb structure of 5ZER_B l = 354 are missing or nan! Indices are: (array([230, 231, 232, 233, 234, 265, 266, 267]),) WARNING!! Some values in the pdb structure of 5ZKE_A l = 178 are missing or nan! Indices are: (array([0, 1, 2, 3]),) WARNING!! Some values in the pdb structure of 5ZKH_B l = 171 are missing or nan! Indices are: (array([ 0, 27, 28]),) WARNING!! Some values in the pdb structure of 5ZME_A l = 686 are missing or nan! Indices are: (array([ 93, 94, 95, 96, 97, 98, 99, 100, 101, 102, 103, 104, 105, 106, 107, 108, 109, 175, 176, 177, 178, 179, 180, 444, 445, 446, 447, 448, 449, 450, 451, 544, 545, 546, 547, 548, 549, 550, 551, 552, 553, 554, 555, 556, 557, 558, 559, 560]),) WARNING!! Some values in the pdb structure of 5ZNS_A l = 382 are missing or nan! Indices are: (array([0]),) WARNING!! Some values in the pdb structure of 5ZT0_J l = 47 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27]),) WARNING!! Some values in the pdb structure of 5ZT7_B l = 263 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10]),) WARNING!! Some values in the pdb structure of 5ZX9_A l = 287 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 98, 99, 100, 101, 102, 103, 104, 105, 106, 107, 108, 175, 176, 177, 178, 246, 247, 248, 249, 250, 251, 252, 253, 254]),) WARNING!! Some values in the pdb structure of 5ZYO_D l = 158 are missing or nan! Indices are: (array([0, 1, 2, 3, 4, 5]),) WARNING!! Some values in the pdb structure of 6A5F_B l = 152 are missing or nan! Indices are: (array([0]),) WARNING!! Some values in the pdb structure of 6A5G_B l = 159 are missing or nan! Indices are: (array([0, 1]),) WARNING!! Some values in the pdb structure of 6A68_A l = 173 are missing or nan! Indices are: (array([136, 137, 138, 139, 140, 141, 142, 143, 144, 145, 146]),) WARNING!! Some values in the pdb structure of 6A83_A l = 394 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36]),) WARNING!! Some values in the pdb structure of 6A9J_B l = 379 are missing or nan! Indices are: (array([219, 220, 260, 261, 262, 263, 264, 265, 266, 267, 268, 269, 270, 271, 272, 273, 274, 275, 276, 277, 278, 279, 307, 308, 309, 310, 311, 312, 313, 314, 341, 342, 343, 344, 345, 360, 361, 362, 363, 364]),) WARNING!! Some values in the pdb structure of 6A9W_A l = 314 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 196, 197, 198, 199, 200, 201, 202, 203, 204, 205, 206, 207, 208, 209]),) WARNING!! Some values in the pdb structure of 6AAY_A l = 1225 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 335, 336, 337, 338, 339, 340, 341, 342, 343, 395, 396, 397, 398, 1065, 1066, 1067, 1068]),) WARNING!! Some values in the pdb structure of 6AE1_B l = 145 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43]),) WARNING!! Some values in the pdb structure of 6AE8_D l = 119 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 98, 99, 100, 101, 102, 103, 104, 105, 106]),) WARNING!! Some values in the pdb structure of 6AEF_A l = 459 are missing or nan! Indices are: (array([0]),) WARNING!! Some values in the pdb structure of 6AGH_B l = 335 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 127, 128, 148, 149, 150, 151, 152, 153, 154, 155, 156, 157, 158, 159, 160, 161, 162, 163, 164, 165, 166, 167, 168, 232, 233, 234, 280, 281, 282, 283, 284, 317, 318, 319, 320, 321, 322]),) WARNING!! Some values in the pdb structure of 6AGJ_B l = 374 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 158, 159, 160, 161, 162, 163, 164, 165, 166, 167, 168, 169, 170, 171, 172, 173, 174, 175, 176, 177, 178, 179, 180, 181, 182, 183, 184, 185, 186, 187, 188, 189, 190, 191, 192, 193, 194, 195, 196, 197, 198, 199, 200, 201, 202, 203, 204, 205, 317, 318, 319]),) WARNING!! Some values in the pdb structure of 6AHQ_T l = 129 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21]),) WARNING!! Some values in the pdb structure of 6AIT_F l = 438 are missing or nan! Indices are: (array([ 0, 110, 111, 112, 113, 114, 115, 133, 134, 135, 136, 137, 138, 139, 140, 141, 142, 143, 144, 145, 146, 147, 190, 191, 192, 193, 194, 195, 196, 203, 204, 205]),) WARNING!! Some values in the pdb structure of 6AJJ_A l = 941 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 368, 369, 370, 371, 372, 373, 374, 375, 376, 377, 378, 379, 380, 381, 382, 383, 384, 385, 386, 387, 388, 389, 390, 391, 392, 393, 394, 395, 396, 397, 398, 399]),) WARNING!! Some values in the pdb structure of 6AKJ_B l = 152 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 53, 54, 55, 56, 57, 58, 59, 60, 61, 109, 110, 111, 112, 113, 114, 115, 116, 117, 118, 119, 120, 121, 122, 123, 124, 125, 126, 127]),) WARNING!! Some values in the pdb structure of 6BEA_A l = 438 are missing or nan! Indices are: (array([0, 1, 2, 3, 4]),) WARNING!! Some values in the pdb structure of 6BS5_A l = 340 are missing or nan! Indices are: (array([0, 1, 2, 3, 4, 5, 6, 7, 8, 9]),) WARNING!! Some values in the pdb structure of 6BS5_B l = 375 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 185, 186, 187, 188, 189, 190, 191, 192, 226, 227, 307, 308, 309, 310, 311, 312]),) WARNING!! Some values in the pdb structure of 6BWH_C l = 227 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 99, 100, 101, 156, 157, 186]),) WARNING!! Some values in the pdb structure of 6BXS_C l = 277 are missing or nan! Indices are: (array([0, 1, 2, 3, 4, 5, 6]),) WARNING!! Some values in the pdb structure of 6BXW_A l = 277 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 107, 108, 109, 110, 111, 112]),) WARNING!! Some values in the pdb structure of 6BZT_D l = 522 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20]),) WARNING!! Some values in the pdb structure of 6CB6_A l = 121 are missing or nan! Indices are: (array([0, 1, 2]),) WARNING!! Some values in the pdb structure of 6CCI_A l = 486 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 98, 99, 100, 101, 102, 103, 104, 105, 106, 107, 108, 109, 110, 111, 112, 113, 114, 115, 116, 117, 118, 119, 120, 121, 122, 123, 124, 125, 126, 127, 128, 129, 130, 131]),) WARNING!! Some values in the pdb structure of 6CGO_B l = 545 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 455, 456, 457, 458, 459, 460, 533, 534, 535, 536, 537]),) WARNING!! Some values in the pdb structure of 6CK1_D l = 408 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 186, 187, 188, 189, 190, 191]),) WARNING!! Some values in the pdb structure of 6CMK_A l = 405 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29]),) WARNING!! Some values in the pdb structure of 6CP8_B l = 163 are missing or nan! Indices are: (array([0]),) WARNING!! Some values in the pdb structure of 6CP8_D l = 164 are missing or nan! Indices are: (array([0, 1, 2, 3, 4, 5, 6]),) WARNING!! Some values in the pdb structure of 6CP9_G l = 121 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11]),) WARNING!! Some values in the pdb structure of 6CPU_A l = 570 are missing or nan! Indices are: (array([ 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 104, 105, 106, 107, 142, 143, 144, 145, 146, 147, 148, 149, 150, 151, 152, 153, 154, 324, 325, 326, 327, 328]),) WARNING!! Some values in the pdb structure of 6CSV_D l = 93 are missing or nan! Indices are: (array([ 0, 40, 41, 42, 43, 44, 45]),) WARNING!! Some values in the pdb structure of 6CUL_H l = 275 are missing or nan! Indices are: (array([ 0, 27, 28, 29, 30, 225, 226, 227]),) WARNING!! Some values in the pdb structure of 6CZ6_D l = 423 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 408, 409]),) WARNING!! Some values in the pdb structure of 6CZT_A l = 90 are missing or nan! Indices are: (array([0, 1, 2, 3, 4, 5, 6, 7]),) WARNING!! Some values in the pdb structure of 6D0I_D l = 72 are missing or nan! Indices are: (array([0]),) WARNING!! Some values in the pdb structure of 6D2S_A l = 284 are missing or nan! Indices are: (array([ 0, 203, 204, 205, 206, 207, 208, 209, 210]),) WARNING!! Some values in the pdb structure of 6D7Y_A l = 96 are missing or nan! Indices are: (array([0, 1, 2, 3]),) WARNING!! Some values in the pdb structure of 6D97_D l = 547 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24]),) WARNING!! Some values in the pdb structure of 6D9F_B l = 325 are missing or nan! Indices are: (array([0, 1, 2]),) WARNING!! Some values in the pdb structure of 6DAN_D l = 329 are missing or nan! Indices are: (array([0, 1, 2, 3, 4, 5]),) WARNING!! Some values in the pdb structure of 6DFL_A l = 259 are missing or nan! Indices are: (array([54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71]),) WARNING!! Some values in the pdb structure of 6DGN_B l = 95 are missing or nan! Indices are: (array([0, 1, 2, 3]),) WARNING!! Some values in the pdb structure of 6DKA_I l = 233 are missing or nan! Indices are: (array([0, 1, 2, 3, 4, 5, 6]),) WARNING!! Some values in the pdb structure of 6DKM_G l = 78 are missing or nan! Indices are: (array([0, 1]),) WARNING!! Some values in the pdb structure of 6DLC_A l = 115 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 76, 77, 78, 79]),) WARNING!! Some values in the pdb structure of 6DLO_A l = 358 are missing or nan! Indices are: (array([ 0, 1, 20, 21, 22, 23, 24, 115, 116, 117, 118, 119, 167, 168, 169, 170, 171, 172, 173, 174, 257, 258, 259, 260, 261, 262, 263, 264, 265, 266, 267, 268, 324, 325, 326, 327, 328, 339, 340, 341, 342, 343, 344, 345, 346, 347, 348]),) WARNING!! Some values in the pdb structure of 6DTD_A l = 1125 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 118, 119, 120, 121, 215, 216, 217, 218, 219, 220, 221, 222, 338, 339, 340, 341, 342, 343, 344, 345, 431, 432, 433, 434, 435, 436, 897, 898, 899, 900, 901, 902, 903, 1094, 1095, 1096, 1097]),) WARNING!! Some values in the pdb structure of 6E0K_A l = 296 are missing or nan! Indices are: (array([0]),) WARNING!! Some values in the pdb structure of 6E0M_A l = 291 are missing or nan! Indices are: (array([76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94]),) WARNING!! Some values in the pdb structure of 6E3C_C l = 137 are missing or nan! Indices are: (array([0, 1, 2]),) WARNING!! Some values in the pdb structure of 6E9B_D l = 420 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 77, 78, 79, 108, 109, 110, 123, 124]),) WARNING!! Some values in the pdb structure of 6E9O_A l = 443 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 230, 231, 232, 233, 234, 235, 236, 237, 238, 239, 240, 241, 242, 276, 277, 278, 279, 280, 281, 282, 283, 284, 285, 286, 287, 288, 289]),) WARNING!! Some values in the pdb structure of 6EAZ_B l = 374 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 151, 152, 153, 154, 155, 156, 157, 158, 159, 160, 161, 162, 163, 164, 165, 166, 167, 168, 169, 170, 171]),) WARNING!! Some values in the pdb structure of 6EDB_B l = 73 are missing or nan! Indices are: (array([0, 1, 2, 3, 4, 5]),) WARNING!! Some values in the pdb structure of 6EGC_A l = 155 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 78, 79, 80, 81]),) WARNING!! Some values in the pdb structure of 6FCG_F l = 430 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36]),) WARNING!! Some values in the pdb structure of 6FTO_C l = 81 are missing or nan! Indices are: (array([0, 1, 2, 3, 4, 5]),) WARNING!! Some values in the pdb structure of 6FXD_B l = 141 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19]),) WARNING!! Some values in the pdb structure of 6G1H_A l = 343 are missing or nan! Indices are: (array([298]),) WARNING!! Some values in the pdb structure of 6G3B_B l = 231 are missing or nan! Indices are: (array([ 0, 1, 45, 46, 47, 48]),) WARNING!! Some values in the pdb structure of 6G70_B l = 607 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 280, 281, 282, 283, 284, 285, 286, 287, 288, 289, 290, 291, 292, 293, 294, 295, 296, 297, 298]),) WARNING!! Some values in the pdb structure of 6GDJ_B l = 71 are missing or nan! Indices are: (array([0, 1]),) WARNING!! Some values in the pdb structure of 6GHO_B l = 295 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20]),) WARNING!! Some values in the pdb structure of 6GMA_F l = 139 are missing or nan! Indices are: (array([ 0, 31, 32, 33, 34, 35, 36, 37, 38, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96]),) WARNING!! Some values in the pdb structure of 6H2X_A l = 354 are missing or nan! Indices are: (array([0, 1]),) WARNING!! Some values in the pdb structure of 6H6N_B l = 129 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10]),) WARNING!! Some values in the pdb structure of 6HC2_X l = 68 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43]),) WARNING!! Some values in the pdb structure of 6HPV_A l = 370 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 199, 200, 201, 202, 203, 204, 205, 206, 207, 208, 209, 225, 226, 227, 228, 229, 230, 231, 232, 233, 234, 235, 236, 251, 252, 253, 254, 255, 256, 257, 258, 259, 260, 261, 262, 263, 264, 265, 266, 267, 268, 269, 270, 271, 272, 273, 274, 275, 276, 297, 298, 299, 300, 301, 302, 303, 304, 305, 306]),) WARNING!! Some values in the pdb structure of 6I1R_B l = 313 are missing or nan! Indices are: (array([27, 28, 29]),) WARNING!! Some values in the pdb structure of 6I9H_A l = 94 are missing or nan! Indices are: (array([0, 1]),) WARNING!! Some values in the pdb structure of 6IAI_D l = 114 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 95]),) WARNING!! Some values in the pdb structure of 6IEH_A l = 97 are missing or nan! Indices are: (array([46, 47, 48, 73, 74, 75, 76, 77, 78, 84, 85, 86]),) WARNING!! Some values in the pdb structure of 6N0T_A l = 440 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 89, 90, 91, 92, 93, 208, 209, 210, 211, 212]),) WARNING!! Some values in the pdb structure of 6N8P_A l = 926 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 459, 460, 461, 462, 463, 464, 465, 466, 467, 468, 469, 470, 471, 472, 623, 624, 625, 626, 627, 628, 629, 630, 631, 632, 633, 634, 635, 636, 637, 638, 639, 640, 641, 642, 643, 644, 645, 646, 647, 648, 649, 650, 651, 652, 653, 654, 655, 656, 657, 658, 659, 660, 661, 662, 663, 664, 665, 666, 667, 668, 669, 670, 671, 672, 673, 674, 675, 676, 677, 678, 679, 680, 681, 682, 683, 684, 685, 686, 687, 688, 689, 690, 691, 692, 693, 694, 695, 841, 842, 843, 844, 845, 846]),) WARNING!! Some values in the pdb structure of 6OHZ_A l = 230 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 133, 134, 135, 136, 137, 138, 139, 140, 141, 142, 143]),) MAE for 0 5OD1_A lr_d8 = 4.68 mlr_d8 = 4.01 lr_d12 = 3.37 mlr_d12 = 3.00 MAE for 1 5OD9_B lr_d8 = 4.34 mlr_d8 = 3.62 lr_d12 = 3.24 mlr_d12 = 2.80 MAE for 2 5OQK_A lr_d8 = 8.69 mlr_d8 = 7.76 lr_d12 = 6.17 mlr_d12 = 5.59 MAE for 3 5OVM_A lr_d8 = 3.05 mlr_d8 = 2.61 lr_d12 = 2.17 mlr_d12 = 1.91 MAE for 4 5W5P_A lr_d8 = 10.41 mlr_d8 = 9.80 lr_d12 = 9.64 mlr_d12 = 9.02 MAE for 5 5WB4_H lr_d8 = 2.20 mlr_d8 = 2.26 lr_d12 = 1.92 mlr_d12 = 1.89 MAE for 6 5XJO_E lr_d8 = 6.12 mlr_d8 = 4.13 lr_d12 = 4.28 mlr_d12 = 3.33 MAE for 7 5XKJ_F lr_d8 = 8.14 mlr_d8 = 5.18 lr_d12 = 5.96 mlr_d12 = 4.60 MAE for 8 5XKN_F lr_d8 = 4.79 mlr_d8 = 3.89 lr_d12 = 3.29 mlr_d12 = 2.99 MAE for 9 5Y08_A lr_d8 = 5.27 mlr_d8 = 5.41 lr_d12 = 4.47 mlr_d12 = 4.35 MAE for 10 5YA6_B lr_d8 = 6.34 mlr_d8 = 5.72 lr_d12 = 5.11 mlr_d12 = 4.91 MAE for 11 5YRQ_E lr_d8 = 7.69 mlr_d8 = 7.16 lr_d12 = 5.73 mlr_d12 = 5.15 MAE for 12 5YVQ_A lr_d8 = 10.49 mlr_d8 = 8.03 lr_d12 = 6.77 mlr_d12 = 5.61 MAE for 13 5YVQ_B lr_d8 = 4.85 mlr_d8 = 4.45 lr_d12 = 4.20 mlr_d12 = 4.17 MAE for 14 5Z2H_B lr_d8 = 2.33 mlr_d8 = 2.38 lr_d12 = 1.90 mlr_d12 = 1.90 MAE for 15 5Z2I_D lr_d8 = 2.17 mlr_d8 = 2.25 lr_d12 = 1.92 mlr_d12 = 1.91 MAE for 16 5Z34_A lr_d8 = 8.68 mlr_d8 = 8.66 lr_d12 = 7.19 mlr_d12 = 7.08 MAE for 17 5Z36_A lr_d8 = 4.88 mlr_d8 = 3.72 lr_d12 = 3.69 mlr_d12 = 3.21 MAE for 18 5Z3F_A lr_d8 = 4.19 mlr_d8 = 4.14 lr_d12 = 3.58 mlr_d12 = 3.44 MAE for 19 5Z3K_B lr_d8 = 5.33 mlr_d8 = 5.32 lr_d12 = 5.07 mlr_d12 = 4.91 MAE for 20 5Z6D_B lr_d8 = 12.25 mlr_d8 = 9.97 lr_d12 = 10.25 mlr_d12 = 8.66 MAE for 21 5Z7C_A lr_d8 = 7.71 mlr_d8 = 6.93 lr_d12 = 6.02 mlr_d12 = 5.46 MAE for 22 5Z8B_B lr_d8 = 3.12 mlr_d8 = 3.21 lr_d12 = 3.12 mlr_d12 = 3.03 MAE for 23 5Z9T_B lr_d8 = 7.33 mlr_d8 = 6.85 lr_d12 = 8.19 mlr_d12 = 7.51 MAE for 24 5ZB2_A lr_d8 = 3.52 mlr_d8 = 3.59 lr_d12 = 3.70 mlr_d12 = 3.46 MAE for 25 5ZER_B lr_d8 = 3.82 mlr_d8 = 3.44 lr_d12 = 3.50 mlr_d12 = 3.15 MAE for 26 5ZKE_A lr_d8 = 5.43 mlr_d8 = 5.63 lr_d12 = 3.98 mlr_d12 = 4.12 MAE for 27 5ZKH_B lr_d8 = 6.05 mlr_d8 = 6.13 lr_d12 = 4.59 mlr_d12 = 4.57 MAE for 28 5ZKT_B lr_d8 = nan mlr_d8 = 3.82 lr_d12 = 7.72 mlr_d12 = 3.22 MAE for 29 5ZME_A lr_d8 = 12.19 mlr_d8 = 11.45 lr_d12 = 10.74 mlr_d12 = 9.97 MAE for 30 5ZNS_A lr_d8 = 8.99 mlr_d8 = 8.75 lr_d12 = 7.60 mlr_d12 = 7.35 MAE for 31 5ZOR_A lr_d8 = 7.01 mlr_d8 = 5.80 lr_d12 = 5.75 mlr_d12 = 4.60 MAE for 32 5ZT0_J lr_d8 = nan mlr_d8 = nan lr_d12 = nan mlr_d12 = nan MAE for 33 5ZT7_B lr_d8 = 2.63 mlr_d8 = 2.52 lr_d12 = 2.11 mlr_d12 = 2.03 MAE for 34 5ZX9_A lr_d8 = 11.25 mlr_d8 = 9.28 lr_d12 = 9.20 mlr_d12 = 8.14 MAE for 35 5ZYO_D lr_d8 = 6.07 mlr_d8 = 5.84 lr_d12 = 5.36 mlr_d12 = 4.73 MAE for 36 6A2W_A lr_d8 = 4.10 mlr_d8 = 6.20 lr_d12 = 3.69 mlr_d12 = 4.74 MAE for 37 6A5F_B lr_d8 = 4.63 mlr_d8 = 3.43 lr_d12 = 3.54 mlr_d12 = 3.04 MAE for 38 6A5G_B lr_d8 = 5.56 mlr_d8 = 4.14 lr_d12 = 4.21 mlr_d12 = 3.62 MAE for 39 6A68_A lr_d8 = 8.29 mlr_d8 = 7.79 lr_d12 = 6.18 mlr_d12 = 5.71 MAE for 40 6A83_A lr_d8 = 6.16 mlr_d8 = 5.88 lr_d12 = 5.29 mlr_d12 = 5.04 MAE for 41 6A9J_B lr_d8 = 9.09 mlr_d8 = 7.78 lr_d12 = 8.90 mlr_d12 = 8.06 MAE for 42 6A9W_A lr_d8 = 3.48 mlr_d8 = 3.41 lr_d12 = 2.98 mlr_d12 = 2.89 MAE for 43 6AAY_A lr_d8 = 19.41 mlr_d8 = 19.07 lr_d12 = 16.28 mlr_d12 = 15.95 MAE for 44 6AE1_B lr_d8 = 3.30 mlr_d8 = 3.45 lr_d12 = 2.33 mlr_d12 = 2.36 MAE for 45 6AE8_D lr_d8 = nan mlr_d8 = nan lr_d12 = nan mlr_d12 = nan MAE for 46 6AE9_B lr_d8 = 5.12 mlr_d8 = 4.65 lr_d12 = 4.95 mlr_d12 = 4.49 MAE for 47 6AEF_A lr_d8 = 6.45 mlr_d8 = 5.91 lr_d12 = 5.32 mlr_d12 = 5.01 MAE for 48 6AGH_B lr_d8 = 11.55 mlr_d8 = 10.63 lr_d12 = 9.41 mlr_d12 = 8.52 MAE for 49 6AGJ_B lr_d8 = 12.89 mlr_d8 = 11.49 lr_d12 = 10.75 mlr_d12 = 9.52 MAE for 50 6AHQ_T lr_d8 = 1.93 mlr_d8 = 1.77 lr_d12 = 1.70 mlr_d12 = 1.59 MAE for 51 6AIT_F lr_d8 = 5.41 mlr_d8 = 4.51 lr_d12 = 4.65 mlr_d12 = 4.01 MAE for 52 6AJJ_A lr_d8 = 6.13 mlr_d8 = 6.16 lr_d12 = 5.23 mlr_d12 = 5.19 MAE for 53 6AKJ_B lr_d8 = 9.41 mlr_d8 = 7.22 lr_d12 = 7.49 mlr_d12 = 6.18 MAE for 54 6BEA_A lr_d8 = 8.35 mlr_d8 = 7.35 lr_d12 = 8.13 mlr_d12 = 7.33 MAE for 55 6BS5_A lr_d8 = 4.71 mlr_d8 = 4.56 lr_d12 = 4.08 mlr_d12 = 3.88 MAE for 56 6BS5_B lr_d8 = 7.16 mlr_d8 = 6.84 lr_d12 = 5.76 mlr_d12 = 5.49 MAE for 57 6BWH_C lr_d8 = 3.84 mlr_d8 = 3.59 lr_d12 = 3.53 mlr_d12 = 3.28 MAE for 58 6BXS_C lr_d8 = 9.83 mlr_d8 = 9.15 lr_d12 = 8.52 mlr_d12 = 7.93 MAE for 59 6BXW_A lr_d8 = 10.23 mlr_d8 = 9.57 lr_d12 = 8.87 mlr_d12 = 8.28 MAE for 60 6BZJ_A lr_d8 = nan mlr_d8 = nan lr_d12 = nan mlr_d12 = nan MAE for 61 6BZK_A lr_d8 = nan mlr_d8 = nan lr_d12 = 14.92 mlr_d12 = 13.47 MAE for 62 6BZT_D lr_d8 = 7.41 mlr_d8 = 7.03 lr_d12 = 6.91 mlr_d12 = 6.56 MAE for 63 6CB6_A lr_d8 = 10.90 mlr_d8 = 10.29 lr_d12 = 8.14 mlr_d12 = 7.56 MAE for 64 6CCI_A lr_d8 = 7.06 mlr_d8 = 7.04 lr_d12 = 6.37 mlr_d12 = 6.22 MAE for 65 6CGO_B lr_d8 = 5.85 mlr_d8 = 5.47 lr_d12 = 4.92 mlr_d12 = 4.68 MAE for 66 6CK1_D lr_d8 = 7.86 mlr_d8 = 5.50 lr_d12 = 5.62 mlr_d12 = 4.40 MAE for 67 6CMK_A lr_d8 = 7.71 mlr_d8 = 5.40 lr_d12 = 5.63 mlr_d12 = 4.42 MAE for 68 6CP8_B lr_d8 = 10.78 mlr_d8 = 8.15 lr_d12 = 8.65 mlr_d12 = 7.06 MAE for 69 6CP8_D lr_d8 = 10.17 mlr_d8 = 8.84 lr_d12 = 8.02 mlr_d12 = 6.85 MAE for 70 6CP9_G lr_d8 = 2.89 mlr_d8 = 2.14 lr_d12 = 2.24 mlr_d12 = 2.01 MAE for 71 6CP9_H lr_d8 = 6.96 mlr_d8 = 3.79 lr_d12 = 4.31 mlr_d12 = 2.97 MAE for 72 6CPU_A lr_d8 = 10.70 mlr_d8 = 10.25 lr_d12 = 8.64 mlr_d12 = 8.18 MAE for 73 6CSV_D lr_d8 = 5.94 mlr_d8 = 5.87 lr_d12 = 4.36 mlr_d12 = 4.30 MAE for 74 6CUL_H lr_d8 = 5.98 mlr_d8 = 6.02 lr_d12 = 5.24 mlr_d12 = 5.14 MAE for 75 6CZ6_D lr_d8 = 6.29 mlr_d8 = 5.69 lr_d12 = 5.34 mlr_d12 = 5.05 MAE for 76 6CZT_A lr_d8 = 2.09 mlr_d8 = 3.03 lr_d12 = 2.46 mlr_d12 = 2.75 MAE for 77 6D0I_C lr_d8 = 3.73 mlr_d8 = 3.55 lr_d12 = 3.80 mlr_d12 = 3.65 MAE for 78 6D0I_D lr_d8 = 5.91 mlr_d8 = 6.00 lr_d12 = 5.14 mlr_d12 = 4.84 MAE for 79 6D2S_A lr_d8 = 5.71 mlr_d8 = 4.88 lr_d12 = 4.72 mlr_d12 = 4.25 MAE for 80 6D7Y_A lr_d8 = 4.34 mlr_d8 = 3.57 lr_d12 = 3.79 mlr_d12 = 3.01 MAE for 81 6D7Y_B lr_d8 = 9.11 mlr_d8 = 8.60 lr_d12 = 7.62 mlr_d12 = 7.44 MAE for 82 6D97_D lr_d8 = 4.26 mlr_d8 = 3.96 lr_d12 = 3.82 mlr_d12 = 3.60 MAE for 83 6D9F_B lr_d8 = 9.34 mlr_d8 = 8.83 lr_d12 = 7.29 mlr_d12 = 6.94 MAE for 84 6D9M_A lr_d8 = 7.30 mlr_d8 = 7.31 lr_d12 = 6.74 mlr_d12 = 6.61 MAE for 85 6DAN_D lr_d8 = 3.38 mlr_d8 = 3.17 lr_d12 = 2.93 mlr_d12 = 2.74 MAE for 86 6DFL_A lr_d8 = 3.05 mlr_d8 = 2.72 lr_d12 = 2.75 mlr_d12 = 2.59 MAE for 87 6DGN_B lr_d8 = 13.18 mlr_d8 = 8.50 lr_d12 = 9.60 mlr_d12 = 7.48 MAE for 88 6DII_L lr_d8 = 7.02 mlr_d8 = 6.70 lr_d12 = 6.06 mlr_d12 = 5.77 MAE for 89 6DKA_I lr_d8 = 8.66 mlr_d8 = 7.56 lr_d12 = 6.39 mlr_d12 = 5.64 MAE for 90 6DKM_G lr_d8 = 19.81 mlr_d8 = 19.36 lr_d12 = 16.69 mlr_d12 = 16.27 MAE for 91 6DLC_A lr_d8 = 16.76 mlr_d8 = 15.16 lr_d12 = 14.12 mlr_d12 = 12.93 MAE for 92 6DLO_A lr_d8 = 8.52 mlr_d8 = 5.58 lr_d12 = 6.42 mlr_d12 = 4.94 MAE for 93 6DRF_A lr_d8 = 7.17 mlr_d8 = 6.93 lr_d12 = 6.07 mlr_d12 = 5.85 MAE for 94 6DTD_A lr_d8 = 18.60 mlr_d8 = 17.59 lr_d12 = 15.56 mlr_d12 = 14.72 MAE for 95 6E0K_A lr_d8 = 6.77 mlr_d8 = 5.80 lr_d12 = 5.47 mlr_d12 = 4.83 MAE for 96 6E0M_A lr_d8 = 5.59 mlr_d8 = 4.73 lr_d12 = 4.44 mlr_d12 = 3.95 MAE for 97 6E3C_C lr_d8 = 2.63 mlr_d8 = 2.63 lr_d12 = 2.78 mlr_d12 = 2.84 MAE for 98 6E9B_D lr_d8 = 9.06 mlr_d8 = 8.28 lr_d12 = 8.03 mlr_d12 = 7.40 MAE for 99 6E9O_A lr_d8 = 2.64 mlr_d8 = 2.58 lr_d12 = 2.30 mlr_d12 = 2.23 MAE for 100 6EAZ_B lr_d8 = 11.93 mlr_d8 = 10.81 lr_d12 = 9.94 mlr_d12 = 8.91 MAE for 101 6EDB_B lr_d8 = 9.00 mlr_d8 = 7.25 lr_d12 = 6.82 mlr_d12 = 5.58 MAE for 102 6EGC_A lr_d8 = 7.22 mlr_d8 = 6.65 lr_d12 = 5.79 mlr_d12 = 5.36 MAE for 103 6FCG_F lr_d8 = 6.24 mlr_d8 = 6.12 lr_d12 = 5.52 mlr_d12 = 5.33 MAE for 104 6FTO_C lr_d8 = 12.19 mlr_d8 = 11.05 lr_d12 = 10.54 mlr_d12 = 9.29 MAE for 105 6FXD_B lr_d8 = 3.04 mlr_d8 = 3.20 lr_d12 = 3.13 mlr_d12 = 3.08 MAE for 106 6G1H_A lr_d8 = 3.59 mlr_d8 = 3.51 lr_d12 = 3.36 mlr_d12 = 3.29 MAE for 107 6G3B_B lr_d8 = 7.64 mlr_d8 = 6.38 lr_d12 = 5.96 mlr_d12 = 5.19 MAE for 108 6G70_B lr_d8 = 7.22 mlr_d8 = 5.20 lr_d12 = 5.92 mlr_d12 = 4.48 MAE for 109 6G7G_A lr_d8 = 3.30 mlr_d8 = 3.16 lr_d12 = 3.44 mlr_d12 = 3.18 MAE for 110 6G7O_A lr_d8 = 6.98 mlr_d8 = 6.96 lr_d12 = 5.80 mlr_d12 = 5.62 MAE for 111 6G8Y_A lr_d8 = 5.68 mlr_d8 = 4.57 lr_d12 = 4.95 mlr_d12 = 4.24 MAE for 112 6GCJ_A lr_d8 = 12.97 mlr_d8 = 12.25 lr_d12 = 12.00 mlr_d12 = 11.17 MAE for 113 6GDJ_B lr_d8 = 8.85 mlr_d8 = 8.01 lr_d12 = 7.34 mlr_d12 = 6.58 MAE for 114 6GHO_B lr_d8 = 5.45 mlr_d8 = 5.01 lr_d12 = 4.71 mlr_d12 = 4.28 MAE for 115 6GMA_F lr_d8 = 5.69 mlr_d8 = 4.69 lr_d12 = 5.29 mlr_d12 = 4.51 MAE for 116 6GMS_A lr_d8 = 7.19 mlr_d8 = 5.25 lr_d12 = 4.89 mlr_d12 = 4.14 MAE for 117 6GW7_A lr_d8 = 4.28 mlr_d8 = 3.60 lr_d12 = 3.09 mlr_d12 = 2.60 MAE for 118 6H2X_A lr_d8 = 3.88 mlr_d8 = 3.88 lr_d12 = 2.95 mlr_d12 = 2.98 MAE for 119 6H6N_B lr_d8 = 3.17 mlr_d8 = 2.80 lr_d12 = 2.99 mlr_d12 = 2.60 MAE for 120 6HC2_X lr_d8 = nan mlr_d8 = nan lr_d12 = nan mlr_d12 = nan MAE for 121 6HPV_A lr_d8 = 10.92 mlr_d8 = 9.66 lr_d12 = 9.87 mlr_d12 = 9.12 MAE for 122 6I1R_B lr_d8 = 3.23 mlr_d8 = 3.23 lr_d12 = 2.50 mlr_d12 = 2.47 MAE for 123 6I9H_A lr_d8 = 5.82 mlr_d8 = 5.55 lr_d12 = 4.21 mlr_d12 = 4.40 MAE for 124 6IAI_D lr_d8 = 10.01 mlr_d8 = 10.55 lr_d12 = 7.63 mlr_d12 = 7.84 MAE for 125 6IEH_A lr_d8 = nan mlr_d8 = 9.52 lr_d12 = nan mlr_d12 = 6.49 MAE for 126 6N0T_A lr_d8 = 8.83 mlr_d8 = 7.28 lr_d12 = 7.40 mlr_d12 = 6.57 MAE for 127 6N8P_A lr_d8 = 16.28 mlr_d8 = 12.94 lr_d12 = 13.44 mlr_d12 = 11.33 MAE for 128 6NU4_A lr_d8 = nan mlr_d8 = 8.15 lr_d12 = 15.35 mlr_d12 = 9.29 MAE for 129 6NX4_A lr_d8 = 4.47 mlr_d8 = 4.11 lr_d12 = 3.82 mlr_d12 = 3.39 MAE for 130 6OHZ_A lr_d8 = 11.72 mlr_d8 = 10.93 lr_d12 = 9.23 mlr_d12 = 8.48 Average MAE : lr<8A = 7.1291 mlr<8A = 6.4211 lr<12A = 6.0746 mlr<12A = 5.4941 WARNING!! Some values in the pdb structure of 5OD1_A l = 96 are missing or nan! Indices are: (array([0, 1]),) WARNING!! Some values in the pdb structure of 5OD9_B l = 95 are missing or nan! Indices are: (array([ 0, 1, 2, 48, 49, 50, 51]),) WARNING!! Some values in the pdb structure of 5OVM_A l = 89 are missing or nan! Indices are: (array([0, 1, 2, 3, 4, 5, 6]),) WARNING!! Some values in the pdb structure of 5W5P_A l = 623 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12]),) WARNING!! Some values in the pdb structure of 5WB4_H l = 194 are missing or nan! Indices are: (array([0]),) WARNING!! Some values in the pdb structure of 5XJO_E l = 52 are missing or nan! Indices are: (array([25, 26, 27, 28, 29, 30, 31, 32, 33, 34]),) WARNING!! Some values in the pdb structure of 5XKJ_F l = 52 are missing or nan! Indices are: (array([25, 26, 27, 28, 29, 30, 31]),) WARNING!! Some values in the pdb structure of 5XKN_F l = 50 are missing or nan! Indices are: (array([ 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 27, 28, 29, 30, 31, 32]),) WARNING!! Some values in the pdb structure of 5YA6_B l = 211 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45]),) WARNING!! Some values in the pdb structure of 5YRQ_E l = 148 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35]),) WARNING!! Some values in the pdb structure of 5YVQ_A l = 504 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 98, 99, 100, 101, 102, 103, 104, 105, 106, 107, 108, 109, 110, 111, 112, 113, 114, 115, 116, 117, 118, 119, 120, 121, 122, 123, 124, 125, 126, 127, 128, 129, 130, 131, 132, 133, 134, 135, 136, 472, 473, 474, 475, 476, 477, 478, 479, 480]),) WARNING!! Some values in the pdb structure of 5Z2H_B l = 99 are missing or nan! Indices are: (array([0, 1, 2, 3, 4, 5, 6, 7]),) WARNING!! Some values in the pdb structure of 5Z2I_D l = 99 are missing or nan! Indices are: (array([0, 1, 2, 3, 4, 5, 6]),) WARNING!! Some values in the pdb structure of 5Z34_A l = 365 are missing or nan! Indices are: (array([0, 1, 2, 3]),) WARNING!! Some values in the pdb structure of 5Z3F_A l = 399 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27]),) WARNING!! Some values in the pdb structure of 5Z3K_B l = 484 are missing or nan! Indices are: (array([0]),) WARNING!! Some values in the pdb structure of 5Z6D_B l = 206 are missing or nan! Indices are: (array([0, 1]),) WARNING!! Some values in the pdb structure of 5Z7C_A l = 463 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 98, 99, 100, 101, 102, 383, 384, 385, 386, 387, 388, 389, 390, 391, 392, 393, 394]),) WARNING!! Some values in the pdb structure of 5Z8B_B l = 206 are missing or nan! Indices are: (array([0, 1]),) WARNING!! Some values in the pdb structure of 5Z9T_B l = 536 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36]),) WARNING!! Some values in the pdb structure of 5ZB2_A l = 405 are missing or nan! Indices are: (array([0, 1, 2]),) WARNING!! Some values in the pdb structure of 5ZER_B l = 354 are missing or nan! Indices are: (array([230, 231, 232, 233, 234, 265, 266, 267]),) WARNING!! Some values in the pdb structure of 5ZKE_A l = 178 are missing or nan! Indices are: (array([0, 1, 2, 3]),) WARNING!! Some values in the pdb structure of 5ZKH_B l = 171 are missing or nan! Indices are: (array([ 0, 27, 28]),) WARNING!! Some values in the pdb structure of 5ZME_A l = 686 are missing or nan! Indices are: (array([ 93, 94, 95, 96, 97, 98, 99, 100, 101, 102, 103, 104, 105, 106, 107, 108, 109, 175, 176, 177, 178, 179, 180, 444, 445, 446, 447, 448, 449, 450, 451, 544, 545, 546, 547, 548, 549, 550, 551, 552, 553, 554, 555, 556, 557, 558, 559, 560]),) WARNING!! Some values in the pdb structure of 5ZNS_A l = 382 are missing or nan! Indices are: (array([0]),) WARNING!! Some values in the pdb structure of 5ZT0_J l = 47 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27]),) WARNING!! Some values in the pdb structure of 5ZT7_B l = 263 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10]),) WARNING!! Some values in the pdb structure of 5ZX9_A l = 287 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 98, 99, 100, 101, 102, 103, 104, 105, 106, 107, 108, 175, 176, 177, 178, 246, 247, 248, 249, 250, 251, 252, 253, 254]),) WARNING!! Some values in the pdb structure of 5ZYO_D l = 158 are missing or nan! Indices are: (array([0, 1, 2, 3, 4, 5]),) WARNING!! Some values in the pdb structure of 6A5F_B l = 152 are missing or nan! Indices are: (array([0]),) WARNING!! Some values in the pdb structure of 6A5G_B l = 159 are missing or nan! Indices are: (array([0, 1]),) WARNING!! Some values in the pdb structure of 6A68_A l = 173 are missing or nan! Indices are: (array([136, 137, 138, 139, 140, 141, 142, 143, 144, 145, 146]),) WARNING!! Some values in the pdb structure of 6A83_A l = 394 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36]),) WARNING!! Some values in the pdb structure of 6A9J_B l = 379 are missing or nan! Indices are: (array([219, 220, 260, 261, 262, 263, 264, 265, 266, 267, 268, 269, 270, 271, 272, 273, 274, 275, 276, 277, 278, 279, 307, 308, 309, 310, 311, 312, 313, 314, 341, 342, 343, 344, 345, 360, 361, 362, 363, 364]),) WARNING!! Some values in the pdb structure of 6A9W_A l = 314 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 196, 197, 198, 199, 200, 201, 202, 203, 204, 205, 206, 207, 208, 209]),) WARNING!! Some values in the pdb structure of 6AAY_A l = 1225 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 335, 336, 337, 338, 339, 340, 341, 342, 343, 395, 396, 397, 398, 1065, 1066, 1067, 1068]),) WARNING!! Some values in the pdb structure of 6AE1_B l = 145 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43]),) WARNING!! Some values in the pdb structure of 6AE8_D l = 119 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 98, 99, 100, 101, 102, 103, 104, 105, 106]),) WARNING!! Some values in the pdb structure of 6AEF_A l = 459 are missing or nan! Indices are: (array([0]),) WARNING!! Some values in the pdb structure of 6AGH_B l = 335 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 127, 128, 148, 149, 150, 151, 152, 153, 154, 155, 156, 157, 158, 159, 160, 161, 162, 163, 164, 165, 166, 167, 168, 232, 233, 234, 280, 281, 282, 283, 284, 317, 318, 319, 320, 321, 322]),) WARNING!! Some values in the pdb structure of 6AGJ_B l = 374 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 158, 159, 160, 161, 162, 163, 164, 165, 166, 167, 168, 169, 170, 171, 172, 173, 174, 175, 176, 177, 178, 179, 180, 181, 182, 183, 184, 185, 186, 187, 188, 189, 190, 191, 192, 193, 194, 195, 196, 197, 198, 199, 200, 201, 202, 203, 204, 205, 317, 318, 319]),) WARNING!! Some values in the pdb structure of 6AHQ_T l = 129 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21]),) WARNING!! Some values in the pdb structure of 6AIT_F l = 438 are missing or nan! Indices are: (array([ 0, 110, 111, 112, 113, 114, 115, 133, 134, 135, 136, 137, 138, 139, 140, 141, 142, 143, 144, 145, 146, 147, 190, 191, 192, 193, 194, 195, 196, 203, 204, 205]),) WARNING!! Some values in the pdb structure of 6AJJ_A l = 941 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 368, 369, 370, 371, 372, 373, 374, 375, 376, 377, 378, 379, 380, 381, 382, 383, 384, 385, 386, 387, 388, 389, 390, 391, 392, 393, 394, 395, 396, 397, 398, 399]),) WARNING!! Some values in the pdb structure of 6AKJ_B l = 152 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 53, 54, 55, 56, 57, 58, 59, 60, 61, 109, 110, 111, 112, 113, 114, 115, 116, 117, 118, 119, 120, 121, 122, 123, 124, 125, 126, 127]),) WARNING!! Some values in the pdb structure of 6BEA_A l = 438 are missing or nan! Indices are: (array([0, 1, 2, 3, 4]),) WARNING!! Some values in the pdb structure of 6BS5_A l = 340 are missing or nan! Indices are: (array([0, 1, 2, 3, 4, 5, 6, 7, 8, 9]),) WARNING!! Some values in the pdb structure of 6BS5_B l = 375 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 185, 186, 187, 188, 189, 190, 191, 192, 226, 227, 307, 308, 309, 310, 311, 312]),) WARNING!! Some values in the pdb structure of 6BWH_C l = 227 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 99, 100, 101, 156, 157, 186]),) WARNING!! Some values in the pdb structure of 6BXS_C l = 277 are missing or nan! Indices are: (array([0, 1, 2, 3, 4, 5, 6]),) WARNING!! Some values in the pdb structure of 6BXW_A l = 277 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 107, 108, 109, 110, 111, 112]),) WARNING!! Some values in the pdb structure of 6BZT_D l = 522 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20]),) WARNING!! Some values in the pdb structure of 6CB6_A l = 121 are missing or nan! Indices are: (array([0, 1, 2]),) WARNING!! Some values in the pdb structure of 6CCI_A l = 486 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 98, 99, 100, 101, 102, 103, 104, 105, 106, 107, 108, 109, 110, 111, 112, 113, 114, 115, 116, 117, 118, 119, 120, 121, 122, 123, 124, 125, 126, 127, 128, 129, 130, 131]),) WARNING!! Some values in the pdb structure of 6CGO_B l = 545 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 455, 456, 457, 458, 459, 460, 533, 534, 535, 536, 537]),) WARNING!! Some values in the pdb structure of 6CK1_D l = 408 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 186, 187, 188, 189, 190, 191]),) WARNING!! Some values in the pdb structure of 6CMK_A l = 405 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29]),) WARNING!! Some values in the pdb structure of 6CP8_B l = 163 are missing or nan! Indices are: (array([0]),) WARNING!! Some values in the pdb structure of 6CP8_D l = 164 are missing or nan! Indices are: (array([0, 1, 2, 3, 4, 5, 6]),) WARNING!! Some values in the pdb structure of 6CP9_G l = 121 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11]),) WARNING!! Some values in the pdb structure of 6CPU_A l = 570 are missing or nan! Indices are: (array([ 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 104, 105, 106, 107, 142, 143, 144, 145, 146, 147, 148, 149, 150, 151, 152, 153, 154, 324, 325, 326, 327, 328]),) WARNING!! Some values in the pdb structure of 6CSV_D l = 93 are missing or nan! Indices are: (array([ 0, 40, 41, 42, 43, 44, 45]),) WARNING!! Some values in the pdb structure of 6CUL_H l = 275 are missing or nan! Indices are: (array([ 0, 27, 28, 29, 30, 225, 226, 227]),) WARNING!! Some values in the pdb structure of 6CZ6_D l = 423 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 408, 409]),) WARNING!! Some values in the pdb structure of 6CZT_A l = 90 are missing or nan! Indices are: (array([0, 1, 2, 3, 4, 5, 6, 7]),) WARNING!! Some values in the pdb structure of 6D0I_D l = 72 are missing or nan! Indices are: (array([0]),) WARNING!! Some values in the pdb structure of 6D2S_A l = 284 are missing or nan! Indices are: (array([ 0, 203, 204, 205, 206, 207, 208, 209, 210]),) WARNING!! Some values in the pdb structure of 6D7Y_A l = 96 are missing or nan! Indices are: (array([0, 1, 2, 3]),) WARNING!! Some values in the pdb structure of 6D97_D l = 547 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24]),) WARNING!! Some values in the pdb structure of 6D9F_B l = 325 are missing or nan! Indices are: (array([0, 1, 2]),) WARNING!! Some values in the pdb structure of 6DAN_D l = 329 are missing or nan! Indices are: (array([0, 1, 2, 3, 4, 5]),) WARNING!! Some values in the pdb structure of 6DFL_A l = 259 are missing or nan! Indices are: (array([54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71]),) WARNING!! Some values in the pdb structure of 6DGN_B l = 95 are missing or nan! Indices are: (array([0, 1, 2, 3]),) WARNING!! Some values in the pdb structure of 6DKA_I l = 233 are missing or nan! Indices are: (array([0, 1, 2, 3, 4, 5, 6]),) WARNING!! Some values in the pdb structure of 6DKM_G l = 78 are missing or nan! Indices are: (array([0, 1]),) WARNING!! Some values in the pdb structure of 6DLC_A l = 115 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 76, 77, 78, 79]),) WARNING!! Some values in the pdb structure of 6DLO_A l = 358 are missing or nan! Indices are: (array([ 0, 1, 20, 21, 22, 23, 24, 115, 116, 117, 118, 119, 167, 168, 169, 170, 171, 172, 173, 174, 257, 258, 259, 260, 261, 262, 263, 264, 265, 266, 267, 268, 324, 325, 326, 327, 328, 339, 340, 341, 342, 343, 344, 345, 346, 347, 348]),) WARNING!! Some values in the pdb structure of 6DTD_A l = 1125 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 118, 119, 120, 121, 215, 216, 217, 218, 219, 220, 221, 222, 338, 339, 340, 341, 342, 343, 344, 345, 431, 432, 433, 434, 435, 436, 897, 898, 899, 900, 901, 902, 903, 1094, 1095, 1096, 1097]),) WARNING!! Some values in the pdb structure of 6E0K_A l = 296 are missing or nan! Indices are: (array([0]),) WARNING!! Some values in the pdb structure of 6E0M_A l = 291 are missing or nan! Indices are: (array([76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94]),) WARNING!! Some values in the pdb structure of 6E3C_C l = 137 are missing or nan! Indices are: (array([0, 1, 2]),) WARNING!! Some values in the pdb structure of 6E9B_D l = 420 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 77, 78, 79, 108, 109, 110, 123, 124]),) WARNING!! Some values in the pdb structure of 6E9O_A l = 443 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 230, 231, 232, 233, 234, 235, 236, 237, 238, 239, 240, 241, 242, 276, 277, 278, 279, 280, 281, 282, 283, 284, 285, 286, 287, 288, 289]),) WARNING!! Some values in the pdb structure of 6EAZ_B l = 374 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 151, 152, 153, 154, 155, 156, 157, 158, 159, 160, 161, 162, 163, 164, 165, 166, 167, 168, 169, 170, 171]),) WARNING!! Some values in the pdb structure of 6EDB_B l = 73 are missing or nan! Indices are: (array([0, 1, 2, 3, 4, 5]),) WARNING!! Some values in the pdb structure of 6EGC_A l = 155 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 78, 79, 80, 81]),) WARNING!! Some values in the pdb structure of 6FCG_F l = 430 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36]),) WARNING!! Some values in the pdb structure of 6FTO_C l = 81 are missing or nan! Indices are: (array([0, 1, 2, 3, 4, 5]),) WARNING!! Some values in the pdb structure of 6FXD_B l = 141 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19]),) WARNING!! Some values in the pdb structure of 6G1H_A l = 343 are missing or nan! Indices are: (array([298]),) WARNING!! Some values in the pdb structure of 6G3B_B l = 231 are missing or nan! Indices are: (array([ 0, 1, 45, 46, 47, 48]),) WARNING!! Some values in the pdb structure of 6G70_B l = 607 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 280, 281, 282, 283, 284, 285, 286, 287, 288, 289, 290, 291, 292, 293, 294, 295, 296, 297, 298]),) WARNING!! Some values in the pdb structure of 6GDJ_B l = 71 are missing or nan! Indices are: (array([0, 1]),) WARNING!! Some values in the pdb structure of 6GHO_B l = 295 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20]),) WARNING!! Some values in the pdb structure of 6GMA_F l = 139 are missing or nan! Indices are: (array([ 0, 31, 32, 33, 34, 35, 36, 37, 38, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96]),) WARNING!! Some values in the pdb structure of 6H2X_A l = 354 are missing or nan! Indices are: (array([0, 1]),) WARNING!! Some values in the pdb structure of 6H6N_B l = 129 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10]),) WARNING!! Some values in the pdb structure of 6HC2_X l = 68 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43]),) WARNING!! Some values in the pdb structure of 6HPV_A l = 370 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 199, 200, 201, 202, 203, 204, 205, 206, 207, 208, 209, 225, 226, 227, 228, 229, 230, 231, 232, 233, 234, 235, 236, 251, 252, 253, 254, 255, 256, 257, 258, 259, 260, 261, 262, 263, 264, 265, 266, 267, 268, 269, 270, 271, 272, 273, 274, 275, 276, 297, 298, 299, 300, 301, 302, 303, 304, 305, 306]),) WARNING!! Some values in the pdb structure of 6I1R_B l = 313 are missing or nan! Indices are: (array([27, 28, 29]),) WARNING!! Some values in the pdb structure of 6I9H_A l = 94 are missing or nan! Indices are: (array([0, 1]),) WARNING!! Some values in the pdb structure of 6IAI_D l = 114 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 95]),) WARNING!! Some values in the pdb structure of 6IEH_A l = 97 are missing or nan! Indices are: (array([46, 47, 48, 73, 74, 75, 76, 77, 78, 84, 85, 86]),) WARNING!! Some values in the pdb structure of 6N0T_A l = 440 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 89, 90, 91, 92, 93, 208, 209, 210, 211, 212]),) WARNING!! Some values in the pdb structure of 6N8P_A l = 926 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 459, 460, 461, 462, 463, 464, 465, 466, 467, 468, 469, 470, 471, 472, 623, 624, 625, 626, 627, 628, 629, 630, 631, 632, 633, 634, 635, 636, 637, 638, 639, 640, 641, 642, 643, 644, 645, 646, 647, 648, 649, 650, 651, 652, 653, 654, 655, 656, 657, 658, 659, 660, 661, 662, 663, 664, 665, 666, 667, 668, 669, 670, 671, 672, 673, 674, 675, 676, 677, 678, 679, 680, 681, 682, 683, 684, 685, 686, 687, 688, 689, 690, 691, 692, 693, 694, 695, 841, 842, 843, 844, 845, 846]),) WARNING!! Some values in the pdb structure of 6OHZ_A l = 230 are missing or nan! Indices are: (array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 133, 134, 135, 136, 137, 138, 139, 140, 141, 142, 143]),) WARNING: Some pdbs in the following list have NaNs in their distances: ['5OD1_A', '5OD9_B', '5OQK_A', '5OVM_A', '5W5P_A', '5WB4_H', '5XJO_E', '5XKJ_F', '5XKN_F', '5Y08_A', '5YA6_B', '5YRQ_E', '5YVQ_A', '5YVQ_B', '5Z2H_B', '5Z2I_D', '5Z34_A', '5Z36_A', '5Z3F_A', '5Z3K_B', '5Z6D_B', '5Z7C_A', '5Z8B_B', '5Z9T_B', '5ZB2_A', '5ZER_B', '5ZKE_A', '5ZKH_B', '5ZKT_B', '5ZME_A', '5ZNS_A', '5ZOR_A', '5ZT0_J', '5ZT7_B', '5ZX9_A', '5ZYO_D', '6A2W_A', '6A5F_B', '6A5G_B', '6A68_A', '6A83_A', '6A9J_B', '6A9W_A', '6AAY_A', '6AE1_B', '6AE8_D', '6AE9_B', '6AEF_A', '6AGH_B', '6AGJ_B', '6AHQ_T', '6AIT_F', '6AJJ_A', '6AKJ_B', '6BEA_A', '6BS5_A', '6BS5_B', '6BWH_C', '6BXS_C', '6BXW_A', '6BZJ_A', '6BZK_A', '6BZT_D', '6CB6_A', '6CCI_A', '6CGO_B', '6CK1_D', '6CMK_A', '6CP8_B', '6CP8_D', '6CP9_G', '6CP9_H', '6CPU_A', '6CSV_D', '6CUL_H', '6CZ6_D', '6CZT_A', '6D0I_C', '6D0I_D', '6D2S_A', '6D7Y_A', '6D7Y_B', '6D97_D', '6D9F_B', '6D9M_A', '6DAN_D', '6DFL_A', '6DGN_B', '6DII_L', '6DKA_I', '6DKM_G', '6DLC_A', '6DLO_A', '6DRF_A', '6DTD_A', '6E0K_A', '6E0M_A', '6E3C_C', '6E9B_D', '6E9O_A', '6EAZ_B', '6EDB_B', '6EGC_A', '6FCG_F', '6FTO_C', '6FXD_B', '6G1H_A', '6G3B_B', '6G70_B', '6G7G_A', '6G7O_A', '6G8Y_A', '6GCJ_A', '6GDJ_B', '6GHO_B', '6GMA_F', '6GMS_A', '6GW7_A', '6H2X_A', '6H6N_B', '6HC2_X', '6HPV_A', '6I1R_B', '6I9H_A', '6IAI_D', '6IEH_A', '6N0T_A', '6N8P_A', '6NU4_A', '6NX4_A', '6OHZ_A'] Precision for 0 - 5OD1_A top_L5_lr = 0.4737 top_L_lr = 0.1809 top_Nc_lr = 0.2941 top_L5_mlr = 0.7368 top_L_mlr = 0.3191 top_Nc_mlr = 0.4375 (total_true_lr = 34 total_true_mlr = 64) Precision for 1 - 5OD9_B top_L5_lr = 0.6111 top_L_lr = 0.2045 top_Nc_lr = 0.3571 top_L5_mlr = 1.0000 top_L_mlr = 0.4205 top_Nc_mlr = 0.4722 (total_true_lr = 42 total_true_mlr = 72) Precision for 2 - 5OQK_A top_L5_lr = 0.0000 top_L_lr = 0.0748 top_Nc_lr = 0.1159 top_L5_mlr = 0.1034 top_L_mlr = 0.1156 top_Nc_mlr = 0.1585 (total_true_lr = 69 total_true_mlr = 82) Precision for 3 - 5OVM_A top_L5_lr = 0.6250 top_L_lr = 0.3293 top_Nc_lr = 0.5128 top_L5_mlr = 0.7500 top_L_mlr = 0.4024 top_Nc_mlr = 0.4902 (total_true_lr = 39 total_true_mlr = 51) Precision for 4 - 5W5P_A top_L5_lr = 0.6885 top_L_lr = 0.5344 top_Nc_lr = 0.3526 top_L5_mlr = 0.7705 top_L_mlr = 0.5820 top_Nc_mlr = 0.3457 (total_true_lr = 1157 total_true_mlr = 1594) Precision for 5 - 5WB4_H top_L5_lr = 1.0000 top_L_lr = 0.8549 top_Nc_lr = 0.7304 top_L5_mlr = 1.0000 top_L_mlr = 0.8446 top_Nc_mlr = 0.7162 (total_true_lr = 319 total_true_mlr = 377) Precision for 6 - 5XJO_E top_L5_lr = 1.0000 top_L_lr = 0.5000 top_Nc_lr = 0.5278 top_L5_mlr = 1.0000 top_L_mlr = 0.7857 top_Nc_mlr = 0.6613 (total_true_lr = 36 total_true_mlr = 62) Precision for 7 - 5XKJ_F top_L5_lr = 0.5556 top_L_lr = 0.2444 top_Nc_lr = 0.3793 top_L5_mlr = 1.0000 top_L_mlr = 0.6222 top_Nc_mlr = 0.5962 (total_true_lr = 29 total_true_mlr = 52) Precision for 8 - 5XKN_F top_L5_lr = 0.5714 top_L_lr = 0.2059 top_Nc_lr = 0.4545 top_L5_mlr = 0.5714 top_L_mlr = 0.6471 top_Nc_mlr = 0.6216 (total_true_lr = 11 total_true_mlr = 37) Precision for 9 - 5Y08_A top_L5_lr = 0.9512 top_L_lr = 0.4439 top_Nc_lr = 0.5430 top_L5_mlr = 0.9512 top_L_mlr = 0.5024 top_Nc_mlr = 0.4884 (total_true_lr = 151 total_true_mlr = 215) Precision for 10 - 5YA6_B top_L5_lr = 1.0000 top_L_lr = 0.6667 top_Nc_lr = 0.4903 top_L5_mlr = 1.0000 top_L_mlr = 0.8242 top_Nc_mlr = 0.5276 (total_true_lr = 310 total_true_mlr = 398) Precision for 11 - 5YRQ_E top_L5_lr = 0.7500 top_L_lr = 0.4628 top_Nc_lr = 0.4907 top_L5_mlr = 0.7083 top_L_mlr = 0.5124 top_Nc_mlr = 0.5124 (total_true_lr = 108 total_true_mlr = 121) Precision for 12 - 5YVQ_A top_L5_lr = 0.0417 top_L_lr = 0.0279 top_Nc_lr = 0.0312 top_L5_mlr = 0.4861 top_L_mlr = 0.2291 top_Nc_mlr = 0.2484 (total_true_lr = 96 total_true_mlr = 306) Precision for 13 - 5YVQ_B top_L5_lr = 0.9524 top_L_lr = 0.5534 top_Nc_lr = 0.5472 top_L5_mlr = 1.0000 top_L_mlr = 0.7379 top_Nc_mlr = 0.5732 (total_true_lr = 106 total_true_mlr = 157) Precision for 14 - 5Z2H_B top_L5_lr = 1.0000 top_L_lr = 0.4396 top_Nc_lr = 0.6939 top_L5_mlr = 1.0000 top_L_mlr = 0.6154 top_Nc_mlr = 0.6933 (total_true_lr = 49 total_true_mlr = 75) Precision for 15 - 5Z2I_D top_L5_lr = 1.0000 top_L_lr = 0.4348 top_Nc_lr = 0.7021 top_L5_mlr = 1.0000 top_L_mlr = 0.6196 top_Nc_mlr = 0.6986 (total_true_lr = 47 total_true_mlr = 73) Precision for 16 - 5Z34_A top_L5_lr = 0.9306 top_L_lr = 0.5928 top_Nc_lr = 0.4492 top_L5_mlr = 0.9722 top_L_mlr = 0.6371 top_Nc_mlr = 0.4464 (total_true_lr = 630 total_true_mlr = 793) Precision for 17 - 5Z36_A top_L5_lr = 0.9032 top_L_lr = 0.4805 top_Nc_lr = 0.4805 top_L5_mlr = 0.9677 top_L_mlr = 0.7143 top_Nc_mlr = 0.5944 (total_true_lr = 154 total_true_mlr = 249) Precision for 18 - 5Z3F_A top_L5_lr = 0.9730 top_L_lr = 0.7143 top_Nc_lr = 0.5657 top_L5_mlr = 0.9730 top_L_mlr = 0.7709 top_Nc_mlr = 0.5760 (total_true_lr = 594 total_true_mlr = 717) Precision for 19 - 5Z3K_B top_L5_lr = 0.9897 top_L_lr = 0.8219 top_Nc_lr = 0.5703 top_L5_mlr = 0.9691 top_L_mlr = 0.8261 top_Nc_mlr = 0.5713 (total_true_lr = 1031 total_true_mlr = 1192) Precision for 20 - 5Z6D_B top_L5_lr = 0.7073 top_L_lr = 0.4951 top_Nc_lr = 0.4006 top_L5_mlr = 0.9512 top_L_mlr = 0.5980 top_Nc_mlr = 0.4559 (total_true_lr = 332 total_true_mlr = 476) Precision for 21 - 5Z7C_A top_L5_lr = 0.6867 top_L_lr = 0.4372 top_Nc_lr = 0.4071 top_L5_mlr = 0.7229 top_L_mlr = 0.5024 top_Nc_mlr = 0.4347 (total_true_lr = 452 total_true_mlr = 559) Precision for 22 - 5Z8B_B top_L5_lr = 0.7561 top_L_lr = 0.6373 top_Nc_lr = 0.5513 top_L5_mlr = 0.8049 top_L_mlr = 0.6814 top_Nc_mlr = 0.5424 (total_true_lr = 341 total_true_mlr = 413) Precision for 23 - 5Z9T_B top_L5_lr = 1.0000 top_L_lr = 0.8377 top_Nc_lr = 0.5753 top_L5_mlr = 1.0000 top_L_mlr = 0.8477 top_Nc_mlr = 0.5665 (total_true_lr = 1208 total_true_mlr = 1518) Precision for 24 - 5ZB2_A top_L5_lr = 0.9500 top_L_lr = 0.7761 top_Nc_lr = 0.6545 top_L5_mlr = 0.9500 top_L_mlr = 0.8010 top_Nc_mlr = 0.6420 (total_true_lr = 715 total_true_mlr = 866) Precision for 25 - 5ZER_B top_L5_lr = 0.9565 top_L_lr = 0.7572 top_Nc_lr = 0.6117 top_L5_mlr = 0.9855 top_L_mlr = 0.8526 top_Nc_mlr = 0.6453 (total_true_lr = 546 total_true_mlr = 750) Precision for 26 - 5ZKE_A top_L5_lr = 0.5143 top_L_lr = 0.4023 top_Nc_lr = 0.3889 top_L5_mlr = 0.5429 top_L_mlr = 0.4138 top_Nc_mlr = 0.3761 (total_true_lr = 198 total_true_mlr = 226) Precision for 27 - 5ZKH_B top_L5_lr = 0.6765 top_L_lr = 0.3810 top_Nc_lr = 0.3641 top_L5_mlr = 0.6471 top_L_mlr = 0.4048 top_Nc_mlr = 0.3529 (total_true_lr = 195 total_true_mlr = 221) Precision for 28 - 5ZKT_B top_L5_lr = nan top_L_lr = nan top_Nc_lr = nan top_L5_mlr = 0.2727 top_L_mlr = 0.0909 top_Nc_mlr = 0.6000 (total_true_lr = 0 total_true_mlr = 5) Precision for 29 - 5ZME_A top_L5_lr = 0.8047 top_L_lr = 0.4107 top_Nc_lr = 0.3120 top_L5_mlr = 0.8672 top_L_mlr = 0.4577 top_Nc_mlr = 0.3279 (total_true_lr = 984 total_true_mlr = 1098) Precision for 30 - 5ZNS_A top_L5_lr = 0.8684 top_L_lr = 0.5984 top_Nc_lr = 0.4504 top_L5_mlr = 0.8947 top_L_mlr = 0.6562 top_Nc_mlr = 0.4502 (total_true_lr = 635 total_true_mlr = 804) Precision for 31 - 5ZOR_A top_L5_lr = 0.6333 top_L_lr = 0.2752 top_Nc_lr = 0.4697 top_L5_mlr = 0.6000 top_L_mlr = 0.3490 top_Nc_mlr = 0.4719 (total_true_lr = 66 total_true_mlr = 89) Precision for 32 - 5ZT0_J top_L5_lr = nan top_L_lr = nan top_Nc_lr = nan top_L5_mlr = nan top_L_mlr = nan top_Nc_mlr = nan (total_true_lr = 0 total_true_mlr = 0) Precision for 33 - 5ZT7_B top_L5_lr = 0.9600 top_L_lr = 0.8849 top_Nc_lr = 0.7080 top_L5_mlr = 0.9600 top_L_mlr = 0.9008 top_Nc_mlr = 0.7123 (total_true_lr = 452 total_true_mlr = 504) Precision for 34 - 5ZX9_A top_L5_lr = 0.6970 top_L_lr = 0.3697 top_Nc_lr = 0.2863 top_L5_mlr = 0.9394 top_L_mlr = 0.6485 top_Nc_mlr = 0.4016 (total_true_lr = 241 total_true_mlr = 371) Precision for 35 - 5ZYO_D top_L5_lr = 0.7333 top_L_lr = 0.2566 top_Nc_lr = 0.4375 top_L5_mlr = 0.7000 top_L_mlr = 0.2961 top_Nc_mlr = 0.4301 (total_true_lr = 80 total_true_mlr = 93) Precision for 36 - 6A2W_A top_L5_lr = 0.6061 top_L_lr = 0.2831 top_Nc_lr = 0.5238 top_L5_mlr = 0.5758 top_L_mlr = 0.3072 top_Nc_mlr = 0.4409 (total_true_lr = 63 total_true_mlr = 93) Precision for 37 - 6A5F_B top_L5_lr = 0.8667 top_L_lr = 0.5497 top_Nc_lr = 0.5030 top_L5_mlr = 1.0000 top_L_mlr = 0.7947 top_Nc_mlr = 0.6381 (total_true_lr = 169 total_true_mlr = 257) Precision for 38 - 6A5G_B top_L5_lr = 0.8387 top_L_lr = 0.5096 top_Nc_lr = 0.4854 top_L5_mlr = 0.9677 top_L_mlr = 0.7516 top_Nc_mlr = 0.6038 (total_true_lr = 171 total_true_mlr = 260) Precision for 39 - 6A68_A top_L5_lr = 0.1562 top_L_lr = 0.1481 top_Nc_lr = 0.2000 top_L5_mlr = 0.0938 top_L_mlr = 0.1790 top_Nc_mlr = 0.1481 (total_true_lr = 70 total_true_mlr = 81) Precision for 40 - 6A83_A top_L5_lr = 1.0000 top_L_lr = 0.7563 top_Nc_lr = 0.5780 top_L5_mlr = 1.0000 top_L_mlr = 0.8039 top_Nc_mlr = 0.5831 (total_true_lr = 635 total_true_mlr = 746) Precision for 41 - 6A9J_B top_L5_lr = 0.9853 top_L_lr = 0.5103 top_Nc_lr = 0.4611 top_L5_mlr = 0.9412 top_L_mlr = 0.6903 top_Nc_mlr = 0.4964 (total_true_lr = 386 total_true_mlr = 548) Precision for 42 - 6A9W_A top_L5_lr = 0.9643 top_L_lr = 0.7616 top_Nc_lr = 0.6500 top_L5_mlr = 0.9643 top_L_mlr = 0.7972 top_Nc_mlr = 0.6467 (total_true_lr = 400 total_true_mlr = 467) Precision for 43 - 6AAY_A top_L5_lr = 0.0583 top_L_lr = 0.0342 top_Nc_lr = 0.0329 top_L5_mlr = 0.1083 top_L_mlr = 0.0651 top_Nc_mlr = 0.0566 (total_true_lr = 1308 total_true_mlr = 1609) Precision for 44 - 6AE1_B top_L5_lr = 0.6522 top_L_lr = 0.4825 top_Nc_lr = 0.5000 top_L5_mlr = 0.7391 top_L_mlr = 0.5702 top_Nc_mlr = 0.5349 (total_true_lr = 106 total_true_mlr = 129) Precision for 45 - 6AE8_D top_L5_lr = nan top_L_lr = nan top_Nc_lr = nan top_L5_mlr = nan top_L_mlr = nan top_Nc_mlr = nan (total_true_lr = 0 total_true_mlr = 0) Precision for 46 - 6AE9_B top_L5_lr = 0.9804 top_L_lr = 0.8142 top_Nc_lr = 0.6208 top_L5_mlr = 0.9804 top_L_mlr = 0.8656 top_Nc_mlr = 0.6202 (total_true_lr = 472 total_true_mlr = 574) Precision for 47 - 6AEF_A top_L5_lr = 0.8370 top_L_lr = 0.6179 top_Nc_lr = 0.4799 top_L5_mlr = 0.8913 top_L_mlr = 0.6834 top_Nc_mlr = 0.5087 (total_true_lr = 819 total_true_mlr = 922) Precision for 48 - 6AGH_B top_L5_lr = 0.7455 top_L_lr = 0.2993 top_Nc_lr = 0.2872 top_L5_mlr = 0.7455 top_L_mlr = 0.3504 top_Nc_mlr = 0.3040 (total_true_lr = 289 total_true_mlr = 352) Precision for 49 - 6AGJ_B top_L5_lr = 0.5873 top_L_lr = 0.2603 top_Nc_lr = 0.2610 top_L5_mlr = 0.6190 top_L_mlr = 0.3270 top_Nc_mlr = 0.2992 (total_true_lr = 318 total_true_mlr = 381) Precision for 50 - 6AHQ_T top_L5_lr = 1.0000 top_L_lr = 0.8505 top_Nc_lr = 0.7664 top_L5_mlr = 1.0000 top_L_mlr = 0.9065 top_Nc_mlr = 0.7572 (total_true_lr = 137 total_true_mlr = 173) Precision for 51 - 6AIT_F top_L5_lr = 0.9506 top_L_lr = 0.6379 top_Nc_lr = 0.5720 top_L5_mlr = 0.9877 top_L_mlr = 0.7833 top_Nc_mlr = 0.6312 (total_true_lr = 479 total_true_mlr = 648) Precision for 52 - 6AJJ_A top_L5_lr = 0.8611 top_L_lr = 0.6593 top_Nc_lr = 0.5389 top_L5_mlr = 0.9000 top_L_mlr = 0.6859 top_Nc_mlr = 0.5305 (total_true_lr = 1414 total_true_mlr = 1608) Precision for 53 - 6AKJ_B top_L5_lr = 1.0000 top_L_lr = 0.5000 top_Nc_lr = 0.5044 top_L5_mlr = 1.0000 top_L_mlr = 0.7636 top_Nc_mlr = 0.6169 (total_true_lr = 113 total_true_mlr = 154) Precision for 54 - 6BEA_A top_L5_lr = 0.8046 top_L_lr = 0.6028 top_Nc_lr = 0.4578 top_L5_mlr = 0.8966 top_L_mlr = 0.7090 top_Nc_mlr = 0.5051 (total_true_lr = 1101 total_true_mlr = 1457) Precision for 55 - 6BS5_A top_L5_lr = 1.0000 top_L_lr = 0.7242 top_Nc_lr = 0.6054 top_L5_mlr = 0.9848 top_L_mlr = 0.7697 top_Nc_mlr = 0.5953 (total_true_lr = 484 total_true_mlr = 556) Precision for 56 - 6BS5_B top_L5_lr = 0.9296 top_L_lr = 0.6338 top_Nc_lr = 0.5009 top_L5_mlr = 0.9296 top_L_mlr = 0.6648 top_Nc_mlr = 0.4976 (total_true_lr = 543 total_true_mlr = 633) Precision for 57 - 6BWH_C top_L5_lr = 1.0000 top_L_lr = 0.8100 top_Nc_lr = 0.6491 top_L5_mlr = 1.0000 top_L_mlr = 0.8800 top_Nc_mlr = 0.6579 (total_true_lr = 322 total_true_mlr = 418) Precision for 58 - 6BXS_C top_L5_lr = 0.7593 top_L_lr = 0.4630 top_Nc_lr = 0.3476 top_L5_mlr = 0.8889 top_L_mlr = 0.5704 top_Nc_mlr = 0.3958 (total_true_lr = 466 total_true_mlr = 571) Precision for 59 - 6BXW_A top_L5_lr = 0.7170 top_L_lr = 0.4457 top_Nc_lr = 0.3467 top_L5_mlr = 0.8679 top_L_mlr = 0.5618 top_Nc_mlr = 0.3855 (total_true_lr = 450 total_true_mlr = 550) Precision for 60 - 6BZJ_A top_L5_lr = nan top_L_lr = nan top_Nc_lr = nan top_L5_mlr = nan top_L_mlr = nan top_Nc_mlr = nan (total_true_lr = 0 total_true_mlr = 0) Precision for 61 - 6BZK_A top_L5_lr = nan top_L_lr = nan top_Nc_lr = nan top_L5_mlr = nan top_L_mlr = nan top_Nc_mlr = nan (total_true_lr = 0 total_true_mlr = 0) Precision for 62 - 6BZT_D top_L5_lr = 0.9700 top_L_lr = 0.7405 top_Nc_lr = 0.5385 top_L5_mlr = 0.9900 top_L_mlr = 0.7984 top_Nc_mlr = 0.5470 (total_true_lr = 897 total_true_mlr = 1075) Precision for 63 - 6CB6_A top_L5_lr = 0.0833 top_L_lr = 0.0847 top_Nc_lr = 0.0654 top_L5_mlr = 0.2083 top_L_mlr = 0.1102 top_Nc_mlr = 0.1087 (total_true_lr = 107 total_true_mlr = 138) Precision for 64 - 6CCI_A top_L5_lr = 0.9718 top_L_lr = 0.7090 top_Nc_lr = 0.5201 top_L5_mlr = 0.9718 top_L_mlr = 0.7316 top_Nc_mlr = 0.5027 (total_true_lr = 623 total_true_mlr = 732) Precision for 65 - 6CGO_B top_L5_lr = 0.8235 top_L_lr = 0.5492 top_Nc_lr = 0.4449 top_L5_mlr = 0.8627 top_L_mlr = 0.6299 top_Nc_mlr = 0.4838 (total_true_lr = 798 total_true_mlr = 926) Precision for 66 - 6CK1_D top_L5_lr = 0.5200 top_L_lr = 0.4027 top_Nc_lr = 0.3352 top_L5_mlr = 0.8667 top_L_mlr = 0.7307 top_Nc_mlr = 0.5181 (total_true_lr = 540 total_true_mlr = 938) Precision for 67 - 6CMK_A top_L5_lr = 0.5867 top_L_lr = 0.4213 top_Nc_lr = 0.3647 top_L5_mlr = 0.9067 top_L_mlr = 0.7307 top_Nc_mlr = 0.5258 (total_true_lr = 521 total_true_mlr = 911) Precision for 68 - 6CP8_B top_L5_lr = 0.5312 top_L_lr = 0.3025 top_Nc_lr = 0.3261 top_L5_mlr = 0.9375 top_L_mlr = 0.4753 top_Nc_mlr = 0.4208 (total_true_lr = 138 total_true_mlr = 202) Precision for 69 - 6CP8_D top_L5_lr = 0.6129 top_L_lr = 0.2675 top_Nc_lr = 0.2887 top_L5_mlr = 0.6452 top_L_mlr = 0.3822 top_Nc_mlr = 0.3516 (total_true_lr = 142 total_true_mlr = 182) Precision for 70 - 6CP9_G top_L5_lr = 0.6818 top_L_lr = 0.4220 top_Nc_lr = 0.4658 top_L5_mlr = 1.0000 top_L_mlr = 0.7339 top_Nc_mlr = 0.6643 (total_true_lr = 73 total_true_mlr = 143) Precision for 71 - 6CP9_H top_L5_lr = 0.4348 top_L_lr = 0.2544 top_Nc_lr = 0.2206 top_L5_mlr = 0.9565 top_L_mlr = 0.7281 top_Nc_mlr = 0.5906 (total_true_lr = 68 total_true_mlr = 171) Precision for 72 - 6CPU_A top_L5_lr = 0.8857 top_L_lr = 0.5399 top_Nc_lr = 0.3920 top_L5_mlr = 0.8952 top_L_mlr = 0.5798 top_Nc_mlr = 0.3966 (total_true_lr = 829 total_true_mlr = 938) Precision for 73 - 6CSV_D top_L5_lr = 0.1176 top_L_lr = 0.0814 top_Nc_lr = 0.1538 top_L5_mlr = 0.2353 top_L_mlr = 0.1163 top_Nc_mlr = 0.2000 (total_true_lr = 26 total_true_mlr = 35) Precision for 74 - 6CUL_H top_L5_lr = 0.7736 top_L_lr = 0.5431 top_Nc_lr = 0.4348 top_L5_mlr = 0.7925 top_L_mlr = 0.6067 top_Nc_mlr = 0.4306 (total_true_lr = 391 total_true_mlr = 497) Precision for 75 - 6CZ6_D top_L5_lr = 0.9385 top_L_lr = 0.5491 top_Nc_lr = 0.5214 top_L5_mlr = 0.9538 top_L_mlr = 0.7025 top_Nc_mlr = 0.5497 (total_true_lr = 374 total_true_mlr = 533) Precision for 76 - 6CZT_A top_L5_lr = 0.8125 top_L_lr = 0.7073 top_Nc_lr = 0.7071 top_L5_mlr = 0.9375 top_L_mlr = 0.7927 top_Nc_mlr = 0.6374 (total_true_lr = 99 total_true_mlr = 171) Precision for 77 - 6D0I_C top_L5_lr = 0.9375 top_L_lr = 0.7736 top_Nc_lr = 0.6285 top_L5_mlr = 0.9375 top_L_mlr = 0.8428 top_Nc_mlr = 0.6367 (total_true_lr = 253 total_true_mlr = 300) Precision for 78 - 6D0I_D top_L5_lr = 0.0000 top_L_lr = 0.1268 top_Nc_lr = 0.1200 top_L5_mlr = 0.3571 top_L_mlr = 0.2254 top_Nc_mlr = 0.3111 (total_true_lr = 25 total_true_mlr = 45) Precision for 79 - 6D2S_A top_L5_lr = 0.9091 top_L_lr = 0.6364 top_Nc_lr = 0.5292 top_L5_mlr = 0.9455 top_L_mlr = 0.7636 top_Nc_mlr = 0.5825 (total_true_lr = 359 total_true_mlr = 491) Precision for 80 - 6D7Y_A top_L5_lr = 0.8889 top_L_lr = 0.4674 top_Nc_lr = 0.5846 top_L5_mlr = 0.8333 top_L_mlr = 0.6304 top_Nc_mlr = 0.5784 (total_true_lr = 65 total_true_mlr = 102) Precision for 81 - 6D7Y_B top_L5_lr = 0.4516 top_L_lr = 0.4065 top_Nc_lr = 0.3555 top_L5_mlr = 0.7419 top_L_mlr = 0.4839 top_Nc_mlr = 0.3957 (total_true_lr = 211 total_true_mlr = 278) Precision for 82 - 6D97_D top_L5_lr = 0.9038 top_L_lr = 0.6801 top_Nc_lr = 0.5366 top_L5_mlr = 0.9423 top_L_mlr = 0.7605 top_Nc_mlr = 0.5678 (total_true_lr = 984 total_true_mlr = 1180) Precision for 83 - 6D9F_B top_L5_lr = 0.7344 top_L_lr = 0.3540 top_Nc_lr = 0.3134 top_L5_mlr = 0.7969 top_L_mlr = 0.3913 top_Nc_mlr = 0.3278 (total_true_lr = 434 total_true_mlr = 485) Precision for 84 - 6D9M_A top_L5_lr = 0.8923 top_L_lr = 0.5583 top_Nc_lr = 0.5013 top_L5_mlr = 0.9538 top_L_mlr = 0.6411 top_Nc_mlr = 0.4772 (total_true_lr = 377 total_true_mlr = 505) Precision for 85 - 6DAN_D top_L5_lr = 0.9231 top_L_lr = 0.7430 top_Nc_lr = 0.5921 top_L5_mlr = 0.9692 top_L_mlr = 0.8142 top_Nc_mlr = 0.6112 (total_true_lr = 581 total_true_mlr = 697) Precision for 86 - 6DFL_A top_L5_lr = 0.9375 top_L_lr = 0.7178 top_Nc_lr = 0.6396 top_L5_mlr = 0.9583 top_L_mlr = 0.8133 top_Nc_mlr = 0.6675 (total_true_lr = 308 total_true_mlr = 391) Precision for 87 - 6DGN_B top_L5_lr = 0.1667 top_L_lr = 0.2418 top_Nc_lr = 0.2429 top_L5_mlr = 0.9444 top_L_mlr = 0.5934 top_Nc_mlr = 0.4375 (total_true_lr = 70 total_true_mlr = 144) Precision for 88 - 6DII_L top_L5_lr = 0.9919 top_L_lr = 0.7922 top_Nc_lr = 0.5681 top_L5_mlr = 0.9919 top_L_mlr = 0.8036 top_Nc_mlr = 0.5726 (total_true_lr = 1262 total_true_mlr = 1411) Precision for 89 - 6DKA_I top_L5_lr = 0.9556 top_L_lr = 0.5885 top_Nc_lr = 0.4763 top_L5_mlr = 0.9778 top_L_mlr = 0.7212 top_Nc_mlr = 0.5168 (total_true_lr = 317 total_true_mlr = 387) Precision for 90 - 6DKM_G top_L5_lr = 0.0000 top_L_lr = 0.0132 top_Nc_lr = 0.0333 top_L5_mlr = 0.0000 top_L_mlr = 0.0132 top_Nc_mlr = 0.0263 (total_true_lr = 30 total_true_mlr = 38) Precision for 91 - 6DLC_A top_L5_lr = 0.0000 top_L_lr = 0.0381 top_Nc_lr = 0.0175 top_L5_mlr = 0.0476 top_L_mlr = 0.0571 top_Nc_mlr = 0.0303 (total_true_lr = 57 total_true_mlr = 66) Precision for 92 - 6DLO_A top_L5_lr = 0.5968 top_L_lr = 0.3794 top_Nc_lr = 0.3413 top_L5_mlr = 0.8548 top_L_mlr = 0.8071 top_Nc_mlr = 0.5520 (total_true_lr = 378 total_true_mlr = 750) Precision for 93 - 6DRF_A top_L5_lr = 0.6207 top_L_lr = 0.3217 top_Nc_lr = 0.4286 top_L5_mlr = 0.7241 top_L_mlr = 0.3497 top_Nc_mlr = 0.4554 (total_true_lr = 91 total_true_mlr = 101) Precision for 94 - 6DTD_A top_L5_lr = 0.1659 top_L_lr = 0.0760 top_Nc_lr = 0.0668 top_L5_mlr = 0.2701 top_L_mlr = 0.1349 top_Nc_mlr = 0.1085 (total_true_lr = 1332 total_true_mlr = 1603) Precision for 95 - 6E0K_A top_L5_lr = 0.8305 top_L_lr = 0.5051 top_Nc_lr = 0.4479 top_L5_mlr = 0.8983 top_L_mlr = 0.6305 top_Nc_mlr = 0.5065 (total_true_lr = 355 total_true_mlr = 462) Precision for 96 - 6E0M_A top_L5_lr = 0.9630 top_L_lr = 0.6029 top_Nc_lr = 0.5394 top_L5_mlr = 0.9630 top_L_mlr = 0.7316 top_Nc_mlr = 0.5789 (total_true_lr = 343 total_true_mlr = 437) Precision for 97 - 6E3C_C top_L5_lr = 0.9259 top_L_lr = 0.4030 top_Nc_lr = 0.7015 top_L5_mlr = 0.8889 top_L_mlr = 0.5746 top_Nc_mlr = 0.6000 (total_true_lr = 67 total_true_mlr = 120) Precision for 98 - 6E9B_D top_L5_lr = 0.6216 top_L_lr = 0.4620 top_Nc_lr = 0.3773 top_L5_mlr = 0.6081 top_L_mlr = 0.4946 top_Nc_mlr = 0.3792 (total_true_lr = 554 total_true_mlr = 712) Precision for 99 - 6E9O_A top_L5_lr = 0.8987 top_L_lr = 0.7608 top_Nc_lr = 0.6440 top_L5_mlr = 0.8861 top_L_mlr = 0.7913 top_Nc_mlr = 0.6480 (total_true_lr = 618 total_true_mlr = 679) Precision for 100 - 6EAZ_B top_L5_lr = 0.6667 top_L_lr = 0.2835 top_Nc_lr = 0.2773 top_L5_mlr = 0.6667 top_L_mlr = 0.3415 top_Nc_mlr = 0.3094 (total_true_lr = 339 total_true_mlr = 404) Precision for 101 - 6EDB_B top_L5_lr = 0.1538 top_L_lr = 0.1194 top_Nc_lr = 0.1538 top_L5_mlr = 0.6154 top_L_mlr = 0.2239 top_Nc_mlr = 0.3243 (total_true_lr = 26 total_true_mlr = 37) Precision for 102 - 6EGC_A top_L5_lr = 0.6207 top_L_lr = 0.3741 top_Nc_lr = 0.3897 top_L5_mlr = 0.6207 top_L_mlr = 0.4354 top_Nc_mlr = 0.4099 (total_true_lr = 136 total_true_mlr = 161) Precision for 103 - 6FCG_F top_L5_lr = 0.9620 top_L_lr = 0.7277 top_Nc_lr = 0.5740 top_L5_mlr = 0.9620 top_L_mlr = 0.7684 top_Nc_mlr = 0.5797 (total_true_lr = 655 total_true_mlr = 778) Precision for 104 - 6FTO_C top_L5_lr = 0.0000 top_L_lr = 0.0267 top_Nc_lr = 0.0476 top_L5_mlr = 0.0000 top_L_mlr = 0.0533 top_Nc_mlr = 0.0678 (total_true_lr = 21 total_true_mlr = 59) Precision for 105 - 6FXD_B top_L5_lr = 0.9583 top_L_lr = 0.8430 top_Nc_lr = 0.7746 top_L5_mlr = 1.0000 top_L_mlr = 0.9256 top_Nc_mlr = 0.7179 (total_true_lr = 142 total_true_mlr = 195) Precision for 106 - 6G1H_A top_L5_lr = 0.9853 top_L_lr = 0.8655 top_Nc_lr = 0.6727 top_L5_mlr = 1.0000 top_L_mlr = 0.9006 top_Nc_mlr = 0.6751 (total_true_lr = 605 total_true_mlr = 748) Precision for 107 - 6G3B_B top_L5_lr = 0.6667 top_L_lr = 0.3956 top_Nc_lr = 0.4195 top_L5_mlr = 0.8444 top_L_mlr = 0.5644 top_Nc_mlr = 0.4867 (total_true_lr = 205 total_true_mlr = 300) Precision for 108 - 6G70_B top_L5_lr = 0.5490 top_L_lr = 0.3176 top_Nc_lr = 0.3964 top_L5_mlr = 0.8039 top_L_mlr = 0.5353 top_Nc_mlr = 0.5342 (total_true_lr = 333 total_true_mlr = 511) Precision for 109 - 6G7G_A top_L5_lr = 1.0000 top_L_lr = 0.8522 top_Nc_lr = 0.7200 top_L5_mlr = 1.0000 top_L_mlr = 0.9130 top_Nc_mlr = 0.7205 (total_true_lr = 175 total_true_mlr = 254) Precision for 110 - 6G7O_A top_L5_lr = 0.7429 top_L_lr = 0.4486 top_Nc_lr = 0.4564 top_L5_mlr = 0.8286 top_L_mlr = 0.4886 top_Nc_mlr = 0.4432 (total_true_lr = 344 total_true_mlr = 449) Precision for 111 - 6G8Y_A top_L5_lr = 0.9444 top_L_lr = 0.4348 top_Nc_lr = 0.4810 top_L5_mlr = 1.0000 top_L_mlr = 0.6304 top_Nc_mlr = 0.5424 (total_true_lr = 79 total_true_mlr = 118) Precision for 112 - 6GCJ_A top_L5_lr = 0.5833 top_L_lr = 0.3802 top_Nc_lr = 0.2958 top_L5_mlr = 0.5000 top_L_mlr = 0.3554 top_Nc_mlr = 0.2602 (total_true_lr = 213 total_true_mlr = 246) Precision for 113 - 6GDJ_B top_L5_lr = 0.2857 top_L_lr = 0.2609 top_Nc_lr = 0.2830 top_L5_mlr = 0.3571 top_L_mlr = 0.3478 top_Nc_mlr = 0.3636 (total_true_lr = 53 total_true_mlr = 66) Precision for 114 - 6GHO_B top_L5_lr = 0.9636 top_L_lr = 0.6642 top_Nc_lr = 0.5667 top_L5_mlr = 0.9818 top_L_mlr = 0.7372 top_Nc_mlr = 0.5852 (total_true_lr = 360 total_true_mlr = 446) Precision for 115 - 6GMA_F top_L5_lr = 1.0000 top_L_lr = 0.6667 top_Nc_lr = 0.6842 top_L5_mlr = 1.0000 top_L_mlr = 0.8250 top_Nc_mlr = 0.6962 (total_true_lr = 114 total_true_mlr = 158) Precision for 116 - 6GMS_A top_L5_lr = 0.4231 top_L_lr = 0.2500 top_Nc_lr = 0.2697 top_L5_mlr = 0.7692 top_L_mlr = 0.4848 top_Nc_mlr = 0.4506 (total_true_lr = 89 total_true_mlr = 162) Precision for 117 - 6GW7_A top_L5_lr = 0.6667 top_L_lr = 0.2034 top_Nc_lr = 0.6000 top_L5_mlr = 0.5833 top_L_mlr = 0.3559 top_Nc_mlr = 0.4828 (total_true_lr = 15 total_true_mlr = 29) Precision for 118 - 6H2X_A top_L5_lr = 0.8857 top_L_lr = 0.4574 top_Nc_lr = 0.5302 top_L5_mlr = 0.8286 top_L_mlr = 0.4830 top_Nc_mlr = 0.5277 (total_true_lr = 281 total_true_mlr = 307) Precision for 119 - 6H6N_B top_L5_lr = 1.0000 top_L_lr = 0.6525 top_Nc_lr = 0.6183 top_L5_mlr = 1.0000 top_L_mlr = 0.8051 top_Nc_mlr = 0.6628 (total_true_lr = 131 total_true_mlr = 172) Precision for 120 - 6HC2_X top_L5_lr = nan top_L_lr = nan top_Nc_lr = nan top_L5_mlr = nan top_L_mlr = nan top_Nc_mlr = nan (total_true_lr = 0 total_true_mlr = 0) Precision for 121 - 6HPV_A top_L5_lr = 0.9833 top_L_lr = 0.6421 top_Nc_lr = 0.4247 top_L5_mlr = 1.0000 top_L_mlr = 0.8428 top_Nc_mlr = 0.4936 (total_true_lr = 558 total_true_mlr = 701) Precision for 122 - 6I1R_B top_L5_lr = 0.8065 top_L_lr = 0.5645 top_Nc_lr = 0.5172 top_L5_mlr = 0.8065 top_L_mlr = 0.5806 top_Nc_mlr = 0.5176 (total_true_lr = 379 total_true_mlr = 398) Precision for 123 - 6I9H_A top_L5_lr = 0.5556 top_L_lr = 0.4130 top_Nc_lr = 0.4000 top_L5_mlr = 0.7222 top_L_mlr = 0.5435 top_Nc_mlr = 0.4295 (total_true_lr = 95 total_true_mlr = 149) Precision for 124 - 6IAI_D top_L5_lr = 0.2381 top_L_lr = 0.1415 top_Nc_lr = 0.1884 top_L5_mlr = 0.2857 top_L_mlr = 0.1698 top_Nc_mlr = 0.1696 (total_true_lr = 69 total_true_mlr = 112) Precision for 125 - 6IEH_A top_L5_lr = nan top_L_lr = nan top_Nc_lr = nan top_L5_mlr = 0.0000 top_L_mlr = 0.0000 top_Nc_mlr = 0.0000 (total_true_lr = 0 total_true_mlr = 2) Precision for 126 - 6N0T_A top_L5_lr = 0.9481 top_L_lr = 0.5300 top_Nc_lr = 0.4325 top_L5_mlr = 0.9870 top_L_mlr = 0.7389 top_Nc_mlr = 0.5085 (total_true_lr = 541 total_true_mlr = 708) Precision for 127 - 6N8P_A top_L5_lr = 0.4727 top_L_lr = 0.1935 top_Nc_lr = 0.1506 top_L5_mlr = 0.9758 top_L_mlr = 0.5695 top_Nc_mlr = 0.3190 (total_true_lr = 1348 total_true_mlr = 1978) Precision for 128 - 6NU4_A top_L5_lr = nan top_L_lr = nan top_Nc_lr = nan top_L5_mlr = 0.0000 top_L_mlr = 0.0357 top_Nc_mlr = 0.0000 (total_true_lr = 0 total_true_mlr = 2) Precision for 129 - 6NX4_A top_L5_lr = 0.5758 top_L_lr = 0.2575 top_Nc_lr = 0.4000 top_L5_mlr = 0.6667 top_L_mlr = 0.4132 top_Nc_mlr = 0.4478 (total_true_lr = 80 total_true_mlr = 134) Precision for 130 - 6OHZ_A top_L5_lr = 0.1190 top_L_lr = 0.0952 top_Nc_lr = 0.0889 top_L5_mlr = 0.1905 top_L_mlr = 0.1286 top_Nc_mlr = 0.1148 (total_true_lr = 225 total_true_mlr = 270) Average Precision: top_L5_lr = 71.61 top_L_lr = 47.09 top_Nc_lr = 43.95 top_L5_mlr = 77.98 top_L_mlr = 56.90 top_Nc_mlr = 46.85 Save predictions.. Everything done! 2020-04-29 11:02:49.740899