All convolutions inside of a dense block are ReLU-activated and use batch normalization. Channel-smart concatenation is only achievable if the height and width dimensions of the info remain unchanged, so convolutions inside of a dense block are all of stride 1. Pooling layers are inserted between dense blocks for https://financefeeds.com/bitgo-appoints-brett-reeves-as-head-of-european-sales/