Thanks for your work.
Could you explain why remove residual connections in BasicTransformerBlock (https://github.com/lbc12345/SeD/blob/311195f371224988bb85d773f4bab8b5acc847a1/models/module_attention.py#L253). Is this choice made because can get high quality images ?