Skip to content

Optionally exclude the MLP from the Adapter #4

@nasheedyasin

Description

@nasheedyasin

Is your feature request related to a problem? Please describe.
We need to align the Adapter layer architechture with the description in Liu 2024.

Describe the solution you'd like
Right now it is (norm -> self_attn -> norm -> MLP). Make it (norm -> self_attn -> norm) as described in Liu 2024.

Describe alternatives you've considered

Additional context

Metadata

Metadata

Labels

enhancementNew feature or request

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions