Skip to content

RoPE MHA applies dropout at test time #2888

@pplantinga

Description

@pplantinga

Describe the bug

Dropout is always passed as a flat rate independent of training status.

https://github.com/speechbrain/speechbrain/blob/develop/speechbrain/nnet/attention.py#L1367-L1374

Expected behaviour

Dropout should be 0.0 when not in training mode.

To Reproduce

No response

Environment Details

No response

Relevant Log Output

Additional Context

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions