Skip to content

Implement RMT-R (New Paper feature to RMTs) #23

@anoojpatel

Description

@anoojpatel

RMT-R or Recurrent Memory Transformer - Retrieval is a new paper from the lab that describes a methodology to inject Past Mt-1 Memories into a Retrieval Cross attention head to improve performance at 10M token lengths. It would be cool to add this feature!

Paper reference: https://arxiv.org/pdf/2402.10790.pdf

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions