Skip to content

Feature Request: Support for C4AI Command R7B / Cohere2ForCausalLM #10816

@arch-btw

Description

@arch-btw

Prerequisites

  • I am running the latest code. Mention the version if possible as well.
  • I carefully followed the README.md.
  • I searched using keywords relevant to my issue to make sure that I am creating a new issue that is not already open (or closed).
  • I reviewed the Discussions, and have a new and useful enhancement to share.

Feature Description

I would like to request support for C4AI Command R7B by Cohere.

Here is some relevant information:

Download link: https://huggingface.co/CohereForAI/c4ai-command-r7b-12-2024

Some specifications:

  • A well-rounded model
  • Model Size: 7 billion parameters
  • Context length: 128K
  • Enhanced efficiency in math, code, and reasoning tasks
  • Multilingual, reasoning, tool use.
  • RAG capability

Blog post: https://cohere.com/blog/command-r7b

Motivation

I believe it will be a great addition to llama.cpp

Possible Implementation

Model Architecture: This is an auto-regressive language model that uses an optimized transformer architecture. After pretraining, this model uses supervised fine-tuning (SFT) and preference training to align model behavior to human preferences for helpfulness and safety. The model features three layers with sliding window attention (window size 4096) and ROPE for efficient local context modeling and relative positional encoding. A fourth layer uses global attention without positional embeddings, enabling unrestricted token interactions across the entire sequence.

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions