Closed
Description
Model Summary
The language model phi-1.5 is a Transformer with 1.3 billion parameters. It was trained using the same data sources as phi-1, augmented with a new data source that consists of various NLP synthetic texts. When assessed against benchmarks testing common sense, language understanding, and logical reasoning, phi-1.5 demonstrates a nearly state-of-the-art performance among models with less than 10 billion parameters.
See https://huggingface.co/microsoft/phi-1_5 for more details
Metadata
Metadata
Assignees
Labels
No labels