Excited to release our latest Vicuna v1.5 series, featuring 4K and 16K context lengths with improved performance on almost all benchmarks!
Vicuna v1.5 is based on the commercial-friendly Llama 2 and has extended context length via positional interpolation.
Since its release, Vicuna has become one of the most popular chat LLMs. It has enabled pioneering research on multi-modality, AI safety, and evaluation. Vicuna models received over 2 million downloads on Hugging Face last month. The latest version follows the proven recipe and brings fresh enhancements. Let’s keep pushing the boundary of open LLM!
Weights: https://github.com/lm-sys/FastChat/tree/main#model-weights
Demo: https://chat.lmsys.org/