Command A

Command R+ is Cohere’s cutting-edge generative AI model, engineered for enterprise-grade performance where speed, security, and output quality are critical. Designed to run efficiently with minimal infrastructure, it outperforms top-tier models like GPT-4o and DeepSeek-V3 in both capability and cost-effectiveness. Featuring an extended 256K token context window—twice as large as most leading models—it excels at complex multilingual and agent-based tasks essential for modern business operations. Despite its power, it can be deployed on just two GPUs, making it highly accessible. With blazing-fast throughput of up to 156 tokens per second—about 1.75x faster than GPT-4o—Command R+ delivers exceptional efficiency without compromising accuracy or depth.

Nova Lite

Amazon Nova Lite is a versatile multimodal model designed to process text, image, and video inputs, producing text-based outputs. Featuring a 300K-token context window, it is well-suited for real-time interactions, document analysis, and visual question answering. As part of the Amazon Nova foundation models, it supports fine-tuning and distillation, enabling advanced customization.

Command ANova Lite
Provider
Web Site
-
Release Date
Mar 14, 2025
1 month ago
Dec 02, 2024
4 months ago
Modalities
text ?
text ?
images ?
video ?
API Providers
Cohere, Hugging Face, Major cloud providers
Amazon Bedrock
Knowledge Cut-off Date
-
Purposefully not disclosed
Open Source
Yes
No
Pricing Input
$2.50 per million tokens
$0.06 per million tokens
Pricing Output
$10.00 per million tokens
$0.24 per million tokens
MMLU
85.5%
Source
80.5%
CoT
Source
MMLU Pro
Not available
Not available
MMMU
Not available
Not available
HellaSwag
Not available
Not available
HumanEval
Not available
85.4%
pass@1
Source
MATH
80%
Source
73.3%
CoT
Source
GPQA
50.8%
Source
42%
Main
Source
IFEval
90.9%
Source
89.7%
Source
Mobile Application
-
-

Compare LLMs

Add a Comment


10%
Our site uses cookies.

Privacy and Cookie Policy: This site uses cookies. By continuing to use the site, you agree to their use.