Gemini 2.0 Flash

Gemini 2.0 Flash is Google's high-performance, low-latency model designed to drive advanced agentic experiences. Equipped with native tool integration, it supports multimodal inputs, including text, images, video, and audio. Offering substantial improvements over previous versions, the model balances efficiency, speed, and enhanced capabilities for seamless real-time interactions.

Command A

Command R+ is Cohere’s cutting-edge generative AI model, engineered for enterprise-grade performance where speed, security, and output quality are critical. Designed to run efficiently with minimal infrastructure, it outperforms top-tier models like GPT-4o and DeepSeek-V3 in both capability and cost-effectiveness. Featuring an extended 256K token context window—twice as large as most leading models—it excels at complex multilingual and agent-based tasks essential for modern business operations. Despite its power, it can be deployed on just two GPUs, making it highly accessible. With blazing-fast throughput of up to 156 tokens per second—about 1.75x faster than GPT-4o—Command R+ delivers exceptional efficiency without compromising accuracy or depth.

Gemini 2.0 FlashCommand A
Provider
Web Site
Release Date
Dec 11, 2024
4 months ago
Mar 14, 2025
1 month ago
Modalities
text ?
images ?
voice ?
video ?
text ?
API Providers
Google AI Studio, Vertex AI
Cohere, Hugging Face, Major cloud providers
Knowledge Cut-off Date
08.2024
-
Open Source
No
Yes
Pricing Input
$0.10 per million tokens
$2.50 per million tokens
Pricing Output
$0.40 per million tokens
$10.00 per million tokens
MMLU
Not available
85.5%
Source
MMLU Pro
77.6%
Source
Not available
MMMU
71.7%
Source
Not available
HellaSwag
Not available
Not available
HumanEval
Not available
Not available
MATH
90.9%
Source
80%
Source
GPQA
60.1%
Diamond
Source
50.8%
Source
IFEval
Not available
90.9%
Source
Mobile Application
-

Compare LLMs

Add a Comment


10%
Our site uses cookies.

Privacy and Cookie Policy: This site uses cookies. By continuing to use the site, you agree to their use.