o4-mini

OpenAI o4-mini is the newest lightweight model in the o-series, engineered for efficient and capable reasoning across text and visual tasks. Optimized for speed and performance, it excels in code generation and image-based understanding, while maintaining a balance between latency and reasoning depth. The model supports a 200,000-token context window with up to 100,000 output tokens, making it suitable for extended, high-volume interactions. It handles both text and image inputs, producing textual outputs with advanced reasoning capabilities. With its compact architecture and versatile performance, o4-mini is ideal for a wide array of real-world applications demanding fast, cost-effective intelligence.

Qwen2.5-VL-32B

Over the past five months since the release of Qwen2-VL, developers have built new models based on it, contributing valuable feedback. Now, Qwen2.5-VL introduces enhanced capabilities, including precise analysis of images, text, and charts, as well as object localization with structured JSON outputs. It understands long videos, identifies key events, and functions as an agent, interacting with tools on computers and phones. The model's architecture features dynamic video processing and an optimized ViT encoder for improved speed and accuracy.

o4-miniQwen2.5-VL-32B
Provider
Web Site
Release Date
Apr 16, 2025
1 month ago
Mar 25, 2025
2 months ago
Modalities
text ?
images ?
text ?
images ?
video ?
API Providers
OpenAI API
-
Knowledge Cut-off Date
-
Unknown
Open Source
No
Yes (Source)
Pricing Input
$1.10 per million tokens
$0
Pricing Output
$4.40 per million tokens
$0
MMLU
fort
78.4%
Source
MMLU Pro
-
49.5%
MMMU
81.6%
Source
70%
HellaSwag
-
Not available
HumanEval
14.28%
Source
Not available
MATH
-
82.2%
GPQA
81.4%
Source
46.0%
Diamond
IFEval
-
Not available
Array
-
-
AIME 2024
93.4%
Source
-
AIME 2025
92.7%
Source
-
Array
-
-
Array
-
-
Array
-
-
Array
-
-
Mobile Application
-

Compare LLMs

Add a Comment


10%
Our site uses cookies.

Privacy and Cookie Policy: This site uses cookies. By continuing to use the site, you agree to their use.