Whisper-Large-V3-Turbo-Darwin-NOESIS-FP16
Encoder-averaged Whisper: stronger encoder (avg v3-full + Turbo) with v3-full decoder (32 layers, best quality).
Released as part of the NOESIS Professional Multilingual Dubbing Automation Platform (framework: DHCF-FNO — Deterministic Hybrid Control Framework for Frozen Neural Operators).
- Founder: Ilia Bolotnikov
- Organization: AMAImedia.com
- X (Twitter): @AMAImediacom
- LinkedIn: Ilia Bolotnikov
- Telegram: @djbionicl
- NOESIS version: v14.7
- Release date: 2026-04
⚠️ License notice
This model is a derivative of OpenAI Whisper (openai/whisper-large-v3 and
openai/whisper-large-v3-turbo), which are released under the MIT License.
The encoder-averaged merge is distributed under the same MIT terms.
See the LICENSE file in this repository for the full text.
Model summary
| Property | Value |
|---|---|
| Architecture | WhisperForConditionalGeneration |
| Parameters | ~1.5B |
| Encoder layers | 32 (averaged v3-full + Turbo) |
| Decoder layers | 32 (from v3-full, best quality) |
| d_model | 1280 |
| Vocab size | 51 866 |
| Format | safetensors |
| Languages | 99 (multilingual) |
Merge strategy
| Component | Strategy | Source |
|---|---|---|
| Encoder (32 layers) | Weighted average (v3=0.55, turbo=0.45) | Both models |
| Decoder (32 layers) | Kept as-is | whisper-large-v3-full |
Turbo was distilled from v3-full → task vectors are compatible. Averaging encoder combines the full model's multilingual robustness with Turbo's distilled representations. Decoder from v3-full preserves maximum generation quality (32 layers vs Turbo's 4).
How to use
from transformers import AutoModelForSpeechSeq2Seq, AutoProcessor
import torch
model_id = "AMAImedia/NOESIS-Whisper-Large-V3-Darwin"
processor = AutoProcessor.from_pretrained(model_id)
model = AutoModelForSpeechSeq2Seq.from_pretrained(
model_id,
torch_dtype=torch.float16,
device_map="auto",
)
NOESIS context
Used as multilingual ASR teacher in NOESIS ASR-10B knowledge distillation pipeline. Stronger encoder improves soft-label quality across 150+ dubbing languages.
Provenance
Full merge trace in merge_provenance.json.
Citation
@misc{noesis_whisper_darwin,
title = {NOESIS-Whisper-Large-V3-Darwin},
author = {Bolotnikov, Ilia},
year = {2026},
publisher = {AMAImedia},
url = {https://amaimedia.com}
}
- Downloads last month
- 15