Skip to content

Crash using WebGPU with translation pipeline #1380

@thesmartwon

Description

@thesmartwon

System Info

transformers.js 3.7

Environment/Platform

  • Website/web-app
  • Browser extension
  • Server-side (e.g., Node.js, Deno, Bun)
  • Desktop app (e.g., Electron)
  • Other (e.g., VSCode extension)

Description

I'm trying to run a translation pipeline with webgpu but hit this error:

transformers.web.js:7483 dtype not specified for "encoder_model". Using the default dtype (fp32) for this device (webgpu).
transformers.web.js:7483 dtype not specified for "decoder_model_merged". Using the default dtype (fp32) for this device (webgpu).
ort-wasm-simd-threaded.jsep.mjs:100 2025-07-25 16:42:10.752699 [W:onnxruntime:, session_state.cc:1280 VerifyEachNodeIsAssignedToAnEp] Some nodes were not assigned to the preferred execution providers which may or may not have an negative impact on performance. e.g. ORT explicitly assigns shape related ops to CPU to improve perf.
ort-wasm-simd-threaded.jsep.mjs:100 2025-07-25 16:42:10.755200 [W:onnxruntime:, session_state.cc:1282 VerifyEachNodeIsAssignedToAnEp] Rerunning with verbose output on a non-minimal build will show node assignments.

Reproduction

import { pipeline } from '@huggingface/transformers';

async function translate(text: string) {
	const translator = await pipeline(
		'translation',
		'Xenova/nllb-200-distilled-600M',
		{
			device: "webgpu",
		},
	);

	return await translator(text, {
		src_lang: 'heb_Hebr',
		tgt_lang: 'eng_Latn',
	});
}
console.log(await translate("בראשית ברא אלהים את השמים ואת הארץ׃"))

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions