arXiv cs.AI by Synapse Flow 編集部

Ortho-Hydra: Orthogonalized Experts for DiT LoRA

概要

arXiv:2605.03252v1 Announce Type: cross Abstract: LoRA fine-tuning of diffusion transformers (DiT) on multi-style data suffers from \emph{style bleed}: a single low-rank residual cannot represent several distinct artist fingerprints, and the optimizer converges to their average. Mixture-of-experts …

元記事を読む →

関連記事