Sorry, for some reason I missed this comment earlier. I think it might be quite expensive to decompose them, but let's assume it works. I think the problem then could be that you lose too much information. In LoRA, you keep all the original weights but then just modify a small subset. Your approach, which is not a bad idea, would probabl…
Sorry, for some reason I missed this comment earlier. I think it might be quite expensive to decompose them, but let's assume it works. I think the problem then could be that you lose too much information. In LoRA, you keep all the original weights but then just modify a small subset. Your approach, which is not a bad idea, would probably get rid of too much useful information when you reconstruct the original weight matrix during the forward pass.
Sorry, for some reason I missed this comment earlier. I think it might be quite expensive to decompose them, but let's assume it works. I think the problem then could be that you lose too much information. In LoRA, you keep all the original weights but then just modify a small subset. Your approach, which is not a bad idea, would probably get rid of too much useful information when you reconstruct the original weight matrix during the forward pass.