by hedora 5 hours ago

I thought I had to split it in the BIOS, but then I just didn't (this is on a 2025 machine), and llama ended up with the same available "GPU" ram either way (confirmed by running inference on it).

mrbuttons454 5 hours ago | [-0 more]

Oh that's fantastic, I'll give it a try. thank you!