

Ollama is a big thing, do you want it to be fast? You will need a GPU, how large is the model you will be running be? 7/8B with CPU not as fast but no problem. 13B slow with CPU but possible
Ollama is a big thing, do you want it to be fast? You will need a GPU, how large is the model you will be running be? 7/8B with CPU not as fast but no problem. 13B slow with CPU but possible
I would take that any day!
That is true, I mean I mostly only use my homelab except some game servers that I am running. And you are totally right. Only reason why I want to run proxmox or in general why I have a homelab is to learn more about servers and self hosting. I am currently in the first year of my apprenticeship and I have learned so much since I got my server up and running 😄 and I think I can learn a lot more when I am using proxmox
Please keep me up to date what you try and how you are trying to migrate it over! :D and obviously good luck
I am in the same boat currently and thinking about how I can migrate my stuff over without having a 1 month downtime EDIT: after reading all the comments I’m still not sure if I should do it or like I said even how. I love my unraid it fits me well however I think I also have fallen in love with proxmox
I don’t think it matters too much but I’m not sure, I just sticked to using the dedicated extension and it’s working good
Meh that sucks i even have a perfectly working ddns, I mean I know I don’t get something like a PTR record but i wish that mail hosters would allow for more self hosting options
At least here in Germany it is like that. if you got a new number or whatever you are 99,9% certain that number is on WhatsApp it’s inevitable its the main source for chatting for everyone. So if you’d want to switch platforms youd have to convince a lot of people and most would not be ready to do that since why bother when you can just use WhatsApp?
Oh yeah I heard about this and saw that mutahar (some ordinary gamers) was doing it once on windows with a 4090. I would love to do that on my GPU and then split it between my host and my VM
Wonderful thank you so much!
I need that wallpaper! Is there a way you could provide me that?
I used llamacpp with opencl but couple of months they supported rocm which is even faster
The language German 😅 If you have any clue where and how I could get them without a private tracker you would make my day
I also mainly use public trackers id love to get some good German films and so on but they are all behind a private tracker but I learned that after I setup by *arr sadly
Just want to piggyback this. You will probably need more than 6gb vram to run good enough models with a acceptable speed and coherent output, but the more the better.
Well but private trackers are hard to get into when you are just starting out and as someone already said I don’t want to pirate as a second job
Depends on your use case I guess. I personally use unraid for my server which works fine but if you don’t want to pay anything and have a good experience choose Ubuntu server or Debian (since they are the most supported and stable) but I have no clue about that either, theoretically you can do it on every distro
Is kagi a metasearch engine? Or does it have its own crawler and so on?
Fyi searx is deprecated or at least not maintained and discontinued Switch to searxNG which is an active fork and it is really good :D
They also created ghidra! Probably second best