

Never heard the original definition lol, I wondered why the AI had such a weird name


Never heard the original definition lol, I wondered why the AI had such a weird name


My biggest problem took a bit to grok
Now that you’re on Linux pop Docker on there and install Ollama/WebUI on there so you can run your own grok at home and not have to support yet another horrible company


Jfc that Arduino terms update is disgusting


https://github.com/linux-surface/linux-surface?tab=readme-ov-file
There’s a dedicated community just for Linux on the Surface


Beware of BitLocker though - I had no idea it was enabled on my wife’s windows tablet and when I came back from a Linux live image her windows drive was locked. The keys were not in her Microsoft account and we couldn’t find them anywhere else so I ended up having to erase the device. Luckily she didn’t use it for much but find and copy your keys before attempting a live Linux boot.


What was the ASUS privacy update? In the router settings page? I haven’t seen it yet but I don’t use any of their extra services


Guess I’ll be getting around to starting my own pihole after all


Thanks for this, I accidentally locked my wife’s tablet when I was testing if Linux would run on it from USB drive. Came back to win 11 and it was bitlocked, with no codes in her Microsoft account and no idea where else to find them. Hopefully I can study this and figure out a way to bypass it


Standalone cosmos installation is now recommended instead of the docker container. It’s technically still beta, but I’ve been running it since February and very happy with it. The dev has indicated updates will focus on the standalone installation going forward.
I have found it’s made everything vastly easier for me - the marketplace, integrated reverse proxy and URL manager mean you can literally spin many services up with a single click. Yet it has sub-menus for docker env and compose data that allow you to dig deeper into how the containers work if you’re interested in it.
https://cosmos-cloud.io/docs/index/ “Install Cosmos as a standalone service”


In terms of the tinyminimicro’s I think i5-6500T 7500T or 8500T (T signifies 35w TDP) could all fit your price point depending on RAM/SSD specs. I haven’t done much research on the n100 processors but I think they are broadly comparable to the above i5’s


While I get leaning towards AMD products, I’ve been doing so as well, when I built my first server with a Ryzen 5 2400GE I have found that there just isn’t as much resources/support for enabling transcoding with the vega 11 in Jellyfin or Immich. Most Intel iGPU’s have a hardware chip specifically tuned for transcoding called quicksync that you should strongly consider.
Especially in the $100-200 price range tiny mini micro’s from HP/Lenovo/Dell are widely available and offer lots of capability in a power-efficient (~10-15w idle, 40-50w full load) and easily maintainable form factor. The Lenovo’s in particular are interesting due to a few models having full pci-e slots if you decide later you want a GPU.
Lenovo pci-e
Finally for software I would suggest looking into Cosmos Cloud, I use it and have found it made it so much easier to setup and manage all my docker containers and domain name/reverse proxy settings.


Any Intel CPU with quicksync will likely be plenty transcoding capability for his use case with significantly lower power draw


If this is only 6 weeks ago now then you can still most likely do a credit card charge back if you paid that way
If this is your first time trying to selfhost I highly recommend Cosmos Cloud, I’ve been using it for 6 months and it’s made every step of the way so much easier for me. It manages docker containers and has included reverse proxy and security features, with paid option for personal VPN like tailscale.
Most services work perfectly from a catalog of pre-built docker compose files, but Jellyfin I remember I did have to go to the internal docker IP on the actual host machine to set the server up and working properly to be visible from other machines


Bro 2.0 came so fast I didn’t even have time to do 144, like why did they even bother releasing that when 2.0 was coming the very next day lol


When on your wifi, try navigating in your browser to your windows computer’s address with a colon and the port 11434 at the end. Would look something like this:
If it works your browser will just load the text: Ollama is running
From there you just need to figure out how you want to interact with it. I personally pair it with OpenWebUI for the web interface


The problem is big businesses like Temu can bulk ship and still only pay a certain %.
But it will ruin small businesses who do only small shipments and will now see a flat fee that may be half or more the value of the good.
I’m actually right there with you, I have a 3060 12gb and tbh I think it’s the absolute most cost effective GPU option for home use right now. You can run 14B models at a very reasonable pace.
Doubling or tripling the cost and power draw just to get 16-24gb doesn’t seem worth it to me. If you really want an AI-optimized box I think something with the new Ryzen Max chips would be the way to go - like an ASUS ROG Z-Flow, Framework Desktop or the GMKtek option whatever it’s called. Apple’s new Mac Minis are also great options. Both Ryzen Max and Apple make use of shared CPU/GPU memory so you can go up 96GB+ at much much lower power draws.


As an Xperia 5 III user I’m feeling very left out
SelfHosted community of Lemmy right now