

Oh yeah? Name ONE ape that wrote Shakespeare. Go on I’ll wait
Oh yeah? Name ONE ape that wrote Shakespeare. Go on I’ll wait
AI is cool. AI research is valuable. AI has the power to be a transformative technology.
Corporate AI hype for tech fueled neofeudalism is not cool. Commodification of AI is not valuable. LLMs will never substantively change the world.
This whole message reads like “we don’t actually care but we have to say that we do 😉🙂↕️”
Honestly this is pretty much it. Sometimes you have to be pretty aggressive to get companies to do the thing you need; they will take advantage of the social friction required to keep you in predatory arrangements. They literally design it to be frustrating so you’ll give up. Like you, I try to make it clear to the person I’m speaking with I have no problem with them just the business. But if the corporations require me to get mad to do the right thing I will get mad.
Why not password protect the keys (ala Linux ssh / gpg symmetric encryption for local storage of PPK)
Half the time I look on stack overflow it feels like the answer is irrelevant by todays standards
Tldr: New desktop environment designed for PopOS (but usable elsewhere)
An oversimplification but Imagine you have an algebraic math function where every word in English can be assigned a number.
x+y+z=n where x y z are the three words in a sentence. N is the next predicted word based on the coefficients of the previous 3.
Now imagine you have 10 trillion coefficients instead of 3. That’s an LLM, more or less. Except it’s done procedurally and there’s actually not that many input variables (context window) just a lot of coefficients per input
Good point! Hadn’t thought of that
Absolute boneheaded move by NVIDIA. Guess they just saw dollar signs and stopped thinking. What I don’t get is they are already at like 300% capacity I don’t think there will be any business short falls from selling only to US customers
Vector embeddings with ChromaDB. Basically you pre compute the word embeddings of every row / table / whatever granularity you want and then stick that into a vector DB. Then you do an embedding computation of your query and compare similarity. You can either return the table / row / whatever you want that’s most similar (“semantic search”) or you use that as context for an LLM (“RAG”)
Hey Vsauce, Michael here…
vibraphone intensifies
This would become an Anti trust suit I would imagine.
MY NAME IS BARRY ALLEN AND IM THE FASTEST MAN ALIVE
gets outrun every episode
fuck yeah I live for this shit. Glad there’s smart people doing their thing. Wonder what they’ll call it. Muon force?
That’s awful. I don’t know why sign language isn’t made into an official state language that everyone has to learn some basic amount of proficiency
A bunch of programs that form the core utilities required to make an OS useful.
The Deck is for sure standing on the shoulder’s of giants.
Deleting 107k lines of code is so much more based than adding 107k lines