

Yeah pulling nearly 600w through a connector designed for 600w maximum just seems like a terrible idea. Where’s the margin for error?
Yeah pulling nearly 600w through a connector designed for 600w maximum just seems like a terrible idea. Where’s the margin for error?
Less weight. I’m still on a 13 mini. One of the things putting me off a larger phone is that they’re all so heavy. I don’t want a larger screen either, but I might end up compromising with this.
If you want a really simple way to run a variety of local models with a nice UI take a look at https://jan.ai/
Probably more likely to be related to the Sora release or any of the other stuff they’ve announced this week.
Yeah which ports are folks actually missing here? Looking at various ports.
Magsafe: This has returned on the new machines. I like it for the green / orange charge indicator. RJ-45: Ok I kinda of get it, but it’s such a tall port. Personally I’d prefer a thinner laptop in this instance. Mini DVI: long dead. Replaced with HDMI. The MacBook pro’s have HDMI FireWire: long dead USB A: Replaced with USB-C. Ok one A port here would still be useful. Headphone / Mic: Still there, just as a combo port on the other side. SD card reader: The MacBook Pros have this. Mini Display port: Long dead. Replaced with DP over USB C
In short if you want HDMI and SD card reader and are anti dongle you get a MBP which has both.
Either way now have ports that can push insane bandwidth and route USB, PCI, HDMI and DP over the same cable which is incredibly versatile.
Yup. Investors have convinced themselves that this time AI development is going to grow exponentially. The breathless fantasies they’ve concocted for themselves require it. They’re going to be disappointed.
I doubt they’ll ever come to Europe. They don’t meet even the most basic crash safety standards. These things are designed to annihilate pedestrians, not to try to reduce harm.
It needs to be way way better than ‘better than average’ if it’s ever going to be accepted by regulators and the public. Without better sensors I don’t believe it will ever make it. Waymo had the right idea here if you ask me.
If anyone was somehow still thinking RoboTaxi is ever going to be a thing. Then no, it’s not, because of reasons like this.
Brill! Glad it helped.
Sounds like a SPF / DKIM / DMARC issue. You can test how your domain is set up here. https://www.dmarctester.com/
Ideally you want all three set up. But you must have SPF and DKIM or you’ve no chance of staying out of spam folders.
Todays news
https://www.turkiyetoday.com/turkiye/turkiye-bans-discord-amid-concerns-over-platform-safety-62896/
For now, Discord users in Türkiye face limited access to the platform, though it remains unclear whether a full ban will be implemented in the coming days.
Still working for me on hotel wifi.
Edit: it won’t launch on my laptop now. Stuck trying to update. Still works on my phone.
They’ve committed to support AM5 (the LGA socket launched 2022) through at least 2027.
“We envision other types of more complex guardrails should exist in the future, especially for agentic use cases, e.g., the modern Internet is loaded with safeguards that range from web browsers that detect unsafe websites to ML-based spam classifiers for phishing attempts,” the research paper says.
The thing is folks know how the safeguards for the ‘modern internet’ actually work and are generally straightforward code. Where as LLMs are kinda the opposite, some mathematical model that spews out answers. Product managers thinking it can be corralled to behave in a specific, incorruptible way, I suspect will be disappointed.
There are no M1 devices with less than 8GB of RAM.
The A16 Bionic has as Neural Engine capable of 17 TOPS but 6GB of RAM.
The M1 had a Neural Engine capable of just 11 TOPS but all M1 chips have at least 8GB of RAM.
So the model could run on an A16 Bionic if it had 8GB of RAM as it has 54% more TOPS than the M1, but it only has 6GB of RAM. Apple have clearly decided that a model small enough to fit just wouldn’t give good enough results.
Maybe as research progresses they’ll find a way to make it work with a model with fewer parameters but I’m not going to hold my breath.
Yeah I thought it was a NPU tops issue that’s keeping it off the 17 non pro. However since it runs on a M1 I think it’s more to do with needing 8GB RAM to fit the model.
He called the software integration between the two companies “an unacceptable security violation,” and said Apple has “no clue what’s actually going on.”
I’d be very surprised if corporates wouldn’t just be able to disable it in MDM for their worker’s phones. Not sure it’s Apple who has ‘no clue’ here.
If they keep burning $100k/w on their Vercel bill they might not be around that long anyway!
Yeah this article is woefully uninformed. Author seems to be butt hurt about GPU pricing rather than any serious interest in how the protocol actually works.