• 0 Posts
  • 530 Comments
Joined 1 year ago
cake
Cake day: March 8th, 2024

help-circle





  • Trading processing power for size is a thing. I guess it depends on application and implementation. Well, and on the actual size of the models required.

    It’s one of those things that makes for a good headline, but then for usability it has to be part of a whole conversation about whether you want to spend the bandwidth, the processing power on compression, the processing power on real time upscaling, the processing power on different compression tools, something else or a mix of the above.

    I suppose at some point it’s all “benchmarks or it didn’t happen” for these things. And when it comes to ML benchmarks are increasingly iffy anyway.




  • This was already true of a number of Switch 1 games, where the partial data in the cart did not include access to the full game. Some gave you access to only a demo (in the line of the “play before the download is finished” feature in home consoles), others not even that. And of course it was true of the “code-in-a-box” products they were selling on retail that you couldn’t even resell or return.

    The real issue isn’t how the key carts work, which is an improvement on those.

    The issue is that the cost of carts with actual storage has gone up. Nintendo’s change of memory spec means they’ve given up on the low-storage carts, which used to come in a bunch of sizes, some of which were relatively cheap. They’ve gone for a single 64GB SKU, which means the type of game that can afford the physical storage will be significantly restricted.

    This may well make technical sense (the new storage standard is based on a SD card update that may not even exist at lower sizes by default), but the practical effect may be that the cost of physical carts makes no sense to anybody but the largest games/publishers, which is a travesty. Nintendo should have found a way around this, even if it is to subsidize the cart cost with their cut of the game’s price to some extent. I get why that’s not the case, since it’d effectively mean giving their cut of each game straight to Amazon and other retailers, but man, does it suck as it is.

    I think what we’ll end up seeing is a lot more Limited Run-style expensive collector’s editions being the only physical media releases of many games. And even that only if people do get used to paying extra to subsidize the card out of their own pocket. If I was Nintendo I would have considered making it a standard to have every physical game in both formats as a rule and have people pay an extra tenner for the full storage version. Instead, they chose to try to push the top end of the price range anyway with no guarantee that the cost of media is part of the increase. They’ve been indecisive and the outcome is going to suck.

    Of course people would be complaining just as hard if they had done that, which is one of the examples where gamers’ default position being antagonism can yield worse results.



  • I’m confused, if you agree the terms don’t show any evidence of spyware or excessive data collection why quote this article specifically in the first place?

    And if you agree that the listed information is standard, what is is the “shit like in this article” we’re talking about?

    Not having split screen on PC is a bummer, though, and it’s likely mostly revolving around an outdated impression of how games are used on PCs. Unfortunately I believe BL4 is sticking to that policy.

    I’m not sure how that makes pirating the game more or less justifiable, though. If what you want is playing split screen that will remain as unavailable on a pirate copy than on a licensed one.

    I mean, do what you want, obviously, but while the “removing spyware” thing would have made sense the other thing doesn’t quite.


  • I don’t think these two academics are suffering from disinterest or a lack of subject expertise.

    I think they are in a space where they don’t think it applies to their output in this particular venue. Maybe in a space where they are subconsciously tied to a “here/now/default” take on the world that is just the US and everything else is this othered “elsewhere” that gets perceived as somehow smaller, less relevant or exceptional.

    Part of it is a cultural disconnect. They may think the implications of “capitalism” when used to an American audience are clear. My observation is that this is not just a cultural disconnect in the use of the world, but instead that the word when used inside the US is fluid, poorly defined, deliberately imprecise and more or less tautological.

    Capitalism is whatever the US does now, as perceived by whoever is using the word. I think that’s a very purposeful result of US politics and, had they gotten to it on time, Americans may have benefitted from putting an end to it before the entire system lost all meaning.


  • People around me will definitely conceive of a noncapitalist alternative because a significant number of them have lived in one.

    That doesn’t mean they will approve of returning to the systems they experienced previously. In many cases those systems were demonstraby worse and less sustainable. Plus “from an European perspective”, the current system most of them live in is heavily social democratic, so again how we define those terms will be relevant.

    If you want to argue that this is not the “default” human experience, then with all due respect that just sounds like ethnocentrism to me. On the authors’ (and Jameson’s) part, at least. Probably a bit of internalized cultural imperialism on our part. It’s not the first time I notice a lot of the European left is trying, and often failing, to import some US left concepts that don’t really apply.


  • This is telling me very little about the value of standards in a non-capitalist model, but man, is it telling me a lot of how pressure-washed the brains of US academics are. ‘It is easier to imagine an end to the world than an end to capitalism,” the saying goes’? What the hell? Is that a “saying”?

    I mean, part of the problem is I have no idea what Americans are talking about when they say “capitalism”. Some mean everything up to and including outright fascist or communist centralized management as long as some form of private property exists. For others any glimpse of social democracy past radical anarchocapitalism is “not capitalism”.

    But even beyond that, how hard could it be to picture a non-capitalist form of trade or information sharing when it actively exists right now and always has? Capitalism has sometimes been the hegemonic form of structure for commerce or society, but it has never been the only one in place.

    Oh, and as a note, I do like that this example comes from what seems to be a clearly left-leaning source. I often struggle to explain to well-meaning progressive Americans that their systems of value and meaning are built from the exact same pieces as their conservatives and in many casses approximate those more than the systems of progressives in other parts of the world. Which is true both ways, not just of Americans, but often not highlighted.


  • For sure. Good UX is not “simple” UX. Professional software doesn’t need to be flashy and clean, but it does need to be efficient and usable.

    Bad UX is bad UX, though.

    I bring up Blender because Blender vs Gimp is my favorite example of how FOSS can find a very functional alternative AND compete with the paid side with no compromises… but also of why it often doesn’t.

    Blender is for power users, but it’s well designed enough you can dabble with it or follow a tutorial and have fun doing it. Gimp will make you hate the very act of opening a file and trying to make the most basic crop on it even if you’re a Photoshop master.


  • Sure, I can agree with that.

    The problem with OSS tends to be that engineers are more willing to work on it than UX designers and it’s quite rare for them to have the lead on that area. Forget convention, just on quality. There are exceptions (hey Blender!), but not many.

    More often than not what you get is some other paid upstart hit some big innovation and then that propagates and sometimes it gets to open source alternatives before it does to fossilized, standardized professional software.

    I do think there’s some value in having UX that makes it easier to jump back and forth, though. Especially if your positioning is “I’m like this paid thing, but free”. The easier you make it for the pros to pick up and play the easier you can carve some of the market and the more opportunities you give to newcomers learning on the free tool to migrate to the paid tool if the market demands it.


  • I mean… cool, but by that logic you want to design all your graphic designers from painters and artists to do posters with brushes again.

    That’s just not practical, and “it’s not efficient, but” is a massive dealbreaker for a whole lot of applications. Artisanal product has a premium and is very cool and if you can get away with making a living out of it I find that amazing.

    But sometimes somebody just needs a poster made or a shop logo or a trash bag removed from their wedding picture background. Industrial work at pace is important and the baseline for a work area.

    I’m also not sure what time was before the standardization and consolidation of software. Word replaced Wordperfect. Photoshop replaced the Corel Suite. Premiere replaced (or at least displaced) Avid. It’s not like there weren’t industry standards before.

    Some companies still use proprietary stuff and train people on their in-house software, it’s doable. It’s just easier for most of the pack working with multiple clients and vendors to be using the most popular thing at any given time.




  • Sure. And I love finding better solutions, particularly when they’re for a thing I do for my own sake.

    But if you’re a newspaper that is ingesting hundreds or thousands of pictures a day from dozens of photographers and having half a dozen people editing all that input into a database that a dozen composers and web editors are using at the same time sometimes janky but universally familiar is a lot more valuable than “better at this thing on interesting ways”.

    It doesn’t mean you can’t displace a clunky, comfortable king of the hill. Adobe itself used to be pretty good at doing just that. Premiere used to be the shitty alternative kids used because it was easy to pirate before it became THE editing software for online video. The new batch of kids are probably defaulting to Resolve these days, so that one feels wobbly. Other times you just create a new function that didn’t exist and grow into space previously occupied by adjacent software, Canva-style.

    But if you see a piece of industry-standard software with a list of twenty alternatives broken down by application, skill level or subsets of downsides the industry standard is probably not about to lose their spot in favor of any of those anytime soon.