• 0 Posts
  • 125 Comments
Joined 1 year ago
cake
Cake day: March 8th, 2024

help-circle
  • Sure, after the fact when they were trying to offload stock. It’s certainly been easier to find cheap physical copies after launch, since digital games go in and out of sale at the same time everywhere.

    We’ll see how that goes this time. I do think it’ll depend a lot on whether people keep buying carts and on how many carts they print on each run. The fact that there is a price difference makes me think they won’t be as physical-heavy as last time, but there’s no way to know yet.


  • There is zero chance that Nintendo is giving you a worse deal than third party sellers. That’s just not a thing. Not only does the third party retailer need to keep a cut, but there is an extra step of logistics and shipping involved. This is the MSRP. You won’t see lower unless it’s leftover stock on sale.

    Admittedly, you should see a lot more of that with physical games, but who knows how small physical runs become in this weather. Physical carts could become Limited Run-style collector’s items.


  • I get why, but I don’t have to like it.

    It’s not widely talked about, partially because Nintendo doesn’t like it talked about, but the Switch cart format makes very little sense in a lot of the same ways some of the PSP and Vita storage didn’t make sense. They are very expensive for the specs and quite small. I can only imagine this version of the carts is even more expensive, since they openly say they’re targeting faster speeds and standard SD cards are out of spec.

    But let me be clear, that is nuts. They are effectively selling you a M2 SSD with each physical game purchase. It’s terrible value compared with a BluRay, for everyone involved.

    It’s still a weird step that both breaks with tradition and moves things towards digital distribution forcibly, much in the way that shipping optical drives as an optional add-on did for the PS5. Having a massive collection of physical Switch games I can’t be on board, and it’s not the only additional expense they’re adding (paid back compat upgrades, Nintendo? WTF?).


  • You didn’t, I did. The starting models cap at 24, but you can spec up the biggest one up to 64GB. I should have clicked through to the customization page before reporting what was available.

    That is still cheaper than a 5090, so it’s not that clear cut. I think it depends on what you’re trying to set up and how much money you’re willing to burn. Sometimes literally, the Mac will also be more power efficient than a honker of an Nvidia 90 class card.

    Honestly, all I have for recommendations is that I’d rather scale up than down. I mean, unless you also want to play kickass games at insane framerates with path tracing or something. Then go nuts with your big boy GPUs, who cares.

    But for LLM stuff strictly I’d start by repurposing what I have around, hitting a speed limit and then scaling up to maybe something with a lot of shared RAM (including a Mac Mini if you’re into those) and keep rinsing and repeating. I don’t know that I personally am in the market for AI-specific muti-thousand APUs with a hundred plus gigs of RAM yet.


  • Thing is, you can trade off speed for quality. For coding support you can settle for Llama 3.2 or a smaller deepseek-r1 and still get most of what you need on a smaller GPU, then scale up to a bigger model that will run slower if you need something cleaner. I’ve had a small laptop with 16 GB of total memory and a 4060 mobile serving as a makeshift home server with a LLM and a few other things and… well, it’s not instant, but I can get the sort of thing you need out of it.

    Sure, if I’m digging in and want something faster I can run something else in my bigger PC GPU, but a lot of the time I don’t have to.

    Like I said below, though, I’m in the process of trying to move that to an Arc A770 with 16 GB of VRAM that I had just lying around because I saw it on sale for a couple hundred bucks and I needed a temporary GPU replacement for a smaller PC. I’ve tried running LLMs on it before and it’s not… super fast, but it’ll do what you want for 14B models just fine. That’s going to be your sweet spot on home GPUs anyway, anything larger than 16GB and you’re talking 3090, 4090 or 5090, pretty much exclusively.


  • This is… mostly right, but I have to say, macs with 16 gigs of shared memory aren’t all that, you can get many other alternatives with similar memory distributions, although not as fast.

    A bunch of vendors are starting to lean on this by providing small, weaker PCs with a BIG cache of shared RAM. That new Framework desktop with an AMD APU specs up to 128 GB of shared memory, while the mac minis everybody is hyping up for this cap at 24 GB instead.

    I’d strongly recommend starting with a mid-sized GPU on a desktop PC. Intel ships the A770 with 16GB of RAM and the B580 with 12 and they’re both dirt cheap. You can still get a 3060 with 12 GB for similar prices, too. I’m not sure how they benchmark relative to each other on LLM tasks, but I’m sure one can look it up. Cheap as the entry level mac mini is, all of those are cheaper if you already have a PC up and running, and the total amount of dedicated RAM you get is very comparable.




  • The flowchart is more that you may get lucky once or twice in your life, and work and preparation are meant to take advantage of those instances instead of screwing them up.

    Some poeple have their parents buy them hundreds of lucky chances, other people never get a shot.

    And then there’s a hell of a lot of Dunning–Kruger being interpreted as not having a chance. Meritocracy is a lie, but idiots thinking they have cosmic bad luck and society is against them when they’re actually idiots, assholes or both is also definitely a thing. The problem is it’s hard to separate the two, particularly if you’re the idiot/asshole.



  • I don’t even know that non-British places have such a brazen pledge of loyalty to the monarch in the first place.

    The one place where I’ve lived that was a Constitutional Monarchy didn’t have public figures swear an oath to the monarch, they just pledge to follow the Constitution (just looked it up, members of the government do mention the monarch in passing, members of parliament do not).

    The monarch does pledge to follow the Constitution when they become the monarch, though, so it’s mostly the other way around. At a glance, this seems to be a pretty standard formula.

    Brits and the people they’ve permanently damaged just seem particularly into the whole tradition of monarchy and haven’t really toned it down as much as other places. Not that other monarchies don’t have their zealots, but it’s a bit of a different role.



  • Man, I used broccoli all the time. Just chuck it in the over for a bit, or stir fry it. No stink, delicious, crisp, bright, crunchy.

    These days I don’t cook as often and people who do just insist on boiling it or steaming it into mush, which is like dropping a stink bomb in the kitchen and turns it into puree. Broccoli is meant to be green, not brown, you guys.

    On the plus side you can recycle that absolutely gross overdone broccoli into vegan burger patties and it’s actually good like that. Still, you have to get through the stinkbomb part first.



  • I’m very torn, because people here are correctly going “nothing”, but then launching on long descriptions of what’s convenient for a server to have. I can’t tell if that is answering your question or if the correct answer is just “a server is any computer that hosts a service for some other computer to access over a network”. Both?

    Hell, technically a server doesn’t even need to be a PC at all. You can absolutely have a server and a client be just pieces of software hosted in the same physical machine. “Server” and “client” are just words for what thing is asking for the data and what thing is sending the data over.


  • What in the absolute hell does that even mean? How does that work? There was an election, there were two candidates, one was Trump. There WAS a chance. An obvious choice, at that.

    Should Biden not have run in the first place? Obviously. Whether his approach to Gaza would have change anything is debatable (he’d have been crucified regardless), but let’s say it would.

    It wasn’t Biden who had to “stand up against fascism”. It was the electorate. And they didn’t. You talk about it like it was not their choice, like it was an oopsie inflicted upon them by evil Dems and not a choice they made. They continue to make, in fact. That is insane.

    As democracy gets eroded I will be more likely to cut them some slack, but as of now? Anybody that didn’t vote for Harris is a collaborationist.



  • See, that’s why this stuff is stumping people on all sides of the conversation.

    AI won’t do your job for you in three years, or probably ever.

    AI is useful now, though.

    It’s useful for specific stuff, when properly built into a process and mostly through more direct applications of machine learning than the firehose of generative AI corpos seem to think is a golden goose, but it’s useful.

    There’s a lot of chaff with the wheat here, a lot of people can’t tell the difference and everybody wants to have an opinion more than they want to spend time understanding what’s going on.

    So as always, a lot of people are going to lose a lot of money, a few people will make a ton of money and a bunch of narcissistic assholes will think that being the ones who won in that casino means they’re infallible and should run the world.


  • I feel like this conversation does a very good job of explaining why FOSS alternatives so often have terrible usability. “Not how most people would do it in a selfhost environment” is effectively “not how a tiny, teensy, borderline irrelevant proportion of users would do it”.

    Selfhosting is moving towards being accessible to the average user in some areas. Not coincidentally, I suspect, mostly in areas where someone is trying to make money on the side (see Home Assistant increasingly trying to upsell you into their cloud subscription and branded hardware, for instance). This idea that structuring the software for the average phone user as opposed to the average home server admin is “bad” or “complicated” is baffling to me.

    Oh, and for the record, no, that’s not the line for legality when it comes to watching the media I own. I am perfectly within my rights to access the files in my hard drive in any way I want. At least where I live. I make no promises for whatever dystopian crap is legal in the US. If anything there is a gray area on my using a specific type of drive to be able to rip commercial optical media that is theoretically DRMd in ways that my drive just happens to ignore. But remotely accessing my legal backups in my local storage? Nah, even if I was more worried about piracy than I am I’d feel fine on those grounds.

    But also, copyright as currently designed is broken and not fit for purpose, and I suspect you don’t disagree and your pearl clutching here may have more to do with disliking Plex and not wanting to acknowledge an actually useful feature they provide than anything else. Maybe I’m reading too much into that.


  • I am very confused here. You seem to have slipped from arguing that it was difficult and complicated to arguing that it’s bad to be able to share content remotely because it’s a felony, which seems like a pretty big leap.

    For one thing, it’s not illegal and I do rip my own media. I will access it from my phone or my laptop remotely whenever I want, thank you very much.

    For another, and this has been my question all along, how is it possibly more difficult and complicated to have remote access ready to go than being “a DNS record away”? Most end users don’t even know what a DNS is.

    And yes, not having (obvious) server configurations up front is transparent. That’s what I’m saying. It does mix at least two sources (their unavoidable, rather intrusive free streaming TV stuff and your library), but it doesn’t demand that you set it up. The entire idea is to not have to worry about whether it’s local content. Like I said, there are edge cases where that can lead to a subpar experience (mainly when it’s downsampling your stuff to route it the long way around without telling you), but from a UX perspective I do get prioritizing serving you the content over warning you of networking issues.

    I don’t know, man, I’m not saying you shouldn’t prefer Jellyfin. I wouldn’t know, I never used it long enough to have a particularly strong opinion. I just don’t get this approach where having the thing NOT surface a bunch of technical stuff up front reads as “complicated and difficult”. I just get hung up on that.