They bought P.A. Semi, but it was for their design capability; they never had fabs anyway, and Apple still depends on TSMC and others for manufacturing chips. Apple building fabs to ensure a guaranteed supply of memory (or logic) chips would be an unprecedented level of vertical integration, even for them.
Operating a FAB requires employing PhDs that are willing to work 8 hours shifts with no breaks (each removal of a bunnysuit is an expensive exercise), and there’s no reason to believe SpaceX is capable of hiring such people.
I made something similar a long time ago partly as a challenge to see what could be done with just 2 KB RAM [0]. It was possible to implement some very basic context switching between two "processes", pipes (okay, I only had a single pipe, and it only worked between certain commands), and some other things like a few built-in games (pong, snake, and a breakout-style game, naturally). I didn't go as far as adding any filesystem functionality though, and ultimately yours does feel more Unix-like overall, but it was a fun little project where you learned to always consider every single byte as precious.
You don't get it. Your 400sqft apartment needs to be shrunk by a factor of 6 to have the same area as the Orion. Try living in an 8x8 foot square for a couple weeks.
Not in a storm you can't! Granted I didn't do ten days. But I was with two other people for close to a week and it was...fine. We're old friends. There were moments it got annoying. But it was never boring or restrictive. We just played games, drank, looked out of the portholes, cursed hangovers and talked the one person who occasionally wanted to call it.
My intuition would be that constant usage (not exceeding maximum rated capacity/thermals/etc.) should generally result in less wear compared to the more frequent thermal cycling that you might expect from intermittent use, but maybe there's something else going on here too. I suppose this would depend on what exactly the cause of the failure is.
Either way, these are obviously being intentionally sold to be used for non-gaming-type workloads, so it wouldn't be a good argument to state that they're just being (ab)used beyond what they were inteded for...unless somehow they really are being pushed beyond design limits, but given the cost of these things I can't imagine anyone doing this willingly with a whole fleet of them.
But if everyone follows this advice, then everything just gets overwhelmed by "hustlers" (and their "shameless spam"), and collectively we're now all worse off because of it. It just turns into yet another tragedy of the commons situation.
I say this as someone who received a lot of great feedback and had some interesting interactions after posting about a project of mine using "Show HN" a few years ago. I didn't need to spam anything to get the attention, but I admit maybe I just got very lucky, or maybe there were just fewer posts to "compete" with at the time (this was before the recent write-everything-with-AI-and-launch-it-out-there craze).
Finally, I'm not making any moral judgments here, and if someone feels they need to do this to get the attention they want, then who am I to tell you otherwise. But we should be aware of what we're giving up when we overall tend to behave in such a way, even if it's the inevitable outcome.
The total size isn't what matters in this case but rather the total number of files/directories that need to be traversed (and their file sizes summed).
> I've seen claims of providers putting IPv6 behind NAT, so don't think full IPv6 acceptance will solve this problem.
I get annoyed even when what's offered is a single /64 prefix (rather than something like a /56 or even /60), but putting IPv6 behind NAT is just ridiculous.
If that's really the case, I wish they would just come out and say it and spare the rest of us the burden of trying to debate such a decision on its technical merits. (Of course, I am aware that they owe me nothing here.)
Assuming this theory is true then, what other GPLv3-licensed "core" software in the distro could be next on their list?
Maybe the thought is that there will be more pressure now on getting all the tests to pass given the larger install base? It isn't a great way to push out software, but it's certainly a way to provide motivation. I'm personally more interested in whether the ultimate decision will be to leave these as the default coreutils implementation in the next Ubuntu LTS release version (26.04) or if they will switch back (and for what reason).
reply