Honestly you're both wrong. RAM prices spiked speculatively, and they're going down for the same reason. Market people always want to argue in fundamentals, when in practice *ALL* the high frequency components of the signal are down to a bunch of traders trying to guess where it's going in the short term.
At best those guesses are informed by ground truth ("AI needs a lot of RAM!" "Sam cornered the marked!" "TurboQuant needs less RAM!"), but they remain guesses, and even then you can't tell the difference between that and random motion.
Then note how wide the gray bands are. That makes it very easy to cherry-pick a few examples to present as "supporting evidence" that prices are doing whatever you want to believe they are doing.
It's showing $999 now, which seems about median for similarly-spec'd memory on Amazon. The cheapest slot-and-capacity-compatible equivalent I can find is around $570, even. So 3-5x increase, at minimum.
It's true that that's a high error bar. It's absolutely not true that the trend is ambiguous.
Can you cherry pick me a $141 kit, please? I mean, it's not an abstract question! I'd buy it from you right now if you had it or could get it, in whatever quantity you can source. No joke.
I’ll believe they’re going down when it doesn’t cost $550 for the $105 ram I purchased 1 year ago. Yes consumer prices lag commercial prices yada yada, I think any hot takes are pointless until we see lower prices or far more convincing evidence it’s coming. When it costs basically a MacBook neo for 32gb of DDR5 ram it’s hard to hear “ram is coming down for sure”
No, they signed a bunch of contracts for future deliveries. That's not a supply constraint. The factories making RAM continued operating and serving their existing deliveries, and in fact they still are.
Freshman economics would say that supply is fine and that prices shouldn't move. But they did anyway. And the reason is speculation.
I don't get it tbh. What market participants were speculating here? There aren't futures markets in RAM as far I know, though I certainly don't know much. And the supply constraints appear to have been pretty real (though maybe not immediate) if eg. Valve was begging publicly for RAM consignments. Were there pure-play speculators filling warehouses with DDR5?
>There aren't futures markets in RAM as far I know
sure there is. not formally, but if you hold a contract for x units of future production, you can sell that contract to somebody else who wants those units more than you do.
Its still speculative that OpenAI won't go bankrupt and have to free it back to the market, but if it is holding them unfinished it is a supply constraint on finished RAM chips even if not on wafer output.
Have we gotten anymore word on the potential Helium constraints that SK Hynix was making noise about after the strike on the helium plant in the Middle East that suppplied 60% of S. Korea's Helium? Because that could definitely put a kink in things, since SKH is one of the 3 remaining big DRAM producers.
It's a command line argument. The undeniably correct way to render it is with two minus signs[1] and absolutely not something non-ascii.
[1] Not strictly a hyphen, which has its own unicode point (0x2010) outside of ascii. Unicode embraced the ambiguity by calling this point (0x2d) "HYPHEN-MINUS" formally, but really its only unique typographic usage is to represent subtraction.
But... it's not more appropriate than an em dash for representing command line arguments? I don't see how either is any more incorrect than the other. There's a uniquely correct answer here and the em-dash is not it. Period.
It’s about the top-level comment’s horror that ”--” was substituted with “an en dash, not even an em dash”. If you’re picking a substitution for “--”, en dash makes more sense. The comment you originally replied to had already agreed “that it should be left as a double hyphen”.
> If you’re picking a substitution for “--”, en dash makes more sense.
No, it doesn't? This seems like crazy talk to me, like "If you're picking a substitute for saffron, blood plasma makes more sense than monocrystalline silicon". Like, what?
It makes zero sense to substitute this at all. It's exactly what it says it is, the "--hard" command line option to "git reset", and you write it in exactly one way.
Nobody is confused or disagrees about the `--hard` part. It was a minor tangent about contexts where these ASCII substitutions are established, like LaTeX (`` -> “, '' -> ”, -- -> –, --- -> —, etc.)
> The undeniably correct way to render it is with two minus signs[1] and absolutely not something non-ascii.
> [1] Not strictly a hyphen, which has its own unicode point (0x2010) outside of ascii. Unicode embraced the ambiguity by calling this point (0x2d) "HYPHEN-MINUS" formally, but really its only unique typographic usage is to represent subtraction.
Strictly, its as you note, the hyphen-minus, and Unicode has separate, disambiguated code points for both hyphen (0x2010) and minus (0x2212); hyphen-minus has no "unique typographic usage".
I said that badly. What I meant was that ASCII 0x2d is, in fact, used as the only minus sign in basically all markup and presentation layers. (Mostly because math layout tends to go through its own interpreter -- what lives in "the unicode text" is always "markup" of some kind). The unicode value is ignored AFAIK, nothing emits it or interprets it specially. That is not true of the hyphen, which does get special treatment at the presentation layer in fonts and whatnot.
The "sed" expressions that power the title "cleanup" here do overshoot quite often. It ruins --long-command-arguments and it definitely also reuins cpp::namespaces. Quite curious why these obvious shortcomings are not being fixed.
High oil prices hurt the US economy much more than China anyway. We're vastly more dependent on shipping and transport and even more vastly less elastic with our fuel demand. The only US interests who would want this are domestic oil producers, who are a small fraction even of the Republican funding base.
> it's actually very hard to find a ship, even as large as an aircraft carrier, in the ocean
I just ran some googled numbers over my envelope, and I get that the Mediterranean sea (great circle distance between Gibraltar and Beirut is 2300mi) is about 14000x larger than the bow-to-stern length (858') of the carrier.
That's... not that terribly difficult as an imaging problem. Just a very tractable number of well-resolved 12k phone camera images would be able to bullseye it.
Obviously there are technical problems to be solved, like how to get the phones into the stratosphere on a regular basis for coverage, and the annoyance of "clouds" blocking the view. So it's not a DIY project.
But it seems eminently doable to me. The barriers in place are definitely not that the "empty space is just too big". The globe is kinda small these days.
And you've defined a harder problem! Once you've found it once it's much easier to find in the future: it can only go so fast, and it's constrained to stay in relatively deep water.
> now the city has a bunch of more soulless condos and is horribly congested
The first bit is a taste thing; obviously lots of people view modern sprawl as "soulless" too.
But the latter point is just plain wrong. Dense housing IMPROVES traffic congestion and shortens commutes, always, everywhere, markedly. And it's for a bleedingly obvious reason: pack people in closer together and they don't have to travel as far to get where they're going. QED.
What you're imagining is some kind of fantasy hometown, which never increased in population and whose economy never developed. I mean, it's true. Forgotten ghost towns have very little traffic and quirky soulful architecture, c.f. Detroit. Everyone agrees that's a bad thing, though.
>But the latter point is just plain wrong. Dense housing IMPROVES traffic congestion and shortens commutes, always, everywhere, markedly. And it's for a bleedingly obvious reason: pack people in closer together and they don't have to travel as far to get where they're going. QED.
You are conflating things, adding more people to an area increases congestion, period. Having dense housing vs not dense housing is better for congestion IF the people are already there.
>What you're imagining is some kind of fantasy hometown, which never increased in population and whose economy never developed. I mean, it's true. Forgotten ghost towns have very little traffic and quirky soulful architecture,
It is a highly desirable area, there is no issue with the economy, it will continue to be desirable if we don't destroy it. The "growth always good" crowd is pretty nuts in their views
> adding more people to an area increases congestion, period
Yes, but so what? That's tautological. "Adding more people" isn't an independent variable, it's the economic ground truth over which we're trying to optimize.
The point is that if you need to build N units of housing to match your M added economic activity, building them denser leads to less congestion.
I mean, duh. This really isn't a complicated idea.
Again, you're imagining a single community divorced from inconvenient ideas like "population growth" or "economic development" (and even going so far as to conflate those with "destruction").
Well, sorry. It's desirable because it's developing. You don't get to change the minds of all the people that want to live there, all you can do is help them decide where to live.
>Well, sorry. It's desirable because it's developing
no.... it's not... what an incredibly naive take. Why don't you just leave out every nice small town in a beautiful location. "Who cares about keeping thing beautiful amiright??"
So, just to point it out: people don't get violent and criminal magically because they made a bet. They get violent and criminal to backstop a bet they can't cover. The story here isn't that horrible criminals are using Polymarket. It's that Polymarket bettors are overleveraged, and at the margin some of them turn to crime to avoid losing their shirts.
We've all been looking around for the trigger for the market-crash-we-all-know-is-coming. Seems like "too much betting on a stupid war of choice" is just dumb enough to fit the timeline we've been trapped in. Very on-brand.
In other news: I'm almost entirely out of volatiles in my own portfolio right now. Cash and bonds until this pops. Frankly the chances are that today will be the day[1] are about as high as they've ever been.
[1] Trump, sigh, basically went on camera and capitulated, telling the world that there is no plan, the US doesn't have the capability to ensure trade through Hormuz and that Iran will deny access until Iran decides otherwise. Markets don't like uncertainty, but they really, really hate losing wars.
So... no, it's not? You're saying everyone who makes a bet on anything is doing so compulsively? Literally everyone has bet on something. The absolutely overwhelming majority of "bets" placed (via whatever definition you want to give them) are basically benign and don't reflect mental illness.
But even so, you're missing my point: even compulsive gamblers don't as a general rule resort to criminal extortion to cover their losses. The interpretation here isn't about the psychology of the criminals, that's sort of speciously true.
It's that the fact that "regular bettors" become "criminals", and are doing so at scale, is a proxy measurement for the amount of leverage in the system.
Gambling is bad anyway because it increases the wealth gap. And wealth is increasingly used to take away wealth from the less fortunate. (See e.g. housing market, where price pressure is caused by wealth).
This is said with very high authority, and nothing whatsoever to back it up. Sure, not all, nor even the majority, nor even the plurality or a large minority of gamblers resort to criminal behavior.
But what evidence do you have that only over-leveraged gamblers resort to criminal behavior? Why do you think that some rich person who bet, say, $1 million they can actually afford will not still seek to recoup their investment, especially if it only takes some bribes and threats?
> Why do you think that some rich person who bet, say, $1 million they can actually afford will not still seek to recoup their investment, especially if it only takes some bribes and threats?
Because "only bribes and threats" are crimes for which people go to jail, and most "rich people" in the west, even in our authoritarian corruption hellhole timeline, are unwilling to engage in that nonsense because the benefits don't outweigh the risks.
Do I get to demand you cite evidence here, too? Has a wealthy person ever been caught in criminal extortion trying to goose a losing position that they could cover? I don't think that's ever happened, honestly.
I mean, yeah, it's my opinion. My gut says that the "bro" markets are all overleveraged right now, there aren't any easy winning positions at the moment (even AI stock valuations seem to have topped), and now the loans are coming due. Something's going to pop, and we're all looking for proxy measurements. This is one.
Well, the Epstein files prove quite clearly that there exist rich people who perform blatantly illegal acts that can put them in jail for a looooong time, even when they don't stand to lose any money whatsoever by not committing said crimes. And they also show that said rich people generally don't face any legal consequences even when their crimes become public knowledge.
So any argument that starts from the assumption that rich people don't commit crimes for relatively low gains, and/or that they would be caught and put in jail if they did commit crimes, is obviously false.
I think the Epstein files even have specific examples of blackmail among said rich people (e.g. Epstein's letter draft to Bill Gates).
Sigh. I didn't say the wealthy don't commit crimes. I said the wealthy don't commit crimes to avoid paying routine investment losses.
Actually what I really said is that no one does this, because it's insane. So I therefore infer that the people doing this are looking at losses that are not routine, they're faced with bets they can't cover.
You're claiming that the wealthy don't value their money enough to commit crimes for them, while knowing that they value their sex drives enough to do so. I don't see how this is a tenable position.
People routinely commit crimes for money, rich and poor alike, often for relatively irrelevant sums - and very often for money they don't even have yet. The incentive to commit crimes to prevent losses is even higher, given the well established loss aversion bias in all people.
And we don't even have to discuss losses. Many people commit crimes to get money quickly, from murder to insider trading to insurance fraud. If you agree that many people would be willing to kill for a few thousand or million dollars, you have to admit they'd be willing to threaten and blackmail a newspaper editor or production crew to try to fix a bet - especially when the internet brings them anonimity, and even if they bet a small sum that they wouldn't even care to lose.
If you don't believe this, try to go to a betting place in a poorer area and offer 1000:1 odds that no one punches you in the face hard enough to break your nose (a crime which could easily land whoever does this in prison). According to you, as long as you don't allow anyone to bet more than, say, $1 on this, it should be a very safe bet for you, surely no one would be insane to risk prison time for losing just $1, right?
That's not correctly stated. "Private Credit" is defined as non-bank lending. Banks are doing "public" lending in the sense of being regulated. Private lending is any sort of financial instrument issued outside of those guard rails.
It's generally felt to be risky and volatile, but useful. Basically, it's never illegal just to hand your friend $20 even if the government isn't watching over the process to make sure you don't get scammed. This is the same thing at scale.
It is. (EDIT: It's a mixed bag. OP was correctly calling out a definitional error.)
Banks have loaned $300bn mostly to private-credit firms. Those firms then compete with the banks to do non-bank lending. It's a weird rabbit hole and I'm grumpy after a cancelled flight, but it feels like I'm in the middle of a Matt Levine writeup.
> I’ve been tempted to buy one and do “real dev work” on it just to show people it’s not this handicapped little machine.
But... you can do the same exercise with a $350 windows thing. Everyone knows you can do "real dev work" on it, because "real dev work" isn't a performance case anymore, hasn't been for like a decade now, and anyone who says otherwise is just a snob wanting an excuse to expense a $4k designer fashion accessory.
IMHO the important questions to answer are business side: will this displace sales of $350 windows machines or not, and (critically) will it displace sales of $1.3k Airs?
HN always wants to talk about the technical stuff, but the technical stuff here isn't really interesting. The MacBook Neo is indeed the best laptop you can get for $6-700.
But that's a weird price point in the market right now, as it underperforms the $1k "business laptops" (to avoid cannibalizing Air sales) and sits well above the "value laptop" price range.
No, you can't do real work on a $350 windows machine. No way such a setup is suitable for anything beyond browsing a tab or two and connecting to servers using SSH.
And, the whole shittiness of the experience will even distract you attempting real work: the horrible touchpad, the bad screen, the forced windows updates when you trying to start the machine to do something urgent, ads in Windows, the lack of proper programmability of Windows (unless you use WSL).... Add the fact that the toy is likely to break in a year or two. These issue exist on far more expensive Windows machines, how much more a $350 machine.
Leaving Windows machines and OS behind for more than a decade has been a continuing breath of fresh air. I have several issues with the Apple devices and macOS (as I have with Linux too), but on the whole they are far better than Windows. The only good thing about Windows that I miss on Macs is the file explorer and window management, not sure why Apple stubbornly refuses to copy those.
A lot of $350-ish Windows machines also don’t have SSDs but instead eMMC storage, which is dog slow and will make modern SSD-mandatory Windows feel even more awful to use.
If Windows/Linux/x86 is non-negotiable and that’s your budget, I would never in a million years recommend anything brand new. This is when you go pick up a $350 used midrange ThinkPad on eBay. It won’t outperform a Neo in terms of CPU and battery life but I guarantee it’ll be a better experience than the garbage routinely sold at this price point.
Of course you can. You can do real work on an $80 Amazon Fire. Yes, some things will be potentially impossible or frustrating but that's also true of the MacBook Neo, just a bit higher of a bar. A lot of this also depends on your definition of "real work".
$350 USD can get you a decent laptop with a SSD, 16GB RAM and something like an Intel N100 or N95. And they pretty comparable to a decent Intel Skylake CPU which are still pretty usable.
Yes, the Neo has a faster CPU but it also has less RAM and less storage and costs more and has less ports. Besides ray traced games what can the Neo do that the others can't? They'll take longer but they'll get there.
And if you're willing to go used? That $350 goes a lot further.
> Yes, the Neo has a faster CPU but it also has less RAM and less storage and costs more and has less ports.
8GB on Apple Silicon is far better than 16 GB on Wintel, and I don't event trust the quality of 16GB of RAM on a bottom of the barrel Windows machine.
Would you prefer a machine that is still good 7 years from now with less ports, or one with more ports that you have to replace in 2 years? Yes it is more expensive now, but over 7 years it is an absolute bargain.
16 GB physical RAM is just better. Apple isn't magic. Gimme a break. Both devices have SSDs for fast swapping and have RAM compression. You can't spin up a VM that has 8GB RAM on the Neo, you can't load a large spreadsheet or do a decently sized digital painting. I could maybe buy a claim that 8GB is better on Mac than 8GB on Windows.
Why would you have to replace it in 2 years? How do we know Apple will even be offering updates to Neo in 7 years? Will 8GB still be usable in 7 years really? 8GB is barely on the fence already.
I wouldn't be surprised if Apple drops the Neo from software support in less than 7 years.
The ThinkBook 14 Gen 6 at Costco for $380 has a single thread passmark score of 2800. The laptop I use to develop most of my SaaS products, with IDEs and claude open etc, has a score of 2000. I run Linux, but win10 iot runs fine on it too.
> No, you can't do real work on a $350 windows machine.
Sigh. I mean, even absent the obvious answers[1], that's just wrong anyway. You're being a snob. Want to run WSL? Run WSL. Want to run vscode natively? Ditto. Put it on a cheap TV and run your graphical layout and 3D modelling work. I mean, obviously it does all that stuff. OBVIOUSLY, because that stuff is all cheap and easy.
All the complaining you're doing is about preference, not capability. You're being a snob. Which is hardly weird, we're all snobs about something.
But snobs aren't going to buy the Neo either. Again, the business question here is whether the $350 junk users can be convinced to be snobs for $600.
[1] "Put Linux on it", "All of your stuff is in the cloud anyway", "It's still a thousand times faster than the machine on which I did my best work", etc...
You mean that machine from 30 years ago that was running 30 year old software that has nothing in common with today’s development? And how well does Linux run on 4GB?
That's a 16G windows box which will happily run multiple VMs for whatever your deployment environment is, something the Neo is actually going to struggle with. The Jasper Lake CPU is indeed awfully slow, but again for routine "dev" tasks that's just not a limit.
You would obviously refuse out of taste, but if you were actually forced to use this machine to do your job... you absolutely could.
Honestly you're both wrong. RAM prices spiked speculatively, and they're going down for the same reason. Market people always want to argue in fundamentals, when in practice *ALL* the high frequency components of the signal are down to a bunch of traders trying to guess where it's going in the short term.
At best those guesses are informed by ground truth ("AI needs a lot of RAM!" "Sam cornered the marked!" "TurboQuant needs less RAM!"), but they remain guesses, and even then you can't tell the difference between that and random motion.
reply