Hacker Newsnew | past | comments | ask | show | jobs | submit | root_axis's commentslogin

I basically always start with digital, if the book is good I always buy a physical copy for my shelf.

I do something similar - but I'm quite picky with books I buy due to limited physical space.

There will be no prosecutions. Even if there's a situation where Dems regain power, they don't have the political capital or efficacy to prosecute.

Like how assiduously Obama went after Bush Jr. administration.

...and how decisively Trump was prosecuted for the 6/1/21 attempted ~coup~ tourism, and for how thoroughly the Epstein child abuse ring was dismantled, and...

Yes, the only chance the US has going forward is to primary all current incumbents and hold both party leadership accountable for complicity in treason.


Even that won't matter. The problem isn't the elected officials, the problem is that most of the county doesn't care either way.

> I think he's finally gone too far.

He'll be fine. It should be pretty clear by now that 40% of the country prefers Trump regardless of policy.


It's fallen below 40% for the first time, per Nate Silver, and that poll was taken before this story came out.

If he follows through on his threat to destroy all of Iran's petroleum infrastructure, fuel prices will rise to unprecedented levels and remain there for a very long time. He will not be able to blame anyone else. People will plaster gas pumps with "I did that!" stickers, only with Trump's picture this time, rather than Biden's.

And it still won't force Iran to open the Strait. He has no good options there. Iran is second only to Russia when it comes to shrugging off staggering losses in wartime. Trump cannot force the Iranians to do much of anything without either invading them or nuking them. If he does the former, the resulting carnage will cost him his remaining support among Republicans at all levels including the MAGA faithful. If he does the latter, he's definitely finished.


Approval rating is different from preference, which is what the OP was talking about. He may have only a 40% approval rating, but if the approval rating of a putative Democratic opponent is lower, he'd still win an election.

Of course he (supposedly) can't be on the ballot any more, but a midterm election is always seen as a referendum on the President. If Democrats win in November they have some ability to hamper his ability to do... whatever it is people voted for him to do.

A 40% approval rating for the President doesn't bode especially well for him in November. But "Democrats in Congress" have an 18% approval rating, so voters aren't thrilled with them, either.


Nuking won't accomplish anything anyway. There's no obvious target that would "defeat" Iran given the seemingly decentralized command structure, and using nuclear weapons anywhere near the Strait would render it unusable.

>the resulting carnage will cost him his remaining support among Republicans at all levels *including the MAGA faithful*

Doubt


It will cost him some of his remaining support among the MAGA faithful. Some of them are just in love with Trump, or at least with the image Trump presents. Some are in love with Trump's (stated) policies, like "America first" and "no new wars".

And even of those who are in love with Trump's image, this may tarnish the image enough for some of them to fall out of love with it.

It won't cost him all of MAGA. But it will cost him some.


I mean personally even a white Christian european country Ukraine didn't garner much maga support, Iran are "brown Muslims" and the "enemy". I feel it doesn't have much impact except for the truly ideologically antiwar otherwise most magas seem to without much difficulty flit between contradictory opinions, especially if Trump said it. It will cost some of the support yes, but I think that's because of people like Tucker Carlson.

To clarify my point above, I meant "the resulting carnage" among America's armed forces, not Iran's.

MAGA obviously doesn't care about carnage that we dole out to the Iranians, including Iranian children, but they presumably will object to a tide of body bags arising from the actions of a President who promised them "No more foreign wars."


Seems like a trivially simple verification to defeat.

You can defeat all client side verification by definition if you know what verification is run.

Why can't you allow typing and just consume the state of the text input as the initial state of the js logic?

how you type is also part of the signal

Then track that data and upload when you can make the request.

That's so far into the future that we can cross that bridge when we come to it.

You really need to look into what people are doing with prosthetic eyes.

Here's a dude from 3 years ago adding a flashlight: https://www.reddit.com/r/interestingasfuck/comments/yblzi4/g...

And I'm pretty sure I saw one who added a laser to theirs for raves, but can't find the link :)

You can buy very very tiny cameras today off the shelf, the main problem would be just packaging either a storage medium or wireless transfer capability + power inside the eye. With government-level budgets it's doable, possibly even by a skilled maker with resources.


AirPods knockoffs are mass-produced. They contain power and RF comms in a package that's significantly smaller than an eye. The only problem with prosthetic eye camera I as a half-skilled home-lab owner would have is how to not ruin the source prostetic. Which is trivial - just buy a dozen and practice.

It's so far into the future that it overflows the temporal coordinates and is actually a few years into the past now.

> so far into the future

Idk, I think this is like, maybe 5 years in the future


On the audio side, it's not a stretch to imagine cochlear implants (or hearing aids) having an undetectable recording ability.

AFAIK some wireless buds can work as better hearing aids, they just don't have the medical device label to officially perform that function.


The "it's just math" argument may not be technically rigorous, but it's directionally correct. The unstated reasoning invites us to consider why this particular math would be conscious, but not many other forms of math all around us.

First, it seems you've shifted from "intelligent" to "conscious". "These math operations produce consciousness" is different from "these operations produce intelligence".

Second, "it's just math" doesn't mean literally "it's a branch of algebra". It means "it's a computable function". So it can be relevant to the discussion only if you think that intelligence is somehow non-computable, and therefore that there are non-computable processes going on in our brain. Otherwise it's a perfectly pointless remark.


The person you replied to isn’t the one that switched to conscious.

>non-computable

Something like 70-80% of all humans believe in a soul or spirit, and of the remainder, many of them are unsure whether human like intelligence can be produced by computable processes.

So it wouldn’t be surprising that the OP does think there are non-computable processes going on in the brain.


> The person you replied to isn’t the one that switched to conscious

Yes he is, the message he replies to is about intelligence:

>> Intelligence (and probably consciousness) is an emergent feature

> Something like 70-80% of all humans believe in a soul or spirit

Then they should say "I believe intelligence only comes from the soul" or "I believe intelligence is not computable", if that's what they mean. "It's just numbers" as an argument is either incomplete or not entirely honest.


> Yes he is, the message he replies to is about intelligence

No, I'm not. The comment I replied to is the one that mentioned consciousness.


> First, it seems you've shifted from "intelligent" to "conscious".

Not true. Take a second look.


> Computers have been running thousands of times slower than they should be for decades

I've been hearing this complaint for decades and I'll never understand it. The suggestion seems completely at odds with my own experience. Regardless of OS, they all seem extremely fast, and feel faster and faster as time goes on.

I remember a time when I could visually see the screen repaint after minimizing a window, or waiting 3 minutes for the OS to boot, or waiting 30 minutes to install a 600mb video game from local media. My m2 air with 16gb of memory only has to reboot for updates, I haphazardly open 100 browser tabs, run spotify, slack, an IDE, build whatever project I'm working on, and the machine occasionally gets warm. Everything works fine, I never have performance issues. My linux machines, gaming pc, and phone feel just as snappy. It feels to me that we are living in a golden age of computer performance.


I think the best example is in iOS. On old iOS versions, the keyboard responsiveness took precedence over everything, no matter what. If you touched the keyboard, it would respond with an animation indicating what you are doing. The app itself may be frozen, but the self contained keyboard process would continue on, letting you know the app you are using is a buggy mess.

Now in iOS 26, you can just be typing in Notes or just the safari address bar for example, and the keyboard will randomly lag behind and freeze, likely because it is waiting on some autocomplete task to run on the keyboard process itself. And this is on top of the line, modern hardware.

A lot of the fundamentals that were focused on in the past to ensure responsiveness to user input was never lost, became lost. And lost for no real good reason, other than lazy development practices, unnecessary abstraction layers, and other modern developer conveniences.


Yeah long ago when I was doing some iOS development, I can remember Apple UX responsives mantras like “don’t block the main thread”, as it’s the thing responsible for making app UIs snappy even when something is happening.

Nowadays seems like half of Apple’s own software blocks on their main thread, like you said things like keyboard lock up for no reason. God forbid you try to paste too much text into a Note - the paste will crawl to a halt. Or, on my M4 max MacBook, 128GB ram, 8tb ssd, Photos library all originals saved locally - I try to cmd-R to rotate an image - the rotation of a fully local image can sometimes take >10 seconds while showing a blocking UI “Rotating Image…”, it’s insane how low the bar has dropped for Apple software.


This trend was obvious when they started removing physical buttons. My thought was, man these people do put so much faith in software.

My M4 Max 128GB ... 90% of the time is like you say.

10% of the time, Windowserver takes off and spends 150% CPU. Or I develop keystroke lag. Or I can't get a terminal open because Time Machine has the backup volume in the half mounted state.

It's thousands of times faster than the Ultra 1 that was once on my desk. And I can certainly do workloads that fundamentally take thousands of times more cycles. But I usually spend a greater proportion of this machine's speed on the UI and responsiveness doesn't always win over 30 years ago.


Or contactsd lol

Spotlight doesn’t make sense either.. caches get evicted, but there’s no logic that prevents it from building it back up immediately

Log processes are fine, but they should never be able to use 100% / At the same priority (cpu+io)


Ok. Today we have multi-Ghz processors, with multiple cores at that.

Photons travel about 1 foot per nanosecond ... so the CPU can executes MANY instructions between the time photons leave your screen, and the time they reach your eyes.

Now, on Windows start Word (on a Mac start Writer) ... come on ... I'll wait.

Still with me? Don't blame the SSD and reload it again from the cache.

Weep.


Not sure where you're getting at. MS Word, full load to ready state after macOS reboot takes ~ 2 seconds on my M1 mac. If I close and re-open it (so it's on fs cache) is takes about ~1 second.

You, and sibling comment author just never experienced the truly responsive ui.

It is one where reaction is under a single frame from action. EDIT: and frame is 1/60s, that is 16.(6)ms. I feel bad feeling I have to mention this basic fact.

This was possible on 1980s hardware. I witnessed that, I used that. Why is it not possible now?


I've used 1980s hardware. In the 80s. And used UNIX and HP/Sun/SGI/etc hardware since the 90s. Not only it was no "truly responsive", nothing opened in a "single frame" (talking about X Windows). Took way longer then 1-2 seconds to open a browser on a blank page for example, and for many programs you saw them slowly drawing.

And I did. And it did. Like, Amiga, even 500 models.

I do not doubt X was horrible from that pov. I remember R5. This is not that I meant.

edit: there were no web browsers back then. the effin "folder browser" opens slower on my xfce4 than the same in an a1200 emulator in a window next to it. this is sad.


Probably rose tinted memories. Here are actual Amiga 500 speeds:

https://www.youtube.com/watch?v=Tl6092fMkZY

Not only it takes a second just to redraw a moved window (with mid-way frames and flashing in between), opening a tiny program is slow and shows the "zzz" busy indicator.


Base model M4 Mac Mini -- takes 2 seconds to load Word (and ready to type) without it being cached. Less than 1 second if I quit it completely, and launch again, which I assume is because it's cached in RAM.

> Regardless of OS, they all seem extremely fast, and feel faster and faster as time goes on.

This very much depends on what hardware you have and what you're doing on it (how much spare capacity you have).

Back in university I had a Techbite Zin 2, it had a Celeron N3350 and 4 GB of LPDDR4. It was affordable for me as a student (while I also had a PC in the dorm) and the keyboard was great and it worked out nicely for note taking and some web browsing when visiting parents in the countryside.

At the same time, the OS made a world of difference and it was anything but fast. Windows was pretty much unusable and it was the kind of hardware where you started to think whether you really need XFCE or whether LXDE would be enough.

I think both of the statements can be true: that Wirth's law is true and computers run way, way slower than they should due to bad software... and that normally you don't really feel it due to us throwing a lot of hardware at the problem to make us able to ignore it.

It's largely the same as you get with modern video game graphics and engines like UE5, where only now we are seeing horrible performance across the board that mainstream hardware often can't make up for and so devs reach for upscaling and framegen as something they demand you use (e.g. Borderlands 4), instead of just something to use for mobile gaming.

It's also like running ESLint and Prettier on your project and having a full build and formatting iteration take like 2 minutes without cache (though faster with cache), HOWEVER then you install Oxlint and Oxfmt and are surprised to find out that it takes SECONDS for the whole codebase. Maybe the "rewrite it in Rust" folks had a point. Bad code in Rust and similar languages will still run badly, but a fast runtime will make good code fly.

I could also probably compare the old Skype against modern Teams, or probably any split between the pre-Electron and modern day world.

Note: runtime in the loose sense, e.g. compiled native executables, vs the kind that also have GC, vs something like JVM and .NET, vs other interpreters like Python and Ruby and so on. Idk what you'd call it more precisely, execution model?


> Regardless of OS, they all seem extremely fast, and feel faster and faster as time goes on.

The modern throughput is faster by far. However, what some people mean when they talk about "slower" is the latency snappiness that characterizes early microcomputer systems. That has definitely gotten way worse in an empirically measurable fashion.

Dan Luu's article explains this very well [1].

It is difficult today to go through that lived experience of that low latency today because you don't appreciate it until you lived it for years. Few people have access to an Apple ][ rig with a composite monitor for years on end any longer. The hackers that experienced that low latency never forgot it, because the responsiveness feels like a fluid extension of your thoughts in a way higher latency systems cannot match.

[1] https://danluu.com/input-lag/*


I wonder if this ties into why I'm baffled at the increasing trend of adding fake delays (f/ex "view transitions"). It's maddening to me. It's generally not a masking/performance delay either; I've recompiled a number of android apps for example to remove these sorts of things, and some actions that took an entire second to complete previously happen instantly after modification.

Have you tried disabling animations in the System Settings? Some apps respond to that.

Ohhhh trust me, I have, assuming you mean "Disable animations". The three duration scale developer settings too. Thank you for suggesting it, though, just in case.

Some apps do respect it, but sometimes it's hardcoded, and OS settings don't seem to override it. Even the OS doesn't respect it in some cases, but I think it used to. Flutter apps? Forget about it.


A really annoying thing I've run into is that lots of libraries/frameworks/etc will have shortcuts to introduce this delay, to avoid "pop-in" of lazy-loaded stuff.

Like, yeah, pop-in looks a little weird, but suddenly APIs are making that one Mass Effect elevator into a first-class feature...


>Regardless of OS, they all seem extremely fast, and feel faster and faster as time goes on.

One analogy is that the distance between two places in the world hasn't changed, but we're not arriving significantly faster than we before modern jetliners were invented. There was a period of new technology followed by rapid incremental progress toward shortened travel times until it leveled off.

However, the number of people able to consistently travel between more places in the world has continued to increase. New airports open regularly, and airliners have been optimized to fit more people, at the cost of passenger comfort.

Similarly, computers, operating systems, and their software aren't aligned in optimizing for user experience. Until a certain point, user interactions on MacOS took highest priority, which is why a single or dual core Mac felt more responsive than today, despite the capabilities and total work capacity of new Macs being orders of magnitude higher.

So we're not really even asking for the equivalent of faster jet planes, here, just wistfully remembering when we didn't need to arrive hours early to wait in lines and have to undress to get through security. Eventually all of us who remember the old era will be gone, and the next people will yearn for something that has changed from the experiences they shared.


Don't forget "the files are on my desk" and many other classics.

Doesn't gmail opt people into 2fa automatically?

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: