Perhaps because the level of respect that Windows has for its users has dropped with each successive version?
Not to mention bloat: I have a keyboard with a dedicated calculator button. On a machine with Core i5 something or other and SSD it takes about 2 seconds for the calculator to appear the first time I push that button. On the Core 2 Duo machine that preceded it, running XP from spinning rust, the calculator would appear instantly - certainly before I can release the button.
But also WinXP was the OS a lot of people used during their formative years - don't underestimate the power of nostalgia.
Also, for some people the very fact that Microsoft don't want you to would be reason enough!
Personally if I were into preserving old Windows versions I'd be putting my effort into Win2k SP4, since it's the last version that doesn't need activating.
(I did have to activate a Vista install recently - just a VM used to keep alive some legacy software whose own activation servers are but a distant memory. It's still possible, but you can't do it over the phone any more, and I couldn't find any way to do it without registering a Microsoft account.)
There are tools out there (like UMSKT) that can activate MS software from that era fully offline too. They cracked the cryptography used by the activation system and reimplemented the tool used for phone activation, so you can “activate by phone” using UMSKT instead of calling MS.
Your comment reminds me of that rule from baseball that says something about batters and hats, or maybe it was about helmets or something, it doesn't really matter though because the only point of this sports ball rambling is to distract you from noticing that my "nuh uh" has no substance. Did it work?
This is more than a bit out of place on HN in my experience, please, try to engage politely.
I’m not sure what I can say that will qualify as more than “nuh uh” to you, shy of getting a Core 2 Duo running with XP and the same keyboard as OP. That isn’t possible at the moment, is there anything else I could do?
I admit you got me mildly anmoyed with the sports nonsense, sorry about that.
Anyway, you're talking about reaction time, which isn't actually relevant. The time between an action (pressing a button, or flipping a switch) and seeing the result happen isn't the same as the time it takes you to re-act to that something. Flip a light switch, does the light turn off instantly, or does it take a full third of a second? I guarantee you can tell the difference. 300ms of latency is actually huge and easily perceptible, even if it's faster than you can react.
300 ms is a long time on a computer, definitely. Just, the autistic side of me has to speak up when it’s wildly unrealistic glorification of the past.
Keypress duration is likely much less than 300 ms, top Google result claims 77 ms on average. And that’s down and up.
I see it being in cache already as sort of game playing, i.e. we can say anything is instant if we throw a cache in front of it. Am I missing something about caching that makes it reasonable? (I’m 37, so only 18 around that time and wouldn’t have had the technical chops to understand it was normal for things to be in disk cache after a cold boot)
Okay, let's say the cache is cold and you're on an old clunky spinning rust 5400 RPM hard drive. Do the math. How long will it take, worst case, for the platter to spin to where calc.exe is stored?
For a 5400 RPM drive, worst-case rotational latency is one full rotation: 5400/60 = 90 rev/sec, so ~11ms. Average is half that (~5.5ms). If you also need to seek (yes, we'll definitely need to move on both axes in the worst case scenario requested, likely all the time), 2006-era datasheets show average seek around 11-12ms, with full-stroke seeks around 21-22ms. So worst case total access: ~33ms.
Tl;dr reaction time, 300 ms is the golden rule for reaction speed, and apparently there was actually a sports medicine study that came to that #. I was surprised to see that, 300 ms comes up a lot in UX as “threshold of perceptible delay” but it was still surprising to see.
I'm not sure why human reaction time is relevant here, since what I'm talking about isn't the time it takes me to respond to a stimulus but the time it takes the computer to respond to a stimulus.
I do do still have both computers set up side-by-side (legacy data from an old business), and the keyboard in question was a Microsoft Comfort Curve 2000 (the calculator button wasn't a proper key, it was one of those squidgy extra keys so beloved of multimedia keyboards, so not as fast to operate as a proper key.)
Anyhow, the point (arguably hypberbolic as it may have been) wasn't about reaction time per se, it was about the older calculator app - and by extension much of the rest of the OS - being a much simpler and less bloated piece of software, and running it on faster-than-contemporaneous hardware makes for a sense of immediacy which is sorely lacking in today's world of web apps.
I'd be very interested to know to what that 300ms "threshold of perceptible delay" applies. You might not notice a window taking 300ms to open - but I'd be willing to bet that when you're highlighting text with the mouse or dragging a slider, you'd be very aware of the UI lagging by nearly 1/3 of a second.
This is a lot of words that say "yeah, I was hyperbolic, but it was directionally correct." I do appreciate the candor but its a bit late, as you see by the text color of my comments. Many people do the same thing as you, no worries, I appreciate you validating my quixotic self-destructive work.
I'm sorry you're being downvoted - for the record I've upvoted since it's interesting, even if we disagree in some aspects.
Since I still have the machine in question here, and I'm now interested enough to try and get some rough measurements, I've just videoed it with my phone (30fps video) and done some frame counting, both from a cold boot with nothing cached, and also a repeated launch.
Firstly from a cold boot:
It's hard to tell exactly when the keypress registers, but I believe what I'm seeing is the key being pressed, two frames later the hourglass appears, two frames after that the calculator appears. (The TFT screen will likely be adding at least one frame lag, but let's ignore that for now.) So that's somewhere between 166 and 200ms for a cold launch.
If I close the app and repeat, there's now just one frame between keypress and hourglass, and just one more frame between hourglass and the app appearing, so now nearer 100ms.
Looking at the videos my finger is off the key the first time the app appears, but not the second time - though if I made a special effort to release the key as quickly as possible I now think I could probably just about beat it.
300 ms is way longer than they budgeted; separately, I was alive then and it's a ridiculous claim, like, it takes a general bias we all have towards seeing the past with rose-colored glasses and takes it farcically far.
Don't want to clutter too much, I'm already eating downvotes, so I'll link:
On the average consumer hardware at launch, 95 and XP were slow, memory hungry bloats. In fact everything that people say about Windows 11 now was even more true of Windows back then.
By the end of the life of Windows 95 and XP, hardware had overtook and Windows felt snappier.
There was a reason I stuck with Windows 2000 for years after the release of XP and it wasn’t because I was too cheep to buy XP.
yeah no. Ask musicians using computers - 50 milliseconds of latency between sound and movement is generally considered unplayable, 20 milliseconds is tough, below 10ms usually is where people start being unable to tell.
You’ve fallen into the common trap of conflating reaction time with observable alignment time.
Reactions are about responding to one off events.
Whereas what you’re describing is about perception of events aligned to a regular interval.
For example, I wouldn’t react to a game of whack-a-mole at 50ms, nor that quickly to a hazard while driving either. But I absolutely can tell you if synth isn’t quantised correctly by as little as 50ms.
Thats because the later isn’t a reaction. It’s a similar but different perception.
Pressing a key to trigger an action that you will then send additional input to is an entirely different sequence of events than whack-a-mole, where you are definitionally not triggering the events you need to respond to.
I'm not talking about latency (though I don't fully agree with your statement but I've covered that elsewhere). I'm talking about the GP's comparison of reactions vs musicians listening to unquantised pieces.
You simply cannot use musicians as proof that people have these superhuman reaction times.
But here we're talking about not being able to notice whether calc.exe opens in less than 300 milliseconds, not how fast we can react to it opening? It's the same thing with audio latency (and extremely infuriating when you're used to fast software where you can just start typing directly just after opening it without having to insert a pause to cater to slowness)
No it's not the same thing with music latency. For one thing, music is an audio event where as UI is a visual event. We know that music and audio stimuli operate differently.
And for the music latency, you can here where the latency happens in relation to the rest of the music piece (be the rock music, techno, or whatever style of music). You have a point of reference. This makes latency less of a reaction event and more of a placement event. ie you're not just reacting to the latency, you're noticing the offset with the rest of the music. And that adds significant context to perception.
This is also ignores the point that musicians have to train themselves to hear this offset. It's like any advanced skill from a golf swing to writing code: it takes practice to get good at it.
So it's not the same. I can understand why people think it might be. But when you actually investigate this properly, you can see why DJs and musicians appear to have supernatural senses vs regular reaction times. It's because they're not actually all that equivalent.
I've seen the same scenario - someone with limited vision, next to no feeling in his fingertips and an inability to build a mental model of the menu system on the TV (or actually the digi-box, since this was immediately after the digital TV switchover).
Losing the simplicity of channel-up / down buttons was quite simply the end of his unsupervised access to television.
Channel up/down doesn't scale to the amount of content available now. It was OK when there were maybe half a dozen broadcast stations you could choose from.
This is ahistorical. If you had cable, you had 100+ channels, and there was no difficulty in numbering them and navigating them through the channel up/down buttons. There weren't even only half a dozen broadcast stations in any city in the US at least since the 50s - you at least had ABC, NBC, CBS and PBS in VHF, and any number of local and small stations in UHF.
The thing that didn't scale was the new (weird, not sure why) latency in tuning in a channel after the DTV transition, and invasive OS smart features after that. Before these, you could check what was on 50 channels within 10 seconds; basically as fast as you could tap the + or - button and recognize whether something was worth watching; changing channels was mainly bound by the speed of human cognition. I think young people must be astounded when they watch movies or old TV shows where people flip through the channels at that speed habitually.
> new (weird, not sure why) latency in tuning in a channel after the DTV transition,
Because with analog signals the tuner just had to tune to the correct frequency and at the next vertical blank sync pulse on the video signal the display could begin drawing the picture.
With digital, the tuner has to tune to the correct frequency, then the digital decoder has to sync with the transport stream (fairly quick as TS packets are fairly small) then it has to start watching for a key frame (because without a keyframe the decoded images would appear to be static) and depending upon the compression settings from the transmitter, keyframes might only be transmitted every few seconds, so there's a multi-second wait for the next keyframe to arrive, then the display can start drawing the pictures.
I still watch OTA DTV. Tuning is instant. Maybe it's slower if you are on cable and there's a few round-trip handshakes to authenticate your subscriber account.
I'm pretty sure there's a lot of round-tripping going on with the streaming services I use through my dongle. They're always slow to both start the app and to start any actual streaming.
That's only if you want to watch specific things; some people just turn it on for entertainment, and change channels to have a spin at the roulette wheel for something better.
The low latency is the reason why the PiStorm (Amiga CPU accelerator) project works so well on a Pi 2, 3 or 4. (Pi 5 is no longer suitable since the GPIO is now the other side of a PCI-E bus and thus suffers significantly higher latency than on previous models, despite being much faster in terms of throughput.)
Were you able to use the online activation system without a Microsoft account? I wasn't able to - though as you say, that account doesn't have to be tied to the license or an account on the machine being activated.
And of course that's exactly what they did with Coldfire - rounding off the inconvenient corners of the ISA to produce CPUs with lower power requirements and able to run at higher clock speeds.
They did it with the 68030 before Coldfire. They discarded a number of things (e.g. addressing modes) that seemed like good ideas for the <=68020 but didn't end up being used in practice.
Unfortunately the choice isn't between sites with something like Anubis and sites with free and unencumbered access. The choice is between putting up with Anubis and the sites simply going away.
A web forum I read regularly has been playing whack-a-mole with LLM scrapers for much of this year, with multiple weeks-long periods where the swarm-of-locusts would make the site inaccessible to actual users.
The admins tried all manner of blocks, including ultimately banning entire countries' IP ranges, all to no avail.
The forum's continued existence depends on being able to hold off abusive crawlers. Having to see half-a-second of the Anubis splashscreen occasionally is a small price to pay for keeping it alive.
The scrapers will not attempt to discover and use an efficient representation. They will attempt to hit every URL they can discover on a site, and they'll do it at a rate of hundreds of hits per second, from enough IPs that each only requests at a rate of 1/minute. It's rude to talk down to people for not implementing a technique that you can't get scrapers to adopt, and for matching their investment in performance to their needs instead of accurately predicting years beforehand that traffic would dramatically change.
I challenge you to take a critical look at the performance of things like PHPBB and see how even naive scraping brings commonly deployed server CPUs to their knees.
Web 2.0 was sites not having finished loading when you thought they had, buttons having a 1 in 20 chance of doing nothing when you click them, and the advent of "oops, something went wrong" being considered an acceptable error message.
It might be coincidence, but I've noticed the ads get slightly less obnoxious if I religiously abandon the video and close the browser tab any time the ad is more than 10 seconds long and unskippable. I'm sure they're monitoring closely to see what people will and won't tolerate.
Not to mention bloat: I have a keyboard with a dedicated calculator button. On a machine with Core i5 something or other and SSD it takes about 2 seconds for the calculator to appear the first time I push that button. On the Core 2 Duo machine that preceded it, running XP from spinning rust, the calculator would appear instantly - certainly before I can release the button.
But also WinXP was the OS a lot of people used during their formative years - don't underestimate the power of nostalgia.
Also, for some people the very fact that Microsoft don't want you to would be reason enough!
Personally if I were into preserving old Windows versions I'd be putting my effort into Win2k SP4, since it's the last version that doesn't need activating. (I did have to activate a Vista install recently - just a VM used to keep alive some legacy software whose own activation servers are but a distant memory. It's still possible, but you can't do it over the phone any more, and I couldn't find any way to do it without registering a Microsoft account.)
reply