man my laptop looks sadder and sadder by the moment at these prices. I remember when AMD was a “meh” chip but they really proved themselves over and over again since. I hope other chips fall in prices in response and more powerful systems get cheaper. The field’s been kinda stagnant more or less for a while.
Intel always sets the bar in quality (most of the time – they’ve made some mistakes) – but AMD has held its own on the consumer-end of electronics. I used to be anti-AMD WAY back in the 386/485/586/Pentium/II days but I’ve softened my stance since then.
That said? I’ll still take a less powerful Intel over a higher spec AMD
I worked in a computer shop in the mid 1990s. AMDs were cheap (I’d install them in the cheap builds) and would come back with more issues. [overheating and stuff].
But I stopped following chip wars around the P4 days where it seemed AMD was finally doing acceptable work.
So my knowledge is behind The focus had shifted around P4 days to improving storage capabilities, memory capacity and then slapping on more and more cores. Speed was less an issue at that point so I kinda got bored.
There’s definitely been changes over the past few years but as people are upgrading their systems less and less frequently and people are playing games longer and longer, there seems to be extended support for older systems.
For example, I got my nephew an ancient desktop when he was 6 years old. It was old then. He’s 11 going on 12 now. Now it’s a *really* old system.
I know it’s Intel, dual core, 2.0Gz, only 2G memory, 32-bit Win7. Old system.
Yet, all I had to do is give him a decent video card update, and except for some of the latest games, he’s able to play anything he wants.
Now if this was the 90s? You’d get maybe 2 years out of your system before you pretty much had to throw it out if you wanted to play anything. It was a fast-moving time..
So, times are better in this way.
Yeah, the idea of using the same system for 9 years was once unthinkable. If I had the same system in 1999 that I had in 1990, nobody would know how to use it
I think Consoles being a ‘fixed set’ of hardware has REALLY helped stall progress too. If the latest XBox specs are equivalent to a computer made 5 years ago, then game makers will make PC games that work on computers from 5 years ago.
Before consoles became PC based, you didn’t see that. They were separate worlds.
Times are much better now for consumers.
and – I hate to admit it – but I gotta give gabe credit. Rip-off http://steampowered.com resurrecting ancient games and reselling them has been a wonderful thing in keeping the need to upgrade at bay.
At one time, I hated the business model of Steam. But through the years, I’ve grown to appreciate their role in “freezing” the need to upgrade systems. This allows more and more people to use computers longer and longer. A whole generation of kids whose parents wouldn’t be able to afford new systems can simply throw their old system at their kids, get them online, and they can play thousands of games AND be part of a community.
[of course that community is quesitonable at times, but that’s another tale]
We functionally hit the limit of Moore’s law around 2002.
I’ve seen smaller wire sizes since then and yes there’s been chip speed improvement, but they haven’t been much at all.
That’s why they shifted focus to storage, memory capacity and video : there was a lot of room to grow. Plus – networking! Amazing progress for a few years there.
Example: In 2001/2, they FINALLY upgraded my system at work. What did I get?
I got this. 15 years ago. How different are those specs than today? Not all that much.
I <3 the progress that’s been made with GPUs. Now if only they can make them NOT melt the solder….
If I had a more modern system, I’d be messing with GPU stuff. I don’t keep a close watch anymore, but about once a year I check on the progress.
I remember when they were talking about using GPUs for doing an internet SETI search long ago. That was the first time I’d heard of the idea of GPUs being used for calculations.
You’re right about 2010. I placed it at 2002 because I was focusing on clock speed and neglected other factors.
Still if you think about it : when is the last time there was a single CPU system?
And the progress that’s made your system better in comparison? Improvement in storage, memory read/write, adding more cores… improved caching…
I’m glad we’re moving towards more parallel processing. The promise of https://en.wikipedia.org/wiki/Fifth_generation_computer was based on parallel processing – but we were nowhere nearly ready technologically for it.
Now we’re here – or close to it.
The Intel singe core on newegg was first available on Amazon in 2004. I’ll look at the other one.
The higher end G440 was discontinued in 2012. I can’t find their original manufacture date.
Hm – good point. I’m not lamenting the loss of single-core. But for me, it was a measuring stick of Moore’s Law – or at least as to that of our manufacturing capabilities
I believe they should’ve “gone vertical” long ago myself.
[the overall two dimensionality of electronics has been irritating for a long time. “Oh, it’s thinner!” marvelous. How about stacking more power? I’d rather have a processor/memory/graphics/storage system that’s an integrated 3D matrix, with sparse connections and lots of room for internal expansion.
But describing in words what I see for future systems is going to sound weird.
I also want to see more work done in fuzzy computing. Whether analog, ternary quantum, whatever — some kind of expansion beyond FLIP/FLOP. There’s some work being done in that area but nothing consumer that I’ve seen.
The model we’re shrinking has been this. Breadboard.
This is why GPU progress has been exciting to me because it’s at least SOME progress in native 3D matrix manipulation.