PC sales doldrums

I also agree with that. We've reached a maturation level in the personal computer world that doesn't require an upgrade every 18 months in order to even function like it did back in the late '80s into the mid-'90s when OSes and application software was changing at a rapid pace. The fact that a huge number of Win XP systems are still in service is a testimony to that.

This is most of the answer, IMO. I remember when the 286 replaced the 8086, when the 386, 486, and Pentium came out. Each of those jumps was significant in terms of processor speed and capability. Also, storage devices were improving in leaps and bounds. And software developers took advantage of every hardware improvement to write more capable code. So it made sense to upgrade every few years.

That rush has petered out. Today, the only people who need "latest greatest" type hardware are gamers or others who are in graphics intensive settings. Otherwise, e-mail, Excel, Internet Explorer, etc. are plenty happy with yesterday's machine...
 
PC technology evolution has slowed to a crawl; it’s like 1990 all over again.

I definitely agree with that. I have a 10-year-old computer that I gave to my grandparents that is still usable. But 10 years ago, a 10 year old computer was utterly useless.

However I think all these problems that have been discussed are interrelated. The R&D money is going into mobile/tablet devices rather than faster PCs because that's what people want. And as R&D money shifts away from PCs it makes them that much less desirable. A self-perpetuating cycle.
 
I'm still running applications that require 1-2 hours to process on a dual 3.5 GHz quad core desktop and I expect I'll be quite retired before any tablets are powerful enough for that.

Surface pro runs i5 cpu, the big things these days hasn't been raw MHz but tdp.

With that said, rumor is amd has a 5ghz chip in fab. Which is odd because even for the next Xbox and ps4, amd went unified gpcpu at lower clock rates and massive core count. Interesting times!
 
I definitely agree with that. I have a 10-year-old computer that I gave to my grandparents that is still usable. But 10 years ago, a 10 year old computer was utterly useless.

However I think all these problems that have been discussed are interrelated. The R&D money is going into mobile/tablet devices rather than faster PCs because that's what people want. And as R&D money shifts away from PCs it makes them that much less desirable. A self-perpetuating cycle.

Not so much, in my opinion, although I agree that it's demand-dependent.

If I were to leave now and drive downstate to my nearest Microcenter, I would still have time to purchase the components to build a supercomputer. I fully expect that they would have the parts in stock for me to build at least a 4-CPU, 16-core machine with 32 GB of RAM, enough enough video processing power to make Spielberg drool, and a sound system to rival Carnegie Hall. That's minimum. Probably I could do better, just with what they have in-stock on the shelves.

But what would I use it for?

That's the real reason why hardware seems to be lagging right now. It's not because we've hit some technical plateau on the hardware side. It's because the practical performance limits using today's hardware far exceed the practical needs of the vast majority of users.

I'm not talking about theoretical limits. I'm talking about a computer that I could build this evening from parts that are in-stock, right now, at MicroCenter or Fry's, that would be so powerful that it would exceed the needs of 99 percent of users for at least five years -- maybe more.

-Rich
 
My iPad had replaced nearly all of my recreational computer use. Hell I even did my taxes on it.

However I suspect I will be purchasing a new PC in the next few days to replace my TV DVR
 
I fully expect that they would have the parts in stock for me to build at least a 4-CPU, 16-core...

But what would I use it for?

Ok, fair enough--so I revise my point that software R&D has shifted, which is probably a better explanation of why the upgrade cycle is less frequent. In either case, there is just no motivation to upgrade.
 
Last edited:
Most folks use a PC as a communications appliance so there is a lot of surplus utility that is wasted and the public is understanding that. Even most business users rarely do anything outside of Excel, PowerPoint, Word and Outlook. Smart phones and tablets are far more convenient as communications appliances so it is no surprise that they are supplanting the PC in that regard.

The PC is still a powerful productivity platform for information oriented business but even there the functionality that we rely on for our livelihood in the information age is migrating to a virtual implementation such as VMware and XenApp. So even in the VM world the PC is far from dead or obsolete, it's just not packaged in the traditional manner.

As for the Mac/Windows debate, I have found the Mac to be better for some applications and the PC better for others. There are applications in both worlds that are not available in one or the other.

I now access a virtual PC from whatever device I happen to have in my hands at the time, phone, tablet or laptop. I think that I am now using my last laptop just as several years ago I used my last desktop and before that my last unix workstation and before that my last mainframe dumb terminal. It's just time for the hardware implementation of the PC as we know it today to fade away.
 
Back
Top