View Full Version : Computer years = how many human years
It is said that a dog year is 7 human years. How many human years is a computer year.
Wife and I went to San Antonio and stayed at a very nice B&B. They had a computer for clients to use. It was a Pentium III with 512MB memory. I asked the innkeeper if it came when the house was built (1852) but the sarcasm went totally over her head.
Anyway, how many human years is a computer year. My guess is about 12.
Datawiz
07-28-2011, 7:16pm
People live on average about 80 years, give or take, computers start fading after 3 years and are toast after 5 years, so 5 into 80 = 16. :cheers:
mrvette
07-28-2011, 7:25pm
MORE depends on how many lightening storms, and power surges you get...
AND what gear you got to protect the computers....
all else is BS, UPdate what??
office 2003 is as good as office 2010.....
all the rest is bullshit....
seems like every time some 'proGRAMMMMME' demands a update, the machine is screwed up forever.....
EFF that...it werks, leave it the EFF alone....damnit....
:cheers::slap::kick:
I'd say about 15. A 1 year old computer is out of date like a 35 yo is out of date. :leaving:
Rotorhead
07-28-2011, 7:42pm
You've got it ass backwards. One human year = 7 dog years.
My calculation is one human year is about 20 computer years.
I'd say about 15. A 1 year old computer is out of date like a 35 yo is out of date. :leaving:
HEY!!! I resemble that remark :leaving:
1=15...My laptop is 90 years old. It must be the healthiest 90 year old Ive never seen.
Uncle Pervey
07-28-2011, 9:45pm
Back in the 1950's a dude name Moore wrote a paper on newly invented integrated circuits. He stated that the number of transistors in an integrated circuit would double every 2 years, and he expect this trend to continue for the next 10 years or so. It's over 50 years later and the exponential growth in numbers of transistors in an IC have continue to double. This trend is supposed to continue until 2020.
Not only has the number of transistors followed this exponential growth but all phases of technology that is associated with computers have followed this trend.
This doubling is of capability in technology is referred to as "Moore's Law."
Moore wasn't the first to make this sort of prediction, Allan Turing back in 1950 stated that by 2001 computers would have a billion words of memory. A gigabyte is equivalent to about one billion words. In 2001 I had a Windows 2000 box that had 2 gigabytes of memory. Looks like Turing undershoot in his guess.
I'd guess the age goes up exponentially. From 20 years old at purchase to 120 years old by year 8. After that time they are on life support and unable to run any current software well. :yesnod:
Datawiz
07-29-2011, 8:22am
You've got it ass backwards. One human year = 7 dog years.
My calculation is one human year is about 20 computer years.
:slap:
Yerf Dog
07-29-2011, 10:37am
15? So my tower PC I use at home is 120. :willy:
My laptop is only 45 though. :D
Stangkiller
07-29-2011, 12:05pm
People live on average about 80 years, give or take, computers start fading after 3 years and are toast after 5 years, so 5 into 80 = 16. :cheers:
That looks about accurate.
wowbrandonwow
07-26-2013, 3:16am
15? So my tower PC I use at home is 120. :willy:
My laptop is only 45 though. :D
I guess if what I had waiting for me back home was 120 I would be tickled pink to have something only 45 in my lap. :rofl:
DukeAllen
07-26-2013, 3:25am
And this thread you opened is about 30 :rofl:
Stangkiller
07-26-2013, 7:34am
That computer is still faster than Reds
Truck Guy
07-26-2013, 11:34am
Take new computer out of the box...turn it on...set it up...
<1 hour later> out of date! :lol:
<<<2 year old thread>>>
vBulletin® v3.8.4, Copyright ©2000-2025, Jelsoft Enterprises Ltd.