Fifty years of seconds is around 1,577,880,000 seconds.
That many seconds ago, a series of lectures was held at M.I.T. in honor of its 100 anniversary (thus, in the spring of 1961).
The lectures were recorded, including questions from the audience, and transcribed. The book was published in 1962: "Management and the computer of the future", Greenberger (ed.).
I am reading that book now, and enjoying it greatly.
Especially enjoyable is the lecture on time-sharing, given by John McCarthy. This was the first public mention of the concept, and McCarthy is widely accepted as the "father" of time-sharing.
The question at hand, then, was whether people would want individual access to a "large" computer, and if so, how this desire could be accommodated.
Now, we know that, yes, they do, and that it will be accommodated by the provision of "small" personal computers and access to the world wide web.
How big is "large" and how small is "small"?
In the 1960's an expenditure of about six million dollars would have provided for a hundred or so terminals to be attached to a computer with about one million words of memory and perhaps as much as 50 million words of secondary (disk) storage. That was considered "large" back then.
Assuming a "word" to be 32 bits (4 bytes), this means that a computer that would satisfy the needs for personal programming for the entire M.I.T. would have 4 M bytes of memory and a 200 M byte hard drive.
Physically, it would fill a large room, and require air conditioning and a team of people to operate.
A personal computer today might easily have (as mine does) 8 G bytes of memory and 500 G bytes of hard disk. That's more than 2000 times the computing power required for all of M.I.T. a mere billion and a half seconds ago. It is also connected to the world wide web and a trillion documents.
Physically, it occupies less than a tenth of a cubic foot, and doesn't require air-conditioning.
The very last quote of the chapter is about the value of specialized hardware to make the machine more usable. McCarthy warns "...we must be very careful that the money saved in hardware is not spent in programming."
Today, a web site can be put together with no hardware purchase at all. By hosting in the cloud, it would cost pennies an hour to operate. The programming, however, will cost tens of dollars an hour. The money saved in hardware is indeed spent on programming, validating McCarthy's fear.
The world has come a long way at the usual rate of one second per second.
Operating Systems as Possible Worlds
22 hours ago