Wall St. Journal: Why Analog is Often Better Than Digital

you can't go home again

you can't go home again

decades ago (i was there) we turned a fork in the road, and there is no going back at this point.

i lamented the loss of analog computers (look it up on google or whatever) and the switch to the convenience of digital (not to be denied - extremely convenient and productive). . . mostly because analog keeps you close to the physics of stuff and digital reinterprets the physical world for you and then decodes it in a simulation of what the real (analog world ) was in the first place.

i guess that you had to be there to really understand what happened.

doesn't matter . . . there is no going back.
 
decades ago (i was there) we turned a fork in the road, and there is no going back at this point.

i lamented the loss of analog computers (look it up on google or whatever) and the switch to the convenience of digital (not to be denied - extremely convenient and productive). . . mostly because analog keeps you close to the physics of stuff and digital reinterprets the physical world for you and then decodes it in a simulation of what the real (analog world ) was in the first place.

Have you ever looked at a B&W negative and asked yourself where all the grayscales are? In the sense that you mean it, analog photography hasn't really existed since the 19th century. A negative is the equivalent of a CD recording.

And as far as physics goes, at some point being analog actually takes you away from reality.
 
I prefer Digital Computers over Analog Computers.

picture.php


picture.php


As you can see, the number of grey levels on an analog computer is quantized.

Digital Computers are easier to Program.

picture.php


picture.php
 
Last edited:
we are probably approaching to the point now that younger directors, who started off on low-budget digital projects, don't actually have any experience with film. If Hollywood abandons film completely, that could have some real consequences for what remains of the film industry.

That is exactly what's happening. Lower expectations = lower skills. As one commenter to the Fukuyama article in the WSJ, Scott Smith, put it:

I witness the same lack of care and attention on film sets, where the attitude is "just keep rolling, we can always do it over again". While this may provide a certain amount of freedom for the actors, it also engenders the attitude of "why try my hardest, we can always do it over", instead of "I better do my best, because we may not have another chance".

The old digital machine gun.
 
Hello Brian

Hello Brian

01001000 01100101 01101100 01101100 01101111 00100000 01000010 01110010 01101001 01100001 01101110 00001101 00001010

yeh . . . you're right, digital is better.

........................................................................................
now let me add my non-sarcastic commentary . . .

Digital computers are easier to program because that's we have been evolving for the last 40 years ! We abandoned analog technology (in computers) a lifetime ago because digital was more generic. A person doesn't need to understand the physics of a problem in order to run a canned, generic digital program. But she/he has to understand the physics to assemble an analog model. Digital made engineering monkey-proof!

Sim to comparing manual film cameras with "select-a-dial" digital cameras. Pretty much anybody now can record a nice picture having not the slightest idea of what they did photographically.

Like I said . . . you can't go home again . . the fork in the road is way way back there.
 
01001000 01100101 01101100 01101100 01101111 00100000 01000010 01110010 01101001 01100001 01101110 00001101 00001010

yeh . . . you're right, digital is better.

48 65 6C 6C 6F 20 42 72 69 61 6E 0D 0A
Hello Brian <cr><lf>

> Digital computers are easier to program because that's we have been evolving for the last 40 years !

I prefer coding in RISC assembly language. I forgot to evolve.

I miss the FORK instruction on the old Supercomputers, where you had to manually schedule processing through the parallel pipelines. Spin off two FORKs in the assembly language, and then use a JOIN instruction to sync up the results.
 
Last edited:
correct my words . . .

correct my words . . .

yes, after reading your last reply, I realized that I should have said "Digital computers are easier to use." (Like modern digital cameras with "scene modes".) Writing the computer code is, and always has been, not easy work. The same must be true for the folks who code the chips in our cameras . . . they are the guys doing all the thinking for us !
 
That is Grace Hopper's AUTOGRAPHED Manual of Operations for the IBM Mark I.

picture.php


picture.php


Looking at the serial number, looks like the 16th copy of an edition of 258. When I presented the book to then Captain Hopper for her autograph, she asked "Where the hell did you get his."

I need to get this book appraised.
 
Last edited:
As far as the Analog Computer discussion- one of the optical engineers that worked for me developed the optics for the "Optical Broadband Correlator", designed and built at NRL. Essentially it was an optical computer that attached to a VAX, and was used for performing Fourier Transforms optically. This was over 20 years ago. The system used fast D/A's to get data in, and A/D's to get it back out. Probably one of the more recent developments of an analog computer tied to a digital computer. In the end- parallel digital computers were faster, the bottleneck for the optical computer being the conversions.
 
Last edited:
yes, after reading your last reply, I realized that I should have said "Digital computers are easier to use." (Like modern digital cameras with "scene modes".) Writing the computer code is, and always has been, not easy work. The same must be true for the folks who code the chips in our cameras . . . they are the guys doing all the thinking for us !

Many things are easier to program too nowadays. Look at the programming of Apps for iPhone etc. Sometimes it's really good to abstract away all the technical stuff from the programmers.
 
Many things are easier to program too nowadays. Look at the programming of Apps for iPhone etc. Sometimes it's really good to abstract away all the technical stuff from the programmers.

The flip side of that coin is the processing power of a modern Mac or PC is orders of magnitude higher than that of two decades ago, but it takes just as long to display a print preview. And yet, there isn't really much more that needs to take place. It's just done sloppier and with more needless overhead.

You want a faster computer, a bigger hard drive. But at the end of the day, tasks take just as long and drives are just as full, everything is just less efficient.

An iPhone is a glorified Palm Pilot 1000. Neither can really multi-task. Not because of hardware but because of the software.
 
...
An iPhone is a glorified Palm Pilot 1000. Neither can really multi-task. Not because of hardware but because of the software.

I don't know about you but MY palm pilot couldn't play music nor view video, nor run 3d games or make phone calls :)
 
The flip side of that coin is the processing power of a modern Mac or PC is orders of magnitude higher than that of two decades ago, but it takes just as long to display a print preview. And yet, there isn't really much more that needs to take place. It's just done sloppier and with more needless overhead.

You want a faster computer, a bigger hard drive. But at the end of the day, tasks take just as long and drives are just as full, everything is just less efficient.

You forget that most people want more and more features in software. So the software grows in it's complexity and it's simply not economically feasible to optimize everything for speed. And why should it be faster? As a software producer you know exactly that most people buy a new pc every 2-4 years, so users solve the problem of the software with hardware. When I have a fixed timeframe and can choose between offering new features for my software or optimize the software for users with old equipment, I know in which direction to go.

>>And yet, there isn't really much more that needs to take place.
>>It's just done sloppier and with more needless overhead.

Think about this again seriously.
 
Back
Top Bottom