Ruminating on Replaceable Sensors

wgerrard

Veteran
Local time
10:11 AM
Joined
Sep 10, 2007
Messages
2,451
Thinking about film and digital and all that... one of the usual arguments against digital is the anticipated inability of digital hardware to hold a price as well as film hardware.

If we assume, correctly or incorrectly, that improvement in sensor technology is the one area most likely to drive down the price of the previous generation of digital hardware, could a manufacturer design a camera with replaceable sensors?

That is, when new sensors are available, you just swap out the old and insert the new.

Feasible? Or, are the other electronics of a camera intimately tied to a particular sensor?
 
I think this has to come eventually: Except that at the moment the commercial incentives are against it. It is much easier for a manufacturer to convince buyers to "invest" in new cameras if the new camera has a new and upgraded sensor of higher resolving power. Remember that a camera manufacturer has to sell new cameras constantly and at high volumes to recoup their development and manuf costs and make a profit.

My feeling however, is that as the technology matures, and especially once the sensors / resolution stop "growing" so rapidly in capacity this could possibly become a feature (unless there are technical impediments I do not know about.) Ironic I know - if sensors continue to grow rapidly in capacity then there is a need for upgradable sensor technology; but no commercial incentive to do it. But this feature actually becomes more feasible commercially once sensors stop growing, as resolution become less of a selling point in the overall camera "package."

Like most things I would imagine if / when it happens it would as usual start with upper end pro quality gear then move to the consumer end of the market. Once one credible pro market supplier does it , then others will be forced by competitive pressures to follow suit. (Just look at the Canon / Nikon battle over full sized sensor chips. No camera has full sized sensors then suddenly it becoomes the next big thing. - excuse the pun.) Eventually though the need for constant chip upgrades will diminish. Except in niche markets, all SLR sensors will probably eventually be full sized and maybe 20-30 megs in resolution, so the battle then will be over other forms of sensor upgrades (e.g. HDR capacity or lower noise at higher ISO settings.)

I find it hard to see why other than commercial reasons this could not be done although the initial cost of developing could be high as it has to be plug and play which means the other supporting technoloy has to be redesigned to have a more open architecture. And of course, probably there would as usual be a battle over whose upgradable chip technolgy becomes the industry standard so some of us may be stuck with outdated "dead end" technology.
 
Last edited:
Many of the same dynamics have driven the consumer PC market. Bigger, faster CPU's prompt a round of new PC models. Usually, those model roll in the improvements to other hardware that have also become available.

Now, the PC architecture is open just enough that savvy users can swap out CPU's, motherboards, drives, video cards, etc, and retain the same basic box. This open architecture does not exist due to the charity of PC makers. it exists because Compaq and others won a commercial battle against IBM. (Apple is an example of a closed architecture. Great hardware, but it's almost impossible to upgrade your own machine.)

Before we see sensors replaceable across brand lines, we will need to see enough common and non-proprietary camera design to make it possible. It would still be a step ahead if manufacturers made sensors that worked only with their cameras.
 
First, interchangeable sensors -- indeed, single use sensors -- already exist, with continual megapixel upgrades. We call them film.

Second, the idea of any other kind higher-megpixel interchangeable sensor is a non-starter.

You need a LOT more processing power, so you must either build the 'Mk. I' version with a wild excess of processing power (and charge a fortune for doing so, and lose all your customers) or accept that the 'Mk. II' will run deadly slowly (and lose all your customers).

Of course you can replace the sensor AND the CPU, but by then, you might as well buy a new camera.

I've discussed this one at Solms.

Cheers,

R.
 
Roger, while I see little incentive for manufacturers to build replaceable sensors, I'm not sure I understand why you'd need "LOT more processing power."

If you had an an open architecture that paralleled the PC world, I'd think a sensor might be just as replaceable as anything in the homebrew PC that's currently residing in one of my closets.

I don't expect to see camera makers accept that kind of openness. (Especially since in the PC world it encompasses software, specifically, the BIOS.)

Then, there is also the fact that commercial pressure and institutional efforts establish tech standards that promote inerchangeability in the PC world.

EDIT: I'd think replaceable sensors would be to the advantage of manufacturers, even if they did not allow consumers to take advantage of them. Specifically, they could market cameras with the newest sensors without the expense of a redesign or of rejigging the assembly process.
 
Last edited:
wgerrard said:
Roger, while I see little incentive for manufacturers to build replaceable sensors, I'm not sure I understand why you'd need "LOT more processing power."
A computer is built to handle anything that may reasonably be thrown at it, and after 3-5 years is slow and short on memory for many applications.

A camera CPU is built to handle the files that the camera produces.

Build in 4x the processing power (for 4x as many pixels) at 2007 prices.

Consider a new camera with 4x the processing power at 2010 prices...

Cheers,

R.
 
Roger, that homebrew PC sitting in my closet is about three years old. If I felt like it, I could replace the motherboard, the CPU, the video card, the sound card, and the hard drives. I'd get a PC with entirely new capabilities.

I could do that because of standards that all PC manufactuerers adhere to because it is to their benefit. That wasn't inevitable. After its initial success with its original IBM PC and the follow-on IBM PC-AT platform, IBM brought out a line of PC's that used a different architecture, and attempted to tie software to that platform. At more or less the same time, the PC BIOS was reverse-engineered, and Compaq launched a very successful line of PC's using that BIOS based on the AT platform. IBM's proprietary hardware disappeared and, today, every PC sold traces it lineage, and its openness, to those Compaq machines.

While I see little, if any, commercial pressure on camera makers to follow that path, I see nothng technological preventing it.
 
Why would they sell just a sensor when they can sell you a whole new camera instead...???
Products today are designed to become obsolete within a few years...
 
sitemistic said:
But when you can buy Quad Core computer at Wal-mart for $500, what's the point in upgrading an old computer. Building computers yourself, unless you are a hard core gamer, stopped making sense 15 years ago.
Sorry to bicker on this, but it still makes sense. And I'm not a hardcore gamer, either.
The computer I built myself 3 years ago was a much better deal than anything I could have bought pre-made. It was a better deal in terms of processing power for the money, and because I configured it exactly for my needs.
 
Roger Hicks said:
First, interchangeable sensors -- indeed, single use sensors -- already exist, with continual megapixel upgrades. We call them film.

YES!

The thing that bothers me the most about buying a digital camera is that I'm stuck with the same "film" for the life of it!

Of course you can replace the sensor AND the CPU, but by then, you might as well buy a new camera.

From the point of view of a camera manufacturer, why would they want to sell you a new sensor when they can sell you the whole camera?
 
Rather than upgradable sensors that don't make economic sense, I would rather camera makers stick to the Volkswagen Beetle, Leica M concept.

That is, design a "timeless", practical, basic and simple "standard" body, and constantly refine and perfect it.

Ideally, I'd like to see something like a Nikon FE2 or a smaller, lighter Leica M with a full frame sensor, and the basic body, controls and concept would be left alone.

Every couple of years a new model would come out, but only changed for the sake of reliability, image improvement, or other practical reasons that can be improved when better imaging technology comes along.

The basic style, form factor and functionality of the camera series would remain unchanged. Everything possible that was standard and interchangeable would remain that way.

These are the types of devices that get a huge cult following and a long life span.
 
wgerrard said:
Roger, that homebrew PC sitting in my closet is about three years old. If I felt like it, I could replace the motherboard, the CPU, the video card, the sound card, and the hard drives. I'd get a PC with entirely new capabilities.

I could do that because of standards that all PC manufactuerers adhere to because it is to their benefit. That wasn't inevitable. After its initial success with its original IBM PC and the follow-on IBM PC-AT platform, IBM brought out a line of PC's that used a different architecture, and attempted to tie software to that platform. At more or less the same time, the PC BIOS was reverse-engineered, and Compaq launched a very successful line of PC's using that BIOS based on the AT platform. IBM's proprietary hardware disappeared and, today, every PC sold traces it lineage, and its openness, to those Compaq machines.

While I see little, if any, commercial pressure on camera makers to follow that path, I see nothng technological preventing it.

The whole point is that the majority of the digicam and DSLR market is to people who wouldn't have a clue nor have an inclination to accomplish that upgrade. And has been stated by you and others, there is absolutely no incentive for a camera manufacturer to offer upgrade components. Just think of the tech support nightmare.

The beauty of film photography is that the sensitive material can be upgraded in a system where the optics often have been better than the previous sensitive material. Your common Soligor lens wasn't better than the film, but for those of us who have Leica, Zeiss and other high quality optics, new film simply increases our return on investment.
 
All this presuposses that digital cameras will continue to base their form on film cameras; I suspect that eventually(and, frankly, hope for sooner rather than later) the cameras will bear very little resemblance to what we think of when "camera" is mentioned. That's when I'll be interested in what's being made.
Rob
 
dmr said:
YES!
From the point of view of a camera manufacturer, why would they want to sell you a new sensor when they can sell you the whole camera?

They wouldn't.

I'm not suggesting that we will start building cameras from a box of components. But I am suggesting that it would make more financial sense to replace a sensor for $100 than to spend several hundred dollars buying a new model that's identical to your current camera except for that new sensor.

PC makers usually won't sell you a new CPU, either, Or a new drive, or video card, etc. Those are all made and sold by other vendors. Most of them sell to consumers and to the PC makers, as well.

All that is possible because PC's are built on an open architecture and according to a set of standards that define how all the parts will work together. While the internals of each component may be proprietary, if a vendor wants to succeed, he needs to ensure his product will work in any PC built on that architecture and heeding those standards.

This state of affairs exists because, twenty or more years ago, IBM failed in its attempt to replace its AT architecture with a closed proprietary system. They failed, primarily because Compaq, and then others, built better and cheaper AT clones using an open architecture and a BIOS with source code that was available for purchase. Vendors began selling the components such machines could use, and an industry was born. No one would dream of selling PC's that did not conform to the architecture. Even Apple, while retaining much of its proprietary nature, abandoned some of its own internal standards (IO and video in particular) and adopted standards developed on the PC side.

The equivalent in digital cameras would be, more or less, lenses that could be used on bodies across brands, interchangeable and replaceable sensors across brands, commercial availability of the source code for the software that drives a digital camera, and componentizing of cameras' internal circuitry so owners could upgrade it in a way that's analagous to swapping motherboards in a PC.

Camera makers do not now have an interest in seeing that happen because they profit by locking each customer into their proprietary system as much as possible.

BTW, the last PC I built cost me $900. Buying equivalent hardware ready-to-go would have cost me almost three times as much.
 
Last edited:
Replacing a processor is not analogous to swapping a processor in a PC.

And what if you could stick a Pentium chip in a 1987 IBM AT?

You would still have a ancient PC that worked really fast.
 
I've seen inside computers -- indeed, I've done the occasional 'upgrade' or component swap myself -- and I've watched M8s being built.

There's a lot more room in a computer and the components don't have to be positioned to fractions of a thousandth of an inch.

'Open architecture' has nothing to do with it.

Cheers,

R.
 
Last edited:
It is not just size tolerances that make upgradeable cameras uneconomic. It's just that an electronic system is not a LEGO set.

To borrow the example of other poster, Pentium in a 1987 AT wouldn't even work at all. You have different memory and I/O bus width (16 bit vs 32), communication protocol, bus speed (a Pentium wouldn't even clock that slow), voltages (5V vs. 3.3V). Not to mention the totally different CPU socket lacking a hundred or two of pins.

Mind you, this is for open-architecture, size and power unconstrained, general purpose commodity hardware. When you step into embedded computing land, there is little to none standardization at all. Most of such designs are done on case-by-case basis; you take core components that have to be there (such as CCD/CMOS sensor, motor actuators, memory controllers, AF phase sensors in case of cameras) into device and wire them together with the support and integration logic required. Controller logic for CCD would be totally different from CMOS, yet two CMOS chips from different generations or different manufacturers would interface quite differently too.

The bottomline - if you want upgradeable digital camera you'd need to replace the complete electronics package.
 
!

!

Didn't Leica do it with the DMR? Who's to say there won't be a DMR-II?

The only thing is the price-- would you pay another $4,500 to swap your current back for the new back?

wgerrard said:
Thinking about film and digital and all that... one of the usual arguments against digital is the anticipated inability of digital hardware to hold a price as well as film hardware.

If we assume, correctly or incorrectly, that improvement in sensor technology is the one area most likely to drive down the price of the previous generation of digital hardware, could a manufacturer design a camera with replaceable sensors?

That is, when new sensors are available, you just swap out the old and insert the new.

Feasible? Or, are the other electronics of a camera intimately tied to a particular sensor?
 
Processing power isn't necessarily a bottle neck. New algorhythms are invented all the time, decreasing processing time with equal processor power.

What's more important is that the physical hardware remains compatible. And that might become a problem. What seems today an efficient way of hooking the sensor up to the rest of the device, might turn out not so efficient anymore in the future. Extra capabilities might also demand a redesign of the wiring and connections.
 
what varjag said - exactly my thoughts.
At the end, many fixed-lens cameras have lens assembly molded together w/ CCD, which is put in case and connected to LCD and processing unit. Add here dimensions, layout of components, tight packaging and here we go - this is single use product. Designing camera to have interchangeable CCD would make it too pricey, I think. And at the end spare CCD would cost well enough to cover expenses.

As long as wide, I'd say.
 
Back
Top Bottom