8-bit/pixel Raw File for M8

Sonnar Brian

Product of the Fifties
Staff member
Local time
11:36 AM
Joined
Jan 12, 2004
Messages
19,625
I was trying to figure out why the DNG files were only 10MBytes and not 20MBytes. The M8, M8.2, and M9 use 16-bit sampling to capture the image but only use 8-bits to store it. The square-root of the captured value is stored, not the actual value. It's a neat trick, but means giving up granularity on the high-end of the A/D output. With storage so cheap, I'd prefer the 20MByte files. I wonder what Pentax does with their 22-bit A/D's?

http://nemeng.com/leica/004f.shtml

If they give me the source code to the camera, I could change it back for them.
 
Last edited:
now there's a thought ... open architecture for digital cameras. it could create a cottage industry of programmers featuring processor customization. woo-hoo!
 
Some of this has been done for other "consumer grade" digital cameras. Apparently open source for the Canon powershots and such.

It's just SO much easier when you get to have the camera built to your specifications and write all of the firmware yourself. Sometimes I miss those days. But it was very, very big.
 
I was trying to figure out why the DNG files were only 10MBytes and not 20MBytes. The M8, M8.2, and M9 use 16-bit sampling to capture the image but only use 8-bits to store it. The square-root of the captured value is stored, not the actual value. It's a neat trick, but means giving up granularity on the high-end of the A/D output. With storage so cheap, I'd prefer the 20MByte files. I wonder what Pentax does with their 22-bit A/D's?

http://nemeng.com/leica/004f.shtml

If they give me the source code to the camera, I could change it back for them.

Sorry, Brian, the M9 is out of that list. Uncompressed 16-bit DNG files are 37 Mb on that camera. Of course you can set it to an M8 type compression, if you so desire. Then the file is 18 Mb. The availability of uncompressed RAW was one of the deciding factors to go to the M9 for me.
 
This is a feature a lot of people are asking for over at luf, hopefully it'll be implemented in the upcoming M8 firmware upgrade.
 
A lot of people?? I have only seen that sporadically, if at all, over the last two years. Anyway, it is rather unlikely, as the M8 probably lacks the processing power to do this.
 
That is great news that the M9 has 16-bit raw files. I am considering one for work, if not for home "yet".

I suspect most people do not know about the compression function, or do not understand the impact. It's nothing new, Kodak did the same type of trick with the DCS420. We asked Kodak the same question about why they just did not store difference frames.

The M8 could probably perform the 16-bit operation, but it would slow down store to memory operations.

Just reminiscing- in 1981 I worked with a group that developed the first Digital Infrared sensors. My boss came up with compression algorithms, and I implemented them in FORTRAN on a Parallel-vector Supercomputer. I could make my code run 400x faster than what he could do. The savings in cost for CPU time was more than my salary.
 
Last edited:
Leica did originally test some M8 cameras for 16-bit output, but reported they were too slow to sell.
 
...

Just reminiscing- in 1981 I worked with a group that developed the first Digital Infrared sensors. My boss came up with compression algorithms, and I implemented them in FORTRAN on a Parallel-vector Supercomputer. I could make my code run 400x faster than what he could do. The savings in cost for CPU time was more than my salary.

Nobody remember or understand those days? Of course, with desktops that can match and exceed mainframes of not too long ago, I guess it's no wonder.

BTW Brian, is FORTRAN still a viable language? We still have some legacy COBOL where I work, and programmers to maintain it, but nobody I know of is writing new programs in it.

I remember when if you were serious, you progressed from BASIC to Paschal, and then went to the big boy toys of COBOL and FORTRAN. You were taught Paschal so you could learn structured programming. Of course you should have been taught that with BASIC, but it just didn't seem that you ever were.

Before I retired from the US Army, I found we were implementing a major database program in COBOL. I was shocked that it had been picked over C or the "new" C++ language. I guess I shouldn't have been.

Later, the CIS certificate course I was in taught BASIC, COBOL, and then C. But in 1991, there was still some COBOL being implemented, and certainly a lot being maintained. I have used some of my IT knowledge as I have moved jobs, but never became a programmer. The way IT has progressed, that is probably a good thing.
 
There are a few of us FORTRAN programmers left. I use it for realtime embedded applications where exact timing is required. FORTRAN produces completely synchronous code. Coupled with the low-level operations done in Assembly, it does what I want. C++ and even C implement some functions using code that can produce "hiccups" in the timing, where the generated code has to take time out for book-keeping, such as garbage collection. I get to have custom boards made for my projects.

On speed of new machines, I've run some benchmark code from the CRAY and TI-ASC on them. Nikki's Celeron is much faster than a Cray XMP. But on the TI-ASC, I could get 400million operations per second out of a 12.5MHz machine. I grouped many pixels per word, then did Bit-wise operations.

The 16-bit save function would not be too slow for my taste. It is faster than the spin-up time for the disk in my DCS200.

Plus- all of the M8's that are going to be made have sold already. So the change in firmware will not impede any sales...
 
Last edited:
@oftheherd: my wife still programs in COBOL. And, these are new programs designed to run on IBM mainframes. COBOL is not alive, but not dead either.

@Brian: People are still trying to solve concurrency issues for parallel programming on multi-core systems. C, C++ are continuously evolving by introducing new constructs. While others are trying to write new programming languages. "GO" by Google is the new kid on the block.

I still use good old "C" to do parallel programming.
 
I'll have to put up some example code from the ASC.

It could collapse a 3-deep DO loop, with stride-counts on two of them, into a single assembly language instruction. It also split the outer loop into four equal loops and scheduled them on four parallel execution pipes. "FORK and JOIN" instructions were generated from the FORTRAN to sync it all. All this in FORTRAN-66, circa 1975.

In the 21st century, I find RISC assembly language is a lot of fun. Keeps me off the streets and pays the bills.
 
A lot of people?? I have only seen that sporadically, if at all, over the last two years. Anyway, it is rather unlikely, as the M8 probably lacks the processing power to do this.

Idk, I remember seeing a quite a few asking for it..not as much as the lens selection menu though :rolleyes:

Doubt the M8 lacks enough processing power to do this, worse case is it'll slow down the saving stage..which really isn't much of a problem considering how one shoots with a rangefinder. For those who do want speed, all they have to do is stick to the 8bit option :)
 
Back
Top Bottom