1. In my previous post on the subject of the color responses of Aperture, Lightroom and Capture, I mentioned that once I adjusted for exposure and contrast, the rendition that Capture One gave became very bright. The image that I was using was taken with a Nikon D80, and is not particularly badly exposed. Below is the rendering that Aperture gives completely unadjusted:

    Aperture 2 test chart, unadjusted


    And now the image that Aperture 2 gives when the adjusted. The process that I followed was to adjust the exposure and contrast setting of each program to get the “White” and “Black” patches to exactly the correct values. In both Lightroom and Aperture, this gives a normal looking image - ceratinly more exposed than the original, to be expected given the underexposed "grey" look to the white patch, but generally correct.


    Aperture 2 test chart, adjusted for contrast and exposure


    The Capture One test chart however is very different; as we might have expected from the numeric results in my previous post, it’s very bright, to the point that the image is becoming washed out. For example, the “Light Skin” patch is effectively no longer skin colored – it’s closer to white.


    Capture One V4 test chart, adjusted for contrast and exposure


    A numeric view confirms the different brightness behavior of the three programs. The three charts below compare the expected versus actual values for the five neutral toned patches on the D80 test image above. Both Aperture and Lightroom hold the deviation from expected to within 6 units. However, Capture One deviates from its unadjusted tone curve by 21 units, a very substantial difference. Some difference in tone curve is probably unavoidable in practice – at the end of the day, any adjust made to exposure effectively impacts on the tone curve of the raw developer in question. But its clear that Capture One’s exposure and contrast controls interact with the tone curve to an extent not seen in the other software.





    A few questions come up here; firstly, is there a better way to make exposure adjustments, and secondly, is this a problem, or just a quirk? The answer to the first question is that while Capture One experts may be able to point out a way to make basic exposure adjustments that doesn’t interact with the Capture one tone curve, when I used the shadow and highlight controls, the only obvious alternatives:

    1. I was unable to get both the white and black patch to simultaneously be at the correct value, and
    2. Using those controls to adjust basic exposure would rather beg the question of what the role of the exposure and contrast controls should be.

    To answer the second question, I attempted to use each program’s brightness control to get the “standard” tone curve for each program after making the adjustments to exposure and contrast. In the case of Aperture and Lightroom, this was quite easy, and I could “dial in” the value of each of the neutral patches to within a unit or two quite easily, using a combination of exposure, contrast and brightness adjustments. For Capture One however, it proved impossible to get close to having all the neutral patches at near to their correct values. The settings required were such that the exposure, contrast and brightness controls ended up at the extremes of their ranges. Given that what I was doing was making fairly minor (less than 1 stop) exposure adjustments, this seems to me to be a fundamental flaw in the way that Phase One have implemented their exposure controls. Now it may be that the forthcoming pro version of Capture One will function differently, but it seems unlikely that so basic a part of a raw developer would change between the standard and pro version.

    2

    View comments

  2. In this post, I’ll look at the color response of Capture One, Lightroom and Aperture against an image of an actual GretagMacbeth test chart, as cteated on a Nikon D80, rather than the Leica M8 I used in the previous post. As was the case last time, I first adjust the contrast and exposure setting on each program to exactly match expected values of the lightest and darkest monochrome patches on the GretagMacbeth chart. This exactly matches the exposure of the real images to the effective exposure of the synthetic image. As was the case for the synthetic images, all the test results are on a 0 to 100 scale, and represent the difference between the expected value as derived from the color values of the GretagMacbeth chart and the actual values measured. So, for example, if the red bar of the “Cyan patch” shows a value of -5, that means that the actual measured value of the R component of the RGB values as read out by the software in question was 5 units less that the theoretical value.

    In the case of Lightroom, relative to the deviations in the M8 actual image from the last post, we see less negative deviations overall, indicating more saturated colors, and a significantly more saturated red patch. However, overall the picture is relatively similar to that of the M8. This starts to imply that differences in color rendering really are more to do with the raw conversion software, and less to do with different cameras.



    Aperture shows a somewhat different picture. The 1.1 rendering of the M8 showed some large positive spikes in the blue component of several patches, especially the yellow patch. This doesn’t appear on the D80 rendering. However, the red component is the cyan patch is quite negative. The 2.0 rendering shows considerable change relative to the previous V1.1. Firstly, most of negative deviations have gone – the largest negative deviation anywhere is in the red component of the cyan patch, but even this is well down from the previous value. Overall, the 2.0 D80 rendering appears significantly better controlled than the previous version.



    Capture One is an interesting case. At a first glance, it appears that the rendering is simply a considerable distance away from the theoretical values, almost all color components appear to be greater than the theoretical values would indicate. The peak deviations are over 20 units, in sharp contrast to Capture One’s rendering on the M8 image, in which the deviations are of the order of 10 units. However, a closer look shows that what has actually occurred is that the pattern of deviations has remained very much the same, but that their magnitude has grown, and been offset in a positive direction. What this amounts to is that the image is considerably brighter overall. This is a strange result, and one that I’ll come back to in my next post.

    For the moment ignoring the issue of the Capture One brightness, we can draw two conclusions at this point:

    1. Most of the color variation that we see appears to be due to variations in the calibration of the raw converters, rather than variations between camera brands. The Lightroom M8 and D80 color renderings look a more alike than, for example, the color renderings of the M8 using Lightroom and Capture One.
    2. Aperture 2’s color rendering appears to have been significantly improved, at least in a technical sense, relative to the previous versions of Aperture.
    0

    Add a comment

  3. In the previous part of this mini-review, I looked at the color response of Capture One, Lightroom and Aperture against a synthetically generated GretagMacbeth test chart. In this post, I’ll look at the response of the same programs against an image of an actual GretagMacbeth test chart. The process that I will follow for the actual image is a little different to that for the synthetic image. In the case of the synthetic image, I made no adjustment whatsoever to the image – the readings are exactly as they appear when the image is imported into each program. For the real image however, I first adjust the contrast and exposure setting on each program to exactly match expected values of the lightest and darkest monochrome patches on the GretagMacbeth chart. This exactly matches the exposure of the real images to the effective exposure of the synthetic image. As was the case for the synthetic images, all the test results are on a 0 to 100 scale, and represent the difference between the expected value as derived from the color values of the GretagMacbeth chart and the actual values measured. So, for example, if the red bar of the “Cyan patch” shows a value of -5, that means that the actual measured value of the R component of the RGB values as read out by the software in question was 5 units less that the theoretical value.

    Relative to the synthetic image, Lightroom was, as expected, very close to the theoretical values for the GretagMacbeth chart. Versus a real image however, it shows significant differences, most noticeably in the red patch, where it has significantly more blue and green than might be expected. At first sight, this is a somewhat counter-intuitive result, as while the greater levels of green and blue indicate a more saturated color than the theoretical representation; Lightroom in general has a reputation for excessive red. It’s only in the green patch that there is significant excess red. This would imply that when the complaint of Lightroom’s “excessive red” is made, it is probably more of a complaint about the saturation of reds in the image, rather than an excess of the red color component.




    Aperture shows no clear pattern of greater or lesser overall saturation, but does show two interesting characteristics. Firstly, the green components are still very much less than are the case for Lightroom, but at the same time the absolute variation from the theoretical value is far less – versus the synthetic image, the variation was -15.3, but against the actual it is only -6.3. This suggests that the Aperture calibration for the green components in a real image is probably better than Lightroom’s, even though the Lightroom’s better matches the synthetic image . Secondly however there are significant variations in the blue component, especially in the 1.1 profile. The newer 2.0 and DNG profiles show color rendition that is a lot closer to expected values that the previous version. This is consistent with Apple’s statements that the raw conversion subsystem has been substantially revised and improved in the new version. Overall, the actual M8 inages converted with the Aperture 2.0 profile is a better match to theoretical values than either the previous version of Aperture, or Lightroom.





    Turning to Capture One, the most significant feature of the charts is the absence of “negative spikes” – while both Lightroom and Aperture have at least some color patches where at least one color is significantly less than the theoretical value, Capture One is relativley better controlled in this respect – only in the yellow patch is there a significant negative deviation. In addition, this control of negative peaks isn’t at the expense of spikes in the positive direct; no spike exceeds 12 units. It’s also interesting to note that in the three primary patches, the red component is within three units of the theoretical value in the red patch, and the green in the green patch and the blue in the blue patch are similarly well controlled. Thus, while Aperture is overall closer to the theoretical values, Capture One is perhaps “closer where it counts”.



    In my next post, I’ll take a brief look at color rendering for the same three programs against an actual image from a Nikon D80, so as to get a feeling for whether the patterns here are M8 specific, or relate more to the programs in question.
    0

    Add a comment

  4. At long last, here’s the comparison of color rendering promised several weeks ago – between work and the display board in my main PC failing, this has taken longer than I’d expected. This post compares the color rendering of Lightroom, Aperture and Capture One versus a synthetic test image. That image was created by taking the raw image from a Leica M8, which is in DNG format, and then replacing the contents of the image with a synthetic version of a GretagMacbeth 24-patch color chart. This can be done because DNG format files contain all the color calibration information that’s required to go from the Camera’s raw image space to a real image in the two ColorMatrix matrices. So the synthetic image is built by taking the l*a*b* color values for the GretagMacbeth test chart, and reversing the calibration matrixes in the Leica DNG file. This, btw, is being done using a modified version of CornerFix – I’m currently debating whether to include the synthetic image creation functionality in the next official CornerFix release.

    The synthetic test image is what a “perfect” M8 would show. But “perfect” here means an M8 that matches Leica’s calibration matrixes. However, there is no one single best calibration for a real camera. Pretty much all camera calibration is done via a three by three matrix. Using that, you can dial in any three particular colors exactly. So, for example, you can get the red, blue and green patch on the GretagMacbeth chart down to the last decimal point. If sensors were perfect, that calibration would also mean that every other patch would also be calibrated. However, in a real sensor, there are a whole lot of imperfections – among other things, the filters in the Bayer matrix aren’t ideal, so colors bleed between each other, and the sensitivity of the sensor itself varies with the frequency of the light striking it. So, even if you dial three patches in perfectly, the others will be out. So practically, what camera manufacturers and raw developer software writers have to do is to find a calibration that is a compromise across a whole range of colors. However, because people are more sensitive to certain colors being out (e.g., skin tone, foliage, etc) that compromise is often weighted in favor of the sensitive colors.

    The M8 test images can be found here: http://chromasoft.googlepages.com/referenceimages

    The charts below show the difference between the theoretical color values that we should see for a selection of six of the more important color patches, and what we actually get. So, for example, if the red bar of the “Cyan patch” shows a value of -5, that means that the actual measured value of the R component of the RGB values as read out by the software in question was 5 units less that the theoretical value as shown in the spreadsheet I discussed in the last post. In all cases, the scale is 0 to 100.

    First up is Lightroom. It shows minimal deviations from the theoretical values – all the values are within 3 units. But this shouldn’t come as a surprise – Lightroom internally uses the exact same color model as the DNG file, and we know that Lightroom uses exactly the same color calibration as the Leica DNG’s have embedded into them. The minor deviations that we seeing are really just slight imperfections in the tone curve and in the color temperature interpolation process that Lightroom uses.



    Next up is Aperture. There are three Aperture graphs, the first for Aperture V1.5.4. In addition I also have graphs for Aperture 2.0, which came out a few days ago. Aperture 2.0 provides four “Raw Fine Tuning” settings, “1.0”, “1.1”, “2.0” and “2.0 DNG”. I checked, and color rendering from the old 1.5.4 and what you get by setting “1.1” in 2.0 are indeed identical. Firstly, all of the Aperture settings have lot less green in the red patch than Lightroom, and less red in the blue and cyan patches. The 2.0 results are not much different to the 1.5 results; a little bit less red in the blue patch, a bit less green in the red patch, but far less blue in the yellow patch.


    The “2.0 DNG” setting is more interesting. There doesn’t seem to be much documentation on what it does – the Apple aperture site itself is silent on the subject, and various third party sites have words to the effect of “changes to the image using the 2.0 DNG converter are made based on the DNG specification of the file”. This implies that rather than using the Aperture color conversion parameters, setting the DNG mode will give you the colors as set by the ColorMatrix values embedded in the DNG. As it turns out however, that’s just not the case – if it were, we’d see values that looked like Lightroom, but what we see are just some subtle changes to the “2.0” profile. Although visible if you change the setting on the fly, the change is actually more subtle than the change between 1.5 and 2.0.




    Finally, there is Capture One. During the course of this process, Capture One 4.0.1 came out; the results shown here are for 4.0.1, but they are identical to those for 4.0; as far as I can tell, no changes have been made to color rendering between versions. Capture One provides two profiles, one Generic, and one UVIR, designed to match to the M8’s color rendering when mounted with a UVIR filter. While the differences between these two are there, they are quite subtle. Overall however, there are significant differences to the rendering of either Lightroom or Aperture. Capture One shows less red for most patches, especially the red patch, but more red in the cyan patch. Finally, there is generally somewhat less saturation for most colors. This is broadly consistent with most people’s views on Capture One’s rendering as being “less red” than Lightroom.

    In the next post I'll show the same charts for actual rather than synthetic images.
    2

    View comments

  5. In part 1 of this series, I promised to show the tone curves for the various raw developers that I'm looking at. Here they are:


    The Lightroom curves, for various settings of brightness and contrast - brightness has by far the most pronounced impact the image. In Lightroom, to get to a linear curve, you need to do three things - set brightness to zero, set contrast to zero, and select "Tone Curve - Flat" from the presets.


    Then the Aperture curve showing the effect of the Aperture boost setting; the effect is essentially the same as the Lightroom brightness setting. To get a linear curve from Aperture, all you have to do is to set boost to zero.


    Finally, the Capture One V4 setting; Capture One is a bit different to Lightroom and Aperture. Where Lightroom and Aperture have slider settings that are non-zero by default (brightness and boost respectively), on Capture One all settings default to zero. However, also by default, Capture One loads a "Film Standard" tone curve, which has a very similar effect to the other two program's non-zero settings. To get rid of the curve, all you need to do is to select the "Linear Response" Curve setting.


    The last set of curves show a comparison of the default curve for each program, all referred back to a sRGB/2.2 gamma curve to make them comparable. While all the curves are about the same shape, there's a distinct difference in the "aggressiveness" of each curve. Lightroom/ACR adds the most brightness in the mid tones, and Capture One the least, and Aperture's about in the middle. We'll come back to this issue later in this series, but the next step is to use these tone curves to allow us to calibrate colors from the GretagMacbeth test chart.

    And actually using the tone curves to calibrate colors is quite easy. All that's involved is the following two steps:
    1. Convert the l*a*b* color values for the GretagMacBeth patches to RGB in the color space of the program in question
    2. Use the tone curve to adjust the RGB values in accordance with the curve
    Once we've done this, we have RGB values that are what should be displayed for each program, if the color calibration is correct. I've done this for the three programs I'm testing - a spreadsheet with the values is posted here: http://chromasoft.googlepages.com/calibrationspreadsheets.

    In the next post in this series, I'll take a look at how each program compares.
    1

    View comments

  6. Just before the Christmas break, I decided to spend some time over the holidays comparing Adobe’s Photoshop Lightroom, Apple’s Aperture and Phase One’s Capture One raw developer/digital asset management products. By way of background I’ve been a Lightroom user since the earlier betas, and really like Lightroom’s workflow management. But I’ve never been happy with Lightroom’s color rendition, but have also not had the time to really dig into why I wasn’t getting what I wanted. But by just before Christmas, I was sufficiently frustrated that, despite what I like about Lightroom, I was in “there’s got to be a better way than this” mode – ready for change.

    Now one person’s great color rendition is another person’s nightmare, so there isn’t any point in trying to just play with the sliders till something nice comes out – at least not for me. Also, on some products there eighteen sliders, all of which interact. I accept that there may be people out there that can look at an image, move a few sliders, and get what they want, but that isn’t me. So what I’ve chosen to do is to look at color rendition in a more “scientific” way; and ask three questions:

    1. How close is the default rendition of each product to a 24-patch GretagMacBeth color checker?
    2. How easy is it, using each product to calibrate the rendition to as exactly as possible match the theoretical values of the GretagMacBeth test chart, as printed on the instruction sheet that comes with the chart?
    3. How usable is the calibration that I’ve created – is it easy to transfer to other images, how sensitive is it to changes in exposure settings, etc.

    At least, while calibrating to to a GretagMacBeth chart doesn't mean I'll have "good" color, at least it's a consistent starting point. And yes, accepted that questions 1 and 2 are fairly simple and objective questions, but that question 3 starts to get more subjective.

    As perhaps I might have expected this turned out to be a far more complex process than I’d thought, and eventually pulled me into comparing the three products quite a bit more broadly, for example, as regards the performance of their Bayer interpolation engines.

    For the record, the software versions used for this comparison were:

    • Lightroom 1.3.1 Camera Raw 4.3.1
    • Aperture 1.5.4 and 2.0
    • Capture One 4.0.14154.14152 and 4.0.1.14900.14887

    Now the first issue that comes up when trying to do this is a simple one - given that, for example, the skin patch on a GetagMacBeth cart has the l*a*b* color values of (65.711, 18.13, 17.81), what does that actually mean in terms of the RGB values that we should expect to get from the cursor read out in each program. A little of experimentation will show that this isn't a simple question.

    Actually there are two issues with trying to calibrate imaging software - in what units is the readout, and secondly what adjustments are being applied to the image. Typically, when a raw developer loads a TIFF file, it does so without adjustment, but usually applies some kind of tone curve when loading a raw file.

    The readout units for the programs that I've calibrated are:

    1. Lightroom: Melissa RGB - Melissa RGB is the combination of the ProPhoto primaries and the sRBG gamma curve, Also known as "bastard RGB", as it's the bastard child of ProPhoto and sRGB.
    2. Aperture: Wide Gamut. (Note: this was correct at the time this post was originally written. But see the comments this year below - its now Adobe RGB. However, the color rendition information is still correct)
    3. Capture One: Capture One uses whatever color space is set as its output space, so you can set it to any ICM profile you have. For a bunch of ICM profile you can use, see my ICM Profiles page.

    The adjustments made by default are more complex - generally, each raw developer has its own tone curve, and also its own default brighness settings.



    This is the graph of the ACR 3 default tone curve, as extracted from the Adobe DNG toolkit - it shows the flatting at the top and bottom of the curve, and also the default brightness setting



    What I've done to get real tone curves from the packages I'm looking at is to use the monochrome stepwedge reference image (shown above, and available in DNG format on my Reference images page) to work out what the tone curve is.

    In part 2, I'll show those curves.

    6

    View comments

  7. At long last, I've done something I've had on my to-do list for a long time now, which is to create a web space where I can put various files that might be useful to other people. It's at http://chromasoft.googlepages.com/. The first thing that I've posted there are two papers I wrote several years ago. The first goes into the mathematics of color spaces as used by ICC profiles, and color conversion between color spaces. Pretty much any serious imaging software package will use color space descriptions like this, either implicitly, or explicitly as is the case for Phase One's Capture One product.

    Although the papers are largely written from the point of view of color conversion on monitors, for display purposes, all the maths are exactly the same as for cameras. The second paper shows how the shape of the CIE color space can be modeled in three dimensions.These two documents were originally created in MathCad, which is a mathematical modeling package from PTC. It allows the document to contain live mathematical equations, so that you can check that your maths actually works, and foots to real answers.

    Both of these papers are either available in Adobe PDF form, or as MathCad documents. The MathCad documents are live, so allow real calculations to be made. However, they require that you have MathCad.
    0

    Add a comment

  8. Well, I learned something new this morning, thanks to a question that Baxter Bradford asked over on the LUF. What he asked was (in effect) whether the various Adobe raw products (Adobe Camera Raw, Lightroom) would pick up on a changed camera profiles in a DNG file. This was in regard to the Leica M8, which changed its camera calibration data after it's IR sensitivity problems were discovered.

    The way color works in a DNG files is that there are two pairs of color matrices:
    1. ColorMatrix1 and ColorMatrix2. These two provide color calibration t two different color temperatures; in order to set an intermediate temperature, a linear interpolation is used.
    2. CameraCalibration1 and CameraCalibration2. These are used to provide color calibration that is specific to the individual camera, rather than to the camera model.
    The color temperature adjusted ColorMatrix and CameraCalibration matrices are multiplied together to get an overall color conversion matrix. In most DNGs the CameraClalibration matrices are not used (set to an identity matrix, technically) - the only DNGs that I've seen using these are for an Olympus E-3.

    Up till this morning, I would have automatically responded to Baxter's question to the effect that that ACR and LR read the color matrices in the DNG, and since Leica has modified that post IR filters, ACR/LR's color calibration will in effect have changed to match the IR filter adjusted sensitivity. However, I've never actually checked that. So today I did, by overriding the ColorMatrix's in a test DNG, which should give weird color. And it made no difference, which was quite unexpected. Then I also overrode the EXIF camera name to "unknown", and then, guess what, weird color, as expected. After a bit more digging what I found is:
    1. If ACR/Lightroom recognizes the camera name in the DNG, it ignores the ColorMatrix matrices, but still uses the CameraCalibration matrices.
    2. If ACR/Lightroom does not recognize the camera name, it uses both the ColorMatrix matrices and the CameraCalibration matrices as contained in the DNG.
    So the bottom line is, even if ACR/LR are reading a DNG, if either program sees a camera name they recognize, you will get an Adobe Camera Raw color calibration, not what's in the DNG. Only if they don't recognize the camera name will they use the DNG values. However, ACR/Lightroom always honor the CameraCalibration matrices.

    To confirm this, I took a look inside the LR/ACR code, and "Leica M8 digital Camera" is indeed listed in there.
    5

    View comments


  9. As part of my journey into digital imaging, I found myself writing CornerFix, which can be found on http://sourceforge.net/projects/cornerfix/. The image is a screen shot of the Mac version.

    CornerFix corrects for color dependent vignetting in digital images, which shows as cyan colored corners, as in the image on the left hand side of CornerFix's main window in the screen shot. The image in the screen shot comes from an M8 with a CV12 lens and IR filter on it. All digital cameras are some extent subject to this; current generation sensors are highly IR sensitive, so there needs to be a IR filter somewhere. But the combination of sensors and IR filters also cuts into the red part of the spectrum, and do so in a way that depends on the angle through which the light bends as it travels to the sensor. So red gets cut more in the corners. Most DSLR's do this a bit - take a picture of a white wall and you will probably see it, although many cameras correct internally to a greater or lesser extent.

    Leica's M8 has a particular problem in this regard. Historically, one of the advantages that rangefinder cameras had was that the back of a rangefinder lens can be a lot closer to the film surface than is the case for a film SLR - the SLR needs space for the mirror, which a rangefinder doesn't have. So in the film world, rangefinder lenses had a lot more design freedom that SLR lenses - wide-angle lenses that on a SLR had to be reverse telephoto designs could be normal designs on a rangefinder. Ironically, in the digital world, this turns into a disadvantage. Because the back of the lens can be a lot closer to the film, the angle is more acute, and you get a worse cyan corner problem than a DSLR would.

    M8's can correct for this problem themselves, but there are two issues for users - firstly, your lenses must be coded, which means that it must be a Leica lens, either new enough to have been coded when it was manufactured, or one that has been sent to the factory to be upgraded. So anyone with a non-Leica lens or one that can't be upgraded is out of luck, unless they somehow code the lens themselves. The second problem is that the M8's correction is "one size fits all"; it's designed for average situations, and sometimes doesn't do well in unusual lighting.

    CornerFix allows the cyan corners problem to be corrected for any lens, in more or less any lighting situation, by post processing the camera's image file. The right hand side of the main window in the screen shot shows the corrected image. CornerFix is available for the Mac and for Windows. It's free and open source, released under the GPL. Note however that image must be a DNG formatted file - CornerFix doesn't work with TIFF or JPEG files.

    For those interested in the technical details, the core of Cornerfix, which is common to both the Mac and PC versions, is written in "pure" C++, and uses the Adobe DNG toolkit to decode files. The GUI of the Windows version is written in C++/.Net, and the GUI of the Mac version in Cocoa. For those REALLY interested in the technical details, as CornerFix is GPL, you can download all the source code from site above.
    0

    Add a comment

  10. Over the past year or so, I've been involving myself more and more in the world of digital imaging. Photography isn't new to me - I was using a rangefinder while in school, developing and printing my own work. Neither is technology new to me - once upon a time, I co-founded (and ran all the R&D and engineering functions) a start-up that built data acquisition systems. Which is essentially what a digital camera is - a sensor, analog to digital converters, and a way of storing and displaying what you get. And I've been programming on and off since then - not as a core part of what I do, but to support what I do. Things like simulations of various parts of the global financial system.

    What I've found is that digital imaging is in a fascinating space; it's only just - say over the past 2-3 years - got out of what I call the "Bear" phase. Not bear in the financial markets sense, but in the "People look at a dancing bear not because of how well it dances, but because it dances at all". The technology has now reached the point that it's mostly better than film. But, there's still lots of innovation coming down the track. Up until recently, digital cameras concentrated on just being more convenient than, and as good as, film. So digital imaging was focussed on being "just like film" only better. That's what's changing now. Things like sensor resolution, the "more megapixels race" are close to done; in the high end DSLR format, we're up against the limits of lenses and basic physics. So now what's starting to happen is things like the "d-lighting" on Nikon's new generation DSLR's. Something that has no analog in the film world. Similarly, some of the capabilities being built raw developers have no analog in the darkroom - e.g., Lightroom's "smart vibrance", a vibrance control that can recognize skin tones, and leave them untouched while colors around the skin can be made brighter.

    Along the way, I also wrote CornerFix, which I talk about more in another post. That re-involved me in Windows C++ programming as well as involved me for the first time in programming for the Mac, which was a journey in of itself. So, while I'm active on some of the photography forums - e.g., the LUF (www.l-camera-forum.com). - this is where I talk about some of the deep technology issues, some of the really broad "where are we going" stuff, or about some of the programming that I do. So that's what this about - a mix a deep imaging technology, what's happening to the photography market, and stuff about Windows and Mac software development.
    0

    Add a comment

Popular Posts
Blog Archive
About Me
About Me
My Photo
Author of AccuRaw, PhotoRaw, CornerFix, pcdMagic, pcdtojpeg, dcpTool, WinDat Opener and occasional photographer....
Loading