There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.
Summary Quantitative measurements of colour, pattern and morphology are vital to a growing range of disciplines. Digital cameras are readily available and already widely used for making these measurements, having numerous advantages over other techniques, such as spectrometry. However, off‐the‐shelf consumer cameras are designed to produce images for human viewing, meaning that their uncalibrated photographs cannot be used for making reliable, quantitative measurements. Many studies still fail to appreciate this, and of those scientists who are aware of such issues, many are hindered by a lack of usable tools for making objective measurements from photographs. We have developed an image processing toolbox that generates images that are linear with respect to radiance from the RAW files of numerous camera brands and can combine image channels from multispectral cameras, including additional ultraviolet photographs. Images are then normalised using one or more grey standards to control for lighting conditions. This enables objective measures of reflectance and colour using a wide range of consumer cameras. Furthermore, if the camera's spectral sensitivities are known, the software can convert images to correspond to the visual system (cone‐catch values) of a wide range of animals, enabling human and non‐human visual systems to be modelled. The toolbox also provides image analysis tools that can extract luminance (lightness), colour and pattern information. Furthermore, all processing is performed on 32‐bit floating point images rather than commonly used 8‐bit images. This increases precision and reduces the likelihood of data loss through rounding error or saturation of pixels, while also facilitating the measurement of objects with shiny or fluorescent properties. All cameras tested using this software were found to demonstrate a linear response within each image and across a range of exposure times. Cone‐catch mapping functions were highly robust, converting images to several animal visual systems and yielding data that agreed closely with spectrometer‐based estimates. Our imaging toolbox is freely available as an addition to the open source ImageJ software. We believe that it will considerably enhance the appropriate use of digital cameras across multiple areas of biology, in particular researchers aiming to quantify animal and plant visual signals.
The acuity of compound eyes is determined by interommatidial angles, optical quality, and rhabdom dimensions. It is also affected by light levels and speed of movement. In insects, interommatidial angles vary from tens of degrees in Apterygota, to as little as 0.24 degrees in dragonflies. Resolution better than this is not attainable in compound eyes of realistic size. The smaller the interommatidial angle the greater the distance at which objects--prey, predators, or foliage--can be resolved. Insects with different lifestyles have contrasting patterns of interommatidial angle distribution, related to forward flight, capture on the wing, and predation on horizontal surfaces.
scite shows how a scientific paper has been cited by providing the context of the citation, a classification describing whether it supports, mentions, or contrasts the cited claim, and a label indicating in which section the citation was made.