11
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Biologically inspired microlens array camera for high-speed and high-sensitivity imaging

      research-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Nocturnal and crepuscular fast-eyed insects often exploit multiple optical channels and temporal summation for fast and low-light imaging. Here, we report high-speed and high-sensitive microlens array camera (HS-MAC), inspired by multiple optical channels and temporal summation for insect vision. HS-MAC features cross-talk–free offset microlens arrays on a single rolling shutter CMOS image sensor and performs high-speed and high-sensitivity imaging by using channel fragmentation, temporal summation, and compressive frame reconstruction. The experimental results demonstrate that HS-MAC accurately measures the speed of a color disk rotating at 1950 rpm, recording fast sequences at 9120 fps with low noise equivalent irradiance (0.43 μW/cm 2). Besides, HS-MAC visualizes the necking pinch-off of a pool fire flame in dim light conditions below one thousandth of a lux. The compact high-speed low-light camera can offer a distinct route for high-speed and low-light imaging in mobile, surveillance, and biomedical applications.

          Abstract

          Nine thousand one hundred twenty fps biologically inspired low-light camera has been developed.

          Related collections

          Most cited references46

          • Record: found
          • Abstract: found
          • Article: not found

          Fast gradient-based algorithms for constrained total variation image denoising and deblurring problems.

          This paper studies gradient-based schemes for image denoising and deblurring problems based on the discretized total variation (TV) minimization model with constraints. We derive a fast algorithm for the constrained TV-based image deburring problem. To achieve this task, we combine an acceleration of the well known dual approach to the denoising problem with a novel monotone version of a fast iterative shrinkage/thresholding algorithm (FISTA) we have recently introduced. The resulting gradient-based algorithm shares a remarkable simplicity together with a proven global rate of convergence which is significantly better than currently known gradient projections-based methods. Our results are applicable to both the anisotropic and isotropic discretized TV functionals. Initial numerical results demonstrate the viability and efficiency of the proposed algorithms on image deblurring problems with box constraints.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Digital cameras with designs inspired by the arthropod eye.

            In arthropods, evolution has created a remarkably sophisticated class of imaging systems, with a wide-angle field of view, low aberrations, high acuity to motion and an infinite depth of field. A challenge in building digital cameras with the hemispherical, compound apposition layouts of arthropod eyes is that essential design requirements cannot be met with existing planar sensor technologies or conventional optics. Here we present materials, mechanics and integration schemes that afford scalable pathways to working, arthropod-inspired cameras with nearly full hemispherical shapes (about 160 degrees). Their surfaces are densely populated by imaging elements (artificial ommatidia), which are comparable in number (180) to those of the eyes of fire ants (Solenopsis fugax) and bark beetles (Hylastes nigrinus). The devices combine elastomeric compound optical elements with deformable arrays of thin silicon photodetectors into integrated sheets that can be elastically transformed from the planar geometries in which they are fabricated to hemispherical shapes for integration into apposition cameras. Our imaging results and quantitative ray-tracing-based simulations illustrate key features of operation. These general strategies seem to be applicable to other compound eye devices, such as those inspired by moths and lacewings (refracting superposition eyes), lobster and shrimp (reflecting superposition eyes), and houseflies (neural superposition eyes).
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Optical imaging techniques for point-of-care diagnostics.

              Improving access to effective and affordable healthcare has long been a global endeavor. In this quest, the development of cost-effective and easy-to-use medical testing equipment that enables rapid and accurate diagnosis is essential to reduce the time and costs associated with healthcare services. To this end, point-of-care (POC) diagnostics plays a crucial role in healthcare delivery in both developed and developing countries by bringing medical testing to patients, or to sites near patients. As the diagnosis of a wide range of diseases, including various types of cancers and many endemics, relies on optical techniques, numerous compact and cost-effective optical imaging platforms have been developed in recent years for use at the POC. Here, we review the state-of-the-art optical imaging techniques that can have a significant impact on global health by facilitating effective and affordable POC diagnostics.
                Bookmark

                Author and article information

                Contributors
                Role: ConceptualizationRole: Data curationRole: Formal analysisRole: InvestigationRole: MethodologyRole: Project administrationRole: SoftwareRole: SupervisionRole: ValidationRole: VisualizationRole: Writing - original draftRole: Writing - review & editing
                Role: Formal analysisRole: InvestigationRole: Validation
                Role: InvestigationRole: Validation
                Role: MethodologyRole: Validation
                Role: Formal analysisRole: InvestigationRole: MethodologyRole: ValidationRole: VisualizationRole: Writing - review & editing
                Role: Methodology
                Role: ConceptualizationRole: Funding acquisitionRole: MethodologyRole: Project administrationRole: ValidationRole: Writing - review & editing
                Role: ConceptualizationRole: Data curationRole: Formal analysisRole: Funding acquisitionRole: InvestigationRole: MethodologyRole: Project administrationRole: ResourcesRole: SoftwareRole: SupervisionRole: ValidationRole: VisualizationRole: Writing - original draftRole: Writing - review & editing
                Role: ConceptualizationRole: Formal analysisRole: Funding acquisitionRole: MethodologyRole: Project administrationRole: ResourcesRole: SupervisionRole: ValidationRole: VisualizationRole: Writing - original draftRole: Writing - review & editing
                Journal
                Sci Adv
                Sci Adv
                sciadv
                advances
                Science Advances
                American Association for the Advancement of Science
                2375-2548
                03 January 2025
                01 January 2025
                : 11
                : 1
                : eads3389
                Affiliations
                [ 1 ]Department of Bio and Brain engineering, Korea Advanced Institute of Science and Technology (KAIST), 291 Daehak-ro, Yuseong-gu, Daejeon 34141, Republic of Korea.
                [ 2 ]KAIST Institute for Health Science and Technology (KIHST), Korea Advanced Institute of Science and Technology (KAIST), 291 Daehak-ro, Yuseong-gu, Daejeon, 34141, Republic of Korea.
                [ 3 ]Unmanned Ground Systems Team, LIGNex1, 333 Pangyo-ro, Bundang-gu, Gyeonggi-do, Seongnam-si 13488, Republic of Korea.
                [ 4 ]School of Computing, Korea Advanced Institute of Science and Technology (KAIST), 291 Daehak-ro, Yuseong-gu, Daejeon 34141, Republic of Korea.
                Author notes
                [* ]Corresponding author. Email: minhkim@ 123456kaist.ac.kr (M.H.K.); kjeong@ 123456kaist.ac.kr (K.-H.J.)
                Author information
                https://orcid.org/0000-0002-3020-7040
                https://orcid.org/0009-0008-8758-9781
                https://orcid.org/0009-0002-1854-157X
                https://orcid.org/0000-0002-6259-070X
                https://orcid.org/0000-0002-6970-0967
                https://orcid.org/0009-0008-7075-5456
                https://orcid.org/0000-0002-3749-7570
                https://orcid.org/0000-0002-5078-4005
                https://orcid.org/0000-0003-4799-7816
                Article
                ads3389
                10.1126/sciadv.ads3389
                11691699
                39742496
                3b38df86-d846-48c3-a099-47be0f235927
                Copyright © 2025 The Authors, some rights reserved; exclusive licensee American Association for the Advancement of Science. No claim to original U.S. Government Works. Distributed under a Creative Commons Attribution NonCommercial License 4.0 (CC BY-NC).

                This is an open-access article distributed under the terms of the Creative Commons Attribution-NonCommercial license, which permits use, distribution, and reproduction in any medium, so long as the resultant use is not for commercial advantage and provided the original work is properly cited.

                History
                : 16 August 2024
                : 19 November 2024
                Funding
                Funded by: FundRef http://dx.doi.org/10.13039/501100003052, Ministry of Trade, Industry and Energy;
                Award ID: RS-2024-00432381
                Funded by: FundRef http://dx.doi.org/10.13039/501100014188, Ministry of Science and ICT, South Korea;
                Award ID: 2022M3H4A4085645
                Funded by: FundRef http://dx.doi.org/10.13039/501100014188, Ministry of Science and ICT, South Korea;
                Award ID: RS-2024-00438316
                Funded by: FundRef http://dx.doi.org/10.13039/501100014188, Ministry of Science and ICT, South Korea;
                Award ID: 2021R1A2B5B03002428
                Funded by: FundRef , Korea Research Institute for defense Technology planning and advancement;
                Award ID: 11-102-103-021
                Categories
                Research Article
                Physical and Materials Sciences
                SciAdv r-articles
                Engineering
                Applied Sciences and Engineering
                Applied Sciences and Engineering

                Comments

                Comment on this article

                scite_
                0
                0
                0
                0
                Smart Citations
                0
                0
                0
                0
                Citing PublicationsSupportingMentioningContrasting
                View Citations

                See how this article has been cited at scite.ai

                scite shows how a scientific paper has been cited by providing the context of the citation, a classification describing whether it supports, mentions, or contrasts the cited claim, and a label indicating in which section the citation was made.

                Similar content110

                Most referenced authors318