Item request has been placed! ×
Item request cannot be made. ×
loading  Processing Request

Methods and apparatus for detecting a presence and severity of a cataract in ambient lighting

Item request has been placed! ×
Item request cannot be made. ×
loading   Processing Request
  • Publication Date:
    April 30, 2024
  • معلومة اضافية
    • Patent Number:
      11969,212
    • Appl. No:
      18/175925
    • Application Filed:
      February 28, 2023
    • نبذة مختصرة :
      Disclosed herein are methods and apparatus for making a determination about a cataract in an eye in ambient lighting conditions.
    • Inventors:
      Ohio State Innovation Foundation (Columbus, OH, US)
    • Assignees:
      Ohio State Innovation Foundation (Columbus, OH, US)
    • Claim:
      1. A method of detecting cataracts in an eye of a subject, the method comprising: capturing, by an image capture device, an image of the eye of the subject, wherein the image is captured using only ambient lighting conditions; determining, by a computing device in communication with the image capture device, ocular opacification and brunescence in the eye of the subject based on the image; and determining, by the computing device, a presence and severity of a cataract in the eye of the subject based on the determined ocular opacification and brunescence.
    • Claim:
      2. The method of claim 1 , wherein ocular opacification and brunescence are determined based on reflected ambient light, and wherein determining ocular opacification and brunescence in the eye of the subject comprises: determining, by the computing device, an overall intensity of light from a plurality of pixels located within the portion of the pupil of the eye of the subject captured in the image; determining, by the computing device, a first intensity of a first color from the plurality of pixels located within the portion of the pupil of the eye of the subject captured in the image; determining, by the computing device, a second intensity of a second color from the plurality of pixels located within the portion of the pupil of the eye of the subject captured in the image; and comparing, by the computing device, a relative intensity of the first color and a relative intensity of the second color, wherein ocular opacification and brunescence are determined based on the comparison and the overall intensity.
    • Claim:
      3. The method of claim 2 , further comprising: determining, by the computing device, a color temperature of the ambient lighting, wherein at least one of the overall intensity of light, the first intensity of the first color, or the second intensity of the second color is adjusted by the computing device based on the determined color temperature of the ambient lighting.
    • Claim:
      4. The method of claim 1 , wherein the determination of ocular opacification and brunescence in the eye of the subject is based on a brightness and one or more colors of reflected ambient light.
    • Claim:
      5. The method of claim 4 , wherein the brightness is adjusted by the computing device based on a determined color temperature of the ambient lighting.
    • Claim:
      6. The method of claim 1 , wherein the ocular opacification and the brunescence are graded according to clinical severity and effect of the cataract on visual acuity.
    • Claim:
      7. The method of claim 1 , further comprising: managing non-relevant reflections in the image of the eye of the subject by at least one of applying a polarizing filter over a lens of the image capture device, blocking external light sources, or providing a surface that absorbs light to prevent the non-relevant reflections.
    • Claim:
      8. The method of claim 1 , wherein ocular opacification and brunescence are determined based on a relative intensity and noise of RGB pixel values in the image after correcting for a color temperature of the ambient lighting.
    • Claim:
      9. The method of claim 1 , wherein the determination of ocular opacification and brunescence is made based on an unmodified version of the image of the eye of the subject.
    • Claim:
      10. The method of claim 1 , wherein the cataract is a nuclear cataract.
    • Claim:
      11. The method of claim 1 , wherein the image capture device and the computing device are part of a smartphone.
    • Claim:
      12. A device for detecting cataracts in an eye of a subject, the device comprising: an image capture unit; a processor; and memory having instructions stored thereon that, when executed by the processor, cause the device to: capture, by the image capture unit, an image of the eye of the subject, wherein the image is captured using only ambient lighting conditions; determine ocular opacification and brunescence in the eye of the subject based on the image; and determine a presence and severity of a cataract in the eye of the subject based on the determined ocular opacification and brunescence.
    • Claim:
      13. The device of claim 12 , wherein ocular opacification and brunescence are determined based on reflected ambient light, and wherein to determine ocular opacification and brunescence in the eye of the subject, the instructions further cause the device to: determine an overall intensity of light from a plurality of pixels located within the portion of the pupil of the eye of the subject captured in the image; determine a first intensity of a first color from the plurality of pixels located within the portion of the pupil of the eye of the subject captured in the image; determine a second intensity of a second color from the plurality of pixels located within the portion of the pupil of the eye of the subject captured in the image; and compare a relative intensity of the first color and a relative intensity of the second color, wherein ocular opacification and brunescence are determined based on the comparison and the overall intensity.
    • Claim:
      14. The device of claim 13 , wherein the determination of ocular opacification and brunescence in the eye of the subject is based on a brightness and one or more colors of reflected ambient light, and wherein the brightness is adjusted based on a determined color temperature of the ambient lighting.
    • Claim:
      15. The device of claim 12 , wherein the ocular opacification and the brunescence are graded according to clinical severity and effect of the cataract on visual acuity.
    • Claim:
      16. The device of claim 12 , wherein, in conjunction with capturing the image of the eye of the subject, non-relevant reflections are managed by at least one of applying a polarizing filter over a lens of the image capture device, blocking external light sources, or providing a surface that absorbs light to prevent the non-relevant reflections.
    • Claim:
      17. The device of claim 12 , wherein ocular opacification and the brunescence are determined based on a relative intensity and noise of RGB pixel values in the image after correcting for a color temperature of the ambient lighting.
    • Claim:
      18. The device of claim 12 , wherein the determination of ocular opacification and the brunescence is made based on an unmodified version of the image of the eye of the subject.
    • Claim:
      19. The device of claim 12 , wherein the cataract is a nuclear cataract.
    • Claim:
      20. The device of claim 12 , wherein the device is a smartphone.
    • Patent References Cited:
      4293198 October 1981 Kohayakawa
      5180907 January 1993 Udden et al.
      5329322 July 1994 Yancey
      5632282 May 1997 Hay et al.
      6095989 August 2000 Hay et al.
      6409342 June 2002 Ohnuma et al.
      6419638 July 2002 Hay et al.
      7641342 January 2010 Eberl et al.
      8585687 November 2013 Campbell
      8591027 November 2013 Su et al.
      8619405 December 2013 Van Heugten
      8630828 January 2014 Parker
      8632184 January 2014 Lai
      8851677 October 2014 Liebich
      10219687 March 2019 Wilkes
      10986991 April 2021 Bailey
      20020036750 March 2002 Eberl et al.
      20030048929 March 2003 Golden et al.
      20030058405 March 2003 Cornsweet et al.
      20040156554 August 2004 McIntyre
      20050057723 March 2005 Wakil et al.
      20060077581 April 2006 Schwiegerlin et al.
      20070076294 April 2007 Kitajima
      20100026957 February 2010 Tanguay, Jr. et al.
      20110091084 April 2011 Li et al.
      20110279679 November 2011 Samuel et al.
      20130135181 May 2013 Eberl et al.
      20140111630 April 2014 Pires et al.
      20140160433 June 2014 Brown et al.
      20160019420 January 2016 Feng
      20160128559 May 2016 Bailey
      20170118403 April 2017 Chu et al.
      20190033140 January 2019 Hu et al.
      20210196118 July 2021 Bailey
      104068827 October 2014
      110448267 November 2019
      201931016829 October 2020
      H0310903 February 1991
      H08-266471 October 1996
      H11299734 November 1999
      2004-261212 September 2004
      2004358111 December 2004
      2011-110253 June 2011
      2012-016606 January 2012
      2013-48376 March 2013
      2014-151024 August 2014
      WO2006/010611 February 2006
      WO2007/069294 June 2007
      WO2008/145786 December 2008
      2009142601 November 2009
      WO2011/100544 August 2011
      WO2013/059663 April 2013
      WO2013/096473 June 2013
      WO2014/175154 October 2014
      2016073887 May 2016

































    • Other References:
      Harris, M. L., et al. “Analysis of retro-illumination photographs for use in longitudinal studies of cataract.” Eye 7.4 (1993): 572-577. cited by examiner
      M. Kaur, J. Kaur and R. Kaur, “Low cost cataract detection system using smart phone,” 2015 International Conference on Green Computing and Internet of Things (ICGCIoT), Greater Noida, India, 2015, pp. 1607-1609, doi: 10.1109/ICGCIoT.2015.7380724. cited by examiner
      J. Rana and S. M. Galib, “Cataract detection using smartphone,” 2017 3rd International Conference on Electrical Information and Communication Technology (EICT), Khulna, Bangladesh, 2017, pp. 1-4, doi: 10.1109/EICT.2017.8275136. (Year: 2017). cited by examiner
      Agarwal, Vaibhav, et al. “Mobile application based cataract detection system.” 2019 3rd International Conference on Trends in Electronics and Informatics (ICOEI). IEEE, 2019. (Year: 2019). cited by examiner
      Fuadah, Yunendah Nur, Agung W. Setiawan, and Tati LR Mengko. “Mobile cataract detection using optimal combination of statistical texture analysis.” 2015 4th international conference on instrumentation, communications, information technology, and biomedical engineering (ICICI-BME). IEEE, 2015. cited by examiner
      Blauensteiner, P., Wildenauer, H., Hanbury, A., & Kampel, M. (2006). On colour spaces for change detection and shadow suppression. Computer Vision Winter Workshop, Czech Pattern Recognition Society, Telc, Czech Republic, Feb. 6-8, 2006, 6 pages. cited by applicant
      Chen, Ying-Ling, Bo Tan, and J. Lewis. “Simulation of eccentric photorefraction images.” Optics express 11.14 (2003): 1628-1642. cited by applicant
      Cibis, Gerhard W. “Video vision development assessment (VVDA): combining the Brückner test with eccentric photorefraction for dynamic identification of amblyogenic factors in infants and children.” Transactions of the American Ophthalmological Society 92 (1994): 643. cited by applicant
      De la Cruz Cardona, Juan, José Ramón Jiménez, and Ma del Mar Pérez. “Colorimetric Analysis of Eccentric Photorefraction Techniques.” Jul. 28, 2011. 4 pages. cited by applicant
      R. Bruckner: “Exakte Strabismus Diagnostik bei 1/2-3jährigen Kindern mjt einem e'infachen Verfahren, dem ”, Ophthalmologica , vol. 144, No. 3, Jan. 1 L962 (Jan. 1, 1962), pp. 184-198. (English translation not provided since listed as an “A” reference in the EESR. Applicant will obtain an English translation upon request.). cited by applicant
      Hanbury, Allan. “A 3D-polar coordinate colour representation well adapted to image analysis.” Image Analysis. Springer Berlin Heidelberg, 2003. 804-811. cited by applicant
      Office Action and it's translation issued for Japanese Application No. 2017-524377, dated Aug. 5, 2019. cited by applicant
      Extended European Search Report issued for European U.S. Appl. No. 15/856,408, dated Jul. 26, 2018. cited by applicant
      International Search Report and Written Opinion of the International Searching Authority, Application No. PCT/US2015/059529, dated Mar. 11, 2016, 13 pages. cited by applicant
      International Search Report and Written Opinion issued for Application No. PCT/US2019/068644, dated Feb. 27, 2020. cited by applicant
      International Search Report and Written Opinion issued for Application No. PCT/US2019/068646, dated Mar. 3, 2020. cited by applicant
      Notice of Allowance issued for U.S. Appl. No. 16/250,592, dated Feb. 4, 2021. cited by applicant
      Office action issued for Japanese Patent Application No. 2020-187167, dated Mar. 22, 2022. cited by applicant
      Office Action issued for Canadian Application No. 3,004,408, dated Nov. 5, 2021. cited by applicant
      Office Action issued for Japanese Application No. 2020-187167, dated Aug. 10, 2021. cited by applicant
      Office Action issued for Japanese Application No. 2017-524377, dated Apr. 20, 2020. cited by applicant
      Corrected Notice of Allowance issued for U.S. Appl. No. 17/240,212, dated Jan. 12, 2023. cited by applicant
      Corrected Notice of Allowance issued for U.S. Appl. No. 17/240,212, dated Jan. 9, 2023. cited by applicant
      Corrected Notice of Allowance issued for U.S. Appl. No. 17/240,212, dated Jan. 5, 2023. cited by applicant
      Notice of Allowance issued for U.S. Appl. No. 17/240,212, dated Dec. 29, 2022. cited by applicant
      Office Action issued for U.S. Appl. No. 17/240,212, dated Aug. 17, 2022. cited by applicant
      Notice of Allowance issued for U.S. Appl. No. 16/728,217, dated Dec. 19, 2022. cited by applicant
      Notice of Allowance issued for U.S. Appl. No. 16/728,217, dated Nov. 30, 2022. cited by applicant
      Office Action issued for U.S. Appl. No. 16/728,217, dated Aug. 15, 2022. cited by applicant
      Notice of Allowance issued for Japanese Application No. 2020187167, dated Oct. 17, 2022. cited by applicant
      Extended European Search Report issued for Application No. 19957857.6, dated Aug. 3, 2023. cited by applicant
      Extended European Search Report issued for Application No. 19957419.5, dated Sep. 5, 2023. cited by applicant
      Sigit, Riyanto, Elvi Triyana, and Mochammad Rochmad. “Cataract detection using single layer perceptron based on smartphone.” 2019 3rd International Conference on Informatics and Computational Sciences (ICICoS). IEEE, 2019. cited by applicant
      Office Action for Japanese Application No. 2022-182734, dated Feb. 5, 2024. cited by applicant
    • Primary Examiner:
      Liu, Li
    • Attorney, Agent or Firm:
      Meunier Carlin & Curfman LLC
    • الرقم المعرف:
      edspgr.11969212