SlideShare ist ein Scribd-Unternehmen logo
1 von 93
Downloaden Sie, um offline zu lesen
Petteri Teikari, PhD
Singapore Eye Research Institute (SERI)
Visual Neurosciences group
http://petteri-teikari.com/
Version “Wed 10 October 2018“
Optical designs
for fundus imaging
From traditional desktop
to novel optical design in small
form factors
“Traditional”
Fundus
Imaging
Optical
Designs
Intro to
Fundus Optics
Design
2009
Funduscamerasystems:
acomparativeanalysis
https://doi.org/10.1364/AO.48.000221
EdwardDeHoogandJamesSchwiegerling
Applied OpticsVol. 48, Issue2, pp.221-228 (2009)
Retinal photography requires the use of a
complex optical system, called a fundus
camera, capable of illuminating and imaging
the retina simultaneously. The patent literature
shows two design forms but does not
provide the specifics necessary for a thorough
analysisofthedesignstobeperformed.
We have constructed our own designs based
on the patent literature in optical design
software and compared them for illumination
efficiency, image quality, ability to
accommodate for patient refractive error, and
manufacturing tolerances, a comparison
lackingintheexistingliterature.
external
illumination
design
internal
illumination
design
Tolerance analysis must always be considered when
determining which system is able to perform a specific task
better. Systems with high performance metrics but extremely
tight or impossible tolerances are likely to be passed over for
productionor redesignedtomakemanufacturingeasier
Kidger, Intermediate Optical Design (SPIE, 2004)
Shannon, The Art and Science of Optical Design (1997)CVI Melles Griot, “
Optical fabrication tolerances”
Rochester Precision Optics, “Traditional optics capability”.
Resultsof 100 MonteCarlo Trialsof FundusCameraSystems
“Retinal imaging presents a unique difficulty considering that the retina must be illuminated and imaged
simultaneously, a process which forcesillumination and imaging systems to share a common optical path. Because
the retina is a minimally reflective surface, the power of the back reflections from the shared optics of the
illuminationandimagingpathsisgreater thanthepowerreflectedbytheretina.“
Traditional
fundus camera
design
Opticaldiagramofatraditionaldigitalfunduscamerawithitsthreebasicmodulus–objective,illuminationandcamera.
DeMatosetal.(2015),http://doi.org/10.1117/12.2190515
DeOliveiraetal.(2016),http://doi.org/10.1117/12.2236973
Fundus
Cameras
Commercial
Landscape#1
VishwanathManik Rathod
http://raiith.iith.ac.in/4141/1/Thesis_Mtech_EE_4141.pdf
Fundus
Cameras
Commercial
Landscape#2Vishwanath Manik Rathod
http://raiith.iith.ac.in/4141/1/Thesis_Mtech_EE_4141.pdf
NextSight Nexy Tutorial-funduscameraSIROftalmica
https://youtu.be/JxmFyhFRN3g
Nexy RoboticRetinalImaging SystemReceivesFDA ...- EyewireNews
GlobalFundusCamerasMarkettobeworth
USD620Million By2024 -ZionMarketResearch
https://globenewswire.com/news-release/2018/08/
19/1553691/0/en/Global-Fundus-Cameras-Market-
to-be-worth-USD-620-Million-By-2024-Zion-Marke
t-Research.html
Fundus Cameras Market: by Product Type (Mydriatic Fundus Cameras
[Tabletop and Handheld], Non-mydriatic Fundus Cameras [Tabletop and
Handheld], Hybrid Fundus Cameras, and ROP Fundus Cameras) and by
End User (Hospitals, Ophthalmology Clinics, and Others): Global Industry
Perspective,ComprehensiveAnalysisandForecast,2018- 2024
Pen-like fundus
camera design
December2009
US8836778B2Portablefunduscamera
https://patents.google.com/patent/US8836778B2/en
Filipp V. IGNATOVICH, David M. Kleinman, Christopher T.
Cotton, Todd Blalock LUMETRICSInc
Legalese description: “Camera for imaging the fundus of an eye, the camera comprising optics aligned along an
imaging axis intersecting a point on the fundus and configured to focus light reflected back from the fundus onto an image
receptor, wherein the optics are capable of varying a field of view of the camera along a path circumferentially around the
point on the fundus, whereby the image receptor acquires images of portions of the fundus located at different peripheral
locationsaroundthepointofthefundus”
Spectral
Characterizatio
n of typical
fundus camera
September2010
Spectralcharacterizationofan
ophthalmicfunduscamera
https://doi.org/10.1117/12.844855
ClaytonT.Miller;CarlJ.Bassi;DaleBrodsky;
TimothyHolmes
This work describes the characterization of
one system, the Topcon TRC-50F, necessary
for converting this camera from film
photography to spectral imaging with a
CCD. Thisconversion consistsofreplacing the
camera's original xenon flash tube with a
monochromatic light source and the film back
with a CCD. A critical preliminary step of this
modification is determining the spectral
throughput of the system, from source to
sensor, and ensuring there are sufficient
photonsatthesensor for imaging.
Dynamic
Artifacts
Cardiacgating
forfundus
2003
TimeCourseofFundusReflectionChanges
AccordingtotheCardiacCycle
https://iovs.arvojournals.org/article.aspx?articleid=2413124
R.P. Tornow;O.Kopp; B.Schultheiss
To compare the time course of fundus reflection from
video sequence (25 frames/sec) at different retinal
locationswithcardiacparameters.
The pulsatile reflection component ΔR(t) R(t)
changes corresponding to the cardiac cycle. ΔR(t) R(t)
rises suddenly during systole, reaches its maximum
after about 32 % of the pulse duration time (RR-
interval) and decreases towards the end of the
diastole. The pulse shape of ΔR(t) R(t) shows a high
correspondence to the cardial impedance signal
while it is different from the pulse shapes of the
peripheralimpedancesignals.
The reflection of the ocular fundus depends
on the cardiac cycle. The simultaneous
assessment of ΔR(t) R(t) and the impedance signals
allows to correlate parameters of ocular
microcirculation with cardiac parameters and
to distinguish physiologically induced
reflection changes from artifacts. More than
this, the pulsatile reflection amplitude has to be taken
into consideration for quantitative imaging like retinal
densitometry.
November 2016
Retinalvenouspulsation:Expandingour
understandinganduseof thisenigmatic
phenomenon
https://doi.org/10.1016/j.preteyeres.2016.06.003
WilliamH.Morgan,Martin L.Hazelton,Dao-Yi Yu
Recently, improved ophthalmodynamometry and video
recording techniques have allowed us to explore the
fundamentals of retinal vein pulsation. This demonstrates that
retinal venous collapse is in phase with both IOP and CSFP
diastole, indicating the dependence upon CSFP pulse. We
describe in some detail the mathematical and physical models of
Starling resistors and how their results can be applied to
understand the physiology of retinal vein pulsation.
October2017 AutomaticDetectionofSpontaneousVenousPulsationsUsing
RetinalImageSequences https://doi.org/10.1007/978-3-319-68195-5_90
Michal Hracho, Radim Kolar,Jan Odstrcilik, IvanaLiberdova, RalfP.Tornow
Evaluation ofmagnitude ofspontaneous venous pulsationhas been proven to
correlatewith occurrenceofglaucoma. Based on this relation a methodis
proposed that might help to detect glaucomavia detection ofspontaneous venous
pulsation.
Modify
Existingfundus
camera for custom
applications
2010
High-resolutionhyperspectralimagingoftheretinawithamodifiedfunduscamera
NourritV,DennissJ,MuqitMM,SchiesslI,FenertyC,StangaPE,HensonDB.
http://doi.org/10.1016/j.jfo.2010.10.010
Thispaper givesinformation on
howtoconvert astandard
funduscameraintoa
hyperspecral camera with off
the shelf elements (CCD
camera, liquid crystal filter,
optical fibre and slide lamp
projector).
Technically, itsmain limitation is
the low transmissionof the filter
(20% maxfor unpolarized light
below 650 nm), which limits
imagingbelow460 nm.
Remidio
Smartphone-based
commercialized
fundus camera
https://doi.org/10.1038/s41433-018-0064-9
http://remidio.com/
June2011
US9398851B2Retinalimagingdevice
https://patents.google.com/patent/US9398851B2/en
https://patents.google.com/patent/WO2018047198A3/en
SivaramanAnand,KummayaPramod,
NagarajanShanmuganathan
remidioinnovativesolutionspvtltd
MobileVision
APortable,Scalable
RetinalImagingSystem
TIEngibousCompetition Report
Rice University
2012
mobileVisionsystem
http://www.ti.com/corp/docs/university/docs/Rice_University_mobileVision%20Final%20Report.pdf
Remidio
Smartphone-based
commercialized
fundus camera
https://doi.org/10.1038/s41433-018-0064-9
http://remidio.com/
June2011
US9398851B2Retinalimagingdevice
https://patents.google.com/patent/US9398851B2/en
https://patents.google.com/patent/WO2018047198A3/en
SivaramanAnand,KummayaPramod,
NagarajanShanmuganathan
remidioinnovativesolutionspvtltd
Fundus Self-
Imaging “Eye
Selfie” from MIT
February2012
US9295388B2Methodsandapparatus
forretinalimaging
https://patents.google.com/patent/US9295388B2/en
MatthewEverettLawson,RameshRaskar
MassachusettsInstituteofTechnology
This invention comprises apparatus for retinal self-
imaging. Visual stimuli help the user self-align his eye with a
camera. Bi-ocular coupling induces the test eye to rotate into
different positions. As the test eye rotates, a video is captured
of different areas of the retina. Computational
photography methods process this video into a mosaiced
image of a large area of the retina. An LED is pressed
against the skin near the eye, to provide indirect,
diffuse illumination of the retina. The camerahasawide
field of view, and can image part of the retina even when the
eye is off-axis (when the eye's pupillary axis and camera's
optical axis are not aligned). Alternately, the retina is
illuminated directly through the pupil, and different parts of
a large lens are used to image different parts of the retina.
Alternately,aplenopticcameraisusedfor retinalimaging.
Computational photography techniques are
used to process the multiple images and to
produce a mosaiced image. These techniques
include (i) “Lucky” imaging, in which high-
pass filtering is used to identify images that
have the highest quality, and to discard poorer
qualityimages
Fundus
Eye-Selfie
http://web.media.mit.edu/~tswedish/projects/eyeSelfie.html
T. Swedish, K. Roesch, I.K. Lee, K. Rastogi, S. Bernstein, R. Raskar. eyeSelfie:SelfDirected
EyeAlignment usingReciprocalEyeBoxImaging. Proc. ofSIGGRAPH2015 (ACM
Transactions on Graphics 34, 4), 2015.
Self-aligned, mobile, non-mydriatic Fundus
Photography. The user is presented with an
alignment dependent fixation cue on a ray-based
display.Oncecorrectlyaligned,aself-acquiredretinal
image is captured. This retinal image can be used for
health, security or HMD calibration. Ilustration: 
LauraPiraino
https://youtu.be/HuXgrbwOjvM
https://www.economist.com/science-and-technology/2015/0
6/13/retina-selfie
Expert-freeeyealignmentandmachinelearningfor
predictivehealthTristan BreadenSwedish
https://dspace.mit.edu/handle/1721.1/112543
Iwill present a system that includes a novel methodfor eyeself-alignmentand automaticimage
analysis and evaluate its effectiveness when applied to a case study of a diabetic retinopathy
screening program. This work is inspired by advances in machine learning that makes
accessible interactions previously confined to specialized environments and trained users. I will
also suggestsomenewdirectionsfor futurework based onthisexpert-freeparadigm.
Fundus
Self-
Alignment
http://web.media.mit.edu/
~tswedish/projects/eyeS
elfie.html
“Thesubject
alignshis/her
ownfor
optimalimage”
Check the
patent trail
Cited By(14)
US20160302665A1 *2015-04-172016-10-20MassachusettsInstituteOfTechnology MethodsandApparatus
forVisualCuesforEyeAlignment
US20170000454A1 *2015-03-162017-01-05MagicLeap,Inc.Methodsandsystemsfordiagnosingeyes
usingultrasound
WO2009081498A1 *2007-12-262009-07-02ShimadzuCorporation Organismimagecapturingdevice
EP2583619A1 *2011-10-222013-04-24SensoMotoricInstrumentsGmbHApparatusformonitoringoneor
moresurgicalparametersoftheeye
US20150021228A12012-02-022015-01-22VisunexMedicalSystemsCo.,Ltd.Eyeimagingapparatusand
systems
US9655517B22012-02-022017-05-23VisunexMedicalSystemsCo.Ltd.Portableeyeimagingapparatus
US9351639B22012-03-172016-05-31VisunexMedicalSystemsCo.Ltd. Eyeimagingapparatuswithawide
fieldof viewandrelatedmethods
US9237847B22014-02-112016-01-19WelchAllyn,Inc.Ophthalmoscopedevice
US9211064B22014-02-112015-12-15WelchAllyn,Inc.Fundusimagingsystem
US9986908B22014-06-232018-06-05VisunexMedicalSystemsCo.Ltd. Mechanicalfeaturesof aneye
imagingapparatus
US9675246B22014-09-152017-06-13WelchAllyn,Inc.Borescopicopticalsystemformedicaldiagnostic
instrumentsandmedicaldiagnosticinstrumentshavinginterlockingassemblyfeatures
EP3026884A1 *2014-11-272016-06-01ThomsonLicensing Plenopticcameracomprisingalightemitting
device
US9848773B22015-01-262017-12-26VisunexMedicalSystemsCo.Ltd.Disposablecapforaneyeimaging
apparatusandrelatedmethods
US20160320837A1 *2015-05-012016-11-03MassachusettsInstituteOfTechnology MethodsandApparatusfor
RetinalRetroreflectionImaging
Design for
minimizing
straylight
e.g.usepolarizedlight
source
November 2014
Design,simulationand experimental
analysisofananti-stray-light
illuminationsystemof funduscamera
https://doi.org/10.1117/12.2073619
ChenMa;DewenCheng; ChenXu;Yongtian
Wang
A fiber-coupled, ring-shaped light source that forms an
annular beam is used to make full use of light energy. The
parameters of the light source, namely its divergence angle
and size of the exit surface of the fiber rod coupler, are
determined via simulation in LightTools. Simulation results
show that the illumination uniformity of fundus can reach to
90% when a 3~6mm annular spot on the cornea is
illuminated. It is also shown that a smaller divergence
angle (i.e., 30°) benefits both the uniformity irradiance of the
fundus image on focus plane (i.e., CCD) and the sharpness
of the image profile. To weaken the stray light, a
polarized light source is used, and an analyzer plate
whose light vector is perpendicular to the source is placed
after the beam splitter in the imaging system. Simulation
shows the average relative irradiance of stray light after stray
lightelimination drops to1%.
PEEK Retina
Smartphone-based
fundusimaging
Panretinal 2.2
3DPrintedoptics
holders
2014
3DPrintedSmartphoneIndirectLens
AdapterforRapid,HighQualityRetinal
Imaging
http://dx.doi.org/10.7309/jmtm.3.1.3
DavidMyung,AlexandreJais,Lingmin He,MarkS.
Blumenkranz,RobertT.Chang
Byers EyeInstituteatStanford, StanfordUniversitySchool
ofMedicine
3D Ophthalmoscope
Design
2015
Studyofopticaldesignofthree-
dimensionaldigitalophthalmoscopes
https://doi.org/10.1364/AO.54.00E224
Yi-Chin Fang,Chih-TaYen,andChin-Hsien Chu
LightToolsdiagram
A 3D optical-zoom sensory
system of the human eye with infrared
and visible light is proposed (Code V,
LightTools) to help the doctor diagnose
human eye diseases. The proposed
lens design for 3D digital
ophthalmoscopes provides a good
means of simultaneously
accessing infrared and visible
light band information to help doctors
performdiagnostics.
The diffraction limit lines in MTF plots
are almost >0.7 at spatial frequencies
up to 40 cycles/mm under all zoom
conditions in the IR region. Accordingto
the experiment results, the proposed
3D digital ophthalmoscope is suitable
forfutureophthalmoscopedesign.
D-Eye
3DPrintedoptics
holders
2015
ANovelDevicetoExploitthe
SmartphoneCameraforFundus
Photography
http://dx.doi.org/10.1155/2015/823139
AndreaRusso,FrancescoMorescalchi,CiroCostagliola,
LuisaDelcassi,andFrancescoSemeraro
Exploded view of the D-Eye module (angles and
distances between components are approximated).
Retinal images are acquired using coaxial illumination
and imaging paths thanks to a beam splitter (C). The
blue arrow depicts the path of the light; red arrow
depicts the path of fundus imaging. Device
components are glass platelet (A) with imprinted
negative lens (A′), photo-absorbing wall (B), beam
splitter (C), mirror (D), plastic case (E), diaphragm (F),
polarized filters (G, H), flash and camera glass (J, I),
andmagnetic external ring (K).
Tunable Liquid
Lens
withtranspupillary
illuminationtosimplify
theopticaldesign
2015
AccessibleDigitalOphthalmoscopy
BasedonLiquid-LensTechnology
https://doi.org/10.1007/978-3-319-24571-3_68
Christos Bergeles, Pierre Berthet-Rayne, Philip McCormac, Luis C. Garcia-
Peraza-Herrera, Kosy Onyenso, FanCao, KhushiVyas, MelissaBerthelot,
Guang-ZhongYang
Thispaper demonstratesa
newdesignintegratingmodern
componentsfor
ophthalmoscopy.Simulations
showthattheopticalelements
canbereducedtojusttwo
lenses:anaspheric
ophthalmoscopiclensanda
commodityliquid-lens,
leadingtoacompact
prototype.
Circularlypolarised
transpupilaryillumination,
withlimitedusesofarfor
ophthalmoscopy,suppresses
reflections,while
autofocusingpreserves
imagesharpness.Experiments
withahuman-eyemodeland
cadaver porcineeyes
demonstrateourprototype’s
clinicalvalueanditspotential
for accessibleimagingwhen
costisalimitingfactor.
Simplifying
Optical
Design
”substitute the complex
illumination systembya ring of
LEDsmounted coaxiallytothe
imaging optical system,
positioningitinthe place of the
holed mirror of the traditional
optical design.”
September2016
Evaluationofretinalilluminationin
coaxialfunduscamera
https://doi.org/10.1117/12.2236973
AndréO.deOliveira;LucianadeMatos;Jarbas
C.CastroNeto
Weevaluatedtheimpactofthissubstitution
regardingtoimagequality (measured
throughthemodulationtransfer function)and
illuminationuniformityproducedbythissystem
ontheretina.
Theresultsshowedthereisnochangeinimage
qualityandnoproblemwasdetected
concerninguniformitycomparedtothe
traditionalequipment.Consequently,we
avoidedoff-axiscomponents,easingthe
alignmentoftheequipment without
reducingbothimagequalityandillumination
uniformity.
Photograph (left) and optical drawing(center) ofthe OEMI-7OcularImaging Eye Model (Ocular
Instruments Inc.). Picture of the OEMI-7 Ocular Imaging Eye Model (right) obtained using the
innovative equipment. Noobscuration isobserved and the image isfree ofreflex.
Optimizing
Zemax tools for
efficient
modelling of
fundus cameras
November 2016
Minimisingbackreflectionsfromthecommonpath
objectiveinafunduscamera
https://doi.org/10.1117/12.2256633
A.SwatSolarisOpticsS.A
Eliminating back reflections is critical in the design of a
fundus camerawith internal illuminating system.Asthere isvery
little light reflected from the retina, even excellent antireflective
coatings are not sufficient suppression of ghost reflections,
therefore the number of surfaces in the common optics in
illuminatingandimagingpathsshallbe minimised.
Typically a single aspheric objective is used. In the paper an
alternative approach, an objective with all spherical
surfaces, is presented. As more surfaces are required, more
sophisticated method is needed to get rid of back reflections.
Typically back reflections analysis, comprise treating
subsequent objective surfaces as mirrors, and reflections from
the objective surfaces are traced back through the imaging
path.
There are also available standard ghost control merit
function operands in the sequential ray-trace, for example in
Zemax system, but these don’t allow back ray-trace in an
alternative optical path, illumination vs. imaging. What is
proposed in the paper, is a complete method to incorporate
ghostreflectedenergyintothe raytracing systemmeritfunction
for sequential mode which is more efficient in optimisation
process. Although developed for the purpose of specific case
of fundus camera, the method might be utilised in a wider
rangeofapplicationswhereghostcontroliscritical.
Commonlyusedamongstopticalspecialists,is
Zemaxsoftware, it doesallowto callamacro script
fromthesystemmeritfunction bytheZPLMoperand.
Thereforetheoptimisation systemcomprisethe
following:
●
Theimaging system builtin Zemax
●
A typicalmeritfunction constructedtoallowforthe
imagingsystemoptimisation
●
An additionallinein themeritfunctionto calla
macro,operandZPLM
●
ThemacrocalledbytheZPLMoperandshallopen
anewfile,wherethesecondsystemdefined,
includingasmanyconfigurationsasthenumberof
surfacesin thecommon pathsuspected to
generateparasiteback reflections. Themacro
evaluatesdetectorflux,foreachconfiguration and
itup,macroisclosedandtotalfluxvaluereturned
totheimagingsystem meritfunction.Theretuned
fluxbecomesoneofthemeritfunction
componentsbeingminimisedamongother
imagingsystemproperties,theweightis
individuallyadjustedbythedesignertobalance
systemproperties.
Startups
focusing on the
software stack
Nov.2016
Phelcom[UniversityofSãoPaulo(USP)],SmartRetinalCamera(SRC),aretinalscannercontrolled
byanattachedsmartphone
http://revistapesquisa.fapesp.br/en/2017/05/17/eye-on-the-smartphone/
The SRC is designed to perform three kinds of fundus exams: color, red-free, and fluorescein
angiography (FA).  Paulo Schor, professor in the Department of Ophthalmology, of the Federal University of São
Paulo (Unifesp), devices that rely on smartphones to perform eye exams do not belong to the future but to the present.
“They’reaccessible–thatis,easytooperateandcheap,”
Miniaturized
nonmydriatic
fundus camera
design
March2017
Opticaldesignofportablenonmydriatic
funduscamera
https://doi.org/10.1117/12.2268699
WeilinChen;JunChang; FengxianLv;YifanHe;
XinLiu;DajiangWang
The ocular fundus is not luminous itself, and the
reflectivity of retina to visible light is about
0.1% to 10%. If the light blocking effect of pupil
is considered, the reflectivity of the fundus is
about 0.1% to 1%. The active illumination
is therefore needed for the total low reflectivity.
The fundus camera uses two kinds of LED as
light sources, one is 590 nm LED and the
other is 808 nm LED. The pulsed 590nm
LED is used to illuminate the capillary vessel in
the ocular fundus and take pictures for the high
contrastofthefundusimages.
Schematicofannularillumination
To evaluate the performance of the lighting system, the
optimization results from Zemax were imported into
Lighttools, and the human eye model was also added
to perform a non-sequential ray tracing. The illumination
distributioninthefundusisshown
SonyICX282AQCCD
Smartphone
Fundus imaging
withdesignchoices
laidout
2017
OpticalDesignofaRetinalImageAcquisitionDeviceforMobile
DiabeticRetinopathyAssessment
https://doi.org/10.1007/978-3-319-24571-3_68
David Simões de Melo, Bachelor Degree in EngineeringSciences, NOVA UniversityofLisbon
Smartphone
Fundus imaging
withdesignchoices
laidout
2017
APortable,Inexpensive,NonmydriaticFundusCamera
BasedontheRaspberryPi®Computer
https://doi.org/10.1155/2017/4526243
BaileyY.Shen andShizuoMukai
DepartmentofOphthalmology, Illinois Eye and EarInfirmary, University ofIllinois at Chicago;
Retina Service, Massachusetts Eye and EarInfirmary, Harvard Medical School
We built a point-and-shoot prototype camera using a Raspberry Pi
computer, an infrared-sensitive camera board, a dual infrared and
white light light-emitting diode, a battery, a 5-inch touchscreen liquid
crystal display, and a disposable 20-diopter condensing lens. Our
prototype camera was based on indirect ophthalmoscopy with both
infrared and white lights. Results. The prototype camera measured and
weighed 386 grams. The total cost of the components, including the
disposablelens, was $185.20.
Our prototype, or a camera based on our
prototype, may have myriad uses for health
professionals. For example, it may be useful
for ophthalmologists seeing inpatient consults,
as moving inpatients to a stationary fundus
camera can be impractical, and many
neurosurgery inpatients in the intensive care
unit are not allowed to be pharmacologically
dilated.
The comfort of nonmydriatic imaging may
make the camera useful for pediatric
ophthalmologists, although alignment might be
difficult. Finally, the low cost and small size of our
camera may make the camera a valuable tool
for ophthalmologistspracticing globalmedicine.
With added features such as a large memory
card and a strong wireless card or cell phone
antenna, the device could help providers
practicetelemedicine.
Open-source
optics design
blocks
acceleratingbasic
design
2018
μCube: A Framework for 3D Printable OptomechanicsCube:AFrameworkfor3DPrintableOptomechanics
http://doi.org/10.5334/joh.8 |https://mdelmans.github.io/uCube
MihailsDelmans, JimHaselofff(2018)UniversityofCambridge
JournalofOpen Hardware.2(1),p.2
For attaching a commercial photo camera lens, a µTMountFace is used, which features a T-Mount adapter
ring, obtained from a commercial T-Mount adapter. In the M12 CCTV camera lens version, both the lens and
theRaspberryPiCameraareheldtogether byasinglepart.CADdesigninOpenSCAD.
Modeling the
pupil/iris as the
imaging
aperture
May18,2018
Theentrancepupilofthehumaneye
https://doi.org/10.1101/325548
GeoffreyKarlAguirre
The precise appearance of the entrance pupil
is the consequence of the anatomical and
optical properties of the eye, and the relative
positions of the eye and the observer. This paper
presents a ray-traced (Matlab), exact model
eye that providesthe parametersofthe entrance
pupil ellipse for an observer at an arbitrary
location and for an eye that has undergone
biologicallyaccuraterotation
Calculation of the virtual image location of a pupil
boundary point This 2D schematic shows the cornea and
a 2 mm radius pupil aperture of the model eye. A camera is
positioned at a 45 viewing angle relative to the optical◦ viewing angle relative to the optical 
axis of the eye. The optical system is composed of the
aqueoushumor,the back andfrontsurfacesofthecornea,
and the air. We consider the set of raysthat might originate
from the edge of the pupil. Each of these rays depart from
thepupilapertureatsomeanglewithrespecttotheoptical
axisoftheeye.Wecantracetheseraysthroughtheoptical
system
“Wide-Field”
Fundus
Imaging
Optical
Designs
Smartphone -
based wide-
field fundus
imaging
July2018
Asmartphone-basedtoolforrapid,
portable,andautomatedwide-fieldretinal
imaging
https://doi.org/10.1101/364265
TysonKim, Frank Myers,ClayReber, PJLoury,Panagiota Loumou, Doug Webster, ChrisEchanique,
Patrick Li, JoseDavila,Robi Maamari,Neil Switz,JeremyKeenan, MariaWoodward, YannisPaulus, Todd
Margolis,Daniel Fletcher
Department of Ophthalmology and Visual Sciences, University of Michigan School of Medicine; Department of
Bioengineering and Biophysics Program, University of California, Berkeley; Department of Ophthalmology, University of
California, San Francisco; Department of Ophthalmology and Visual Sciences, Washington University School of
Medicine in St. Louis; Department of Physics and Astronomy, San José State University; Chan-Zuckerberg Biohub, San
Francisco, CA
High-quality,wide-fieldretinalimagingisavaluable
methodtoscreenpreventable,vision-threatening
diseasesoftheretina.
Smartphone-basedretinalcamerasholdpromise
for increasingaccesstoretinalimaging,butvariable
imagequalityandrestrictedfieldofviewcanlimit
theirutility.Wedevelopedandclinicallytesteda
smartphone-basedsystemthataddressesthese
challengeswithautomation-assistedimaging.
TheCellScopeRetinasystemwasdesignedto
improvesmartphoneretinalimagingbycombining
automated fixationguidance,photomontage,and
multi-coloredilluminationwithoptimizedoptics,
user-testedergonomics,andtouch-screen
interface.Systemperformancewasevaluatedfrom
imagesofophthalmicpatientstakenby non-
ophthalmicpersonnel.
Thefixationtarget is translated througha series
ofpositions, re-orienting the patient’s eyes and
retina in arapid and controllable fashion.
CellScope Retina was
capable of capturing and
stitching montage wide-
field, 100-degree images
of a broad variety of retinal
pathologies in the
nonoptimal imaging
conditions of an ophthalmic
consultation service and
emergency department
setting.
Phantom
development for
retinal imaging
January2018
QuantifyingRetinalAreainUltra-Widefield
ImagingUsinga3-DimensionalPrintedEye
Model
https://doi.org/10.1016/j.oret.2017.03.011
Design of the model eye with an axial length of 24 mm with section A-A
representing the coronal plane and section B-B, the sagittal plane (top left). The
radius of the model is 13 mm (top right). The walls of the model eye have a
thickness of 2 mm. Each model is made up of multiple rings centered at the
posterior pole with each ring separated by 9 as in the image. The top right image
represents the sagittal plane and bottom left image represents the coronal plane.
Thebottomrightimagerepresents the model eyeviewed externally.
The grids in the original
image (left) is traced using
Photoshop CS2 (Adobe,
San Jose, CA; middle). In
this example, the line
thickness is set at 5 pixels
for ease of the reader;
however, in determining the
area, this was set at 1 pixel
for increased accuracy. The
traced image which was
used to determine the area
of each ring in pixels using
ImageJ(bottom).
Wide-field
fundus image
quality in
clinical practice
2016
PosteriorSegmentDistortioninUltra-
WidefieldImagingComparedto
ConventionalModalities
https://doi.org/10.3928/23258160-20160707-06
NationalInstitutefor HealthResearchMoorfieldsBiomedicalResearch Centre, MoorfieldsEyeHospital
andUniversityCollegeLondonInstitute ofOphthalmology, London
2017
Canultra-widefieldretinalimagingreplace
colourdigitalstereoscopyforglaucoma
detection?
https://doi.org/10.1080/09286586.2017.1351998
In conclusion, this study demonstrated almost
perfect agreement between colour digital
stereoscopy and the Optomap, an ultra-wide
field imaging technique when assessed by a
glaucoma specialist. It also showed the UWF
technique was reproducible in VCDR estimates.
Our data suggest that UWF imaging may be
suitable for diagnosing glaucoma in situations in
which slit-lamp biomicroscopy or digital colour
stereoscopy are not available and further research
about the comparative diagnostic performance of
UWF and other imaging technologies may be
warranted.
2018
Peripheral Retinal Imaging Biomarkers for
Alzheimer’s Disease: A Pilot Study?
https://doi.org/10.1159/000487053
Whether ultra-widefield (UWF, Optos P200C AF)
retinal imaging can identify biomarkers for
Alzheimer’s disease (AD) and its progression. …
after clinical progression over 2 years,
suggesting that monitoring pathological changes in
the peripheral retina might become a valuable tool
inADmonitoring.
The proposed averaging of images taken 90° apart can
improve the quality of the images obtained using the Optos
system. An acknowledgment and correction of this posterior
segment distortion will increase the accuracy that the
revolutionaryOptossystem hastooffer.
Clinical Reviews April2016
ULTRA-WIDEFIELDFUNDUSIMAGING:A
Reviewof ClinicalApplicationsandFuture
Trends
http://doi.org/10.1097/IAE.0000000000000937
Over the last 40 years, several innovative
approaches to wide-angle fundus imaging have
been introduced. These include the Pomerantzeff
camera,the Panoret-1000,the RetCam,andvarious
contact and noncontact lens-based systems.
These instruments can provide 100° to 160°
panoramic photographs using either traditional
fundusphotographyor confocalSLO(cSLO).
A major disadvantage of several of these
approaches, including the Pomerantzeff camera,
the Panoret-1000, the RetCam, and the Staurenghi
lens, is the utilization of a contact lens which
requires a skilled photographer to hold the
cameraandlensinplaceduringimageacquisition
A major source of frustration for retinal
physicians has been the difficulty associated with
creating fundus drawings in these electronic
systems (EHR). A potential solution would be the
seamless integration of an UWF color or
angiographic image into the examination note that
could be supplemented with annotations by
thephysiciantonotetheimportantfindings.
Schematic illustration
of ultra-widefield
imaging (Optos)of the
retinausing an
ellipsoidalmirror.
A laser light source is
reflected off the
galvanometer mirrors
ontoan ellipsoidal mirror.
The second focal point of
the mirror resideswithin
the eye, which facilitates
image acquisition anterior
totheequator.
Optosultra-widefield fluorescein
angiographyof proliferative
diabetic retinopathy. Right (A)
and left (B)eyesof apatient
with scattered microaneurysms,
peripheral capillary
nonperfusion, and focal leakage
consistent with
neovascularization elsewhere.
The peripheral
neovascularization and
nonperfusionare not
detectable using traditional
seven-field fundusimaging(C
and D).
Deep learning
with wide-field
retinal imaging
2017
Accuracyofdeeplearning,amachine-
learningtechnology,usingultra–wide-field
fundusophthalmoscopyfordetecting
rhegmatogenousretinaldetachment
https://doi.org/10.1038/s41598-017-09891-x
Thisstudyhadseverallimitations.Whenclarityof
theeyeisreducedbecauseofseverecataractor
densevitreoushaemorrhage,capturingimages
withOptosbecomeschallenging;thus,suchcases
werenotincludedinthisstudy.
July2018
Deep-learningClassifierWithanUltrawide-
fieldScanningLaserOphthalmoscope
DetectsGlaucomaVisualFieldSeverity
https://doi.org/10.1097/IJG.0000000000000988
To evaluate the accuracy of detecting glaucoma
visual field defect severity using deep-learning (DL)
classifier with an ultrawide-field scanning laser
ophthalmoscope.
May2018
Accuracyofultra-wide-fieldfundus
ophthalmoscopy-assisteddeeplearning,a
machine-learningtechnology,fordetecting
age-relatedmaculardegeneration
https://doi.org/10.1007/s10792-018-0940-0
AcombinationofDCNNwithOptosimagesisnot
betterthanamedicalexamination;however,itcan
identifyexudativeAMDwithahighlevelof
accuracy.Our systemisconsideredusefulfor
screeningandtelemedicine.
“Animal”
Fundus
Imaging
Optical
Designs
Modeling the
optics of the rat
eye with ZEMAX
April2011
Novelnon-contactretinacamerafortherat
and itsapplicationtodynamicretinalvessel
analysis
https://doi.org/10.1364/BOE.2.003094
A novel optical model of the rat eye was
developed for use with standard ZEMAX optical
design software, facilitating both sequential and
non-sequential modes. A retinal camera for the
rat was constructed using standard optical and
mechanical components. The addition of a
customized illumination unit with Xenon fiber-
coupled light source and existing standard software
enableddynamicvesselanalysis.
Optimizing
fundus image
quality for a rat
model
λpeak
= 580 nm
hbw = 19 nm
October2015
Investigatingtheinfluenceof chromatic
aberrationandopticalillumination
bandwidthonfundusimaginginrats
https://doi.org/10.1117/1.JBO.20.10.106010
Noninvasive, high-resolution retinal imaging of
rodent models is highly desired for longitudinally
investigating the pathogenesis and
therapeutic strategies. However, due to severe
aberrations, the retinal image quality in
rodents can be much worse than that in
humans.
We numerically and experimentally investigated the
influence of chromatic aberration and optical
illumination bandwidth on retinal imaging. We
confirmed that the rat retinal image quality
decreased with increasing illumination bandwidth.
We achieved the retinal image resolution of 10  μCube: A Framework for 3D Printable Optomechanicsm
using a 19 nm illumination bandwidth centered
at580nminahome-builtfunduscamera.
Furthermore, we observed higher chromatic
aberration in albino rat eyes than in pigmented
rat eyes. This study provides a design guide for
high-resolution fundus camera for rodents. Our
method is also beneficial to dispersion
compensation in multiwavelength retinal imaging
applications.
Contact lenses
for water-
immersion
imaging
July2018
Effectofacontactlensonmouseretinalin
vivoimaging:Effectivefocallengthchanges
and monochromaticaberrations
https://doi.org/10.1016/j.exer.2018.03.027
For in vivo mouse retinal imaging, especially with
Adaptive Optics instruments, application of a
contactlens (withGelTeal)isdesirable,asitallows
maintenance of cornea hydration and helps to
prevent cataract formation during lengthy
imaging sessions.
However, since the refractive elements of the eye
(cornea and lens) serve as the objective for most in
vivo retinal imaging systems, the use of a contact
lens, even with 0 Dpt. refractive power, can alter
thesystem'sopticalproperties.
In this investigation we examined the effective
focal length change and the aberrations that
arisefromuseofacontactlens.
Based on the ocular wavefront data we evaluated
the effect of the contact lens on the imaging system
performance as a function of the pupil size. These
results provide information for determining
optimum pupil size for retinal imaging without
adaptive optics, and raise critical issues for
design of mouse optical imaging systems
thatincorporatecontactlenses.
The effect of a contact lens and
gel on ocular aberration is
complex. In our system, the use
of a contact lens introduced
vertical coma and spherical
aberrations above those of the
native eye.
Tunable goggle
lens for rodent
models
July2017
Opticalmodellingofasupplementary
tunableair-spacedgogglelensforrodent
eyeimaging
https://doi.org/10.1371/journal.pone.0181111
In this study, we present the concept of a tunable
goggle lens, designed to compensate individual
ocular aberration for different rodent eye powers.
Ray tracing evidences that lens-fitted goggles
permit, not only to adjust individual eyes
power, but also to surpass conventional adaptive
correction technique over large viewing angle,
provided a minimum use of two spaced liquids. We
believe that the overlooked advantage of the
3D lens function is a seminal finding for further
technological advancements in widefield retinal
imaging.
Example of a multi-element lens-fitted goggle rigidly fastened
to an optical system: The goggle lens having a corneal matching
index (see Jianget al. 2016 for details) is made of a plurality of liquid
filled cavities, having a distinct refractive index, separated by elastic
surface membranes that enable a static correction of the eye by
restructuration of therodent cornea.
Improving
contact lens
modelling itself
for all imaging
studies
2018
Nonpupiladaptiveopticsforvisual
simulationofacustomizedcontactlens
https://doi.org/10.1364/AO.57.000E57
We present a method for determining the
deformable mirror profile to simulate the optical
effect of a customized contact lens in the central
visual field. Using nonpupil-conjugated adaptive
optics allows a wider field simulation compared to
traditional pupil-conjugated adaptive optics. For a
given contact lens, the mirror shape can be derived
analytically using Fermat’s principle of the
stationary optical path or numerically using
optimization in ray-tracing programs such as
Zemax.
Diagramoftheasphericcontactlensand
schematiceyemodel.
“Video-based”
Fundus
Imaging
Optical
Designs
Can’t really use
flashes of visible
light
Pupilconstrictsfromtheflash,and
dynamicpupillometrycanbe donewith
theflashofcommercialfunduscameras
(seeright )→)
Notaproblemunlessyoualwaysimage
throughthecentral2mm pupilfor
example (MaxwellianOptics)
2018
PupillaryAbnormalitieswithVarying
SeverityofDiabeticRetinopathy
https://doi.org/10.1117/12.2036970
Mukesh Jain, Sandeep Devan, DurgasriJaisankar,
GayathriSwaminathan, ShahinaPardhan & Rajiv Raman
Pupil measurements were performed with an
1/3 inch infrared camera and flash light (10 ws
xenon flash lamp, OrionFundusCamera,
Nidek Technologies,Italy).
SpectralPower DistributionofCanonSpeedlite540EZconsumer Xenonphotographicflash
SPDat
Fullintensity(1/1)
Xenon flashes are typically
powered by capacitor
banks. The more capacitors
involved, the higher the time
constant, thus longer the
flashduration
http://www.fredmiranda.co
m/forum/topic/1485638
13– Xenon flashtubeon funduscameradesign
http://doi.org/10.1167/iovs.12-10449
KennethTran;ThomasA. Mendel;KristinaL. Holbrook;PaulA.Yates
Either
continuous IR
lighting or
optical design
to cope with the
small light-
adapted pupil
2004
Observationof theocularfundusbyan
infrared-sensitivevideocameraafter
vitreoretinalsurgeryassistedby
indocyaninegreen
https://www.ncbi.nlm.nih.gov/pubmed/12707597
F-10 confocal digital ophthalmoscopefrom NIDEK
http://usa.nidek.com/products/scanning-laser-ophtha
lmoscope/
Downside:Near-infraredvideodoesnot
necessarilycaptureallthefeaturesofthe
fundus
2008
US20100245765A1Videoinfrared
ophthalmoscope
https://patents.google.com/patent/US20100245765/en
David S. DYER, JamesHiggins
Dyer HoldingsLLC
March2016
FundusPhotographyinthe21stCentury
—AReviewofRecentTechnological
AdvancesandTheirImplicationsfor
WorldwideHealthcare
https://doi.org/10.1089/tmj.2015.0068
Nishtha Panwar, Philemon Huang, Jiaying Lee, Pearse A. Keane, TjinSwee
Chuan, Ashutosh Richhariya, Stephen Teoh, Tock HanLim, and Rupesh
Agrawal
OculusImagecam2 DigitalSlitLampCamera Different segments ofthe
eye, such as anterior segment, fundus, sclera, etc., canbeconveniently
imaged bysetting suitable exposure time, lightmagnification, and white
balance. Additional videosequences can be recorded bythe high-
resolution, digital video camerain the beam pathofthe slitlamp.
Volk Pictor enables nonmydriatic fundus examination with an improved 40°
FoV. It has modifications allowing still images and videos of the optic disc,
macula, and retinal vasculature.
The Horus Scope from JedMed (St. Louis, MO) is a hand-held
ophthalmoscopic adaptor for viewing the retina and capturing video and still
images thatcan be easilytransferred to thepersonal computer
Riester Ri-Screen Multifunctional Digital Camera System This slit
lamp-based system, along with attachable ophthalmoscopic lens, enables
retinal ophthalmic imaging and nonmydriatic eye fundus examination. The
Riester (Jungingen, Germany) Ri-Screen provides digital images and video to
support screeninganddocumentation ofocular lesions and anomalies.
Smarphone-based approach from Harvard Medical School, Boston (
Haddock et al. 2013), They used the iPhone camera's built-in flash for acquiring
images and an external 20D lens for focusing. They used the Filmic Pro app
(£5.99) for control of light intensity, exposure, and focus. A Koeppe lens was
used for imaging patients under anesthesia. Still images were then
retrieved from the recorded video (like with D-Eye). When imaging the
fundus ofrabbits, 28D or30D lenses haveshown togivebetterresults.
Stripe-field
method
SPIEBiOS,2014
Non-mydriatic,widefield,fundusvideo
camera
https://doi.org/10.1117/12.2036970
Bernhard Hoeher;Peter Voigtmann; GeorgMichelson;
Bernhard Schmauss
We describe a method we call "stripe field
imaging" that is capable of capturing wide
field color fundus videos and images of the
humaneyeatpupilsizesof2mm.
We designed the demonstrator as a low-cost
device consisting of mass market
components to show that there is no major
additional technical outlay to realize the
improvements we propose. The technical
core idea of our method is breaking the
rotational symmetry in the optical design that is
giveninmanyconventionalfunduscameras.
By this measure we could extend the possible
field of view (FOV) at a pupil size of 2mm
from a circular field with 20° in diameter to a
square field with 68° by 18° in size. We
acquired a fundus video while the subject
was slightly touching and releasing the lid. The
resulting video showed changes at vessels in
the region of the papilla and a change of the
palenessofthepapilla.
Stripe-field method: 1st and 2nd Purkinje Reflections
focused to unused lower black stripe; 4 th Purikinje reflection
focused to upper unused stripe; gaining unlimited width of the
fieldofviewinthecenter
Binocular
Opthalmoscope
2017 SPIE
Binocularvideo ophthalmoscope
forsimultaneous recording of
sequencesof the humanretina to
compare dynamicparameters
https://doi.org/10.1117/12.2282898
RalfP.Tornow,AleksandraMilczarek,Jan
Odstrcilik,andRadimKolar
A parallel video ophthalmoscope was
developed to acquire short video
sequences (25 fps, 250 frames) of both
eyes simultaneously with exact
synchronization.
Video sequences were registered off-line
to compensate for eye movements. From
registered video sequences dynamic
parameters like cardiac cycle
induced reflection changes and eye
movements can be calculated and
compared between eyes.
Concept design of
what portable
binocular fundus
camera could
look like
Doesnothurtatall to thinkabouttheUX
for the end-user(clinician,non-trained
operatorwho couldbethepatientor an
opticianfor example)
Naturallythisdoesnotexcludetheneed
for goodopticaldesignandgood
computational imageenhancement.
Combinetheseallintoonesolution,and
you willhave asuccessfulbusinessthat
bringsactualvalue to the patientsinstead
of the oftenover-hyped“startup value”
2018
KoreanstartupROOTEEHEALTH
“ELI (Eye-Linked-Information), a wearable fundus camera
possesses an auto-shot feature which removes the need for
manually adjusting the camera to focus on the retina. This
removes the need for patients to undergo taking several photos
with flashes. With the use of ELI, patients previously undiagnosed
or lost in between their ‘first’ diagnosisof diabetes andlater arising
diabetic complications related to the eye will be possible to
prevent. Providing the Internal Medicine department the tool to
diagnose diabetic retinopathy is crucial as timing for treatment
mustbeinearlystages. 
There'sagapbetweenfirstprototypeandELI.wewanttoimprove
cost-effectiveness and accuracy by using adaptive optics & deep
learningtechnology”
“Custom”
Fundus
Imaging
Optical
Designs
Fundus imaging
for Stiles-
Crawford effect
2017SPIE
Developmentofafunduscamerafor
analysisofphotoreceptordirectionality
inthehealthyretina
http://hdl.handle.net/10362/15618
Author: Anjos,Pedro Filipedos Santos;
Advisors: Vohnsen,Brian; Vieira,Pedro
The Stiles-Crawford effect (SCE) is the well-known
phenomenon in which the brightness of light
perceived by the human eye depends upon its
entrancepoint in thepupil.
Retinal imaging, a widely spread clinical practice,
may be used to evaluate the SCE and thus serve as
diagnostic tool.
Nonetheless, its use for such a purpose is still
underdeveloped and far from the clinical reality.
In this project a fundus camera was built and used to
assess the cone photoreceptor directionality
by reflective imaging of the retina in healthy
individuals.
Diagramofthefinalsystem.
1–Illuminator;
2–OpticalFibre;
3–MillimetricalStage;
4–RedFilter;
5–IrisDiaphragm;
6–MaxwellianLens;
7–Beamsplitter;
8–ImagingLens;
9–ZoomLens;
10–Sensor;
11–DesktopComputer.
“Novel”
Fundus
Imaging
Optical
Designs
Inspiration for
compact
ophthalmic
imaging designs
Trans-epidermal
illumination
Quantitativephaseimagingofretinalcells
https://arxiv.org/abs/1701.08854
Bycollectingthescatteredlight through
thepupil,thepartiallycoherent illumination
producesdarkfieldimages,whichare
combinedtoreconstruct a quantitative
phaseimagewithtwicethenumerical
aperturegivenbytheeye'spupil. Wethen
report,to ourknowledge,thevery first human in
vivophaseimagesofinnerretinalcellswithhigh
contrast.
a. Trans-epidermal illumination
by means of flexible PCB containing
LEDs placed in contact with the skin
of the eyelid. Light is then transmitted
inside the eyeball. After scattering off
the eye fundus, the light passing
through the retinal’s cell layers is
collected by the eye lens. b. Flexible
PCB holding 4 red LEDs. c.
Recording and reconstruction
procedure for in-vivo measurement.
d. Experimental setup. The light
scattered from the retina is collected
by lens L1. The 4f system composed
of the lenses L1 and L2 is adjusted for
defocus thanks to a badal system.
The lens L2 forms an image of the
pupil plane at its focal distance, while
the lens L3 forms an image of the
retina on the EMCCD camera. Dic:
dichroic mirror. Synchronization
between the LEDs and the camera is
performed thanks to a programmable
board. -TimothéLaforestetal. (2017)
Illumination of the retinal layers
provided by transcleral
illumination. The light is first
transmitted through sclera, RPE
and retina. After travelling through
the aqueous humor it impinges on the
RPE. Herebackscattering off the RPE
generates a new illumination beam.
This secondary illumination
provides a transmission light
propagating through the translucent
layers of the retina which is then
collected by the pupil. Azimuthal
angle and polarangle .θ and polar angle α. α.
Trans-palpebral
illumination
Paper 1
Trans-palpebralillumination:anapproachfor wide-angle
fundusphotographywithout theneedfor pupildilation
DevrimToslak, Damber Thapa, Yanjun Chen,MuhammetKazimErol, R. V.PaulChan,
and Xincheng Yao https://doi.org/10.1364/OL.41.002688
Optics Letters Vol. 41, Issue 12, pp. 2688-2691 (2016)
“Retinal field-of-view (interior angle of 152°,
and exteriorangle 105°”
Digitalsuper-resolution
algorithms are alsobeing
considered for further resolution
improvements [Thapa et al.2014]
. In
addition tothe smartphone-
based prototype imagingdevice,
we are currentlyconstructinga
benchtop prototype for testing
the feasibilityof wide-angle
fluorescein angiography
employingthetrans-palpebral
illumination
Trans-pupillary
illumination
Paper 2
Near-infraredlight-guided miniaturizedindirect
ophthalmoscopyfornonmydriaticwide-field fundus
photography
DevrimToslak, Damber Thapa, Yanjun Chen,MuhammetKazimErol, R. V.PaulChan,
and Xincheng Yao https://doi.org/10.1364/OL.41.002688
Optics Letters Vol. 41, Issue 12, pp. 2688-2691 (2016)
Trans-pars-
planar
illumination
Contact-freetrans-pars-planarilluminationenablessnapshot funduscamera fornonmydriatic widefeld
photography
BenquanWang,DevrimToslak,Minhaj Nur Alam, R.V. Paul Chan&Xincheng Yao https://doi.org/10.1038/s41598-018-27112-x
Scientific Reports volume 8, Article number:8768 (2018)
Panoret-1000™ employed trans-scleral illumination to image the retina from the optic disc to the ora serrata in a single-shot image (Shields et al. 2003). However, clinical deployment of trans-scleral
illumination was not successful, and the product Panoret-1000™ is no longer commercially available. Clinical deployment of trans-scleral illumination failed due to several limiting factors. First,
the employed contact-mode imagingwas not favorable for patients. Direct contact of the illuminating and imaging parts to the eyeball might produce potential inflammation, contamination,
and abrasion to the sclera and cornea. Second, it was difficult to operate the system to obtain good retinal images. In Panoret-1000™, the digital camera and light illuminator were apart from
each other. To capture a retinal image, one hand was used to operate the camera, and the other hand was used to adjust the light illuminator. The simultaneous need of both hands for imaging
operation madethe deviceverydifficulttouse.
Insteadofusingalightilluminator
contacting theeyelid(trans-palpebral
illumination)13
 or sclera(trans-scleral
illumination)10,11
 trans-pars-planar
illuminationistotallycontact-
freetoprojectilluminatinglight
throughtheparsplana. 
Representative fundus images with illumination at different locations. (a) Illustration of different
illumination locations. (b) Fundus images acquired at different illumination locations. b1-b3 were acquired at
corresponding locations P1-P3 in panel a. (c) Average intensity of fundus images collected with constant power
illumination delivered through different locations. The curve is an average of 5 trials from one subject. Gray shadow
shows standard deviation. P1-P3 corresponds to images b1-b3. (d) Red, green and blue channels of the fundus
image b2. (e) Normalized fundus image b2, with digital compensation of red and green channel intensities. Macula,
opticdisc,nervefiber bundlesandbloodvesselscouldbeclearlyidentified.
Retinal Imaging
Illumination
Summary
https://doi.org/10.1038/s41598
-018-27112-x
Schematicillustrationofdifferentilluminationschemesforretinalimaging.
trans-pupillary
illumination
trans-scleral
illumination
trans-palpebral
illumination
trans-pars-planar
illumination
 Alenswasusedtoimagetheapertureontothe
scleratoforman arc-shapedillumination
pattern.Theilluminationaperturewas
carefullydesignedtocloselymatchtheshape
oftheparsplana.Theendoftheilluminating
armthatwasclosetoeyecouldbemanually
movedinahorizontaldirectionbya
translationstagetopreciselydeliver
illuminationlighttotheparsplana.Lightpassing
throughtheparsplanawasdiffusedand
illuminatedtheintraocular areahomogenously. 
Transcranial
Illumination
Artifacts causedbyretinalsurfacereflexareoften
encountered,whichcomplicatequantitative
interpretationofthereflectionimages.Wepresentan
alternativeilluminationmethod,whichavoids
theseartifacts.Themethodusesdeeplypenetrating
near-infrared(NIR)lightdeliveredtranscranially
fromthesideofthehead,andexploitsmultiple
scatteringtoredirectaportionofthelighttowardsthe
posterioreye
August2018
Non-mydriaticchorioretinalimaging
inatransmissiongeometryand
applicationtoretinaloximetry
doi:10.1364/BOE.9.003867
TimothyD.WeberandJeromeMertz
BostonUniversity
Project: Transcranialretinalimaging
A: Schematic of fundus transillumination and imaging. LEDs at several central wavelengths ( N ) are imaged via couplingλ
optics (CO), comprised of lenses f 1 and f 2 , onto theproximalend of aflexiblefiberbundle(FB). A commericalfunduscamera
(FC) images the transilluminated fundus onto a camera (CCD). B: Example raw image recorded on the CCD. C: Normalized
measuredspectraofavailable high-power deepredandNIRLEDs.
This unique transmission geometry simplifies absorption measurements and enables flash-free, non-mydriatic
imaging as deep as the choroid. Images taken with this new transillumination approach are applied to retinal
oximetry.
Aplanat
Fundus
Imaging
June2018
FundusImagingusingAplanat
https://doi.org/10.1080/24699322.2017.1379143
VishwanathManik Rathod,M.Sc.Thesis
IndianInstituteof Technology Hyderabad
In this thesis, we suggest an alternative optics
for fundus imaging. Design constraints of
aplanat helps to remove all major
aberrations observed by lens without adding
any extra corrective measures as like those of
lens optical systems. And since the proposed
system does not have complex set of lenses,
complications of the system are and helps in
reductionofcostsignificantly.
Major advantage of the system is it offers wide
numerical aperture large field of view and
systemsize remainsto thatofhandhelddevice.
Large NA and high radiation efficiency abolish
the need of pupil dilation making process
painlessforpatient.
Process follows as coordinates in MATLAB exported to Solid Edge
where aplanat reflector in CAD object form was made then
imported in Zemax. Zemax supports four CAD formats: STL, IGES,
STEP and SAT. Of these, only STL uses facets to represent the
object: the other three model the object as a smooth, continuous
surfaceshape.
Stepsin Solid Edge
In order to image retina completely, 3 phases of imaging need to be done.
Narrow field aplanat will be used to image the part near to optical axis of eye.
Wide throat aplanat will be used to image peripheral region. Hole at center
remained undetected through aplanat can be imaged using normal lens system.
Sofinal solution istoimage:
Exploiting the
overlapping partin the
imagesfromall the
system,stitching
algorithmcan beused
later to forma
completeimage. This
systemleadsto total
FOV of200˚
Freeform Optics
Fundus
Imaging?
May2018
Startinggeometrycreationand design
method forfreeformoptics
https://doi.org/10.1038/s41467-018-04186-9
AaronBauer,EricM.Schiesser &Jannick P.Rolland
May2018
Over-designedandunder-performing:
designandanalysisofafreeformprism
viacarefuluseoforthogonalsurface
descriptions
https://doi.org/10.1117/12.2315641
NicholasTakaki; WanyueSong;AnthonyJ.Yee; JulieBentley;
DuncanMoore;Jannick P.Rolland
- Instituteof Optics,Univ.of Rochester
http://focal.nl/en/#technology
Medical Optical Systems
DEMCONFocal
Institutenweg25A
7521 PH Enschede
May2018
DesignofFreeformIlluminationOptics
https://doi.org/10.1002/lpor.201700310
Rengmao Wu ZexinFeng Zhenrong Zheng Rongguang Liang
Pablo Benítez JuanC.Miñano,FabianDuerr
Reviewfocuseson thedesign of freeformillumination optics, which
isakey factorinadvancing thedevelopmentof illumination
engineering.
May2018
High-performanceintegral-imaging-
basedlightfieldaugmentedreality
displayusingfreeformoptics
https://doi.org/10.1364/OE.26.017578
HekunHuang andHong Hua
Freeform Optics
forCorneal
Imaging
2017
Freeformopticaldesignfora
nonscanningcornealimagingsystem
withaconvexlycurvedimage
https://doi.org/10.1364/AO.56.005630
Yunfeng Nie,HerbertGross,Yi Zhong,andFabianDuerr
“Thelateralresolutiononthecorneaisabout10 μCube: A Framework for 3D Printable Optomechanicsm
withgoodmodulationtransferfunction(MTF)andspot
performance.Toeasetheassembly,amonolithicdesignis
achievedwithslightlylowerresolution,leadingtoapotential
massproductionsolution.”
Additive
Manufacturing
Optics
Fundus
Imaging?
3DPrintingoptics
inotherwords
June2018
3Dprintedphotonicsandfree-form
optics
http://www.uef.fi/en/web/photonics/3d-printed-pho
tonics-and-free-form-optics
http://optics.org/news/4/6/8
JyrkiSaarinen,Jari Turunen(design),Markku Kuittinen,Anni
Eronen,Yu Jiang,Petri Karvinen,VilleNissinen,Henri Partanen,
PerttiPääkkönen,Leila Ahmadi,RizwanAli,BisratAssefa,Olli
Ovaskainen,DipanjanDas,MarkkuPekkarinen
Dutch start-up LUXeXceL  has invented the
Printoptical® technology for 3D printing optical
elements. Their technology is based on an
inkjet printing process. In collaboration with
Luxexcel, University of Eastern Finland will
furtherdevelopthePrintoptical®technology.
June2016
Additivemanufacturingofoptical
components
https://doi.org/10.1515/aot-2016-0021
Andreas Heinrich /Manuel Rank/Philippe Maillard/Anne Suckow /Yannick
Bauckhage /PatrickRößler /Johannes Lang /Fatin Shariff/Sven Pekrul
The additive manufacturing technology offers a high potential in the
field of optics as well. Owing to new design possibilities, completely
new solutions are possible. This article briefly reviews and
compares the most important additive manufacturing methods for
polymer optics. Additionally, it points out the characteristics of
additive manufactured polymer optics. Thereby, surface quality
is of crucial importance. In order to improve it, appropriate post-
processing steps are necessary (e.g. robot polishing or coating),
which will be discussed. An essential part of this paper deals with
various additive manufactured optical components and their use,
especially in optical systems for shape metrology (e.g. borehole
sensor, tilt sensor, freeform surface sensor, fisheye lens). The
examples should demonstrate the potentials and limitations of
optical componentsproduced byadditivemanufacturing.
Feb2018
Additivemanufacturingofreflectiveandtransmissive
optics:potentialandnewsolutionsforopticalsystems
https://doi.org/10.1117/12.2293130
A.Heinrich; R.Börret; M.Merkel;H.Riegel
Additive manufacturing enables the realization of complex shaped parts. This also provides a high
potential for optical components. Thus elements with virtually any geometry can be
realized, which is often difficult with conventional fabrication methods. Depending on the
material and thus the manufacturing method used, either transparent optics or reflective optics
canbedeveloped with theaid of additivemanufacturing.
Our aim is to integrate the additive manufactured optics into optical systems.
Therefore we present different examples in order to point out new possibilities and new solutions
enabled by 3D printing of the parts. In this context, the development of 3D printed reflective and
transmissiveadaptiveopticswill bediscussed aswell.
Depth-
resolving
Fundus
Imaging
Optical
Designs
Fundus
Stereo
Imaging
2008
Quantitativedepthanalysisofoptic
nerveheadusingstereoretinalfundus
imagepair
http://doi.org/10.1117/1.3041711
Toshiaki Nakagawa,Takayoshi Suzuki,Yoshinori Hayashi,Tetsuya
Yamamoto etal..
Convergentvisualsystemfor depthcalculationof
stereoimagepair. 
March2014
3Dpapillaryimagecapturingbythestereo
funduscamerasystemforclinicaldiagnosis
onretinaandopticnerve
https://doi.org/10.1117/12.2038435
Danilo A.Motta;AndréSerillo; Luciana deMatos; Fatima M.M.Yasuoka;
Vanderlei SalvadorBagnato; LuisA.V.Carvalho
2012
QuantitativeEvaluationofPapilledemafrom
StereoscopicColorFundusPhotographs
http://doi.org/10.1167/iovs.12-9803
Li Tang; Randy H.Kardon; Jui-Kai Wang; MonaK.Garvin; Kyungmoo Lee;
MichaelD.Abràmoff
Comparison of ONH shape estimates from stereo fundus
photographs and OCT scans using topographic maps. (A) Reference
(left) fundus image wrapping onto reconstructed topography as output
from stereo photographs. Small squares with different colors are marked
at four corners of the reference image, indicating the orientation of retinal
surface rendering. (B) Fundus image wrapping onto reconstructed
topography asoutputfromtheOCTscansfromthesameviewangle.
Fundus
Stereo
Imaging #2
2015
All-in-focusimagingtechniqueused toimprove3Dretinalfundusimagereconstruction
https://doi.org/10.1145/2695664.2695845 |https://doi.org/10.1117/12.2038435
DaniloMotta,LucianadeMatos,AmandaCaniattodeSouza,RafaelMarcato,AfonsoPaiva,LuisAlbertoVieirade
CarvalhoR&D– Wavetek; UniversidadedeSão Paulo,SãoCarlos, SP,Brazil
Snapshot
stereo fundus
systems
2017
Designof opticalsystemforbinocular
funduscamera
https://doi.org/10.1080/24699322.2017.1379143
JunWu,Shiliang Lou,Zhitao Xiao,Lei Geng,Fang Zhang,Wen
Wang &Mengjia Liu
Anon-mydriasisopticalsystemforbinocular
funduscamerahasbeendesignedinthis
paper.Itcancapturetwoimagesofthesame
fundusretinalregionfromdifferentanglesatthe
sametime,andcanbeusedtoachievethree-
dimensionalreconstructionoffundus.
According to requirements of
medical light source, sodium
lamp whose wavelength is 589
nm is selected as light source
and its spectral range is 560-
610 nm; Magnifying power of
the imaging system is 1.07, and
the cut-off frequency of object
is 96.3pl/mm, that is our
system can distinguish the
structure unit of 5.2 μm. In
order to make operation and
adjustment more convenient,
the size of this optical system is
set to be 480 mm x 100 mm x
200mm.
Diagramofimagingsystem.
3D fundus with
aplanats
2018
3DImageReconstructionofRetina
usingAplanats
http://raiith.iith.ac.in/id/eprint/4109
SankrandanLoke and SoumyaJana
Mastersthesis,IndianInstituteof TechnologyHyderabad
Theraytracingprogramiswrittenin Python,with
assistancefromtheMATLABtoolboxOptometrika
3Deyemodel
3Dplotofeye,aplanatand itssensor
A very high resolution 3D retina is constructed using the Meshlab software.
This is considered as the digital version of painted retina to be imaged. A
hemi-spheroidal shaped, high density point cloud is created and normals are
calculated at each point. Then ”screened Poisson Surface Reconstruction”
filter is applied on it to create a mesh over the point cloud. The resultant mesh
iscleaned and theface-normalsand vertex-normalsarenormalized.
Plenoptic
Fundus
Imaging
Optical
Designs
PlenopticFundus
Imaging
Idea been around for
some time now
2011
US20140347628A1
Multi-view funduscamera
Inventor:ASOCIACION INDUSTRIAL DE OPTICA,
COLOR E IMAGEN -AIDO;UniverstitatdeValencia
CurrentAssigneeASOCIACIONINDUSTRIAL DE
OPTICA COLOR E IMAGENASOCIACIONINDUSTRIAL
DE OPTICA COLOR E IMAGEN- AIDO Universtitatde
Valencia
2011
US8814362B2
Method forcombininga plurality of
eyeimagesinto aplenoptic
multifocalimage
Inventor:Steven Roger Verdooner 
CurrentAssignee : Neurovision ImagingInc 
2011
US8998411B2
Light field camera for fundus
photography
Inventor:Steven Roger Verdooner Alexandre R.
Tumlinson, Matthew J. Everett
CurrentAssignee : Carl Zeiss Meditec Inc
2013
US9060710B2I
System and method for ocular
tomography using plenopticimaging
Inventor Richard J. Copland
CurrentAssigneeAMOWavefront Sciences LLC
2015
US9955861B2
Constructionof an individualeye
model using aplenopticcamera
Inventor Liang Gao, IvanaTosic
CurrentAssigneeRicoh Co Ltd
US8998411B2US8998411B2: ”As described by RenNg (founder of Lytro),
the “light field” is a concept that includes both the position and
direction of light propagating in space (see for example
U.S.Pat.No.7,936,392). “
DeHoog andSchwiegerling
“Funduscamera
systems: acomparative
analysis”
Appl. Opt.48, 221-228
(2009)
https://doi.org/10.1364/AO.4
8.000221
PlenopticFundus
Imaging
Short intro
Plenoptic imagingof theretina:
canitresolvedepthinscattering
tissues?
Richard Marshall, Iain Styles, ElaClaridge,and Kai
Bongs
https://doi.org/10.1364/BIOMED.2014.BM3A.60, PDF
Configurationsoftwo
differentplenoptic
cameras: (a) The
traditionalplenoptic
camera.(b)The
focusedplenoptic
camera.
“Plenoptic imaging has already proven its capabilities to determine depth and give 3D
topographic information in free space models, however no study has shown how it
would perform through scattering media such as the retina. In order to study
this, simulations were performed using MCML, a multi-layered Monte Carlo modeling
software[Wang etal.1995]. Theparameters characterising theproperties ofretinal layers
and used in MonteCarlo (MC)simulationhavebeen taken from Stylesetal.(2006).”
Simulationof Light Field
FundusPhotography
ShaTonand T.J. Melanson.
Stanford Courseassignment
http://stanford.edu/class/ee367/Winter
2017/Tong_Melanson_ee367_win17_rep
ort.pdf
Light Field Imagesfrom
Different ViewingPoints
Comparisonsbetween normal camera(L), light field
camera(M) and reference image (R)
PlenopticFundus
Imaging
Prototype System #1:
Moorfields Eye Hospital and
University College of London
Retinal fundus imaging with a plenoptic sensor
Brice Thurin; Edward Bloch; Sotiris Nousias; Sebastien Ourselin;
Pearse Keane; Christos Bergeles
https://doi.org/10.1117/12.2286448
Optical layout of the plenoptic fundus camera. A white LED
illuminates the eye fundus through a polarizer and polarizing
beamsplitter. The LED chip is conjugated with the eye pupil and an IRIS.
While the condensing lens L1 is conjugated with the retinal plane.A
primary image of the retinal is formed by a Digital Wide-Field lens L4.
This image is relayed to the plenoptic sensor (Raytrix R8) by L3 and L6
through the polarizing beamsplitter. The polarization helps reduce the
corneal reflex.
Plenoptic ophthalmoscopy has been
considered for eye examinations [Tumlinsonand Everett2011;
Bedard etal.2014; LawsonandRaskar2014]
A crude implementation
has been proposed by Adametal.(2016) the
system built is used as a substitute for a human
observer, only a small portion of the field is used for
fundus imaging and the it does not exploit the full
capabilities of light-field imaging. More recently a
plenoptic sensor has been used to successfully
characterize the topography of the healthy and
diseasedhumanirisinvivo[Chenetal.2017].
PlenopticFundus
Imaging
Prototype System #2a
:
Queensland University of Technology; Medical and
Healthcare Robotics, Australian Centre for
Robotic Vision, Brisbane; Institute of Health and
Biomedical Innovation, Brisbane
Glare-free retinal imaging using a portable light
field fundus camera
DouglasW.Palmer, ThomasCoppin,Krishan Rana, Donald G. Dansereau,MarwanSuheimat, Michelle
Maynard, David A. Atchison, JonathanRoberts,RossCrawford, and AnjaliJaiprakash
BiomedicalOpticsExpressVol. 9, Issue7,pp. 3178-3192(2018)
https://doi.org/10.1364/BOE.9.003178
Imaging path optical diagram of light field fundus camera.
Top row (A,B,C) represents a correctly designed system where
the entrance pupil diameter ØLF is smaller than the eye pupil ØP
and the region of the sensor under the microlens shows minimal
vignetting, where d is the number of pixels under a microlens
horizontally and vertically. Bottom row (D,E,F) represents an
incorrectly designed system where ØLF is larger than ØP. The
resultant micro image vignetting is shown in (F).(A) and (D) show
slices taken approximately through the iris of the eye. (B) and (D)
show the arrangements of components and paraxial
approximations of the ray paths for a point on (blue) and off-axis
(red).The design entrance and exit pupils are the images of the
design aperture stop as seen through theobjective and relay lenses
respectively.
Plenoptoscope - General arrangement. Imaging path in gray, eye fixation path in
red, illumination path in yellow. The Lytro Illum light field camera has an internal fixed
f/2.0 aperturestop (notshown).
PlenopticFundus
Imaging
Prototype System #2b
:
Queensland University of Technology; Medical and
Healthcare Robotics, Australian Centre for
Robotic Vision, Brisbane; Institute of Health and
Biomedical Innovation, Brisbane
Glare-free retinal imaging using a portable light
field fundus camera
DouglasW.Palmer, ThomasCoppin,Krishan Rana, Donald G. Dansereau,MarwanSuheimat, Michelle
Maynard, David A. Atchison, JonathanRoberts,RossCrawford, and AnjaliJaiprakash
BiomedicalOpticsExpressVol. 9, Issue7,pp. 3178-3192(2018)
https://doi.org/10.1364/BOE.9.003178
Series of images captured using the Retinal
Plenoptoscope. Images are shown in sets of two with
the top image being a standard (not glare-free)
render of the light field, and the bottom image being
a gray-scale relative depth map. Each depth map has
an associated scale that relates gray shade to depth.
Note that the leftmost set is of a model eye, the second
leftmost set is of a myopic eye (-5.75D), and the two
rightmost sets are ofemmetropiceyes.
An image of a human retina captured using the Retinal
Plenoptoscope with an associated epipolar image.
Annotations indicate areas of interest, where (A) and (C)
correspondtoglare,and(B)correspondstotheOpticDisk.
Series of images created using various light field rendering techniques. Images
are shown in sets of three with the left image being the central view from the
light field, the middle image being astandard render with noglare masking, and
the right image beingaglare-freerender.
PlenopticIris
Imaging
Prototype System
Mechanical Engineering; Department of
Ophthalmology and Visual Sciences; Department of
Human Genetics | University of Michigan
Human iris three-dimensional imaging at
micron resolution by a micro-plenoptic camera
Hao Chen, MariaA.Woodward, David T.Burke,V.SwethaE. Jeganathan, Hakan Demirci,andVolker Sick
BiomedicalOpticsExpressVol. 8,Issue10,pp.4514-4522 (2017)
https://doi.org/10.1364/BOE.8.004514 | researchgate
A micro-plenoptic system (Raytrix R29) was designed to capture the
three-dimensional (3D) topography of the anterior iris surface by simple
single-shot imaging. Within a depth-of-field of 2.4 mm, depth resolution of
10 µm can be achieved with accuracy (systematic errors) and precision
(randomerrors)below20%. 
Multimodal
Imaging
Optical
Designs
Combined
Fundus and OCT
Imaging
2007
SimultaneousFundusImagingand
OpticalCoherenceTomographyofthe
MouseRetina
http://doi.org/10.1167/iovs.06-0732
OmerP.Kocaoglu;Stephen R.Uhlhorn; EleutHernandez;
RogerA. Juarez; Russell Will;Jean-MarieParel;Fabrice
Manns
To develop a retinal imaging system suitable for
routine examination or screening of mouse
models and to demonstrate the feasibility of
simultaneously acquiring fundus and optical
coherencetomography(OCT)images.
The mousewas held ina cylindrical holdermadefrom30-mL syringes. The positionofthe mouse was adjusted
to align the optical axis ofthe mouse eye with the axis ofthe deliverysystembyusing 6-μm screws.m screws.
Left: general optical design of the imaging system; right: the mouse fundus and OCT imaging system, including fundus imaging with a digital camera attached to the
photographic portof theslitlamp, theOCT beamdeliverysystem, thesix-axis mousepositioner,and theinterferometer.
Fundus Camera
Guided
Photoacoustic
Ophthalmoscopy
2013
FundusCameraGuidedPhotoacoustic
Ophthalmoscopy
https://doi.org/10.3109/02713683.2013.815219
Aschematicoftheimagingsystemdesigned
andoptimizedfor rateyesisshowninright→ 
A532-nmpulsed laser(Nd:YAGlaser,
SPOT-10-100,ElforlightLtd,UK;output
wavelength1064nm;pulseduration:2ns;
BBOcrystalforsecondharmonicfrequency
generation;CasTech,SanJose,CA) wasused
astheilluminationlightsourceforPAOM.
ThePAOMlaser(greenpath)wasscannedby
anx–ygalvanometer (QS-7,Nutfield
Technology)anddeliveredtotheposterior
segmentoftheeyeafter passingthrougha
relaylensL5(f=150mm)andanobjectivelens
OBJ1(f=30mm,VIS-NIRARcoated).Thefinal
laserpulseenergyonthecorneais60nJ,
whichisconsideredeye-safe.
TheinducedPAwavesweredetectedbya
custom-builtunfocused needle
ultrasonictransducer(centralfrequency
35MHz;bandwidth:50%;activeelementsize:
0.5x0.5mm2
).Theultrasonictransducerwas
gentlyattachedtotheeyelid(closetothe
canthus)coupledbyathinlayer ofmedical-
gradeultrasoundgel.
Infrared
Retinoscopy
2014
Infrared Retinoscopy
http://dx.doi.org/10.3390/photonics1040303
Retinoscopy could be a more effective and
versatile clinical tool in observing a wide range
of ocular conditions if modificationswere made
to overcome the inherent difficulties. In this
paper, a laboratory infrared retinoscope
prototype was constructed to capture the
digital images of the pupil reflex of various
typesofeyeconditions.
The capturedlow-contrastrefleximagesdueto
intraocular scattering were significantly
improved with a simple image processing
procedure for visualization. Detections of
ocular aberrations were demonstrated, and
computational models using patients’
wavefront data were built to simulate the
measurementfor comparison.
The simulation results suggest that the retinal
stray light that is strongly linked to intraocular
scattering extend the detection range of
illuminating eccentricity in retinoscopy and
make it more likely to observe ocular
aberrations.
Light Levels
Maximumintensities
limitedbywhatis
safeforhumaneye
ISO 15004-2.2
Standard for safe
retinalirradiancewith
ophthalmic instruments
Used e.g. by
Kölbl et al. (2015)
Sheng Chiong Hong (2015)
for discussion on limits, see
Sliney et al. (2005)
Wangetal.(2018)
https://doi.org/10.1038/s41598-018-27112-x
“According to the ISO 15007-2:2007 (Petteri:
Incorrect standard reference) standard, a
maximum of 10  J/cm2
weighted
irradiance is allowed on the retina without
photochemicalhazardconcern.”
Kim, Delori, Mukai (2012):
Smartphone Photography Safety
https://www.aaojournal.org/article/S0161-6420(12)00410-1/pdf
The light safety limits for ophthalmic instruments set by the International
Organization for Standardization (ISO 15004-2.2) recommend that spectral
irradiance (W/cm2
/nm) on the retina be weighted separately for thermal and
photochemical hazard functions or action spectra. These safety limits are at least 1
order of magnitude below actual retinal threshold damage [Delori et al.2007; Slineyet al.2002]
.
The radiant power of the smartphonewas8 mW.
For thermal hazard, the weighted retinal irradiance for the smartphone was 4.6
mW/cm2
, which is 150 times below the thermal limit (706 mW/cm2
). For
photochemical hazard, the weighted retinal radiant exposure was 41 mJ/cm2
(exposure duration of 1 minute), which is 240 times below the photochemical limit
(10 J/cm2
). Since the light safety standards not only account for the total retinal
irradiance but also for spectral distribution, we measured the latter with a
spectroradiometer (USB4000, Ocean Optics, Dunedin, FL). The radiation was
limited to the 400–700 nm wavelength interval with about 70% of that light emitted
inthe blue and green partof the spectrum (wavelength600nm).
We then compared the light levels produced during smartphone fundoscopy
with those produced by standard indirect ophthalmoscopes. The retinal
irradiance produced by a Keeler VantagePlusLED (Keeler Instruments Inc.,
Broomall, PA), measured using identical procedures as earlier described, was 46
mW/cm2
or about 10 times the levels observed with the smartphone. This
finding corresponds well with retinal irradiances of 8 to 210 mW/cm2
found in other
studies for a wide selection of indirect ophthalmoscopes. The spectral distribution
of the Keeler indirect ophthalmoscope was similar to that of the smartphone (both
have LED sources). The weighted exposures for the Keeler indirect
ophthalmoscope were thus 15 and 24 times less than the limits for thermal and
photochemical hazards, respectively. The lower light level available for
observation using the smartphone, as opposed to the indirect ophthalmoscope, is
largelycompensated for bythe high electronicsensitivityof the camera.
In conclusion, retinal exposure from the smartphone was 1 order of
magnitude less than that from the indirect ophthalmoscope and both are
within safety limits of thermal and photochemical hazards as defined by the ISO
whentested under conditions simulatingroutinefundoscopy.
Hazard weighting functions according to the
DIN EN ISO 15007 – 2: 2014 standard A(λ)
and R(λ). A(λ) rates the photochemical and
A(λ) rates the thermal hazard for all kinds of
lightsources.Kölbletal.(2015)
Example of
Calculation
For calculatingretinal
irradiance from quasi-
monochromatic green
(565nm)parsplanar
illumination
2018
Contact-free trans-pars-planar illumination enables snapshot
funduscamerafornonmydriatic wide field photography
https://doi.org/10.1038/s41598-018-27112-x
The thickness of the sclera is ~0.5 mm Olsenet al.1998
. The
transmission of the sclera in visible wavelength is 10–30%
Vogeletal.1991
.Tobeconservative,30%wasusedforcalculation.
For the proof-of-concept experiment, the weighted
irradiance on the sclera was calculated to be 0.5 mW, the
area of the arc-shaped light was 13 mm2
. For the worst case
estimation,we assumedallillumination lightdirectlyexposeto
the retinal area behind the illuminated sclera area. Therefore,
themaximumallowedexposuretimeis
If the illumination light accidently fell into the pupil, the
illuminated area on retina was estimated to be >9 mm2
. Thus
the maximum allowedexposuretimethrough thepupilis >30 
minutes. For thermal hazard, the maximum weighted power
intensity allowed on the sclera without thermal hazard
concern is 700 mW/cm2
. The calculated weighted power
intensity was 230 mW/cm2
, which was more than three times
lower thanthemaximumlimit.
Kölbletal. (2015)
lightsourcearoundshapedwhiteLEDisused.Byintegrationof
thelight sourceintoa speculum, theLEDispressedfirmly held
against thesclera. Thustheocularspaceisilluminated
transsclerally.Asaresult anindirect uniformilluminationofthe
completeintraocularspaceisachieved.
Example of the
use of
Supercontinuum
light source
We assessed the spectral sensitivity of the
pupillary light reflex in mice usinga high
power supercontinuum white light
(SCWL) source in a dual wavelength
configuration. Thisnovel approachwas
comparedto data collected from a more
traditional setupusing a Xenon arclamp
fitted withmonochromatic interference
filters.
2018
Use ofa supercontinuum whitelight inevaluating thespectral
sensitivity of the pupillight reflex
CatherineChin; Lasse Leick; Adrian Podoleanu;GurpritS. Lal
Univ. ofKent(United Kingdom);NKTPhotonics A/S (Denmark)
https://doi.org/10.1117/12.2286064
Lightwasgenerated
bytheNKTPhotonics
SuperKExtremeEXR
andfilteredthrough
ExtendedUV(480
nm)andSuperK
Select (560nm)
modules. 
The use of a SCWL is a significant leap
forward from the Xenon arc light
traditionally used in recording pupillary light
responses. The SCWL gives the
experimenter much more control over the
light stimulus, through wavelength, intensity
and, most importantly, a dual light
configuration.
Together, this will allow more complex
lighting protocols to be developed that
can further assist in unraveling the complex
coding of light that gives rise to the pupil light
reflex and other photic driven physiological
responses
Image
Processing
Introforcompensating
lowimagequality
computationally
Fundus Video
Processing
2014
Multi-frameSuper-resolutionwith
QualitySelf-assessmentforRetinal
FundusVideos
https://doi.org/10.1117/12.2036970
Thomas Köhler, Alexander Brost, Katja Mogalle, QianyiZhang, Christiane
Köhler, Georg Michelson, JoachimHornegger, RalfP. Tornow
In order to compensate heterogeneous illumination on the
fundus, we integrate retrospective illumination correction
for photometric registration to the underlying imaging
model. Our method utilizes quality self-assessment to
provide objective quality scores for reconstructed images
as well as to select regularization parameters
automatically. In our evaluation on real data acquired from
six human subjects with a low-cost video camera, the
proposed method achieved considerable enhancements
of low-resolution frames and improved noise and
sharpnesscharacteristicsby 74%.
2014
Bloodvesselsegmentationinvideo-
sequencesfromthehumanretina
https://doi.org/10.1109/IST.2014.6958459
J . Odstrcilik ; R. Kolar ; J. Jan ; R. P. Tornow ; A. Budai
This paper deals with the retinal blood vessel
segmentation in fundus video-sequences acquired by
experimental fundus video camera. Quality of acquired
video-sequences is relatively low and fluctuates across
particular frames. Especially, due to the low resolution,
poor signal-to-noise ratio, and varying illumination
conditions within the frames, application of standard
image processing methods might be difficult in such
experimental fundusimages.
2014
Geometry-BasedOpticDiskTrackingin
RetinalFundusVideos
https://doi.org/10.1007/978-3-642-54111-7_26
Anja Kürten, Thomas Köhler, Attila Budai, Ralf-Peter Tornow, Georg Michelson,
JoachimHornegger
Fundus video cameras enable the acquisition of image
sequences to analyze fast temporal changes on the
human retina in a non-invasive manner. In this work, we
propose a tracking-by-detection scheme for the optic disk
to capture the human eye motion on-line during
examination. Our approach exploits the elliptical shape of
the opticdisk.
2016
Registrationofretinalsequencesfrom
newvideo-ophthalmoscopiccamera
https://doi.org/10.1186/s12938-016-0191-0
RadimKolar, Ralf. P. Tornow, JanOdstrcilik and Ivana Liberdova
Analysis of fast temporal changes on retinas has become
an important part of diagnostic video-ophthalmology.
It enables investigation of the hemodynamic processes in
retinal tissue, e.g. blood-vessel diameter changes as a
result of blood-pressure variation, spontaneous venous
pulsation influenced by intracranial-intraocular pressure
difference, blood-volume changes as aresult of changesin
light reflection from retinal tissue, and blood flow using
laser speckle contrast imaging. For such applications,
image registration of the recorded sequence must be
performed.
Multiframe
registration
July2018
Fundusphotographywithsubpixel
registrationofcorrelatedlaserspeckle
images
https://doi.org/10.7567/JJAP.57.09SB01
Jie-EnLiandChung-HaoTien
Schematic diagram of the
optics of fundus
photography. LS, light source;
CL, collector lens; AD,
aperture diaphragm; BS,
beam splitter; FD, field
diaphragm; Ln, lenses. In our
experiment, the focal lengths
of the lenses are 30 and 100
mm (f1 = 300 mm and f2 =
100 mm).
Imagesofrabbitretinawith (a)
incoherent illumination, (b)
coherent illumination, and (c) LSCI
image. Vesselsenclosed bythered
frame are barely distinguishable in
imagesobtained usingaconventional
fundussystem, but are enhanced
when their imagesareobtained with
the help ofLSCI.
Imagesobtainedbylaser
specklecontrastimaging
(LSCI)(a)withoutimage
registrationand (b)with
imageregistration
process.(c)Speckle
contrastof(a)and(b)along
theredlines.
Future of
Fundus
Imaging
Hardware becoming a
low-cost commodity
with thevalue ofeasy
dataacquisition
increasing
Google
building a
2018
OpticalSystemsEngineer
VerilyLifeSciences
https://www.linkedin.com/jobs/view/674965920
Responsibilities
●
Designstate-of-the-artoptics-based
devices.
●
Work closelywithinterdisciplinaryteamto
integrateopticaldesignsinto
prototypes and productdevelopment
path.
MinimumQualifications
●
PhD in Optical Engineering/Optics/Physics/
EE, or related technical field or equivalent
practical experience.
●
Knowledgeand experiencein optical
design, optics and imaging systems design.
●
Applied research experience in
physics/optics/imagingsystems.
●
PreferredQualifications
●
Experience in opto-mechanicaldesign
●
Experience in electronics (PCB schematic
capture and layout, soldering, etc.)
●
Programming experience in MATLAB/C/C+
+/Python
●
Experience withmicrocontrollers
●
Excellent communicationand collaboration
skills.
2018
ElectricalEngineer,
OphthalmicDevices
VerilyLifeSciences
https://www.linkedin.com/jobs/view/692175458
Responsibilities
●
Working with cross-functional teams to
define electronic systems based on system-
level requirements and tradeoffs
●
Designelectronic systems forhighly
miniaturizedelectronicdevices,
especiallyatthe PC board level
●
Identification, selection and qualificationof
keyelectronic components for electronic
systems in miniaturized medical devices
●
Identification and qualification of key
vendorsandpartners for electronics
integration and manufacture
●
Rapidprototypingofelectronic
systems with in-houseteams and with
support from vendors, including integration
with avariety ofelectrical and mechanical
components
MinimumQualifications
●
MSdegreeinElectrical Engineering or
related major plus 4 years work experience
●
Solidanaloganddigitalcircuitdesign
skill
●
Experiencewithcomplex PC board design
●
Experiencewithfirmware development
2018
TechnicalLead,
OphthalmicDevicesand
ElectroactivePolymers
VerilyLifeSciences
https://www.linkedin.com/jobs/view/technical-lead
-ophthalmic-devices-and-electroactive-polymers-
at-verily-life-sciences-731643271/
Responsibilities
●
Leading thedevelopment ofnew
ophthalmologicaldevices and
diagnostic instrumentation.
●
Early-stagehardware and systems
development.
MinimumQualifications
●
Experiencein electroactivepolymer
applicationsinopticalsystems.
●
Experiencewith embedded systems
hardware/software development.
●
Experiencewith differentstages ofproduct
development:proof-of-concept, prototyping,
EV and DVbuilds.
PreferredQualifications
●
Experiencewith opto-mechanical
systems and components. Experience
withophthalmic devices.
●
Background in bothhardwareand
software. Programming experiencewith
C++ and Python.
●
Experiencein optics and electronics witha
focus on optical/spectral
imaging/sensingtechnologies.
●
Product development track record inoptical
applications of electroactivepolymers
or similar.
●
Experiencewith Camera, ImageSignal
Processor (ISP) orcamerasoftware
stack. Knowledge ofsignal processing,
digital imaging, computervision, and image/
video processing. ExperiencewithSoC,
cameramodules, VR/AR display
techniques.
●
Experiencewith (real time)data processing
LilyPeng–Google Brain
Talkingabout deeplearningforfundus
images(diabeticretinopathy),andtheir
collaborationwith ArvindinstituteinIndiaat
APTOS2018
Portable
adaptive optics
for improving
image quality
05Sep2018
Adaptiveopticsrevealsfinedetailsof
retinalstructureDukeUniversityhandheldAOSLO
platform assistsdiagnosisofeyediseaseandtrauma
http://optics.org/news/9/9/5
"To overcome the weight and size restrictions in
integrating AOSLO into handheld form (weighing
less than 200g), we used recent advancements in
the miniaturization of deformable mirror
technology and 2D microelectromechanical
systems (MEMS) scanning, together with a novel
wavefront sensorless AO algorithm and a custom
optical and mechanical design," commented the
teamintheOpticapaper.
The newprobe andassociatednumericalmethods
could be useful for a variety of applications in
ophthalmic imaging, and the Duke team has made
its designs and computational algorithms available
as opensourcedata for otherresearchers.
The system was able to image photoreceptors as
close as 1.4 degrees eccentric to the fovea area
of the retina, where photoreceptors have an
average spacing of 4.5 microns. Without AO, the
closest measurement had previously been 3.9
degrees. Further clinical trials with the instrument
will follow, and the researchers plan to incorporate
additional imaging modalities into the
platform that could prove useful for detecting
otherdiseases.
2018
Handheldadaptiveopticsscanning
laserophthalmoscope
https://doi.org/10.1364/OPTICA.5.001027
http://people.duke.edu/~sf59/HAOSLO.htm
Theodore DuBose, Derek Nankivil, FrancescoLaRocca, GarWaterman,
Kristen Hagan, James Polans, BrentonKeller, DuTran-Viet, Lejla Vajzovic,
Anthony N. Kuo, Cynthia A. Toth, Joseph A. Izatt, and SinaFarsiu
Novel
Components
For Imaging
The use of MEMStech,has already
allowed miniaturization of many
devices probing visualfunction
What more?
Diffractive
Optics for
Fundus
Imaging?
February2018
Broadbandimagingwithoneplanar
diffractivelens
https://doi.org/10.1080/24699322.2017.1379143
NabilMohammad,MonjurulMeem,Bing Shen,Peng Wang &Rajesh
Menon |Department of Electrical and Computer Engineering, MACOM Technology
Solutions, Department of Medical Engineering, CaliforniaInstitute ofTechnology
Here, we design, fabricate and characterize
broadband diffractive optics as planar lenses
for imaging. Broadband operation is achieved by
optimizing the phase transmission function for each
wavelength carefully to achieve the desired intensity
distribution at that wavelength in the focal plane. Our
approach is able to maintain the quality of the
images comparable to that achievable with
morecomplexsystemsoflenses.
The achromatic lenses were patterned on a
photoresist film atop a glass wafer using grayscale
laser patterning using a Heidelberg Instruments
MicroPG101 tool. The exposure dose was varied as a
function of position in order to achieve the multiple
heightlevelsdictatedbythedesign.
We show here that planar diffractive lenses, when
designedproperly and fabricated carefullyaresufficient
for broadband imaging. By extending the fabrication
process to industry-relevant lithographic scales and
large area replication via nanoimprinting (Guo2004), it
is possible to envision planar lenses enabling imaging
with very thin form factors, low weights and low costs.
Therefore, we believe that our approach will lead to
considerably simpler, thinner and cheaper imaging
systems.
(a) Schematic of a flat-lens design. The structure is
comprised of concentric rings of width, Wmin
 and varying
heights. (b) Photograph of one fabricated lens. Optical
micrographs of (c) NA = 0.05 and (d) NA = 0.18 lenses.
Focal length is 1 mm. Measured full-width at half-maximum
(FWHM) of the focal spot as a function of wavelength for
(e) NA = 0.05 and (f) NA = 0.18 lenses. Measured focal
spots as a function of wavelength for (g) NA = 0.05 and (h)
NA = 0.18lenses.
Diffractive
Optics in intraocular
lenses (IOL) and satellite-
based remote sensing
July2018
Fractal-structuredmultifocal intraocularlens
https://doi.org/10.1371/journal.pone.0200197
LauraRemón, Salvador García-Delpech, Patricia Udaondo, VicenteFerrando, Juan
A. Monsoriu, WalterD. Furlan | Departamento de ÓpticayOptometríayCienciasdelaVisión,
UniversitatdeValència, Burjassot,Spain
In this work, we present a new concept of IOL design inspired by the
demonstrated properties of reduced chromatic aberration and
extended depth of focus of Fractal zone plates. A detailed
description of a proof of concept IOL is provided. The result was
numerically characterized, and fabricated by lathe turning. The
prototype was tested in vitro using dedicated optical system and
software. The theoretical Point Spread Function along the optical axis,
computed for several wavelengths, showed that for each wavelength,
the IOL produces two main foci surrounded by numerous secondary
foci that partially overlap each other for different wavelengths. The result
is that both, the near focus and the far focus, have an extended
depth offocusunderpolychromaticillumination. 
May2018
ModificationofFresnelzonelightfield
spectralimagingsystemforhigher
resolution
https://doi-org/10.1117/12.2303898
CarlosDiaz; Anthony L.Franz;Jack A.Shepherd
Air Force Institute ofTechnology (United States)
Recent interest in building an imaging system using
diffractive optics that can fit on a CubeSat (10 cm x 10
cm x 30 cm) and can correct severe chromatic
aberrations inherent to diffractive optics has led to the
development of the Fresnel zone light field
spectral imaging system (FZLFSI). The FZLFSI is a
system that integrates an axial dispersion binary
diffractive optic with a light field (plenoptic)
camera design that enables snapshot spectral
imaging capability.
Metamaterial
Optics for
Fundus
Imaging?
July2017
Metamaterialsandimaging
https://dx.doi.org/10.1186/s40580-015-0053-7
MinkyungKimandJunsuk Rho
Here, we review metamaterial-based lenses which
offer the new types of imaging components and
functions. Perfect lens, superlenses, hyperlenses,
metalenses, flat lenses based on metasurfaces, and
non-optical lenses including acoustic hyperlens are
described.
Not all of them offer sub-diffraction imaging, but they
provide new imaging mechanisms by controlling
and manipulating the path of light. The underlying
physics, design principles, recent advances, major
limitations and challenges for the practical
applicationsarediscussedinthisreview.
Diffraction-free acoustic imaging using metamaterials
allows more efficient underwater sonar sensing,
medical ultra-sound imaging, and non-
destructivematerialstesting.
Therefore, with the development of
nanofabrication and nanomanufacturing
methods, and the integration of new creative ideas,
overcoming the results and limitations mentioned in
this review will be the continuous efforts to make
metamaterial-based imaging techniques to be a next
generation of imaging technology replacing current
optical microscopy, which thus can be called
nanoscopy.
2018
Dynamicallytunableandactivehyperbolic
metamaterials
https://doi.org/10.1364/AOP.10.000354
Joseph S.T.Smalley,FelipeVallini, XiangZhang, andYeshaiahu
Fainman
Here
Metasurface
Optics for
Fundus
Imaging?
July2017
OpticswithMetasurfaces:Beyond
RefractiveandDiffractiveOptics
https://doi.org/10.1364/OFT.2017.OW1B.5
MohammadrezaKhorasaninejad
HarvardUniversity
Flat optics based-on metasurfaces has
the potential to replace/complement
conventional refractive/diffractive
components. In this talk, we give an
overview of our works on dielectric-
metasurfaces, which have led to high
performance components in the visible
spectrum.
https://doi.org/10.1364/OPTICA.4.000139
Nano-opticendoscope seesdeep intotissueathigh resolutionNow, experts in
endoscopic imagingat Massachusetts General Hospital (MGH) and pioneers of flatmetalens technologyatthe
HarvardJohn A. Paulson  School ofEngineering and Applied Sciences (SEAS), haveteamed up to develop a
new class ofendoscopic imaging catheters –termed nano-optic endoscopes 
-https://doi.org/10.1038/s41566-018-0224-2
Departmentof ElectricalandComputer Engineering,NationalUniversityof Singapore,
Singapore,Singapore- Yao-Wei Huang &Cheng-Wei Qiu

Weitere ähnliche Inhalte

Was ist angesagt?

soft contact lens fitting
soft contact lens fittingsoft contact lens fitting
soft contact lens fittingMohammad Noor
 
selection of patient for contact lens.pptx
selection of patient for contact lens.pptxselection of patient for contact lens.pptx
selection of patient for contact lens.pptxSnehaaVaidya
 
FITTING SPHERICAL RIGID GAS PERMEABLE CONTACT LENS
FITTING SPHERICAL RIGID GAS PERMEABLE  CONTACT LENSFITTING SPHERICAL RIGID GAS PERMEABLE  CONTACT LENS
FITTING SPHERICAL RIGID GAS PERMEABLE CONTACT LENSMarion Kemboi
 
RGP Fitting
RGP Fitting RGP Fitting
RGP Fitting emlctvla
 
Types of pediatric contact lens [autosaved]
Types of pediatric contact lens [autosaved]Types of pediatric contact lens [autosaved]
Types of pediatric contact lens [autosaved]Bipin Koirala
 
Assesssment of strabismus
Assesssment of strabismusAssesssment of strabismus
Assesssment of strabismusanwesha manna
 
Orthoptic instruments
Orthoptic instrumentsOrthoptic instruments
Orthoptic instrumentsSasanka Dutta
 
Accommodative and vergence dysfunction
Accommodative and vergence dysfunctionAccommodative and vergence dysfunction
Accommodative and vergence dysfunctionRabindraAdhikary
 
Spectacles dispensing in children
Spectacles dispensing in childrenSpectacles dispensing in children
Spectacles dispensing in childrenKrishna Kumar
 
Objective refraction
Objective refractionObjective refraction
Objective refractionsneha_thaps
 
Dispencing optics
Dispencing opticsDispencing optics
Dispencing opticsMahantesh B
 
Recent advancements in optometry
Recent advancements in optometryRecent advancements in optometry
Recent advancements in optometryPuneet
 
Soft Toric Contact Lens
Soft Toric Contact LensSoft Toric Contact Lens
Soft Toric Contact LensTahseen Jawaid
 

Was ist angesagt? (20)

soft contact lens fitting
soft contact lens fittingsoft contact lens fitting
soft contact lens fitting
 
optic of contact lens
optic of contact lensoptic of contact lens
optic of contact lens
 
selection of patient for contact lens.pptx
selection of patient for contact lens.pptxselection of patient for contact lens.pptx
selection of patient for contact lens.pptx
 
Tinted lens
Tinted lens Tinted lens
Tinted lens
 
FITTING SPHERICAL RIGID GAS PERMEABLE CONTACT LENS
FITTING SPHERICAL RIGID GAS PERMEABLE  CONTACT LENSFITTING SPHERICAL RIGID GAS PERMEABLE  CONTACT LENS
FITTING SPHERICAL RIGID GAS PERMEABLE CONTACT LENS
 
Introduction to cl fitting
Introduction to cl fittingIntroduction to cl fitting
Introduction to cl fitting
 
RGP Fitting
RGP Fitting RGP Fitting
RGP Fitting
 
Types of pediatric contact lens [autosaved]
Types of pediatric contact lens [autosaved]Types of pediatric contact lens [autosaved]
Types of pediatric contact lens [autosaved]
 
Absorptive lenses
Absorptive lenses Absorptive lenses
Absorptive lenses
 
Assesssment of strabismus
Assesssment of strabismusAssesssment of strabismus
Assesssment of strabismus
 
Polarized lenses
Polarized lensesPolarized lenses
Polarized lenses
 
Orthoptic instruments
Orthoptic instrumentsOrthoptic instruments
Orthoptic instruments
 
Aniseikonia
AniseikoniaAniseikonia
Aniseikonia
 
Accommodative and vergence dysfunction
Accommodative and vergence dysfunctionAccommodative and vergence dysfunction
Accommodative and vergence dysfunction
 
Spectacles dispensing in children
Spectacles dispensing in childrenSpectacles dispensing in children
Spectacles dispensing in children
 
Contact Lenses
Contact LensesContact Lenses
Contact Lenses
 
Objective refraction
Objective refractionObjective refraction
Objective refraction
 
Dispencing optics
Dispencing opticsDispencing optics
Dispencing optics
 
Recent advancements in optometry
Recent advancements in optometryRecent advancements in optometry
Recent advancements in optometry
 
Soft Toric Contact Lens
Soft Toric Contact LensSoft Toric Contact Lens
Soft Toric Contact Lens
 

Ähnlich wie Optical Designs for Fundus Cameras

AI in Ophthalmology | Startup Landscape
AI in Ophthalmology | Startup LandscapeAI in Ophthalmology | Startup Landscape
AI in Ophthalmology | Startup LandscapePetteriTeikariPhD
 
Soon gi Park (LetinAR): PinMR: Novel Optical Solution for AR Glasses
Soon gi Park (LetinAR): PinMR: Novel Optical Solution for AR GlassesSoon gi Park (LetinAR): PinMR: Novel Optical Solution for AR Glasses
Soon gi Park (LetinAR): PinMR: Novel Optical Solution for AR GlassesAugmentedWorldExpo
 
Event-Based Vision Systems – Technology and R&D Trends Analysis Report
Event-Based Vision Systems – Technology and R&D Trends Analysis ReportEvent-Based Vision Systems – Technology and R&D Trends Analysis Report
Event-Based Vision Systems – Technology and R&D Trends Analysis ReportNetscribes
 
Next Gen Computational Ophthalmic Imaging for Neurodegenerative Diseases and ...
Next Gen Computational Ophthalmic Imaging for Neurodegenerative Diseases and ...Next Gen Computational Ophthalmic Imaging for Neurodegenerative Diseases and ...
Next Gen Computational Ophthalmic Imaging for Neurodegenerative Diseases and ...PetteriTeikariPhD
 
Advance Intelligent Video Surveillance System Using OpenCV
Advance Intelligent Video Surveillance System Using OpenCVAdvance Intelligent Video Surveillance System Using OpenCV
Advance Intelligent Video Surveillance System Using OpenCVIRJET Journal
 
TitanLabs Pitch Deck 10cf
TitanLabs Pitch Deck 10cfTitanLabs Pitch Deck 10cf
TitanLabs Pitch Deck 10cfScott Andrews
 
TitanLabs Streak Camera Pitch Deck
TitanLabs Streak Camera Pitch DeckTitanLabs Streak Camera Pitch Deck
TitanLabs Streak Camera Pitch DeckScott Andrews
 
Advanced Robot Vision for Medical Surgical Applications
Advanced Robot Vision for Medical Surgical ApplicationsAdvanced Robot Vision for Medical Surgical Applications
Advanced Robot Vision for Medical Surgical ApplicationsDR.P.S.JAGADEESH KUMAR
 
Company Presentation
Company PresentationCompany Presentation
Company Presentationsleitgeb
 
Recent advances in diagnosis and treatment planning1 /certified fixed orthod...
Recent advances in diagnosis and treatment  planning1 /certified fixed orthod...Recent advances in diagnosis and treatment  planning1 /certified fixed orthod...
Recent advances in diagnosis and treatment planning1 /certified fixed orthod...Indian dental academy
 
Recent advances in diagnosis and treatment planning1 /certified fixed orthod...
Recent advances in diagnosis and treatment  planning1 /certified fixed orthod...Recent advances in diagnosis and treatment  planning1 /certified fixed orthod...
Recent advances in diagnosis and treatment planning1 /certified fixed orthod...Indian dental academy
 
IRJET- Movie Piracy Tracking using Temporal Psycovisual Modulation
IRJET-  	  Movie Piracy Tracking using Temporal Psycovisual ModulationIRJET-  	  Movie Piracy Tracking using Temporal Psycovisual Modulation
IRJET- Movie Piracy Tracking using Temporal Psycovisual ModulationIRJET Journal
 
IRJET- Movie Piracy Tracking using Temporal Psycovisual Modulation
IRJET- Movie Piracy Tracking using Temporal Psycovisual ModulationIRJET- Movie Piracy Tracking using Temporal Psycovisual Modulation
IRJET- Movie Piracy Tracking using Temporal Psycovisual ModulationIRJET Journal
 
Design of lighting systems for animal experiments
Design of lighting systems for animal experimentsDesign of lighting systems for animal experiments
Design of lighting systems for animal experimentsPetteriTeikariPhD
 
Social Distancing Detector Management System
Social Distancing Detector Management SystemSocial Distancing Detector Management System
Social Distancing Detector Management SystemIRJET Journal
 
IRJET - Smart Vision System for Visually Impaired People
IRJET -  	  Smart Vision System for Visually Impaired PeopleIRJET -  	  Smart Vision System for Visually Impaired People
IRJET - Smart Vision System for Visually Impaired PeopleIRJET Journal
 
IRJET- Assistant Systems for the Visually Impaired
IRJET- Assistant Systems for the Visually ImpairedIRJET- Assistant Systems for the Visually Impaired
IRJET- Assistant Systems for the Visually ImpairedIRJET Journal
 

Ähnlich wie Optical Designs for Fundus Cameras (20)

AI in Ophthalmology | Startup Landscape
AI in Ophthalmology | Startup LandscapeAI in Ophthalmology | Startup Landscape
AI in Ophthalmology | Startup Landscape
 
Soon gi Park (LetinAR): PinMR: Novel Optical Solution for AR Glasses
Soon gi Park (LetinAR): PinMR: Novel Optical Solution for AR GlassesSoon gi Park (LetinAR): PinMR: Novel Optical Solution for AR Glasses
Soon gi Park (LetinAR): PinMR: Novel Optical Solution for AR Glasses
 
Event-Based Vision Systems – Technology and R&D Trends Analysis Report
Event-Based Vision Systems – Technology and R&D Trends Analysis ReportEvent-Based Vision Systems – Technology and R&D Trends Analysis Report
Event-Based Vision Systems – Technology and R&D Trends Analysis Report
 
Next Gen Computational Ophthalmic Imaging for Neurodegenerative Diseases and ...
Next Gen Computational Ophthalmic Imaging for Neurodegenerative Diseases and ...Next Gen Computational Ophthalmic Imaging for Neurodegenerative Diseases and ...
Next Gen Computational Ophthalmic Imaging for Neurodegenerative Diseases and ...
 
Advance Intelligent Video Surveillance System Using OpenCV
Advance Intelligent Video Surveillance System Using OpenCVAdvance Intelligent Video Surveillance System Using OpenCV
Advance Intelligent Video Surveillance System Using OpenCV
 
TitanLabs Pitch Deck 10cf
TitanLabs Pitch Deck 10cfTitanLabs Pitch Deck 10cf
TitanLabs Pitch Deck 10cf
 
TitanLabs Streak Camera Pitch Deck
TitanLabs Streak Camera Pitch DeckTitanLabs Streak Camera Pitch Deck
TitanLabs Streak Camera Pitch Deck
 
Advanced Robot Vision for Medical Surgical Applications
Advanced Robot Vision for Medical Surgical ApplicationsAdvanced Robot Vision for Medical Surgical Applications
Advanced Robot Vision for Medical Surgical Applications
 
Light Treatment Glasses
Light Treatment GlassesLight Treatment Glasses
Light Treatment Glasses
 
Optovue OCT-A quick review
Optovue OCT-A quick reviewOptovue OCT-A quick review
Optovue OCT-A quick review
 
Company Presentation
Company PresentationCompany Presentation
Company Presentation
 
Recent advances in diagnosis and treatment planning1 /certified fixed orthod...
Recent advances in diagnosis and treatment  planning1 /certified fixed orthod...Recent advances in diagnosis and treatment  planning1 /certified fixed orthod...
Recent advances in diagnosis and treatment planning1 /certified fixed orthod...
 
Recent advances in diagnosis and treatment planning1 /certified fixed orthod...
Recent advances in diagnosis and treatment  planning1 /certified fixed orthod...Recent advances in diagnosis and treatment  planning1 /certified fixed orthod...
Recent advances in diagnosis and treatment planning1 /certified fixed orthod...
 
IRJET- Movie Piracy Tracking using Temporal Psycovisual Modulation
IRJET-  	  Movie Piracy Tracking using Temporal Psycovisual ModulationIRJET-  	  Movie Piracy Tracking using Temporal Psycovisual Modulation
IRJET- Movie Piracy Tracking using Temporal Psycovisual Modulation
 
IRJET- Movie Piracy Tracking using Temporal Psycovisual Modulation
IRJET- Movie Piracy Tracking using Temporal Psycovisual ModulationIRJET- Movie Piracy Tracking using Temporal Psycovisual Modulation
IRJET- Movie Piracy Tracking using Temporal Psycovisual Modulation
 
Design of lighting systems for animal experiments
Design of lighting systems for animal experimentsDesign of lighting systems for animal experiments
Design of lighting systems for animal experiments
 
Social Distancing Detector Management System
Social Distancing Detector Management SystemSocial Distancing Detector Management System
Social Distancing Detector Management System
 
IRJET - Smart Vision System for Visually Impaired People
IRJET -  	  Smart Vision System for Visually Impaired PeopleIRJET -  	  Smart Vision System for Visually Impaired People
IRJET - Smart Vision System for Visually Impaired People
 
IRJET- Assistant Systems for the Visually Impaired
IRJET- Assistant Systems for the Visually ImpairedIRJET- Assistant Systems for the Visually Impaired
IRJET- Assistant Systems for the Visually Impaired
 
Virtual Techniques: VDC - Trend Report 2018
Virtual Techniques: VDC - Trend Report 2018Virtual Techniques: VDC - Trend Report 2018
Virtual Techniques: VDC - Trend Report 2018
 

Mehr von PetteriTeikariPhD

ML and Signal Processing for Lung Sounds
ML and Signal Processing for Lung SoundsML and Signal Processing for Lung Sounds
ML and Signal Processing for Lung SoundsPetteriTeikariPhD
 
Next Gen Ophthalmic Imaging for Neurodegenerative Diseases and Oculomics
Next Gen Ophthalmic Imaging for Neurodegenerative Diseases and OculomicsNext Gen Ophthalmic Imaging for Neurodegenerative Diseases and Oculomics
Next Gen Ophthalmic Imaging for Neurodegenerative Diseases and OculomicsPetteriTeikariPhD
 
Wearable Continuous Acoustic Lung Sensing
Wearable Continuous Acoustic Lung SensingWearable Continuous Acoustic Lung Sensing
Wearable Continuous Acoustic Lung SensingPetteriTeikariPhD
 
Precision Medicine for personalized treatment of asthma
Precision Medicine for personalized treatment of asthmaPrecision Medicine for personalized treatment of asthma
Precision Medicine for personalized treatment of asthmaPetteriTeikariPhD
 
Two-Photon Microscopy Vasculature Segmentation
Two-Photon Microscopy Vasculature SegmentationTwo-Photon Microscopy Vasculature Segmentation
Two-Photon Microscopy Vasculature SegmentationPetteriTeikariPhD
 
Skin temperature as a proxy for core body temperature (CBT) and circadian phase
Skin temperature as a proxy for core body temperature (CBT) and circadian phaseSkin temperature as a proxy for core body temperature (CBT) and circadian phase
Skin temperature as a proxy for core body temperature (CBT) and circadian phasePetteriTeikariPhD
 
Summary of "Precision strength training: The future of strength training with...
Summary of "Precision strength training: The future of strength training with...Summary of "Precision strength training: The future of strength training with...
Summary of "Precision strength training: The future of strength training with...PetteriTeikariPhD
 
Precision strength training: The future of strength training with data-driven...
Precision strength training: The future of strength training with data-driven...Precision strength training: The future of strength training with data-driven...
Precision strength training: The future of strength training with data-driven...PetteriTeikariPhD
 
Intracerebral Hemorrhage (ICH): Understanding the CT imaging features
Intracerebral Hemorrhage (ICH): Understanding the CT imaging featuresIntracerebral Hemorrhage (ICH): Understanding the CT imaging features
Intracerebral Hemorrhage (ICH): Understanding the CT imaging featuresPetteriTeikariPhD
 
Hand Pose Tracking for Clinical Applications
Hand Pose Tracking for Clinical ApplicationsHand Pose Tracking for Clinical Applications
Hand Pose Tracking for Clinical ApplicationsPetteriTeikariPhD
 
Precision Physiotherapy & Sports Training: Part 1
Precision Physiotherapy & Sports Training: Part 1Precision Physiotherapy & Sports Training: Part 1
Precision Physiotherapy & Sports Training: Part 1PetteriTeikariPhD
 
Multimodal RGB-D+RF-based sensing for human movement analysis
Multimodal RGB-D+RF-based sensing for human movement analysisMultimodal RGB-D+RF-based sensing for human movement analysis
Multimodal RGB-D+RF-based sensing for human movement analysisPetteriTeikariPhD
 
Creativity as Science: What designers can learn from science and technology
Creativity as Science: What designers can learn from science and technologyCreativity as Science: What designers can learn from science and technology
Creativity as Science: What designers can learn from science and technologyPetteriTeikariPhD
 
Deep Learning for Biomedical Unstructured Time Series
Deep Learning for Biomedical  Unstructured Time SeriesDeep Learning for Biomedical  Unstructured Time Series
Deep Learning for Biomedical Unstructured Time SeriesPetteriTeikariPhD
 
Hyperspectral Retinal Imaging
Hyperspectral Retinal ImagingHyperspectral Retinal Imaging
Hyperspectral Retinal ImagingPetteriTeikariPhD
 
Instrumentation for in vivo intravital microscopy
Instrumentation for in vivo intravital microscopyInstrumentation for in vivo intravital microscopy
Instrumentation for in vivo intravital microscopyPetteriTeikariPhD
 
Future of Retinal Diagnostics
Future of Retinal DiagnosticsFuture of Retinal Diagnostics
Future of Retinal DiagnosticsPetteriTeikariPhD
 
OCT Monte Carlo & Deep Learning
OCT Monte Carlo & Deep LearningOCT Monte Carlo & Deep Learning
OCT Monte Carlo & Deep LearningPetteriTeikariPhD
 
Multispectral Purkinje Imaging
 Multispectral Purkinje Imaging Multispectral Purkinje Imaging
Multispectral Purkinje ImagingPetteriTeikariPhD
 
Beyond Broken Stick Modeling: R Tutorial for interpretable multivariate analysis
Beyond Broken Stick Modeling: R Tutorial for interpretable multivariate analysisBeyond Broken Stick Modeling: R Tutorial for interpretable multivariate analysis
Beyond Broken Stick Modeling: R Tutorial for interpretable multivariate analysisPetteriTeikariPhD
 

Mehr von PetteriTeikariPhD (20)

ML and Signal Processing for Lung Sounds
ML and Signal Processing for Lung SoundsML and Signal Processing for Lung Sounds
ML and Signal Processing for Lung Sounds
 
Next Gen Ophthalmic Imaging for Neurodegenerative Diseases and Oculomics
Next Gen Ophthalmic Imaging for Neurodegenerative Diseases and OculomicsNext Gen Ophthalmic Imaging for Neurodegenerative Diseases and Oculomics
Next Gen Ophthalmic Imaging for Neurodegenerative Diseases and Oculomics
 
Wearable Continuous Acoustic Lung Sensing
Wearable Continuous Acoustic Lung SensingWearable Continuous Acoustic Lung Sensing
Wearable Continuous Acoustic Lung Sensing
 
Precision Medicine for personalized treatment of asthma
Precision Medicine for personalized treatment of asthmaPrecision Medicine for personalized treatment of asthma
Precision Medicine for personalized treatment of asthma
 
Two-Photon Microscopy Vasculature Segmentation
Two-Photon Microscopy Vasculature SegmentationTwo-Photon Microscopy Vasculature Segmentation
Two-Photon Microscopy Vasculature Segmentation
 
Skin temperature as a proxy for core body temperature (CBT) and circadian phase
Skin temperature as a proxy for core body temperature (CBT) and circadian phaseSkin temperature as a proxy for core body temperature (CBT) and circadian phase
Skin temperature as a proxy for core body temperature (CBT) and circadian phase
 
Summary of "Precision strength training: The future of strength training with...
Summary of "Precision strength training: The future of strength training with...Summary of "Precision strength training: The future of strength training with...
Summary of "Precision strength training: The future of strength training with...
 
Precision strength training: The future of strength training with data-driven...
Precision strength training: The future of strength training with data-driven...Precision strength training: The future of strength training with data-driven...
Precision strength training: The future of strength training with data-driven...
 
Intracerebral Hemorrhage (ICH): Understanding the CT imaging features
Intracerebral Hemorrhage (ICH): Understanding the CT imaging featuresIntracerebral Hemorrhage (ICH): Understanding the CT imaging features
Intracerebral Hemorrhage (ICH): Understanding the CT imaging features
 
Hand Pose Tracking for Clinical Applications
Hand Pose Tracking for Clinical ApplicationsHand Pose Tracking for Clinical Applications
Hand Pose Tracking for Clinical Applications
 
Precision Physiotherapy & Sports Training: Part 1
Precision Physiotherapy & Sports Training: Part 1Precision Physiotherapy & Sports Training: Part 1
Precision Physiotherapy & Sports Training: Part 1
 
Multimodal RGB-D+RF-based sensing for human movement analysis
Multimodal RGB-D+RF-based sensing for human movement analysisMultimodal RGB-D+RF-based sensing for human movement analysis
Multimodal RGB-D+RF-based sensing for human movement analysis
 
Creativity as Science: What designers can learn from science and technology
Creativity as Science: What designers can learn from science and technologyCreativity as Science: What designers can learn from science and technology
Creativity as Science: What designers can learn from science and technology
 
Deep Learning for Biomedical Unstructured Time Series
Deep Learning for Biomedical  Unstructured Time SeriesDeep Learning for Biomedical  Unstructured Time Series
Deep Learning for Biomedical Unstructured Time Series
 
Hyperspectral Retinal Imaging
Hyperspectral Retinal ImagingHyperspectral Retinal Imaging
Hyperspectral Retinal Imaging
 
Instrumentation for in vivo intravital microscopy
Instrumentation for in vivo intravital microscopyInstrumentation for in vivo intravital microscopy
Instrumentation for in vivo intravital microscopy
 
Future of Retinal Diagnostics
Future of Retinal DiagnosticsFuture of Retinal Diagnostics
Future of Retinal Diagnostics
 
OCT Monte Carlo & Deep Learning
OCT Monte Carlo & Deep LearningOCT Monte Carlo & Deep Learning
OCT Monte Carlo & Deep Learning
 
Multispectral Purkinje Imaging
 Multispectral Purkinje Imaging Multispectral Purkinje Imaging
Multispectral Purkinje Imaging
 
Beyond Broken Stick Modeling: R Tutorial for interpretable multivariate analysis
Beyond Broken Stick Modeling: R Tutorial for interpretable multivariate analysisBeyond Broken Stick Modeling: R Tutorial for interpretable multivariate analysis
Beyond Broken Stick Modeling: R Tutorial for interpretable multivariate analysis
 

Kürzlich hochgeladen

TeamStation AI System Report LATAM IT Salaries 2024
TeamStation AI System Report LATAM IT Salaries 2024TeamStation AI System Report LATAM IT Salaries 2024
TeamStation AI System Report LATAM IT Salaries 2024Lonnie McRorey
 
Anypoint Exchange: It’s Not Just a Repo!
Anypoint Exchange: It’s Not Just a Repo!Anypoint Exchange: It’s Not Just a Repo!
Anypoint Exchange: It’s Not Just a Repo!Manik S Magar
 
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024BookNet Canada
 
DevEX - reference for building teams, processes, and platforms
DevEX - reference for building teams, processes, and platformsDevEX - reference for building teams, processes, and platforms
DevEX - reference for building teams, processes, and platformsSergiu Bodiu
 
SALESFORCE EDUCATION CLOUD | FEXLE SERVICES
SALESFORCE EDUCATION CLOUD | FEXLE SERVICESSALESFORCE EDUCATION CLOUD | FEXLE SERVICES
SALESFORCE EDUCATION CLOUD | FEXLE SERVICESmohitsingh558521
 
DSPy a system for AI to Write Prompts and Do Fine Tuning
DSPy a system for AI to Write Prompts and Do Fine TuningDSPy a system for AI to Write Prompts and Do Fine Tuning
DSPy a system for AI to Write Prompts and Do Fine TuningLars Bell
 
The Ultimate Guide to Choosing WordPress Pros and Cons
The Ultimate Guide to Choosing WordPress Pros and ConsThe Ultimate Guide to Choosing WordPress Pros and Cons
The Ultimate Guide to Choosing WordPress Pros and ConsPixlogix Infotech
 
WordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your BrandWordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your Brandgvaughan
 
Scale your database traffic with Read & Write split using MySQL Router
Scale your database traffic with Read & Write split using MySQL RouterScale your database traffic with Read & Write split using MySQL Router
Scale your database traffic with Read & Write split using MySQL RouterMydbops
 
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptx
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptxUse of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptx
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptxLoriGlavin3
 
SIP trunking in Janus @ Kamailio World 2024
SIP trunking in Janus @ Kamailio World 2024SIP trunking in Janus @ Kamailio World 2024
SIP trunking in Janus @ Kamailio World 2024Lorenzo Miniero
 
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)Mark Simos
 
DevoxxFR 2024 Reproducible Builds with Apache Maven
DevoxxFR 2024 Reproducible Builds with Apache MavenDevoxxFR 2024 Reproducible Builds with Apache Maven
DevoxxFR 2024 Reproducible Builds with Apache MavenHervé Boutemy
 
Gen AI in Business - Global Trends Report 2024.pdf
Gen AI in Business - Global Trends Report 2024.pdfGen AI in Business - Global Trends Report 2024.pdf
Gen AI in Business - Global Trends Report 2024.pdfAddepto
 
Ryan Mahoney - Will Artificial Intelligence Replace Real Estate Agents
Ryan Mahoney - Will Artificial Intelligence Replace Real Estate AgentsRyan Mahoney - Will Artificial Intelligence Replace Real Estate Agents
Ryan Mahoney - Will Artificial Intelligence Replace Real Estate AgentsRyan Mahoney
 
How to write a Business Continuity Plan
How to write a Business Continuity PlanHow to write a Business Continuity Plan
How to write a Business Continuity PlanDatabarracks
 
What's New in Teams Calling, Meetings and Devices March 2024
What's New in Teams Calling, Meetings and Devices March 2024What's New in Teams Calling, Meetings and Devices March 2024
What's New in Teams Calling, Meetings and Devices March 2024Stephanie Beckett
 
Unraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdfUnraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdfAlex Barbosa Coqueiro
 
What is Artificial Intelligence?????????
What is Artificial Intelligence?????????What is Artificial Intelligence?????????
What is Artificial Intelligence?????????blackmambaettijean
 
Generative AI for Technical Writer or Information Developers
Generative AI for Technical Writer or Information DevelopersGenerative AI for Technical Writer or Information Developers
Generative AI for Technical Writer or Information DevelopersRaghuram Pandurangan
 

Kürzlich hochgeladen (20)

TeamStation AI System Report LATAM IT Salaries 2024
TeamStation AI System Report LATAM IT Salaries 2024TeamStation AI System Report LATAM IT Salaries 2024
TeamStation AI System Report LATAM IT Salaries 2024
 
Anypoint Exchange: It’s Not Just a Repo!
Anypoint Exchange: It’s Not Just a Repo!Anypoint Exchange: It’s Not Just a Repo!
Anypoint Exchange: It’s Not Just a Repo!
 
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
 
DevEX - reference for building teams, processes, and platforms
DevEX - reference for building teams, processes, and platformsDevEX - reference for building teams, processes, and platforms
DevEX - reference for building teams, processes, and platforms
 
SALESFORCE EDUCATION CLOUD | FEXLE SERVICES
SALESFORCE EDUCATION CLOUD | FEXLE SERVICESSALESFORCE EDUCATION CLOUD | FEXLE SERVICES
SALESFORCE EDUCATION CLOUD | FEXLE SERVICES
 
DSPy a system for AI to Write Prompts and Do Fine Tuning
DSPy a system for AI to Write Prompts and Do Fine TuningDSPy a system for AI to Write Prompts and Do Fine Tuning
DSPy a system for AI to Write Prompts and Do Fine Tuning
 
The Ultimate Guide to Choosing WordPress Pros and Cons
The Ultimate Guide to Choosing WordPress Pros and ConsThe Ultimate Guide to Choosing WordPress Pros and Cons
The Ultimate Guide to Choosing WordPress Pros and Cons
 
WordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your BrandWordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your Brand
 
Scale your database traffic with Read & Write split using MySQL Router
Scale your database traffic with Read & Write split using MySQL RouterScale your database traffic with Read & Write split using MySQL Router
Scale your database traffic with Read & Write split using MySQL Router
 
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptx
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptxUse of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptx
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptx
 
SIP trunking in Janus @ Kamailio World 2024
SIP trunking in Janus @ Kamailio World 2024SIP trunking in Janus @ Kamailio World 2024
SIP trunking in Janus @ Kamailio World 2024
 
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
 
DevoxxFR 2024 Reproducible Builds with Apache Maven
DevoxxFR 2024 Reproducible Builds with Apache MavenDevoxxFR 2024 Reproducible Builds with Apache Maven
DevoxxFR 2024 Reproducible Builds with Apache Maven
 
Gen AI in Business - Global Trends Report 2024.pdf
Gen AI in Business - Global Trends Report 2024.pdfGen AI in Business - Global Trends Report 2024.pdf
Gen AI in Business - Global Trends Report 2024.pdf
 
Ryan Mahoney - Will Artificial Intelligence Replace Real Estate Agents
Ryan Mahoney - Will Artificial Intelligence Replace Real Estate AgentsRyan Mahoney - Will Artificial Intelligence Replace Real Estate Agents
Ryan Mahoney - Will Artificial Intelligence Replace Real Estate Agents
 
How to write a Business Continuity Plan
How to write a Business Continuity PlanHow to write a Business Continuity Plan
How to write a Business Continuity Plan
 
What's New in Teams Calling, Meetings and Devices March 2024
What's New in Teams Calling, Meetings and Devices March 2024What's New in Teams Calling, Meetings and Devices March 2024
What's New in Teams Calling, Meetings and Devices March 2024
 
Unraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdfUnraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdf
 
What is Artificial Intelligence?????????
What is Artificial Intelligence?????????What is Artificial Intelligence?????????
What is Artificial Intelligence?????????
 
Generative AI for Technical Writer or Information Developers
Generative AI for Technical Writer or Information DevelopersGenerative AI for Technical Writer or Information Developers
Generative AI for Technical Writer or Information Developers
 

Optical Designs for Fundus Cameras

  • 1. Petteri Teikari, PhD Singapore Eye Research Institute (SERI) Visual Neurosciences group http://petteri-teikari.com/ Version “Wed 10 October 2018“ Optical designs for fundus imaging From traditional desktop to novel optical design in small form factors
  • 3. Intro to Fundus Optics Design 2009 Funduscamerasystems: acomparativeanalysis https://doi.org/10.1364/AO.48.000221 EdwardDeHoogandJamesSchwiegerling Applied OpticsVol. 48, Issue2, pp.221-228 (2009) Retinal photography requires the use of a complex optical system, called a fundus camera, capable of illuminating and imaging the retina simultaneously. The patent literature shows two design forms but does not provide the specifics necessary for a thorough analysisofthedesignstobeperformed. We have constructed our own designs based on the patent literature in optical design software and compared them for illumination efficiency, image quality, ability to accommodate for patient refractive error, and manufacturing tolerances, a comparison lackingintheexistingliterature. external illumination design internal illumination design Tolerance analysis must always be considered when determining which system is able to perform a specific task better. Systems with high performance metrics but extremely tight or impossible tolerances are likely to be passed over for productionor redesignedtomakemanufacturingeasier Kidger, Intermediate Optical Design (SPIE, 2004) Shannon, The Art and Science of Optical Design (1997)CVI Melles Griot, “ Optical fabrication tolerances” Rochester Precision Optics, “Traditional optics capability”. Resultsof 100 MonteCarlo Trialsof FundusCameraSystems “Retinal imaging presents a unique difficulty considering that the retina must be illuminated and imaged simultaneously, a process which forcesillumination and imaging systems to share a common optical path. Because the retina is a minimally reflective surface, the power of the back reflections from the shared optics of the illuminationandimagingpathsisgreater thanthepowerreflectedbytheretina.“
  • 6. Fundus Cameras Commercial Landscape#2Vishwanath Manik Rathod http://raiith.iith.ac.in/4141/1/Thesis_Mtech_EE_4141.pdf NextSight Nexy Tutorial-funduscameraSIROftalmica https://youtu.be/JxmFyhFRN3g Nexy RoboticRetinalImaging SystemReceivesFDA ...- EyewireNews GlobalFundusCamerasMarkettobeworth USD620Million By2024 -ZionMarketResearch https://globenewswire.com/news-release/2018/08/ 19/1553691/0/en/Global-Fundus-Cameras-Market- to-be-worth-USD-620-Million-By-2024-Zion-Marke t-Research.html Fundus Cameras Market: by Product Type (Mydriatic Fundus Cameras [Tabletop and Handheld], Non-mydriatic Fundus Cameras [Tabletop and Handheld], Hybrid Fundus Cameras, and ROP Fundus Cameras) and by End User (Hospitals, Ophthalmology Clinics, and Others): Global Industry Perspective,ComprehensiveAnalysisandForecast,2018- 2024
  • 7. Pen-like fundus camera design December2009 US8836778B2Portablefunduscamera https://patents.google.com/patent/US8836778B2/en Filipp V. IGNATOVICH, David M. Kleinman, Christopher T. Cotton, Todd Blalock LUMETRICSInc Legalese description: “Camera for imaging the fundus of an eye, the camera comprising optics aligned along an imaging axis intersecting a point on the fundus and configured to focus light reflected back from the fundus onto an image receptor, wherein the optics are capable of varying a field of view of the camera along a path circumferentially around the point on the fundus, whereby the image receptor acquires images of portions of the fundus located at different peripheral locationsaroundthepointofthefundus”
  • 8. Spectral Characterizatio n of typical fundus camera September2010 Spectralcharacterizationofan ophthalmicfunduscamera https://doi.org/10.1117/12.844855 ClaytonT.Miller;CarlJ.Bassi;DaleBrodsky; TimothyHolmes This work describes the characterization of one system, the Topcon TRC-50F, necessary for converting this camera from film photography to spectral imaging with a CCD. Thisconversion consistsofreplacing the camera's original xenon flash tube with a monochromatic light source and the film back with a CCD. A critical preliminary step of this modification is determining the spectral throughput of the system, from source to sensor, and ensuring there are sufficient photonsatthesensor for imaging.
  • 9. Dynamic Artifacts Cardiacgating forfundus 2003 TimeCourseofFundusReflectionChanges AccordingtotheCardiacCycle https://iovs.arvojournals.org/article.aspx?articleid=2413124 R.P. Tornow;O.Kopp; B.Schultheiss To compare the time course of fundus reflection from video sequence (25 frames/sec) at different retinal locationswithcardiacparameters. The pulsatile reflection component ΔR(t) R(t) changes corresponding to the cardiac cycle. ΔR(t) R(t) rises suddenly during systole, reaches its maximum after about 32 % of the pulse duration time (RR- interval) and decreases towards the end of the diastole. The pulse shape of ΔR(t) R(t) shows a high correspondence to the cardial impedance signal while it is different from the pulse shapes of the peripheralimpedancesignals. The reflection of the ocular fundus depends on the cardiac cycle. The simultaneous assessment of ΔR(t) R(t) and the impedance signals allows to correlate parameters of ocular microcirculation with cardiac parameters and to distinguish physiologically induced reflection changes from artifacts. More than this, the pulsatile reflection amplitude has to be taken into consideration for quantitative imaging like retinal densitometry. November 2016 Retinalvenouspulsation:Expandingour understandinganduseof thisenigmatic phenomenon https://doi.org/10.1016/j.preteyeres.2016.06.003 WilliamH.Morgan,Martin L.Hazelton,Dao-Yi Yu Recently, improved ophthalmodynamometry and video recording techniques have allowed us to explore the fundamentals of retinal vein pulsation. This demonstrates that retinal venous collapse is in phase with both IOP and CSFP diastole, indicating the dependence upon CSFP pulse. We describe in some detail the mathematical and physical models of Starling resistors and how their results can be applied to understand the physiology of retinal vein pulsation. October2017 AutomaticDetectionofSpontaneousVenousPulsationsUsing RetinalImageSequences https://doi.org/10.1007/978-3-319-68195-5_90 Michal Hracho, Radim Kolar,Jan Odstrcilik, IvanaLiberdova, RalfP.Tornow Evaluation ofmagnitude ofspontaneous venous pulsationhas been proven to correlatewith occurrenceofglaucoma. Based on this relation a methodis proposed that might help to detect glaucomavia detection ofspontaneous venous pulsation.
  • 10. Modify Existingfundus camera for custom applications 2010 High-resolutionhyperspectralimagingoftheretinawithamodifiedfunduscamera NourritV,DennissJ,MuqitMM,SchiesslI,FenertyC,StangaPE,HensonDB. http://doi.org/10.1016/j.jfo.2010.10.010 Thispaper givesinformation on howtoconvert astandard funduscameraintoa hyperspecral camera with off the shelf elements (CCD camera, liquid crystal filter, optical fibre and slide lamp projector). Technically, itsmain limitation is the low transmissionof the filter (20% maxfor unpolarized light below 650 nm), which limits imagingbelow460 nm.
  • 14. Fundus Self- Imaging “Eye Selfie” from MIT February2012 US9295388B2Methodsandapparatus forretinalimaging https://patents.google.com/patent/US9295388B2/en MatthewEverettLawson,RameshRaskar MassachusettsInstituteofTechnology This invention comprises apparatus for retinal self- imaging. Visual stimuli help the user self-align his eye with a camera. Bi-ocular coupling induces the test eye to rotate into different positions. As the test eye rotates, a video is captured of different areas of the retina. Computational photography methods process this video into a mosaiced image of a large area of the retina. An LED is pressed against the skin near the eye, to provide indirect, diffuse illumination of the retina. The camerahasawide field of view, and can image part of the retina even when the eye is off-axis (when the eye's pupillary axis and camera's optical axis are not aligned). Alternately, the retina is illuminated directly through the pupil, and different parts of a large lens are used to image different parts of the retina. Alternately,aplenopticcameraisusedfor retinalimaging. Computational photography techniques are used to process the multiple images and to produce a mosaiced image. These techniques include (i) “Lucky” imaging, in which high- pass filtering is used to identify images that have the highest quality, and to discard poorer qualityimages
  • 15. Fundus Eye-Selfie http://web.media.mit.edu/~tswedish/projects/eyeSelfie.html T. Swedish, K. Roesch, I.K. Lee, K. Rastogi, S. Bernstein, R. Raskar. eyeSelfie:SelfDirected EyeAlignment usingReciprocalEyeBoxImaging. Proc. ofSIGGRAPH2015 (ACM Transactions on Graphics 34, 4), 2015. Self-aligned, mobile, non-mydriatic Fundus Photography. The user is presented with an alignment dependent fixation cue on a ray-based display.Oncecorrectlyaligned,aself-acquiredretinal image is captured. This retinal image can be used for health, security or HMD calibration. Ilustration:  LauraPiraino https://youtu.be/HuXgrbwOjvM https://www.economist.com/science-and-technology/2015/0 6/13/retina-selfie Expert-freeeyealignmentandmachinelearningfor predictivehealthTristan BreadenSwedish https://dspace.mit.edu/handle/1721.1/112543 Iwill present a system that includes a novel methodfor eyeself-alignmentand automaticimage analysis and evaluate its effectiveness when applied to a case study of a diabetic retinopathy screening program. This work is inspired by advances in machine learning that makes accessible interactions previously confined to specialized environments and trained users. I will also suggestsomenewdirectionsfor futurework based onthisexpert-freeparadigm.
  • 17. Check the patent trail Cited By(14) US20160302665A1 *2015-04-172016-10-20MassachusettsInstituteOfTechnology MethodsandApparatus forVisualCuesforEyeAlignment US20170000454A1 *2015-03-162017-01-05MagicLeap,Inc.Methodsandsystemsfordiagnosingeyes usingultrasound WO2009081498A1 *2007-12-262009-07-02ShimadzuCorporation Organismimagecapturingdevice EP2583619A1 *2011-10-222013-04-24SensoMotoricInstrumentsGmbHApparatusformonitoringoneor moresurgicalparametersoftheeye US20150021228A12012-02-022015-01-22VisunexMedicalSystemsCo.,Ltd.Eyeimagingapparatusand systems US9655517B22012-02-022017-05-23VisunexMedicalSystemsCo.Ltd.Portableeyeimagingapparatus US9351639B22012-03-172016-05-31VisunexMedicalSystemsCo.Ltd. Eyeimagingapparatuswithawide fieldof viewandrelatedmethods US9237847B22014-02-112016-01-19WelchAllyn,Inc.Ophthalmoscopedevice US9211064B22014-02-112015-12-15WelchAllyn,Inc.Fundusimagingsystem US9986908B22014-06-232018-06-05VisunexMedicalSystemsCo.Ltd. Mechanicalfeaturesof aneye imagingapparatus US9675246B22014-09-152017-06-13WelchAllyn,Inc.Borescopicopticalsystemformedicaldiagnostic instrumentsandmedicaldiagnosticinstrumentshavinginterlockingassemblyfeatures EP3026884A1 *2014-11-272016-06-01ThomsonLicensing Plenopticcameracomprisingalightemitting device US9848773B22015-01-262017-12-26VisunexMedicalSystemsCo.Ltd.Disposablecapforaneyeimaging apparatusandrelatedmethods US20160320837A1 *2015-05-012016-11-03MassachusettsInstituteOfTechnology MethodsandApparatusfor RetinalRetroreflectionImaging
  • 18. Design for minimizing straylight e.g.usepolarizedlight source November 2014 Design,simulationand experimental analysisofananti-stray-light illuminationsystemof funduscamera https://doi.org/10.1117/12.2073619 ChenMa;DewenCheng; ChenXu;Yongtian Wang A fiber-coupled, ring-shaped light source that forms an annular beam is used to make full use of light energy. The parameters of the light source, namely its divergence angle and size of the exit surface of the fiber rod coupler, are determined via simulation in LightTools. Simulation results show that the illumination uniformity of fundus can reach to 90% when a 3~6mm annular spot on the cornea is illuminated. It is also shown that a smaller divergence angle (i.e., 30°) benefits both the uniformity irradiance of the fundus image on focus plane (i.e., CCD) and the sharpness of the image profile. To weaken the stray light, a polarized light source is used, and an analyzer plate whose light vector is perpendicular to the source is placed after the beam splitter in the imaging system. Simulation shows the average relative irradiance of stray light after stray lightelimination drops to1%.
  • 21. 3D Ophthalmoscope Design 2015 Studyofopticaldesignofthree- dimensionaldigitalophthalmoscopes https://doi.org/10.1364/AO.54.00E224 Yi-Chin Fang,Chih-TaYen,andChin-Hsien Chu LightToolsdiagram A 3D optical-zoom sensory system of the human eye with infrared and visible light is proposed (Code V, LightTools) to help the doctor diagnose human eye diseases. The proposed lens design for 3D digital ophthalmoscopes provides a good means of simultaneously accessing infrared and visible light band information to help doctors performdiagnostics. The diffraction limit lines in MTF plots are almost >0.7 at spatial frequencies up to 40 cycles/mm under all zoom conditions in the IR region. Accordingto the experiment results, the proposed 3D digital ophthalmoscope is suitable forfutureophthalmoscopedesign.
  • 22. D-Eye 3DPrintedoptics holders 2015 ANovelDevicetoExploitthe SmartphoneCameraforFundus Photography http://dx.doi.org/10.1155/2015/823139 AndreaRusso,FrancescoMorescalchi,CiroCostagliola, LuisaDelcassi,andFrancescoSemeraro Exploded view of the D-Eye module (angles and distances between components are approximated). Retinal images are acquired using coaxial illumination and imaging paths thanks to a beam splitter (C). The blue arrow depicts the path of the light; red arrow depicts the path of fundus imaging. Device components are glass platelet (A) with imprinted negative lens (A′), photo-absorbing wall (B), beam splitter (C), mirror (D), plastic case (E), diaphragm (F), polarized filters (G, H), flash and camera glass (J, I), andmagnetic external ring (K).
  • 23. Tunable Liquid Lens withtranspupillary illuminationtosimplify theopticaldesign 2015 AccessibleDigitalOphthalmoscopy BasedonLiquid-LensTechnology https://doi.org/10.1007/978-3-319-24571-3_68 Christos Bergeles, Pierre Berthet-Rayne, Philip McCormac, Luis C. Garcia- Peraza-Herrera, Kosy Onyenso, FanCao, KhushiVyas, MelissaBerthelot, Guang-ZhongYang Thispaper demonstratesa newdesignintegratingmodern componentsfor ophthalmoscopy.Simulations showthattheopticalelements canbereducedtojusttwo lenses:anaspheric ophthalmoscopiclensanda commodityliquid-lens, leadingtoacompact prototype. Circularlypolarised transpupilaryillumination, withlimitedusesofarfor ophthalmoscopy,suppresses reflections,while autofocusingpreserves imagesharpness.Experiments withahuman-eyemodeland cadaver porcineeyes demonstrateourprototype’s clinicalvalueanditspotential for accessibleimagingwhen costisalimitingfactor.
  • 24. Simplifying Optical Design ”substitute the complex illumination systembya ring of LEDsmounted coaxiallytothe imaging optical system, positioningitinthe place of the holed mirror of the traditional optical design.” September2016 Evaluationofretinalilluminationin coaxialfunduscamera https://doi.org/10.1117/12.2236973 AndréO.deOliveira;LucianadeMatos;Jarbas C.CastroNeto Weevaluatedtheimpactofthissubstitution regardingtoimagequality (measured throughthemodulationtransfer function)and illuminationuniformityproducedbythissystem ontheretina. Theresultsshowedthereisnochangeinimage qualityandnoproblemwasdetected concerninguniformitycomparedtothe traditionalequipment.Consequently,we avoidedoff-axiscomponents,easingthe alignmentoftheequipment without reducingbothimagequalityandillumination uniformity. Photograph (left) and optical drawing(center) ofthe OEMI-7OcularImaging Eye Model (Ocular Instruments Inc.). Picture of the OEMI-7 Ocular Imaging Eye Model (right) obtained using the innovative equipment. Noobscuration isobserved and the image isfree ofreflex.
  • 25. Optimizing Zemax tools for efficient modelling of fundus cameras November 2016 Minimisingbackreflectionsfromthecommonpath objectiveinafunduscamera https://doi.org/10.1117/12.2256633 A.SwatSolarisOpticsS.A Eliminating back reflections is critical in the design of a fundus camerawith internal illuminating system.Asthere isvery little light reflected from the retina, even excellent antireflective coatings are not sufficient suppression of ghost reflections, therefore the number of surfaces in the common optics in illuminatingandimagingpathsshallbe minimised. Typically a single aspheric objective is used. In the paper an alternative approach, an objective with all spherical surfaces, is presented. As more surfaces are required, more sophisticated method is needed to get rid of back reflections. Typically back reflections analysis, comprise treating subsequent objective surfaces as mirrors, and reflections from the objective surfaces are traced back through the imaging path. There are also available standard ghost control merit function operands in the sequential ray-trace, for example in Zemax system, but these don’t allow back ray-trace in an alternative optical path, illumination vs. imaging. What is proposed in the paper, is a complete method to incorporate ghostreflectedenergyintothe raytracing systemmeritfunction for sequential mode which is more efficient in optimisation process. Although developed for the purpose of specific case of fundus camera, the method might be utilised in a wider rangeofapplicationswhereghostcontroliscritical. Commonlyusedamongstopticalspecialists,is Zemaxsoftware, it doesallowto callamacro script fromthesystemmeritfunction bytheZPLMoperand. Thereforetheoptimisation systemcomprisethe following: ● Theimaging system builtin Zemax ● A typicalmeritfunction constructedtoallowforthe imagingsystemoptimisation ● An additionallinein themeritfunctionto calla macro,operandZPLM ● ThemacrocalledbytheZPLMoperandshallopen anewfile,wherethesecondsystemdefined, includingasmanyconfigurationsasthenumberof surfacesin thecommon pathsuspected to generateparasiteback reflections. Themacro evaluatesdetectorflux,foreachconfiguration and itup,macroisclosedandtotalfluxvaluereturned totheimagingsystem meritfunction.Theretuned fluxbecomesoneofthemeritfunction componentsbeingminimisedamongother imagingsystemproperties,theweightis individuallyadjustedbythedesignertobalance systemproperties.
  • 26. Startups focusing on the software stack Nov.2016 Phelcom[UniversityofSãoPaulo(USP)],SmartRetinalCamera(SRC),aretinalscannercontrolled byanattachedsmartphone http://revistapesquisa.fapesp.br/en/2017/05/17/eye-on-the-smartphone/ The SRC is designed to perform three kinds of fundus exams: color, red-free, and fluorescein angiography (FA).  Paulo Schor, professor in the Department of Ophthalmology, of the Federal University of São Paulo (Unifesp), devices that rely on smartphones to perform eye exams do not belong to the future but to the present. “They’reaccessible–thatis,easytooperateandcheap,”
  • 27. Miniaturized nonmydriatic fundus camera design March2017 Opticaldesignofportablenonmydriatic funduscamera https://doi.org/10.1117/12.2268699 WeilinChen;JunChang; FengxianLv;YifanHe; XinLiu;DajiangWang The ocular fundus is not luminous itself, and the reflectivity of retina to visible light is about 0.1% to 10%. If the light blocking effect of pupil is considered, the reflectivity of the fundus is about 0.1% to 1%. The active illumination is therefore needed for the total low reflectivity. The fundus camera uses two kinds of LED as light sources, one is 590 nm LED and the other is 808 nm LED. The pulsed 590nm LED is used to illuminate the capillary vessel in the ocular fundus and take pictures for the high contrastofthefundusimages. Schematicofannularillumination To evaluate the performance of the lighting system, the optimization results from Zemax were imported into Lighttools, and the human eye model was also added to perform a non-sequential ray tracing. The illumination distributioninthefundusisshown SonyICX282AQCCD
  • 29. Smartphone Fundus imaging withdesignchoices laidout 2017 APortable,Inexpensive,NonmydriaticFundusCamera BasedontheRaspberryPi®Computer https://doi.org/10.1155/2017/4526243 BaileyY.Shen andShizuoMukai DepartmentofOphthalmology, Illinois Eye and EarInfirmary, University ofIllinois at Chicago; Retina Service, Massachusetts Eye and EarInfirmary, Harvard Medical School We built a point-and-shoot prototype camera using a Raspberry Pi computer, an infrared-sensitive camera board, a dual infrared and white light light-emitting diode, a battery, a 5-inch touchscreen liquid crystal display, and a disposable 20-diopter condensing lens. Our prototype camera was based on indirect ophthalmoscopy with both infrared and white lights. Results. The prototype camera measured and weighed 386 grams. The total cost of the components, including the disposablelens, was $185.20. Our prototype, or a camera based on our prototype, may have myriad uses for health professionals. For example, it may be useful for ophthalmologists seeing inpatient consults, as moving inpatients to a stationary fundus camera can be impractical, and many neurosurgery inpatients in the intensive care unit are not allowed to be pharmacologically dilated. The comfort of nonmydriatic imaging may make the camera useful for pediatric ophthalmologists, although alignment might be difficult. Finally, the low cost and small size of our camera may make the camera a valuable tool for ophthalmologistspracticing globalmedicine. With added features such as a large memory card and a strong wireless card or cell phone antenna, the device could help providers practicetelemedicine.
  • 30. Open-source optics design blocks acceleratingbasic design 2018 μCube: A Framework for 3D Printable OptomechanicsCube:AFrameworkfor3DPrintableOptomechanics http://doi.org/10.5334/joh.8 |https://mdelmans.github.io/uCube MihailsDelmans, JimHaselofff(2018)UniversityofCambridge JournalofOpen Hardware.2(1),p.2 For attaching a commercial photo camera lens, a µTMountFace is used, which features a T-Mount adapter ring, obtained from a commercial T-Mount adapter. In the M12 CCTV camera lens version, both the lens and theRaspberryPiCameraareheldtogether byasinglepart.CADdesigninOpenSCAD.
  • 31. Modeling the pupil/iris as the imaging aperture May18,2018 Theentrancepupilofthehumaneye https://doi.org/10.1101/325548 GeoffreyKarlAguirre The precise appearance of the entrance pupil is the consequence of the anatomical and optical properties of the eye, and the relative positions of the eye and the observer. This paper presents a ray-traced (Matlab), exact model eye that providesthe parametersofthe entrance pupil ellipse for an observer at an arbitrary location and for an eye that has undergone biologicallyaccuraterotation Calculation of the virtual image location of a pupil boundary point This 2D schematic shows the cornea and a 2 mm radius pupil aperture of the model eye. A camera is positioned at a 45 viewing angle relative to the optical◦ viewing angle relative to the optical axis of the eye. The optical system is composed of the aqueoushumor,the back andfrontsurfacesofthecornea, and the air. We consider the set of raysthat might originate from the edge of the pupil. Each of these rays depart from thepupilapertureatsomeanglewithrespecttotheoptical axisoftheeye.Wecantracetheseraysthroughtheoptical system
  • 33. Smartphone - based wide- field fundus imaging July2018 Asmartphone-basedtoolforrapid, portable,andautomatedwide-fieldretinal imaging https://doi.org/10.1101/364265 TysonKim, Frank Myers,ClayReber, PJLoury,Panagiota Loumou, Doug Webster, ChrisEchanique, Patrick Li, JoseDavila,Robi Maamari,Neil Switz,JeremyKeenan, MariaWoodward, YannisPaulus, Todd Margolis,Daniel Fletcher Department of Ophthalmology and Visual Sciences, University of Michigan School of Medicine; Department of Bioengineering and Biophysics Program, University of California, Berkeley; Department of Ophthalmology, University of California, San Francisco; Department of Ophthalmology and Visual Sciences, Washington University School of Medicine in St. Louis; Department of Physics and Astronomy, San José State University; Chan-Zuckerberg Biohub, San Francisco, CA High-quality,wide-fieldretinalimagingisavaluable methodtoscreenpreventable,vision-threatening diseasesoftheretina. Smartphone-basedretinalcamerasholdpromise for increasingaccesstoretinalimaging,butvariable imagequalityandrestrictedfieldofviewcanlimit theirutility.Wedevelopedandclinicallytesteda smartphone-basedsystemthataddressesthese challengeswithautomation-assistedimaging. TheCellScopeRetinasystemwasdesignedto improvesmartphoneretinalimagingbycombining automated fixationguidance,photomontage,and multi-coloredilluminationwithoptimizedoptics, user-testedergonomics,andtouch-screen interface.Systemperformancewasevaluatedfrom imagesofophthalmicpatientstakenby non- ophthalmicpersonnel. Thefixationtarget is translated througha series ofpositions, re-orienting the patient’s eyes and retina in arapid and controllable fashion. CellScope Retina was capable of capturing and stitching montage wide- field, 100-degree images of a broad variety of retinal pathologies in the nonoptimal imaging conditions of an ophthalmic consultation service and emergency department setting.
  • 34. Phantom development for retinal imaging January2018 QuantifyingRetinalAreainUltra-Widefield ImagingUsinga3-DimensionalPrintedEye Model https://doi.org/10.1016/j.oret.2017.03.011 Design of the model eye with an axial length of 24 mm with section A-A representing the coronal plane and section B-B, the sagittal plane (top left). The radius of the model is 13 mm (top right). The walls of the model eye have a thickness of 2 mm. Each model is made up of multiple rings centered at the posterior pole with each ring separated by 9 as in the image. The top right image represents the sagittal plane and bottom left image represents the coronal plane. Thebottomrightimagerepresents the model eyeviewed externally. The grids in the original image (left) is traced using Photoshop CS2 (Adobe, San Jose, CA; middle). In this example, the line thickness is set at 5 pixels for ease of the reader; however, in determining the area, this was set at 1 pixel for increased accuracy. The traced image which was used to determine the area of each ring in pixels using ImageJ(bottom).
  • 35. Wide-field fundus image quality in clinical practice 2016 PosteriorSegmentDistortioninUltra- WidefieldImagingComparedto ConventionalModalities https://doi.org/10.3928/23258160-20160707-06 NationalInstitutefor HealthResearchMoorfieldsBiomedicalResearch Centre, MoorfieldsEyeHospital andUniversityCollegeLondonInstitute ofOphthalmology, London 2017 Canultra-widefieldretinalimagingreplace colourdigitalstereoscopyforglaucoma detection? https://doi.org/10.1080/09286586.2017.1351998 In conclusion, this study demonstrated almost perfect agreement between colour digital stereoscopy and the Optomap, an ultra-wide field imaging technique when assessed by a glaucoma specialist. It also showed the UWF technique was reproducible in VCDR estimates. Our data suggest that UWF imaging may be suitable for diagnosing glaucoma in situations in which slit-lamp biomicroscopy or digital colour stereoscopy are not available and further research about the comparative diagnostic performance of UWF and other imaging technologies may be warranted. 2018 Peripheral Retinal Imaging Biomarkers for Alzheimer’s Disease: A Pilot Study? https://doi.org/10.1159/000487053 Whether ultra-widefield (UWF, Optos P200C AF) retinal imaging can identify biomarkers for Alzheimer’s disease (AD) and its progression. … after clinical progression over 2 years, suggesting that monitoring pathological changes in the peripheral retina might become a valuable tool inADmonitoring. The proposed averaging of images taken 90° apart can improve the quality of the images obtained using the Optos system. An acknowledgment and correction of this posterior segment distortion will increase the accuracy that the revolutionaryOptossystem hastooffer.
  • 36. Clinical Reviews April2016 ULTRA-WIDEFIELDFUNDUSIMAGING:A Reviewof ClinicalApplicationsandFuture Trends http://doi.org/10.1097/IAE.0000000000000937 Over the last 40 years, several innovative approaches to wide-angle fundus imaging have been introduced. These include the Pomerantzeff camera,the Panoret-1000,the RetCam,andvarious contact and noncontact lens-based systems. These instruments can provide 100° to 160° panoramic photographs using either traditional fundusphotographyor confocalSLO(cSLO). A major disadvantage of several of these approaches, including the Pomerantzeff camera, the Panoret-1000, the RetCam, and the Staurenghi lens, is the utilization of a contact lens which requires a skilled photographer to hold the cameraandlensinplaceduringimageacquisition A major source of frustration for retinal physicians has been the difficulty associated with creating fundus drawings in these electronic systems (EHR). A potential solution would be the seamless integration of an UWF color or angiographic image into the examination note that could be supplemented with annotations by thephysiciantonotetheimportantfindings. Schematic illustration of ultra-widefield imaging (Optos)of the retinausing an ellipsoidalmirror. A laser light source is reflected off the galvanometer mirrors ontoan ellipsoidal mirror. The second focal point of the mirror resideswithin the eye, which facilitates image acquisition anterior totheequator. Optosultra-widefield fluorescein angiographyof proliferative diabetic retinopathy. Right (A) and left (B)eyesof apatient with scattered microaneurysms, peripheral capillary nonperfusion, and focal leakage consistent with neovascularization elsewhere. The peripheral neovascularization and nonperfusionare not detectable using traditional seven-field fundusimaging(C and D).
  • 37. Deep learning with wide-field retinal imaging 2017 Accuracyofdeeplearning,amachine- learningtechnology,usingultra–wide-field fundusophthalmoscopyfordetecting rhegmatogenousretinaldetachment https://doi.org/10.1038/s41598-017-09891-x Thisstudyhadseverallimitations.Whenclarityof theeyeisreducedbecauseofseverecataractor densevitreoushaemorrhage,capturingimages withOptosbecomeschallenging;thus,suchcases werenotincludedinthisstudy. July2018 Deep-learningClassifierWithanUltrawide- fieldScanningLaserOphthalmoscope DetectsGlaucomaVisualFieldSeverity https://doi.org/10.1097/IJG.0000000000000988 To evaluate the accuracy of detecting glaucoma visual field defect severity using deep-learning (DL) classifier with an ultrawide-field scanning laser ophthalmoscope. May2018 Accuracyofultra-wide-fieldfundus ophthalmoscopy-assisteddeeplearning,a machine-learningtechnology,fordetecting age-relatedmaculardegeneration https://doi.org/10.1007/s10792-018-0940-0 AcombinationofDCNNwithOptosimagesisnot betterthanamedicalexamination;however,itcan identifyexudativeAMDwithahighlevelof accuracy.Our systemisconsideredusefulfor screeningandtelemedicine.
  • 39. Modeling the optics of the rat eye with ZEMAX April2011 Novelnon-contactretinacamerafortherat and itsapplicationtodynamicretinalvessel analysis https://doi.org/10.1364/BOE.2.003094 A novel optical model of the rat eye was developed for use with standard ZEMAX optical design software, facilitating both sequential and non-sequential modes. A retinal camera for the rat was constructed using standard optical and mechanical components. The addition of a customized illumination unit with Xenon fiber- coupled light source and existing standard software enableddynamicvesselanalysis.
  • 40. Optimizing fundus image quality for a rat model λpeak = 580 nm hbw = 19 nm October2015 Investigatingtheinfluenceof chromatic aberrationandopticalillumination bandwidthonfundusimaginginrats https://doi.org/10.1117/1.JBO.20.10.106010 Noninvasive, high-resolution retinal imaging of rodent models is highly desired for longitudinally investigating the pathogenesis and therapeutic strategies. However, due to severe aberrations, the retinal image quality in rodents can be much worse than that in humans. We numerically and experimentally investigated the influence of chromatic aberration and optical illumination bandwidth on retinal imaging. We confirmed that the rat retinal image quality decreased with increasing illumination bandwidth. We achieved the retinal image resolution of 10  μCube: A Framework for 3D Printable Optomechanicsm using a 19 nm illumination bandwidth centered at580nminahome-builtfunduscamera. Furthermore, we observed higher chromatic aberration in albino rat eyes than in pigmented rat eyes. This study provides a design guide for high-resolution fundus camera for rodents. Our method is also beneficial to dispersion compensation in multiwavelength retinal imaging applications.
  • 41. Contact lenses for water- immersion imaging July2018 Effectofacontactlensonmouseretinalin vivoimaging:Effectivefocallengthchanges and monochromaticaberrations https://doi.org/10.1016/j.exer.2018.03.027 For in vivo mouse retinal imaging, especially with Adaptive Optics instruments, application of a contactlens (withGelTeal)isdesirable,asitallows maintenance of cornea hydration and helps to prevent cataract formation during lengthy imaging sessions. However, since the refractive elements of the eye (cornea and lens) serve as the objective for most in vivo retinal imaging systems, the use of a contact lens, even with 0 Dpt. refractive power, can alter thesystem'sopticalproperties. In this investigation we examined the effective focal length change and the aberrations that arisefromuseofacontactlens. Based on the ocular wavefront data we evaluated the effect of the contact lens on the imaging system performance as a function of the pupil size. These results provide information for determining optimum pupil size for retinal imaging without adaptive optics, and raise critical issues for design of mouse optical imaging systems thatincorporatecontactlenses. The effect of a contact lens and gel on ocular aberration is complex. In our system, the use of a contact lens introduced vertical coma and spherical aberrations above those of the native eye.
  • 42. Tunable goggle lens for rodent models July2017 Opticalmodellingofasupplementary tunableair-spacedgogglelensforrodent eyeimaging https://doi.org/10.1371/journal.pone.0181111 In this study, we present the concept of a tunable goggle lens, designed to compensate individual ocular aberration for different rodent eye powers. Ray tracing evidences that lens-fitted goggles permit, not only to adjust individual eyes power, but also to surpass conventional adaptive correction technique over large viewing angle, provided a minimum use of two spaced liquids. We believe that the overlooked advantage of the 3D lens function is a seminal finding for further technological advancements in widefield retinal imaging. Example of a multi-element lens-fitted goggle rigidly fastened to an optical system: The goggle lens having a corneal matching index (see Jianget al. 2016 for details) is made of a plurality of liquid filled cavities, having a distinct refractive index, separated by elastic surface membranes that enable a static correction of the eye by restructuration of therodent cornea.
  • 43. Improving contact lens modelling itself for all imaging studies 2018 Nonpupiladaptiveopticsforvisual simulationofacustomizedcontactlens https://doi.org/10.1364/AO.57.000E57 We present a method for determining the deformable mirror profile to simulate the optical effect of a customized contact lens in the central visual field. Using nonpupil-conjugated adaptive optics allows a wider field simulation compared to traditional pupil-conjugated adaptive optics. For a given contact lens, the mirror shape can be derived analytically using Fermat’s principle of the stationary optical path or numerically using optimization in ray-tracing programs such as Zemax. Diagramoftheasphericcontactlensand schematiceyemodel.
  • 45. Can’t really use flashes of visible light Pupilconstrictsfromtheflash,and dynamicpupillometrycanbe donewith theflashofcommercialfunduscameras (seeright )→) Notaproblemunlessyoualwaysimage throughthecentral2mm pupilfor example (MaxwellianOptics) 2018 PupillaryAbnormalitieswithVarying SeverityofDiabeticRetinopathy https://doi.org/10.1117/12.2036970 Mukesh Jain, Sandeep Devan, DurgasriJaisankar, GayathriSwaminathan, ShahinaPardhan & Rajiv Raman Pupil measurements were performed with an 1/3 inch infrared camera and flash light (10 ws xenon flash lamp, OrionFundusCamera, Nidek Technologies,Italy). SpectralPower DistributionofCanonSpeedlite540EZconsumer Xenonphotographicflash SPDat Fullintensity(1/1) Xenon flashes are typically powered by capacitor banks. The more capacitors involved, the higher the time constant, thus longer the flashduration http://www.fredmiranda.co m/forum/topic/1485638 13– Xenon flashtubeon funduscameradesign http://doi.org/10.1167/iovs.12-10449 KennethTran;ThomasA. Mendel;KristinaL. Holbrook;PaulA.Yates
  • 46. Either continuous IR lighting or optical design to cope with the small light- adapted pupil 2004 Observationof theocularfundusbyan infrared-sensitivevideocameraafter vitreoretinalsurgeryassistedby indocyaninegreen https://www.ncbi.nlm.nih.gov/pubmed/12707597 F-10 confocal digital ophthalmoscopefrom NIDEK http://usa.nidek.com/products/scanning-laser-ophtha lmoscope/ Downside:Near-infraredvideodoesnot necessarilycaptureallthefeaturesofthe fundus 2008 US20100245765A1Videoinfrared ophthalmoscope https://patents.google.com/patent/US20100245765/en David S. DYER, JamesHiggins Dyer HoldingsLLC March2016 FundusPhotographyinthe21stCentury —AReviewofRecentTechnological AdvancesandTheirImplicationsfor WorldwideHealthcare https://doi.org/10.1089/tmj.2015.0068 Nishtha Panwar, Philemon Huang, Jiaying Lee, Pearse A. Keane, TjinSwee Chuan, Ashutosh Richhariya, Stephen Teoh, Tock HanLim, and Rupesh Agrawal OculusImagecam2 DigitalSlitLampCamera Different segments ofthe eye, such as anterior segment, fundus, sclera, etc., canbeconveniently imaged bysetting suitable exposure time, lightmagnification, and white balance. Additional videosequences can be recorded bythe high- resolution, digital video camerain the beam pathofthe slitlamp. Volk Pictor enables nonmydriatic fundus examination with an improved 40° FoV. It has modifications allowing still images and videos of the optic disc, macula, and retinal vasculature. The Horus Scope from JedMed (St. Louis, MO) is a hand-held ophthalmoscopic adaptor for viewing the retina and capturing video and still images thatcan be easilytransferred to thepersonal computer Riester Ri-Screen Multifunctional Digital Camera System This slit lamp-based system, along with attachable ophthalmoscopic lens, enables retinal ophthalmic imaging and nonmydriatic eye fundus examination. The Riester (Jungingen, Germany) Ri-Screen provides digital images and video to support screeninganddocumentation ofocular lesions and anomalies. Smarphone-based approach from Harvard Medical School, Boston ( Haddock et al. 2013), They used the iPhone camera's built-in flash for acquiring images and an external 20D lens for focusing. They used the Filmic Pro app (£5.99) for control of light intensity, exposure, and focus. A Koeppe lens was used for imaging patients under anesthesia. Still images were then retrieved from the recorded video (like with D-Eye). When imaging the fundus ofrabbits, 28D or30D lenses haveshown togivebetterresults.
  • 47. Stripe-field method SPIEBiOS,2014 Non-mydriatic,widefield,fundusvideo camera https://doi.org/10.1117/12.2036970 Bernhard Hoeher;Peter Voigtmann; GeorgMichelson; Bernhard Schmauss We describe a method we call "stripe field imaging" that is capable of capturing wide field color fundus videos and images of the humaneyeatpupilsizesof2mm. We designed the demonstrator as a low-cost device consisting of mass market components to show that there is no major additional technical outlay to realize the improvements we propose. The technical core idea of our method is breaking the rotational symmetry in the optical design that is giveninmanyconventionalfunduscameras. By this measure we could extend the possible field of view (FOV) at a pupil size of 2mm from a circular field with 20° in diameter to a square field with 68° by 18° in size. We acquired a fundus video while the subject was slightly touching and releasing the lid. The resulting video showed changes at vessels in the region of the papilla and a change of the palenessofthepapilla. Stripe-field method: 1st and 2nd Purkinje Reflections focused to unused lower black stripe; 4 th Purikinje reflection focused to upper unused stripe; gaining unlimited width of the fieldofviewinthecenter
  • 48. Binocular Opthalmoscope 2017 SPIE Binocularvideo ophthalmoscope forsimultaneous recording of sequencesof the humanretina to compare dynamicparameters https://doi.org/10.1117/12.2282898 RalfP.Tornow,AleksandraMilczarek,Jan Odstrcilik,andRadimKolar A parallel video ophthalmoscope was developed to acquire short video sequences (25 fps, 250 frames) of both eyes simultaneously with exact synchronization. Video sequences were registered off-line to compensate for eye movements. From registered video sequences dynamic parameters like cardiac cycle induced reflection changes and eye movements can be calculated and compared between eyes.
  • 49. Concept design of what portable binocular fundus camera could look like Doesnothurtatall to thinkabouttheUX for the end-user(clinician,non-trained operatorwho couldbethepatientor an opticianfor example) Naturallythisdoesnotexcludetheneed for goodopticaldesignandgood computational imageenhancement. Combinetheseallintoonesolution,and you willhave asuccessfulbusinessthat bringsactualvalue to the patientsinstead of the oftenover-hyped“startup value” 2018 KoreanstartupROOTEEHEALTH “ELI (Eye-Linked-Information), a wearable fundus camera possesses an auto-shot feature which removes the need for manually adjusting the camera to focus on the retina. This removes the need for patients to undergo taking several photos with flashes. With the use of ELI, patients previously undiagnosed or lost in between their ‘first’ diagnosisof diabetes andlater arising diabetic complications related to the eye will be possible to prevent. Providing the Internal Medicine department the tool to diagnose diabetic retinopathy is crucial as timing for treatment mustbeinearlystages.  There'sagapbetweenfirstprototypeandELI.wewanttoimprove cost-effectiveness and accuracy by using adaptive optics & deep learningtechnology”
  • 51. Fundus imaging for Stiles- Crawford effect 2017SPIE Developmentofafunduscamerafor analysisofphotoreceptordirectionality inthehealthyretina http://hdl.handle.net/10362/15618 Author: Anjos,Pedro Filipedos Santos; Advisors: Vohnsen,Brian; Vieira,Pedro The Stiles-Crawford effect (SCE) is the well-known phenomenon in which the brightness of light perceived by the human eye depends upon its entrancepoint in thepupil. Retinal imaging, a widely spread clinical practice, may be used to evaluate the SCE and thus serve as diagnostic tool. Nonetheless, its use for such a purpose is still underdeveloped and far from the clinical reality. In this project a fundus camera was built and used to assess the cone photoreceptor directionality by reflective imaging of the retina in healthy individuals. Diagramofthefinalsystem. 1–Illuminator; 2–OpticalFibre; 3–MillimetricalStage; 4–RedFilter; 5–IrisDiaphragm; 6–MaxwellianLens; 7–Beamsplitter; 8–ImagingLens; 9–ZoomLens; 10–Sensor; 11–DesktopComputer.
  • 53. Inspiration for compact ophthalmic imaging designs Trans-epidermal illumination Quantitativephaseimagingofretinalcells https://arxiv.org/abs/1701.08854 Bycollectingthescatteredlight through thepupil,thepartiallycoherent illumination producesdarkfieldimages,whichare combinedtoreconstruct a quantitative phaseimagewithtwicethenumerical aperturegivenbytheeye'spupil. Wethen report,to ourknowledge,thevery first human in vivophaseimagesofinnerretinalcellswithhigh contrast. a. Trans-epidermal illumination by means of flexible PCB containing LEDs placed in contact with the skin of the eyelid. Light is then transmitted inside the eyeball. After scattering off the eye fundus, the light passing through the retinal’s cell layers is collected by the eye lens. b. Flexible PCB holding 4 red LEDs. c. Recording and reconstruction procedure for in-vivo measurement. d. Experimental setup. The light scattered from the retina is collected by lens L1. The 4f system composed of the lenses L1 and L2 is adjusted for defocus thanks to a badal system. The lens L2 forms an image of the pupil plane at its focal distance, while the lens L3 forms an image of the retina on the EMCCD camera. Dic: dichroic mirror. Synchronization between the LEDs and the camera is performed thanks to a programmable board. -TimothéLaforestetal. (2017) Illumination of the retinal layers provided by transcleral illumination. The light is first transmitted through sclera, RPE and retina. After travelling through the aqueous humor it impinges on the RPE. Herebackscattering off the RPE generates a new illumination beam. This secondary illumination provides a transmission light propagating through the translucent layers of the retina which is then collected by the pupil. Azimuthal angle and polarangle .θ and polar angle α. α.
  • 54. Trans-palpebral illumination Paper 1 Trans-palpebralillumination:anapproachfor wide-angle fundusphotographywithout theneedfor pupildilation DevrimToslak, Damber Thapa, Yanjun Chen,MuhammetKazimErol, R. V.PaulChan, and Xincheng Yao https://doi.org/10.1364/OL.41.002688 Optics Letters Vol. 41, Issue 12, pp. 2688-2691 (2016) “Retinal field-of-view (interior angle of 152°, and exteriorangle 105°” Digitalsuper-resolution algorithms are alsobeing considered for further resolution improvements [Thapa et al.2014] . In addition tothe smartphone- based prototype imagingdevice, we are currentlyconstructinga benchtop prototype for testing the feasibilityof wide-angle fluorescein angiography employingthetrans-palpebral illumination
  • 55. Trans-pupillary illumination Paper 2 Near-infraredlight-guided miniaturizedindirect ophthalmoscopyfornonmydriaticwide-field fundus photography DevrimToslak, Damber Thapa, Yanjun Chen,MuhammetKazimErol, R. V.PaulChan, and Xincheng Yao https://doi.org/10.1364/OL.41.002688 Optics Letters Vol. 41, Issue 12, pp. 2688-2691 (2016)
  • 56. Trans-pars- planar illumination Contact-freetrans-pars-planarilluminationenablessnapshot funduscamera fornonmydriatic widefeld photography BenquanWang,DevrimToslak,Minhaj Nur Alam, R.V. Paul Chan&Xincheng Yao https://doi.org/10.1038/s41598-018-27112-x Scientific Reports volume 8, Article number:8768 (2018) Panoret-1000™ employed trans-scleral illumination to image the retina from the optic disc to the ora serrata in a single-shot image (Shields et al. 2003). However, clinical deployment of trans-scleral illumination was not successful, and the product Panoret-1000™ is no longer commercially available. Clinical deployment of trans-scleral illumination failed due to several limiting factors. First, the employed contact-mode imagingwas not favorable for patients. Direct contact of the illuminating and imaging parts to the eyeball might produce potential inflammation, contamination, and abrasion to the sclera and cornea. Second, it was difficult to operate the system to obtain good retinal images. In Panoret-1000™, the digital camera and light illuminator were apart from each other. To capture a retinal image, one hand was used to operate the camera, and the other hand was used to adjust the light illuminator. The simultaneous need of both hands for imaging operation madethe deviceverydifficulttouse. Insteadofusingalightilluminator contacting theeyelid(trans-palpebral illumination)13  or sclera(trans-scleral illumination)10,11  trans-pars-planar illuminationistotallycontact- freetoprojectilluminatinglight throughtheparsplana.  Representative fundus images with illumination at different locations. (a) Illustration of different illumination locations. (b) Fundus images acquired at different illumination locations. b1-b3 were acquired at corresponding locations P1-P3 in panel a. (c) Average intensity of fundus images collected with constant power illumination delivered through different locations. The curve is an average of 5 trials from one subject. Gray shadow shows standard deviation. P1-P3 corresponds to images b1-b3. (d) Red, green and blue channels of the fundus image b2. (e) Normalized fundus image b2, with digital compensation of red and green channel intensities. Macula, opticdisc,nervefiber bundlesandbloodvesselscouldbeclearlyidentified.
  • 58. Transcranial Illumination Artifacts causedbyretinalsurfacereflexareoften encountered,whichcomplicatequantitative interpretationofthereflectionimages.Wepresentan alternativeilluminationmethod,whichavoids theseartifacts.Themethodusesdeeplypenetrating near-infrared(NIR)lightdeliveredtranscranially fromthesideofthehead,andexploitsmultiple scatteringtoredirectaportionofthelighttowardsthe posterioreye August2018 Non-mydriaticchorioretinalimaging inatransmissiongeometryand applicationtoretinaloximetry doi:10.1364/BOE.9.003867 TimothyD.WeberandJeromeMertz BostonUniversity Project: Transcranialretinalimaging A: Schematic of fundus transillumination and imaging. LEDs at several central wavelengths ( N ) are imaged via couplingλ optics (CO), comprised of lenses f 1 and f 2 , onto theproximalend of aflexiblefiberbundle(FB). A commericalfunduscamera (FC) images the transilluminated fundus onto a camera (CCD). B: Example raw image recorded on the CCD. C: Normalized measuredspectraofavailable high-power deepredandNIRLEDs. This unique transmission geometry simplifies absorption measurements and enables flash-free, non-mydriatic imaging as deep as the choroid. Images taken with this new transillumination approach are applied to retinal oximetry.
  • 59. Aplanat Fundus Imaging June2018 FundusImagingusingAplanat https://doi.org/10.1080/24699322.2017.1379143 VishwanathManik Rathod,M.Sc.Thesis IndianInstituteof Technology Hyderabad In this thesis, we suggest an alternative optics for fundus imaging. Design constraints of aplanat helps to remove all major aberrations observed by lens without adding any extra corrective measures as like those of lens optical systems. And since the proposed system does not have complex set of lenses, complications of the system are and helps in reductionofcostsignificantly. Major advantage of the system is it offers wide numerical aperture large field of view and systemsize remainsto thatofhandhelddevice. Large NA and high radiation efficiency abolish the need of pupil dilation making process painlessforpatient. Process follows as coordinates in MATLAB exported to Solid Edge where aplanat reflector in CAD object form was made then imported in Zemax. Zemax supports four CAD formats: STL, IGES, STEP and SAT. Of these, only STL uses facets to represent the object: the other three model the object as a smooth, continuous surfaceshape. Stepsin Solid Edge In order to image retina completely, 3 phases of imaging need to be done. Narrow field aplanat will be used to image the part near to optical axis of eye. Wide throat aplanat will be used to image peripheral region. Hole at center remained undetected through aplanat can be imaged using normal lens system. Sofinal solution istoimage: Exploiting the overlapping partin the imagesfromall the system,stitching algorithmcan beused later to forma completeimage. This systemleadsto total FOV of200˚
  • 60. Freeform Optics Fundus Imaging? May2018 Startinggeometrycreationand design method forfreeformoptics https://doi.org/10.1038/s41467-018-04186-9 AaronBauer,EricM.Schiesser &Jannick P.Rolland May2018 Over-designedandunder-performing: designandanalysisofafreeformprism viacarefuluseoforthogonalsurface descriptions https://doi.org/10.1117/12.2315641 NicholasTakaki; WanyueSong;AnthonyJ.Yee; JulieBentley; DuncanMoore;Jannick P.Rolland - Instituteof Optics,Univ.of Rochester http://focal.nl/en/#technology Medical Optical Systems DEMCONFocal Institutenweg25A 7521 PH Enschede May2018 DesignofFreeformIlluminationOptics https://doi.org/10.1002/lpor.201700310 Rengmao Wu ZexinFeng Zhenrong Zheng Rongguang Liang Pablo Benítez JuanC.Miñano,FabianDuerr Reviewfocuseson thedesign of freeformillumination optics, which isakey factorinadvancing thedevelopmentof illumination engineering. May2018 High-performanceintegral-imaging- basedlightfieldaugmentedreality displayusingfreeformoptics https://doi.org/10.1364/OE.26.017578 HekunHuang andHong Hua
  • 61. Freeform Optics forCorneal Imaging 2017 Freeformopticaldesignfora nonscanningcornealimagingsystem withaconvexlycurvedimage https://doi.org/10.1364/AO.56.005630 Yunfeng Nie,HerbertGross,Yi Zhong,andFabianDuerr “Thelateralresolutiononthecorneaisabout10 μCube: A Framework for 3D Printable Optomechanicsm withgoodmodulationtransferfunction(MTF)andspot performance.Toeasetheassembly,amonolithicdesignis achievedwithslightlylowerresolution,leadingtoapotential massproductionsolution.”
  • 62. Additive Manufacturing Optics Fundus Imaging? 3DPrintingoptics inotherwords June2018 3Dprintedphotonicsandfree-form optics http://www.uef.fi/en/web/photonics/3d-printed-pho tonics-and-free-form-optics http://optics.org/news/4/6/8 JyrkiSaarinen,Jari Turunen(design),Markku Kuittinen,Anni Eronen,Yu Jiang,Petri Karvinen,VilleNissinen,Henri Partanen, PerttiPääkkönen,Leila Ahmadi,RizwanAli,BisratAssefa,Olli Ovaskainen,DipanjanDas,MarkkuPekkarinen Dutch start-up LUXeXceL  has invented the Printoptical® technology for 3D printing optical elements. Their technology is based on an inkjet printing process. In collaboration with Luxexcel, University of Eastern Finland will furtherdevelopthePrintoptical®technology. June2016 Additivemanufacturingofoptical components https://doi.org/10.1515/aot-2016-0021 Andreas Heinrich /Manuel Rank/Philippe Maillard/Anne Suckow /Yannick Bauckhage /PatrickRößler /Johannes Lang /Fatin Shariff/Sven Pekrul The additive manufacturing technology offers a high potential in the field of optics as well. Owing to new design possibilities, completely new solutions are possible. This article briefly reviews and compares the most important additive manufacturing methods for polymer optics. Additionally, it points out the characteristics of additive manufactured polymer optics. Thereby, surface quality is of crucial importance. In order to improve it, appropriate post- processing steps are necessary (e.g. robot polishing or coating), which will be discussed. An essential part of this paper deals with various additive manufactured optical components and their use, especially in optical systems for shape metrology (e.g. borehole sensor, tilt sensor, freeform surface sensor, fisheye lens). The examples should demonstrate the potentials and limitations of optical componentsproduced byadditivemanufacturing. Feb2018 Additivemanufacturingofreflectiveandtransmissive optics:potentialandnewsolutionsforopticalsystems https://doi.org/10.1117/12.2293130 A.Heinrich; R.Börret; M.Merkel;H.Riegel Additive manufacturing enables the realization of complex shaped parts. This also provides a high potential for optical components. Thus elements with virtually any geometry can be realized, which is often difficult with conventional fabrication methods. Depending on the material and thus the manufacturing method used, either transparent optics or reflective optics canbedeveloped with theaid of additivemanufacturing. Our aim is to integrate the additive manufactured optics into optical systems. Therefore we present different examples in order to point out new possibilities and new solutions enabled by 3D printing of the parts. In this context, the development of 3D printed reflective and transmissiveadaptiveopticswill bediscussed aswell.
  • 64. Fundus Stereo Imaging 2008 Quantitativedepthanalysisofoptic nerveheadusingstereoretinalfundus imagepair http://doi.org/10.1117/1.3041711 Toshiaki Nakagawa,Takayoshi Suzuki,Yoshinori Hayashi,Tetsuya Yamamoto etal.. Convergentvisualsystemfor depthcalculationof stereoimagepair.  March2014 3Dpapillaryimagecapturingbythestereo funduscamerasystemforclinicaldiagnosis onretinaandopticnerve https://doi.org/10.1117/12.2038435 Danilo A.Motta;AndréSerillo; Luciana deMatos; Fatima M.M.Yasuoka; Vanderlei SalvadorBagnato; LuisA.V.Carvalho 2012 QuantitativeEvaluationofPapilledemafrom StereoscopicColorFundusPhotographs http://doi.org/10.1167/iovs.12-9803 Li Tang; Randy H.Kardon; Jui-Kai Wang; MonaK.Garvin; Kyungmoo Lee; MichaelD.Abràmoff Comparison of ONH shape estimates from stereo fundus photographs and OCT scans using topographic maps. (A) Reference (left) fundus image wrapping onto reconstructed topography as output from stereo photographs. Small squares with different colors are marked at four corners of the reference image, indicating the orientation of retinal surface rendering. (B) Fundus image wrapping onto reconstructed topography asoutputfromtheOCTscansfromthesameviewangle.
  • 65. Fundus Stereo Imaging #2 2015 All-in-focusimagingtechniqueused toimprove3Dretinalfundusimagereconstruction https://doi.org/10.1145/2695664.2695845 |https://doi.org/10.1117/12.2038435 DaniloMotta,LucianadeMatos,AmandaCaniattodeSouza,RafaelMarcato,AfonsoPaiva,LuisAlbertoVieirade CarvalhoR&D– Wavetek; UniversidadedeSão Paulo,SãoCarlos, SP,Brazil
  • 66. Snapshot stereo fundus systems 2017 Designof opticalsystemforbinocular funduscamera https://doi.org/10.1080/24699322.2017.1379143 JunWu,Shiliang Lou,Zhitao Xiao,Lei Geng,Fang Zhang,Wen Wang &Mengjia Liu Anon-mydriasisopticalsystemforbinocular funduscamerahasbeendesignedinthis paper.Itcancapturetwoimagesofthesame fundusretinalregionfromdifferentanglesatthe sametime,andcanbeusedtoachievethree- dimensionalreconstructionoffundus. According to requirements of medical light source, sodium lamp whose wavelength is 589 nm is selected as light source and its spectral range is 560- 610 nm; Magnifying power of the imaging system is 1.07, and the cut-off frequency of object is 96.3pl/mm, that is our system can distinguish the structure unit of 5.2 μm. In order to make operation and adjustment more convenient, the size of this optical system is set to be 480 mm x 100 mm x 200mm. Diagramofimagingsystem.
  • 67. 3D fundus with aplanats 2018 3DImageReconstructionofRetina usingAplanats http://raiith.iith.ac.in/id/eprint/4109 SankrandanLoke and SoumyaJana Mastersthesis,IndianInstituteof TechnologyHyderabad Theraytracingprogramiswrittenin Python,with assistancefromtheMATLABtoolboxOptometrika 3Deyemodel 3Dplotofeye,aplanatand itssensor A very high resolution 3D retina is constructed using the Meshlab software. This is considered as the digital version of painted retina to be imaged. A hemi-spheroidal shaped, high density point cloud is created and normals are calculated at each point. Then ”screened Poisson Surface Reconstruction” filter is applied on it to create a mesh over the point cloud. The resultant mesh iscleaned and theface-normalsand vertex-normalsarenormalized.
  • 69. PlenopticFundus Imaging Idea been around for some time now 2011 US20140347628A1 Multi-view funduscamera Inventor:ASOCIACION INDUSTRIAL DE OPTICA, COLOR E IMAGEN -AIDO;UniverstitatdeValencia CurrentAssigneeASOCIACIONINDUSTRIAL DE OPTICA COLOR E IMAGENASOCIACIONINDUSTRIAL DE OPTICA COLOR E IMAGEN- AIDO Universtitatde Valencia 2011 US8814362B2 Method forcombininga plurality of eyeimagesinto aplenoptic multifocalimage Inventor:Steven Roger Verdooner  CurrentAssignee : Neurovision ImagingInc  2011 US8998411B2 Light field camera for fundus photography Inventor:Steven Roger Verdooner Alexandre R. Tumlinson, Matthew J. Everett CurrentAssignee : Carl Zeiss Meditec Inc 2013 US9060710B2I System and method for ocular tomography using plenopticimaging Inventor Richard J. Copland CurrentAssigneeAMOWavefront Sciences LLC 2015 US9955861B2 Constructionof an individualeye model using aplenopticcamera Inventor Liang Gao, IvanaTosic CurrentAssigneeRicoh Co Ltd US8998411B2US8998411B2: ”As described by RenNg (founder of Lytro), the “light field” is a concept that includes both the position and direction of light propagating in space (see for example U.S.Pat.No.7,936,392). “ DeHoog andSchwiegerling “Funduscamera systems: acomparative analysis” Appl. Opt.48, 221-228 (2009) https://doi.org/10.1364/AO.4 8.000221
  • 70. PlenopticFundus Imaging Short intro Plenoptic imagingof theretina: canitresolvedepthinscattering tissues? Richard Marshall, Iain Styles, ElaClaridge,and Kai Bongs https://doi.org/10.1364/BIOMED.2014.BM3A.60, PDF Configurationsoftwo differentplenoptic cameras: (a) The traditionalplenoptic camera.(b)The focusedplenoptic camera. “Plenoptic imaging has already proven its capabilities to determine depth and give 3D topographic information in free space models, however no study has shown how it would perform through scattering media such as the retina. In order to study this, simulations were performed using MCML, a multi-layered Monte Carlo modeling software[Wang etal.1995]. Theparameters characterising theproperties ofretinal layers and used in MonteCarlo (MC)simulationhavebeen taken from Stylesetal.(2006).” Simulationof Light Field FundusPhotography ShaTonand T.J. Melanson. Stanford Courseassignment http://stanford.edu/class/ee367/Winter 2017/Tong_Melanson_ee367_win17_rep ort.pdf Light Field Imagesfrom Different ViewingPoints Comparisonsbetween normal camera(L), light field camera(M) and reference image (R)
  • 71. PlenopticFundus Imaging Prototype System #1: Moorfields Eye Hospital and University College of London Retinal fundus imaging with a plenoptic sensor Brice Thurin; Edward Bloch; Sotiris Nousias; Sebastien Ourselin; Pearse Keane; Christos Bergeles https://doi.org/10.1117/12.2286448 Optical layout of the plenoptic fundus camera. A white LED illuminates the eye fundus through a polarizer and polarizing beamsplitter. The LED chip is conjugated with the eye pupil and an IRIS. While the condensing lens L1 is conjugated with the retinal plane.A primary image of the retinal is formed by a Digital Wide-Field lens L4. This image is relayed to the plenoptic sensor (Raytrix R8) by L3 and L6 through the polarizing beamsplitter. The polarization helps reduce the corneal reflex. Plenoptic ophthalmoscopy has been considered for eye examinations [Tumlinsonand Everett2011; Bedard etal.2014; LawsonandRaskar2014] A crude implementation has been proposed by Adametal.(2016) the system built is used as a substitute for a human observer, only a small portion of the field is used for fundus imaging and the it does not exploit the full capabilities of light-field imaging. More recently a plenoptic sensor has been used to successfully characterize the topography of the healthy and diseasedhumanirisinvivo[Chenetal.2017].
  • 72. PlenopticFundus Imaging Prototype System #2a : Queensland University of Technology; Medical and Healthcare Robotics, Australian Centre for Robotic Vision, Brisbane; Institute of Health and Biomedical Innovation, Brisbane Glare-free retinal imaging using a portable light field fundus camera DouglasW.Palmer, ThomasCoppin,Krishan Rana, Donald G. Dansereau,MarwanSuheimat, Michelle Maynard, David A. Atchison, JonathanRoberts,RossCrawford, and AnjaliJaiprakash BiomedicalOpticsExpressVol. 9, Issue7,pp. 3178-3192(2018) https://doi.org/10.1364/BOE.9.003178 Imaging path optical diagram of light field fundus camera. Top row (A,B,C) represents a correctly designed system where the entrance pupil diameter ØLF is smaller than the eye pupil ØP and the region of the sensor under the microlens shows minimal vignetting, where d is the number of pixels under a microlens horizontally and vertically. Bottom row (D,E,F) represents an incorrectly designed system where ØLF is larger than ØP. The resultant micro image vignetting is shown in (F).(A) and (D) show slices taken approximately through the iris of the eye. (B) and (D) show the arrangements of components and paraxial approximations of the ray paths for a point on (blue) and off-axis (red).The design entrance and exit pupils are the images of the design aperture stop as seen through theobjective and relay lenses respectively. Plenoptoscope - General arrangement. Imaging path in gray, eye fixation path in red, illumination path in yellow. The Lytro Illum light field camera has an internal fixed f/2.0 aperturestop (notshown).
  • 73. PlenopticFundus Imaging Prototype System #2b : Queensland University of Technology; Medical and Healthcare Robotics, Australian Centre for Robotic Vision, Brisbane; Institute of Health and Biomedical Innovation, Brisbane Glare-free retinal imaging using a portable light field fundus camera DouglasW.Palmer, ThomasCoppin,Krishan Rana, Donald G. Dansereau,MarwanSuheimat, Michelle Maynard, David A. Atchison, JonathanRoberts,RossCrawford, and AnjaliJaiprakash BiomedicalOpticsExpressVol. 9, Issue7,pp. 3178-3192(2018) https://doi.org/10.1364/BOE.9.003178 Series of images captured using the Retinal Plenoptoscope. Images are shown in sets of two with the top image being a standard (not glare-free) render of the light field, and the bottom image being a gray-scale relative depth map. Each depth map has an associated scale that relates gray shade to depth. Note that the leftmost set is of a model eye, the second leftmost set is of a myopic eye (-5.75D), and the two rightmost sets are ofemmetropiceyes. An image of a human retina captured using the Retinal Plenoptoscope with an associated epipolar image. Annotations indicate areas of interest, where (A) and (C) correspondtoglare,and(B)correspondstotheOpticDisk. Series of images created using various light field rendering techniques. Images are shown in sets of three with the left image being the central view from the light field, the middle image being astandard render with noglare masking, and the right image beingaglare-freerender.
  • 74. PlenopticIris Imaging Prototype System Mechanical Engineering; Department of Ophthalmology and Visual Sciences; Department of Human Genetics | University of Michigan Human iris three-dimensional imaging at micron resolution by a micro-plenoptic camera Hao Chen, MariaA.Woodward, David T.Burke,V.SwethaE. Jeganathan, Hakan Demirci,andVolker Sick BiomedicalOpticsExpressVol. 8,Issue10,pp.4514-4522 (2017) https://doi.org/10.1364/BOE.8.004514 | researchgate A micro-plenoptic system (Raytrix R29) was designed to capture the three-dimensional (3D) topography of the anterior iris surface by simple single-shot imaging. Within a depth-of-field of 2.4 mm, depth resolution of 10 µm can be achieved with accuracy (systematic errors) and precision (randomerrors)below20%. 
  • 76. Combined Fundus and OCT Imaging 2007 SimultaneousFundusImagingand OpticalCoherenceTomographyofthe MouseRetina http://doi.org/10.1167/iovs.06-0732 OmerP.Kocaoglu;Stephen R.Uhlhorn; EleutHernandez; RogerA. Juarez; Russell Will;Jean-MarieParel;Fabrice Manns To develop a retinal imaging system suitable for routine examination or screening of mouse models and to demonstrate the feasibility of simultaneously acquiring fundus and optical coherencetomography(OCT)images. The mousewas held ina cylindrical holdermadefrom30-mL syringes. The positionofthe mouse was adjusted to align the optical axis ofthe mouse eye with the axis ofthe deliverysystembyusing 6-μm screws.m screws. Left: general optical design of the imaging system; right: the mouse fundus and OCT imaging system, including fundus imaging with a digital camera attached to the photographic portof theslitlamp, theOCT beamdeliverysystem, thesix-axis mousepositioner,and theinterferometer.
  • 77. Fundus Camera Guided Photoacoustic Ophthalmoscopy 2013 FundusCameraGuidedPhotoacoustic Ophthalmoscopy https://doi.org/10.3109/02713683.2013.815219 Aschematicoftheimagingsystemdesigned andoptimizedfor rateyesisshowninright→ A532-nmpulsed laser(Nd:YAGlaser, SPOT-10-100,ElforlightLtd,UK;output wavelength1064nm;pulseduration:2ns; BBOcrystalforsecondharmonicfrequency generation;CasTech,SanJose,CA) wasused astheilluminationlightsourceforPAOM. ThePAOMlaser(greenpath)wasscannedby anx–ygalvanometer (QS-7,Nutfield Technology)anddeliveredtotheposterior segmentoftheeyeafter passingthrougha relaylensL5(f=150mm)andanobjectivelens OBJ1(f=30mm,VIS-NIRARcoated).Thefinal laserpulseenergyonthecorneais60nJ, whichisconsideredeye-safe. TheinducedPAwavesweredetectedbya custom-builtunfocused needle ultrasonictransducer(centralfrequency 35MHz;bandwidth:50%;activeelementsize: 0.5x0.5mm2 ).Theultrasonictransducerwas gentlyattachedtotheeyelid(closetothe canthus)coupledbyathinlayer ofmedical- gradeultrasoundgel.
  • 78. Infrared Retinoscopy 2014 Infrared Retinoscopy http://dx.doi.org/10.3390/photonics1040303 Retinoscopy could be a more effective and versatile clinical tool in observing a wide range of ocular conditions if modificationswere made to overcome the inherent difficulties. In this paper, a laboratory infrared retinoscope prototype was constructed to capture the digital images of the pupil reflex of various typesofeyeconditions. The capturedlow-contrastrefleximagesdueto intraocular scattering were significantly improved with a simple image processing procedure for visualization. Detections of ocular aberrations were demonstrated, and computational models using patients’ wavefront data were built to simulate the measurementfor comparison. The simulation results suggest that the retinal stray light that is strongly linked to intraocular scattering extend the detection range of illuminating eccentricity in retinoscopy and make it more likely to observe ocular aberrations.
  • 80. ISO 15004-2.2 Standard for safe retinalirradiancewith ophthalmic instruments Used e.g. by Kölbl et al. (2015) Sheng Chiong Hong (2015) for discussion on limits, see Sliney et al. (2005) Wangetal.(2018) https://doi.org/10.1038/s41598-018-27112-x “According to the ISO 15007-2:2007 (Petteri: Incorrect standard reference) standard, a maximum of 10  J/cm2 weighted irradiance is allowed on the retina without photochemicalhazardconcern.” Kim, Delori, Mukai (2012): Smartphone Photography Safety https://www.aaojournal.org/article/S0161-6420(12)00410-1/pdf The light safety limits for ophthalmic instruments set by the International Organization for Standardization (ISO 15004-2.2) recommend that spectral irradiance (W/cm2 /nm) on the retina be weighted separately for thermal and photochemical hazard functions or action spectra. These safety limits are at least 1 order of magnitude below actual retinal threshold damage [Delori et al.2007; Slineyet al.2002] . The radiant power of the smartphonewas8 mW. For thermal hazard, the weighted retinal irradiance for the smartphone was 4.6 mW/cm2 , which is 150 times below the thermal limit (706 mW/cm2 ). For photochemical hazard, the weighted retinal radiant exposure was 41 mJ/cm2 (exposure duration of 1 minute), which is 240 times below the photochemical limit (10 J/cm2 ). Since the light safety standards not only account for the total retinal irradiance but also for spectral distribution, we measured the latter with a spectroradiometer (USB4000, Ocean Optics, Dunedin, FL). The radiation was limited to the 400–700 nm wavelength interval with about 70% of that light emitted inthe blue and green partof the spectrum (wavelength600nm). We then compared the light levels produced during smartphone fundoscopy with those produced by standard indirect ophthalmoscopes. The retinal irradiance produced by a Keeler VantagePlusLED (Keeler Instruments Inc., Broomall, PA), measured using identical procedures as earlier described, was 46 mW/cm2 or about 10 times the levels observed with the smartphone. This finding corresponds well with retinal irradiances of 8 to 210 mW/cm2 found in other studies for a wide selection of indirect ophthalmoscopes. The spectral distribution of the Keeler indirect ophthalmoscope was similar to that of the smartphone (both have LED sources). The weighted exposures for the Keeler indirect ophthalmoscope were thus 15 and 24 times less than the limits for thermal and photochemical hazards, respectively. The lower light level available for observation using the smartphone, as opposed to the indirect ophthalmoscope, is largelycompensated for bythe high electronicsensitivityof the camera. In conclusion, retinal exposure from the smartphone was 1 order of magnitude less than that from the indirect ophthalmoscope and both are within safety limits of thermal and photochemical hazards as defined by the ISO whentested under conditions simulatingroutinefundoscopy. Hazard weighting functions according to the DIN EN ISO 15007 – 2: 2014 standard A(λ) and R(λ). A(λ) rates the photochemical and A(λ) rates the thermal hazard for all kinds of lightsources.Kölbletal.(2015)
  • 81. Example of Calculation For calculatingretinal irradiance from quasi- monochromatic green (565nm)parsplanar illumination 2018 Contact-free trans-pars-planar illumination enables snapshot funduscamerafornonmydriatic wide field photography https://doi.org/10.1038/s41598-018-27112-x The thickness of the sclera is ~0.5 mm Olsenet al.1998 . The transmission of the sclera in visible wavelength is 10–30% Vogeletal.1991 .Tobeconservative,30%wasusedforcalculation. For the proof-of-concept experiment, the weighted irradiance on the sclera was calculated to be 0.5 mW, the area of the arc-shaped light was 13 mm2 . For the worst case estimation,we assumedallillumination lightdirectlyexposeto the retinal area behind the illuminated sclera area. Therefore, themaximumallowedexposuretimeis If the illumination light accidently fell into the pupil, the illuminated area on retina was estimated to be >9 mm2 . Thus the maximum allowedexposuretimethrough thepupilis >30  minutes. For thermal hazard, the maximum weighted power intensity allowed on the sclera without thermal hazard concern is 700 mW/cm2 . The calculated weighted power intensity was 230 mW/cm2 , which was more than three times lower thanthemaximumlimit. Kölbletal. (2015) lightsourcearoundshapedwhiteLEDisused.Byintegrationof thelight sourceintoa speculum, theLEDispressedfirmly held against thesclera. Thustheocularspaceisilluminated transsclerally.Asaresult anindirect uniformilluminationofthe completeintraocularspaceisachieved.
  • 82. Example of the use of Supercontinuum light source We assessed the spectral sensitivity of the pupillary light reflex in mice usinga high power supercontinuum white light (SCWL) source in a dual wavelength configuration. Thisnovel approachwas comparedto data collected from a more traditional setupusing a Xenon arclamp fitted withmonochromatic interference filters. 2018 Use ofa supercontinuum whitelight inevaluating thespectral sensitivity of the pupillight reflex CatherineChin; Lasse Leick; Adrian Podoleanu;GurpritS. Lal Univ. ofKent(United Kingdom);NKTPhotonics A/S (Denmark) https://doi.org/10.1117/12.2286064 Lightwasgenerated bytheNKTPhotonics SuperKExtremeEXR andfilteredthrough ExtendedUV(480 nm)andSuperK Select (560nm) modules.  The use of a SCWL is a significant leap forward from the Xenon arc light traditionally used in recording pupillary light responses. The SCWL gives the experimenter much more control over the light stimulus, through wavelength, intensity and, most importantly, a dual light configuration. Together, this will allow more complex lighting protocols to be developed that can further assist in unraveling the complex coding of light that gives rise to the pupil light reflex and other photic driven physiological responses
  • 84. Fundus Video Processing 2014 Multi-frameSuper-resolutionwith QualitySelf-assessmentforRetinal FundusVideos https://doi.org/10.1117/12.2036970 Thomas Köhler, Alexander Brost, Katja Mogalle, QianyiZhang, Christiane Köhler, Georg Michelson, JoachimHornegger, RalfP. Tornow In order to compensate heterogeneous illumination on the fundus, we integrate retrospective illumination correction for photometric registration to the underlying imaging model. Our method utilizes quality self-assessment to provide objective quality scores for reconstructed images as well as to select regularization parameters automatically. In our evaluation on real data acquired from six human subjects with a low-cost video camera, the proposed method achieved considerable enhancements of low-resolution frames and improved noise and sharpnesscharacteristicsby 74%. 2014 Bloodvesselsegmentationinvideo- sequencesfromthehumanretina https://doi.org/10.1109/IST.2014.6958459 J . Odstrcilik ; R. Kolar ; J. Jan ; R. P. Tornow ; A. Budai This paper deals with the retinal blood vessel segmentation in fundus video-sequences acquired by experimental fundus video camera. Quality of acquired video-sequences is relatively low and fluctuates across particular frames. Especially, due to the low resolution, poor signal-to-noise ratio, and varying illumination conditions within the frames, application of standard image processing methods might be difficult in such experimental fundusimages. 2014 Geometry-BasedOpticDiskTrackingin RetinalFundusVideos https://doi.org/10.1007/978-3-642-54111-7_26 Anja Kürten, Thomas Köhler, Attila Budai, Ralf-Peter Tornow, Georg Michelson, JoachimHornegger Fundus video cameras enable the acquisition of image sequences to analyze fast temporal changes on the human retina in a non-invasive manner. In this work, we propose a tracking-by-detection scheme for the optic disk to capture the human eye motion on-line during examination. Our approach exploits the elliptical shape of the opticdisk. 2016 Registrationofretinalsequencesfrom newvideo-ophthalmoscopiccamera https://doi.org/10.1186/s12938-016-0191-0 RadimKolar, Ralf. P. Tornow, JanOdstrcilik and Ivana Liberdova Analysis of fast temporal changes on retinas has become an important part of diagnostic video-ophthalmology. It enables investigation of the hemodynamic processes in retinal tissue, e.g. blood-vessel diameter changes as a result of blood-pressure variation, spontaneous venous pulsation influenced by intracranial-intraocular pressure difference, blood-volume changes as aresult of changesin light reflection from retinal tissue, and blood flow using laser speckle contrast imaging. For such applications, image registration of the recorded sequence must be performed.
  • 85. Multiframe registration July2018 Fundusphotographywithsubpixel registrationofcorrelatedlaserspeckle images https://doi.org/10.7567/JJAP.57.09SB01 Jie-EnLiandChung-HaoTien Schematic diagram of the optics of fundus photography. LS, light source; CL, collector lens; AD, aperture diaphragm; BS, beam splitter; FD, field diaphragm; Ln, lenses. In our experiment, the focal lengths of the lenses are 30 and 100 mm (f1 = 300 mm and f2 = 100 mm). Imagesofrabbitretinawith (a) incoherent illumination, (b) coherent illumination, and (c) LSCI image. Vesselsenclosed bythered frame are barely distinguishable in imagesobtained usingaconventional fundussystem, but are enhanced when their imagesareobtained with the help ofLSCI. Imagesobtainedbylaser specklecontrastimaging (LSCI)(a)withoutimage registrationand (b)with imageregistration process.(c)Speckle contrastof(a)and(b)along theredlines.
  • 86. Future of Fundus Imaging Hardware becoming a low-cost commodity with thevalue ofeasy dataacquisition increasing
  • 87. Google building a 2018 OpticalSystemsEngineer VerilyLifeSciences https://www.linkedin.com/jobs/view/674965920 Responsibilities ● Designstate-of-the-artoptics-based devices. ● Work closelywithinterdisciplinaryteamto integrateopticaldesignsinto prototypes and productdevelopment path. MinimumQualifications ● PhD in Optical Engineering/Optics/Physics/ EE, or related technical field or equivalent practical experience. ● Knowledgeand experiencein optical design, optics and imaging systems design. ● Applied research experience in physics/optics/imagingsystems. ● PreferredQualifications ● Experience in opto-mechanicaldesign ● Experience in electronics (PCB schematic capture and layout, soldering, etc.) ● Programming experience in MATLAB/C/C+ +/Python ● Experience withmicrocontrollers ● Excellent communicationand collaboration skills. 2018 ElectricalEngineer, OphthalmicDevices VerilyLifeSciences https://www.linkedin.com/jobs/view/692175458 Responsibilities ● Working with cross-functional teams to define electronic systems based on system- level requirements and tradeoffs ● Designelectronic systems forhighly miniaturizedelectronicdevices, especiallyatthe PC board level ● Identification, selection and qualificationof keyelectronic components for electronic systems in miniaturized medical devices ● Identification and qualification of key vendorsandpartners for electronics integration and manufacture ● Rapidprototypingofelectronic systems with in-houseteams and with support from vendors, including integration with avariety ofelectrical and mechanical components MinimumQualifications ● MSdegreeinElectrical Engineering or related major plus 4 years work experience ● Solidanaloganddigitalcircuitdesign skill ● Experiencewithcomplex PC board design ● Experiencewithfirmware development 2018 TechnicalLead, OphthalmicDevicesand ElectroactivePolymers VerilyLifeSciences https://www.linkedin.com/jobs/view/technical-lead -ophthalmic-devices-and-electroactive-polymers- at-verily-life-sciences-731643271/ Responsibilities ● Leading thedevelopment ofnew ophthalmologicaldevices and diagnostic instrumentation. ● Early-stagehardware and systems development. MinimumQualifications ● Experiencein electroactivepolymer applicationsinopticalsystems. ● Experiencewith embedded systems hardware/software development. ● Experiencewith differentstages ofproduct development:proof-of-concept, prototyping, EV and DVbuilds. PreferredQualifications ● Experiencewith opto-mechanical systems and components. Experience withophthalmic devices. ● Background in bothhardwareand software. Programming experiencewith C++ and Python. ● Experiencein optics and electronics witha focus on optical/spectral imaging/sensingtechnologies. ● Product development track record inoptical applications of electroactivepolymers or similar. ● Experiencewith Camera, ImageSignal Processor (ISP) orcamerasoftware stack. Knowledge ofsignal processing, digital imaging, computervision, and image/ video processing. ExperiencewithSoC, cameramodules, VR/AR display techniques. ● Experiencewith (real time)data processing LilyPeng–Google Brain Talkingabout deeplearningforfundus images(diabeticretinopathy),andtheir collaborationwith ArvindinstituteinIndiaat APTOS2018
  • 88. Portable adaptive optics for improving image quality 05Sep2018 Adaptiveopticsrevealsfinedetailsof retinalstructureDukeUniversityhandheldAOSLO platform assistsdiagnosisofeyediseaseandtrauma http://optics.org/news/9/9/5 "To overcome the weight and size restrictions in integrating AOSLO into handheld form (weighing less than 200g), we used recent advancements in the miniaturization of deformable mirror technology and 2D microelectromechanical systems (MEMS) scanning, together with a novel wavefront sensorless AO algorithm and a custom optical and mechanical design," commented the teamintheOpticapaper. The newprobe andassociatednumericalmethods could be useful for a variety of applications in ophthalmic imaging, and the Duke team has made its designs and computational algorithms available as opensourcedata for otherresearchers. The system was able to image photoreceptors as close as 1.4 degrees eccentric to the fovea area of the retina, where photoreceptors have an average spacing of 4.5 microns. Without AO, the closest measurement had previously been 3.9 degrees. Further clinical trials with the instrument will follow, and the researchers plan to incorporate additional imaging modalities into the platform that could prove useful for detecting otherdiseases. 2018 Handheldadaptiveopticsscanning laserophthalmoscope https://doi.org/10.1364/OPTICA.5.001027 http://people.duke.edu/~sf59/HAOSLO.htm Theodore DuBose, Derek Nankivil, FrancescoLaRocca, GarWaterman, Kristen Hagan, James Polans, BrentonKeller, DuTran-Viet, Lejla Vajzovic, Anthony N. Kuo, Cynthia A. Toth, Joseph A. Izatt, and SinaFarsiu
  • 89. Novel Components For Imaging The use of MEMStech,has already allowed miniaturization of many devices probing visualfunction What more?
  • 90. Diffractive Optics for Fundus Imaging? February2018 Broadbandimagingwithoneplanar diffractivelens https://doi.org/10.1080/24699322.2017.1379143 NabilMohammad,MonjurulMeem,Bing Shen,Peng Wang &Rajesh Menon |Department of Electrical and Computer Engineering, MACOM Technology Solutions, Department of Medical Engineering, CaliforniaInstitute ofTechnology Here, we design, fabricate and characterize broadband diffractive optics as planar lenses for imaging. Broadband operation is achieved by optimizing the phase transmission function for each wavelength carefully to achieve the desired intensity distribution at that wavelength in the focal plane. Our approach is able to maintain the quality of the images comparable to that achievable with morecomplexsystemsoflenses. The achromatic lenses were patterned on a photoresist film atop a glass wafer using grayscale laser patterning using a Heidelberg Instruments MicroPG101 tool. The exposure dose was varied as a function of position in order to achieve the multiple heightlevelsdictatedbythedesign. We show here that planar diffractive lenses, when designedproperly and fabricated carefullyaresufficient for broadband imaging. By extending the fabrication process to industry-relevant lithographic scales and large area replication via nanoimprinting (Guo2004), it is possible to envision planar lenses enabling imaging with very thin form factors, low weights and low costs. Therefore, we believe that our approach will lead to considerably simpler, thinner and cheaper imaging systems. (a) Schematic of a flat-lens design. The structure is comprised of concentric rings of width, Wmin  and varying heights. (b) Photograph of one fabricated lens. Optical micrographs of (c) NA = 0.05 and (d) NA = 0.18 lenses. Focal length is 1 mm. Measured full-width at half-maximum (FWHM) of the focal spot as a function of wavelength for (e) NA = 0.05 and (f) NA = 0.18 lenses. Measured focal spots as a function of wavelength for (g) NA = 0.05 and (h) NA = 0.18lenses.
  • 91. Diffractive Optics in intraocular lenses (IOL) and satellite- based remote sensing July2018 Fractal-structuredmultifocal intraocularlens https://doi.org/10.1371/journal.pone.0200197 LauraRemón, Salvador García-Delpech, Patricia Udaondo, VicenteFerrando, Juan A. Monsoriu, WalterD. Furlan | Departamento de ÓpticayOptometríayCienciasdelaVisión, UniversitatdeValència, Burjassot,Spain In this work, we present a new concept of IOL design inspired by the demonstrated properties of reduced chromatic aberration and extended depth of focus of Fractal zone plates. A detailed description of a proof of concept IOL is provided. The result was numerically characterized, and fabricated by lathe turning. The prototype was tested in vitro using dedicated optical system and software. The theoretical Point Spread Function along the optical axis, computed for several wavelengths, showed that for each wavelength, the IOL produces two main foci surrounded by numerous secondary foci that partially overlap each other for different wavelengths. The result is that both, the near focus and the far focus, have an extended depth offocusunderpolychromaticillumination.  May2018 ModificationofFresnelzonelightfield spectralimagingsystemforhigher resolution https://doi-org/10.1117/12.2303898 CarlosDiaz; Anthony L.Franz;Jack A.Shepherd Air Force Institute ofTechnology (United States) Recent interest in building an imaging system using diffractive optics that can fit on a CubeSat (10 cm x 10 cm x 30 cm) and can correct severe chromatic aberrations inherent to diffractive optics has led to the development of the Fresnel zone light field spectral imaging system (FZLFSI). The FZLFSI is a system that integrates an axial dispersion binary diffractive optic with a light field (plenoptic) camera design that enables snapshot spectral imaging capability.
  • 92. Metamaterial Optics for Fundus Imaging? July2017 Metamaterialsandimaging https://dx.doi.org/10.1186/s40580-015-0053-7 MinkyungKimandJunsuk Rho Here, we review metamaterial-based lenses which offer the new types of imaging components and functions. Perfect lens, superlenses, hyperlenses, metalenses, flat lenses based on metasurfaces, and non-optical lenses including acoustic hyperlens are described. Not all of them offer sub-diffraction imaging, but they provide new imaging mechanisms by controlling and manipulating the path of light. The underlying physics, design principles, recent advances, major limitations and challenges for the practical applicationsarediscussedinthisreview. Diffraction-free acoustic imaging using metamaterials allows more efficient underwater sonar sensing, medical ultra-sound imaging, and non- destructivematerialstesting. Therefore, with the development of nanofabrication and nanomanufacturing methods, and the integration of new creative ideas, overcoming the results and limitations mentioned in this review will be the continuous efforts to make metamaterial-based imaging techniques to be a next generation of imaging technology replacing current optical microscopy, which thus can be called nanoscopy. 2018 Dynamicallytunableandactivehyperbolic metamaterials https://doi.org/10.1364/AOP.10.000354 Joseph S.T.Smalley,FelipeVallini, XiangZhang, andYeshaiahu Fainman Here
  • 93. Metasurface Optics for Fundus Imaging? July2017 OpticswithMetasurfaces:Beyond RefractiveandDiffractiveOptics https://doi.org/10.1364/OFT.2017.OW1B.5 MohammadrezaKhorasaninejad HarvardUniversity Flat optics based-on metasurfaces has the potential to replace/complement conventional refractive/diffractive components. In this talk, we give an overview of our works on dielectric- metasurfaces, which have led to high performance components in the visible spectrum. https://doi.org/10.1364/OPTICA.4.000139 Nano-opticendoscope seesdeep intotissueathigh resolutionNow, experts in endoscopic imagingat Massachusetts General Hospital (MGH) and pioneers of flatmetalens technologyatthe HarvardJohn A. Paulson  School ofEngineering and Applied Sciences (SEAS), haveteamed up to develop a new class ofendoscopic imaging catheters –termed nano-optic endoscopes  -https://doi.org/10.1038/s41566-018-0224-2 Departmentof ElectricalandComputer Engineering,NationalUniversityof Singapore, Singapore,Singapore- Yao-Wei Huang &Cheng-Wei Qiu