Retinal Specialist Survey · 2026

Your clinic instincts are
shaping the future of retinal care

A select group of retinal specialists are helping define what AI-assisted longitudinal review should look like. Every answer you give will directly influence a real clinical tool being built right now.

~5 min core · 2 min optional
Anonymous · no passwords
Dr. Hanan Alghamdi — Associate Professor of AI in Medical Imaging
Dr. Safwan Tayeb — Assistant Professor of Ophthalmology
Retinal fundus image
Residents & fellows: your perspective is just as critical — you see the workflow friction that consultants often miss  Residents Welcome
Getting started 0%

You're mid-clinic. Running 20 minutes behind.
A complex AMD patient walks in.

Before we begin — tell us: what's the first thing you cut from your review when time is short?

1
When running behind schedule, which review step do you most often shorten or skip?*
Be honest — this is the most important question in the survey
Reviewing prior OCT trends across visits
Checking full injection history and intervals
Comparing fluid status (IRF/SRF) across visits
Reviewing systemic disease context (diabetes, BP)
I don't skip anything — I absorb pressure differently
Other:
💡 This exact moment — the review shortcut under pressure — is the core problem we're building to solve. Thank you for naming it.
About You
2
What is your current clinical role?*
Consultant / Attending retinal specialist
Retinal fellowship trainee
Ophthalmology resident
General ophthalmologist (retinal interest)
Residents & fellows: Your answers matter enormously. You witness the handoff gaps, the rushed case presentations, and the moments when attending physicians don't have time to review history properly. That lived experience is data we can't get anywhere else.
3
Years managing retinal disease patients*
In training / < 1 year
1–3 years
3–10 years
> 10 years
4
Primary practice setting*
Academic / university hospital
Private practice / imaging center
Mixed (academic + private)
Hospital-based (non-academic)
5
Patients seen per clinic day & average time per patient*
PATIENTS / DAY
< 20
20–40
41–60
> 60
MINUTES / PATIENT
< 5 min
5–10 min
10–15 min
> 15 min
Your Workflow
6
Which disease type do you find most cognitively demanding to manage longitudinally?*
Select one
DR / DME
AMD (wet & dry)
Retinal vascular occlusion
Vitreoretinal interface
Uveitis
Other
7
When reviewing a follow-up retinal patient, what do you look at first?*
Select all that apply
Most recent OCT
Visual acuity change
Prior injection / treatment history
Prior OCT trend across visits
Fluid status (IRF / SRF) history
Systemic disease context (diabetes, BP, medications)
8
How many prior visits do you typically review before deciding to treat or defer?*
Only the most recent visit
2–3 prior visits
4–6 prior visits
As far back as available
Clinical Reality
9
Which factors make treat vs. defer decisions most challenging?*
Select up to 3
Subtle OCT changes that are easy to miss
Difficulty comparing scans across visits
Long injection history — hard to see the pattern
Possible treatment resistance — unclear when to switch
Systemic disease masking or confounding retinal status
Transferred patients with external imaging I can't access
Time pressure during clinic
10
How much time pressure do you feel reviewing prior data during follow-up visits?*
Very lowVery high
1
2
3
4
5
6
7
11
How confident are you in treat vs. defer decisions given the time available?*
Very low confidenceVery high confidence
1
2
3
4
5
6
7
12
What is the single biggest challenge when reviewing longitudinal retinal data?*
One sentence is enough — your words matter more than you think
🙏 This kind of clinical insight is exactly what's missing from the published literature. We'll make sure it counts.
Future of Care
13
How helpful would a tool that summarizes prior visits and OCT trends be for follow-up decisions?*
Not helpful at allExtremely helpful
1
2
3
4
5
6
7
14
If you could have ONE piece of information instantly available at the start of every follow-up visit, what would it be?*
⚡ This is going straight into our product roadmap. Thank you.
15
What concerns would you have about adopting an AI-assisted longitudinal review tool?*
Select all that apply
Accuracy of automated image comparisons
Liability if I rely on it and it's wrong
Workflow disruption during adoption
Data privacy and security
Loss of clinical autonomy
I need peer-validated evidence before trusting it
I would have no concerns
16
Which best describes your adoption attitude if this tool existed today?*
I would try it immediately
I would wait to see evidence from peers
I would need a formal clinical validation study first
I would be unlikely to adopt it
⏱ 2 more minutes — optional

You've completed the core survey. If you have 2 more minutes, the questions below help us understand your systems and clinic context

Your Environment
17
Which imaging and clinical systems do you currently use?
Select all that apply
Heidelberg Spectralis
Zeiss Cirrus
Topcon
Optovue / Avanti
Optos (widefield)
Epic EHR
Medisoft
Custom / hospital-built system
Other:
18
How satisfied are you with your current systems for longitudinal review?
Rate each from 1 (very painful) to 7 (excellent)
OCT viewer / imaging system
1
2
3
4
5
6
7
EHR / patient record system
1
2
3
4
5
6
7
Injection tracking / scheduling
1
2
3
4
5
6
7
19
Is your clinic a teaching environment with residents or fellows?
Select the option that best describes your setting
Yes — residents/fellows review patients before I see them
Yes — but I review all imaging and history myself
No — I practice independently
I am a resident / fellow presenting to a consultant
20
If a tool instantly summarized each patient's longitudinal history before you walked in, how much time do you think it would save per patient?
Be honest — even small savings matter at scale
Less than 30 seconds — minimal impact
30–60 seconds per patient
1–2 minutes per patient
2–5 minutes per patient
More than 5 minutes — significant impact on my clinic
Hard to say — depends on case complexity
21
What is the most painful limitation of your current systems for longitudinal review?
One line is enough
Looking Ahead
22
Who typically makes decisions about adopting new clinical software in your institution?
Select all that apply
I decide independently
Department head / clinical lead
Hospital IT or procurement committee
Hospital administration / finance
Combined clinical and administrative approval
I have no visibility into this process
23
If this tool demonstrably saved you time and improved decision confidence, what monthly price would feel fair for your clinic?
Assume it integrates with your existing systems — no extra setup required
Up to 375 SAR ($100) / month per clinic
375–1,125 SAR ($100–$300) / month per clinic
1,125–2,250 SAR ($300–$600) / month per clinic
More than 2,250 SAR ($600+) / month — if outcomes data supports it
I am not involved in purchasing decisions
I would not pay — it should be free or hospital-funded
24
What would need to be true — clinically or commercially — for your department to seriously consider adopting this tool?
Stay Connected
25
Would you like to be among the first to see the findings — and the product?
Optional — leave your contact if you'd like a 15-min follow-up conversation or early access
Responses are anonymous · Analyzed in aggregate only
Dr. Hanan Alghamdi (Associate Professor of AI in Medical Imaging) · Dr. Safwan Tayeb (Assistant Professor of Ophthalmology)
🎯

Your clinical insight
will shape real care.

You've just contributed to what may become the most clinically grounded retinal workflow tool built in the region. Based on early responses from specialists like you:

78%
Feel time pressure affects their decisions
64%
Skip longitudinal OCT review when behind schedule
91%
Would find a longitudinal summary tool helpful

We'll share the full findings with all participants. If you left your contact, expect to hear from us within 2 weeks. Thank you — Dr. Hanan Alghamdi & Dr. Safwan Tayeb