A Report, based on the Keynote Speech delivered
– By –
Dr. Anjali Agrawal, Head, Teleradiology Solutions, Delhi Operations
@ Artificial Intelligence in Radiology 2018 Symposium, November 10, 2018
Organized by Telerad Tech and Image Core Lab
Dr. Anjali Agrawal addressed the audience about how AI was going
to be relevant in emergency care and emergency radiology.
Deep learning and the
human brain
Speaking broadly on artificial intelligence (AI), Dr Anjali said
that AI is a more general term and includes machine learning (ML) and deep
learning (DL). Machine learning, a specific type of AI, gives computers the
ability to learn without being explicitly programmed. Deep learning, a subset
of machine learning, mimics the human brain configuration, where the multiple
neuronal layers or neurons can crunch vast amounts of data and draw conclusions.
In particular, DL has immense relevance for radiology and healthcare. The availability
of large amounts of annotated image datasets and increased computational power had
made AI a reality and it wasn’t an illusion anymore. It is moving from experimental
to the implementation phase now.
Current state of
Emergency and Trauma Care
Dr. Anjali drew attention to the current state of emergency
and trauma care in India. From trauma registry, it was a documented fact that trauma
related deaths in India occurred every 1.9 minutes. The mortality in serious
injuries was 6 times worse in a developing country such as India, as compared
to a developed country. A WHO survey revealed that there were more deaths due
to lack of timely care than due to other diseases like AIDS, Malaria and TB,
all put together.
Dr. Anjali said that, more than 80% of Indians didn’t get
care within the golden hour and she highlighted the challenges in emergency
care and radiology. There was tremendous pressure on the limited resources that
were available at one’s disposal. The process from the scene of accident to the
emergency room is disorganized. Dr. Anjali drew attention to the education
system and training, which was quite heterogeneous.
AI will transform the
ER services
As per Dr. Anjali, AI could be very useful for triage in the
emergency room. Studies have shown that Emergency Severity Index assignment by
doctors and nurses is correct only 60% of the time. They ended up under-triaging
almost 27% of the patients. And a vast majority of those, nearly half, went into
the mid-acuity group, level 3 – a typical human tendency to play safe.
AI and deep learning could help by analyzing the complex data
from various sources – the age and sex of the patient, presenting history,
complaints, vital signs, what was the mode of transport – did the patient walk-in
or was s/he brought by an ambulance, past medical history etc. This could help
in transforming the emergency department operations. By using these algorithms one
could accurately triage patients, so that the critically ill patients got the
attention of the emergency medicine physician and were managed appropriately.
These algorithms could also help in allocating resources appropriately,
minimizing mismatch between staff and patient case load, with improved patient
outcomes. AI could help expedite interpretation of emergent imaging studies. These
algorithms would be able to make predictions of adverse events and help make an
individual-specific follow-up plan.
AI will revolutionize
the radiology workflow and create smart enterprises
Dr. Anjali maintained that disruptive new generation healthcare
technologies such as AI, robotics, machine learning and deep learning will revolutionize
radiology workflow in many ways and that it was going to lead to massive
improvements in quality, value and the depth of contribution of radiology towards
patient care. She stated that one of the most well researched applications is
emergency radiology.
Dr. Anjali cited that, AI could help make an informed
decision regarding the need for imaging and the choice of modality based on
analysis of the patient records. AI enabled algorithms could play a huge role
in reducing radiation doses of CT examinations or reducing scan time for MRI, by using various enhancement and
post-processing techniques. However, the overall decision making and communication
with the referring physician or the patient would require the intervention of a
human radiologist for quite some time, despite being aided by AI.
AI can help reduce scan
timings and dosages
Dr. Anjali said that, the algorithms could enhance very noisy,
grainy and undersampled data, such as
from MRI, which were being produced in shortened timeframes and produce high-resolution MRI images– with huge implications
in the emergency room, where one tends to shy away from doing an MRI because of
time constraints. For e.g., if an MRI could
be done in two- thirds or one-third the time required, one would be more
comfortable in sending a sick patient for a suspected hip fracture for an MRI.
Similarly, these algorithms could also be applied to Computed
Tomography (CT) scanners to help reduce the CT radiation dose – a huge
advantage, as the reduction in radiation dose from CT would be comparable to a
standard chest x-ray. One would be able to do ultra
low-dose CTs, and get more information from diagnostic quality images,
compared to a radiograph.
AI can ease the
workflow of a radiologist in many ways
Showcasing a typical workflow of a radiologist, Dr. Anjali
said that the radiologist logs into the system, reviews his/her work list and
selects a study to review. The radiologist reviews other information such as patient
history, prescriptions, etc., that are related to the case. Once the images are
presented, the radiologist, in most cases, adjusts the hanging protocols to
enable him/her to perform the interpretation and generate a report. The initial
process of arranging studies is time consuming. Citing the recent developments
in reading protocols, Dr. Anjali said that it could help hang the images in an interactive
manner, learning each time, catering to individual preferences, and saving time.
AI can help in detection of findings, segmentation,
quantification and reporting, in a manner that is easier to understand by both
the referring physicians and the patients.
Dr. Anjali maintained that
AI could help the radiologist by triaging cases, such that only the positive
ones could be seen by the radiologists for further interpretation. She was of
the thought that, as opposed to a few articles quoting that AI would replace a
radiologist for particular targeted applications, AI would assist a radiologist, where a
radiologist would act as a second reader or vice versa.
She gave one particular example where an article looked into
automated detection of critical findings on non-contrast CT examinations of the
head –hemorrhage, mass effect, and
hydrocephalus using an AI algorithm, which would be helpful in triage. If the
algorithm found the non-contrast CT to be negative, it went through another
stroke algorithm. In the case of it being positive, it was labeled as a
critical imaging finding, and if they were both negative, it was labelled as “no
critical imaging finding”. The algorithm had a good sensitivity of 62% and a
specificity of 96%, comparable to a radiologist in the detection of acute ischemia. The sensitivity and specificity were
higher for detection of hemorrhage, hydrocephalus and mass effect, matching the performance of the radiologists.
Therefore, the study concluded that there was a huge potential for AI algorithm
in terms of screening and detection of critical findings in the emergency setting.
Dr. Anjali said that she had seen similar data in her group and
had detected intracranial hemorrhage with a very high sensitivity and
specificity using a hybrid approach of convoluted neural networks and factorial
image analysis. The data and the results
were comparable to the existing literature and that more data pertaining to
quantification and localization of intracranial hemorrhage is underway.
Juxtaposing her earlier statement on triage being the low
hanging fruit for AI applications in radiology, Dr. Anjali, mentioned that apart
from acute neurologic conditions, these triage tools had been used in the detection of chest radiographic findings by
classifying them into normal or abnormal with a high accuracy of almost 95%.
She quoted another example of AI application – wrist
fractures. These algorithms were trained by senior orthopedic surgeons. When the
emergency room physicians, not trained orthopedicians or radiologists, used
them, their sensitivity improved from 81% to almost 92% and specificity from
88% to 94%; with a relative reduction in misinterpretation rate of almost 47%. Dr.
Anjali simplified it further by saying that the algorithm was able to emulate
the diagnostic acumen of the experts by providing the labels on where the
fracture was and also put a heat map, assigning a confidence level to the
detected fracture.
Adding on, Dr. Anjali said that AI applications would also be
useful in detection of non-acute findings the emergency radiology setting.
These may be overlooked because the focus is on the critical life-threatening illnesses. AI could help with measurements
of bone density, detection of fatty liver, coronary calcifications, and the presence of emphysema, which may not be
relevant in the acute setting, but would have implications in the future.
AI can make expertise widely
available and scalable
Dr. Anjali then stated that according to her, AI would become
a huge leveler in terms of the expertise and experience of radiologists. Good radiology
consult would become easily accessible, affordable, as well as scalable. In the
scenario of a mass casualty incident, the algorithms could be put to use to quickly
distinguish between critical and non-critical cases. Handling massive imaging
volumes would also become easier.
Dr. Anjali mentioned about one particular study on automated
bone age estimation where the algorithms were extremely accurate as well as reproducible,
with an interpretation time of fewer than
2 seconds. This is a huge achievement because every radiologist knows how
tedious and time-consuming the task of bone age determination is.
Man with machine
synergy
She went to add that many similarities had been drawn between
the fields of medicine and aviation. The pilot, as well as the doctor, needs to
be highly skilled, as they are responsible for human lives. Both the
professions have benefitted tremendously from automation. There is no flight
without a human pilot, and similarly, there would be no healthcare without
human doctors, because the legal responsibility would always be with the
doctors. Dr. Anjali urged the doctors to not
forget that medicine was an art and that the physicians needed to practice it
like an art to stay relevant. AI would not replace radiologists. According to
Dr. Anjali, it would be the synergy between man and machine that will help the
profession as well as benefit the patients.