Sadeer Al-Kindi, MD, discusses machine learning/AI in cardiovascular risk assessment.
All right thank you everyone for joining us today. Um It's a tough act to follow dr galleries lecture. But I was asked to talk about a simple topic machine learning in an artificial intelligence. And they only gave me 15 minutes to do this. So we'll do this. I have unfortunately no disclosures. So as it's no surprise that you know in the past decade there has been an increase an explosive increase in publications research in Ai everybody this is the bus term that people use these days. Machine learning AI. And this is one of these things that people talk about it. A lot of people don't know what it means. But you can see here in the on the slide that on the left side of the slide you can see the number of publications of cardiac Ai. Cardiac Machine Learning as of last year. And you can see the explosion of of number of publications over the past year. So as of last year we have more than 3000 publications that talk about cardiac imaging. Cardiac AI. And on the right side of the of the screen you can see that the number of publications vary by the specialty and by the modality. So you can see the one of the highest areas of of of publications was in atherosclerosis imaging in C. T. That's blue that red square there. But across the board I think atherosclerosis has taken the center stage for AI and Machine learning. And it provides a unique area of of of of exposure using AI analysis using AI and machine learning because we don't do a good job of predicting risk in individuals. So what is AI. So there are three terms that we often use we use AI we use machine learning and we use deep learning. So what is each So artificial intelligence is in general is just a computational tool or the tool box that we utilize for analysis of data. But machine learning is a subset of artificial intelligence that uses automatically learning and which can improve from experience. So from the term learning it can get better and better with time. Deep learning is a subset of machine learning that utilizes multi layer the specific structure of network called artificial neural network that can handle a significant amount of data. So you can see that the application of deep learning will be mostly in imaging and high dimensional data that we utilize on daily bases like ct MRI's echoes and so and so forth. Machine learning has been utilized on a lot of clinical data which may not be super high dimensional but could be you know you have aged sex race, you have demographics, you have some I. D. Codes and whatnot and artificial intelligence in general income passes both of these terms. So what is the structure in just in simple terms it's very difficult to just tease out because there are hundreds of methodologies within each of these subsets of AI. And it's very difficult to kind of pinpoint to one specific mechanism. It requires careers of individuals to learn about this. But one of the basic things that we utilize in cardiology and cardiac imaging. For example we have data structure data and variables that start from the top. And that could include data from the could include data from proteins, cytokines, whatnot, blood work. It could include tracings like or cardio MEMS or whatever it is. It could include imaging like Eco ct amar and whatnot. We do daily labeling and we can feed the information to a software that can handle a lot of these data variables. So the problem is usually that there is a lot of culinary t between the relationship between these data sets that we acquire. So for example a P wave on the E. K. G. Will correlate with the size of the left atrium on echo. So how does an individual that can incorporate both E. K. G variables and echo variables to predict. So we have to do something called the dimensionality reduction. Which you can see on the left side which eliminates some of the coalition er or co related variables. But also there are other methodologies including clustering, where you can actually tease out specific subgroups or phenotype of patients and individuals on the right side of the screen. You can see that we feed this data labeled data to do something called supervised learning where we the information and outcome to the software. And the software will do either regression on the left side or classification where it's a binary or multi class category. So for example, you want to categorize patients admitted versus not admitted with heart failure patients versus no AM I or patients with chest pain typical versus a typical and so and so forth. So this is a graph that dr gallery just showed you. This is the spectrum of atherosclerosis. And you can see on the lower side of the pyramid there is healthy and low risk and at the top of the of the pyramid is the high risk individuals when it comes to atherosclerosis and risk of myocardial infarction, major adverse cardiovascular events and AI actually on the bottom side of the of the slide you can see AI has multiple applications and across the board of all these risk categories. So for example, we can do automated quantification and prognostic data from images and biomarkers. We can do direct predictions. So there is this type of deep learning that you can feed the software and entire CT. And you tell the software that this patient is going to have my this patient is not going to have a my And you have thousands of these data sets and and the machine will learn over time that there are some specific patterns of the CT that could be predictive of my risk versus not. And then also there is integration of imaging and clinical data for individual prognostication. Obviously AI has in the radiology space has improved our lives. A lot of clinical clinically available and approved software these days help us to reduce the work, reduce the resources that are needed to annotate images and so and so forth. So one of the visualized applications could be that you have you can see on the top of the screen. You have you have uh multiple kind of data sources including imaging including for example CTF Fr including risk factors, imaging datasets and plaque characterizations. And then you integrate those into um physician decision making and incorporate clinical data from the patient. So including what dR gallery was talking about for example patient expression of their symptoms. You integrate those into physician labels and then machine learning can take it and can analyze, integrate all this information and give you unique exposure data and unique prognostic data and tells you that this patient is going to benefit from X and Y and Z. So this is I think what the future is going to hold. So I'm going to go through a couple of use case scenarios when it comes to deep learning and driving some of our own work here. So this is one of the case scenarios where we use a deep learning enabled automated segmentation for cardiovascular risk prediction. So one of the areas that has been really gaining a lot of a lot of momentum over the past year is or last decade actually is the impact of epic cardio fat and in general fat distribution throughout the body. And how does that impact cardiovascular risk in general beyond atherosclerosis, I don't want to focus too much on the plaque because dr Blankstein is going to talk to us about that. But I wanted to focus on outside the heart and looking outside the coronary arteries. And here in this example, we take images, coronary artery calcium, score cts and we provide segmentation. We manually segment, we have a fellow who manually segments these hearts and it takes about 2 to 3 hours to segment one heart to identify where the epic are real fat is. And four of us for all of us, image as we know that it takes a quiet amount of time to actually identify where the epic arial fat is. So we wanted to see if the machine if the software can be doing this in a few seconds. So this we take these cts on the left side. We basically identify where the epic arial fat. We feed it through a machine and then we test how the epic arial fat works. So you can see on the, on the upper side of the screen, you can see that there is this is manual segmentation, meaning a human being and the blue human being did the analysis and looked into what the epic arial fat is and then and see column. See in red, you can see how the machine did in the intersection between them on the right side of the screen. You can see that there's a really almost identical um you know, analysis in terms of volume. So suggesting that to us that basically the software can do the work of a human that is done by one hour. It can be done in two seconds. So the same thing can be actually also applied to liver. So you can see on the right side of the screen, you can see that the liver can be automatically segmented and you can actually get the liver and the spleen. And the implications for this is really and this is very accurate. It's almost identical when it comes to human versus machine and the the basically the output of this, the vision for doing all these segmentation is the capability of looking outside the heart. So in addition to looking at calcium score, you can look at liver fat and non alcoholic ketosis. You can look at the capacity distributions of both epic Ardell, pericardial fat, visceral fat, but you can also look at aortic valve calcifications, pulmonary artery size and you can integrate all these for three potential applications. So one is obviously the obvious one is risk prediction. So you can improve prediction of risk for major adverse cardiovascular events but potentially you can envision having heterogeneity and the effect of specific treatment. So there may be some treatments like GLP one receptor agonists may work on patients who have a lot of cardio fat or they may have a lot of liver fat or they may have a specific type of attenuation and deliver that can benefit from GLP one receptor agonist. So there is a and this is something that's been being done over the past decade in the cancer space and can be translated eventually using cardiac imaging, but also using some of these outputs as surrogate outcomes. And we're seeing more and more of this of short mechanistic trials to look at, for example, changing epic cardio fat change in liver fat, change in coronary fat and so and so forth to identify the impact of specific medications or lifestyle interventions. So another application of this could be integrating beyond just imaging but integrating other factors including E K. G factors. EKGs are widely utilized, probably one of the you know, highest utilized procedures in the United States, but also integrating clinical V. In this type of project. What we did is we got about 5000 patients. We got the calcium scores for these patients as well. We have clinical data on the risk factors we use to derive more than 200 measurements using software and then basically you can predict outcomes. So you can see that in this just for the sake of time. You can see on the left side comparison between just the calcium score, which is probably the best predictor of cardiovascular risk at this point in time if you incorporate E. K. G. Variables that can actually improve the risk prediction significantly in this A. U. C. Curve and then on the right side you can see that the inclusion of E. K. G. Plus calcium score acts even better compared with calcium score plus pulled court equation which is the cardiovascular risk events. And based on that you know incorporating a lot of these variables. You can actually come up with a normal graham. Now you can plug in numbers using the E. K. G. Risk score CAC score, age, sex and so and so forth. And you can come up with a precision approach to risk to identify risk for future events. The last thing here is the coronary factor economics and cardiovascular risk. There is a lot of interest these days on perry coronary fat because it turns out that we can actually look at the perry coronary fat on ct coronary ct A. S. And it has a lot of information that has been overlooked by humans by readers by us. That can improve our risk prediction. But it can also identify things that can change our management of patients. So for example in this study they looked into a small cohort of patients that had either am I or no coronary disease or stable coronary disease. And they were able to show that by looking at the on the left side of the screen you can see that there is a petty coronary fat and using a lot of features we call them radio mics to extracting a lot of variables. You can actually improve risk prediction for differentiation between acute and stable coronary disease. And it can actually, in a subsequent study they were able to show that you can pick the culprit lesion versus non culprit lesion, which is something that we have been striving to do for a long time in cardiology. And it turns out that this acts, you know, the features on on on these precautionary fat attenuation and radio mics can act beyond just the plaque characteristics. So beyond just the atherosclerosis imaging that we do routinely we can actually look at the perry coronary fat and derive more information. So on the right side of the screen you can see, You know the addition of high risk black features too fat risk black features. And you can see that, you know, it really adds up significantly and you can identify patients who are at extreme high risk for events including a 10% risk at five years. And the nice thing about ai there is now more increase into a focus into uh the explain explainable ai. So it can help you identify the risk factors specific risk factors that are contributing to someone's risk. So for example, in this exercise you can see that this personalized explanation, machine learning for for a man who's 60 years, 62 year old, you can see that calcium score percentile was the highest highest risk factor most important risk factor than age. But you can see some modifiable things including LDL including HDL including E 80 adipose tissue. That can be changed with medications and so and so forth. While on the right side it's actually different story. You can see a different completely different fingerprint for risk. So where do we go from here? I think the future is going to be incorporating imaging. Procedural data incorporating HR data. Like Dr Tarek was mentioning E. H. R. Data, remote monitoring, cardio memes, others E. K. G. And military E. K. G. Apple watch incorporating those all go into the cloud and use machine learning machine learning models to actually both identify you know patients who can benefit from active management including lifestyle modification but also window for intervention in acute settings for example Priest Emmy situation pre M. I. Situation um And then remote goal directed medical therapy. Uptight duration and then real time disease modeling for early referrals for advanced heart failure for example our prevention program and so and so forth. So limitations of machine learning obviously the data can should be annotated by trained clinicians and this concept of garbage in garbage out holds true for a lot of these projects. So whatever you put into the software the software is actually stupid. It's not it's not intelligent but it just learns from stupidity. So that's a problem. Deep learning can actually require a lot of data requires a lot of data because of the variance and it has to see a lot of different things to learn from them. And then algorithms are subject to bias. There's a problem with acceptance by both physicians, clinicians and patients and patient data privacy given the vast amount of data that has to be shared. And obviously this black box situation where it can give you a little bit of anxiety as a clinician as to how to fix and troubleshoot some of these things. So some of the future directions and gaps in knowledge. We need our CTS obviously in ai we're starting to see some of these. There was an echo network Econet RCT that was presented in just recently. We need testing of the impact of machine learning algorithms on clinical care and automation of workflows. So how does it impact length of interpretation? For example accuracy of interpretation. Also we need more of the prediction of responses as opposed to just prediction of events and then validation validation and validation. Thank you so much