EX-99.1 2 radnet_8k-ex9901.htm TRANSCRIPT OF CONFERENCE CALL

Exhibit 99.1

 

 

C O R P O R A T E    P A R T I C I P A N T S

 

 

Dr. Howard G. Berger, President and Chief Executive Officer, RadNet Inc.

 

Dr. Gregory Sorensen, Chief Executive Officer and Co-Founder, DeepHealth Inc.

 

 

 

 

C O N F E R E N C E    C A L L    P A R T I C I P A N T S

 

 

Mitra Ramgopal, Sidoti & Company, LLC

 

 

 

P R E S E N T A T I O N

 

 

Operator

 

Good day everyone. Welcome to the RadNet Inc. DeepHealth Artificial Intelligence Call. Today’s conference is being recorded.

 

At this time, I would like to turn the conference over to Dr. Howard Berger, President and Chief Executive Officer of RadNet Inc. Please go ahead, sir.

 

Dr. Howard Berger

 

Thank you, Operator. Good morning everyone.

 

This is Dr. Howard Berger calling in. Two days ago RadNet announced that it had acquired DeepHealth and formed a division for artificial intelligence that will be led by the president of DeepHealth, Dr. Greg Sorensen. Today, we want to talk about RadNet’s decision to move more substantively into artificial intelligence as well as the strategy that hopefully RadNet and DeepHealth will be executing together to accomplish those goals.

 

Ten years ago RadNet made a critical decision to invest and own its own information technology platform, which we call eRAD, which would serve as the backbone for managing a widely distributed network of imaging centers. The premise of this decision was based on both economic and operating requirements, (inaudible) representing an existential necessity. Indeed, the strategy has proven successful as operating RadNet without this capability seems improbable.

 

I believe RadNet, and in fact all of radiology and imaging is at a similar crossroads. Artificial intelligence will undoubtedly impact every facet on how the practice of imaging is delivered.

 

Given the size to which RadNet has grown, and likely to continue growing, artificial intelligence becomes another existential necessity to maximize operational and clinical opportunities.

 

 

 

 1 

 

 

RadNet, Inc. – Deep Health Artificial Intelligence Call, March 13, 2020

 

 

Over the past year RadNet and DeepHealth have been corroborating in the use of artificial intelligence for mammography. During this period we came to know the DeepHealth team, and in particular, Dr. Gregory Sorensen. When RadNet made the decision to make a substantial investment into artificial intelligence, it was obvious that Dr. Sorensen’s credentials were uniquely qualified to lead this effort. Clinical and research radiologist, business and technology experience, running the North American division of Siemens Healthcare, and evolving the DeepHealth effort in mammography AI over the past four years.

 

The opportunities which lie ahead in artificial intelligence practically fall into three categories: one, clinical accuracy and productivity enhancement; two, business and operating efficiencies; and three, revenue enhancement.

 

Before I turn the call over to Dr. Sorensen, I would like to give some perspective to the category of revenue enhancement.

 

Procedures performed in diagnostic imaging centers generally require a prescription for the particular exam from the reporting physician. In addition, for advanced imaging procedures, authorization is also required before the exam can be performed. The only exception to this rule is screening mammography, which the patient can self-refer for annual or bi-annual follow-up. As a result of this capability, RadNet has sought tools to improve compliance and increase volume by direct outreach to the patient population. Since RadNet implemented the White Rabbit AI application, mammography volume in most of our regions has increased substantially. The benefit of using mammography and artificial intelligence for more accurate and earlier diagnosis of breast cancer is rapidly being validated.

 

Through the recent technology advances, the opportunity to use AI for prostate, lung and colon cancer screening are now possible. Similar to mammography, the potentially new screening tools should eventually be adopted by patients and payors as both affordable and necessary, and as a part of population health initiatives which improve outcomes and reduce costs. Thus, the use of AI for these new screening tools should create a significant benefit which will help direct patients to centers which utilize this capability. Both RadNet and DeepHealth share this vision and made the combination and collaboration opportunities compelling.

 

I’d like to turn the call over now to Dr. Gregory Sorensen who will further elaborate on both the circumstances which led RadNet and DeepHealth to come together, as well as his vision of the opportunities for the future in artificial intelligence. Greg?

 

Dr. Gregory Sorensen

 

Thank you, Howard. Hello everyone. It’s a pleasure to be with you on the call today.

 

I’d like to begin by affirming what Dr. Berger has said about the potential impact for artificial intelligence, and in particular a form of machine learning known as deep learning, to improve the health of women and men.

 

Dr. Berger’s categorization of opportunities, that is those three opportunities he mentioned—clinical enhancements, business operations and revenue enhancements are also spot on. Before diving into the specifics of how DeepHealth’s team and technology will address each of these issues, I’d actually like to review a couple of commonly discussed concepts to help set the stage.

 

Let’s start with an analogy that’s gotten a lot of attention, namely the phrase data is the new oil. In some ways, that’s a very apt analogy, but I’d like to say that a better one is that data is the new tight oil or tight natural gas. Getting the value out of the source requires a lot of work, perhaps not precisely analogous to fracking but still a very energy intensive, sophisticated and time-consuming process. If we were to continue that analogy, DeepHealth would be like an oil extraction company and RadNet would be like the Permian Basin, a vast resource with tremendous potential.

 

 

 

 2 

 

 

RadNet, Inc. – Deep Health Artificial Intelligence Call, March 13, 2020

 

 

Bringing these two together enables so much opportunity. This analogy may be even more apt because it’s clear that the two companies are such a good fit for each other. From DeepHealth’s perspective, it’s far more efficient for us to be inside the sources of data that we need to do with the AI, that we want to do, to seamlessly deal with collecting the data. Something that, of course, is made much easier at RadNet, since you’ve heard RadNet owns the IT infrastructure itself.

 

We also both have instance access to friendly and cooperative medical experts to ensure that our AI isn’t developed in a vacuum. This is something that other imaging companies struggle with from time to time.

 

From RadNet’s perspective, having what you might think of as the fracking company in the world show up and offer to help them realize the value of all this data is also very exciting.

 

Bluntly put, I believe that the most valuable assets that RadNet has are not currently listed on its balance sheet. The data and clinical scale that RadNet has is so tremendous that the opportunity to unlock that value is just incredibly exciting. That’s one concept I wanted to discuss.

 

Another concept that is quite common in AI is to say that AI is going to replace radiologists. Well, I’m a radiologist myself and RadNet works with over 750 radiologists directly, and I can tell you that there’s never been a better time to be a radiologist. AI is not going to replace us. AI is empowering us.

 

We at DeepHealth and at RadNet expect that our machine learning technologies will enable RadNet physicians, meaning our affiliated radiologists, to practice medicine at the very top of their license. We at DeepHealth are building the AI to do the drudgery parts of the job, which will enable our physicians to provide better quality care while also being more productive.

 

How are we going to do that? What are some of the specifics? Well, let me open the hood a little bit on our technology and our team to help explore that.

 

As you know, at the core of DeepHealth, from the name you might guess, is deep learning. Deep learning is the name given to a relatively new way that lets computers learn to do certain tasks. In our case, the breakthrough is that we can develop deep learning technologies to actually help physicians do a better job of interpreting images.

 

Deep learning consists of taking what is called a model, which is really just a set of equations, linear algebra equations, and adjusting the coefficients and the variables in those equations until the model produces the answers that you were hoping to see. The great thing is that this adjustment can be done automatically by exposing the set of equations or the model to training examples. If you want your model to distinguish pictures of dogs or pictures of cats, you could starts with a computer model consisting of lots of linear algebra equations, show those equations thousands of pictures that you know are a bunch of dogs, and other thousands of pictures that you know are pictures of cats, and you will end up with a model or a software system that does a pretty good job of distinguishing dogs from cats. In fact, this is no longer so hard to do. Many companies around the world are using systems like this today for everything from software for self-driving cars to facial recognition to designing drugs.

 

But, what if you wanted a model to distinguish dogs from wolves? Or in our case, images that were mammograms that show something that might be cancer versus mammograms that show something that actually is cancer. This is where the hard work comes in.

 

We at DeepHealth have built teams of people, very talented people, to address this question, and they in turn have built tools to start to address this type of question. As with wolves versus dogs, we have carefully identified examples of, on mammograms, breast cancer versus not breast cancer, not by using the opinions of humans but by tracking down pathologic evidence for each example, each one of those mammograms. Our team has specifically built software tools, methods and even patented approaches for adapting deep learning to the specific needs of our medical imaging problems.

 

In the case of breast cancer because fortunately only one of 200 women who undergo screening mammography actually have breast cancer, the key to getting lots of pictures of wolves is to be sure that you can first collect lots and lots of pictures of dogs. That’s why we at DeepHealth are so excited to be part of RadNet. Because of the large volume of medical images that RadNet generates every year, we have the opportunity to build precise, powerful tools that really do answer important medical clinical questions.

 

 

 

 3 

 

 

RadNet, Inc. – Deep Health Artificial Intelligence Call, March 13, 2020

 

 

With that backdrop, now let’s turn back to Dr. Berger’s categories, so where AI can benefit patients, and specifically help RadNet do its job better.

 

His first category, clinical accuracy and productivity, is where the DeepHealth team has been focusing its efforts in its first four years. To illustrate the power of AI and its opportunity for clinical benefit, I would like to briefly discuss a little bit more screening mammography.

 

Screening mammography clearly saves lives. It’s been studied in multiple countries multiple times over the past three decades, and since screening has been introduced, breast cancer deaths have indeed declined. But that doesn’t mean that screen mammography is a perfect tool. Reading mammograms, I can tell you from personal experience, is a difficult task, and it’s difficult even for full-time experts. Because breast cancer is fortunately still quite rare—as I said earlier, only one out of 200 women who get a screening mammogram actually have a breast cancer present on that mammogram—but the signs of cancer, those one in 200, can be quite subtle. It’s actually not a bad analogy to go back to the idea of trying to sort dogs from pictures of wolves; sometimes they can look very, very similar. Because it’s so rare, only one in 200, that means the other 199 women do not have the breast cancer. All in all, this is very much like the proverbial task of finding a needle in a haystack, and this can be for us physicians both simultaneously very tedious because so many are normal, yet very stressful because as an interpreter you know that there’s a cancer somewhere in those 200 women’s cases.

 

Because this task is so difficult, in turn that means not all doctors are equally good at it. The best radiologist might catch well over 90% of the cancers, but there are some who appear to catch more like 60%. Recent technology improvements in the way we take those pictures has made the task even more difficult. A new technology called digital press tomosynthesis, which RadNet is a pioneer and a leader in rolling out—it’s sometimes called 3D mammography—does do a better job of showing the cancer on the mammogram, but because it’s 3D instead of 2D, it converts 50 to 100 times as many images for us experts to review, making our job even more difficult.

 

This situation is exactly where AI should be able to help. We at DeepHealth, our team, has created a database of thousands and thousands of verified examples of breast cancer to build AI software to identify even subtle signs of cancer. We’ve been testing this software now with RadNet for over a year—or I guess nearly a year—and we’ve learned that our software can find cancers that human physicians sometimes overlook, sometimes as much as a year earlier.

 

We’re now working with RadNet’s expert physicians to determine not only how much better at interpreting mammograms the DeepHealth software could allow doctors to be, but also to measure how much more efficient they could be if the software could help them prioritize their work.

 

That’s that first example, clinical accuracy and productivity. It’s very clear that there’s some exciting opportunities there, some really exciting low-hanging fruit.

 

Now, the second category that Dr. Berger mentioned of business and operating efficiencies is where RadNet’s earlier team, Nulogix, has been working already. While the breakthroughs in image analysis capture a lot of attention in the press and media, etc., many of the same deep learning tools can be applied to non-imaging data. They can have substantial impact.

 

One example that RadNet has in active testing now is a tool that analyzes the dozens, if not hundreds of variables that play in how to best submit a reimbursement request to an insurance company. This helps RadNet immediately because having a claim rejected by an insurer not only means a delay in past cash collection but it also raises the risk of never being paid at all. Of course, another big bonus of these kinds of applications, these business applications of AI is since they’re not involved in direct medical care, they do not require FDA approval.

 

The third category, revenue enhancement, is something that Dr. Berger has already alluded to. If AI can help us lower our costs at RadNet to do screening exam patients, be those examinations mammograms or lung CT scans, which have been proven to save lives in smokers and former smokers, or save prostate MRI, then we can offer these exams to large populations at patients at an affordable cost and actually improve millions and millions of lives. It’s that opportunity, the RadNet opportunity, that makes us so excited to join together with them.

 

 

 

 4 

 

 

RadNet, Inc. – Deep Health Artificial Intelligence Call, March 13, 2020

 

 

In closing, I would just like to return to the idea that RadNet’s best assets are not yet visible on its balance sheet. I’ve had a chance to work closely with the RadNet team over the past year and we and my whole team are convinced that not only is the raw material present in the data but that the entire RadNet management team and clinical teams also understand the possibilities here. They are eager to work with our highly talented scientists to unlock these assets. Doing this in a business setting is very exciting to us because it sharpens our scientific focus onto the things that are sustainable and valuable to all patients and to all stakeholders. We are very motivated to get going, and with that I will close my remarks.

 

Thanks very much for your attention and I think we’re going to turn it now to some questions. Is that right, Dr. Berger?

 

Dr. Howard Berger

 

Yes, Greg, thank you.

 

 

 

 

 

 

 5 

 

 

 

All right, we will take your questions. Please, Operator, give us the first questioner.

 

Operator

 

Thank you, sir. If you would like to ask a question at this time, please signal by pressing star, one on your telephone keypad. Please make sure that your mute function is turned off to allow your signal to reach our equipment.

 

Once again everyone, that is star, one at this time if you would like to ask a question. We’ll pause for just a few moments to allow everybody the opportunity to signal.

 

One more time, that is star, one if you would like to ask a question. We’ll pause again for another moment.

 

Dr. Greg Sorensen

 

While we’re waiting for people to come up with their questions, let me just jump in because I can repeat a few questions that I often get about deep learning.

 

One is I’d just like to circle back to this question of will it replace radiologists, and are radiologists doomed? There’s a famous meme going around about how radiologists might be like Wiley Coyote that are already over the cliff.

 

That statement was made back in 2016 or so. Here we are five years later or four years later and the job market for radiologists is fantastic. There’s huge demand for our specialty, and that’s because so many people the value that imaging provides.

 

RadNet, by being focused on imaging, really does bring one of the most valuable technologies to patients and the opportunity for AI to speed that up and to make it more efficient and to improve its quality, this is what radiologists really are excited about, this human element of actually speaking with a woman to explain what’s going on in her mammogram, discuss the procedure of a biopsy you might do. These things are never going to go away. Computers are never going to actually consult with patients, at least I don’t see that happening in my lifetime.

 

It's really a symbiotic relationship where the AI and the doctors work together to actually deliver things in a much smoother and a more efficient way.

 

I’ll pause here and see if there’s any questions coming in.

 

Operator

 

It looks like we do have a question from Mitra Ramgopal with Sidoti.

 

Mitra Ramgopal

 

Yes, hi. Good morning. Thanks for taking the questions. First, I just wanted to get a sense in terms of what the details they are able to bring to the table that, for example, Nulogix might not have been able to offer. As you combine the two entities, would you say you have the full range of capabilities to be able to drive toward force efficiencies and lower the cost that you intend to, or do you need to invest even more in bringing in some additional technologies.

 

Dr. Howard Berger

 

Good morning, Mitra. It’s a multifaceted question. Let me try to break it apart.

 

First off, the team that we had in Nulogix, while very competent, was limited to practically speaking four people and their primary function, as Dr. Sorensen had mentioned, was looking at business efficiencies such as improving our reimbursement operations by reducing denials, getting a better sense of the professions at the front door for co-pays and self-pay, etc. Their functions, while their capabilities might have been broad were rather limited.

 

With the acquisition of the DeepHealth team, their focus has been solely on clinical opportunities and, as Dr. Sorensen was describing, primarily in the field of mammography.

 

For the past four years, they’ve been performing that task of bringing a product to market, which we hope to apply for FDA approval later this year and then be able to start using it clinically and perhaps even in selling it commercially.

 

The development for the basics of the algorithms are used to develop these clinical tools, particularly as Dr. Sorensen was saying, identifying the cats versus dogs, once you’ve set up those processes they are more easily adaptable to other opportunities where you might want to use the same basic logic and strategy, and which is what we hope that a DeepHealth team will turn to either after or simultaneous with their work in mammography.

 

As I mentioned in my opening remarks, our particular focus will be for prostate screening and then lung screening and colon screening, all of which between the cancers that are identified for those four organs comprise about 80% of all the cancers diagnosed in the United States annually.

 

The market opportunity for us to develop these tools is not only very broad and extensive, but it’s more important that the understanding of how we want to have these tools is something similar to mammography become adopted by the medical communities so that patients can get these in an affordable manner and one where they can self-refer if they have either the clinical background and risk assessment or just as part of screening, for example like prostate cancer, can be done on a regular basis.

 

The need that RadNet or anybody would have to attack this kind of a broad approach would require substantial resources and assets. Even if we go after the more strategic initiatives that we’d like RadNet to pursue because they lead not only to the clinical and accuracy of our radiologists, they lead to revenue opportunities. These are still only a fraction of the opportunities inside artificial intelligence, and it’s quite likely because we have been working with other teams on artificial intelligence that come to us for both our data extraction as well as clinical expertise, that we will be either partnering or collaborating with other teams so that we can further develop in a more expeditious manner the benefits of artificial intelligence inside RadNet and for the community. I expect this to be a broad approach. I think it’s important that we had somebody of Dr. Sorensen’s capability, both from a clinical standpoint, a research standpoint and a business standpoint be able to lead what I see as a broad-based initiative on the part of the company.

 

It's also quite possible that we’ll be working with some of our manufacturers, the OEMs equipment that we buy, given the resources that we have and tools, both in terms of the amount of equipment that we might have out there and the ability to implement a wide range of delivery of these services, as well as having expertise in running the business side of (inaudible) that most of the equipment manufacturers lack.

 

I see the resources both coming from an expansion of the combination of the DeepHealth and Nulogix teams that Dr. Sorensen will oversee, as well as bringing in other relationships with business entities, whether they be manufacturers or people that are just focused on other parts of the AI opportunities as part of an overall strategy for the company.

 

Mitra Ramgopal

 

Great. Thanks for the color on that. I was also wondering if maybe you could give us a sense in terms of how does the FDA approval work for AI and your sense in terms of getting your product to market, the kind of timeframe we should be looking at.

 

Dr. Greg Sorensen

 

Sure, great question I’ll take that. This is Greg.

 

The FDA has asserted jurisdiction over medical imaging artificial intelligence products. They consider them medical devices. In past decades they considered them Class 3 devices or certain risk categories, but in the past year or so they have reclassified most AI tools, like the ones we’re developing, to be Class 2 devices that will require 510(k) clearance before they can be used in interstate commerce.

 

We at DeepHealth have been spending a substantial amount of time in talking with folks at the FDA. We’ve met repeatedly with them. We’ve built a good relationship of trust and of scientific integrity, and we believe that will be an ongoing part of our strategy where we will continue to work closely with FDA about the products we want to bring to market.

 

The typical 510(k) process, as you know, these days, well, it does kind of continue to evolve because AI is such a new field, but in general the FDA requests that you have what they call a presubmission meeting with them. You kind of go in and explain what you’re thinking about doing. They give you some feedback. That takes a few months. Then you go off and do the work that you’ve essentially laid out of what you expect to do. That can usually take a few months. Then you submit the material to the FDA and they get a few months to actually review things.

 

All in all, once you’re kind of pretty close to done with your technology and ready to start the regulatory process, that can be about a year, sometimes a little longer. It depends on how good your science is going into the FDA first.

 

To your specific question about RadNet tools, we will be getting FDA clearance for those AI technologies that the FDA has guided us that require FDA clearance and we expect that will be a pipeline of things over the years.

 

In your earlier question, just to echo what Dr. Berger said earlier, there’s so many opportunities in AI and radiology, and RadNet has such a broad portfolio. It’s not like we’re a single imaging modality business, that for some things we’re going to build internally at DeepHealth, some things it will just make more sense businesswise for us to buy, if you will, or to go outside our partner with external parties. We’ll be looking at that closely as we move ahead.

 

It's definitely not a not-invented here syndrome here. We’re very open, so the best technologies wherever we can find them to help RadNet meet its mission and to help our patients.

 

Mitra Ramgopal

 

That’s great. Just to follow-up a little on that, obviously the focus has been on mammography. How easy or how quickly do you think you will be able to use these technologies on some of the other modalities?

 

Dr. Greg Sorensen

 

Some of the modalities have tools commercially available already today. RadNet has been evaluating a number of them commercially, even as we speak. The real question I think from an investor point of view and from an owner point of view is how valuable are those? That’s really the process that I think we’ll be doing over the next quarter or two is trying to figure out which of these technologies can really make an impact for our shareholders and for our operations, versus the kind of more nice to have but not necessarily meaningful? I expect that over the course of 2020 we will identify some things that really will have a meaningful impact.

 

Now, in medicine things take a long time, practices and habits take a while to change because there’s so many people involved, so I wouldn’t want to overpromise or say we’re going to see things turn on a dime, but the opportunities to identify those, RadNet is really well situated for because we have such broad operations, we can kind of distribute different opportunities to different teams and in parallel identify, I don’t know, 5 or 10 different things all kind of almost simultaneously.

 

I think we’re as well situated, perhaps better situation than most entities out there to address the AI opportunities at scale.

 

Dr. Howard Berger

 

Let me interject something for a moment, Mitra.

 

Greg, can you give some perspective, at least as it relates to mammography AI where DeepHealth is in the current process of filing for FDA approval?

 

Dr. Greg Sorensen

 

Yes. Specifically with our product, we met with the FDA last November for what’s called this presubmission meeting. Had a very productive, positive experience there. Came out with some very clear guidance from the FDA about how to proceed. We will be meeting with them again next month to get final approval for our execution plan, and basically the way that works is you come up with essentially an agreement with the FDA, a handshake agreement equivalent, where they essentially say if you execute the scientific process that you are proposing to us and it turns out with the scientifically proven answers you think it will, then that will support the kind of claims that you’re looking for. In our case, we’re looking at DeepHealth and RadNet not just to have a claim that says we can do a little bit better at maybe interpreting mammograms, but we’re very focused on being able to demonstrate to the FDA’s satisfaction and our own satisfaction that the AI actually makes a meaningful difference in women’s lives.

 

We don’t yet have clearance of what that will actually look like so I can’t give you a specific quote of what the FDA will agree to. The scientific thing I can tell you, we’re interested in things like something like a physician who interprets a mammogram using DeepHealth software would find cancer earlier than if they interpreted a mammogram without DeepHealth software, that kind of thing. Our data suggests that we’ll be able to do that. We think that’s a substantially unique differentiator in the marketplace and that more importantly it will be something that our referring physicians see as very valuable for their women, their female patients.

 

That’s the kind of direction we want to go, not just the dollar operational efficiencies, but meaningful improvements in women’s and men’s lives. We think that’s particularly valuable in the RadNet setting because of something we haven’t really talked about on this call but I know came up in yesterday’s call, and that was RadNet is unique in that it has capitated lives, or relatively unique. Not many radiology providers can take on capitation, and that opens a whole series of value propositions for artificial intelligence that are going to be possible and even very meaningful, that when you’re at sort of an outside the tent or not so closely linked, just become much harder to do or monetize. There’s a bunch of things we’re going to be able to do together by working hand-in-glove and by DeepHealth being part of RadNet and RadNet having its own DeepHealth, if you will, that would just other just be much harder to do from a business perspective.

 

(Inaudible).

 

Dr. Howard Berger

 

(Inaudible) a couple of things before perhaps if you had another question.

 

First of all, to put it better in perspective, even though only one out of every 200 women who we do screening mammography on ultimately get diagnosed with breast cancer, in the general population one out of every eight women will get some form of breast cancer in their lifetime. You are talking about a very substantial problem and as Dr. Sorensen suggested, even though there has been great strides in diagnosing breast cancer earlier, it hasn’t necessarily reduced the number of breast cancers; it’s simply allowing us to diagnose it earlier and with the use of artificial intelligence and other technologies such as 3D tomography, we should be able to diagnose these earlier and ultimately reduce morbidity and lower the cost to the system.

 

I think it’s important for all of us to understand that this is about scale and that goes to the heart of RadNet’s decision to become fully invested in artificial intelligence. We are on a trajectory to do about 2 million mammograms a year and so on an annual basis that would represent close to 5% of all mammography performed in the United States and that’s done in ostensibly the five major markets that we’re in.

 

If you wanted to take technology such as what DeepHealth is on the brink of getting FDA approval from, and then purchase that, even if they add $1 or $2 per read, per scan, it would cost the company $2 million to $4 million and be an ongoing cost that we would continue to bear regardless of how many mammograms are ultimately done inside the RadNet organization, and we’d only be able to do with that kind of expenditure really one product or have one product.

 

Mammography is unique in what we do in that every mammogram that we perform will need to be subjected to artificial intelligence. This isn’t a matter of saying we only use artificial intelligence on the (inaudible) that you ultimately diagnose to have breast cancer; it’s a tool that will help more accurately and earlier identify those cases so every mammogram we do will need to be subjected to artificial intelligence, both to determine normal from not normal and then essentially be another pair of eyes to help the radiologist read these more accurately.

 

The same logic could apply and should apply to, for example, prostate cancer. Every man, should they live long enough, is likely to get prostate cancer. The question in some of the medical communities is often what do you do for the testing and what then do you do for treatment given that prostate cancer and many of the prostate cancers will not necessarily be something which ultimately leads to somebody’s demise, but becomes a very substantial cost to the system to treat. We firmly believe—and we’ve done some initial work with this—that we can develop a tool that will make prostate MRI screening as easy and as affordable as mammographist training. We’ll substantially reduce the cost of the diagnosis and will allow I think the payors to make this every bit as accessible to their membership as mammography is now to women.

 

I want to put that in perspective because that becomes another tool that we believe will be suggested to every prostate MR that we do. We just hope to do them earlier, diagnose them faster and more accurately.

 

The other comment I would want to make—this is really looking down the road a piece—is that when we do a mammogram and we do an MRI, particularly of the prostate, there is a lot of information on those scans that are simply discarded because they’re either not the reason why the test is ordered or they’re just not practically capably of doing. I saw recently an article about a team out of Stanford that won an award for taking knee scans and subjecting them to artificial intelligence and then being able to determine risk assessment for cardiovascular disease in all those patients that had knee MRIs regardless of whether the knee itself was determined to be normal or abnormal.

 

Essentially, that kind of information is available on every scan or procedure that is done in imaging and that data, which is another part of the assets, if you will, that any radiology company or provider owns, is something that I think can yield an enormous amount of valuable information, particularly from mammography or prostate if they were to do that on a swing basis for the population, would essentially be additional information that wouldn’t cost anything more to develop than giving the opportunity for artificial intelligence (inaudible) particular perspective that we’re trying to achieve with that diagnostic information.

 

The perspective of what we’re talking about in artificial intelligence is really not just that one tool but potentially looking at the spectrum of the impact artificial intelligence can have overall in the delivery of healthcare, particularly as it’s representative of the radiology diagnostic imaging field.

 

I’m sorry, Mitra, if you had another comment or question you’d like to ask us.

 

Mitra Ramgopal

 

No, that is great. Actually, the final question—and again, this is more down the road—would the reimbursement be necessarily any different if AI is used versus non-AI?

 

Dr. Howard Berger

 

Well, that’s probably again going back to my analogy of the $64,000 question, although that’s from a number of years ago, that TV series.

 

My guess is the answer to that may be no. I don’t know if anybody is going to pay any more because you’re doing it. It’s more likely to be that you would do more and see more of those patients because you’re doing it and providing a value added to the system that will make you a better provider and a better partner in dealing with the overall health of that patient.

 

Now, I could be wrong but if you look at what’s happened over the years in imaging, as new technology has been advanced, nobody is running out there and generally paying us more money. The one exception to that is perhaps at this time 3D mammography which does have an additional code that you get reimbursed on above and beyond 2D mammography, but even the number of dollars that we get paid for that are not necessarily reflective of the additional investment that has to be made.

 

However, 3D tomography for the mammography procedures is now the state of the art and most people will be expecting you to perform that exam, and if you don’t, you’re at a competitive disadvantage. I expect the same type of thing to be embraced by the payors and others that will have artificial intelligence as a requirement that if you don’t provide it you actually may get less money, and that indeed has some historical precedent—thank you, Greg.

 

Dr. Greg Sorensen

 

It becomes the new normal.

 

Dr. Howard Berger

 

Yes, and I’ll give you the example for that what that when the industry moved to digital radiology and it was required for reimbursement by CMS, they wouldn’t pay you more if you did it; they paid you less if you didn’t do it.

 

Again, perhaps I’m being a little overly dramatic when I use the term existential necessity, but I think we have to work from a historical standpoint and be rather sanguine about what’s going to happen on the reimbursement side.

 

We all tend to live in this world, by the way, of still everything is on a fee-for-service basis, but the industry is moving more to population health, alternative reimbursement models, risk taking, getting paid by better outcomes, if you will, and artificial intelligence will be a critical part of that opportunity moving forward.

 

Dr. Greg Sorensen

 

If I could just echo one thing, kind of that I want to bring out. The scale of RadNet enables us to have the potential to develop these AI tools. Dr. Berger mentioned that we’re going to do we think this year 2 million mammograms. That’s as many as the entire United Kingdom will do next year. RadNet can collect data, analyze data, get all those pictures of dogs or wolves or whatever analogy you want to use to answer the most critical and valuable questions at a scale very few other organizations can even contemplate. We have our own internal AI system—I’m sorry, IT system, so all the data collection gets streamlined. We have a lot of alignment of interests. Just the opportunity to catapult ahead and really bring these technologies to patients and to make business impact faster, just really a remarkable opportunity that we both saw and we’re just super excited about.

 

Mitra Ramgopal

 

I really appreciate the color and thanks for taking the questions.

 

Dr. Howard Berger

 

Thank you, Mitra.

 

Operator

 

We have no further questions at this time.

 

Dr. Howard Berger

 

Great. All right, well, thank you all for joining us today and please stay tuned for exciting developments that RadNet hopes to be able to deliver this year and bring better medicine, which is good business, to the marketplace. Thank you all for your time this morning.

 

Dr. Greg Sorensen

 

Thank you. Bye.

 

Operator

 

That does conclude today’s conference. We thank everyone again for their participation.