Verily, which has separated from Google within the Alphabet Group’s new structure, reports its first success. According to a study by Nature, an internally developed AI machine-learning system can diagnose heart and circulatory illnesses from patient eye images. It also can recognize a chronic disease.
From retinal and iris images, the software can deduct age, blood pressure, or whether the patient smokes. From these data, he predicts, according to the study, the risk of myocardial infarction with precision similar to the currently validated methods used in medical practice.
Verily’s engineers are building on the current research, which suggests that a lot of patient data can be derived from retinal vascular status due to the high density of fine vessels.
To teach AI, Verily researchers used a vast database of health information about 280,000 patients from the United States and the United Kingdom. By comparing retinal images among patients who experienced cardiovascular problems within five years of imaging, with other photos, artificial intelligence has learned to predict the risk of cardiovascular complications with an accuracy of 70 percent. The current method, which includes blood tests, is accurate in 72 percent of cases.
The British are close to practicing An even greater success rate is reported by another health project associated with heart disease and artificial intelligence. Oxford researchers have developed artificial intelligence to analyze cardiac activity from data obtained from echocardiography. Artificial intelligence to evaluate a single image from an ultrasonic heart scan uses more than 80,000 data points.
To teach the system, thousands of patients were able to scan the hearts of the system creator Paul Leeson.
The Ultromics system is “significantly more accurate” than the diagnosis of consulting physicians who are mistaken in twenty percent of cases, according to the preliminary results of clinical trials.
According to the Ultromics website, the accuracy of artificial intelligence diagnoses exceeds 90 percent and could save £ 300 million a year for UK healthcare. Hospitals should be free to get the software next year.
Clinical trials are also underway by another British health system based on machine learning. Optellum is able to evaluate lung imaging and detect early-onset cancer by analyzing data from computed tomography. Up to 30% of patients have findings on the lungs that are mostly harmless, but due to uncertainty in the results, patients have to undergo a two-year check-up.
The advancement of AI in medicine hinders some of the experts because, unlike traditional diagnostic systems, there is no justification for the outcome to which artificial intelligence has come through machine learning. For example, the authors of the Deep heart system, which can detect diabetes from data from smartwatches equipped with a heart rate sensor, draw attention to this risk.
The acceptance of AI into medical practice is mainly helped by tangible savings in test and follow-up care – in Europe, the professional public, including insurance companies, is more receptive to the use of artificial intelligence in the UK. Space for machine learning and software-driven diagnostics is also in developing countries where healthcare is significantly worse than in the US or Western Europe. For example, Stanford researchers have developed a system capable of recognizing photos of skin cancer symptoms. When testing their system against 21 dermatologist practitioners, their artificial intelligence achieved 91 percent success. They want to offer their technology in the future in the form of a mobile application.