Workflow smarter prioritization
Pre-dictation for reporting
AI assistance for peace of mind.
Detection of seven common abnormalities seen in emergency departments: fracture, dislocation, elbow joint effusion, pleural effusion, pulmonary nodule, pulmonary opacity, and pneumothorax.
AI-Generated “STAT” Flags for Unbiased Prioritization and Rule Out Support.
The algorithm automatically scans plain films and seamlessly pushes a STAT-like flag to RIS/PACS worklists, once an abnormality is detected.
Save time using AI-powered report pre-dictation.
For radiologists, the algorithm pre-populates the study description and findings fields in dictation softwares. It will not submit a report without a radiologist’s approval.
The Arterys Chest | MSK AI application enables fast and thorough reading of emergency radiographs by providing AI-powered detection and rule out support with a user-friendly interface. In addition to the seven pathologies, a diagnostic assistance of several more illnesses is at your disposal via the Arterys Marketplace. Mix and match various algorithms to suit your needs in x-ray workflow.
The Arterys software is a medical diagnostic application that displays, processes, stores, and transfers DICOM and non-DICOM medical data. It is intended to be used as a support tool by trained healthcare professionals to aid in diagnosis. It is intended to provide image and related information that is interpreted by a trained professional to render findings and/or diagnosis, but it does not directly generate any diagnosis or potential findings.
MILVUE is an Artificial Intelligence powered predictive software enhancing performance and safety through: 1. Prioritization of radiographs (abnormalities versus absence of abnormality) and 2. Emphasis of abnormalities by heatmap and/or boxes. MILVUE is intended to be used in radiology and emergency services to enhance patient management by reducing the processing time devoted to radiographs reading.
Milvue is not trained with intensive care unit or operating room X-rays. Utilizing the model in these settings is outside the scope of the intended use.
Information on training data
Trained with over 30,000 professionally annotated X-ray data, taken by eight OEM machines, including GE, Siemens, and Phillips, at over 5 different institutions in France.
Model performance metrics
Area ROC (AUC) : 0.87 - 0.98 (depending on the pathology)
Inference Time: 1 second