PEPconnect

teamplay Usage KPIs and Calculations

This document explains which data is used for the calculation of key performance indicators in teamplay Usage. 

teamplay Usage KPIs and Calculations siemens-healthineers.com/teamplay SIEMENS ... Healthineers - Regional .. Albert J ... 9/1/2018 10/15/2018 Dose Target Dose Events Dose Events Exams within target above internal reference above national reference 80 474 973 Percentage of al exams / + 6.4% Number of Exams . - 36.4% Number of Exams - 25.6% Patients & Exams Exam Duration Patient Change Time 6,328 Number of patients 10:46 25:35 7,912 min min Number of exams & duratio O @ time Today's Open Orders Incomplete Reports Contrast Cardio Cardio 52 % Omnipaque 48 % Visapaque 29 59 Number of orders Number of incomplete reports Exams with Contrast usage Examinations Avg. Report Stents Cardio Turnaround Time vnd: usbou smit bnwonsmut SIEMENS Healthineers . teamplay · KPIs and Calculations Contents Introduction 3 Which data is used to calculate the KPIs in teamplay Usage? 4 Data Privacy Level and data in teamplay Usage 5 Data privacy in teamplay 5 Effect of the chosen data privacy level on KPIs 6 What is an examination in teamplay Usage? 7 Goal of examination based analytics in teamplay Usage 7 teamplay Usage intents to identify 7 How is the DICOM study content used? 7 teamplay Usage KPIs 8 Number of patients 8 Number of examinations 8 Number of exams per patient 8 Exam duration 8 Patient change time 9 Exams per hour 9 Calculated working hours 9 Table occupancy and Utilization 10 Contrast agent exams reflected in KPIs 11 Supported KPIs 11 Benchmarking 12 Calculation of benchmark value 12 Benchmarking functionality 12 Recommendations 13 Reliable and comparable key performance indicators 13 Missing data in teamplay Usage 13 Interpretation of Benchmark values 13 Appendix 14 Supported DICOM SOP classes 14 What is the cause of data discrepancy or inconsistency between the RIS / PACS and teamplay? 15 2 teamplay · KPIs and Calculations Introduction teamplay Usage1 provides functionality and Key Performance Indicators (KPIs) to monitor and analyze the efficiency and performance of the diagnostic imaging processes in your institution. To make the KPIs and functionality more valuable, the understanding of the used data and calculations is key. The following chapters explain the data used in teamplay Usage and describe the various KPIs. Which data is used to calculate the KPIs in teamplay Usage? teamplay Usage processes the DICOM image header content of the DICOM files analyzed by the team-play Receiver. The available DICOM header information is checked for supported DICOM header formats, by verifying the Service-Object Pair (SOP) Class Unique Identifiers (UIDs). For a detailed list of supported SOP classes in teamplay Usage please refer to chapter Supported DICOM SOP classes. 3 teamplay · KPIs and Calculations Data privacy level and data in teamplay Usage Data privacy in teamplay teamplay offers three different privacy profiles that allow the user to reduce the amount of identifying patient information extracted from DICOM files and uploaded via teamplay Receiver. These three levels offer the following protection characteristics: • Standard privacy – no direct patient identifiers • High privacy – no re-identification based on exceptional patient characteristics • Restrictive – anonymous The table below shows the effect of the different privacy levels on the uploaded data for teamplay Usage. Data Cluster Standard privacy High privacy Restrictive Device information Kept Kept Kept Institution information Kept Removed Removed Procedure description Kept Kept Removed Technical Data Kept Kept Kept UIDs Keyed hash Keyed hash Keyed hash Time Kept Kept Kept Date Kept Kept Reduced accuracy – Date month Patient age Reduced accuracy – Reduced accuracy – Reduced accuracy – Years only 8 Age Clusters 8 Age Clusters Patient characteristics Kept Reduced accuracy – Removed (e.g. weight,gender) Clustered weight and size values Patient identifiers Keyed hash Keyed hash Dummy Other data (Content Cleaned Cleaned Cleaned sequence, meta data) Institution personnel Kept Kept Keyed hash information* Removed: the DICOM tags are removed Kept: the values are taken 1:1 from the DICOM tags to teamplay Keyed hash: the values are hashed using a keyed hash to even reduce this information and only keep what is needed for consistency Cleaned: the DICOM tags are retained and within the value, the contained attributes are processed as defined in detail in the teamplay Privacy Concept For more details on data privacy and security within teamplay please contact your local sales representative. *There is an option to retain institution employee data (e.g. operator) from uploaded data. This option is disabled by default. 4 teamplay · KPIs and Calculations Effect of the chosen data privacy level on KPIs To prevent potential re-identification in data privacy level “Restrictive”, the examination dates are normalized to the beginning of the month. In this case data can only be analyzed on a monthly basis, not per day. This restriction affects the accuracy of the KPI calculation and leads to different aggregation possibilities. Furthermore, in privacy level “Restrictive” it is not possible to attribute exams to the same patient. Thus, the number of exams and number of patients will be the same. Certain KPIs are even impossible to be calculated. The following KPIs are affected: • “Number of Patients”: shows the same results as the KPI “Number of Examinations” • “Exams per Patient”: shows always “1”, since no differentiation between examinations and patients is possible • “Patient Change Time”: cannot be calculated due to normalization to the beginning of the month • “Exam per hour”: is based on all exams of the month aggregated in the first day of the month. Therefore, the calculation cannot consider the variations between the different days of the month. The result is a different KPI value compare to a different privacy level. 5 teamplay · KPIs and Calculations What is an examination in teamplay Usage? Goal of examination based analytics in teamplay Usage teamplay Usage intends to identify comparable (cross vendor, cross fleet) and reliable (automatically created) examination values out of the available DICOM date and time information. Workflow and vendor specific deviations shall be removed or at least minimized. Since the basis for the identification of an examination is the technical content of the DICOM image creation, the examination reflects the acquisition of the images and ignores any manual activities of the technologist before and after the acquisition. How is the DICOM study content used? As already mentioned above, teamplay Usage is processing DICOM images header information of dedicated SOP classes. The supported DICOM SOP classes can be found in chapter “Supported DICOM SOP classes”. To achieve the goal to identify the examination as the acquisition time, teamplay Usage differentiates between series containing acquired images (scanning) and series containing post processed images (reconstruction). The business logic is using certain DICOM indicators such as the DICOM image type to identify the acquisition series. Real examination Preparation Acquisition 1 Acquisition 2 Postprocessing 1 Acquisition 3 Postprocessing 2 Finalization Pause Pause Pause Pause Examination in teamplay Usage Acquisition 1 Acquisition 2 Acquisition 3 Pause Pause User clicks “Start exam” @scanner User clicks “Close exam” @scanner Figure 1: teamplay Usage examination In some cases, an examination of one patient contains multiple exams spread throughout the day. An example is the so called “Stress and Rest” examination in Molecular Imaging (MI). The whole diagnostic procedure is distributed over two exams within 3–5 hours. The acquired data are stored in a single DICOM study. For device utilization purposes the DICOM study has to be split in two examinations to understand when the device had been occupied. 6 teamplay · KPIs and Calculations Real MI examination (stress & rest exam) Preparation Acquisition 1 Acquisition 2 Acquisition 3 Acquisition 4 Finalization Examination break Pause Pause MI Examination in teamplay Usage Acquisition 1 Acquisition 2 Acquisition 3 Acquisition 4 Pause Pause User clicks “Start exam” @scanner User clicks “Close exam” @scanner Figure 2: teamplay Usage examination for MI Note: The examination in teamplay Usage mentioned in Figure 1 and Figure 2 are valid starting from January 2019. Before this date, the preparation time between loading the case in the device and the first acquisition was also included in the definition of an examination in teamplay Usage. In case of an Ultrasound examination, the preparation time is still included in this definition due to the specifics reflected in DICOM studies. Real examination Preparation Acquisition 1 Acquisition 2 Postprocessing 1 Acquisition 3 Postprocessing 2 Finalization Pause Pause Pause Pause Examination in teamplay Usage Preparation Acquisition 1 Acquisition 2 Acquisition 3 Pause Pause User clicks “Start exam” @scanner User clicks “Close exam” @scanner Figure 3: teamplay Usage examination before 2019 7 teamplay · KPIs and Calculations teamplay Usage KPIs The following KPIs are based on the identification of the examination as it is described in the chapter “What is an examination in teamplay Usage?”. Number of patients teamplay Usage is capable of counting unique patients. The KPI “Number of patients” counts the unique patients examined in the institution or on a dedicated device in the selected date range. Unique patients cannot be identified by teamplay Usage in case the DICOM data upload is performed in the data privacy level “Restricted”. Number of exams per patient The “Examinations per patient” value provides the average ratio of examinations performed for single patients. Unique patients cannot be identified by teamplay Usage in case the DICOM data upload is per- formed in the data privacy level “Restricted”. Therefore, the KPI will show always the value “1”. 8 Number of examinations The KPI “Number of Examinations” is counted based on the identified teamplay examinations in the selected date /time range. DICOM studies with unsupported SOP class images / series are not counted. Exam duration The “Examination duration” is the duration starting with the earliest DICOM acquisition date and time of the whole DICOM study. In case of ultrasound examinations the start of the examination is determined by the DICOM study date /time. The end of the examination is the end of the last acquisition series. The time between these time stamps is the calculated examination duration. See also Figure 1 and Figure 2 – teamplay Usage examination. Patient change time The “Patient change time” is calculated between exams of different patients. The time between consecutive exams for the same patient on the same device is ignored. In case the DICOM data upload is performed in the data privacy level “Restricted” DICOM dates are normalized to the beginning of the month. Therefore, the KPI cannot be calculated by teamplay Usage in case of the data privacy level “Restricted”. 8 teamplay · KPIs and Calculations Real examinations Preparation Acquisition 1 Acquisition 2 Postprocess. 1 Finalization Preparation Acquisition 1 Acquisition 2 Postprocess. 1 Finalization Pause Pause Pause Pause Pause Pause Calculated examinations and patient change time exam 1 Patient change time exam 2 Figure 4: Identification of patient change time Exams per hour The KPI “Exams per hour” is calculating the average number of exams per hour. teamplay Usage considers only hours, where at least one exam has been performed. In case the DICOM data upload is performed in the data privacy level “Restricted” DICOM dates are normalized to the beginning of the month. Therefore, the KPI will be calculated for the date range of the whole month. Calculated working hours To avoid continuous administration of working hours per device per weekday and to compensate typical variations over time, teamplay detects the working hours dynamically based on the performed examinations. The system calculates the start and end of the working hours for each day and device based on a dynamic threshold. The algorithm also considers breaks of max. one hour during the day with lower throughput like lunch hours or shift changeovers. Such breaks are part of the normal working hours (see Figure 5 – Calculation of normal working hours with lunch time at “13”). Exams per hour 100% (= max rate) Y Exams per hour Exceptional hour X Exams per hour Exceptional hour 20% Δ t 01 02 03 04 05 06 07 08 09 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 Normal working hours Figure 5: Calculation of normal working hours 9 teamplay · KPIs and Calculations Table occupancy and utilization The KPI “Table occupancy” represents the time the patient is physically on the table. This will show the ratio of how much time the device is used compared to how long the device is available. The KPI “Utilization” represents the time the technologist is interacting with the patient. This will show the ratio of how much time the device is used plus preparation & aftercare of the patient compared to how long the device is available. The KPI “Utilization” simulates the value based on the shortest time between two patients as reference value. Filter: Time of day Selected hours 8:00 AM to 4:59 PM Real working day shortest exam 1 exam 2 patient change time exam 3 exam 4 08:00 AM 05:00 PM Table occupancy in % exam 1 exam 2 exam 3 exam 4 08:00 AM 05:00 PM Utilization in % exam 1 exam 2 exam 3 exam 4 08:00 AM Pause Pause Pause 05:00 PM = Exam Pause Time Figure 6: KPIs Table occupancy and Utilization 10 teamplay · KPIs and Calculations Contrast agent exams reflected in KPIs The differentiation of exams applying contrast agent and native exams, exams which are not contrast enhanced, is an important aspect when comparing KPI values. Contrast enhanced exams typically require more patient preparation work and are performed differently. Therefore, a differentiation of the KPI values by exams with contrast agent and exams without contrast agent is available for certain KPIs. The user can activate the view and can analyze the respective examinations in the supported charts. Supported KPIs: Patients page: Scanner distribution – Patient change time Examinations page: Number of exams over time Scanner distribution – Number of exams Scanner distribution – Examination duration teamplay uses the DICOM tag Contrast/ Bolus Agent (0018,0010) to identify contrast enhanced examinations Examinations Contrast agent Export graph Hour Day Week Month - Exams with CA - Exams without CA 400 350 300 250 200 NUMBER OF EXAMS 150 100 50 O S M 11/18/2018 11/19/2018 11/20/2018 1/21/2018 11/22/2018 11/23/2018 11/24/2018 Figure 7: Example for visualization of number of exams over time Contrast agent Sort by Exams in total v 4 Export graph v - Exams with CA - Exams without CA EXAMS 50 100 150 200 250 SAMSUNG-DGR-0 230 Hospital 140 SIEMENS-Perspective-2 52 Ysio Max 141 SIEMENS-5kyra-19 58 61 Multix Fusion Max 107 Figure 8: Example for visualization of number of exams per scanner 11 teamplay · KPIs and Calculations Benchmarking Benchmarking2 compares the performance of KPI values between imaging devices with dedicated characteristics. External benchmarking is typically done Benchmarking2 compares the performance of KPI values between imaging devices with dedicated characteristics. Internal benchmarking compares KPI values of two or more devices within an institution. Internal benchmarking compares two or more devices within an institution. Here we talk about benchmarking as external benchmarking. Benchmarking is … comparing key metrics with the “Best-in-Class” … used to identify areas of improvements (look out of the box) … a point of reference … anonymous … Calculation of benchmark value The benchmark values are intended to be strictly anonymous with respect to patients and institutions. In addition to anonymization, the benchmark values should remove seasonal and temporary effects. The following criteria are used to achieve compliance with these goals. Considers the last 12 months Average of top 10% Benchmarking KPI: Examination duration i!i per Scanner Filter by institution type Privacy Control per modality type per manufacturer model (e.g. CT, MR) (e.g. Perspective) Benchmarking functionality Benchmarking is currently available for the KPI “Examination duration” and will be calculated per modality type (e.g. CT, MR) and per manufacturer model. The modality type based benchmark value reflects the best 10% of the devices of a specific modality type, independent of the vendor or specific model. The manufacturer model based value represents the best 10% of a specific model of a device. In order to narrow the devices of interest for the benchmark values, an institution type filter criterion can be applied. The filter options match with the teamplay institution types Imaging Center, Hospital, University. In case the number of devices is below a certain threshold and privacy rules would be compromised, no benchmark value will be calculated and shown in the respective chart, which could lead to a scenario that the modality type benchmark is shown but the manufacturer model value is not available. 12 teamplay · KPIs and Calculations Recommendations Reliable and comparable key performance indicators In order to get comparable KPIs in teamplay Usage the important process step is the start of the examination. Since this is the only manually triggered event which is used to identify the examination duration, an alignment between the different devices is recommended. The start of the examination should be initiated as consistent as possible to ensure reliable and comparable KPIs. Another important aspect is to consider the working hours and working days. Certain KPI values vary significantly depending if the KPI is calculated in the working hours or if it’s calculated 24 / 7. In case KPI values seem to be off your experienced performance, please check the filter settings used to calculate the KPIs and charts. Missing data in teamplay Usage Missing numbers of examinations should not occur due to incomplete uploading of data or incorrect processing of the uploaded data. This may lead to scenarios where certain examinations will be ignored for the Usage KPIs. Such sanity checked ensure consistent values and results are provided. Incomplete or wrong DICOM content would potentially affect the calculated KPIs. In such cases it’s important to ignore the examinations. teamplay Usage provides information about ignored examinations. The data quality index calculated per device can be found in the data sheet of the device in the Scanner detail view accessible via the “Scanner List”. In case data appears to be missing but the data quality index does not indicate the data are excluded during data processing, the upload process and the DICOM content (e.g. correct/ supported DICOM SOP classes) should be verified. See also the chapter “What is the cause of data discrepancy or inconsistency between the RIS / PACS and teamplay?” in the Appendix. Interpretation of Benchmark values You should always make sure you understand where the data comes from and what it represents before comparing it to your situation and / or interpreting it. Make sure you are comparing the right values, assets and outcomes in an objective way. 13 teamplay · KPIs and Calculations Appendix Supported DICOM SOP classes 1.2.840.10008.5.1.4.1.1.1 CR Image Storage 1.2.840.10008.5.1.4.1.1.1.1 Digital X-Ray Image Storage – for Presentation 1.2.840.10008.5.1.4.1.1.1.1.1 Digital X-Ray Image Storage – for Processing 1.2.840.10008.5.1.4.1.1.1.2 Digital Mammography X-Ray Image Storage – for Presentation 1.2.840.10008.5.1.4.1.1.1.2.1 Digital Mammography X-Ray Image Storage – for Processing 1.2.840.10008.5.1.4.1.1.1.3 Digital Intra – oral X-Ray Image Storage – for Presentation 1.2.840.10008.5.1.4.1.1.1.3.1 Digital Intra – oral X-Ray Image Storage – for Processing 1.2.840.10008.5.1.4.1.1.12.1 X-Ray Angiographic Image Storage 1.2.840.10008.5.1.4.1.1.12.2 X-Ray Radiofluoroscopic Image Storage 1.2.840.10008.5.1.4.1.1.128 Positron Emission Tomography Image Storage 1.2.840.10008.5.1.4.1.1.2 CT Image Storage 1.2.840.10008.5.1.4.1.1.20 NM Image Storage 1.2.840.10008.5.1.4.1.1.3.1 Ultrasound Multiframe Image Storage 1.2.840.10008.5.1.4.1.1.4 MR Image Storage 1.2.840.10008.5.1.4.1.1.2.1 Enhanced CT Image Storage 1.2.840.10008.5.1.4.1.1.4.1 Enhanced MR Image Storage 1.2.840.10008.5.1.4.1.1.4.2 MR Spectroscopy Storage 1.2.840.10008.5.1.4.1.1.4.3 Enhanced MR Color Image Storage 1.2.840.10008.5.1.4.1.1.6.1 Ultrasound Image Storage 1.2.840.10008.5.1.4.1.1.6.2 Enhanced US Volume Storage 1.2.840.10008.5.1.4.1.1.12.1.1 Enhanced XA Image Storage 1.2.840.10008.5.1.4.1.1.12.2.1 Enhanced XRF Image Storage 1.2.840.10008.5.1.4.1.1.13.1.3 Breast Tomosynthesis Image Storage 1.2.840.10008.5.1.4.1.1.130 Enhanced PET Image Storage 14 teamplay · KPIs and Calculations What is the cause of data discrepancy or inconsistency between the RIS/PACS and teamplay? The following chart symbolically describes the different stages of data processing and potential effects on data cleansing and processing. Missed studies • Studies that come late into the PACS and are not “caught” Data Flow by the nightly query • Studies that cannot be retrieved after 10 retries Bad images (invalid dicom files) • Studies that cannot be anonymized (DICOM tags that cannot be parsed or are malformed) Invalid Corrupted zip containers Usage • studies Zip containers that got corrupted either during the packaging 1 process or during the upload to the cloud (1part per Million) Discarded usage studies Studies Studies Studies Studies • Studies that contain only DICOM objects with unsupported in PAC retrieved by the processed uploaded Relevant SOP classes or have no image type attribute receiver on the to the Usage receiver cloud studies Studies Invalid usage studies visible • Studies that do not contain information that can be used by in Usage Relevant Studies Dose Usage (no acquisition data), or the information contained is studies visible in Dose not valid (overlapping series, exam time too long/short, etc.) Unknown black images • Studies that cannot be processed by Dose (e.g. Black Images on which our OCR has not been trained, etc. The following circumstances may result in inconsistencies between your RIS / PACS and teamplay account: • The imaging device sends DICOM data to the PACS that is not supported by teamplay Usage (see the teamplay DICOM Conformance Statement for supported DICOM data). teamplay Usage does not support PDF, JPEG and other non-DICOM data. • The imaging device is not sending acquisition data. teamplay Usage processes only data that correlates with an actual examination. For example, if the imaging data is post-processed and ONLY VRT or MPR’s are sent to the PACS, the study will not be processed by teamplay Usage. • The teamplay sanity check verifies all incoming data and rejects exams that appear to exceed the following durations as being “invalid”: 5 hours: X-Ray Angiography (XA) or Radio Fluoroscopy (RF) 3 hours: anything else (e.g. CT, MR, PT, US, MG, DX, CR …) • The hospital network is inaccessible at the time the teamplay Receiver performed a query-retrieve for the PACS system. We recommend verifying the following hints to help increase the data consistency: • Make sure that DICOM data is sent from the modalities to the PACS • Use the query-retrieve functionality to get data from the PACS. • If your PACS does not allow a query-retrieve, then set up an auto-routing from the PACS to the teamplay Receiver. • Make sure the imaging device sends acquisition data to the PACS. For example, a topogram or axial images from a CT. • If there is a case of a network breakdown, teamplay Receiver will try up to 10 times to retrieve the correct data from the PACS when the network is accessible again. 15 1 teamplay Usage is not commercially available in all countries. If the services are not marketed in countries due to regulatory or other reasons, the service offering cannot be guaranteed. Please contact your local Siemens organization for further details. 2 Availability of Benchmark option depends on a minimum number of considered subscribers to guarantee customer anonymity and protection. Please check if teamplay is available in your country. Siemens Healthineers Headquarters Legal Manufacturer Siemens Healthcare GmbH Siemens Healthcare GmbH Henkestr. 127 Henkestr. 127 91052 Erlangen, Germany 91052 Erlangen, Germany Phone: +49 9131 84-0 siemens-healthineers.com Published by Siemens Healthcare GmbH · Order No. P08-007-DS.627.15.01.02 · Printed in Germany · 7333 0419.xx · ©Siemens Healthcare GmbH, 2019

  • data privacy settings
  • teamplayusage