Maintenance Notice

Due to necessary scheduled maintenance, the JMIR Publications website will be unavailable from Wednesday, July 01, 2020 at 8:00 PM to 10:00 PM EST. We apologize in advance for any inconvenience this may cause you.

Who will be affected?

Accepted for/Published in: JMIR Formative Research

Date Submitted: Dec 5, 2022
Open Peer Review Period: Nov 24, 2022 - Jan 19, 2023
Date Accepted: May 15, 2023
(closed for review but you can still tweet)

The final, peer-reviewed published version of this preprint can be found here:

Evaluation of the EsteR Toolkit for COVID-19 Decision Support: Sensitivity Analysis and Usability Study

Alpers R, Kühne L, Truong HP, Zeeb H, Westphal M, Jäckle S

Evaluation of the EsteR Toolkit for COVID-19 Decision Support: Sensitivity Analysis and Usability Study

JMIR Form Res 2023;7:e44549

DOI: 10.2196/44549

PMID: 37368487

PMCID: 10337391

Evaluation of the EsteR Toolkit for COVID-19 Decision Support: Sensitivity Analysis and Usability Study

  • Rieke Alpers; 
  • Lisa Kühne; 
  • Hong-Phuc Truong; 
  • Hajo Zeeb; 
  • Max Westphal; 
  • Sonja Jäckle

ABSTRACT

Background:

In the COVID-19 pandemic, local health authorities are responsible for managing and reporting the current cases in Germany. Since March 2020, the employees had to focus on the pandemic. In the EsteR project, we implemented existing and newly developed statistical models in decision support tools to assist the work in the local health authorities.

Objective:

The two main goals of this work are investigating the stability of the answers provided by our statistical tools regarding model parameters and evaluating the usability and applicability of our web application.

Methods:

For the model stability assessment, a sensitivity analysis was carried out for all five statistical models. The default parameters of our models and the ranges the parameters were tested in were based on a prior literature review on COVID-19 properties. For the usability evaluation of the web application, cognitive walkthroughs and focus group interviews were conducted with employees of two different local health authorities.

Results:

The simulation results showed that some statistical models are more sensitive to changes in their parameters than others. For each of the single person use cases we were able to find an area where we rate the respective model to be stable, whereas for the group use cases the stability highly depends on the user inputs. The cognitive walkthroughs and the focus group interview disclosed that the user interface had to be simplified and more information was needed as guidance. In general, the testers rated the web application as helpful, especially for new employees.

Conclusions:

This evaluation study allowed for a refinement of our toolkit. With the simulations, we identified suitable model parameters and analyzed how stable the statistical models are regarding changes in their parameters. Furthermore, the web application was improved with the results of the cognitive walkthroughs and focus group interviews.


 Citation

Please cite as:

Alpers R, Kühne L, Truong HP, Zeeb H, Westphal M, Jäckle S

Evaluation of the EsteR Toolkit for COVID-19 Decision Support: Sensitivity Analysis and Usability Study

JMIR Form Res 2023;7:e44549

DOI: 10.2196/44549

PMID: 37368487

PMCID: 10337391

Download PDF


Request queued. Please wait while the file is being generated. It may take some time.

© The authors. All rights reserved. This is a privileged document currently under peer-review/community review (or an accepted/rejected manuscript). Authors have provided JMIR Publications with an exclusive license to publish this preprint on it's website for review and ahead-of-print citation purposes only. While the final peer-reviewed paper may be licensed under a cc-by license on publication, at this stage authors and publisher expressively prohibit redistribution of this draft paper other than for review purposes.