If You Can’t Get Them to the Lab
Evaluating a Virtual Study Environment with Security Information Workers

<To be included upon publication>


This is a companion website for our publication presenting OLab, a virtual environment for conducting online experiments and studies. OLab provides a remote desktop configurable via Dockerfiles, allowing for UX-testing in complex IT & IT-Security experiments.

As part of this publication, we include a requirements-engineering and comparison to other remote study approaches, as well as an iterative evaluation using three distinct studies. To demonstrate OLab's unique capabilities, we present an unique study setup involving DevOps administrating a server.


Usable security and privacy researchers use many study methodologies, including interviews, surveys, and laboratory studies. Of those, lab studies allow for particularly flexible setups, including programming experiments or usability evaluations of software. However, lab studies also come with challenges: Often, it is particularly challenging to recruit enough skilled participants for in-person studies. Especially researchers studying security information workers reported on similar recruitment challenges in the past. Additionally, situations like the COVID-19 pandemic can make in-person lab studies even more challenging Finally, institutions with limited resources may not be able to conduct lab studies.

Therefore, we present and evaluate a novel virtual study environment prototype, called OLab, that allows researchers to conduct lab-like studies remotely using a commodity browser. Our environment overcomes lab-like study challenges and supports flexible setups and comprehensive data collection. In an iterative engineering process, we design and implement a prototype based on requirements we identified in previous work and conduct a comprehensive evaluation, including a cognitive walkthrough with usable security experts, a guided and supervised online study with DevOps, and an unguided and unsupervised online study with computer science students. We can confirm that our prototype supports a wide variety of lab-like study setups and received positive feedback from all study participants.

Developer Studies

To investigate the requirements and features such an online environment has to provide, we collected and categorizes developer studies in the last 6 years of research. We focused on popular/top tier publications and their results. The following shows an overview over our results, with more detail provided in the paper.

Related Work Overview Table

This list shows the results of our analysis , demonstrating how a lot of studies with higher amounts of participants are conducted using remote and browser based setup, as well as the popularity of certain data collection approaches, tools for development and security tasks and Programming languages in python, for which we provide a detailed overview in the paper


To implement our prototype of OLab, we have considered two main workflows, the researcher workflow (A-F) and the participant workflow (1-6) in our study platform.

Participant Workflow

  1. Receive Invite: Invitees can participate in a study remotely by accessing a (unique) invite URL with a HTML5-capable commodity browser on a desktop or laptop computer, using a sufficiently stable internet connection (validated for 8.0 Mbit/s downlink and 0.8 Mbit/s uplink).
  2. Landing Page & Consent Form: After clicking the invite URL, OLab forwards invitees to a landing page showing study information and a consent form (cf. Screenshot 1).
  3. Briefing: After giving consent, OLab presents participants a full study description, including an introduction to the study environment (cf. Screenshot 2).
  4. Solving Tasks: Participants are encouraged to work on tasks in full-screen mode, look up the study and task descriptions with a mouse click, skip a current task, or finish the entire study. OLab aims to provide a working experience as close to a regular desktop environment as possible (cf. Screenshot 3).
  5. Survey Questionnaires: At any point in a study, OLab allows researchers to forward participants to external websites, including surveys (e. g., using Qualtrics).
  6. Debriefing & Exit: After solving all tasks, OLab allows researchers to forward participants to an exit survey and a debriefing website.

Researcher Workflow

  1. Setup Study Environment: During the study setup, researchers can freely choose operating systems, applications, tools, file access, and connection control.
  2. Setup Tasks & Conditions: OLab supports within-subjects, between-subjects, and mixed studies. Tasks and conditions can be randomized or arranged using the Latin squares method.
  3. Scaling: OLab is based on a highly scalable Kubernetes cluster and allows researchers to run studies in different geographical regions with many concurrent participants to optimize connection speeds and scale available environments.
  4. Generate Invites: OLab supports individual invite tokens for participants, forgoing the need to save participants’ personally identifiable information (PII). Invite tokens can be used to track participants across other services (e. g., Amazon’s MTurk).
  5. Study Progress: Researchers can track the study progress and modify and manage scaling options using a dashboard.
  6. Data Access: After study completion, researchers can gather the collected data (e. g., specific study metrics, metadata, and questionnaire answers) with a mouse click.


To evaluate OLab, we conducted three studies. First an expert walkthrough with other usable security researchers covering aspects like the UX of OLab. After completing said walkthrough and implementing expert feedback into our prototype, we conducted two studies with end-users.

In the first study, we set up a virtual fileshare (samba) and database (MySql) server, which we then provided to a sys-admin to debug in an interview-like environment. This setup intends to demonstrate the unique capabilities of a technology like OLab, since this type of setup would usually be very difficult to conduct remotely. Using the feedback from this guided approach, we could address a few misconceptions in our task display and the iconography of OLab. OLab was well perceived by participants and worked on almost all broadband connections (everything above 8.0 Mbit/s downlink and 0.8 Mbit/s uplink)

For our second study, we provided a full-on development environment (PyCharm) with a python environment and a python programming task to a set of developers. In a setup similar to "Comparing the Usability of Cryptographic APIs", we tasked developers to work with a cryptographic API and implement its tasks. Since this was an evaluation of OLab, we only evaluated a single cryptographic API and in a within-subjects round-robin setup, we asked participants to solve the task once in our Environment and once in a more classical remote study setup using a website and manual download of the task. While some participants mentioned missing their “usual” programming setup in OLab, most where impressed by the OLab study flow, ease of access with environment and required libraries set up and a code environment ready to go. We further found out that participants followed our instructions more clearly in OLab due to the study-specific interface.

Our first study

Replication Package

To make our study reproducible and allow for easy access for meta-research, we publish a replication package containing the following documents. It is hosted on the research data repository of the Leibniz University Hannover.

Table of Contents:

Guided DevOps Study & Comparison Experiment

  • Condition design
  • Consent form
  • Scenario description
  • Tasks & Task descriptions
  • Study procedure protocol
  • Between task surveys & exit survey

Comparison Experiment specific

  • Instructions for
    • Download Environment
    • OLab Environment

@inproceedings {conf/soups/huaman22,
  author = {Nicolas Huaman and
Alexander Krause and
Dominik Wermke and
Jan H. Klemmer and
Christian Stransky and
Yasemin Acar and
Sascha Fahl},
  title = {If You {Can{\textquoteright}t} Get Them to the Lab: Evaluating a Virtual Study Environment with Security Information Workers},
  booktitle = {Eighteenth Symposium on Usable Privacy and Security, SOUPS 2022, Boston MA, USA, August 8-9, 2022},
  year = {2022},
  address = {Boston, MA},
  url = {https://www.usenix.org/conference/soups2022/presentation/huaman},
  publisher = {USENIX Association},
  month = aug,