George’s PhD research looks into issues of trust and calibration of trust in Connected and Automated Vehicles (CAVs) by determining the factors that are affecting the trust of the users, and how the situational awareness of the vehicle can be used to calibrate human trust.
One of the most interesting outcomes that needs further exploration is related to issues of transparency and trust within the CAV environment.
Risks such as hacking, tracking, spoofing, jamming, raise important issues related to the transparency of information. Should users be informed in real time of hacking attacks? Is that going to affect their trust and acceptance? Should they be informed in the form of a “weekly digest” or should the information be available through sub-menus?
Finding a balance between what type of information and when it should be relayed to CAV users is of crucial importance.
Therefore, George will use this impact grant to: