JB: I think inevitably, when we talk about Alexa in the boardroom, there's going to be the question of, okay, this is a listening device. Where and how data is stored in that kind of scenario will be incredibly important. If Alexa is listening to boardroom meetings, a board meeting [discusses] highly sensitive information, and [it’s] incredibly risky where that might end up and how it's encrypted, etc. I think [the board would] probably go towards on-site data storage for anything like that.
We first think to the challenges of where information might be stored, and by whom. Use of the cloud could pose real issues from the data security and a wider risk management perspective. If and how anything is recorded for the purposes of providing those services that's likely to cause concern: could that information be discoverable to regulators, for example? Or opposing legal advisers? Could it, by [virtue of] its existence, supplant minutes as an official board record on a de facto basis? Of course not, you wouldn't want it to, but could it almost become de facto something which supports minutes? Which wouldn't be in accordance with what we see as good governance practices.
I think there is potential for over reliance on technology in decision-making as well in these kinds of scenarios. If we think of other potential technologies in the future, you could have crowdsourcing or real-time analytics and feedback, maybe standing panels representing stakeholders whose feedback can be sought and played back through data analytics real time in the boardroom. You could use artificial intelligence (AI) for management information analysis, minutes analysis, i.e., looking for tone or the number of times a stakeholder is referenced for example, or diving deeper into information etc. In those kinds of scenarios, if you have those other potential future developments, then you could have an over reliance on technology in decision-making. We can imagine board meetings becoming quite clunky at the very least: asking Alexa, speaking to a chatbot, asking a panel for feedback, stopping to understand some data analytics that have just come through. Discussion could become limited by deferring to technological solutions that are supposed to enhance debate, but instead end up taking up time being taken at face value.
You could have a risk around shadow directors and outsourcing decisions. These kinds of technologies can also dilute accountability in theory. The scenario of, ‘we thought this as a board, or I thought this is a director, but then the technology said that, and so we felt compelled to go with that.’ The more tools and information a board has at its disposal, the less agency some directors may feel they have and in poor-performing boards, you can see these kinds of things being used as an excuse.
The last thing I'd probably say as a risk there is, is pressure on time and agendas, boards are under enough time pressure. Going back to the clunkiness point, bringing in technological solutions may take up valuable time in the boardroom, or for directors overall outside of the boardroom. I would just say in thinking about all of these things, what we shouldn't be doing in my opinion, is looking for solutions for solutions’ sake. We don't want bandwagons that end up [with] the implications not being understood. Instead, we need to be careful to identify opportunities for real value-add and think carefully about how and when those tools are used. And that's a subjective thing for each organisation. Just because the guys over there are doing something, doesn't mean that necessarily works for you, given where you are, the risk you're facing, everything that's on your agenda. It might just be that you prioritise in a different way.