The ever-increasing processing power of electronic devices means that they are changing from being helpers to being decision makers and a wrong decision could have a major impact. The obvious example is a driverless car where the ADAS (Advanced Driver Assistance Systems) makes all the decisions but nothing is ever perfect so fatalities will occur.
Humans are also far from perfect and make fatal mistakes. So, if a million car journeys with human drivers result in an average number of accidents and, for a similar number of journeys with driverless cars, there are fewer accidents then, logically, the driverless cars are the safer option. Although, part of humans’ imperfection is that they are also illogical so the tolerance of any driverless car accidents is near zero.
So, programming the ADAS to make the right decisions is a daunting task such as swerve right to avoid killing two children or swerve left to avoid killing two adults. It’s like the Kobayashi Maru training exercise in the fictional Star Trek universe designed to test the character of Starfleet Academy cadets in an unwinnable scenario. Spoiler alert - Kirk cracked it by reprogramming the simulator. There is no such cheat option for the software program of driverless cars.
To ensure the system acts predictably in any situation...
One thing that can be done is to ensure the system acts predictably in any situation, and, if problems occur, they are detected and appropriate action is reliably taken. Known as Functional Safety (FuSa), this is increasingly important for many application areas where AI is making critical decisions, and specifically ISO 26262 in the case of automotive.
One example where a systems behaviour could become unpredictable is due to radiation. Even when not in the radiation intense environment of space, a stray radiation particle can cause a Bit-Flip in a chip changing a 0 to a 1 for example. One method to spot this is to have two sensors feeding into a dual lock step processor. The circuit checks that the results are identical. If not, then an error has occurred which, depending on the importance of the circuit, could either trigger a warning light or instigate a system reset.
Another example is ECC (Error Correction Code or Error Correcting Code). When data is moved around a chip, additional redundant digits are added that enable the veracity of the data to be mathematically checked. A flipped bit, caused perhaps by an alpha particle of radiation, can then be identified and even safely flipped back as it is highly unlikely that two bits have been simultaneously flipped in the same data packet.
These techniques, along with many others, can also be used to make the chips safer for other safety-critical application areas such as medical or aviation. This could even include Smart Homes as no-one wants their AI to start behaving like the HAL 9000 in 2001 with “I’m afraid I can’t open the front door, Dave.”