This is an excellent example (artificial intelligence is another) of hairless apes doing really dangerous and stupid stuff just because we can.
We will be lucky to survive it.
If there is any benefit to gain-of-function research it should probably be carried out off world, either in a high orbit where the station airlocks/environmental systems can be purged to space or low where the failsafe is accelerated fiery re-entry.
high level AI research would be safe in space, and possibly remotely located on earth, but the failsafe’s on that have to revolve around data transmission rates, distance to receivers, and the available manipulatable tool set. There can be zero probability of the sentient program being able to transmit itself out of a study area, or fabricate a device where it could smuggle copies out. You’d need a system whereby no device capable of holding more than say, 1/1000th of the total size of the program could come within it’s potential grasp...