Lab A (or multiple labs A*) creates a set of compounds consisting of basic nanomechanical building block molecules.

Labs B* create a DNA-derived blueprint that can assemble the building blocks to form precursors of individual nanomachine parts.

Labs C* create a nutrient solution used as fuel for the machines.

Daniel the lab worker in Lab D is paid $1.5M to assemble everything that was shipped to him from labs A*, B*, and C*, using his company's equipment off-hours without anyone's knowledge, because he's just an assistant and needs the money.

None of the labs has any knowledge of what is being built, but when Daniel follows the instructions, he creates a self-sustaining, self-replicating mass of goo that:

a) kills everyone on the planet, via generating viruses, or dissolving people directly, or whatever.
b) kills only some people on the planet, such as non-believers of Faith Q (absolutely arbitrarily chosen symbolic letter, that), or particular ethnic groups, or folks with blue eyes, or whatever.
c) dissolves all of the <insert economic driver> in a given country, or hemisphere, or continent, or whatever.
d) accidentally or intentionally dissolves the planet itself, because the AI is just a probabilistic model that's following a prompt entered by some dipshit with an Internet connection who linked it up to Langchain and went out for pizza.
e) do we need any more scenarios?