@@ -3,3 +3,3 @@ As a corollary to [[quantum immortality]], events which would kill all humans, s
* Python has bad dependency management because all ML code is written in it. If it were good, we would have AGI.
-* RL doesn't work stably or reliably because it would be too powerful.
+* RL doesn't work stably or reliably because it would be too powerful - imitation learning is less likely to do "weird things".
* LLMs are what we got because they are slow to develop ([[scaling laws]]) and can do some useful tasks but are bad at [[agentic]] action.