The ND programming language studied in this thesis combines the While programming language extended with random assignment, and the Guarded Command Language GC of Dijkstra. A semantic theory for ND is developed following algebraic operational semantics, using semantic computation trees labeled with states instead of the computation sequences used in the deterministic case. The semantics of an ND procedure is then the set of states at all leaves of its tree, together with '↑' (the divergence symbol) if the tree has an infinite path.
Since GC has (i) finite non-determinism (i.e. the semantic computation tree for a GC statement is finitely branching), and (ii) localization of computation (i.e., the output is always in the subalgebra generated by the input), the whole computation procedure can be represented using Gödel numbering. Hence (assuming a "term evaluation property" for the given algebra) we can prove a Universal Function Theorem for GC. This technique fails for the full ND language with its infinite non-determinism and failure of localization of computation.