Decision theory solves the extremely hard problem of telling you what you should do if you have a consistent and mathematically formalized model of the world and a utility function telling you exactly how good any particular world-state is. The most common decision theories are evidential decision theory and causal decision theory; various accursed decision theories exist but are not in wide use.
Evidential decision theory
EDT says, informally, that you should take whatever action has the highest expected utility (i.e. the sum of utility for each outcome weighted by the probability of that outcome conditional on you taking the action).
Causal decision theory
Informally, CDT says that you should take whatever action causes the highest expected utility, i.e. the conditional probability is replaced with a counterfactual (there is some debate about how to operationalize this, but roughly speaking, it's the probability if you replaced the action you took with another one).
Limitations
Neither theory provides a satisfying answer to all problems.
Toxoplasmosis Dilemma
Imagine that you have the opportunity to pet a cat. In this hypothetical world cats, in general, might carry toxoplasmosis, which causes you to be more likely to pet cats, as well as having many other negative effects you disvalue greatly. The cat you may pet has been tested for toxoplasmosis, and you are very confident it is not infected and could not infect you. However, people with toxoplasmosis are twice as likely to pet a cat given the opportunity than people without it.
An EDT agent reasons that petting the cat would be reasonably strong evidence that they have toxoplasmosis, which is undesirable, so they would not pet the cat. However, a CDT agent reasons that at the time of their decision their infection status is fixed, so counterfactually deciding to pet it or not pet it cannot affect whether they have toxoplasmosis. The latter is, in this instance, more intuitively satisfying.
XOR Blackmail
The XOR blackmail problem allows Decision Theory agents to be money-pumped. XOR blackmail is incorrectly named, as it is actually not blackmail. However, the terminology remains persistent.
Say Omega, a perfectly honest perfect predictor of your actions, sends you the following message: "I have sent you this message iff your house has been infested with termites xor you will pay me $1000", and that a termite infestation would cost $1000000, a larger number.
A CDT agent reasons that whether they pay Omega has no causal influence on whether their house is infested with termites, and so elects to not pay Omega. Omega predicts this, so the CDT agent will only ever receive this message in worlds where their house is infested with termites.
An EDT agent reasons that, if they pay when they receive the message, the reason they received the message will not have been a termite infestation, and vice versa (if they do not pay, they will have received the message because of a termite infestation), and the $1000 payment is smaller than the $1000000 cost of the termite infestation.
A helpful expository video from our research teams has been attached.