The Hybrid (or Hamiltonian) Monte Carlo (HMC) algorithm has been a mainstay of lattice field theory computations for a long time, and it is becoming popular in other disciplines (such as Statistics and Machine Learning). HMC combines Markov Chain Monte Carlo (MCMC) with Hamiltonian dynamics to give a sampling algorithm with good mixing properties (small autocorrelations). I will introduce the basic properties of Hamiltonian dynamics, and explain how a solution of Hamilton's equations may be used to generate more-or-less correlated samples while satisfying detailed balance. I will then explain some of the remarkable properties of symplectic integrators such as the leapfrog (Verlet) scheme, and show how these may be combined with the use of a Metropolis--Hastings acceptance step to give a practicable Markov update with exactly the desired fixed-point distribution. I will then discuss the relationship between some special cases of the HMC algorithm and stochastic differential equations; why symplectic integrators exactly conserve a Shadow Hamiltonian close to the desired Hamiltonian; and how such a Shadow may be used. In passing I will explain how higher-order integrator schemes may be constructed, and in particular how second-derivative “force gradient” integrator steps arise.