The ionosphere is a partly ionized turbulent plasma layer in the upper atmosphere, with an electron density that is highly variable in space and time. An electromagnetic signal passing through this layer experiences a wavelength dependent delay. This dispersive delay is a major issue for calibration of low frequency radio interferometric data. The time variability requires high time resolution phase solutions, whereas the spatial variation forces direction dependent calibration. In the presence of the Earth’s magnetic field, the delay is slightly different for right and left circular polarization, leading to a rotation of the linear polarization angle, an effect which is well known as Faraday rotation. Faraday rotation is an issue that needs to be addressed when studying polarization, but its differential effect can also be observed for unpolarized signals in the cross correlation products at low frequencies or long baselines. In this chapter we will discuss the above mentioned effects of the ionosphere on LOFAR data as well as methods to correct for them.