Optical channels, such as fibres or free-space links, are ubiquitous in today's telecommunication networks. They rely on the electromagnetic field associated with photons to carry information from one point to another in space. A complete physical model of these channels must necessarily take quantum effects into account to determine their ultimate performances. Single-mode, phase-insensitive bosonic Gaussian channels have been extensively studied over past decades, given their importance for practical applications. In spite of this, a long-standing unsolved conjecture on the optimality of Gaussian encodings has prevented finding their classical communication capacity. Here, this conjecture is solved by proving that the vacuum state achieves the minimum output entropy of these channels. This establishes the ultimate achievable bit rate under an energy constraint, as well as the long awaited proof that the single-letter classical capacity of these channels is additive.