We study learning and influence in a setting where agents communicate according to an arbitrary social network and naively update their beliefs by repeatedly taking weighted averages of their neighbors' opinions. A focus is on conditions under which beliefs of all agents in large societies converge to the truth, despite their naive updating. We show that this happens if and only if the influence of the most influential agent in the society is vanishing as the society grows. Using simple examples, we identify two main obstructions which can prevent this. By ruling out these obstructions, we provide general structural conditions on the social network that are sufficient for convergence to truth. In addition, we show how social influence changes when some agents redistribute their trust, and we provide a complete characterization of the social networks for which there is a convergence of beliefs. Finally, we survey some recent structural results on the speed of convergence and relate these to issues of segregation, polarization and propaganda.