Information retrieval (IR) is the task of learning useful ways of representing text documents that allow us to efficiently relevant the most relevant documents for a given query, e.g. by encoding document similarity in the distances between documents in representation space. We propose a simple and effective strategy to augment document representations using inbound citations and hyperlinks as pseudo-queries and alternate views of a document, leading to significant task- and domain-agnostic gains for zero-shot information retrieval and a new state-of-the-art on multiple BEIR tasks. Notably, this technique works with arbitrary sparse or dense models and, unlike generative text expansion techniques, requires no further training or auxiliary large decoder models. More broadly, we propose the lens of query and document expansion as finding shortcuts between points in representation space. We additionally find that the benefits of different kinds of shortcuts stack, explore shortcuts as an optimization-free way to adapt the effective representation space of a retriever to new information at inference time, and show that shortcuts provide strong signals for fine-tuning dense retrievers.