Neural systems supporting spatial navigation have been extensively investigated in animals and humans. Despite consistency in brains regions implicated in navigation, restrictions imposed by static lab based imaging techniques in humans impose strict limits on our current understanding of real-world navigation. By definition, navigation in real-world environments is multifaceted, sensory perception of environmental features and self-motion information combine to establish orientation in space and guide navigation. From a theoretical perspective, how the brain integrates sensory environmental and self-motion information to support spatial navigation in naturalistic environments remains an open question. With recent developments in mobile technology, we are now in a position to assess real-world navigation, and while it is difficult to predict at this early stage exactly how adoption of this approach will advance knowledge of navigation in humans, it clearly has potential to provide valuable insights into the relationship between psychological, physical and environmental aspects of navigation.