Lots of games have alternate vision modes (nightvision, heat-vision), or use rumble unrealistically, for the sake of gameplay and tactile feedback. Even though these sensory outputs from the game aren't physically accurate, they map the player's senses directly onto the fiction of the game world - nightvision is still vision, tactile sensations rumble.
If you can give information through a different sensory channel, it opens up a lot of possibilities. Far Cry instincts showed smell as a particle trail, which you could use to track people. XIII showed position audio cues rendered in 3d with a "tap tap tap" comics-style cutout showing the location of footsteps (something I wish I have in Thief when I constantly look left and right to hear enemy footsteps switch L/R channels on my headphones). In both of these games, it ends up more like a UI indicator than an actual sense - the way the information's mapped from one sense to another is shallow.
Ideally a good mapping of one sense to another would give the player a kind of synesthesia (wiki). Rez is the closest I can think of to synesthetic, and although the gameplay is terrible, it's a unique experience to play Rez. It feels like guitar hero or DDR, where things on the screen (and your control inputs to the game) are synchronized to the music. Adding that sort of wackiness to a game should be pretty easy for us - grab a random stream of data from somewhere in your game (audio, player input, some important state from the game), and hook it up to a different sort of arbitrary output instead of showing it as a bar on the hud. Additionally, it's awesome if we can make sensory output from the game matter in ways it normally doesn't (eg. the "find the rumble direction" minigame, or the last Def Jam's crazy beat-matched fighting system). Instead of just being cosmetic output, make it part of the feedback loop the player needs to play the game.
Internalization of the laws of the world make you understand the game at a deep level - the character physics feel kinesthetic and "right", the tactics are all well-known to you and enemies so they turn into gambits and counter-moves, and you've learned to read the game's state from its output. Every little sensory channel we can feed the player helps.
What's even more interesting to me is giving the player senses he doesn't really have. Some new senses can be trivial for us and the player. "New senses" doesn't need to be difficult to implement or mentally alien (like trying to model smell diffusion, or vibration sensitivity depending on the ground's hardness, or echolocation bouncing off walls).
It would be trivial to make the player into a mindreader - make a prettier version of your AI's debug displays. The player could then see what the NPCs are aware of, and what they're about to do. If it had a cost in the game (eg. it's only available when the enemy's not hostile, or when the moon's full and visible), there's automatic gameplay. Preempt their feeble plans and feel like a mastermind.
Prescience is a little trickier than mindreading - you need fast prediction [oracle billiards] for objects (similar to what you already need for network play), and some rendering tricks to overlay that information on what's currently happening. (Does it make sense in turnbased games? I could see it working in a game like xcom, but it starts to overlap with the mindreading above).
A cheesy way to do it would be to render every moving object again a second or two predicted into the future. In an FPS this isn't useful in multiplayer (because of how players move, and how skilled players already predict simple movement really well), but against AI opponents (since the game knows what they'll do) and for predicting physics objects it could get pretty psychedelic.
Here's the cheesy future effect - rendering the object a few times over where we predict it's going. Planet orbits are easy to predict :) I rendered the 'futureness' in redder and alpha'd out colors, so it's clear which objects are predictions and which one is real. The future ones are also bigger, I was hoping to make it look like a spreading probability function.
Here's a continuous version, where the object gets stretched out towards its future position - it's kind of like the Donnie Darko spear-object extruded through time. You can see where the Earth is going to be for the next 1/4 orbit.