All we have is illusion
If you have an interest in UX on the web, you will know that consistency plays an important part in presenting the illusion that a webspace has legitimacy and trustworthiness. Consistency will also help to impart a sense of ownership over the virtual space the content occupies (in both virtual worlds - the web and our minds). I have always thought the main sticking point to creating consistency is the visual and subjective anarchy released every time a hyper-link is clicked - and we fly off (virtually) into the ether, to god knows where. Unless we are in a self contained App or SPA, we will always rely on the http application protocol (its not called HTML for nothing) - but maybe we don't have to visually convey this anarchy to the viewer. As usual, my research into finding ways of 'covering up the gap' ended at the first signs of anything remotely industry standard and the use of any external Libraries (if looking for that approach see here) what follows here is pure hacking away at the problem..
Setting out the goals
The above illustration outlines the basic process. The aim is to get both the present page and the loading page to be identical, hiding the point when the new page is loaded. The easiest way to get two pages with different content to look the same is to make the whole screen a single colour. However, if you have a fixed navigation bar and you have only a few elements on your pages then another option is to 'morph' or move elements into places used on the loading page. This requires a lot more planing but does look and feel more fluid. You can even mix the two techinquies, using the fading method for individual complex areas of content.
Hijacking the links
So we need to allow some time to get the current page to a state when we can load the new page.
Every internal link on your site will need to call a javascript function via the onclick
event, passing it a reference to itself (this is just the href
value) using the this
keyword. return
simply means we can return a false value to cancel the actual link process. Add the following couple of functions to your javascript.
Using the Style Cascade
In the endpagefake()
javascript function we added a class the HTML BODY
tag, not directly onto the DIV
that we want to use to cover up the page. This is good practise as you may also want to effect other elements when leaving the page. We only need target the BODY
but by setting up styles in the CSS we can effect countless child elements. Here is the bare bones HTML:
The DIV
with the cover_up
class will be used to cover the browser window, here is its basic CSS:
When we add the leavingPage
style to the BODY
tag, the element will now follow the following style declaration:
We have to use animation here as if we just transition the opacity up, the cover_up div will still stop the user activating the links, even at 0 opacity. (We could set the pointer-events
style property to none
) Here is the animation:
The cover_up div lies in wait with height of 0px at the top of the browser. When the animation runs within the first frame its set to its set to a height which should cover the browser - but its still at zero opacity, so we can't see it yet. The opacity rises over the course of the animation thus completing the cover up.
On Load
We now have to make sure the new page loading matches this cover up state. Change the original .cover_up
class to:
This will load up the page with all the content covered up. To uncover, add a class of unload
to the BODY
tag in the initialise
function in the javascript:
The animation is just a reverse of the fade-up animation. no_scroll refers to removing any scroll bars on the cover up screen, this is optional.
Dashed by the Cache
This works well in all browsers - if user is using the actual page links to navigate. At some stage though, users will start using the browsers own back and forward buttons to navigate through their current viewing session. Some browsers will just show cached versions of the pages already visited, rather than re-downloading the page. As we left our pages in a 'covered-up' state the browser will just show this to the user, with all the content hidden below the cover-up. Not very UX friendly. To combat this we use the onpageshow
event to test if a page is cached (unload
events won't be fired when cached pages are shown ).
In the javascript file create a global variable testonload
, setting it to false
. For the first page of your site the viewer views they have to use your internal page links to move away, so during the cover-up process we set testonload
to true
. When we come back to the page via caching, the onpageshow
is fired and we test testonload
, if true
we can undo the cover-up routine. When not cached the onpageshow
is still fired after unload
event. So the final html BODY
tag looks like:
Adding the following to the javascript:
VIEW DEMO