Uh Oh! It appears you're not connected to the internet!
Sorry! We currently have more users than we can cope with. You can either wait until a space becomes available or come back later. We will be upgrading our servers to cope with this soon!
You got the boot! Click to connect back up...
Please contact an administrator, You are not authorised to login via this IP address
Already have an account?
Click here to login
Don't have an account?
Click here to sign up
Go check your email!
Oops! We don't know your email address, We can't send you a new password! Contact an Admin
Set Profile Picture
Select An Image
Confirm End Of Friendship
Chat To Us
Prefer to chat anonymously?
Read all about it, See what the good people at Project Peach have to say
WebApp Loading Time: The Battle Continues...
In this refactor we merge our static and dynamic web pages to improve WebApp load speed
WebApp Loading Time: The Battle Continues...
We've been talking to a potential client for a couple of months now. He had a concern about our loading speeds even with our recent improvements in past sprints. I'll quote him "Two seconds isn't going to be acceptable to our existing 1,500 clients, some are on really slow Internet connections, oh and BTW I hate the loading screen". We've been reassuring him for a while that we've lots of backlogged tasks to address the issue but this was going to be a showstopper moving forward. So, we simply said we will fix this problem once and for all in a single sprint of a fortnight. If we hit the mark then you come with us, if we fail you can walk away. You'll get to see what it's like to work with us and when we promise something to be delivered, it's delivered. The scene was set.
The Recent Past
We've recently done a few sprints to enhance our loading time. We thought that we had done pretty well but we knew we could do much, much better. In June 2018 we did a refactor that optimised our loading code, we wrote a blog on it called "Turbocharge WebApp Loading Time" https://projectpeach.co.uk/ProjectPeach/Default/Static/def/blog_46.html to have a read. Following on in July 2018 we did another refactor, optimising our page loading time we also did a blog on it called "Supercharge Page Loading Time" to read https://projectpeach.co.uk/ProjectPeach/Default/Static/def/blog_52.html. Both are good background reading for this blog. We made real improvement both to entry time, click URL to page display and inter-page movements. However it seems to have given our clients the thirst for more speed. At the end of these enhancements we had a less than two second load time on a 20mbps Internet connection however some of our clients have only 4mbps. Still too slow. And the problem with mobile networks on 4G. Let's see what the make up of the problem really is.
The Real Problem
Here's what Google PageSpeed Insights thinks of our existing Project Peach WebApp. Not pretty reading at 43 / 100 - Low (in red). Google-bot ranks websites, NOT cloud hosted live WebApps BUT page ranking is important to some of our clients. A good natural search ranking drives traffic, although NOTHING beats paying for search.
Before we continue let's have a little detour. Back to what's taking the time. On a Friday at the end of the previous sprint we find ourselves a little time. We'd just had the client meeting and got the message loud and clear. Prove you can speed it up! Let's investigate the stages of the load, we'll ignore the server side translations since we've previously optimised the hell out of them down to 40 milliseconds.
Load the HTML for all pages (some optimisation done already here with lazy loading popups and admin panel.
Load the entire CSS for ALL pages.
Swap out the target page HTML in the DOM using JS.
Poke in the images contained using JS.
Show the page.
This is a simplification but it's the main loading steps. From step 1 to step 6 the loading animation is played with inline CSS animation. So the key here is why bother with the loading animation? We could show a page at after step 2 and continue in the background. We wanted to prototype this quickly to prove it actually worked. So that's what we did that Friday afternoon. It wasn't pretty or elegent but it proved a point. The prototype hit < 2s to display on a 4mbps connection incognito. On typical faster connections it's instant, first or successive connection. For the prototype we'd simply hacked the home page of Project Peach into a static page and showed after step 2, nothing worked until step 4 completes but the perception is wonderful.
So that's the key to the solution. Refactor our current web indexing to create lookalike static pages. That's the story for our next sprint and the design work is done. The prototype is tested and it works. The Devil is in the detail as they say. The implementation starts next Monday.
Show the Client Early Progress
We work with our clients from day one. If we've a really rubbish, buggy, prototype we share. Remember they might not like it or have input, better get it early rather than at the end of the job. So we share our prototype in the form of a beta Project Peach WebApp on the Monday. It's not wholly a prototype but the very start of our static page refactor to improve perception of loading time.
It's Monday, start of a two week sprint. We've done the design, built a beta with the start of our ideas and published it to the client. We can now get on with doing the real coding now having proved that we have loading speed well and truly in the bag. Remember too, all this web indexing refactor is 'framework' code, all our client projects are build on our framework therefore all our projects will contain this improved functionality at the next release/update. So where to start? Seems like the web indexing c++ core is as good a place as any. Currently it's very simple. A set of static page type templates where we poke data and image links into placeholders / tags. So, in eCommerce, each product gets it's own unique static HTML page based on the product template. It's all done the same way. The Cloud WebApp runs the web indexer thread each minute to check for 'dirty' items, if it finds one it rebuilds the page and re queues the task for the next run. It's not quite that simple now since we want a static HTML view of a dynamic page, we want them to look as similar as possible. Our dynamic pages are already use HTML templates so why not source these as the template for the static pages too and point the web indexer to them. Perfect, but it's a little bit more complicated than that. Before we continue let's have a basic overview on something that we call 'Custom Content', to read more https://projectpeach.co.uk/ProjectPeach/Default/Static/def/blog_31.html.
We need to teach our WebApp web indexing code how to make initial state custom content controls and how to default them, as in how they appear in the static page HTML. It also marks them up as a custom content placeholder so the real JS custom content control can find and replace when the WebApp connects and comes alive. We also made the web indexing update model event driven. Since it's part of our working WebApp now we need to update instantly a change is made rather than on a minute timer loop.
This change is pretty big. It's been simplified here but each page type needed to be addressed for initial static content and each control type changed to respect the new static page custom content type markup. But basically we simply updated the web indexing process to create the same looking web pages as the dynamic content and made them load early and seamlessly transform into live pages. Worked a treat BUT have we lost anything relating to bot / spider searching?
So what's Google PageSpeed Insights think of our faster loading technique? Blown away! Remember we are NOT a website BUT we'd like to be ranked well for our client's sake. Best of all this is only the start of what we can do. Managed 97 / 100 for mobile and 87 / 100 for desktop. Not bad for not a website with big chunky initialise baggage attached. We love the green!
Web Indexing. Anything lost?
The very last thing we wanted to do was to break our web indexing to search. We'd like bots to index our live WebApps really well. We've not lost anything in terms of SEO, in fact we've made some important gains. We're faster now, which ranks us better. We've the same static site visuals as the dynamic site which means our mobile ranking gets 97 / 100 for speed, it's even better for usable formatting which affects mobile ranking. We've add a Google-bot friendly SiteMap to aid indexing of our pages and also build nice HTML nav tags if the bot feels like recursively descending through our pages. We've also added product data descriptions for Google-bot consumption and better search data rendering. We've always had micro-data but this area has evolved significantly in the past two years. So we added it. Remember our clients don't maintain any of this bot consumable data, our cloud hosted WebApps build it automatically when pages, blogs, product data, etc. is added/ modified. Work smarter NOT harder!
New to web indexing we've added Google sitemap support. We wanted to enhance our google search so in addition to our static page navigation links we added a sitemap for Google-bot consumption. You don't need to ask who maintains it, the cloud hosted WebApp of course, it's completely automatic and always bang up to date.
Google Structured Data
New to our web indexing we've also added google structured data support. Here's what we did for our eCommerce products https://developers.google.com/search/docs/data-types/product You can read all about how it supplements search information. Best of all it's for FREE.
When Can We See the Improvements?
These loading performance improvements are coded into our core functionality. As we release new versions of our products it will simply appear. We will add some links into this blog and on our newsfeed when key products become live. Watch this space!
Doing it differently since 2012. We're simply 10 years ahead. Our software is designed, coded, maintained, supported & hosted in the United Kingdom.