At least the german audience of this blog will probably recognize the rather blunt punch line. Mediamarkt, for those who don’t know, is Europe’s biggest retail chain for Electronics with something like 650 stores. For different reasons, most of them not really relevant here, they did not have a Webshop up until last week. Can you believe that?
Well, now they do. Last week they launched the site and did, as they often do, a pretty bold statement. No other than Amazon is the opponent they would like to attack. So I was curious, how they perform from a Web Performance perspective… To make a long story: Pretty badly.
The dry numbers, pulled from WebPageTest:
- Time to Render 0.7 seconds
- Visually Complete 2.3 seconds
- Time to render 3.4 seconds
- Visually Complete 18.2 seconds
Yep, you are reading that right: Amazon is already visually complete, before Mediamarkt starts to draw a single pixel.
Both tests done from WPT’s Paris node with 1.5 MBit/s bandwidth and IE8. Reason I picked Paris and not Frankfurt is that Paris apparently has a higher headroom regarding CPU Power. With Frankfurt I often encounter measurements, that are clearly having CPU Power as influencing bottleneck. Paris comes at a latency price of +10ms, which normally doesn’t worsen the results significantly.
What is the reason, I hear you asking… Well actually it is two things Mediamarkt is paying for. First and foremost they are loading WAY too many CSS and JS Files (green and orange bars) at the beginning of the page. There are ~15 CSS and ~15 JS files placed in the HEAD section of the Page.
As you can see in the picture above, it takes almost 3.5 seconds, before all necessary CSS and JS files are recieved. To make things worse, the Browser is already pulling images, before all CSSes and JSes are requested. The “final” necessary object is actually Number 44 on the wire…
They could reduce the Time to Render by quite a lot, if they would carefully concatenate them or inline part of them.
But as you recall, it is not only the Time to Render, that is pretty bad. Visually complete doesn’t look much better, either. The simple root cause, and reason for the Headline, is weight. Or better Overweight. The page clocks in at 2.5 MByte! A simple math will tell you, that @1.5MBit/s it won’t get much faster than 15 seconds to download the page.
And, as you can see, it is the images. All of them are PNGs, and apparently at True Color depth. 4 Images of them come in at roughly 360KB each. So just these 4 images are already half of the weight of the page. I converted them to JPG with Quality setting 100, and then used JPEGMini to crunch them. As a result, each of them was crunched to 60KB instead of 360KB. Reduced to 1/6th of its former size. For the quality degradation, look for yourself:
Additionally, another flaw they did, they implemented a rotating banner in a rather badly way. These 4 HUGE Images are loaded IN PARALLEL. Which means, even though you only see ONE of them at the beginning, all of them have to be pushed down the pipe simultanously. So instead of loading the visible one fast and immediately, and lazy load the other ones, as soon as the page completes rendering, they are loading them all at once. Which means that the VISIBLE one loads roughly ~4 times slower, than it could. Look at the Video below, how slowly that main banner progesses:
But it doesn’t end here… BELOW THE FOLD, they have another Banner rotation with 5 images, each at 30KB resulting in additional 150 KB Page weight. And another 100KB image, which actually isn’t visible AT ALL.
So, to sum it up, I assume you will be able to bring the page weight below 1 MByte, and if you properly lazy load, have a visual complete at around 800KB.
Three other minor things stood out, but aren’t as problematic, as the issues mentioned above:
- Caching is rather on the short side. The good thing is, that almost every object DOES have a Cache-Control-Header. The bad news is, that the Time-to-Live is rather short with 1 hour, resp. 1 day.
- Cookies. Normally I would count that as Micro-Optimization. Their Cookies though are actually THAT big (don’t fit in a single frame) and are sent even with static objects, that they should have a look at that.
- They are serving 1 CSS and 1 Image from a different Domain, owned by them. No cookies there and no functionality. IMHO, they could put these two on the main domain, and save a DNS Lookup as well as 2 TCP Handshakes (This specific Domain sends a Connection: Close Header).
To conclude, even though Mediamarkt achieves a decent 81 Pagespeed Score we shouldn’t come to the conclusion, that the user experience is mirrored by that. An invalid causation, as Catchpoint excellently has written about.
From a performance point of view (and thereof user experience point of view), if they keep it, like it currently is, it will be quite a an uphill battle for Mediamarkt to fight.