Photo by Lucas Metz on Unsplash
Let's start with the BBC site. The front page is now 3.4 meg and it takes my modest but not underpowered machine 1.7 seconds to load.
The BBC is meant to be a global information radiator, but clearly only if you have a decent phone. God knows how people in third world countries are meant to get anything from the site with a feature phone. I mean you don't need a 50 foot antenna to pick up the world service.
Now CNN and 7.3 meg and 4 seconds to load on a machine that has 20 gig of ram.
Fox news goes even further into madness with 8.6meg and a crazy 6.6s seconds load time. I mean honestly who is advising these companies.
The Guardian is a slight bit better with 5.8 meg and 3.8s seconds. I must give mention to the 23 trackers on the site. Honestly, is that really necessary? How many times do you want to track the user.
If you consider google page insights, or lighthouse scores which google uses as a test of your site performance in loading on the client machine then the BBC scores an abysmal 15 out of 100. Fox news gets 19, CNN is 15 and the Guardian gets 22.
As these tests feed into search engine results then any mortal who got that score would absolutely struggle to get placed on searches in google. Yet these sites are just allowed to hoard search engine terms despite being woeful are building quality sites.
If Google are going to apply this scoring system then they should do it across the board. I's only one part of search engine placement but google stress over an over how important it is, but if you get to a certain size you just walk past it.
If I have to get 80 plus at a minimum then they should have to as well. These sites can write anything and it will instantly be top ranked on multiple search terms.
The thing is these sites can be built better, even with multiple images and some visual elements, even tracking tech can be built to be lean. I have no doubt that I could build any one of these sites to use only 20% of the data and render 2 seconds faster.