SEO for Web Developers
Most articles on SEO are geared towards marketers and don’t have many techy details useful for Web Developers, but tech related aspects of a website can have a huge impact on the search engine rankings for a website. Most SEO advice out there is geared towards marketers who will mostly just tell you to create good and useful content, set up social media pages, link those pages to each other, and give you advice on how to choose keywords that go into the website. These things are only half the part of SEO, and as a web developer you probably won’t have much control over them, but there are still many ways to write your code in a way that increases your website’s SEO.
Habits you should follow
- Keep SEO in mind from day one, you don’t have to spend a lot of time thinking about what is the best way to optimize your search engine rankings, but just keep it in the back of your mind, and try to write search engine friendly code
- All your images should have an alt tag
- All your svgs should have a title and a description tag
- Try to avoid having duplicate copies of the exact same image(i don’t mean avoid multiple sizes, I mean avoid having 3 copies of the same image that are 300x300 px). If there are multiple copies used in different places, then you’ll be splitting the score for this image between each copy.
- Words in urls should be separated by hypens
-
. The search engine reads this as a space - The url is very important for seo. If you run a store at
bears-beats-and-battlestar-gallactica.com
where you sell Bears, Beets and Battlestar Gallactica each on their own webpage, instead of using page names likebears-beats-and-battlestar-gallactica.com/inventory-13254
, use page names likebears-beats-and-battlestar-gallactica.com/bears
,bears-beats-and-battlestar-gallactica.com/beets
andbears-beats-and-battlestar-gallactica.com/battlestar-gallactica
- Shorter domain names are better in general, supposedly every letter above 7 in a domain name is correlated with a decrease in visitors. Don’t sacrifice a meaningful, easy to spell, name for something shorter though. The domain
twitter.com
is much better thantwtr.com
- Don’t spam keywords. Google can tell when your spamming keywords into your website. Use them where it makes sense
- Yours links should be on the text which describes the content at the link. If you have some text which says “I wrote an article about the nutritional benefits of beets” and you want to link the article with this text, the link should be connected to the text “the nutritional benefits of beets” instead of “an article”
- Your image names should describe the image, and describe the content on the page. If you have a page about fire safety, and an image of smokey the bear, instead of naming the image bear.png, you could use smokey-the-bear.png, or even better smokey-the-fire-safety-bear.png
Add these tags to the head of each page
(React Helmet makes this easy for React developers)
<link rel="canonical" href="your-website.com/the-current-page" />
<title>Your web page title</title>
<meta name="description" content="Description of this web page" >
(A lot of SEO experts will tell you this doesn’t matter, Google disagrees)- Meta properties that cause a title, description and image to appear when a link to this page is shared on social media (example for Facebook
<meta name="og:url" content="your-website.com/the-current-page" >
<meta name="og:description" content="Web page description">
<meta name="og:type" content="website">
<meta name="og:description" content="your-website.com/your-image.png">
Single Page Applications, Server Side Rendering
When google is scraping the web, it scrapes the webs HTML much more often. After scraping your websites HTML, it will probably come back a few weeks later to scrape the JavaScript (3 weeks could be too much for you). If your website is mostly JavaScript, which it will be if you’re using a JavaScript framework like React, Vue, or Angular, then you will want to have some kind of server-side rendering.
Lighthouse
Lighthouse is a tool that measures your websites load speed, accessibility, SEO and best programming practices. Google gives higher search page rankings to websites that achieve a high lighthouse score. Google spends a certain amount of time scraping your page, after this time limit it gives up and moves on. Your page is going to need fast loading times for Google to get everything.
Things that help your lighthouse score
- Preact.js
- Adding an
aria-label
to all buttons and links that are not perfectly described by the text on the button/link. An example of this would be an image that is used as a link. The text on the image can’t be read, so add an aria-label to tell Google and assistive software what the link/button does. - Properly sizing your image files so the images are no larger than they are on your website. Sharp is a useful tool for this.
- Use SVGs for icons. If an icon is under 10kb when it’s an SVG instead of a PNG or JPG, then use an SVG
- JPGs are smaller than PNGs
- I’ve listed a number of plugins below that can also help your lighthouse score
Gatsby and Next.js
If you’re a React developer, there are tools out there like next.js for dynamic websites, and gatsby.js for static websites, which can help you get perfect lighthouse scores without much effort. If your using Gatsby and still struggling, here’s a Stack Overflow answer that might be able to help you increase your lighthouse score on your Gatsby website
Webpack Plugins
Here are some useful plugins to include in your webpack configuration that will help boost your pagespeed and lighthouse score
- mini-css-extract-plugin
- webpack-pwa-manifest
- workbox-webpack-plugin
- copy-webpack-plugin (Just to copy your
robots.txt
Here’s the code to include them in your webpack config. (Gatsby and Next.js do some of these things automatically and you don’t have to set them up)
const MiniCssExtractPlugin = require('mini-css-extract-plugin'); const WebpackPwaManifest = require('webpack-pwa-manifest'); const CopyPlugin = require('copy-webpack-plugin'); const {GenerateSW} = require('workbox-webpack-plugin');...module.exports = [...new MiniCssExtractPlugin({
filename: "[name].css",
chunkFilename: "[name].css"
}),
new WebpackPwaManifest({
icons: [{
src: path.resolve("./images/icon-192.png"),
sizes: [192],
type: "image/png",
purpose: "any maskable"
},
{
src:path.resolve("./images/icon-512.png"),
sizes: [512],
type: "image/png"
}],
name: "Website Name",
short_name: "Name",
orientation: "portrait",
start_url: "/",
theme_color: "#049870", //Use your websites color
background_color: "#ffffff",
description: "Description of your website",
display: "standalone",
prefer_related_applications: false
}),
new GenerateSW({
// option: 'value'
}),
new CopyPlugin({
patterns: [{ from: "robots.txt", to: "robots.txt" }] })...]
Make sure that you include this rule in your rules of your webpack config to use the MiniCssExtractPlugin
{
test: /\.(scss|css)$/,
use: [
MiniCssExtractPlugin.loader,
{
loader: "css-loader",
options: {
importLoaders: 1,
}
},
//"postcss-loader", //If you use postcss
//"sass-loader" //If you use sass
]
}
And you’ll of course need a robots.txt such as this robots.txt
which blocks the Yandex and Baidu search engines
User-agent: *
Allow: /User-agent: Yandex
Disallow: /User-agent: Baiduspider
Disallow: /
Other Useful Node Modules
- Purgecss — Removes unused css (which you will have a lot of if you use a css framework like Bootstrap, Bulma, etc.)
- Sharp — Creates web optimized images that are sized correctly, and can create multiple images for different screen sizes. Runs on a back-end node server
More SEO Resources
Good online resources on SEO can be few and far between, but these 2 are useful
- Google SEO Starter Guide
- The Complete SEO Checklist For 2021 by Backlinko (More of the non-tech SEO but still a good resource)