Over 10 years we help companies reach their financial and branding goals. Engitech is a values-driven technology agency dedicated.

Gallery

Contacts

411 University St, Seattle, USA

engitech@oceanthemes.net

+1 -800-456-478-23

SEO

Starting technical SEO

How do you get started with technical SEO?

Technical SEO involves improving a website’s non-content parts. These are the background elements of a website which are not visible to the users (This affects user experience). Technical SEO is concerned with working aspects of website, and protocols which search engines use to index and crawl pages. Think of it as a website’s foundation.

Technical SEO is complicated. But, one should have basic knowledge about its principles. A developer can work on different technical aspects of a website.

1. Do a monthly SEO health check:

Search Console allows a developer to perform basic SEO checks without using third-party tools.  You can identify major issues like traffic drop or see if Google has penalized the site in some manner. Search Console emails the major issues, but proactive checks are better. Search Console informtaion allows one to greatly improve a site’s SEO. By checking XML sitemaps and robots.txt files, along with searching for new broken links, a 5 minute weekly check will keep a site updated.

2. Have a really fast site:

We all get frustrated when a website takes time to load. Such websites perform poorly and generally do not get traffic. Perform a speed run on the site using Pingdom’s speed test or Google’s PageSpeed Insight to get a few recommendations to improve the performance of the website. You might have to review the hosting service if it is the root of the problem.

3. Is your site mobile friendly?

Now-a-days almost more than 90% internet searches are performed through mobile devices. Hence it is crucial to make a website mobile friendly. Google appreciate sites which are optimized for mobile devices. Phone users are shown separate search results. 

How can we know that a website is mobile friendly? We can use tools to find that out. Run a test, and make necessary changes to make the website mobile friendly.

4. See how your site looks to Google:

We can check how Google crawl content on our website. We can use two methods for this purpose. 

We can put our page in BROWSEO. The tool is like a simple browser, and shows all the content and tags of a website without the design elements. If some content is missing, it’s because sometimes search engines cannot read it.

Using the Search Console is the second option. We can find out what Google makes of our page by checking the mobile or desktop version of our page, by putting our URL in the Fetch and Render tool. This lets us know how Google sees our page, and how users see it, and how a browser displays it.

5. Fix broken links:

A website adds and removes different pages from time to time. There is no harm in removing a page, however it is a problem if a visitor lands on a dead page during the browsing. Worse, a removed page loses the importance given to it by a search engine. 

Website must be regularly checked to identify broken links and pages. We can use crawler to identify the errors, or use Search Console for this purpose. Once identified, edit the internal links so they can direct user to the correct page, then deal with the old URL. If page has direct replacement or is important, redirect old URL to its successor via 301 redirect. This is simple in many CMSs, despite sounding complicated.

6. Fix any duplicate content issues:

Duplicate content is another common issue in SEO.  This is when the same content is accessed on a site in multiple ways.

If URLs are not set properly, then same content is accessed in some CMSs. It is a problem when search engines cannot choose the best versions. But, duplicate content issues do not get a site penalized in most cases.

The potential authority is diluted, when other sites link to both versions of page. Duplicate content has multiple cases, but also have some fixes. Find duplicate content issue by performing a crawl on site and checking indexed pages (using a site: search). Canonical tags often fix the situation.

7. Upload a XML Sitemap:

It can make things a whole lot easier if Bing and Google have a list of all the pages we want them to note. With XML sitemaps this can be done. We add this file to our website (usually at exampledomain.uk/sitemap.xml) which lists the sites URLs. Many CMSs automatically create one for us, and give us the option to remove certain pages.

Once live, we can upload to Bing Webmaster Tools or Search Console. Regularly check how many of the submitted pages are indexed to identify problems with the pages

8. Check your robots.txt file:

All websites have a robots.txt file. This file is found in the roots of a server (exampledomain.uk/robots.txt) and it instructs robots to crawl only those parts of a site where they are allowed. Take a look and identify if search engine’s access to certain pages is accidently blocked? Do certain sections of the site need to be denied access to? Search Console has a tool to test robots.txt file of a site and see it its working according to our expectations.

Leave a comment

Your email address will not be published. Required fields are marked *

    Ready to Get Started?

    Your personal data will not be published. Required fields are marked *