Moz to update Domain Authority metric

The popular SEO toolset Moz has announced that they will update and upgrade their Domain Authority metric.

Domain Authority is a score given by Moz for a given domain that takes into account how valuable and ‘authoritative’ it is, and how well it might rank on search engine results pages.

The problem with this metric and those supplied by other tools such as Ahrefs, is that the score out of 100 can be influenced by the types of links Google deems as spammy – particularly paid links.

The new update from Moz is therefore geared towards weeding out more of the untrustworthy signals so that their Domain Authority score is more in line with how Google – and other search engines – may view a domain.

In a blog post announcing the update, Moz said:

Domain Authority has become the industry standard for measuring the strength of a domain relative to ranking. We recognize that stability plays an important role in making

Domain Authority valuable to our customers, so we wanted to make sure that the new Domain Authority brought meaningful changes to the table.

As search engines don’t use or publish domain scores such as Domain Authority, the metrics used by SEO tool providers can cause confusion for SEOs and online business owners.

This has become an even bigger problem since Google stopped showing their PageRank scores, which has led to many SEOs referring to Domain Authority as the official metric by which to judge a website’s authority and ranking potential.

The need to make the score as accurate as possible is therefore a key requirement for tools such as Moz Pro.

Russ Jones, Moz’s Principal Search Scientist (great job title!) explained in his blog post how Moz went about making technical changes to come up with a more trustworthy Domain Authority score. He said: “We can remove spam, improve correlations, and, most importantly, update Domain Authority relative to all the changes that Google makes.”

Here’s a quick rundown of what has been changed:

  • Training set: Domain Authority is now better at understanding sites which don’t rank for any keywords at all than it has in the past (i.e. dodgy/spammy websites).
  • Training algorithm: Rather than relying on a complex linear model, Moz switched to a neural network. This offers several benefits including a much more nuanced model which can detect link manipulation.
  • Model factors: Domain authority doesn’t just look at link counts, Moz added Spam Score and complex distributions of links based on quality and traffic, along with a host of other factors.
  • Index: Moz now has an index of 35 trillion links.

 

 

Best URL shortener tools (2019 update)

You don’t want to share long, ugly links across social media – especially URLs that contain a lot of tracking code – so you need a good link shortening tool to help your shared URLs look clean and professional.

The best URL shorteners also come with a lot of useful added features – particularly extra reporting data so you know when your shortened link has been clicked on what platform. Some even allow you to create your own branded short URLs.

It’s worth noting now that most URL shorteners offer a free service to simply shorten URLs, with their extra features requiring a paid subscription. Therefore it is important to check which of the below tools offers the best service for what you need before you decide to pay for any.

So if you are unhappy with your current tool of choice, or you’re looking to get started with a link shortening tool, check out our definitive list of the best URL shorteners below…

A complete round-up of the best BrightonSEO round-ups

Post updated: 3 October 2018

If you missed Brighton SEO last week (Friday 28 September), or if you were there and want a handy list of all the most useful presentations, then you’ve come to the right place.

Below we’ve put all the best Brighton SEO round-ups in one place, and we’ll be updating this list as more and more are published.

So let’s get stuck in…

Screaming Frog announces SEO Spider 10.0 update

The popular crawling tool has been updated, and now boasts a host of new features including scheduled crawls, XML Sitemap crawl integration and Internal Link Score.

The Screaming Frog SEO Spider – one of the most popular technical SEO tools around – has updated to version 10.0, and introduced a range of new features.

See a quick run-down of the new features below, and read our Screaming Frog SEO Spider review for more information.

Google could view charity ‘sponsor’ links as spam if you do it ‘systematically’

Google’s John Mueller responded to a question during a webmaster hangout and revealed that links from sponsoring charities could lead to penalisation.

Sponsoring charities in return for relevant and authoritative ‘follow’ links has long been a tactic employed by SEOs, but it’s a strategy that could lead to penalties from Google if the practice is abused.

Speaking in a recent Google Webmaster Hangout, John Mueller from Google suggested that – while some sponsor links from charities are fine – there is the risk of receiving a ranking penalty from the search engine if there is a pattern of systematically abusing charity links for SEO gain.

Google’s new Search Console is out of Beta

After a year of testing, the replacement to Webmaster Tools has now graduated out of beta and now features a new Manual Actions report and a ‘Live Test’ feature to the URL Inspection tool.

Google has announced that their Search Console tool is now out of beta – after more than a year of testing and gradual roll-outs to webmasters.

As well as moving the tool out of beta, Google has also added a new Manual Actions report and the ability to check the status of URLs in real-time with the URL Inspection feature.

The first time you log in to Search Console after this update you will be greeted with the below message, which takes you through a quick tutorial of all the features in the tool:

search console new message tutorial screenshot

Source: https://webmasters.googleblog.com/2018/09/the-new-search-console-is-graduating.html

How to recover from the Google Medic update – and how to prepare for any future E-A-T based algorithm updates

Google released a major update on 1 August 2018, which caused major fluctuations in search rankings.

Although Google called the update a ‘broad core algorithm update’ (which means it affects all search results), SEOs quickly noticed that sites in the health/medical/well-being sector were hit the most – which led to Barry Schwartz quickly coining the term ‘Medic Update‘.

What link data should you use when disavowing links?

With so many different sources of link data out there, how do we know what data to use to create the most comprehensive and effective disavow list?

Whether you have suffered a link-based penalty, or if you have noticed a lot of ‘dodgy’ links in your link profile, submitting a link disavow file to Google is often the first step to either recover or protect your rankings.

However, with so many link building tools out there all supplying slightly different link profile data – which one do you use to compile your disavow list?

Google Search Console adds Links and Mobile Usability reports

The beta version of the new Search Console tool has had new features imported from the old Webmaster Tools.

The new features added to Search Console this week include the Links Report and Mobile Usability Report.

The Links Report is now a combined internal and external links report, rather than the separate reports formerly used in Webmaster Tools. Google says the new Links Report is more accurate, so you may see fewer links reported in the new version.