Here’s a bold statement: “SEO in the travel industry is immensely challenging.”
The sheer number of pages to manage, complexities of properties, flights, accommodation, availability, occupancy, destinations, not to mention the crazy amount of APIs and databases to make a travel site function, can all make life tricky for an SEO, particularly when it comes to the development queue…
Having said that, there are still common mistakes and missed opportunities out there that have the potential to be really impactful and believe it or not, they don’t actually require a huge amount of resource to put right.
So, here’s a list of the six most common travel SEO mistakes to get right for 2019:
There are a LOT of facets and filters when it comes to commercial travel category pages, arguably the most of any industry.
Typically with every facet or filter, be it; availability, location, facilities, amenities nearby, occupancy etc. A URL is created with the associated parameters selected by the user.
If not handled correctly, this can produce thousands of indexable pages that have no unique organic value to users.
This is a problem for a number of reasons:
Combined, this can cause big losses in rankings, traffic and subsequently conversion!
How to identify index bloat
Go to Search Console (formerly Google Webmaster Tools) and check your ‘Index Coverage’ report or, in the old version, check ‘Index Status’ to see if you can see any spikes or growth in ‘Total Indexed’ pages. If you notice something like the graph below and it’s not expected, then there may be a problem:
If you find there is a big increase and you can’t explain why, conduct some ‘Site:’ operator searches and spot check areas of your site where this may be commonplace to see what you can find.
Here’s an example of index bloat from the page speed tool ‘Pingdom’. It seems as though every input a user executes produces an indexable URL:
Once you’ve found a problem like this, review the extent of it with a Screaming Frog crawl. This way you can see how many URLs are affected and distinguish between whether they are actually indexable or not.
For example, there may be a few hundred pages that are indexable but have not yet been found and indexed by Google.
How to fix index bloat:
If any of the above are difficult to get implemented in your dev queue and you don’t trust yourself using the parameter handling tool, you can actually noindex web pages & directories in your robots.txt file. You can actually add lines reading:
This could save you a lot of time and is fully reversible, so less risky if you have control over your robots file. If you’ve never heard of this, don’t worry it is supported and it does work!
It’s pretty staggering but in the UK, there’s a lot going on in January for travel — it is certainly the biggest spike in the year for many brands, followed by ‘holiday blues’ peaks after summer.
Here’s the trend of interest over time for the query ‘tenerife holidays’ (a destination famed for its good weather all year round) to show you what I mean:
January might be a bad time to experiment because of the higher interest but, the rest of the year presents a great opportunity to get creative with your titles.
Why would you?
Simply, keyword heavy titles don’t inspire high click-through rates.
Creative titles entice users into your landing pages, give your brand a personality and increase your click-through rate. This sends strong positive relevancy signals to Google which helps towards highlighting that your website is the best for the initial user query.
Here are a few things you can try with supportive content and commercial landers:
As previously mentioned, the travel industry experiences peaks and troughs of consumer behavior trend throughout the year which causes the majority intent to switch dramatically across different months in the year.
So, having a deep understanding of what users are actually looking for is really important when merchandising high traffic pages to get the best conversion out of your audience.
In short, gaining an understanding of what works when, is huge.
Here’s some tips to help you make better merchandising decisions:
Often consumers are exposed to the same offers, destinations and visuals on key landing pages all year round which is such a missed opportunity.
We now live in a world of immediacy and those in the industry know the challenges of users cross-shopping between brands, even those who are brand loyal. This often means that if users can’t find what they are looking for quickly, they will bounce and find a site that serves them the content they are looking for.
For example, there’s an argument for promoting and focusing on media-based content, more so than product, later in the year, to cater to users that are in the ‘consideration’ part of the purchasing funnel.
Use number five in this list to pull even more clues to help inform merchandising
I grant you, this is a tall order, travel advice, blogs and guides are a standalone business but, the opportunity for commercial travel sites to compete with the likes of TripAdvisor is massive.
An opportunity estimated from our recent Travel Sector Report at 232,057 monthly clicks from 22,040 keywords and only Thomas Cook is pushing into the top 10.
Commercial sites that don’t have a huge amount of authority might struggle to rank for informational queries because dedicated travel sites that aren’t directly commercial are usually deemed to provide better/unbiased content for users.
Having said that, you can see clearly from above that it IS possible!
So, here’s what you should do…
…focus on one thing and do it better than anyone else
Sounds pretty straightforward and you’re probably thinking ‘I’ve heard this before’ but, only a handful in the travel industry are actually doing this well.
Often you see the same information from one travel site to the next, average weather, flight times, the location of the country on a map, a little bit of fluff about the history of the destination and then straight into accommodation.
This is fine, it’s useful, but it’s not outstanding.
Let’s take Thomas Cook as an example.
Thomas Cook has built a network of weather pages that provide live forecasts, annual overviews as well as unique insights into when is best to go to different destinations. It even has a tool to shop for holidays by the weather (something very important to Brits) called ‘Where’s Hot When?’
The content is relevant, useful, concise, complete, easy to use, contemporary in design and, most importantly, better than anyone else’s.
In short, Thomas Cook is nailing it.
They have focused on weather and haven’t stopped until it’s as best as it can be.
Why did they bother with weather? Well it’s approximately a third of all travel-related informational searches that we found in our keyword set from the Travel Sector Report:
Apply Thomas Cook’s methodology to something that is relevant to your audience, it could be; family attractions, adult only tour guides, Michelin star eateries, international laws families should be concerned about, the list is plentiful!
Find something, nail it.
There are some big travel sites out there that don’t have an on-site search function which is a huge missed opportunity. Travel sites are inherently difficult to navigate with such a volume of pages, site search is quite often a great solution for users.
As well as this, it can give marketers some amazing insight into what users are looking for, not just generally in terms of the keywords users might be using but also the queries users are searching on a page by page level.
For example, you could drill down into the differences between queries searched on your homepage vs queries searched on specific landing pages to spot trends in behavior and fix the content gaps from these areas of the site.
You could also use the data to inform merchandising decisions to address number three on this list.
In doing this, users are actually telling you exactly what they are looking for, at what time, whether they are a repeat visitor or a new one and where they’ve come from to visit your site.
If you spend the time, this data is gold!
If you can’t get buy in for this, test the theory with an out of the box search function that plugs straight into your site like searchnode. Try it for six months, you might be surprised at how many users turn to it and you will get some really actionable data out of it.
It’s also super easy to track in Google Analytics and the reports are really straightforward:
1. Go to Admin
2. Click ‘View Settings’
3. Switch ‘Site search Tracking’ on
4. Strip the letter that appears in your site’s search URL before the search terms e.g. for wordpress this is usually the letter “s”: www.travelsite.co.uk/?s=search-term
5. Click ‘save’, boom you’re done.
Let Google collect data, extract it monthly and dig, dig furiously!
Who doesn’t love a witty 404 page. More and more often you’ll find that when webmasters optimize a 404 error page they make them lighthearted. Here’s a great example from Broadway Travel:
There is a reason why webmasters aim for a giggle.
Think about it… when users hit a 404 error page, 100% of the time there’s a problem, which is a big inconvenience when you’re minding your own business and having a browse, so, something to make you laugh goes a long way at keeping you unfrustrated.
Time to name names, and show you some 404 error pages that need some work…
TUI & Firstchoice
404 error pages happen over time, it’s totally normal.
It’s also normal to get traffic to your 404 error page. But it’s not just any old traffic, it’s traffic that you’ve worked hard to get hold of.
If, at this point, you’re thinking, ‘my site has recently been audited and internal links to 404 pages have been cleared up’.
Users can misspell URLs, ancient external links can point to old pages, the product team can make mistakes, as meticulous as you may be, please don’t discount this one.
Losing quality users because of a bad 404 experience is an SEO’s idea of nails down a chalkboard.
Here are some tips to optimize your 404 pages:
Even if you think your 404 is awesome don’t neglect them when they pop up:
404’s are often the bane of an SEO’s life and you might think about ways to get out of keeping on top of them.
Sadly there aren’t any short cuts….
…Bonus SEO mistake
Creating a global 301 redirect rule for every 404 page and direct them to your homepage.
This is surprisingly common but is poor SEO practice for a number of reasons, firstly you won’t be able to identify where users are having issues on your site when 404 pages pop up.
You may also be redirecting a page that could have originally had content on it that was totally irrelevant to your homepage. It’s likely in this situation that Google will actually override your redirect and classify it as a soft 404, not to mention the links that may have originally pointed to your 404’s.
Save your users, build a 404 page!
No site is perfect, and although it might appear as though we’re pointing fingers, we want you to be able to overcome any challenges that come with SEO implementation — there’s always a bigger priority but keep your mind open and don’t neglect the small stuff to stay ahead of the game.
The post Six most common travel SEO mistakes to get right in 2019 appeared first on Search Engine Watch.
The appearance of Google CEO Sundar Pichai in front of Congress yesterday has been eagerly anticipated by those of us following the company’s tumultuous year in the face of criticism from international press, human rights organizations, and its own staff.
The hearing – lasting more than three hours – was titled Transparency & Accountability: Examining Google and its Data Collection, Use and Filtering Practices and promised to give Pichai an opportunity to publicly clarify the search giant’s position on consumer rights in regards to privacy in an increasingly data-dependent world – as well as reflecting on its openness as a business in the political context at home and abroad.
Overall, the hearing was a bit of a mixed bag. This was less to do with the substance of Pichai’s answers and more to do with the flawed questioning from the assembled. Here are what I took away as good and bad responses from the Google chief, as well as a few of my frustrations.
Some of Pichai’s most substantial answers came when asked about issues of diversity, the wellbeing of ethnic minorities, and the rights of women.
He reiterated Google’s commitment to diversity and made reference to the fact that the business were the first to publish a transparency report on their diversity. He also pointed to combating the spread of white supremacy content on YouTube and made clear his and the company’s zero-tolerance attitude on hate speech.
This was obviously comforting to the assembled congress men and women – particularly as the US has seen hate crimes rise by 17% last year, and online media is argued to be adding to the normalization of hate speech in the mainstream.
Pichai was also asked about forced arbitration within the business – a subject that came to the fore last month as staff in Google offices around the world staged a mass walkout ‘to protest sexual harassment, misconduct, lack of transparency, and a workplace that doesn’t work for everyone.’
His response was that the company have already enacted changes where forced arbitration for sexual harassment is concerned. This means that if employees want to bring sexual harassment charges against someone they now have the right to do so outside of the internal arbitration structure of the business (via a class action lawsuit, for instance). Pichai also expressed commitment to make changes (ultimately removing forced arbitration, I assume) outside of the realm of sexual harassment – giving more options and rights back to the employees.
As we expected, a number of questions during the hearing were focused on the rumored development of a new search product for the Chinese market.
Pichai was initially quite firm that Google had no plans to launch a search product (currently referred to as Dragonfly) in China, but the more he was pressed on the subject, the more woolly that stance became. ‘We have undertaken internal effort,’ he said, adding later: ‘It’s our duty to explore possibilities to give users access to information.’
Answers here were less substantial than those given in a recent Q&A with Pichai at the Wired 25 Summit in October. But we can see that internal development on a tool for the Chinese market is ongoing – even in the face of calls from staff to shut it down. Pichai’s stance is that pursuing work to give access to information for consumers everywhere (including China) is the human right he, and Google, is focused on. He did commit to being transparent as this work continues, but I have my doubts.
Some of the other more difficult questions for Pichai concerned internal messages and discussions from Google staff regarding domestic politics.
He was pressed on potential bias when a staff member admitted in an email to the company helping to get the Latino vote out in key states and not others during the 2016 election. Pichai denied such activities happened – at least in terms of Google itself working to do this. Another congressman questioned whether it was right that there should be a forum for the Resist (anti Donald Trump) group on Google’s staff network. Pichai said he was not aware of the group.
As we’ve seen, some of the questions on bias at Google are justified – although they frequently simmer down to semantics of the language used by staff when discussing politics on company time and in staff forums. Who is ‘we’ in the case of getting the Latino vote out in key states? Is the content discussed in the Resist group too political?
There were also plenty of frustrating moments where the capacity of congress members to understand how an algorithm which takes into account a vast number of metrics (including freshness, how linked-to the content is, and previous individual search history) can sometimes deliver results that appear more or less conservative or liberal.
I felt sorry for Pichai as he spent several minutes assuring one congressman that while his search for Donald Trump gave mostly negative results, the algorithm itself is neutral and the best content for the search query (in terms of quality and relevance) just so happens to not present Trump in a positive light. A few moments later, a congressman (presumably on the other side of the fence) expressed his disgruntlement after a recent vanity search to find most top ranking sites running stories about him were from the right wing news press.
All too often congressmen were seen to bark at Pichai, “it’s a yes or no answer,” when it could never be. Some held their iPhones (not Android devices) aloft and expected Pichai to know whether any Google apps on it were saving location data. These instances managed to be both depressing and humorous, but they highlighted a number of dualities Google must contend with as it moves into 2019. As a business, it has to be at the forefront of technology dealing with the complex issue of the world’s information and data, while still making sure every day consumers can use it safely and successfully.
At the same time, Google must be neutral in what it delivers to consumers – while having a staff that is always likely to lean one way politically more than the other, and also striving to be progressive in how it operates.
And ultimately, it still has a mission to provide users – wherever they are – with the best search information. It is clear that in the case of China, some negotiation with a government known to operate surveillance has to happen. In the case of the US, the company has to answer to a political class that is so binary that it, by comparison, can seem very outdated.
The post Google’s Pichai answers to Congress: The good, the bad and the frustrating appeared first on Search Engine Watch.
Adhering to the whims and fancies of Google’s unpredictable nature can be a tasking ordeal for content creators around the world.
What once seemed like a harmonious relationship, the bond between consumers and producers of content has become somewhat volatile, asking marketeers to be agile in their approach to content. As click-through rates and organic opportunity steadily drops, what are we required to do with our content to compete in the SERPs?
In this post, we’ll reflect on the recent ‘zero results’ update, mobile web, featured snippets and Google’s ‘biases’ can inform the next steps for 2019, ensuring a bright content-filled future for all.
Mobile web has been subject to immense change in recent times, and while the stats may prove daunting, the stakes have never been higher for producing valuable content on the platform.
In years gone by, Google has appeared to be giving preferential treatment to mobile over desktop. In 2015, the mobile-friendly update or ‘Mobilegeddon’ as it became informally known, paved the way for mobile-friendly content creators to rank higher. Since then, we’ve also seen the Mobile-first Index and a number of other activities aimed at bettering the consumer experience on mobile.
Fishkin does not make this statement in an attempt to expose some form of conspiracy, but merely to draw attention to how the SERPs are changing and those who produce content must do more to compete for clicks. In early 2016, almost 60% of searches produced clicks to an organic result on mobile, with an even higher percentage for desktop (65.56%). Since then, while desktop web remains the same, the statistics for mobile give a very different picture, with organic clicks dropping just under 20%, and no-click searches now in the majority.
This rise suggests that users are getting all they need from Google without having to leave the page they’re on. The response, however, should not be to disregard mobile web, but to prioritize it and find ways to optimize your work in a way that adheres to the biases Google shows towards certain things. This is less about seeing SEO as one big game but creating content for a certain type of searcher and making the most of the space available.
Snippet in the bud
Making data instantly accessible for users is part and parcel of Google’s service with regard to SERPs. Google’s remarkable comprehension of what its users want is symbolized by additional features such as the featured snippet that sits in ‘page rank zero’. Results from queries such as ‘Ryan Gosling height in feet’ give us little reason to leave page one on Google or even scroll down the page, as we can instantly find what we’re looking for.
Incredibly, Google extends this service beyond the height of random celebrities, giving immediate responses to searches about news, facts and even recipes. Optimizing for featured snippets allows your content to be scraped by Google and repurposed for the user, whether it’s a paragraph, list or table. Its presence in the SERPs appears to be more relevant than ever, with featured snippets and related questions appearing in 40% of queries according to HubSpot.
Producing content with this feature in mind must force you to treat Google as your customer, giving them what they want to then pass on to their customers through their own services. This strange cycle of events to achieving an appearance in the featured snippets becomes even more convoluted in the realm of link building, however. Content marketers such as us at Kaizen face the constant battle of relying on other sites to link to our content in a way that can be optimized by Google.
Fortunately, a critical feature of virtually all the content we produce, is to integrate data into the work. Content that answers existing questions that people are likely to ask is one of the few ways to find yourselves ranking in the featured snippets.
Two example campaigns for featured snippets
Below are two campaigns we launched last year which still currently appear in the featured snippets.
The first was a data study of seaside ‘Staycation’ towns in the UK for Credit Card checking service Marbles, whose coverage on House Beautiful appears in page rank zero.
Secondly, was a study we carried out on the world’s most and least reliable airlines for medical travel insurance provider Get Going. In addition to being covered in the likes of Yahoo and the Independent, it was the link on the Reader’s Digest site that appeared in the featured snippets for the query of ‘most reliable airlines’.
Achieving visibility in the featured snippets through our content’s coverage is a handy by-product of content marketing, but serves to show the occupiable spaces in the SERPs for your brand or client, bolstering the credibility of their content through data.
In an AHRefs study of the most frequently used terms in featured snippet queries, ‘Best’ and ‘Vs’ ranked 2nd and 3rd. While there may never be a set formula for appearing in the featured snippets, these findings imply Google’s leaning towards data, or simply the content that a searcher is looking for.
Answer the question
To conclude, the production of visual content that appeals to the nuances of Google’s scraping is not going to be possible for everyone given the varying nature of content, but it’s important to recognize where it’s possible.
Ultimately, your content needs to answer a question people are asking. What seems like such a rudimentary action for any marketeer is often lost in the wave of what seems like a ‘good idea’ at the time.
Content is made for many different reasons and for many different audiences, but Google’s growing dominance demands that we consider them at all stages of production. Google will always endeavor to embody the user in all that they do, meaning that it is our job to treat them as one. Then we’ll all live happily ever after.
Nathan Abbott is Content Manager at Kaizen.
SEO moved beyond exact keyword matching long ago. These days, in order to rank, we need to create content that includes related concepts, satisfies intent and provides value.
With such an important and complicated task in front of us, there’s never such a thing as too many tools.
Every keyword tool below has something new to bring to the table when it comes to helping you understand the topic better, expand your keyword list and diversify your organic rankings:
TextOptimizer is probably the most interesting tool on the list. For any term you put it, it will look at Google search result page, extract search snippets and apply semantic analysis to generate the list of all related topics, terms and concepts that form your topic cluster.
For example, for [grow tomatoes] it will generate the list of the following terms:
If you already have a page that you want to rank for that query, the tool will compare your existing text to the snippets Google returns for that query. It will then score your text and recommend expanding your content to include some of those suggested terms:
The thing is, Google generates its search snippets based on which sentences from the ranked pages do the best job satisfying the query. This means that Google search snippets represent the best (in Google’s opinion) summary of the query topic.
By semantically analyzing those snippets and extracting related terms and topics from them, you will get a better understanding of what you need to include in your content.
It also shows subtopics and related questions (i.e. niche questions for each query you run) which helps you structure and format your content better.
Overall I have found the tool extremely helpful for creating more indepth content as it does a good job urging the writer to include the variety of related and neighboring terms (in order to increase your score)
2. Serpstat Clustering Tool
Serpstat Clustering Tool is another innovative tool that uses Google to better understand and analyze relevancy.
This tool should be used to make sense of your long keyword lists. Instead of simply word-matching, the tool analyzes Google SERPs for every single term in your list and groups them based on how many overlapping URLs each query triggers in Google.
The logic is simple: The more identical results two SERPs have, the more related the search queries are.
This way, instead of creating a group based on a common modifier, the tool will form groups based on each keyword meaning and let you discover keywords which have no words in common, yet can (and should) be used within one copy:
3. Spyfu Related Keywords
Spyfu has a separate tab listing related keywords to the one you put in. The nice thing about the tool is that it excludes phrases that contain your core term.
You can play with helpful filters to see more popular or less competitive keywords.
Read more about Spyfu related keyword analysis here.
4. Google: Related Searches, Google Trends, Google Correlate
Google is kind enough to provide us with lots of useful data that can be used for content planning and optimization. Here are three Google tools that are useful for discovering related terms:
According to Google in the tool’s documentation,
In our case, we don’t have the data series, but the tool can also work with keywords: Simply put in your search term, and Google will calculate the trending pattern and show matching patterns.
Mind that correlation does not necessarily equal causation, so you may come across some funny terms. Don’t be discouraged! Keep running the tool and put together a list of related terms that do match your topic.
My favorite thing about the tool (and why I do use it) is that you can exclude your initial search term from the returned list which means you can prevent the tool from phrase-matching (which you already did when doing your traditional keyword research) and force it to come up with related phrases instead:
Google Trends is a more straightforward tool: Simply type in your core term and scroll down to “Related queries”, i.e. “Users searching for your term also searched for these queries”.
The nicest thing about this tool is that it shows “Breakout” queries, i.e. queries that “had a tremendous increase, probably because these queries are new and had few (if any) prior searches.” These could be an opportunity for trending content!
Google’s “Searches related to”
Finally Google’s “Searches related to” can give you some ideas where to expand your core terms. Notice how Google is helpfully showing new terms it’s suggesting in bold:
IMN Featured Snippet Tool collects those results and organizes them by (1) query they are triggered by and (2) popularity (i.e. based on how many queries trigger them):
Expand your keyword lists! This will help you create more indepth content, diversify your rankings and generate expsoure from other Google search result sections, like featured snippets and “People Also Ask.”
The post Four tools to discover and optimize for related keywords appeared first on Search Engine Watch.
It’s a well-known fact that there are over 200 ranking signals used by Google. And every year it keeps on tweaking and refining its algorithm introducing new ranking signals and changing priorities.
I know that the idea of having to optimize for all of them will probably make you shiver with horror. The good news is there are not so many ranking signals optimizing for which is simply a must.
Please note: in the light of mobile-first indexing, according to which mobile websites are being indexed in the first place, it’s most important that mobile sites are optimized for the below listed ranking signals.
So, without further ado, here is the list of the most important ranking factors for you to dominate search in 2019.
I guess it’s more than obvious for any SEOs out there that Google is going nuts about getting into people’s heads and providing them with the most relevant search results. Now that we live in the age of semantic search, Google aims to figure out the meaning behind a certain search query to provide the most precise search results. Besides, Google also considers such factors as users’ search patterns, search history, location, and time.
Of course, when searching for something, users have certain intents in mind. And Google’s ultimate task is trying to figure them out in order to supply users with the most relevant search results on the top positions. Ranking-wise, the more relevant your page is to a certain query, the higher position it gets in the SERPs. What’s more, satisfying search intent almost always results in high CTR.
If you want to understand what search intents hide behind your keywords, consider experimenting with various queries. After typing them in the search box, have a look at the first result pages and try to figure out their search intent. If you see that some of your pages don’t really match the designed search intent, it may signify that these are not the right pages to be optimized for such keywords. So, if that’s the case, consider finding corresponding pages and adding more relevant content to them or creating some new ones that would be relevant to the implied search intent.
CTR is one of the strongest relevance signals for Google. And there’s no doubt CTR has high correlation with rankings as an increase in CTR almost every time entails a significant ranking boost.
If you want to get an idea of what people tend to click on the SERPs to reach your site, you can use Google Search Console’s Search Analytics report. Pay your special attention to pages that rank high but have low CTR. It may be a flagger that your title tags or meta descriptions are not relevant enough and need to be worked on. To understand where you stand with your CTR, have a look at this summary of CTR data sorted by position in Google search.
If there’s anything I know for sure, rankings and content have always belonged together. Basically, your content is the very reason for people visiting your site. What’s more, Google has rolled out Panda and Fred updates aiming to make the web more helpful and beneficial content-wise. However, even well-written content pages are not always enough. With Google constantly raising its standards, your piece of content should also satisfy the below listed ranking factors.
In 2019 keywords in the title tag still remain a powerful ranking signal as this is one of the ways Google decides whether your page is relevant to a given query or not. What’s more, the closer your keywords are to the beginning of the title, the better. And of course, your most important keywords should be present in the page’s body, alt texts, and H1 tag. But please make sure that you’re not overusing them because you don’t want to be penalized for keyword stuffing, do you?
Of course, except from your main keywords, you need to be optimized for some related terms that would accompany them. Just in case you still haven’t collected such keywords, here are some advice on how to nail keyword research these days.
As I’ve mentioned before, Google is going nuts about improving the quality of search. With Hummingbird, Google now prioritizes pages that match the meaning of the query rather than separate keywords. That is why you need to aim not for just filling your piece of content with keywords but for making it as comprehensive as you can.
In order to optimize your content for comprehensiveness, consider using TF-IDF analysis, which can help to calculate how frequent certain keywords are used on your competitors’ pages. By doing this, you can get lots of relevant terms and concepts used by your top-ranking competitors. Luckily, there are now plenty of tools that have TF-IDF analysis in them. By the way, here is a nice guide for you on how to improve your content’s comprehensiveness with the help of TF-IDF.
Publishing mistake-free content is yet another signal to Google that content is of good quality. There’s not much to say there. Just make sure you proofread your piece of content before publishing it or use online grammar checkers like Grammarly.
By organizing your HTML markup in a clear way, you make it much easier for the search engines to understand what your content is actually about. Yes, search engines still rely on HTML structure and its semantic markup. So, no matter how cool your content is, if your page has messy HTML, peaky search engine spiders may think it’s of bad quality and down-rank it. Luckily, there is now a whole variety of plugins (including WordPress’ ones) that can help with cleaning and optimizing your HTML.
To make your HTML even more structured, consider implementing schema markup. Structured Data Markup Helper can offer you a helping hand with that. Doing this will help search engines to understand your content better, identify the most important information on your site, as well as make your snippets look more attractive. You can also preview your snippets with the help of Google’s Testing Tool to make sure everything is displayed correctly.
Just as much as Google appreciates uniqueness it also penalizes sites with duplicate content. So, in order to improve your rankings and get Panda off your site, make sure it has no duplication issues. By the way, here’s a nice guide on how to spot and deal with various types of duplicate content. What’s more, you should also watch out for external duplication. So, if you suspect some pages on your site may have it, go ahead and check them with Copyscape.
If you work for one of those industries that simply cannot publish unique content every time (like online stores with many product pages), try to make your product descriptions as diverse as you can. Another good way to solve the problem is by utilizing user-generated content.
I guess it’s of no surprise to you that backlinks have been ruling ranking for ages. The reality is they still remain the strongest indication of authority to Google. And it’s safe to say that it’s hardly going to change in 2019. That is why quality link building should be your primary concern if you want to make it to the top. By the way, here are some powerful link building strategies for you to get some inspiration from.
Of course, one of the coolest tactics is to spy on your competitors’ linking profiles. One of my favorite tools for this kind of activity is SEO SpyGlass. With its help you can compare your linking profile with the ones of your competitors as well as see where your links intersect. By doing so, you will get priceless insights of new link building strategies that you can arm yourself with.
Although Google definitely appreciates quality more than quantity, the total number of backlinks still remains a powerful ranking signal. Please note that links coming from a single domain carry much less weight comparing to those that come from various domains. So, just have a look at the total number of backlinks and total linking domains parameters in whatever SEO tool you are using and see if your linking profile is in need of some improvement quantity-wise.
No matter how many links you have, they need to be of good quality. Otherwise, they’ll most probably get you in trouble (Penguin is watching you) rather than bring you good rankings. That is why in order to maintain quality of your links, you need to carry out regular backlink audits. Fortunately, there is a huge number of tools that help with identifying links’ harmfulness. So, if you’ve spotted some spammy links, make sure to contact the website owners who linked to you asking politely for removing them. If it didn’t work out, just disavow these reputation damagers and forget about it. What is more, if you spot some sudden spikes of links, make sure to check them as there is always a chance that your competitors could be pointing spammy links to you.
Although nowadays link anchor text is a bit less important than the two above mentioned link parameters, keyword-rich anchor text still firmly stays an important relevance signal for Google.
To be on the safe side, your links’ anchor texts need to be semantically relevant to the topic of your content and also maintain diversity. On top of that, don’t over-optimize your anchor texts with keywords, especially with the ones that are somehow connected with monetization, as this will definitely get you under Google’s Penguin penalty.
With Google now being obsessed with user experience more than ever, the pressure on website owners and SEOs is really high. You are supposed to have super fast and uber convenient website to make your visitors stay and compete for high positions in the SERPs. So, here are three major user experience ranking signals for Google that I want to drive your attention to specifically.
Of course, the very first thing that comes to your mind when you think of user experience is page speed. And I’m sure you’re aware of Google’s Speed Update that has officially made page speed a ranking factor for mobile.
Another speed related change that took place recently has to do with the PageSpeed Insights tool which now evaluates websites according to two criteria: Speed and Optimization. The Speed parameter is now calculated based on real-user measurements: FCP (First Contentful Paint) and DCL (DOM Content Loaded) which are extracted from CrUX database. And Optimization score has to do with technical parameters like redirects, compression, minification, etc.
In the light of all these recent changes, our team has conducted a research aiming to figure out the correlation page speed has with rankings. Surprisingly enough, it turned out that Optimization score has huge influence on rankings these days.
So, in order to get yourself an idea of how your websites is performing speed-wise, go ahead and test it with PageSpeed Insights. Pay your special attention to the Optimization parameter and fix technical issues (if you have any). If you’re not sure how to do it, please consult this guide on Optimization score improvement.
In case your Optimization score is perfectly fine but the Speed parameter leaves much to be desired, the only thing you can do is to make it less “heavy” and sophisticated by minimizing the amount of images and scripts. You can also consider implementing AMP (Accelerated Mobile Pages) for your mobile pages as it will make them load almost instantly.
Another two ranking signals that are closely connected with user experience are dwell time and bounce rate. To be completely honest with you, both of these metrics depend massively on the type of query. When it comes to bounce rate, for instance, a user may receive an immediate answer by visiting only one page of your site. This will still be considered a bounce, although it doesn’t mean that your page is not good enough. But as a rule, researching something takes a user more than just one page to open.
Speaking of dwell time, the longer a certain user stays on your page, the more relevant it seems for Google. Just like with bounce rate, a user can spend only 5 seconds on your site and be fully satisfied with the answer at the same time.
So, although both of these parameters depend on what exactly users type in the search box, the combination of these two parameters allows Google to evaluate pages’ relevance pretty accurately.
So, to make your visitors stay for longer, try to engage them as much as you can. Think of providing your users with some additional content links so that they are sent to some related posts on your site, for instance. Another good idea is to implement so-called “breadcrumbs”. These are small text paths at the top of the page that improve website navigation and help users to understand where they are on you site. What’s more, you can add comment sections under your posts, that may win you another couple of minutes.
I guess it goes without saying that PageRank is one of the strongest authority signals for Google. The thing is, except for external PageRank, your page is also influenced by the internal one. So, if you want to improve rankings of some pages that are performing not so well, it’s better not to hide them deep in your site structure. The best practice is for every single page of your website to be not more than 3 clicks away from your homepage.
However, if you need to boost rankings of a page that is buried in your site structure, the best thing you can do is to point some internal links to it. But just before doing that, look at your site structure with the help of WebSite Auditor’s Visualization feature to see how internal link juice is distributed within your site and what pages need to be worked on in the first place.
Caring about user’s safety is yet another Google’s concern these days. Back in 2014, Google has made HTTPS a ranking signal. Since that having an HTTPS site is not a recommendation but a must as Chrome browser now marks sites as “not secure” in case they are not HTTPS. For you to be safe and provide your users with safe experience, learn how to migrate your site from HTTP to HTTPS.
As I mentioned at the beginning of the article, there is an enormous amount of ranking factors that directly or indirectly influence your position on SERPs. But in 2019 I would definitely suggest setting a course for creating great content, quality link building, and improving user experience. Besides this, it’s always nice to carry out competition research to see how your top competitors optimize for the following ranking signals to borrow their tactics and reinforce your weak spots (if there are any).
The post 14 ranking signals you need to optimize for in 2019 appeared first on Search Engine Watch.
Page speed has been a part of Google’s search ranking algorithms for quite some time, but it’s been entirely focused on desktop searches until recently when Google began using page speed as a ranking factor for mobile searches as well.
Have you checked your page speed scores lately?
How do your speeds match up against your competition?
If your pages are loading slower than competitors, there’s a chance you’re taking a hit in the SERPs. While relevance of a page carries much more weight than page speed, it’s still important to ensure your pages are loading fast for users and search engines.
Here are 5 ways to increase page speed and improve SEO results.
Large image files can have a significant negative impact on page speed performance. Images often represent the largest portion of bytes when downloading a page. This is why optimizing images generally returns the biggest improvement in speed performance. Compressing your images using an image compression tool will reduce their file size leading to faster loading pages for both users and search engines, which in turn will have a positive impact on your organic search rankings.
Leverage browser caching
Google recommends setting a minimum cache time of one week (and preferably up to one year) for static assets, or assets that change infrequently. So, make sure you work with your web developer to ensure caching is setup for optimal page speed performance.
Decrease server response time
There are numerous potential factors that may slow down the response of your server: slow database queries, slow routing, frameworks, libraries, slow application logic, or insufficient memory. All these factors should be taken into consideration when trying to improve your server’s response time.
The most favorable server response time is under 200ms. SEO marketers should work with their website hosting provider to reduce server response time and increase page speed performance.
Enable Gzip compression
You will need to determine which type of server your site runs on before enabling Gzip compression as each server requires a unique configuration, for example:
Again, your hosting provider can help you enable Gzip compression accordingly. You’d be surprised how much faster your pages load by having Gzip implemented.
Avoid multiple landing page redirects
Having more than one redirect from a given URL to the final landing page can slow page load time. Redirects prompt an additional HTTP request-response which can delay page rendering. SEO Marketers should minimize the number of redirects to improve page speed. Check your redirects and make sure you don’t have redundant redirects that could be slowing load time.
SEO marketers must be analyzing and improving page speed. A great place to start is compressing images, utilizing caching, reducing server response time, enabling file compression, and removing multiple/redundant redirects.
I urge marketers to periodically use Google’s Page Speed Insights Tool to check your load time and compare your website to competitors’ sites. The tool also provides specific, recommended optimizations to increase your site’s page speed performance.
As Google continues to favor fast-loading websites it’s crucial that SEO Experts take necessary steps to ensure your site’s pages are meeting (and beating) Google’s expectations. Today, improving page speed is an essential aspect of any successful SEO Program.
Keeping up to date with what people search for online can be invaluable to your business. Whether you’re looking to inform your latest paid search campaign or just need some fresh, trending content for your blog, these tools can help.
However, with over 3.5 billion searches each day worldwide, it’s hard to know how to narrow all that data down to help you improve your SEO.
Here are some of the many great tools available to help you discover what people search for: the most popular topics, keywords, and trending stories.
When asking what people search for, it makes sense to start with the largest, most commonly used search engine in the world – Google. Due to its sheer size, Google has some great stats, trends, and insights to dig your teeth into.
Let’s look at Google Trends, for example. This gives you a very quick overview of the searches with the most traffic overall, which is continually updated. You can enter a keyword and see how search volume has varied for that term over time and in different places.
Simply change the location, time frame, category and type of search to dig even deeper into the data.
Google Trend results for “SEO.”
If you’re looking to find variations in your keyword phrases, Google Autocomplete is a great free tool that you’re probably already using every day.
Type your keyword into the search box and related terms will display in a drop down list. This can be a good starting point for inspiration on long-tail keywords you might need.
If you’re looking for fresh content for a website or blog, Google News will deliver the very latest headlines from news sites across the globe (including local news), which are tailored to your personal interests/keywords.
You can use the search bar and the “top stories” section on the left side bar. Also, you can zero in on specific topics and locations to see what news stories people are reading and searching for now.
Google News results for “Brexit.”
Trending on social media
Away from Google related tools, there is plenty that the big social media platforms can offer you when it comes to updates on the latest trends. Regular users of Twitter will know about the trends for you box, which uses an algorithm to display trends that are based on your location and who you follow.
This is similar to Instagram’s Explore function. Again, it’s based on your Instagram history and the type of content you follow and watch.
When it comes to broader discovery of what people search for, trending hashtags on both Twitter and Instagram are invaluable. Simply start researching the day’s top performing hashtags to see what‘s hot and then follow the conversation – perfect for blog post topics.
Of all the question and answer sites, Quora is one of the most useful tools for long tail keywords research. Its “related questions” feature (which appears once you’ve typed in a question) is a handy way to generate long tail keywords that might not immediately spring to mind from just looking at your search term. But more importantly, whatever topic you plan to cover, Quora has relevant questions and corresponding answers from thought leaders in that field.
Quora’s title page with related questions box.
One of our favorite but lesser-used tools is Answer the Public.
Answer the Public utilizes search data from Bing and Google and predicts what questions will be asked around each keyword. It also presents this data in a unique and visually stunning way – plus you’re able to download the information to an Excel or .CSV file.
Answer the public’s beautiful data presentation.
Poke around with what people search for with Bing’s own data on its organic searches. The tool provides up to six months of search data (no averages) and generates suggestions of keywords by language and country/region.
For more information on this tool, check out our previous guide, Bing Keyword Research Tool: Highlights & Limitations.
This tool allows you to discover what people search for by industry and sub-industry. You select your target audience, and it populates new trending queries (by volume) on the left bar.
Then, you can click any given query to see how it’s performed this year versus last year and how popularity changes week to week.
Most of us are already familiar with Google Ads (formerly AdWords) Keyword Planner. Search for keyword ideas, compare how keywords perform, measure the keyword competition and improve your next campaigns.
For more information on using Gooogle Ads, check out these articles:
End of year summaries
If you want to explore the searches that have shaped the previous year, most of the major search engines will summarize this data for you. For example, Google’s Year in Search will give you a top 5 list in a variety of topics – from actors, car brands and consumer tech, to movies, recipes, and even selfies!
Click on any of the results to be taken to an Explore page to see more information, such as interest by region, related topics and related queries – giving you a heap of great insights.
What other of your favorite tools did we miss? Leave a comment below!
In the previous article on 7 Things That Hurt Your SEO Rankings and How to Fix Them, we found how a drop in SEO rankings can be one of the worst things that can happen to a website.
A bad SEO entails practices that are outside the boundaries of Google webmaster guidelines and it affects the optimization of your website for search engines.
There are many techniques that can hurt your SEO rankings if implemented. Serpstat, in 2017, found about 300 million errors when they indexed 175 million pages with an SEO audit tool. These errors stemmed from not doing SEO the right way and while we have discussed 7 of these errors, here are more things that could hurt your rankings and how you can fix them.
1. Accessibility and indexation
The accessibility and indexation of your site contribute greatly to how your website pages can be seen on search engines. Some of the categories to consider include:
Canonical Tag: duplicate content makes it difficult for search engines to decide on a page to show users, which could affect the visibility of either page. If you will be implementing a rel=canonical tag, ensure it is done correctly, to avoid losing your website ranking. Situations, where you may implement wrongly, may include:
Noindex Tag: if you do no longer need a noindex tag on a webpage, ensure you remove it as soon as possible. With the tag still there, search engines will not index the webpage, which could leave you wondering why your SEO isn’t improving. Always keep track of your pages to know when a tag is no longer relevant.
Robots.txt: always check for pages hidden in robots.txt and take them out when necessary, to help improve your SEO rankings. If you have redirects in the webpage hidden in your robots.txt file, the crawler will likely not recognize it.
Nofollow links: nofollow links have no SEO value but you could be penalized by search engines for not using it properly. Many websites easily fall victim of this, as they often have links featured in their web pages that are unrelated to the content of the page. This ends up dropping their SEO rankings.
2. Links and bad redirects
While links are great to help drive traffic and boost your SEO ranking, they could also ruin your SEO efforts if they aren’t managed well.
Broken links on your web pages should be rectified or removed as soon as possible. Reasons for broken links could stem from entering the wrong URL, removal or permanent move of the linked webpage by the destination website, or a software on the user’s end that is blocking access to the destination website. There is a WordPress plugin for WordPress users that can be integrated into the website to get rid of dead links. You can also manually check for broken links by using the broken link checker plugin.
How to disavow negative backlinks
Google has a Disavow Tool that can help protect your site from penalties that may arise from bad linking and also help remove bad links. This tool simply sends a signal to Google to ignore negative backlinks. To disavow negative backlinks, look for the links you want to disavow, create a disavow file and then upload to the Google Disavow Tool. Once this is done, the specified links will no longer be considered by Google
Bad redirects and best redirects – 301 and 302
301 and 302 redirects might look similar to a user but definitely not to search engines. While 301 is a permanent move to a new site, 302 is temporal but a lot of users get to mix both up and use either, without thinking much about the difference. If you use 302 rather than 301, search engines might view it as a temporal move and still continue to index the old URL, which could affect your SEO rankings.
3. Not maximizing Google Search Console
Google Search Console is packed with lots of benefits that should be maximized in order to have the best SEO experience. Some of the things to pay attention to in Google Search Console include search analytics, links to your site, mobile usability, robots.txt tester, sitemaps, index status, and security issues. Once an identified issue is fixed, your rankings will be improved and your website will gain more traction.
4. Meta tags
Meta tags are important for SEO and usually one of the first things to learn in SEO training. Your key meta tags, including keywords attribute, title tag, meta description attribute, and meta robots attribute should be taken seriously, as they help search engines understand what a page is about.
Don’t use too long or too short titles and descriptions. The optimal number of words for your title required for the best SEO practice is 10-15 words, which is about 78 characters, following Google’s current meta title guideline.
Your description should be between 110 and 120 characters, for easy optimization for both mobile and desktop. While you ensure your title and description aren’t too long, you should also be careful not to make them too short. Your meta tags should provide enough info about the page to help the search engines understand the content.
Google encourages creating good meta descriptions; ensure there’s a description for every page on your site and they must be different for each page. Duplicate content could mess with your rankings. You should also include clearly tagged facts in the description and use quality descriptions.
Doing SEO wrongly will affect your SEO rankings and following the accurate SEO practices, based on Google standards, will help your website success. These common errors should be avoided at all cost. If you are also caught flouting the SEO rules, you might be penalized by Google which could cause a huge drop in your rankings.
The post Things that hurt your SEO rankings and how to fix them, part 2 appeared first on Search Engine Watch.
This year’s TechSEO Boost, an event dedicated to technical SEO and hosted by Catalyst, took place on November 29 in Boston.
Billed as the conference “for developers and advanced SEO specialists,” TechSEO Boost built on the success of the inaugural event in 2017 with a day of enlightening, challenging talks from the sharpest minds in the industry.
Some topics permeated the discourse throughout the day and in particular, machine learning was a recurring theme.
As is the nature of the TechSEO Boost conference, the sessions aimed to go beyond the hype to define what precisely machine learning means for SEO, both today and in future.
The below is a recap of the excellent talk from Britney Muller, Senior SEO Scientist at Moz, entitled (fittingly enough) “Machine Learning for SEOs.”
What is machine learning? A quick recap.
The session opened with a brief primer on the key terms and concepts that fit under the umbrella of “machine learning.”
Muller used the definition in the image below to capture the sense of machine learning as “a subset of AI (Artificial Intelligence) that combines statistics and programming to give computers the ability to “learn” without explicitly being programmed.”
That core idea of “learning” from new stimuli is an important one to grasp as we consider how machine learning can be applied to daily SEO tasks.
Machine learning excels at identifying patterns in huge quantities of data. As such, some of the common examples of machine learning applications today include:
This very ubiquity can make it a challenging concept to grasp, however. In fact, Eric Schmidt at Google has gone so far as to say, “The core thing Google is working on is basically machine learning.”
It is helpful to break this down into the steps that comprise a typical machine learning project, in order to see how we might apply this to everyday SEO tasks.
The machine learning process
The image below represents the machine learning process Muller shared at TechSEO Boost:
It is important to bear in mind that some of the training data should be reserved for testing at a later point in the process.
Where possible, this data should also be labelled clearly to help the machine learning algorithm identify classifications and categories within a noisy data set.
It is for precisely this reason that Google asks us to label images to verify our identity:
This demonstrates our human ability to pick out objects in cluttered contexts, but it has the added benefit of providing Google with higher quality image data.
The pitfalls of an unsupervised approach to machine learning, and a training data set that is open to interpretation, were laid bare just last week.
Google’s ‘Smart Compose’ feature within Gmail has demonstrated gender bias by preferring certain pronouns when predicting what a user might want to say.
As reported in Reuters, “Gmail product manager Paul Lambert said a company research scientist discovered the problem in January when he typed “I am meeting an investor next week,” and Smart Compose suggested a possible follow-up question: “Do you want to meet him?” instead of “her.”
The challenge here is not restricted to projects on such a scale. Marketers who want to get their hands dirty must be aware of the limitations of machine learning, as well as its exciting possibilities.
Muller added that people tend to overfit their data, which reduces the accuracy and flexibility of the model they are using. This (very common) phenomenon occurs when a model corresponds very closely with one specific data set, reducing its applicability to new scenarios.
The ability to scale effectively is what gives machine learning its appeal, so overfitting is something to be avoided with care. There is a good primer to this topic here and it is also explained very well through this image:
So, how exactly can this subset of AI be used to improve SEO performance?
How you can use machine learning for SEO
As is the case with all hype-friendly technologies, businesses are keen to get involved with machine learning. However, the point is not to “use machine learning” through fear of being left behind, but rather to find the best uses of machine learning for each business.
Britney Muller shared some examples from her role at Moz during her session at TechSEO Boost.
The first was an approach to automated meta description generation using the Algorithmia Advanced Content Summarizer, which was then compared to Google’s approach to automated descriptions pulled directly from the landing page.
Meta descriptions remain an important asset when trying to encourage a positive click-through rate, but a lot of time is spent crafting these snippets. An automated alternative that can interpret the meaning of landing pages and create clickable summaries for display in the SERPs would be very useful.
Muller shared some examples, such as the image above, to demonstrate the comparison between the two approaches. The machine learning approach is not perfect and may require some tweaking, but it does an excellent job of conveying the page’s intent when compared to Google’s selection.
The team at Moz has since built this into Google Sheets:
Although this is not a product other businesses can access right now, an alternative way of achieving automated meta descriptions has been shared by Paul Shapiro (the TechSEO Boost host) via Github here.
Automated image optimization
Another fascinating use of machine learning for SEO is the automation of image optimization. Britney Muller showed how, in under 20 minutes, it is possible to train an algorithm to distinguish between cats and ducks, then use this model on a new data set with a high level of accuracy.
For large retailers, the application of this method could be very beneficial. With so many new images added to the inventory every day, and with visual search on the rise, a scalable image labeling system would prove very profitable. As demonstrated at TechSEO Boost, this is now a very realistic possibility for businesses willing to build their own model.
A further use of machine learning described by Britney Muller was the transcription of podcasts. An automated approach to this task can turn audio files into something much more legible for a search engine, thereby helping with indexation and ranking for relevant topics.
Muller detailed an approach using the Amazon Transcribe product through Amazon Web Services to achieve this aim.
The audio is broken down and delivered in a J-SON file in a lot of detail, with the different speakers on the podcast labelled separately.
There was not enough time in the session to work through every potential use of machine learning for SEO, but Muller’s core message was that everyone in the industry should be working towards at least a working knowledge of these concepts.
Some further opportunities for experimentation were listed as follows:
As we can see, machine learning truly excels when working with large data sets to identify patterns.
Tools and resources
The best way to get engaged is to combine theory with practice. This is almost always the case, but it is a particularly valid piece of advice in relation to programming.
Muller’s was not the first or last talk to reference Google Codelabs throughout the day.
There are more resources out there than ever before and the likes of Amazon and Google want machine learning to be approachable. Amazon has launched a machine learning course and Google’s crash course is a fantastic way to learn the components of a successful project.
The Google-owned Kaggle is always a great place to trial new data sets and review the innovative work performed by data scientists around the world, once a basic grasp has been attained.
Furthermore, Google’s Colaboratory makes it easy to get started on a project and work with a remote team.
Key takeaways: machine learning for SEOs
What became particularly clear through Muller’s talk is how approachable machine learning applications can be for SEOs. Moreover, the room for experimentation is unprecedented, for those willing to invest some time in the discipline.
Whether you’re running an organic search or PPC campaign, it all starts with keyword research. Keyword research is usually the first step you undertake when planning how to bring in customers to your website – because the terms they’re searching for will determine the kind of content you will create and the way you will optimize it.
Behind every search, however, is an intent – a need or want – whether it be for a product, service, solution, or simply more information. Intent is one of the most significant variables in marketing. All customer journeys, brand engagement, and sales funnels begin with intent.
Brands that use data to uncover the context and motivation behind every search that lead users to their website will be able to deliver an experience that eventually translates to revenue for them. How, then, can you go about capturing customer intent, and influence the decisions that they make and the actions that they take?
Match keywords with intent
All marketers know about the customer journey and the sales funnel. We know about the different touchpoints where customers interact with our brands and we spend hours planning how to interact with them at each of those touchpoints.
However, we frequently overlook the baggage (context) that customers carry when they type in those search queries. The three types of search queries – informational, navigational, and transactional – only clarify top level intent.
Under each of those umbrella categories, you can find a variety of subtypes of keywords that reveal a whole lot about the intent of the searcher.
Most businesses will find that they’d want organic visibility for almost all of these searcher intents. You can use a grouping of keywords from each of these categories as a seed list and expand upon them using your favorite keyword tool (because, they haven’t started calling themselves “intent research tools” yet).
That will give you a better idea of the search volume, click volume, cost per click, difficulty, trends, SERP features, and other variables, which you can use to estimate how many opportunities you have to interact with the customer (and what that will cost you) as they move along their purchase journey.
Determine how far your customers are from taking action
In the image above, do you see a scenario where a person searching for “tomato plant” might eventually get to “why are my tomato plants turning yellow” a few months down the line? Search is a marketing channel that can reveal the motivations and goals of a person from the queries they type or speak, as well as help you uncover insights into their behavior from what they eventually do on your site.
A study by Northwestern University examined the “psychological distance” between consumers’ current state and their intent to take action using their search queries. The hypothesis was that the farther away a person is from buying something, the more abstract their queries, and the more likely they are to use “why” questions. As they get closer to their goal, they use more concrete, contextual terms with verbs (action words) like “shop” or “buy.”
Data from nearly 25,000 queries revealed that searchers tend to click on results with words that mirrored the nature (abstract or concrete) of their search phrase. Users searching with a browsing intent are 20% more likely to click on a result that stresses abstract words like “best,” while those searching with a buying intent are 180% more likely to click on a result that emphasizes concrete words like “shop.”
What do these results tell you? If you ask me, they underscore the need for mapping content to the intent of the customer at every stage of their buyer’s journey. Aligning keywords to the mindset of your customers allows for more relevant brand messaging, targeted ad or landing page copy, and personalized user experiences that drive more conversions.
Align your SEO with customer intent
When people say keywords are dead, what they mean is you shouldn’t obsess over “optimizing” your blog post or landing page for a particular keyword or set of keywords, because no matter how “smartly” you insert keywords in your titles, subheadings, meta descriptions, or copy, Google is laser-focused on whether your content matches the intent of the searcher.
They are moving from being a search engine to an answer engine. This is evident from how the SERPs show a single answer box (the click-less version of “I’m Feeling Lucky”) along with options such as “People also ask” and “Related search” to further gauge – and meet – searcher intent.
In a nutshell, Google is looking to
This doesn’t mean that all SEO is useless. On the contrary you can use insights from keyword research to structure your content to appear for evolving search features such as instant answers. You then need to make sure that you can truly provide the best answer. Ask yourself:
Working towards the outcome of these questions might get you in the answer box for your targeted search terms and intents, but if you want to stay there, you need to consistently get better at all of the above.
Analyzing Google SERPs for different sets of keywords can also help you make critical decisions on whether to use SEO or PPC to target the right intent. For example, if a search term returns a page filled with blog posts, Q&A sites or forums, videos, and the like, as opposed to sales pages for a product or service, it probably means there is no purchase intent there.
Achieving high rankings through keyword targeting is a long, drawn-out, extremely competitive and complex process, prone to errors in judgement. You’re far better off trying to understand searcher intent and context, and providing best-match, content that’s relevant to them at the moment.
The variables and factors that influence intent matching differ from those that you take into account while doing keyword research. A keyword doesn’t need to have high search volume in order to have purchase intent and be profitable. Intent matching takes the guessing game out of keyword targeting.
To really get better at search marketing, you need much more than technical SEO, high quality content, and links. You need to optimize your website for user experience at every point in the customer journey, and encourage them to take the right actions that eventually result in conversions.
Rohan Ayyar is the Regional Marketing Manager at SEMRush.
The post How to move from keyword research to intent research appeared first on Search Engine Watch.
Pleasure to introduce my self i am Sean Webb i am 27 years old from Manchester, UK.I am doing affiliate marketing and have spend lots of time learning how to rank easy to medium competition keywords. I have recently started PPL and Video Marketing and learning more about it.