The times when ranking high in search results as the final goal have passed.
As search algorithms are improving and there’s no place for keyword stuffing anymore, SEO experts should adopt for trends coming and replacing each other too fast.
Today SEO involves loads of practices including link building, technical optimization, proper keyword research, and more. Although the process of boosting websites’ rankings is challenging enough, it isn’t the only task SEO specialists should cope with. To help businesses engage their target audience and make them convert is the problem falling to SEO practitioners as well.
Here comes the need for UX optimization, which is impossible not knowing what matters most of all for the target audience coming to the specific website. In this article, I’ll tell you about the most important steps to focus on for your customers and how to improve the user experience.
The debate rages: How much does UX count?
UX = CTR in some way. The debate about CTR among SEOs is evergreen, really. Everyone tries to defend his or her point of view despite Google’s statements on this issue. And recently, a new wave of discussion sparked up after the tweet of Moz SEO, Britney Muller. She tweeted about a new Google document that implies CTR matters for ranking.
Different SEO specialists expressed their opinions and tried to confirm or deny this fact, as several weeks before that Google also said CTR for ranking is made up.
For example, Barry Schwartz said that this new doc is “Confusing. Google did write “when you click a link in Google Search, Google considers your click when ranking that search result in future queries.” They should clarify that it is used for personalized search.”
Regardless of all these debates, CTR and UX itself is a great practice. Even if it doesn’t improve your rankings, it’ll make your site more understandable, comfortable, and informative for visitors.
Why should SEO pros bother with user experience?
It may seem that being number one on Google search results is key to an engaging audience and driving conversions. It’s not quite so. Every improvement of search algorithms Google brought in recent years is focused on providing user-friendly results. Of course, domain authority and quality links are still important, but if the website ranked top has poor user experience, it may lose its positions soon.
How does it happen? Well, let’s assume your website is technically optimized, has a perfect content-to-keywords balance, and has links even from .edu domains. All these factors are most likely to result in high rankings.
But if the site is not fulfilling users’ expectations, it may take a dip as quickly as it’s rocketed. Once people that clicked through your website aren’t satisfied with the result, they’ll leave it quickly. The next time Google updates search results, it’ll see that your page’s bounce rate is too high.
The numbers will point to the fact your site isn’t relevant to the query, and it shouldn’t rank that high.
Bounce rate isn’t the only factor search engines consider when analyzing user experience. There are also such signals as pages per session, dwell-time, and organic CTR. As you see, the tasks of an experienced SEO specialist are much broader than it used to be several years ago. But the results cost the effort.
Best practices for UX improvement
The good news is that you already know these practices. The only difference is that using them now, you should concentrate not on the search engines’ requirements but on meeting the needs of your visitors.
Let me take you through the user’s journey point by point and emphasize on crucial aspects influencing his or her decisions. Here we go.
Where does the search journey start? Right, everything starts from the query. Once users have questions, they go to search engines and ask them. That’s why knowing what your potential customers are likely to look for is half-way to success.
The knowledge of queries your target audience conducts helps you come up not only with content ideas but also with key phrases your page should rank for. There are several questions you should answer to make your keyword research user-focused:
1. Do you consider user intent?
The thing is that every user conducting a search query has a certain intent. In other words, there’s always the reason why a person searches for something. If I search for “iPhone price”, there’s a 90 percent probability that I want to buy the smartphone. In this case, Google will provide me with various online stores. Searching for “Apple or Samsung”, it’s most likely I want to read the articles where authors compare devices produced by these two companies. And again, the search engine will get it.
Therefore, it’s essential to denote the intent of the content you provide. If you own an online shop, you should mark transactional intent on your page. Add “buy”, “price”, “purchase”, “on sale”, and other related keywords. If you run a blog, use the phrases with “how to”, “what is”, “best tips”, and more. Denoting intent not only helps search engines rank your website for the relevant queries but also provides users with a better understanding of what they’ll see on your page.
Moreover, users quite often don’t mark their intent in their search queries. One can type “women jeans”, and the machine won’t be 100 percent sure whether a person wants to get some fashion tips or purchase jeans. In this case, the search results will contain both informational and transactional websites. To help your prospects understand if they should click through your site, you’d better denote what type of service you provide.
2. What queries do your prospects conduct?
Even if you think you understand what kind of information concerning your product your target audience may be interested in, don’t jump to conclusions. Working in a niche is pretty different from being a consumer. The words you use to describe what you do may be absolutely unknown for people searching for your product at first.
As we concluded it’s important to know what kind of search queries your potential customer’s conduct. I’ll show you the quickest ways to identify these queries.
The next step users undertake after entering the search query is choosing which website from all the search results provides the most comprehensive information. And how do they make a decision? Right, they judge by what they see in the snippet.
The website may contain high-quality and relevant content, but failing to denote it in the snippet, decreases its chances to get high traffic dramatically. In fact, there are two pieces of metadata influencing how your snippet may look.
1. Title tag & meta description
Of course, creating a catchy and intriguing title and description is essential in case you want to attract the audience’s attention. But you should be careful. In the way, a boring title will bring you little profit, the too promising one will also do you no good.
You may say: “There are loads of posts on the subject, I should make my page stand out.” I agree. Partly. You should stand out. But with unique and quality content, not with false promises in your snippets.
Once you promised something that you don’t provide on your page, it’ll increase your bounce rate and hurt user experience.
So, when creating a title tag and meta description, make sure you:
2. Page speed
After your audience clicked through your website, there’s one more thing they’ll face before seeing the page itself. Page speed is the factor considered not only by search robots but also by users.
If a person (especially a mobile user) should wait for more than several seconds for your server to answer, the chances he or she will return to search results are quite significant. Here’s an infographic by HubSpot.
There are services, such as PageSpeed Insights, Serpstat, and Moz, which analyze web pages’ loading speed and generate suggestions to make them faster.
Never forget that 48.2 percent of web page views worldwide account for mobile device users. So, if you don’t want to damage the user experience, it’s worth checking whether your website is mobile-friendly or not. Click on “mobile” to see the analysis of your mobile version. You may be surprised to see that these two versions differ a lot.
Design & content structure
When people have already chosen your website, clicked through it, and waited for it to get loaded, they see your page. What’s the first thing catching the visitor’s eye? The way it looks. Your content may be incredibly authoritative and trustworthy, but once users see it’s impossible to draw the essential data quickly, they might decide the content is too complicated.
I’ve circled out a few tips for you to follow when elaborating your content structure and design:
Mind that lots of visitors don’t come to your website from your home page. Make sure your website is easy to navigate for users to visit different pages of your website. Remember that the more time people spend on your site, the better the user experience it causes.
First of all, your menu button (or hamburger button) should always be handy. It would be the most convenient to create a fixed header for your web pages. Due to such a header, visitors won’t have to scroll the page after they’ve read your content.
Moreover, when unrolled, the menu shouldn’t overlap the page content. Talking of overlapping, try to avoid pop-ups. All the advertisements overtaking the article every minute cause extremely poor user experience.
Don’t forget to provide clearly labeled categories on the menu. Everything should be organized logically. To make your website even more convenient in the usage, add a search box. If visitors aren’t sure which category contains the necessary information, this box will help a lot.
Never stop testing
Here were the basic tips which may help you improve UX for your website. Following them is the start, not the finish line. UX tendencies are changing almost as often as search algorithms do.
Always look for ways of improving your strategies. Track users’ reactions to your new posts. ask yourself, “Did the conversions improve when you applied a new design?” Analyze all the changes whether they’re good or not and discover how you can develop user experience.
Inna Yatsyna is a Brand and Community Development Specialist at Serpstat. She can be found on Twitter @erin_yat.
The post Why bother with user experience: UX tips for SEO experts and business owners appeared first on Search Engine Watch.
When it comes to link building using digital PR, it’s important that you start with the story.
A great story can be told in many different ways. However, you’ll struggle to gain traction when there’s either a lack of it or a poor story regardless of what format you choose.
That means, almost ignoring the end format of your asset (infographic, interactive, and the others) until you’ve got a solid concept with strong headlines.
That said, there’s a lot to be said for spending the time to let yourself be inspired and to understand which formats are working for other brands.
Here, I’ll showcase five proven formats which help maximize link acquisition in a digital PR campaign, placing a focus upon specific campaign types as opposed to simply the visual delivery.
I’ll look at why each format is often successful at earning links and share campaign examples (to note; these aren’t all campaigns which I’ve been involved in, but ones which I love and have been inspired myself by).
(To note: All linking root domain statistics have been taken from ahrefs using the “Historical” figure. Correct as of 13th February 2019.)
Maps are a great format to use when running a digital PR campaign for two main reasons:
When choosing a map as the most suitable format for your campaign, you need to be thinking of simplicity to avoid a design which looks too busy and need to have a comparable thing to compare either across regions or countries.
Whether you opt for a map which showcases regional differences across a country (think city vs city or state vs state), within a continent (think European countries or cities compared) or global; you need to ensure that the data is available as the format can fall down with too many gaps.
Ready to be inspired? Here’s three totally different, but equally great, map-based campaigns:
Veygo – The world’s most desired motors
Linking Root Domains: 114
Credit Card Compare – The literal translation of country names
Linking Root Domains: 277
Linking Root Domains: 90
2. Calculators & tools
Calculators and tools are a fantastic way to earn links. Why? For the simple reason that a link becomes a vital part of a journalist’s story. It’d make no sense for them to mention a new tool or calculator (which users can interact with) without linking out to it. It’s common user experience but the main consideration here is that the link adds value to the article. It helps the user to navigate to an asset which they’re being told to try out.
It’s only human nature that we want to find things out and working an element of personalization into a campaign is a great way to drive engagement and help to create a connection with a campaign.
Think about it this way, if you saw an article which mentioned a calculator which could showcase how quickly Kim Kardashian would earn your annual salary (hint: She earns the average annual UK Salary in 6 and a half hours), you’d want to try it out? Wouldn’t you?
Similarly, what if you heard about a tool which allowed you to enter your Instagram handle and be told how much you could be earning from brand collaborations?
Take a look at three very different examples of calculator-led campaigns to feel inspired and start to think of what you could create:
Missy Empire – You vs The Kardashians
Linking Root Domains: 219
Linking Root Domains: 347
Totally Money – How much is your unpaid overtime worth?
Linking Root Domains: 41
We love to compare both ourselves and our lifestyles to others and that’s why an index format works so well for content campaigns.
Often used to compare cities or countries, but also seen comparing the popularity of brands, products or similar, indexes are a format which rarely struggles to earn links when backed by strong data.
Think about it this way; if you’re looking at running a campaign which looks at revealing “the best cities in the world for foodie tourists” you’ve got the opportunity to pitch this out to niche food and travel publications and blogs, regional publications (how the local city ranks on the list), national publications (where cities from the country rank), and global publications (to showcase the overall findings).
There are literally so many ways to hook into different sectors of the press with an index-led format and the data behind often reveals some interesting stories. Just be sure your sources are credible.
The challenge with this format is typically what to rank but the beauty is that they can be presented in so many different ways depending on the budget and resources.
Working to a low budget? Show the data and ranking as a table on a blog post. Have a little more resources? Why not design as an infographic? Looking for something even more stunning? Develop an interactive asset where data columns of the index are sortable.
Here are three great examples of campaigns which use this format.
Movehub – The hipster index
Linking Root Domains: 271
Nestpick – The 2018 millennial cities index
Linking Root Domains: 330
Zalando – The world’s most elegant cities
Linking Root Domains: 162
4. Social stat rankings
If you’re looking for a simple but effective campaign format; have you considered conducting a study into the social stats behind a concept?
From the most hashtagged sneakers to the most Instagrammed beaches, there’s plenty of fantastic examples of campaigns of this nature which have earned significant volumes of links yet have been able to be executed in a resource-friendly way.
At the most basic level, to launch a campaign utilizing social statistics, you simply need to find something comparable and collect the hashtag data. There’s also the potential to replicate a similar format utilizing follower-counts of celebrities in a sector or the like.
Are you ready to feel inspired? Here are a few of my favorite social stat campaigns from the past 12 months.
Forward2me – The world’s most Instagrammed sneakers
Linking Root Domains: 120
MyVoucherCodes – Dogs of Instagram
Linking Root Domains: 42
TravelSupermarket – The best beaches in the world according to Instagram
Linking Root Domains: 41
Have you started to notice a trend here?
The campaign examples which I’ve shared aren’t all highly complex data studies or interactive assets and this may come as a surprise.
It’s important to understand when launching a content marketing campaign that simplicity is often the key to success and this last format certainly falls into that category.
Brainteasers have been used by marketers over the past couple of years, often to great success due to their shareable nature and the challenge which they present to users.
So long as you’ve got a great design resource and an imagination, the opportunities are endless with this format and are always fun to work on as well.
Here are a few examples to get your creativity flowing.
Lenstore – Can you spot it?
Linking Root Domains: 126
Bloom & Wild – Can you spot the Christmas robin?
Linking Root Domains: 26
When it comes to launching a digital PR campaign where the focus is on earning links, the story always needs to be the priority. However, by taking the time to understand what formats are working for others, you can start to think about ideas in a different way.
If a format is working, it makes sense to learn from this and understand why the stories which go alongside such campaigns resonate so well with publishers.
It’s all about understanding why some formats perform better than others when it comes to earning links and what it is that makes them attractive to publishers. Hopefully the above has given some inspiration for your own campaigns and left you thinking up ideas for what you could launch yourself.
James Brockbank is the Managing Director of Digitaloft, a multi-award winning SEO, PPC & Content Marketing agency. He can be found on Twitter @BrockbankJames.
The post Five proven content formats to maximize link acquisition with digital PR appeared first on Search Engine Watch.
Digital campaigns are all about visuals in today’s digital world.
According to Deposit Photos on visual trends in 2019,
Social Media Examiner’s 2018 Industry Report shows that 80 percent of marketers use visual assets in their content marketing. And 11 percent more B2C marketers than B2B marketers attest that visual content is the more important type of content today.
Retail marketing without visual content can be boring, unattractive, and will yield low ROI. Visual cues will, however, help direct attention while portraying a message with visual methods of communication, including videos, photos, infographics, memes, and comics.
Using the appropriate visual cues on landing pages will help direct attention and engagement to the intended CTA and if they will get any value from it. You could use bright banners, exclamation marks, arrows, product images, and more.
Here are four visual design cues ecommerce marketers should focus on in 2019:
Arrows are one of the most commonly used visual cues because they explicitly describe what you should do and are easily understood. They are often used to point to a CTA and could come in different forms. According to ConversionXL, when it comes to using explicit visual cues, an arrow outperforms a human’s line of sight as humans tend to spend twice the average time looking at forms with arrows.
The Gift Rocket design below is an example of a creative way to use arrows. They simply directed the top of the rocket toward what is important.
To get the best of arrow cues, ensure the color of your arrows align with the rest of the design and remember not to use more than one arrow, unless where necessary. Also, be creative with your arrows and remember that they have the tendency to increase traction and sales.
Color is one of the most important aspects of design and is also a form of communication. The choice and usage of your brand color play a huge role in how you interact with and engage your audience.
Colors have a strong connection with the human mind, as they could help set a mood, make a memory stick or invoke memory, and also affect decision making. It is then imperative that marketers learn how to implement various colors in a campaign to draw attention and help their customers decisions.
Your choice of color could be based on age, location, gender, or trends. Or you could simply use a color that depicts what the brand is about and represents the emotion you want your audience to associate with. Know what your brand stands for and choose a color that accurately depicts it. For example, the color blue could be associated with trust, loyalty, confidence, wisdom, and faith. A popular company that uses this color is Facebook, with its core value being transparency and trust.
The Oxford Summer School also uses the same shade of blue which stands for trust, integrity, and communication across its website and social media platforms. This does not only depict excellence and a professional brand identity but also helps improve brand recognition by 80 percent.
3. Line of sight
A line of sight can also function as an explicit visual cue. Based on the cognitive bias of deictic (or “pointing”) gaze, eye directions on an image naturally direct viewers to look in the same direction as the line of sight. People often follow the line of sight of others, so if someone on a screen is looking at a quote, form, or testimonial, others will follow. This technique can be used to influence attention and connect emotions to your offer.
This technique was used by both presidential candidates (Trump and Clinton) in the 2016 US elections. Using the line of sight on landing pages as seen in the pictures below, Clinton and Trump’s marketing team guided visitors to the forms on their respective landing pages.
Like arrows, the line of sight in an image can be used to draw attention to a CTA button or something significant on the image. It could be a simple eye illustration, an animal picture, or a human photograph looking towards the action point as seen in the image below.
This technique is particularly effective for social media ads with pop up forms, testimonials, and landing pages. Whatever you do make sure, don’t use a human looking away from the intended target.
4. Product imagery
Consistent and high-quality imagery that perfectly describes your product or service is one of the best ways to engage your audience with your brand.
Humans have a short attention span, which leaves you with three seconds or less to capture your audience. Your social media images represent your brand and how your customers view your products to determine if they will purchase or not.
To get the perfect product imagery for your social media that will engage your audience, use high-quality images and high color accuracy. Also, take great close-up photos from different angles to help your customers easily analyze the product.
Visual design is not limited to using videos and gifs on landing pages, and adding cues yield an effective way to convince visitors to act. Don’t be limited by your visitors’ attention span, grab the bull by the horn and guide visitors to a mutually desired outcome with the help of visual cues. Visual design cues, if maximized properly, will help increase conversion ratio, customer satisfaction.
Tell us how you have or plan to make your website stand out with interesting usage of visual design cues.
Pius Boachie is the founder of DigitiMatic, an inbound marketing agency.
The post Four visual design cues ecommerce marketers should use in 2019 appeared first on Search Engine Watch.
For some businesses, there is only a small team, or even one individual, in charge of all the pay-per-click, or PPC advertising.
And that one person, or team, may have other responsibilities that cut into that PPC management time. As the business grows, keeping up with all the work that goes into a well-managed PPC account (or multiple accounts) can be difficult. So it may be necessary to outsource some, or all, of this workload.
The challenge now is to find the right PPC software or management company. The purpose of this article will be to help you identify the needs of your business and what to look for in companies or software that best fits those needs.
Identify your business needs
There are many options available to small businesses for PPC advertising management including tools and specialized software, all the way to full-service PPC management agencies. The decision to go with one over the other depends on three things:
If you want to keep full control of your accounts, devote some time to managing them. And if you’re working with a conservative budget, then a PPC management software may be perfect. However, if you don’t mind handing over the reign, or have little or no time to spare, and can afford the extra expense, then a full-service PPC management agency might be right for you.
Read also: 10 reasons to hire a PPC management expert.
Not all businesses are going to fall into the these options, and most will actually be somewhere in between. Other considerations to be taken into account when identifying your businesses’ needs include the number of PPC accounts you have, average monthly spend, ad type mix, and whether you currently manage PPC manually or with automation.
Types of PPC accounts and ads
PPC accounts could include one, all, or a mix of the following:
Research software and management companies
I was recently tasked by my company to research and create a top three list of PPC management companies, or software tools, in order to free up my time to work on our websites.
So I began my search for companies that provide these services and software. After compiling a list of the ones I wanted to investigate further, I signed up for software demos, free trials, free PPC audits, or free phone consultations.
Below are some details of my research into these companies. This is clearly not an exhaustive list of what’s available out there, but it’s a helpful guide if you’re unsure of where to start.
PPC management agencies
Features: Free PPC audit with follow up consultation and demo of software designed for Google Ads and Google Shopping feed management. Management services include PPC lead gen, paid search, Amazon, and Facebook.
Pricing: Not listed on their site, but during the consult, they provided a price range based on current accounts and recommended services. The estimate I received for the company I work for, was a range of $2,500 to $3,500 for the one-time setup fee, plus 25 percent of our monthly ad spend.
My research experience: Sign up for the audit and demo was easy and they contacted me pretty quick about getting me set up. I had the consult call which lasted 45 minutes and was a detailed overview of our accounts including areas that were working or not working in our favor, how they could be improved, and how they would help solve these issues as well as improve our overall account performance. After the representative went over the audit results, it was recommended that we go with PPC management services rather than the CAPx software alone.
Final impression: I was really impressed with the detailed results and feedback I received about our account. Using the software without managed services was definitely not something that would be beneficial for us, nor could we use it to manage Bing Ads. The pricing was also quite high, more than what we can fit into our budget. Their PPC management services did sound robust and valuable, just not the right fit for us.
Features: Full-service management with dedicated account management, research, analysis & strategy development, comprehensive optimization, and custom reporting are all included in their listed services.
Pricing: Pricing not listed on their website but was in the proposal. Fee schedule is based on the monthly ad spend as follows: 13 percent for more than $5,000 monthly spend, 15 percent for $3,000 to $5,000 monthly spend, or a flat fee of $399 for less than $3,000 monthly ad spend, per account.
My research experience: Sign up for the free consultation was easy, but no audit was performed first. Consult call was about 15 minutes and we reviewed what kind of accounts we currently have, what we’re looking for, and more. Received a proposal a few days later that included a high-level overview of services they provide along with their fee schedule.
Final impression: Although quick to set up a call after the sign-up, the call was too short to fully understand our needs and accounts. It took a few days to get the proposal and it was very generic, not designed around our businesses’ specific needs. Looking back at their website, there are no testimonials from clients, very basic high-level content, and very little company information. We did have a good call and their representative was great to talk with, so it’s possible that the customer service could be great. I would recommend further research.
Features: Integrated service offerings include digital strategy, ad copy, PPC analytics, display and paid search, dedicated PPC account manager, and an account specialist with weekly, monthly. Plus, ad hoc client calls with weekly and monthly reporting.
Pricing: One time new client fee $1,495 includes utilization of Google Ads and Bing Ads, plus a monthly management fee based on the monthly ad spend, broken down into tiers starting at $600 (up to $3,000 in ad spend) to 10 percent (at $30,000 to $150,000 in ad spend).
My research experience: Signed up for their free PPC audit, which they linked directly to our Google Ads account, and then sent a detailed report within a few days that included specifics of actionable items, and a description of the on-boarding process. The audit report features JumpFly agency intro, audit (initial thoughts), and specific areas they see that need work.
Final impression: It was a very simple process for the audit and the report had lots of details, specific to our account, of how to optimize, and grow, our account. The report was comprehensive, not simply focusing on their service. They even had simple, actionable items that I could immediately implement. The pricing seemed reasonable enough for what we’d get in return. The company also had a very thorough website and appears knowledgeable and trustworthy.
PPC management software tools (with and without additional managed services)
Pricing: A free 14-day trial offer plus a free live demo. Pricing is based on the number of AdWords accounts and Merchant Center accounts. Up to five accounts – $250 per month; up to 25 accounts – $459 per month. Discount available if paying annually.
My research experience: The trial was easy to get started and set up, bringing in your accounts. The recommendations it gave were decent, but there were times that my accounts gave me suggestions, and this software did not. It was very simple to use and straightforward on what it does (and doesn’t do).
Final impression: A big con was it’s designed for Google AdWords only, no Bing Ads management. I was also unable to find any training, but there’s a live chat box and an article finder to ask for help. That being said, the software is not very comprehensive. The only thing I can see that it does is offering recommendations to optimize your account, but you can’t create campaigns or do any re-structuring within the software. That being said, if you want something that can offer suggestions on dozens of Google Ads accounts in one place, then this may be helpful on a day-to-day basis for quick optimizations.
Features: PPC management software suite includes account management for Google AdWords, Bing Ads, and Google Shopping feeds. One-click optimizations, data insight tools, report designer, quality score tracker, Bing tools, Google Ads scripts, Google Shopping campaign tools, advanced reporting features, advanced shopping features, rule engine and custom optimizations, custom domain for reports, training sessions (two personal sessions with PRO accounts, email support, all accounts linked to software (no limit), and spend per month linked up to $500k. Plus, automation credits (250 free credits per month for a Pro account).
Pricing: Regular $499 per month for a Pro account (discounts for six months or annual payments). Enterprise level available, have to contact for pricing.
My research experience: The free 14-day trial was easy to sign up for and I was able to quickly pull in all our AdWords accounts, analytics accounts, Bing Ads, and shopping feeds. It was nice to have everything in one place. The software gave decent recommendations. Training is offered with a free Udemy private training course.
Final impression: A very comprehensive software that was impressive. Plus, for us, we’d be able to manage all of our accounts in one place. Ability to create ads, create structure, and set up custom rules. The price is really great for what this tool can do. I think this is the perfect software for someone who has a really good working knowledge of PPC accounts, and knows what needs to be done (as far as structuring). After the initial work of getting everything set up and creating rules, it will definitely be a time-saver as well.
Pricing: On their website, they have a very cool sliding scale that includes different packages of software with or without mentoring, and with or without managed services, at different business levels. Based on my company’s monthly ad spend the approximate cost for us is as follows: Software alone $1,000, software without mentoring $1,500, or software with mentoring, and managed services $2,250.
My research experience: Very easy to sign up for a free 30-day trial. There is tons of training videos plus support via email, chat, and phone. Free academy training is also available. That being said, it was a bit overwhelming and I wasn’t sure how to use the software, so I requested a free demo. I had a consult call with them for about 30 minutes to explain how it all works, discussed our business needs, and received recommendations, based on our business and customers for which package would be right for us.
Final impression: The software is fantastic and pulls in all of your inventory in basically any format you want. Then all of your accounts and ads can be structured around that data, or you can continue with the structure you already have. Their managed services are also incredibly flexible, you can use them as little or as much as you want. Although the software was more advanced than what I would want to use if I was going for software alone. However, if you looking for a mix of software and managed services, want to save a ton of time, but want to still retain control and decision making, then this might be the perfect solution.
With hundreds of PPC software tools and management agencies, how do you know which to choose? Knowing what your business needs, what you currently have in place, and what you hope to achieve with PPC services will help you navigate these waters.
Knowing what to look for in software capabilities and features, or with agencies and their services, will help you make an educated decision. Also, check into the company, look for reviews, see how long the company has existed, and who are their clients. Take your time, know your business, and do your research.
Got any first-hand experiences with PPC management solutions, tools, and services? Share them in the comments.
The post How to find a PPC management solution for your business appeared first on Search Engine Watch.
A new study by Kaizen has revealed that content that performs well for backlinks does not necessarily perform well for social shares and vice versa.
Analyzing over 2300 pieces of finance content, Kaizen has found the best performing pieces of content for URL rating, the number of referring domains, and the number of social shares. Nine out of the top 10 pieces of content with the highest URL ratings also featured in the top 10 pieces of content for the most referring domains.
This shows a clear correlation between the two. The higher the quantity of referring domains, the higher the quality of URL rating.
The best-performing piece of content for both URL rating and the number of referring domains was the Corruptions Perceptions Index 2017, by Transparency International. The campaign highlighted the countries that are or are not making progress in ending corruption, finding that the majority of countries were making little or no progress.
But what made this campaign succeed so well in SEO terms?
1. It has global appeal
By placing emphasis on visual components of content, the campaign is easily understandable without language and is based on data from across the world, making it globally link-worthy.
2. It is emotional content
The piece evokes an emotional response from the element of corruption and the fact that the majority of countries in the world are making little or no progress in ending corruption.
3. It is evergreen content
“Evergreen content” is content that is not tied to a specific date or time of the year and can be outreached (and can gain links) at any time. In addition, Transparency International is able to update the data each year, creating a new story for outreach and increasing its chances of landing links.
By combining these typical elements of viral content, the Corruption Index earned 6372 referring domains, and a URL rating of 84, making it the most successful piece of finance content in the study. Use these three aspects as a checklist for your own content, and it should emulate great results.
The Corruption Perception Index also ranked in the top 10 pieces of content for social shares, with a grand total of nearly 48,000. However, it is only one of two pieces to rank in the top 10 for URL ratings or referring domains and social shares. There is much less correlation between social share success and backlink success, showing that they are not directly or significantly linked.
The most successful piece of content for social shares was this car insurance calculator by Confused.com, with 91,000 total social shares. This piece of content, as well as the majority of the top 10, is B2C-focused. In comparison, the URL rating and referring domains lists are more technical and B2B-focused.
Therefore, B2B content performs better for SEO strategies focused on backlinks, whereas B2C tools and guides suitable for customers rather than businesses perform better for social shares.
The Corruption Perception Index is an exception, performing well for both backlinks and social shares. However, by focusing on analytical data from experts and business people, and by providing relevant data for both businesses and customers, it has equal value for both B2B and B2C audiences.
Don’t expect the same piece of content to perform well for both backlinks and social shares. But, if you are able to create content that provides equal value for both B2B and B2C communities, you will have the opportunity for multiple outreach strategies, with resounding value throughout the industry.
Nathan Abbott is Content Manager at Kaizen.
The post Backlinks vs social shares: How to make your content rank for different SEO metrics appeared first on Search Engine Watch.
It has been a while since Google has had a major algorithm update.
They recently announced one which began on the 12th of March.
It appears multiple things did.
When Google rolled out the original version of Penguin on April 24, 2012 (primarily focused on link spam) they also rolled out an update to an on-page spam classifier for misdirection.
And, over time, it was quite common for Panda & Penguin updates to be sandwiched together.
If you were Google & had the ability to look under the hood to see why things changed, you would probably want to obfuscate any major update by changing multiple things at once to make reverse engineering the change much harder.
Anyone who operates a single website (& lacks the ability to look under the hood) will have almost no clue about what changed or how to adjust with the algorithms.
In the most recent algorithm update some sites which were penalized in prior "quality" updates have recovered.
Though many of those recoveries are only partial.
Many SEO blogs will publish articles about how they cracked the code on the latest update by publishing charts like the first one without publishing that second chart showing the broader context.
The first penalty any website receives might be the first of a series of penalties.
If Google smokes your site & it does not cause a PR incident & nobody really cares that you are gone, then there is a very good chance things will go from bad to worse to worser to worsterest, technically speaking.
Absent effort & investment to evolve FASTER than the broader web, sites which are hit with one penalty will often further accumulate other penalties. It is like compound interest working in reverse - a pile of algorithmic debt which must be dug out of before the bleeding stops.
Further, many recoveries may be nothing more than a fleeting invitation to false hope. To pour more resources into a site that is struggling in an apparent death loop.
The above site which had its first positive algorithmic response in a couple years achieved that in part by heavily de-monetizing. After the algorithm updates already demonetized the website over 90%, what harm was there in removing 90% of what remained to see how it would react? So now it will get more traffic (at least for a while) but then what exactly is the traffic worth to a site that has no revenue engine tied to it?
That is ultimately the hard part. Obtaining a stable stream of traffic while monetizing at a decent yield, without the monetizing efforts leading to the traffic disappearing.
A buddy who owns the above site was working on link cleanup & content improvement on & off for about a half year with no results. Each month was a little worse than the prior month. It was only after I told him to remove the aggressive ads a few months back that he likely had any chance of seeing any sort of traffic recovery. Now he at least has a pulse of traffic & can look into lighter touch means of monetization.
If a site is consistently penalized then the problem might not be an algorithmic false positive, but rather the business model of the site.
The more something looks like eHow the more fickle Google's algorithmic with receive it.
Google does not like websites that sit at the end of the value chain & extract profits without having to bear far greater risk & expense earlier into the cycle.
Thin rewrites, largely speaking, don't add value to the ecosystem. Doorway pages don't either. And something that was propped up by a bunch of keyword-rich low-quality links is (in most cases) probably genuinely lacking in some other aspect.
Generally speaking, Google would like themselves to be the entity at the end of the value chain extracting excess profits from markets.
This is the purpose of the knowledge graph & featured snippets. To allow the results to answer the most basic queries without third party publishers getting anything. The knowledge graph serve as a floating vertical that eat an increasing share of the value chain & force publishers to move higher up the funnel & publish more differentiated content.
As Google adds features to the search results (flight price trends, a hotel booking service on the day AirBNB announced they acquired HotelTonight, ecommerce product purchase on Google, shoppable image ads just ahead of the Pinterest IPO, etc.) it forces other players in the value chain to consolidate (Expedia owns Orbitz, Travelocity, Hotwire & a bunch of other sites) or add greater value to remain a differentiated & sought after destination (travel review site TripAdvisor was crushed by the shift to mobile & the inability to monetize mobile traffic, so they eventually had to shift away from being exclusively a reviews site to offer event & hotel booking features to remain relevant).
It is never easy changing a successful & profitable business model, but it is even harder to intentionally reduce revenues further or spend aggressively to improve quality AFTER income has fallen 50% or more.
Some people do the opposite & make up for a revenue shortfall by publishing more lower end content at an ever faster rate and/or increasing ad load. Either of which typically makes their user engagement metrics worse while making their site less differentiated & more likely to receive additional bonus penalties to drive traffic even lower.
In some ways I think the ability for a site to survive & remain though a penalty is itself a quality signal for Google.
Some sites which are overly reliant on search & have no external sources of traffic are ultimately sites which tried to behave too similarly to the monopoly that ultimately displaced them. And over time the tech monopolies are growing more powerful as the ecosystem around them burns down:
Businesses which have sustainable profit margins & slack (in terms of management time & resources to deploy) can better cope with algorithmic changes & change with the market.
Over the past half decade or so there have been multiple changes that drastically shifted the online publishing landscape:
Each one of the above could take a double digit percent out of a site's revenues, particularly if a site was reliant on display ads. Add them together and a website which was not even algorithmically penalized could still see a 60%+ decline in revenues. Mix in a penalty and that decline can chop a zero or two off the total revenues.
Businesses with lower margins can try to offset declines with increased ad spending, but that only works if you are not in a market with 2 & 20 VC fueled competition:
And sometimes the platform claws back a second or third bite of the apple. Amazon.com charges merchants for fulfillment, warehousing, transaction based fees, etc. And they've pushed hard into launching hundreds of private label brands which pollute the interface & force brands to buy ads even on their own branded keyword terms.
They've recently jumped the shark by adding a bonus feature where even when a brand paid Amazon to send traffic to their listing, Amazon would insert a spam popover offering a cheaper private label branded product:
Buying those Amazon ads was quite literally subsidizing a direct competitor pushing you into irrelevance.
As the market caps of big tech companies climb they need to be more predatious to grow into the valuations & retain employees with stock options at an ever-increasing price.
They've created bubbles in their own backyards where each raise requires another. Teachers either drive hours to work or live in houses subsidized by loans from the tech monopolies that get a piece of the upside (provided they can keep their own bubbles inflated).
The above sort of dynamics have some claiming peak California:
If you live hundreds of miles away the tech companies may have no impact on your rental or purchase price, but you can't really control the algorithms or the ecosystem.
All you can really control is your mindset & ensuring you have optionality baked into your business model.
As the update ensues Google will collect more data with how users interact with the result set & determine how to weight different signals, along with re-scoring sites that recovered based on the new engagement data.
Recently a Bing engineer named Frédéric Dubut described how they score relevancy signals used in updates
That same process is ongoing with Google now & in the coming weeks there'll be the next phase of the current update.
So far it looks like some quality-based re-scoring was done & some sites which were overly reliant on anchor text got clipped. On the back end of the update there'll be another quality-based re-scoring, but the sites that were hit for excessive manipulation of anchor text via link building efforts will likely remain penalized for a good chunk of time.
In recent years, the nature of SEO has become more and more data-driven, paving the way for innovative trends such as AI or natural language processing.
This has also created opportunities for smart marketers, keen to use everyday tools such as Google Sheets or Excel to automate time-consuming tasks such as redirect mapping.
Thanks to the contribution of Liam White, an SEO colleague of mine always keen on improving efficiency through automation, I started testing and experimenting with the clever Fuzzy Lookup add-in for Excel.
The tool, which allows fuzzy matching of pretty much any set of data, represents a flexible solution for cutting down manual redirects for 404 not-found pages and website migrations.
In this post, we’ll go over the setup instructions and hands-on applications to make the most of the Excel Fuzzy Lookup for SEO.
1. Setting up Excel Fuzzy Lookup
Getting started with Fuzzy Lookup couldn’t be easier — just visit the Fuzzy Lookup download page and install the add-in onto your machine. System requirements are quite basic. However, the tool is specifically designed for Windows users — so no Mac support for the moment.
Unlike the not-exact match with Vlookup (which matches a set of data with the first result), Fuzzy Lookup operates in a more comprehensive way, scanning all the data first, and then providing a fuzzy matching based on a similarity score.
The score itself is easy to grasp, with a score of one being a perfect match, for instance. This score then decreases with the matching accuracy down to a score of zero where there is no match. Regarding this, it’s advisable not to venture below the 0.5 to 0.6 similarity threshold in the settings, as the results are not consistent enough for a site migration or 404 redirects purpose below that limit.
For greater accuracy, it’s also desirable to trim the domain (or staging site equivalent) from the URLs, making sure that the similarity score is not altered by too many commonalities. For more information about the setup, you can also refer to this Fuzzy Lookup guide.
2. Redirect mapping automation and its benefits
Considering the time necessary to familiarize with the site, categories and products/services, it’s safe to assume that a person would manually match two URLs roughly every thirty seconds. If that doesn’t sound too bad, consider that it would take between five to eight hours for a website of 1,000 URLs. This would make it quite a tedious and time-consuming task.
Bearing in mind that Fuzzy Lookup can provide nearly immediate results with a reliable fuzzy matching for at least 30 to 40 percent of the URLs, then this approach starts to appear interesting. If we consider the savings in terms of time as well, this would translate to about three hours for a small site or over ten hours for large ecommerce site.
3. Dealing with site migration redirects
If you are changing the structure of a site, consolidating more domains into one, or simply switching to a new platform, then redirect mapping for a website migration is definitely a priority task on your list. Assuming that you already have a list of existing pages plus the new site URLs, then you are all set to go with Fuzzy Lookup for site migrations.
Once you have set up the two URL lists in two separate tables, you can fire up the Fuzzy Lookup and order the matched URLs by the similarity score. In my tests, this has proven to be an effective, time-saving solution, helping in cutting down the manual work by several hours.
As displayed in the screenshot below, the fuzzy matching excelled with product codes and services/goods (such as 20600 and corner-sofas, for example). This allows the matching of IDs with IDs, and the URL with the parent category, in the case where an identical ID is not available.
4. 404 error redirects
Pages with a 404 status code are part of the web and no website is immune, hosting at least a few of them. Having said that, 404 errors have the potential of creating problems, hurting the user experience and SEO. Fuzzy Lookup can help with that, requiring just one simple addition a recent crawl of your site to extract the list of live pages, like the example below:
The fuzzy matching works pretty well in this instance too, matching IDs with IDs, and leaving the match to the most relevant category if a similar product/service is not live on the site. As per the site migrations, the manual work is not completely wiped out, but it’s made a whole lot easier than before.
5. Bonus: Finding gap/similarities in the blog
Another interesting application for Excel Fuzzy Lookup can be found in analyzing the blog section. Why? Simply because if you’re not in charge of the blog then you are not likely to be aware of what’s in it now, and what has been written in the past.
This solution works in two ways as well, because if a similarity is found, then you have the confirmation that the topic has been already covered. If not, this means that there’s still room for creating relevant content that can be linked to the service/product category to improve organic reach as well.
Time is money, and when it comes to dealing with large numbers of URLs that need to be redirected, a solution like Fuzzy Lookup can help you in cutting down the tedious manual redirect mapping. Thus, why not embrace fuzzy automation and save time for more exciting SEO tasks?
Marco Bonomo is an SEO & CRO Expert at MediaCom London. He can be found on Twitter @MarcoBonomoSEO.
The post Excel Fuzzy Lookup for SEO: Effortless 404 and site migration redirects appeared first on Search Engine Watch.
There is certainly a big pool of choices for agencies to choose from when it comes to picking a website audit tool.
There are standalone tools and those that come as part of a package. Some audit tools go through all pages of a website while others just give an overview of a specific page. And there are some tools that claim to be developed specifically for agencies but in reality can’t cope with the requirements that are very important to digital agencies and digital commerce services.
In this article, we are going to dive deep into what exactly agencies need when it comes to website audits and what to look for when choosing the tool for this task.
The audit has to scan deeply and provide a detailed report
Let’s start with reviewing the capabilities of website audit tools in terms of how deeply and completely they audit websites. Google ranks pages, not entire sites. So, logically, a website audit has to analyze each page separately. But it’s only at a glance, so to speak, because Google considers everything:
Advice #1: Choose a website audit with a customizable setting
Select a website audit tool that is powerful enough to scan as deeply as possible (scan all pages, subdomains, and even test pages), and with a high level of detailing so that you can see what pages require your immediate attention and which ones can wait for a little.
My favorite website audit tool in terms of control and completeness is that of SE Ranking.
These guys really went the extra mile in creating a tool that lets users set the scanning depth and speed. You can decide what pages to audit (you can upload URLs in an excel file, configure whether the scanning process should follow or ignore robots.txt rules, or set up your own rules). Plus you can choose the maximum number of pages to audit and define what should be treated as an error at the main negotiable optimization points.
I also like the fact that all their reports are white-labeled and highly customizable. On top of that, I love that the functionality of their website audit includes the option to create an XML sitemap right there in the tool and in just a few clicks.
Lead generation and website audit
SEO as part of marketing sure is a way to drive and convert traffic. In that perspective, a website audit helps to discover errors and fix problems that prevent website pages from being ranked higher in SERPs. In other words, you are using a website audit to make your website make more money. But for those that make a living out of digital marketing, a website audit is also a tool that generates income by itself so to speak.
Advice #2: Choose an audit tool that comes with embed options (aka widgets) that will generate leads for you in exchange for a website or an on-page audit
For example, SE Ranking offers a tool called lead generator which is a web form installed on your site that provides a free on-page audit to anyone that fills out the form (I also like the one from MySite Auditor). The audit report comes in a nice, easy-to-read format that anyone can understand.
Such widgets bring value to your visitors and build trust in your services. And for you qualified leads and a list of problems their pages have which is a great starting point for a nurturing conversation.
When choosing which widget to pick, check the following:
Ideally, the lead generating option has to be easily customizable and be able to integrate with your CRM, that way you can incorporate it nicely into your site and your business operations.
SEO Softwares that offer a lead generation form: WooRank, SE Ranking, MySite Auditor.
White-labeled SEO platform with a website audit
Presenting your SEO services based on data collected by your own technology is an incredibly powerful way to create brand ambassadors for your agency. You can use a software that offers a white labeled website audit and customizable reporting only but I found those to come with a lot of limitations. For an agency, a full white label option is the best choice regardless of whether you want to do a website audit or present an array of your digital services in a way that reflects your identity in the most complete manner.
Advice #3: Make sure the software you pick for website audit offers a white label option
Building credibility and trust with your clients is the number one priority for agencies. As found in multiple sources, over about 70% percent of a business that comes to agencies is generated through referral which is built on trust and loyalty.
A few suggestions on how to choose software with the white label option:
Another valuable tip: Use your own domain/subdomain for white label SEO. That way the services you are providing will look authentic and genuine with no traces back to the parent software.
SEO Softwares where white label comes with the subscription: SE Ranking, WebCEO, BrightLocal, NinjaCat. More software are listed here.
Comparison data in a website audit
I don’t know about you but the majority of my clients are expecting results the next day after signing the retainer for my services. Their web pages should be at the top of the SERPs right away which, of course, is not possible. So what else can agencies do to justify their bread and butter for a good number of months before their efforts start yielding tangible results? I show dynamics.
Advice #4: Make sure that the website audit tool you pick comes with comparative data and dynamics analysis
Website Audit is a great tool for showing progress especially if the tool you choose provides comparison data and analytics over time. I like to show clients the initial report with all sorts of errors in there and then provide the cut from a different period of time in the month from the first reporting, two months, and so forth. Look how many errors we fixed, I tell my clients, how many things are optimized, and how many improvements we’ve implemented.
SEO softwares that offer a comparison audit: MySiteAuditor, SEMRush, ScreamingFrog, SEO Report Card.
Reporting module in a website audit
I can’t stress enough how important it is for agencies to be able to slice and dice data they are working with for a client in a visually appealing and informative format. I would say that reporting is a tool for keeping clients happy while having a close eye on the ROI of your SEO progress and investments.
Advice #5: The website audit tool of your choice has to come with a robust and highly customizable reporting module
Of course, any website audit comes with a report — how else would you see all the errors and data obtained as a result of the scanning? However, not all come with options that are absolutely necessary for agencies:
SEO software that offer robust reporting within the website audit: SE Ranking, MySiteAuditor, SEMRush, Ahrefs.
A few words in conclusion
I know some people say that agencies or marketing teams like to use website auditing software that do just that, run a website audit.
But in my years of experience, I found it to be highly inefficient since there are so many tasks to do for a client, so many projects to handle. This contributes to the juggling between the interfaces, reports, and functionality that gets really annoying if not completely chaotic.
What I do is, use the SEO platform that comes with everything and has a very powerful website audit module that complies with all the requirements I outlined above. It saves me time and money. And I do use the one that comes with a white label, so everything I present to my clients is deeply branded and rooted into my business values.
Marco Bonomo is an SEO & CRO Expert at MediaCom London. He can be found on Twitter @MarcoBonomoSEO.
The post How to pick the best website audit tool for your digital agency appeared first on Search Engine Watch.
Who can argue that building an SEO strategy is not a time-consuming thing? Keyword research, niche analysis, technical audit, link building — all these tasks are just a small part of an SEO’s daily routine.
Willing to automate search engine optimization processes, experts use special tools and software. But it’s not always sufficient when analyzing the results
Of course, solving some basic issues for a small website isn’t that difficult with quality SEO tools. On the other hand, if you work with several sites and analyze lots of data, you’ll need to find ways of saving your time. At this point, people may look into implementing other methods into their working process. Here usually come various SEO extensions and plugins. They are very convenient as you can activate them in one click right from the page you’re analyzing.
However, extensions often have even fewer features than the SEO tool itself. If taking a closer look at the issue, there’s one more decision to be found. I’m talking about APIs, the method few people know how to use, missing the opportunity to benefit a lot. In this article, I’ll tell you what an API is, why you need it, and how to use it to fulfill SEO tasks.
What is an API?
API stands for an application programming interface. It’s a set of functions that lets users get access to the data or components of the tool. In other words, an API is a set of methods of communication among several applications.
APIs may serve for various purposes. For instance, developers often use them to embed some objects into websites. If you see a piece of Google Maps on a site, it means that the Google Map API is being used there. The same may be done with apps or tools.
Why does an SEO expert need this?
The right API helps experts to simplify the whole process of data collection. Some SEO tools offer an opportunity for their customers to use their APIs and drive better results. It lets users integrate analytics provided by platforms into their custom interface tools. With an API, you can request data and get it, while not even managing the tool’s interface.
Advantages of an API:
Four tasks you can better solve with an API
As previously mentioned, APIs let you make your SEO research much more flexible than typical tools do. So, what tasks exactly do APIs help with, and how can you use them for maximum profit?
While SEOs have various issues to deal with, there are different platforms created to facilitate keyword research, niche analysis, content curation, and evaluation of the results. Some of these tools provide APIs to make the research even more effective. Below you’ll find the tasks an API may help you cope with and the tools providing such a method for their customers.
1. Keyword research and batch analysis of websites
A comprehensive niche analysis and proper keyword research are the first tasks appearing in an SEOs’ to-do list when they get to a new project. SEO tools meet these needs very well. Unless you don’t want to spend your time analyzing each competitor or keyword individually. For this purpose, quality tools provide their APIs.
With Serpstat API, conducting complex research becomes easier than ever before. The thing is that working with it you don’t even have to know how an API actually works. Serpstat has created several documents with scripts already implemented there. It means that all you need to do is to enter your token and create your request. This document allows you to take advantage of all the Serpstat API methods in one place.
This API includes domain analysis, URL analysis, and keyword research features. It provides 17 reports on competitors, domain history, top pages, related keywords, missing phrases, and more. For example, if you want to know your competitors’ domains, you can do it in several clicks without spending limits on analyzing each website separately. Here’s a step-by-step instruction on how you can do that.
Step 1: Generate your token in your Serpstat account. Starting from Plan B (69$ a month), every user has access to the API. If you don’t have one, contact the support team via live chat to discuss options.
Step 2: Open the document and make a copy of it.
Step 3: Enter your token into the cell.
Step 4: Select a database from a dropdown list.
Step 5: Enter a list of your competitors’ domains.
Step 6: Choose domains > Domain info report in Serpstat tab
Step 7: Watch the following results
2. Content curation
Knowing the most trending topics and articles is the basic thing everyone who wants to attract their target audience needs to know. Moreover, tracking your content performance helps publishers improve their strategies to drive higher traffic and engagement.
As blog owners usually run lots of documentation to provide reports on their marketing results, integrating tracking tools into their own applications is extremely useful and helps save time a lot.
The Buzzsumo API provides a wide range of filters. Applying them, you’ll get highly specific reports that handle content creation processes for you. You can not only analyze your pages but also see your competitors’ top articles. Such reports will help you come up with the most engaging types of content.
Its standard API has five resources:
Links shared API request and response examples:
3. Monitor backlinks
Backlink analysis is another essential part of SEO. This process helps people see their link profiles’ weak points and discover new link building opportunities.
To integrate your applications with backlink analysis reports, you can use Majestic API. It’s available on Platinum and API plans. The full API lets you discover the following information:
4. Get performance metrics
Running a website and not analyzing the results it gives is a complete waste of time. So, it’s pretty difficult to find a person who owns a website and doesn’t have a Google Analytics account. The tool provides you with a deeper understanding of your audience, evaluate your marketing performance, and helps you find out what tactics that are working the best. However, accessing this data via the tool itself isn’t always convenient.
Website owners often need to build custom dashboards and integrate their analytics reports with their business applications. For instance, if you want to create a KPI dashboard for your marketing team, integrating Google Sheets with Google Analytics will be the best decision.
The Google Analytics reporting API allows users with the following:
To combine the power of Google Analytics API with the power of data operation in Google Sheets, use the Google Analytics spreadsheet add-on. It’ll let you compute custom calculations, schedule reports creation, share the data with your team, visualize your reports and embed them to other websites. To install the add-on, read the step-by-step instruction by Google Developers.
Bonus: More ways to optimize SEO processes
APIs are extremely useful when you deal with vast data. And what if you need to get the results here and now? In these cases, browser extensions will be handy. They help quickly analyze your page, find technical issues, or research keywords without switching between the page and the tools. I’ll share four free SEO extensions for Google Chrome that are essential for marketers.
If you deal with content, you probably know this tool already. If not, it’s the right time to start using it. SEO TextOptimizer measures the quality of your content based on the topic and the words you use in the text.
All you need to do is enter your main keyword into its search field. The extension will show you the optimization score with the words you’d better add or remove from the article.
This SEO extension lets Serpstat users conduct SEO analysis in one click. With it, you can analyze your competitors, get your site’s top-10 keywords, get the data on domain’s traffic, see its visibility trend for a year, and more.
Serpstat SEO & Website Analysis Plugin has three tabs (Page Analysis, On-page SEO Parameters, and Domain Analysis) providing detailed information on each aspect.
This extension is an interactive SEO dashboard with the SEO overview, backlink report, and other important metrics. However, its best feature is SERP analysis. It means that when searching for the query, you’ll see the bar providing the most crucial domain data below each search result. With SEOquake, you’ll get the following metrics without even clicking through the page:
The SEO and website analysis extension by Woorank is great for a quick analysis of the page’s SEO issues. It identifies crawl errors, usability, mobile friendliness, local directories, and more. The extension evaluates the total score of your marketing efforts and prioritizes all the issues for you to solve.
Optimize your working process for more effectiveness
The more tasks you have to solve, the more difficult it is to manage your time. Don’t limit yourself to SEO tools’ interfaces with the standard set of functions. Implement new methods into your SEO analysis processes to become more productive and save your time on manual work.
Tell us which of these tools have helped you save on precious productive time! Leave a comment below.
Inna Yatsyna is a Brand and Community Development Specialist at Serpstat. She can be found on Twitter @erin_yat.
The post How to speed up SEO analysis: API advantages for SEO experts appeared first on Search Engine Watch.
As most of you know the aggregator market is a competitive one, with the popularity in comparison sites rising.
Comparison companies are some of the most well-known and commonly used brands today. With external marketing and advertising efforts at an all-time high, people turn to the world wide web for these services. So, who is championing the online market?
In an investigation, we carried out we found brands such as MoneySuperMarket and MoneySavingExpert are the kings of the organic market.
It’s hard to remember a world without comparison sites. It turns out that comparison websites have been around for quite some time. In fact, some of the most popular aggregator domain names have been available since 1999.
Yes 1999, two years after Google launched.
The lifespan and longevity of these sites mean that over time issues start to build up, especially in the technical SEO department. Many of us SEOs are aware of the benefits that come with spending time on technical SEO issues — not to mention the great return on investment.
As comparison sites are so popular and relied upon by users, simple technical issues can result in a poor user experience damaging customer relationships. Or worse, users seeking assistance elsewhere.
Running comparison crawls have identified the common technical SEO issues across the market leaders. Find out what these issues are and how they will be harming their SEO — and see if they correlate with your own website.
1. Keyword cannibalization
When developing and creating new pages it is easy to forget about keyword cannibalization. Duplicating templates can easily leave metadata and headings unchanged, all confusing search engines on which page to rank for that keyword.
Here is an example from GoCompare.
The page on the left has the cannibalizing first heading. This is because the page’s h1 is situated in the top banner. This should target the long-tail opportunity “how to make your own electricity at home” which has been placed in an h2 tag directly under the banner.
The best course of action here would be to tweak the template, removing the banner and placing the call to action in the article body and placing the targeted keyword in a first heading tag.
Comparison sites are prime candidates for keyword cannibalization with the duplication of templates, services, and offers which results in cannibalization issues sitewide.
Run a crawl of your domain, gathering all the duplicated first headings tags, you can use tools such as Sitebulb for this. Decipher between which is the original page and which is the duplicate, then gather your keyword data to find a better keyword alternative for that duplicate page.
Talk to your SEO expert when creating new pages, they will be able to provide recommendations on URL structure, first headings, and titles. It is worth having an SEO at the start of the planning process when rolling out new pages.
2. Internal redirects
Numerous changes can result in internal redirects, primary causes are redundant pages, upgrades to a site’s functionality, and furthermore, the dreaded site migration.
When Google urged sites to accelerate to HTTPs in January 2017, with the ideal methodology to 301 redirect HTTP pages to HTTPs, it’s painful to think about the mass number of internal redirects.
Here’s an example.
Comparison sites specifically need to be aware of this. Just like ecommerce sites, products and services become unavailable. The normal behavior seems to be to then to redirect that product either to an alternative page or, in most cases, back to the parent directory.
This can then cause internal redirects across the site that need immediate attention.
To tackle this issue, gather all the internal redirected URLs from your crawler.
Once you’ve done this find the link on the parent page by inspecting the page on Google Developer tools.
Find where the link is and recommend to your development team that it changes the href attribute target within the link anchor to the final destination of the redirect.
3. Cleaning up the sitemap
With loads of changes happening across aggregator sites all the time, it is likely that the sitemap gets neglected.
However, it’s imperative you don’t allow this to happen! Search engines such as Google might ignore sitemaps that return “invalid” URLs.
Here’s an example.
Usually, a site’s 400/500 status code pages are on the development teams’ radar to fix. However, it isn’t always best practice as that these pages still sit in the sitemap. As they might be set live, orphaned and no indexed, or redirected elsewhere, that leaves some less severe issues within the Sitemap file.
Aggregators currently have to deal with sites changing product ranges, releasing new and, even, discontinuing services on a regular basis. New pages, therefore, have to be set up, redirects are then applied and sometimes issues are missed.
First, you need to identify errors within the sitemap. Search Console is perfect for this. Go to the coverage section, and filter with the drop down. Select your sitemaps with “Filter to Sitemaps” to inspect the errors that are within these.
If your sitemap has 400 or 500 status code pages, then this is more of a priority, if it has the odd redirect or canonical issue, focus on sorting these out first.
Check your sitemap weekly or even more frequently. It is also a great way of checking your broken pages across the site.
4. Subdomains are causing index bloat
Behind any great comparison site is a quotation functionality. This allows users to place personal information about a quote and being able to revisit previously saved data kind of like a shopping cart on most ecommerce websites.
However, these are usually hosted on subdomains and can get indexed, which you don’t really want. These are mostly thin content pages, a useless page in Google index equaling index bloat.
Here’s an example.
The solution is to add the “noindex” meta attribute to the quotation domains to stop them from being indexed. You can also include the subdomains in your robots.txt file to stop them from being crawled. Just make sure they aren’t in the search engines’ index before you place them in the file as they won’t drop out of the SERPs.
5. Spreading link equity to irrelevant pages
Internal linking is important. However, passing link equity thinly across pages can cause a loss in value. Think of a pyramid, and how the homepage spreads equity to the directory and then down to the subdirectories through keyword targeted anchor text.
These pages where equity is passed should hold the value and only link out to relevant pages that might be of relevance.
As comparison sites target a range of products and opportunities it is important to include them within the site architecture, but not spread the equity thinly.
How do we do this?
1. Consider the architecture of your site. For example:
“Fixed rate mortgages” has different yearly offerings, most sites sit these under a mortgage subdirectory, but this could easily have its own directory. This would benefit the site architecture as it lowers the click depth for those important pages and stops the thin spread of equity.
2. Only link to what is relevant.
Let’s take the below example. The targeted keyword here is “bad credit mortgages.” Money.co.uk then supplies a load of internal links at the bottom of the page that aren’t relevant to the keyword intent. Therefore, the equity is spread to these pages resulting in the page losing value.
Review the internal linking structure. You can do this by running pages through Screaming Frog, which identifies pages that have a click depth greater than two and evaluates the outgoing links. If there are a lot, this could be a good indicator that pages might be spreading the equity thinly. Manually evaluate the pages to find there the links are going to and remove any that might be irrelevant spreading equity unnecessarily.
6. Orphaned pages
Following on from the above point, pages that are orphaned, or poorly linked to, will receive low equity. Comparison sites are prime candidates for this.
MoneySuperMarket has several orphaned pages, especially located in the blog section of the site.
Use Sitebulb to crawl the site and discover orphaned pages. Spend time evaluating these, it might be that these pages should be orphaned. However, if they are present in the sitemap that indicates either one of two problems given below.
If the pages are redundant, make them “no indexable.” However, if they should be linked to, evaluate your site’s internal architecture to work out a perfect linking strategy for these pages.
It is very easy for blog posts to get orphaned, using methods such as topic clustering can help benefit your content marketing efforts while making sure your pages aren’t being orphaned.
Last ditch tips
A lot of these issues occur across a range of different sites and many sectors, as comparison sites undergo a lot of changes and development work with a vast product range and loads to aggregate. It is very hard to keep up-to-date with SEO tech issues.
Be vigilant and delegate resources sensibly. SEO tech issues shouldn’t be ignored, actively monitor and run crawls and checks after any site development work has been rolled out, this can save your organic performance and keep your technical SEO game strong.
Tom Wilkinson is Search & Data Lead at Zazzle Media.
The post Common technical SEO issues and fixes for aggregators and finance brands appeared first on Search Engine Watch.
Pleasure to introduce my self i am Sean Webb i am 27 years old from Manchester, UK.I am doing affiliate marketing and have spend lots of time learning how to rank easy to medium competition keywords. I have recently started PPL and Video Marketing and learning more about it.