lundi 27 janvier 2020

The Dirty Little Featured Snippet Secret: Where Humans Rely on Algorithmic Intervention [Case Study]

Posted by brodieclarkconsulting

I recently finished a project where I was tasked to investigate why a site (that receives over one million organic visits per month) does not rank for any featured snippets.

This is obviously an alarming situation, since ~15% of all result pages, according to the MozCast, have a featured snippet as a SERP feature. The project was passed on to me by an industry friend. I’ve done a lot of research on featured snippets in the past. I rarely do once-off projects, but this one really caught my attention. I was determined to figure out what issue was impacting the site.

In this post, I detail my methodology for the project that I delivered, along with key takeaways for my client and others who might be faced with a similar situation. But before I dive deep into my analysis: this post does NOT have a fairy-tale ending. I wasn’t able to unclog a drain that resulted in thousands of new visitors.

I did, however, deliver massive amounts of closure for my client, allowing them to move on and invest resources into areas which will have a long-lasting impact.

Confirming suspicions with Big Data

Now, when my client first came to me, they had their own suspicions about what was happening. They had been advised by other consultants on what to do.

They had been told that the featured snippet issue was stemming from either:

1. An issue relating to conflicting structured data on the site

OR

2. An issue relating to messy HTML which was preventing the site from appearing within featured snippet results

I immediately shut down the first issue as a cause for featured snippets not appearing. I’ve written about this topic extensively in the past. Structured data (in the context of schema.org) does NOT influence featured snippets. You can read more about this in my post on Search Engine Land.

As for the second point, this is more close to reality, yet also so far from it. Yes, HTML structure does help considerably when trying to rank for featured snippets. But to the point where a site that ranks for almost a million keywords but doesn’t rank for any featured snippets at all? Very unlikely. There’s more to this story, but let’s confirm our suspicions first.


Let’s start from the top. Here’s what the estimated organic traffic looks like:

Note: I’m unable to show the actual traffic for this site due to confidentiality. But the monthly estimation that Ahrefs gives of 1.6M isn’t far off.

Out of the 1.6M monthly organic visits, Ahrefs picks up on 873K organic keywords. When filtering these keywords by SERP features with a featured snippet and ordering by position, you get the following:

I then did similar research with both Moz Pro using their featured snippet filtering capabilities as well as SEMrush, allowing me to see historical ranking.

All 3 tools displaying the same result: the site did not rank for any featured snippets at all, despite ~20% of my client's organic keywords including a featured snippet as a SERP feature (higher than the average from MozCast).

It was clear that the site did not rank for any featured snippets on Google. But who was taking this position away from my client?

The next step was to investigate whether other sites are ranking within the same niche. If they were, then this would be a clear sign of a problem.

An “us” vs “them” comparison

Again, we need to reflect back to our tools. We need our tools to figure out the top sites based on similarity of keywords. Here’s an example of this in action within Moz Pro:

Once we have our final list of similar sites, we need to complete the same analysis that was completed in the previous section of this post to see if they rank for any featured snippets.

With this analysis, we can figure out whether they have featured snippets displaying or not, along with the % of their organic keywords with a featured snippet as a SERP feature.

The next step is to add all of this data to a Google Sheet and see how everything matches up to my client's site. Here’s what this data looks like for my client:

I now need to dig deeper into the sites in my table. Are they really all that relevant, or are my tools just picking up on a subset of queries that are similar?

I found that from row 8 downwards in my table, those sites weren’t all that similar. I excluded them from my final dataset to keep things as relevant as possible.

Based on this data, I could see 5 other sites that were similar to my clients. Out of those five sites, only one had results where they were ranking within a featured snippet.

80% of similar sites to my client's site had the exact same issue. This is extremely important information to keep in mind going forward.

Although the sample size is considerably lower, one of those sites has ~34% of search results that they rank for where they are unable to be featured. Comparatively, this is quite problematic for this site (considering the 20% calculation from my client's situation).

This analysis has been useful in figuring out whether the issue was specific to my client or the entire niche. But do we have guidelines from Google to back this up?

Google featured snippet support documentation

Within Google’s Featured Snippet Documentation, they provide details on policies surrounding the SERP feature. This is public information. But I think a very high percentage of SEOs aren’t aware (based on multiple discussions I’ve had) of how impactful some of these details can be.

For instance, the guidelines state that: 

"Because of this prominent treatment, featured snippet text, images, and the pages they come from should not violate these policies." 

They then mention 5 categories:

  1. Sexually explicit
  2. Hateful
  3. Violent
  4. Dangerous and harmful
  5. Lack consensus on public interest topics

Number five in particular is an interesting one. This section is not as clear as the other four and requires some interpretation. Google explains this category in the following way:

"Featured snippets about public interest content — including civic, medical, scientific, and historical issues — should not lack well-established or expert consensus support."

And the even more interesting part in all of this: these policies do not apply to web search listings nor cause those to be removed.

It can be lights out for featured snippets if you fall into one of these categories, yet you can still be able to rank highly within the 10-blue-link results. A bit of an odd situation.

Based on my knowledge of the client, I couldn’t say for sure whether any of the five categories were to blame for their problem. It was sure looking like it was algorithmic intervention (and I had my suspicions about which category was the potential cause).

But there was no way of confirming this. The site didn’t have a manual action within Google Search Console. That is literally the only way Google could communicate something like this to site owners.

I needed someone on the inside at Google to help.

The missing piece: Official site-specific feedback from Google

One of the most underused resources in an SEOs toolkit (based on my opinion), are the Google Webmaster Hangouts held by John Mueller.

You can see the schedule for these Hangouts on YouTube here and join live, asking John a question in person if you want. You could always try John on Twitter too, but there’s nothing like video.

You’re given the opportunity to explain your question in detail. John can easily ask for clarification, and you can have a quick back-and-forth that gets to the bottom of your problem.

This is what I did in order to figure out this situation. I spoke with John live on the Hangout for ~5 minutes; you can watch my segment here if you’re interested. The result was that John gave me his email address and I was able to send through the site for him to check with the ranking team at Google.

I followed up with John on Twitter to see if he was able to get any information from the team on my clients situation. You can follow the link above to see the full piece of communication, but John’s feedback was that there wasn't a manual penalty being put in place for my client's site. He said that it was purely algorithmic. This meant that the algorithm was deciding that the site was not allowed to rank within featured snippets.

And an important component of John’s response:


If a site doesn’t rank for any featured snippets when they're already ranking highly within organic results on Google (say, within positions 1–5), there is no way to force it to rank.

For me, this is a dirty little secret in a way (hence the title of this article). Google’s algorithms may decide that a site can’t show in a featured snippet (but could rank #2 consistently), and there's nothing a site owner can do.

...and the end result?

The result of this, in the specific niche that my client is in, is that lots of smaller, seemingly less relevant sites (as a whole) are the ones that are ranking in featured snippets. Do these sites provide the best answer? Well, the organic 10-blue-links ranking algorithm doesn’t think so, but the featured snippet algorithm does.

This means that the site has a lot of queries which have a low CTR, resulting in considerably less traffic coming through to the site. Sure, featured snippets sometimes don’t drive much traffic. But they certainly get a lot more attention than the organic listings below:

Based on the Nielsen Norman Group study, when SERP features (like featured snippets) were present on a SERP, they found that they received looks in 74% of cases (with a 95% confidence interval of 66–81%). This data clearly points to the fact that featured snippets are important for sites to rank within where possible, resulting in far greater visibility.

Because Google’s algorithm is making this decision, it's likely a liability thing; Google (the people involved with the search engine) don’t want to be the ones to have to make that call. It’s a tricky one. I understand why Google needs to put these systems in place for their search engine (scale is important), but communication could be drastically improved for these types of algorithmic interventions. Even if it isn’t a manual intervention, there ought to be some sort of notification within Google Search Console. Otherwise, site owners will just invest in R&D trying to get their site to rank within featured snippets (which is only natural).

And again, just because there are categories available in the featured snippet policy documentation, that doesn’t mean that the curiosity of site owners is always going to go away. There will always be the “what if?”

Deep down, I’m not so sure Google will ever make this addition to Google Search Console. It would mean too much communication on the matter, and could lead to unnecessary disputes with site owners who feel they’ve been wronged. Something needs to change, though. There needs to be less ambiguity for the average site owner who doesn’t know they can access awesome people from the Google Search team directly. But for the moment, it will remain Google’s dirty little featured snippet secret.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

vendredi 24 janvier 2020

Measure Form Usage with Event Tracking - Whiteboard Friday

Posted by Matthew_Edgar

When it comes to the forms your site visitors are using, you need to go beyond completions — it's important to understand how people are interacting with them, where the strengths lie and what errors might be complicating the experience. In this edition of Whiteboard Friday, Matthew Edgar takes you through in-depth form tracking in Google Analytics. 

Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

Howdy, Moz fans. My name is Matthew Edgar. Welcome to another edition of Whiteboard Friday. I am an analytics consultant at Elementive, and in this Whiteboard Friday what I want to talk to you about are new ways that we can really start tracking how people are interacting with our forms.

I'm going to assume that all of you who have a form on your website are already tracking it in some way. You're looking at goal completions on the form, you're measuring how many people arrived on that page that includes the form, and what we want to do now is we want to take that to a deeper level so we can really understand how people are not just completing the form, but how they're really interacting with that form.

So what I want to cover are how people really interact with the form on your website, how people really interact with the fields when they submit the form, and then also what kind of errors are occurring on the form that are holding back conversions and hurting the experience on your site. 

1. What fields are used?

So let's begin by talking about what fields people are using and what fields they're really interacting with.

So in this video, I want to use just an example of a registration form. Pretty simple registration form. Fields for name, company name, email address, phone number, revenue, and sales per day, basic information. We've all seen forms like this on different websites. So what we want to know is not just how many people arrived on this page, looked at this form, how many people completed this form.

What we want to know is: Well, how many people clicked into any one of these fields? So for that, we can use event tracking in Google Analytics. If you don't have Google Analytics, that's okay. There are other ways to do this with other tools as well. So in Google Analytics, what we want to do is we want to send an event through every time somebody clicks or taps into any one of these fields.

On focus

So for that, we're going to send an on focus event. The category can be form. Action is interact. Then the label is just the name of the field, so email address or phone number or whatever field they were interacting with. Then in Google Analytics, what we'll be able to look at, once we drill into the label, is we'll be able to say, "Well, how many times in total did people interact with that particular field?"

GA report

So people interacted with the name field 104 times, the revenue field 89 times, sales per day 64 times, and phone number 59 times. Then we could go through all the other fields too to look at that. What this total information starts to give us is an idea of: Well, where are people struggling? Where are people having to really spend a lot of time? Then it also gives us an idea of the drop-off rate.

So we can see here that, well, 104 people interacted with the full name field, but only 89 made it down here to the revenue field. So we're losing people along the way. Is that a design issue? Is that something about the experience of interacting with this form? Maybe it's a device issue. We have a lot of people on mobile and maybe they can't see all of those fields. The next thing we can look at here is the unique events that are happening for each of those.

Unique events aren't exactly but are close enough to a general idea of how many unique people interacted with those fields. So in the case of the name field, 102 people interacted 104 times, roughly speaking, which makes sense. People don't need to go back to the name field and enter in their name again. But in the case of the revenue field, 47 unique interactions, 89 total interactions.

People are having to go back to this field. They're having to reconsider what they want to put in there. So we can start to figure out, well, why is that? Is that because people aren't sure what kind of answer to give? Are they not comfortable giving up that answer? Are there some trust factors on our site that we need to improve? If we really start to dig into that and look at that information, we can start to figure out, well, what's it going to take to get more people interacting with this form, and what's it going to take to get more people clicking that Submit button?

2. What fields do people submit?

The next thing that we want to look at here is what fields do people submit. Not just what do they interact with, but when they click that Submit button, which fields have they actually put information into? 

On submit

So for this, when people click that Submit button, we can trigger another event to send along to Google Analytics. In this case, the category is form, the action is submit, and then for the label what we want to do is we want to send just a list of all the different fields that people had put some kind of information in.

So there's a lot of different ways to do this. It really just depends on what kind of form you have, how your form is controlled. One easy way is you have a JavaScript function that just loops through your entire form and says, "Well, which of these fields have a value, have something that's not the default entry, that people actually did give their information to?" One note here is that if you are going to loop through those fields on your form and figure out which ones people interacted with and put information into, you want to make sure that you're only getting the name of the field and not the value of the field.

We don't want to send along the person's email address or the person's phone number. We just want to know that they did put something in the email address field or in the phone number field. We don't want any of that personally identifiable information ending up in our reports. 

Review frequency

So what we can do with this is we can look at: Well, how frequently did people submit any one of these fields?

So 53 submissions with the full name field, 46 with revenue, 42 with sales per day, etc. 

Compare by interact

The first thing we can do here is we can compare this to the interaction information, and we can say, "Well, there were 53 times that people submitted a field with the full name field filled out.But there are 102 people who interacted with that full name field."

That's quite the difference. So now we know, well, what kind of opportunity exists for us to clean this up. We had 102 people who hit this form, who started filling it out, but only 53 ended up putting in their full name when they clicked that Submit button. There's some opportunity there to get more people filling out this form and submitting.

Segment by source

The other thing we can do is we can segment this by source. The reason we would want to do that is we want to compare this to understand something about the quality of these submissions. So we might know that, well, people who give us their phone number, that tends to be a better quality submission on our form. Not necessarily. There are exceptions and edge cases to be sure.

But generally speaking, people who give us their phone number we know are better quality. So by segmenting by source, we can say, "Well, which people who come in from which source are more likely to give their phone number?" That gives us an idea of which source we might want to go after. Maybe that's a really good thing that your ad network is really driving people who fill out their phone number. Or maybe organic is doing a better job driving people to submit by giving you that information.

3. What fields cause problems?

The next thing we want to look at on our form is which errors are occurring. What problems are happening here? 

Errors, slips, mistakes

When we're talking about problems, when we're talking about errors, it's not just the technical errors that are occurring. It's also the user errors that are occurring, the slips, the mistakes that people are just naturally going to make as they work through your form.

Assign unique ID to each error

The easiest way to track this is every time an error is returned to the visitor, we want to pass an event along to Google Analytics. So for that, what we can do is we can assign a unique ID number to each error on our website, and that unique ID number can be for each specific error. So people who forgot a digit on a phone number, that's one ID number. People who forgot the phone number altogether, that's a different ID number. 

On return of error

When that error gets returned, we'll pass along the category is form, the action is error, and then the label is that unique ID number

Frequency of errors

The first thing we can look at is the frequency of how frequently each error occurs. So we can say, "Well, Error ID No. 1 occurred 37 times, and Error ID No. 2 occurred 26 times."

Segment by form completion

It starts to give us an idea of how to prioritize these errors. But the more interesting thing to look at is we want to segment by the form completion, and then we can compare these two. So we can say, "Okay, people who completed this form, how often did they get these errors?" So in this case, we can say, "Well, Error ID No. 1, 29 people got it, but 27 people who submitted this form got it."



That means pretty much everybody who got that error was able to move beyond the error and submit the form. It's not that big of a deal. It's not hurting the experience on our site all that much. It's not hurting conversions all that much. Error ID No. 4 though, 19 people got the error, but only 3 of the people who got that error were able to submit the form. Clearly whatever this ID is, whatever this error is, that's the one that's really hurting the experience on our site.

That's the one that's really going to hurt conversions. So by improving or figuring out why that error is occurring, then we can start to improve conversions on our site. I hope these ideas have given you some new ways to really track and understand how people are interacting with your forms at a deeper level.

I look forward to hearing your comments about different things you're doing on your forms, and certainly if you start using any of these ideas, what kind of insights you're gaining from them. Thank you.

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

mardi 21 janvier 2020

How to Scale Your Content Marketing: Tips from Our Journey to 100,000 Words a Month

Posted by JotFormmarketing

In the fall of 2018 our CEO had a simple yet head-exploding request of the JotForm marketing and growth teams: Produce 100,000 words of high-quality written content in a single month.

All types of content would count toward the goal, including posts on our own blog, help guides, template descriptions, and guest posts and sponsored articles on other sites.

In case you don’t think that sounds like a lot, 100,000 words is the length of a 400-page book. Produced in a single month. By a group of JotFormers who then numbered fewer than eight.

Why would on Earth would he want us to do all that?

My colleague and I trying to calculate how many blog posts it would take to reach 100,000 words.

It’s important to understand intent here. Our CEO, Aytekin, isn’t a crazy man. He didn’t send us on a mission just to keep us busy.

You see, for many months we’d dabbled with content, and it was working. Aytekin’s contributed posts in Entrepreneur magazine and on Medium were big hits. Our redesigned blog was picking up a lot of traction with the content we already had, and we were starting to understand SEO a lot better.

Still. Why would any software company need to produce that much content?

The answer is simple: infrastructure. If we could build a content engine that produces a high volume of quality content, then we could learn what works well and double down on creating great content. But in order to sustain success in content, we needed to have the pieces in place.

He allocated a sufficient budget and gave us the freedom to hire the staff we needed to make it happen. We were going to need it.

A full year later, I’m very proud to say we’ve officially crossed over the 100,000-word count in a single month [hold for applause].

However, it didn’t come without some painful learnings and mistakes.

Here’s what I figured out about scaling content through this process.

Develop a system early

Our old editorial calendar was a Google sheet. I started it back when JotForm was publishing one or two blogs per week and needed a way to keep it organized. It worked.

Back then, the only people who needed to view the editorial calendar were three people on the marketing staff and a couple of designers.

However, no spreadsheet on earth will be functional when you’re loading up 100,000 words. It’s too complicated. We discovered this right away.

After much discussion, we migrated our editorial workflow into Asana, which seemed like the closest thing to what we needed. It has a nice calendar view, the tagging functionality helped keep things orderly, and the board view gives a great overview of everyone’s projects.

This is where our marketing team lives.

Counterintuitively, we also use Trello, since it’s what our growth team had already been using to manage projects. Once the marketing team finishes writing a post, we send a request to our growth team designers to create banners for them using a form that integrates with their Trello board.

The system is intricate, but it works. We’d be lost if we hadn’t spent time creating it.

Style guides are your friends

Speaking of things to develop before you can really grow your content machine. Style guides are paramount to maintaining consistency, which becomes trickier and trickier the more writers you enlist to help you reach your content goals.

We consider our style guide to be a sort of living, ever-changing document. We add to it all the time.

It’s also the first thing that any legitimate writer will want to see when they’re about to contribute something to your site, whether they’re submitting a guest post, doing paid freelance work, or they’re your own in-house content writer.

Things to include in a basic style guide: an overview of writing style and tone, grammar and mechanics, punctuation particulars, product wording clarifications, and formatting.

Cheap writing will cost you, dearly

If you want cheap writing, you can find it. It’s everywhere — Upwork, Express Writers, WriterAccess. You name it, we tried it. And for less than $60 a blog post, what self-respecting marketing manager wouldn’t at least try it?

I’m here to tell you it’s a mistake.

I was thrilled when the drafts started rolling in. But our editor had other thoughts. It was taking too much time to make them good — nay, readable.

That was an oversight on my end, and it created a big bottleneck. We created such a backlog of cheap content (because it was cheap and I could purchase LOTS of it at a time) that it halted our progress on publishing content in a timely manner.

Instead, treat your freelance and content agencies as partners, and take the time to find good ones. Talk to them on the phone, exhaustively review their writing portfolio, and see if they really understand what you’re trying to accomplish. It’ll cost more money in the short term, but the returns are significant.

But good writing won’t mask subject ignorance

One thing to check with any content agency or freelancer you work with is their research process. The good ones will lean on subject matter experts (SMEs) to actually become authorities on the subjects they write about. It’s a tedious step, for both you and the writer, but it’s an important one.

The not-so-good ones? They’ll wing it and try to find what they can online. Sometimes they can get away with it, and sometimes someone will read your article and have this to say:

Screenshot of feedback for article saying it feels like it was written by a content creator, not a photographer.

That was harsh.

But they had a point. While the article in question was well-written, it wasn’t written by someone who knew much about the subject at hand, which in this case was photography. Lesson learned. Make sure whoever you hire to write will take the time to know what they’re talking about.

Build outreach into your process

Let’s be real here. For 99.9 percent of you, content marketing is SEO marketing. That’s mostly the case with us as well. We do publish thought leadership and product-education posts with little SEO value, but a lot of what we write is published with the hope that it pleases The Google. Praise be.

But just publishing your content is never enough. You need links, lots of them.

Before I go any further, understand that there’s a right and a wrong way to get links back to your content.

Three guidelines for getting links to your content:

1. Create good content.

2. Find a list of reputable, high-ranking sites that are authorities on the subject you wrote about.

3. Ask them about linking or guest posting on their site in a respectful way that also conveys value to their organization.

That’s it. Don’t waste your time on crappy sites or link scams. Don’t spam people’s inboxes with requests. Don’t be shady or deal with shady people.

Create good content, find high-quality sites to partner with, and offer them value.

Successful content is a numbers game

One benefit to creating as much content as we have is that we can really see what’s worked and what hasn’t. And it’s not as easy to predict as you might think.

One of our most successful posts, How to Start and Run a Summer Camp, wasn’t an especially popular one among JotFormers in the planning stage, primarily because the topic didn’t have a ton of monthly searches for the targeted keywords we were chasing. But just a few months after it went live, it became one of our top-performing posts in terms of monthly searches, and our best in terms of converting readers to JotForm users.

Point being, you don’t really know what will work for you until you try a bunch of options.

You’ll need to hire the right people in-house

In a perfect world JotForm employees would be able to produce every bit of content we need. But that’s not realistic for a company of our size. Still, there were some roles we absolutely needed to bring in-house to really kick our content into high gear.

A few of our content hires from the past 12 months.

Here are some hires we made to build our content infrastructure:

Content writer

This was the first dedicated content hire we ever made. It marked our first real plunge into the world of content marketing. Having someone in-house who can write means you can be flexible. When last-minute or deeply product-focused writing projects come up, you need someone in-house to deliver.

Editor

Our full-time editor created JotForm’s style guide from scratch, which she uses to edit every single piece of content that we produce. She’s equal parts editor and project manager, since she effectively owns the flow of the Asana board.

Copywriters (x2)

Our smaller writing projects didn’t disappear just because we wanted to load up on long-form blog posts. Quite the contrary. Our copywriters tackle template descriptions that help count toward our goal, while also writing landing page text, email marketing messages, video scripts, and social media posts.

Content strategist

One of the most difficult components of creating regular content is coming up with ideas. I made an early assumption that writers would come up with things to write; I was way off base. Writers have a very specialized skill that actually has little overlap with identifying and researching topics based on SEO value, relevance to our audience, and what will generate clicks from social media. So we have a strategist.

Content operations specialist

When you aim for tens of thousands of words of published content over the course of a month, the very act of coordinating the publishing of a post becomes a full-time job. At JotForm, most of our posts also need a custom graphic designed by our design team. Our content operations specialist coordinates design assets and makes sure everything looks good in WordPress before scheduling posts.

SEO manager

Our SEO manager had already been doing work on JotForm’s other pages, but he redirected much of his attention to our content goals once we began scaling. He works with our content strategist on the strategy and monitors and reports on the performance of the articles we publish.

The payoff

JotForm’s blog wasn’t starting from scratch when Aytekin posed the 100,000-word challenge. It was already receiving about 120,000 organic site visitors a month from the posts we’d steadily written over the years.

A year later we receive about 230,000 monthly organic searches, and that’s no accident.

The past year also marked our foray into the world of pillar pages.

For the uninitiated, pillar pages are (very) long-form, authoritative pieces that cover all aspects of a specific topic in the hopes that search engines will regard them as a resource.

These are incredibly time-consuming to write, but they drive buckets full of visitors to your page.

We’re getting more than 30,000 visitors a month — all from pillar pages we’ve published within the last year.

To date, our focus on content marketing has improved our organic search to the tune of about 150,000 additional site visitors per month, give or take.

Conclusion

Content isn’t easy. That was the biggest revelation for me, even though it shouldn’t have been. It takes a large team of people with very specialized skills to see measurable success. Doing it at large scale requires a prodigious commitment in both money and time, even if you aren’t tasked with writing 100,000 words a month.

But that doesn’t mean you can’t find a way to make it work for you, on whatever scale that makes the most sense.

There really aren’t any secrets to growing your content engine. No magic recipe. It’s just a matter of putting the resources you have into making it happen.

Best of all, this post just gave us about 2,000 words toward this month’s word count goal.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

lundi 20 janvier 2020

The True Value of Top Publisher Links

Posted by KristinTynski

I’m often asked about what results are earned through content marketing and digital PR.

So I decided to take a data-driven approach to quantifying the value of links from top-tier press mentions by looking at the aggregate improvements seen by a group of domains that have enjoyed substantial press attention in the last few months. Then I examined which publishers can have the biggest impact on rankings.

My goal was to answer this question: What sort of median bump can be expected when your brand secures media coverage? And how can you potentially get the biggest organic lift?

First off: Top-tier links matter a great deal

This chart represents the correlation between the number of times a site was linked to from within the article text of publishers and its rankings and traffic.

Considering the sheer number of possible variables that contribute to rankings changes (on-site factors, amount and quality of on-site content, penalties, etc.) seeing R-values (which determine the linear relationship) this high is a good result.

In general, the higher the R-score, the stronger the relationship between number of links from publishers and improvements in organic ranking.

We found significant relationships between the number of mentions on news sites ranked in the Top 500 and an even stronger relationship for those ranked within the Top 300.

The likely reason for this is twofold:

  1. Top 300 publishers confer more Domain Authority than less popular sites.
  2. Top 300 publishers often have larger syndication networks and broader visibility, leading to more links being built as the result of a press mention, leading to more Domain Authority accumulation overall.

Which publishers link out the most?

When pitching publishers, it can be extremely useful to understand who is most likely to actually provide a link.

Some publishers have policies against outbound links of any type or nofollow all outbound links.

Looking at the huge dataset, I got a better understanding for which publishers link out to other sites most frequently.

Notice the large number of local news sites with high numbers of outbound links. Local news is often keen to link out.

Unfortunately, most local news won’t have large scale syndication, so looking at top-tier publishers with large numbers of outbound links is likely a better strategy when developing a pitch list. So when you remove those from the list, here are the winners.

The top 15 national publishers that provide links

  1. Forbes
  2. The New York Times
  3. ZDnet
  4. NPR
  5. PR News Wire
  6. Seeking Alpha
  7. The Conversation
  8. USA Today
  9. CNN
  10. Benzinga
  11. Business Insider
  12. Quartz
  13. The Hill
  14. Heavy
  15. Vox

Sites like Forbes only dole out nofollow links, but many of these others provide dofollow links (in addition to just being great, high-authority coverage to achieve). Some industry specific options, like Seeking Alpha, Benzinga, and The Hill, can make for great vertical-specific dream publications to strive for coverage on.

Which publishers confer the most value in terms of organic search improvements?

Looking at this database, it’s possible to look at the median organic traffic gains aggregated by the site that gave the link.

This view is filtered to only include sites that had linked out 100+ times in order to reduce outlier publishers with small volumes of outbound links to only a handful of sites.

More popular sites are clustered near the top, further reiterating the fairly obvious point that the more popular a site, the more value a link from them will be in terms of improving organic ranking.

While most of the top-value links are from these sites, there are quite a few mid-tier sites that seem to grant disproportionate value, including several local news sites and niche authoritative publishers.

Methodology

I used The GDELT Project, a massive repository of news articles that are searchable using BigQuery, to extract the links from all news articles over the last year. Then I aggregated them by root domain.

For each domain from the GDELT dataset that was mentioned in a news article at least 30 times, we then pulled organic data from SEMrush’s API for each one.

I combined the SERP change numbers to the cleaned GDELT data by matching it to the URL of the linked-to site. This gave me organic changes (traffic volume, price, ranking keyword volume change) for each of the root URLs linked to more than 30 times from within the text of articles in the GDELT scrape.

From there, I ran a correlation analysis to see if we could find a statistically significant influence of news coverage on rankings.

Conclusion

Using insights like the ones above, you’ll be able to craft content better suited to those specific writers and audiences, increasing your chances of getting extremely impactful links via a digital PR strategy.

You can download the Tableau notebook and sort in the desktop version to explore the different sites relevant to your vertical. While not all of them may accept outside content, it’s a great start for building a “dream” pitch list. Study the type of content they typically publish, what their audience seems to enjoy most (based on shares and comments), and consider using these insights to hone your content strategy.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

vendredi 17 janvier 2020

Mining Reddit for Content Ideas in 5 Steps - Whiteboard Friday

Posted by DanielRussell

For marketers, Reddit is more than a tool to while away your lunch break. It's a huge, thriving forum with subreddits devoted to almost any topic you can imagine — and exciting new content ideas lurk within threads, just waiting to be discovered. In this edition of Whiteboard Friday, Daniel Russell takes you through five simple steps to mine Reddit for content ideas bolstered by your target audience's interest.


Video Transcription

Howdy, Moz fans. Welcome to another edition of Whiteboard Friday. My name is Daniel Russell. I'm from an agency called Go Fish Digital. Today we're going to be talking about mining Reddit for content ideas.

Reddit, you've probably heard of it, but in case you haven't, it's one of the largest websites on the internet. It gets billions of views and clicks per year. People go there because it is a great source of content. It's really entertaining. But it also means that it's a great source of content for us as marketers. So today what we're going to be talking about is two main groups here.

We're going to first be talking about the features of Reddit, the different things that you can use on Reddit to find good content ideas. Then we're going to be talking about five steps that you can take and apply today to start finding ideas for your company, for your clients and start getting that successful content. 

Features of Reddit

So first, Reddit as a breakdown here.

Subreddits

First, a big feature of Reddit is called subreddits. They're essentially smaller forums within Reddit, a smaller forum within a forum dedicated to a particular topic. So there might be a forum dedicated to movies and discussing movies. There's a forum dedicated to food and talking about different types of food, posting pictures of food, posting recipes.

There is a forum for just about everything under the sun. If you can think of it, it's probably got a forum on Reddit. This is really valuable to us as marketers because it means that people are taking their interests and then putting it out there for us to see. So if we are trying to do work for a sports company or if we're trying to do work for our company that's dentistry or something like that, there is a subreddit dedicated to that topic, and we can go and find people that are interested in that, that are probably within our target markets.

Upvoting and downvoting

There's upvoting and downvoting. Essentially what this is, is people post a piece of content to Reddit, and then other users decide if they like it or not. They upvote it or they downvote it. The stuff that is upvoted is usually the good stuff. People that are paying really close attention to Reddit are always upvoting and downvoting things. Then the things that get the most upvotes start rising to the top so that other people can see it.

It's super valuable to us again because this helps verify ideas for us. This helps us see what's working and what's not. Before we even put pen to paper, before we even start designing everything, we can see what has been the most upvoted. The most upvoted stuff leads to the next big feature, which is rankings. The stuff that gets voted the most ends up ranking on the top of Reddit and becomes more visible.

It becomes easier for us to find as marketers, and luckily we can take a look at those rankings and see if any of that matches the content we're trying to create. 

Comments

There's the comments section. Essentially what this is, is for every post there's a section dedicated to that post for comments, where people can comment on the post. They can comment on comments. It's almost like a focus group.

It's like a focus group without actually being there in person. You can see what people like, what people don't like about the content, how they felt about it. Maybe they even have some content ideas of their own that they're sharing in there. It's an incredibly valuable place to be. We can take these different features and start digging in to find content ideas using these down here.

Reddit search & filters

Search bar

The search bar is a Reddit feature that works fairly well. It will probably yield mediocre results most of the time. But you can drill down a little further with that search bar using search parameters. These parameters are things like searching by author, searching by website.

Search parameters

There are a lot of different searches that you can use. There's a full list of them on Reddit. But this essentially allows you to take that mediocre search bar and make it a little bit more powerful. If you want to look for sports content, you can look specifically at content posted from ESPN.com and see what has been the most upvoted there. 

Restrict results to subreddit

You can restrict your results to a particular subreddit. So if you're trying to look for content around chicken dishes, you're doing work for a restaurant and you're trying to find what's been the most upvoted content around chicken, you don't want people calling each other chickens. So what you can do is restrict your search to a subreddit so that you actually get chicken the food rather than posts talking about that guy is a chicken.

Filter results

You can filter results. This essentially means that you can take all the results that you get from your search and then you can recategorize it based off of how many upvotes it's gotten, how recently it was posted, how many comments it has. 

Filter subreddits

Then you can also filter subreddits themselves. So you can take subreddits, all the content that's been posted there, and you can look at what's been the most upvoted content for that subreddit.

What has been the most controversial content from that subreddit? What's been the most upvoted? What's been the most downvoted? These features make it a really user-friendly place in terms of finding really entertaining stuff. That's why Reddit is often like a black hole of productivity. You can get lost down it and stay there for hours.

That works in our benefit as marketers. That means that we can go through, take these different features, apply them to our own marketing needs, and find those really good content ideas. 

5 steps to finding content ideas on Reddit

So for some examples here. There's a set of key steps that you can use. I'm going to use some real-world examples, so some true-blue things that we've done for clients so that you can see how this actually works in real life.

1. Do a general search for your topic

The first step is to do a general search for your topic. So real-world example, we have a client that is in the transportation space. They work with shuttles, with limos, and with taxis. We wanted to create some content around limos. So the way we started in these key steps is we did a general search for limos.

Our search yielded some interesting things. We saw that a lot of people were posting pictures of stretch limos, of just wild limo interiors. But then we also saw a lot of people talking about presidential limos, the limos that the president rides in that have the bulletproof glass and everything. So we started noticing that, hey, there's some good content here about limos. It kind of helped frame our brainstorming and our content mining. 

2. Find a subreddit that fits

The next step is to find a subreddit that fits that particular topic. Now there is a subreddit dedicated to limos. It's not the most active. There wasn't a ton of content there. So what we ended up doing was looking at more broad subreddits. We looked at like the cars subreddit.

There was a subreddit dedicated to guides and to breakdowns of different machines. So there were a lot of breakdowns, like cutaways of the presidential limos. So again, that was coming up. What we saw in the general search was coming up in our subreddit specific search. We were seeing presidential limos again.

3. Look at subreddit content from the past month

Step 3, look at that sub's particular content from the past month. The subreddit, for example, that we were looking at was one dedicated to automobiles, as I had mentioned earlier. We looked at the top content from that past month, and we saw there was this really cool GIF that essentially took the Chevy logo back from like the '30s and slowly morphed it over the years into the Chevy logo that we saw today.

We thought that was pretty cool. We started wondering if maybe we could apply that same kind of idea to our presidential limo finding that we were seeing earlier. 

4. Identify trends, patterns, and sticky ideas

Number 4 was to identify trends, patterns, and sticky ideas. Sticky ideas, it just means if you come across something and it just kind of sticks in your head, like it just kind of stays there, likely that will happen for your audience as well.

So if you come across anything that you find really interesting, that keeps sticking in your head or keeps popping up on Reddit, it keeps getting lots of upvotes, identify that idea because it's going to be valuable. So for us, we started identifying ideas like morphing GIFs, the Chevy logo morphing over time. We started identifying ideas like presidential limos. People really like talking about it.

5. Polish, improve, and up-level the ideas you've found

That led us to use Step Number 5, which is to take those ideas that we were finding, polish them, improve them, one up it, take it to the next level, and then create some content around that and promote it. So what we did was we took those two ideas, we took presidential limos and the whole morphing GIF idea over time, and we combined them.



We found images of all of the presidential limos since like the '50s. Then we took each of those presidential limos and we created a morphing GIF out of them, so that you started with the old presidential limos, which really weren't really secure. They were convertibles. They were normal cars. Then that slowly morphed up to the massive tanks that we have today. It was a huge success.

It was just a GIF. But that idea had been validated because we were looking at what was the most upvoted, what was the most downvoted, what was ranked, what wasn't ranked, and we saw some ideas that we could take, one up, and polish. So we created this morphing presidential limo, and it did really well.

It got coverage in a lot of major news networks. ABC News picked it up. CBS talked about it. It even got posted to Reddit later and performed really well on Reddit. It was all because we were able to take these features, mine down, drill down, find those good content ideas, and then polish it and make it our own. 

I'm really interested to hear if you've tried this before. Maybe you've seen some really good ideas that you'd like to try out on Reddit.

Do you have like a favorite search function that you use on Reddit? Do you like to filter by the past year? Do you like a particular subreddit? Let me know down in the comments. Good luck mining ideas. I know it will work for you. Have a great day.

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

mardi 14 janvier 2020

How to Use Tools to Determine Which Content to Re-Optimize: A Step-by-Step Guide

Posted by Jeff_Baker

Why is everyone and their grandparents writing about content re-optimization?

I can’t speak for the people writing endless streams of blogs on the subject, but in Brafton’s case, it’s been the fastest technique for improving rankings and driving more traffic.

As a matter of fact, in this previous Moz post, we showed that rankings can improve in a matter of minutes after re-indexing.

But why does it work?

It’s probably a combination of factors (our favorite SEO copout!), which may include:

  • Age value: In a previous study we observed a clear relationship between time indexed and keyword/URL performance, absent of links:
  • More comprehensive content: Presumably, when re-optimizing content you are adding contextual depth to existing topics and breadth to related topics. It’s pretty clear at this point that Google understands when content has fully nailed a topic cluster.
  • It’s a known quantity: You’re only going to be re-optimizing content that has a high potential for return. In this blog post, I’ll explain how to identify content with a high potential for return.

How well does it work?

Brafton’s website is a bit of a playground for our marketing team to try new strategies. And that makes sense, because if something goes horribly wrong, the worst case scenario is that I look like an idiot for wasting resources, rather than losing a high-paying client on an experiment.

You can’t try untested procedures on patients. It’s just dangerous.

So we try new strategies and meticulously track the results on Brafton.com. And by far, re-optimizing content results in the most immediate gains. It’s exactly where I would start with a client who was looking for fast results.

Example: Top Company Newsletters

Example: Best Social Media Campaigns



In many cases, re-optimizing content is not a “set it and forget it,” by any means. We frequently find that this game is an arms race, and we will lose rankings on an optimized article, and need to re-re-optimize our content to stay competitive.

(You can clearly see this happening in the second example!)

So how do you choose which content to re-optimize? Let’s dig in.

Step 1: Find your threshold keywords

If a piece of content isn’t ranking in the top five positions for its target keyword, or a high-value variant keyword, it’s not providing any value.

We want to see which keywords are just outside a position that could provide more impact if we were able to give them a boost. So we want to find keywords that rank worse than position 5. But we also want to set a limit on how poorly they rank.

Meaning, we don’t want to re-optimize for a keyword that ranks on page eleven. They need to be within reach (threshold).

We have found our threshold keywords to exist between positions 6–29.

Note: you can do this in any major SEO tool. Simply find the list of all keywords you rank for, and filter it to include only positions 6-29. I will jump around a few tools to show you what it looks like in each.

You have now filtered the list of keywords you rank for to include only threshold keywords. Good job!

Step 2: Filter for search volume

There’s no point in re-optimizing a piece of content for a keyword with little-to-no search volume. You will want to look at only keywords with search volumes that indicate a likelihood of success.

Advice: For me, I set that limit at 100 searches per month. I choose this number because I know, in the best case scenario (ranking in position 1), I will drive ~31 visitors per month via that keyword, assuming no featured snippet is present. It costs a lot of money to write blogs; I want to justify that investment.

You’ve now filtered your list to include only threshold keywords with sufficient search volume to justify re-optimizing.

Step 3: Filter for difficulty

Generally, I want to optimize the gravy train keywords — those with high search volume and low organic difficulty scores. I am looking for the easiest wins available.

You do not have to do this!

Note: If you want to target a highly competitive keyword in the previous list, you may be able to successfully do so by augmenting your re-optimization plan with some aggressive link building, and/or turning the content into a pillar page.

I don’t want to do this, so I will set up a difficulty filter to find easy wins.

But where do you set the limit?

This is a bit tricky, as each keyword difficulty tool is a bit different, and results may vary based on a whole host of factors related to your domain. But here are some fast-and-loose guidelines I provide to owners of mid-level domains (DA 30–55).

Tool

KW Difficulty

Ahrefs

<10

Moz

<30

Semrush

<55

KW Finder

<30

Here’s how it will look in Moz. Note: Moz has predefined ranges, so we won’t be able to hit the exact thresholds outlined, but we will be close enough.

Now you are left with only threshold keywords with significant search volume and reasonable difficulty scores.

Step 4: Filter for blog posts (optional)

In our experience, blogs generally improve faster than landing pages. While this process can be done for either type of content, I’m going to focus on the immediate impact content and filter for blogs.

If your site follows a URL hierarchy, all your blogs should live under a ‘/blog’ subfolder. This will make it easy for you to filter and segment.

Each tool will allow you to segment keyword rankings by its corresponding segment of the site.

The resulting list will leave you with threshold keywords with significant search volume and reasonable difficulty scores, from blog content only.

Step 5: Select for relevance

You now have the confidence to know that the remaining keywords in your list all have high potential to drive more traffic with proper re-optimization.

What you don’t know yet, is whether or not these keywords are relevant to your business. In other words, do you want to rank for these keywords?

Your website is always going to accidentally rank for noise, and you don’t want to invest time optimizing content that won’t provide any commercial value. Here’s an example:



I recommend exporting your list into a spreadsheet for easy evaluation.

Go through the entire list and feel out what may be of value, and what is a waste of time.

Now that you have a list of only relevant keywords, you now know the following: Each threshold keyword has significant search volume, reasonable keyword difficulty, corresponds to a blog (optional), and is commercially relevant.

Onto an extremely important step that most people forget.

Step 6: No cannibals here

What happens when you forget about your best friend and give all your attention to a new, but maybe not-so-awesome friend?

You lose your best friend.

As SEOs, we can forget that any URL generally ranks for multiple keywords, and if you don’t evaluate all the keywords a URL ranks for, you may “re-optimize” for a lower-potential keyword, and lose your rankings for the current high value keyword you already rank for!

Note: Beware, there are some content/SEO tools out there that will make recommendations on the pieces of content you should re-optimize. Take those with a grain of salt! Put in the work and make sure you won’t end up worse off than where you started.

Here’s an example:

This page shows up on our list for an opportunity to improve the keyword “internal newsletters”, with a search volume of 100 and a difficulty score of 6.

Great opportunity, right??

Maybe not. Now you need to plug the URL into one of your tools and determine whether or not you will cause damage by re-optimizing for this keyword.

Sure enough, we rank in position 1 for the keyword “company newsletter,” which has a search volume of 501-850 per month. I’m not messing with this page at all.

On the flipside, this list recommended that I re-optimize for “How long should a blog post be.” Plugging the URL into Moz shows me that this is indeed a great keyword to reoptimize the content for.

Now you have a list of all the blogs that should be reoptimized, and which keywords they should target.

Step 6: Rewrite and reindex

You stand a better chance of ranking for your target keyword if you increase the depth and breadth of the piece of content it ranks for. There are many tools that can help you with this, and some work better than others.

We have used MarketMuse at Brafton for years. I’ve also had some experience with Ryte’s content optimizer tool, and Clearscope, which has a very writer-friendly interface.

Substep 1: Update the old content in your CMS with the newly-written content.

Substep 2: Keep the URL. I can’t stress this enough. Do not change the URL, or all your work will be wasted.

Substep 3: Update the publish date. This is now new content, and you want Google to know that as you may reap some of the benefits of QDF.

Substep 4: Fetch as Google/request indexing. Jump into Search Console and re-index the page so that you don’t have to wait for the next natural crawl.

Step 7: Track your results!

Be honest, it feels good to outrank your competitors, doesn’t it?

I usually track the performance of my re-optimizations a couple ways:

  1. Page-level impressions in Search Console. This is the leading indicator of search presence.
  2. A keyword tracking campaign in a tool. Plug in the keywords you re-optimized for and follow their ranking improvements (hopefully) over time.
  3. Variant keywords on the URL. There is a good chance, through adding depth to your content, that you will rank for more variant keywords, which will drive more traffic. Plug your URL into your tool of choice and track the number of ranking keywords.

Conclusion

Re-optimizing content can be an extremely powerful tool in your repertoire for increasing traffic, but it’s very easy to do wrong. The hardest part of rewriting content isn’t the actual content creation, but rather, the selection process.

Which keywords? Which pages?

Using the scientific approach above will give you confidence that you are taking every step necessary to ensure you make the right moves.

Happy re-optimizing!


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

vendredi 10 janvier 2020

Intro to Python - Whiteboard Friday

Posted by BritneyMuller

Python is a programming language that can help you uncover incredible SEO insights and save you time by automating time-consuming tasks. But for those who haven't explored this side of search, it can be intimidating. In this episode of Whiteboard Friday, Britney Muller and a true python expert named Pumpkin offer an intro into a helpful tool that's worth your time to learn.

Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

Hey, Moz fans. Welcome to another edition of Whiteboard Friday. Today we're talking all about introduction to Python, which is why I have a special co-host here. She is a ball python herself, total expert. Her name is Pumpkin, and she's the best. 

What is Python?

So what is Python? This has been in the industry a lot lately. There's a lot of commotion that you should know how to use it or know how to talk about it. Python is an open source, object-oriented programming language that was created in 1991.

Simpler to use than R

Some fun facts about Python is it's often compared to R, but it's arguably more simple to use. The syntax just oftentimes feels more simple and common-sense, like when you're new to programming. 

Big companies use it

Huge companies use it. NASA, Google, tons of companies out there use it because it's widely supported.

It's open source

It is open source. So pretty cool. While we're going through this Whiteboard Friday, I would love it if we would do a little Python programming today. So I'm just going to ask that you also visit this in another tab, python.org/downloads. Download the version for your computer and we'll get back to that. 

Why does Python matter?

So why should you care? 

Automates time-consuming tasks

Python is incredibly powerful because it helps you automate time-consuming tasks. It can do these things at scale so that you can free up your time to work on higher-level thinking, to work on more strategy. It's really, really exciting where these things are going. 

Log file analysis

Some examples of that are things like log file analysis. Imagine if you could just set up an automated system with Python to alert you any time one of your primary pages wasn't being crawled as frequently as it typically is. You can do all sorts of things. Let's say Google crawls your robots.txt and it throws out a server error, which many of you know causes huge problems. It can alert you. You can set up scripts like that to do really comprehensive tasks. 

Internal link analysis

Some other examples, internal link analysis, it can do a really great job of that. 

Discover keyword opportunities

It can help you discover keyword opportunities by looking at bulk keyword data and identifying some really important indicators. 

Image optimization

It's really great for things like image optimization. It can auto tag and alt text images. It can do really powerful things there. 

Scrape websites

It can also scrape the websites that you're working with to do really high volume tasks. 

Google Search Console data analysis

It can also pull Google Search Console data and do analysis on those types of things.

I do have a list of all of the individuals within SEO who are currently doing really, really powerful things with Python. I highly suggest you check out some of Hamlet Batista's recent scripts where he's using Python to do all sorts of really cool SEO tasks. 

How do you run Python?

What does this even look like? So you've hopefully downloaded Python as a programming language on your computer. But now you need to run it somewhere. Where does that live? 

Set up a virtual environment using Terminal

So first you should be setting up a virtual environment. But for the purpose of these examples, I'm just going to ask that you pull up your terminal application.

It looks like this. You could also be running Python within something like Jupyter Notebook or Google Colab. But just pull up your terminal and let's check and make sure that you've downloaded Python properly. 

Check to make sure you've downloaded Python properly

So the first thing that you do is you open up the terminal and just type in "python --version." You should see a readout of the version that you downloaded for your computer. That's awesome. 

Activate Python and perform basic tasks

So now we're just going to activate Python and do some really basic tasks. So just type in "python" and hit Enter. You should hopefully see these three arrow things within your terminal. From here, you can do something like print ("Hello, World!"). So you enter it exactly like you see it here, hit Enter, and it will say "Hello, World!" which is pretty cool.



You can also do fun things like just basic math. You can add two numbers together using something like this. So these are individual lines. After you complete the print (sum), you'll see the readout of the sum of those two numbers. You can randomly generate numbers. I realize these aren't direct SEO applications, but these are the silly things that give you confidence to run programs like what Hamlet talks about.

Have fun — try creating a random number generator

So I highly suggest you just have fun, create a little random number generator, which is really cool. Mine is pulling random numbers from 0 to 100. You can do 0 to 10 or whatever you'd like. A fun fact, after you hit Enter and you see that random number, if you want to continue, using your up arrow will pull up the last command within your terminal.

It even goes back to these other ones. So that's a really quick way to rerun something like a random number generator. You can just crank out a bunch of them if you want for some reason. 

Automating different tasks

This is where you can start to get into really cool scripts as well for pulling URLs using Requests HTML. Then you can pull unique information from web pages.

You can pull at bulk tens of thousands of title tags within a URL list. You can pull things like H1s, canonicals, all sorts of things, and this makes it incredibly easy to do it at scale. One of my favorite ways to pull things from URLs is using xpath within Python.

This is a lot easier than it looks. So this might be an xpath for some websites, but websites are marked up differently. So when you're trying to pull something from a particular site, you can right-click into Chrome Developer Tools. Within Chrome Developer Tools, you can right-click what it is that you're trying to scrape with Python.

You just select "Copy xpath," and it will give you the exact xpath for that website, which is kind of a fun trick if you're getting into some of this stuff. 

Libraries

What are libraries? How do we make this stuff more and more powerful? Python is really strong on its own, but what makes it even stronger are these libraries or packages which are add-ons that do incredible things.

This is just a small percentage of libraries that can do things like data collection, cleaning, visualization, processing, and deployment. One of my favorite ways to get some of the more popular packages is just to download Anaconda, because it comes with all of these commonly used, most popular packages.

So it's kind of a nice way to get all of it in one spot or at least most of them. 

Learn more

So you've kind of dipped your toes and you kind of understand what Python is and what people are using it for. Where can you learn more? How can you start? Well, Codecademy has a really great Python course, as well as Google, Kaggle, and even the Python.org website have some really great resources that you can check out.

This is a list of individuals I really admire in the SEO space, who are doing incredible work with Python and have all inspired me in different ways. So definitely keep an eye on what they are up to:

But yeah, Pumpkin and I have really enjoyed this, and we hope you did too. So thank you so much for joining us for this special edition of Whiteboard Friday. We will see you soon. Bye, guys.

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

jeudi 9 janvier 2020

Find Competitive Keywords, Ranking Distributions, & Common Questions: 3 Workflows for Smarter Keyword Research

Posted by FeliciaCrawford

What keywords do your top competitors both rank for that you're missing out on? How do you know how much top real estate your URL or page owns in the SERPs? How can you discover answers to your searchers' most common questions and beef up that FAQ page?

We can answer all of those questions with some super-simple workflows using Keyword Explorer. In our last post in this series, we covered how to find ranking keywords, uncover new opportunities, check rankings, and more. This time around, we're diving into three more quick and easy workflows you can use to bolster your keyword research and work smarter, not harder.

Ready to get started? Follow along in the tool with Britney Muller as she shares her very favorite Keyword Explorer features:

Follow along in Keyword Explorer

And remember, if you have a Moz Community account that you use to thumbs-up and comment on Moz Blog posts, you already have free access to Keyword Explorer — let's show you how to use it!


1. How to discover competitive keyword opportunities

This is my favorite feature of all in Keyword Explorer and let me explain why. Let's say that you're this website, pimylifeup.com. They create projects and tutorials on Raspberry Pis. The two competing websites for Raspberry Pi, which is a mini computer, are raspberrypi.org and canakit.com.

If this is your site, we could paste that in here, select Root Domain, and do a search. Then we're going to grab these other two sites. We're going to copy their URLs and enter them in these additional site areas. 

This is essentially going to look at the ranking keywords for your competitive sites that your site doesn't rank for. So it's a really great, high-level overview of what those keywords are.

Pi My Life Up is pretty good. Then you can view the Domain Authority for the sites. Where it gets really exciting is over in Ranking Keywords. Here you can see this is raspberrypi.org, and this is the amount of keywords that they rank for. This blue circle is Pi My Life Up, and then the yellow is CanaKit.

What you want to look at are the keywords that both CanaKit and raspberrypi.org right here rank for that you don't. So you click on the competing overlap keywords, and they will populate here below. You can export all of them, which is great.

Or you could filter by various things, like search volume or difficulty in ranking. What I suggest doing is going through some of these by hand and selecting the keywords that you think might be opportunities for your site.

From here, what you can do is, after you select and click around to the ones that you want, you can add them to a keyword list. So you can keep track of all of these keywords. Let's do Pi Opportunities. I've already saved these in a list over here that's populated.

From a high-level overview, you can see what the popular SERP features are. There are lots of images for these competing keywords. If I want to be competitive in those keyword spaces, I know I need to create content that has images. There are also lots of related questions.

Then from here, I can filter by SERP features or organic click-through rates. Maybe most interestingly I can add a URL. Let's say we'll enter Pi My Life Up, and again we're not seeing any rankings here because this was that overlap that Pi My Life Up didn't rank for but the two competitive sites do.

This is confirming that we don't currently rank for any of these keywords, but we can work on that. What's so great about these saved lists is that you can come back after a couple of weeks or a couple of months and you can select all of the keywords and refresh the data.

You might want to come back to this keyword list, refresh it, enter in your URL, and then filter by rank and see where you're starting to pop up for these keyword terms. It's a really exciting way to dig into the competitive keyword space. There's tons you can do with this, but this was the high-level overview of finding those keywords that your competitors currently rank for that you don't.

2. How to discover a URL or an exact page's ranking distribution of keywords

You can just paste in the URL or an exact page into Keyword Explorer. Let's just use webmd.com. From here, you get the Overview page. But if you scroll down to the very bottom, you see the ranking distribution.

You can see how many keywords are currently in positions 1 to 3 versus 4 to 10, all the way down to 41 to 50.

3. How to discover common keyword questions

This is one of my favorite features that we offer with Keyword Explorer. Just put in your keyword. Click Search, and from here you can navigate over to Keyword Suggestions. In this view, you can filter display keyword suggestions that are questions.



Here you'll see all of the results that are questions, and you can sort by various things. You can add all of these to a list, incorporate them into an FAQ page, whatever your end goal is.


Discover anything new or especially useful? Let us know on Twitter or here in the comments, and keep an eye out for more ways to use your everyday SEO tools to level up your workflows.

Try out some new tricks in Keyword Explorer


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!