SEO – Performancing http://performancing.com Tue, 13 Feb 2018 03:14:04 +0000 en-US hourly 1 https://wordpress.org/?v=4.9.5 How To Fix Inner Pages that Aren’t Being Indexed http://performancing.com/fix-inner-pages-arent-indexed/ http://performancing.com/fix-inner-pages-arent-indexed/#comments Tue, 13 Feb 2018 03:14:04 +0000 http://performancing.com/?p=14217 How to Make Google Crawl Inner Pages When Google crawls the web, it indexes each page that it links to. During this indexing, the page is judged for things like its link profile, content, and value to searchers. It’s one of the steps in taking your page and giving it its PageRank. But not every […]

The post How To Fix Inner Pages that Aren’t Being Indexed appeared first on Performancing.

]]>

How to Make Google Crawl Inner Pages

When Google crawls the web, it indexes each page that it links to. During this indexing, the page is judged for things like its link profile, content, and value to searchers. It’s one of the steps in taking your page and giving it its PageRank. But not every page on the web is indexed, which can create a problem for your business.

How Does Google Index Pages?

Google uses a program called Googlebot to examine pages and determine what value they offer and what order they should be presented in for each search. Googlebot uses links to move from one page to the next, as well as Sitemap data from each site. New links, dead links, and updated links are adjusted on Google each time Googlebot crawls a page and updates the content on it.

Every result in a Google search is a page that has been indexed.

What Happens to Pages that Aren’t Indexed?

A page that isn’t indexed doesn’t show up in search results. It still exists on the web, but it loses one of the best ways to get organic visitors.

Why Are Some Pages Not Indexed?

There are a number of reasons a page won’t be indexed. It could be a technical issue, quality issue or problem with the link structure of your site. The only way to find out is to check each issue off one by one, and fix the problems when they appear.

Which Pages Are Most Likely to be Indexed?

Pages that are near the top of your domain, like the homepage and pages linked to the homepage, are most likely to be indexed quickly. One of the reasons for this is that Google’s crawlers use links as gateways to index each page. A crawler follows a certain number of links on each webpage it crawls, most of the time.

How Can I Tell If My Pages Are Indexed?

Type a specific search query into the Google search bar to see whether your site is indexed. First, use “site:domain.com”. If you were checking for indexed pages on Facebook, for example, you’d type “site:facebook.com” and run the search. It will show some—but not necessarily all—pages indexed from your site. You can also use “site:domain inurl:<slug> to show whether a specific page is indexed or “site:domain filetype:<filetype>” to see whether a specific file type has been indexed.

If you use the Google Search Console, check the “Index Status” to see how many pages are indexed, and which URLs are blocked or removed. Check the graph to see how your site has fared over time. Google Search Console also has “Sitemaps” that shows how many pages in the XML sitemap were submitted and indexed.

How to Make Google Index My Pages

There are a number of reasons why your pages may not be being indexed. Check each possibility against your site. Some changes are quick and painless, while others may take a little more effort. Either way, it’s worth doing the work so you can start getting those organic visits to those pages from Google searches.

Length of Time

Sometimes the issue is simply that your website hasn’t been up long enough. New pages aren’t indexed the second they appear on the web. Expect it to take a little while for Google to pick it up. The more you link build, the more quickly this is likely to happen, which is one of the reasons why a good link profile is so important early on.

Check Your Response Codes

If your site is producing anything other than a 200 (OK) server response code, you need to make adjustments. Errors, redirections, and dead links won’t be indexed by Google. You can use an HTTP status checker to check your pages and see what codes they’re producing if you’re not sure. If there are problems, resolve them.

You can hover over the status code boxes for information about each code.

Duplicate Content

Duplicate content is a bad idea for your SEO. Google may choose not to index pages with duplicate content and even if it is indexed, it may have a negative impact on your PageRank. There are tools you can use to see whether you have too much duplicate content. Make adjustments to your content if it’s too similar and repetitive, and you may see an increase in the number of pages that are indexed. It can also have a positive effect on your SEO.

Internal duplicate content isn’t the only issue, however. If you have content that’s too similar to that of other sites, your indexing may suffer. Check sections of your text to see how many other sites have the exact same phrasing. One issue may be that you’re pulling too many quotes from other sites. Another may be that your wording is just too similar. When these problems occur, work to make your content stand out and vary more from content on the same topics that other sites publish.

Page Quality

Sometimes a page is of such poor quality that Google simply won’t index it. Consider this: Google is creating a list of search results that offer value to its users. If the algorithm it uses to determine page quality shows that there isn’t enough, it may choose not to offer that page in search results at all. If you feel quality may be your issue, improve your content, link up with reputable sites, and consider getting expert advice on improving your content and SEO.

Another page quality issue to consider is the length of time it takes your site to load. Sites that spend too much time loading cause visitors to navigate away, which can negatively affect your PageRank. Eventually, it may lead to your page being removed from Google completely. Google’s PageSpeed Insights Tool can give you help to determine whether your site has loading issues and a guide on how to resolve them.

Internal Linking Issues

Sometimes you create pages on your site, but don’t link to them from other pages. These orphaned pages can negatively impact your rankings and may not be indexed. Crawlers follow links to find new pages. It’s impossible for them to find pages with no links pointing to them. If you aren’t sure whether your pages are linked correctly, use a site crawler to figure out whether your internal linking is done well.

Updating your site so that Google indexes more pages will have a positive impact on your representation on the search engine results page. As you produce new content, keep the suggestions for making your site get indexed in mind. That way you can produce content that is SEO-friendly, useful for viewers, and likely to get indexed as soon as possible so you can start getting organic views from search results.

The post How To Fix Inner Pages that Aren’t Being Indexed appeared first on Performancing.

]]>
http://performancing.com/fix-inner-pages-arent-indexed/feed/ 3
How to Increase Your Website’s Domain Authority http://performancing.com/increase-websites-domain-authority/ http://performancing.com/increase-websites-domain-authority/#comments Fri, 10 Nov 2017 12:30:28 +0000 http://performancing.com/?p=14215 As search engines and their algorithms become more refined, more ranking signals have emerged to help rank sites and order them so searchers can find what they’re looking for. One of these is domain authority. You can use domain authority to see how your website is faring in comparison to comparable websites. Increasing your domain […]

The post How to Increase Your Website’s Domain Authority appeared first on Performancing.

]]>

As search engines and their algorithms become more refined, more ranking signals have emerged to help rank sites and order them so searchers can find what they’re looking for. One of these is domain authority. You can use domain authority to see how your website is faring in comparison to comparable websites. Increasing your domain authority can positively impact your PageRank and how many visitors you get on your site.

 

What is Domain Authority?

In layman’s terms, domain authority is a website rating. The higher your domain authority, the better your website in comparison to other websites. Unlike SEO, a higher domain authority won’t directly affect your PageRank, but it is an indirect metric that can help you determine how your website fares compared to others. If a site has a higher domain authority, that is an indication that it will rank higher than comparable sites.

Facebook has a high domain authority.

Who Determines Domain Authority?

Domain authority is a metric developed by Moz. It was designed to be a prediction of a site’s rank on a search engine results page. Domain authority is logarithmic, so it’s harder to increase your score at higher levels than it is at lower levels. For example, moving your score from 15 to 25 is easier than moving it from 85 to 95. Your efforts go farther earlier on and it requires more work to get the same results later.

 

What Factors Affect Domain Authority?

Domain authority scores are measured from 1 to 100 and higher scores are better. A lower score means you’re less likely to have a favorable rank on a search engine. However, a lower score also doesn’t mean that your website is necessarily bad. Look at other sites in your niche and see how they rank for domain authority. If similar sites have much higher scores than yours, it’s a sign that you should work to improve your domain authority. If your site is ranked similar to high-performance sites in your niche, then you’re doing well. Remember that domain authority should be used as a comparable metric instead of an absolute one.

 

How Do I Find My Domain Authority?

Use an online service designed to show your domain authority. Moz, the company that determines domain authority, has one of their own. Type your URL into Open Site Explorer to see your domain and page authority. You can also see how many inbound links Moz has discovered in the last 60 days, which is important because new links help the site stay relevant over time.

 

Comparing Domain Authority

To get an idea of where your site ranks in comparison to others, type the URL of a similar site into Open Site Explorer. You can see the domain authority of that site. Make a list of the information provided, and then compare it to your own domain authority results. This gives you an idea of which site is more likely to rank higher in Google search results.

 

 

What is Page Authority?

Page authority is the measure of a particular page’s metrics. Sometimes a domain will be rated highly, but a page will have a lower ranking. An example of this is blog websites. The domain Blogger.com has a high domain authority, at 90/100. An individual blog without many inbound links is a page on Blogger, but itself may have only a 1/100 page authority. This means that the page isn’t likely to rank high, despite the high domain authority of the domain.

 

Where Does Moz Get Information to Calculate Domain Authority?

The Mozscape Web Index is a map of websites and the links between them, in the simplest terms. Like Google’s crawlers, it crawls sites and takes a snapshot of them that provides information to Moz about website performance and utility. It’s updated once a month and was developed as a tool to judge SEO and determine how to improve site ranking.

When Mozscape updates, the number of links, types of sites, and other information changes. This means that your domain authority and page authority can change from update to update. Another metric that changes is exactly what the highest domain authority achievable is. Since top sites are constantly getting massive numbers of new inbound links, the top domain authority that’s possible is always changing in different niches.

 

What If My Pages Aren’t Ranked by MozScape?

MozScape indexes pages from the top down, so it first indexes the homepage and the sites linked from the homepage. Sometimes pages deeper on a site aren’t indexed right away—but that doesn’t mean they won’t be included in future updates. If you’re waiting for a page to be included in the snapshot Moz takes of the web, consider moving it up in your site’s hierarchy or working on getting more inbound links to that page.

In general, the most useful thing you can do to get a page indexed is attempt to get links from sites with a high MozRank. As with most SEO, the best way to get these links is to publish regular, interesting, and relevant content, according to Moz.

Pages with a custom Top Level Domain aren’t indexed by MozScape, which is a result of the way their tool is built. They also don’t index sites that aren’t linked to their seed URLs.

 

Link Echoes and Link Ghosts: How Do They Affect Your SEO?

 

What Affects Domain Authority?

One of the primary drivers of domain authority is your link profile. As with SEO, the type and number of sites with links to your domain matter a great deal. MozRank and MozTrust are other metrics that go a long way to determining how high your domain authority is. Moz explains that there are many different factors that influence domain authority because it is meant to measure how a site will rank in Google search results. Google uses many factors to rank and so, too, does Moz.

 

Link Profile

Your link profile is determined by how many sites have links pointing to and from your website. There are both follow and nofollow links, each of which have a different impact. A link profile can be judged by the number of links, the types of links, and what sites those links are from. Links from higher-quality, more authoritative sites are more valuable than links from spam or low-quality sites.

 

MozRank

MozRank is comparable to Google’s PageRank and scores links on a scale from 1 to 10. It measures how popular a certain URL is, and is determined by which pages link to that URL and how often. Like domain authority, MozRank is a logarithmic scale. This means it’s harder to go from 1 to 2 than from 3 to 4. However, most URLs have a MozRank that’s a fraction, like 0.08. Pages with scores at 1 or 2 are the exception rather than the rule. Most sites with a high Google rank will be higher than 1, however. The reason so many pages have low scores is because there are billions of pages that have no or few links pointing to them.

 

MozTrust

MozTrust is a measure of how trustworthy sites that link to a URL are. This metric is important because when a site with a high level of trust—like a government site or university—links to you, it is a positive endorsement. Ranking trusted sites higher than those with links from less trustworthy sites can help reduce the amount of webspam searchers are subject to.

 

Raising Your Domain Authority

Raising your domain authority can help you rank higher on a search engine results page, because the same things that raise your domain authority also improve your SEO. Improving your link profile goes a long way toward raising your domain authority. There are a few other steps you can take to improve your domain authority score as well.

 

Get More Links

This is the way to have the most influence on your domain authority. Increasing the number of backlinks from quality sites will raise your score more than other factors. Keep in mind, though, that you must keep your focus on quality sites. If you use low-quality sites, it can actually hurt your domain authority and SEO.

Google has algorithm updates, like Penguin, that target spam and black hat SEO techniques. Using these types of strategies to get links will come back to haunt you and tank your PageRank and domain authority, even if they offer some utility at first. Professional, white hat link building is the way to go when you want proven results that last over time.

 

Diversify Your Links

Another way to make your link profile work for you is to diversify it. If you’re getting most of your links from the same sites or types of sites, look elsewhere for new links. Consider ones with different domain extensions or locations around the world. It’s important that these sites are high-quality as well.

One way to diversify your links is to gain more authority in your niche. When people are interested in your content and company, you’re more likely to be approached for content swaps or guest spots. These can help create new inbound links to your website.

 

Internal Link Structure

Make sure you’re using strong internal links on your website. Blog posts that reference one another should be linked together. Topics that reference others you’ve written about should be linked so readers can easily get more information. A strong internal link structure can offer link juice while also raising your domain authority. If you’re not sure what to link together yet, start producing content that elaborates on or connects to other topics on your site.

It’s important that your internal links have strong associations. Don’t link an article about pasta sauce to one about importing cars.

 

Disavow Negative Links

Sometimes spam sites or low-quality sites link to your domain without your consent. Google offers a Disavow tool that allows you to remove the connection to your site. Removing negative links can help raise your domain authority because it removes any penalty for these low-quality associations. You can also contact the websites that link to your site and ask for those links to be removed.

 

Trusted Sites

Focus your link-building efforts on sites with a higher MozTrust rank. Even a few links from very high-quality sites can have a large impact on your domain authority. Since links from very reputable sites can be hard to get, don’t make this your entire effort—but do try to get at least a few so that your MozTrust rank improves while you’re also building new inbound links.

 

Optimize Your Pages

If you haven’t already optimized all your title tags, image tags, images, and content for search engine purposes, do it now. Your search engine optimization helps improve both your PageRank and your domain authority, and therefore it is essential that your page is optimized.

You should also consider updating your site more often if updates are few and far between. More updated sites are more likely to get backlinks and are usually ranked higher than similar sites updated less often.

Since most people access the web on mobile devices, another thing you can do to improve your domain authority is to have a mobile-friendly site. You’re less likely to lose visitors on mobile—and therefore views that can lead to links—if they’re able to read and see your site in a favorable format.

 

Domain Information

One factor that can play into your MozTrust score—and therefore your domain authority—is what sites are linked to your domain registration. If you have multiple domains under the same or similar information and several of them are of low-quality, it can drag down the high-quality domains to a lower MozTrust score.

Another factor is the age of your domain. Older sites are more trusted than newer sites.

You can also improve your domain authority a bit by having a longer time before your domain name expires. Many sites expire in only a year. If you purchase your domain name for a greater length of time, it’s a positive signal for a higher domain authority.

Your domain authority is a measure of how well you’ll rank on a Google search engine results page in comparison to similar sites. If you’re ranking low in your niche, take action to improve your domain authority score. Since it’s based on things like your link profile and domain trustworthiness, an approach that favors search engine optimization should yield excellent results.

 

Tips for Getting in the Google Local 3-Pack

 

The post How to Increase Your Website’s Domain Authority appeared first on Performancing.

]]>
http://performancing.com/increase-websites-domain-authority/feed/ 14
Tips for Getting in the Google Local 3-Pack http://performancing.com/tips-getting-google-local-3-pack/ http://performancing.com/tips-getting-google-local-3-pack/#comments Wed, 02 Aug 2017 06:56:15 +0000 http://performancing.com/?p=14228 Once upon a time, there were seven results in each Google search that returned location-based business listings. In 2015, however, Google changed how they display results and cut the listings down to three that appear at the top of the search results. Being one of the listings in the local ‘3-pack’ is invaluable for your […]

The post Tips for Getting in the Google Local 3-Pack appeared first on Performancing.

]]>

Once upon a time, there were seven results in each Google search that returned location-based business listings. In 2015, however, Google changed how they display results and cut the listings down to three that appear at the top of the search results. Being one of the listings in the local ‘3-pack’ is invaluable for your business as you seek to reach more clients and make more money.

What is the Google Local 3-Pack?

The local pack is a list of three results under a location based Google search. An example of a location-based search would be “donut shops Portland Maine”. The searcher is clearly looking for a business to visit, and the results reflect that.

 

Listings are displayed under a map that shows the physical location of each result. The name, address, and current hours are displayed for each listing. If there is a website, that link is offered. An option to get directions to each result also appears. There’s also an approximation of the price, a description of the business, and a star rating review, if applicable.

Viewers can click “More places” under the 3-pack to see a list of other nearby business listings. Doing so takes them to a maps page with many results plotted on a map.

 

Why Did Google Reduce the Number of Local Results?

Mobile takes an increasingly large share of views on Google every year. When seven results were displayed, they didn’t fit on most mobile devices properly. It didn’t look as good. Cutting the number of local results from seven to three made the map and results fit on the mobile screen and provided a better customer experience for users.

As of 2015, Google officially stated that more than 50 percent of searches were done on mobile devices. They didn’t give an exact number, but the data was still important. For the first time, optimizing for mobile meant optimizing for a larger number of customers. Hitwise said in 2016 that mobile search makes up 58 percent of search query volume in the United States. Of those queries, searches about food and beverage were the highest at 72 percent.

Hitwise also found that mobile queries are a bit longer than PC search queries. The food and beverage category saw search strings of 13.8 on PC and 15.5 on mobile.

Google doesn’t stay static. Things change all the time because Google wants to provide the best user experience to the widest number of users possible. Chances are that in the future Google will adjust what appears in the local results again—and keeping your local SEO approach focused and clean will help you stay in the results, no matter how many there are.

 

Local SEO

 

 

Local search engine optimization is a bit different from traditional SEO. It focuses on making sure you’re one of the results that’s returned in local searches. When you’re looking to get foot traffic or make connections with people who are physically located near you, local SEO should be a priority. After all, reaching customers in Canada doesn’t do much good if your business focuses on physical sales in Florida.

One of the most important aspects of local SEO is the local pack. These businesses are the first listings that appear on the search engine results page. Having such a small list means that competition is fierce. Since most people click on the first result—and the majority of clicks beyond that go to the next few listings—you’re missing out on valuable customers if you aren’t in the local three pack.

 

Why is the Local 3-Pack Important?

Being in the local three pack means you get more exposure. Local searches are very important for a number of reasons. Consider these facts:

  • Local searches account for 46 percent of Google searches. That’s a large percentage of searches that you’ll miss out on if you’re not showing up in local results. Just having the location of your business on your site isn’t enough anymore, because competition for the top spots is fierce. You have to go farther if you want a spot in the local 3-pack.
  • Most people who do a location-based search end up contacting a business when they’re done. You want to be the business they contact, especially because 78 percent of location-based searches lead to a purchase down the road. So for every person who does a local search, there’s a strong chance they’ll end up contacting a business and then making a purchase based on that search.
  • You don’t have to wait long to get that extra money in some cases. Some customers act very quickly after they’re done doing a local search. 18 percent make a purchase on the same day.

 

How Do I Get Into the Local 3-Pack?

Improving your local SEO is the best thing you can do to make your business rank higher. The more Google trusts your business, the more likely you are to be listed in the local three pack.

 

Improve Your Overall SEO

Improving the SEO of your entire site will lift you in the Google search results and help you get into the local three pack. Your link profile, content, and outreach all need to be excellent so that your company is one of the top results for your keyword searches. Being optimized outside of local search puts you in a good position to rise quickly in local search.

 

Fill Out Your Google Business Listing

If you haven’t already done it, now’s the time to claim your free business listing on Google. You fill out your business information and keep it updated as things change. For example, you can adjust your hours when they change due to a holiday. Google’s business listings give you other ways to interact with your potential customers as well, including uploading photos and responding to customer comments.

 

Consistent Contact Information

One thing Google values is delivering consistency and value to its users. That’s one way that it became the most used search engine in the world. People know that they’ll be able to find the information they want by performing a Google search. Because of this, NAP (name, address, and phone number) are strong ranking signals for local search. Citations are one way to refer to an entry that includes the NAP of your business.

The NAP data should be exactly the same on any citation that appears online. This means that you should always use the same format, too. If your website refers to your location as Main Street, don’t shorten it to Main St. in another listing. Don’t list different phone numbers for the same place in different listings. Instead, keep everything in line with what’s on your main website.

Use a local number in any listings with NAP data. You can leave a toll-free number too, but having a local number is essential for getting in the three pack.

If you’re listed in many different places with different contact information, it may be harder to get into the three pack. Go through all your online sites and presences to make sure they’re consistent. If you’re listed on sites that aggregate information and they’re showing outdated or incorrect information, contact the people that run the site and ask for your listing to be updated or removed. You can usually find contact information on the “About” page or in “Frequently Asked Questions”.

If you don’t have your information in niche-specific and high-quality databases, start adding it. You shouldn’t spam any site that will take your information. Instead, find places where listing your information adds value for site users and your customers. Review sites, industry directories, and other types of sites can help get your information out there. Yelp and TripAdvisor are two high-quality sites where you can check and correct your business information.

 

Provide Information for Visitors

Include information that can help someone visit your business on your site and in your Google business listing. For example, the hours you’re open are very important. Google will be less likely to include you in the local three pack if it doesn’t have all the relevant information to provide to a searcher.

 

On-Page Optimization

In addition to consistent contact information, you should have an optimized page for each business location. When people do a local search, they want information about somewhere they can go in person. If you have multiple locations, having that information helps keep the results relevant to the search.

When you make a specific page for each location, don’t just copy and paste your content. Instead write something a little different for each one. That should help with the SEO for each page. Also consider including several other elements, like an on-page Google map showing your location. If you use schema markup, include the address and phone number of the locations in it.

Consider including images of the location, a description of the location, or a section with the history of the branch. That way, you’re giving content that’s just a bit different. It helps keep Google from lowering the PageRank of the pages based on duplicate content.

 

Reviews

Reviews are vital to your business’s presence in the local three pack. They’re one of the most important ranking signals for inclusion in the local three pack and there are good reasons for that.

First of all, people trust reviews. They provide a way for a searcher to know whether they should patronize one of the businesses displayed in the results. They give feedback that is valuable for a person to make a decision about where they’re going to go once they take action after their search.

According to BrightLocal, 92 percent of customers read reviews of businesses online on a regular basis. So almost the entire potential customer base you can get from Google searches gets information from these reviews. The same survey showed that 68 percent of respondents said positive reviews engendered trust in the company being reviewed.

Since 80 percent of customers will leave a review when asked, start canvassing for reviews from satisfied customers. There are lots of ways to encourage people to have their say online—for example, you could include a note on the receipt asking for a review on Google. There are many sites that offer reviews, but if you’re trying to rank in the local three pack, try to get customers to leave Google reviews specifically.

 

Mobile-Friendly Pages

Since so many people use mobile devices, having a mobile site is strongly correlated with inclusion in the local three pack. Even if your site looks good to you, it may not appear correctly on every type of device. Consider investing in responsive design, so that you site adjusts to whatever device accesses it.

As with any changes to your site, check for errors and problematic formatting when you launch the adjustments. Having a site that doesn’t display correctly will hurt your attempts to rank high enough to appear in the local three pack. Google’s Search Console has a Mobile Usability option where you can see whether any issues exist.

Inclusion in Google’s local three pack will help you connect with customers in your area. Since so many searchers use Google to find information on local businesses, you’re leaving money on the table if you’re not optimized for local search. Since the strategies that get you into the local three pack are also good for your SEO, you’re doing your business a favor by working to get into one of the top three results, even if it takes some time to knock others out of the spaces.

The post Tips for Getting in the Google Local 3-Pack appeared first on Performancing.

]]>
http://performancing.com/tips-getting-google-local-3-pack/feed/ 6
Link Echoes and Link Ghosts: How Do They Affect Your SEO? http://performancing.com/link-echoes-and-link-ghosts-how-affect-seo/ http://performancing.com/link-echoes-and-link-ghosts-how-affect-seo/#comments Wed, 12 Jul 2017 09:06:31 +0000 http://performancing.com/?p=14152 So many factors contribute to your search engine page rank that sometimes it’s difficult to put your finger on just what is giving your site a boost. Google’s algorithms consider many different factors when ranking your site. One that is well known and understood are inbound links to your site from outside sources. Certain links […]

The post Link Echoes and Link Ghosts: How Do They Affect Your SEO? appeared first on Performancing.

]]>

So many factors contribute to your search engine page rank that sometimes it’s difficult to put your finger on just what is giving your site a boost. Google’s algorithms consider many different factors when ranking your site. One that is well known and understood are inbound links to your site from outside sources. Certain links are a positive ranking signal that helps boost your page’s rank—but sometimes links are deleted or moved. That doesn’t mean the power of the link to improve your search engine page rank evaporates.

What Are Link Ghosts?

Link ghosts are links to your site that have been removed or deleted. Sites remove or delete links for many reasons. Updating a webpage to include new information may delete the content that included your original link. An inbound link may be deleted after a period of time. For example, if a link was mentioned as part of a contest, the contest expiration may cause the page to be removed.

Some websites close down permanently or take themselves offline to update. That can also cause a link ghost. Google knows there was a link pointing to your website. The link is dead now, but it still causes an impact on your rankings. Since one of the major benefits of a good link profile is your search engine ranking, this means that temporary links may offer more benefit than is usually assumed.

What Are Link Echoes?

Link echoes are the same thing as link ghosts. The terms can be used interchangeably. One reason the term link echoes is used is that the ranking signal of the link echoes, continuing to affect your site’s rank even after the original source of that signal is gone. Rand Fishkin says he actually finds the term link echoes to be a better name than link ghosts.

How Do We Know Link Echoes Affect Rankings?

Rand Fishkin’s team at Moz saw the effects of link echoes while working on a project to determine the effects of anchor text.

The links pointing to the pages they were testing bumped up the page rank of those pages. They moved to higher positions in the targeted searches. In the course of the test, Fishkin’s team removed those links and then, to their surprise, found that the pages either retained the same rank or just dropped a bit.

Once they removed the links and Google indexed the pages again, it would be normal to assume the boost from the links would disappear. However, that wasn’t the case.

The positive effects of the inbound links remained.

Almost five months after the initial test, the sites were still significantly higher in the rankings than they were before the test. The pages were indexed by Google multiple times and didn’t see new links that Fishkin was aware of, though he admits that it’s possible that links from outside sources had been created since the original test.

But in every test with eight different pages, the results remained the same. Link echoes improved the position of the page in search results.

 

Why Does SERP Matter?

SERP matters because your rank determines how many visitors you get from the search engine. Studies have repeatedly shown that the first Google search result gets the most clicks. The second gets less, but more than the third. Eventually, the click rate drops off and the lower you’re ranked, the less likely you are to get traffic.

Results on the second page of Google are massively unlikely to get a click than results on the first page.

When you improve your SERP, you improve your chances of getting a new visitor. That person may become a follower of your site, recommend it to friends, or use products or services you provide. Since SERP is so essential for your business, it’s important to take the steps to improve your SEO.

Link Echoes and SEO

Link echoes are an exciting thing for people interested in improving their search engine optimization.

When you’re working to increase your SERP, your focus on SEO matters. The way your page is displayed and promoted has a major effect on where it falls in the rankings. Promoting your page includes getting links from other sites. These links show Google that your site is reputable and useful, among other things.

Improving your link profile is an essential part of SEO. Multiple options for links that have a positive ranking signal can affect your SEO strategy. A lot of SEO experts focus on retaining links over the long-term. They look for links that won’t be removed. But there are reasons to accept links that won’t last forever, too. Good search engine optimization aims to improve your SERP in a way that remains even as Google tweaks its algorithm to remove low-quality sites. Having some temporary links is natural and may benefit your site more than you think.

There are a lot of ways for you to get positive links that improve your SEO. Since link echoes retain value, you can include links that will only be available temporarily as part of your link building strategy.

Why Do Link Echoes Retain Value?

No one knows for sure why link echoes still have a positive effect on rankings. Fishkin has a few theories about why Google still gives value to links that are gone.

Website Performance

It’s possible that Google considers the website’s performance in search results. If it does well and doesn’t send searchers back to the results, Google might allow it to keep the rank. Search engine ranking positions are all about offering value to searchers. If your page offers that value, Google doesn’t have a lot of reason to bump it down in rankings. Good links from reputable sources are a positive signal to Google that your site deserves a higher SERP. So link echoes, positive ranking factors, and searcher reactions to your site may combine to keep your site in a good ranking position.

SERP Factors

Link profiles aren’t the only thing that contributes to a site’s SERP. Other factors play key roles as well and since Google doesn’t publish its ranking algorithm, it’s difficult to know which are in effect. Having those links pointing at your site in the first place creates a change in your SEO. It may cause other unseen changes that still resonate enough to maintain the rank. Fishkin doubts this theory, because of the repeated results of his tests. But it’s still a possibility that he put forth.

Link Echo Ranking Factor

Fishkin says it’s possible that Google actually does consider link echoes as their own signal. The things about inbound links that are positive, like the reputation of the linking site, for example, don’t disappear just because a page updates and the link is removed. Google knows that certain types of links and in of themselves are a signal that your page has value. Those links being gone doesn’t mean that the value was never there to begin with.

No definite answer exists to explain why link echoes have such a positive effect on your rank, but these theories might hold the answer.

What Do Link Echoes Mean for Me?

Link echoes retaining their ranking value means that even temporary links can help boost your SERP in the long term. While conventional wisdom suggests that focusing on permanent links is a better strategy, link echoes show that may not be true. Since you can get value from link echoes, temporary links should also be sought.

Keep in mind that as your site rises on Google, you’re more likely to get links from organic sources. A person who sees your site when its SERP improves might link to it from his blog, for example. So even if the value of a link declines until it doesn’t matter in a year or two, that temporary boost can pay dividends.

One thing to consider is that it’s still important to keep your SEO strategy as white hat as possible. Google eventually adapts to remove sites that use black hat tactics to improve their page rank. If you use clean strategies that improve the quality of your site and your reputation, you’ll come out of these algorithm changes without losing major rank overnight.

When you’re working to improve your search engine ranking position, link echoes are beneficial. They ensure that links you’re getting today continue to help improve your search rank tomorrow. Since you can’t guarantee that inbound links will last forever, it’s nice to know they can still provide value after their gone. As you work to improve your link profile, you don’t have to focus only on links that will remain on a webpage forever. Instead, place links where they fit best and enjoy the boost as it lifts your site to a higher rank on Google.

The post Link Echoes and Link Ghosts: How Do They Affect Your SEO? appeared first on Performancing.

]]>
http://performancing.com/link-echoes-and-link-ghosts-how-affect-seo/feed/ 8
Google Sandbox – What Is It And How Does It Affect Your Website? http://performancing.com/google-sandbox/ http://performancing.com/google-sandbox/#comments Thu, 15 Jun 2017 10:30:04 +0000 http://performancing.com/?p=14106 Sometimes you do everything right, but you don’t see the results of your efforts right away. It could be that your web design, approach, or marketing need some tweaking—but sometimes another factor is at work. In its quest to keep search results relevant and only lift the best to the top, Google sometimes holds back […]

The post Google Sandbox – What Is It And How Does It Affect Your Website? appeared first on Performancing.

]]>

Sometimes you do everything right, but you don’t see the results of your efforts right away. It could be that your web design, approach, or marketing need some tweaking—but sometimes another factor is at work. In its quest to keep search results relevant and only lift the best to the top, Google sometimes holds back sites that are attractive, functional, and deliver a lot of utility to their audience. But if you know how to deal with Google Sandbox, the impact will be much less than it will be if you panic and make the wrong moves.

Google sandbox
Google Sandbox is where your site may go when it’s brand new.

 

What is Google Sandbox?

Google Sandbox is a filter imposed on new sites. It’s designed to stop them from ranking, especially for high-competition keywords, until they’re more established and Google can determine what the site is about.

So many new websites are launched every day; many of them employ techniques to manipulate the search engine and gain more clicks. Google doesn’t want to lift these less-relevant sites to the top of a search. Many new sites are filtered until their intent and techniques are determined to be positive ones.

Does Google Sandbox Exist?

The existence of Google Sandbox has not been confirmed by Google. However, there are many signs that it exists. Beyond the indications, though, it’s logical for it to exist. It serves a purpose for Google that helps them optimize their search engine.

Recently the conversation about whether Google does actually sandbox sites came up again when Gary Illyes tweeted about it and of course denied it existed:

There is no need for Google to confirm its existence when many new sites have the same issues getting a good rank when they’re first launched. It’s clear that something is holding them back. That something is the filter Google puts up until they know the site is one they want to display.

How Does Google Sandbox Affect My Site?

Google Sandbox has an especially obvious effect on high-competition keywords. More people are trying to make their sites rank for those terms. Google uses the Sandbox filter to make sure that low-quality sites don’t take over results, but even good sites get caught in it.

The biggest impact is in terms of your search clicks. When you’re in the Sandbox, you aren’t appearing as frequently in search results are you would outside the Sandbox. This means that less people are being directed to your website via the keywords you’re trying to rank for. This can make it look like you’re not doing things right in terms of SEO. That may not be the case though.

If you’re following SEO best practices, you may just be stuck in the Sandbox and waiting to be unleashed so that the full power of all the work you’ve done brings you more traffic. Until then, unfortunately, you may see that you have less traffic than you’d expect.

 

Google searches rank sites on many factors. Your site may not achieve its full potential until it’s out of the sandbox.

Why Does Google Have a Sandbox?

Google has a sandbox because first impressions matter. When your site debuts, it is making a first impression on Google—and they can’t be sure whether you’re dressed up in your Sunday best to impress them, or if you’re actually as great as you appear to be.

Many people use quick-fix SEO tactics to influence search engines and get more viewers. These degrade the quality of searches for everyone. They are able to perform better than sites with superior content. They offer less utility than sites that are optimized in a good way, but may still rise about good results in search lists.

Google is as aware of these tactics as anyone. Their Sandbox acts as a filter to keep these sites from overtaking search results. Someone well-versed in SEO tactics can easily put up many sites to grab quick rankings. Sandbox keeps them from being able to do this successfully.

Once Google is more familiar with your site, you get to venture out of the sandbox. Then all your efforts will start directing traffic to your site. It just takes time for the search engine to understand

How Long Does the Filter Last?

It appears that the Google Sandbox filter lasts about six months max.

The length that it affects your site depends on a few factors, like:

  • Content on your site
  • Keywords you’re targeting
  • Your niche

The reason these factors affect how strongly Google has to scrutinize your site. Different niches are targeted in different amounts and by different tactics. If you’re in one with many black hat SEO practices, then you may be in the sandbox longer. If you’re going after low-competition keywords, you may be out of the sandbox sooner.

It’s better not to try to predict when you’ll be out of the sandbox. Instead, focus on making your site the best it can be. Make sure you’re doing everything you can to build great content. That way, when you’re out of the sandbox, you have an optimized site. It won’t last forever and you want to be ready to get the best possible traffic from Google once you’re out.

5 Ways to Avoid Google Sandbox

Though you can’t guarantee that you’ll avoid the Sandbox, there are things you can do to boost your chances of staying out of it. If you’re already in it, doing these things could even help you get properly classified sooner.

  • Reexamine everything on your site and make sure you’re following SEO best practices. If you’re using any black hat techniques, cut them out and start doing things the right way.
  • Reconsider the links to and from your site. It’s important to have high-quality links in the right proportions of follow and nofollow. If all your links are paid or sponsored, it may change how Google views your site for the worse.
  • Keep updating your content with new and relevant content. Manage your site in such a way that Google can see you have good intentions.
  • Use a sitemap.xml file and a robots.txt file to make sure Google’s crawlers can read and index your site properly. If your site is difficult for the crawlers to understand, it may not be updated and removed from the Sandbox as quickly as it will be otherwise.
  • Update the <lastmod> tag on each webpage on your site. This shows Google that your site is being updated, which they favor.

Doing these things can help you avoid the Sandbox or even help you get out easier. You can also launch your site before you need a lot of traffic and just wait it out. But remember there’s no guarantee no matter what path you take. If you’re going for very competitive keywords, you may end up there.

It’s not always a bad thing. Sometimes it’s an indication that you’re doing things right and Google is making sure your efforts to optimize your site are white hat, not black hat.

Black Hat SEO

Black hat SEO or black hat tactics are names given to a particular theory of search engine optimization. Unfortunately, it’s one that has negative effects even for websites that don’t use these tactics.

People who engage in this behavior build a site that gets to the top of a Google search for keywords at all costs. What they don’t do is create great content or deliver utility. If you’ve ever performed a search and only gotten low-quality results that looked like spam, you’ve been the victim of these tactics.

Black hat SEO can actually violate the terms of service of Google, which can have even worse consequences for your business.

Some examples of black hat SEO are:

  • Keyword stuffing, which refers to writing content that exists only to repeat keywords. This helps you rank for those keywords, but doesn’t make good, useful content.
  • Link manipulation. This can be anything from spamming links on different sites to buying links. You aren’t getting quality links because you’re a quality site. You’re building them too quickly from low quality sites or purchasing them to get a false advantage.
  • Sneaky redirects. These occur when a person visits one page and is automatically brought to another. For example, if you searched for lawnmowers and were redirected to a page about hedge trimmers because the owner could rank for lawnmowers, but not hedge trimmers.
  • Automated or ‘spun’ content. Some people use programs to create content, rather than having it written. This leads to low-quality content with nothing new to offer.

Since black hat SEO tactics can temporarily boost a page’s search ranking, Google has to guard against it. If they don’t, the quality of their searches goes down. Google Sandbox is one way of fighting back against black hat SEO tactics and other ways that people find to manipulate search results.

White Hat SEO

White hat SEO is the opposite of black hat SEO. It refers to SEO best practices and other ways people use to optimize their sites that are good for your business, your clients, and the search engines. It’s the kind of SEO work that Google wants people to do, because it helps bring some of the best content to its proper place in search results.

White hat SEO is more difficult and may take more time to show results than black hat SEO. It will keep you in Google’s good graces and make sure you don’t get hit by changes to the algorithm like Panda. It’s always good to do thing the right way so that you don’t have to redo them later.

Best Practices

SEO best practices are all about bringing the right content to the right people. It’s a way to make sure people are getting the results they need. Unlike black hat SEO, white hat SEO ensures that people who find the way to your site are the ones interested in what you have to offer. It also helps you build positive relationships in your niche when you interact with other, complementary sites.

One of the most important aspects of white hat SEO is writing good content. Google gives weight to strong content that’s updated regularly. While you may feel discouraged and not like creating new content when you’re in the Sandbox, put away those thoughts. Instead, realize that you’re creating a portfolio of work that can bring you benefits for years to come.

Once you have the good content, you have to find a way to get your name out there. Creating a strong, positive backlink profile is essential for your site’s success. Unlike black hat SEO tactics, white hat SEO is all about good content marketing and outreach. You find people in your niche who can feature your work because they appreciate it and find value in it

So does this mean that you should wait to do the right things until your site is out of the sandbox? Absolutely not.

When planning marketing and promotion, always go with SEO best practices to avoid being hit by filters or algorithm changes.

3 Things to Do While You Wait

  • Keep writing content for your site. The more content you have, the more pages that can appear in search results. When you keep writing great content, you’re creating a future for your site and business.
  • Make relationships with others in your niche and get great links. Links have a filter of their own, so getting them up now may help them achieve their full potential sooner. Use your time in the sandbox to start the process of making and sustaining those connections that benefit you.
  • Join Google AdWords and find visitors in other ways. Since your traffic is slowed by being in the sandbox, things like AdWords, content marketing, or other advertisements can help you reach out and find people to use your site now rather than later. These may also benefit your SEO when you’re out of the Sandbox, allowing you a higher rank than you would otherwise.

Whether you’re in the Sandbox or not, keep making your site the best it can be. If you’ve done everything you can, there’s nothing left to do but wait for Google to release you from the Sandbox. As soon as you’re out, you’ll see the benefit of your hard work in terms of traffic directed to your site from Google search results.

Ultimately, Google Sandbox is just a reminder to do the best you can to conform to SEO best practices. As long as you do—and keep things updated, relevant and useful—you’ll see your traffic from search engines increase. You’ll also stay ahead of SEO changes, filters and any other check Google comes up with.

The post Google Sandbox – What Is It And How Does It Affect Your Website? appeared first on Performancing.

]]>
http://performancing.com/google-sandbox/feed/ 7
The SEO Copywriting Guide For 2017 http://performancing.com/seo-copywriting-guide/ http://performancing.com/seo-copywriting-guide/#comments Mon, 12 Jun 2017 11:00:21 +0000 http://performancing.com/?p=14117 One of the main things that drives people to your website is content. It’s one of the primary things Google indexes to create results for searches that are applicable to your website. While great content and content that’s great for search results may not always look the same at first glance, you can work on […]

The post The SEO Copywriting Guide For 2017 appeared first on Performancing.

]]>

One of the main things that drives people to your website is content. It’s one of the primary things Google indexes to create results for searches that are applicable to your website. While great content and content that’s great for search results may not always look the same at first glance, you can work on making your content stand out from the crowd and optimized for search engines.

It’s an essential part of promoting your website, finding new visitors, and turning your brand into a powerhouse.

 

 

What is SEO Copywriting?

SEO copywriting is all about creating content that’s designed to help your search engine rank. Even a great page of content may be tweaked to provide better performance on search engines. When you’re writing a page of content for your website, whether it’s a blog post, a product advertisement, or a landing page, always keep search engine optimization in mind.

Every part of your site should be designed with SEO in mind, but content is one area to especially pay attention. The reason for that is because your content is the thing that will be updated most on your site. Since your place in search results may be boosted if you update more often, SEO copywriting is a habit you’ll want to develop early on.

Of course, if you already have a lot of content that isn’t optimized for search engines, you can go back and make it stronger. Updating your old content may also give you a boost, and is worth the time it takes to do so.

 

What are the Goals of SEO Copywriting?

Copywriting for SEO done well has three parts. First, you must write content that delivers the message you want your site visitors to absorb. Second, you must write it in a compelling, clear way that encourages people to read it and makes them want to come back. Third, you must write and present the content in a way that makes it easier for search engines to crawl your site and present it when the right keyword search is performed.

Ultimately, these will contribute to the three primary goals of SEO copywriting.

  • Filling your site with great content that’s straightforward and written to your audience.
  • Structuring that content so that a search engine has an easy time indexing it.
  • Boosting your position in search results, leading to increased clicks and visitors.

It’s all about playing nice with what search engines want so that you can reap the benefits of getting to the top of the search results. Since the first result on Google gets 33 percent of the traffic, and every result down gets exponentially less, even one upward position can make a substantial difference. If you’re not on the first page of Google search results for your targeted keywords, don’t despair. That can be your first goal as you improve your SEO copywriting and update old content.

 

 

Cornerstones of SEO Copywriting

To become skilled at copywriting for SEO, keep in mind the best practices as you write. That way, you’re always working to make your content better and more likely to be shared.

Compelling Content

Writing compelling content is the most important aspect of SEO copywriting, which cannot be stressed enough. The reason for this is that search engines put a lot of weight on how many quality links from trusted domains link back to your content. When they like your content and share it in a positive way, you get two different benefits:

  • People who follow the site linking to you are more likely to click the link and discover your site organically. It may lead to other shares and links on other sites, too.
  • It increases the amount of link juice you have.

Keyword Research

When you’re writing content designed to get people on your website, do keyword research to make sure that not all the keywords you’re targeting are competitive. You want a good mix of different types of keyword types. A handful of competitive high-value keywords that are hard to rank  may surprise you when you rank higher than expected. Lower competition long-tail keywords will be easier for you to get rank in when you’re first starting, though. So when you’re designing a piece of content, consider the keywords you want to target before you start writing.

Content Research

Do your research on the topic you’re writing about before you start the writing process. Having a good grasp of the topic will make your content better and provide a good first impression to any new visitors. Putting out mediocre content quick may seem appealing, but in the long run writing high-quality content with helpful information will lead to better performance for your site.

Proper Structure

Once you have a great content piece ready, you need to put it in a compelling frame. Having relevant images that are attractive and eye-catching, related videos, and a catchy headline. Since the headline may be picked up as the title of your search results, make sure it explains the content and makes people want to look at it. Great copywriting is improved by having all the right link types, keywords, images, and structure to present it properly.

 

Keywords

So how do you find the right keywords to build your content around? Use a keyword research tool. There are several available from Google, Moz, and Wordstream, for example. Type in a keyword you want to see results for, and then explore the available data. Look for keywords that have many searches, but lower competition. Ones with a lot of competition will be hard to rank for. Keywords with few searches won’t give you as much benefit as ones that get many searches.

Most tools will also give you data for long-tail keywords. These are longer, more specific keywords that people search for and may deliver lower competition for more searches than standard keyword phrases. If your website is relatively new and lacking in the authority to take on the major industry competitors then it maybe wise to target multiple long-tail keywords and build up traffic and authority to your site and brand that way though.

 

Tips for Better Keyword Research

  • Think outside the box. If you’re not sure what keywords to use, put yourself in your customer’s shoes. Try to figure out what he might look for. Think what queries a searcher might type if they are asking a question you have the answer to. Keep a list of keyword phrases you’ve considered.
  • Read through competitors’ websites to determine what keywords they’re targeting. It may help you come up with ideas to put into the tool.
  • Give yourself some room to experiment. Not every piece of content you publish has to be a homerun, because it will also provide other benefits for your site—even if it doesn’t get a lot of views. It’s okay to try something new to see how it performs and test it on your own site. You may discover a great keyword or phrase others in your niche haven’t yet.

 

Setting Up SEO-Friendly Documents

One important consideration when performing SEO copywriting is that you can’t just dump paragraphs on the page like an essay. Web content should be formatted differently than other types of content, simply because it will be read and examined differently.

Consider using headings to break up your document. Think of them as the bones of your content. They make it easier for a visitor to skim the page and find the content that’s relevant to them. Don’t make your paragraphs too long. Short paragraphs or a mix of lengths is more readable and attractive.

Choose a font that isn’t hard to read. You’re looking for something attractive that fits with your page, but nothing fancy with lots of flourishes. If it’s hard for a visitor to read your site, they may move on and not come back. Always think readability when you’re working on your copy cosmetics.

 

“On a typical blog, only about 2% will spend more than two minutes reading a post”

Neil Patel, Quicksprout.com

 

Writing SEO-Friendly Content

Think of your content as a conversation between you and the person visiting your site. Depending on your niche and the approach you want to take, you might keep things more casual or more formal. For example, a recipe site that uses real-life anecdotes might address people like old friends. A site trying to rent business conference spaces probably wants to keep things more professional. Choose a tone and stay on it throughout your site.

Use the active voice when possible. Don’t force it, but do try to write in such a way that most of your sentences are active rather than passive. It is a cleaner writing style that resonates with most readers. Also, use powerful words that inspire emotion in the person reading your content. When they connect with your words, they’re more likely to connect with your site.

 

“Using active voice helps people picture themselves taking action, so it’s no surprise that it’s considered so important in marketing copywriting. After all, when you write your page copy, you want it to convert – and that means convincing readers to take action on your offer”

Sharon Hurely Hall, Sharonhh.com

 

Determine what questions your content could answer, and then work that into the content. Many people search for questions when they don’t know the answer to something, and if you provide the answer, you might rise in search results.

Finally, don’t skimp on the words. Writing more than 1500 words not only offers more space to answer questions and explain concepts, but it also provides more room for links and keywords. You don’t need to go overboard; every post isn’t a novel. But don’t cut yourself off, either. Write to the topic, and try to go long. Neil Patel of Quicksprout found that pages that ranked higher on Google were likely to have more than 2000 words. Pages that were higher in the results tended to have more words than those lower in the results as well.

 

 

7 Ways to Improve Your SEO Copywriting

Don’t Keyword Stuff Your Content

Old SEO professionals often suggested stuffing in the keyword you’re targeting as much as possible in a post. But that creates an unreadable and unattractive piece of content. Instead, keep your tone natural and appealing. Use different variations of your keyword to target other potential searchers with your content.

Pay Attention to Your Metadata

Your metadata is information about your content that doesn’t appear on the page, but is visible to search engines. It may determine how your site’s search result appears on Google. Make sure to optimize your metadata before you publish your page. Even great SEO copywriting needs the right frame to get those initial views before your site develops a large following.

Create a Blend of Link Types

Not every link on your page should go to a different site. If you have other content that offers more information about a topic you touch on in your content, link it in your piece. It keeps people on your site, shows the breadth of your knowledge, and improves the utility of what you’re writing by offering a portal to other information.

Always Check for Errors

Errors aren’t only grammatical issues or misplaced commas. They can also be using the same word too often, or not varying sentence lengths. You don’t have to be a perfect, professional writer to create good content. Using a great editor can be highly beneficial in making your writing better and more attractive to visitors. Look for any issue that makes reading the piece less than smooth.

Look for Places to Quote Experts

Not only will quoting experts offer an SEO benefit, but it will also add authority to your piece. Read through the content that others in your niche are developing to see what they’re saying. Once you find something relevant, quote the person and link back to their site.

Keep a Steady Flow of Information

Repeating yourself in SEO copywriting isn’t a bad thing—but you should be introducing new ideas as your piece develops, too. Repetition helps drive home the most important points you need to make. New information keeps the content moving and interesting, as well. Try to only repeat the things that are vital for a reader to understand in a single piece of content.

Back Up Your Facts

If you cite a quote, paper, study, or fact, link to it in the body of your content. Make sure you find the best source possible for that link, too. For example, linking to a study itself offers more utility to your reader than linking to a fluff news piece about that study. Always err on the side of value when making decisions in your copywriting. Good content will lead to good PageRank, as long as you also focus on things like content outreach and metadata best practices.

 

Conclusion

SEO copywriting is an important skill to cultivate and master as you build your website and develop a following. It’s the base of your site and your communication with visitors. You can always hire a professional if you don’t have the time or inclination to learn, but feel confident that you can also write powerful content yourself if you’re so inclined.

 
 

 
 

The post The SEO Copywriting Guide For 2017 appeared first on Performancing.

]]>
http://performancing.com/seo-copywriting-guide/feed/ 4
11 Myths About Link Building In SEO http://performancing.com/myths-about-link-building-in-seo/ http://performancing.com/myths-about-link-building-in-seo/#comments Wed, 10 May 2017 08:12:12 +0000 http://performancing.com/?p=13830 Link building is one of the cornerstones of SEO and essential to rank for most competitive search terms. Doing link building the wrong way can quickly get your website into trouble and many websites and business have gone from the top of the SERPs to almost bust overnight due to the wrong link strategies so people […]

The post 11 Myths About Link Building In SEO appeared first on Performancing.

]]>
Link building is one of the cornerstones of SEO and essential to rank for most competitive search terms. Doing link building the wrong way can quickly get your website into trouble and many websites and business have gone from the top of the SERPs to almost bust overnight due to the wrong link strategies so people are naturally cautious. This has also lead to many myths about link building spreading across SEO blogs, forums and social media over the years and this article is going to put a few of them straight. 

Links from directories do not have any value

Back in the old SEO days when link building involved submitting your website to as many places as possible directories were high up on the list as a main place for getting links from. When people started to spam this tactic too much and many link directories started to lower the standards of what they accepted or even let any website add it’s link with no oversight things started to get out of hand. Many directories deemed to be low quality or simply for the purpose of adding links were heavily penalized by Google and then after the Penguin updates people quickly stopped using them.

However, this does not mean that all directories are bad. Like with all links it depends on what the quality of the site is and what it’s doing. If there is a website run specifically for businesses in your local area with a directory and they are accepting listings and vetting them  then by all means you should be looking to get a link from that site. As with all link building, use your judgement and analyse the quality of the site beforehand.

Getting links too fast will get you penalised

Another myth and one which even seasoned SEOs believe! There is some truth to this though, if you create a new website or publish a new page and then get a lot of links with the same anchor text or text that is overly optimised for search terms than Google may not see these links as natural as when a link is truly built naturally it is normally not using your target search terms! However, if your business happens to blow up in the news or a piece of content goes viral and suddenly you are getting lots of links then this is fine and Google can tell that they are natural. So yes if you are doing forced, low quality link building too quickly then expect them to either not count at all or for Google to come after you, otherwise you’re fine.

Don’t link out to too many sites

This is something else that did have some truth to it in the past but in the last few years Google has stated is not true anymore. Many years ago in the Google webmaster guidelines it said “Keep the links on a give page to a reasonable number (fewer than 100)”, however people took this as a web spam rule and soon after 100 was the benchmark figure a page should not go over for outbound links. Matt Cutts helpfully cleared this up in 2009 though explaining that this was more of a user experience guideline which is why it was not listed in the web spam section.

“Does Google automatically consider a page spam if your page has over 100 links? No, not at all”

Matt Cutts, 2009

Even though the advice was dropped from the guidelines in 2008, Google publicly reinforced this in 2013 but also added a note that they make take action against that page if there were signs of obvious manipulation and/or spam.

Do not ask for links at all as it will get you penalised

This is not true but again it does share similarities with other situations where companies have offered discounts on their products and services or even free samples in return for links. If Google sees a website or company doing this on any kind of large scale they will see it as manipulative and most likely issue a penalty and warning.

However doing general outreach to niche related sites and asking them to link back to you is just fine. If a website has a resources page and you have something that you feel would be a good fit for that page then by all means ask them to add it.

This will *not* happen if you ask for links

Don’t get links from sites with less authority than you

As long as a site has quality content and isn’t spamming then the link is still valuable to you if its DA 50 or 20 site. Of course links from higher authority sites will carry more weight but lower ones can still help especially if the content is closely related to your site, don’t turn your nose up at them!

More than one link from the same domain has no value

Getting links from as many different domains as possible is of course desirable for any webmaster or link builder. Getting a second or third link from a domain that has already linked to you may not help you as much as the first link did but it still passes value in the same way other links do. If Forbes.com was linking to you multiple times from their site would you be upset? Of course not.

Links from non-related sites will not help

Your priority target sites should be sites related to your industry but it doesn’t mean that links from other niches will not pass value and help your rankings.

Link building is an independent strategy

This is another hangover from how SEO was done in the past and before the rise of content marketing. Any modern day SEO or content marketer knows that these two fields now go hand-in-hand. High quality and useful content is integral to nearly all link building campaigns especially if you want to attract links organically (without having to do manual outreach) and if you are planning a piece of content that you are hoping will get links naturally and/or you want to use for outreach to get links then making that piece of content as stand out as possible should be your main priority. If you have an attention grabbing piece of content that is head and shoulders above your competition then this will do a lot for the work for you!

 Reciprocal link building will always get you penalised

No it will not. If you link to a site that happens to link back to you by chance or even intentionally then Google will not have a problem with it. However, if they see you conducting what looks like a reciprocal link scheme where there are many sites and you are all linking to each other then they will take action.

I’m sure you are beginning to see a pattern here now, don’t spam, Google isn’t stupid!

Links are permanent

You may have heard people say thins like ‘pay per click is temporary, SEO is forever’ and there is some truth to this, although SEO takes a lot longer to achieve results the affects can last for a long time and organic SERP results historically have a much better click through rate than PPC search ads. However this does not mean that if you build XX number of links for a website that those links are still going to be there in a few months or a year later. Editors may remove links or entire pieces of content, websites are bought or shutdown, businesses may go bust and there are numerous other reasons that any number of highly valuable links to your website disappear overnight. Link building should never be a ‘set it and forget it’ strategy and should be on-going which is adds further emphasis to why it needs to be baked into your overall content and digital marketing strategies.

Nofollow links have no value

As we covered here in a previous post nofollow links should actually be part of your linking strategy to ensure that your link profile looks natural as Google does not like to see too many followed links and you want to have a link portfolio with a good mix of followed and nofollow links. Some SEOs and content marketers these days even say that you shouldn’t even think about whether a link is follow or nofollow and instead focus on reaching out to the right websites and people.

The post 11 Myths About Link Building In SEO appeared first on Performancing.

]]>
http://performancing.com/myths-about-link-building-in-seo/feed/ 12
How to Do Internal Linking the Right Way http://performancing.com/how-to-do-internal-linking-right-way/ http://performancing.com/how-to-do-internal-linking-right-way/#comments Wed, 10 May 2017 03:11:29 +0000 http://performancing.com/?p=14023 Internal Linking and SEO Best Practices One of the great things about raising your visibility through search engine optimization is that you can get a lot of link juice with a few simple changes to how your produce and present your content. One well-known way of improving your SEO is through external links from relevant, […]

The post How to Do Internal Linking the Right Way appeared first on Performancing.

]]>
Internal Linking and SEO Best Practices

One of the great things about raising your visibility through search engine optimization is that you can get a lot of link juice with a few simple changes to how your produce and present your content. One well-known way of improving your SEO is through external links from relevant, popular sites that point to your content. A method of improving your SEO that’s a little less understood is internal linking and its many benefits. If you aren’t including internal links in your webpages, you’re losing out on what could be a substantial boost in your Google rank.

 

What is Internal Linking?

An internal link is a link on your website that points to another page hosted on your website. For example, if a blog post on your page references a product you’re selling, you could link to that product. You could also link to your homepage, frequently asked questions or another post that sheds light on a topic mentioned in the one with the link. The important thing about internal linking is that the source and target link are on the same domain.

 

What’s the Point of Internal Linking?

Internal links serve a few different purposes on your website. For one, they create a roadmap of your site that helps people navigate. A clear webpage structure makes a better customer experience. Another thing they do is educate your visitors on what topic you’re writing about. Sometimes concepts are too complex to be explained in a single post, and linking to another can help your visitor find more information to clear things up.

These links also help create more links on your site, which can benefit your Google rank and help make your site more visible to interested searchers. One of the major benefits of internal linking is its ability to make more of your pages indexed, which is another way to increase your visibility. 

 

Internal Links and Google Crawlers

Google indexes the web with robots called crawlers. They go to a website and index the content, then follow links to new pages to index those as well. When content is indexed, it is available to searchers on Google.

Internal linking gives the crawlers a new way to find pages on your website. It creates links for them to follow. It helps Google index your pages and display them in search results, which increases your potential visibility. Since Google only updates their indexed content so often, creating new links to your pages may increase the rate of refresh for your site.

 

 

Internal Linking provides crawlers a way to find other pages on your website.

Internal Links and Crawl Limits

When designing an internal linking strategy, keep in mind that search engine crawlers have limits to the amount of links they can crawl. Google recommends that you keep the number of links to a few thousand at most, as of 2017. In the past, however, Matt Cutts of Google recommended keeping links on a page to less than 100, Google dropped this recommendation later in 2008 but also reinforced that they would take action if a page was deemed to be linking out for spam purposes. 

When linking internally it is also important to keep your crawl budget in mind. This is the number of pages on your site that Google bots are able to crawl.

 

What is Your Crawl Budget and Why You Need To Know This

what is your crawl budget

 

Since you also want to keep the page looking natural and not like it was link-stuffed, go easy on the internal links. Link totals include every link on the page, including the header, footer and any sidebars. If you can work in two or three internal links in a few thousand words, you’re doing the right thing.

That’s not to say that there aren’t times when more internal links are useful. If you’re setting up an index page or doing an overview of a topic to help people find more in-depth articles, link away. But in general, go easy on the internal links. Just a few to help readers and crawlers find their way to new pages can do wonders for your website navigation and SEO.

 

Types of Internal Links to Use

Use links in places where there is a convincing connection between the two pages. Ask yourself: is this relevant enough that a reader would want to click the link to see more information about it.

Use links that reach deep within the content of your site, not only links to the surface. While a link to your homepage is technically an internal link, it’s unlikely that it will deliver the same search engine optimization benefits as a link into deeper content, like blog posts or answers to reader’s questions that have been submitted to the site.

 

Types of Internal Links to Avoid

One type of link to avoid is a nofollow link. Since crawlers are using the link to get deeper into your site and index more pages, nofollow links counteract this. Crawlers generally only continue to the next page when the link is formatted as a follow link. Google doesn’t like to see webmasters doing this as it looks like ‘page rank sculpting’ which is when you try and force Google bots to only flow to certain pages to boost their value. 

“Nofollow is probably never the answer, especially on your own site. I can think of corner case scenarios where the target page would be robotted for whatever reason, and then if it is robotted and not indexed yet, if you don’t want to get that page indexed, then you probably don’t want to point to it with anchors”

Gary Illyes, Google

Matt Cutts did say back in 2013 though that its ok to nofollow a link to a page that contains something such as a login form:

 

 

Don’t have your internal links be the same on every page. Most webpages have a setup where links to major content are linked on every page. Examples include: contact us, the homepage or business hours. Avoid the mistake of making no variation.

Some examples of links that won’t be crawled are:

  • Links kept behind forms won’t be indexed. Crawlers won’t submit forms.
  • Some links are only accessible through an on-site search. These won’t be indexed. This is one of the most common causes of links not being crawled.
  • Flash, Java and similar plugins can prevent crawlers from accessing the links on them.
  • Links in certain types of Javascript won’t be crawled. It’s almost always better to use normal HTML links.

As you build internal links, decide what you want to focus on. You can focus on spreading many internal links through your site. Another strategy is promoting certain pieces of well-written content on many different pages. Either way, work to structure your links so they’ll be crawled. Your site will benefit from the new links and crawlers may find pages they haven’t before.

 

Internal Linking and SEO

Internal linking combines some of the most important components of search engine optimization into one task: linking, content, indexing, and refresh rates. Though you won’t get the same juice as you would from an external link from a high-authority site, you still get a boost when Google crawls and indexes your pages more often.

When your search engine optimization is increased, you’ll see your Google rank climb. Since pages at the top of Google’s search results for a term get significantly more traffic than those lower on the list, it’s worth the effort you put in to increase your rank.

Internal linking can also help increase your traffic and SEO by returning more of your pages in search results. More links mean more crawling means more pages indexed by Google. Those indexed pages can be returned in searches for the keywords you’re targeting. In that way, your site may rise above one that doesn’t use internal linking.

 

Internal Links and Reader Engagement

Another benefit of internal linking is that you can increase reader engagement. A good link structure will make your site and content easier to navigate for readers. One benefit is that a clear link to supplementary content will add value. Adding value will make your site more appealing to a visitor.

The easier and more helpful your site, the more interested people will be in returning to it later. Since returning visitors are more likely to make a purchase on your site, it pays to keep people coming back again. This is another reason why fresh content is essential and helps an internal linking strategy–it keeps people coming back and following links to your pages, establishing you as an authority and building the trust that inspires people to purchase products or services.

 

Internal Links and Niche Authority

Linking to your own material can also help increase your own authority in your niche. While linking to outside websites also has value, you show that you know your stuff when you use internal links properly. Since building your niche authority can help you build relationships with other businesses, attract customers, and offer external link trading capabilities, it’s good to position your brand as an expert.

Pages that define terms, elaborate on mentioned concepts, or highlight some area of expertise that you excel in can position you as an authority. Linking to those exposes your brand knowledge and awareness to other people. If you have more relevant and substantive content on your site, they know that you know as much or more than your competition.

 

Creating Linkable Content

One trick to internal linking is to create lots of content. The more content, the more potential links and places to link on your site. Since updating your content regularly helps improve your Google rank, increasing the content production on your website is a good idea for more reasons than one anyway.

To create content that is easy to make relevant links to, take a look at your old posts and consider how they might connect to new content. Don’t force it, though. You don’t want to sacrifice the quality or readability of your content to link to other pages on your site.

If you already have a lot of content that isn’t linked, go through and update it to add links to relevant pages. To make it natural, consider adding a sentence or two to refresh the old content when you revisit it. Add the link in the new content. In this way, you’re making it fit naturally and it will be easier to read.

 

Formatting Internal Links

Always format internal links the right way. Not doing so many detract from their benefits. The link structure you’re creating is designed to help Google find and index pages on your website.

A normal, follow link should work perfectly fine with internal linking. Also, if you’re trying to get very good results for a particular page, consider making the link more visible. Place it in the content with a good anchor text: keep it succinct, relevant to the link, and don’t make your anchor text too keyword heavy.

Many websites are designed in such a way that Google can’t easily index their content. Having the pages linked the wrong way can inhibit the crawling activity. This keeps your site from being indexed and returning as a search result. If your pages aren’t linked properly, Google may not even know they exist. Google can’t return what it isn’t aware of in search results.

 

Keep Your Internal Links Relevant

Adding internal links to your website isn’t just about shoehorning them in where they fit. It’s important that they’re relevant to the content of the source page. Ask yourself whether the links adds value to the page. If it does–and is formatted and selected properly–then you have a perfect internal link. If it isn’t, remove it and use a link that does give a reader more value.

Before placing a link, determine whether it’s relevant. Ask yourself whether you’d find it interesting and educational if you were reading the page. Picture yourself as a customer and ask whether that link would seem natural and useful to you. If you think it would be, then add it to your page.

If you aren’t already using internal links in your content, start today. It’s a simple and easy way to increase reader engagement and improve your search engine position. On top of that, it makes your site more easy to navigate and can supplement your content.

As you work to increase your position as a brand authority, internal linking should become a commonplace feature of your site. The benefits over time will be substantial.

The post How to Do Internal Linking the Right Way appeared first on Performancing.

]]>
http://performancing.com/how-to-do-internal-linking-right-way/feed/ 1
Soft 404 Vs Hard 404 Errors: What’s The Difference? http://performancing.com/soft-404-vs-hard-404-errors-whats-difference/ http://performancing.com/soft-404-vs-hard-404-errors-whats-difference/#comments Mon, 01 May 2017 07:28:26 +0000 http://performancing.com/?p=13963 Even when you have an attractive website filled with relevant, timely content that links to other reputable sites, there are still a few places where it’s easy to stumble and end up with negative effects on your SEO. The technical design and function of your website is just as important as the content when it […]

The post Soft 404 Vs Hard 404 Errors: What’s The Difference? appeared first on Performancing.

]]>
Even when you have an attractive website filled with relevant, timely content that links to other reputable sites, there are still a few places where it’s easy to stumble and end up with negative effects on your SEO. The technical design and function of your website is just as important as the content when it comes to climbing the ranks to get a top spot on Google. One example of a technical error that can cause big problems for your search ranking is a soft 404 error.

Hard 404 Errors

A hard 404 error is something you’ve probably encountered more than once while browsing the web. Simply put, it’s a signal to a user that the page couldn’t be found or accessed. This could be because the page doesn’t exist. Hard 404 errors can be frustrating for users who can’t find a page, but aren’t likely to affect your SEO in a large way. 404 errors also aren’t always the fault of the site. If a person types in a web address wrong, for example, they may receive a hard 404 error.

 

Soft 404 Errors

Soft 404 errors, on the other hand, are negative signals for your website. A soft 404 error occurs when someone is trying to access a URL on your website and is getting a message that the page doesn’t exist. However, the site isn’t sending out a typical 404 error code. Instead, it’s responding with a 200 OK HTTP response code. This means that the site is saying the URL is fine and not broken or missing. In other words, the response indicates a successful HTTP request. When the request isn’t successful, the 200 OK code shouldn’t be sent by the server. When soft 404 errors appear on a large percentage of your pages, it becomes a serious customer experience and SEO issue.

 

Effects on Page Indexing

When Googlebot crawls the web to index pages, it has a limited amount of time that it can spend on each domain before it moves on to another one. When you have pages giving soft 404 errors, Googlebot interprets them as pages with unique content that you want indexed and displayed in search results. So it spends some of the time it has on your domain to index pages that aren’t delivering unique or useful content to your viewers. This is sometimes called a sites crawl budget.

This means that the pages you want indexed may take longer to update and rank on Google. You should run your website so that you have the best chance of ranking for the keywords you’re targeting as quickly as you can—and soft 404 errors can prevent that from happening. Instead you may be ranking for useless terms that aren’t related to your niche, especially if you have a high proportion of pages with 404 errors.

 

Effects on SEO

Because a soft 404 error can limit the number of good pages indexed and how frequently they’re indexed, it can have a negative effect on your SEO. Ideally, every time Googlebot crawled your page it would index the newest versions of all your pages, signaling to Google’s ranking algorithm that you’re a frequently-updated and relevant site. However, when your pages with soft 404 errors are indexed, you’re losing the positive benefits of all those well-crafted, on target pages. This is because the web crawlers may not prioritize the pages you’re trying to rank when they’re spending their time on pages with soft 404 errors.

Google on Soft 404 Errors

Google directly said on the Webmaster Central Blog that:

“We discourage the use of so-called “soft 404s” because they can be a confusing experience for users and search engines. [. . .] Search engines may spend much of their time crawling and indexing non-existent, often duplicative URLs on your site. This can negatively impact your site’s crawl coverage.”

There are some areas of SEO like subfolders vs subdomains where Google isn’t extremely clear on what you should do to get the best possible SEO results. However when Google is direct and clear about an issue, you should always do what they suggest. In the case of soft 404 errors, your best bet is to eliminate them completely so that you have the highest ranking you can achieve.

 


Example of a hard 404 error page.

User Experience

Another problem with soft 404 errors is that they create a negative and confusing user experience. A hard 404 error just refuses to load the content and explains that it couldn’t be found. However, a soft 404 will often redirect to the homepage or show a related page that wasn’t what the user was searching for. It can be a frustrating experience for a person looking for content on your site and may end up preventing a return visit from that person in the future.

You don’t have to use the standard 404 error page either. You can create a custom page that is triggered when someone requests a page that doesn’t exist on your server. It can redirect the user to another page while still sending the 404 error code out that indicates that the page isn’t found. This is one way to improve the experience for users of your website.

 

Configuring Your Pages

When you have a URL for a page that isn’t there anymore, it’s best to configure it to return a hard 404 error. This tells both users and search engines that the file couldn’t be found.

Even if your page is displaying a 404 error message, it may not be actually transmitting a 404 error code. The code and the content of the page aren’t necessarily the same. Your page must be transmitting that code so that Google and users know that it’s nonexistent or not reachable. The HTTP header response must be changed so that the server returns a proper 404 code instead of the 200 OK code.

 

301 Redirects

One way to deal with a page that is no longer available is to configure it to redirect to a different page. However, if you’re not planning to revamp the page in the future and if it doesn’t have value in terms of entry traffic, it’s better to completely delete it and move it off your sitemap for good. That way Google will stop indexing that URL and spend its time crawling pages that still offer good information.

Another reason to use a redirect is if you have a valuable link on another site that you don’t want to lose. The link will still be directed to its original page, and then the 301 code will bring it to the new page. It can preserve your very high-value SEO links.

Never try to configure your site to only use 301 redirects instead of 404s. They’re only appropriate when a direct replacement page is available. Some webmasters turn every 404 into a redirect, but that’s not an appropriate way to set up your website. Having 404 errors when no direct replacement product or page is available is more appropriate.

 

How to Check For Soft 404 Errors

The first thing you need to do is see how many of your pages are returning soft 404 error codes. Load your site into Google Webmaster Tools (or Search Console as its actually called now but I still prefer the old name!) and navigate to the Diagnostics portion of the page. Once there, open Crawl Errors and look to see what pages, if any, are returning errors. Above the listed URLs, click Soft 404s to see which pages Google thinks are soft 404 errors.

Sometimes Googlebot believes a page is a soft 404, but it’s really a page with accurate content returning a 200 OK response. In that case, it’s good to have the page indexed and you don’t need to worry about the code. Other times, you need to configure the page so that it 301 redirects to the right page. (For example, if you were running a contest that ended but you’re still getting hits on that page and want to steer them to a new contest or information about your products.)

If the page shouldn’t exist or should be returning a 404 error code, then it’s time to fix the problem.

Fixing Pages with the Wrong HTTP Response Code

If your page is returning the wrong error code and you want to change it, talk to the person who handles your website. They’ll have to update the code in the content management system for each of the URLs in question. Depending on which content management system or website architecture you use, the fix will vary.

 

Useful Tools

Once you’ve fixed any errors in how Google and site visitors see your page, try using Fetch as Google. It’s a useful tool that can give you insight into how Google crawls your page and whether anything on it is blocked to the crawler. If you have anything you need to debug, Fetch as Google is a good place to start and will let you see what, if any, errors exist in the indexing process.

If you aren’t sure what HTTP status codes are being sent by a URL, use a tool to find out. One web-based tools that can help is the HTTP Status Code Checker. It will shed light on which codes a particular URL is giving. The more information you have about how Google reads, indexes and displays your site, the better you can optimize it to get the best possible SEO.

 

Creating a Custom 404 Page

Once you’ve sorted out your soft 404 errors, you should create a custom 404 page to help site visitors navigate to the information they want or need. Ideally, the page would give them navigation options, an error message and any additional information you deem fit in an attractive format that fits with the rest of your site. Having a custom page could help retain first-time visitors who navigate to your site and are met with an error. Google offers a 404 widget you can place on a custom 404 page. It not only helps a viewer find more information, but also suggests other ways to find what they clicked the link to find.

Github have combined custom visuals with humour for a great 404 page

You can use your Google Webmaster account to check the XML sitemap of your page to make sure the widget will display and function correctly.

Making it easier for a viewer to find what they want may increase the likelihood that the person will stay on your page instead of switching to a competitor. They will offer a way back to your primary domain and also give you protection against undiscovered broken links.

Though soft 404 errors may not look like a major problem for your site, they can have a major impact on both the search engine ranking and the customer experience. Identifying, locating and adjusting your pages so that the right HTTP code is sent can fix these problems. Since your search ranking goes a long way in determining how much traffic is sent to your site by Google, it’s essential to optimize your pages and deliver the best possible product to both site visitors and web crawlers.

The post Soft 404 Vs Hard 404 Errors: What’s The Difference? appeared first on Performancing.

]]>
http://performancing.com/soft-404-vs-hard-404-errors-whats-difference/feed/ 2
Subfolders vs Subdomains – Which Should You Use? http://performancing.com/subfolders-vs-subdomains-which-should-use/ http://performancing.com/subfolders-vs-subdomains-which-should-use/#comments Thu, 20 Apr 2017 08:30:09 +0000 http://performancing.com/?p=13833 The organizational structure of your website can affect your search engine ranking in a significant way. Though John Mueller of Google says that subdomains and subdirectories don’t matter much when considering how Google indexes and ranks your website, there is evidence that one is superior to the other for SEO. Using the right architecture for […]

The post Subfolders vs Subdomains – Which Should You Use? appeared first on Performancing.

]]>
The organizational structure of your website can affect your search engine ranking in a significant way. Though John Mueller of Google says that subdomains and subdirectories don’t matter much when considering how Google indexes and ranks your website, there is evidence that one is superior to the other for SEO. Using the right architecture for your site can increase your ranking on Google, leading to increased traffic and visibility.

 

Subdomains Versus Subfolders

Subdomains and subfolders are two different types of website architecture. You can tell whether a page is a subdomain or a subfolder by looking at the URL. A subdomain has a word before the main page of the website. For example, https://images.google.com. A subfolder has the words in the URL after the main page of the website. For example, https://moz.com/blog. A subdomain hosts all the content of the site on separate servers and with separate content management systems. Subfolders are usually all hosted on the same server and CMS.

 

Benefits of Subfolders

Google automatically recognizes a subfolder as a part of your main website. The benefits of links, SEO work or other efforts to raise your search rank will work no matter which page of your site it targets if you use subfolders. They’re a good way to keep things coherent and organized without having to do different targeted marketing and SEO for different domains.

 

Subfolders and SEO

  • Since all the content is definitely indexed under a single site, you get SEO benefits from each part of the website. There’s no risk that a subfolder will compete for rankings with your primary site. Every part of the site is boosted by each SEO effort.
  • All crawling activity will be directed to one site. Since Google crawls your site when you update content, updating on any part of the site will help freshen the content for all of the site.

Benefits of Subdomains

Subdomains are used to separate different parts of your website into their own discrete domains. Though John Mueller says Google’s crawlers try to determine which sites go together, in many cases they’re recognized as their own sites and don’t benefit from SEO efforts that target your main domain.

The main benefits of subdomains are organization and structure of your site for users. For example, a website that sells a product might have its landing pages, a store, and a blog. Subdomains let you use a different content management system for each type of page. So if you’re using a blogging site for your blog, a third-party sales management site for your seller page and your own CMS for your landing pages, you might need to use subdomains.

 

An example of two subdomains.

Subdomains and SEO

  • A subdomain can use a clear keyword or phrase to signal what the site is about very early in the URL. For example, http://contest.test.com is obviously a portal to a contest being run by the main domain. More clarity can help inspire clicks.
  • Subdomains can appear as their own results on Google. If you’re trying to move ahead of similar websites, having multiple Google results that link to your subdomains can increase the number of people that click on your rather than a competitor. The amount of traffic each Google search result receives varies wildly depending on its position. So if you occupy positions one, two and three, you’ll be able to increase your traffic.
  • Subdomains are also commonly used to separate areas of a site for localization purposes. Versions of a site with a different language often appear on subdomains.

 

The Impact of Subfolders on SEO

Even though both Matt Cutts and John Mueller say that there’s no difference in how Google ranks sites organized in subfolders versus those organized in subdirectories, SEO experts have found that there’s a significant difference.

When you switch to a subfolder setup, all the links that point to your main site may give your new page a boost too. A subdirectory setup is like starting from scratch in terms of SEO so many prominent SEO experts and blogs stick with subfolder setups.

 

Subdomains Can Cause Traffic to Drop

Rand Fishkin of Moz found that their rankings for different keywords were much higher when they switched a page on their site from a subdomain to a subdirectory. According to Rand, they tried it three times in two years. The move never offered good enough results to make the change permanent. Ultimately a subdomain setup almost always loses out to a subfolder setup in terms of traffic. That’s why they should only be used when necessary for business purposes.

Fishkin also writes about Timo Reitnauer’s experience with the iwantmyname blog. When Reitnauer moved from a subfolder setup to a subdomain, he experienced a significant drop in traffic. Even five or six months later, the traffic hadn’t recovered. So it’s not a matter of simply waiting for traffic to find the new page. The issue is that content ranked in the subdirectory simply isn’t getting the same benefits as the content linked in the subfolder did. So even though moving the blog to have more options for maintenance and performance seemed like a good idea at the beginning, it impacted their ability to reach people.

Another example of a subfolder setup being better for rankings is Craig Emerson’s experience. He set up a blog in a subdirectory for a site that had already been established for three years. Even though he set up the blog in such a way that he should have been able to rank for his targeted keywords, he couldn’t get in the top 100 in the Google Blog search tool. So Emerson switched his setup to a subfolder setup. Two weeks later, he was ranking at number 57 for one of his targeted keywords.

 

Why Subfolders and Subdomains are Different

Even though Google’s crawlers are very intelligent, they aren’t perfect. Google may be able to recognize that a subdomain is part of the main domain. If so, they’ll categorize it as such. However, much of the time, the crawlers will index it as a different site and you’ll lose any ranking benefits picked up by your main site.

Ultimately, unless you have a specific reason to use subdomains, go with subdirectories. The main factors that affect your SEO, like quality content that’s fresh and well-targeted are the same no matter which architecture you use. But since using subdirectories can give you a boost, there’s no reason not to take advantage of it.

The post Subfolders vs Subdomains – Which Should You Use? appeared first on Performancing.

]]>
http://performancing.com/subfolders-vs-subdomains-which-should-use/feed/ 5
What Are The SEO Benefits of Nofollow Links? http://performancing.com/seo-benefits-nofollow-links/ http://performancing.com/seo-benefits-nofollow-links/#comments Wed, 12 Apr 2017 08:30:40 +0000 http://performancing.com/?p=13780 When you’re setting up your marketing strategy, getting your brand out there is essential. Whether you’re climbing the ranks in a Google search or attracting new customers to your site, everything you do should push your brand to make it more popular and more visible. One way to encourage both new customers and better search […]

The post What Are The SEO Benefits of Nofollow Links? appeared first on Performancing.

]]>
When you’re setting up your marketing strategy, getting your brand out there is essential. Whether you’re climbing the ranks in a Google search or attracting new customers to your site, everything you do should push your brand to make it more popular and more visible. One way to encourage both new customers and better search rankings is getting links to your site from other sites. The type of link makes all the difference—but neither type is inherently bad. Though nofollow links have long been seen as less beneficial than dofollow links, that’s not a complete picture of the situation. Nofollow links come with their own set of benefits and offer real utility to anyone who wants to increase their SEO and promote their brand.

 

What Are Nofollow Links?

 

Nofollow links are links set up so that Gogle bots do not pass any page rank or ‘link juice’ link to the target site.  According to Google, they don’t “transfer PageRank or anchor text across these links.” Nofollow links were long eschewed by SEO professionals because they were believed to not have a direct result on SEO, so the effort you spent getting a nofollow link was seen as wasted effort. That’s no longer the case. Nofollow links can be just as valuable as dofollow links for increasing your visibility.

While Google says “in general we don’t follow” nofollow links, that doesn’t mean they never follow the links. It just means that in general, you shouldn’t expect the same SEO benefit type from nofollow links as dofollow links. The type of benefit is different.

 

What are Dofollow Links?

 

Technically there is no such thing as a ‘dofollow’ link, there are only normal links without the ‘rel=nofollow’ tag, however within the SEO industry people like to refer to these as ‘Dofollow’ or ‘Followed’ links. Dofollow links are the opposite of nofollow links. They signal to a search engine crawler that the link should be followed by that bot. It’s a signal that the link is trusted and organic. Dofollow links are also the default type of link, so if you put a link on your page, it’s already a dofollow link without any modification. Special tags have to be added to a standard link to make it a nofollow link. The more dofollow links you have, the more Google sees your site as authoritative. They’re a positive signal for your website that can help increase your rankings even if they don’t increase engagement—though having all your dofollow links remain unclicked isn’t as good as having dofollow links that people engage with.

 

Why Are Nofollow Links Used?

 

There are a number of reasons a site would use a nofollow link. The first is that the site is concerned with protecting its own page quality and site rank. Since Google’s crawlers can’t go through site pages that are password protected, using nofollow tags on those links will help make the crawlers index the pages the site actually wants on the search engine.

Another reason a site would use a nofollow link is if they don’t trust the site on the other end of the link. It doesn’t mean a site is bad—just that the referring site isn’t personally willing to vouch for it. One example is a site that allows users comments, which aren’t vetted beforehand and could lead to spam or other inappropriate content.

Paid links are another type of link that is often converted to nofollow. When you advertise a product on another site, it’s important to Google that your marketing budget doesn’t directly affect your rank—otherwise people could pay thousands to get links on enough pages to rise above sites with more relevant and useful content. For that reason, Google prefers that any paid links are nofollow, which is one of the issues you can run into, depending on your marketing strategy.

 

 

SEO and Nofollow Links

 

So why use nofollow links if a Google crawler won’t use it to boost your page rank? The reason you should use nofollow links is because they can increase organic engagement—and lead to dofollow links from other sources. Each link is a chance to get noticed, which gives your link a chance to be shared organically, not as a paid link that you’ve built or one you’ve exchanged for visibility. If a person who’s interested in your product or service visits the link, finds it useful, and then links it on his own page, you’ve not only engaged an entirely new group of customers but also developed a second link on the back of the original nofollow. As these connections increase, your search results position should increase as well.

 

What Are the Benefits of Nofollow Links?

 

Nofollow links biggest benefit is that they act as an introduction to your site, brand, products or services on the page where they’re hosted. When you consider what you’re trying to achieve with your SEO & content marketing strategy, it’s clear that the introduction and attention are the most important aspects of the game. Nofollow links from paid advertisements or sources that found your site helpful and linked back to it cautiously still give you access to a new audience. Moreover, sometimes nofollow links generate more traffic than a dofollow link.

Rob Toledo writes on Moz about his experience with a dofollow link and a nofollow link. The dofollow link generated no traffic for his site; the nofollow link generated hundreds of pageviews that engaged with his site. He concludes by saying, “These are not unique experiences. I have noticed an upward trend where nofollow links can often present the absolute best and immediate return when proper site metrics are measured.”

 

Check out Rob Toledo’s excellent nofollow examples on Moz.

 

His experience is an example of why your link acting as an external resource can sometimes be more important for your site rankings than a crawler detecting your link and following it to your site. There’s more than one way to conquer SEO and move your site up in search results for Google; you don’t have to rely on traditional marketing strategies like an uneven distribution of nofollow and dofollow links.

 

Passing Value Through Nofollow Links

 

A nofollow attribute isn’t a hard-and-fast rule. Though Google’s crawlers are told to not follow the link, certain links may be followed despite the nofollow attribute. Jason Lancaster, of Moz, argues that “Nofollow isn’t a rule – it’s just a guide” because it’s possible that search engines are giving some weight to nofollow links on certain pages.

“It’s beyond foolish to assume that Google doesn’t use nofollow links on Wikipedia (for example) to rank sites. Same goes for links from a popular Twitter profile, YouTube profile, and any other trusted/quality site with an automatic nofollow policy”

Jason Lancaster

 

Also, Google’s own documentation on how they handle them is a little ambiguous!

 

Managing Your Follow/Nofollow Ratios

 

Your link profile should be made up of both follow links and nofollow links. Though Google’s specific algorithm for determining site ranking isn’t known, experts like Neil Patel believe that the makeup and size of your backlink profile has a major impact on your ranking.  Create a link profile that includes both follow and nofollow to have the best SEO success. Though the ratio doesn’t have to be 50 nofollow links for every 50 dofollow links, it’s to your benefit to keep it fairly even. For example, 40 nofollow links for every 60 dofollow links. It helps you create a more natural link profile to keep the ratios close to even.

While nofollow links also have the benefit of referring people to your site who may later add an additional link to your profile, they shouldn’t be your main priority. Since dofollow links are a stronger positive signal for your site ranking in general, aim to have more dofollow links. Just remember that nofollow links aren’t to be ignored either. In fact, they may demonstrate to Google that you’re building a solid, organic profile.

 

5 Ways to Get the Most from Your Nofollow Links

 

Not all nofollow links are created equal. The trick to getting the most from nofollow links is to engage your viewers and keep them coming back. Get the most from your nofollow links by incorporating these tips as you link build.

  1. Make your link interesting and relevant. If you’re the one developing the content, make sure the link is in a place that will encourage people to visit it. If someone else is developing the content on their own site, you may not have control over this—but you can always request that the link is higher on the page or in a certain topic area.
  2. Develop interesting content. This is essential for every area of SEO optimization and marketing, and the same is true for nofollow links. To get the most benefit from a nofollow link, you need excellent contents. For example, content that keeps people on the page is essential. Equally important is content that encourages them to return to your site and establishes you as an expert on the topic.
  3. Create a blend of link types. Make sure your strategy includes a blend of nofollow and dofollow links. Focusing solely on one will hamper your ability to improve your SEO. This is because they both work to reach people and boost your rank in different ways. Experiment with which links work on which sites. Then adjust your ratio accordingly as you see what’s working best for your brand.
  4. Use giveaways and other types of enticements to lead people to click on your nofollow link. One of the challenges of working with nofollow links is finding ways to lure people to click on the link. One way is using giveaways so that the person who wants to enter has to view your page. If you’re offering a product that interests your targeted user base, you’ll be able to pull more of them in when you offer material benefits. For example, if you have someone advertising your product with nofollow links, make those links lead to a contest. If someone is interested in the content, they’ll likely be interested in getting something relating to it for free.
  5. Encourage visitors to share your link on their own pages. Make sure the way to share your page with others is easy to see. This is one of the best ways to transform one nofollow link to several organic dofollow links. Create an even larger organic reach by asking viewers to share your page for an entry into a contest.

 

Here’s an infographic from Searchengineland.com on the nofollow tag and when & how to use it.

 

Search Engine Land writes about the basics of the nofollow tag on their site.

How to Tell Whether a Link is Nofollow or Dofollow?

 

To check whether a link is dofollow or nofollow, right-click on the page in question and select “View page source”. Search the page for the url of your website, and check whether the nofollow attribute is included after the link. If it is, the link is nofollow. When no extra attribute is present, it’s dofollow.

An example of each type of link.

For times when you’re checking a lot of links, check our browser extensions to automatically mark nofollow links. For Chrome, Safari, Opera and Firefox, NoFollow outlines links and detects nofollow and noindex meta tags.

 

Nofollow links have been around since 2005 and were a response to blog comment spam that artificially inflated SEO. Nofollow links are a valuable way to increase your organic reach and get your website seen by more people. While many have painted them as pointless, it’s an unfair portrait. They both help manage spam and offer a different way to boost your SEO.

 

If you haven’t been trying to use nofollow links, start now. The nofollow links you sprinkle across the Web may surprise you with the traffic they generate in the future.

The post What Are The SEO Benefits of Nofollow Links? appeared first on Performancing.

]]>
http://performancing.com/seo-benefits-nofollow-links/feed/ 14
How to Increase Your Organic Click Through-Rates http://performancing.com/increase-organic-click-rates/ http://performancing.com/increase-organic-click-rates/#comments Tue, 04 Apr 2017 12:12:41 +0000 http://performancing.com/?p=13755 One of the most important aspects of marketing your online business is reaching out and grabbing the attention of new customers. While you can get website visitors from other sources, choosing to focus on your website ranking and search engine position may yield the best results in terms of boosting your visibility. One important metric […]

The post How to Increase Your Organic Click Through-Rates appeared first on Performancing.

]]>
One of the most important aspects of marketing your online business is reaching out and grabbing the attention of new customers. While you can get website visitors from other sources, choosing to focus on your website ranking and search engine position may yield the best results in terms of boosting your visibility. One important metric to consider is your organic click through rate (CTR). Boosting your organic click through rate means getting more people on your website and positioned to buy your products or services.

 

What is Your Organic Click Through Rate?

 

A click through rate (CTR) is simply the number of people who click on a link that goes to your website. There are a lot of different ways to get clicks, including advertising or social media posts. What makes a particular click organic is that it doesn’t rise from advertisements. It’s a direct click from a search where you appear because of your SEO, not your advertising budget. The number of people who click on your website in search results is your organic click through rate.

 

The higher your rank, the more people will click on it. An increase in your organic click through rate will correspond with an increase in your traffic.

 

Items above the fold on Google–and items that rank higher in the search–see more clicks than others. One factor in your search result placement is how many people click on your search listing. In the screenshot above, there are many factors at play that determine which result comes first. One is the organic click through rate. Since the first Google search result gets 33 percent of the traffic, anything that improves your position can help increase your business.

 

Why is Your Organic Click Through Rate Important?

 

Your organic click through rate gives you a metric for judging how well your keywords, titles, and meta data are doing, and whether you need to adjust them to draw more website viewers in. The less clicks your search listing gets compared to other listings, the more you’ll fall in rank. The higher your rank, the more likely your link is to get clicked on by a person who has searched for a particular keyword or phrase.

You can increase your organic click through rate by making adjustments on various aspects of your Web content. Updating past content so that it matches best practices for better organic click through rates and tailoring future content with an eye toward getting more clicks can improve the amount of traffic your website gets. Another benefit to boosting your organic click through rate is that you may see a rise in your search engine positioning. The more people who click on your link, the better you do against competitors fighting to get clicks for the same terms you’re targeting.

There is also very strong evidence that click through data is taken as a ranking signal by Google.

 

Long Versus Short Tailed Search Terms

 

If most of your targeted keywords are short one or two word terms, think longer to see an increased click through rate.

A study by Search Engine Watch found that:

“In general, we see a steady and almost monotonic increase in CTR, conversion rate, and conversions per 1,000 impressions as keywords increase in length, with the most “efficient” keywords in the 31-35 character bucket. Note that keywords above 40 characters not only generated a tiny number of impressions – they also weren’t terribly efficient.”

So when you’re setting up your keywords and targeting search terms with an eye toward getting more traffic, try to think of longer phrases that you want to target. If you can hit that terms length of 31-35 characters, even better.

 

Adjusting Your Titles

 

One of the first things a searcher sees when performing a search is the title of the page. There are two things to focus on when choosing a title.

First, make sure it’s relevant to your content. If a title isn’t relevant and doesn’t properly represent the page, even a well-targeted link may not get a click when a searcher chooses to click another, more pertinent link. Catchy titles that don’t give information about the page or misrepresent it can lead to low clicks or low rates of return visitors.

The second thing to think about is whether the title is dry or appealing. People are more likely to click on titles that tug at their heartstrings, stimulate their interest or seem to offer more utility. Before you title a page, consider whether someone glancing at it will feel a strong desire to click based on the terms you’re targeting. If you think it’s bland or unappealing, try a different title.

When developing a title, try to work in the format of the piece, a power word, the type of content and your subject. All of these together will make an interesting title that will draw in more clicks. For example “Eight Shocking Questions to Ask at Job Interviews” is better than “Job Interview Questions to Ask”.

Search Engine Journal recommends writing titles from different perspectives, depending on which best matches your content. They suggest:

  • The Feel Good Friend
  • The Comedian
  • The Hero or Villain
  • The Bearer of Bad News

Keep in mind that a change in punctuation or capitalization doesn’t create a new title, just a version of the old one that looks different. Use new words and get an entirely new take on the title instead.

Go back to old content and review your titles. It’s a good idea to use targeted search terms in your titles, but that shouldn’t be the end of it. An appealing title that offers utility will draw in a searcher more quickly than just a keyword set. For example, “Children’s Books” is a possible search term that could function as a title. “Ten Children’s Books That Will Inspire You” is more emotional and likely to generate clicks.

 

These titles use strategies for increasing clicks, which have helped them climb to the top of the Google search results for “children’s books about diversity”.

13 Ways to Increase Your Organic Click Through Rate

  • Examine your current click through rate for each page and site you have. List the difference between the highest-performing pages and the lowest performing pages. Once you can clearly see the things that draw in visitors, you can adjust your content strategy to develop more content that encourages people to click on your site rather than a competitor site.
  • Adjust your content to include lists and mention the list in the title. Searchers tend to click lists when offered headlines in a set of links both with and without lists. Most topics can be adjusted to include some type of list that works with your content.
  • Use power words: words that evoke an emotional response. Think adjectives. “25 Delicious Pizzas to Try in Chicago” is more powerful than “Chicago Pizza”.
  • Make custom URLs. When you use descriptive words rather than random numbers and letters in your URL, people are more likely to click it.
  • Don’t worry about pushing your brand in the title of the page. It will be on your website, and may appear in the description of the site or the URL. Use the valuable character space you have to push the title that increases your organic click through rate. The only exception to this is if you’re a known industry leader already or are trying to rank for your brand name specifically.
  • Reference the volume of what you’re offering. Don’t just say that you can recommend some hotels in a specific city. Instead, say the exact number. People respond to titles and descriptions that include actual numbers.

  • Work in punchy facts, where appropriate in the title or description. A short fact worked into your search result may draw attention and clicks. For example: “22% of High School Seniors Don’t Write Passing Essays—We Can Help You Get Into Your Dream College.”
  • If you have a special offer, giveaway or some kind of promotion, mention it in the title. Many people will never read past a title, so leading with your best foot forward can help pull a skimmer in.
  • Focus in. If you have a site that’s designed to lead tourists to the best places in town, a page about Italian restaurants and museums isn’t as targeted and many people will skip it in favor of something more specific. You can always add additional pages to your site, and link from one article to another.
  • When writing your page description, try to answer a question or offer something to the reader. This gives a searcher an early idea of how your page is useful. The more useful your listing appears, the more likely someone is to click on it.
  • End all website meta descriptions with a call to action. When a reader finishes reading what you have to offer—whether it’s a product, a recommendation or information—use the call to action to encourage them to take the final step and click.
  • Create more content. The more pages you have, the more searches you’re able to appear in. There are only so many keywords or phrases you can target with any given page of content; by increasing your content, you’re increasing your reach.
  • Use rich snippets to draw a searcher’s attention to your listing. Rich snippets are a type of data markup that is included in your Google search listing. Once you’ve adjusted your listing to include them, it may show a star rating, a price or other information from your page, which can help distinguish your page in a list of results.

 

Target uses detailed keywords in their URL, which is proven to help increase the organic click through rate of that entry relative to other results in a Google search.

Testing Your CTR Strategy

Once you’ve developed a strategy to get higher click through rates, test it. Since there are many different approaches to combine, which ones work for you will depend on your brand, site and audience. Try a few different strategies with similar pieces of content at once to see which draws in more clicks and repeat viewers.

Once you’ve narrowed down the types of changes that work best for you, remember to go back and adjust your older content. Make sure every piece conforms to your new approach for a higher click through rate, which will increase overall traffic to your site, especially if you’re focused on evergreen content.

Don’t stop testing just because you find a strategy that works. Web marketing is a changeable field and what works today may not yield results tomorrow. More importantly, you may come across a new idea, test it, and find that it produces an even higher conversion rate. If you play around with your old content, it also has the benefit of updating the page—which is a positive signal to your Google ranking. So don’t be afraid to keep testing as you continue to grow your business.

Even when you’re trying to boost your organic click through rates, there’s no harm in testing with a paid ad, which may give you more data. Test the same ad with a few different headlines, determine which one performs the highest for conversions, and then use that headline as the title of the page you’re creating. You can use what you learn from that test to title future pages, too.

At the same time, Larry Kim of Moz says that you shouldn’t change your title every week on a particular page, because Google will think it’s being “dynamically populated.” The same applies to every type of content. Try new things, but don’t go overboard. Choose a few pages for testing–not your entire site.

 

Google has a lot of great suggestions for optimizing your titles, too!

Increasing your organic click through rates by improving your search result appearance, offering answers to questions and distinguishing your listing will expose your brand to more people. It’s also beneficial to optimize past content so that you have more listings appearing for more search terms. As you implement and test different ways to increase your organic click through rate, you’ll also be building your customer base and brand—so keep tweaking your site until you have the results you want. Your site will draw more viewers as you optimize your listings and increase the utility of your search results.

The post How to Increase Your Organic Click Through-Rates appeared first on Performancing.

]]>
http://performancing.com/increase-organic-click-rates/feed/ 8
Best SEO Practices for Website Security http://performancing.com/best-seo-practices-website-security/ http://performancing.com/best-seo-practices-website-security/#comments Tue, 28 Mar 2017 10:00:22 +0000 http://performancing.com/?p=13721 Google first indicated that they’d start prioritizing sites with valid security certificates years ago, and soon after rolled out with a significant ranking boost to sites that offered HTTPS certificates. Many sites are still using the older, less secure HTTP protocol and losing out on a valuable way to improve SEO and reach more visitors. […]

The post Best SEO Practices for Website Security appeared first on Performancing.

]]>
Google first indicated that they’d start prioritizing sites with valid security certificates years ago, and soon after rolled out with a significant ranking boost to sites that offered HTTPS certificates. Many sites are still using the older, less secure HTTP protocol and losing out on a valuable way to improve SEO and reach more visitors. Just creating an HTTPS site isn’t as effective, though. Use the best practices to choose your security certificate, set up and roll out your site to get the largest possible boost from the change.

HTTP Versus HTTPS

Hyper Text Transfer Protocol (HTTP) is the way messages are sent and received on the Internet. Originally, the information transferred wasn’t encrypted and was more susceptible to being taken and used by a third party. Hyper Text Transfer Protocol Secure (HTTPS) was introduced to fight this danger. When someone accesses your site, a Secure Socket Layer (SSL) certificate is transferred to the viewer; the SSL certificate contains the information the user’s browser needs to validate that your website is the one they’re trying to access and that you’re receiving any information they choose to input. The information is also encrypted so only someone with the right certificates is able to decrypt it.

Another benefit to offering HTTPS is that your Webpage viewers, people who spend money, or those who input information on your site can trust that the information is secure, that no one can intercept it and that they’re viewing the site to which they meant to navigate.

Secure your site with HTTPS – Google 

Choosing Your Certificate

When choosing a security certificate provider, go with a Certificate Authority with a good reputation who uses the latest technologies to protect your viewers and their information. For example, Digicert, VeriSign and Comodo are providers with good reputations that offer up-to-date technology and customer support. There are many other Certificate Authorities to consider, however, and you can use merits like customer service, location or cost. More established companies may offer higher prices, but also more customer service hours or different certificate packages.

When it comes to security, more is always better. Google recommends a 2048-bit key over a less secure 1024-bit key. A 4096-bit key offers an even more secure connection in theory, but neither 2048-bit or 4096-bit keys have ever been cracked, so both are considered equally secure for now. So choosing a 2048-bit key is as good as choosing a 4096-bit key until the industry is able to hack the 2048-bit key and everyone has to migrate to the next, more secure encryption.

There are different certificate setups to consider, too. For example, someone with a basic, one-domain site might only need a single certificate. If you have a multi-domain website (for example, one that has local domains for different countries), a multi-domain certificate might be the better fit. If your site has dynamic subdomains, a wildcard certificate is what you need. The cost will vary, but choosing the right one will protect your Webpage and make sure you get the best SEO boost for all your domains and pages.

Site moves with URL changes – Google

Setup

After you’ve purchased the certificate to convert your site to HTTPS from a trusted Certificate Authority, you need to set it up correctly using the best practices for SEO. First, instruct the person updating your site to use server-side 301 redirects, which tells a search engine that the page has been moved to a new Web address. It’s the best way to maintain your ranking while you migrate to a new site.

Once you’ve updated the Website to HTTPS, Google recommends making sure your site supports HSTS, which switches the Website viewer to the secure HTTPS page even if they specifically request the HTTP page. Put HSTS headers out low max-age, and then increase the max-age slowly over time, making sure the change isn’t negatively impacting your performance. Google also offers HSTS preloading for Chrome, which you can request from Google once you know your page supports HSTS with no problems. You have to change the HSTS headers on your page to the ones that allow preloading before the change can take place, which prevents a third-party from adding your site to the preloading request list.

Troubleshooting

When your site is set up with HTTPS, make sure you’re regularly performing a security and wellness check to make sure nothing is negatively affecting your SEO ranking. First, your certificate will expire at a certain point and won’t be valid anymore. Stay aware of your expiration date so that you can set up a new certificate or extend the one you already have. Next, make sure that every certificate you set up is registered to the proper domain name. A syntax or spelling error can prevent your site from using the certificate and keep search engines from recognizing that your security is up-to-date.

Your robots.txt file needs to be set to allow crawling. If it isn’t, search engines can’t crawl your page, compile information and return your site in search results. In the same token, your site needs to be open to indexing; using the noindex meta tag will drop your SEO ranking.

As you update your site, ensure that all the information is migrated to the HTTPS page. Having an HTTP page with more information than your new, secure page will have a negative SEO impact. All site elements should be HTTPS as well. For example, a payment module that isn’t HTTPS secure will lower the security ranking of your entire site.

Keep in mind that Google treats switching from HTTP to HTTPS as a site change, as if you’d switched your page over to a new URL. It can impact your SEO ranking temporarily. Make sure all your old links are updated to have the best possible results with your change.

As long as you stay on top of having the best security protocols, you’ll be sure to get any available SEO boost. Moving from an HTTP site to an HTTPS site that is set up correctly will be a positive signal to a search engine’s ranking algorithm, making it well worth the time and expense of acquiring a security certificate. Google even says that it may further increase the boost that HTTPS pages receive in the future.

 

What Should You Do With Your 404 Pages?

The post Best SEO Practices for Website Security appeared first on Performancing.

]]>
http://performancing.com/best-seo-practices-website-security/feed/ 2
Metadata SEO Best Practices http://performancing.com/metadata-seo-best-practices/ http://performancing.com/metadata-seo-best-practices/#comments Fri, 24 Mar 2017 03:35:53 +0000 http://performancing.com/?p=13705 Metadata is information about a webpage that doesn’t appear on the page itself, but instead is part of the code that makes the website. For example, the page description in the metadata may not appear on the page itself, but will appear on search engine previews and, depending on your site’s code, along with the […]

The post Metadata SEO Best Practices appeared first on Performancing.

]]>
Metadata is information about a webpage that doesn’t appear on the page itself, but instead is part of the code that makes the website. For example, the page description in the metadata may not appear on the page itself, but will appear on search engine previews and, depending on your site’s code, along with the post when it’s shared on social media or through email. The right metadata can influence your click-through rate and help improve your site’s visibility by improving your content and engagement.

What is the Meta Description?

The meta description is simply a description of the webpage you’ve created. The important thing to remember when thinking about meta descriptions is that each one describes a certain page, not your entire website. So while your landing page might describe the overall purpose of your website in the metadata, a page where you sell items would have a different description.

As you can see in the search results for the Starbucks website, each page is described differently so that a viewer knows exactly where they’ll be landing if they click the link. When updating your website to include new metadata or new pages, it’s very important to differentiate between a menu, a sales page, a directory or any other type of page.

In that way, you’re offering your potential viewers more utility before they even get to your page. In the examples above, a viewer interested in signing into their account and a viewer interested in seeing the available food and beverage choices would be able to click through to the information required without extra clicks on the website.

Google won’t always use the meta description that you write—sometimes they generate their own from the page content. But they use it often enough that the meta description is one of the most important metadata adjustments you can make, and well worth the time it takes to perfect it.

 Writing a Compelling Description

Your metadata description doesn’t just inform a search engine seeking out information about the content of your site. It’s also the first thing a viewer sees when you site appears in search results. Since your search ranking may be influenced by the click-through rate your site gets, it’s important to craft a description that lures visitors in and holds their interest. Boring or irrelevant descriptions could cause a reader to look elsewhere for content.

To write a compelling description, start by stating the purpose of your page. Keep in mind, the description has to be short and punchy—something that grabs the reader’s interest and keeps it. You can also use a call to action to convince a viewer to come to your page. Telling them to click the link can actually help you convince someone to convert from a viewer to a click.

You can test different meta descriptions to see which offers the most utility. Even a good meta description may be out performed by an alternate one, so it’s well worth the time it takes to adjust your meta description and then track your traffic to see which offers the most conversions for your search terms.

Meta Description Examples

As you write your meta descriptions, think about what a viewer looks for and what you can offer them in a quick, 165 character snippet to get them to click on your page. For example, a parent searching for a local zoo might prefer one of these descriptions to the other:

“Come see the lions, tigers and bears at the San Diego zoo! Children under 12 are admitted free Monday-Wednesday. Click now to receive money saving coupons.”

“San Diego zoo is open from 9-8 Monday through Sunday, except in cases of inclement weather. We have a wide variety of animals on show.” 

One of those descriptions is more compelling, offers something to a common viewing group (parents with young children), and has a call-to-action telling the viewer to click to receive even more utility. The second one is dry and informative, but offers no real utility or excitement. A person testing meta descriptions might find that one of those works better than the other for getting more clicks.

Since you need multiple meta descriptions for a multipage website, there are plenty of opportunities to try out different strategies.

 

Essential reading:

Meta Description Magic: Think Less about SEO & More about Click-ThroughsKissmetrics

 

Meta Description Q & A

What is the purpose of a meta description?
The purpose of a meta description is to describe the content of your page in such a way that a person clicks on your link to view your site.

Where do I put my metadata?
Your metadata goes into the html of your webpage. For example, the meta description tag looks like this (for Google Chrome download page):

<meta name=”description” content=”A fast, secure, and free web browser built for the modern web. Chrome syncs bookmarks across all your devices, fills out forms automatically, and so much more..“>

If you don’t design your own website, the person who does can insert the code for you or show you how to adjust it. There are also website and SEO plugins available that have places where you can update your meta description for each page without adjusting the actual html of your website manually.

How many meta descriptions do I need?
Each page of your site should have a different meta description, and each page can only have one. So you should have as many meta descriptions as you do pages on your site.

How do I make Google use my meta description?
You can’t choose whether Google uses your description or creates one of their own. The best way to have your meta description used is to make sure it’s relevant, offers utility, is well-written and is right at 155 characters long. If it’s too long it will trail off the page, but too short ones may be replaced by Google.

 

Using Keywords in your Meta Description

When writing a description or any other piece of metadata, keywords are one of the most important factors. Take the three main keywords or phrases you want to rank for, and try to work them into your metadata in an organic way. In other words, the words or phrases should feel natural in the titles and descriptions, not awkward or forced.

Try writing out a few different test descriptions, for example. You can test with them by watching how many people click-through to your website. If one doesn’t seem to be getting an acceptable number of clicks, try using another description. With time and testing, you can narrow down which will serve you best in the long term. You can also test ranking for different keywords in the same way, and seeing which set offers you the greatest visibility.

One way to write a compelling, standout description is to check what your competitors are doing right and wrong. Perform a search for the keywords or phrases you want to target and consider what makes one description better than another. If a description offers value, it’s better than one that doesn’t. Value could be in the form of a free trial, an answer to a question, or expert information. The description should also be welcoming and encourage the people you’re trying to target to click on it.

Since you’re competing for a top slot with paid advertisements, one strategy is to write a description that isn’t selling to the reader. It could make you stand out from advertisement listings that are set above yours.

 

Essential reading:

How to Write an Effective Meta Description Tag – Informatics

 

Titles, Tags and Image Metadata

Metadata also includes the titles of your pages, the tags for your page and the images on your page, all of which can influence your SEO by adding content that the search engines crawl and index. Image metadata influences where pictures on your site appear and helps you get more visibility in image searches, so make sure to include relevant keywords that will attract the right kind of viewers. Titles and tags should also use keywords, though tags are just a list and titles need to be crafted to be organic, just like descriptions.

Titles are one of the most important types of metadata because they give the search engine and the viewer an idea of what to expect on your page. A short, descriptive title that uses one of your targeted keywords is best. According to Moz, titles under 60 characters long will display correctly 90 percent of the time. If you want a higher probability of your title showing up exactly as you wrote it, keep it at 55 characters. Don’t use all CAPS titles, since it shortens what is visible.

Title Examples

Reflections: Discounted mirrors cut and shipped quickly
This title is 55 characters, including spaces, and includes both the name of the site and what you can expect to find there.

Chai Tea: A list of the teas we offer and prices per ounce
This title is 58 characters, including spaces, and includes the name of the site and what you can expect to find on the landing page. It would be appropriate for a page that isn’t the main page, but rather one with a menu of what’s available and how much a person can expect to pay.

 

Index and NoFollow Tags

Another important consideration when crafting your metadata is the information in the robots.txt. This is part of your code that tells crawlers whether they should come to your page, catalogue the information, store and share it with searchers. For example, “follow” is a better metadata option than “nofollow” for a site that wants a better ranking. “Follow” means that the search engine can trust any links on your site and can safely continue on through to the directed Webpages. “Nofollow” means that links shouldn’t be treated as safe. The “index” option is likewise better than “noindex”. If you choose “noindex”, the search engine that crawls your site won’t catalogue and share your results with searchers. Always use “index” in your robots.txt metadata.

 

Essential reading:

Follow Links Vs. No Follow Links: Should You Care? – Wordstream

 

Influencing Factors & Final Thoughts

One important factor to consider when writing metadata that will potentially appear in a search engine listing is the length. For the best description, stick to 155 characters or less. Otherwise, the description might get cut off by the listing and the viewer won’t be able to see the entire thing.

Remember that the best metadata in the world won’t work for you unless you can deliver high-quality content, too. Let your descriptions and titles be accurate reflections of the content of your page, so that you don’t lose any SEO advantage you gain by optimizing the code of your Webpage. If you’re updating or improving the content of your site, consider optimizing the metadata for each page that you’ve improved. That way, the new metadata will lure people in to see the high-quality information available on your page.

Keep in mind that some search engines will bold the term that a user is searching for. If you use that term one time in your title and another time in your description, it may draw the viewer’s attention and help encourage conversions. When you write your metadata, try to link the titles and descriptions for each page so they’re working together to target a searcher.

Optimizing your metadata can help improve the visibility of your Webpage and boost your SEO. When your page has a higher rank and more people access it, the products and services you want to share are available to a larger audience and your engagement is increased. Since refining your metadata can help you meet that goal, testing new metadata and keywords is a valuable use of the time you spend developing your site.

 

The post Metadata SEO Best Practices appeared first on Performancing.

]]>
http://performancing.com/metadata-seo-best-practices/feed/ 3
Getting Ahead of SEO Algorithm Tweaks By Offering a Better User Experience http://performancing.com/getting-ahead-seo-algorithm-tweaks-offering-better-user-experience/ http://performancing.com/getting-ahead-seo-algorithm-tweaks-offering-better-user-experience/#comments Wed, 15 Mar 2017 05:47:53 +0000 http://performancing.com/?p=13654 Instead of waiting for a new SEO change to take effect, get ahead of the curve by offering a better user experience now. A focus on improving certain aspects of your site will ensure that you don’t drop down in the search results when new changes are rolled out, which means that you’ll still be […]

The post Getting Ahead of SEO Algorithm Tweaks By Offering a Better User Experience appeared first on Performancing.

]]>
Instead of waiting for a new SEO change to take effect, get ahead of the curve by offering a better user experience now. A focus on improving certain aspects of your site will ensure that you don’t drop down in the search results when new changes are rolled out, which means that you’ll still be visible to the people you want to reach. Waiting until changes are implemented will put you behind—and improving your Web presence now has the added benefit of creating a more relevant, attractive and secure website.

Changes to SEO Algorithms

Changes to SEO algorithms tend to target areas like content, user experience, and website security. For example, the Panda update from Google targeted low information sites that were set up to get views, but not to give viewers useful information. Other recent changes have favored sites with a stronger security setup. Changes in 2017 will favor sites that cater to mobile users instead of only desktop viewers. Each change in SEO algorithm determines who gets a higher rank in search engine results, meaning that your page will be more visible than those with equally relevant content that don’t comply with the recommendations and requirements of the search engine.

Essential reading:

Google Panda Update – SearchEngineLand

Content

The best and most important way to get ahead of SEO algorithm changes is to offer quality content that’s updated regularly and is relevant to the keywords you’re targeting. More than security or user experience, SEO algorithms are created to make sure users are able to find and access the content that offers them the most benefit. Sites that aren’t updated regularly with new information may drop a few spaces in a list of results.

If you’re having trouble creating timely content, look no farther than a company blog. It’s easy to create a page that has information regularly updated to help keep your website relevant. Even posting once a week can put you ahead of other site owners who have a static site that only offers basic contact information and a list of services or products. As a bonus, offering more information will help you get return visitors who want updates on the topic you’re discussing.

Since the single most important factor in SEO is the information you’re providing, never skip the content when considering how to get more visibility online. Current, comprehensive content presented in a direct, easy-to-access way will do more to help you stay ahead of algorithm changes than anything else you can do. The trick is to make sure that the information you’re providing is relevant, well written and uses words specific to your industry, product or service to help people find your site after it has been indexed by search engines.

User Experience

As more people perform searches on mobile devices, having a website with a positive user experience for a person on a smartphone or tablet is essential. There are several types of mobile optimization you can use. If you only offer a desktop site, you’ll lose out when algorithms that benefit progressive mobile design continue to be rolled out.

Essential reading:

Mobile Fact SheetPew Research Center

A mobile site is a website specifically designed to display when a mobile device is detected. This means that you have two versions of your website—and one doesn’t always include the same content and functionality as the other. If your mobile site doesn’t offer the same information and usefulness as your desktop site, your SEO ranking will drop.

The other type of mobile site you can use is a responsive site. This site offers all the same information and features as your desktop site, but appears in a mobile-friendly format. SEO algorithms that benefit mobile sites prefer responsive design to a specific mobile site.

Another popular choice is a web app, which is a site that appears to be a mobile application. It isn’t installed on the person’s device—it just has the same appearance and functionality. Web apps also load quickly and put your information in front of the viewer with less time than either other type of mobile site. Choosing either a responsive design or a Web app can help you increase your SEO.

Website Security

Another area that search engines target to weed out low quality submissions is website security. If your site doesn’t have HTTPS, consider upgrading. HTTPS is a protocol where your site gets a certificate that authenticates the site’s identity, and also encrypts the data and prevents it from being altered while it’s being transferred to and from the site. If you’re selling goods and services, this kind of protection is even more important because it allows consumers to spend money on your site without concern for the security of their financial information.

Essential reading:

Secure your site with HTTPSGoogle Webmasters

Setting up your certificate and site the right way will offer you the most benefit. Google recommends selecting a 2048 bit key when setting up your HTTPS certificate, which offers more security than a 1024 bit key. Choose a certificate issuer who offers technical support if you’ve never worked with a security certificate before. Make sure that your site is set up so that HTTPS pages can be crawled by Google—if they can’t, you might not be available on search results at all. To do this, check that HTTPS pages aren’t blocked by robots.txt files and that the pages don’t include meta noindex tags.

Essential reading:

Mobile OptimizationMoz

Not only will the information transmitted to and from your site be more secure, but you will also see a boost in your SEO rankings. Google, for example, prefers sites that use HTTPS over sites that use HTTP. It’s only a slight boost as of 2017, but the possibility always exists that SEO algorithms will be tweaked again to offer a more substantial boost. Getting HTTPS certified now will help you take advantage of any security-related changes in how rankings are determined.

Don’t Wait

Making small changes to your website now will help you avoid losing clicks when a new SEO algorithm change debuts. All changes are designed to better filter search results to help users find relevant pages, so making sure you’re relevant and have good content will keep you ahead of the game. Since your site will be more visible after changes are processed, it’s a good practice to stay up to date on each relevant area so that you don’t miss out on potential clients or customers. Ultimately, making sure your user experience, content and website security are up to par won’t only help you avoid being penalized due to changes in SEO algorithms, they’ll also give you a site that people are more interested in returning to and using.

The post Getting Ahead of SEO Algorithm Tweaks By Offering a Better User Experience appeared first on Performancing.

]]>
http://performancing.com/getting-ahead-seo-algorithm-tweaks-offering-better-user-experience/feed/ 3
What Should You Do With Your 404 Pages? http://performancing.com/what-should-you-do-with-your-404-pages/ http://performancing.com/what-should-you-do-with-your-404-pages/#comments Fri, 11 Nov 2016 13:03:46 +0000 http://performancing.com/?p=13361 Having 404 pages is a perfectly normal part of a website and Google expects this. However, how you handle 404 pages and the visitors that land on them is important if you want to keep these people on your site. What Are All The Different HTTP Status Codes? Here we are mainly focusing on 404 […]

The post What Should You Do With Your 404 Pages? appeared first on Performancing.

]]>
Having 404 pages is a perfectly normal part of a website and Google expects this. However, how you handle 404 pages and the visitors that land on them is important if you want to keep these people on your site.

What Are All The Different HTTP Status Codes?

Here we are mainly focusing on 404 status codes but it’s important to know what the other major codes mean when talking about 404s. You can see an exhaustive list of codes here at W3.org but really there are just a handful of main ones that you need to know and these are visualized in this infographic from the guys at Moz:

http-status-codes-guide

 

So I Should Just Redirect 404s Elsewhere Right?

Not really, it depends on various factors such as; how much traffic is there to that page, what are people who land on that page looking for, and are their back-links and SEO juice pointing to it?

If you have a 404 page that is getting significant amounts of traffic then you might want to look and see how they are getting to this page and what the source is. If its from a source you can control such as somewhere else on your site or an advertisement then you should look at making sure these visitors end up in the right place on your site and not a 404 page.

If they came from another site linking to an expired page then it would be best to redirect them to the either the new page or one closest to what the old page was about. However redirecting isn’t always the answer, for example simply redirecting a user to your homepage might not always be the best action as if they don’t see what they came in looking for within a few seconds they will go back to the search results. Sometimes keeping them on the 404 page with a short note to say sorry and then offering other options such as pages similar to what they were looking for, category pages and/or the homepage lets the user know that even though they’ve landed in the wrong place they might be able to find what they need elsewhere on your site.

Finally, you need to look at any backlinks pointing to this 404 page. If there are significant or highly valuable links pointing to one of these pages (and assuming you can’t get the linking source to change the link) then you may want to setup a 301 redirect so that the valuable link juice flows through to either your homepage or other part of your site that you want to rank.

Going back to the point about 404 pages that are receiving traffic, if you do intend to keep the user engaged on the site so that they hopefully click through to another part of your website, then setting up a custom 404 page is a good way to go about it. Google advises on doing the following for these custom 404 pages

  • Inform the user that what they were looking for is not there and that they are on a 404
  • Keep the look and feel of the page the same as the rest of the site
  • Use links to the most popular parts of your site
  • Make sure the page is returning an actual 404 status code as you don’t want Google attempting to index it

404notfound.fr has a huge collection of custom 404 pages where you can get plenty of inspiration for making your own.

What is Your Crawl Budget and Why You Need To Know This

 

The post What Should You Do With Your 404 Pages? appeared first on Performancing.

]]>
http://performancing.com/what-should-you-do-with-your-404-pages/feed/ 8
What is Your Crawl Budget and Why You Need To Know This http://performancing.com/crawl-budget-need-know/ http://performancing.com/crawl-budget-need-know/#comments Tue, 01 Nov 2016 07:48:55 +0000 http://performancing.com/?p=13348 If you are running a large website with many pages then it is essential that you know exactly what a crawl budget is, how it’s affecting your site, what your budget is and what to do if it’s not enough. What exactly is a crawl budget? The crawl budget is how many times the Google […]

The post What is Your Crawl Budget and Why You Need To Know This appeared first on Performancing.

]]>
If you are running a large website with many pages then it is essential that you know exactly what a crawl budget is, how it’s affecting your site, what your budget is and what to do if it’s not enough.

What exactly is a crawl budget?

The crawl budget is how many times the Google bots or spiders are crawling pages on your website within a given period of time. If you only have a small or medium sized site then this is most likely not going to be a problem, if you have a large site with hundreds, thousands or even tens of thousands of pages (such as an e-commerce site or news/media site) then you need to know that Google is crawling as many of these pages as possible and if changes are being made to these pages that they are re-crawled soon after.

The main factor that affects the crawl budget is PageRank, so for large and established sites its often not a problem, however if you site is relatively new and you are adding many new pages to it then the lank of PageRank could be a problem.

Matt Cutts summarized crawl budget perfectly in this interview published at Stone Temple some years back:

“The first thing is that there isn’t really such thing as an indexation cap. A lot of people were thinking that a domain would only get a certain number of pages indexed, and that’s not really the way that it works. There is also not a hard limit on our crawl. The best way to think about it is that the number of pages that we crawl is roughly proportional to your PageRank. So if you have a lot of incoming links on your root page, we’ll definitely crawl that. Then your root page may link to other pages, and those will get PageRank and we’ll crawl those as well. As you get deeper and deeper in your site, however, PageRank tends to decline.

That interview was way back in 2010 and there have been many changes to how Google crawls sites such as the Caffeine update in June of that year and Google is able to crawl more pages and a lot faster now but what Matt said back then about Google focusing on pages with more authority still remains true, these pages are just going to be crawled with greater frequency now.

How does Google crawl your pages?

First the Google spider will look into your robots.txt file and see what it should and shouldn’t be crawling and indexing. The budget part is how many of these urls Google decides to crawl per day, this is decided by the health of your site and also the number of links pointing to it.

How to check the health of your crawl budget

First check the total number of pages your site has in its XML sitemap, usually this will be at the root of your site eg Yourdomain.com/sitemap.xml. Quick tip, if you don’t have a sitemap setup and are running a WordPress site then we strongly recommend the YOAST SEO plugin which will do all of this for you with just a few clicks 🙂

sitemp1

 

Within your sitemap XML file there will be other sitemaps for different parts of your site eg a sitemap for the blog posts, one for different authors or users and so on. Go into each of these and get the total number of pages for each.

Once you have your total number of pages go into your Google Webmaster Tools account and then to Crawl>Crawl Stats in the left side menu until you see the pages crawled per day like the image below.

crawlstats

Then to find out your crawl budget simply divide the total number of pages your site has by the average number of pages crawled per day.

If your final number is less than 10 you are fine, if its more then you have a problem as Google has not allocated a you a large enough crawl budget and thus not all of your pages are being crawled, this needs to be fixed.

My crawl budget is bad, now what?

First off you need to find out if there are any crawl errors being found by Google on your site. Your server logs are a good place to start, you want to be looking for any 404s and redirecting them if possible or fixing the pages. 301 and 302’s are ok as long as they are redirecting to the correct places.

Once you have cleaned up crawl errors your next step should be to look at how Google is crawling your site.

How to sculpt where Google bots go

Remember, there are a finite number of pages on your site that Google can crawl, however Google bots will parse anything put in front of them so we need to make sure that they aren’t crawling pages that aren’t important to your site.

Robots.txt file – use this a the top level for disallowing sections of your site from bots being able to crawl

Noindex meta tag – this can be used on a more finite level for individual pages so that they will not be indexed

Nofollow tags – this can be used at an even more granular level on individual links to pages but if you don’t add this tag to all links pointing to the page then Google will still be able to find it

Knowing how and when Google bots are crawling your site is crucial for mid-range to large sites, especially ones that might not have so much authority and are competing against more established sites so webmasters have to ensure that these bots are seeing and crawling the most important parts of their site.

For more in-depth and further information on how Google is crawling websites see this Google hangout with John Mueller and Andrey Lipattsev.

 

Stop Letting Visitors Slip Through Your Fingers By Implementing These Conversion Tips

 

The post What is Your Crawl Budget and Why You Need To Know This appeared first on Performancing.

]]>
http://performancing.com/crawl-budget-need-know/feed/ 2
Analyzing Organic Keywords With Google Webmaster Tools http://performancing.com/analyzing-organic-keywords-google-webmaster-tools/ http://performancing.com/analyzing-organic-keywords-google-webmaster-tools/#comments Thu, 20 Oct 2016 11:30:42 +0000 http://performancing.com/?p=13319 It has been some years now since Google took away a lot of the data in Google Analytics for keywords that organic traffic is coming to our sites by. What surprises us the most here at Performancing though is the number of business owners and webmasters that aren’t aware of how to get this data […]

The post Analyzing Organic Keywords With Google Webmaster Tools appeared first on Performancing.

]]>
It has been some years now since Google took away a lot of the data in Google Analytics for keywords that organic traffic is coming to our sites by. What surprises us the most here at Performancing though is the number of business owners and webmasters that aren’t aware of how to get this data from their Google Webmaster Tools account. This is a crucial method for isolating individual pages and seeing which search queries are bringing visitors to that page from search engines.  So we decided to put a quick tutorial together to demonstrate how it’s done.

After logging in to your Webmaster tools account go to Search Traffic > Search Analytics, make sure you have Queries selected and you should then see all your major search queries listed below.

 

wmt1

 

 

 

 

 

 

 

 

 

 

 

 

Next click Pages and you will see the most visited pages for your site listed below.

 

wmt2

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Now look for the page that you want to dig deeper into to find out what keywords people are coming into that page from, click the url and it will then show that url under the Pages section. Next click on Queries again and below you will have a list of all the organic search terms that people fond your page with from Google.

 

wmt3

 

 

 

 

 

 

 

 

 

 

 

 

Further data can then be acquired on these search terms with the Clicks, Impressions , CTR and Positions tabs selected.

 

wmt4

 

 

 

 

 

 

 

 

 

Finally, you can isolate individual search terms under Queries and then select Pages to see which pages are getting search traffic for a specific page.

wmt5

 

 

 

 

 

 

 

 

 

 

 

We hope this quick Google Webmaster tools tip was useful, are there any tips or tricks you use for analyzing your search data from Google?

The post Analyzing Organic Keywords With Google Webmaster Tools appeared first on Performancing.

]]>
http://performancing.com/analyzing-organic-keywords-google-webmaster-tools/feed/ 4
Identifying Traffic Drop Causes http://performancing.com/identifying-traffic-drop-causes/ http://performancing.com/identifying-traffic-drop-causes/#comments Fri, 09 Oct 2015 13:21:07 +0000 http://performancing.com/?p=13164 It’s the situation that every webmaster or SEO person dreads, they login to their Google Analytics account only to see that their traffic has nose dived into oblivion. Now is not the time for panicking though! Websites and how they acquire traffic are a lot more complex than they used to be so finding the […]

The post Identifying Traffic Drop Causes appeared first on Performancing.

]]>
It’s the situation that every webmaster or SEO person dreads, they login to their Google Analytics account only to see that their traffic has nose dived into oblivion. Now is not the time for panicking though! Websites and how they acquire traffic are a lot more complex than they used to be so finding the source of the problem needs to be tackled correctly and methodically, here’s a short guide on some of the main areas you should be investigating.

traffic drop

Look Closely At Traffic Source Changes

First thing you want to do when you see the traffic dip is to find out the exact source, go to Acquisition > Overview and check all your traffic sources, if its an SEO issue then your organic traffic will be down. If you have a lot of referral traffic from other sites or social media platforms and this is down then make sure any ads or links you have there are still up and running.

Inspect On Page Elements

Is the title tag still the same and all the other meta information correct? For high traffic/highly competitive terms changing an optimized title tag can cause a page or site to plummet from page one of the rankings and all your organic traffic will go with it. Other things to look out for are noindex tags and nofollow tags for your inner page links. If all the links to your inner pages are no followed then Google bots can not follow these links and they will lose their rankings. H1 & H2 tags are also very important and something as simple as removing a target keyword from these can cause a page to suffer.

Login to Google Webmaster Tools

Here you need to check your site messages, if you have received a manual Google penalty you should have a message here, note that many Google penalties do not come with a warning though e.g. Panda. Then check the following:

Index status – here you can see how many pages on your site Google has indexed, a sudden drop in these means there is a problem.

Crawl errors – if your site has experienced any down time or has too many 404 pages this will have a major affect on not just how Google sees your site but can also cause users to leave and not come back!

Robots.txt status – just like with on-page elements the robots.txt file can be very powerful and can cause a huge amount of problems if not setup correctly. Make sure that the Googlebot is allowed to crawl the correct pages and parts of your site using the tester tool. Here’s a fantastic guide from YOAST on how to correctly configure a robots file.

Check Any Page Redirects

If you had any pages setup to redirect elsewhere then make sure all those 301s are still in place, if any have them have been removed or broke then all the traffic going to those pages is going to see a big, fat 404 page, too many of these and Google will severely penalize your site!

These are just some of the main things you should do as your first port of call when investigating a traffic drop but there are many other things ranging from canonical tags being removed to footer and navigation links breaking that can cause your site to crash. Most if not all of these problems are brought on by humans making changes to your site which is why its crucial to keep a log of all changes and edits you or your team make to your website.

How To Filter Out C Language or ‘Bot’ Traffic From Google Analytics

The post Identifying Traffic Drop Causes appeared first on Performancing.

]]>
http://performancing.com/identifying-traffic-drop-causes/feed/ 17
5 Old (But Still Relevant) Content and Link Building Strategies….And 1 New One http://performancing.com/5-old-but-still-relevant-content-and-link-building-strategies/ http://performancing.com/5-old-but-still-relevant-content-and-link-building-strategies/#comments Tue, 26 Aug 2014 07:55:32 +0000 http://performancing.com/?p=12990 Infographics still work….sort of Infographics started to gain traction around 2010 and then saw an explosion across the web for the next 2 years as they grew in popularity. As with all new content and link building strategies SEO and marketing firms were all over them like a bad rash until Google stepped in (as […]

The post 5 Old (But Still Relevant) Content and Link Building Strategies….And 1 New One appeared first on Performancing.

]]>
Infographics still work….sort of

Why-Infographics-Still-Work-link-building-strategies

Infographics started to gain traction around 2010 and then saw an explosion across the web for the next 2 years as they grew in popularity. As with all new content and link building strategies SEO and marketing firms were all over them like a bad rash until Google stepped in (as they always do) and reduced the effectiveness of signals from links within infographics. However this doesn’t mean you can’t still use them. The field is pretty crowded these days so you really need to dig deep to come up with an interesting topic and get the data to go with it, after that you really need a killer design with the information displayed correctly to get your message across. This is really the hardest part as if you have an infographic on a unique and interesting topic then people will still accept it to be published on their sites and people are more likely to share it and link back to it. On top of that there are many infographic galleries you can seed your design on and again, the better the design and unique the topic is the more likely gallery sites will accept it. The links from infographics will not carry the same weight as they used to but are still a solid link-building strategy.

Refreshing or updating old content

refreshing-old-content-link-building-strategies

This is a a great tactic as it gives you a legit reason to push some old content to the front of your website or blog again and share it across Facebook and Twitter. If you have any kind of list post you wrote in the past or resource post then go over it again, is there any new content or information you can add there? Nothing ever stays the same and there should always be something new you can add., a resource page that is contently evolving is more likely to be linked to.

Guest Blogging (yes this is still an option)

guest-blogging-link-building-strategies

Guest blogging was the new ‘go to’  and popular link building option throughout most of 2012 and 2013. As with all of the latest SEO tactics it was abused and Google once again had to take action. Matt Cutts first made some warnings and noises about the quality of the content and links that people allow from guest bloggers at the end of 2012 and then again several times in 2013 and early 2014. Finally in spring of this year My Blog Guest and many blogs and websites associated with it were heavily penalised. In the days and weeks after the guest blogging ‘scene’ went very quiet as no-one knew what to do next. Slowly it has been coming back though but what we are seeing now is that many sites and publications have taken a much stricter stance on what they accept, this means guest blogging is moving back to what it originally was and reclaiming its position as trusted content and links.

Sponsoring events

Planning Law Webinar Flickr - Photo Sharing!

‘Sponsored’ is a dirty word in SEO and link building, if you pay for a link then it should not be a normal and followed link. But if you help out at and event or on a podcast then your name is going to be mentioned anyway and this is more than likely to bring links also and links that have been earned naturally.

Interviews

Mark Zuckerberg interviewed by Financial Times, Scobleizer, and Techcrunch Flickr - Photo Sharing!

An old strategy and still a good one. Many people make the mistake of trying to interview the biggest names in their niche, if you can get these people then that’s great but chances are they won’t have the time. Look for people who are new to the scene or have just started a new project, they are more likely to be keen to talk or have something to say. A good example of this is over at our new crowd-funding site Stiqblox.com, we often reach out to creators with the offer of an interview when they launch their campaigns on Kickstarter or IndieGoGo as this is when they are hungry to talk to the press, but you need to catch them before they hit the funding target as after that they simply don’t have the time and/or don’t need the coverage anymore!

An ice bucket challenge?

Gwen Stefani ice bucket

I’m sure you have seen different celebrities across the globe doing this for the past month or so for the ALS charity, it’s guerilla and viral marketing at it’s best and companies should get involved by taking the challenge while it’s still viral, this way you get to help out a good cause while at the same time getting a little exposure and coverage for your brand.

27 Tips for Building a Kick-Ass Blog

Taking Your SEO Back To Basics

Image sources

One, two, three, four

The post 5 Old (But Still Relevant) Content and Link Building Strategies….And 1 New One appeared first on Performancing.

]]>
http://performancing.com/5-old-but-still-relevant-content-and-link-building-strategies/feed/ 31
A/B Testing and SEO: What You Need to Know to Succeed http://performancing.com/ab-testing-seo-need-know-succeed/ http://performancing.com/ab-testing-seo-need-know-succeed/#comments Fri, 08 Aug 2014 12:00:14 +0000 http://performancing.com/?p=12981 Being indecisive is something that companies often struggle with when it comes to decisions about a website, but ironically enough a lot of this indecisiveness comes from the indecisiveness of consumers. You want one thing one minute, and then something turns you off the next. This is why A/B testing is so important for small […]

The post A/B Testing and SEO: What You Need to Know to Succeed appeared first on Performancing.

]]>
Being indecisive is something that companies often struggle with when it comes to decisions about a website, but ironically enough a lot of this indecisiveness comes from the indecisiveness of consumers. You want one thing one minute, and then something turns you off the next. This is why A/B testing is so important for small businesses. It ensures that you website is satisfying the majority of consumers, and it allows for you to modify this as consumers modify their own purchasing decisions.

This then brings up the idea of SEO. If you’re constantly changing your website, how can you be sure that your SEO is going to remain in tact? Furthermore, is it all worth it in the end?

The SEO Repercussions of A/B Testing

It’s first important to understand that A/B testing doesn’t necessarily mean you’re changing your site entirely. That would be a redesign. A/B testing is just creating different versions of usually one webpage and then testing the different versions to see which has the best results. Companies typically use A/B testing software to complete a test. According to the company adogy.com, they got creative with their “work” page all thanks to A/B testing software, which you can check out by visiting the link. Each webpage has its own URL, and when someone visits that page the software will redirect some people to one page and some to another.

Despite different URLs, however, there are a few ways you can make sure that you have even higher visibility because your SEO remains intact:

  • Use 302 redirects. These are only temporary redirects.

It’s important to remember that you don’t want to use a 301 redirect. 302 redirects are temporary, which gives the search engine bots a heads up that the page they have found might not be there for long. This will then send them back to your usual URL and keep that URL indexed; not the temporary one.

According to Adam Heitzman, Managing Partner of SEO Company HigherVisibility, “this also helps if you are planning to re-use these temporary pages in the future for other tests or if you have any users who might have bookmarked one of your testing pages or linked to it from another website. This is a common occurrence, and a 302 redirect will make sure that the link juice isn’t lost when the page is gone.”

  • No cloaking. Do now show users and Google a different webpage.

This is an SEO black hat tactic that is against the Google Webmaster Guidelines. You cannot trick search engine bots into following one webpage but show another to users. This could accidentally happen when performing an A/B test, so keep that in mind to avoid being penalized (using a 302 redirect will help make sure this doesn’t happen by accident).

  • Rel=”canonical” link attribute. This tag lets Google know a page is a test.

You want to use this tag on all of the webpages that you are testing so that Google knows they are only tests. Usually duplicate versions of each page will be created when testing, and although it’s very rare for Google to index one of your test pages, it can happen, especially if the test has been running for a while. The canonical tag on those duplicated URLs will make sure that they are not indexed.

However, you should not use this tag on your control page. You can watch this video to learn more specific information. Extra Tip: You can also choose to block the bots from your test pages by using robots.txt disallow command or a robots meta tag with noindex, both of which you can learn more about here.

  • Timing. Make sure you aren’t running your tests too long.

You don’t want to be running a test for more than a few (2-3) months. If you run the test too long, Google might notice and then ask you to delete all of your other webpages. In other words, you want to make sure that you’ve made your decision in a good amount of time so that you have control over what stays, not Google.

The Takeaway

Google unfortunately hasn’t brought up the subject of A/B testing officially since 2009, but back in 2009 they did announce that they endorse A/B testing and actually want marketers to engage in these tests. It’s a great way to offer readers the best possible information in the best way, so not only is it something that won’t hurt your SEO, it’s something that will hopefully help it.

What has been your biggest problem with A/B testing? Do you have a story about losing your SEO value in the process? Let us know your experience and tell us your thoughts in the comments below.

Photo Credit: customersure.com

The post A/B Testing and SEO: What You Need to Know to Succeed appeared first on Performancing.

]]>
http://performancing.com/ab-testing-seo-need-know-succeed/feed/ 1
Don’t Forget about Semantic SEO in 2014 http://performancing.com/dont-forget-about-semantic-seo-in-2014/ http://performancing.com/dont-forget-about-semantic-seo-in-2014/#comments Thu, 09 Jan 2014 13:00:34 +0000 http://performancing.com/?p=12640 The New Year is the perfect time to try new things and see what works for your company, but there are a few things that you really have to try in 2014 if you want to keep up and be successful. Semantic SEO, a new kind of SEO strategy, is something that’s going to be […]

The post Don’t Forget about Semantic SEO in 2014 appeared first on Performancing.

]]>
performancingThe New Year is the perfect time to try new things and see what works for your company, but there are a few things that you really have to try in 2014 if you want to keep up and be successful. Semantic SEO, a new kind of SEO strategy, is something that’s going to be crucial this year.

Based on the Google updates and changes made last year, it’s clear that Google is trying to move their algorithm toward analyzing sites based on other factors besides just keywords. As Google evolves, it’s important you understand how semantic SEO works so you don’t fall behind.

A Quick Recap: How Semantic SEO Works

For those who are unfamiliar, semantic SEO is about optimizing your content for related terms and connections that a searcher might make to that content. For example, using the word “car” in your content frequently or optimizing your page for that term isn’t going to be quite as important anymore. Now, terms like “automobile,” or even related topics such as “mechanics” or “car loans” will all be taken into account when it comes to rankings and should therefore be taken into account when writing and optimizing.

Of course, there is a little bit more to the idea of semantic SEO (along with what many call the “semantic web”) as well as tips to create a semantic strategy. Visit this article learn more about how it all works and this article to learn more about some tools that can help.

Why Semantic SEO Will be More Important in 2014 Than Ever

The idea of semantic SEO has been around for quite some time, but there were two different signals and hints we got from Google this past year that can lead us to believe it’s only becoming more important to understand:

  • The Hummingbird Update: The Hummingbird update was all about revamping the algorithm to keep up with how people search today—more conversational and less keyword-based. It was the first update of its kind in over 10 years, so it made quite a stir in the SEO community. It shows that Google is ready to make serious changes in order to keep with how search is changing.
  • Knowledge Graph. The Google Knowledge Graph shows up for certain subjects on the right hand side of a Google SERP. It shows relationships between different things across the web and helps to cut down on ambiguity. This past year, Google added more categories and filters to the Knowledge Graph. Focusing on semantics will help you focus on the related concepts to your content and articles, which will help you get more involved in this way of organizing the web.

Have you used semantic SEO in the past? Is there any other SEO method you think is going to be a necessity in 2014? Let us know your story and your thoughts in the comments below.

Photo Credit: copywritingservicespro.com

<a href=”https://plus.google.com/u/0/113948183183915741351/posts?rel=author”>Amanda DiSilvestro</a> gives small business and entrepreneurs SEO advice ranging from keyword density to recovering from Panda and Penguin updates. She writes for HigherVisibility.com, a nationally recognized SEO agency that offers online marketing services to a wide range of companies across the country.

The post Don’t Forget about Semantic SEO in 2014 appeared first on Performancing.

]]>
http://performancing.com/dont-forget-about-semantic-seo-in-2014/feed/ 1
Matt Cutts Details the Most Common SEO Mistakes: What This Means to You http://performancing.com/matt-cutts-details-the-most-common-seo-mistakes-what-this-means-to-you/ http://performancing.com/matt-cutts-details-the-most-common-seo-mistakes-what-this-means-to-you/#comments Wed, 20 Nov 2013 13:00:21 +0000 http://performancing.com/?p=12613 Whenever Head of Google Webspam Matt Cutts gives SEO advice, it’s worth dropping everything you’re doing. Although sometimes his advice is vague or answers an obvious question, you know that whatever he says is true. He also has insights into the search market that few others have, so when he shares opinions and answers using […]

The post Matt Cutts Details the Most Common SEO Mistakes: What This Means to You appeared first on Performancing.

]]>
cuttsWhenever Head of Google Webspam Matt Cutts gives SEO advice, it’s worth dropping everything you’re doing. Although sometimes his advice is vague or answers an obvious question, you know that whatever he says is true. He also has insights into the search market that few others have, so when he shares opinions and answers using those insights it’s a great way to learn more about the industry. In his latest video, Cutts did just that when he laid out the most common SEO mistakes made by small businesses. It’s important to ask yourself: Am I making these common mistakes, and if so, why?

Top 5 SEO Mistakes According to Matt Cutts

The latest video gives you the feeling that you’re not alone when it comes to some SEO struggles. The exact question asked to Cutts in the video was this:

“What are the top 3-5 SEO areas where webmasters make the most mistakes? How can we do better on those?”

Cutts explained that the biggest mistakes webmasters make are not very advanced or detailed, but rather seem pretty simple. Nonetheless, these are the mistakes that are most common (not most devastating). A few of the most common mistakes that Cutts explained, in order, include:

  1. The website isn’t crawlable. If your website is locked up and you can’t go through and click around your website or you’re somehow hiding your most important content, your website isn’t overly crawlable and the Google bots won’t be able to rank you properly (not to mention users won’t be able to navigate your website well).
  2. You don’t include the right words on the page. Think about what the user is going to type and then include those words. It’s also a good idea to put your business hours on the page, or if you’re a restaurant include the menu as plain text and not just a PDF.
  3. You think about link building and not compelling content and marketing. This is one piece of advice Cutts can’t seem to give enough, and he adds it here as well. Thinking about link building cuts off things like talking to newspapers and other avenues.
  4. You’re forgetting about the title and description of your really important pages. Really pay attention to your homepage. You want what is showing in your snippet to urge people to click, and you want those who want to revisit your site to have a homepage title they’re remember.
  5. Not using Wemaster resources. This includes the Google Webmaster tools, the blog, Google videos, the Google Webmaster forum, going to conferences, etc.

What This Means to You

Understanding the most common SEO mistakes that people make can help you ensure that you’re not making the same ones. Go through your website and check these five points and make changes if necessary. This will help you move forward in the future so simple problems aren’t holding you down like many others.

What are some of the most common SEO mistakes that you’ve experienced or seen webmasters make? Let us know your story and your thoughts in the comments below.

Photo Credit: brandignity.com

Amanda DiSilvestro gives small business and entrepreneurs SEO advice ranging from keyword density to recovering from Panda and Penguin updates. She writes for the nationally recognized SEO agency HigherVisibility.com that offers online marketing services to a wide range of companies across the country. 

The post Matt Cutts Details the Most Common SEO Mistakes: What This Means to You appeared first on Performancing.

]]>
http://performancing.com/matt-cutts-details-the-most-common-seo-mistakes-what-this-means-to-you/feed/ 1
The Benefits of Content Marketing http://performancing.com/benefits-content-marketing/ Thu, 04 Apr 2013 18:42:18 +0000 http://performancing.com/?p=12472 It has been said before that “content is king”, but when it comes to search engine optimization (SEO), this is really the case. Regular high-quality content can help websites and blogs get better rankings online and also drive more traffic to a website. In addition, it also helps companies be seen as a resource and […]

The post The Benefits of Content Marketing appeared first on Performancing.

]]>
content marketing

It has been said before that “content is king”, but when it comes to search engine optimization (SEO), this is really the case. Regular high-quality content can help websites and blogs get better rankings online and also drive more traffic to a website. In addition, it also helps companies be seen as a resource and important resource in their field or industry.

Blog Posts

Blog posts are an easy way to continuously provide a website with fresh content. Blog posts should be published on websites 2-3 per week, on a variety of topics that are related to the industry or the company’s products and services. These are easy to find if marketers read online content daily. Companies that create publishing calendars for their blog content may find that they be more productive and organized, leading to better overall content.

To get the benefit from regular blog posts, the blog must be hosted on the same domain as the content. The blog can be on a subdomain, but most users and search engines are used to finding blogs at domain.com/blog.

External Content

Providing content to other sources is another great way to get the benefits of SEO, even though it’s not on the company’s website directly. Guest blogging, press releases, testimonials, videos, and other types of content are examples that can give links to the company’s website. Besides getting links, users see companies that have a wide reach (meaning that they are published/seen in a variety a different platforms) as being more reputable than those that aren’t seen anywhere else online besides their own website.

To find places that will accept your content or mention your company, do Google searches for “guest blogging + “industry keyword”” or look at guest blogging websites like MyBlogGuest and Hullabaloo. Use these platforms to create connections with other companies and website owners to help get your content and website out there.

Social Media

Google and Bing now include personalized search results based on a user’s social network, so being active on social media, for both employees as individuals and companies, is definitely key. Websites should also have social sharing buttons on all blog posts so they can be shared online easily.

Social media shouldn’t just be a place to share your own company’s information and content, but to also be a fun resource and industry news source. Occasionally post updates about current events, holidays, fun facts, and other information that your audience will find interesting.

photo credit: See-ming Lee 李思明 SML via photopin cc

The post The Benefits of Content Marketing appeared first on Performancing.

]]>
Key SEO Habits to Stay on Google’s Good Side This Year http://performancing.com/key-seo-habits-to-stay-on-googles-good-side-this-year/ Wed, 13 Mar 2013 18:17:24 +0000 http://performancing.com/?p=12417 The search engine optimization world has always been a dynamic one, requiring online professionals to adjust as quickly as they can. The goal is always the same – to rank as highly as possible in the search engine results – but techniques and strategies have to evaluated and changed if you want to reach your […]

The post Key SEO Habits to Stay on Google’s Good Side This Year appeared first on Performancing.

]]>
key SEO habits
The search engine optimization world has always been a dynamic one, requiring online professionals to adjust as quickly as they can. The goal is always the same – to rank as highly as possible in the search engine results – but techniques and strategies have to evaluated and changed if you want to reach your goal.

With the search engines – of which Google is the unquestioned leader – regularly changing their policies, it is up to you to make sure that you adapt. It is not always easy, of course, especially since the exact details are almost always never divulged, but we have to admit that what they are doing is for the good of the user.

That being said, here is a timely reminder for you – some key SEO habits that will help you get on Google’s good side. Consider these tips a refresher, if you will.

Be original

Originality is such an important thing – online and offline – especially in this world where it is very easy to provide and acquire information. How can you make sure that you have something unique for your users? Something they will not find in another hundred other sites? While you may write about the same topic as others do – that is something that cannot be avoided – make sure that you provide something of additional value.

Be relevant

More than providing original content, you also have to ensure the relevance of your content. What good is original content when it does not mean anything to your target audience? That’s like giving a guy 10,000 spoons when all he needs is a knife to cut his steak. (That’s channeling Alanis Morissette right there.)

Clean up your web site

You don’t have to be an expert coder to work on this aspect, but a basic coding knowledge is imperative to those who want to achieve any measure of success in SEO. And when it comes to the point, there are many resources which can help you learn. It is important that you not only pay attention to the quality of your content, but also to how your site is structured. While we place more emphasis on your content being written for users (humans!), we cannot overlook the role that crawlers (bots!) play. They need to be able to access your pages efficiently. Hence, tried and tested practices such as internal linking and a clean internal URL structure should be innate to you. Take the time to check your site for these things. It will be worth your while.

About the Author
Lance Stadler has been blogging for 10 years. He has gone through so many search engine updates, and he believes in staying on the right side of the law. He also makes sure his workflow continues to be smooth by choosing the right internet service providers.

Image via StayAlien

The post Key SEO Habits to Stay on Google’s Good Side This Year appeared first on Performancing.

]]>