With DMOZ an Open Directory project closed as of March 17, 2017, Google no longer uses their listings as one of the options for generating search snippets. With that in mind, good meta description is even more important. So here are a few pointers on how to make a good meta description:
- Don’t forget about it, without it google has no choice but to use portions of page content for their search snippet.
- Keep it short. Although there is no actual limit on how long the meta description can be, meta description exceeding on average 160 characters will be truncated.
- Make it unique. Do not use the same meta description across multiple pages. Each page on your website should be unique and so should the meta description.
- Stay on topic. Good meta description can increase click through rates so make it informative and to the point.
Note that with Google’s new mobile-first indexing the same principles apply for mobile pages. If you have a separate mobile site, make sure that the meta description is correctly included on all mobile pages.
Google still reserves the right to choose between page content and meta description for their search snippet generation. The one thing you can do is to specify nosnippet robot directive to prevent snippet generation all together. For more details on search snippets see latest post by Gary Illyes titled “Better Snippets for your Users”
Google is introducing a new Mobile-first Indexing. This change supposed to address an issue where the desktop and mobile versions of a page vary in content and markup. Up until now, when users search on Google using a phone or any mobile device the ranking system would most likely utilize desktop version of a page’s content. With this new indexing this should no longer be the case.
While so far this is a limited and experimental change in indexing, it is a big development and Google urges users whose websites are not responsive or dynamic serving and where content and markup varies between mobile and desktop version to consider following:
Make sure to serve structured markup for both the desktop and mobile version. Sites can verify the equivalence of their structured markup across desktop and mobile by typing the URLs of both versions into the Structured Data Testing Tool and comparing the output. When adding structured data to a mobile site, avoid adding large amounts of markup that isn’t relevant to the specific information content of each document.
Use the robots.txt testing tool to verify that your mobile version is accessible to Googlebot.
Sites do not have to make changes to their canonical links; we’ll continue to use these links as guides to serve the appropriate results to a user searching on desktop or mobile.
If you are a site owner who has only verified their desktop site in Search Console, please add and verify your mobile version.
Also to confirm, if you only have a desktop version of a website or you have responsive site where content and markup are the same across, you should not need to take any actions.
Email addresses that are publicly posted on the webpages in plain text, will very quickly be collected by spam bots and used to send unsolicited emails. To stop or at least make it more difficult for bulk emailers to collect publicly accessible emails, we can utilize some email obfuscation techniques. Protecting publicly displayed email addresses by obfuscating them, can not only cut down on spam but is also considered a courteous gesture.
There are several techniques at your disposal to obfuscate or in other words hide email addresses that are posted on publicly accessible webpages from spam bots.
It is important to know if your website’s IP is blacklisted by spammer directory. Spammer directories provide lists of IPs from which spam distribution is reported. Emails sent from the blocked IPs are subject to closer scrutiny and are much less likely to be delivered.
One important factor to understand is that your server does not have to send spam in order for your IP to get blacklisted. Many websites are hosted on a single IP with shared hosting accounts. If one domain on shared IP is reported as distributing spam all other websites on the same IP are affected.
The website analysis & SEO tool for Chrome browser provides a handy extension for users to run iwebchk reports.
When installed, the iwebchk icon button will be added to the Google Chrome toolbar.
When clicked it will initiate a comprehensive review in SEO, performance, validity, security, accessibility, social media, backlinks, visitors, technologies and usability, of any website that is currently being viewed by the user.
We are happy to announce a new addition to our reports, a priority check list. All new reports will now include a priority list on top of the report, which will help in prioritizing work needed to be done for each analyzed website.
The public reports will include 3 important tasks that are needed to be fixed, while our subscribed users will be able to review a full list of tasks ordered by their importance and impact on SEO. See sample list below:
We are hoping this new functionality will be useful to all our users.
Our users, who would like to initiate a website SEO review directly from their webpages, can now do so by embedding our iwebchk bar. Just copy and paste following code into your website or a blog:
After the code has been properly embedded on a webpage the iwebchk bar should look and work like this:
Website SEO Analysis by iwebchk
In order for the iwebchk bar to work the browser must support and allow html iframe elements. All modern browsers support iframes but in the rare case there is no iframe support following message will be displayed instead:
“Inline frames (iframes) are not supported by your browser. Please visit iwebchk.com to review your website.”
We are hoping that this new addition to our tools will be helpful to our users.
Proper utilization of hyperlinks is an integral part of a good website design and a major component of an effective search engine optimization plan. After all, the World Wide Web (WWW), in its most basic form, is simply a system of documents interconnected through links.
We are excited to announce the launch of a new “Links Analysis” module, the most recent addition to our website analysis tool. It is not a secret that properly functioning and SEO optimized links are an integral part of a good website. The links module provides users with vast amount of valuable information such as number of links on a webpage and their ratios, types, URL structure and much more.
Here is a quick overview of key data points:
- Internal Links: relative – number of internal links with relative URL.
- Internal Links: absolute – number of internal links with absolute URL.
- External Links: noFollow – number of external links that do not pass page rank
- External Links: passing Juice – number of external links passing page rank.
- Anchors – Text or Image that is the anchor part of the links.
- Count – number of links pointing to the same resource.
- Title – shows value of the “TITLE” attribute of a link.
- URL Type – static indicates that the URL is SEO and user friendly. Dynamic indicates complex URL with many parameters in the query string.
- Length – URL length in characters.
- Target – value of the TARGET attribute. If there is no value it is the same as “_self”.
- Rel – value of the REL attribute. Usually “noFollow” value is placed here to limit flow of the page rank
We are hoping you will find link analysis addition helpful in reviews. As always we are looking forward to your comments and ideas for further improvements.