Email addresses that are publicly posted on the webpages in plain text, will very quickly be collected by spam bots and used to send unsolicited emails. To stop or at least make it more difficult for bulk emailers to collect publicly accessible emails, we can utilize some email obfuscation techniques. Protecting publicly displayed email addresses by obfuscating them, can not only cut down on spam but is also considered a courteous gesture.
There are several techniques at your disposal to obfuscate or in other words hide email addresses that are posted on publicly accessible webpages from spam bots.
It is important to know if your website’s IP is blacklisted by spammer directory. Spammer directories provide lists of IPs from which spam distribution is reported. Emails sent from the blocked IPs are subject to closer scrutiny and are much less likely to be delivered.
One important factor to understand is that your server does not have to send spam in order for your IP to get blacklisted. Many websites are hosted on a single IP with shared hosting accounts. If one domain on shared IP is reported as distributing spam all other websites on the same IP are affected.
The website analysis & SEO tool for Chrome browser provides a handy extension for users to run iwebchk reports.
When installed, the iwebchk icon button will be added to the Google Chrome toolbar.
When clicked it will initiate a comprehensive review in SEO, performance, validity, security, accessibility, social media, backlinks, visitors, technologies and usability, of any website that is currently being viewed by the user.
We are happy to announce a new addition to our reports, a priority check list. All new reports will now include a priority list on top of the report, which will help in prioritizing work needed to be done for each analyzed website.
The public reports will include 3 important tasks that are needed to be fixed, while our subscribed users will be able to review a full list of tasks ordered by their importance and impact on SEO. See sample list below:
We are hoping this new functionality will be useful to all our users.
Our users, who would like to initiate a website SEO review directly from their webpages, can now do so by embedding our iwebchk bar. Just copy and paste following code into your website or a blog:
After the code has been properly embedded on a webpage the iwebchk bar should look and work like this:
Website SEO Analysis by iwebchk
In order for the iwebchk bar to work the browser must support and allow html iframe elements. All modern browsers support iframes but in the rare case there is no iframe support following message will be displayed instead:
“Inline frames (iframes) are not supported by your browser. Please visit iwebchk.com to review your website.”
We are hoping that this new addition to our tools will be helpful to our users.
Proper utilization of hyperlinks is an integral part of a good website design and a major component of an effective search engine optimization plan. After all, the World Wide Web (WWW), in its most basic form, is simply a system of documents interconnected through links.
We are excited to announce the launch of a new “Links Analysis” module, the most recent addition to our website analysis tool. It is not a secret that properly functioning and SEO optimized links are an integral part of a good website. The links module provides users with vast amount of valuable information such as number of links on a webpage and their ratios, types, URL structure and much more.
Here is a quick overview of key data points:
- Internal Links: relative – number of internal links with relative URL.
- Internal Links: absolute – number of internal links with absolute URL.
- External Links: noFollow – number of external links that do not pass page rank
- External Links: passing Juice – number of external links passing page rank.
- Anchors – Text or Image that is the anchor part of the links.
- Count – number of links pointing to the same resource.
- Title – shows value of the “TITLE” attribute of a link.
- URL Type – static indicates that the URL is SEO and user friendly. Dynamic indicates complex URL with many parameters in the query string.
- Length – URL length in characters.
- Target – value of the TARGET attribute. If there is no value it is the same as “_self”.
- Rel – value of the REL attribute. Usually “noFollow” value is placed here to limit flow of the page rank
We are hoping you will find link analysis addition helpful in reviews. As always we are looking forward to your comments and ideas for further improvements.
Character sets (charsets) are utilized by browsers to convert information from stream of bytes into readable characters. Each character is represented by a value and each value has assigned corresponding character in a table. There are literally hundreds of the character encoding sets that are in use. Here is a list of just a few common character encoding used on the web ordered by popularity:
- UTF-8 (Unicode) Covers: Worldwide
- ISO-8859-1 (Latin alphabet part 1) Covers: North America, Western Europe, Latin America, the Caribbean, Canada, Africa
- WINDOWS-1252 (Latin I)
- ISO-8859-15 (Latin alphabet part 9) Covers: Similar to ISO 8859-1 but replaces some less common symbols with the euro sign and some other missing characters
- WINDOWS-1251 (Cyrillic)
- ISO-8859-2 (Latin alphabet part 2) Covers: Eastern Europe
- GB2312 (Chinese Simplified)
- WINDOWS-1253 (Greek)
- WINDOWS-1250 (Central Europe)
- US-ASCII (basic English)
Note that popularity of particular charsets greatly depends on the geographical region. You can find all names for character encodings in the IANA registry.
As you can see there are multiple possibilities to choose from therefore character encoding information should always be specified in the HTTP Content-Type response headers send together with the document. Without specifying charset you risk that characters in your document will be incorrectly interpreted and displayed.
In Hypertext Transfer Protocol (HTTP) a header is simply a part of the message containing additional text fields that are send from or to the server. When browsers request a webpage, in addition to the HTML source code of a webpage the web server also sends fields containing various metadata describing settings and operational parameters of the response. In another words, the HTTP header is a set of fields containing supplemental information about the user request or server response.
From the example above, the “Response Headers” contain several fields with information about the server, content and encoding where the line
Content-Type: text/html; charset=utf-8
informs the browser that characters in the document are encoded using UTF-8 charset.
What are “Bad Requests”?
How to check for missing resources on the webpage?
The easiest way to check for missing resources on the webpage is to utilize your browser’s developer tools. Most modern browsers come with tool sets that allow to examine network traffic. The common way to access developer tools is to press “F12” button on your keyboard while browsing the webpage. My preferred way to analyze webpage resources is with Firebug which is a developer plugin for Firefox browser.
Why is it important to avoid bad requests?
First noticeable item in the traffic analysis of the page is the size of the 404 No Found responses which are not small in comparison to our tiny test page that is only 277 bytes. Depending on the server and website configuration the size of the error page will vary but it will usually be at least several kilobytes in size as the response usually will consists of headers and text or HTML code with the explanation of the error. If you have a fancy custom 404 error page which is large in size, the difference would be even more dramatic. Removing references to missing resources definitely will decrease bandwidth usage.