151 items found for ""
- Find Out When Others Are Talking About You Online
Contrary to the physical world where you can rarely know if people are talking about you and your business, in the online world there are many ways to find out if others are having conversations about your organization. Everything that is online can be searched, indexed, compiled and found. The trouble is finding out where the conversations are happening. Fortunately there are quite a few tools available to help keep you updated on what others are saying about you. First and foremost you have Google Alerts. This very straightforward service lets you subscribe via email to any new piece of internet content that matches your alert query. It will send you an email with the latest article, web page, video, and blog post that matches the keywords you submitted. This sounds like the perfect solution, but it has a catch: you need to define your query well to prevent getting swamped with non-relevant alerts. Set your alert query too generic and you’ll get hundreds, or even thousands, of alerts in your inbox, most of which will be entirely irrelevant to your business. A good idea is to set your company name as a Google Alert. Unless your business has a very generic name, this will ensure that you’ll only receive alerts when your company is actually mentioned in an online publication. You could also set alerts for your product brand names, the names of high profile employees, and of course your competitors. Another very useful tool is Google Blog Search. This is a specialized search engine that only searches in blog posts. Blogs are becoming increasingly important and valuable as a form of citizen journalism, and some successful blogs can make or break the reputations of international corporations. It’s important to be aware of when and how a blogger is talking about your business. Knowing when people are talking about you online is just the first step. Next you’ll want to join the conversation. But be careful when responding to news articles or blog posts – don’t be aggressive. Join the conversation with a civil tone – this is a public conversation after all and you don’t want to come across as a jerk. Don’t be afraid to contact the blogger or journalist directly via email instead of openly discussing matters in the comments section. Just be sure to always stay friendly and positive, just as you would with a customer on the phone or in person. One unhappy customer in the real world may not harm your business much, but online everything is stored forever and publicly available to anyone – one negative online argument can come back to haunt you years later.
- The Changing Face of SEO
(This article was originally published in the Belfast Telegraph on 3 Feb 2010. It’s been modified slightly for this blog.) Search engine optimisation, or SEO for short, is defined as ‘the process of improving ranking in search engine results’. When search engines first appeared on the scene in the 1990’s to help people make sense of the exponential growth of websites, it suddenly became important to show up first in these search engine results pages. Savvy entrepreneurs quickly figured out how search engines worked and what a website needed to rank first, and the dark art of SEO was born. The first search engines were relatively simplistic pieces of software that crawled the world wide web and matched words found on websites to search queries entered by its users. All a search engine optimiser needed to do to get his site to the number one spot was stuff as many keywords on a website as possible. Whether or not that site was actually useful and relevant for the user’s query didn’t matter, at least not for the optimisers. It’s at this stage the SEO industry earned its dubious reputation, a blemish it has yet to discard. This of course led to abundant complaints from search engine users who were looking for one thing but ended up on websites that offered something entirely different. In response search engines got smarter, but search engine optimisers got smarter as well, and the arms race has been on ever since. The big breakthrough came with Google who in 1997 added a whole new approach to determining what websites were really relevant for a given search query. Keywords on a page were still important, but more important than keywords were the links from other websites pointing to that page. Google’s idea was that every link to a website counts as a vote, a recommendation from one website owner to another. The more links point to a website, the more important that website is. That, in a nutshell, is Google’s secret recipe, and while it’s gone through many iterations over the years the core premise remains intact. Search engine optimisers were quick to catch on. The focus shifted from optimising sites for keywords to optimising them for links. The goal is to get as many other websites as possible to link to your website. Unscrupulous optimisers, the same types that didn’t hesitate to stuff as many ‘Britney Spears Nude’ keywords on a website that sold vacuum cleaners just to get extra traffic, devised all kinds of different schemes to quickly and cheaply generate as many links as possible. Search engines like Google also kept updating their software to filter out these false links, trying to count only those links it considered to be real recommendations. But the web is so unimaginably vast that search engines have no choice but to rely on automatic processes to filter these false links. Machines, no matter how clever we try to make them, are easily fooled, and the ‘black-hat’ search engine optimisers (contrary to ‘white-hat’ optimisers that use only legitimate methods) are smart and inventive. But perhaps the era of unscrupulous optimisers is nearing its end. The past few years have been very exciting for website owners and search engine optimisers. Search engines have enhanced their results pages with all types of extra content such as YouTube videos and local businesses. Recently new tweets about the topic a user is searching for started showing up in Google results as well. The latest refinement Google is deploying, called Social Search, integrates content from the user’s online social circle. If, for example, you are searching on Google for a holiday home in Portugal, and one of your Twitter friends blogged about it, Google will show that blog in your results. This new level of personalisation of search engine results, combined with other changes Google has made and continues to refine, means that search engine optimisers are increasingly unable to rely on the basic optimisation factors of keywords and links. There are signs that indicate Internet users are being drawn more and more to online community website such as Facebook and Twitter and begin their search for online products and services there as well. Why trust an anonymous search engine result if you can get a recommendation from a real friend? Or at least a real friend of a real friend. Black-hat optimisers will continue to try and outsmart search engines and force their websites to the top of the list. Setting up fake social media accounts is already a common practice, as any Twitter and Facebook user can attest to, but generally these are easy to spot and filter. I wouldn’t go as far as to proclaim the death of SEO – this has been done many times before and been proven wrong each time – but as web search moves towards social media, and social media becomes more about web search, it’s definitely going to change the search engine optimisation landscape.
- The unlikely persistence of Email Marketing
(This article was originally published in the Belfast Telegraph on 17 Feb 2010.) Fortunately the average user doesn’t see the vast majority of spam messages. Spam filters are incredibly smart pieces of software, using advanced algorithms to filter out nearly all spam messages. The more you use a spam filter, the smarter it gets as it learns what is spam and what are genuine emails. Yet sending millions of emails at once is so cheap and easily done that a spammer only needs a few of his messages to get through the filters, and even fewer users to actually buy something from that spam message, to make a profit already. And that’s the core that lies at the heart of unsolicited spam, as well as legit email marketing – it’s cheap and it works. While a spammer is happy with a 0.01% response rate on his spam emails, a well-crafted opt-in email campaign can deliver much higher returns for an organisation. Smart email marketers have learned to go beyond just sending standard commercial messages. The key is what is called ‘permission marketing’ – getting the user’s permission to send him emails, and delivering what the user expects. A key approach to permission-based email marketing is newsletters. In a newsletter a company can package its commercial message around interesting and newsworthy content. By combining well-written, engaging content with a subtle commercial message, newsletters can form the solid backbone of a company’s email marketing strategy. This approach has worked well for decades, and is one of the reasons why email marketing is still around after all this time. Another reason is that email as a communication system has proven remarkably robust. Many new ways of communicating instantly and across vast distances have emerged – instant messaging, text messaging, twitter – but the preferred method of communicating online is still email. For some reason email resonates with us. It manages to strike a nearly perfect balance between speed and length. Whether it’s a one-word message or an email with several megabytes worth of attachments, emails travel around the world nearly instantly and arrive with almost flawless precision. Because of this our email inbox is the focal point of our online existence. Email and the World Wide Web are essentially different aspects of the Internet, but we’ve become so accustomed to email that often we fail to realise this. Instead we perceive email as an entirely separate thing. As long as our daily online journeys start with firing up our email programmes, email marketing will continue to thrive. Unfortunately, so will spam.
- SEO Factors for Geotargeting Your Website
There are all sorts of challenges when dealing with international SEO, such as what domain extension to use, how to structure multiple target countries on a single domain, dealing with country selection without losing link value, how to build incoming links across all countries, and so on. There have never been any hard rules on how search engines deal with international SEO, leaving optimisers with trial and error exercises to discover the right approach for their sites. Now Google is lending a helping hand with an extensive post on their Webmaster Central blog, which explains how they attempt to handle different geotargeting factors such as domains & subdomains, directory structure, and Webmaster Tools settings. For experienced SEOs there aren’t many surprises: Country-level domains are important and probably the best way to target different countries if you can spare the expense. Subdomains are a good alternative, as are country-specific directories on your website. Duplicate content issues have plagued international SEOs since the dawn of the web. Often it’s extremely hard to avoid duplicate content when your business operates internationally in Europe. Many countries use the same language, so if you want to effectively target all your operating markets you will eventually end up duplicating content in the same language for different countries. Google doesn’t recommend hiding this duplicate content with robots.txt or noindex tags – I’ve never been a fan of hiding content in this way either, as every page on your website may convey some ranking benefit and I feel you should let search engines crawl the full length and breadth of your content. What you should do is pick one ‘preferred’ version (ideally the version with the largest target market) and ensure all duplicate versions of that content use the canonical tag to point to the preferred version. Google does have one interesting revelation though in their geotargeting recommendations: “Note that we do not use locational meta tags (like “geo.position” or “distribution”) or HTML attributes for geotargeting. While these may be useful in other regards, we’ve found that they are generally not reliable enough to use for geotargeting.” (Emphasis added) This might mean that the ‘lang’ HTML attribute conveys no geotargeting benefit, though it’s undoubtedly a useful attribute for browsers and other UAs and helps with multilingual websites. In upcoming blog posts Google will take a look at multilingual websites and special situations with global websites, so I recommend keeping a sharp eye on the Webmaster Central blog, and I’ll be discussing things here as well.
- The Similarities Between SEO and Poker
It’s been a slow week as far as writing blog posts is concerned, and if it wasn’t for a sudden burst of inspiration after reading a book on poker I wouldn’t have written anything at all. The inspirational episode resulted in a post for State of Search where I draw parallels between SEO and poker. State of Search: How SEO is a lot like Poker Poker is a card game that involves certain skills, techniques, and a bit of luck, to win more money in a game than your competitors. Search engine optimisation is a profession that involves certain skills, techniques, and a bit of luck, to help a site rank higher than its competitors in search engine results. So both poker and SEO rely on a basic foundation of skills and techniques, complemented by a certain degree of luck. And that’s just the superficial similarity. It doesn’t end there.
- SEO for Google News – Ranking Factors and Recommendations
Many news organisations receive most of their website traffic from Google News – the dominant news destination for users online. Google News is a very different animal from Google Search, and SEO for Google News needs a very different approach. It’s vital for news publishers to have a good understanding of how Google News operates and what can be done to optimise your presence there. There’s an extensive FAQ on Google about how to get your site included in Google News, so I won’t discuss that topic here. Instead I’ll focus on what you can do to maximise your exposure there once you’re included. The following ranking factors are distilled from my own experiences with a large regional news site, as well as various online sources including interviews with Google staff, Google’s own FAQs and videos, research papers and patents published by Google, and analyses performed by other SEO professionals. Google guards its ranking algorithms fiercely. As a result we don’t know how many other ranking factors come in to play, nor what weight each factor has in the overall ranking algorithm. Google News Ranking Factors Original Content An article that is unique to a publisher has a much higher chance of ranking than an article that is taken from a news syndication feed or republished from another source. Note that Google is striving to show every article under the original publisher’s banner. So content republished from other sources (such as AP and Reuters) are much less likely to show up in Google News as part of your site than our own original content. AP and other news agencies are also working hard to ensure they capture the web traffic for their own content. Additionally, if you have content that refers to an original source (i.e. “The New York Times reported that…”) Google News could detect this and will rank the original NY Times article higher. Timeliness A bit of a no-brainer: news articles that are more recent and tie in with current events are preferred over older articles. Coverage of recent developments Nowadays Google News is able to detect updates to an already indexed article. News articles that are updated to reflect ongoing developments in the story are preferred over static stories. Cluster Relevancy Google divides news articles in clusters centred on a single topic (an algorithmic feature Google calls Aggregated Editorial Interest). The more relevant an article is for that cluster, the higher it is likely to rank. Local source & content If a story has a location element in it, Google News tends to prefer articles from publishers geographically close to the story’s focus that create their own local content for the story. I.e. for an event in Belfast that is covered by both the Belfast Telegraph and a national newspaper, the Belfast Telegraph’s coverage is likely to rank higher in Google News. Publisher Reputation This is a complex ranking factor that depends on a number of factors in itself. One important factor for determining publisher reputation is the volume of original content per news edition that the publisher produces. A publisher that produces a lot of original content for different news editions is seen as more reputable than niche content produces and news aggregators. Google News defines ‘editions’ as separate categories of news, such as sports, politics, and entertainment, but also its own country-specific versions (news.google.com, news.google.ca, news.google.co.uk, etc). It’s important to note that publisher reputation is mostly independent of a website’s PageRank (PR is said to be applied ‘delicately’ to Google News), and that the reputation can be different for each edition. Thus it is possible for a news site to have a great publisher reputation for politics in news.google.ca, but a very poor publisher reputation for sports in news.google.com. Clickthroughs An article with a high CTR is seen as more relevant – with every click counting as a ‘vote’ for the article – and is thus more likely to rank higher. On-page Optimisation The concepts of generic SEO also apply to Google News as well. Factors such as search-engine-friendly URLs, good title tags, use of header tags, strong body content, and optimised code, all factor in to Google News rankings. Images The thumbnail images that accompany stories in Google News are usually JPEG images, have a relevant caption and alt text, and aren’t clickable (so the image doesn’t link to another page). The latter is because Google News wants the best image to be part of the article, so users don’t have to perform an additional click to see the best possible image. Personalisation Recently Google has started to personalise Google News based on collaborative filtering. This is much like Amazon.com’s recommendations system. An example: User A reads articles 1, 2, 3 and 4 on Google News. User B reads articles 1 and 3. Google News then personalises the News page and shows articles 2 and 4 more prominently, as it suspects user B will want to read these as well. A research paper on Google’s implementation of collaborative filtering in Google News has been published and can be read here: http://www2007.org/papers/paper570.pdf Google News Search Patent In 2003 Google filed for a patent for “systems and methods for improving the ranking of news articles”. The patent was granted in 2009. Much has changed since 2003 so it is very likely rankings in Google News work very differently nowadays from what this patent describes, but we can still learn a few things from the patent’s ranking factors: Number of articles produced by the source Article length (longer = better) Breaking news score Clickthroughs Human opinion (awards won, survey results, etc) Newspaper circulation numbers Editorial staff size Number of associated news bureaus Inclusion of original named entities (people/places/organisations) Number of topics the source produces content for International diversity of audience Writing style (spelling, grammar, reading level) Some of these factors are likely still part of the Google News ranking algorithm in some form or another, such as clickthroughs, number of topics, and breaking news score. Other factors are unlikely to be a part of the current workings of Google News (circulation, staff size). The full patent text is available here: http://www.faqs.org/patents/app/20090276429 Recommendations From these ranking factors a number of recommendations follow that you should keep in mind when creating content for your news site, as well as any technical changes you make to the site: Publish unique content: Strive to publish as much unique, original content as possible. Publish & update fast: By being early with breaking news, as well as keeping on top of new developments, you can increase your chances of ranking high in Google News. Minor article tweaks can be interpreted as a developing story update, and are thus encouraged if applied inconspicuously. Develop editorial specialities: You can increase your publisher reputation in specific news editions by developing a speciality for a certain type of news. For example you could strive to cover your regional politics better than anyone else, and thus increase your chances of outranking big news publishers in Google News for local political news. Optimise your site for general SEO: Like with any other site, it pays to optimise things like title tags, URLs, header tags, etc. Images should be JPGs and non-clickable: By making sure all images used on your site are JPGs, and that images included in an article are not linked, you can increase your visibility in Google News. Having a good caption for your images also helps. Google News Sitemap: While having a Google News sitemap doesn’t help your rankings in Google News, I still consider it essential to have one, if only to ensure all your content is found and indexed by Google’s news spiders. Note that these recommendations come from my point of view as an SEO specialist, and I reckon they would benefit from a journalistic perspective. Also note that this document is a snapshot of the state of Google News as it exists now. Google rolls out updates and tweaks all the time, so these ranking factors are likely to change over time. If you’re a publisher in need of SEO and want to improve your visibility in Google News, Polemic Digital offers specialised SEO services for publishers.
- Battle of the World Cup SERPs
It’s the World Cup and I needed ideas for a new blog post on State of Search, so I compared the SERPs for the ‘world cup’ keyword on five different search engines to see which one provided the best information. State of Search: Battle of the World Cup SERPs Yahoo pulls out all the stops – after two sponsored results there’s a table of recently played games with links pointing to Yahoo’s Eurosport pages. Yahoo also knows where I am and shows me results for England and the Group C where England is a part of. Up next is a plug for that monstrosity known as the Yahoo Toolbar, followed by organic results for FIFA.com with a lot of useful sitelinks.
- Presentation: SEO for Web Developers
On the most recent edition of the Barcamp Belfast conference I did a 20-minute talk about the mistakes some web developers make when building search engine friendly websites. Much of the content in that talk came from a blog post I did for State of Search about the same topic – SEO for Web Developers. Below are the slides: SEO for Web Developers
- Learning SEO
Learning how to be a search engine optimiser is not a particularly straightforward process. There’s no college education you can follow to become a SEO. There are precious few SEO training courses you can follow, and most of these aren’t worth the money. So what is the best way to learn SEO and become a specialist in the field? Well, you could buy some books and dig in to them. A recently published tome called The Art of SEO is pretty solid, as is the SEO an Hour a Day book. But these books will only give you the basics. Since many SEO problems and challenges are specific to certain (types of) websites, these books will give you a starting point but no more than that. Reading blogs, articles, and white papers about SEO is also encouraged, though with one caveat: don’t get carried away by reports on the latest hypes and “must-do’s” in SEO. These are nearly always blown out of proportion and when chasing the latest hypes its easy to lose track of the core essence of SEO. SEO can of course be learned by simply doing it. Learn what works for other websites, then unleash it on your own sites. The problem with this is that sometimes this can backfire and you end up harming your sites more than helping them. Still, learning through experience is absolutely essential. So, what else is there to help you learn SEO and become a true specialist? Well, in my opinion there is no substitute for a good community. A community where you can ask questions about your specific SEO challenges and get helpful, straightforward answers. A community that shares its specialised knowledge freely and widely and isn’t afraid to spill the occasional ‘secret’ that can really prove to be the difference between a good site and a great site. Such a community exists, and I happen to be a member of it: the SEO Training Dojo. The Dojo boasts an impressive range of tools and resources that are useful for everyone, from newbie SEOs coming to grips with the complexities of the craft to experienced SEO gurus that have seen and done it all. Some examples of what the SEO Dojo contains: – Knowledge Exchange forums – Free SEO tools – Discounts for paid SEO tools – Weekly chat sessions – SEO Beginners guide – Ranking Factors guide – Link Builders handbook – and much much more… I’m very pleased to be able to offer a discount for my readers who want to join the SEO Training Dojo – use the “barryrocks” discount code when you sign up and you get 25% off any subscription plan! This discount code is only valid for the first 25 who make use of it, so it pays to act quick! I promise you it will be worth it – the very first chat session I attended earned my whole annual subscription fee back with the incredible amount of insight and tips that I gained from it. So if you’re serious about becoming a true search engine optimisation specialist, the SEO Training Dojo is the place to hang out.
- The Importance Of Sitemaps
(This article was originally published on the Visual Script blog.) Sitemaps are a crucial aspect of a successful website. First, let’s make it clear what we mean with a sitemap. There are two types of sitemaps: one meant for visitors of your website, and one for search engine spiders. Sitemaps for Visitors The first type of sitemap is probably very familiar to you. It’s a webpage that shows an overview of all the content on a website. You can see an example here of our own Visual Script sitemap. This type of sitemap is very useful as it allows your visitors to quickly find what they’re looking for without having to go through your website’s navigation. Especially for large websites it’s recommended to have a well-structured sitemap that is linked from every page on your site, for example in your website’s footer. Sitemaps for Search Engines The second type of sitemap is a so-called XML sitemap. This type of sitemap is specifically intended for search engines, and it does roughly the same: allowing search engines to find all the content on your website quickly and easily. Why bother with an XML sitemap then, if it’s the same as a normal sitemap? Because an XML sitemap allows you to include extra information about the content on your site, such as: when a webpage was last updated how often a webpage is usually updated what the priority of a webpage is relative to other pages on your site what type of content a webage contains (text, video, etc) An XML sitemap allows a search engine to quickly and efficiently index all the content on your website, making sure your site is fully spidered and all your content is part of a search engine’s index. Google recommends every site includes an XML sitemap. You can create an XML sitemap yourself manually, or you could have one generated automatically – ask your site’s web developer about it, or look at Google’s sitemap help pages here. For larger websites it’s recommended to have a sitemap created automatically, so that whenever you create a new page or update an existing page your sitemap is automatically updated as well. You can tell Google you have a sitemap by submitting it manually in Google’s Webmaster Tools, or you can include a sitemap reference in your robots.txt file. The second option is always recommended as this way other search engines such as Bing can also find your sitemap. Need help with sitemaps or other aspects of your website? Get in touch with us at Visual Script, an experienced Northern Ireland Web Design company that can help you with all aspects of your online adventure.
- SEO is not dead
We didn’t have to wait long. Within minutes of Google’s announcement of their latest feature, Google Instant, the blogosphere was abuzz with the news that this would really mean the death of SEO. Naturally this was a total fabrication. People who understand search engines, SEO, and user behaviour, actually realised that this made SEO all the more important. At most SEO would have to shift its focus somewhat, but ranking high for popular search terms has only become more vital for any online business. Since we SEO professionals have had to defend the existence of SEO for years, I decided to build a small site dedicated to end the ‘SEO is dead’ argument once and for all. Inspired by Mark Bronlow’s ‘Email is not dead‘ site, I launched SEO is not dead: The site contains a number of statistics on internet and search engine use, links to and quotes from industry specialists talking about the life of SEO, and videos and quotes from senior people working for major search engines. So the next time you hear someone proclaim the death of SEO, send them here.
- Guest Appearances on State of Search Radio
I was invited to join presenters Bas van den Beld and Roy Huiskes on their regular State of Search radioshow on WebmasterRadio.fm, and apparently they liked me as I was invited straight back for their next episode! In my first guest appearance we chatted about SEO in 2010 and 2011, the Goldman Sachs investment in Facebook and the potential of a new dotcom bubble. In the second show we discussed Google and their stance towards affiliate marketing as well as some odd google.com pages popping up in search results. You can listen to the shows and download them as mp3 files here: State of Search radio – ep39: Bubble coming up, 2010 and and the future State of Search radio – ep40: Google testings, Google the affiliate and Twitter transparancy