Net66 SEO: Google going after more link Networks.

What a week so far. Is it really only Wednesday? We’ve had confirmation that Penguin is only a few weeks away from Matt Cutts himself. Prompting a raft of people asking about the best way to vet their previous, maybe not so squeaky clean, link building techniques. Naturally we have a lot of advice for people and are willing to help out, even offering a penguin guide on how you can help prepare for Penguin 2.0.

There was a brief respite when we all had a quick time out to play Atari classic Breakout, but now we have even more news from Google and that now great source of information Matt Cutts. Once again it seems like Google are tightening the noose on Link Networks. Their previous High profile tart was the SAPE Network that got penalised and vanished from the Search Engines. The knock on effect would have been bad for a lot of sites that would have lost the value from any links from that network. And obviously from losing that many links at once, their rankings would have dropped drastically.

This news has come by way of Matt Cutts himself once again. He’s posted a few tweets recently outlining his and Google’s desire to really punish sites that are benefitting from spam and they’re definitely cranking it up on the Link Networks. Look at the Tweets below to see what has been said:

Matt-link-spam

So without specifying which Link network has been hit, Matt Cutts has inferred that Google have “took action on several thousand linksellers” which has to amount to a link network. I’ve still not heard anything as to which network has been hit, but I’m sure over the coming days all will be revealed.

Which Link Network do you think has been hit?

Blog Post by: Greg McVey.

Net66 SEO : Penguin 4 (2.0) still “A few weeks away” – Matt Cutts

Recently it seems that all the talk on the web from Manchester SEO forums, blogs and other news stories has been about recent fluctuations in rankings and traffic. Rumours abound that this was the start of Penguin 4, But Matt Cutts himself has waded into the mess of confusion to clear a few things up. First of all he has categorically stated that there was no Penguin update last week, you can see from the Tweet below/totheleftright?

matt-cutts

So what was it? More than likely it was part of Panda refreshing itself now that it is part of the algorithm. So GREAT! There’s still time to Prepare for Penguin. But ‘our Matthew went further than that to definitely rule out that this was a Penguin update by announcing that Penguin 2.0 is still “a few weeks away”.

matt2

Penguin 2.0? But we’re currently on Penguin 3. Surely the next update has to be Penguin 4, can Google not count? What Google have done is subtle but they’ve announced that this is going to be a Big Penguin, PenguinZilla in fact. Maybe not that big, but Google have confirmed that this will be the biggest penguin update yet as internally it’s being referred to as Penguin 2.0. The rest of the Penguin updates have subsequently been devalued to:Penguin Zilla

> Penguin 1 is now Penguin 1.0
> Penguin 2 is now Penguin 1.1
> Penguin 3 is now Penguin 1.2
> And Penguin 4 is now Penguin 2.0

A big step up indeed. However there are still issues with this, especially when it’s left to Google to decide when an update is big enough to progress from 1.2 to 2.0. For example, look at the Panda updates that were released last year that had a few occasional big refreshes but then a long period of small refreshes that eventually led to there being a panda 3.92. This got too much for everyone who then started re-numbering all updates chronologically regardless of size.

We’ll be calling the next update Penguin 2.0, but the crux of the matter here is, Penguin is on it’s way, and it is going to be big. So make sure you’re prepared and start counting your links.

Blog Post by: Greg McVey

Net66 SEO – Was that a Google Update?

I’ve heard and read a lot recently about fluctuations in a lot of different areas particularly concerning rankings and traffic. Naturally everyone assumed that there was another Google update which had caused the fluctuations in ranking and traffic. But after being questioned several times, Google responded with it’s default answer:

““We have nothing to announce at this time. We make over 500 changes to our algorithms a year, so there will always be fluctuations in our rankings in addition to normal crawling and indexing.”

Hardly helpful but with the amount of people reporting the fluctuations there has to be something going on surely?! Well there isn’t a definite yes or no and I apologise for calling you Shirley.

In the past with certain updates, Google has been very transparent when dealing with updates in the past. Remember the EMD update? Exact Match Domains (EMD) are what’s used when people want to rank for a keyword or phrase and buy a domain name with the keywords in it. For example if you wanted to rank for “Manchester SEO Company” you would want to have a domain like http://manchesterseocompany.com/ As the keywords are in the domain, Google would realise this domain is relevant for the keywords searched and thus rank it in the SERPS (Search Engine Ranking Pages) higher than it would if it didn’t have an EMD. This is quite clearly unfair, especially considering that the people who bought the EMDs could have an inferior site offering an inferior service.

So, Google released an update that refused to count keywords in the domain towards rankings in an effort to be fair. However there are still arguments about how effective this really was as people have claimed to see rises in rankings for EMDs afterwards.

Google has also been clear about Penguin and Panda updates in the past. Which lends weight to the argument that Google haven’t released an update otherwise they would have mentioned it.

You also need to take into consideration that last month Google announced that they would no longer be confirming Panda Updates and were integrating Panda into the algorithm itself. So have they released a new algorithm with Panda introduced in it? Or have the ran a quick Panda refresh in addition to the latest changes to the algorithm?

We can spend days and weeks debating it and never know for sure. The only facts we have at the moment are that a) people are reporting fluctuations in rankings and traffic b) Google aren’t confirming anything.

So what do you think?

Blog Post by: Greg McVey

How Do I add Google Authorship to my Blog

Yesterday I talked about “What is Google Authorship”. At the moment the true power of Google Authorship has yet to be revealed however there are quite a few rumours around the SEOSphere that this could be very beneficial to SEO. The reasons, as I explained yesterday, stem from the fact that there is now a boom of content on the web. This is stemming from Google’s push to encourage natural links through great content.

As with all things SEO, there were the companies who did this properly through organic content marketing creating properly structured campaigns. And there were those who scraped together small paragraphs of keyword rich text and flung this out into the web flooding the internet with chaff. Google Authorship is meant to help with this by penning verified Google+ accounts to the blogs they write for.

For example, if you search “What Happens if a Penalised site links to me?” You will see the following:

What happens if a penalised site links to me?

So how do you get this showing up on your Blog? Pay attention and you’ll have no trouble doing this by the end of this post. Follow the steps below:

> First of all, you need to sign up to Google+ and get yourself a Google Plus Profile, like this one. Ensure you have a recognisable and clear profile picture and that your profile is fully filled out.

> Ensure you actually include a line or a short author Bio in the content you’re publishing. i.e. Blog Post by Greg McVey. The name you include on the content you’re publishing  has to match the name of the profile on your Google+ profile.

> Finally, make sure you have an email address that ends with the domain you’re publishing the content on. So for me I would need greg@net66.co.uk, as I’m publishing to https://netsixtysix.co.uk/ Which is what I have. Then you need use This Tool to verify that you own that address, it’ll send you a verification email and you’ll need to click a link within this email. This then adds your @domain.co.uk address to your profile, as well as a link to the domain your email address is on.

Voila! You have now verified that you are a content publisher for the domain you’ve just registered and that you want to link your Google+ profile to it. Excellent!

So how to I pen individual pieces of content?

If it’s a one off piece, you can use the following:

Blog Post by <a href=”https://plus.google.com/[enter your google+ account number]?rel=author”>[Your Name Here]</a>

However a lot of people are the sole contributors to their blog and to painstakingly go through each page individually is not productive one bit. So how can you do this sitewide? Luckily, it is a simple manner of including one link in the header of your website. If you have a static HTML website then unfortunately, as with all sitewide changes, you will have to change each HTML file individually. However, most blogs are WordPress these days anyway so for most it is a simple fix.

The code is entered after the first <head> tag and before the closing </head> tag. The code is as follows:

<link rel=”author” href=”https://plus.google.com/[enter your google+ account number]”>

Entered into your header this will associate one author with every page on your website.

And there you go. A simple guide on how to enable Google Authorship on your Blog.

Blog Post by : Greg McVey

What is Google Authorship?

Google Authorship has been around for a while now and is rumoured to be good for SEO. Whilst not everyone is using it, it’s definitely gaining traction becoming more and more popular. The industry buzz is that this is going to be the biggest thing since sliced PageRank, or PageRank or indeed, sliced bread.

So What is It?

Due to its still relatively low popularity, I do get quite a few people asking me about this. Authorship is adding an “Author” to any content published on your website or your blog. It just so happens that the Author Bio is a Google+ profile. You can see mine here.

The reason behind Google making a move towards authorships comes from their extra scrutiny on Link Building. Google’s recent shift in policy has led to them becoming much more tenacious and pedantic when expecting links. If it get’s even the merest whiff of suspicion that you may have purchased a high PR link that is also rel=”dofollow” in order to manipulate search engine listings, then it will devalue that link.

With this new tactic fully employed now, one of the best ways to garner inbound links is to use Link Bait so people organically link to your website. With this approach the internet figuratively exploded with content. So how do you set your content apart from the rest? Google Authorship helps set you aside from the group.

I’ll be covering, in depth, all the different ways to add authorship to your blogs tomorrow but for now, here’s the code to quickly add this authorship to your Blog:

Blog Post by : <a href=”https://plus.google.com/[enter your google+ account number]?rel=author”>[Your Name Here]</a>

Simple as that, but of course you have to go through the Google Authorship Verification Process first. Which of course, will be posted tomorrow in depth.

I hope to see you tomorrow.

Greg McVey

 

Why does SEO Take So Long?

Frustration I would say is the most linked word with this question. Whilst it is a valid question, there are many answers to it that could lead to a whole series of blog posts on the manner of SEO. However here are some answers to help you understand.

Analyzing Competition

First of all, if you’re just starting your SEO then you can bet the shirt on your back that there are already competitors ranking for the phrases that you wish to also rank for. This in itself is a disadvantage as your competitors could have quite a head start in the race for #1 listings. And as you’re just setting off, now far behind your competition, it can take some time to overtake them.

Link Building Authority

Search engines and Google in Particular look to the quantity and quality of links to a website when determining where to rank websites. Analysing your Competitors backlinks can be an enduring task as you have to find the truly quality links that are shared by all the top sites for the key phrases you wish to link for. After this you then have to start the process of getting a link to your site from the aforementioned shared top links of your competitors.

Overtaking Competitors

That’s just the start, as you then have to go out and find and build your own links to show Google why you deserve to be ranked above your competitors. There are quite a few ways to generate and build links to your site including Creating Content, guest posts, blog posts, social sharing to name but a few.
Then there’s the onsite structure of your website that needs to be streamlined particularly for SEO. For example, you need to ensure each image file you have contains a relevant title tag and alt tag, each keyphrase has to have a page dedicated to it. Page titles, headers and more all have to be fine tuned to ensure the best results.

To much too fast = bad

Back in the day it was literally a case of “Whoever has the most keywords wins”. Then Google evolved to include a philosophy of “Whoever has the most links wins”. Naturally this left Google’s rankings wide open to manipulation where a spammer could just outsource link building and generate 500 links overnight to rocket up the rankings. If someone overtook them they could then repeat this procedure. Because of this and the advancement of Google’s algorithm, it can now detect a lot of links generated at once and apply punishments to sites Google believes has violated it’s guidelines.

So all in all, there are quite a few reasons that SEO can take so long.

Blog Post by: Greg McVey

Is “not provided” affecting Google’s Search traffic to news sites?

Now first of all if you don’t know what the “not provided” result means in Google Analytics, it is basically people searching when they are signed into their Gmail account. The reason behind this is that when you are signed into your Gmail you will then be on a secure server which is encrypted with SSL. You will notice this in your address bar as it will change from http:// to https://. Google cannot track your data due to the security of the SSL because you have entered a password to login.

I seen a statistic on the internet which shown that the news site Buzzfeed investigated the traffic to its own website and 200 websites within their network and the organic traffic had declined from September last year. Websites including Fark and The Huffington Post which are big websites are seeing decrease in traffic. So the people at Buzzfeed have now two theories are users decreasing their visits to the website or is the data generated by Google not giving them the accurate results they need because of not provided?

We are all still a bit clueless in the not provided result as it was what many people branded a side of Google as “dark”. Google hit back at these claims saying that they were doing mainly for privacy and security for your account. Which in my personal opinion as someone who works in SEO Manchester based just to be specific, I want the most accurate data possible on my websites but I can also understand the fact that Google has to keep its product and their clients’ data secure so it is a bit frustrating in both ways.

At the end of the day Google is a business who have a product (Gmail) and there job is to protect the clientele from spam and being hacked which is why we see the not provided result. Ask yourself this question. Is it really worth Google risking all the private data of their client base just to satisfy a handful of them?

Net66 SEO : Google Vs Bing – Head to Head

Like Coca Cola and Pepsi, there is always going to be a comparison between Google and Bing. But which one of them works better? The answer is there is no answer, individual results are always going to be tailor made by search history, location and whatever other secrets are included in the Search Engine algorithm that makes personalised web searches personal. So what can we do if everything’s subjective? We shall run a case study!

Recently CERN who basically created the origins of what we now call the World Wide Web have decided to host a copy of the First Ever Website back on its original domain. It’s been hosted elsewhere since then by W3C. You can view the very first website ever at it’s original domain address here: http://info.cern.ch/hypertext/WWW/TheProject.html

But how does Google and Bing manage search requests for the first ever website? We’ll test that now. by entering “First Website Ever” into Google and Bing respectively. We’ll add a control to this little experiment and we’ll start be searching the URL first. See the Results below:

Google Returns: The first website ever at it’s original URL. Followed by CERN’s modern website. Followed by two more results from CERN.

Bing Returns: The first website ever at it’s original URL. Followed by CERN’s modern website. Followed by a few news articles and then a wiki.

So the first two results are exactly the same. No real surprise there as we were using a very specific search term. But it’s interesting to see how Google continues to list CERN’s website and Bing chose to list news items related to the search term. Which do you prefer?

So now lets really put it to the test and search for “First Website Ever”. See results below:

Google Returns: The first website Ever. Followed by Cern’s website. Followed by News relating to the first ever website.

Bing Returns: A copy of the the first ever website hosted on W3.org. Then followed by CERN’s modern website.

A clear win for Google then. But like I’ve mentioned above, there are a lot of factors that affect rankings and localisation is definitely one of them. So give this little test a go and let me know how you get on.

Blog By: Greg McVey

Net66: SEO Vs Web Design

SEO Vs. Web DesignFacebook Vs Twitter. Google Vs Bing. Manchester United Vs Manchester City (Hardly a fair fight I know, 20). And even the Star Wars Prequels Vs the Star Wars Sequels.

There’s conflict everywhere but one we see every single day is Search Engine Optimisation VS Web Design. I’ll first explain why there is such a divide between the two.

The Viewpoint of an SEO

Functionality > Presentation. In an ideal world the whole of a site that’s ready to be optimised needs to be content rich. There can be no flash and at least an image per page. Eveything can be controlled by one plugin where you can manage all the meta and rich snippets. Everything the website needs to rank for has it’s own section, there’s a blog which is updated daily with quality content. The reality however varies wildy from this idealistic view. There are still sites that are written in static .html that makes it hard to create anything dynamic. Coding can be erratic and some sites can be hacked together where editing a single line of code can knock the whole website out of sync. You can still get website written in .aspx that is a programming language created by Microsoft, and whilst some people are still advocates of it they are finding their numbers dwindling.

The Viewpoint of a Web Designer

Presentation > Functionality.  To any web designer worth their salt, a website HAS to have that extra special something that makes it stand out in front of all the other competitors websites. In an ideal world for a Web Designer, websites in search engines would rank according to how good they look regardless of their Meta titles or how many links there are to them (although from an SEO point of view a beautiful site generates links organically with friends sharing sites saying “Check this beautiful site”). Content to a Web designer would be irrelevant as surely how a site looks is the only thing that matters.

So who’s right and who’s wrong? As I mentioned, the above is a snapshot of the Viewpoint of a Web designer and an SEO, it’s entirely subjective. There is no right or wrong. However I will say this. In each industry, as with all industries you do get the the good and the bad. There will be companies out there who are after a quick buck so have together a pretty decent looking website that is not easily optimised or edited, much to the irk of the designated SEO Technician who will complain whole heartedly about Web Designers.

On the other side of this coin however there will be a web designer who has had one of their lovingly created websites ruined by some amateur “SEO Technician” who keyword stuffs, buys hundreds of bad links and then disappears at the first sign of a Google Webmaster warning.

So in the end it doesn’t boil down to SEO Vs Web Design, it comes down to people complaining about people who haven’t done their job properly.

Blog Post by: Greg McVey

 

Net66 SEO: How to make UX easier.

There are a few things in SEO that can really make a difference in the SEO to your website. With Search engine Optimisation now there are things you can do to promote Social Sharing on your website that makes it easier to generate organic links.

Remove extra Steps

If you have a step in a process that you like but that isn’t necessary, REMOVE IT. With the removal of a step you can earn or lose a potential client or link. You need to make sure that the user on your site is having a great time and as user engagement is linked with SEO this definitely needs evaluating. For example, if you’re setting up a sign up form for your blog, you want to treat this as “How can I make the experience great for the user” rather than “How much information can I get from this user”. Some websites employ this tactic requiring your first and last name, your mobile number, your address, your email address and the creation of an account on their website. “I only want to sign up to new blog posts, why do they want so much information? I’ll just check back with their website from time to time, it’s easier”. And that’s how you lose a potential long term user of your site.

Keep Users Logged in.

If you do have a user interface where people log in and become part of an online community, then a big part of keeping user engagement high is to keep the user logged in. With a raft of Social Media accounts, email accounts and more, I sometimes have a bit of trouble remembering them all. If I sign in to a casual community and have to log in and forget my password, I have a lot of passwords and am more likely to leave to a community where I’m already signed in. I know this seems lazy in one of the highest degrees but it is an example that rings true. Especially true on mobile devices. Think about it, if you’re on your smartphone and you have the Facebook app and have to sign in to it every time you want to check, it’ll get to the point where you won’t want to check Facebook due to the hassle of signing in. This is the same as logging in to any website.

Finally, Make it easier to socially share.

You can see from the bottom of this post we have the options to share automatically to the most popular social networks. Simple, Easy, Clean.

Do you have any tips of your own for increasing User Engagement?

Blog Post by: Greg McVey.

Net66 Blog – Penguin 4 – What you can do to Prepare

So very many Manchester SEO Companies, in the UK and the whole world have been affected in some way by last 2012’s “Year of the Penguin”. And that was just the starter. Investigating quite a few companies who have had link building conducted by an outsourced “Guaranteed Link Building” company, leads me to believe that there are highly likely going to be many more this year.

So what can you do to prepare for this penguin themed pretence for penalisation? The answer, is a lot.

I’ve reported on Interflora getting penalised and bouncing back quickly, so there is a way to be prepared for this. Here are some tips I believe will help:

> VET YOUR LINKS. Seriously. Right now you need to be very aware of what sites are linking to your website. Group them into Low risk, Medium risk and High risk. I’d even go so far as to use Google’s Disavow tool to make sure the High risk sites linking to your site aren’t counted against you. This way you’d pre-empt any penalisation. Medium risk links are also ones to segregate in preparation for if you need to disavow those too.

> Re-Evaluate your definition of a link: The phrase “A link is a link” has been bandied about for far too long. This is wildly inaccurate and has been for some time. It’s not just a race for the amount of links with [Keyword] in there. This is now actually bad for your site. If you have a lot of low quality links to your site with keyword stuffed text, that isn’t natural, Google will see this and Google will apply a penalty. Create differing anchor text keywords and expand the sources from which you’re getting these links. Use long tail keywords and keyword rich anchor text sporadically whilst mixing in generic “Click Here” links, branded links and quite simply your URL.

These tips should help you weather the Penguin Storm if it hits you.

Blog Post by: Greg McVey

Google is Dropping Instant Preview

Google have recently declared that they have dropped the instant preview feature on their search engine, as of immediate effect.

If you are not familiar as to what instant previews are, next to each listing you could hover over an arrow and it would display a “preview” of the website to which that link is linking to.

It also gave information such as date of last cache. This is still available but on a drop down link at the end of the URL in green text.

Why has the instant preview feature been abandoned by Google?

In short Google have come to the conclusion to no longer include this feature on the search engine because of the very low usage. It has been started at the end of 2010, and in that time they have monitored and tracked the usage of that specific function, coming to the conclusion it is no longer benefit which is used by the majority.

They have also stated that their intention is to make the page more “streamline” and this will help with that.

Why Do Google Keep Making Changes

Its a good question really. Often Google makes changes to their search engine and to their analytics or adwords accounts much to the annoyance of the user. Most people get used to the way a system works and then Google changes it, so they have to almost- re-learn it all over again.

From Googles perspective however they are constantly evolving and chancing what they do to improve it, enhance it and make it better.

So although it can be a pain having to readjust to the beat of what Google sets, in the long run it is in the best interest of the user (generally speaking) and they go off information which dictates certain decisions, just like they have with preview instants.

Google Slaps Mozilla with Manual Spam Penalty

Interflora, The BBC and now Mozilla? Google have been relentless in their pursuit of spam recently. A campaign leaving waves of devastation all over the SEO world. A lot of people think Google are being too picky as a lot of sites claim to have been penalised but have no record of ever knowingly conducting any black hat SEO. Predominantly these have been Algorithmic penalties dished out by Google’s Web Crawler, but every now and then you hear about a Manual Penalty too.

Mozilla is the latest high profile victim in this crusade against webspam. Not through unethical backlinking, cloaking or any other black hat SEO, but by User Generated spam. Mozilla received this message:

Google has detected user-generated spam on your site. Typically, this kind of spam is found on forum pages, guestbook pages, or in user profiles.

A bit harsh then that Google would punish a site that has been taken advantage of by it’s users. However, Google have issued a granular manual penalty. Not penalising the site overall, but singling out the page that was generating the spam and penalising that. Good news right? Wrong. As it left Mozilla completely in the dark about which page was penalised.

It’s a theme we’ve seen a lot after the latest Penguin refresh which generated thousands of “Unnatural Link Warnings”. You can see an example one below:


Dear site owner or webmaster of ….
We’ve detected that some of your site’s pages may be using techniques that are outside Google’s Webmaster Guidelines.

Specifically, look for possibly artificial or unnatural links pointing to your site that could be intended to manipulate PageRank. Examples of unnatural linking could include buying links to pass PageRank or participating in link schemes.

We encourage you to make changes to your site so that it meets our quality guidelines. Once you’ve made these changes, please submit your site for reconsideration in Google’s search results.

If you find unnatural links to your site that you are unable to control or remove, please provide the details in your reconsideration request.

If you have any questions about how to resolve this issue, please see our Webmaster Help Forum for support.

Sincerely,

Google Search Quality Team

A fair warning you would think, but in the world of SEO Link Building is a massive part of the optimisation of a website. So if you’ve built a website and had it optimised and you’ve had a lot of links built and then you get the warning, it’s quite annoying that Google have told you that your site has unnatural links and you may be penalised, but NOT telling you which links in particular are offending.

This has led to a lot of people getting out the scythe for their links when really all they needed was a pair of tweasers. In Mozilla’s case though, Google took the preliminary step of removing the offending result from their index. This annoyed Mozilla as their site is huge and the quickest was to find this spam would have been searching for it through Google. But as Google have removed the page, it can’t be found to be fixed.

What do you think to Google’s Webmaster Warnings?

Blog Post by: Greg McVey

Net66 Blog: Manage Time on SEO better with these Tips

We all want to have time to do everything in our days which includes the full SEO circle of Link Building, Content Creation and Social Media. But you can spend hours and hours poring through hundreds and hundreds of potential link opportunities looking for that one needle in the bails of hay. Leaving you little time to generate content and get it published across the multitude of Social Networks and Bookmarks that can benefit this content.

Here are some tips to help you better manage your time and get the most out of your day.

Guarantee One Link A Day – Within a Limit: I know this sounds oxymoronic but if you set out with the intention to create one quality link every day within the first two hours of the day, it motivates you to go out and get that link; but at the same time if you exhaust every avenue and can’t find anything beneficial to your site, you’ll reach the time limit and have to move on to something else. On the other side of this coin, if you manage to create a fantastic link in the first 20 minutes, the you’ve got a lot more time to chase other great links without the pressure on you which, in some cases at least, can really help the link builder in question.

Set aside time for Content Creation: You don’t have time to write, you’re busy! I understand, I’ve had days where I’ve chased my tail so much I started to wonder why I had a tail. So set a time period where you can settle in to writing. It doesn’t have to be long, 30-45 minutes is sufficient. Even if you’ve got writers block this time can be spent generating ideas for further blog posts or blocks of content. You could spend 25 minutes turning your brain into a paperweight and then in the 26th minute have an epiphany about a weeks worth of content. Time spent away from the pressure cooker of the day to day tasks can help you free your mind and open it to content creation.

So give this a go. Make time for content. Set a link building limit. And make sure you work smarter, not harder.

Blog Post by: Greg McVey

Net66 SEO Blog “Google Now” to Arrive on Google’s Home Page?

“Google What?” the cries ring out, but fear not there’s an explanation. In layman’s terms, Google’s rival to Siri. It’s an application you can get on Android Smartphones that uses your location, local time settings and other features to give you information that you need when you need it.

As indicated in it’s title, Google Now is all about the present. It works by offering a series of “Cards” that you can customise around your life. For example:

The Traffic Card: As Google has an extensive maps feature, it can now monitor traffic to and from certain places. Say you log on to your home wifi and your works wifi everyday. You log off your home WiFi at around 08:30 am and log on to your work’s WiFi at 08:50 am. Google Now will recognise this as your commute to work and prompt you to save “Work” as a location. Then every day at around 08:25 am your Traffic Card will alert you as to how bad traffic might be between the route between home and work. A nifty little tool.

You can also get cards for the Weather, Sports, Time Zones, Currency exchange rates and one of my favourite, the Calendar. You can hook up the calendar on your mobile phone to Google Now so that you never forget an appointment. You get alerts reminding you of appointments for the day and you’re always a step ahead as you’re always aware of upcoming events too.

But how will this fit on to the home page of Google.co.uk? It’s not a mobile phone so can’t guess your commute between home and work. The answer will lie with a Gmail account. Not only this, but with the recent news that Google is discontinuing iGoogle (a previous home dashboard for Google) Google Now could already have been earmarked to replace this.

Sign in to your Google account and bingo, Google Now will display the Weather, Time Zones, Currency etc automatically based on the browser settings of the computer you’re using to access google.co.uk. There may be some manual configuration of the Traffic Card having to manually assign work and home locations but after that, one quick log in at home before you set off to work and all your information is there. Happy Days.

Blog Post by: Greg McVey writing for Net66

How to get your Website Flagged – Not in the Good Way

Getting your website flagged could make your business stand out a bit more and hopefully attract new visitors, new potential clients and new business. However that one entity you don’t want to flag your website to is Google, mainly for SEO reasons. Reason being if your site does get flagged it could put you in front of Google’s Web Spam team who will go through your site with a fine comb and look to restrict your progress wherever you can. Not out of any malicious intent, but Google pride themselves on having the best results that not only are accurate, but provide a great user experience.

One of the things that could get your website flagged is by adding a lot of content at a time. To rank on the first page of Google’s organic listings, you have to have an organic site. Organic sites grow and develop and if they were to suddenly bloom from 50 pages to 9,000 then alarm bells will soundly ring out. Let me give you an example.

You’re a company that launched a website last year and you are now ranking quite well for your phrases. One of your diligent staff members finds an archive of content, nearly 2,000 pages worth. All the content is relevant, good quality and can only improve your user’s experience. Your instinct would be to get it all on your website as soon as possible to boost your rankings. But if you’re adding a lot of content at once then Google could be prompted to investigate.

This isn’t necessarily a bad thing, especially if your site has only ever employed White Hat tactics. A member of Google’s webspam team will take a look at your site to make sure the content you’re adding is quality, relevant and not intended to manipulate the rankings.

In an ideal world, you should break the content you’re adding into chunks and upload them gradually. Now I don’t expect everyone to do this as it is impractical and can take up a fair amount of time what with the planning, the breaking down, the categorising etc. But if you want to do it properly then that is the way you should do it.

Blog Post by: Greg McVey

Is it Worth Taking Risks With SEO?

In my honest opinion I would say yes. Risks, whilst they can go wrong, also have the added benefit of working out better than you could ever have hoped. There’s a difference between taking risks and making informed decisions.

For example, an informed decision is going to Tesco and picking up a Value lasagne. You know that there’s a chance of horse meat being in there, but because you’re aware of it that’s an informed decision.

A risk would be to go up to a burger van and order a burger. There’s a chance that there is horse meat in there, but there is no way of knowing and a high chance that your risk will pay off. A la the burger will be delicious.

Now, to make it more SEO Specific here are some risks that are worth taking:

> Hiring an inexperienced SEO writer: There’s a difference between ability and skill. You could spend money hiring a skilled SEO writer who can churn out all the posts you require, but it will cost as they have built up their skills and commend a great CV. However hiring someone inexperienced with SEO writing, but a person who has the ability to write great content is well worth the risk. You can offer support, guidance and training to this new person with the natural ability and they can learn the skills that will turn them into a great SEO writer based on their natural ability.

> Re-Designing your website: But the pages wont match up and it’ll confuse the search engine spiders and everything will change and it’s bad for SEO. But on the plus side, user engagement could shoot through the roof. A brand new site that looks the bees knees with a brilliant UI and UX (User Interface/Experience) is going to reduce bounce rate so much and improve the conversion rate of your site. And surely that outweighs the possibility of a few weeks with poor rankings that are more than likely to return anyway.

Bearing in mind this is my personal opinion if you take these risks and they don’t pay off then you should be they were a risk and not bulletproof.

Blog Post by: Greg McVey writing for Net66

Why don’t People Like my Website?

There are so many varied and reasons as to why someone might dislike a website. It could stem from a person’s personal preference as a lot of sites that, whilst appeal to many, offend the few.

For example I quite like the colour blue. It’s calming, it’s professional, it’s the colour of the sky and the sea and everywhere in life. You can get it in all kinds of different shades, textures and more. But Jordan who sit’s across the way from me really likes the colour red. It’s passionate and driven with a real go-getter attitude. So obviously Jordan will be more inclined to like red sites where I would maybe prefer a blue site.

Because of such subjectivity a recent study decided to produce data on users browsing habits and what does and doesn’t cause them to dislike a site. Here’s the top three reasons why people would block a site from search Results:

Too Many Advertisements: People don’t like to be advertised to. Advert breaks between TV shows are quite often referred to as “Tea Breaks” with even the presenters of television programmes advising you to pop the kettle on during these ad breaks. So it follows true that if you were searching for a cheap coffee table and you found a website full of advertisements not for a coffee table from that site, but low grade, high density adverts for coffee tables from a variety of other sites, you’re not going to be impressed.

Poor Quality Content: You’re looking for advise on building a shed. You run an internet search (or you Google It) and find not a website filled with the soundest advice on building a shed, but a loosely put together website mentioning “Garden”, “Shed” and “build” a lot. How annoyed would you be? Enough to block the site? Well apparently so with 60.6% of people in this study saying they would block a site with poor quality content.

Incorrect Result: “It’s good but it’s not the one”. When looking for something online, search engines are constantly trying to second guess us and produce the results we want quicker than their rivals. But in some cases there’s an argument for “More speed, less haste”. Queries can be run that do bring up the wrong results and even though it’s a similar result, it’s not the right result so unfortunately falls to a ban from 47.2% of people.

So make sure you’re correcting these mistakes before they start to affect your website’s performance in the search engines.

Posted by: Greg McVey writing for Net66