Net66 SEO: Penguin 2.0 Rolled out and Confirmed by Google.

Penguin 2.0 UpdateRecently I brought you news of Google saying that the new Penguin update was still a few weeks away. Less than two weeks later however they’ve rolled the new update out and confirmed it. PANIC! No, don’t. You should only be panicking if you or your SEO Company have been engaging in link practices that don’t abide by Google’s Webmaster guidelines.

So why Penguin 2.0 and not Penguin 4? It’s being referred to as Penguin 2.0 as there have been changes made to the actual penguin algorithm, rather than a regular data refresh. Such an update in fact that Google have said around 2.3% of English queries will be affected and that “regular users” will notice a difference. I know 2.3% seems a bit minimal but with the amount of websites online in England at the minute, that figure has to be in the tens or hundreds of thousands of affected website out there.

There’s still the question of when there was an abnormal amount of fluctuations in rankings and traffic a few weeks ago. So what was that? Google themselves say it wasn’t an update to the Penguin algorithm, so could it have been a bit of a data refresh before the new Penguin Algorithm went live? Or was it just another Panda refresh that Google are no longer confirming? It’s unclear as Google haven’t commented on this.

The good news about this update is that Penguin is directly targeting spam and black hat results in search engines. This should lead to more accurate results in search engines with people who have engaged in previous black hat link campaigns dropping from results. So if someone has always used black hat techniques to rank higher than you and you have thought this unfair, look who has the last laugh!

So, has your site been affected?

Blog Post by: Greg McVey

Net66 SEO: Basic Errors to Avoid with Search Engine Optimisation

We still see this on a day to day basis where companies or individuals out there are making rookie mistakes when it comes to SEO. So here are some common themes to avoid:

> Slow Page Load: You may have 100mbps fibre optic broadband, but other people out there may not be as lucky or that in to the internet to get such fast speeds. You also have to think about what’s on your website. If you have audio and video that loads up on your home page and plays straight away, not only is that annoying (opinion), but it is also a data hog. Hi Res images also fall into this category, especially now that Mobiles are coming into play a lot more. So try to trim your website up a bit. If a website doesn’t load in a couple of seconds, users are likely to leave.

> No Social Media: Not everyone is on Facebook, Twitter, Google+, LinkedIn, Tumblr, Flickr yada yada yada yada. But most people are on at least one of these meaning if that user has a positive experience with your site and wants to repeat it socially, you’re bang out of luck and you’ve just lost a potential client. Not only is it down to conversions, search engines also look at social indicators in their algorithms. If your website is fully geared towards the search engines, whilst it might rank okay, as Google can’t see you being prominent socially it looks like users aren’t talking about your site and aren’t recommending you on social networks. An if these are real people making no real mentions or suggestions, why should Google mention or suggest you on their search engine?

> Writing for Search Engines: Again and again and again we see this where keywords are thrown into the midst of prose haphazardly to improve “Keyword Density”. Keyword Density is dead. If you indicate what your content is about in your title, and then write about that content, job done. Google doesn’t care how many times you’ve got “Dog Groomer Wigan” in your page, it cares about how well structured that content is, how well it reads and how users respond to it.

A few tips there to help you out with your SEO. What common SEO Errors do you dislike the most?

Blog Post by: Greg McVey

 

Structure your Content With the Data Highlighter Tool

The term “content is king” is vastly used by the webmasters of the SEO world, in all fairness it is becoming very generic but is still extremely important that it stays in your head. At the end of last year Google introduced a tool in webmaster which will help you structure you content more effectively, increasing the quality and therefore possibly increasing your websites performance in the SERPs.

data highlighter tool

Not many people have really recognised this tool and so I thought it might be a good time to give them a little reminder due to the up and coming penguin update. This little tool could be the margin of the quality content in which Penguin will reward.

By using the data highlighter tool, it gives you the opportunity to structure all content within your website. Presenting your content in a way which is more understandable for the audience, Google’s algorithm will take into consideration; the easier to read the better it will score you. Simple?

For example, is you have a website which promotes events then you can use the data highlighter tool to tag important details for Google to present more attractively, such as the location, price and date, basically any crucial information which will bring you more traffic/potential business etc… after Google has crawled your re-tagged page it will then be available for rich snippets.

Please note that this tool does not work on pages which haven’t been cached and not in Google’s index. To find the data highlighter tool simply go to your webmaster tools, click on the “optimisation” tab and then the “data highlighter” tool will be available from there.

A very clever thing which has been included with this tool is that after manually highlighting and tagging your content in a consistent format (mainly for events pages) the tool will adapt to this and start suggesting tags for you to include and help speed up your work, clever Google eh.

Now before the introduction of the tool webmasters had to learn the HTML code to add their structured markup content and it became very time consuming. Not anymore this tool is simple and easy to use, all you need to use is your mouse to point and click which data you wish to highlight.

Blog Post by: Jordan Whitehead

Net66 SEO: Sprint Penalised by Google

Another one bites the dust. Google have once again penalised a “Big Name” in business for their website practices. Sprint, who are a global provider of data, voice and internet services, received news of their penalty via Webmaster Tools and instantly took to the Google Webmaster forums for help.

Before we all get carried away wondering what black hat ways a big company is employed to strongarm their way to the top of the SERPs ensuring their dominance, the penalty Sprint received was one relating to user-generated spam and not to any practices Sprint themselves had put in place.

You could chastise Sprint, who are such a large company with quite a large amount of resources, for not preventing this happening in the first place. But then you have to consider that Sprint are a large company with an incredibly large website. Could you manage nearly 5,820,000 without allowing a solitary one to escape without having a proper review? Quite a challenge then.

But Sprint are not alone in this, the BBC have had a run in with Google and received the same penalty. More recently Firefox fell foul to the same thing. The issue with User Generated spam is that Google will let you know there is spam coming from your website, but it won’t tell you where. With web pages in the millions for some of the penalised companies you can imagine the enormity of the task at hand in finding the offending page.

Thankfully though it does seem that Google have found a way penalise individually offending pages if the website itself is not spamming anywhere else. Whilst this is a positive, it can be quite difficult for smaller companies to find the time and resources to audit their whole site to fix what might be causing this penalty. They have to resort to doing what they can and hoping for the best.

Do you think Google should be more specific in the penalties it gives out?

Blog Post by: Greg McVey

Net66 SEO: OK Google

GoogleGlassGoogle are once again making waves and this time it’s with conversational search. This new technology is called “OK Google” and is set to revolutionise the way we search. It’s coming to mobiles, desktops and even Google Glass (although the latter is called “OK Glass”). Google Glass constantly listens for the command “OK Glass” when you’re touching the device, and once it hears this phrase it then knows to run a search for you. The idea is also about having a conversation with your search engine rather than running a search, running a new search, running a slight variation of your first search etc. Here’s an example of how it’d go:

Me: Ok Google, who is #1 for SEO Manchester?

Google: Would return the search listings for the keyword “SEO Manchester”

Me: What about Web Design?

Google: Would return the search listings for “Web Design Manchester”

Me: What about London?

Google: Would return the search listings for “Web Design London”

You see the difference there? Rather than having to run three different searches for three different key phrases, OK Google remembers previous queries and alter the results according to new questions asked. Personally I think that’s brilliant. Imagine checking the price of something at an online shop, then only having to mention the other shops for the price of whatever it is you’re getting. It’s going to save so much time for people.

There are still quite a few kinks to work out though such a, with the “OK Glass” command, you have to be touching the Google Glasses you have on your head. So how that is going to pan on out desktops and mobile devices, but you can see that this is going to be big.

Will you be buying Google Glass?

Blog Post by: Greg McVey

Net66 SEO: Is it a bird? Is it a Plane? Is it Pinterest? No! It’s the new Google+

So this morning I arrived at the office nice and early, I had my porridge, I had my brew (Coffee, white, one) and there was even a good song on the radio. I started cycling through the Social Media accounts checking for new follow, tweets, likes, posts, +1s etc. But when I got to Google+ I was hit by a wave of confusion and confronted with this:

google-plus-home

“But I’ve just checked Pinterest” I thought. Clearly this coffee isn’t working. I loaded it up again and was greeted with the same layout. Then it hit me that this was actually Google Plus and not Pinterest. Incredible, whilst not exactly a carbon copy of Pinterest, the design does seem to lend heavily from the infinite scroll, tiled panel design that was first introduced by the big P.

My first impression was, if it’s a toss up between this layout and the previous layout, it’s completely on par. I’m not championing the previous design, but neither am I shouting about this new design from the rooftops. However, after a couple of minutes of playing around with it, the news feed decided to finally respond to my browser and screen size and include a third column. I know it’s only a small thing, but it just seemed to make the whole thing sit together nicely.

Then there was the discovery of full width images spanning the width of the page and that really helped me warm towards the new news feed.

After my fill of the home page I went to my profile and was greeted by the biggest cover photo in the world. Check it out:

google-plus-profile

Pretty big. The rest of the profile panned out into all my latest updates spread over 3 columns. I’ve got to admit as well, the three column layout is brilliant. You add your updates in the very top left box which, as humans read from top left to bottom right, is perfectly placed. As opposed to Facebook where you’re made to add updates from the top of your news feed. A slight advantage to Google+ but an advantage nonetheless.

All in all I’d say Google has done a really great job with this. It looks like Google may have realised that it might never cater to everyone’s requirements for a Social Media platform, so it’s adapted to improve UX for the current clientele. After all Google+ is far more geared towards photo sharing than Facebook is.

Have you experienced the New Google+ yet? What do you think?

Blog Post by: Greg McVey

Net66 SEO: Google going after more link Networks.

What a week so far. Is it really only Wednesday? We’ve had confirmation that Penguin is only a few weeks away from Matt Cutts himself. Prompting a raft of people asking about the best way to vet their previous, maybe not so squeaky clean, link building techniques. Naturally we have a lot of advice for people and are willing to help out, even offering a penguin guide on how you can help prepare for Penguin 2.0.

There was a brief respite when we all had a quick time out to play Atari classic Breakout, but now we have even more news from Google and that now great source of information Matt Cutts. Once again it seems like Google are tightening the noose on Link Networks. Their previous High profile tart was the SAPE Network that got penalised and vanished from the Search Engines. The knock on effect would have been bad for a lot of sites that would have lost the value from any links from that network. And obviously from losing that many links at once, their rankings would have dropped drastically.

This news has come by way of Matt Cutts himself once again. He’s posted a few tweets recently outlining his and Google’s desire to really punish sites that are benefitting from spam and they’re definitely cranking it up on the Link Networks. Look at the Tweets below to see what has been said:

Matt-link-spam

So without specifying which Link network has been hit, Matt Cutts has inferred that Google have “took action on several thousand linksellers” which has to amount to a link network. I’ve still not heard anything as to which network has been hit, but I’m sure over the coming days all will be revealed.

Which Link Network do you think has been hit?

Blog Post by: Greg McVey.

Net66 SEO : Penguin 4 (2.0) still “A few weeks away” – Matt Cutts

Recently it seems that all the talk on the web from Manchester SEO forums, blogs and other news stories has been about recent fluctuations in rankings and traffic. Rumours abound that this was the start of Penguin 4, But Matt Cutts himself has waded into the mess of confusion to clear a few things up. First of all he has categorically stated that there was no Penguin update last week, you can see from the Tweet below/totheleftright?

matt-cutts

So what was it? More than likely it was part of Panda refreshing itself now that it is part of the algorithm. So GREAT! There’s still time to Prepare for Penguin. But ‘our Matthew went further than that to definitely rule out that this was a Penguin update by announcing that Penguin 2.0 is still “a few weeks away”.

matt2

Penguin 2.0? But we’re currently on Penguin 3. Surely the next update has to be Penguin 4, can Google not count? What Google have done is subtle but they’ve announced that this is going to be a Big Penguin, PenguinZilla in fact. Maybe not that big, but Google have confirmed that this will be the biggest penguin update yet as internally it’s being referred to as Penguin 2.0. The rest of the Penguin updates have subsequently been devalued to:Penguin Zilla

> Penguin 1 is now Penguin 1.0
> Penguin 2 is now Penguin 1.1
> Penguin 3 is now Penguin 1.2
> And Penguin 4 is now Penguin 2.0

A big step up indeed. However there are still issues with this, especially when it’s left to Google to decide when an update is big enough to progress from 1.2 to 2.0. For example, look at the Panda updates that were released last year that had a few occasional big refreshes but then a long period of small refreshes that eventually led to there being a panda 3.92. This got too much for everyone who then started re-numbering all updates chronologically regardless of size.

We’ll be calling the next update Penguin 2.0, but the crux of the matter here is, Penguin is on it’s way, and it is going to be big. So make sure you’re prepared and start counting your links.

Blog Post by: Greg McVey

Net66 SEO – Was that a Google Update?

I’ve heard and read a lot recently about fluctuations in a lot of different areas particularly concerning rankings and traffic. Naturally everyone assumed that there was another Google update which had caused the fluctuations in ranking and traffic. But after being questioned several times, Google responded with it’s default answer:

““We have nothing to announce at this time. We make over 500 changes to our algorithms a year, so there will always be fluctuations in our rankings in addition to normal crawling and indexing.”

Hardly helpful but with the amount of people reporting the fluctuations there has to be something going on surely?! Well there isn’t a definite yes or no and I apologise for calling you Shirley.

In the past with certain updates, Google has been very transparent when dealing with updates in the past. Remember the EMD update? Exact Match Domains (EMD) are what’s used when people want to rank for a keyword or phrase and buy a domain name with the keywords in it. For example if you wanted to rank for “Manchester SEO Company” you would want to have a domain like http://manchesterseocompany.com/ As the keywords are in the domain, Google would realise this domain is relevant for the keywords searched and thus rank it in the SERPS (Search Engine Ranking Pages) higher than it would if it didn’t have an EMD. This is quite clearly unfair, especially considering that the people who bought the EMDs could have an inferior site offering an inferior service.

So, Google released an update that refused to count keywords in the domain towards rankings in an effort to be fair. However there are still arguments about how effective this really was as people have claimed to see rises in rankings for EMDs afterwards.

Google has also been clear about Penguin and Panda updates in the past. Which lends weight to the argument that Google haven’t released an update otherwise they would have mentioned it.

You also need to take into consideration that last month Google announced that they would no longer be confirming Panda Updates and were integrating Panda into the algorithm itself. So have they released a new algorithm with Panda introduced in it? Or have the ran a quick Panda refresh in addition to the latest changes to the algorithm?

We can spend days and weeks debating it and never know for sure. The only facts we have at the moment are that a) people are reporting fluctuations in rankings and traffic b) Google aren’t confirming anything.

So what do you think?

Blog Post by: Greg McVey

How Do I add Google Authorship to my Blog

Yesterday I talked about “What is Google Authorship”. At the moment the true power of Google Authorship has yet to be revealed however there are quite a few rumours around the SEOSphere that this could be very beneficial to SEO. The reasons, as I explained yesterday, stem from the fact that there is now a boom of content on the web. This is stemming from Google’s push to encourage natural links through great content.

As with all things SEO, there were the companies who did this properly through organic content marketing creating properly structured campaigns. And there were those who scraped together small paragraphs of keyword rich text and flung this out into the web flooding the internet with chaff. Google Authorship is meant to help with this by penning verified Google+ accounts to the blogs they write for.

For example, if you search “What Happens if a Penalised site links to me?” You will see the following:

What happens if a penalised site links to me?

So how do you get this showing up on your Blog? Pay attention and you’ll have no trouble doing this by the end of this post. Follow the steps below:

> First of all, you need to sign up to Google+ and get yourself a Google Plus Profile, like this one. Ensure you have a recognisable and clear profile picture and that your profile is fully filled out.

> Ensure you actually include a line or a short author Bio in the content you’re publishing. i.e. Blog Post by Greg McVey. The name you include on the content you’re publishing  has to match the name of the profile on your Google+ profile.

> Finally, make sure you have an email address that ends with the domain you’re publishing the content on. So for me I would need greg@net66.co.uk, as I’m publishing to https://netsixtysix.co.uk/ Which is what I have. Then you need use This Tool to verify that you own that address, it’ll send you a verification email and you’ll need to click a link within this email. This then adds your @domain.co.uk address to your profile, as well as a link to the domain your email address is on.

Voila! You have now verified that you are a content publisher for the domain you’ve just registered and that you want to link your Google+ profile to it. Excellent!

So how to I pen individual pieces of content?

If it’s a one off piece, you can use the following:

Blog Post by <a href=”https://plus.google.com/[enter your google+ account number]?rel=author”>[Your Name Here]</a>

However a lot of people are the sole contributors to their blog and to painstakingly go through each page individually is not productive one bit. So how can you do this sitewide? Luckily, it is a simple manner of including one link in the header of your website. If you have a static HTML website then unfortunately, as with all sitewide changes, you will have to change each HTML file individually. However, most blogs are WordPress these days anyway so for most it is a simple fix.

The code is entered after the first <head> tag and before the closing </head> tag. The code is as follows:

<link rel=”author” href=”https://plus.google.com/[enter your google+ account number]”>

Entered into your header this will associate one author with every page on your website.

And there you go. A simple guide on how to enable Google Authorship on your Blog.

Blog Post by : Greg McVey

What is Google Authorship?

Google Authorship has been around for a while now and is rumoured to be good for SEO. Whilst not everyone is using it, it’s definitely gaining traction becoming more and more popular. The industry buzz is that this is going to be the biggest thing since sliced PageRank, or PageRank or indeed, sliced bread.

So What is It?

Due to its still relatively low popularity, I do get quite a few people asking me about this. Authorship is adding an “Author” to any content published on your website or your blog. It just so happens that the Author Bio is a Google+ profile. You can see mine here.

The reason behind Google making a move towards authorships comes from their extra scrutiny on Link Building. Google’s recent shift in policy has led to them becoming much more tenacious and pedantic when expecting links. If it get’s even the merest whiff of suspicion that you may have purchased a high PR link that is also rel=”dofollow” in order to manipulate search engine listings, then it will devalue that link.

With this new tactic fully employed now, one of the best ways to garner inbound links is to use Link Bait so people organically link to your website. With this approach the internet figuratively exploded with content. So how do you set your content apart from the rest? Google Authorship helps set you aside from the group.

I’ll be covering, in depth, all the different ways to add authorship to your blogs tomorrow but for now, here’s the code to quickly add this authorship to your Blog:

Blog Post by : <a href=”https://plus.google.com/[enter your google+ account number]?rel=author”>[Your Name Here]</a>

Simple as that, but of course you have to go through the Google Authorship Verification Process first. Which of course, will be posted tomorrow in depth.

I hope to see you tomorrow.

Greg McVey

 

Why does SEO Take So Long?

Frustration I would say is the most linked word with this question. Whilst it is a valid question, there are many answers to it that could lead to a whole series of blog posts on the manner of SEO. However here are some answers to help you understand.

Analyzing Competition

First of all, if you’re just starting your SEO then you can bet the shirt on your back that there are already competitors ranking for the phrases that you wish to also rank for. This in itself is a disadvantage as your competitors could have quite a head start in the race for #1 listings. And as you’re just setting off, now far behind your competition, it can take some time to overtake them.

Link Building Authority

Search engines and Google in Particular look to the quantity and quality of links to a website when determining where to rank websites. Analysing your Competitors backlinks can be an enduring task as you have to find the truly quality links that are shared by all the top sites for the key phrases you wish to link for. After this you then have to start the process of getting a link to your site from the aforementioned shared top links of your competitors.

Overtaking Competitors

That’s just the start, as you then have to go out and find and build your own links to show Google why you deserve to be ranked above your competitors. There are quite a few ways to generate and build links to your site including Creating Content, guest posts, blog posts, social sharing to name but a few.
Then there’s the onsite structure of your website that needs to be streamlined particularly for SEO. For example, you need to ensure each image file you have contains a relevant title tag and alt tag, each keyphrase has to have a page dedicated to it. Page titles, headers and more all have to be fine tuned to ensure the best results.

To much too fast = bad

Back in the day it was literally a case of “Whoever has the most keywords wins”. Then Google evolved to include a philosophy of “Whoever has the most links wins”. Naturally this left Google’s rankings wide open to manipulation where a spammer could just outsource link building and generate 500 links overnight to rocket up the rankings. If someone overtook them they could then repeat this procedure. Because of this and the advancement of Google’s algorithm, it can now detect a lot of links generated at once and apply punishments to sites Google believes has violated it’s guidelines.

So all in all, there are quite a few reasons that SEO can take so long.

Blog Post by: Greg McVey

Is “not provided” affecting Google’s Search traffic to news sites?

Now first of all if you don’t know what the “not provided” result means in Google Analytics, it is basically people searching when they are signed into their Gmail account. The reason behind this is that when you are signed into your Gmail you will then be on a secure server which is encrypted with SSL. You will notice this in your address bar as it will change from http:// to https://. Google cannot track your data due to the security of the SSL because you have entered a password to login.

I seen a statistic on the internet which shown that the news site Buzzfeed investigated the traffic to its own website and 200 websites within their network and the organic traffic had declined from September last year. Websites including Fark and The Huffington Post which are big websites are seeing decrease in traffic. So the people at Buzzfeed have now two theories are users decreasing their visits to the website or is the data generated by Google not giving them the accurate results they need because of not provided?

We are all still a bit clueless in the not provided result as it was what many people branded a side of Google as “dark”. Google hit back at these claims saying that they were doing mainly for privacy and security for your account. Which in my personal opinion as someone who works in SEO Manchester based just to be specific, I want the most accurate data possible on my websites but I can also understand the fact that Google has to keep its product and their clients’ data secure so it is a bit frustrating in both ways.

At the end of the day Google is a business who have a product (Gmail) and there job is to protect the clientele from spam and being hacked which is why we see the not provided result. Ask yourself this question. Is it really worth Google risking all the private data of their client base just to satisfy a handful of them?

Net66 SEO : Google Vs Bing – Head to Head

Like Coca Cola and Pepsi, there is always going to be a comparison between Google and Bing. But which one of them works better? The answer is there is no answer, individual results are always going to be tailor made by search history, location and whatever other secrets are included in the Search Engine algorithm that makes personalised web searches personal. So what can we do if everything’s subjective? We shall run a case study!

Recently CERN who basically created the origins of what we now call the World Wide Web have decided to host a copy of the First Ever Website back on its original domain. It’s been hosted elsewhere since then by W3C. You can view the very first website ever at it’s original domain address here: http://info.cern.ch/hypertext/WWW/TheProject.html

But how does Google and Bing manage search requests for the first ever website? We’ll test that now. by entering “First Website Ever” into Google and Bing respectively. We’ll add a control to this little experiment and we’ll start be searching the URL first. See the Results below:

Google Returns: The first website ever at it’s original URL. Followed by CERN’s modern website. Followed by two more results from CERN.

Bing Returns: The first website ever at it’s original URL. Followed by CERN’s modern website. Followed by a few news articles and then a wiki.

So the first two results are exactly the same. No real surprise there as we were using a very specific search term. But it’s interesting to see how Google continues to list CERN’s website and Bing chose to list news items related to the search term. Which do you prefer?

So now lets really put it to the test and search for “First Website Ever”. See results below:

Google Returns: The first website Ever. Followed by Cern’s website. Followed by News relating to the first ever website.

Bing Returns: A copy of the the first ever website hosted on W3.org. Then followed by CERN’s modern website.

A clear win for Google then. But like I’ve mentioned above, there are a lot of factors that affect rankings and localisation is definitely one of them. So give this little test a go and let me know how you get on.

Blog By: Greg McVey

Net66: SEO Vs Web Design

SEO Vs. Web DesignFacebook Vs Twitter. Google Vs Bing. Manchester United Vs Manchester City (Hardly a fair fight I know, 20). And even the Star Wars Prequels Vs the Star Wars Sequels.

There’s conflict everywhere but one we see every single day is Search Engine Optimisation VS Web Design. I’ll first explain why there is such a divide between the two.

The Viewpoint of an SEO

Functionality > Presentation. In an ideal world the whole of a site that’s ready to be optimised needs to be content rich. There can be no flash and at least an image per page. Eveything can be controlled by one plugin where you can manage all the meta and rich snippets. Everything the website needs to rank for has it’s own section, there’s a blog which is updated daily with quality content. The reality however varies wildy from this idealistic view. There are still sites that are written in static .html that makes it hard to create anything dynamic. Coding can be erratic and some sites can be hacked together where editing a single line of code can knock the whole website out of sync. You can still get website written in .aspx that is a programming language created by Microsoft, and whilst some people are still advocates of it they are finding their numbers dwindling.

The Viewpoint of a Web Designer

Presentation > Functionality.  To any web designer worth their salt, a website HAS to have that extra special something that makes it stand out in front of all the other competitors websites. In an ideal world for a Web Designer, websites in search engines would rank according to how good they look regardless of their Meta titles or how many links there are to them (although from an SEO point of view a beautiful site generates links organically with friends sharing sites saying “Check this beautiful site”). Content to a Web designer would be irrelevant as surely how a site looks is the only thing that matters.

So who’s right and who’s wrong? As I mentioned, the above is a snapshot of the Viewpoint of a Web designer and an SEO, it’s entirely subjective. There is no right or wrong. However I will say this. In each industry, as with all industries you do get the the good and the bad. There will be companies out there who are after a quick buck so have together a pretty decent looking website that is not easily optimised or edited, much to the irk of the designated SEO Technician who will complain whole heartedly about Web Designers.

On the other side of this coin however there will be a web designer who has had one of their lovingly created websites ruined by some amateur “SEO Technician” who keyword stuffs, buys hundreds of bad links and then disappears at the first sign of a Google Webmaster warning.

So in the end it doesn’t boil down to SEO Vs Web Design, it comes down to people complaining about people who haven’t done their job properly.

Blog Post by: Greg McVey

 

Net66 SEO: How to make UX easier.

There are a few things in SEO that can really make a difference in the SEO to your website. With Search engine Optimisation now there are things you can do to promote Social Sharing on your website that makes it easier to generate organic links.

Remove extra Steps

If you have a step in a process that you like but that isn’t necessary, REMOVE IT. With the removal of a step you can earn or lose a potential client or link. You need to make sure that the user on your site is having a great time and as user engagement is linked with SEO this definitely needs evaluating. For example, if you’re setting up a sign up form for your blog, you want to treat this as “How can I make the experience great for the user” rather than “How much information can I get from this user”. Some websites employ this tactic requiring your first and last name, your mobile number, your address, your email address and the creation of an account on their website. “I only want to sign up to new blog posts, why do they want so much information? I’ll just check back with their website from time to time, it’s easier”. And that’s how you lose a potential long term user of your site.

Keep Users Logged in.

If you do have a user interface where people log in and become part of an online community, then a big part of keeping user engagement high is to keep the user logged in. With a raft of Social Media accounts, email accounts and more, I sometimes have a bit of trouble remembering them all. If I sign in to a casual community and have to log in and forget my password, I have a lot of passwords and am more likely to leave to a community where I’m already signed in. I know this seems lazy in one of the highest degrees but it is an example that rings true. Especially true on mobile devices. Think about it, if you’re on your smartphone and you have the Facebook app and have to sign in to it every time you want to check, it’ll get to the point where you won’t want to check Facebook due to the hassle of signing in. This is the same as logging in to any website.

Finally, Make it easier to socially share.

You can see from the bottom of this post we have the options to share automatically to the most popular social networks. Simple, Easy, Clean.

Do you have any tips of your own for increasing User Engagement?

Blog Post by: Greg McVey.

Net66 Blog – Penguin 4 – What you can do to Prepare

So very many Manchester SEO Companies, in the UK and the whole world have been affected in some way by last 2012’s “Year of the Penguin”. And that was just the starter. Investigating quite a few companies who have had link building conducted by an outsourced “Guaranteed Link Building” company, leads me to believe that there are highly likely going to be many more this year.

So what can you do to prepare for this penguin themed pretence for penalisation? The answer, is a lot.

I’ve reported on Interflora getting penalised and bouncing back quickly, so there is a way to be prepared for this. Here are some tips I believe will help:

> VET YOUR LINKS. Seriously. Right now you need to be very aware of what sites are linking to your website. Group them into Low risk, Medium risk and High risk. I’d even go so far as to use Google’s Disavow tool to make sure the High risk sites linking to your site aren’t counted against you. This way you’d pre-empt any penalisation. Medium risk links are also ones to segregate in preparation for if you need to disavow those too.

> Re-Evaluate your definition of a link: The phrase “A link is a link” has been bandied about for far too long. This is wildly inaccurate and has been for some time. It’s not just a race for the amount of links with [Keyword] in there. This is now actually bad for your site. If you have a lot of low quality links to your site with keyword stuffed text, that isn’t natural, Google will see this and Google will apply a penalty. Create differing anchor text keywords and expand the sources from which you’re getting these links. Use long tail keywords and keyword rich anchor text sporadically whilst mixing in generic “Click Here” links, branded links and quite simply your URL.

These tips should help you weather the Penguin Storm if it hits you.

Blog Post by: Greg McVey

Google is Dropping Instant Preview

Google have recently declared that they have dropped the instant preview feature on their search engine, as of immediate effect.

If you are not familiar as to what instant previews are, next to each listing you could hover over an arrow and it would display a “preview” of the website to which that link is linking to.

It also gave information such as date of last cache. This is still available but on a drop down link at the end of the URL in green text.

Why has the instant preview feature been abandoned by Google?

In short Google have come to the conclusion to no longer include this feature on the search engine because of the very low usage. It has been started at the end of 2010, and in that time they have monitored and tracked the usage of that specific function, coming to the conclusion it is no longer benefit which is used by the majority.

They have also stated that their intention is to make the page more “streamline” and this will help with that.

Why Do Google Keep Making Changes

Its a good question really. Often Google makes changes to their search engine and to their analytics or adwords accounts much to the annoyance of the user. Most people get used to the way a system works and then Google changes it, so they have to almost- re-learn it all over again.

From Googles perspective however they are constantly evolving and chancing what they do to improve it, enhance it and make it better.

So although it can be a pain having to readjust to the beat of what Google sets, in the long run it is in the best interest of the user (generally speaking) and they go off information which dictates certain decisions, just like they have with preview instants.