Google Slaps Mozilla with Manual Spam Penalty

Interflora, The BBC and now Mozilla? Google have been relentless in their pursuit of spam recently. A campaign leaving waves of devastation all over the SEO world. A lot of people think Google are being too picky as a lot of sites claim to have been penalised but have no record of ever knowingly conducting any black hat SEO. Predominantly these have been Algorithmic penalties dished out by Google’s Web Crawler, but every now and then you hear about a Manual Penalty too.

Mozilla is the latest high profile victim in this crusade against webspam. Not through unethical backlinking, cloaking or any other black hat SEO, but by User Generated spam. Mozilla received this message:

Google has detected user-generated spam on your site. Typically, this kind of spam is found on forum pages, guestbook pages, or in user profiles.

A bit harsh then that Google would punish a site that has been taken advantage of by it’s users. However, Google have issued a granular manual penalty. Not penalising the site overall, but singling out the page that was generating the spam and penalising that. Good news right? Wrong. As it left Mozilla completely in the dark about which page was penalised.

It’s a theme we’ve seen a lot after the latest Penguin refresh which generated thousands of “Unnatural Link Warnings”. You can see an example one below:


Dear site owner or webmaster of ….
We’ve detected that some of your site’s pages may be using techniques that are outside Google’s Webmaster Guidelines.

Specifically, look for possibly artificial or unnatural links pointing to your site that could be intended to manipulate PageRank. Examples of unnatural linking could include buying links to pass PageRank or participating in link schemes.

We encourage you to make changes to your site so that it meets our quality guidelines. Once you’ve made these changes, please submit your site for reconsideration in Google’s search results.

If you find unnatural links to your site that you are unable to control or remove, please provide the details in your reconsideration request.

If you have any questions about how to resolve this issue, please see our Webmaster Help Forum for support.

Sincerely,

Google Search Quality Team

A fair warning you would think, but in the world of SEO Link Building is a massive part of the optimisation of a website. So if you’ve built a website and had it optimised and you’ve had a lot of links built and then you get the warning, it’s quite annoying that Google have told you that your site has unnatural links and you may be penalised, but NOT telling you which links in particular are offending.

This has led to a lot of people getting out the scythe for their links when really all they needed was a pair of tweasers. In Mozilla’s case though, Google took the preliminary step of removing the offending result from their index. This annoyed Mozilla as their site is huge and the quickest was to find this spam would have been searching for it through Google. But as Google have removed the page, it can’t be found to be fixed.

What do you think to Google’s Webmaster Warnings?

Blog Post by: Greg McVey

Net66 Blog: Manage Time on SEO better with these Tips

We all want to have time to do everything in our days which includes the full SEO circle of Link Building, Content Creation and Social Media. But you can spend hours and hours poring through hundreds and hundreds of potential link opportunities looking for that one needle in the bails of hay. Leaving you little time to generate content and get it published across the multitude of Social Networks and Bookmarks that can benefit this content.

Here are some tips to help you better manage your time and get the most out of your day.

Guarantee One Link A Day – Within a Limit: I know this sounds oxymoronic but if you set out with the intention to create one quality link every day within the first two hours of the day, it motivates you to go out and get that link; but at the same time if you exhaust every avenue and can’t find anything beneficial to your site, you’ll reach the time limit and have to move on to something else. On the other side of this coin, if you manage to create a fantastic link in the first 20 minutes, the you’ve got a lot more time to chase other great links without the pressure on you which, in some cases at least, can really help the link builder in question.

Set aside time for Content Creation: You don’t have time to write, you’re busy! I understand, I’ve had days where I’ve chased my tail so much I started to wonder why I had a tail. So set a time period where you can settle in to writing. It doesn’t have to be long, 30-45 minutes is sufficient. Even if you’ve got writers block this time can be spent generating ideas for further blog posts or blocks of content. You could spend 25 minutes turning your brain into a paperweight and then in the 26th minute have an epiphany about a weeks worth of content. Time spent away from the pressure cooker of the day to day tasks can help you free your mind and open it to content creation.

So give this a go. Make time for content. Set a link building limit. And make sure you work smarter, not harder.

Blog Post by: Greg McVey

Net66 SEO: Google Taking Over the World, Not literally.

Google have widely been considered one of the largest companies on this planet for a number of years now. What started out as a humble search engine that counted the links to each site, measured their keywords and held up the one with the most of each, is now a corporate Behemoth. Ranging from a maps service to a mobile Operating system. From creating their own Web Browser to their proprietary Google Glass Software, a high tech pair of glasses that have a holographic head up display (HUD) that can display almost anything, from the weather in your local area to websites you’re browsing for that new tweed jacket.

Arguably the two most used services are the Google Web Search, which in itself have helped create the whole industry of SEO, and the Maps service. Because with Maps, you get directions and a handy Navigation App too (if you’re on Google’s Android Mobile OS platform).

Maps started out as a series of Satellite images. This progressed into a new program for Google called Google Earth. Google realised the potential of this and developed Maps even further rolling out the hugely successful “Street View” programme. I personally love this tool and use it a lot, it always helps when you’re looking at navigating somewhere and you can become unsure junction you need to turn at, but with street view you can check out the junction first.

Plus you can get a taste of life all around the world. Ever wanted to walk down the streets of Cairo? Now you can! (figuratively) Although this pales in comparison to actually visiting Cairo, it surely has to be the next best thing. Plus there’s always the chance that next time you see a Google Maps car with mounted cameras, remember where you were and check the maps later for a claim to fame. (I’ve seen a few cars but never made the final cut)

It’s unsurprising then that Google have continued this service and expanded into a grand total of 50 Countries. They’ve now added Google Street view in Hungary and Lesotho (I hadn’t heard of it either, it’s in South Africa).

Blog Post by Greg McVey

Net66 SEO Blog “Google Now” to Arrive on Google’s Home Page?

“Google What?” the cries ring out, but fear not there’s an explanation. In layman’s terms, Google’s rival to Siri. It’s an application you can get on Android Smartphones that uses your location, local time settings and other features to give you information that you need when you need it.

As indicated in it’s title, Google Now is all about the present. It works by offering a series of “Cards” that you can customise around your life. For example:

The Traffic Card: As Google has an extensive maps feature, it can now monitor traffic to and from certain places. Say you log on to your home wifi and your works wifi everyday. You log off your home WiFi at around 08:30 am and log on to your work’s WiFi at 08:50 am. Google Now will recognise this as your commute to work and prompt you to save “Work” as a location. Then every day at around 08:25 am your Traffic Card will alert you as to how bad traffic might be between the route between home and work. A nifty little tool.

You can also get cards for the Weather, Sports, Time Zones, Currency exchange rates and one of my favourite, the Calendar. You can hook up the calendar on your mobile phone to Google Now so that you never forget an appointment. You get alerts reminding you of appointments for the day and you’re always a step ahead as you’re always aware of upcoming events too.

But how will this fit on to the home page of Google.co.uk? It’s not a mobile phone so can’t guess your commute between home and work. The answer will lie with a Gmail account. Not only this, but with the recent news that Google is discontinuing iGoogle (a previous home dashboard for Google) Google Now could already have been earmarked to replace this.

Sign in to your Google account and bingo, Google Now will display the Weather, Time Zones, Currency etc automatically based on the browser settings of the computer you’re using to access google.co.uk. There may be some manual configuration of the Traffic Card having to manually assign work and home locations but after that, one quick log in at home before you set off to work and all your information is there. Happy Days.

Blog Post by: Greg McVey writing for Net66

How to get your Website Flagged – Not in the Good Way

Getting your website flagged could make your business stand out a bit more and hopefully attract new visitors, new potential clients and new business. However that one entity you don’t want to flag your website to is Google, mainly for SEO reasons. Reason being if your site does get flagged it could put you in front of Google’s Web Spam team who will go through your site with a fine comb and look to restrict your progress wherever you can. Not out of any malicious intent, but Google pride themselves on having the best results that not only are accurate, but provide a great user experience.

One of the things that could get your website flagged is by adding a lot of content at a time. To rank on the first page of Google’s organic listings, you have to have an organic site. Organic sites grow and develop and if they were to suddenly bloom from 50 pages to 9,000 then alarm bells will soundly ring out. Let me give you an example.

You’re a company that launched a website last year and you are now ranking quite well for your phrases. One of your diligent staff members finds an archive of content, nearly 2,000 pages worth. All the content is relevant, good quality and can only improve your user’s experience. Your instinct would be to get it all on your website as soon as possible to boost your rankings. But if you’re adding a lot of content at once then Google could be prompted to investigate.

This isn’t necessarily a bad thing, especially if your site has only ever employed White Hat tactics. A member of Google’s webspam team will take a look at your site to make sure the content you’re adding is quality, relevant and not intended to manipulate the rankings.

In an ideal world, you should break the content you’re adding into chunks and upload them gradually. Now I don’t expect everyone to do this as it is impractical and can take up a fair amount of time what with the planning, the breaking down, the categorising etc. But if you want to do it properly then that is the way you should do it.

Blog Post by: Greg McVey

Is it Worth Taking Risks With SEO?

In my honest opinion I would say yes. Risks, whilst they can go wrong, also have the added benefit of working out better than you could ever have hoped. There’s a difference between taking risks and making informed decisions.

For example, an informed decision is going to Tesco and picking up a Value lasagne. You know that there’s a chance of horse meat being in there, but because you’re aware of it that’s an informed decision.

A risk would be to go up to a burger van and order a burger. There’s a chance that there is horse meat in there, but there is no way of knowing and a high chance that your risk will pay off. A la the burger will be delicious.

Now, to make it more SEO Specific here are some risks that are worth taking:

> Hiring an inexperienced SEO writer: There’s a difference between ability and skill. You could spend money hiring a skilled SEO writer who can churn out all the posts you require, but it will cost as they have built up their skills and commend a great CV. However hiring someone inexperienced with SEO writing, but a person who has the ability to write great content is well worth the risk. You can offer support, guidance and training to this new person with the natural ability and they can learn the skills that will turn them into a great SEO writer based on their natural ability.

> Re-Designing your website: But the pages wont match up and it’ll confuse the search engine spiders and everything will change and it’s bad for SEO. But on the plus side, user engagement could shoot through the roof. A brand new site that looks the bees knees with a brilliant UI and UX (User Interface/Experience) is going to reduce bounce rate so much and improve the conversion rate of your site. And surely that outweighs the possibility of a few weeks with poor rankings that are more than likely to return anyway.

Bearing in mind this is my personal opinion if you take these risks and they don’t pay off then you should be they were a risk and not bulletproof.

Blog Post by: Greg McVey writing for Net66

Why don’t People Like my Website?

There are so many varied and reasons as to why someone might dislike a website. It could stem from a person’s personal preference as a lot of sites that, whilst appeal to many, offend the few.

For example I quite like the colour blue. It’s calming, it’s professional, it’s the colour of the sky and the sea and everywhere in life. You can get it in all kinds of different shades, textures and more. But Jordan who sit’s across the way from me really likes the colour red. It’s passionate and driven with a real go-getter attitude. So obviously Jordan will be more inclined to like red sites where I would maybe prefer a blue site.

Because of such subjectivity a recent study decided to produce data on users browsing habits and what does and doesn’t cause them to dislike a site. Here’s the top three reasons why people would block a site from search Results:

Too Many Advertisements: People don’t like to be advertised to. Advert breaks between TV shows are quite often referred to as “Tea Breaks” with even the presenters of television programmes advising you to pop the kettle on during these ad breaks. So it follows true that if you were searching for a cheap coffee table and you found a website full of advertisements not for a coffee table from that site, but low grade, high density adverts for coffee tables from a variety of other sites, you’re not going to be impressed.

Poor Quality Content: You’re looking for advise on building a shed. You run an internet search (or you Google It) and find not a website filled with the soundest advice on building a shed, but a loosely put together website mentioning “Garden”, “Shed” and “build” a lot. How annoyed would you be? Enough to block the site? Well apparently so with 60.6% of people in this study saying they would block a site with poor quality content.

Incorrect Result: “It’s good but it’s not the one”. When looking for something online, search engines are constantly trying to second guess us and produce the results we want quicker than their rivals. But in some cases there’s an argument for “More speed, less haste”. Queries can be run that do bring up the wrong results and even though it’s a similar result, it’s not the right result so unfortunately falls to a ban from 47.2% of people.

So make sure you’re correcting these mistakes before they start to affect your website’s performance in the search engines.

Posted by: Greg McVey writing for Net66

WordPress Attacked by Botnet

Own a WordPress site? Had trouble with it this last week? Well you’re not the only one. A botnet had been configured to target the default user that is created anytime a WordPress site is set up.

A botnet is a network of computers that have previously been infected with a virus. This virus gains control of the computer it has infected but doesn’t always set to work straight away. The virus infects multiple computers until it has a network of computers large enough for the users intent. The user can then control all of these computers and instruct them to do their bidding, in this case targeting WordPress sites.

The scale of this attack comes down to the basic install of a WordPress site. WordPress is a Content Management System (CMS) that utilises a username and password to enable someone to log in to the site and make any changes they want. Throughout the default install of WordPress the user name that is suggested to everyone is “admin”. This is what has been targeted and due to the large amount of WordPress sites out there (upwards of 64m) it wouldn’t be surprising if one of your websites was hit.

There are however several ways you can guard against this:

> Change your username from “Admin”. Make it personal to yourself and use a strong password, at least two capital letters, numbers and punctuation.

> Enable “Two Step Authentication”. This assigns a secret number to every user of the site and if you don’t know the secret number, you won’t be getting in.

Hopefully your site hasn’t been compromised and you can take these steps to further safeguard your website.

One of the concerns also raised after this attack is that the WordPress websites weren’t the primary target of the assault. Experts fear that this botnet was predominantly comprised of  home computers that are relatively weak machines and that they were actually targeting servers to infect with virus’. As servers are a lot more powerful than regular computers they can process much larger amounts of traffic at a time.

Which is where the real threat lies.

Have you been affected by this?

Blog Post by Greg McVey writing for Net66

From Zero to Hero – Google Beating Bing at Keeping Malware out of Searches

Back in July 2010 a study was conducted measuring the amount of pages returned by a search engine that contained malware. Google had over twice the malware results of Twitter, Bing and Yahoo combined.

This must have hurt Google to the very core because another study has been published and three years on, Google isn’t at the top of the pile for free malware, they’re now languishing at the bottom enjoying the fact that they now provide the least amount of Malware.

This study discovered the following results:

  1. Google performed best in these results returning only 0.0025% results with malware in them
  2. Blekko search engine came second best with a 0.0067 percent return
  3. Bing returned nearly 5 times as many malicious results as Google with a 0.012% return.
  4. Standing atop the group, lord of free malware was the yandex search engine with 0.024% of results containing malware

As you can see that’s quite a turnaround with Google performing admirably especially when you consider that Google’s nearest rival Bing had nearly 5 times as many malicious results.

The study went further with this and even reported on what results are most likely to bring back malicious pages. The most likely was searching for current affairs. Malicious pages take advantage of user’s thirst for breaking news and as there is a rush on to learn, people are a bit more careless when clicking links.

Another previous study also found that searching for “Free Downloads” or “Free Music” or quite a lot of things that are free that people don’t pay for 100% of the time.

So my advice would be to check and double check all pages you’re accessing and if you get a malware warning, turn tail and run. Only get your breaking news sources from trusted news websites.

Posted by Greg McVey.

How weak content can hurt your site.

badtimeYes that’s right. That perfectly, pretty, published piece of prose is going to hurt your site and maybe your rankings. Here’s why:

Content is for people, not Search Engines

Yes your content ranks well and your getting traffic. But what’s your bounce rate like? If your content reads somewhat so-so and you’ve included keywords just to rank for certain phrases and you’ve achieved this ranking then Kudos. You’re now getting a lot of traffic to your site. Unfortunately the content you’ve used to get this traffic isn’t engaging enough for your readers who quickly switch off and bounce right off your site.

As you’re getting a lot of bounces to this page Google can see that although to it’s search engine it does look like it’s relevant, humans clearly aren’t voting for this site so why should they continue to rank it there? They wont. You’ll soon start to see your rankings slip which isn’t good for traffic or sales.

Think of weak content like you would an attractive weed. When they’re first planted you do get noticed, people often stop and look at your garden and think “Ooh that’s pretty” before quickly realising that it’s a weed and moving on. The weed spreads and takes over your grass and evolves into and ugly weed until you’re left with a garden full of weeds where no one stops to look any more because it is just a weed garden.

Be a Human, not a Robot

People like to feel connections. And if they’re reading something that feels as if it’s been written for them, then there will much higher user engagement and a much much lower bounce rate. Here’s an idea on how you can improve your content writing.

Be Yourself. There’s a whole industry of content writers out there programmed to churn out text with the occasional keyword inserted into it. But of 7 billion people on the planet you and you alone are you. Unique. Let that show through when you’re writing.

Can a previously spammy domain recover?

Now before you wonder why I’m asking this question, there is a legitimate reason behind this. spammed-image

Let me give you an example. fishingnetsonline.co.uk has previously been owned and used by a spammer. It’s had thousands of spammy links sent to it, had a keyword density in double figures and meta descriptions as long as my arm. It ranks ok but it’s getting to the point where more is going in than coming out for the Spammer, so to squeeze as much as they can out the domain they then decide to sell it.

Along comes Bob. Bob’s your average guy who owns a fishing net shop and doesn’t have a website. It’s coming into fishing season and Bob decides the times is right to finally get online. To him, fishingnetsonline.co.uk is the perfect website for him. It ranks ok, it’s already been registered and established online, all he needs is a new website on it and he’s ready to catch some interest.

His site has been live a week and everything is ok but then he get’s a Webmaster Tools notification. His site is violating Google’s quality guidelines in spectacular fashion, which leaves Bob thinking “Well this isn’t fair, I’ve got a new website and have no affiliation with the old website or it’s previous actions. Now I’m out of pocket having bought the domain (for a steep rate I must add), a new website, hosting and I’m hemorrhaging money because the site I’ve put together is nowhere with no rankings”.

So, can poor old Bob rescue his website and his money? The answer is yes, but in theory.

There are two types of penalties that Google can hand out to sites that are spamming. Manual penalties, when a member of Google’s Webspam team decide that you’re a spammer and apply a penalty, and Algorithmic penalties, when Google’s algorithm applies a penalty based on crawl stats.

Now you can clean up your site, remove all outbound spammy links, edit your content to regular keyword densities, create genuine quality content on your blog and submit a reconsideration request to Google. The Webspam Team will see this and remove any manual penalties in place. However, everytime the algorithm crawls your site it can still see the rafts of spammy inbound links to your site that are currently hindering your rankings.

Google does offer a solution by way of the disavow tool. This entails analysing every link to your site that you can find, uploading them to a spreadsheet and adding this to your Google Webmaster tools account so that the algorithm can read it. This process however is not instantaneous and far from thorough.

My advice would be to check any used domain that you’re thinking of buying. Put it into ahrefs.com or opensiteexplorer.org to see whether there is a history of spammy links to it. These are free resources to use and very user friendly however if you are uncomfortable or not very handy with the internet then enlist the help of a friend who is.

I’d also advise, if you are a new business or getting your first website then consider a brand new domain.

Posted by Greg McVey

The Fickle World of Social Media

This week we’ve seen some quite high profile incidents that have been covered endlessly all over the Media, including Social Media.

First of all, we’ve seen Paris Brown vilified in the media for making what could be considered homophobic and racist tweets. This has been such a high profile case due to her becoming the Youth Crime Commissioner for Kent. The Kent Police Crime Commissioner has also added that Paris’ tweets weren’t checked before she was awarded the £15,000 a year job.

So this raises a question. Should Social Media accounts be taken into account when employees are interviewed?

Some who would say no point towards the privacy of your Social Life and the right you have to enjoy yourself when not at work. After all, does your social life affect your ability to do a certain job?

Those who say yes point towards your social media accounts as indicative of your true character. Especially for such sensitive jobs as childcare, social workers and other such professions.

If so, then the second Social Media fracas this week could land some people in hot water. The passing yesterday of former Prime Minister Margaret Thatcher split opinion in a highly bi-polar way.

Some greeted the news with enthusiasm, posting their delight at such tidings. Others were less disrespectful and merely mentioned they had no remorse but also no happiness.

Then you had the opposite end of the spectrum, where people mourned Thatcher’s passing and were critical of people who were openly in a state of delirium.

So following the first incident are there now hundreds, maybe even thousands of jobs at stake due to the fact their social media accounts contain posts mocking and insulting the a recently deceased human being?

My own personal advice would be to find a balance between what you post and how your privacy settings are configured on your Social Media accounts.

What do you think?

Posted by Greg McVey.

If a Penalised site Links to me, will I get Penalised?

With the most recent Penguin refresh affecting quite a lot of websites out there and the perennial problem of “I can’t help who links to me”, this question is getting more and more frequently asked.

Thankfully Matt Cutts has released a video answering this question once and for all. Before we get into the explanation, let me give you an example.

You’ve got a blog that creates original, quality content that also has a high page rank. Let’s call this site A.

Site B is your site. You also create beautiful, descriptive content so successfully that site A decides it’s going to link to you.

Site C isn’t in the same league. The content it creates is OK but no trails are blazed with it. Site C has a low page rank and would ideally like this to change so it approaches site A and requests a link, but offers to pay for it. Site A agrees and sees this as an avenue of revenue eventually selling quite a few links.

Invariably this is picked up by Google who then takes the following steps:

> Reduces Site A’s Toolbar PageRank by anywhere between 20%-50%

> Prevents PageRank being Passed forward.

> Removes trust from any links pointing out from site A.

As you can see that’s a pretty hefty penalty for selling a few links but there are no active penalisations that are passed on due to this link.

However, you do have to deal with the fact that you’ve lost link juice from Site A so there is a slight downside to this.

Phew! Glad we got all that cleared up. Once again thank you to Matt Cutts.

Posted by Greg McVey

Panda Updates – Google Keeping Shtum.

Google Keeping Zip on Panda Updates.It’s been like this for a while. Rumours abound that Google are going to release another update. Hogwash, some say. Another? Cry others. Then the inevitable boom in conversations “My site has dropped”, “Well my site has climbed”, “Well mine is currently somersaulting somewhere over there and I’ve given up”.

Once all the humdrum finally calms down and people’s rationale is somewhere near level, Google come out with a message that stirs it all up again. “Oh yes, we did release an update, sorry about all the bother”. This usually leaves a lot of people wallowing in indignation occasionally muttering “not fair” and aiming kicks at the odd inanimate object.

However, this is all about to change. Recently Google (or more specifically Matt Cutts) have announced that they’re about to integrate Panda into their Algorithm. This in effect ends the manual refreshes of Panda which is also now leading to Google (or again, Matt Cutts) refusing to confirm Panda Updates as they have done in the past.

There have been 25 confirmed Panda Updates so far. Last week Matt Cutts inferred that Google were about to release another Panda update, but then later added that there would be no more instant impact of the update due to the incorporation of the updates into the algorithm. Due to the now muted impact of these new updates the decision to confirm updates has been reversed and Google will now be keeping quiet when face with questions on updates. A direct quote from Matt Cutts states:

“I don’t expect us to tweet about or confirm current or future Panda updates because they’ll be incorporated into our indexing process and thus be more gradual.”

So that’s it. The next time the Google update rumour mill starts churning, expect no confirmation from Google and try to get off the mill before you get too dizzy.

Posted by Greg McVey

Matt Cutts – The Short Cutts

Most SEO Technicians know who Matt Cutts is from the very large number of videos that he has made over the last few years, as well as being head of Google’s WebSpam division. I am sure at some point in our lives we have all checked a few out by him for whatever the reason. But what are the advantages and disadvantages of them?

I personally find that Matt Cutts makes very detailed videos that can often turn out to be quite lengthy. Without realising it you can become so engrossed with the chase for new information that you can end up spend far too much time watching them. After all time is money. Another thing to consider is that the video can be quite simply out of date from Google’s ever changing algorithm, so it is important to check that you’re watching fresh videos.

The benefits of these detailed videos are quite straight forward really. Maybe it is so you can quite simply learn something new, gaining a more detailed answer or a full explanation may be what you need. Everybody is different and have their own way of learning.

This is where The Short Cutts comes in! Taking lengthy answers and making them short. This is great for people who don’t have time to spend watching video after video to get the answer what they need, with The Short Cutts it simply tells you the question and answer. Straight to the point, Simple, easy and fast.

What The Short Cutts allows you to do is search a question within its large database of videos, this will then bring up the relevant videos that Matt Cutts has recorded from the question that you have searched with underneath your question and answer to it. It will also provide the full clip for you to watch if you wish to gain a more detailed answer, giving you the best of both worlds.

Posted by Greg McVey

Guest Blogging

Guest blogging or guest posting as some may call it started to create a bigger presence towards the end of last year. This was mainly due to content marketing. Since the start of 2013 this technique of link building has grown and many webmasters are using it more and more. This is a smart move considering the changes which have been made to Google’s algorithm, penalising big sites (Interflora) and various other linking networks.

How to choose a platform to post on.

This is the very first action you should do and that is to determine the blog you wish to put your content on. When doing this the niche needs to be relevant not exactly identical but there needs to be some sort of relevance. Always look out for any spun content, keyword stuffing or duplicate content as this could have an effect on the power of the link.

Ideally you need a platform which is related to your niche, which has no grey/black hat SEO featured with it. You can determine that through the page rank and link profile (that is also a good factor to check) along with the social media presence.

Content

Now for the most important part and that is your content. The idea of writing the content for a successful guest post is to keep the audience entertained. Ideally 500 words minimum and related to the niche.

A good tip is to use graphics and humour to keep the target audience engaged in the subject, because let’s face it you could be reading about your favourite thing in the world but if it is all one big long paragraph of text then you will not read it correctly. Use titles to break up the content and add some cliff-hangers just to add that bit more suspense.

The thing with guest blogging is that it is beneficial for both parties involved and I think this is why Google values the link as much as it does. You are engaging with others related to the topic and you are working for a backlink which is what the recent panda refreshes have been all about.

Best possible result?

If you have done all of the above, added fresh, relevant and engaging content to a relevant high authority domain then not only will your site move up the SERPs but there is also a chance that you could go viral.

This would mean promoting your post to relevant forums, use your social media accounts and depending on how high you are on the SERPs it could happen.

SEO Rumour Mill – SAPE Network Penalised?

Now there has previously been news that Google are now penalising link networks and most recently SAPE Network. We have never heard of this type of link network due its black/grey hat reputation as we strictly use just white hat search engine optimisation Techniques. I have done my research on the SAPE Network and this is what I have found out;

What is the SAPE Network?

SAPE is a Russian website which offers thousands and thousands of webmasters high PR Links with any niche you require and low outbound links. Including directory submissions, Google +1s, social bookmarks, retweets basically everything to give your site a bigger presence on the web.

So this is basically selling high PR link packages which we all know is the biggest NO in Google’s guidelines so it was only a matter of time before they got found out. SAPE was probably the biggest black hat network that was being used as the way the network was set out; it promised customers instant results and sales at a relatively low cost. Whilst having the safety from search engine updates by using sites with different IPs, name servers and hosting companies making the links untraceable.

Now obviously it sounds too good to be true and it is as it would be a definite that your website would get heavily penalised for breaking the rules and would have a bad reputation with Google which is the last thing you want!

This doesn’t seem to be because of a latest algorithm update but it seems to me and many other webmasters, that Google are targeting specific link networks recently (BuildMyRank.com being another) and also Interflora for the use of advertorial links. Google are finally showing people that they are specifically targeting sites that are selling links and breaking their code of conduct.

If you have used this kind of service, Google will find you out and you will pay the price.

Say What You Like About Google – Says Matt Cutts

There have been a few rumours that Google have been penalising websites just for being critical to Google’s tactics and procedures. It’d be such a shame if Google was to take the approach of banning these websites for dishing out bad press, so much for Free Speech!

However, the man from Google (Matt Cutts) says that this isnt the case at all, never has been and never will be. This is exactly the way it should be too, you cant be a giant such as Google and not be able to take criticism from time to time, and at times they do deserve it. I’m sure there is a long line of businesses who have been hit and lost revenue due to Google’s seemingly strange and sudden changes to their algorithm.

Then of course you have the fact that whatever Joe Bloggs happens to do online, you can bet Google wont be far behind, watching what you do, then adapting a similar approach themselves to cash in. Sometimes you just think, really Mr Google?? Do you have to stick your finger in all these other pies too? But I guess that’s the nature of large companies, some call it good business sense, others call it plain greed.

Whatever your views on Google, one thing is for sure, like them or loathe them, they have helped to change the way in which we use the internet. For all the people who have been hit and lost money using  Google’s search engine, I’m sure there are hundreds who have profited and made money.

So by no means is Google perfect, on a weekly basis it can often baffle, bamboozle and frustrate even the likes of a professional SEO company like ours. However in the grand scheme of things it is good for the average internet user and most likely provides more benefit then anything else.

So there you go Mr Google! A bit of criticism there for you, I’m sure you’ll find that most people find it fair, but at least I added in the odd compliment for good measure too.

So if you are worried about getting the slap from Google, then rather than being worried about what you say about the big G’, you really do want to pay more attention to your page layout, architecture of site, duplicate content and what backlinks you are using. These are still looking like the main things a Webmaster needs to keep an eye on to be able to use the world’s biggest search engine successfully, Freedom of speech is still going strong people!!