Net66 SEO: Google PageRank not Getting Update this Year

PageRank. Long was it the staple diet of the internet, instantly giving you an idea of the site’s trustability. It was easy, the higher the PageRank, the more Google trusts the site and the more you should too. The lower the PageRank usually meant that Google didn’t trust this site as much, or that it was a new site.

But now it’s different. PageRank hasn’t been updated since February and now it looks like it’s not going to be updated again this year. See Matt Cutts video below:

You can see from this video that PageRank is only updated periodically so you might not get a PageRank straight away, or for a while. But it’s a tweet that Matt Cutts responded too that gave away just how little importance Google is placing in PageRank. See the Tweet below:

 

That’s going to mean no PageRank update for at least 10 months. Nearly a year of no updates. That’s quite a time, so it does look as if Google is starting to look at other factors for it’s main trust in a website, and letting PageRank fall into disrepair.

Blog Post by: Greg McVey

Net66 SEO: Google Release New Penguin 2.1 (Penguin 5) Update

On Friday Google confirmed, via their perennial knight of webspam Matt Cutts, that they have released a new version of their Penguin Update. This update runs in tandem with Google’s new algorithm, Hummingbird, with their other significant update, Panda, also being integrated into the algorithm.

Now there is something to be cleared up here with a lot of people referring to Penguin 5, whilst others refer to it as Penguin 2.1. The reason being is that Google have released 5 Penguin updates, however, the fourth update was deemed too different from the third. So, Google decided to name it Penguin 2.0 as it was a pretty revolutionary and new update that was a cut above the previous Penguins.

But now we have Penguin 2.1. So what does this entail? Well the official release didn’t really give two much away. It was confirmed by Matt on his Twitter page with the tweet reading that Google have released the update and that it will affect around 1% of search queries. See the Tweet below:

The thing to notice here is the text “to a noticeable degree”. With Hummingbirds release it affected around 90% of searches on Google. But with that, the SEO industry only noted a slight change here or there and there were no mass scale fallouts of rankings for people. But this time, with the 1% being to a noticeable degree, that’s more likely to mean that there will be a higher degree of page one listings changing. Much more likely to draw attention from the SEO industry.

Penguin 2.1 does seem to have hit a lot of sites that have so far seemed to escape punishment by Google. After all, that’s the point of Penguin 2.1, to root out websites that are manipulating the search algorithm, specifically though bad link building practices.

It seems to have taken hold already with a lot of fluctuations in traffic and rankings being noted throughout the SEO industry, especially on the Webmaster World forums. Have you noticed any ups or downs in your traffic or rankings?

Blog Post by: Greg McVey

Net66 Google: Google Now Integrated with Knowledge Graph

There have been a lot of big updates form Google recently. Not least of which is the completely new algorithm they’ve introduced called Hummingbird. This Algorithm pays closer attention to sentences as a whole rather than noting which keywords are used and bringing up relevant results for them. This helps Google answer questions better when asked.

Another big update is the integration of two of Google’s most advanced search features. Their personalised search service Google Now and the Knowledge Graph. Personally I love Google Now, it practically second guesses me. I pull out my phone wondering when my next meeting is, and it’s first on the display of “Cards”.

But now Google are making this and even bigger feature as you can get Updates from people who have the Knowledge Graph Card. I.e. when you search for someone on Google and it brings up a little bio on the right hand side of the page, that’s the knowledge graph guessing at what you want and bringing up a snapshot of the information you’re after.

But if you’re running a search on a device with Google Now on it, then you get something extra. You get a little “Keep Me Updated” box that you can tick. So say you’re looking up Matt Cutts on a device that you have Google Now on, then you can tick a box to make sure next time he updates his blog, you’re notified. Perfect for the SEO on the go. See the below image:

Google Now and the Knowledge Graph

So, do you use Google Now? Who have you signed up with?

Blog Post by: Greg McVey

Image Courtesy of Search Engine Roundtable

Net66 SEO: Google Releases Hummingbird Algorithm Update

Google HummingbirdGoogle last night revealed their biggest change to their famed search algorithm in 3 years. They tinker with the algorithm all the time, allowing them slight refinements to it so they can better provide you with the search results you’re looking for.

But this time they’ve added by far the largest update to their algorithm so far. This update will affect 90% of all search queries! That’s right, 90%! To put that in perspective, Google’s Penguin Update affected 2.3% of English queries and caused an uproar that is still lasting to this day.

I’m sure you must be aware of all the penguin recovery techniques that are going around. Even now, months after it’s been rolled out.

So what does Hummingbird actually entail then? Well the main purpose of this update is to better answer long tailed questions that are put to the search engine. So for instance, before the update, if I were to search for “How do I make sticky toffee pudding without raisins?”, because raisins aren’t everyones cup of tea. The previous algorithm would still have counted raisins as a search term as it is in the search string. To find recipes without raisins I’d have to enter the more complicated search “Sticky Toffee Pudding Recipe -raisins”. Because when you add the minus sign before a word, Google will return all search results without that word in them.

This Hummingbird update is now set to change that, so that Google will be able to understand more long tailed question search terms. Previously the algorithm’s main prerogative was to search for short tail related keywords. Which is why my search for a good sticky toffee pudding recipe took so long.

Google has also took the time to update their voice search capabilities. They want communication with Google to be as if you were taking to your friend. So for instance if you were looking up information on Malta you could say “Tell me about the History of Malta” and Google would come up with results for historical data about Malta.

But then, you could say something along the lines of “What about it’s geography?”. Google would then present you with geographical knowledge about the country, without you having to repeat it’s name. Cool isn’t it?

I’m sure that the hummingbird update had a lot to do with how much Google is pushing into mobile search as well. As on a mobile, its easier to hit one button and talk rather than fiddle with the device’s small keyboard.

Finally, in a well thought out release, the Hummingbird update also coincides with Google’s 15th Birthday celebrations. So, as well as a cute little game on the Google home page, it also subtly implies how far Google has come in 15 years by releasing their most intelligent Google Algorithm to date. Smart Move Google.

google-birthday

High Score for the day: 155

Blog Post by: Greg McVey

Net66 Tips – Onsite Optimisation: Building the perfect web page for SEO

This is one of many challenging aspects in SEO and that is building a web page which is perfectly optimised not only for the algorithms but also for the user. The list is endless with this but I am going to show you what I think and what the people here at Net66 think is the best way to optimise a web page.

Gone are the days where we could solely rank of meta data and keyword stuffing, search engines have advanced over the years and it is now all about quality and relevance for the audience. There are also many ways of generating traffic to your website through social media, blogs and emails etc.

Crawling and Accessibility

web crawlerThis is crucial to check on your web page as this could impact the performance of your website. Search Engines read website through an automated bot which is therefore programmed to look for specifics. Some of these specifics include;

• Is the page with the content on the correct Url?

• Is this Url user-friendly?

• Is the robots.txt file blocking the robots from crawling any pages?

• If the page is down then are you using the correct status code?

Now these are not all the specifics but to me these are the most important. So, what do they mean? Having a friendly URL structure ensures the bots can read your website more efficiently and therefore can only benefit you, it also makes it easier for the users to understand what the content is about. The robots.txt file is a set of commands in which you can control which pages can be crawled by robots.

It is recommended that you check this to make sure you are not blocking any pages you wish to be crawled. Lastly, sometimes we can experience technical issues with our website (if not you must be doing it wrong…) and when this happens it is important to use the correct status code.

If a page is down temporarily, then you must use a 503 status code plus if you need to redirect a page to a new address then you must use a 301 redirect which is permanent.

Content

This is the most important factor when it comes to successful SEO. The term ‘content is king’ is widely used by webmasters all through the industry and they are correct!

As the search engine algorithms have shifted and advanced over the past couple of years the two words which constantly arise are “quality” and “relevance”.

This is exactly what your content should be, quality and relevant to your niche. Now obviously we still have to abide by Googles Webmaster Guidelines with regards to uniqueness and keyword stuffing (you know the drill).

At the same you need to try and target a specific keyword without breaking the rules… But how do we do that? You do know that Google does not pick up that exact word 100% of the time. Google now picks up other relevant keywords. Here is a prime example;

knowledge graph

As you can see Google highlights words relevant to swimming supplies, such as swimming goggles, swimming gear, swimming supply and equipment. So therefore, we can include these other relevant keywords in our body text, meta data, h1’s, h2’s and our alt tags on images.

This shows Google that you have done some research and it will reward you with that extra credibility. For ranking purposes the exact keyword you wish to rank for needs to be included in the meta title, then broken up in the description with a 2-3% keyword density in the body text.

Try and break up your content with images, bullet points, videos and short paragraphs. What this does is keeps the audience interested in the content and this is another factor which Google takes into consideration and that is user experience.

With regards to this and also content, Google seem to be rewarding more engaging content, so do not be shy and add your sense of humour into there (if you have one that is) and start engaging with as many people as possible.

seo meme

Internal Links

When I first started out with SEO I used internal links for ranking purposes, linking to an internal page with exact keyword anchor text. What I then found out is that it didn’t look natural and there was always that risk of getting a telling off from Google.

I now use my internal links wisely, I create user friendly internal links which are linking to pages which deemed most valuable for a certain phrase/(s) and still some keyword anchor text linking to the page I want to rank.

I find that a good natural mix of anchor text is the way forward especially after Penguin 2.0. Also internal links create paths for the bots to crawl your website, the more paths the quicker your website will get read and indexed.

 

Blog Post by Jordan Whitehead

Net66 SEO: Google Now Encrypting Every Search

Google is set to make SSL searches the default for all users of their search engine. They’ve previously stated that they would only encrypt anyone who used the https:// version of the site, or users who were signed into their Google Account (YouTube, Gmail, Adwords etc).

But now in a dramatic and sudden U-Turn, they’ve completely reneged on this and encrypted every search term being entered. Now there are two main theories going round the web here:

1. Google have acted on their users concerns of data sharing with the US’ PRISM program. So to draw users back to their site they have assured us that they’ll encrypt all searches so no one with the power to do so can see what we’re searching.

2. Advertising sales are down and, as you may know, Google still passes on keywords used to advertisers. So in withhold keyword data from analytics, it could push more people onto their adwords, thus driving up revenue.

I’m more inclined to think it’s a mixture of the two. But what do you think?

Blog Post by Greg McVey

Net66 SEO: Matt Cutts says IPv4 and IPv6 sites are not Duplicate Content

This is something that shouldn’t be of concern to most of you yet as a lot of ISPs haven’t yet made the jump to IPv6 connections for websites.

The difference between the two is that IPv4 connections consist of 4 groups of 3 digits. Such as your regular IP address a la 123.12.123.12 which is your standard IP address. IPv6 addresses have a much different configuration consisting of letters and numbers. Although with IPv6 addresses if you have more than four zeros in a row you can omit them.

So, in theory if you did have an IPv6 address, you have two versions of your site on two different connections. Which usually would account to duplicate content. Thankfully Google has devised a way to tell whether the same site is on a different connection rather than a different host/domain (which would be a duplicate content issue).

Matt Cutts explains it all here:

Blog Post by: Greg McVey

Net66 SEO: A New Player in the Digital Marketing Game? Enter Adobe Target

TargetWe all know Adobe as the company that created Flash, Photoshop, Fireworks and of course the much loved Adobe PDF reader. So why has this creative company suddenly decided to release a new product designed specifically to help business owners understand the complex world of digital marketing.

But further than that, it also has the best interests of the end user in mind too. Not only will helping the business owner/SEOer improve the digital marketing aspect of the website. But tidying up all the onsite issues will surely increase the benefits of the website as a whole, thus, creating a more enjoyable experience for the end user.

It also deals with optimisation issues that some business owners can find confusing. They may know everything about how to whiten teeth, but when it comes to analytics, they will know nothing about this subject. So can get confused with the difference between unique visitors and regular visitors, bounce rate and time on site and the myriad of other statistics Google display through their analytics software.

With an online presence being a massive tool in revenue for businesses; more and more owners, marketing execs, accountants and anyone who has something to do with the website will be wanting to understand where their revenue is coming from, and how the website has attracted it. This way they can make an informed decision on whether or not each marketing avenue is worth pursuing
or not after analysing how much business each stream has brought to the site.

This software also goes beyond mere analytics and offers targeting, with the chance to set up favourite audiences so you can make the most of your online market. They further offer a step by step process, so that even the most novice digital marketer can have the chance to get a grip of their digital marketing campaign and steer it in the direction they want.

You can see the video that Adobe have released below:

Blog Post by: Greg McVey

Net66 SEO: Having a Manual Action Penalty Removed Doesn’t Help Rankings

Well, not instantaneously anyway. We’ve seen it a lot over the past year where people claim their website suddenly drops for all it’s rankings. Its traffic subsequently follows. So what is it in the first place that causes their sites to drop?

Usually it’s some form of manual action that Google applies to the website due to it violating one of Google’s stringent and vague Webmaster Guidelines. The usual line is that their links weren’t completely in line with Google’s rules, sorry, guidelines, and that they clean them all up, submit a resubmission request and huzzah! No more manual action.

They give it a week and see no return to their previous levels of traffic and rankings. So what’s the problem? Well, it just seems that although your penalty has been lifted, it’s more of a “we wont punish you further” rather than “we’ll fix your rankings”.

After all, would you learn your lesson if you could break the guidelines, rank well, get found out, have a manual action penalty, get within the guidelines again, and be restored? It’d be open season for trying out black hat results if you knew you could try anything with impunity knowing you could remove it and simply be restored to where you were.

So what are your thoughts on this? Do you think traffic and rankings should be restored straight away?

Blog Post by: Greg McVey

Net66 SEO: Google Asking you to Recommend Good Websites for Search

People have long complained about being unable to break into the top of Google’s SERPs simply because of the fact that although you have a better website and provide a better service, they spend more on SEO than you.

Well all that is set to change (maybe) with Google now asking SEO’s to send websites to Google that they think are first page worthy, but currently aren’t on the first page due to the high paying optimised websites. It was actually a Tweet by Matt Cutts that revealed this news, and, the link to where you can recommend your websites.

Here’s what the Tweet said:

 

This has thrown up a question for me though, with the massive amounts of competition out there, the fact that anyone can submit to this leaves it open to exploitation. Especially when you consider that the majority of people follow Matt Cutts are SEOs who would love to show off their work.

It could also open the doors to black hat techniques such as submitting a website 200 times from different IPs to make it look like 200 people like your site. So what do you think the reasoning behind this is? Do Google want us to do their Jobs for them? Is this a trap? Or do they genuinely want good sites recommended?

What do you think?

Blog Post by: Greg McVey

Net66 SEO: The Keyword Tool is Dead. Long Live the Keyword Planner

RIP Keyword ToolSo, the keyword tool has finally said it’s last hurrah and been replaced by the new Keyword planner. To start with, the keyword tool was quite an excellent tool designed for PPC Specialists, but used by SEOs nonetheless. Reason being, it was the closest thing anyone had to getting traffic estimates for certain phrases, especially when you consider that you had local and global monthly estimates.

You could also add broad match or exact match phrases to the same set of results, which would give a more in depth look at who was searching what, in what volume, and present the figures side by side. Anyone could use this tool too, even without having to sign into a Google account, you just had to fill in a captcha.

So here’s where the keyword planner differs straight away. First of all you DO need to sign up for a Google Account when using the Keyword Planner, drawing the ire of many a marketer. I’m sure they all have Google Accounts already, it’s just frustrating when you need information quickly and you have to fiddle about signing into things. Plus if you want to talk to a client and use the keyword planner, they’re not always going to have a Google account themselves so this could throw a spanner in the works.

Secondly, you have to view exact match and broad match keyword statistics on different pages. Which limits your ability to check keywords side by side for the difference in statistics between broad and exact.

Also, you now can’t filter results by “Closely Related”. This always provided more enhanced results when you were looking for phrases related specifically to one subject. A move that has been remedied straight away by Google, such was the force of the backlash. Google now say that they will be reinstating the “Closely Related” tool, at some point in the future.

It isn’t all doom and gloom though, as there have been some improvements to the old tool such as the fact that there are more geographic targeting tools. Allowing people planning Local SEO campaigns to break down targeted locations more specifically. You can also bundle together these new geographic targets to get even better results.

You can also upload more keywords of your own with with the limit being much higher than it used to be. You can now upload up to 10,000 keywords.

A nice new feature is that you get ad groups suggested to you. Sort of a quick fix for keywords relating to what you’re looking for. These groups come with an average cost per click and an average monthly traffic, indicators of what you can expect from this ad group. You also get to have a look at which keywords are in the group so you pick and choose the ones you want.

So although some features of the old keyword tool will be missed, you’ve got the keyword planner to look after you and although you might not love it yet, I’m sure you’ll come to.

Blog Post by: Greg McVey

 

Net66 SEO: Link Building Dos

Find Unlinked Brand Mentions & Logos

If people are talking about your company on the web, you need to be aware of it. So why not run a search for you company name and contact all the places where your company is mentioned and where this mention isn’t linked to your Website. If it’s someone you know I’m sure a quick fire email will help get you that link, but you may have to work harder if you’re mentioned on another website without the link.

For logos you could search an image of your logo and then select the “visually similar” option on Google Image Search. There you could easily ask website owners to give you a link, or at least customise your Alt Text.

Telling Great Stories

Content is king! We all know that, and have done for some time. So why you wouldn’t spend a long time on your content building strategies seems strange. Rather than publish blog post after blog post without really getting into it. Spend one week where instead of half an hour at the end of the day to publish 5 blog posts, you use that time to instead create genuine, compelling content that will get shared and linked to.

A story is a great way to do that and is one of the most organic ways to get links to your site.

So there are a couple of tips to help you on your way with Link building.

 

Net66 SEO: Has there Been a Google Update?

Word on the forums seems to suggest an update affecting quite a lot of sites around the 21st and 22nd of August. Although the forums are relatively quiet (it is summer and I’m sure a lot of people are on holiday) there were spikes of traffic on both the 20th and the 21st where people were obviously talking about something.

But was it an update?

I’ve checked a few sites and some seem to have really increased whilst others have just hit an absolute average (Analytics shows a near enough straight line of traffic). Although traffic has seemed to fluctuate, there has not been a massive change in rankings. Yes there’s the usual fluctuation but nothing that would make me worried.

There’s quite a few ideas going round the web now as to what Google may have done to cause such a spike in conversation. But until we find out more of the effects of this update we’ll just have to wait and see.

Blog Post by: Greg McVey

 

Net66 SEO: Is Google Deliberately Harming Organic Results?

First of all, The Fold, is what’s immediately viewable on your screen once you’ve performed your internet search (Googled something). If you have to scroll down to view past the paid for content and Google’s own maps service, what does it say about Google’s attitude towards the SEO industry?

We’re all working hard to stay within Google’s strict guidelines, and now it looks like soon enough organic listings will be third in line in results displayed on the SERPs. This cam about because of a new feature Google have implemented. Which means when you run certain searches on Google, what you see straight away is:

> A Carousel. The only purpose this seems to have, is to promote you to refine your search. Each image on the carousel is tied to a different Google search so if you search Hotels (in the example image) each image at the top will have a different search attached to it. Such as “Boutique hotels”, “5 Star Hotels” etc.

> Google Ads. In the bottom left of our example image you can see the bottom left is dominated by Google’s Adwords. A larger than average box ensures that it takes up all the space it can.

> Google Maps. Although we set up Google Maps originally, we have no say whether or not our maps will be chosen with no real way to optimise them. This, again, relies solely on Google.

See the example image for the search “hotels” below:

googles-destroying-seo

 

So what do you think about this new first page Dominance by Google? Is this a deliberate attempt to discourage the SEO Industry? Or is this just simply Google trying to improve search results for us?

Blog Post by: Greg McVey

Net66 SEO: Matt Cutts Reiterates +1s have no Direct Impact on Rankings

Matt CuttsIt’s been touted for a long while that Google takes Social Signals into account for it’s rankings. With that in mind, would Google’s own Social Network be a better platform to have your content shared on that other networks?

The response was prompted mainly by the fact that Moz had published a couple of studies that showed a massive correlation between the numbers of Google +1s and rankings. The more +1s something had, the higher it ranked.

But Matt Cutts was quick to point out that correlation doesn’t imply causality, deeming this the politest way to debunk the myth. He also added, more in reference to Moz’s study of Facebook Likes, that if you create compelling and great content, naturally people are going to take it and share it and link to it.

Implying that it isn’t due to the likes and +1s that the content ranks, it’s down to the actual content itself. It just happens to attract links organically. He also stated:

“But that doesn’t mean that Google is using those signals in our ranking, rather than chasing +1s of content, your time is much better spent making great content.”

So what do you believe? The cold, hard facts of Moz. Or the genial face of Matt Cutts?

Blog Post by: Greg McVey

Net66 SEO: Generate Ideas for Content the Easy Way

Ok, maybe not easy, as there is some work involved. I simply mean you don’t have to sit alone in a room, lit solely by your blank computer screen as you scratch your head for inspiration. I’ve actually done that a few times before thinking “There has to be another way”, and thankfully, there was. See the tips below:

Read User Comments

Writers Block

If you’ve written blog posts in the past then you’ll usually get one a two comments agreeing or disagreeing with what you’ve written. That’s not to say right or wrong, there are many more than 50 grey areas in SEO these days, and that’s where you’ll get an idea from.

If someone has challenged you on a blog post you’ve written, you can either take inspiration from them and see whether you can write a blog post from their view. Or you can dedicate a whole blog post to slam dunking their argument into the bin. It’s up to you.

Use Existing Customers

This can be worked in a few ways. If you’ve had a great success story recently with one of your clients, brag about it. Why not? You’ve put in the hard work, determination and man hours, so a little recognition is what you deserve. Plus this puts your work right in the shop window, giving you something to show off to potential new clients.

Also, you could ask some of your clients for a review of your product or service. You can then thicken this out into a blog post where you agree, disagree and applaud your clients review on the matter. This again puts you in the shop window, as people can see you’re understanding and appreciative of the feedback you have from your clients.

LinkedIn, Facebook, Twitter, LinkedIn

If you’re proficient in Social Media then you should be on all of these platforms and better still, you should have networked with several other companies who are all in similar industries to you. This way you can peruse their updates and wait for inspiration to strike.

This could be critiquing what they’ve said on a certain matter, or even identifying yourself with what they’ve said, but go into further detail than they have. If they’ve mentioned something along the lines of one topic, this could set the cogs in your head whirring off on a different tangent, one you could write a blog about.

Follow these tips and you should find yourself with plenty to write about, saving yourself from the dreaded writers block.

Blog Post by: Greg McVey

Net66 SEO: Mixed Messages From Google Webmaster?

Link analysisA post I read recently discussed what you need to evaluate when you get a manual penalty from Google. Some people generally get as many link reports that they can and then scythe through them hoping to cull anything that looks a bit out of the ordinary.

But the post I read was talking about just analysing the links from within Webmaster tools itself. However there are other examples which have shown that just analysing the links in Webmaster tools isn’t enough. Matt Cutts, head of the webspam team at Google, has addressed and recognised this by releasing a statement as follows:

It’s certainly the case that we endeavor to show examples from the set of links returned by Webmaster Tools, and likewise we prefer to assess reconsideration requests on that basis of those links. However, if there’s a really good example link that illustrates a problem, we do leave enough room to share that link, especially because it can help point the webmaster in a better direction to diagnose and fix issues.

So it shows that although Google do try, that is all they can do. Try. So if you’re ever on the receiving end of a manual penalty, then don’t rely solely on Webmaster tools to recuperate. Use other link analysis software and make sure to examine as many links as possible to try and find what is hurting your site.

Net66 SEO: Content Marketing Myths

Content MarketingThere has been such a bit buzz around content marketing since it first emerged as a “thing” almost a year ago. As with all things SEO, it has been taken to excess with a lot of people getting it wrong through a combination of over-enthusiasm and impatience. Here are some of the biggest myths concerning Content Marketing:

Shorter Blogs are Better

It makes sense to a degree. Everyone is so pressed for time these days that small, condensed blogs are the right choice as people can read them on the go and want the information there and then. But is that right? A lot of people are saying no and there are a few posts I’ve read that have proved that their sites longer posts have benefitted from more exposure.

On average posts between 1,100 and 1,400 words long performed better in popularity than other posts between 300 and 800 words long. I’ve found personally as well, some of the more popular Net66 blogs have more content on there than other short blogs. I’ve verified this through our Google Analytics and can say that it works.

More Frequent Content = Better Content Marketing

If you read the first point about longer blogs performing better, then you can understand already why more frequent content isn’t always the best. To create a blog post in excess of 1,100 words every day is no easy feat. Unless you employ a solitary writer to spend a full day on a blog post each day of course, which of course isn’t really viable.

So what should you do? The answer is simple, take your time. Quality beats quantity every time, so here’s some tips on writing a more quality blog post:

> Find a subject worth pursuing
> Gather the correct stats
> Source some brilliant images
> Allow yourself the time to do it properly

After you’ve done all that you should feel suitably tired and mildly as if you’ve wasted time that could have been spent doing something else. Well you’re wrong! What you’ve done is draw up some quality content, backed it with stats and presented it beautifully. Much better than rushing out 200 words on something vaguely related to your subject.

If you follow the above two tips then you’ll soon find yourself with a lot more time to spend researching your blog posts and allowing yourself time to create them.

Blog Post by: Greg McVey