Net66 Web Design: Designing a Mobile Site – Top Tips

As a business owner you need to keep up to date with every aspect of your business, and that includes the website. Now a lot of us aren’t tech savvy, much in the same way us “computer geeks” aren’t social savvy. So here’s some top tips to help you not only understand mobile design, but also how to have an input in the design process too:

Keep it Simple

The mobile experience, although immersive in it’s own right, can get bogged down by too much content. A mobile device’s screen is a lot smaller than even the smallest netbook. So if you’ve a 500 word, graphic rich, video inclusive home page, a mobile device will take it’s time to load all of this content. So keep it simple. An image here or there isn’t too bad, you also don’t want reams of text, as users will struggle to read it all on a small device.

Ensure Compatibility

iPads, iPhones, Adroids, Tablets and now Smart Watches. Each device released makes it harder for one website to do it all. Even with the ubiquitous iPhone, with each new release comes a new screen to fit your website to. So you need to make sure your website is looking how you want it on a range of devices.

Responsive Web Design

This type of design is designed to respond to the size of the browser width that you’re viewing it through. This ties in with the second point as a responsive website can usually respond to a range of different mobile device screens. More than that though, if you’re on your computer and choose to shrink the size of your browser window, the website will respond to that straight away. So you can still view all the content of your favourite website, but whilst also having another window open.

Full Website Option

Some people are quite averse to mobile websites and would prefer to view the website in full. So make sure you always include this option.

Follow these tips and if ever you want a mobile website designed, you’ll have a head start on what to do.

Blog Post by: Greg McVey

Net66 Google: Google Launches New Card Layout for Mobile Search

Web Design has progressed a lot over the years and it seems to be the current trend is to just simplify everything. We’ve seen that most recently with Google’s redesigned logo. This has lead to a lot of companies now redesigning their own websites with simplicity in mind.

Google is again rolling out their interpretation of the simple design by introducing a new layout for their Mobile search. Known as a card layout, this design segments different areas of the website and displays them floating apart from each other, over a static background.

This has previously been introduced on Google+ and it’s also similar to Facebook’s recent timeline update. But now it seems to be the turn of the Google Search Results themselves. You can see a screenshot below of what the new Card Layout looks like on the iPad:

Image Courtesy of SearchEngineLandI really quite like the new design. I’m also comfortable saying it’d work just as well and look just as nice on your computer. What do you think to the new Layout?

 

Net66 SEO: Google Releases Hummingbird Algorithm Update

Google HummingbirdGoogle last night revealed their biggest change to their famed search algorithm in 3 years. They tinker with the algorithm all the time, allowing them slight refinements to it so they can better provide you with the search results you’re looking for.

But this time they’ve added by far the largest update to their algorithm so far. This update will affect 90% of all search queries! That’s right, 90%! To put that in perspective, Google’s Penguin Update affected 2.3% of English queries and caused an uproar that is still lasting to this day.

I’m sure you must be aware of all the penguin recovery techniques that are going around. Even now, months after it’s been rolled out.

So what does Hummingbird actually entail then? Well the main purpose of this update is to better answer long tailed questions that are put to the search engine. So for instance, before the update, if I were to search for “How do I make sticky toffee pudding without raisins?”, because raisins aren’t everyones cup of tea. The previous algorithm would still have counted raisins as a search term as it is in the search string. To find recipes without raisins I’d have to enter the more complicated search “Sticky Toffee Pudding Recipe -raisins”. Because when you add the minus sign before a word, Google will return all search results without that word in them.

This Hummingbird update is now set to change that, so that Google will be able to understand more long tailed question search terms. Previously the algorithm’s main prerogative was to search for short tail related keywords. Which is why my search for a good sticky toffee pudding recipe took so long.

Google has also took the time to update their voice search capabilities. They want communication with Google to be as if you were taking to your friend. So for instance if you were looking up information on Malta you could say “Tell me about the History of Malta” and Google would come up with results for historical data about Malta.

But then, you could say something along the lines of “What about it’s geography?”. Google would then present you with geographical knowledge about the country, without you having to repeat it’s name. Cool isn’t it?

I’m sure that the hummingbird update had a lot to do with how much Google is pushing into mobile search as well. As on a mobile, its easier to hit one button and talk rather than fiddle with the device’s small keyboard.

Finally, in a well thought out release, the Hummingbird update also coincides with Google’s 15th Birthday celebrations. So, as well as a cute little game on the Google home page, it also subtly implies how far Google has come in 15 years by releasing their most intelligent Google Algorithm to date. Smart Move Google.

google-birthday

High Score for the day: 155

Blog Post by: Greg McVey

Net66 News: Apple Bodges Maps Again

So IOS7 has launched to much fanfare. Serious fanfare in fact. I’ve seen on my Twitter feed alone people leaving university (college, if American), people taking the day off from work as well as people using an impressive array of unsecured WiFi hotspots to secure the much talked about upgrade.

However, underneath this triumphant release of their new flagship software, there’s been another problem with their maps software. It seems Apple’s idea of “Release flawed software – Apply patch fix later” that has worked in the past doesn’t work all the time.

Especially not with their Apple Maps software which has encountered pretty hilarious errors in the past. But this time it was far from a laughing matter as the software this time directed people over an airport runway.

Not specifically to be honest. You see the Apple Maps software will take you through an access route used by pilots and then conclude by telling you “Take Taxiway B” which is probably the safe route to the terminal. However the reality of it is that you turn onto the access route and are met with concrete all the way to the terminal that is dead ahead of you.

There are two problems here. 1) Some of that concrete belongs to the runway where Boeing 737s happen to use regularly. 2) Apple Maps directs you to this expanse of concrete.

You would forgive people for thinking that Taxiway B does go all the way to the terminal, when in fact it doesn’t. Thankfully no one was hurt in either of the incidents, but I’m sure Apple Maps’ pride has suffered a wee bit of a blow.

Blog Post by: Greg McVey

Net66 SEO: Google Takes out Another Link Network

Google is no stranger to walking the walk after talking the talk. And the’ve proven this again this week by taking down a lot of websites from the link network Ghost 2.0. Somehow as well in their infinite wisdom, Google have managed to take down a lot of sites that the link network had in reserve, including those that haven’t even sent out any external links yet.

How they’ve done it, I’m not sure. Apparently neither are the link networks as they’ve sent out and email. This is a screenshot highlighting part of what the Link Network had to say:

Ghost Link Network Penalised

This isn’t the first time this has happened either, take a look at our other articles on Link Networks getting penalised:

Net66 SEO: Another Link Network bites the dust.

Net66 SEO: Google going after more link Networks.

SEO Rumour Mill – SAPE Network Penalised?

It also appears that Matt Cutts sent out a subtle warning to Ghost networks before he took them down. He said he was looking at Ghost related puns to use and eventually went for “they try to look super natural”, referring to the artificially generated links.

Everyone loves a pun. And I suspectre you do too.

Blog Post by: Greg McVey

Net66 Tips – Onsite Optimisation: Building the perfect web page for SEO

This is one of many challenging aspects in SEO and that is building a web page which is perfectly optimised not only for the algorithms but also for the user. The list is endless with this but I am going to show you what I think and what the people here at Net66 think is the best way to optimise a web page.

Gone are the days where we could solely rank of meta data and keyword stuffing, search engines have advanced over the years and it is now all about quality and relevance for the audience. There are also many ways of generating traffic to your website through social media, blogs and emails etc.

Crawling and Accessibility

web crawlerThis is crucial to check on your web page as this could impact the performance of your website. Search Engines read website through an automated bot which is therefore programmed to look for specifics. Some of these specifics include;

• Is the page with the content on the correct Url?

• Is this Url user-friendly?

• Is the robots.txt file blocking the robots from crawling any pages?

• If the page is down then are you using the correct status code?

Now these are not all the specifics but to me these are the most important. So, what do they mean? Having a friendly URL structure ensures the bots can read your website more efficiently and therefore can only benefit you, it also makes it easier for the users to understand what the content is about. The robots.txt file is a set of commands in which you can control which pages can be crawled by robots.

It is recommended that you check this to make sure you are not blocking any pages you wish to be crawled. Lastly, sometimes we can experience technical issues with our website (if not you must be doing it wrong…) and when this happens it is important to use the correct status code.

If a page is down temporarily, then you must use a 503 status code plus if you need to redirect a page to a new address then you must use a 301 redirect which is permanent.

Content

This is the most important factor when it comes to successful SEO. The term ‘content is king’ is widely used by webmasters all through the industry and they are correct!

As the search engine algorithms have shifted and advanced over the past couple of years the two words which constantly arise are “quality” and “relevance”.

This is exactly what your content should be, quality and relevant to your niche. Now obviously we still have to abide by Googles Webmaster Guidelines with regards to uniqueness and keyword stuffing (you know the drill).

At the same you need to try and target a specific keyword without breaking the rules… But how do we do that? You do know that Google does not pick up that exact word 100% of the time. Google now picks up other relevant keywords. Here is a prime example;

knowledge graph

As you can see Google highlights words relevant to swimming supplies, such as swimming goggles, swimming gear, swimming supply and equipment. So therefore, we can include these other relevant keywords in our body text, meta data, h1’s, h2’s and our alt tags on images.

This shows Google that you have done some research and it will reward you with that extra credibility. For ranking purposes the exact keyword you wish to rank for needs to be included in the meta title, then broken up in the description with a 2-3% keyword density in the body text.

Try and break up your content with images, bullet points, videos and short paragraphs. What this does is keeps the audience interested in the content and this is another factor which Google takes into consideration and that is user experience.

With regards to this and also content, Google seem to be rewarding more engaging content, so do not be shy and add your sense of humour into there (if you have one that is) and start engaging with as many people as possible.

seo meme

Internal Links

When I first started out with SEO I used internal links for ranking purposes, linking to an internal page with exact keyword anchor text. What I then found out is that it didn’t look natural and there was always that risk of getting a telling off from Google.

I now use my internal links wisely, I create user friendly internal links which are linking to pages which deemed most valuable for a certain phrase/(s) and still some keyword anchor text linking to the page I want to rank.

I find that a good natural mix of anchor text is the way forward especially after Penguin 2.0. Also internal links create paths for the bots to crawl your website, the more paths the quicker your website will get read and indexed.

 

Blog Post by Jordan Whitehead

Net66 SEO: Google Now Encrypting Every Search

Google is set to make SSL searches the default for all users of their search engine. They’ve previously stated that they would only encrypt anyone who used the https:// version of the site, or users who were signed into their Google Account (YouTube, Gmail, Adwords etc).

But now in a dramatic and sudden U-Turn, they’ve completely reneged on this and encrypted every search term being entered. Now there are two main theories going round the web here:

1. Google have acted on their users concerns of data sharing with the US’ PRISM program. So to draw users back to their site they have assured us that they’ll encrypt all searches so no one with the power to do so can see what we’re searching.

2. Advertising sales are down and, as you may know, Google still passes on keywords used to advertisers. So in withhold keyword data from analytics, it could push more people onto their adwords, thus driving up revenue.

I’m more inclined to think it’s a mixture of the two. But what do you think?

Blog Post by Greg McVey

Net66 News: Google Changes Logo – Or Figures the Bevel Setting on Photoshop

Ok maybe I’m being a bit cynical here, but if Google’s logo was a physical thing, it just looks like they’ve hit it with a brick. Not smashed it, just flattened it. Technically it isn’t even a “new” logo. As Google has been using it internally for years now to save on Printing costs. You can compare the two new logo’s below:

GoogleLogo

 

To be fair there is a noticeable difference in the colours, and it’s bang on trend with the whole minimalist approach that’s so prevalent these days. I do like it as well, it’s neat and precise.

What do you think?

Blog Post by Greg McVey

Image: Justin Sullivan/Getty Images; Ars Technica

Net66 News: Move over Lazarus, Here comes Google!

DeathNow I know that Lazarus is from the bible and is more prone to rising from the dead. But being an avid Dr Who fan, I watched the Lazarus episode where the antagonist (bad guy) created a machine to restore youth. Allowing for a longer life and potential immortality and it’s that reference I’m using.

Google will be launching a new company named Calcio, who’s ambition it will be to study ageing and how it can be slowed down allowing for longer life. Great right? But why would Google, more of a technology behemoth than dermatological/cosmetic company do this? Well there are two hypothesises on this.

First, there’s the idea of Google being altruistic in this and genuinely wanting to help us endure so we can learn, prosper and get the most out of life.

And then there’s the cynic. Citing that the longer someone lives, then theres the more:

Adwords you can buy
Google Glass you can buy
Android revenue
Self driving cars you can run

You can pick your favourite but I’m sure there’s more. My belief is more aligned to the cynics way of thinking, but as an SEO I’ve no real love for Adwords so sometimes want to pan Google.

But what do you think?

Blog Post by: Greg McVey

Net66 Google: Léon Foucault Given Google Logo

You may have been pleased to see a new Google Doodle today. If you’ve not already seen it, just cast your eyes downwards:

Google Doodle

 

This is in honour of Léon Foulcault. A French physicist who first demonstrated the Earth’s rotation with his creation of the Foulcault Pendulum.

The way this worked was to set a pendulum up and place dominos or something similar around the edge of the pendulum’s swing. You then proceed to swing the pendulum.

Given that we now know for sure the earth is rotating, this device proved to show that with no external force, the pendulum that would supposedly swing backwards and forwards, eventually had it’s swing distorted into an oblong by the natural rotation of the Earth.

Blog Post by: Greg McVey

Net66 SEO: Matt Cutts says IPv4 and IPv6 sites are not Duplicate Content

This is something that shouldn’t be of concern to most of you yet as a lot of ISPs haven’t yet made the jump to IPv6 connections for websites.

The difference between the two is that IPv4 connections consist of 4 groups of 3 digits. Such as your regular IP address a la 123.12.123.12 which is your standard IP address. IPv6 addresses have a much different configuration consisting of letters and numbers. Although with IPv6 addresses if you have more than four zeros in a row you can omit them.

So, in theory if you did have an IPv6 address, you have two versions of your site on two different connections. Which usually would account to duplicate content. Thankfully Google has devised a way to tell whether the same site is on a different connection rather than a different host/domain (which would be a duplicate content issue).

Matt Cutts explains it all here:

Blog Post by: Greg McVey

Net66 SEO: Link Building – What Not to Do Post Penguin

Anchor Text - What not to do. Optimise Your Anchor Text

To a degree you must still occasionally include the keyword in one or two of your links. But if you’re looking to rank yourself for “cheap building services” or something similar. Then you should not have a high percentage of keyword anchors of “cheap building services”. You should also avoid having a lot of variation of the same anchor text. For instance having this variety:

Cheap Building Services
Low Cost Building Services
Cheap Builders
Low Builders

This will still alert Google that what you’re doing isn’t the best practice. You see the anchors are still over optimised because Google understands that all the anchor text pertains to the same keyword, and that you’re trying to manipulate the algorithm by changes the anchors text.

You should always try to create links with brand optimisation, use part of a sentence or just use your URL. All a better option than overusing “cheap building services” as your anchor.

Blog Post by: Greg McVey

Net66 SEO: Google Updates it’s Links Tool in Webmaster Tools

People have often been frustrated by the lack of link data under the “Links to my Site” option on Google Webmasters tools. There were never that many, and weren’t updated all that often.

But now it seems that in the background, Google had been planning one big update, rather than consistently updating them. And now it looks like the big one has been released.

In a Blog Post that Google recently released, they’ve announced that there will now be a “much broader, more diverse cross-section of links”. I’ve checked this out this morning and found that there are a lot more links now showing up in the links to your site section.

What this also means is that there will be a wider range of top level domains showing up, rather than your more country specific ones. So rather than the focus being on .co.uk domains if your website targets England. It will now show many more domains form the outside of the UK. Such as .coms .nets etc.

However Google have still limited the amount of links that they will show, which is still 100,000. They have made it clear though, that the types of links that will show up will be much different.

Meaning that if you ever have engaged in bad linking practices, you will be able to get a better scope of your linking profile, and tidy it up quicker and more efficiently than if you were to have the previous set of results.

Blog Post by: Greg McVey

Net66 SEO: Matt Cutts on How to Recover from Panda

February 2011 changed the search engine listings forever. That was because it was this month that Google first released their Panda update. The main aim of this update was to evaluate the quality of content on website. This was because at the time people were neglecting to really write content, focussing instead on how many keywords they were place in their text.

This naturally lead to keyword stuffing and degraded the quality of content as better phrased sentences were always sacrificed to include keywords instead. Take the two sentences below as an example:

A) We’re a SEO Company Manchester who offer a wide range of Manchester SEO Services.

B) Based in Manchester, we’ve been offering SEO services in the county for years now.

Sentence A is clearly based only for fitting in the keywords relating to roofing in Manchester. You can tell as it doesn’t quite read well. It’s the omission of the “in” between company and Manchester that just leaves the sentence a little off.

Sentence B however has been written properly without the intention of forcibly including in the keywords. So sentence B, according to Panda, has more quality so should rank higher.

Content quality seems to be the only way to combat the Panda update. As seen in Matt Cutts video below:

So now, if you do have a website that’s been hit by the Panda update. You now know how to combat it.

Blog Post by: Greg McVey

Net66 SEO: A New Player in the Digital Marketing Game? Enter Adobe Target

TargetWe all know Adobe as the company that created Flash, Photoshop, Fireworks and of course the much loved Adobe PDF reader. So why has this creative company suddenly decided to release a new product designed specifically to help business owners understand the complex world of digital marketing.

But further than that, it also has the best interests of the end user in mind too. Not only will helping the business owner/SEOer improve the digital marketing aspect of the website. But tidying up all the onsite issues will surely increase the benefits of the website as a whole, thus, creating a more enjoyable experience for the end user.

It also deals with optimisation issues that some business owners can find confusing. They may know everything about how to whiten teeth, but when it comes to analytics, they will know nothing about this subject. So can get confused with the difference between unique visitors and regular visitors, bounce rate and time on site and the myriad of other statistics Google display through their analytics software.

With an online presence being a massive tool in revenue for businesses; more and more owners, marketing execs, accountants and anyone who has something to do with the website will be wanting to understand where their revenue is coming from, and how the website has attracted it. This way they can make an informed decision on whether or not each marketing avenue is worth pursuing
or not after analysing how much business each stream has brought to the site.

This software also goes beyond mere analytics and offers targeting, with the chance to set up favourite audiences so you can make the most of your online market. They further offer a step by step process, so that even the most novice digital marketer can have the chance to get a grip of their digital marketing campaign and steer it in the direction they want.

You can see the video that Adobe have released below:

Blog Post by: Greg McVey

Net66 SEO: Google Potentially Rolling out an Update – Keeping Quiet

Google PandaAs is Google’s wont, it seems that they’ve released another update for their search algorithm. And sticking in line with their “We won’t confirm updates after Panda”, they won’t confirm anything. But large spikes in MozCast (a forecast for activity on search engine forums) suggest that there’s been spikes in this forecast.

Usually this happens when Google do release an update and SEOs up and down the country see a change in their rankings and or traffic. So far, the dates where chatter has been at it’s highest have been the 21st of August and the 4th of September.

We’ve yet to hear from Google and it doesn’t look like they’ll be commenting any time soon. But the sheer volume of chatter on such SEO forums makes it look to me that this was no mere fluctuation. This was something big, but as we’re unlikely to hear anything from Google, we’re just going to have to wait until the dust has settled and we can really analyse the facts and figures from the big change.

It could be something to do with the Panda update, which deals with quality issues of websites. Or it could even be something to do with the Penguin update, that examines in detail the links pointing to your website.

So have you seen any changes in your rankings or traffic?

Net66 SEO: Having a Manual Action Penalty Removed Doesn’t Help Rankings

Well, not instantaneously anyway. We’ve seen it a lot over the past year where people claim their website suddenly drops for all it’s rankings. Its traffic subsequently follows. So what is it in the first place that causes their sites to drop?

Usually it’s some form of manual action that Google applies to the website due to it violating one of Google’s stringent and vague Webmaster Guidelines. The usual line is that their links weren’t completely in line with Google’s rules, sorry, guidelines, and that they clean them all up, submit a resubmission request and huzzah! No more manual action.

They give it a week and see no return to their previous levels of traffic and rankings. So what’s the problem? Well, it just seems that although your penalty has been lifted, it’s more of a “we wont punish you further” rather than “we’ll fix your rankings”.

After all, would you learn your lesson if you could break the guidelines, rank well, get found out, have a manual action penalty, get within the guidelines again, and be restored? It’d be open season for trying out black hat results if you knew you could try anything with impunity knowing you could remove it and simply be restored to where you were.

So what are your thoughts on this? Do you think traffic and rankings should be restored straight away?

Blog Post by: Greg McVey

Net66 SEO: Mozilla Moves to Block Cookies on Sites

Back in 2011, Google decided that it was going to encrypt all search results for people who were already signed into their Google Account. On the surface this was great,you search is more secure, anonymous and whole lot more bountiful privacy bonuses.

One of the drawbacks however, only if you’re in the internet marketing business, is that when this was introduced, a lot of results in a lot people’s Google Analytics account started to be displayed as “Not Provided”. Much to the irk of people who relied on that data to measure just how successful their SEO campaigns had been.

This wouldn’t be a problem in most cases, however one study has suggested that up to 40% of data that come through Analytics now descends into this inescapable pit of Not Provided. So that’s nearly half of all data an SEO receives will be nullified. And now Mozilla with it’s blocking of Cookies looks set to only increase this number.

It will do this if it decides that the Google search cookie isn’t beneficial to the user, and decides to block it. That’s 20% of all search traffic GONE if Mozilla Press forward with this. “Well that’s no good” I hear you cry, my sentiments exactly. But there is something you can do to help. There’s a petition going round that you can sign. Set up by the Interactive Advertising Bureau, you can sign this petition here, I already have.

So do you think Mozilla will pull the plug on the Google Cookie?

Blog Post by: Greg McVey