Thursday, 21 March 2013

Recovering from a Hacked Website 3 — Off-page SEO


Recovering from a Hacked Website 3


Part III, Off-page SEO


the relative importance of on-page and off-page SEO
This is the third-part of a series of articles where I share with you the steps I took to recover my website Celtnet Recipes from a combination of years of neglect and from being hacked.

In the first part I discussed how I removed the malware and hacked content and then protected my site from further attacks.

In the second part I wend through how I updated my web pages, making then cleaner and making them load faster whilst ensuring I had unique content and unique and meaningful meta tags (on-page SEO, in other terms).

In this part I will go through some of the off-page SEO efforts that I have been undertaking on the site.

In a nutshell, off-page SEO means getting links to your site and promoting your site through social media sites and content sites like Google+, Twitter, Facebook, StumblUpon and Pinterest. I use all of these, either manually, or through sharing my RSS feeds with them.

In terms of links, I started with RSS feeds. New content was automatically pushed to an RSS feed and old content as well as essentially static content was pushed through another feed. These feeds were published to feed aggregator services and were automatically published to Twitter and Facebook.

For Google+ I semi-automated publishing by loading my RSS feeds in Google reader and then I published any new content to G+ (google+ do not like duplicate content in your Google+ pages).

That was the social media taken care of. Though article directories are not valued as highly in Google as they once were, they can still be an useful source of in-bound links, so I started an article campaign for each page that I updated as I updated it.

Also if a page or a part of my site reached a milestone or an important event or I could link it to something in the news then I released a press release about it.

Next I began guest blogging (this is by far one of the best ways of getting in-bound links with real Google 'juice' these days). I also started commenting on various blogs to get my name seen and site's profile improved.

Next I hired a few gigs on Fiverr to get some of this work done for me. There are scam artists there, but there are also some real bargains to be had. The warrior forum is also a good source of information and potential hires for out-sourcing this work.

Within a month of starting my income was beginning to increase again and I was able to use that increase to hire two SEO companies with different strategies to begin link-building strategies for me. We're just at the start of this process, but it's looking promising and I hope to give the two companies more URLs next month.

This is going to be a long slog, but I can see improvements in SERPs (an more importantly in income) already. As long as this improvement continues, in three months I will be back where I was before the trouble started and in six months I might even be able to live off the income from my website for the first time...

But I am trying not to think about that yet, as there is still a considerable amount of remedial work to be done, and in the meantime I still need to get the day-to-day work of adding content and updating old content going.

In the next article I will go into detail about a new aspect of SEO related to content ownership, rich snippets and Google+

Tuesday, 19 March 2013

Recovering from a Hacked Website — Part II, On-page SEO

Recovering from a Hacked Website 2

Part II, On-page SEO

SEO text with arrow going upwards. SEO optimization of web pages.
In the first part of this series on recovering from a hacked website I detailed how I discovered and removed the badware from my site and then hardened my code so that it could not easily be hacked again.

Today is the second part of the article series, detailing how on-page SEO was improved.

The truth is, even had my site not been compromised, because I was using it as learning platform for programming and site development the look and feel had become dated and much of the code was bloated.

Also, as I had moved to cached pages to improve response time and reduce server load on my MySQL databases I had geo-location scripts that were not working. Even the geo-location scripts themselves were causing problems as they were also hitting my MySQL instance. I'm UK based, but almost half my visitors were from North America, so I needed to serve slightly different content to the two audiences.

My first solution was to move to a static file-based geo location system that did not use the database. This freed up MySQL resource, but increased server load — not an optimal solution. Then I came across Google's JavaScript APIs. They had already solved the problem, and by integrating this with some nifty JavaScript to load my ads and other content I could have cached pages that remained dynamic for some content without the need to bring in external IFrames... sorted!

OK, one headache removed. What about page design and removal of bloat?

I needed to keep some of my original headers, but wanted to move to a cleaner design and cleaner, more readable fonts. I took Wikipedia as my inspiration, then dusted off my CSS files and stripped out my base three column format definition.

I designed that in CSS a few years ago to give me a header, left column (1), central column (2), right column (3) and footer. There was a nasty hack in there for older internet explorer versions, but a quick internet search showed me how to fix that.

Now I had a three column definition with no IE hack, but which still kept the useful feature of having the central, main column, content come first, right after the header, making it more visible to search engines. So my columns in the page code came in the order 2, 1, 3. But with the CSS they all appeared in the right places.

I had the base CSS file defined and the main header I kept from the original files, with a few tweaks.

I added my new font definitions and some extra definitions for headers and the appearance of the left and right columns. I had a functioning CSS definition that was clean and which was 1/3 the size of my previous version. I also had a separate CSS definition for printing that omitted the left and right sidebar content in the printed file and stripped out ads. I also updated this based on my display CSS and, again, the file size reduced.

Now that I had my CSS I used an on-line tool to strip out the extraneous spaces, reducing the file size then I cached a compressed copy and called this from my web pages. Clean and compressed CSS, a huge space saving.

For the main web pages themselves I also added page compression using the following PHP code in the PHP section before the main HTML code:

if (substr_count($_SERVER['HTTP_ACCEPT_ENCODING'], 'gzip'))
    ob_start("ob_gzhandler");
else
    ob_start();


This being completed with the PHP code:

ob_end_flush();

at the very bottom of the file.

Now the pages would be sent compressed to any web browser that would accept compressed pages. A bandwidth and speed win.

The next problem was a little trickier. As my site had grown, I had lost sight of what I had been doing with the content. I had my main Celtnet Recipes home page ++http://www.celtnet.org.uk/recipes/++ and beneath that were region pages to regions of the world I had recipes for, for various countries I had recipes for and for ingredients and cookery techniques that I also had ingredients for.

To increase my chances of the links being indexed I had set a limit of 100 recipe links per page, with additional recipes on the succeeding pages. This was fine whilst my site was growing, but eventually I had hundreds of recipes for some pages and thousands for others. This meant that the content of my home page for that country/region/ingredient etc was being replicated hundreds and hundreds of times. Though the links were different on each page because the main text was often more content than the links, that meant massive duplicate content and duplication of <TITLE> and <DESCRIPTION> tags in the page headers.

This, of course, is VERY BAD SEO. So, I updated my headers to have the page location defined. So, the first page was the 'Home' page, then the '2nd', '3rd', '4th' pages etc. This meant that the title and description tags for each page were different.

For the duplicate content problems, I added a page definition of what the page was all about to the top of the page. This was essentially the same for each page, but I did add a new page number definition.

Next I wrapped my main text content in PHP so that it would only display on the home page, then added some boilerplate to all succeeding pages of the form 'This page is a continuation of my recipes listing from Britain, the nth page in fact. For more information about Britain and its cuisine, please visit the <home page>.'

or words to that effect. The overall effect was to make the text content of the pages less than that of the link list content below them so that the web search engines would see the pages as unique content below the home page for that entry.

Now I had pages with cleaner HTML, neater CSS, that were more readable for humans and which were more unique in terms of content than they had been before. With compression and caching they also loaded faster.

Next I validated the code as far as I could. This did reveal a few glaring problems that I fixed, but because I had some third party code and Web 2.0 buttons, the pages did not validate completely. But I did not worry about that as the main content was fine.

To get some extra load time benefits I played around with asynchronous JavaScript loading for some of the ads and some of my own JavaScript (the main page search runs off JavaScript and I also have JavaScript that adds a copyright notice to any text copied from the site).

Again, this brought load time benefits.

From a very bad start I had significantly improved my web site's appearance, on-page SEO and load times. Of course, replicating this throughout the site was (and is) a big issue and it's still on-going. But it's giving me a chance to completely overhaul the content and to make things a lot better overall.

This is very much a work in progress, but it's getting there.

Next time I will be talking a little about something that we all hate, but have to do... off-page SEO.

Monday, 11 March 2013

Recovering from a Hacked Website — Part 1


Recovering from a Hacked Website


Part I, Defending your Site


About 18 months ago my website was hacked quite badly and this series of articles is about how I recovered (or am recovering from that) with a few insights and tips along the way as to how you can avoid what happened to me in the first place.

I've been on the web since about 2003 and in 2004 I began my site, Celtnet (http://www.celtnet.org.uk). About a year later I started a new section on the site, Celtnet Recipes (http://www.celtnet.org.uk/recipes/). Initially I added a bunch of Welsh Recipes there to go with the various Welsh Legends and lists of Celtic gods I was adding to the main section of my site.

The site was growing and it needed to be more dynamic, so I move over to PHP on top of a MySQL database as my content delivery platform and added a forum as well as an article directory.

Over the years the recipe section grew, initially just with those recipes that I found interesting. Then I started on a personal project to add to the site recipes from each and every country in Europe and then recipes from each and every country in Africa. All the time I was also adding traditional and British recipes.

By 2010 the British and African recipe sections had grown into some of the largest on the web and I was getting lots and lots of visitors. This was converting into quite a decent income.

Neo with bullets from the matrix. How to make your website hacker-proof.
Neo from the Matrix stopping bullets. How to make your
website bullet (and hacker) proof.
However, though I was expanding the site and updating the code quite often, it was still really only a hobby site for me. I had begun some limited SEO and was working on the in-bound links to my site, but nothing serious.

Then, in 2011 things went a bit awry in my life and I lost interest in pretty much everything. The site was left on tick over and in 2012 it was hacked through a vulnerability in the forum system phpBB. My rankings in Google began to tumble and it was not really until February 2013 that I began to take notice.

By that time things had gone very wrong indeed. All the most popular sections of the site had been compromised and I was nowhere to be seen in the searches that I had previously been most popular for.

Some things were historic, I had cleaned out the bad code, but the Google spiders had stopped coming and I had pharmaceutical and adult spam all over the place (where the header to files are compromised and Google spiders or google referrals see different content to what someone coming directly to the site sees).

As I say, I had been running my site as a hobby site, so I quickly had to learn how to harden my site and in the process I decided to give my site an overhaul to improve my SEO and to get more links.

The first thing was to go over all the code with a fine-tooth comb and to remove some of the riskier JavaScrips I was running. This meant that two of my advertisers had to go as they relied on public domain code that was just too easily compromised (this had been responsible for some of the exploits — redirects to spam sites).

Next I found that some of my SQL query code had not really been optimized. The main database had grown so big that queries were failing, so I went through all my SQL and optimized everything. That got the site back up and running again at optimal speed and queries were running once more.

But to reduce the load on the server overall I decided to cache my pages, also in the hope of making page queries faster.

But this exposed another vulnerability. As the site runs on PHP potential SQL inject vulnerabilities and insertion of malicious code into the cache became a possibility.

So, every page that required a variable from a get or post statement had 'htmlentities' wrapped around the variable to prevent malicious code being inserted.

for example, say I require a variable timeStamp to be passed to the script.

my original code would define this as:

$timeStamp = $_GET['ts'];

but the new header code defined:

$timeStamp = (htmlentities($_GET['ts']));

so that every malicious character like '#', '%', etc is encoded as an HTML entity.

Next I put checks around every variable, to ensure that they were valid. So, if I expected a variable to be numeric only I checked for that, or if it had a specific format I checked for that too.

The most pain was with searches, of course. But there I made a list of all potentially malicious characters and common malicious code strings and I stripped those from any user input before performing any searches or caching any pages.

Next I deployed BadBehavior throughout my site. This stopped the majority of known malicious bots from even getting to see any of my pages and malicious attacks dropped considerably (server load also dropped as a result so it was a win-win).

Now I checked permissions on all my directories and upgraded or changed them then I updated all third-party software to the latest versions and if there were problems with any of them I changed over to something else.

With the site suitably protected and hardened, the next step was to undo some of the damage done to the overall SEO.

I will detail those steps in the next article...

Wednesday, 24 October 2012

A Tale of Three Blogs

A Tale of Three Blogs


This blog is about technical reviews, SEO, internet marketing and all that schtick. And like everyone at the moment I need some extra money. So I decided to perform an experiment. I have two Blogs, one that is mature, but which I have not done much with, and one that is new, but updated frequently. I have and AdSense account for my main website, Celtnet... almost everything I need for a little experiment.

I will be adding another blog to the mix and from now until the end of November I am going to be running these blogs in parallel, whilst using different strategies to promote and monetize them. Below I will introduce the blogs and explain my strategies.

1 Celtnet Recipes Blog.

This is the most mature of the blogs and it's been going since 2008. However, I have been concentrating on running my main website so this blog has not really been updated for a long white. As you might have guessed from the blog's title, this is a recipe-related blog. Originally I used it to feed traffic and links to my main Celtnet Recipes site. I had a steady but small traffic flow of about 30 visitors per day. So this is an established blog with not much content and not much traffic.

As  Google now allows product ads in the UK I am going to be monetizing this blog using AdSense ads and by putting ads for recipe-related products in the blog body. I am going to post 1 or 2 new recipes onto the blog every day. I will get backlinks from commenting on other recipe blogs. I will also try and get five backlinks from articles to the blog or individual posts every week.

I am going to add a 'popular blog posts' widget and I will link as many posts as possible into themes so there are crosslinks in the blog.

And that will be the whole strategy.

Dyfed's Adventures in Publishing.

I started this blog about a month ago mainly as a personal journal for me about my various writing and publishing exploits. It has some but not much content and very little traffic. I have really not promoted this blog at all.

I will use AdSense ads and I will also use this blog to sell my own eBooks as well as putting links to relevant affiliate products for writers. This will be completely content based and I will (I hope) provide good and useful information.

To promote the blog I will use comments on appropriate blogs, article marketing and guest blogging only.

3. Repossession Homes.

This is a completely new blog with no content. And it's never going to get any meaningful content. What I will add are basic descriptions of houses and electronic products and AdSense ads.

This blog will not be advertised at all, but I will use free ads on Craigslist and the like to entice visitors to the blog. There will be no links there apart from Ad links and hopefully they will click.

Aims:

Here, then, are three different blogs each with different content and different marketing and linking strategies. My aim is to make a minimum of $300 each per month with the blogs.

Whether that is achievable or not, I do not know, but I think it's an interesting experiment to find out.

Basically this experiment is to find out whether an existing domain or a new one is better. Whether guest blogging is better at getting traffic than blog commenting and article links and whether you can actually make an income from a blog with minimal traffic and absolutely no marketing at all.

Things have been running for about a week now and traffic on the Celtnet Recipes Blog has increased to over 100 hits per day. The Dyfed's Adventures in Publishing blog has gone from nothing to just over 30 hits per day, with the occasional spike of 80 hits or more when some articles are published. The final  blog gets occasional spikes of traffic, but it's very low. However, I only have two pages of content at the moment. That will increase as I add a new page and new ads every day.

The aim is to work on each site for only an hour every day, leaving time for writing other articles and managing my main site.


And this blog? Well think of it as a control. I will add content as and when I can, but I will not add any links to it. It will be used just to document what's happening with the other blogs so that there will be a complete record of what's happened over the next month or so.

Thursday, 18 October 2012

Viral Marketing with eBooks and Widgets

Introduction

There are a number of ways of creating a viral 'buzz' on the internet. The aim of these campaigns is for your link to spread throughout the internet as quickly as possible. Use social media and standard link systems to propagate the information about your viral 'bait' across the web.

If you are a good writer or a paparazzo then this could be a story of a photograph. But if you are an internet marketer it's more likely to be a free report or maybe an app or widget.

As a writer and a sometime programmer the eBook and the widget are my preferred methods and it's these that I will discuss here. Of course, I will use my own content as examples (which is another excuse for me to give them a plug).

The Free eBook

Giving away free eBooks and free reports has been a central marketing strategy... and it still works. The aim is to provide an eBook (you can use a mix of blog posts and fresh content) that is full of links to your website. If you have a product you can also plug that product at the end of the book.

I've done exactly this from my Free Halloween Recipes eBook. For this eBook, you just go to the link page I have provided and download the eBook, free, no strings. Most marketers want you to enter your eMail address before you grab the eBook as this lets them grow their list. But I am not that kind of marketer.

I'm a writer and the eBook has links to the other recipe books that I've written.

Basically what I'm doing here is providing something useful for people completely free, but at the end of the eBook I am providing links back to my site and links to the other eBooks I have written.

On the download page I tell people that they are free to download and distribute the eBook as they see fit. They can send to their friends or they can add as a download on their own website. This is the key of any viral campaign, get the word to spread as much as possible.

I also know a marketer who has a very large list and she will send the eBook to her list (2 million people), which is an incredible way of viral marketing.

So why don't you go over to my other site and download the Free Halloween Recipes eBook for yourself to see what I have done with it and how the process works.


The Free Widget

Widgets can also be a great way of viral marketing or of getting backlinks to your site. I wrote a JavaScript method to convert between different volume, weight and temperature measurements a long while ago. Then it dawned on me that I could package what I had done in a simple way that would be suitable for putting in the sidebar of a website or a blog. I added this to my blogs and my website and it worked well.

It was served from a central site so I could update and adapt the application, but it could be deployed anywhere. A little tinkering with the code and I had something that anyone could use.

I branded it and put a backlink to my site in there. What I eventually got was this:

Unit Converter


A classic functioning widget. I then put a 'This widget on your site' link in the widget and I wrote about it on guest blogs, saying that the code to embed in websites and blogs was freely available for anyone to use. Then I watched the code get used and the backlinks to my site increase.

Some very simple coding proved to be incredibly valuable in getting links back to my site.

If you run a recipes or food site, or a site using measurements, then you can go to my: Add the Celtnet Unit Converter Widget to your site page to grab it for yourself.

Conclusion

Neither of these two methods took me very long to write over all, but by being clever I was able to distribute them through the web. The ebook literally landed on millions of users' desktops and the widget was placed on many hundreds of sites.

I obtained lots of backlinks and lots of visitors (and sales) to my sites and products. All for a couple of days' work, all told.

Viral marketing really, really, works.

Tuesday, 16 October 2012

Best-paying AdSense Niches


If you are writing a website or blogging for money, then you have probably started writing about something that you love and know. This is a great way of getting a name, but may not be such a good way of making money.

The truth is that if you are blogging for an income, the right niche to choose may not always be something you know a lot about. Indeed, the topic you end up writing about may not even be something you like.

Most blogs and websites make money from advertising, and the money you get from the advertising reflects what the person paying for that advert is willing to pay. Advertisers bid on ads based on keywords. The more competitive a keyword is, the more they will have to pay for an ad spot.

Topics where the ads are expensive are known in marketing parlance as 'high paying niches'. These include areas like consumer electronics, health and fitness, credit cards, home mortgage loans, etc. In these areas, because lots of advertisers are bidding on the same keywords, the ads that are displayed on your website will earn you more for every click that someone makes on them.

In some niches, some ads have been known to pay out over $5.00 per click. 99% of the time, those ads come from highly-competitive sources who pay a lot of money on developing attractive ads (which often means that the number of clicks you get also increases). After all, if a company is paying a lot for an ad, they only want their ads displayed where readers are most likely to purchase something from them, so it is important that your niche is unique, relevant, and has good competition.

The best paying niches are very competitive, and are almost impossible for a beginner to break into. That said, every niche has sub-niches and you will be surprised at which ones have very few people creating content for them.

So, if you are thinking of creating a new blog or website do your research first. Start searching for keywords that might relate to your website, and you can instantly see how much people are paying to place those ads on a Blog just like yours.

To get you started, some of the best-paying niches out there include:

Health: Anti-ageing, Weigh Loss and Fitness, Dentistry and Orthodontics, Cancer

Insurance: Health Insurance, Medical Insurance, Auto and car insurance.

Computers & Internet: Dedicated Hosting, Computer Repairs, Computer Hardware, PC & Internet Security, SEO software and SEO tools, Online Degree, ISP and Networking, Stockbroking/Automated Trading and Web Development

Banking and Finance: Debt consolidation, Credit and Finance, Loans and Mortgage.

Legal representation: Just search in Google for: "attorney increased the bidding rate".

Digital Photography: Careers in photography, digital photography education, digital photography schools and certificate, wedding photography.

Psychotherapy: It can be included in Health but this can also be considered as different Niche since it gives you high Revenues.

Real Estate: Real Estate related Niches are good for getting huge Revenues in Google AdSense.

Consumer Electronics: Electronic devices, Gadgets, Mobiles and other

Antiques and Collectibles: This Niche also gives you high AdSense Earnings.

Dating: This is nice Niche since there are lot of sites that pay well for Dating related Niches.

Leveraging Hashtags for Campaigns and Traffic

Put simply, a hashtag is a word or phrase that's prefixed with the hash (#) symbol. It derives from the early days of Unix where the '#' symbol was used to mark comments or metadata in code. Today, however (unless you are a programmer), the hashtag is used more commonly to mark short messages in microblogging and social networks (like twitter and Google+) to denote discussion topics. For example, #Wikipedia is the hashtag for Wikipedia discussions, and searching for the string #Wikipedia using search engines or on Twitter will cause that word to appear in search results. Such tags have to be a single string of characters (no spaces) and are case insensitive, though they are often written in mixed case (known in programming circles as CamelCase) to make them more readable (eg #HelpCeltnetRecipes). Use of hashtags has a long history. They began on internet relay chat (IRC) networks in the 1970s as a means to label groups and topics. They were also used to mark individual groups as being on a particular topic or of being of interest to a specific group. This inspired Chris Messina, the open source advocate, to propose that a similar system could be used for Twitter. Indeed, it was he who published the first Twitter feed to incorporate a hashtag in 2007. It was only in 2009 though that Twitter produced a search system for hashtags. This, of course, made them much more useful and began the explosion in their use that we see today. Because the twitter search system is public, you can add hashtag searches in your own web pages. To add a hashtag search to your web page, simply use the url: http://twitter.com/#search?q=%23xxx (replace the xxx with the text of your hashtag, but do not add the hash sign '#' itself). For example #HelpCeltnetRecipes would be http://twitter.com/#search?q=%23HelpCeltnetRecipes. Because search is a navigation tool within the Twitter platform, this functionality became yet another way that people could discover new and interesting Tweets outside their immediate group of followers. And, because adding the # at the beginning of a word flags it as a searchable string in both Twitter and Google+ feeds it means that you can start a new discussion yourself with a hashtag, or you can point users to a discussion topic based around that string. When a user clicks on this #hashtag keyword in a Tweet or searches for it in the Twitter searchbar they see a real-time stream of Tweets that include the phrase. As a result, hashtags allow users to watch threads and tweets that are discussion based. You can even add multiple #-linked keywords in your tweets. It's best not to add no more than three, though, or it can start to look like spam. For optimal results it's best if you can fit the hashtags into the text of your tweets it saves space. For example: Follow #CeltnetRecipes and see how the #HelpCeltnetRecipes campaign is performing on #Twitter. You can also search for multiple hashtags as well as single ones. This is an excellent marketing tool, as users who specifically search for your #hashtag keyword combination are guaranteed to be interested in your keyword subject. It also allows fine-tuning of hashtags down to a very specific discussion. Given that tweets are basically free, you have room to experiment with keywords. Indeed, when using #hashtag keywords you’re likely to find it’s not easy to predict which words or phrases will attract your audience so try some out. Test broad keywords and also more specific words. You have nothing to lose. When you find one that works, stick with it. When a hashtag becomes popular it is 'trending' and is put on the front page of twitter and on Google+ pages. This potentially gives you lots of people. Of course, you never know which keywords will trend. Trending is a double-edged sword and if your hashtags are too general they will not be seen in a mass of other tweets and posts. But if your hashtag is too specific no-one will look for it. Experimentation is key, but if you crack it, traffic will flood into your site. If you want to see if your hashtag is original or not, then you can use hashtags.org. This is a very valuable new marketing tools. Start using it now!
Related Posts Plugin for WordPress, Blogger...