JournalHome.com has created this blog for the purpose of keeping members and visitors up to date with JournalHome.com site information. The easiest way to stay up to date is by using the RSS/XML feed. Place this URL within your rss aggregator and it will automatically update itself whenever a new journal entry is published.
Happy new year to all, I wish you all a blessed and wonderful 2013.
At the time of writing this entry I think there are only a few countries left around the world that are still in 2012, but for the most part we made it!
Personally, 2012 was a great year, full of happy memories, (with a few ups and downs), and I can only hope that 2013 will be even better.
For Journalhome, 2012 was not so great, we had a lot issues with the servers, not that many new features added and no new templates were added at all. But that does not mean that I was doing nothing, (I promise). I have stabilized the code a lot more and 2013 should see a lot more new features that should enhance your stay here. Unfortunatly the background work has to be done before all the 'pretty' stuff can be done.
But, in the end, we all made it, the end of the world came and went, so 2013 can only be better from now on!
I manage an old vBulletin (3.6.9) board and from time to time I need to moderate some posts/threads. One of the problems I have is, as the the only moderator, I sometime have 5000 or more posts/threads to moderate.
By default, vBulletin tries to display/moderate all the threads at once, this is ridiculous, even if your server can handle +5000 queries in one script your browser certainly cannot handle it, (you would need a truck load of memory to handle it).
So the best way is to edit the vBulletin code and limit the size of the posts been returned, that way you can moderate ~500 posts/threads at a time.
Look for the file called "moderate.php", it is found in the modcp folder.
Look for the code
$moderated = $db->query_read("SELECT * FROM " . TABLE_PREFIX . "moderation");
And add/replace the following limit checks to make sure it does not get out of hand.
$moderated = $db->query_read("SELECT * FROM " . TABLE_PREFIX . "moderation");
I know it would have been better to change the queries, (have 2 queries with type='thread' and limit of 250), but doing it this way makes it a little easier to explain. Once you understand what is going on maybe you can change the code yourself to use more efficient queries.
The reason why I chose 250 threads and 500 posts? because, on my server, this seemed to be the safest way to prevent timeouts.
If you have version 4.x of vBulletin, I would be quite curious to hear how it works for you. I don't have that code, so I cannot say for sure if the same issue happens, (but I would hope that they fixed it).
It is already 2012 in some parts of the world and it is slowly getting to that time where we are...
2011 was full of changes for Journalhome, I have, (and in fact I am still busy doing it), re-written a lot of of the engine. I know it was fustrating for some of you as everything did not always work as expected... I am aware that there are still some issues with some templates.
2012 should bring some visual changes, new templates, (a-la Wordpress), finally a new front page, (this one is 7 years old I think)!
So, I hope you have a wonderful and prosperous new year, filled with all your wishes. If you are driving, be safe on the roads!
Today, (November the 16th), the US congress is discussing some form of Internet censorship bill.
Basically they want to be able to monitor and censor the content of the Internet, (to prevent piracy and copyright infringements). This might sound like a noble idea, but in reality, anything you blog might be scrutinised and censor under some piracy cover.
Our servers are not located in the US so we would fall out of any such bills, but we still strongly believe that no government should be able to censor the Internet! You can read more about it here.
I have now stopped the promotion to take stock of the results, (and because it was costing me a lot of money). There has been some signups directly from Facebook, (I can track those), but nothing worth the amount of money spent.
I ended up spending about 50 pound, (GBP), this was my budget for this experiment, each 'clicks', (that is users that saw the adverts and came to the sign-up page), cost between 0.2p and 0.8p, (I am not entirely certain why the cost varied so much, I suspect it has to do with the time of day). So, I ended up with about 70 'visitors' and out of those, I got 4 or 5 sign ups.
As you can well imagine this is very expensive for such a low return.
So what next?
I'll wait a little bit and try Google AdWords.
I'll probably wait for the Google+ plugin to be finished...
Probably start some kind of competition, in other words, award the 50 pound to the best post, (per months or per week).
Wait to see if someone else has another suggestion.
*I like number 3 but I want to give google a fair chance to see what comes of it.
I have always been curious on how to advertise journalhome, the way I see it, there are only two realistic options. Google Adsense or Facebook, I decided to go with Facebook because it is easier to choose the location, gender and user preferences.
I have a very modest budget, so we will see what kind of results we are getting, I am aiming for 1 'real' new blogger a day.
I will update this entry from time to time, to give a bit of feedback, the ad goes 'live' in about 2 or 3 hours, (when the US wakes up) and it is already live in Europe.
I got rid of the 'Total members that have visited us in the last 24 hours: ...', it was ugly, slowing the db, and all in all not telling us anything very interesting. You are welcome to go on the stats page to have a look at who has/has not visited us.
I upgraded the default editor to version 3.4.4 I fixed a bug that was causing the spell checker to not work properly, (for the record, we use Google spell check).
I re-wrote the messages section, just to make sure it looks a little better.
I also updated the 'lifeless', 'forced energy', and 'exotic' templates, because they were ugly... I know there are others, but I am working on them slowly.
For a while now I have been running Bad-behavior for a while now and I must admit I have never really noticed a major slow down in the number of spam registrations/entries. This is primarily because it is blindingly easy to fix whatever issue the plugin might find with your headers.
It does use some third party website like the project honeypot, but those has the down side of slowing this website at every page load.
I need to refine it to only check on certain pages, (like comments and signup), while having it turned off for other pages. I mean, realistically I couldn't care less if a spammer is reading the front page, (as long as they don't do anything else naughty).
So, I have turned it off for a while, this will help me to check 2 things.
- First how much faster is the site, (if at all!), and how much more spam I get. We, (the bots and I :)), are currently disabling no less than 500 to 1000 accounts a day!, so if the numbers get much higher than that then I will know that bad-behavior was helping in a way.
- Secondly, I have metrics to check the speed of the site, so I can compare how much faster it will be.
I will leave it like that for a week or so and then compare everything again...
If any of you have some kind of experience with Bad-behavior, (or if you know how to fine tune it to work faster/smoother/better).
Over the past few weeks/month I have been adding various third party plugins to allow users to connect to Journalhome using their favourite social network account, (to date I have added Facebook, Twitter and Linkedin).
But one of the problems I have been facing is, how to debug/test my Facebook app. When you create an app, (on your facebook developer account), it is expected to redirect to your live server. The obvious solution is to create a new 'test' account.
So lets say your test account is http://127.0.0.1/, (or whatever IP address/website you use), create a new app in facebook using that address. Make sure that all the values are as close as possible to your 'live' server.
I have been having some issues with htpp://localhost/, I think this is because facebook tries to parse the URL and it doesn't like it, so it might be better to use the ip address. It did work some times, but not always, but when you debugging you don't want to have it nor work 'sometimes'.
Pay special attention to the 'Web Site' section, make sure that the site URL points to your test website, (even if it is local and only accessible to you). Facebook uses this address to allow only applications from that address to make 'calls'.
Note your new Application ID and Application secret, they are obviously not the same as your live application details.
And that's all you need to do with Facebook really, (You can also use port number like http://www.example.com:6141/) if you need to.
You now need to create some test accounts, to do that, download the facebook test user manager, it will allow you to create test accounts, (and delete them/edit them/add friends). Just copy the files somewhere on your test server and run it from there, (like http://127.0.0.1/testuser/).
That's all you need really, just make sure that your test server is pointing to the 'test' version of your application and Facebook will return you back to your test server.
NB: This might be stating the obvious, but make sure that you replace the App ID and App secret when you nove to the live server. In my case I have config files that are 'live' and 'test' to ensure that I don't need to bother about changing the app ID and secret.
That's it, let me know if you have an easier way of testing Facebook apps. I am also very interested to know if there are other tools to help you test your Facebook app.
Fixed an issue with the automatic ping functionality, (if the site had no title then you could not ping it properly).
Your blogs are now automatically pinging more services, (Google blogsearch in various languages, but not all). Some services have been dropped, either because they are too slow/offer poor service or simply because they want to charge for a service that is mostly free. I will update the list as time goes by but I will try and make sure we have at least 5 or 10 services been pinged at all time ...
Fixed some minor display issues if your blog had a lot of entries, (the total number of entries was wrong).
Blogs with looooots of entries/post every month now have a paginated archive. This should not affect most of you, but some more than 100 posts per month, (yes they do!), so the archive now paginate for them.
Tag, some clever spark thought it would be funny to have a massive amount of tags in one of their posts. So now I have had to limit the number of tags to 100 per posts and the length of each tags is also limited, (for the life of me I cannot see what they were trying to do).
That's all for now, please let me know if you come across any issue that I might have missed.
After the spam attack we had last week things are slowly getting back to normal.
It has nothing to do with spammer giving us a break, what really happened is that I changed the blog signup page, (to be a bit more user friendly), and the various spam bots that have been flooding us are now hitting the wrong page(s). I fear that once those bots are updated we will be hit with the same amount of spam. But I will deal with it has it happens, (I am watching the events very closely).
I know I have said it many times before, but I am very happy with the way our own anti-spam bot is working, it is catching a huge amount of spam, a vast amount of accounts are flagged a 'untrustworthy' by the system and I not getting the search engine links they so desperately want.
I am manually looking at all the accounts created, but +7000 accounts were created, (and I have automatically disabled +3000), so I still have to remove a lot of them.
I am also very happy with the way the system itself handled the attack, the front page went down once or twice, (under the load), but I have learned from it and we should now be able to handle further such attacks. Those of you that have been around for a long time will know how badly the server used to behave under higher than normal loads.
All this has taken me away from the other improvements I have been working on, so, sadly there isn't much else new happening in the development front.
Let me know if you have any issues you want me to look at...
Just a quick note to let you know that we have been under a larger than usual spam attack over the last 2-3 days.
The spam bot is still doing its job, (and doing it rather well I think ), but it is taking a bit longer than usual, (only because of the volume of spam).
So please bear with me, (and the bot), for a few days, I am sure they will move on as soon as they realise that their garbage is not allowed to remain here. I will probably remove all 'new' weblog from the frontpage, just to make sure that they won't pollute the site any more than they do already.
Also remember to report any weblog that you feel is not appropriate or if it should be removed. A vast amount of older weblogs are not flagged as spam simply because the spam bot is still running in ‘cautious’ mode, (as some of you are all too aware).
I have added one more social network to allow you to connect to JournalHome, LinkedIn.
Simply click on the button on the top banner to link your LinkedIn account to JournalHome, as usual you will be asked to allow JournalHome to connect to your profile. Once this is done you will be able to login to Journalhome using your LinkedIn credentials.
If you are a new user you can also create a new account here using your LinkedIn user-name and password, simply click on the button and follow the various prompts.
So you can now login to JournalHome with Facebook, Twitter and LinkedIn.
As usual, if you come across any issues just drop me a message and I will look into it right away.
Since the 16th of June I have seen a big jump in our traffic, this is mainly because Google has made some changes in the way it returns search results, we will never know how they work things out exactly, but basically they punish low quality websites, or sites that offer duplicate contents. It seems that they have come up with a way of knowing what is a good or bad website.
In our little corner of the web, what this means is that if you copy content in your blog from other sources you better add about 50% 'new', quality content, otherwise Google will punish you, (by that I mean they will make your blog fairly hard, if not imposible, to be found).
For JournalHome as a whole, this is not really bad news as most of the content is unique, I go around and remove all spam/duplicate contents anyway, (thanks for those of you who report sites to me from time to time).
Below is a graph of Journalhome traffic over the past 6 months.
The yellow line is the 1st of January when I started removing all the spam accounts and doing some general clean-up. The red line is around the time of the first Google 'Panda', we saw a small jump in traffic but that was short lived, but there was still an increase in traffic. The green line is from the 16th of June when Google Panda 2.2 was released.
You can see a general increase in traffic, I will try and give an update in the next few months so we can see if we were able to maintain a positive .
How can you improve traffic to your blog?
Content, content, content, Google accounts for a large part of our traffic, (Facebook, Twitter and Yahoo, together, are not far behind), so we cannot ignore their rules. And their most important rules are:
Update your contents on a regular basis.
Quality content, it does not matter what your favourite topic is, as long as it is quality people will come to your blog.
Don't copy from other websites! Ever!
If you do those two points, traffic to your site should follow quickly...
When you want to get the latest Adobe reader, (adobe X), it insists on installing some download manager, supposedly to make you think that your bowser has been incapable of downloading files until now.
But I suspect that in reality Adobe want to push some thirdparty software onto your machine that you never asked for in the first place, (they are pushing the Mcafee antivirus as well as the Google Chrome browser hard).
I created a new plugin for the Google +1 button, it is basically similar to the 'I like' button. Watch the video below explaining a bit more about the new button.
Simply go to your plugin page and activate Google +1 button, (at the bottom of the list). The The jury is still out about the real use of such a button, but, what do I know... I get the feeling that Google has not quite finished adding new options, so I will be changing the plugin as need be.
As always, let me know if you find something wrong.
I have re-enabled bad-behaviour, (and updated it to their latest version in the 2.0 branch), but I don't think I will keep it on permanently. I think it does have a lot of false positives.
So I think the best option is to leave it in most of the time, but not always. I will just need to monitor when our friendly spammers are visiting :). I am looking at their 'new' version, 2.1.x, to see if it will make any big difference with JournalHome.
On the spamming front we are up to about 90% deletion rate, almost every single new account is a spam account, but I am not too worried about it, (for now), I will do proper advertising once everything has been upgraded to the new system.
On that note, the migration is taking a lot longer than expected, this is, (as I have said before), mostly because I am trying to rebuild the engine while the car is still running :). But I am making some good progress, the new templates that I had promised are still some way away, but I am still working on it.