Tim Yang’s Weblog

Jump to content

About Tim Yang’s Weblog

Nothing to see here.

Categories

Archives

Tags


-->

Author Archive


Howto: Tell-a-friend script for Wordpress

Aug 2005
01

I had a tell-a-friend script on my old blog. But Wordpress doesn’t have anything like that, not even in plugin form. So I wrote a quick one. This script is extremely simple. It works but there are no checks, validations or contingencies built into it. So don’t kill me, I’m still on page 28 of PHP for Dummies (”Concantenating is not a Mexican dance”). Caveat emptor, you have been warned.

The script is meant for use with single post pages. It does not have to be in the Loop, so it can be placed in the sidebar of single.php. If you put it on index.php (or any other page), it will simply send the url and page title of the first post you ever made (not good). If I was informing Madame X of this post, the message the script sends out is formated like this (but you can change it where appropriate to suit your taste):

Subject: Have a look at this blog page, Madame X

Message: Hi, Madame X, I found this interesting post on Tim Yang’s Geek Blog called “Howto: Tell-a-friend script for Wordpress” that I thought you would also find interesting. It’s at http://timyang.com/2005/07/howto-tell-a-friend-script-for-wordpress.

Signed, Tim Yang

Here are the instructions.
Read the rest of this entry »


TalkDigger.com: Compares the number of inbound links to your site from nine search engines

Jul 2005
31

Coming right on the back of Mary Hodder’s blog search comparison table is Talkdigger.com which is easier way for you to make comparisons between the results of nine search engines (both conventional and blog search engines are included). But you have to type very specific urls. The results from your url with “www.” and without “www.” will be very different.

Talk Digger: Check who is linking to you

UPDATE: There is now also Uptimebot.com which does the same thing. But they have a better explanation of the results and they include Alexa.com in the search too.


RSS2PDF.com - Read news from feeds on a printable document

Jul 2005
31

I think the whole point behind converting a feed to a PDF file is to be able to print it out for reading in hard copy. But the PDF output generated by this free online tool looks so plain. I think the next step for RSS2PDF.com is to allow some customisation of the output — for example, changing of the font or the font size at the very least as well as choices of layouts.

RSS 2 PDF - Online RSS or Atom Newsfeed to PDF Generator


Howto: Customising the description metatag to the title of each post in Wordpress

Jul 2005
31

Instead of having a set of standard metatags across all the pages of my blog, I wanted to customise the description metatag of each post to the title of the post and the keywords metatag to the categories of each post. I thought I saw a plugin that did this, but when I looked again, I couldn’t find it. Because I’m reading up on Wordpress Template Tags right now, it made sense for me to try do something different with them. So out comes the PHP for Dummies manual. I succeeded (partially) in my goal. If you check the description metatag of each post, they are all customised while the homepage has the standard blog name and description that’s set in the admin interface. But I failed in the categories as keywords because the category template tags don’t work outside of the Loop.

Here’s the code I used. Just copy the description metatag part and paste it between your head tag to achieve the result.

	<meta name="description" content="<?php if ( is_single() ) {
		single_post_title('', true);
	} else {
		bloginfo('name'); echo " - "; bloginfo('description');
	}
	?>" />

Howto: Generating a list of earlier posts in Wordpress

Jul 2005
30

I had this feature in my earlier weblog. I had ten posts on the home page and I wanted to show in the sidebar a list of the ten posts that pre-dated the ones on the homepage. You can see the unstyled list on the sidebar right now. I used the get_posts function that comes in Wordpress. Although the_date function is supposed to work only within The Loop, somehow it works here. Here’s the code I used.

<ul id="earlierposts">
<?php
$posts = get_posts('numberposts=10&offset=10&order=ASC');
foreach ($posts as $post) : start_wp();
?>
<?php
echo "<li><a href="";
the_permalink(); echo "">";
the_title('', '', true);
the_date('j M','</a> <em>','</em></li>');
?>
<?php
endforeach;
?>
</ul>

A Comparison of How Some Blog Aggregation and RSS Search Tools Work

Jul 2005
30

Mary Hodder of Napsterization.org has produced an analysis of five popular blog content search services (Bloglines, Feedster, Technorati, Blogpulse, Pubsub). She examines what each of them searches, how they search, what sort of links they count and how long they keep those links counted. It gives us some idea of why the results from each of the search engines differs so greatly from the others. For example, Bloglines keeps all data on inbound links from Day One whereas Technorati keeps link data as long as it is on the front page of a blog, so their link count is much lower but much fresher.

Hodder has put her research into a table on a PDF file for easy reference. I’m sure many people will be using her table to produce more insights into the way each of these search engines work. I hope she’ll include Icerocket.com in that table when it becomes more popular.


Feed Digest : Mix, convert, and syndicate RSS and Atom feeds

Jul 2005
30

Feeddigest is finally released and it has all the features I’d been waiting for. It’s like RSSmix and Bloglines in one. You can combine feeds and there’s an interface where I can see and control all my mixes. There’s even a built-in online feed reader. When the statistics feature comes into play, Feedburner will have a competitor.

http://feeddigest.com/


Retailing: What’s working with online shopping

Jul 2005
29

The McKinsey Quarterly has a good article on tactics for online retailing. The article recommends a strategy of complementary online and offline tactics. Giving the examples of LL Bean and Ross-Simons, it calls the strategy a “triple play” — the online store drives traffic to the bricks and mortar store, while the bricks and mortar store acquires walk-in customers and a direct mail catalog offers traditional from-home purchasing from a hard-copy. In every which way, this strategy targets shopping behaviours and captures the market whichever way that the market likes to shop — with the convenience from the home or the human-contact of offline.

The McKinsey Quarterly: Retailing: What’s working online


Howto: Fake a Google Page Rank 10

Jul 2005
27

SEO Black Hat has an interesting article on
how to give any website an outward appearance of PR 10. It does work (SEO Black Hat points out a PR10 demo site), but it’s a superficial PR10 that only website visitors are able to see.

  1. Add a permanent (301) redirect with htaccess or some other means on your website to a PR10 site (eg Google.com).
  2. Wait for a Google update to happen. After that, when your website visitors check the PR of your URL, they will see your website now “has” PR 10.
  3. When you have your “new PR”, add a condition to your permanent redirect that says only Googlebots get redirected while allowing your website visitors into your site. Voila! People can now visit your new PR10 website while Googlebots are still sent away.

As the author of SEO Black Hat also says, Google will not index your site while you are permanently redirecting. So as far as Google is concerned, your PR is the same as before the redirect (and it will probably be lower after the update because it can’t index any of the content on your site). You cannot pass on your “new PR” with outbound links. But your site visitors will get fooled and they won’t know any better (unless they try to Google your site).

Note: This works for Google. But it might also work for Yahoo search and other search engines like AskJeeves too. I’m just not sure if it does, but theoretically it ought to.


Gary McKinnon: Scapegoat or public enemy?

Jul 2005
24

Cnet has an interesting story of Gary Mckinnon a London guy who managed to bypass the security of the Department of Defense as well as the NASA computers. And he got caught for it. But his story takes an interesting turn after that.

He makes the distinction between bypassing and hacking because he insists he merely found a “blank system level administrator password” and didn’t cause a breach to happen. Yet he is fighting an extradition order that accuses him of “hacking and causing damage to federal defense systems”. Causing damage is another point of contention. The U.S. Department of Justice have all but labelled him a terrorist and accuse him of willful destruction of irreplaceable information. But the way that Mckinnon tells it, that as he was leaving the system, the damage was accidental and that the damage was far less than what the Americans accuse him of. In other words, Mckinnon is being demonised and made a patsy for things he had nothing to do with. There’s more information on the Free Gary Mckinnon blog.


Paging

Credits

Copywriter Malaysia