Tim Yang’s Weblog

Jump to content

About Tim Yang’s Weblog

Nothing to see here.

Categories

Archives

Tags


-->

Posts filed in ‘Online’


Preventing inline images from comment posters

Aug 2005
30

Chris Josephes writes in Oreillynet about an interesting method for finding out if an image that one of your visitors is hotlinking is instead sending a substituted image. Hotlinking is often a danger in forums and blog comments where there is no preset control over what is posted.

Chris says:

If any site user makes a posting that inlines images from a third party server, the editing software should retrieve the image twice using the HTTP HEAD method. For the first retrieval, don’t pass a Referer header. For the second retrieval, set a Referer header that would reference the full URL of the page that would eventually load the image. For both requests, the HTTP server headers Content-Length and ETag should return identical values. If they don’t, that means the web server is sending out different files. Make sure the comment poster is aware of this, and give them the opportunity to correct the problem.

This would be really nice if someone used this method in a plugin for blog CMSes.


Microformats aren’t relevant

Aug 2005
27

Microformats are sets of XHTML standards to present various bits of metadata on websites. They’re usually used for presenting trivial sets of metadata (hence the ‘micro’ monicker). Microformats include the XFN (XHTML Friends Network) and the rel=tag attribute for links.

The microformats.org site says that microformats are “Designed for humans first and machines second, microformats are a set of simple, open data formats built upon existing and widely adopted standards.” While I applaud the effort to make a standard to help developers, I’m not sure microformats is terribly relevant. I’d have to argue that humans don’t see the difference between one way of marking up and another, even if they are standardised. So it doesn’t make a difference to the population. You don’t have to look further than the XFN format for evidence of irrelevancy. It was proposed two years ago by Matt Mullenwegg and has yet to become anything more than an interesting idea. Microformats are certainly not “widely adopted”. But having said that, there is one popular microformat — Technorati’s rel=tag attribute. Unfortunately, the only one who seems to be using it is … yup, Technorati.


tagifieds.com - an open-ended bulletin board with tags

Aug 2005
20

Written using Ruby on Rails, Tagifieds.com could quite possibly be a work of genius. I haven’t decided yet. It’s still too new and I haven’t seen the full potential of it yet. In the “About” section, the creator insists that “It’s great for online classifieds, recipes, reviews, rants, scrapbooks, and useful information of all kinds.” Yes, it can be used for all those things, but I’m not sure whether this is the perfect format for any of them. It is a bulletin board, but more in the sense of the corkboard kind with all its chaos, not like the online kind which is often known as a forum. All the posts appear on the front page. And without categories, it may be hard to browse for things — you have to use the search function instead.


Akamai News usage index

Aug 2005
19

Akamai is tracking the number of people who are consuming news from news sites like CNN and BBC around the world. And its providing the numbers on its news usage zeitgeist page. In the last 24 hours six out of seven news site visitors get their news from American sites.


Online social tagging is not about sharing

Aug 2005
19

A landmark study from the HP research department finds that social bookmarking is less about sharing than we thought. A large portion of the tags on del.icio.us used by the study group to describe documents on the web were self-referencing (ie mycomments) or for self-organising purposes (ie toread). As an example, in the last 15 days, over 30,000 links had no tags in them, suggesting that they were less for sharing than for self-tracking. But this was not a study of motives, but rather of finding out how links were used and how they were being described. Probably the next step is to distribute a qualitative survey via del.icio.us to its users, asking for information to provide insights into motives of tagging.


Why should a software company GPL its code

Aug 2005
17

Here’s a very good discussion on Slashdot that revolves around business reasons for a company to GPL their code. Some of those include:

  1. Other people can fix your bugs and security holes for you
  2. No need to pay for beta testers
  3. Free development of new features, some of which you might not otherwise have thought of yourselves if you can get a development community started.
  4. Free positive P.R. for your company, especially if things really take off.
  5. Free advertising for your company as well if you brand the package with your company logo and colours by default.

The issues that came from this thread http://ask.slashdot.org/comments.pl?sid=159218&cid=13334726 are especially insightful.


Why NYT and Yahoo News’ need to track their stories is losing them readers

Aug 2005
08

Have you ever wondered why so few people bookmark New York Times stories on del.icio.us? I mean they hardly ever appear on the del.icio.us popular list. Not even the recent Karl Rove stories. There is a simple reason. If you have a look at the end of the url of each NYT story, there is a unique session id stuck on it (after .html). It serves no purpose except for NYT to track which pages you visit on the nyt.com site. But not everyone has the patience or the know-how to remove the unique session id data before they post the article to del.icio.us so that only the real url remains.

That means the same article may get posted to del.icio.us hundreds or thousands of times, but because the url is different every time, del.icio.us assumes they are all unique web pages because del.icio.us tracks them by url. As a result, it will appear as if the article has been unpopular.

This is not to say that NY Times articles don’t get passed around. They do. But only when some really popular site like Kottke or Techdirt has linked to them. Then the url they used (with the unique id) will get posted and re-posted. The unique id is great for tracking the popularity of stories and the flow of traffic on nyt.com, but the stories would be even more popular if people knew they were popular. That’s the whole point behind social bookmarking sites like Del.icio.us. To share articles you liked and to read articles recommended by others.

But NYT is not the only purveyor of this mistake. Yahoo News also adds an unnecessary user session id to the end of its urls. So no matter how many times their stories get posted to Fark.com or Metafilter.com, they never make the popular list either. It’s unnecessary because it would be just as simple to embed any user id into their webpages when they are dynamically being generated. And although it isn’t their problem, the admins of del.icio.us could take the initiative to strip session ids from urls. Why, because superfluous data is being added by their users into their database and this is causing errors in the popularity of articles on their site. So there are a few possible solutions, but it doesn’t look like any one is going to make the first move. So it’s up to you. The next time you post a NYT or Yahoo News article to Del.icio.us, please do remember to strip off the unique session id first.


Marketers should communicate more with server managers to prevent service failures

Aug 2005
05

A study called the Internet Campaign Effectiveness Study in the UK found that a lack of communication between marketing and IT is one of the key reasons why serious site failures occur. Site failures might mean slowing down of responses or errors in processing due to timeouts because of heavy traffic.

The study puts the blame heavily on marketers, suggesting that a lack of planning and goal setting coupled with a lack of understanding of the processing power required to service a block of visitors is creating unnecessary stress on existing processing power. I think this is quite unfair because it is notoriously difficult to gauge with any degree of accuracy the number of people who visit a site during or after a marketing campaign. Yes, a seasoned marketer using conventional media can estimate the degree of generated interest based on reach and persuasiveness of the message. But even if he were to supply those figures to a server manager, the manager wouldn’t really be able to tell how much processing power would be required to service the expected visitors. For instance, can anyone tell how many requests a thousand online visitors will make in one hour? And what kind of requests would those be? Database requests or static page requests?

I think the key problem occurs when marketers engage in a heavy burst campaign and don’t tell server managers to have extra power on standby. In a burst campaign, a great deal of effort and money is spent to create an attention explosion toward a website. It would therefore be unreasonable to expect zero server failures with the resulting burst response. Adequate contingencies can always be prepared, but only if the manager is expecting to have to make them.


Retailing: What’s working with online shopping

Jul 2005
29

The McKinsey Quarterly has a good article on tactics for online retailing. The article recommends a strategy of complementary online and offline tactics. Giving the examples of LL Bean and Ross-Simons, it calls the strategy a “triple play” — the online store drives traffic to the bricks and mortar store, while the bricks and mortar store acquires walk-in customers and a direct mail catalog offers traditional from-home purchasing from a hard-copy. In every which way, this strategy targets shopping behaviours and captures the market whichever way that the market likes to shop — with the convenience from the home or the human-contact of offline.

The McKinsey Quarterly: Retailing: What’s working online


WashingtonPost.com gives its readers customised news by zipcode

Jul 2005
18

The Washington Post is giving its American online readers the choice of viewing its content based on their zipcode. It will present readers with more news from their area or else it will show a version of their paper that’s more international. I think its great that newspapers are taking advantage of the internet’s customisation quality to make their websites more relevant to different market segments. It’s a great step forward that they’re realising that the market is highly segmented and that the internet can help them achieve targeting with cost-efficiency. Although ecommerce sites like Ebay and Craigslist have been doing it for years, it’s better late than never.

Post Site Splits Into Local, Global Pages


Paging

Credits

Copywriter Malaysia