Monday, December 17, 2007

Does Your Website Measure Up?

Website Series Introduction

This series of blog entries will relay some suggestions for improving website usability and performance. It is not intended for the technical web practitioner: instead, the intended audience is the business leader who relies upon the web to deliver sales messages to customers and prospects. The idea is to provide some information to help the business leader communicate with the techies and to evaluate the results achieved.

The visitor experience at your site is influenced by three primary factors: 1) relevance of the site content; 2) ease of finding what they are looking for; and 3) the speed and reliability of content delivery. Thus, content, UI and systems all influence the visitor experience. Said another way, the content relevance, ease of accessing content, and content delivery are the reasons your site will be judged as good or mediocre by visitors and determine whether they stay or come back to buy or get your sales messages. This series will help you understand what to look for to be sure your site is up to the task.

I plan to publish it in four parts over the next month or two:

Part 1: Is Your GUI Sticky?
The User Interface (UI) is what contributes to most to the visitor experience. The objective is to make the experience as positive as possible so visitors will buy and/or come back to buy. Making your site simple to navigate, error free and comfortable is essential to keeping visitors on the site and getting them back to buy again.

Part 2: Do Search Engines Like You?
Most web site owners find a majority of their traffic comes from the major search engines: Yahoo, Google, MSN, AOL, etc. In simple terms, a web site that is set up to appeal to the nuances of search engines is likely to get more traffic than a web site that is not. There are some basic search engine optimization (SEO) techniques that you want to be sure your developers include routinely in the pages they build for you.

Part 3: Is Your Technology Tuned?
In referring to underlying technology, I do not generally mean the hardware and software choices. Several hardware/software vendor solutions are up to the task of delivering web content. The critical issues are the configuration and the environment the systems operate within. These two factors must both be right to reliably perform the required tasks with speed and almost 100% uptime.

Part 4: Is Your Content King?
Visitors primarily find your web site based upon content (and some SEO techniques) and decide to stay and buy your product or proposition based upon content. Clear, professional copy, “community” information, and complete product information are essential to a truly effective web site.

Thursday, December 6, 2007

Image Data Cache

Earlier this year, I implemented an interesting architecture for serving images. The company had a web site with a searchable portal for real estate listings – about 1 million of them with over 20 million images associated with the listings – which was growing as sluggish real estate sales caused listing inventories to grow nationwide. In addition, a new version of the portal displayed additional images in the search results. This situation caused the company’s primary image disk storage array to be extremely stressed, especially while loading new images. The secondary site was off line due to upgraded disk appliance o/s software that was too memory intensive for the model of disk appliance in service. To resolve that required rolling back the software to an earlier version; then reloading the image files on that appliance by transferring the images from the primary site disk array, further stressing the primary hardware. The job was going to take about 10 days, and performance was suffering.

The idea that solved the problems was to cache images on servers at a third party location. Since the company had already signed a contract with a third party to host and serve video, trying the image cache approach for the company made sense. Within 5 business days, the company had implemented the changes needed and had started to cache images on servers at a third party location. Most of that time was spent waiting for the vendor to set up the systems on their end. The changes on the portal servers were relatively simple and implemented within a day or two.

The cache was pretty easy to implement. Code is changed to request the image from the third party URL instead of the traditional source. In this company’s situation, this was easy because most images are called with a data base function that builds a URL for the call and executes the request for the image. So, all the IT staff needed to do as revise and deploy that function.

Once the request is directed to the off site cache, the image is served if it is cached. If not, the 404 error invokes software on the cache that gets the image from the source. In this company’s situation, that source was multiple sources in an image structure behind a F5 load balancer. The cache vendor then serves the image and copies it to their server with concurrent processes.

Images stay on the vendor’s servers for a time set by the customer. The portal company used a 2 week time to live (TTL). This means if an image is not used for 2 weeks; it is automatically removed from the vendor’s cache. The vendor distributed the company’s images on 1000 servers, so the company had to add 1000 names to its DNS records.

The cost of this service is very reasonable compared to the cost of adding or upgrading disk appliances to meet the growth in images stored. For a few hundred dollars a month in vendor services, the company avoided disk appliance upgrade that may have cost as much as a half million dollars.

For more information about this project, or to discuss other issues related to large scale image storage for web delivery, contact me at 757.675.5010 or email bobpride @ (Remove the spaces before and after the @ --- I have to put them there to fool the spammers’ bots.)

Friday, November 16, 2007

Is It Time To Chuck MS Office For A Less Expensive Alternative?

Some of us are old enough to remember when Lotus 123 was king of the spread sheets, WordPerfect ruled the document space, Harvard Graphics was the preferred presentation tool, and our email client was whatever CompuServe provided. However, Microsoft gave us its Office suite with Excel, Word, PowerPoint, and Outlook; and now Microsoft completely dominates the space previously occupied by these former icons.

Many IT managers chafe at Microsoft’s dominance of the office “productivity suite” market. They especially balk at the chunk of their budget it takes to acquire new MS Office licenses (about $500 per seat), upgrade old ones (over $300 per seat), and maintain the Windows Server environments it takes to enable Office desktop use and Internet connectivity.

We now see many alternatives in the marketplace ranging from no cost open source products to lower cost commercially marketed products. On the free end, we currently have, Google Docs, and others. (Example: Lotus Symphony, which is currently free; but will probably fall into the commercial category once it gains traction.) On the commercial product list, there are many offerings. The best known are: Corel WordPerfect Office ($350 or so per seat); StarOffice (under $70); and iWork (under $80).

Are any of these as good as the MS Office standard users are accustomed to? Probably, except for power users who use macros, pivot tables, and collaborators who need to track changes made by one another. However, there are other considerations to conversion that can be daunting. What follows is a discussion of the most critical ones.

1. Conversion. Conversion of documents from one suite to another is not perfect. This is especially true of heavily formatted Word documents and presentation documents: both types may convert with odd results that require reworking the document in the new software. If you have a large library of standard documents to convert, this could be a big effort.

2. Macros. Macros from one suite will generally not translate to a new suite. The work required to build new macros is a factor to consider.

3. Platform. If you want to completely phase out Microsoft, it means changing the operating system for the desktops and servers to something else. That something else is probably going to be some flavor of Unix or Linux. Changing the operating systems on existing hardware is a huge undertaking and will result in user discontent as some favorite applications are not available on the new platform.

4. Email. Outlook has been Microsoft Office’s killer application. It is good and feature rich. In recent implementations it also has integration with other Microsoft applications that will not be found on competing platforms. IT managers may also have to evaluate and select email client software since many suites do not have one. Exchange users who are coordinating calendars, meetings and using other Exchange features also have considerable homework to do.

5. Training. While many competitors have tried to create a look and feel that Microsoft Office users will find familiar, the fact is the applications will have differences and users will need to be trained and supported in making the transition.

6. Collaboration. Compatibility with users in other organizations, “markup” sharing with other users and other collaborations tools are generally weak in the alternative software. If you have these needs, tread carefully.

Based upon the factors listed above, the attraction of replacing Office dims substantially. The conversion costs are just prohibitive. Webs based applications like Google Docs offer the most painless transition; but still have problems to overcome like perceived security of sensitive documents and Internet connection downtime.

In my opinion, converting from MS Office is still too costly for large organizations that have already invested in the Microsoft Office software and supporting technology. However, startups, smaller businesses facing upgrades, and new offices of larger organizations should seriously consider the open source alternatives. The initial savings can be substantial.

Monday, November 12, 2007

Michael Gelb has written a new book

Michael Gelb has written a new book. For those of us who know Gelb’s work, this is great news. Mr. Gelb looks at superstars in the arena of human achievement and extracts the essence of the behaviors that make them more creative, productive and successful than the rest of us. Gelb's published works convey keen insight - some might say genius – in a fashion that is readily understandable and usable by the reader.

Gelb’s most well-known books are based upon the creative genius of Leonardo da Vinci and other great thinkers and extract principles we can employ to empower ourselves with some of that genius.

Gelb’s new work, Innovate Like Edison, is about Thomas Edison’s success as an inventor and his amazing ability to get sweeping changes made to the fabric of American society to allow his inventions to be mainstreamed. His inventions of the phonograph, movie camera, and light bulb created the need for the voice and film recording industry and widespread electric power production and distribution. This process of invention and implementation is innovation.

Gelb ferrets the concepts and then describes the practical use of Edison’s Competencies of Innovation:

1. Solution-Centered Mindset
2. Kaleidocopic Thinking
3. Full-Spectrum Engagement
4. Master Mind Collaboration
5. Super-Value Creation
He also uses lots of practical exercises and list a plethora of resources for innovators. I won’t spoil the surprise by describing them, but be assured every entrepreneur and business innovator can benefit from understanding and mastering the application of these competencies.

The book is co-authored with Edison’s great grand niece Sarah Miller Caldicott. This collaboration provided Gelb with access to the Miller/Edison family archives and that adds to the richness of the product. It is available at and other book retailers.

Thursday, November 1, 2007

The power of vision

…I quit cigarettes forever

In 1992, after smoking for the better part of three decades, I quit cigarettes forever. I had quit before --- countless times --- and it did not last. Why did it work in July of 1992? Well, I think there were several reasons: social pressures, rising cost of cigarettes, my mother’s death 15 years before from what was probably smoking-induced cancer, and increasing pressure from the younger people in my life to give it up. However, there was a difference this time which was purely psychological: I could see myself smoke free. I envisioned clothes that did not reek of cigarettes, a fresh smelling car, a day uninterrupted by cigarette breaks, and more cash for other things.

I truly believe that imagining myself as a non-smoker was the big difference. That vision of a smoke free Bob came true, mostly because I could see myself clearly as a non-smoker.

So what? Bob is some big hero because he quit smoking? No, it illustrates a powerful principle that can help us all achieve more.

From the football player who sacks the quarterback to the bride who loses ten pounds before she walks down the aisle, the vision of the goal stokes the fire of achievement and helps get the desired result.

Microsoft spent millions developing and promoting…

It can work in business, too. Microsoft spent millions developing and promoting the Microsoft Solutions Framework for success in managing complex technology projects. One of the key elements of the framework is a vision of the desired outcome(s). This vision assures everyone involved shares the same definition (vision) of the project’s success. Strategic plans routinely include descriptions of the desired future state of the enterprise. This kind of strategic vision for the future status provides both a road map for achievement and a measure of success.

The power of the vision can be effective on a more tactical level, too. I once ran a customer support organization that was struggling to keep up with customer service demands. Working with the staff, I created a vision of the best customer service available and began implementing changes to support it. The first step was to start telling our customers that we provided the best support available in our industry. We shared the vision with our customers and within a few months, the vision became reality. We were the best customer support organization in the industry and we had the awards and customer retention statistics to prove it.

Personalize it for the team members…

So the next time you want to achieve something, create a vision of what the world will be like when the goal is met. Personalize it for the team members, making sure you define what they will be doing, how they will feel, and other clear indicators the goals has been met. Set a time certain for the achievement and be very specific: for example, “I want to complete the vision blog by the close of business on November 2, 2007. When it’s finished, I will feel proud, relieved and happy. I will celebrate with a shot of the good brandy.”

If the goal involves complexity, you may need a plan to achieve it. An easy way to develop a plan is to work backwards from the goal date to the present, detailing the actions needed to achieve the goal. Think of it like building a house. If your planned move in date in December 1, 2007, think of the last thing that has to happen before you can move in: get an occupancy permit. Before that, you need final inspections. Before that, landscaping must be completed. Before that, utilities must be turned on. By moving backwards from the goal, it is easier to identify the milestones and processes needed to achieve the desired end result.

If you try to plan forward from today, it is much harder...

Don’t be afraid to explore alternative sequences and different approaches. But always work backwards from the goal. If you try to plan forward from today, it is much harder and will take far longer to develop a plan. And the likelihood is that the forward moving plan development will produce an inferior result.

Motivate yourself and your team with colorful, carefully crafted visions of the end result and you will be rewarded with a better future.

Monday, October 22, 2007

...Google's problems in Brazil with inapproriate content...
The Wall Street Journal recently published a front page article highlighting Google’s problems in Brazil with inappropriate content on its social networking. Orkut, Google’s version of social networking sites similar to and NewsCorp’s, in an inexplicable scenario best explained by Malcolm Gladwell’s book The Tipping Point; has become wildly popular with Brazilians. Social networking sites allow users to establish online communities (topics) and upload content to both the community areas on the servers and to personal web sites (which may also have community affinities).

Anyway, leaving the intricacies of the platform aside, the communities created range from those dedicated to praising moms to the darker side of human nature: racism, pedophilia, animal cruelty and so forth. The WSJ does its usual excellent job of explaining how Google got in hot water, so I won’t rehash that. What I want to point out is the root problem and a proposed solution.
The root problem is...
The root problem is that these social networking sites (like any internet site) allow users with access to upload anything. In the case of Google’s Orkut, all content uploaded ends up on Google’s servers in the USA, where free speech is an inalienable right. Not so in Brazil. So the real questions are:

Do content hosts have a legal right or obligation to edit content?
What laws apply?

Let’s deal with the legal right to edit content first. Google, and other content hosts, have what is called Terms of Service (TOS). In these, they specify the types of content that is not approved for uploading to their servers. If a user violates these terms, then Google has the right to delete the content and remove the user account under the TOS because the user agreed to the TOS in order to receive an Orkut account.

Next, does Google have an obligation to remove content that violates its Orkut TOS? Well, according to the TOS, they do. However, they also state they are not liable for content that violates the TOS. That’s a topic for another day, so let’s assume for the moment they have an obligation but the timeliness of their obligatory response is fuzzy.

If a Brazilian citizen in Brazil uploads pro-Nazi comments or racial slurs...
Now we get to the meat of today’s topic. What laws apply? If a Brazilian citizen in Brazil uploads pro-Nazi comments or racial slurs and the uploaded content is stored on, and served from, computers in the US, whose laws apply if the content is subsequently served to another Brazilian citizen in Brazil? What if the same content is uploaded by a US citizen standing on US soil? By a Brazilian on US soil? By a US citizen in Brazil? .

Logically, if the content is illegal in Brazil and was uploaded by a Brazilian who was actually in Brazil at the time, the Brazilian citizen seems to be the culprit and the Brazilian government within its rights to prosecute that citizen. Yet the content is stored and served from computers in the US, where such content may be perfectly legal. Because the content can be served to citizens in their homes and offices in most countries of the world, which have varying laws and standards, the permutations become seemingly infinite.

Unfortunately, when you offer a free service...
The solution is crafting adequate Terms of Service and enforcing them. If Google (and others) craft TOS agreements that reflect the laws of the country of origins where they allow users to sign up, then Google and the others should enforce those agreements by screening content before it is available to the public. Unfortunately, when you offer a free service and users upload millions of items daily, enforcement comes at a high cost. Human review of the millions of images alone uploaded daily would be very expensive. When you throw in text posts, the number of editors needed would start to look like the cadre of TSA screeners.

You also have to consider there may be legal content in one country that that country’s TOS allows that is illegal elsewhere and users signing up in that country have different content standards. So Google may want to display that content to users in one country but lock it to users in another.

Some would say police nothing and allow everything...
So what is the solution? Some would say police nothing and allow everything, however, I think we can safely assume that society is not going to accept that alternative. So the solution has to lie with technology, of course. Specifically, the solution is applying image recognition software to screen out photos not meeting the TOS standards and using context sensitive text screening software to catch the words of extremists and pedophiles who violate local law or the TOS. Since the sophistication of existing software is not up to the task, this is not an easy solution. However, it is probably easier than getting all the countries of the world to agree to uniform laws or to a global internet authority. Google are you up to the technical challenge? BTW, the Homeland Security folks would love the software, too, so there may be a big paycheck in it for Google!

Sunday, October 14, 2007

Sell or Hold --- Managing Foreclosure Inventory

With these market conditions, how does a lender decide whether to sell or hold a property

Prestigious sources like the Wall Street Journal and Realtor Magazine have devoted a considerable number of column inches to the growth in residential real estate inventories. Some of that inventory growth has been due to foreclosures, especially in the last 24 months., an on ine foreclosure marketing company, and the Mortgage Bankers Association are both citing year-to-year growth in foreclosures of 40-50% for 2006 over 2005 and forecast similar results in 2007 over 2006. What this really means is a lot of homes are lender owned today. Because the market is still weak, these homes are not selling.

With these market conditions, how does a lender decide whether to sell or hold a property? By collecting some basic financial data on each property and knowing the cost of capital, a lender can pragmatically decide whether to accept an offer on a foreclosure property or not.

Since most foreclosures will be disposed of within a year or two...

The first figure to know is the lender’s cost of capital. This is the weighted average of the cost of borrowing and equity, expressed as an interest rate. Since most foreclosures will be disposed of within a year or two, it may be best to use the cost of incremental, or additional, capital. Most companies will find their current incremental cost of capital to be in the range of 10-12% assuming a mix of preferred stock or bonds, debt, and common stock and that the company that is only moderately risky form the investor’s viewpoint.

The next numbers are the holding costs: taxes, insurance, lawn service, utilities, and depreciation. Yes, depreciation must be considered. Any inventory is subject to deterioration, and empty houses are particularly prone to vandalism, storm damage, and general neglect from being unoccupied. This “inventory carrying cost” may be somewhat subjective and vary widely from property to property, but a lender should be able to determine the ratio of such expenses to the loan value outstanding on foreclosed properties by market area.

Finally, the selling costs are needed: real estate commissions, transfer taxes, fixing up expenses, and other cost borne by the seller. Other than fixing up expenses, selling cost are not particularly relevant since they will be incurred at any time the property sells. The only variable there is selling price so once could argue the difference in the selling costs between a full value sale and a discounted sale should be considered. I chose to ignore them because I focus on the monthly costs of ownership.

Assume the lender has a home appraised at $240,000 with a defaulted balance of $200,000

Now let’s look at an example. Assume the lender has a home appraised at $240,000 with a defaulted balance of $200,000. Also assume there is no chance of recovery from the borrower, the houses in the market are taking an average of 14 months to sell (longer for foreclosures), and the house needs some work to sell at the appraised value.

The numbers:
Cost of capital: 12% or $2000 monthly
Taxes $4800 or $200 per month
Insurance $800 or $67 per month
Utilities $120 per month
Lawn service average $100 per month (use summer/winter costs to be precise)
Lender’s inventory loss carrying cost expenses in this market 2% annually or $333 per month

The scenario:
This house has been on the market 3 months
The lender has an offer of $148,000 from a qualified buyer who can close in 30 days
Fixing up expenses will be $1800 to meet FHA lending standards

However, the message is it is better to not to have and hold...

The monthly holding cost for this property is $2820. It could be on the market another 11 or more months. That is $31,020 or more in holding costs. The loan balance, less this potential cost, is $168,980. When you add the fixing up expenses of $1800, the offer of $148,000 is too far from this benchmark discounted value of $168,980. However, an offer of about $170,800 would be a break even offer after discounting for potential holding costs and offers approaching this figure should be seriously considered.

One can argue that the time on the market before a sale is the big unknown. I certainly agree. If the lender feels this house will sell faster or slower than average, that should factor into the decision. However, the message is it is better to not to have and hold if the lender gets an offer that reflects something as good or better than the averages after considering the potential holding costs.

Sunday, October 7, 2007

Why did Rupert Murdoch buy Dow Jones Corp?

...let the speculation begin!
Let me start by saying that I have no direct knowledge of Mr. Murdoch’s reasoning and I am offering my own opinions. That said, let the speculation begin!

Much of Murdoch’s News Corporation holdings are traditional print and broadcast media whose profitability are largely dependent on advertising. Although broadcast advertising is doing OK, advertising revenue growth in the print sector is just not happening --- for Murdoch or anyone else. Revenues for most print media advertising are flat or declining in real terms. However, Internet advertising is growing --- and growing fast! I have seen estimates of 60-85% growth in the next 2 years for US markets.
With 2007 U.S. Internet advertising approaching $21 billion, that is a lot of real dollar growth. Most of that growth will be at the expense of traditional advertising media --- especially print. Since Murdoch operates globally, the US market is just a fraction of the growth he perceives in this new venue and it is a huge challenge to his existing revenue stream.

First is, one of the few successful subscription Internet publications
In short, Murdoch must capture a share of that growth to offset declines in print advertising revenues. Dow Jones has, buried in all the WSJ and other print media, some golden Internet assets. First is, one of the few successful subscription Internet publications. Its success led to a subscription model for Next, consider, a consumer investor portal with over 6 million daily visitors. Dow Jones online revenues are growing at a healthy clip, too.
Depending on how you do the accounting, the growth is a healthy 15% (diluted) or an impressive 26% (year to year). DJ is also well into a significant redesign of the WSJ for the “digital age”. This initiative was started in 2005 and the initial launch was in January 2007. The redesign included new web friendly press width for the print edition, improved content, better organization/navigation, and tighter integration to the online version.

Dow Jones is going to approach $400 million in “online” revenue for 2007...
All in all, Dow Jones is going to approach $400 million in “online” revenue for 2007, about 20% of its total revenue. That’s a remarkable percentage for a major US company. Murdoch likely foresees that achievement and is betting he is getting some infrastructure, talent, and ideas that can help News Corporation achieve growth in the online arena.
In addition, I believe Murdoch is seriously investing in the social experiment we call the Internet. His acquisition of, the 5.6 billion dollar bet on DJ, and his other recent investments point to a serious attempt to change News Corporation into a Web 2.0 (and 3.0) enabled 21st century powerhouse. I think he is doing it, and very competently, too.