Saturday, September 20, 2008

Does Your Website Measure Up? Part 4: Does Your Content Rule?


Does Your Website Measure Up?
This blog topic is being published as a serial. This is the last of four parts.


Part 4: Does Your Content Rule?

The visitor experience at your site is influenced by three primary factors:

1. Relevance and quality of the site content;
2. Ease of finding the desired product and/or information; and
3. Speed and reliability of content delivery.

Thus, content, UI and systems all influence the visitor experience. Said another way, the content relevance, ease of accessing content, and content delivery are the reasons your site will be judged as good or mediocre by visitors and determine whether they stay or come back to buy or understand and act upon your sales messages.

I discussed the speed and reliability issues in Part 3 of this series. Throughout the series, I have alluded to the importance of content and of accurately portraying your web site’s products, services, or purposes. This fourth and final part of the series will focus on content factors and best practices.

On the internet, information, or content, is king. It rules! Why? It is because the Internet surfer is looking for information with good reason. That is the only way the surfer can distinguish your offerings from that of others. The surfer needs reassurance your site and business or other action proposition is valid, credible, and otherwise trustworthy. Visitors need assurances that your products will fit, work for their application, etc. How do they determine this? The only way they can: by examining your site content. Pictures and descriptions are the selling tools on the Internet. Why do you think it is called the information highway?

Let’s explore how your content can “rule” and help you close business or convey your message convincingly on line.

Are You Relevant?

The most important message about a web site is that the content should be appropriate for the purposes of the site’s sales proposition and suit the needs of visitors. If your site is a footwear site, adding content that eschews the merits of a political candidate is pretty worthless --- unless the candidate is your customer and/or providing a testimonial about your business.

I recently searched on line for information on a particular digital camera. I was interested in learning about the quality of the photographs compared to other brands in the same price range and what lenses best suited my intended photographic targets. By reading the user ratings, I found out some basic information and was referred to other web sites for details and comparisons. I also searched for comparisons of specific models. After research, I found out I would be much better off spending $200 more for a slightly better model than I had originally intended to buy. However, it would have been great to find a camera sales web site that had all the relevant information. If that site existed, and had competitive prices, they probably would have received my business. That’s true content. Going beyond the manufacturer’s description and specs and getting to the use, handling, good and quirky features, and suitability for certain applications. Many sites had the user rating feature, and those were helpful.

Generally speaking, the more relevant content you provide, the better your site will rank in the search engines and the more visitors your site will appeal to.

Is Your Content Lost Or Found?

What? Remember the search engine discussion in Part 2? Content plays a huge part in determining how easy your content (and therefore your web site) is to find using search engines. Since 60-80% of traffic at the typical web site will come from search engines like Yahoo, MSN, Google and AOL; and content is an essential measure that search engines use to rank your site in their results; that content determines how easily your site is found by search engine users. Without relevance, your content will not place high on search engine results and will therefore be “lost” to searchers rather than found. Why will it be lost? It will be lost imply because the search results will show your site on page 20, 200, or 2000. It is a rare search engine results reader who bothers to go past page 3.

Remember the lessons from part 2 of this series:

Are Your Visitors Playing The Waiting Game?

You just spent $2500 for that cool 20 second Flash introduction on your site, but the number of people leaving the site before getting past the home page is up 30% and overall page views and visitors have declined. What is going on? You are making them wait to see your content! Most people don’t care about your Flash intro --- probably only Flash developers. The rest of us want to get to the content we came to see NOW, not waste 20 seconds waiting for your branding commercial to run its course. That is why you see most Flash intros have an option to skip them, although most are not prominent enough. In my view, you should have to click to continue the Flash intro, not to stop it.

The point is, anything that slows down access to your content is probably detrimental, even cool Flash stuff. Poor site design, slow servers, insufficient bandwidth, or high traffic can all contribute to the waiting game and make your visitors cut their loss of time by leaving for the next site on the search results list. If you make your visitors pay the waiting game, you lose every time. Test your site rigorously, monitor its performance with software services developed for the purpose, and always be sure the path to the “close” is as short as can be.

How to Get Read and Understood

Did you ever pick up a book and after trying to read it, just give up because it was too hard to understand what the author was trying to say? You were trying to read for entertainment or education; but felt like you needed a doctorate in literature just to understand the darnn book! Popular authors understand this and write their books to be readable without their readers having to struggle to understand what is going on. It’s the old KISS (Keep It Simple, Stupid) principle. These authors understand their audiences want to be entertained or educated and do not want to struggle to understand the text.

Your visitors are the same. They want information in a readable, understandable way. So, you need to write readable text and spice it up with illustrations appropriate for understanding your product or proposition.

Here are some tips for web readability.

1. Load your conclusion into the front of the page. For a product, this could be a statement like “this shoe is great for walking on hard surfaces like roads and sidewalks.”
2. Only include one idea or subject in each paragraph. Like this list, focus each paragraph n a single message to the reader.
3. Use a lot of white space and use subheadings, lists, bold or colored text, and other techniques to provide emphasis to your essential message.
4. Use left justified text, easy to read fonts, etc.
5. Avoid text on dark backgrounds.
6. Include product photos, especially ones with the product in use if it is not a commonplace item. For ideas, conceptual diagrams can help get the idea across.

Remember that most of us have done most of our reading with black letters on a white background. Therefore, our brains find that most comfortable. Use color for text and text backgrounds carefully and always make sure there is a lot of contrast. Many of your visitors may not have good eyesight or large computer monitors, so be sure you don’t make readability decisions without considering those factors.

Your Content Can Rule

Content is the life blood of a web site. The more high quality, relevant content you have, the better your site will be and that overall quality should attract more visitors than sites with lesser content. Budgeting for development of quality content and making sure your site serves the content in a relaxed, easy to use fashion can make your site a king among knaves.

Concluding the Series

This article concludes the series “Does Your Website Measure Up?” It has been both a challenge and a lot of fun to write. It has forced me to organize my own thinking about web site development and deployment and, as a result, my clients are benefiting from a more streamlined approach to creating or renewing their web sites.

One thing that should be obvious, but that I have not specifically stated, is that web development is complex and involves more skills and talents than a single person typically has in his/her toolbox. Therefore, web development tends to be a collaborative process involving the owner and other persons with the requisite skills. A single web designer/developer usually does not have all the skills needed in a given situation and that can be problematic. While good results can come from “singleton” web designers, it usually takes more to get the best results. At a minimum, I would suggest you only consider designers who collaborate with developers and copywriters. Good designers can map out a nice looking, easily navigable site but few have the skills to integrate a data base, add PHP (or other) programming, or add AJAX routines to speed up content delivery. Similarly, I find few programmers who are good content writers; although some designers do this well, especially those who come from ad agency backgrounds. Make sure the team you assemble has the skills you need to get your job done.

Most of my web consulting is related to creating and translating business strategy into supportive web strategies and mapping out the general design concept --- typically in the form of heavily documented site map. I then work with the client to identify the skills required, find people who can contribute successfully, and manage the project through implementation.

Another point concerns process and project management. Web development (or major site updates) involves a series of steps that should be followed concluding with milestones that are the best measurements of progress. These general milestones are listed below. The process involved can be inferred from the milestone.

1. Site Map Completed (content outline, menus)
2. Home Page Design Completed (navigation for all pages)
3. Site CSS Finalized (fonts and colors)
4. Site Mockup Completed (build pages, no content)
5. Site Content Created (text and photos)
6. Hosting Venue Selected
7. Deployment of the Live Site
8. Maintenance Schedule and Assignments

Note that steps 2 and 3 are typically concurrent. At some point in the future, I will add blog articles on each of these process/milestone steps.

Please contact me it f you want guidance for your own efforts, clarification on my statements, or to take exception to my ideas. Thanks for reading.

Monday, March 17, 2008

Does Your Website Measure Up? Part 3: Technology Under the Hood


Does Your Website Measure Up?
This blog topic is being published as a serial. This is the third of four parts.


Part 3: Technology Under the Hood

In referring to underlying technology, I do not generally mean the hardware and software choices. Microsoft, Sun, IBM, and Open Source solutions are all up to the task of delivering web content. Well, OK, we all have our opinions about which vendor’s technology works best in certain situations, but let’s leave that aside. The critical issues are in the installation, setup and the environment your systems live in. It must be done right to reliably perform the required tasks with speed and almost 100% uptime.


Is Your Pipe Big Enough?

Perhaps the biggest issue is bandwidth. Bandwidth in the networking sense refers to the data rate or channel capacity of the local area network (LAN) or Internet connection. In our discussion, we are more concerned with the Internet connection capacity or “pipe” size than LAN capacity. If you connect the best Internet server in the world to an insufficient pipe, the user experience will be degraded by the speed limit of the pipe. As an analogy, assume we have five gallons of fuel to dump into our race car in the pit. If we pour that fuel through a soda straw, it will take a while. Your Internet Service Provider establishes bandwidth for your Internet connection. However, there are other ways to expose your web site server(s) to the Internet.

Besides putting a web server for your public site on your own business network, you can utilize third party web server “hosting” in several ways. All third party web hosting solutions involve sharing physical resources and bandwidth in one of a couple of ways. The first, and least expensive, is called virtual hosting. In this setup, you share a server with other small accounts. You can also have one or more dedicated servers in a host’s data center. You can own your own servers (called co-locating) or use servers provided by the host. Third party hosting usually has the advantage of big bandwidth capacity, secure buildings, redundant cooling and electrical supplies, and redundant Internet backbone connections. The Internet backbone refers to the main “trunk” lines of the Internet owned and operated by the major communications companies and the government. If you host in your own shop and your router connecting to the Internet fails, your site is off line until that router is fixed. At a commercial data center, that router is one of several connections to various “trunks” and requests for your server would be automatically re-routed to a working connection. Nice!

A disadvantage of co-located servers is their relative inaccessibility for maintenance or changes that require physical proximity. However, most hosts will perform disk replacements or other minor hardware maintenance for a reasonable fee. The biggest inconveniences are during the initial installation and when any major upgrades are required.

The primary message concerning bandwidth is do not put a public server on your local office’s DSL or cable connection if you expect any significant traffic. The pipe is too small and it may even be a violation of your ISP’s terms of service or use.


Are You Being Served?

The next factor to consider is the equipment used for your web server(s). Surprisingly, a server that functions quite admirably does not have to have the speed and storage capacity of a desktop used for Microsoft Office applications. It does, however, need some failsafe provisions your favorite desktop can live without. Why? Because that server is working 24/7 and it does not have a human connecting with it daily to discover any problems. While monitoring helps (see below), redundancy is very important.

The first area of redundancy is in the power supply. The 115-120 volt wall socket AC power coming into a computer is transformed into 12 and 5 volt DC power to supply the computer’s internal components. The input source and the associated electronics and transformers are a hardware unit collectively called the “power supply”. Computers with two power supplies than can switch automatically to the other when one fails are preferred. These “switching power supplies” are inexpensive, so be sure you have them. Next, is to have disk drives that are redundant and can be replaced while the machine is still operating (“hot-swapped”). Some form of RAID (Redundant Arrays of Independent Disks) is appropriate. See the RAID article on Wikipedia for more information on RAID options.

Optimization of your server configuration is important for performance and ease of administration. Each server software system --- Apache, Solaris, Microsoft IIS, etc. --- has its own best practices and performance tradeoffs. Adding JAVA with Tomcat or other JAVA servers, ASP, PHP or .Net technologies can complicate optimizations substantially. Look for documents on the ‘Net that offer guidance for server setup and optimization.
For example, a white paper on Apache optimization can be found on Serverwatch.com. Make sure you understand enough to ask your technology providers some questions to be sure they have considered performance optimization.

Monitoring of your server (or web site) just to be sure it is available and performing acceptably is important. You don’t want customers or your boss calling you to tell you the site it not operational! Use of monitoring services that alert you by cell or pager of any performance issue or outage can provide you with an early warning of any problem. Monitoring service costs start at free to $50 annually for one URL with simple services and reporting and range up to tens of thousands of dollars per year for complex monitoring and reporting for many URLs.

The message: take steps to keep your site available and performing at its best while monitoring 24/7 for any potential problems.


Is Your Load Too Heavy?

How do you know if a server (or group of servers) is handling its load OK or if it is overloaded? The most obvious answer is “response time”. What is response time? Simply stated, it is the time between the visitor’s request for a page and when that page is displayed in his/her browser. The industry standard goal is 1-3 seconds for a visitor with a broadband (cable or DSL) connection to the Internet, depending upon who you talk to. I say 2-3 seconds is very good, with peak traffic responses of up to 5-10 seconds, as long as that performance level is delivered consistently from hour-to-hour and day-to-day. Most monitoring services have some way to sample response time for a URL (specific page).

The infrastructure for multiple servers responding to requests for the URLs for one site can get complex. The individual servers can be connected to a single load balancer that directs requests to the least busy server, each server in turn, or to different server groups in different locations using some logic for the distribution of requests. While beyond the scope of this article, sites receiving hundreds of thousands of visitors per week need this kind of architecture. However, when ever your requirements for a single site exceed the capacity of a single server, some kind of “traffic cop” --- hardware or software --- is needed.

A final note on performance relates to images. Images are the most important selling tool on the Internet yet they are the most resource intensive components of any site. Images are typically very large compared to a text area of the same size. We have all had the experience of visiting a web page and waiting, and waiting, and waiting for some image to display. This is usually because the image has not been optimized for the web. It is stored on the server with the same resolution as the camera that took the original photo. These days, that can be a million bytes or more. Re-sizing a picture close to the size of the screen real estate it will use and saving it as a JPG or GIF file type will usually mitigate the issue by cutting the size to under 10,000 bytes. The quality will still be good for the web at the expected display size; but the photo will not scale to a larger image very well. This may create an issue when a prospective buyer wants to zoom in for a closer look. That’s why you see small images on most web pages that can be clicked to view a bigger version of the image. That bigger image is in fact a different image file on the computer that is larger in size and “byte count”. However, since it loads alone, the larger size is less bothersome than when you are trying to display a page with several pictures and text.

Again, a simple message: make sure you optimize photos and that you have enough server horsepower to keep page load times in the 2-3 second range for pages which are viewed most often.


Technical Excellence is in the Details

As this article shows, the server engine is dependent on many factors that determine the quality of service your visitor experiences. If I had to offer one single piece of advice to the novice it would be to use a third party resource for hosting. That will all but eliminate the bandwidth issue and help with many more of the technology issues that could take your site offline more than the .02% industry standard goal about 9 minutes per month.

Sunday, January 27, 2008

Does Your Website Measure Up? Part 2: Do Search Engines Like You?


This blog topic is being published as a serial. This is the second of four parts.

Part 2: Do Search Engines Like You?


Most web site owners find a significant amount, if not a majority, of their traffic comes from the major search engines like Yahoo, Google, MSN, AOL, Lycos and a few others. In simple terms, a web site that is set up to standards that meet the expectations of search engines is likely to get more traffic than a web site that is not. This is because those search engine friendly sites will rank higher in the search results, meaning more people will see them in the results and link to them. There are some basic search engine optimization (SEO) techniques that you want to be sure your developers include routinely in the sites and added pages they build for you.

Most search engines respond well to static HTML (Hyper Text Markup Language) pages with lots of text. If you have a site with dynamic URL calls (including search variables, for example), Flash, Frames or other technologies that are not HTML-based, you probably will need SEO specialists to make your site adequately visible to search engines.

Within the current structure of the search engine marketplace, you can improve search results in two basic ways: organically or by paying for placement. Organic SEO refers to using the elements of site and page design and structure that search engines find most acceptable and which produce higher placement in the results based upon those elements alone. This article focuses on organic SEO, but a brief introduction to paid placement is also provided.

Some parts of this topic may get a little technical in terminology. Please don’t let it put you off. Read for the gestalt first. When you finish, use Wikipedia.com to search for any definitions that you need for clarification. I have provided some hyperlinks for key concepts and definitions if you want a more in-depth view of this subject.

There are also numerous publications for SEO learning. Since the search engines are continually changing the rules, and the different engines have differing approaches, subscribing to a SEO periodical might be good, especially if you are trying to do it yourself or evaluate an SEO employee or vendor. Search Engine News is a good choice for a monthly paid subscription. Search Engine Watch and Pandia Post are good free resources. A review of the results in your favorite search engine for the term “search engine newsletter” or “search engine help” will provide many more paid and free resources to choose from. Just remember when evaluating paid services that this is a labor intensive service and you will typically get what you pay for. Low one-time fees for “search engine submission” are typically of limited value. SEO is an art and requires repeated attention. Be sure to check references and look at the historical site statistics for the reference sites.

Is Your Domain Relevant?

The first step in the SEO game is to pick a domain name that is reflective of the search terms you expect people to use when looking for your products. If you are selling premium ballet shoes, JillsBalletShoes.com is probably a better choice than FancyFootworks.com. Domains that reflect the name of the business and business names that reflect the product or service will help in search engine rankings.

Can Search Engines Spider Your Site?

A critical step is to make sure the content on your site can be found by search engines. Search engines “crawl” or “spider” web sites to look for unique and relevant content associated with certain keywords which are then used to index and weight the sites’ pages. (There will be more on keywords later.) The beginning point is the home page for the site; e.g., index.html. Therefore, in order for a search engine to successfully spider a site, the links from the home page to the sub pages needs to be clear in the home page HTML. There are two very good ways to accomplish this.

Create a site map with all the pages and the links to the pages. Provide one or more links to the site map on the home page. Site maps are terrific and every site should have one. Be sure to keep it up to date as the site changes.
Add a list of all pages as hyperlinks. This is why many sites you visit have a section at the bottom of the page with links to all pages in the site. Again, keep it up to date.

Using one, or preferably both, of these techniques will assure your pages can be readily found by the spiders.

Another critical search engine requirement is the robots.txt file. Your root directory must have a robots.txt file or some search engines will not spider the site. The robots.txt file can be used to disallow access to content by spiders, too. This is frequently done while sites are under development to prevent getting a bad page rank for the domain due to incomplete content.

As a final word on spiders, you can set up software to capture search spider visits and provide valuable statistics by analyzing the spider and the results of the visit. By seeing which pages the spider visited and comparing it to page rankings, the analysis can help determine if a page needs additional optimization effort. This analysis software can also block unwanted spiders like those who visit only to copy email addresses for use in spam campaigns.

Do Your Tags, Links and Names Work Together?

Tags, and the keywords they contain, were primarily what search engines used to rank pages in the early days. Because it did not take site developers long to figure out how to use popular keywords like “sex” in their tags to improve page rankings, the search engines now place far less reliance on tags. However, the repetition of keywords in tags, directory and file names, link names to other pages within the site, and the actual copy or content on the site’s pages is very important. This relationship and frequency analysis by the search engines means that keywords should be carefully optimized in conjunction with their use in other elements of a site’s construction.

Here are the basic keyword elements. I am presenting them to show how they are interrelated, but while I’m at it, I’ve included some best practice “how to” suggestions.

Tags
..
Title tag: Use a few of your most strategic keywords to title your page.
Description tag: Include as many relevant keywords as practical in a description of the site.
Meta tags: List all your keywords here, again remembering to keep them relevant.
Heading tags: Can be added at the top of pages, using keywords.
Alt/Image tags: Use keywords to tag images with relevant identifiers.
Links (hypertext): Use relevant keywords in links to the site’s other pages or files.
Directory Names: Use keywords in directory names. Separate with hyphens.
Filenames: Use keywords in file names. Separate with hyphens.
Site Map: As indicated before, a site map helps the spiders find your pages. It is also another relevant way to include keywords.

If your keywords are relevant and included in tags, names, and content; most search engines will give “extra credit” in the rankings for the repetition. As you will see in a few minutes, the “relevance” is related to the copy, or content, of your site.

A Word About Stuffing

No, I haven’t changed the subject to holiday turkeys. In search engine terminology, “stuffing” refers to the addition of popular keywords that are not relevant to the site content. Search engines will ignore non-relevant keywords, or even penalize sites that use large numbers of “hot” keywords in tags when those keywords are not present in the copy on the site. Similarly, sites that use “invisible” text (text the same color as the page background, for example) to put keywords on pages so it appears to be in the copy are penalized. Some search engines exclude such sites completely from their results. Needless to say, it is not a good idea to use this technique.

Does Your Copy Complement Your Key Words?

Using copy that integrates the strategic words that search engines will use to index your site is essential to placement success. Using text with these strategic keywords at the top of each page, in page and paragraph headings, and at the beginning of paragraphs complements the keyword based tags and names used in the construction of your site. All these uses of the strategic words combine to create a synergistic effect in most search engine algorithms, and the attention to detail and relevant content is rewarded with higher placement in search results from searches using those keywords.

Paying For Search Engine Placement

After all this, budget permitting, you may choose to augment your organic rankings with some form of paid placement. Paid placement is a complex subject and I only intend to introduce it here. For a more detailed discussion, refer to the newsletters reference earlier and look at the descriptions for AdWords and AdSense products on Google’s ads page.

Paid Placement
..
Paid placement can take many forms. However, the most common is to pay the search engine for preferred placement for certain keywords. The cost can be a function of the number of times the results display the advertiser’s link, the number of searchers who actually click on the link to the site, or a flat periodic fee. Search engines refer to these as sponsored listings, pay per click, and other similar terminology. Most search engines wisely segregate paid results from “organic” results to avoid criticism from their user communities.

Paid Submission
..
Paid submission or paid inclusion refers to paying a fee for prompt search engine spidering of a web site. It is most frequently used for new sites to expedite appearing in key search engine rankings. Not all search engines provide paid submission. Paid submission does not affect ranking, just inclusion on that specific search engine. Sites will still be ranked by the search engine’s usual algorithms and good organic design is still important. However, if market research shows you that your competition has good success with a particular search engine, you might consider the paid submission approach for a new web site.

Budgeting for keyword “buys” and paid submissions can jumpstart the success of a new site. Ongoing buys of keywords can enhance the success of an established site. However, these techniques are expensive and a minimum budget of several thousand dollars per year will typically be required to have a significant effect, especially with popular keywords.

Traffic Analysis

Knowing how much traffic your site has and your “sales per visitor” can help you evaluate organic and paid search results. If you have 1000 visitors per day and sell $2000 in product per day to 100 visitors, you have a 10% “conversion rate” (visitors who buy) and sales of $200 per customer and $20 per visitor. In the simplest terms, you should be able to evaluate the effect of new initiatives by seeing the change (or lack of change) in these metrics. However, if a site has rapidly growing traffic, has a base of established customer who buy repeatedly, or is constantly changing its product offerings; evaluation of the results accruing from SEO changes becomes more difficult. This is because the other traffic and growth factors are commingled with the SEO factors.

To analyze these complex factors, it is important to know where your traffic comes from and which page they “land” on. Did they link from a search engine; another link source (business partner or advertising resource); or a personal IP address? Did they link to the home page, a product or service page, or a specific informational page like a how-to or success story? It can also be important to know which pages they linked to on your site and which page they were on when they abandoned, or left, the site.

Your web site host should provide some analytic software to get the broad statistics. You can also subscribe to paid analytics that monitor your site and accumulate data for analysis. Ongoing analysis of a site’s dynamics is important to its initial and ongoing success, especially in the realm of search engine results placement. Be sure to budget time and money for this important function.
..
Overwhelmed?

SEO best practices cover a lot of ground. If you are in charge of a commercial web site that is focused on eCommerce, you must be sure the SEO bases are covered. Similarly, if finding your web site in the search engines has strategic importance to your organization’s operations or marketing success, SEO is a key element of the site design.

The old saying is that if you build a better mousetrap, the world will beat a path to your door. In the twenty-first century, you have to assure your door is easy to find on the web. The world is too big to rely on word of mouth alone to find your mousetrap.

Saturday, January 19, 2008

Does Your Website Measure Up? Part 1: The User Interface


This blog topic is being published as a serial. This is the first of four parts.

Part 1: The User Interface Is Your GUI Sticky?
..
The Graphical User Interface (GUI) or, more commonly, the User Interface (UI) is what contributes most to the visitor experience. The objective is to make the experience as positive as possible so visitors will buy and/or come back to buy. Making your site simple to navigate, error free and comfortable is essential to keeping visitors on the site and getting them back to buy your product or take other action again.
..
The “stickiness" of a web site refers to the characteristics that make a visitor linger and/or return for subsequent visits.
..
The UI is one key to making a site sticky. The UI design stages the flow of the search and site “browsing” as well as the transactional flow of a sale or call to action. For the purposes of this series, I am going to assume the call to action is a sale, but it could also be to write a legislator or other action that serves the site owner’s purposes.
..
What are the basics of good user interface design? There are three main elements:
....
1. The look and feel of the site;
2. The utility of the content presentation; and,
3. Ease of achieving the user’s goal(s).
..
Look and Feel
..
Let’s start with the look and feel of the site. A gaudy site might look interesting, but all the bright colors, waving flags, and gyrating dancers are going to keep your visitor from seeing what you want them to see: your messages and how to find what they are looking for. Similarly, most of us spent 12 or more years learning to read black letters on white paper, so when we encounter red letters on a black background we find them very hard to read. That’s not to say your site should look like a newspaper, color goes a long way to making a site appealing. Just use colors that provide good contrast for easy reading or just intersperse the black on white text with relevant illustrations and photographs. Think back to the illustrated children’s books you have encountered. Another good trick is to look at magazine articles and advertisements. After all, the web page is a page, too, and those magazine layouts have good lessons for web designers. One big difference between a web site and a magazine is that users have come to expect the page header and navigation elements on each page to remain relatively constant. Since magazines have no interactive navigation elements like the web page hyperlink can provide, this commonality of navigation tools is important on a web site. The magazine publisher assumes the reader will refer back to the table of contents for navigation. So the “buttons” and other hyperlinks to other sections of your web site should be clear, easy to find, and consistent from page to page.
..
Also, stick with the proven typography proportional fonts: Times New Roman, Garamond, Sans Serif (Verdana), etc. If you use an offbeat font, not only is it hard to read, but if the user does not have the font on their PC, their browser may interpret the font such that it is practically illegible. Refer to the inset to see how my browser interprets Lucida Casual, a font not resident on my computer and some other more common fonts. I suggest you use these more common fonts to be sure your visitor does not see something that is very difficult or impossible to read.


Another look and feel element that is critical is everything you want the user to act on should be visible without scrolling the page down or to the right. Make your content fit and check to be sure it fits on various sizes of monitor at various popular screen resolutions. Some web designers refer to this placement as “above the fold”. The term relates to newspapers, which typically have a horizontal fold midway down the page. Keeping your primary message clear and prominently “above the fold” makes sure it is seen at first glance and increases the chance the visitor will take the time to see more of the site.

Content Presentation

Have you seen a web site where the product for sale has several pictures? You are excited to get such detail and you happily click on a picture’s thumbnail. Then you find the picture is HUGE, taking a long time to load, requiring you to scroll to see it. Then, adding more complexity, you have to close the window, and click the next one. Compare that experience to a site where the photos are properly sized and all you have to do to see the next one is click “next”. The former approach is poor content presentation and makes a visitor quickly want to look elsewhere. One of the biggest payoffs in site redesign is to consider the user in developing easy to access content and properly sized photographs.

When my team redesigned the Homes.com real estate portal in 2006-7, enhancing the user experience was the primary goal. We even removed some elements that provided advertising revenue to the business because they were an irritant to the visitor. Visitors use Homes.com to quickly and efficiently search for homes in a specific geographic area and to get as much information about each listing that attracts their interest. So, our mission was to present as many listings on the search page as practical considering page load times and aesthetics; present the listings with a primary photo of sufficient size to see the property clearly; and make it easy and fast to see more details about a listing, the listing agent, and the neighborhood. Once our design team focused on those goals, the changes we needed to make on individual pages became pretty obvious. Note that the changes were not in design alone; we had to use new programming techniques, revise the database structure and write new data base packages to speed things up.

This concept of content relevance to the user’s purpose is an essential element of good design. Too often, web developers worry about the site owners’ needs and do not adequately consider the use experience. This can result in causing visitors to leave before buying and the web site getting less traffic due to poor relevance and presentation.

Help Them Find What They Want (and buy it)

When was the last time you went to a commerce site and had trouble finding the search box? Isn’t that annoying? Did you stay or leave for another site? This illustrates a primary rule for web page presentation and navigation: MAKE IT EASY to do what the user came to do. For example, a pet peeve of mine is the drop down box for states. I can type VA or FL a lot faster than I can scroll and select “Florida” or “Virginia” from a drop down list. A user friendly way to accomplish the same purpose is to ask for the zip code and populate the city, state and country fields for the user from your own data base. This saves the user time and keystrokes and assures your application gets accurate information. Besides, what’s the point of spelling out the state? All shipping documents use the 2 letter state abbreviations anyway. (Yeah, I know the Internet is multinational; but if you have a .com site in English, get real. Your primary audience is the USA and maybe Canada. Just have a checkbox for international customers to invoke a different data entry form.)

Another annoyance with drop down lists is list positioning by the first letter only. “V” usually turns up Vermont and if I type “A” afterwards, I end up at the top of the list. If your developers insist on using drop down lists, ask them to use positioning based upon all user entries.

A similar rule of thumb is “three clicks to buy”. Your visitor should be able to find, select and start the checkout process in three clicks. This is especially true for sites with just one or a few products where searching is unnecessary. However, if a visitor types something in a search box, he should be able to buy from the results list and proceed to check out. A related annoyance is when you add an item to the “cart” and the system takes you to the cart for checkout. Why not offer them the option to continue shopping or check out now as many sites now do? If they want to continue shopping, return them to the page they were on, if not, display the page to start the check out process.

Conclusion

As you consider new web pages, remember the three “R”s: Readability; Relevance and Rationality. Your pages should be readable on a small laptop screen in unfavorable lighting. Your message should be prominent and “above the fold”. Be sure your content is relevant to the user's purposes for visiting and presented accordingly. Finally, make sure your site navigation is rational, and again, that it absolutely facilitates the visitor’s reasons for being there.

Monday, December 17, 2007

Does Your Website Measure Up?


Website Series Introduction

This series of blog entries will relay some suggestions for improving website usability and performance. It is not intended for the technical web practitioner: instead, the intended audience is the business leader who relies upon the web to deliver sales messages to customers and prospects. The idea is to provide some information to help the business leader communicate with the techies and to evaluate the results achieved.

The visitor experience at your site is influenced by three primary factors: 1) relevance of the site content; 2) ease of finding what they are looking for; and 3) the speed and reliability of content delivery. Thus, content, UI and systems all influence the visitor experience. Said another way, the content relevance, ease of accessing content, and content delivery are the reasons your site will be judged as good or mediocre by visitors and determine whether they stay or come back to buy or get your sales messages. This series will help you understand what to look for to be sure your site is up to the task.

I plan to publish it in four parts over the next month or two:

Part 1: Is Your GUI Sticky?
..
The User Interface (UI) is what contributes to most to the visitor experience. The objective is to make the experience as positive as possible so visitors will buy and/or come back to buy. Making your site simple to navigate, error free and comfortable is essential to keeping visitors on the site and getting them back to buy again.

Part 2: Do Search Engines Like You?
..
Most web site owners find a majority of their traffic comes from the major search engines: Yahoo, Google, MSN, AOL, etc. In simple terms, a web site that is set up to appeal to the nuances of search engines is likely to get more traffic than a web site that is not. There are some basic search engine optimization (SEO) techniques that you want to be sure your developers include routinely in the pages they build for you.

Part 3: Is Your Technology Tuned?
..
In referring to underlying technology, I do not generally mean the hardware and software choices. Several hardware/software vendor solutions are up to the task of delivering web content. The critical issues are the configuration and the environment the systems operate within. These two factors must both be right to reliably perform the required tasks with speed and almost 100% uptime.

Part 4: Is Your Content King?
..
Visitors primarily find your web site based upon content (and some SEO techniques) and decide to stay and buy your product or proposition based upon content. Clear, professional copy, “community” information, and complete product information are essential to a truly effective web site.

Thursday, December 6, 2007

Image Data Cache


Earlier this year, I implemented an interesting architecture for serving images. The company had a web site with a searchable portal for real estate listings – about 1 million of them with over 20 million images associated with the listings – which was growing as sluggish real estate sales caused listing inventories to grow nationwide. In addition, a new version of the portal displayed additional images in the search results. This situation caused the company’s primary image disk storage array to be extremely stressed, especially while loading new images. The secondary site was off line due to upgraded disk appliance o/s software that was too memory intensive for the model of disk appliance in service. To resolve that required rolling back the software to an earlier version; then reloading the image files on that appliance by transferring the images from the primary site disk array, further stressing the primary hardware. The job was going to take about 10 days, and performance was suffering.

The idea that solved the problems was to cache images on servers at a third party location. Since the company had already signed a contract with a third party to host and serve video, trying the image cache approach for the company made sense. Within 5 business days, the company had implemented the changes needed and had started to cache images on servers at a third party location. Most of that time was spent waiting for the vendor to set up the systems on their end. The changes on the portal servers were relatively simple and implemented within a day or two.

The cache was pretty easy to implement. Code is changed to request the image from the third party URL instead of the traditional source. In this company’s situation, this was easy because most images are called with a data base function that builds a URL for the call and executes the request for the image. So, all the IT staff needed to do as revise and deploy that function.

Once the request is directed to the off site cache, the image is served if it is cached. If not, the 404 error invokes software on the cache that gets the image from the source. In this company’s situation, that source was multiple sources in an image structure behind a F5 load balancer. The cache vendor then serves the image and copies it to their server with concurrent processes.

Images stay on the vendor’s servers for a time set by the customer. The portal company used a 2 week time to live (TTL). This means if an image is not used for 2 weeks; it is automatically removed from the vendor’s cache. The vendor distributed the company’s images on 1000 servers, so the company had to add 1000 names to its DNS records.

The cost of this service is very reasonable compared to the cost of adding or upgrading disk appliances to meet the growth in images stored. For a few hundred dollars a month in vendor services, the company avoided disk appliance upgrade that may have cost as much as a half million dollars.

For more information about this project, or to discuss other issues related to large scale image storage for web delivery, contact me at 757.675.5010 or email bobpride @ cox.net. (Remove the spaces before and after the @ --- I have to put them there to fool the spammers’ bots.)

Friday, November 16, 2007

Is It Time To Chuck MS Office For A Less Expensive Alternative?


Some of us are old enough to remember when Lotus 123 was king of the spread sheets, WordPerfect ruled the document space, Harvard Graphics was the preferred presentation tool, and our email client was whatever CompuServe provided. However, Microsoft gave us its Office suite with Excel, Word, PowerPoint, and Outlook; and now Microsoft completely dominates the space previously occupied by these former icons.

Many IT managers chafe at Microsoft’s dominance of the office “productivity suite” market. They especially balk at the chunk of their budget it takes to acquire new MS Office licenses (about $500 per seat), upgrade old ones (over $300 per seat), and maintain the Windows Server environments it takes to enable Office desktop use and Internet connectivity.

We now see many alternatives in the marketplace ranging from no cost open source products to lower cost commercially marketed products. On the free end, we currently have OpenOffice.org, Google Docs, and others. (Example: Lotus Symphony, which is currently free; but will probably fall into the commercial category once it gains traction.) On the commercial product list, there are many offerings. The best known are: Corel WordPerfect Office ($350 or so per seat); StarOffice (under $70); and iWork (under $80).

Are any of these as good as the MS Office standard users are accustomed to? Probably, except for power users who use macros, pivot tables, and collaborators who need to track changes made by one another. However, there are other considerations to conversion that can be daunting. What follows is a discussion of the most critical ones.

1. Conversion. Conversion of documents from one suite to another is not perfect. This is especially true of heavily formatted Word documents and presentation documents: both types may convert with odd results that require reworking the document in the new software. If you have a large library of standard documents to convert, this could be a big effort.

2. Macros. Macros from one suite will generally not translate to a new suite. The work required to build new macros is a factor to consider.

3. Platform. If you want to completely phase out Microsoft, it means changing the operating system for the desktops and servers to something else. That something else is probably going to be some flavor of Unix or Linux. Changing the operating systems on existing hardware is a huge undertaking and will result in user discontent as some favorite applications are not available on the new platform.

4. Email. Outlook has been Microsoft Office’s killer application. It is good and feature rich. In recent implementations it also has integration with other Microsoft applications that will not be found on competing platforms. IT managers may also have to evaluate and select email client software since many suites do not have one. Exchange users who are coordinating calendars, meetings and using other Exchange features also have considerable homework to do.

5. Training. While many competitors have tried to create a look and feel that Microsoft Office users will find familiar, the fact is the applications will have differences and users will need to be trained and supported in making the transition.

6. Collaboration. Compatibility with users in other organizations, “markup” sharing with other users and other collaborations tools are generally weak in the alternative software. If you have these needs, tread carefully.

Based upon the factors listed above, the attraction of replacing Office dims substantially. The conversion costs are just prohibitive. Webs based applications like Google Docs offer the most painless transition; but still have problems to overcome like perceived security of sensitive documents and Internet connection downtime.

In my opinion, converting from MS Office is still too costly for large organizations that have already invested in the Microsoft Office software and supporting technology. However, startups, smaller businesses facing upgrades, and new offices of larger organizations should seriously consider the open source alternatives. The initial savings can be substantial.

Monday, November 12, 2007

Michael Gelb has written a new book


Michael Gelb has written a new book. For those of us who know Gelb’s work, this is great news. Mr. Gelb looks at superstars in the arena of human achievement and extracts the essence of the behaviors that make them more creative, productive and successful than the rest of us. Gelb's published works convey keen insight - some might say genius – in a fashion that is readily understandable and usable by the reader.

Gelb’s most well-known books are based upon the creative genius of Leonardo da Vinci and other great thinkers and extract principles we can employ to empower ourselves with some of that genius.

Gelb’s new work, Innovate Like Edison, is about Thomas Edison’s success as an inventor and his amazing ability to get sweeping changes made to the fabric of American society to allow his inventions to be mainstreamed. His inventions of the phonograph, movie camera, and light bulb created the need for the voice and film recording industry and widespread electric power production and distribution. This process of invention and implementation is innovation.

Gelb ferrets the concepts and then describes the practical use of Edison’s Competencies of Innovation:

1. Solution-Centered Mindset
2. Kaleidocopic Thinking
3. Full-Spectrum Engagement
4. Master Mind Collaboration
5. Super-Value Creation
..
He also uses lots of practical exercises and list a plethora of resources for innovators. I won’t spoil the surprise by describing them, but be assured every entrepreneur and business innovator can benefit from understanding and mastering the application of these competencies.

The book is co-authored with Edison’s great grand niece Sarah Miller Caldicott. This collaboration provided Gelb with access to the Miller/Edison family archives and that adds to the richness of the product. It is available at Amazon.com and other book retailers.

Thursday, November 1, 2007

The power of vision



…I quit cigarettes forever

In 1992, after smoking for the better part of three decades, I quit cigarettes forever. I had quit before --- countless times --- and it did not last. Why did it work in July of 1992? Well, I think there were several reasons: social pressures, rising cost of cigarettes, my mother’s death 15 years before from what was probably smoking-induced cancer, and increasing pressure from the younger people in my life to give it up. However, there was a difference this time which was purely psychological: I could see myself smoke free. I envisioned clothes that did not reek of cigarettes, a fresh smelling car, a day uninterrupted by cigarette breaks, and more cash for other things.

I truly believe that imagining myself as a non-smoker was the big difference. That vision of a smoke free Bob came true, mostly because I could see myself clearly as a non-smoker.

So what? Bob is some big hero because he quit smoking? No, it illustrates a powerful principle that can help us all achieve more.

From the football player who sacks the quarterback to the bride who loses ten pounds before she walks down the aisle, the vision of the goal stokes the fire of achievement and helps get the desired result.

Microsoft spent millions developing and promoting…

It can work in business, too. Microsoft spent millions developing and promoting the Microsoft Solutions Framework for success in managing complex technology projects. One of the key elements of the framework is a vision of the desired outcome(s). This vision assures everyone involved shares the same definition (vision) of the project’s success. Strategic plans routinely include descriptions of the desired future state of the enterprise. This kind of strategic vision for the future status provides both a road map for achievement and a measure of success.

The power of the vision can be effective on a more tactical level, too. I once ran a customer support organization that was struggling to keep up with customer service demands. Working with the staff, I created a vision of the best customer service available and began implementing changes to support it. The first step was to start telling our customers that we provided the best support available in our industry. We shared the vision with our customers and within a few months, the vision became reality. We were the best customer support organization in the industry and we had the awards and customer retention statistics to prove it.

Personalize it for the team members…

So the next time you want to achieve something, create a vision of what the world will be like when the goal is met. Personalize it for the team members, making sure you define what they will be doing, how they will feel, and other clear indicators the goals has been met. Set a time certain for the achievement and be very specific: for example, “I want to complete the vision blog by the close of business on November 2, 2007. When it’s finished, I will feel proud, relieved and happy. I will celebrate with a shot of the good brandy.”

If the goal involves complexity, you may need a plan to achieve it. An easy way to develop a plan is to work backwards from the goal date to the present, detailing the actions needed to achieve the goal. Think of it like building a house. If your planned move in date in December 1, 2007, think of the last thing that has to happen before you can move in: get an occupancy permit. Before that, you need final inspections. Before that, landscaping must be completed. Before that, utilities must be turned on. By moving backwards from the goal, it is easier to identify the milestones and processes needed to achieve the desired end result.

If you try to plan forward from today, it is much harder...

Don’t be afraid to explore alternative sequences and different approaches. But always work backwards from the goal. If you try to plan forward from today, it is much harder and will take far longer to develop a plan. And the likelihood is that the forward moving plan development will produce an inferior result.

Motivate yourself and your team with colorful, carefully crafted visions of the end result and you will be rewarded with a better future.

Monday, October 22, 2007

Orkut.com


...Google's problems in Brazil with inapproriate content...
..
The Wall Street Journal recently published a front page article highlighting Google’s problems in Brazil with inappropriate content on its Orkut.com social networking. Orkut, Google’s version of social networking sites similar to Facebook.com and NewsCorp’s Myspace.com, in an inexplicable scenario best explained by Malcolm Gladwell’s book The Tipping Point; has become wildly popular with Brazilians. Social networking sites allow users to establish online communities (topics) and upload content to both the community areas on the servers and to personal web sites (which may also have community affinities).

Anyway, leaving the intricacies of the platform aside, the communities created range from those dedicated to praising moms to the darker side of human nature: racism, pedophilia, animal cruelty and so forth. The WSJ does its usual excellent job of explaining how Google got in hot water, so I won’t rehash that. What I want to point out is the root problem and a proposed solution.
..
The root problem is...
..
The root problem is that these social networking sites (like any internet site) allow users with access to upload anything. In the case of Google’s Orkut, all content uploaded ends up on Google’s servers in the USA, where free speech is an inalienable right. Not so in Brazil. So the real questions are:

Do content hosts have a legal right or obligation to edit content?
What laws apply?

Let’s deal with the legal right to edit content first. Google, and other content hosts, have what is called Terms of Service (TOS). In these, they specify the types of content that is not approved for uploading to their servers. If a user violates these terms, then Google has the right to delete the content and remove the user account under the TOS because the user agreed to the TOS in order to receive an Orkut account.

Next, does Google have an obligation to remove content that violates its Orkut TOS? Well, according to the TOS, they do. However, they also state they are not liable for content that violates the TOS. That’s a topic for another day, so let’s assume for the moment they have an obligation but the timeliness of their obligatory response is fuzzy.

If a Brazilian citizen in Brazil uploads pro-Nazi comments or racial slurs...
..
Now we get to the meat of today’s topic. What laws apply? If a Brazilian citizen in Brazil uploads pro-Nazi comments or racial slurs and the uploaded content is stored on, and served from, computers in the US, whose laws apply if the content is subsequently served to another Brazilian citizen in Brazil? What if the same content is uploaded by a US citizen standing on US soil? By a Brazilian on US soil? By a US citizen in Brazil? .

Logically, if the content is illegal in Brazil and was uploaded by a Brazilian who was actually in Brazil at the time, the Brazilian citizen seems to be the culprit and the Brazilian government within its rights to prosecute that citizen. Yet the content is stored and served from computers in the US, where such content may be perfectly legal. Because the content can be served to citizens in their homes and offices in most countries of the world, which have varying laws and standards, the permutations become seemingly infinite.

Unfortunately, when you offer a free service...
..
The solution is crafting adequate Terms of Service and enforcing them. If Google (and others) craft TOS agreements that reflect the laws of the country of origins where they allow users to sign up, then Google and the others should enforce those agreements by screening content before it is available to the public. Unfortunately, when you offer a free service and users upload millions of items daily, enforcement comes at a high cost. Human review of the millions of images alone uploaded daily would be very expensive. When you throw in text posts, the number of editors needed would start to look like the cadre of TSA screeners.

You also have to consider there may be legal content in one country that that country’s TOS allows that is illegal elsewhere and users signing up in that country have different content standards. So Google may want to display that content to users in one country but lock it to users in another.

Some would say police nothing and allow everything...
..
So what is the solution? Some would say police nothing and allow everything, however, I think we can safely assume that society is not going to accept that alternative. So the solution has to lie with technology, of course. Specifically, the solution is applying image recognition software to screen out photos not meeting the TOS standards and using context sensitive text screening software to catch the words of extremists and pedophiles who violate local law or the TOS. Since the sophistication of existing software is not up to the task, this is not an easy solution. However, it is probably easier than getting all the countries of the world to agree to uniform laws or to a global internet authority. Google are you up to the technical challenge? BTW, the Homeland Security folks would love the software, too, so there may be a big paycheck in it for Google!

Sunday, October 14, 2007

Sell or Hold --- Managing Foreclosure Inventory


With these market conditions, how does a lender decide whether to sell or hold a property

Prestigious sources like the Wall Street Journal and Realtor Magazine have devoted a considerable number of column inches to the growth in residential real estate inventories. Some of that inventory growth has been due to foreclosures, especially in the last 24 months. RealtyTrac.com, an on ine foreclosure marketing company, and the Mortgage Bankers Association are both citing year-to-year growth in foreclosures of 40-50% for 2006 over 2005 and forecast similar results in 2007 over 2006. What this really means is a lot of homes are lender owned today. Because the market is still weak, these homes are not selling.

With these market conditions, how does a lender decide whether to sell or hold a property? By collecting some basic financial data on each property and knowing the cost of capital, a lender can pragmatically decide whether to accept an offer on a foreclosure property or not.

Since most foreclosures will be disposed of within a year or two...

The first figure to know is the lender’s cost of capital. This is the weighted average of the cost of borrowing and equity, expressed as an interest rate. Since most foreclosures will be disposed of within a year or two, it may be best to use the cost of incremental, or additional, capital. Most companies will find their current incremental cost of capital to be in the range of 10-12% assuming a mix of preferred stock or bonds, debt, and common stock and that the company that is only moderately risky form the investor’s viewpoint.

The next numbers are the holding costs: taxes, insurance, lawn service, utilities, and depreciation. Yes, depreciation must be considered. Any inventory is subject to deterioration, and empty houses are particularly prone to vandalism, storm damage, and general neglect from being unoccupied. This “inventory carrying cost” may be somewhat subjective and vary widely from property to property, but a lender should be able to determine the ratio of such expenses to the loan value outstanding on foreclosed properties by market area.

Finally, the selling costs are needed: real estate commissions, transfer taxes, fixing up expenses, and other cost borne by the seller. Other than fixing up expenses, selling cost are not particularly relevant since they will be incurred at any time the property sells. The only variable there is selling price so once could argue the difference in the selling costs between a full value sale and a discounted sale should be considered. I chose to ignore them because I focus on the monthly costs of ownership.

Assume the lender has a home appraised at $240,000 with a defaulted balance of $200,000

Now let’s look at an example. Assume the lender has a home appraised at $240,000 with a defaulted balance of $200,000. Also assume there is no chance of recovery from the borrower, the houses in the market are taking an average of 14 months to sell (longer for foreclosures), and the house needs some work to sell at the appraised value.

The numbers:
Cost of capital: 12% or $2000 monthly
Taxes $4800 or $200 per month
Insurance $800 or $67 per month
Utilities $120 per month
Lawn service average $100 per month (use summer/winter costs to be precise)
Lender’s inventory loss carrying cost expenses in this market 2% annually or $333 per month

The scenario:
This house has been on the market 3 months
The lender has an offer of $148,000 from a qualified buyer who can close in 30 days
Fixing up expenses will be $1800 to meet FHA lending standards

However, the message is it is better to not to have and hold...

The monthly holding cost for this property is $2820. It could be on the market another 11 or more months. That is $31,020 or more in holding costs. The loan balance, less this potential cost, is $168,980. When you add the fixing up expenses of $1800, the offer of $148,000 is too far from this benchmark discounted value of $168,980. However, an offer of about $170,800 would be a break even offer after discounting for potential holding costs and offers approaching this figure should be seriously considered.

One can argue that the time on the market before a sale is the big unknown. I certainly agree. If the lender feels this house will sell faster or slower than average, that should factor into the decision. However, the message is it is better to not to have and hold if the lender gets an offer that reflects something as good or better than the averages after considering the potential holding costs.

Sunday, October 7, 2007

Why did Rupert Murdoch buy Dow Jones Corp?


...let the speculation begin!
..
Let me start by saying that I have no direct knowledge of Mr. Murdoch’s reasoning and I am offering my own opinions. That said, let the speculation begin!

Much of Murdoch’s News Corporation holdings are traditional print and broadcast media whose profitability are largely dependent on advertising. Although broadcast advertising is doing OK, advertising revenue growth in the print sector is just not happening --- for Murdoch or anyone else. Revenues for most print media advertising are flat or declining in real terms. However, Internet advertising is growing --- and growing fast! I have seen estimates of 60-85% growth in the next 2 years for US markets.
..
With 2007 U.S. Internet advertising approaching $21 billion, that is a lot of real dollar growth. Most of that growth will be at the expense of traditional advertising media --- especially print. Since Murdoch operates globally, the US market is just a fraction of the growth he perceives in this new venue and it is a huge challenge to his existing revenue stream.

First is WSJ.com, one of the few successful subscription Internet publications
..
In short, Murdoch must capture a share of that growth to offset declines in print advertising revenues. Dow Jones has, buried in all the WSJ and other print media, some golden Internet assets. First is WSJ.com, one of the few successful subscription Internet publications. Its success led to a subscription model for Barrons.com. Next, consider Marketwatch.com, a consumer investor portal with over 6 million daily visitors. Dow Jones online revenues are growing at a healthy clip, too.
..
Depending on how you do the accounting, the growth is a healthy 15% (diluted) or an impressive 26% (year to year). DJ is also well into a significant redesign of the WSJ for the “digital age”. This initiative was started in 2005 and the initial launch was in January 2007. The redesign included new web friendly press width for the print edition, improved content, better organization/navigation, and tighter integration to the online version.

Dow Jones is going to approach $400 million in “online” revenue for 2007...
..
All in all, Dow Jones is going to approach $400 million in “online” revenue for 2007, about 20% of its total revenue. That’s a remarkable percentage for a major US company. Murdoch likely foresees that achievement and is betting he is getting some infrastructure, talent, and ideas that can help News Corporation achieve growth in the online arena.
..
In addition, I believe Murdoch is seriously investing in the social experiment we call the Internet. His acquisition of MySpace.com, the 5.6 billion dollar bet on DJ, and his other recent investments point to a serious attempt to change News Corporation into a Web 2.0 (and 3.0) enabled 21st century powerhouse. I think he is doing it, and very competently, too.