Hadley Reynolds of IDC moderated a wide-ranging discussion of strategies to enhance "findability" on corporate websites at this week's Gilbane Conference in in San Francisco. The discussion bypassed the most commonly-discussed topics such as improving ranking in Google or methods to increase click-through rates. Instead, the focus was on how to insure the people companies most want to reach -- potential customers -- can reach the right information and be converted from visitors to customers, both on the wider web as well as within the corporate website. Here are the key take aways.
The speakers for the discussion were Richard Zwicky of Eightfold Logic, Ed Hoffmman from SLI Systems and Sam Mefford from Avalon Consulting.
We Aren't Listening to Customers
Richard Zwicky of Eightfold Logic kicked things off delivering an admonition, a book recommendation and a roadmap for connecting with future customers. The admonition from Vanessa Fox's book "Marketing in the Age of Google" (which Zwicky recommended everyone buy) was simple:
Millions of customers are telling us what they want, but we aren't listening.
Zwicky noted that in the online world we have lots of information about visitors which is something that offline, traditional marketing doesn't have and can't leverage. This information should be, and can be used to improve the online marketing effort. Some of his key points:
- Marketing is getting people talking about your products, being found where the customers are looking -- not necessarily where you want them to look, and only telling them what they need to know.
- Get the customer to just the information they need, don't waste time telling them things they don't need to know. It's key to make a positive impression with 3rd party locations and get people to validate a purchase decision in ways independent of the corporate website.
62% of people consult online communities before making a purchase, only 27% go directly to a retailer.
Who Cares About Cost Per Click?
Zwicky decried the focus on cost per click. Getting the sale is the important thing, he said. Once you have the sale, you can then look at the data and work back to the cost.
He also discussed SEO and the relationship of paid to organic traffic. 88% of traffic is organic, the focus should be on that, not on the small fraction that is paid. Also, SEO isn't just about customer acquisition, it's also key to optimization of the conversion process, effective internal site navigation, and branding, among other things.
Where Will Customers be Tomorrow?
He wrapped up with a discussion of the value of prediction. He said too much focus is on figuring out where users have been or what they have looked at in the past -- that's not what all this data is for.
The key is what your customer will be looking for tomorrow.
If you can get your focus and message on where customers are going next, before they go there, that's where you need to be. To do this, organizations need to constantly listen to the buzz, and see what people are taking about. Measure everything you can because you can't manage what you can't measure and in the end it's not about how many people visit your site, it's finding the right users and making them your customers.
Two Aspects to Findability
Ed Hoffman kicked off a talk focused primarily on search as it's used within corporate sites. He asserted that there are two aspects to users finding a companies products:
- How your site gets found in the first place, from external sources like Google.
- Once they are on your site, how do users get to what they are looking for.
These two aspects are connected because people use the same behavior searching within a site as they do from outside the site.
Hoffman then talked about some influences and factors facing companies which include the use of Facebook for search and recommendations, the use of mobile devices is changing how people search and find information, the need to understand how competition affects the landscape.
Streamline Arrival, Continuously Refine
Hoffman also talked about how to streamline the experience once users get to a site:
- Have landing pages that get people to what they are looking for.
- Put thought into how users will get to other relevant information on the site if the first page doesn't do it.
- The relevance of the internal site search is key
- The search results page is often a top exit page. Is this because users found what they needed right in the search results page (an address or phone number for example), or are they leaving because they didn't get what they wanted?
He says there is a need to continuously improve. Even the famously slow to change (from a design perspective) Google is incorporating more interactive results. It's not just a list of blue links anymore, it's video, images, refining and sorting results, and incorporation of social information. All of this can be added to search results, and should be. Companies should add new features, see what the reaction is and test the results to find better ways of connecting users with what they want.
Four Techniques to Improve Content Quality, with Less Staff
Sam Mefford closed out the session with a discussion of four things you can do on a corporate site with a lot of data to improve the quality, relevance and findability of content. The problem? Often organizations have so much information in so many different places that it's not feasible to manually integrate and manage it all.
What are the core needs?
- Cross-linking between information
- Better meta-data
- Integration of data that may live in many different places
- Integration of structured content you may not have direct access to
Cross Linking
Mefford is talking about the classic "See Also" box on a page. This can be key in maximizing return and getting people to what they need quickly. The solution here is relatively straightforward. You should use your own search results to kick start cross-linking. Implement this so that it happens automatically and then an editor approves the list. Alternatively, a 3rd party recommendation service like Baynote can be implemented.
Better Metadata
There's too much metadata to do this manually on an item by item basis. You should identify your key content (top 200 product pages, for example). Provide manually-added full meta-data for all those pages. Then, for the rest of the content just store one piece of metadata: Link it to one of the high value pages that has full metadata and then have that content assume the metadata values of the linked content.
Integration of Disparate Data
Systems like Kayak are a great example of this. For any request content is coming from many different sources. Using search to tackle this is great. A search index gives you a read-only copy of the data that can be used and then refreshed at any time.
Integration of Structured Content
You may have access to data only from a web page that combines information, and you may not have access to the back-end source data. In this case you can use a variety of third party "smart spiders" to read the pages of data and then use rules and query languages to pull the relevant data out of the HTML you have. The information can then be added to the site from there.