Blogs

Enterprise Search Implementation, Step 2: Picking The Right Search Engine

By John Gillies posted 07-07-2011 10:42

  

In my previous posting on implementing a search engine in a law firm, I focused on the first step of the process, namely Establishing the Business Requirements. Getting a detailed list of your business requirements is the essential starting point, because you will use it to compare the features of your “finalists.”

Having got to this point, your next hurdle is figuring out which search engines you’re going to want to test. You may well choose at some point in the process to involve an outside consultant with experience in search engine implementations to help guide you through the process. We found the help of our consultant (Joshua Fireman from ii3) to be invaluable. If you haven’t done so before now, this would be a good time.

In determining what sort of search engine you’re looking for, you’re faced with two choices, namely to restrict your search to the engines that have been customized for the legal market, or to look at engines that have been designed for the general market (for the Fortune 500 crowd, if you will), knowing that you’ll need to do a fair bit of customizing to address the many unique aspects of a legal environment.

You should only consider going the second route if you have reliable support on the technology front so you’re confident that your business requirements can be realized in your environment. For example, if integration with your DMS is an important requirement, will you actually be able to optimize your “Fortune 500 search engine” to do that effectively? Also, how much ongoing coding work will need to be done so it continues to function properly in your environment as it is upgraded? Many firms choose the first route simply because they do not want to rely so much on variables, many of which are beyond their control.

We opted to limit our selection to engines optimized for the legal environment, namely Autonomy iManage Universal Search (a/k/a “IUS”) and Recommind’s Decisiv Search. It is here that the investment of time in defining your business requirements really pays big dividends. In my previous posting, I noted that we had ranked our requirements by importance (from “Essential” down to “Nice to have.”) You can now use that list to create an Excel spreadsheet that you can use to compare your finalists.

We created five categories of rankings, with a score that ranged from 5 to 1. We then added another column that ranked the particular engine on how well it met the particular requirement, also on a five point scale. We then systematically went through each item in the business requirements document, assessed how well the particular feature performed, and assigned a score. Excel will calculate the weighted score for each item (so, for example, an “Essential” item that you give a score of 5 gets a weighted score of 25).

While there is no “ideal” minimum or maximum score that you are hoping to see at the end of this process, it’s possible that the ultimate scores are so low that you will have to reassess your entire process, but the likelihood of that is minimal. What you will most likely get is a total score for each of the finalists that enables you to engage a much more objective comparison than if you had just seen vendor demos of each.

You can also use Excel to compare your finalists just on their scores for the “Essential” items. (You may find, for example, that the overall result is fairly even between them, but one of them scores significantly higher when comparing just the “Essential” items. Once again, this is important information in helping you make your final decision.)

That is not, however, the end of the matter. Whether your finalists essentially get the same score (which is what happened in our case) or whether there is a clear winner, there are other non-quantifiable factors that you need to take into account, all of which can significantly influence your final decision.

The first factor is, of course, price. (You may in fact have taken this factor into consideration at the outset in determining which engines were, or were not, going to be tested.) Then, there are some factors that are likely relevant for any firm, as well as others that may be unique to your environment.

Among the common factors might be items such as:

  • What is your relationship with vendor? If you use other applications of this vendor’s, what is their history on responsiveness to issues you’ve raised about those other applications?
  • What are the announced upgrades for the next version of their engine, and what is their development roadmap? What process do they follow in determining which features to focus on for the future?

Aspects relating to your unique environment depend on the state of your current IT infrastructure and might include:

  • How well will this engine integrate with your current applications?
  • What repositories do you intend to index and what are the implications for integrating those different repositories?
  • What internal support requirements are there?

At the end of this process, you should have all the necessary elements for making a final decision, picking a “winner,” and then moving to the next state, namely the proof of concept.



#KnowledgeManagementandSearch
0 comments
35 views