This new, exciting feature is designed to improve FAQ search. FAQs are the most prevalent vertical in Answers, but today it’s difficult to search them effectively because they are full of semantic information of which regular keyword matching often cannot understand the intent. Semantic Text Search will help to solve this problem and significantly improve search quality for FAQs.
Semantic Text Search uses BERT - Google’s open-source machine learning framework for NLP - to represent phrases as points in space, called embeddings. You can visualize this process in 2D through the diagram below:
Instead of looking for overlapping keywords, Semantic Text Search measures the distance between the user’s query and the FAQ as a measure of intent. The closer they are, the better the match. Semantic Text Search then calculates the distance to every FAQ in the Knowledge Graph and sorts and ranks search results based on that distance.
Overall, Semantic Text Search is a new algorithm for Answers that searches FAQs based on intent, not keywords. It will allow customers to use Answers for both structured and semi-structured data, with the added benefit of eliminating the need for synonyms.
Here’s a comparison of keyword-based matching search and semantic text search:
For more information, please see the Vertical Searchable Fields unit.
If you have any comments, feedback, or questions on this feature, leave a comment below!