Add InstantSearch and Autocomplete to your search experience in just 5 minutes
A good starting point for building a comprehensive search experience is a straightforward app template. When crafting your application’s ...
Senior Product Manager
A good starting point for building a comprehensive search experience is a straightforward app template. When crafting your application’s ...
Senior Product Manager
The inviting ecommerce website template that balances bright colors with plenty of white space. The stylized fonts for the headers ...
Search and Discovery writer
Imagine an online shopping experience designed to reflect your unique consumer needs and preferences — a digital world shaped completely around ...
Senior Digital Marketing Manager, SEO
Winter is here for those in the northern hemisphere, with thoughts drifting toward cozy blankets and mulled wine. But before ...
Sr. Developer Relations Engineer
What if there were a way to persuade shoppers who find your ecommerce site, ultimately making it to a product ...
Senior Digital Marketing Manager, SEO
This year a bunch of our engineers from our Sydney office attended GopherCon AU at University of Technology, Sydney, in ...
David Howden &
James Kozianski
Second only to personalization, conversational commerce has been a hot topic of conversation (pun intended) amongst retailers for the better ...
Principal, Klein4Retail
Algolia’s Recommend complements site search and discovery. As customers browse or search your site, dynamic recommendations encourage customers to ...
Frontend Engineer
Winter is coming, along with a bunch of houseguests. You want to replace your battered old sofa — after all, the ...
Search and Discovery writer
Search is a very complex problem Search is a complex problem that is hard to customize to a particular use ...
Co-founder & former CTO at Algolia
2%. That’s the average conversion rate for an online store. Unless you’re performing at Amazon’s promoted products ...
Senior Digital Marketing Manager, SEO
What’s a vector database? And how different is it than a regular-old traditional relational database? If you’re ...
Search and Discovery writer
How do you measure the success of a new feature? How do you test the impact? There are different ways ...
Senior Software Engineer
Algolia's advanced search capabilities pair seamlessly with iOS or Android Apps when using FlutterFlow. App development and search design ...
Sr. Developer Relations Engineer
In the midst of the Black Friday shopping frenzy, Algolia soared to new heights, setting new records and delivering an ...
Chief Executive Officer and Board Member at Algolia
When was your last online shopping trip, and how did it go? For consumers, it’s becoming arguably tougher to ...
Senior Digital Marketing Manager, SEO
Have you put your blood, sweat, and tears into perfecting your online store, only to see your conversion rates stuck ...
Senior Digital Marketing Manager, SEO
“Hello, how can I help you today?” This has to be the most tired, but nevertheless tried-and-true ...
Search and Discovery writer
Language is one of our most basic ways of communicating, but it is also a rich source of information and one that we use all the time, including online. What if we could use that language, both written and spoken, in an automated way? That’s what natural language processing sets out to do.
Natural language processing, or NLP, takes language and processes it into bits of information that software can use. With this information, the software can then do myriad other tasks, which we’ll also examine.
But first, why is natural language processing even necessary? First off is the fact that massive amounts of information is created and shared every day through natural language. Billions of social media posts come through daily. Trillions of searches happen on search engines great and small. Call transcripts. Emails. Classifieds. News articles. Some of these, like search queries, benefit directly from NLP. Others, like news articles, can be processed via NLP to create value.
Let’s look at a couple of examples in more detail. We’ll start off by looking at news articles.
Jones to Assume Presidency of Acme Corp.
Marcus L. Jones today announced that he was to become the 4th President in Acme Corp. history. He will lead the widget maker into its next chapter as it examines expansion into new markets, such as Europe, Mexico, and Canada.
Now think about all of the things we may want to do with this text. For example, we could want to know which companies, subjects, countries, and other key entities are mentioned so that we can tag and categorize similar articles. One way we can do that is to first decide that only nouns and adjectives are eligible to be considered for tags. For this we would use a parts of speech tagger that will specify what part of speech each word in a text is.
But even once we identify those words, things are tricky, because is “widget” different from “widgets”? Of course not! So we need to use some normalization, which will collapse words to their core so that different variations can be considered equivalent. And normalization can be complex, in cases such as “Europe” and “EU,” or “Marcus L. Jones” and “Marcus Jones.”
This approach even ignores that the items relevant for tagging and categorization may not be single words, but could even be a phrase, such as “Acme Corp.” Identifying these items is the job of tokenization. Tokenization breaks a larger text into smaller pieces. It may break a document into paragraphs, paragraphs into sentences, and sentences into “tokens.” (We won’t say words here, because Acme Corp. can be a token but isn’t a word, and “isn’t” is a word, but would often be broken down into two tokens: is and n’t.) Tokenization can be very difficult. For example, even something as “simple” as identifying sentences in a paragraph is tricky, because what happens when you have a sentence like the first one in the article? Is “Marcus L.” a sentence because it ends with a paragraph and is followed by a word with a capital letter?)
Altogether, identifying key concepts is what is known as named entity recognition. Named entity recognition is not just about identifying nouns or adjectives, but about identifying important items within a text. In this news article lede, we can be sure that Marcus L. Jones, Acme Corp., Europe, Mexico, and Canada are all named entities.
Finally, we may want to understand the connections between words. This will help our programs understand the semantics behind who the “he” is in the second sentence, or that “widget maker” is describing Acme Corp.
Natural language processing for search queries is just as important, but with its own challenges and needs. A good way to illustrate this is to also discuss an important factor of natural language processing: the fact that there are thousands of natural languages spoken in the world. Different languages will have different needs, and while English is the language that much NLP software starts with, it is not indicative of all languages.
As an example, English rarely compounds words together without some separator, be it a space or punctuation. In fact, it is so rare that we have the word portmanteau to describe it. Other languages do not follow this convention, and words will butt up against each other to form a new word entirely. In German, the word “Hundehütte” means dog house. It’s not two words, but one, but it refers to these two concepts in a combined way.
A naive search engine will match Hundehütte to Hundehütte well enough, but it won’t match that query word to the phrase “Hütte für große Hunde,” which means house for big dog. Natural language processing comes in to decompound the query word into its individual pieces so that the searcher can see the right products. This illustrates another area where the deep learning element of NLP is useful, and how NLP often needs to be language-specific.
Going through all of these steps for natural language processing (altogether known as the natural language processing pipeline) returns information that is structured in a way that software can understand. By now you’ve seen that there is a lot of information hidden inside language. A 40 word paragraph can refer to a company, a person, three regions, and much information about those items. Humans are very good at identifying the important parts of language, and understanding how it goes together, but bad at taking hundreds, thousands, or millions of texts and finding trends or grouping them together. Most software programs are the reverse: they can find trends or categorize texts, but they are bad at the text itself. That’s why we use tailored software, natural language processing, to structure the text into a way that those programs can use. (By the way, why not combine all the steps into a single program? Having small, focused programs can make each step better, and allow us to combine different tools for different purposes.)
Above, we looked at the examples of a news article and a search query, and how we could use natural language processing to better transform the text. Think now about the other examples of textual content we discussed, like call transcripts, classifieds, or emails. What kind of processing might these texts need?For more information on how Algolia’s search and discovery APIs leverage NLP, or to learn more about how we can help you implement this powerful technology within your site or app for a more engaging user experience, please contact our team of experts.
Powered by Algolia Recommend