When you have signed up for AddSearch, you have an account. The account links to a sample index you can use for checking out how the search works.
To use your content in the search, you need to add an index to the account. Adding the index creates the settings and statistics and allows for adding users and subscribing to AddSearch’s plans.
When adding an index to your account, the AddSearch dashboard provides you with a choice between the crawler and API indices. Which one should you choose?
If you want an index that is automatically created from your website’s content and integrates well with the ready-made views creating the crawling index is for you.
However, if you have the technical expertise and want to choose what content to index and design the search experience yourself, creating an API index is for you.
If you’re interested in more information on the two types of indices, continue reading.
Crawling index requires you to add your website for crawling, after which our crawlers search and collect the links to the web pages from the website. The content of the web pages is collected and stored in the search index.
In addition to collecting links from the web pages, AddSearch collects links from sitemaps named sitemap.xml and located in the site root.
The crawling index also allows for indexing dynamically generated content, referred to as Ajax crawling.
API index requires you to push the contents to the search index with the indexing API. As stated, the API index requires technical expertise. Please visit our reference page for the indexing API for more information.
In addition to using the API index in site search implementations, the API index allows for creating indices in other instances such as mobile or desktop applications.
Please note that if you want to use the API index with the ready-made search UIs, the JSON documents pushed to the index require the same fields that the ready-made search UIs use. To have a closer look at the fields, check out the search API response.