WebCrawlerLimits

The rate limits for the URLs that you want to crawl. You should be authorized to crawl the URLs.

Types

Link copied to clipboard
class Builder
Link copied to clipboard
object Companion

Properties

Link copied to clipboard

The max number of web pages crawled from your source URLs, up to 25,000 pages. If the web pages exceed this limit, the data source sync will fail and no web pages will be ingested.

Link copied to clipboard

The max rate at which pages are crawled, up to 300 per minute per host.

Functions

Link copied to clipboard
inline fun copy(block: WebCrawlerLimits.Builder.() -> Unit = {}): WebCrawlerLimits
Link copied to clipboard
open operator override fun equals(other: Any?): Boolean
Link copied to clipboard
open override fun hashCode(): Int
Link copied to clipboard
open override fun toString(): String