Elasticsearch
Important Capabilities
Capability | Status | Notes |
---|---|---|
Platform Instance | ✅ | Enabled by default |
This plugin extracts the following:
- Metadata for indexes
- Column types associated with each index field
CLI based Ingestion
Install the Plugin
The elasticsearch
source works out of the box with acryl-datahub
.
Starter Recipe
Check out the following recipe to get started with ingestion! See below for full configuration options.
For general pointers on writing and running a recipe, see our main recipe guide.
source:
type: "elasticsearch"
config:
# Coordinates
host: 'localhost:9200'
# Credentials
username: user # optional
password: pass # optional
# SSL support
use_ssl: False
verify_certs: False
ca_certs: "./path/ca.cert"
client_cert: "./path/client.cert"
client_key: "./path/client.key"
ssl_assert_hostname: False
ssl_assert_fingerprint: "./path/cert.fingerprint"
# Options
url_prefix: "" # optional url_prefix
env: "PROD"
index_pattern:
allow: [".*some_index_name_pattern*"]
deny: [".*skip_index_name_pattern*"]
ingest_index_templates: False
index_template_pattern:
allow: [".*some_index_template_name_pattern*"]
sink:
# sink configs
Config Details
- Options
- Schema
Note that a .
is used to denote nested fields in the YAML recipe.
Field | Description |
---|
The JSONSchema for this configuration is inlined below.
{
"title": "ElasticsearchSourceConfig",
"description": "Any source that connects to a platform should inherit this class",
"type": "object",
"properties": {
"env": {
"title": "Env",
"description": "The environment that all assets produced by this connector belong to",
"default": "PROD",
"type": "string"
},
"platform_instance": {
"title": "Platform Instance",
"description": "The instance of the platform that all assets produced by this recipe belong to. This should be unique within the platform. See https://datahubproject.io/docs/platform-instances/ for more details.",
"type": "string"
},
"host": {
"title": "Host",
"description": "The elastic search host URI.",
"default": "localhost:9200",
"type": "string"
},
"username": {
"title": "Username",
"description": "The username credential.",
"type": "string"
},
"password": {
"title": "Password",
"description": "The password credential.",
"type": "string"
},
"api_key": {
"title": "Api Key",
"description": "API Key authentication. Accepts either a list with id and api_key (UTF-8 representation), or a base64 encoded string of id and api_key combined by ':'.",
"anyOf": [
{
"type": "array",
"minItems": 2,
"maxItems": 2,
"items": [
{
"type": "string"
},
{
"type": "string"
}
]
},
{
"type": "string"
}
]
},
"use_ssl": {
"title": "Use Ssl",
"description": "Whether to use SSL for the connection or not.",
"default": false,
"type": "boolean"
},
"verify_certs": {
"title": "Verify Certs",
"description": "Whether to verify SSL certificates.",
"default": false,
"type": "boolean"
},
"ca_certs": {
"title": "Ca Certs",
"description": "Path to a certificate authority (CA) certificate.",
"type": "string"
},
"client_cert": {
"title": "Client Cert",
"description": "Path to the file containing the private key and the certificate, or cert only if using client_key.",
"type": "string"
},
"client_key": {
"title": "Client Key",
"description": "Path to the file containing the private key if using separate cert and key files.",
"type": "string"
},
"ssl_assert_hostname": {
"title": "Ssl Assert Hostname",
"description": "Use hostname verification if not False.",
"default": false,
"type": "boolean"
},
"ssl_assert_fingerprint": {
"title": "Ssl Assert Fingerprint",
"description": "Verify the supplied certificate fingerprint if not None.",
"type": "string"
},
"url_prefix": {
"title": "Url Prefix",
"description": "There are cases where an enterprise would have multiple elastic search clusters. One way for them to manage is to have a single endpoint for all the elastic search clusters and use url_prefix for routing requests to different clusters.",
"default": "",
"type": "string"
},
"index_pattern": {
"title": "Index Pattern",
"description": "regex patterns for indexes to filter in ingestion.",
"default": {
"allow": [
".*"
],
"deny": [
"^_.*",
"^ilm-history.*"
],
"ignoreCase": true
},
"allOf": [
{
"$ref": "#/definitions/AllowDenyPattern"
}
]
},
"ingest_index_templates": {
"title": "Ingest Index Templates",
"description": "Ingests ES index templates if enabled.",
"default": false,
"type": "boolean"
},
"index_template_pattern": {
"title": "Index Template Pattern",
"description": "The regex patterns for filtering index templates to ingest.",
"default": {
"allow": [
".*"
],
"deny": [
"^_.*"
],
"ignoreCase": true
},
"allOf": [
{
"$ref": "#/definitions/AllowDenyPattern"
}
]
},
"profiling": {
"title": "Profiling",
"description": "Configs to ingest data profiles from ElasticSearch.",
"allOf": [
{
"$ref": "#/definitions/ElasticProfiling"
}
]
},
"collapse_urns": {
"title": "Collapse Urns",
"description": "List of regex patterns to remove from the name of the URN. All of the indices before removal of URNs are considered as the same dataset. These are applied in order for each URN.\n The main case where you would want to have multiple of these if the name where you are trying to remove suffix from have different formats.\n e.g. ending with -YYYY-MM-DD as well as ending -epochtime would require you to have 2 regex patterns to remove the suffixes across all URNs.",
"allOf": [
{
"$ref": "#/definitions/CollapseUrns"
}
]
}
},
"additionalProperties": false,
"definitions": {
"AllowDenyPattern": {
"title": "AllowDenyPattern",
"description": "A class to store allow deny regexes",
"type": "object",
"properties": {
"allow": {
"title": "Allow",
"description": "List of regex patterns to include in ingestion",
"default": [
".*"
],
"type": "array",
"items": {
"type": "string"
}
},
"deny": {
"title": "Deny",
"description": "List of regex patterns to exclude from ingestion.",
"default": [],
"type": "array",
"items": {
"type": "string"
}
},
"ignoreCase": {
"title": "Ignorecase",
"description": "Whether to ignore case sensitivity during pattern matching.",
"default": true,
"type": "boolean"
}
},
"additionalProperties": false
},
"OperationConfig": {
"title": "OperationConfig",
"type": "object",
"properties": {
"lower_freq_profile_enabled": {
"title": "Lower Freq Profile Enabled",
"description": "Whether to do profiling at lower freq or not. This does not do any scheduling just adds additional checks to when not to run profiling.",
"default": false,
"type": "boolean"
},
"profile_day_of_week": {
"title": "Profile Day Of Week",
"description": "Number between 0 to 6 for day of week (both inclusive). 0 is Monday and 6 is Sunday. If not specified, defaults to Nothing and this field does not take affect.",
"type": "integer"
},
"profile_date_of_month": {
"title": "Profile Date Of Month",
"description": "Number between 1 to 31 for date of month (both inclusive). If not specified, defaults to Nothing and this field does not take affect.",
"type": "integer"
}
},
"additionalProperties": false
},
"ElasticProfiling": {
"title": "ElasticProfiling",
"type": "object",
"properties": {
"enabled": {
"title": "Enabled",
"description": "Whether to enable profiling for the elastic search source.",
"default": false,
"type": "boolean"
},
"operation_config": {
"title": "Operation Config",
"description": "Experimental feature. To specify operation configs.",
"allOf": [
{
"$ref": "#/definitions/OperationConfig"
}
]
}
},
"additionalProperties": false
},
"CollapseUrns": {
"title": "CollapseUrns",
"type": "object",
"properties": {
"urns_suffix_regex": {
"title": "Urns Suffix Regex",
"description": "List of regex patterns to remove from the name of the URN. All of the indices before removal of URNs are considered as the same dataset. These are applied in order for each URN.\n The main case where you would want to have multiple of these if the name where you are trying to remove suffix from have different formats.\n e.g. ending with -YYYY-MM-DD as well as ending -epochtime would require you to have 2 regex patterns to remove the suffixes across all URNs.",
"type": "array",
"items": {
"type": "string"
}
}
},
"additionalProperties": false
}
}
}
Code Coordinates
- Class Name:
datahub.ingestion.source.elastic_search.ElasticsearchSource
- Browse on GitHub
Questions
If you've got any questions on configuring ingestion for Elasticsearch, feel free to ping us on our Slack.
Is this page helpful?