REST API

Topics

Complete REST API reference for programmatic access to Energylogserver.

Overview

This chapter provides comprehensive documentation for the Energylogserver SIEM APIs, covering REST endpoints, authentication methods, data operations, and integration patterns. The API reference includes endpoints for core components: ELS Data Node, ELS Console, SIEM Agents, and specialized ELS endpoints for dashboards, alerts, archiving, AI operations, and license management.

Energylogserver SIEM exposes multiple API layers for different operational needs:

  • ELS REST API: Core data operations and cluster management.

  • ELS Console API: Dashboard management, saved objects, and visualizations.

  • Alert API: Alert creation, management, and execution.

  • Archive API: Data archiving and retention management.

  • AI API: Machine learning and anomaly detection operations.

  • ELS-specific APIs: License management, SIEM operations, and agent communication.

Connecting to API

To connect to APIs, you can use basic authorization or an authorization token.

To generate the authorization token, run the following command:

curl -XPUT http://localhost:9200/_logserver/login -H 'Content-type: application/json' -d '
{
  "username": "$USER",
  "password": "$PASSWORD"
}'

The result of the command will return the value of the token, and you can use it in the API by passing it as a “token” header, for example:

curl -H 'token: 192783916598v51j928419b898v1m79821c2' -X GET "http://localhost:9200/_cluster/health"

Basic Authentication

Alternatively, use basic authentication:

curl -u "username:password" -X GET "https://els-data-01:9200/_cluster/health"
curl -H "Authorization: Basic $(echo -n 'admin:password' | base64)" -X GET "https://els-data-01:9200/_cluster/health"

Dashboards API

The Dashboards import/export APIs allow users to import dashboards along with all corresponding saved objects, such as visualizations, saved searches, and index patterns.

Dashboards Import API

Request:

POST /api/opensearch-dashboards/dashboards/import

Query Parameters:

  • force (optional, boolean) - Overwrite any existing objects on ID conflict.

  • exclude (optional, array) - Saved object types that should not be imported.

Example:

curl -XPOST -ulogserver: -H "osd-xsrf: true" -H "Content-Type: application/json" "https://127.0.0.1:5601/api/opensearch-dashboards/dashboards/import?force=true" -d@"${DASHBOARD-FILE}"

Dashboards Export API

Request:

GET /api/opensearch-dashboards/dashboards/export

Query Parameters:

  • dashboard (required, array|string) - The ID(s) of the dashboard(s) to export.

Example:

curl -XGET -ulogserver: -H "osd-xsrf: true" -H "Content-Type: application/json" "https://127.0.0.1:5601/api/opensearch-dashboards/dashboards/export?dashboard=${DASHBOARD-ID}" > ${DASHBOARD-FILE}

Energylogserver API

The Energylogserver API is a typical REST API, and data is received in JSON format over the HTTP protocol. By default, the tcp/9200 port is used to communicate with the Energylogserver API. For the purposes of examples, communication with the Energylogserver API will be carried out using the curl application.

Program syntax:

curl -XGET -u login:password '127.0.0.1:9200'

Available methods:

  • PUT - sends data to the server.

  • POST - sends a request to the server for a change.

  • DELETE - deletes the index/document.

  • GET - gets information about the index/document.

  • HEAD - is used to check if the index/document exists.

Available APIs by roles:

  • Index API - manages indexes.

  • Document API - manages documents.

  • Cluster API - manages the cluster.

  • Search API - is used to search for data.

Cluster Management

Cluster health:

curl -X GET "https://els-data-01:9200/_cluster/health"

Node statistics:

curl -X GET "https://els-data-01:9200/_nodes/stats"

Data Operations

Search events:

curl -X POST "https://els-data-01:9200/security-events-*/_search" \
  -H "Content-Type: application/json" \
  -d '{"query": {"match_all": {}}}'

Index a document:

curl -X POST "https://els-data-01:9200/security-events-2025.08.25/_doc" \
  -H "Content-Type: application/json" \
  -d '{"event": "login_success", "timestamp": "2025-08-25T15:47:00Z"}'

Energylogserver Index API

The indices APIs are used to manage individual indices, index settings, aliases, mappings, and index templates.

Adding Index

Adding Index - automatic method:

curl -XPUT -u login:password '127.0.0.1:9200/twitter/tweet/1?pretty=true' -d'{
    "user" : "elk01",
    "post_date" : "2017-09-05T10:00:00",
    "message" : "tests auto index generation"
    }'

You should see the output:

{
"_index" : "twitter",
  "_type" : "tweet",
  "_id" : "1",
  "_version" : 1,
  "_shards" : {
    "total" : 2,
    "successful" : 1,
    "failed" : 0
  },
  "created" : true
}

The parameter action.auto_create_index must be set to true.

Adding Index – manual method:

Setting the number of shards and replicas:

curl -XPUT -u login:password '127.0.0.1:9200/twitter2?pretty=true' -d'{
	"settings" : {
	"number_of_shards" : 1,
    "number_of_replicas" : 1
    }
 }'

You should see the output:

{
  "acknowledged" : true
}

Command for manual index generation:

curl -XPUT -u login:password '127.0.0.1:9200/twitter2/tweet/1?pretty=true' -d'{
                "user" : "elk01",
                "post_date" : "2017-09-05T10:00:00",
                "message" : "tests manual index generation"
            }'

You should see the output:

{
  "_index" : "twitter2",
  "_type" : "tweet",
  "_id" : "1",
  "_version" : 1,
  "_shards" : {
    "total" : 2,
    "successful" : 1,
    "failed" : 0
  },
  "created" : true
}

Delete Index

Delete Index - to delete twitter index you need use the following command:

curl -XDELETE -u login:password '127.0.0.1:9200/twitter?pretty=true'

The delete index API can also be applied to more than one index, by either using a comma-separated list, or on all indices by using _all or * as index:

curl -XDELETE -u login:password '127.0.0.1:9200/twitter*?pretty=true'

To allow deletion of indices via wildcards, set the action.destructive_requires_name setting in the configuration to false.

API Useful Commands

Get information about replicas and shards:

curl -XGET -u login:password '127.0.0.1:9200/twitter/_settings?pretty=true'
curl -XGET -u login:password '127.0.0.1:9200/twitter2/_settings?pretty=true'

Get information about mapping and alias in the index:

curl -XGET -u login:password '127.0.0.1:9200/twitter/_mappings?pretty=true'
curl -XGET -u login:password '127.0.0.1:9200/twitter/_aliases?pretty=true'

Get all information about the index:

curl -XGET -u login:password '127.0.0.1:9200/twitter?pretty=true'

Check if the index exists:

curl -XGET -u login:password '127.0.0.1:9200/twitter?pretty=true'

Close the index:

curl -XPOST -u login:password '127.0.0.1:9200/twitter/_close?pretty=true'

Open the index:

curl -XPOST -u login:password '127.0.0.1:9200/twitter/_open?pretty=true'

Get the status of all indexes:

curl -XGET -u login:password '127.0.0.1:9200/_cat/indices?v'

Get the status of one specific index:

curl -XGET -u login:password '127.0.0.1:9200/_cat/indices/twitter?v'

Display how much memory is used by the indexes:

curl -XGET -u login:password '127.0.0.1:9200/_cat/indices?v&h=i,tm&s=tm:desc'

Display details of the shards:

curl -XGET -u login:password '127.0.0.1:9200/_cat/shards?v'

Energylogserver Document API

Create Document

Create a document with a specified ID:

curl -XPUT -u login:password '127.0.0.1:9200/twitter/tweet/1?pretty=true' -d'{
    "user" : "lab1",
    "post_date" : "2017-08-25T10:00:00",
    "message" : "testuje Energylogserver"
}'

You should see the output:

{
  "_index" : "twitter",
  "_type" : "tweet",
  "_id" : "1",
  "_version" : 1,
  "_shards" : {
    "total" : 2,
    "successful" : 1,
    "failed" : 0
  },
  "created" : true
}

Creating a document with an automatically generated ID (note: PUT -> POST):

curl -XPOST -u login:password '127.0.0.1:9200/twitter/tweet?pretty=true' -d'{
    "user" : "lab1",
    "post_date" : "2017-08-25T10:10:00",
    "message" : "testuje automatyczne generowanie ID"
    }'

You should see the output:

{
  "_index" : "twitter",
  "_type" : "tweet",
  "_id" : "AV49sTlM8NzerkV9qJfh",
  "_version" : 1,
  "_shards" : {
    "total" : 2,
    "successful" : 1,
    "failed" : 0
  },
  "created" : true
}

Delete Document

Delete a document by ID:

curl -XDELETE -u login:password '127.0.0.1:9200/twitter/tweet/1?pretty=true'
curl -XDELETE -u login:password '127.0.0.1:9200/twitter/tweet/AV49sTlM8NzerkV9qJfh?pretty=true'

Delete a document using a wildcard:

curl -XDELETE -u login:password '127.0.0.1:9200/twitter/tweet/1*?pretty=true'

(Parameter: action.destructive_requires_name must be set to false.)

Useful Commands

Get information about the document:

curl -XGET -u login:password '127.0.0.1:9200/twitter/tweet/1?pretty=true'

You should see the output:

{
    "_index" : "twitter",
    "_type" : "tweet",
    "_id" : "1",
    "_version" : 1,
    "found" : true,
    "_source" : {
        "user" : "lab1",
        "post_date" : "2017-08-25T10:00:00",
        "message" : "testuje Energylogserver"
    }
}

Get the source of the document:

curl -XGET -u login:password '127.0.0.1:9200/twitter/tweet/1?pretty=true&_source'

Alert API

List API

Method: GET
URL: https://:/api/alert/list
Parameters: None

Curl:

curl -XGET 'https://localhost:5601/api/alert/list' -u <user>:<password> -k

Result: JSON array of alert documents.

Change API

Method: POST
URL: https://:/api/alert/change
Parameters: Alert document in JSON format.

Curl:

curl -XPOST 'https://localhost:5601/api/alert/change' -u <user>:<password> -k \
  -H 'Content-Type: application/json' -d '{
  "name": "my_alert",
  "description": "My alert description",
  "index": "logstash-*",
  "search": {
    "request": {
      "search_type": "query_then_fetch",
      "index": ["logstash-*"],
      "body": {
        "size": 0,
        "query": {
          "bool": {
            "must": [
              {"match": {"event": "error"}}
            ]
          }
        }
      }
    }
  },
  "timeframe": "5m",
  "num_events": 10,
  "alert_method": "email",
  "email": "admin@company.com"
}'

Result: JSON document with status, id, and message.

Run API

Method: GET
URL: https://:/api/alert/run/
Parameters: id - alert document ID.

Curl:

curl -XGET 'https://localhost:5601/api/alert/run/ea9384857de1f493fd84dabb6dfb99ce' -u <user>:<password> -k

Result: JSON document with status and id.

AI API

List API

Method: GET
URL: https://:/api/ai/list
Parameters: None

Curl:

curl -XGET 'https://localhost:5601/api/ai/list' -u <user>:<password> -k

Result: JSON array of AI documents.

Change API

Method: POST
URL: https://:/api/ai/change
Parameters: AI document in JSON format.

Available Parameters:

  • name (required, string) - AI rule name

  • description (optional, string) - Rule description

  • index (required, string) - Index pattern to analyze

  • query (required, JSON object) - Energylogserver query

  • time_field (required, string) - Timestamp field name

  • time_frame (required, string) - Analysis time window

  • value_type (required, string) - Value aggregation type

  • model_type (required, string) - Model type for univariate/multivariate

  • fields (required for multivariate, array) - Field names for analysis

  • contamination (optional, float) - Expected anomaly ratio (0-0.5)

  • n_neighbors (optional, integer) - Number of neighbors for LOF algorithm

  • random_state (optional, integer) - Random seed for reproducibility

  • n_clusters (optional for clustering, integer) - Number of clusters

  • min_samples (optional for clustering, integer) - Minimum samples per cluster

  • forecast_steps (optional for forecasting, integer) - Prediction steps

  • seasonality (optional for forecasting, integer) - Seasonal pattern period

  • min_documents (optional for text, integer) - Minimum documents for training

  • threshold (optional, float) - Anomaly threshold

  • automatic_cron (required, string) - Cron schedule expression

  • automatic_enable (required, boolean) - Enable automatic execution

  • automatic (required, boolean) - Automatic processing flag

  • start_date (optional, string) - Analysis start date

  • multiply_by_values (required, array) - Fields for result multiplication

  • multiply_by_field (required, string) - Field for result multiplication

  • selectedroles (optional, array) - User roles with access

Curl:

curl -XPOST 'https://localhost:5601/api/ai/change' -u <user>:<password> -k \
  -H 'Content-Type: application/json' -d '{
  "name": "network_anomaly",
  "description": "Network anomaly detection",
  "index": "network-*",
  "query": {
    "match_all": {}
  },
  "time_field": "@timestamp",
  "time_frame": "1h",
  "value_type": "avg",
  "model_type": "univariate",
  "field": "bytes_in",
  "contamination": 0.05,
  "n_neighbors": 20,
  "random_state": 42,
  "automatic_cron": "0 */1 * * *",
  "automatic_enable": true,
  "automatic": true,
  "multiply_by_values": ["host"],
  "multiply_by_field": "network.bytes"
}'

Result: JSON document with status, id, and message.

Run API

Method: GET
URL: https://:/api/ai/run/
Parameters: id - AI document ID.

Curl:

curl -XGET 'https://localhost:5601/api/ai/run/ea9384857de1f493fd84dabb6dfb99ce' -u <user>:<password> -k

Result: JSON document with status and id.

Create Rules

The create service adds a new document with the AI rule definition.

Method: PUT
URL:

https://:/api/ai/create

where:

host - ELS Console host address port - ELS Console port body - JSON with definition of ai rule

Curl:

curl -XPUT 'https://localhost:5601/api/ai/create' -u <user>:<password> -k -H "kbn-version: 6.2.4" -H 'Content-type: application/json' -d' {"algorithm_type":"TL","model_name":"test","search":"search:6c226420-3b26-11e9-a1c0-4175602ff5d0","label_field":{"field":"system.cpu.idle.pct"},"max_probes":100,"time_frame":"1 day","value_type":"avg","max_predictions":10,"threshold":-1,"automatic_cron":"*/5 * * * *","automatic_enable":true,"automatic_flag":true,"start_date":"now","multiply_by_values":[],"multiply_by_field":"none","selectedroles":["test"]}'

Validation:

Field

Values

algorithm_type

GMA, GMAL, LRS, LRST, RFRS, SMAL, SMA, TL

value_type

min, max, avg, count

time_frame

1 minute, 5 minutes, 15 minutes, 30 minutes, 1 hour, 1 day, 1 week, 30 day, 365 day

Body JSON description:

Field

Mandatory

Value

Screen field

algorithm_type

Yes

GMA, GMAL, LRS, LRST, RFRS, SMAL, SMA, TL

Algorithm

model_name

Yes

Not empty string

AI Rule Name

search

Yes

Search id

Choose search

label_field.field

Yes

Feature to analyse

max_probes

Yes

Integer value

Max probes

time_frame

Yes

1 minute, 5 minutes, 15 minutes, 30 minutes, 1 hour, 1 day, 1 week, 30 day, 365 day

Time frame

value_type

Yes

min, max, avg, count

Value type

max_predictions

Yes

Integer value

Max predictions

threshold

No (default -1)

Integer value

Threshold

automatic_cron

Yes

Cron format string

Automatic cycle

Automatic_enable

Yes

true/false

Enable

automatic

Yes

true/false

Automatic

start_date

No (default now)

YYYY-MM-DD HH:mm or now

Start date

multiply_by_values

Yes

Array of string values

Multiply by values

multiply_by_field

Yes

None or full field name eg.: system.cpu

Multiply by field

selectedroles

No

Array of roles name

Role

Result:

JSON document with fields:

status - true if ok id - id of changed document message- error message

Update Rules

The update service changes the document with the AI rule definition.

Method: POST
URL:

https://:/api/ai/update/

where:

host - ELS Console host address port - ELS Console port id - ai rule document id body - JSON with definition of ai rule

Curl:

curl -XPOST 'https://localhost:5601/api/ai/update/ea9384857de1f493fd84dabb6dfb99ce' -u <user>:<password> -k -H "kbn-version: 6.2.4" -H 'Content-type: application/json' -d'
{"algorithm_type":"TL","search":"search:6c226420-3b26-11e9-a1c0-4175602ff5d0","label_field":{"field":"system.cpu.idle.pct"},"max_probes":100,"time_frame":"1 day","value_type":"avg","max_predictions":100,"threshold":-1,"automatic_cron":"*/5 * * * *","automatic_enable":true,"automatic_flag":true,"start_date":"now","multiply_by_values":[],"multiply_by_field":"none","selectedroles":["test"]}

Validation:

Field

Values

algorithm_type

GMA, GMAL, LRS, LRST, RFRS, SMAL, SMA, TL

value_type

min, max, avg, count

time_frame

1 minute, 5 minutes, 15 minutes, 30 minutes, 1 hour, 1 day, 1 week, 30 day, 365 day

Body JSON description:

Field

Mandatory

Value

Screen field

algorithm_type

Yes

GMA, GMAL, LRS, LRST, RFRS, SMAL, SMA, TL

Algorithm

model_name

Yes

Not empty string

AI Rule Name

search

Yes

Search id

Choose search

label_field.field

Yes

Feature to analyse

max_probes

Yes

Integer value

Max probes

time_frame

Yes

1 minute, 5 minutes, 15 minutes, 30 minutes, 1 hour, 1 day, 1 week, 30 day, 365 day

Time frame

value_type

Yes

min, max, avg, count

Value type

max_predictions

Yes

Integer value

Max predictions

threshold

No (default -1)

Integer value

Threshold

automatic_cron

Yes

Cron format string

Automatic cycle

Automatic_enable

Yes

true/false

Enable

automatic

Yes

true/false

Automatic

start_date

No (default now)

YYYY-MM-DD HH:mm or now

Start date

multiply_by_values

Yes

Array of string values

Multiply by values

multiply_by_field

Yes

None or full field name eg.: system.cpu

Multiply by field

selectedroles

No

Array of roles name

Role

Result:

JSON document with fields:

status - true if ok id - id of changed document message - error message

Run Rules

The run service executes a document of AI rule definition by ID.

Method: GET
URL:

https://:/api/ai/run/

where:

host - ELS Console host address port - ELS Console port id - ai rule document id

Curl:

curl -XGET 'https://localhost:5601/api/ai/run/ea9384857de1f493fd84dabb6dfb99ce' -u <user>:<password> -k

Result:

JSON document with fields:

status - true if ok id - id of executed

Screen fields

Not screen fields

_index

_type

_id

_source.model_name

_source.algorithm_type

_source.search

_source.label_field

_source.max_probes

_source.time_frame

_source.value_type

_source.max_predictions

_source.threshold

_source.automatic_cron

_source.automatic_enable

_source.automatic

_source.start_date

_source.multiply_by_values

_source.multiply_by_field

_source.selectedroles

_source.preparation_date

_source.machine_state_uid

_source.path_to_logs

_source.path_to_machine_state

_source.searchSourceJSON

_source.processing_time

_source.last_execute_mili

_source.last_execute

_source.pid

_source.exit_code

Authentication and Security

API Key Authentication

For programmatic access, generate API keys:

curl -u "admin:password" -X POST "https://els-data-01:9200/_security/api_key" \
  -H "Content-Type: application/json" \
  -d '{
    "name": "monitoring-key",
    "role_descriptors": {
      "monitoring": {
        "cluster": ["monitor"],
        "index": [
          {
            "names": ["security-*", "alerts-*"],
            "privileges": ["read"]
          }
        ]
      }
    },
    "expiration": "7d"
  }'
curl -H "Authorization: ApiKey <base64_encoded_key>" \
  -X GET "https://els-data-01:9200/_cluster/health"

License Management

Check license status:

curl -X GET "https://els-data-01:9200/_logserver/license" \
  -H "Authorization: Basic admin:password"

Update license:

curl -X POST "https://els-data-01:9200/_logserver/license" \
  -H "Content-Type: application/json" \
  -H "Authorization: Basic admin:password" \
  -d '@license.json'

Best Practices

API Security

  • Use encrypted connections (HTTPS).

  • Rotate API keys regularly.

  • Restrict endpoint access with firewall rules.

  • Validate all input data.

Performance Optimization

  • Batch requests for bulk operations.

  • Use appropriate timeouts.

  • Cache frequent responses.

  • Monitor API usage with metrics.

Error Handling

  • Implement retry logic for transient errors.

  • Log all API errors.

  • Handle rate limits gracefully.

  • Validate response codes.

Integration Patterns

Automation with Scripts

Example: Daily Health Check

#!/bin/bash
HEALTH=$(curl -s -k -u admin:password https://els-data-01:9200/_cluster/health | jq -r .status)

if [ "$HEALTH" != "green" ]; then
  echo "Cluster health issue detected: $HEALTH" | mail -s "ELS Health Alert" admin@company.com
fi

Integration with External Tools

Prometheus Metrics Collection:

scrape_configs:

  • job_name: ‘els-api’ static_configs:

    • targets: [‘els-data-01:9200’] metrics_path: ‘/_prometheus/metrics’ scheme: ‘https’ basic_auth: username: ‘monitoring_user’ password: ‘password’

SIEM Agent Operations

Agent enrollment:

curl -X POST "https://els-server:1515/register" \
  -H "Content-Type: application/json" \
  -d '{"agent_id": "agent-001", "host": "agent-host"}'

Agent status:

curl -X GET "https://els-server:55000/agents/agent-001" \
  -H "Authorization: Bearer <token>"

Network Probe Pipeline Management

Get pipeline status:

curl -X GET "http://localhost:9600/_node/pipelines"

Update pipeline:

curl -X PUT "http://localhost:9600/_node/pipelines/main" \
  -H "Content-Type: application/json" \
  -d '{"pipeline": {"batch_size": 1000}}'