How to Configure Logstash
Logstash is used in MCS to send data to external destinations. MCS includes ElasticSearch by default for demonstration and POC purposes. In production environments, TAG recommends using external data targets. These targets could be an external ElasticSearch, AWS OpenSearch, GCP Pub/Sub, HTTP server, or a CSV file.
Logstash collects data from the MCS REDIS Pipe and forwards it to external destinations, which are controlled by the output plugins. MCS exposes the output part of the configuration so it can be pointed to any destination. The input side of the configuration is internal and not exposed.
A complete list of Logstash output plugins can be found here: Output plugins | Logstash Reference [8.16] | Elastic
MCS exposes several data indices. Each index can be configured independently to send data to one or more external destinations.
Logstash can be configured from MCS version 1.2.0 onwards.
List of currently supported indices
ApplicationStatus.conf
BridgeRxStatistics.conf
BridgeTxStatistics.conf
ChannelCaptions.conf
ChannelEvents.conf
ChannelFingerprint.conf
ChannelScte104.conf
ChannelScte35.conf
ChannelStatistics.conf
ContentMatching.conf
EncoderStatistics.conf
KmsStatistics.conf
NetworkStatistics.conf
NtpStatistics.conf
OttStatistics.conf
PtpStatistics.conf
SsimResults.conf
SystemEvents.conf
SystemKeepAlive.conf
How to Modify MCS to use External Data Destination
Updating the configuration
Navigate to ~/MCS folder.
To open the environment file use the command nano .env to edit the enviroment file. Set EXTERNAL_LOG_DB_TYPE="external" as shown below.
If you dont to use internal ElasticSearch, set ENABLE_INTERNAL_ES=false
Save (Control O) and close (Control X) the file.
To restart the MCS, run the ./run.sh command. Select option 2 Stop and then option 1 Start.
Template files will be placed in the external folder ~/MCS/logstash/pipeline/external
Modify index files to add new outputs or modify existing ones
Example 1: To send all the events to Google Cloud Pub/Sub, modify the output section of the ChannelEvents.conf to add or replace an existing output.
output {
google_pubsub {
# Required attributes
project_id => "my_project"
topic => "my_topic"
# Optional if you're using app default credentials
json_key_file => "service_account_key.json"
}
}
project_id is the full GCP project ID (not just the project name).
topic is one of the topics created in Pub/Sub for this data stream.
json_key_file points to the path of the Service Account key file as mapped inside the MCS logstash container. For a file uploaded to /home/ubuntu/MCS/logstash/pipeline/external/key.json, the output configuration in each .conf file will be set as json_key_file => "/usr/share/logstash/pipeline/external/key.json"
Example 2: To send events to HTTP server.
output{
http {
format=>"json"
http_method=>"post"
url=>"http://192.168.1.1/bar"
}
}
The output configuration will have different parameters for each destination type. For specific examples, please refer to the Logstash guide.
Every conf file includes a drop() filter by default. This is in order to drop all the data if the index is not needed.
filter {
if "${EXTERNAL_LOG_DB_TYPE}" == "external" {
drop { }
}
}
If a specific index will be used, comment out the drop section and enable the filter and output sections.