ECSs are core computing resources that host many service applications. These applications generate text log data, including application and system logs. Such logs are essential for monitoring system health, optimizing performance, and troubleshooting. To better manage and analyze ECS text logs, you can ingest them to LTS. This centralizes log storage, enabling efficient log query, analysis, and alarm reporting.
Follow these steps to complete the ingestion configuration:
Step 1: Select a Log Stream: Store various log types in separate log streams for better categorization and management.
Step 2: (Optional) Select a Host Group: Define the range of hosts for log ingestion. Host groups are virtual groups of hosts. They help you organize and categorize hosts, making it easier to configure log ingestion for multiple hosts simultaneously. You can add one or more hosts whose logs are to be collected to a single host group, and associate it with the same ingestion configuration.
Step 3: Configure the Collection: Configure the log collection details, including collection paths and policies.
Step 4: Configure Indexing: An index is a storage structure used to query log data. Configuring indexing makes log searches and analysis faster and easier.
Step 5: Complete the Ingestion Configuration: After a log ingestion configuration is created, manage it in the ingestion list.
Setting Multiple Ingestion Configurations in a Batch: Select this mode to collect logs from multiple scenarios.
You can also choose Log Ingestion > Ingestion Management in the navigation pane and click Create. On the displayed page, click ECS (Elastic Cloud Server).
Figure 1 Selecting a log stream

A host group is a virtual group of hosts, allowing you to configure host log collection efficiently. Ensure that ICAgent has been installed on hosts where logs are to be collected and the hosts have been added to a host group.
You can skip this step by not selecting any host group and clicking Next: Configurations, and then Skip in the subsequent dialog box. However, if you skip this step, the collection configuration will not take effect. It is advised to select host groups during the initial ingestion configuration.
If you initially skip host group selection, you can associate host groups later using either method:
Collection configuration items include the log collection scope, collection mode, and format processing. Configure them as follows.
Protect your privacy and sensitive data. You are advised not to transmit privacy or sensitive data through fields involved in access logs. Encrypt the data if necessary.
Procedure
For example, /var/logs/**/a.log will match the following logs:
/var/logs/a.log/var/logs/1/a.log/var/logs/1/2/a.log/var/logs/1/2/3/a.log/var/logs/1/2/3/4/a.log/var/logs/1/2/3/4/5/a.log
If a log collection path is similar to C:\windows\system32 but logs cannot be collected, enable Web Application Firewall (WAF) and configure the path again.
/var/logs/1/a.log/var/logs/2/a.log
/var/logs/service-1/a.log/var/logs/service-2/a.log
/var/logs/service/a1.log/var/logs/service/a2.log
The built-in rules are {basename}{connector}{wrapping identifier}.{suffix} and {basename}.{suffix}{connector}{wrapping identifier}. Connectors can be hyphens (-), periods (.), or underscores (_), wrapping identifiers can contain only non-letter characters, and the suffix can contain only letters.
A custom wrapping rule consists of {basename} and the feature regular expression of the wrapped file. Example: If your log file name is test.out.log and the names after wrapping are test.2024-01-01.0.out.log and test.2024-01-01.1.out.log, configure the collection path to /opt/*.log, and add a custom wrapping rule: {basename}\.\d{4}-\d{2}-\d{2}\.\d{1}.out.log.
After you enable this function, one host log file can be collected to multiple log streams.
After you disable this function, each collection path must be unique. That is, the same log file in the same host cannot be collected to different log streams.
Blacklist filters can be exact matches or wildcard pattern matches. For details, see Collection Paths.
Parameter | Description |
|---|---|
Log Type | You can select from the following log types:
|
First Collection Time Offset | If you set this parameter to 7, logs generated within the seven days before the collection start time are collected. This offset takes effect only for the first collection to ensure that the logs are not repeatedly collected. The maximum value is seven days. |
Event Level | You can filter and collect Windows events based on their severity (information, warning, error, critical, and verbose). This function is available only to Windows Vista or later. |
LTS offers various log parsing rules, including Single-Line - Full-Text Log, Multi-Line - Full-Text Log, JSON, Delimiter, Single-Line - Completely Regular, Multi-Line - Completely Regular, and Combined Parsing. Select a parsing rule that matches your log content. Once collected, structured logs are sent to your specified log stream, enabling field searching.
Parameter | Description | Example Value |
|---|---|---|
Max Directory Depth | Specify the number of directory levels that can be traversed when using double asterisks (**) for fuzzy matching of log collection paths. LTS supports a maximum of 20 directory levels. For example, to collect logs from /var/logs/department/app/a.log, set the collection path to /var/logs/**/a.log and Max Directory Depth to 5. | 5 |
Split Logs | To prevent individual logs from being too large or being truncated and discarded, you can split logs based on file size.
| Enable |
Collect Binary Files | Specify whether to collect log data stored in binary format. You can run the following command to check the file type. Log files containing charset=binary are binary files.
| Enable |
Log File Code | Select the storage format of characters in log files. You can select UTF-8 or GBK encoding. GBK is not supported in the Windows OS. Set the encoding format properly to ensure that log content can be correctly read and parsed, preventing garbled characters or data damage.
| UTF-8 |
Collection Policy | Set whether ICAgent reads a file from the end or the beginning when collecting new log files.
| Incremental |
Custom Metadata |
| Enable |
Parameter | Description |
|---|---|
Log Format |
|
Log Time | System time: log collection time by default. It is displayed at the beginning of each log event.
|
Time wildcard: You can set a time wildcard so that ICAgent will look for the log printing time as the beginning of a log event.
Example:
| |
Log Segmentation | This parameter needs to be specified if the Log Format is set to Multi-line. By generation time indicates that a time wildcard is used to detect log boundaries, whereas By regular expression indicates that a regular expression is used. |
By regular expression | You can set a regular expression to look for a specific pattern to indicate the beginning of a log event. This parameter needs to be specified when you select Multi-line for Log Format and By regular expression for Log Segmentation. The time wildcard and regular expression will look for the specified pattern right from the beginning of each log line. If no match is found, the system time, which may be different from the time in the log event, is used. In general cases, you are advised to select Single-line for Log Format and System time for Log Time. ICAgent supports only RE2 regular expressions. For details, see Syntax. |
An index is a storage structure used to query log data. Configuring indexing makes log searches and analysis faster and easier. Different index settings generate different query and analysis results. Configure index settings to fit your service requirements.
On this page, click Auto Configure to have LTS generate index fields based on the first log event in the last 15 minutes or common system reserved fields (such as hostIP, hostName, and pathFile), and manually add structured fields. After completing the settings, click Submit. The message "Logs ingested" will appear. You can also adjust the index settings after the ingestion configuration is created. However, the changes will only affect newly ingested logs.
The created ingestion configuration will be displayed.
Deleting an ingestion configuration may lead to log collection failures, potentially resulting in service exceptions related to user logs. In addition, the deleted ingestion configuration cannot be restored. Exercise caution when performing this operation.
Disabling an ingestion configuration may lead to log collection failures, potentially resulting in service exceptions related to user logs. Exercise caution when performing this operation.
You can set multiple ingestion configurations for multiple scenarios in a batch, avoiding repetitive setups.
.
. In the displayed dialog box, click Yes.The added ingestion configurations will be displayed on the Ingestion Management page after the batch creation is successful.