tencent cloud

Cloud Log Service

Release Notes and Announcements
Release Notes
Announcements
User Guide
Product Introduction
Overview
Features
Available Regions
Limits
Concepts
Service Regions and Service Providers
Purchase Guide
Billing Overview
Product Pricing
Pay-as-You-Go
Billing
Cleaning up CLS resources
Cost Optimization
FAQs
Getting Started
Getting Started in 1 Minute
Getting Started Guide
Quickly Trying out CLS with Demo
Operation Guide
Resource Management
Permission Management
Log Collection
Metric Collection
Log Storage
Metric Storage
Search and Analysis (Log Topic)
Search and Analysis (Metric Topic)
Dashboard
Data Processing documents
Shipping and Consumption
Monitoring Alarm
Cloud Insight
Independent DataSight console
Historical Documentation
Practical Tutorial
Log Collection
Search and Analysis
Dashboard
Monitoring Alarm
Shipping and Consumption
Cost Optimization
Developer Guide
Embedding CLS Console
CLS Connection to Grafana
API Documentation
History
Introduction
API Category
Making API Requests
Topic Management APIs
Log Set Management APIs
Index APIs
Topic Partition APIs
Machine Group APIs
Collection Configuration APIs
Log APIs
Metric APIs
Alarm Policy APIs
Data Processing APIs
Kafka Protocol Consumption APIs
CKafka Shipping Task APIs
Kafka Data Subscription APIs
COS Shipping Task APIs
SCF Delivery Task APIs
Scheduled SQL Analysis APIs
COS Data Import Task APIs
Data Types
Error Codes
FAQs
Health Check
Collection
Log Search
Others
CLS Service Level Agreement
CLS Policy
Privacy Policy
Data Processing And Security Agreement
Contact Us
Glossary

Data Processing

PDF
Focus Mode
Font Size
Last updated: 2025-12-16 21:28:42

Data Processing

Data processing refers to the process of filtering, cleaning, desensitizing, enriching, and distributing log data to the target log topic. It can be understood as log ETL (Extract-Transform-Load).
Source Log Topic: Input of data processing task.
Target Log Topic: Output of data processing task.
Target name: Custom target topic name, which improves the readability of the target topic (business attribute) and is used in the function log_output("alias") when outputting logs to the specified target topic. Data processing tasks must have an output target topic, otherwise, the task cannot be created.
DSL Processing Functions: DSL (Domain Specific Language) is a log data processing function developed by CLS for the requirements of log ETL. The functions are simple and easy to use, with high processing performance. The underlying layer is implemented based on Flink, which can process logs in real time.

Timed SQL Analysis

Timed SQL Analysis refers to the process of periodically querying log data (supporting retrieval and SQL) based on the specified time window and saving the query results to the target log topic.
Source Log Topic: Input of scheduled SQL task.
Target Log Topic: Output of timed SQL task.
Scheduling Range: Time range of logs that supports query, for example, the log data from 00:00:00 on January 1, 2023 to 00:00:00 on March 31, 2023.
Scheduling Period: Periodic query, with the value ranging from 1 to 1,440 minutes. If daily reports need to be generated, it can be configured as 1,440 minutes.
SQL Time Window: Time window for specifying query statements. When it is used with the scheduling cycle, you can get tumbling window and sliding window.
Rolling window: Non-overlapping query window. For example, if the scheduling period is 60 minutes, the SQL time window is 60 minutes. Typical scenario: hourly report.
Sliding Window: Query windows with overlapping. For example, the scheduling cycle is 1 minute and the SQL time window is 60 minutes. Typical scene: Create a sequence diagram for active users within 1 hour, with a time axis granularity of 1 minute.
Delayed Execution: For query of the delayed time in the Advanced Settings of the console, with the value ranging from 60 to 120 seconds. There is usually a delay when the log index is generated, and the log cannot be queried before the index is generated. Therefore, set a delay of 60 seconds for query, and the index is already generated when the time elapses (99.9% of the index data will be generated within 5 seconds).

Help and Support

Was this page helpful?

Help us improve! Rate your documentation experience in 5 mins.

Feedback