Overview of data transformation cases
Common log processing
Case name | Description |
---|---|
Parse NGINX logs | Use the data transformation feature to parse NGINX logs. |
Check data by using functions | Use the data transformation feature to check log data. |
Log distribution
Case name | Description |
---|---|
Replicate data from a Logstore | Use the data transformation feature to replicate data from a Logstore. |
Replicate and distribute data | Replicate data from a source Logstore and distribute the data to multiple destination Logstores in a typical scenario. |
Transmit data across regions | Use the data transformation feature to transmit data across regions. |
Distribute data to multiple destination Logstores | Distribute data to multiple destination Logstores in various scenarios. |
Aggregate data from multiple source Logstores | Aggregate data from multiple source Logstores to a destination Logstore. |
Data masking
Case name | Description |
---|---|
Data masking | Use functions to mask sensitive data in various scenarios. |
Datetime conversion
Case name | Description |
---|---|
Datetime conversion | Use functions to convert and offset the datetime. |
Text parsing
Case name | Description |
---|---|
Parse Syslog messages in standard formats | Use the Grok function in the domain-specific language (DSL) of Simple Log Service to parse Syslog messages in different formats. |
Parse NGINX logs | Use regular expression functions or the Grok function to parse NGINX access logs. |
Parse Java error logs | Use the data transformation feature to parse Java error logs. |
Extract dynamic key-value pairs from a string | Use different functions to extract dynamic key-value pairs from a string. |
Transform logs in specific text formats | Use LOG DSL orchestration to transform logs to meet data transformation requirements. |
Convert logs to metrics | Use a data transformation function to convert logs to metrics. |
Parse and update JSON data | Use the data transformation feature to parse and update JSON data. |
Cases related to IP addresses
Case name | Description |
---|---|
Obtain the IPIP library from OSS and enrich IP address data | Obtain the IPIP library from Object Storage Service (OSS) and use the library to identify the city, state, and country to which an IP address belongs. |
Obtain the IP2Location library from OSS and enrich IP address data | Obtain the IP2Location library from OSS and use the library to identify the city, state, and country to which an IP address belongs. |
Filter flow logs that record Internet traffic | Use the data transformation feature to filter traffic based on a 5-tuple of network traffic. |
Cleanse data by using functions | Use functions to cleanse data in various scenarios. |
Processing of logs in specific formats
Case name | Description |
---|---|
Process logs in the JSON format | Use the data transformation feature to transform complex JSON data. |
Parse log entries in a CSV-format log file | Parse log entries in a CSV-format log file, such as a Syslog file. |
Convert logs to metrics | Use the data transformation feature to convert logs to metrics. |
Configuration of a data transformation job as a RAM user
Case name | Description |
---|---|
Use a default role to transfer data within the same Alibaba Cloud account | Specify a default role to transfer log data within the same Alibaba Cloud account. |
Use custom roles to transfer data within the same Alibaba Cloud account | Use custom roles to transfer log data within the same Alibaba Cloud account. |
Use custom roles to transfer data across different Alibaba Cloud accounts | Use custom roles to transfer log data across different Alibaba Cloud accounts. |
Use AccessKey pairs to transfer data within the same Alibaba Cloud account | Use AccessKey pairs to transfer log data within the same Alibaba Cloud account. |
Use AccessKey pairs to transfer data across different Alibaba Cloud accounts | Use AccessKey pairs to transfer log data across different Alibaba Cloud accounts. |
Data enrichment
Case name | Description |
---|---|
Log enrichment practices | Log enrichment practices |
Pull data from one Logstore to enrich log data in another Logstore | Use resource functions to pull data from one Logstore to enrich log data in another Logstore. |
Pull a CSV file from OSS to enrich data | Use resource functions to pull data from OSS and use mapping functions to map the data fields to data fields in Simple Log Service. |
Obtain data from an ApsaraDB RDS for MySQL database over the internal network | Obtain data from an ApsaraDB RDS for MySQL database over the internal network. |
Build dictionaries and tables for data enrichment | Describe common methods for building dictionaries and tables and compare the advantages and disadvantages of different building methods. |
Obtain data from an ApsaraDB RDS for MySQL database for data enrichment | Use resource functions to obtain data from an ApsaraDB RDS for MySQL database for data enrichment.化 |
Use a resource function to obtain incremental data | Use the res_rds_mysql function to obtain incremental data. |
Enrich log data by using mapping functions | Use the e_dict_map and e_search_dict_map functions to enrich log data. |
Pull data from a Hologres database for data enrichment | Use resource functions to pull data from a Hologres database for data enrichment. |
Use the e_table_map function to enrich HTTP response status codes | Use the e_table_map function to enrich HTTP response status codes. |
Filter VPC flow logs for Internet traffic logs | Use the data transformation feature to filter flow logs for Internet traffic logs. |
other
Case name | Description |
---|---|
Transform historical data | Best practices for transforming historical data |