saprk sql split
时间: 2023-10-04 21:04:23 浏览: 97
Sql in split
Spark SQL Split function is used to split a string into an array of substrings based on a specified delimiter. The syntax for split function in Spark SQL is as follows:
```
split(string, delimiter)
```
Where,
- `string` is the input string that needs to be split.
- `delimiter` is the character or string used as a separator to split the input string.
For example, if we have a string "Hello,World" and we want to split it into an array of substrings based on the comma separator, we can use the split function as follows:
```
SELECT split("Hello,World", ",") as words
```
This will return an array of two strings - "Hello" and "World".
We can also use the split function with column names in a table to split the values in that column. For example, if we have a table `employee` with a column `name` containing full names separated by spaces, we can split the names into first and last name using the split function as follows:
```
SELECT split(name, " ")[0] as first_name, split(name, " ")[1] as last_name FROM employee
```
This will return a table with two columns - `first_name` and `last_name` containing the first and last names of all employees in the `employee` table.
阅读全文