To convert a string column (StringType) to an array column (ArrayType) in PySpark, you can use the split() function from the pyspark.sql.functions module. This function splits a string on a specified delimiter like space, comma, pipe e.t.c and returns an array.
from pyspark.sql import functions as F
data =
columns=
# Create DataFrame
df=spark.createDataFrame(data,columns)
df.printSchema()
df.show()
df.withColumn("Name_", F.explode(F.split(F.col("Name"), ""))).show()

df.createOrReplaceTempView("PERSON")
spark.sql("select SPLIT(name,',') as NameArray from PERSON") \
.show()
