In Databricks Delta Table, the supported column types are the same as those in Apache Spark, since Delta Lake is built on Spark.
Supported Data Types in Delta Table:
-
Primitive Types
Integer Numeric Types:
BYTE (8-bit, -128 to 127)
SHORT (16-bit, -32,768 to 32,767)
INT or INTEGER (32-bit, ~ -2 billion to 2 billion)
LONG (64-bit, -2⁶³ to 2⁶³-1)
Floating-Point Numeric Types:
FLOAT (32-bit, single precision)
DOUBLE (64-bit, double precision)
Fixed-Precision Numeric Types:
DECIMAL(precision, scale) or NUMERIC (up to 38 digits of precision)
Boolean Type:
BOOLEAN (true/false)
Text and String Types:
STRING (variable-length character sequence)
Date and Time Types:
DATE (date only, no time)
TIMESTAMP (date and time with microsecond precision)
-
Complex Types
Array: ARRAY (list of values of type T)
Map: MAP<K, V> (key-value pairs)
Struct: STRUCT<field1: type1, field2: type2, ...> (nested structure)
-
Advanced Types
Binary: BINARY (arbitrary binary data)
Intervals:
INTERVAL YEAR TO MONTH
INTERVAL DAY TO SECOND
-
Example: Creating a Delta Table with Different Column Types
sql
CREATE TABLE my_delta_table (
id INT,
name STRING,
age SHORT,
salary DECIMAL(10,2),
is_active BOOLEAN,
created_at TIMESTAMP,
metadata STRUCT<source: STRING, version: INT>,
tags ARRAY,
properties MAP<STRING, STRING>
) USING DELTA;
The supported types ensure that Delta Tables can store and handle structured, semi-structured, and even binary data, making them highly flexible for both analytical and transactional use cases.
Links:
https://docs.delta.io/latest/delta-intro.html
https://parquet.apache.org/docs/file-format/types/