Postgresql array encoding in Doobie – text[] vs varying[]

659 views Asked by At

Suppose we have this postgresql schema:

CREATE TABLE "temp" (id  int, fields text[]);
INSERT INTO "temp" VALUES
  (1, array['abc', 'def']),
  (2, array['abc', 'def', 'jkl']),
  (3, array['abd', 'def']),
  (4, '{"1{c}1", a}');

The following works in plain SQL, returning rows 1 and 2 (@> is the "contains" operator in pg):

SELECT id, fields FROM temp WHERE "fields" @> '{def, abc}';

But the same in Doobie (with its postgresql extension):

import doobie._
import doobie.implicits._
import doobie.postgres.implicits._
val searchTerms = List("def", "abc")
fr"SELECT id, fields FROM temp WHERE fields @> $searchTerms"

Fails with:

org.postgresql.util.PSQLException: ERROR: operator does not exist: text[] @> character varying[]
Hint: No operator matches the given name and argument type(s). You might need to add explicit type casts.

Casting the interpolated searchTerms value to text[] seems to solve the problem:

val searchTerms = List("def", "abc")
fr"SELECT id, fields FROM temp WHERE fields @> CAST($searchTerms AS text[])"

Similarly, I can change the type of the fields column from text[] to varchar[] in the database schema to avoid the cast, and that would also compile and return the correct rows.

My main problem is that I don't know why Doobie behaves that way. I will probably run into this same issue again with custom types, so I need to understand why $searchTerms decodes into varying[] and not text[], and what, if anything, I can do to change this behaviour so that I can keep the text[] column type and avoid the cast. I'm guessing there's probably an implicit instance somewhere controlling this, but I couldn't figure out what it is.

0

There are 0 answers