how to pass parameters to Spark SQL query when using type safe config

522 views Asked by At

Not able to achieve string interpolation. My daily_query.conf file and code looks like this

metrics {
opt_metrics{
query= """select * from opt where created_at= '$ds'"""
}

}
```
val config: Config = ConfigFactory.load("daily_query.conf").getConfig("metrics")

val ds = "2022-10-30"

val rawQuery = config.getString("opt_metrics.query")

val q = "s\"\"\""+rawQuery+"\"\"\""

println(q) //output: s"""select * from opt where created_at= '$ds'"""
```

Expectation is to substitute value of the variable 'ds' as in spark.sql(s"""select * from opt where created_at= '2022-10-30' """).

1

There are 1 answers

0
stefanobaghino On

The string interpolators are expanded at compile time using macros (see here). This means that the only way for you to use them programmatically while reading a configuration file is using a macro yourself (although I'm no 100% sure that' really feasible either). This is probably too complicated for the end goal you want to achieve and you can probably just use replace as in this this example:

val config = com.typesafe.config.ConfigFactory.parseString(
  """metrics {
    |  opt_metrics {
    |    query = "select * from opt where created_at='$ds'"
    |  }
    |}
    |""".stripMargin)

val ds = "2022-10-30"

val rawQuery = config.getString("metrics.opt_metrics.query")

rawQuery.replace("$ds", ds) // evaluates to select * from opt where created_at='2022-10-30'

You can play around with this code here on Scastie.