spark job server does not return a json in the correct format

353 views Asked by At
case class Response(jobCompleted:String,detailedMessage:String)

override def runJob(sc: HiveContext, runtime: JobEnvironment, data: 
JobData): JobOutput = {
  val generateResponse= new GenerateResponse(data,sc)
  val response=generateResponse.generateResponse()
  response.pettyPrint
}

I am trying to get ouput from spark job server in this format from my scala code.

" result":{
     "jobCompleted":true,
    "detailedMessage":"all good"
   }  

However what returns to me is the following result:{"{\"jobCompleted\":\"true\",\"detailedMessage.."}.

Can some one please point out what I am doing wrong and how to get the correct format. I also tried response.toJson which returns me the AST format

"result": [{
    "jobCompleted": ["true"],
    "detailedMessage": ["all good"]
  }],
1

There are 1 answers

0
Siva On

I finally figured it out. Based on this stack over flow question. If there is a better way kindly post here as I am new to scala and spark job server.

Convert DataFrame to RDD[Map] in Scala

So the key is to convert the response to a Map[String,JsValue]. Below is the sample code I was playing with.

case class Response(param1:String,param2:String,param3:List[SubResult])
case class SubResult(lst:List[String])
object ResultFormat extends  DefaultJsonProtocol{
implicit val subresultformat=jsonFormat1(SubResult)
implicit val responsefomat=jsonFormat3(Response)
}

type JobOutput=Map[String,JsValue]
def runJob(....){

val xlst=List("one","two")
val ylst=List("three","four")
val subresult1=SubResult(xlst)
val subresult2=SubResult(ylst)
val subResultlist=List(subresult1,subresult2)
val r=Result("xxxx","yyy",subResultlist)
 r.toJson.asJsObject.fields
//Returns output type of Map[String,JsValue] which the spark job server serializes correctly.
}