I have few very, very simple functions in Python that I would like to use as UDFs in Spark SQL. It seems easy to register and use them from Python. But I would like to use them from Java/Scala when using JavaSQLContext or SQLContext. I noted that in spark 1.2.1 there is function registerPython but it is neither clear to me how to use it nor whether I should ...
Any ideas on how to to do this? I think that it might got easier in 1.3.0 but I'm limited to 1.2.1.
EDIT: As no longer working on this, I'm interest in knowing how to do this in any Spark version.