1

I'm tried to install cosmosdb spark connector (https://docs.microsoft.com/en-us/azure/cosmos-db/spark-connector) in azure databricks on a cluster in init script, but had errors and non working cluster (one of the uber libraries has different signature) or script could not find spark connector even after 20 mins delay. But the same library is installable and works well when I create an interactive cluster and install it on the Azure portal. I've tried also to install it as non uber library (every dependency library was installed separately), but the script could not see cosmosdb spark connector. I know also I can create a library in portal and also mark it to be installed on each cluster in databricks. But I would like to control it more flexible way like in the init script. I've also tried different versions in the repository: https://repo1.maven.org/maven2/com/microsoft/azure/azure-cosmosdb-spark_2.4.0_2.11/ (corresponds to the spark/scala version).

Any help?

0 Answers0