0

i want to save data from Destination data store via data factory to blob or some other storage in azure cloud and then get it back to Azure Database for MySQL , but the only options in the UI screen data factory->Author & Monitor ->Destination data store are:

Azure Blob Storage ,Azure Cosmos DB (MongoDB API) , Azure Cosmos DB (SQL API) , Azure Data Explorer (Kusto) , Azure Data Lake Storage Gen1 , Azure Data Lake Storage Gen2 , Azure File Storage ,Azure SQL Data Warehouse , Azure SQL Database , Azure SQL Database Managed Instance ,Azure Search and Azure Table Storage.

is there a way of save it to MySQL database ? maybe via CLI? or that i could choose one of this option and configure it to work with MySQL database in azure?

1 Answers1

0

At present, there does not exist a sink for Azure MYSQL databases, so you cannot do this natively in Azure Data Factory. There is a UserVoice item for this here that states the work on this has started.

To do this today, you would need to write a process to do this yourself which can be triggered by your DataFactory pipeline. You could do this using something like Azure Functions, Azure Automation etc.

Sam Cogan
  • 38,158
  • 6
  • 77
  • 113
  • can you please elaborate on Azure Functions? maybe a simple example ? – Omer Anisfeld Aug 11 '19 at 16:16
  • Azure functions is just a platform to run your code in serverless manner, so you'd need to write a script that could do what you needed, in this case read from blob storage and write to MYSQL. You can do this C#, PowerShell, Python, Java etc. – Sam Cogan Aug 11 '19 at 16:18
  • but when i save the data to blob it save in a plain text format, not something that any tool like mysqldump can use so it seems kind of expensive for large scale databases. – Omer Anisfeld Aug 11 '19 at 16:38
  • You can save the data to blob in any format you require. easiest way would be to have data factory create it as a CSV file that matches your database format, then script the import of that CSV – Sam Cogan Aug 11 '19 at 16:40