Step (1) Copy the data from the azure storage tables to a local file (JSON formatted automatically)
AzCopy /Source:https://{your storage account}.table.core.windows.net/{your container} /Dest:c:\azdata\ /SourceKey:{your sas key}
The output from this command might look something like this:
[2019/02/27 08:17:33] Manifest file for this export operation is: "{redacted}".
[2019/02/27 08:17:33] All table entities have been downloaded.
[2019/02/27 08:17:33] Validating the integrity of data file list…
[2019/02/27 08:17:33] Validation of data file list passed.
[2019/02/27 08:17:33]
Transfer summary:
Total entities transferred: 139
Elapsed time: 00.00:00:01
The JSON document this produces will have some metadata at the top. I deleted that and left just the array of objects e.g. [{…},{…}…].
(2) Copy this local file back into an Azure table storage blob.
(3) Create an external data source for
CREATE EXTERNAL DATA SOURCE MyData
WITH ( TYPE = BLOB_STORAGE,
LOCATION = 'https://{your storage account}.blob.core.windows.net/{your container}')
INSERT INTO MyTable ([Columns...])
SELECT row.* FROM OPENROWSET(
BULK '{file name from Step (2)}',
DATA_SOURCE = 'MyData',
SINGLE_CLOB) AS DataFile
CROSS APPLY OPENJSON(BulkColumn)
WITH( [Column1] datatype1
, [Column2] datatype2
, ...
) AS row
Here are some references I used:
import-json-documents-into-sql-server
sys-external-data-sources-transact-sql
That’s it, you’re done!
Leave a comment