I have a script which does the following:
drag an AIF or M4A file, multiple files, or a folder of files onto it.
connects to a server share and gets a list of file names from a sqlite DB, as follows:
set theCommand to "sqlite3 " & path_to_db & " 'select file_name FROM player_view;' "
set theResult to do shell script theCommand
set theText to paragraphs of theResult
currently this is returning about 14,000 paragraphs
script recurses through the list of files dragged (generally the number of files dragged is less than around 50)
script will check each: if it is an AIF file it creates a new name ending in .m4a. If it is an m4a it leaves the name as is, ending in .m4a. It stores the name in variable "newName"
script checks if "newName" exists in variable theText:
if newName is in theText then
error newName & " already exists!"
end if
If no error, more processing follows:
convert AIF to M4A if it's not already M4A
tag ID3 tags
etc
The script is already working, and fairly quick on a gigabit network, but the sqlite query is pretty slow when logged in remotely.
My question is, would it be more efficient to query the sqlite table for each file, instead of grabbing all the text at once?
In other words, I would have to construct a shell script that does this (in this case you can see that the name contains double quotes, and all of them do):
sqlite3 <path.to.db> 'select file_name from player_view where file_name is '\''A&E MV "The Unfolding".m4a'\'';'
Exactly how to escape all those quotes so I can "do shell script" will be a challenge, so, before I attempt it, I'd like to know if the extra effort is worth it.