I had the same problem where a Bulk Copy Job (BCP) on SQL Server needed to write a file out to a different server, managed by a different group in the organization.
The trick is to get the drive mapped as the user that the BCP job is using.
You map the drive as a dedicated active directory account that has permissions for the desired destination, what some call a service account. I had to work with my AD folks to get one setup.
To find out what user account is being used by BCP run this command from a SQL Server Query window:
EXEC xp_cmdshell 'ECHO %USERDOMAIN%\%USERNAME%
When mapping the drive, use whoami
at the command prompt.
In my case, I was unable to be logged into the SQL Server as the SYSTEM account, so I built a batch job that could then be executed by Task Scheduler, but run AS the SYSTEM account.
I put commands in the batch job to write the results out to a text file, since I wouldn't be able to see it.
** Batch job below **
ECHO ON
ECHO This will map the drive, but is being run by task scheduler AS the user SYSTEM
ECHO which should make it accessible to the user SYSTEM
ECHO List the existing drives first.
net use >> c:\Test\SystemNetUseOutput.txt
net use R: \\MyRemoteServer\MyDirectory\ /P:Yes /u:MyDomain\MyUsername pa$$word
ECHO the /P switch makes the drive remain after reboot
ECHO List the existing drives with the new mapping
net use >> c:\Test\SystemNetUseOutput.txt
ECHO See what user this batch job ran under
whoami >> c:\Test\SystemNetUseOutput.txt
ECHO need to exit to allow the job to finish
EXIT
** I hope this helps someone else **
Wow, just running the scheduled task affects the service if it's restarted. Awesome. – eel ghEEz – 2019-05-01T19:44:38.920