2

I have a problem with running SQL*Plus in the bash. Here is my code

#!/bin/bash

#curl http://192.168.168.165:8080/api_test/xsql/f_exp_order_1016.xsql > script.sql
wget -O script.sql 192.168.168.165:8080/api_test/xsql/f_exp_order_1016.xsql
set NLS_LANG=_.UTF8
sqlplus /nolog << ENDL
connect login/password 
set sqlblanklines on
start script.sql
exit
<<endl

I download the insert statements from our intranet, put it into sql file and run it through SQL*Plus. This is working fine. My problem is that when I save the file script.sql my encoding goes wrong. All special characters(like íášč) are broken and that's causing inserting wrong characters into my DB. Encoding of that file is UTF-8, also UTF-8 is set on the XSQL page on our intranet. So I really don't know where could be a problem.

And also any advices regarding to my script are welcomed, I am total newbie in Linux scripting:-)

Petr Mensik
  • 215
  • 1
  • 3
  • 11
  • Use [UNISTR](https://docs.oracle.com/cd/B19306_01/server.102/b14200/functions204.htm) and UCS2 [encoding](http://www.columbia.edu/kermit/ucs2.html). – Ayb May 15 '18 at 14:12

2 Answers2

8

Ok, the problem wasn't in the file (encoding was UTF-8 as it should be) but in the setting of Oracle NLS_LANG environmental variable. So solution was putting this line before executing SQL*Plus script

NLS_LANG="CZECH_CZECH REPUBLIC.UTF8" export NLS_LANG

Petr Mensik
  • 215
  • 1
  • 3
  • 11
0

You may need to convert your inserts to a ISO8859 code page. Check the encoding in use on your Oracle server. The encoding should look like 'WE8ISO8859P1' or something like that, which will tell the ISO code page in use.

You can use iconv or something similar to convert your file from the UTF-8 encoding to the ISO8859 one, then try running it through SQL*Plus.

  • Thanks for advice, I was able to set characters in file right through `iconv -f UTF8 -t CP1250 < temp.sql > script.sql`(ISO8859-2 didn't work for me), so the file is looking fine. But characters in my DB are still broken as they were after insertion. So I am inserting a correct file with correct characters and it's still broken in DB. I also looked to DB encoding at oracle server and it's UTF8. So am I doing something wrong? – Petr Mensik Mar 15 '12 at 12:28
  • You could load the data into a staging table with the columns as CP1250 then do a secondary load that explicitly does the conversion. However, if this worked before I'd suspect that maybe there's something wonky in your configuration. – ConcernedOfTunbridgeWells Mar 15 '12 at 12:49
  • I don't have direct access to DB so I can't do such a change. Also I doubt that my boss would like this solution:-)But thanks for advice, at least I am closer to solution:-) – Petr Mensik Mar 15 '12 at 13:03