I have a text file on my laptop that I want to load into an Oracle database on Amazon. The only tool available is SQLPLUS client on my laptop from which I can connect to the database to run query. The text file has 3,735 records and 17 columns as shown below.
1.0,65.79,329.0,...,4.0 2.0,224.9,146.0,...,10.0 3.0,113.0,113.0,...,9.0 4.0,175.4,28.0,...,7.0I decide to create a SQL script file that contains 3,735 "insert" SQL statements, connect from SQLPLUS client to the database and run the SQL script file. As shown in the following scripts, I use Linux awk command to make insert statement for each records.
$ cat x.txt | awk '{print "insert into t_data values(",$0,");"}' > insert_data2.sql $ cat insert_data2.sql insert into t_data values( 1.0,65.79,329.0,...,4.0 ); insert into t_data values( 2.0,224.9,146.0,...,10.0 ); insert into t_data values( 3.0,113.0,113.0,...,9.0 ); insert into t_data values( 4.0,175.4,28.0,...,7.0 ); .................I open notepad and add "begin " at the beginning of the insert_data2.sql and "end; /" at the end of insert_data2.sql to make these insert statements within a PL/SQL block. By doing so, all these insert statments will be treated as one transaction, i.e., all 3,735 insert statements have to be executed fully or not at all. I don't want data partially inserted.
begin insert into t_data values( 1.0,65.79,329.0,...,4.0 ); insert into t_data values( 2.0,224.9,146.0,...,10.0 ); insert into t_data values( 3.0,113.0,113.0,...,9.0 ); insert into t_data values( 4.0,175.4,28.0,...,7.0 ); ................. end; /I connect to the Oracle database on Amazon and run the insert_data2.sql. It takes less than 16 seconds to insert 3725 records. Not bad.
SQL> @insert_data2 PL/SQL procedure successfully completed. Elapsed: 00:00:15.93 SQL> select count(1) from t_data; COUNT(1) ---------- 3735 Elapsed: 00:00:00.39
No comments:
Post a Comment