Data amount processed by modern databases grows steadily. In this regard, there is an acute problem – database performance. Insert, Update and Delete operations have to be performed as fast as possible. Therefore Devart provides several solutions to speed up processing of huge amounts of data. So, for example, insertion of a large portion of data to a DB is supported in the Loader. Unfortunately, Loader allows to insert data only – it can’t be used for updating and deleting data.
The new version of Devart Delphi Data Access Components introduces the new mechanism for large data processing – Batch Operations. The point is that just one parametrized Modify SQL query is executed. The plurality of changes is due to the fact that parameters of such a query will be not single values, but a full array of values. Such approach increases the speed of data operations dramatically. Moreover, in contrast to using Loader, Batch operations can be used not only for insertion, but for modification and deletion as well.
Let’s have a better look at capabilities of Batch operations with an example of the BATCH_TEST table containing attributes of the most popular data types.
Batch_Test table generating scripts
CREATE TABLE BATCH_TEST
(
ID NUMBER(9,0),
F_INTEGER NUMBER(9,0),
F_FLOAT NUMBER(12,7),
F_STRING VARCHAR2(250),
F_DATE DATE,
CONSTRAINT PK_BATCH_TEST PRIMARY KEY (ID)
)
CREATE TABLE BATCH_TEST
(
ID INT,
F_INTEGER INT,
F_FLOAT FLOAT,
F_STRING VARCHAR(250),
F_DATE DATETIME,
CONSTRAINT PK_BATCH_TEST PRIMARY KEY (ID)
)
CREATE TABLE BATCH_TEST
(
ID INTEGER,
F_INTEGER INTEGER,
F_FLOAT DOUBLE PRECISION,
F_STRING VARCHAR(250),
F_DATE DATE,
CONSTRAINT PK_BATCH_TEST PRIMARY KEY (ID)
)
CREATE TABLE BATCH_TEST
(
ID INTEGER NOT NULL PRIMARY KEY,
F_INTEGER INTEGER,
F_FLOAT FLOAT,
F_STRING VARCHAR(250),
F_DATE DATE
)
CREATE TABLE BATCH_TEST
(
ID INT,
F_INTEGER INT,
F_FLOAT FLOAT,
F_STRING VARCHAR(250),
F_DATE DATETIME,
CONSTRAINT PK_BATCH_TEST PRIMARY KEY (ID)
)
CREATE TABLE BATCH_TEST
(
ID INTEGER,
F_INTEGER INTEGER,
F_FLOAT FLOAT,
F_STRING VARCHAR(250),
F_DATE DATETIME,
CONSTRAINT PK_BATCH_TEST PRIMARY KEY (ID)
)
Batch operations execution
To insert records into the BATCH_TEST table, we use the following SQL query:
INSERT INTO BATCH_TEST VALUES (:ID, :F_INTEGER, :F_FLOAT, :F_STRING, :F_DATE)
When a simple insertion operation is used, the query parameter values look as follows:
Parameters | ||||
---|---|---|---|---|
:ID | :F_INTEGER | :F_FLOAT | :F_STRING | :F_DATE |
1 | 100 | 2.5 | ‘String Value 1’ | 01.09.2015 |
After the query execution, one record will be inserted into the BATCH_TEST table.
When using Batch operations, the query and its parameters remain unchanged. However, parameter values will be enclosed in an array:
Parameters | ||||
---|---|---|---|---|
:ID | :F_INTEGER | :F_FLOAT | :F_STRING | :F_DATE |
1 | 100 | 2.5 | ‘String Value 1’ | 01.09.2015 |
2 | 200 | 3.15 | ‘String Value 2’ | 01.01.2000 |
3 | 300 | 5.08 | ‘String Value 3’ | 09.09.2010 |
4 | 400 | 7.5343 | ‘String Value 4’ | 10.10.2015 |
5 | 500 | 0.4555 | ‘String Value 5’ | 01.09.2015 |
Now, 5 records are inserted into the table at a time on query execution.
How to implement a Batch operation in the code?
Batch INSERT operation sample
Let’s try to insert 1000 rows to the BATCH_TEST table using a Batch Insert operation:
var
i: Integer;
begin
// describe the SQL query
Query1.SQL.Text := 'INSERT INTO BATCH_TEST VALUES (:ID, :F_INTEGER, :F_FLOAT, :F_STRING, :F_DATE)';
// define the parameter types passed to the query :
Query1.Params[0].DataType := ftInteger;
Query1.Params[1].DataType := ftInteger;
Query1.Params[2].DataType := ftFloat;
Query1.Params[3].DataType := ftString;
Query1.Params[4].DataType := ftDateTime;
// specify the array dimension:
Query1.Params.ValueCount := 1000;
// populate the array with parameter values:
for i := 0 to Query1.Params.ValueCount - 1 do begin
Query1.Params[0][i].AsInteger := i + 1;
Query1.Params[1][i].AsInteger := i + 2000 + 1;
Query1.Params[2][i].AsFloat := (i + 1) / 12;
Query1.Params[3][i].AsString := 'Values ' + IntToStr(i + 1);
Query1.Params[4][i].AsDateTime := Now;
end;
// insert 1000 rows into the BATCH_TEST table
Query1.Execute(1000);
end;
This command will insert 1000 rows to the table with one SQL query using the prepared array of parameter values. The number of inserted rows is defined in the Iters parameter of the Execute(Iters: integer; Offset: integer = 0) method. In addition, you can pass another parameter – Offset (0 by default) – to the method. The Offset parameter points the array element, which the Batch operation starts from.
We can insert 1000 records into the BATCH_TEST table in 2 ways.
All 1000 rows at a time:
Query1.Execute(1000);
2×500 rows:
// insert first 500 rows
Query1.Execute(500, 0);
// insert next 500 rows
Query1.Execute(500, 500);
500 rows, then 300, and finally 200:
// insert 500 rows
Query1.Execute(500, 0);
// insert next 300 rows starting from 500
Query1.Execute(300, 500);
// insert next 200 rows starting from 800
Query1.Execute(200, 800);
Batch UPDATE operation sample
With Batch operations we can modify all 1000 rows of our BATCH_TEST table just this simple:
var
i: Integer;
begin
// describe the SQL query
Query1.SQL.Text := 'UPDATE BATCH_TEST SET F_INTEGER=:F_INTEGER, F_FLOAT=:F_FLOAT, F_STRING=:F_STRING, F_DATE=:F_DATE WHERE ID=:OLDID';
// define parameter types passed to the query:
Query1.Params[0].DataType := ftInteger;
Query1.Params[1].DataType := ftFloat;
Query1.Params[2].DataType := ftString;
Query1.Params[3].DataType := ftDateTime;
Query1.Params[4].DataType := ftInteger;
// specify the array dimension:
Query1.Params.ValueCount := 1000;
// populate the array with parameter values:
for i := 0 to 1000 - 1 do begin
Query1.Params[0][i].AsInteger := i - 2000 + 1;
Query1.Params[1][i].AsFloat := (i + 1) / 100;
Query1.Params[2][i].AsString := 'New Values ' + IntToStr(i + 1);
Query1.Params[3][i].AsDateTime := Now;
Query1.Params[4][i].AsInteger := i + 1;
end;
// update 1000 rows in the BATCH_TEST table
Query1.Execute(1000);
end;
Batch DELETE operation sample
Deleting 1000 rows from the BATCH_TEST table looks like the following operation:
var
i: Integer;
begin
// describe the SQL query
Query1.SQL.Text := 'DELETE FROM BATCH_TEST WHERE ID=:ID';
// define parameter types passed to the query:
Query1.Params[0].DataType := ftInteger;
// specify the array dimension
Query1.Params.ValueCount := 1000;
// populate the arrays with parameter values
for i := 0 to 1000 - 1 do
Query1.Params[0][i].AsInteger := i + 1;
// delete 1000 rows from the BATCH_TEST table
Query1.Execute(1000);
end;
Performance comparison
The example with BATCH_TEST table allows to analyze execution speed of normal operations with a database and Batch operations:
DAC Name | Operation Type | 25 000 records | |
---|---|---|---|
Standard Operation (sec.) | Batch Operation (sec.) | ||
ODAC / UniDAC (with OracleUniProvider) | Insert | 17.64 | 0.59 |
Update | 18.28 | 1.20 | |
Delete | 16.19 | 0.45 | |
LiteDAC / UniDAC (with SQLiteUniProvider) | Insert | 2292 | 0.92 |
Update | 2535 | 2.63 | |
Delete | 2175 | 0.44 | |
PgDAC / UniDAC (with PostgreSQLUniProvider) | Insert | 346.7 | 1.69 |
Update | 334.4 | 4.59 | |
Delete | 373.7 | 2.05 | |
IBDAC / UniDAC (with InterBaseUniProvider) | Insert | 55.4 | 3.03 |
Update | 81.9 | 3.58 | |
Delete | 61.3 | 0.91 | |
MyDAC / UniDAC (with MySQLUniProvider) | Insert | 1138 | 11.02 |
Update | 1637 | 26.72 | |
Delete | 1444 | 17.66 | |
SDAC / UniDAC (with SQLServerUniProvider) | Insert | 19.19 | 3.09 |
Update | 20.22 | 7.67 | |
Delete | 18.28 | 3.14 | |
The less, the better. |
It should be noted, that the retrieved results may differ when modifying the same table on different database servers. This is due to the fact that operations execution speed may differ depending on the settings of a particular server, its current workload, throughput, network connection, etc.
Thing you shouldn’t do when accessing parameters in Batch operations!
When populating the array and inserting records, we accessed query parameters by index. It would be more obvious to access parameters by name:
for i := 0 to 9999 do begin
Query1.Params.ParamByName('ID')[i].AsInteger := i + 1;
Query1.Params.ParamByName('F_INTEGER')[i].AsInteger := i + 2000 + 1;
Query1.Params.ParamByName('F_FLOAT')[i].AsFloat := (i + 1) / 12;
Query1.Params.ParamByName('F_STRING')[i].AsString := 'Values ' + IntToStr(i + 1);
Query1.Params.ParamByName('F_DATE')[i].AsDateTime := Now;
end;
However, the parameter array would be populated slower, since you would have to define the ordinal number of each parameter by its name in each loop iteration. If a loop is executed 10000 times – performance loss can become quite significant.
thank you very much
and thanks also to mariadb
You are welcome;)
That is an impressive speed increase. I switched to Firebird because a Select ran very slow on this large table I used. Firebird crunched that like it was nothing. Then found out that Bulk Firebird inserts are very slow, even when using “EXECUTE BLOCK ” for ~50k records one could watch the Progress meter do its thing but this is super fast..
My code is inserting in 1k record chunks but I set that to 50k for a test and it was nearly instant.
Bravo..
Thank you for kind words. However, we think there is still much to develop.
Hello,
Does it works with MSAccess data provider?
Thank you!
The latest UniDAC version 6.2.9 supports Batch operations for MSAccess data provider.
I have to reply you a problem about batch insert, as I tried result, when I try to batch insert data with chinese character, I found chinese character is NOT correct in mysql db.
But, if you just insert one record, It’s correct.
Why ??
Hi Sir:
I got a problem when I batch insert with chinese character(chinese character doesn’t show correctly).
But, the wired thing is when I just insert one record, chinese character show correctly.
Anybody have the same problem ?
Hi, Jeff! Thank you for your notice. We will investigate the issue with Chinese characters on batch insert to MySQL and post here about the results. Are you using MyDAC or UniDAC? And what version of the product are you using?
Thanks DAC Team,
I am using MyDAC 8.6 trial version.
I am evaluating the solution of using MySQL on delphi.
thanks.
For the record,
I try to use component MyLoader to implement batch insert and It’s working fine !
I save the data into array, when PutData event triggered, I loop the array to PutColumnData.
Thanks.
Hello,
Tested batch update in one situation but not get this to work.
Is the procedure exact same when you’ve got Value parameters and also parameters in where clause? Or do I just have to include the key values also?
-Tee-
I’m using SDAC ver 6.10.21.
I wanted to use the Batch-Update example but the TMSQuery
doesn’t seems to have ‘ValueCount’ params –> MSQuery1.Params.ValueCount
Also, the Params 2D-array seems that it is not exist…
MSQuery1.Params[0][i].AsInteger := i + 1;
MSQuery1.Params[1][i].AsInteger := i + 2000 + 1;
Am i using incorrect version?
Thanks.
Hello, Pini.
In the SDAC version 6.10.21 support for batch operations wasn’t yet added. It was added in SDAC 7.2.7 released on 09.09.2016.
Batch Test not working? I have Error “The SQL statement is not allowable for a bulk”
DB :MySQL
CREATE TABLE `afaturalar` (
`AFaturaID` int(11) NOT NULL AUTO_INCREMENT,
`AktarimID` int(11) DEFAULT NULL,
`FirmaSicilNo` varchar(20) DEFAULT NULL,
`Yil` smallint(6) DEFAULT NULL,
`Tarih` date DEFAULT NULL,
`Seri` varchar(30) DEFAULT ”,
`Sira` varchar(30) DEFAULT ”,
`AltFirmaSicilNo` varchar(20) DEFAULT NULL,
`AltFirmaUnvan` varchar(400) DEFAULT ”,
`Hizmet` varchar(100) DEFAULT ”,
`Miktar` varchar(30) DEFAULT ”,
`Matrah` decimal(12,2) DEFAULT ‘0.00’,
`Kdv` decimal(12,2) DEFAULT ‘0.00’,
`GGBTescilNo` varchar(50) DEFAULT ”,
`KdvDonem` varchar(20) DEFAULT ”,
`IhracatFirmaSicilNo` varchar(20) DEFAULT ”,
`ExcelSatirNo` int(11) DEFAULT NULL,
PRIMARY KEY (`AFaturaID`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8;
Delphi Code;
Qry.SQL.Text := ‘ INSERT INTO afaturalar VALUES (DEFAULT, :AktarimID, :FirmaSicilNo, :Yil, :Tarih, :Seri, :Sira, :AltFirmaSicilNo, :AltFirmaUnvan, :Hizmet, :Miktar, :Matrah, :Kdv, :GGBTescilNo, :KdvDonem, :IhracatFirmaSicilNo, :ExcelSatirNo)’;
Qry.Params[0].DataType := ftInteger; //AktarimID
Qry.Params[1].DataType := ftString; //FirmaSicilNo
Qry.Params[2].DataType := ftSmallint; //Yil
Qry.Params[3].DataType := ftDate; //Tarih
Qry.Params[4].DataType := ftString; //Seri
Qry.Params[5].DataType := ftString; //Sira
Qry.Params[6].DataType := ftString; //AltFirmaSicilNo
Qry.Params[7].DataType := ftString; //AltFirmaUnvan
Qry.Params[8].DataType := ftString; //Hizmet
Qry.Params[9].DataType := ftString; //Miktar
Qry.Params[10].DataType := ftFloat; //Matrah
Qry.Params[11].DataType := ftFloat; //Kdv
Qry.Params[12].DataType := ftString; //GGBTescilNo
Qry.Params[13].DataType := ftString; //KdvDonem
Qry.Params[14].DataType := ftString; //IhracatFirmaSicilNo
Qry.Params[15].DataType := ftInteger; //ExcelSatirNo
Qry.Params.ValueCount := 1000;
for i := 0 to Qry.Params.ValueCount – 1 do begin
Qry.Params[0][i].AsInteger := i;
Qry.Params[1][i].AsString := i.ToString;
Qry.Params[2][i].AsSmallInt := i;
Qry.Params[3][i].AsDate := Now;
Qry.Params[4][i].AsString := i.ToString;
Qry.Params[5][i].AsString := i.ToString;
Qry.Params[6][i].AsString := i.ToString;
Qry.Params[7][i].AsString := i.ToString;
Qry.Params[8][i].AsString := i.ToString;
Qry.Params[9][i].AsString := i.ToString;
Qry.Params[10][i].AsFloat := i;
Qry.Params[11][i].AsFloat := i;
Qry.Params[12][i].AsString := i.ToString;
Qry.Params[13][i].AsString := i.ToString;
Qry.Params[14][i].AsString := i.ToString;
Qry.Params[15][i].AsInteger := i;
end;
Qry.Execute(1000);
I tried but not working
Hi,
I am using batch insert which works fine when I have multiple rows (means rowcount>1).
The statement is Query1.Execute(rowcount);
However, it throws exception when the value of rowcount = 1 (because I have only one row in the batch and I don’t have any remaining rows to insert. The exception is : “Could not convert variant of type (Array Variant) into type (OleStr). Is there any unified way to use the same batch insert facility to handle the remaining single row to be inserted with the same command?
Hello, Tanvir!
Please specify the name and exact version of our product, which you are using when executing Batch Insert.
Does this approach of batch update/insert work with Blob fields? (using latest UniDAC product to connect to a SQLite database).
Hello, Mulham!
Yes, you can use batch operations when working with BLOB fields in UniDAC with SQLite database.