Discussion:
Inserting new records into a large ADO table
(too old to reply)
Edward Diener
2005-05-22 13:42:01 UTC
Permalink
In my application I need to insert a number of records into a fairly
large ADO table periodically. The number of records will average about
1000 and the table may have about 200000 records initially. If I create
a TADOTable for my table then, as I understand it, my table must be
populated by all the records when I make it Active. As I insert a record
into this table, changes will be going to the server's database with
each insert. This seems to be very expensive in terms of time and
memory. Is there a better way of doing this ?

I initially designed this using a client data set, so that I could apply
my inserted updates all at one time using ApplyUpdates, therefore
reducing traffic back and forth to the server database. However that is
consuming even more memory as the client dataset, connected to the table
through a provider, takes up another 200000 records in memory along with
the ADO table. Any thoughts of doing this in a better way would be
appreciated. I realize that there are always memory-speed tradeoffs and
am trying to come up with the best combination for my task.
Bill Todd
2005-05-22 14:50:47 UTC
Permalink
If you are inserting the records in code why not use a Command object
and a parameterized INSERT statement?
--
Bill Todd (TeamB)
Edward Diener
2005-05-22 16:03:37 UTC
Permalink
Post by Bill Todd
If you are inserting the records in code why not use a Command object
and a parameterized INSERT statement?
That is a valid idea.

I can do this but this means traffic to the database server for every
insert statement, since every insert statement is a command toi the
server. If my database is on another machine, this means plenty of
network traffic.

This is what I meant by the speed versus size tradeoff. When I use a
client data set I can apply my updates at a single time. But of course
the latter means a much larger size in keeping my client dataset in
memory as well as reading in my original table.

I guess there is no right answer. I originally designed thinking that
the speed issue was the most important but with the memory taken up by
my original table and client dataset I am having second thoughts.
Bill Todd
2005-05-22 16:31:28 UTC
Permalink
Huh? When you call CDS.ApplyUpdates the DataSetProvider creates and
executes one SQL statement for each change in the CDS.Delta property.
If you insert 1,000 rows the DSP will create and send to the server
1,000 INSERT statements.

There is no magic way to send a SQL database server one statement that
inserts 1,000 rows.
--
Bill Todd (TeamB)
Edward Diener
2005-05-22 23:08:55 UTC
Permalink
Post by Bill Todd
Huh? When you call CDS.ApplyUpdates the DataSetProvider creates and
executes one SQL statement for each change in the CDS.Delta property.
If you insert 1,000 rows the DSP will create and send to the server
1,000 INSERT statements.
There is no magic way to send a SQL database server one statement that
inserts 1,000 rows.
OK, thanks for the info.
Viatcheslav V. Vassiliev
2005-05-23 08:34:10 UTC
Permalink
Use prepared command - in this case only parameter values will be sent over
network.

//------------------------------------------
Regards,
Vassiliev V. V.
http://www.managed-vcl.com - using .Net objects in Delphi for Win32 +
ADO.Net
http://www.oledbdirect.com - The fastest way to access MS SQL Server,
MS Jet (Access) and Interbase (through OLEDB)
Post by Edward Diener
Post by Bill Todd
Huh? When you call CDS.ApplyUpdates the DataSetProvider creates and
executes one SQL statement for each change in the CDS.Delta property.
If you insert 1,000 rows the DSP will create and send to the server
1,000 INSERT statements.
There is no magic way to send a SQL database server one statement that
inserts 1,000 rows.
OK, thanks for the info.
Edward Diener
2005-05-23 10:50:37 UTC
Permalink
Post by Viatcheslav V. Vassiliev
Use prepared command - in this case only parameter values will be sent over
network.
When I insert a row the data for the row still has to be passed over the
network.
Vitali Kalinin
2005-05-23 11:17:42 UTC
Permalink
Post by Edward Diener
When I insert a row the data for the row still has to be passed over the
network.
So you think that MS SQL Server could be that smart to predict actual data
that you want to insert 8-)? AFAIK fastest approach to insert something in
MS SQL server is using bulk insert. There are several ways to use it: DTS,
bulk insert TSQL statement, IRowsetFastLoad interface.
deanb
2005-05-25 02:19:10 UTC
Permalink
insert into A (select 1 from dual union selet 2 from dual union select
3 from dual).....

Up to about 64K in one string. I forgot the syntax, but you get the
idea.

Or, use a SQL loader.

Dean

Loading...