ADO 2.0 fasest way to load database... OT

Search
Go

Discussion Topic

Return to Forum List
This thread has been locked
Messages 1 - 8 of total 8 in this topic
pdx_climber

Sport climber
portland,or
Topic Author's Original Post - Oct 17, 2007 - 04:50pm PT
Hey all you nerds!

I have a mulit-million row access database. I'm working in C# and want to get it into a dataTable as quickly as possible.

I tried 2 ways and get the same elapsed time.

theActs = OleDbHelper.ExecuteDataset(cnAct, CommandType.Text, sSQL2 + sWhere);
tblActs = theActs.Tables[0];

-OR-

OleDbCommand cmdAct = new OleDbCommand(sSQL2 + sWhere, cnAct);
IDataReader rdr = cmdAct.ExecuteReader();
tblActs1.Load(rdr);

Any faster ways????

Thanks in advance.
Gary

climber
Desolation Basin, Calif.
Oct 17, 2007 - 05:09pm PT
Any faster ways????

MySQL
pdx_climber

Sport climber
portland,or
Topic Author's Reply - Oct 17, 2007 - 05:37pm PT
so its not slow b/c of ADO methods, but Access? Is that b/c mySql doesn't use oldDb?

A good lead, thanks.
G_Gnome

Trad climber
In the mountains... somewhere...
Oct 17, 2007 - 06:16pm PT
Yep, Microsoft's SQL implementation is seriously slow. The other way to speed it up is to put the DB on an machine running Advanced Server 2003 with at least 8 gigs of RAM. That way it can load the whole thing into RAM.
TradIsGood

Half fast climber
the Gunks end of the country
Oct 17, 2007 - 07:02pm PT
Just for jollies, why not apply a little advanced thinking...

How many bytes per row? How many rows? What type of data transfer rates can you achieve on your hard drive.

How long would it take to read/write data from hard-drive?

How many times that amount of time is your database load?

How are you going to use the data? For example, if you are just calculating statistics on the whole data set, using any type of database system is silly.
pdx_climber

Sport climber
portland,or
Topic Author's Reply - Oct 17, 2007 - 07:17pm PT
The data is being read into several C# data structures, which are then added to a series of linked lists in a C DLL(via the GAC) for use in an industrial simulation . My boss wants to know what the largest size simulation is...

I can get all the data to load in 5.5 hours by reading in the data from Access in blocks, and transferring each block to the linked lists (72 blocks).

The fewer blocks, the faster it runs b/c the time drain is talking to Access. But if I have to few blocks (e.g. 6), a block can go over 1,400,000 K, and I get a memory exception error.

So the speed/memory requirements of talking to the database is the crux. Sounds like a new database is the way to go, but I will probably not have that choice....

Thanks for the input thus far. (edited to Ks, thank TIG)
TradIsGood

Half fast climber
the Gunks end of the country
Oct 17, 2007 - 07:24pm PT
1,400,000 MBs?


At 10 Mb/s you are looking at 39 hours. I think you need to check your math and the size of your drive(s).
healyje

Trad climber
Portland, Oregon
Oct 18, 2007 - 12:03am PT
Getting a OleDbDataReader is generally faster than filling a DataSet - but requires the connection remain open. Possibly of more importance may be what SQL text is in 'sSQL2' and 'sWhere'?

I'm also in PDX and you can email me if you like - just put 'SUPERTOPO:' at the start of the subject line if you do.
Messages 1 - 8 of total 8 in this topic
Return to Forum List
 
Our Guidebooks
spacerCheck 'em out!
SuperTopo Guidebooks

guidebook icon
Try a free sample topo!

 
SuperTopo on the Web

Recent Route Beta