Question:
Data retrieval in Access database is very slow?
dave j
2008-11-13 07:54:32 UTC
I have a database at work that is used for data retrieval. Basically I have a table with 12 fields containing a range of data. I have created a form that enables me to search in any combination of these fields like so;

Find records where field one is like marl*, field 3 like *abb* and field 12 like *ben*

The search is carried out through an event procedure I have created. All works well and the required data is returned as expected.

However, the database is networked with all data files (dbf) stored on one PC and the Access mdb file located on each individual PC. Data retrieval from these networked locations is VERY slow. It is quick when using the PC where the data files are stored.

The table contains approx 8000 records.

Would i be right in thinking that the networked PC's are retrieving all 8000+ records and then searching for the relevant data - hence the slow speeds.

If this is the case, can I have the PC where the data is stored do the search and send the relevant results to the networked PC's

Hope that makes sense

Thanks
Three answers:
Yuman
2008-11-13 08:37:34 UTC
Your question makes good sense and your arguments are to the point.



No matter where the data files is, if Access is the database engine processing the data, it will scan the full table, unless it can find the data in the indexes.



Even if you have indexes on the criteria columns, they are not useful for 'where field1 like' kind of selection criteria. Your query requires full table scans.



A table of 8000 records is rather small even with full table scans with no indexes. That's why it is fast when the data file is local.



A solution is to import the data into a local table at Access startup. This way you can maintain central control of the data while speeding up the queries. To achieve this, put the table import logic in AutoExec macro. This article gives you more detail on how to do it: http://www.databasedev.co.uk/text-import-macro.html
anonymous
2008-11-13 08:11:22 UTC
This is the big downside of Access. It sounds like you need to upgrade to a more robust database. Rather than having a split DB, you could copy the .MDB file to the network and have each use CREATE A SHORTCUT to that file on their desktop, although that is a nightmare for managing and I'm not sure it'll help that much.. Have you tried compressing the database? The only other option I can think of is to switch to a SQL back end and keep the Access front end.
?
2016-11-04 09:02:45 UTC
Normalizing records makes it much less confusing to alter issues approximately your database layout later, and likewise protects you from having to bypass discover and restoration duplicate records whilst it differences. working example, in case you have a catalogue of names on your database, and a individual differences their final call (gets married or divorced, working example). If the suggestions is normalized, all references to that individual are interior the variety of an identity, and you basically might desire to alter the call in one place. If that's no longer, you could desire to seek your database for all places the place the call is, and in case you permit out one, existence gets yucky. in case you want to function fields, an identical situation exists. in case you want to function one extra line on your addresses, if the suggestions is normalized you upload the sector to a single table. If no longer, you could desire to discover each table the place addresses exist, and upload a container to it. in case you be attentive to the suitable type of your database in develop and you do no longer do updating, then it won't remember as plenty, quite in the adventure that your database is small. no remember if it is important, or can replace, then normalization keeps issues prepared so as which you will save your records integrity and your sanity intact.


This content was originally posted on Y! Answers, a Q&A website that shut down in 2021.
Loading...