c++ - Performance of table access -


We have an application that is fully written in C. Pro * C And to enhance the performance of the application, we also preload some tables to get the data. We take some input fields and usually take the output fields from the table.

Usually there are approximately 30000 entries in the table and it reaches a maximum of 0.1 million.

But if table entries increase in approximately 10 million entries, I think it affects the performance of the app in a dangerous way.

Am I wrong somewhere? If this really affects the performance, then is there any way to keep the app's performance stable?

If possible, the number of rows in the table is 10 million in the table, how does the application work with the tables?

If you are not leaving the table, you will get a proportional increase in search time. .. If you do not do anything wrong, you will get 33X more search bar in your example (30K vs. 1M). I agree that you are iterating incrementally (i ++ style) table

However, if it is possible to sort the table, you can reduce the search bar. This is possible because an indexer algorithm which searches the sorted information will not parse every single element unless it demands: it uses helpful tables (trees, hashes, etc.), usually to search It is very fast, and then it is precisely the demanding element, or at least guess where it is in the master table.

Of course, this comes at the cost of sorting the table, either when you add or remove elements from it, or when you do a search.


Comments

Popular posts from this blog

c# - How to capture HTTP packet with SharpPcap -

php - Multiple Select with Explode: only returns the word "Array" -

php - jQuery AJAX Post not working -