Re: Performance of large queries
[prev]
[thread]
[next]
[Date index for 2004/07/15]
Hi all,
first of all thank you very much for your suggestions.
> In light of this I'm not sure that your code should be 3-4 times
> slower. Have you made sure (as above) that your setup means that all the
> methods you are using are marked as Essential or All? If so can you
> post an example of the type of thing you are doing just in case there is
> something we can suggest?
I'm trying to develop a new database-structure for the project
Geniustrader (www.geniustrader.org) to store the values of a stocks.
The structure of the database is quite simple:
---
package GT::DBDyn::Stock::EOD::Data::Prices;
use base 'GT::DBDyn::Stock::EOD::Data::DBI';
__PACKAGE__->table('prices');
__PACKAGE__->columns(Primary => qw/source code datum exchange/);
__PACKAGE__->columns(Essential => qw/open high low close volume
currency/); __PACKAGE__->columns(Others => qw/adjust/);
__PACKAGE__->has_a(source => 'GT::DBDyn::Stock::EOD::Data::Source');
__PACKAGE__->has_a(exchange =>
'GT::DBDyn::Stock::Exchange::Data::Exchange'); __PACKAGE__->has_a(code =>
'GT::DBDyn::Names::Data::Names'); ---
And the query is the following:
my @ret =
GT::DBDyn::Stock::EOD::Data::Prices->search( { @_ }, {order_by =>
'datum'} );
It takes quite a long time for the processing. The query return ca. 2000
datasets. Using the classical DBI-Query speeds up the query by an factor
of 3 or 4. ...
CU, Olf
|
|
Re: Performance of large queries
cdbi-talk 19:22 on 15 Jul 2004
|