[Date Prev][Date Next]
[Chronological]
[Thread]
[Top]
Re: my searchs doesn't use indexes
Isaac Cruz Ballesteros wrote:
>
> > You've probably already done this, but may as well check the obvious. =)
> > Did you add the "tac" index after you put the data in on that machine? If
> > so, did you regenerate the index file afterwards?
> >
> > Regenerating the index file might be a good thing to try regardless (stop
> > ldap, slapdindex, start ldap).
>
> The "tac" index was in config file when there was no data, so the index "exists".
> I have already try slapindex, with the same result... Also, before slapindex,
> there was one tac (that did exist in database but an ldapsearch using the filter
> tac=35045610 didn't return any result, but with filter tac=35045610* did return
> the correct entry. This happened even after manually deleting and reinserting the
> entry. After slapindex, this problem was corrected.
>
> > Do your two databases have the same data? How do the sizes of the gdbm
> > files compare?
>
> They are not the same. The test database have "only" about 1000000 entries , and I
> can't import all the data as the total size of gdbm files is 4,5 GB
>
> Thanks for the help anyway
If you're running ldap as a non-root user (it's a good thing), make sure
all the .ldbm files are owned by that user.
I have had index corruption problems with OpenLDAP and GDBM. Running
slapindex would not fix the problem. The problems didn't go away until
I switched to Berkeley DB 3.3.11 as the backend database.
My experience is that GDBM files also tend to get very large very fast
if you do lots of updates. Reloading the database (stop ldap; slapcat >
foo.ldif; slapadd -c < foo.ldif; chown -R <ldap user> /var/lib/ldap;
start ldap) will shrink the files again on a temporary basis. Berkeley
DB does not have this problem.
Also, GDBM can't handle individual files larger than 2GB. Updates that
would increase the file size past that limit return I/O errors and crash
the slapd process. Hopefully you run multithreaded so this will not
immediately terminate all LDAP service. If your id2entry.ldbm is
getting close to 2GB, you might want to take steps to avoid this problem
before it happens to you.
Good luck,
John