Database fills up too quickly

Post Reply
Nbonini
Posts: 2
Joined: Tue Oct 28, 2003 10:04 pm
Location: Brooklyn
Contact:

Database fills up too quickly

Post by Nbonini »

I don't know why the database is limited to 400MB even if I change it in the settings. Frequently, I get database overruns and the whole application craps out. If I get headers for all of the newsgroups I have subscribed to, I get about 1800 tasks, most of them binaries news groups. Is the solution to create different databases? I was hoping to avoid this, as I like to have everything in one place. If my problem makes sense to you, let me know if you have any fixes. I'm getting irritated downloading the same headers over and over again.

One more thing, my news servers are from Newsfeeds.com, so I have access to about 20 servers with HUGE retention. Maybe this is more than NewsPro was designed to handle, but the daily database crashes are pissing me off.

Thanks,

Nathaniel Bonini
blackdog56
Posts: 106
Joined: Thu Feb 27, 2003 7:33 pm

Post by blackdog56 »

Yes.. it is well documented. Check this thread http://216.194.102.140/newspro/phpBB2/v ... .php?t=499 for the latest discussion of it. or search the forum for multiple databases.
Nbonini
Posts: 2
Joined: Tue Oct 28, 2003 10:04 pm
Location: Brooklyn
Contact:

Thanks

Post by Nbonini »

I thought there might be a more elegant solution. Thanks for the help.

Nathaniel
bruce73
Posts: 117
Joined: Tue Mar 04, 2003 11:39 pm

Post by bruce73 »

I use Newfeeds as well and kept putting off starting basically over with multiple databases, but I'm glad I finally did. Not only does it speed things up (even running multiple instances), but I figure if one instance does crash, then I only have that one to re-download headers for.

I don't know how you handle getting headers now, but when I was using just one database, this is what I did to keep it as small as possible: I did a refresh of the newsgroup list for all servers. Then found the server for each newsgroup that had the best retention and used just that one to pull headers from, keeping the local retention to 1-3 days. I figured I would see most everything, and, since I deal mostly with large files, if any parts were missing I could do an XPAT. I enabled messageID for the other servers and generally had no problems getting what I wanted.

Just an idea, if you don't want to go through the hassle of creating more than one database.
murgoob
Posts: 24
Joined: Thu Mar 13, 2003 4:48 am

Post by murgoob »

I've got a newsfeeds account.. I know your pain when trying to get headers from all 20 somthing servers they've got... newspro crashed on me quite a bit before i got the hang of things.

I've found that if I just limit the servers that get headers to a couple I get much less database crashes now.

I have the 4 more generic ones set to get new headers each hour:
spamkiller, anonymous, goliath-west, and text-west

all the rest are set to message-direct mode only.

Even with only 4 servers getting headers I find I have more than enough headers to go through and find what interests me.. If I find somthing incomplete (very rare), I'll do an XPAT search on it. The xpat searches the other servers regardless if they're set to "suppress headers". 99% chance that one of the more specialized servers would be able to complete the post.

hope that helps
Post Reply