Indexing a newsgroup server means you will have to write a little library to download all the posts of the usenet server. Just the headers(topics) take a big amount of Gigabytes just to store. I believe you have to have several Terabytes available to be able to store just all the headers.
A little less storage hungry would be to just index the last X days, but might get big anyway.
Once you have all the post headers, you will have to use some logic to group together all the posts that would be one release of any kind (movie/program/ISO). For each group you create, you combine these to an NZB xml file which you can use for your favorite usenet downloader.
But if your question is that you have several URLs for NZB files you wish to download, there is a nice tool called webget that you can use to download any URL file.
A vague question.
A list of NZB files will not help you.
Indexing a newsgroup server means you will have to write a little library to download all the posts of the usenet server. Just the headers(topics) take a big amount of Gigabytes just to store. I believe you have to have several Terabytes available to be able to store just all the headers.
A little less storage hungry would be to just index the last X days, but might get big anyway.
Once you have all the post headers, you will have to use some logic to group together all the posts that would be one release of any kind (movie/program/ISO). For each group you create, you combine these to an NZB xml file which you can use for your favorite usenet downloader.
But if your question is that you have several URLs for NZB files you wish to download, there is a nice tool called webget that you can use to download any URL file.