Are their Any ftp Solutions for Grabbing Binaries in an nntp?

Tony Baechler tony at baechler.net
Thu Jul 10 06:48:53 UTC 2014


Hi,

It would be cool if there is such a thing, but I couldn't find it.  I
checked and libcurl (the library which drives curl) definitely doesn't
support news.  Too bad as that could be an interesting solution.  I also
looked at lftp but it doesn't support news either.  However, due to how
Usenet works, I don't think such a client could be written.  As you know,
binaries are split into many smaller parts which each get unique article
numbers.  If even one article is missing, the binary is corrupt.  A client
would have to download and sort all of the article headers to determine
which binaries are complete.  When you have potentially a million articles,
it eats a huge amount of memory and makes the client run extremely slowly.
To further add the complexity of an ftp-style interface on top of trying to
manage that many articles is just asking for trouble and confusion.  The
"ls" output would consist of hundreds of thousands of files and the next
complaint would be that there are too many files to grab conveniently.  I
know this from experience on Windows.  I suspect the graphical clients have
better memory management or something.  Remember that the Curses clients are
very old and weren't designed for that many articles.  It would almost be
worth writing a new Curses client from scratch.

If I can make a suggestion, I would suggest either something like brag which
grabs everything and you can delete what you don't want or something like
leafnode which pulls down all of the articles in a group for processing at
your convenience.  You can get new 1 TB drives on Amazon in the US for $65,
so I would just get one of those and use it solely for news grabs.  Even
with a fairly busy group, you would have a hard time filling 1 TB in one
grab.  Once you sort through what you want after decoding the binaries, you
can move them to another drive for listening, watching, etc.  Likewise, you
could do the reverse.  After you decode the binaries, move the ones you want
to keep to an external drive or a new 1 TB drive for archiving and later
listening, etc.  I personally still grab binaries on Windows, but if I
wanted an entirely Linux solution, that's probably what I would do.  I had
OK luck with brag.  I say "OK" because it grabbed binaries well enough, but
also keeps a lot of extra articles and overhead that I don't want.  Some
overhead is expected because the headers have to be stored somewhere, but it
looked like it was storing all of the articles plus the binaries and
generating fairly large indexes besides.  I did pull some text groups on a
Linux server and I used "suck" to pull new articles by hand, but if I wanted
a totally automatic solution, I would go for leafnode or similar.

On 2014-07-09 02:10 PM, Hart Larry wrote:
> In an ideal World, it would seem that I should be able to run an ncftp style
> client to list-and-grab binaries of an nntp server. Sure, I understand there
> are several Linux clients where you practicly must know exact article
> numbers, such as aub, lottanzb, papercut, newslight, sinntp, nntp-pull, and
> nget. For myself, as I am rather comfortable with trn4 and Pine for dealing
> with Usenet, these other solutions seem complicated.
> I suppose if we could really get Leafnode configured with trn4, then maybe I
> could load larger groups. One of our LUG members, who I convinced to join
> this list, wondered if "curl" would do the trick? But looking in its manual,
> I see nothing about news, usenet, or nntp.
> I have many times looked in the Debian repo, but seemingly the better more
> robust clients are made for graphical settings, including windows.
> Aside from an amount of large articles, trn has many troubles with quite
> high article numbers, so whats a grabber to do?
> Now, yes, occasionally I use an Easynews web solution, but Giganews has a
> larger archive, but only nntp--and-they know seemingly little about Linux
> solutions which work well with speech.
> Thanks so much in advance of any solutions.
> Hart
> 
> _______________________________________________
> Blinux-list mailing list
> Blinux-list at redhat.com
> https://www.redhat.com/mailman/listinfo/blinux-list

-- 
Have a good day,
Tony Baechler
tony at baechler.net




More information about the Blinux-list mailing list