Every website is different.
There maybe loads of links that are not the "next page".
The formatting / styling and images can be an issue.
So despite over 30 years of programming and a couple of programs like Pocket, I don't believe it's possible to automatically convert an arbitrary number of separate web pages into one ebook or document, other than a mirrored copy using something like "wget".
So if you want the arbitrary sized document on multiple web pages the options are:
- wget, create a local mirror copy of that part of a website
- copy and paste a page at a time
- Contact author and buy a copy
Also there is the legal and moral issue with a mirror copy or making an ebook, is the content in the Public Domain? If not is your copy purely for your own consumption (may be fine depending on your country)?
Disclaimer
I know an author that used to serialise twice a week. He'd be happy with a copy and paste for your own use. No automated program would be able to "fetch" a complete story from his site. He does offer an ebook version.