View Single Post
Old 05-27-2011, 05:57 AM   #8
juco
Enthusiast
juco is on a distinguished road
 
juco's Avatar
 
Posts: 28
Karma: 50
Join Date: Oct 2003
Location: Bavaria/Germany
Device: Palm m105, Kindle KB
Until the day before yesterday, I used the "old" receipe that came with calibre, which I had enhanced by adding the other category feeds (in pretty much the same way user schuster did in his script):

Code:
__license__   = 'GPL v3'
__copyright__ = '2008-2011, Kovid Goyal <kovid at kovidgoyal.net>, Darko Miletic <darko at gmail.com>, Ralph Stenzel <ralph at klein-aber-fein.de>'
'''
Profile to download FAZ.NET
'''

from calibre.web.feeds.news import BasicNewsRecipe

class FazNet(BasicNewsRecipe):
    title                 = 'FAZ.NET'
    __author__            = 'Kovid Goyal, Darko Miletic, Ralph Stenzel'
    description           = 'Frankfurter Allgemeine Zeitung'
    publisher             = 'Frankfurter Allgemeine Zeitung GmbH'
    category              = 'news, politics, Germany'
    use_embedded_content  = False
    language = 'de'

    max_articles_per_feed = 30
    no_stylesheets        = True
    encoding              = 'utf-8'
    remove_javascript     = True

    html2lrf_options = [
                          '--comment', description
                        , '--category', category
                        , '--publisher', publisher
                        ]

    html2epub_options = 'publisher="' + publisher + '"\ncomments="' + description + '"\ntags="' + category + '"'

    keep_only_tags = [dict(name='div', attrs={'class':'Article'})]

    remove_tags = [
                     dict(name=['object','link','embed','base'])
                    ,dict(name='div', attrs={'class':['LinkBoxModulSmall','ModulVerlagsInfo']})
                  ]

    feeds = [
              ('FAZ.NET Aktuell', 'http://www.faz.net/s/RubF3CE08B362D244869BE7984590CB6AC1/Tpl~Epartner~SRss_.xml'),
              ('Politik', 'http://www.faz.net/s/RubA24ECD630CAE40E483841DB7D16F4211/Tpl~Epartner~SRss_.xml'),
              ('Wirtschaft', 'http://www.faz.net/s/RubC9401175958F4DE28E143E68888825F6/Tpl~Epartner~SRss_.xml'),
              ('Feuilleton', 'http://www.faz.net/s/RubCC21B04EE95145B3AC877C874FB1B611/Tpl~Epartner~SRss_.xml'),
              ('Sport', 'http://www.faz.net/s/Rub9F27A221597D4C39A82856B0FE79F051/Tpl~Epartner~SRss_.xml'),
              ('Gesellschaft', 'http://www.faz.net/s/Rub02DBAA63F9EB43CEB421272A670A685C/Tpl~Epartner~SRss_.xml'),
              ('Finanzen', 'http://www.faz.net/s/Rub4B891837ECD14082816D9E088A2D7CB4/Tpl~Epartner~SRss_.xml'),
              ('Wissen', 'http://www.faz.net/s/Rub7F4BEE0E0C39429A8565089709B70C44/Tpl~Epartner~SRss_.xml'),
              ('Reise', 'http://www.faz.net/s/RubE2FB5CA667054BDEA70FB3BC45F8D91C/Tpl~Epartner~SRss_.xml'),
              ('Technik & Motor', 'http://www.faz.net/s/Rub01E4D53776494844A85FDF23F5707AD8/Tpl~Epartner~SRss_.xml'),
              ('Beruf & Chance', 'http://www.faz.net/s/RubB1E10A8367E8446897468EDAA6EA0504/Tpl~Epartner~SRss_.xml')
            ]

    def print_version(self, url):
        article, sep, rest = url.partition('?')
        return article.replace('.html', '~Afor~Eprint.html')

    def preprocess_html(self, soup):
        mtag = '<meta http-equiv="Content-Type" content="text/html; charset=utf-8"/>'
        soup.head.insert(0,mtag)
        del soup.body['onload']
        for item in soup.findAll(style=True):
            del item['style']
        return soup
This script was working very well and delivered the articles in an outstanding layout to my Kindle 3. However, the programmers of faz.net seem to have changed something recently, so the receipe is not working anymore since last Wednesday. The URLs of the feeds are still valid (and the same as before), but I'm not competent to find and fix the problem involved myself.

The new receipe from user schuster *does* deliver content, however it does not provide the same "polished" and well-looking results (cropping, formatting etc.) which the previous script did. Perhaps someone here is able to fix the receipe cited here in my comment so that it may be put in service again? Any help would be greatly appreciated!

Thanks in advance,
Ralph

Last edited by juco; 05-27-2011 at 08:18 AM.
juco is offline   Reply With Quote