Hi,
Is it possible to modify the Instapaper.com recipe so that successfully downloaded articles are moved to Instapaper's archive, automatically?
Unfortunately, I only have very basic programming skills and never used python, before. Nonetheless, I tried to do the following:
Code:
def parse_index(self):
totalfeeds = []
lfeeds = self.get_feeds()
for feedobj in lfeeds:
feedtitle, feedurl = feedobj
self.report_progress(0, _('Fetching feed')+' %s...'%(feedtitle if feedtitle else feedurl))
articles = []
soup = self.index_to_soup(feedurl)
for item in soup.findAll('div', attrs={'class':'titleRow'}):
description = self.tag_to_string(item.div)
atag = item.a
if atag and atag.has_key('href'):
url = atag['href']
title = self.tag_to_string(atag)
date = strftime(self.timefmt)
articles.append({
'title' :title
,'date' :date
,'url' :url
,'description':description
})
totalfeeds.append((feedtitle, articles))
for item in soup.findAll('a', attrs={'class':'archiveButton'}): // Get all links with class "archiveButton"
skipurl = item['href'] // get the repective URLs
self.index_to_soup(skipurl) // 'click' on them
return totalfeeds
As I expected, it didn't work. So can someone point me into the right direction?