Register Guidelines E-Books Today's Posts Search

Go Back   MobileRead Forums > E-Book Readers > Kobo Reader > Kobo Developer's Corner

Notices

Reply
 
Thread Tools Search this Thread
Old 06-13-2022, 05:16 AM   #16
qkqw
Connoisseur
qkqw is at one with the great books of the world.qkqw is at one with the great books of the world.qkqw is at one with the great books of the world.qkqw is at one with the great books of the world.qkqw is at one with the great books of the world.qkqw is at one with the great books of the world.qkqw is at one with the great books of the world.qkqw is at one with the great books of the world.qkqw is at one with the great books of the world.qkqw is at one with the great books of the world.qkqw is at one with the great books of the world.
 
Posts: 58
Karma: 143000
Join Date: Apr 2022
Device: Kobo Libra 2
Quote:
Originally Posted by xyclonei View Post
I thought using cmd_output instead of cmd_spawn so that the script has a chance to indicate when it is done might sort it out:
In fact I've been using cmd_spawn instead of cmd_output exactly because of this reason - the latter has a max. 10 second timeout.

But looking at the NickeMenu source code and its docs it seems to me that cmd_spawn will start detached from the main process, but a chain will only continue after the previous item has finished.

My personal experience seems to confirm this, as the dbg_toast message always took its time depending on how many new articles there were. Additionally if you look at the NickelMenu docs, you can see that e.g. the telnet process also waits for the different commands to end before going on to the next one:

https://github.com/pgaskin/NickelMen.../doc#L207-L209

I might be mistaken, but that's what I gathered so far.
qkqw is offline   Reply With Quote
Old 07-06-2022, 05:12 PM   #17
qkqw
Connoisseur
qkqw is at one with the great books of the world.qkqw is at one with the great books of the world.qkqw is at one with the great books of the world.qkqw is at one with the great books of the world.qkqw is at one with the great books of the world.qkqw is at one with the great books of the world.qkqw is at one with the great books of the world.qkqw is at one with the great books of the world.qkqw is at one with the great books of the world.qkqw is at one with the great books of the world.qkqw is at one with the great books of the world.
 
Posts: 58
Karma: 143000
Join Date: Apr 2022
Device: Kobo Libra 2
I've updated the script to support GIF, WEBP and SVG (with embedded text). For this I had to switch to ImageMagick 7.

Let me know if you happen to find other image formats, but for now these were the only ones I could find.
qkqw is offline   Reply With Quote
Advert
Old 07-06-2022, 11:57 PM   #18
xyclonei
Connoisseur
xyclonei is clearly one to watchxyclonei is clearly one to watchxyclonei is clearly one to watchxyclonei is clearly one to watchxyclonei is clearly one to watchxyclonei is clearly one to watchxyclonei is clearly one to watchxyclonei is clearly one to watchxyclonei is clearly one to watchxyclonei is clearly one to watchxyclonei is clearly one to watch
 
xyclonei's Avatar
 
Posts: 92
Karma: 10988
Join Date: Dec 2018
Device: Kobo Clara HD
Thank you for the update, @qkqw! Works well.

EDIT : Quick note. Running this versus the earlier version seems to slow down my device quite significantly (Clara HD). But then again, I have quite a few articles saved so this could be why.

Last edited by xyclonei; 07-07-2022 at 01:44 AM.
xyclonei is online now   Reply With Quote
Old 07-07-2022, 05:01 AM   #19
qkqw
Connoisseur
qkqw is at one with the great books of the world.qkqw is at one with the great books of the world.qkqw is at one with the great books of the world.qkqw is at one with the great books of the world.qkqw is at one with the great books of the world.qkqw is at one with the great books of the world.qkqw is at one with the great books of the world.qkqw is at one with the great books of the world.qkqw is at one with the great books of the world.qkqw is at one with the great books of the world.qkqw is at one with the great books of the world.
 
Posts: 58
Karma: 143000
Join Date: Apr 2022
Device: Kobo Libra 2
Quote:
Originally Posted by xyclonei View Post
EDIT : Quick note. Running this versus the earlier version seems to slow down my device quite significantly (Clara HD). But then again, I have quite a few articles saved so this could be why.
There is no throttling or so. It might be because of the additional image formats? Previously it would ignore everything else than PNGs.
qkqw is offline   Reply With Quote
Old 07-07-2022, 03:31 PM   #20
NiLuJe
BLAM!
NiLuJe ought to be getting tired of karma fortunes by now.NiLuJe ought to be getting tired of karma fortunes by now.NiLuJe ought to be getting tired of karma fortunes by now.NiLuJe ought to be getting tired of karma fortunes by now.NiLuJe ought to be getting tired of karma fortunes by now.NiLuJe ought to be getting tired of karma fortunes by now.NiLuJe ought to be getting tired of karma fortunes by now.NiLuJe ought to be getting tired of karma fortunes by now.NiLuJe ought to be getting tired of karma fortunes by now.NiLuJe ought to be getting tired of karma fortunes by now.NiLuJe ought to be getting tired of karma fortunes by now.
 
NiLuJe's Avatar
 
Posts: 13,477
Karma: 26012494
Join Date: Jun 2010
Location: Paris, France
Device: Kindle 2i, 3g, 4, 5w, PW, PW2, PW5; Kobo H2O, Forma, Elipsa, Sage, C2E
IM 7 is built w/ an "HDR first" mindset, so poorer performance is to be expected (even in Q8 builds), to some extent (incidentally, that's why I never moved away from IM 6 for those ).
NiLuJe is offline   Reply With Quote
Advert
Old 03-19-2023, 01:18 PM   #21
CasualBookworm
Junior Member
CasualBookworm doesn't litterCasualBookworm doesn't litter
 
Posts: 3
Karma: 168
Join Date: Mar 2023
Device: Kobo Aura Edition 2
First off, I want to extend a huge thank you for this script! I discovered this thread a few weeks back and, as an avid Pocket user, it has been a game-changer!

Also as an avid Pocket user, I've been saving articles since back when the service was called Read It Later and I only read maybe 70% of what I save, so I have a pretty ridiculous backlog. Running the script against every article wasn't seeming ideal for me, so I made a few tweaks that I wanted to share back:

Code:
#!/bin/sh

usage()
{
cat << EOF
usage: $0 [ -d DAYS ]

Pocket Image Fix:
Converts gif, png, svg, and webp images in downloaded Pocket articles to jpeg
so that they render correctly in the Kobo reader

OPTIONS:
  -d DAYS          only process articles downloaded in the last DAYS days
  -h               prints this help message
EOF
}

# Setup
ARTICLES="/mnt/onboard/.kobo/articles"
POCKET="/mnt/onboard/.adds/pocket"

CONVERT="${POCKET}/magick convert -limit time 60 -font ${POCKET}/fonts/DejaVuSans.ttf"
IDENTIFY="${POCKET}/magick identify -limit time 60 -quiet"

LD_LIBRARY_PATH="${POCKET}/lib:${LD_LIBRARY_PATH}"
export LD_LIBRARY_PATH

# Prepare the directory that flags articles as already processed
FLAGS="${POCKET}/processed_articles"
(mkdir -p "${FLAGS}")

# Parse arguments
unset ARG_DAYS

while getopts 'd:h' c
do
  case $c in
    d) 
        if ! [ "$OPTARG" -gt 0 ] 2> /dev/null; then
          echo "Invalid arg for -d option! Expected a positive integer, got: $OPTARG"
          usage
          exit 1
        fi
        ARG_DAYS=$OPTARG
        ;;
    *)
        usage
        exit 0
        ;;
  esac
done

mtime=""
if [ -n "$ARG_DAYS" ]; then
  mtime="-mtime -$ARG_DAYS"
  echo "Processing articles downloaded in the last $ARG_DAYS day(s)"
else
  echo "Processing all articles; this may take a long time"
fi

# Counters
skippedd=0
processedd=0

skippedf=0
convertedf=0
failedf=0

# Main Loop
echo "Started: $(date)"

for d in $(find $ARTICLES -mindepth 1 -type d $mtime); do
  dir=$(basename $d)
  if [ -f "${FLAGS}/$dir" ] ; then
    skippedd=`expr $skippedd + 1`
  else
    for i in $(find $d -type f -not -iname "*.html"); do
      FORMAT=$($IDENTIFY -format "%m" "$i")
      if [ "$FORMAT" == "GIF" ] ||
         [ "$FORMAT" == "PNG" ] ||
         [ "$FORMAT" == "SVG" ] ||
         [ "$FORMAT" == "WEBP" ]; then
        $CONVERT "$i" "$i.jpg"
        if [ $? -eq 0 ]; then
          mv "$i.jpg" "${i%.jpg}"
          convertedf=`expr $convertedf + 1`
        else
          failedf=`expr $failedf + 1`
        fi
      else
        skippedf=`expr $skippedf + 1`
      fi
    done
    processedd=`expr $processedd + 1`
    touch "${FLAGS}/$dir"
  fi
done

# Summary
echo "Finished: $(date)"
echo
echo "Articles Processed: $processedd"
echo "Articles Skipped: $skippedd"
echo
echo "Files Converted: $convertedf"
echo "Files Not Converted: $skippedf"
echo "Files Failed: $failedf"
The main changes are:
  • After processing an article, it creates an empty file in /mnt/onboard/.adds/pocket/processed_articles as a flag and on subsequent runs it will check that directory and skip over any articles that are flagged as already having been processed
  • It accepts a command line argument that limits it to looking at articles downloaded in the last X days
  • It outputs a little summary when finished

Here's how I'm running it from my NickelMenu config:

Code:
menu_item  :library  :Pocket Images > Convert  :cmd_spawn   :quiet:/mnt/onboard/.adds/pocket/fix.sh -d 7 > /mnt/onboard/.adds/pocket/log
  chain_success                                :dbg_toast   :Started processing articles
menu_item  :library  :Pocket Images > Status   :cmd_output  :1000:tail /mnt/onboard/.adds/pocket/log
The "Convert" item starts processing anything downloaded in the last 7 days and writes the output to a log file. The "Status" item displays that log file so you can check and see if/when it finished along with details on what all it did

Two things I've considered adding but haven't really needed yet are:
  • A way to clear out old items in the processed_articles folder so it doesn't keep growing as new articles get added
  • A way to limit the overall number of articles processed in a given run (because the first time you sync a new device everything will be downloaded in the last X days)
CasualBookworm is offline   Reply With Quote
Old 03-19-2023, 04:06 PM   #22
qkqw
Connoisseur
qkqw is at one with the great books of the world.qkqw is at one with the great books of the world.qkqw is at one with the great books of the world.qkqw is at one with the great books of the world.qkqw is at one with the great books of the world.qkqw is at one with the great books of the world.qkqw is at one with the great books of the world.qkqw is at one with the great books of the world.qkqw is at one with the great books of the world.qkqw is at one with the great books of the world.qkqw is at one with the great books of the world.
 
Posts: 58
Karma: 143000
Join Date: Apr 2022
Device: Kobo Libra 2
Quote:
Originally Posted by CasualBookworm View Post
First off, I want to extend a huge thank you for this script! I discovered this thread a few weeks back and, as an avid Pocket user, it has been a game-changer!
You're very welcome

Quote:
Originally Posted by CasualBookworm View Post
Code:
         convertedf=`expr $convertedf + 1`
You might be able to use $(($convertedf + 1)) instead of expr here.

Quote:
Originally Posted by CasualBookworm View Post
[*]After processing an article, it creates an empty file in /mnt/onboard/.adds/pocket/processed_articles as a flag and on subsequent runs it will check that directory and skip over any articles that are flagged as already having been processed
Instead of having a separate directory for this, would it be possible to store a hidden file inside the article folder, ie. /mnt/onboard/.adds/pocket/1234567890/.processed? That way Kobo should remove the whole folder once you archive/delete an article and the script could be simplified by ignoring those folders with such a file.

Quote:
Originally Posted by CasualBookworm View Post
[*]It accepts a command line argument that limits it to looking at articles downloaded in the last X days
I rarely have more than 20-30 articles on my Kobo, so I never ran into a problem here. In any case, the IM identify command should be quite fast, even with many files to check?

Regardless it does make sense to add some sort of limit. However I'd prefer a number based approach instead of a day based approach. Getting the latest X articles is quite easy, and then you could still check if those were already processed. What do you think?
qkqw is offline   Reply With Quote
Old 03-23-2023, 04:18 PM   #23
CasualBookworm
Junior Member
CasualBookworm doesn't litterCasualBookworm doesn't litter
 
Posts: 3
Karma: 168
Join Date: Mar 2023
Device: Kobo Aura Edition 2
Quote:
Originally Posted by qkqw View Post
You might be able to use $(($convertedf + 1)) instead of expr here.
Good call! Shows how often I write shell scripts.

Quote:
Originally Posted by qkqw View Post
Instead of having a separate directory for this, would it be possible to store a hidden file inside the article folder, ie. /mnt/onboard/.adds/pocket/1234567890/.processed? That way Kobo should remove the whole folder once you archive/delete an article and the script could be simplified by ignoring those folders with such a file.
I'd initially shied away from putting the "flag" file in the article's directory since I wasn't sure how the device would react, but I tested it out and it does look like the whole thing does get cleaned up after an archive/delete. That definitely solves the issue of storing flags for long-deleted articles.

Quote:
Originally Posted by qkqw View Post
I rarely have more than 20-30 articles on my Kobo, so I never ran into a problem here. In any case, the IM identify command should be quite fast, even with many files to check?
I have something ridiculous like 2,000 articles on mine, so re-checking the files can be pretty slow. In some quick testing on my Aura Edition 2, just running identify on 61 articles with a total of 375 files took about 30 seconds, so having a way to skip articles does seem important.

Quote:
Originally Posted by qkqw View Post
Regardless it does make sense to add some sort of limit. However I'd prefer a number based approach instead of a day based approach. Getting the latest X articles is quite easy, and then you could still check if those were already processed. What do you think?
The only issue I could see with getting the latest X articles is that processing a directory could update its modified time, which could bump it back to the top of the latest list. If we fetched all articles but then only processed X (i.e., skipped articles didn't count against the limit) that definitely seems like it could work!
CasualBookworm is offline   Reply With Quote
Old 03-23-2023, 09:54 PM   #24
CasualBookworm
Junior Member
CasualBookworm doesn't litterCasualBookworm doesn't litter
 
Posts: 3
Karma: 168
Join Date: Mar 2023
Device: Kobo Aura Edition 2
I'm not sure how I overlooked this the first time, but shortly after that last post I realized that the article directories are in numeric order, so a reverse sort will give a reliable newest-first order!

I've updated my tweaked script to:
  • Use $(($x + 1)) instead of expr
  • Create .processed files inside the article directories instead of a separate one
  • Removed -d DAYS and the -mtime argument when finding directories
  • Updated the directory iteration to always go in reverse numeric order
  • Added -p ARTICLES to limit the number of articles that will be processed in a given run
  • Added -s ARTICLES to limit the number of articles that will be processed OR skipped in a given run
  • Added -r to force re-processing old articles
(those last two are probably more useful for debugging than regular use)

Code:
#!/bin/sh

usage()
{
cat << EOF
usage: $0 [ -r ] [ -s ARTICLES ] [ -p ARTICLES ]

Pocket Image Fix:
Converts gif, png, svg, and webp images in downloaded Pocket articles to jpeg
so that they render correctly in the Kobo reader

OPTIONS:
  -r               reprocess previously processed articles
  -s ARTICLES      scan at most ARTICLES articles
  -p ARTICLES      process at most ARTICLES articles
  -h               prints this help message
EOF
}

# Setup
ARTICLES="/mnt/onboard/.kobo/articles"
POCKET="/mnt/onboard/.adds/pocket"

CONVERT="${POCKET}/magick convert -limit time 60 -font ${POCKET}/fonts/DejaVuSans.ttf"
IDENTIFY="${POCKET}/magick identify -limit time 60 -quiet"

LD_LIBRARY_PATH="${POCKET}/lib:${LD_LIBRARY_PATH}"
export LD_LIBRARY_PATH

FLAG_FILE=".processed"

# Parse arguments
unset ARG_PROCESS_LIMIT
unset ARG_SCAN_LIMIT
unset ARG_REPROCESS

while getopts 's:p:rh' c
do
  case $c in
    s) 
        if ! [ "$OPTARG" -gt 0 ] 2> /dev/null; then
          echo "Invalid arg for -s option! Expected a positive integer, got: $OPTARG"
          usage
          exit 1
        fi
        ARG_SCAN_LIMIT=$OPTARG
        ;;
    p) 
        if ! [ "$OPTARG" -gt 0 ] 2> /dev/null; then
          echo "Invalid arg for -p option! Expected a positive integer, got: $OPTARG"
          usage
          exit 1
        fi
        ARG_PROCESS_LIMIT=$OPTARG
        ;;
    r) 
        ARG_REPROCESS=1
        ;;
    *)
        usage
        exit 0
        ;;
  esac
done

if [ -n "$ARG_SCAN_LIMIT" ]; then
  echo "Scanning up to $ARG_SCAN_LIMIT articles"
else
  echo "Scanning all articles"
fi

if [ -n "$ARG_PROCESS_LIMIT" ]; then
  echo "Processing up to $ARG_PROCESS_LIMIT articles"
else
  echo "Processing all articles; this could take a while"
fi

if [ -n "$ARG_REPROCESS" ]; then
  echo "Reprocessing previously processed articles"
fi

# Counters
skippedd=0
processedd=0

skippedf=0
convertedf=0
failedf=0

# Main Loop
echo "Started: $(date)"

for d in $(find $ARTICLES -mindepth 1 -type d | sort -Vr); do
  if [ -z "$ARG_REPROCESS" ] && [ -f "$d/$FLAG_FILE" ]; then
    skippedd=$(($skippedd + 1))
  else
    for i in $(find $d -type f -not \( -iname "*.html" -o -iname "$FLAG_FILE" \) ); do
      FORMAT=$($IDENTIFY -format "%m" "$i")
      if [ "$FORMAT" == "GIF" ] ||
         [ "$FORMAT" == "PNG" ] ||
         [ "$FORMAT" == "SVG" ] ||
         [ "$FORMAT" == "WEBP" ]; then
        $CONVERT "$i" "$i.jpg"
        if [ $? -eq 0 ]; then
          mv "$i.jpg" "${i%.jpg}"
          convertedf=$(($convertedf + 1))
        else
          failedf=$(($failedf + 1))
        fi
      else
        skippedf=$(($skippedf + 1))
      fi
    done
    processedd=$(($processedd + 1))
    touch "$d/$FLAG_FILE"

    if [ "$processedd" -ge "$ARG_PROCESS_LIMIT" ] 2> /dev/null; then
      break
    fi
  fi
  if [ $(($processedd + $skippedd)) -ge "$ARG_SCAN_LIMIT" ] 2> /dev/null; then
    break
  fi
done

# Summary
echo "Finished: $(date)"
echo
echo "Articles Processed: $processedd"
echo "Articles Skipped: $skippedd"
echo
echo "Files Converted: $convertedf"
echo "Files Not Converted: $skippedf"
echo "Files Failed: $failedf"
And then I also slightly tweaked the arguments I use when calling the script:
Code:
menu_item  :library  :Pocket Images > Convert  :cmd_spawn   :quiet:/mnt/onboard/.adds/pocket/fix.sh -p 40 > /mnt/onboard/.adds/pocket/log
  chain_success                                :dbg_toast   :Started processing articles
menu_item  :library  :Pocket Images > Status   :cmd_output  :1000:tail -n 15 /mnt/onboard/.adds/pocket/log
If there's nothing new to process it returns almost instantly and the 40-article limit seems to do a decent job of preventing it from slowing down the device for too long at any one time.
CasualBookworm is offline   Reply With Quote
Old 03-26-2023, 03:55 AM   #25
xyclonei
Connoisseur
xyclonei is clearly one to watchxyclonei is clearly one to watchxyclonei is clearly one to watchxyclonei is clearly one to watchxyclonei is clearly one to watchxyclonei is clearly one to watchxyclonei is clearly one to watchxyclonei is clearly one to watchxyclonei is clearly one to watchxyclonei is clearly one to watchxyclonei is clearly one to watch
 
xyclonei's Avatar
 
Posts: 92
Karma: 10988
Join Date: Dec 2018
Device: Kobo Clara HD
Thank you for the updated script. Works splendidly!
xyclonei is online now   Reply With Quote
Old 04-12-2023, 12:16 PM   #26
jaxasaurus
Junior Member
jaxasaurus began at the beginning.
 
Posts: 2
Karma: 10
Join Date: Apr 2023
Device: kobo libra 2
Extracting files to the right location

Quote:
Originally Posted by qkqw View Post
Then extract pocket.zip to /mnt/onboard/.adds/pocket.
I just got a new kobo libra 2 yesterday and noticed this pocket issue right away, thanks for this fix! I'm a newbie and I don't understand how to extract pocket to the right place, I don't have /mnt/onboard/ as folders. Can someone please explain or advise where I should put the pocket folder?

I have NikelMenu installed and it's viewable on my kobo ✅

Newbie attempts:
- I've tried just deleting /mnt/onboard/ from the code in both locations and putting pocket folder in the .add folder (didn't work)
- I've tried adding the folders /mnt/onboard/.adds (new .adds folder) in the .kobo directory but that didn't work either 😅

I was hoping to use my kobo to read more long form articles and blogs that have a lot of images.
jaxasaurus is offline   Reply With Quote
Old 04-12-2023, 08:44 PM   #27
DNSB
Bibliophagist
DNSB ought to be getting tired of karma fortunes by now.DNSB ought to be getting tired of karma fortunes by now.DNSB ought to be getting tired of karma fortunes by now.DNSB ought to be getting tired of karma fortunes by now.DNSB ought to be getting tired of karma fortunes by now.DNSB ought to be getting tired of karma fortunes by now.DNSB ought to be getting tired of karma fortunes by now.DNSB ought to be getting tired of karma fortunes by now.DNSB ought to be getting tired of karma fortunes by now.DNSB ought to be getting tired of karma fortunes by now.DNSB ought to be getting tired of karma fortunes by now.
 
DNSB's Avatar
 
Posts: 35,380
Karma: 145435140
Join Date: Jul 2010
Location: Vancouver
Device: Kobo Sage, Forma, Clara HD, Lenovo M8 FHD, Paperwhite 4, Tolino epos
Quote:
Originally Posted by jaxasaurus View Post
I just got a new kobo libra 2 yesterday and noticed this pocket issue right away, thanks for this fix! I'm a newbie and I don't understand how to extract pocket to the right place, I don't have /mnt/onboard/ as folders. Can someone please explain or advise where I should put the pocket folder?

I have NikelMenu installed and it's viewable on my kobo ✅

Newbie attempts:
- I've tried just deleting /mnt/onboard/ from the code in both locations and putting pocket folder in the .add folder (didn't work)
- I've tried adding the folders /mnt/onboard/.adds (new .adds folder) in the .kobo directory but that didn't work either ��

I was hoping to use my kobo to read more long form articles and blogs that have a lot of images.
/mnt/onboard/ is a method of referring to the onboard storage from a Linux style OS such as Kobo uses and must stay in the script. Say the path is /mnt/onboard/.adds/pocket, the Windows equivalent would be D:\.adds\pocket if you are trying to copy files to your Kobo from a Windows machine (change the D: to whatever drive letter your Kobo mounts at).

When I played with the script, I just opened the .zip file and then dragged the pocket directory to the .adds directory on my Kobo.

Last edited by DNSB; 04-12-2023 at 08:47 PM.
DNSB is offline   Reply With Quote
Old 04-13-2023, 01:20 AM   #28
jaxasaurus
Junior Member
jaxasaurus began at the beginning.
 
Posts: 2
Karma: 10
Join Date: Apr 2023
Device: kobo libra 2
Thanks for taking the time to explain that all! that helps to know and eliminate one thing

Quote:
Originally Posted by DNSB View Post
When I played with the script, I just opened the .zip file and then dragged the pocket directory to the .adds directory on my Kobo.
That was the first thing I tried (plus restarting), but something else must be happening or I'm doing something else wrong. Ah well ¯\_(ツ)_/¯
jaxasaurus is offline   Reply With Quote
Old 04-30-2023, 02:22 PM   #29
bouchacha
Junior Member
bouchacha began at the beginning.
 
Posts: 3
Karma: 10
Join Date: Apr 2023
Device: Kobo Clara 2E
If I understand this thread correctly, the purpose of the script is to fix instances where you find the "image not found" icon. This script does not address the issue of some images not included at all by Pocket. For example here's how this article shows up on the web version of Pocket:



Versus how it shows up on my Kobo (running the Fix Pocket Images script made no difference):



Am I understanding the limitations correctly? Thanks
bouchacha is offline   Reply With Quote
Old 05-01-2023, 05:02 AM   #30
qkqw
Connoisseur
qkqw is at one with the great books of the world.qkqw is at one with the great books of the world.qkqw is at one with the great books of the world.qkqw is at one with the great books of the world.qkqw is at one with the great books of the world.qkqw is at one with the great books of the world.qkqw is at one with the great books of the world.qkqw is at one with the great books of the world.qkqw is at one with the great books of the world.qkqw is at one with the great books of the world.qkqw is at one with the great books of the world.
 
Posts: 58
Karma: 143000
Join Date: Apr 2022
Device: Kobo Libra 2
You are mentioning two different issues:

- The two screenshots are from the Pocket overview section. In most cases images do work properly there (even with acoup), and in most cases it's just a sync issue. I've had the issue that the same image will be displayed in the overview but not inside the article due to being a PNG
- Sometimes whole sections and/or images are missing from the article. That often happens with <code> sections, movies and advanced <picture> elements. In those cases you're right, there's not much we can do as that section is simply not there. If you check with the web view however, you'll notice that the same section is missing, so it's more likely a Pocket issue than a Kobo one.
qkqw is offline   Reply With Quote
Reply


Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
fixing images? PandathePanda Workshop 8 11-15-2017 03:49 AM
Script to resize images to 2 Million Pixels shorshe ePub 14 06-28-2013 12:35 PM
Creation of Complex Script E-Book Using Images for Every Word? sungkhum Conversion 8 10-26-2011 04:20 PM
Script for fixing covers on B&N Nook Color jmricker Devices 0 06-28-2011 01:45 PM
Linux script for images to Kindle Screensaver soymicmic Kindle Developer's Corner 4 01-28-2011 11:20 AM


All times are GMT -4. The time now is 08:27 AM.


MobileRead.com is a privately owned, operated and funded community.