Summary: | Disable bilinear filtering of images at native resolution -- Patch supplied | ||
---|---|---|---|
Product: | poppler | Reporter: | Charles Hyder <dr.hyder> |
Component: | splash backend | Assignee: | poppler-bugs <poppler-bugs> |
Status: | RESOLVED MOVED | QA Contact: | |
Severity: | major | ||
Priority: | medium | CC: | jwilk |
Version: | unspecified | ||
Hardware: | x86 (IA32) | ||
OS: | All | ||
Whiteboard: | |||
i915 platform: | i915 features: | ||
Attachments: |
You can clearly see it does the same we do
Adobe Page Display Settins pdf with exactly 400ppi image Force Image Interpolation Option The patch for Force Image Interpolation poppler-0.24.1-bug-68360.patch The git patch against master Test files for the patch pdf that gets me an arithmetic exception attachment-21019-0.html |
Description
Charles Hyder
2013-08-21 00:37:26 UTC
It is an image of a resolution being parent with a different resolution. We have improved the code to do bilinearFiltering now instead of the crude filtering we had before. ImageOutputDev does not do the bilinearFiltering since it's just writing the raw data to a file, not painting it to a page. Sincerely I don't see the problem here. (In reply to comment #1) > It is an image of a resolution being parent with a different resolution. We > have improved the code to do bilinearFiltering now instead of the crude > filtering we had before. > > ImageOutputDev does not do the bilinearFiltering since it's just writing the > raw data to a file, not painting it to a page. > > Sincerely I don't see the problem here. Perhaps, I haven't clearly stated why this situation constitutes a problem. Here's why I think it does: if I have, say, a JPEG image at 300dpi and I create a PDF file by embedding the image in it and then render the page of the PDF that contains the image with a resolution of exactly 300dpi then I should get, ideally, the image that's IDENTICAL to the original JPEG image. In fact, this is how it was in previous versions. But it is no longer the case: some filtering is unnecessarily applied. When describing the problem originally, by "rendering" I meant rendering at the original resolution, 400dpi in that case. So, the rendering resolution is the same as the original image resolution, not different! In fact, bilinear filtering (by "filtering" you probably mean "interpolation) is the way to go (bicubic is even better and is not very hard to implement either) for rendering at resolutions DIFFERENT from the original one. But for the same resolution (and resolutions that are integer multiples of the original one) no filtering should be applied to embedded images. On the other had, even if filtering is applied, it's based on interpolation, and as such, interpolated values should coincide EXACTLY with the original pixel intensities if the scaling factor is an integer. To sum up, 1)no filtering is necessary when rendering images at certain resolutions; 2) whatever filtering is used - even it's used at all times - it is, apparently, not implemented correctly. As a matter of fact, accurate rendering of embedded images is critical for the project I'm working on. I'll appreciate very much if this issue is resolved so that I can update the Poppler code to the latest version. I think you don't understand it, i don't think there's any bug, Adobe does the same, so I'm going to close this. If you can prove we behave differently to any other PDF renderer please reopen the bug. P.S: Don't demand stuff of people that give you things for free, it's bad manners Reopning the bug due to it being inappropriately closed. (In reply to comment #3) > I think you don't understand it, i don't think there's any bug, Adobe does > the same, so I'm going to close this. If you can prove we behave differently > to any other PDF renderer please reopen the bug. > http://ge.tt/3AEJrIq/v/0 -- this is a result of the PDF file linked to in the first post related to this bug (http://ge.tt/56fxgSp/v/0?c) being rendered in Adobe Acrobat Pro v. 10.1.3. To avoid further replies along the lines of "I think you don't understand it" here are detailed instructions to follow: 1) open the page http://ge.tt/3AEJrIq/v/0 2) click on the "download" link on the left 3) open the image with a good image viewer, like irfanview 4) set the magnification of the image viewer at 100% 5) observe the fact all pixels of the PDF rendered on screen by Adobe have intensities either (0,0,0) or (255,255,255), i.e. are either black or white -- there's no anti-aliasing, "filtering" or anything else like that. 6) Note two important facts: --- a) the magnification was set to 400% in the Adobe Acrobat (visible in the screenshot); --- b) the resolution of the computer screen was set to 100dpi (this is important as by default MS Windows - and, possibly other systems - screens are set to 96dpi -- you have to change your screen resolution in system preferences to reproduce this; please let me know if you need help doing so). It is a combination of a) and b) that allows the Adobe Acrobat to render a PDF page so that rendering of the embedded 400dpi image is "pixel-to-pixel" (overall scaling factor is 1). This combination for the Adobe Acrobat amounts to it rendering at 400dpi. Once again: --- Adobe Acrobat renders bitonal embedded image as bitonal when rendering at the original image resolution. And so does any properly implemented pdf renderer. Please let me know if you need more examples. > P.S: Don't demand stuff of people that give you things for free, it's bad > manners No demands intended. Just wanted to find out if whoever wrote the buggy update to the Poppler renderer is interested at all in confirming and fixing the bug or if I should do it myself (which would be much more time-consuming since it would be somebody else's code that I would have to work with) and if so what are the chances of the owner(s) of this open source project accepting the patch (so that I don't waste my time in vain). P.S. Frankly, so far the attitude of Mr. Albert Astals Cid toward this bug report reminds me that of Adobe's when in the early versions of Acrobat they kept saying that the horrible rendering of Type 3 fonts is "just fine". This eventually led to possibly millions of people having to print the PDFs on paper that were simply unreadable on screens. Adobe eventually fixed their renderer, and so can we, I hope. P.P.S. While it is true that the developers of the Poppler are doing their work for free, so also true is the fact that many people write free software that uses Poppler as it's becoming the #1 PDF lib on Unices. It would be wrong to say that none of them assume at least some level of adequate maintenance of the Poppler code. (In reply to comment #4) > P.P.S. While it is true that the developers of the Poppler are doing their > work for free, so also true is the fact that many people write free software > that uses Poppler as it's becoming the #1 PDF lib on Unices. It would be > wrong to say that none of them assume at least some level of adequate > maintenance of the Poppler code. You are welcome to fork it and give the level of adequate maintenance (In reply to comment #4) > P.P.S. While it is true that the developers of the Poppler are doing their > work for free, so also true is the fact that many people write free software > that uses Poppler as it's becoming the #1 PDF lib on Unices. It would be > wrong to say that none of them assume at least some level of adequate > maintenance of the Poppler code. You are also welcome to hire people to work on poppler making its maintaince easier. Created attachment 84815 [details]
You can clearly see it does the same we do
Ok, so let's post screenshots if screenshots are needed.
My Adobe Reader.
Now, Adobe Reader *does* have an option to disable what they call "Image smoothing", and you probably disabled that option. Now, *that* would be an acceptable feature request, adding an option to the SplashOutputDev so that it does not smooth images. But no, it's not a bug. So i'll give you a change, either i close this bug again or we repurspose this as a feature request, your decision. Created attachment 84816 [details]
Adobe Page Display Settins
I've attached a screenshot of the Adobe Acrobat page display settings. You're right, Adobe does differently depending on these settings! I've always had the "2D graphics acceleration" setting on. With it on, the rendering proceeds as it should (in my opinion; more on that later): smoothing the image for non-integer scaling factors and rendering bitonal for integer scaling factors. However, if I uncheck that box then I get an option to set "Smooth images". Amazingly, this option controls smoothing for ALL scaling factors at once: with it on you always get smoothing (even for 1-to-1 scale), and with it off you never get smoothing. It's amazing, but true: my graphics card driver does better job scaling images than does Adobe ) Another example of a piece of software that proceed in the (partly) correct fashion (in my opinion) about this is WinDjView - smoothing for all scales except scale=1. Another piece of (somewhat irrelevant) information about PDF: there's the so-called "interpolate" flag for each JPEG image in the PDF that should tell any (respecting) PDF render to interpolate images for scales > 1 (i.e., if a PDF page containing a 300dpi image is shown at 600% magnification; screen resolution of 100dpi is again assumed). This flag has no effect on how the image is encoded, only on how it should be rendered. It's rarely used by PDF producers, however. This has no direct relevance to the question we're discussing. Just wanted to mention in case someone finds it curious. I would like to outline the reasons for not smoothing images at integer scale factors next. Since I was only given a choice to either close this ticket or move it to a feature request, and since closing it means I can't write along these lines any further I'd have to agree to moving it to a feature request. Shall I open a feature request or you can move this stuff over there somehow? I know there's the Interpolate flag, and we obey it, just set it and the scaling will be gone. (In reply to comment #11) > I know there's the Interpolate flag, and we obey it, just set it and the > scaling will be gone. Actually i was talking shit here, we obey it in the sense that if you set it to true, we do interpolate, if you set it to false (which is the default) we may still interpolate or not depending on our heuristics. Hmmm, actually i'm confused now because this file is not hitting the new bilinear code, which pdftoppm command line are you using exactly? (In reply to comment #13) > Hmmm, actually i'm confused now because this file is not hitting the new > bilinear code, which pdftoppm command line are you using exactly? pdftoppm.exe -r 400 page.pdf Here's the version: $ pdftoppm.exe -h pdftoppm version 0.22.5 Ok, sorry my bad, yes ./utils/pdftoppm -r 400 -png ~/page.pdf lolo Does hit the bilinear filtering path. So the new name of the bug + enhacement category should fix the thing for you and get the old behaviour. Now, are you guys interested in coding this feature? Should be a matter of no more than 50 lines, basically a settter to SplashOutputDev, a setter to Splash, modifying isImageInterpolationRequired and making pdftoppm set that option if needed + the manpage change. Meanwhile, let me elaborate a little on the topic of scaling. Here's a pic: http://i.stack.imgur.com/JMpbI.png -- this is just some pic I found on the web which is good for illustrating what I want to say. Let's say that the image we're trying to display is a raster image and in this pic it's the edgy b/w thing. Let's also assume that we want to render it at a low resolution so that the pixel on the screen (or any other device) is much larger than the pixel on of the image - let's say the pale gray squares on the pic are the screen pixels. What are different modes of rendering? In what follows, I will basically retell this: http://netpbm.sourceforge.net/doc/pamscale.html -- Discrete Sampling: find the geometric center of each screen pixel and see, within which of the image pixels this center falls, then render the (big) screen pixel with the color of the thus found center image pixel. -- Pixel Mixing: for each big screen pixel find the average intensity as follows: Intensity of the screen pixel = { (Sum over all image pixels that are inside the screen pixel, even those that are partially inside) OF (intensity of the image pixel) X (area of the image pixel that is inside the screen pixel) } / (area of the screen pixel) This formula, actually, works for any scale factor (i.e., linear size of the screen pixel divided by the linear size of the image pixel). It produces greyscale images when the scale factor is not integer (assuming we render a bitonal image), but produces bitonal images for integer scale factors. To quote the netpbm authors, "This method tends to replicate what the human eye does as it moves closer to or further away from an image. It also tends to replicate what the human eye sees, when far enough away to make the pixelization disappear, if an image is not made of pixels and simply stretches or shrinks. " In other words, this is the most natural method for scaling and thus the one that, I think, should be used by default. Again, this method will produce a bitonal image when a raster bitonal image at resolution D embedded in PDF is rendered with resolution N x D, where N is integer, and will produce a greyscale image otherwise. N.B. The discussion of scaling above is equally applicable to color images. I'm using bitonal example since it's easier to illustrate points with it. Interestingly, this "pixel mixing" logic can be applied to rendering vector graphics as well. Obviously, it will almost always produce greyscale images even when rendering black vector graphics (say, black fonts). Now, THIS is where some people would like to have a choice: let's say we have a hybrid page with black vector fonts and raster graphics. When using pixel mixing the fonts will always produce greyscale. I would assume that there are users, myself included, who would sometimes need the black fonts rendered as black, with no filtering, i.e. using the "discrete sampling", as defined above. In fact, this already is an option in pdftoppm: -aaVector. I can only guess that the new code that you're referring to (the that is not executed in the rendering of my example) is the one that renders vector graphics. There, in fact, several methods can be used, see the netpbm page I gave link to, specifically the "resampling" section. The reason to cosider something beyond the pixel mixing is that "resampling" may produce anti-aliased results that appear sharper. But again, this may be made an option. Anyhow, it seems to me still that there's some bug. That is, I tend to think that raster graphics is still rendered using pixel mixing, as it has always been in the early versions of Poppler/Xpdf. What I think is causing unneeded smoothing at scale=1 is some bug in the geometry calculations. This is a slippery part, in fact: if we render a 300dpi image at, say, 301dpi then we'd get greyscale with pixel mixing. Now, one can easily get 301dpi out of 300dpi by a rounding error somewhere. Also, at scale=1 one has screen pixel size = image pixel size. This means one may fit the two grids so that each image pixel fits perfectly inside a screen pixel. That is, one may, but one may translate one of the grids by 1/2 pixel as well and end up with intensity averaging. In my experience with the Xpdf this never ever happened. Somehow, the guys who wrote it were really trying to align the two grids, it seems ) Seriously, however, the only sensible approach - given the fact that the position of an image on the screen is arbitrary - probably was to check and see if the scale = integer and switch off pixel mapping in those cases. "pixel mapping" -- typo, I meant "pixel mixing" "What I think is causing unneeded smoothing at scale=1 is some bug in the geometry calculations." No, it is caused by the fact that it executes the new scaleImageYuXuBilinear code instead of the old scaleImageYuXu About the rest of the comment, no clue, i'm by far not a graphics expert. (In reply to comment #15) > Ok, sorry my bad, yes > ./utils/pdftoppm -r 400 -png ~/page.pdf lolo > > Does hit the bilinear filtering path. Again, I don't really understand why raster graphics needs any bilinear filtering since, as I wrote, the good old pixel mixing (I assume that's what the old code used) is the natural way to go. Rendering vector graphics, again, is a different matter, though. > So the new name of the bug + > enhacement category should fix the thing for you and get the old behaviour. > > Now, are you guys interested in coding this feature? "Guys" is just myself. > Should be a matter of > no more than 50 lines, basically a settter to SplashOutputDev, a setter to > Splash, modifying isImageInterpolationRequired and making pdftoppm set that > option if needed + the manpage change. I might. However, the issue is not very high on my personal priorities list since I'm all set using the Poppler v 0.20 for now and unless Adobe releases an update to their PDF spec such that the old Poppler code won't render it I won't feel any pressure really. (In reply to comment #18) > "What I think is causing unneeded smoothing at scale=1 is some bug in the > geometry calculations." > > No, it is caused by the fact that it executes the new scaleImageYuXuBilinear > code instead of the old scaleImageYuXu Oh, so it's just for images! What was the motivation for moving to a new algo, btw? Just curious. Obviously, it did better in some situation, I assume. Can I see an example? Or can you point me to the right place to look, please? Reading Splash.cc: ---------- // determine if a scaled image requires interpolation based on the scale and // the interpolate flag from the image dictionary static GBool isImageInterpolationRequired(int srcWidth, int srcHeight, int scaledWidth, int scaledHeight, GBool interpolate) { if (interpolate) return gTrue; /* When scale factor is >= 400% we don't interpolate. See bugs #25268, #9860 */ if (scaledWidth / srcWidth >= 4 || scaledHeight / srcHeight >= 4) return gFalse; return gTrue; } ----------- I had to rearead the commend "When scale factor..." 3 or 4 times: I could not believe what I was seeing. https://bugs.freedesktop.org/show_bug.cgi?id=25268 -- here's what happened, in simple words: xpdf/early versions of Poppler used nearest neighbor scaling (same as discrete sampling in the netpbm lingo). This, clearly, produced results much worse than Adobe's, which uses pixel mixing, as it should. Some folks noticed that Evince (based on Poppler) produces bad quality results and asked: how come Adobe does so much better? Someone brought the "interpolate" tag into the discussion, and as is usually the case when people don't know exactly what they are talking about, there surfaced a hypothesis: "Adobe must be interpolating those images despite the fact that the interpolate flag is not set" that is "Adobe is cheating to get nice looking results". And so the developers introduced interpolation for all images in Poppler. Soon people came forward who realized that this downgrades the quality (essentially, the same issue as the one I raised originally!). And so the compromise has been made: "When scale factor is >= 400% we don't interpolate." Let me say a couple of things: 1) This "rule" is a) ad hoc; b) wrong 2) There's a fundamental misunderstanding: pixel mixing and interpolation are two different things! Again from Splash.cc: if (!tilingPattern && isImageInterpolationRequired(srcWidth, srcHeight, scaledWidth, scaledHeight, interpolate)) { scaleImageYuXuBilinear(src, srcData, srcMode, nComps, srcAlpha, srcWidth, srcHeight, scaledWidth, scaledHeight, dest); } else { scaleImageYuXu(src, srcData, srcMode, nComps, srcAlpha, srcWidth, srcHeight, scaledWidth, scaledHeight, dest); } So, basically, it's either "bilinear" (Adobe with "interpolate" set) or "nearest neighbor" ("poor" quality xpdf-style; "poor" here refers to the quality for non-integer scaling factors - this is the issue people were discussing). What about pixel mixing?? There's a third (Adobe with "interpolate" not set) way - the pixel mixing way! Doing some more reading now, hold on... Confirmed: Splash::scaleImageYuXu (scaling image up) does not use pixel mixing, it uses nearest neighbor. Unbelievable! Netpbm has been around for ages. You didn't have to reinvent the wheel, you could have just copied the code. Here's the plan: 1) introduce pixel mixing into Splash::scaleImageYuXu, and also into scaleImageYdXu & scaleImageYuXd -- any "u" (upsampling) function. 2) reduce if (!tilingPattern && isImageInterpolationRequired(srcWidth, srcHeight, scaledWidth, scaledHeight, interpolate)) scaleImageYuXuBilinear(...); to if (!tilingPattern && interpolate) scaleImageYuXuBilinear(...); that is, interpolate if and only if the "interpolate" flag is set. I'm going to start with rewriting a pair of functions (both defined in Splash.cc) that implement bilinear interpolation: static void expandRow(Guchar *srcBuf, Guchar *dstBuf, int srcWidth, int scaledWidth, int nComps) void Splash::scaleImageYuXuBilinear(SplashImageSource src, void *srcData, SplashColorMode srcMode, int nComps, GBool srcAlpha, int srcWidth, int srcHeight, int scaledWidth, int scaledHeight, SplashBitmap *dest) The first one interpolates along x, the second one interpolates along y using results of calling the first one on adjacent rows. There are two problems that I'm going to address: 1) both functions use floating point math, I'm going to replace that with integer point math, fully preserving accuracy and hugely improving the speed; 2) when scaledWidth == srcWidth, scaledHeight == srcHeight, the nodes of the source and destination grids are coinciding, and the interpolation should (mathematically speaking) produce an identical image. So, I suspect, there is indeed a bug after all. I'm going to find it. Once I solve both problems the status of the bug will be as follows: 1) the original issue (bug) of fuzzy rendering at scale = 1 will be resolved; 2) For scale > 1 the rendering of images will be done as before, but at improved speed. In particular, at 1 < scale < 4 (4 is an AD HOC number introduced previously by someone else) the rendering will be done using interpolation REGARDLESS of whether the "interpolate" flag is set. I consider the last point a (minor) bug as well, but the one that doesn't bother me personally. I'm not going to address it. In summary: I'm going to 1) improve speed of image rendering at scale > 1, while not altering what is produced - images produced will be interpolated for 1 < scale < 4, as before; 2) I'm going to fix interpolation at scale = 1: the image should just be copied 1-to-1 in this case, obviously. The plan in Comment 23 obsoletes the plan in Comment 22, so please disregard the plan in Comment 22 (the issues raised there are partly valid, however). Created attachment 84829 [details]
pdf with exactly 400ppi image
The image is not exactly 400ppi.
$ pdfimages -list page.pdf
page num type width height color comp bpc enc interp object ID x-ppi y-ppi size ratio
--------------------------------------------------------------------------------------------
1 0 image 2284 3471 gray 1 1 jbig2 no 13 0 401 400 3365B 0.3%
In fact looking at the image matrix it has a 0.0475 degree rotation.
411.1199951 0.3412323 -0.5185699 624.7799988 3.6575623 -0.1516113 cm
I've attached modified version of the file with the rotation removed and the image resolution exactly 400ppi.
Created attachment 84866 [details] Force Image Interpolation Option I have done more than what I planned to: 1) Wrote expandRowInt() and Splash::scaleImageYuXuBilinearInt() which use integer math - should work much faster; they also use more accurate geometry math, but that would be visible only for scaling, say, a 2x2 pixel image to 3x3; 2) Rewrote isImageInterpolationRequired(): -------------- // determine if a scaled image requires interpolation based on the scale and // the interpolate flag from the image dictionary static GBool isImageInterpolationRequired(int srcWidth, int srcHeight, int scaledWidth, int scaledHeight, GBool interpolate, GBool forceImageInterpolationA) { if (interpolate) return gTrue; /* When asked not to interpolate, or when scale factor is >= 400% we don't interpolate. See bugs #25268, #9860, #68360 */ if (!forceImageInterpolationA || scaledWidth / srcWidth >= 4 || scaledHeight / srcHeight >= 4) return gFalse; return gTrue; } ---------------- By default, forceImageInterpolation == gTrue everywhere, so the behaviour is exactly as before: ad hoc, but what are you gonna do if people are used to that? If it's set to false no interpolation is done, unless the PDF file requests it by itself. The forceImageInterpolationA parameter is eventually settable via globalParams similar to the antialiasing options. 3) Changed the constructors of splashOutputDev and Splash to incorporate the forceImageInterpolationA argument. This allows one to create instances of those classes that behave just as pre-0.20 versions of Poppler would. At the same time, all code that assumes the post-0.20 behavior is 100% compatible. 4) added command line parsing for the new "-fii" option in pdftoppm. This allows one to control the forceImageInterpolation option. 5) updated the pdftoppm.1 man page Can you please attach a proper patch produced by git diff? Do you have any before and after timings? I think the -fii option should just be called "-ii". We don't use the word "force" in similar options like -aa. It would also be better if the option took three values: "yes", "no", and "default". This would allow the user to force interpolation to be always on, always off, or follow the existing behavior. I've never used git, but I can do a simple diff. Would you like that? Timings - I'm on Cygwin, while I'm compiling Poppler from purely within Windows. For this reason I can't use cygwin's "time" command, i.e. I can but it won't properly report the program time, it bunches everything into the "real" time section for native Windows executables. Can someone do the tests under a Unix? Regarding the naming - feel free to change to whatever you like. Regarding a default value - I just copied the syntax of the option parsing from the -aa option, so if you know how to do better feel free to do so. Created attachment 84903 [details]
The patch for Force Image Interpolation
Patch produced by running diff -uN file.{h|cc} file-new.{h|cc} > file.{h|cc}.patch consecutively on all changed files. My first Unix patch ever so please tell me if I'm not doing it right )
(In reply to comment #30) > Created attachment 84903 [details] > The patch for Force Image Interpolation > > Patch produced by running diff -uN file.{h|cc} file-new.{h|cc} > > file.{h|cc}.patch consecutively on all changed files. My first Unix patch > ever so please tell me if I'm not doing it right ) If you are going to use diff, the best way is keep a copy of the original untarred poppler. Make you changes in another copy of the poppler tree. Then $ diff -rup <poppler-orig-dir> <poppler-new-dir> Using git would be preferred. The most basic use of git would be: $ git clone git://git.freedesktop.org/git/poppler/poppler $ cd poppler # make some changes $ git diff (In reply to comment #29) > Timings - I'm on Cygwin, while I'm compiling Poppler from purely within > Windows. For this reason I can't use cygwin's "time" command, i.e. I can but > it won't properly report the program time, it bunches everything into the > "real" time section for native Windows executables. Can someone do the tests > under a Unix? ok, I can do the timing test when I have a useable diff I can apply. > > Regarding the naming - feel free to change to whatever you like. Regarding a > default value - I just copied the syntax of the option parsing from the -aa > option, so if you know how to do better feel free to do so. Generally the author of the patch is expected to address the review comments. After all it is you that wants this change applied. Changing the option name would take you 5 seconds. Created attachment 86277 [details]
poppler-0.24.1-bug-68360.patch
Sorry for the delay, this is the patch produced by
diff -rup poppler-0.24.1 poppler-0.24.1-new > poppler-0.24.1-bug-68360.patch
as requested
Please don't use GlobalParams to pass the data around, GlobalParams is a thing we're trying to get rid of. You have a SplashOutputDev there in pdftoppm, just use it. Regarding the option, the naming seems wrong, what you want is an option to manually force it off, not to force it on. Should be called something like disableImageInterpolation and default to false. Also you should develop against git master as Adrian said, your patch does not apply to git master already and we are not going to add this new feature to stable releases. Created attachment 100363 [details] [review] The git patch against master Sorry for inactivity, just recently got time to actually produce the patch. In short: all this "forceImageInterpolation" stuff has been removed altogether. What is added is 1) integer math version of bilinear scaling algorithms, and the change of the code in SplashOutputDev to use the latter instead of the former floating point math version; 2) new code in drawImage() and isImageInterpolationRequired() that change how the images are drawn at native resolutions and their multiples (other resolutions are unaffected). Native rendering resolution of an image is image's own resolution, so that an image n x m pixels occupies a rectangle of n x m pixels on the raster output device when rendered at native resolution. 3) a few explicit type casts to reduce the number of compiler warning; this is irrelevant to the current bug, really. The main effect of 1) is that the code should be faster (to test speed use pdftoppm -r 450 in the example below). The main effect of 2) can be observed by preparing, say, a bitonal TIFF (CCITT group 4 compression) image at, say, 300dpi, embedding it in a PDF (I use Acrobat's import function, but for those without Acrobat there's a tool called c42pdf which can embed such images without re-encoding), then running pdftoppm -r 300 file.pdf (the new, patched version) on the PDF. What you get is a PBM file which IDENTICAL to the bitonal image you started with. Furthermore, when running pdftoppm -r 600 you get each original pixel becoming now a block of 2x2 pixels, and similarly at all resolutions that are multiples of 300dpi. At all other resolutions the rendering behavior has been left as it was. Please attach all the files you are describing in your test, it'll be easier for us to review/reproduce what you say, also i've no clue where to find c42pdf Created attachment 100415 [details]
Test files for the patch
Here are the test files. The file names are pretty self-explanatory.
http://c42pdf.ffii.org/ --- the tool I mentioned. Created attachment 110149 [details]
pdf that gets me an arithmetic exception
After weeks of looking at literally hundreds of pdf and deciding if the new code is better than the old one, i stumbled over one were the new code crashes.
Please have a look
i run
./utils/pdftoppm thefile.pdf mooooooooooooo
Ping? Any news of this bug ? Created attachment 139030 [details] attachment-21019-0.html Nothing new. I used the fixed code for my project, never finished committing it to the Poppler tree, and eventually left the project, so don't have interest in doing anything about the issue anymore. But I oulined the issue and posted some code, so anyone really interested can easily go all the way with the patch. вт, 24 апр. 2018 г., 0:55 <bugzilla-daemon@freedesktop.org>: > *Comment # 41 <https://bugs.freedesktop.org/show_bug.cgi?id=68360#c41> on > bug 68360 <https://bugs.freedesktop.org/show_bug.cgi?id=68360> from > roucaries.bastien+bugs@gmail.com <roucaries.bastien+bugs@gmail.com> * > > Any news of this bug ? > > ------------------------------ > You are receiving this mail because: > > - You reported the bug. > > -- GitLab Migration Automatic Message -- This bug has been migrated to freedesktop.org's GitLab instance and has been closed from further activity. You can subscribe and participate further through the new bug through this link to our GitLab instance: https://gitlab.freedesktop.org/poppler/poppler/issues/515. |
Use of freedesktop.org services, including Bugzilla, is subject to our Code of Conduct. How we collect and use information is described in our Privacy Policy.