I'm develop a package that produces visualizations. Most of my (informal) testing has been using a notebook to generate visualizations, and then constantly checking if things are still in order.
I was wondering if there is any Julia package (or testing design strategy) for such situation. Perhaps comparing the SVG output to see if it matches (this seems highly innapropriate)?
Depending on how much run-to-run variance you expect in these visualizations, https://github.com/JuliaTesting/ReferenceTests.jl may help
Can you explain what you don't like about comparing the svg output? Which aspects of a figure do you expect to be preserved across runs?
If the new svg has an extra whitespace that does not affect the image, then the test would fail.
Although , changes in margins and white space could be considered to be significant changes… I don’t think there’s an easy way to get the bounding box of visible graphics from the SVG - lots of transforms to analyse…. Although you could convert it to PNG and look for borders :thinking:
I meant in the SVG file. Like <circle cx = 1> and <circle cx=1>.
you can parse the xml and compare that
cormullion said:
Although , changes in margins and white space could be considered to be significant changes… I don’t think there’s an easy way to get the bounding box of visible graphics from the SVG - lots of transforms to analyse…. Although you could convert it to PNG and look for borders :thinking:
Yeah, the closer I got to was to turn it into a png, and define a distance function over the matrices in order to declare if they are similar enough.
Sebastian Pfitzner said:
you can parse the xml and compare that
wouldn't it still suffer the same "white space" issue?
Your two examples should parse the same.
But you'll still run into issues if something in your stack changes and svgs get generated differently without a visible change
Sebastian Pfitzner said:
Your two examples should parse the same.
I mean, if I have something like <circle ... style = {fill=nothing, stroke=red}> and <circle ... style = {stroke=red}> I think I'd again get an error.... I can think of other similar "differences" that are not actually "important".
Sebastian Pfitzner said:
But you'll still run into issues if something in your stack changes and svgs get generated differently without a visible change
Yeah, that is the issue here. I actually want to compare the images, not the actual svg. At least, not in these tests I'm writing.
I think raster images seems to be the way to go.
Thanks for the inputs.
Davi Sales Barreira has marked this topic as resolved.
What about this pkg https://github.com/JuliaPlots/VisualRegressionTests.jl ?
Last updated: Nov 06 2024 at 04:40 UTC