This page describes how to create "Layout Engine Test Cases".

Guidelines

Where are they?

You can find the test cases under the following directory:

test/layoutengine

http://svn.apache.org/viewcvs.cgi/xmlgraphics/fop/trunk/test/layoutengine/

The test cases are divded into groups depending on which resources they need. Test cases that should work against the "out-of-the-box" version of FOP are in test/layoutengine/standard-testcases. Test cases that require hyphenation files are in test/layoutengine/hyphenation-testcases. Further subdirectories may be added in the future.

How do I run them?

The easiest way is to simply run the FOP build process. There's an Ant target called "junit" that runs all of FOP's enabled tests. The target "junit-layout" runs all the layout engine tests and the targets "junit-layout-standard" and "junit-layout-hyphenation" run only the standard and hyphenation tests respectively.

Or you can setup FOP in your favourite Java IDE and run the JUnit tests from there.

Please note that there is a file disabled-testcases.xml in test/layoutengine which contains a list of file names. These are the tests which are by default excluded from the JUnit tests. Tests listed here currently fail and are kept as a reminder for those who fix bugs or implement missing features.

Setting up the Test Suite in your favorite IDE

You can run the test suite from your IDE by setting up a JUnit launch configuration on the org.apache.fop.layoutengine.LayoutEngineTestSuite class. There are a few system properties you can use to control which test cases are run:

To make switching between different sets of test cases easier, consider simply adding an "x" or something like that into the system property which simply causes the property not to be found anymore. This essentially disables it.

Eclipse Users

Eclipse users can set the variables from the, "Run Configurations" on the second tab, "Arguments" and in the test box called, "VM Arguments".

Examples:

Example of a disabled property:

How does it work?

The tests are basically XML files with a predefined structure. Here's the raw structure:

<testcase>
  <info>
    <p>
      This test checks <something>.....
    </p>
  </info>
  <variables>
    <img>../../resources/images/bgimg300dpi.jpg</img>
  </variables>
  <fo>
    <fo:root xmlns:fo="http://www.w3.org/1999/XSL/Format" xmlns:svg="http://www.w3.org/2000/svg">
      <fo:layout-master-set>

        <!-- etc. etc. -->

        <fo:block-container background-image="##img">

        <!-- etc. etc. -->

    </fo:root>
  </fo>
  <checks>
    <eval expected="0 0 360000 360000" xpath="/areaTree/pageSequence/pageViewport/@bounds" desc="page size"/>
    <true xpath="/areaTree/pageSequence/pageViewport/page[1]"/>
    <true xpath="not(/areaTree/pageSequence/pageViewport/page[2])"/>
    <eval expected="0 0 360000 360000" xpath="/areaTree/pageSequence/pageViewport/page[1]/regionViewport/@rect" desc="region body area"/>
  </checks>
</testcase>

Now, when you are creating a test case you can use the XSLT stylesheet called "testcase2fo.xsl" if you want to manually run it through FOP (or any other XSL-FO implementation). The stylesheet basically extracts the XSL-FO document from the XML file.

When the JUnit checks for the layout engine are run, the following happens. The checks are extracted from the XML file and each of them is checked against the result from a FOP processing run. Most tests will check against the "Area Tree XML" that is generated by FOP's XMLRenderer. From the command-line you can access it if you use "-at" instead of "-pdf", for example. These checks are normally XPath queries to the area tree XML (see "true" and "eval" checks below).

When you run test cases the generated Area Tree XML is written to the build/test-results/layoutengine directory.

What checks do I have available?

Area Tree Checks

true

Format:

<true xpath="[XPath expression which results in a boolean value (true or false)]" fail-msg="[optional string]"/>

If the XPath expression results to anything else than "true" the test fails. The fail-msg can be used to supply a custom error message when the check fails. The fail-msg attribute is optional.

eval

Format:

<eval expected="[expected value]" xpath="[XPath expression]" tolerance="[optional number]"/>

This is similar to the first check, but here you can specify an expected value the XPath expression has to result into. Generally you get a more descriptive error message if such a test fails than with "true". You can use the optional "tolerance" value to specify a tolerance for number comparisons.

element-list

Format:

<element-list category="[category]" id="[id]" index="[index]">
  (box|penalty|glue|skip)*
</element-list>

This kind of check is mostly for those who know how the Knuth element list approach works. So this is mostly for developers only. With this check you can intercept an element list that is generated during layout.

<box w="[length]"/>

<penalty w="[length]" p="[p]" flagged="[boolean]" aux="[boolean]"/> <!-- p can also be "INF" or "-INF" -->

<glue w="[length]"/> <!-- stretch and shrink are NYI -->

<skip>[integer]</skip> <!-- can be used to skip n elements in the list -->

Event Check

It is possible to check that certain events have occurred while processing the tests, and that (some of) its parameters match expected values. For example:

<event-checks>
  <event key="inlineContainerAutoIPDNotSupported" fallback="300.0"/>
</event-checks>

The general format is:

<event key="[key as returned by Event.getEventKey()]" [param1="expected value1" param2="expected value2" ...]/>

In theory we should check on the event ID rather than its key to avoid any ambiguity, but that would be impractical (the full ID of the event shown in the example is "org.apache.fop.layoutmgr.BlockLevelEventProducer.inlineContainerAutoIPDNotSupported"...). In practice, the key alone will be enough to distinguish events.

It is possible to specify only the subset of the event's parameters that is of interest. The expected value should be what the toString method called on the parameter would return.

The event checker will process events sequentially. If several events with the same key are expected, then they will be checked in the order given against the sequence of actual events. The checker will not skip one event in an attempt to find a subsequent one that matches. This is to avoid a lack of precision in the event checking. That means that if we want to check one event, then we must also add checks for all the previous events with the same key.

For example, if two "inlineContainerAutoIPDNotSupported" events are expected, one with a fallback value of 300 and one with a fallback value of 100, and we are interested in checking the second one, then the we must still add a check for the first one. Otherwise, the checker has no means to determine whether the test should fail because the first event has a fallback value of 300 while 100 was expected, or whether it should pass because the first event should be ignored and the second one does have a fallback value of 100.

Events that are not checked will be passed on to a regular EventListener that will typically display the corresponding message on the standard output.

How to create additional kinds of checks?

You can easily implement new checks by subclassing org.apache.fop.layoutengine.LayoutEngineCheck. When you do that be sure to register the new class in the static initialization block in org.apache.fop.layoutengine.LayoutEngineTester.

HowToCreateLayoutEngineTests (last edited 2014-01-29 13:02:40 by VincentHennebert)