Reference manual

Architectural overview

Synopsis

The core functionality of the IDEAL software is to compute the dose distribution for any PBS ion beam treatment plan, using Geant4/GATE as the dose engine. The primary purpose is to provide an independent dose calculation tool that may be useful in clinics, but the software is designed to be useful also for a variety of research topics related to dose calculations.

Since Geant4/GATE simulations are quite CPU time consuming, IDEAL is designed to run on a GNU/Linux cluster in order to achieve the statistical goal within a reasonable time. IDEAL is implemented as a set of python modules and scripts. These convert the information in the input DICOM plan data into a set of data files and scripts that can be used by Gate to simulate the delivery of each of the beams requested in the treatment plan, using beamline model details and CT calibration curves that are configured by the user during installation and commissioning of the IDEAL. The dose distribution for each beam is computed separately up to the statistical goal specified by the user: number of primaries, average uncertainty or a fixed calculation time. The dose grid is by default the same as used by the treatment planning system, but the user can choose to override the spatial resolution. The beam dose as well as the plan dose are exported as DICOM files.

Workflow

A device oriented example view of the workflow is illustrated in the figure below.

"IDEAL workflow diagram"

In this view, the user is logged in to the treatment planning workstation.

  1. The user exports a treatment plan from the TPS to DICOM files on a shared file system. The exported data include the planning CT, the treatment plan (including the ion beam sequence), the structure set and physical and/or effective dose for the beams and/or the plan, in a folder that is also mounted on the submission node of the cluster.

  2. The user logs in to the submission node of the cluster and runs one of the IDEAL user interface scripts (clidc.py or socrates.py) to start an independent dose calculation. The minimum input information is the treatment plan file and the statistical goal (number of primaries or average uncertainty). Optional overrides of the default settings are discussed in the following sections.

  3. IDEAL collects and checks all necessary DICOM data referred to by the treatment plan file: the structure set (including exactly one ROI of type “External”), the planning CT and the TPS dose.

  4. IDEAL finds the beamline model(s) corresponding to the treatment machine(s) specified in the plan file, as well as the conversion tables for the CT protocol of the planning CT.

  5. A ‘work directory’ is set up with all scripts specific for the IDC of the treatment plan: * configuration file for Preprocessing (CT image cropping, material overrides) * shell scripts, macros and input files for GateRTion * configuration file for Postprocessing (dose accumulation, resampling and export to DICOM) * submit file for HTCondor (cluster job management system) “DAGman” job.

  6. The IDC is started by submitting the condor “DAGman” job. The DAGman graph has just three nodes: * Preprocessing on the submit node, typically takes about a minute. * simulation on the calculation nodes (Nbeams*Ncores jobs, where Nbeams is the number of beams Ncores the number of physical cores in the cluster), runtime is typically in the order of hours, depending on many factors. * postprocessing on the submit node, takes a couple of minutes.

  7. A “job control daemon” is spawned which will regularly (typically every 300 seconds) check whether the statistical goal (average uncertainty or number of primaries) has been reached for each successive beam. If the goal is reached, then a semaphore file “STOP-<beamname>” is created in the work directory. The scripts that are called by the Gate “StopOnScript” actor check the presence of that semaphore file to decide whether to stop the simulation or to continue.

Preprocessing

The job-dependent details for the preprocessing are computed and saved to a text file by ideal.impl.idc_details.WritePreProcessingConfigFile(). The preprocessing is performed by the bin/preprocess_ct_image.py script as part of the HTCondor DAGman corresponding to the IDC job.

  • Cropping: the minimum bounding box is computed that encloses both the “padded” External ROI volume in the planning CT and the TPS dose distribution. The External ROI is padded with a fixed thickness of air on all six sides. The padding thickness can be configured with an entry for air box margin [mm] in the [simulation] section of the system configuration file.

  • Hounsfield unit values are truncated to the maximum HU value specified in the density curve given for the relevant CT protocol.

  • In all voxels whose central point is not within the External ROI volume, the material is overridden with air (G4_AIR).

  • Each override specified by the IDEAL user is applied to all voxels whose central point lies within the ROI to which the override applies. The user should not specify different material overrides for two or more overlapping ROIs, since the order in which the overrides will be applied is random.

  • The material overrides are implemented by extending a copy of the interpolated Schneider table corresponding to the relevant CT protocol. In the extension, HU values larger than the maximum HU value in the density curve tables are associated with the override materials, and in the preprocessed CT image those high HU values are used for the voxels that have material overrides.

  • A mass file is created with the same geometry as the cropped CT image, with floating point voxel values representing the density (in grams per cubic centimeter). For voxels with material overrides (e.g. outside the external), the density value are taking from the overriding material; for all other voxels the density is obtained by a simple linear interpolation in the density curve.

  • A dose mask file is created with the same geometry as the output dose files, with value 1 (0) for all voxels with the central point inside (outside) the External ROI.

Calculation of average uncertainty

The dose distributions of each beam in a beamset (or treatment plan) is computed separately. When computing the dose for one beam, on (almost) all physical cores of the cluster GateRTion is simulating protons or ions with kinematics that are randomly sampled from all spots in the beam, with probabilities proportional to the number of planned particles per spot, and Gaussian spreads given by source properties file for the beamline (treatment machine) by which the beam is planned to be delivered.

For all particles that are tracked through the TPS dose volume within the (cropped) CT geometry, the physical dose (in water, by convention) is computed by the DoseActor in GateRTion as the deposited energy divided by mass, multiplied by the stopping power ratio (water to material, where the material is either the Schneider material corresponding to the voxel Hounsfield value or override material, in case the user specified an override for a ROI including the voxel). The dose is saved periodically (by default every 300 seconds, configurable in the system configuration file).

The final dose distribution (typically the same as for the TPS) is typically not with the same spatial resolution as the CT. The job control daemon computes an estimate of the Type A (“statistical”) dose uncertainty in each (resampled) voxel by resampling the (intermediate) dose distributions of from all condor jobs to the TPS dose grid, dividing the dose by the number of simulated primaries by each core, and computing the weighted average and weighted standard deviation of the dose-per-primary for each voxel. The number of primaries simulated on each core serve as weights. The relative uncertainty in each voxel is the the ratio of the (weighted) standard deviation and the (weighted) average.

A “mean maximum” value of dose-per-primary is estimated by computing the mean of the Ntop highest values in the distribution of the weighted average dose per voxel per primary. A threshold value is then defined as a fraction P (in percent) of this “mean maximum” dose-per-primary value. The “average uncertainty” is then computed as the simple unweighted average of the relative uncertainties in those voxels in which the dose-per-primary is higher than this threshold.

The values of Ntop can be set in the system configuration file (default Ntop=100 and P=50 percent).

When the user starts and IDC with an uncertainty value as statistical goal, then the job control daemon with apply this goal to the dose for every beam. The simulations for each beam will not be stopped until the above described average uncertainty estimate for the beam dose is below the target value. In other engines or treatment planning systems, the average uncertainty goal may refer to to the plan dose instead.

Postprocessing

Actions in post processing:

  • Accumulate dose distributions and total number of primaries from simulations on all cluster cores.

  • Scale the dose with the ratio of the planned and simulated number of primary particles.

  • Scale the dose with an overall correction factor.

  • Resample the dose to the output dose geometry (typically the same as the TPS dose geometry)

  • For protons: compute a simple estimate of the “effective” dose by multiplying with 1.1.

  • Save beam doses in the format(s) configured by the user.

  • Compute the gamma index value for every voxel in the beam dose with dose above threshold, if gamma index parameters are configured and the corresponding TPS beam dose is available. Output only in MHD format (no DICOM).

  • Compute plan doses (if plan dose output is configured by the user).

  • Compute gamma index distributions for plan doses (if TPS plan dose is available and gamma index calculation is enabled in the system configuration).

  • Update user log summary text file with settings and performance data.

  • Copy final output (beam and plan dose files, user log summary) to another shared folder (typically a CIFS mount of a folder on Windows file share, to be configured by the user in the system configuration file).

  • Clean up: compress (rather than delete: to allow analysis in case of trouble) the outputs from all GateRTion simulations, remove temporary copies. This is usually the most time consuming part of the post processing.

(To be continued.)

Developers’ manual

How to hack stuff.

TODO list

  • Make IDEAL/pyidc into a “pip” project.

  • Support rotating gantry.

  • Support range shifter on movable snout.

  • The job_executor and idc_details classes are too big, lots of implementation details should be delegated into separate classes. + Specifically, the creation of the Gate work directory and the creation of all condor-related files should be wrapped in separate classes. And then it should be more straightforward to support other cluster job management systems, such as SLURM and OpenPBS.

  • There are still many ways in which the user can provide wrong input and then get an error that is uninformative. For instance, using “G4_Water” instead of “G4_WATER” as override material for a ROI results in a KeyError that only says that the “key” is unknown.

Modules

The two central modules in IDEAL are the humbly named ideal.impl.idc_details and py:class:ideal.impl.job_executor. The ideal.impl.idc_details module is responsible for collecting and checking the input data and the user settings. The ideal.impl.job_executor is responsible for the execution of all steps of the calculation, which is organized in three phases: pre-processing, simulation and post-processing. Both modules should be used by a script (e.g. the command line user interface script clidc.py) that runs on the submission node of the cluster.

IDEAL currently has two scripts that provide user interfaces (UI): one “command line” user interface (clidc.py) and one “graphical” user interface (sokrates.py). Both these UIs are effectively just differently implemented wrappers around the idc_details and job_executor modules; with clidc.py, the user specififies the input plan file and simulation details via command line options while sokrates.py, these inputs are selected with traditional GUI-elements such as drop-down menus, radio buttons, etc. Research users are of course not limited to these UI scripts and can write their own scripts based on the IDEAL modules.

This module implements the “green” box in the “prepare” diagram included in the “device description” of IDEAL, the one that says “get beamline model”.

class impl.beamline_model.Test_GetBeamLineModel(methodName='runTest')
impl.dicom_dose_template.write_dicom_dose_template(rtplan, beamnr, filename, phantom=False)

Create a template DICOM file for storing a dose distribution corresponding to a given treatment plan. * rtplan: a pydicom Dataset object containing a PBS ion beam plan. * beamnr: a string containing the beam number to be used for referral. Should contain “PLAN” for plan dose files. * filename: file path of the desired output DICOM file * phantom: boolean flag to indicate whether this is for a CT (False) or phantom dose (True).

impl.dual_logging.get_dual_logging(verbose=False, quiet=False, level=None, prefix='logfile', daemon_file=None)

Configure logging such that all log messages go both to a file and to stdout, filtered with different log levels.

This module defines the function which writes a Gate macro very similar to the one that Alessio composed for his PSQA simulations.

Some commands which were executed by Alessio from an external macro file are now copied here directly, in an effort to completely eliminate the use of aliases. Writing all beam numbers and beamset names explicitly hopefully facilitates later debugging.

impl.gate_macro.check(ct=True, **kwargs)

Function to check completeness and correctness of a long list of key word arguments for gate macro composition.

impl.gate_macro.roman_year()

Vanity function to print the year in Roman numerals.

impl.gate_macro.write_gate_macro_file(ct=True, **kwargs)

This function writes the main macro for PSQA calculations for a specific beam in a beamset. beamset: name of the beamset uid: DICOM identifier of the full plan/beamset user: string that identifies the person who is responsible for this calculation spotfile: text file

This module serves to define the available Hounsfield lookup tables (HLUT) as defined in the materials/HLUT/hlut.conf configuration file in the commissioning directory of an IDEAL installation.

In the ‘hlut.conf’ configuration file, each section represents a supported CT protocol and the data in the section should unambiguously describe for which kinds of CTs the protocol is relevant and how to use it to define the Gate/Geant4 material to use for each HU value in a given CT image with that protocol.

There are two kinds of protocols: Schneider protocols which are intended for both clinical and commissioning purposes.

For the Schneider type protocols the ‘hlut.conf’ file should specify two text files, providing the “density curve” and the “material composition”, respectively. The density curve typically specifies the density for about a dozen HU values between -1024 and +3000. The composition file specifies for several HU intervals a mixture of elements. Gate/Geant4 has a method for converting these two tables for a given density tolerance value into a “HU to material” table, along with a database file that has the same format as the “Gate material database” file (usually named “GateMaterials.db”) which defines lots of “interpolated materials”, each with slightly different density, but the composition exactly like one of the materials in the “composition” file.

The conversion of the Schneider density and composition tables into the HU-to-materials table and the associated new database is taken care of by the gate_hlut_cache module, which saves the new table and database in a cache directory such that they can be reused.

The protocols can either be directly selected through their name or through a match with the metadata included in the CT DICOM files.

A match with DICOM metadata allows the “automatic detection” of the CT protocol. In the hlut.conf file, the commissioning physicist lists one or more DICOM tags (by keyword or by hexadecimal tag) and which text strings the DICOM values are expected to have for that CT protocol. It is the responsibility of the commissioning physicist to define this in such a way that for all data that will be used with this particular IDEAL installation, the automatic selection will always result in one single match. In case multiple DICOM tags are given per CT protocol, the protocol with the most matches “wins”.

The user may also provide a name or keyword in the IDEAL command invocation (e.g. the -c option from clidc.py) which is then directly matched with the CT protocol names. First a case sensitive perfect match is attempted, if that fails then a case insensitive partial match (the given keyword may be a substring of the full CT protocoll name). In case the partial match results in multiple results, an exception is thrown and the user should try again.

class impl.hlut_conf.hlut(name, prsr_section, hutol=None)

This class serves to provide the HU to material conversion tables, based on the information from the HLUT configuration file. This can be either a “Schneider” type conversion (Schneider tables, interpolated with a density tolerance value to a large list of interpolated materials) or a “commissioning” type conversion in which the HU to materials are directly coded by the commissioning physicist.

match(ct)

For now, check that all requirements match. :param ct: a DCM data set as read with pydicom.dcmread.

class impl.hlut_conf.hlut_conf(s={})

This is a ‘singleton’ class, only one HLUT configuration object is supposed to exist. The singleton HLUT configuration object behaves like a Python dictionary with limited functionality: no comparisons (since there is only one such object), no additions or removals, but the elements are not strictly ‘const’. The HLUT configuration is initialized the first time its instance is acquired with the static getInstance() method.

static getInstance(fname=None)

Get a reference to the one and only “HLUT configuration” instance, if it exists. The ‘fname’ argument should ONLY be used by the unit tests. In normal use, the HLUT conf file should be defined by the system configuration.

hlut_match_dicom(ctslice)

This method tries to find the HLUT for which the DICOM metadata matching rules uniquely matches user-provided CT file object (a data set returned by pydicom.dcmread, e.g. first CT slice in the series coming with the DICOM treatment plan file).

In case of success, it returns the CT protocol name.

In case of failure, this function will throw a KeyError with some explaining text that should help the user understand what is going wrong.

hlut_match_keyword(kw)

This method compares a user-provided keyword (e.g. from the -c option for the clidc.py script) to the names of the CT protocols provided in the hlut.conf file.

In case of success, it returns the matching CT protocol name.

In case of failure, this function will throw a KeyError with some explaining text that should help the user understand what is going wrong.

class impl.hlut_conf.test_good_hlut_conf(methodName='runTest')
setUp()

Hook method for setting up the test fixture before exercising it.

tearDown()

Hook method for deconstructing the test fixture after testing it.

This module implements the job executor, which uses a “system configuration” (system configuration object) and a “plan details” objbect to prepare the simulation subjobs, to run them, and to combine the partial dose distributions into a final DICOM dose and to report success or failure.

The job executor is normally created after a ‘idc_details’ object has been created and configured with the all the plan details and user wishes. The job executor then creates the Gate workspace and cluster submit files. The job is not immediately submitted to the cluster: if the user interacts with the “socrates” GUI, then the user can inspect the configuration by running a limited number of primaries and visualizing the result with “Gate –qt”. After the OK by the user, the job is then submitted to the cluster and the control is taken over by the “job control daemon”, which monitors the number of simulated primaries, the statistical uncertainty and the elapsed time since the start of the simulation and decides when to stop the simulations and accumulate the final results.

impl.system_configuration.get_sysconfig(filepath=None, verbose=False, debug=False, username=None, want_logfile='default')

This function parses the “system configuration file”, checks the contents against trivial mistakes and then creates the system_configuration singleton object. :param filepath: file path of the system configuration file, use dirname(script)/../cfg/system.cfg by default :param verbose: boolean, write DEBUG level log information to standard output if True, INFO level if False. :param debug: boolean, clean up the bulky temporary data if False, leave it for debugging if True. :param username: who is running this, with what role? :param want_logfile: possible values are empty string (no logfile), absolute file path (implying no logs to stdout) or ‘default’ (generated log file path, some logs will be streamed to standard output)

class impl.system_configuration.system_configuration(s={})

This is a ‘singleton’ class, only one system configuration object is supposed to exist. The singleton system configuration object behaves like a dictionary, except that you get a (shallow) copy of an object from it, instead of a reference, when you “get” an “item”.

The system configuration is initialized once at the start of whatever program is using it, through the function call get_sysconfig (see below). After initialization it should then not change anymore.

The implementation aims to avoid changing the system configuration inadvertently. This is not banking software.

static getInstance()

Get a reference to the one and only “system configuration” instance, if it exists.

This module was developed in as part of the effort in Uppsala (2015-2017) develop a framework to analyze the effect of patient motion on the dose distribution in proton PBS (primarily at Skandion, but it should work for other clinics as well). Authors: David Boersma and Pierre Granger

class utils.roi_utils.contour_layer(points=None, ref=None, name='notset', z=None, ignore_orientation=True)

This is an auxiliary class for the region_of_interest class defined below. A contour_layer object describes the ROI at one particular z-value. It is basically a list of 2D contours. The ones with positive orientation (sum of angles between successive contour segments is +360 degrees) will be used to for inclusion, the ones with negative orientation (sum of angles is -360 degrees) will be used for exclusion. All points of an exclusion contour should be included by an inclusion contour.

utils.roi_utils.list_roinames(ds)

Return the names of the ROIs in a given dicom structure set as a list of strings.

utils.roi_utils.list_roinumbers(ds)

Return the names of the ROIs in a given dicom structure set as a list of strings.

utils.roi_utils.scrutinize_contour(points)

Test that points is a (n,3) array suitable for contour definition.

utils.roi_utils.sum_of_angles(points, name='unspecified', rounded=True, scrutinize=False)

Code to determine left/right handedness of contour. We do this by computing the angles between each successive segment in the contour. By “segment” we mean the difference vector between successive contour points. The cross and dot products between two segments are proportional to sine and cosine of the angle between the segments, respectively.

This module provides a function for mass weighted resampling of an image file that contains a 3D dose distribution to match the geometry (origin, spacing, number of voxels per dimension) of a reference image.

The resampled dose R_j in voxel j of the new grid is computed as follows from the dose D_i of voxels i in the input dose distribution, the corresponding masses (or mass densities) M_i and the volumes V_i_j of each pair of input voxel i and output voxel j:

R_j = Sum_i w_i_j D_i / N_j w_i_j = V_i_j * M_i N_j = Sum_i w_i_j

If you choose to run the multithreaded version, then it is your own responsibility to make wise choice for the number of threads, based on (e.g.) the number of physical cores, the available RAM and the current workload on the machine.

class utils.resample_dose.dose_resampling_tests(methodName='runTest')
setUp()

Hook method for setting up the test fixture before exercising it.

utils.resample_dose.enclosing_geometry(img1, img2)

Does img1 enclose img2?

This function could be used to test whether img2 can be used as a reference image for resampling img1.

utils.resample_dose.equal_geometry(img1, img2)

Do img1 and img2 have the same geometry (same voxels)?

This is an auxiliary function for mass_weighted_resampling.

utils.resample_dose.mass_weighted_resampling(dose, mass, newgrid)

This function computes a dose distribution using the geometry (origin, size, spacing) of the newgrid image, using the energy deposition and mass with some different geometry. A typical use case is that a Gate simulation first computes the dose w.r.t. a patient CT (exporting also the mass image), and then we want to resample this dose distribution to the geometry of the new grid, e.g. from the dose distribution computed by a TPS.

This implementation relies on a bit of numpy magic (np.tensordot, repeatedly). A intuitively more clear but in practice much slower implementation is given by _mwr_wit_loops(dose,mass,newgrid); the unit tests are verifying that these two implementation indeed yield the same result.

class utils.resample_dose.overlaptests(methodName='runTest')

This module provides a simple function that converts an ITK image with (short int) HU values into an float32 image with density values.

utils.mass_image.create_mass_image(ct, hlut_path, overrides={})

This function creates a mass image based on the HU values in a ct image, a Hounsfield-to-density lookup table and (optionally) a dictionary of override densities for specific HU values.

If the HU-to-density lookup table has 2 columns, it is interpreted as a density curve that needs to be interpolated for the intermediate HU values. If the HU-to-density lookup table has 3 columns, then it is interpreted as a step-wise density table, with a constant density within each successive interval (no interpolation).

class utils.mass_image.mass_image_test(methodName='runTest')
class utils.gate_pbs_plan_file.gate_pbs_plan(planpath, bml=None, radtype='proton')

It might be more appropriate to call this a ‘MCsquared’ or ‘REGGUI’ plan. It can still be used for reading “normal” Gate-style PBS plan text files, which do not contain any information about passive elements.

class utils.gate_pbs_plan_file.gate_pbs_plan_file(planpath, gangle=0.0, allow0=False)

Class to write GATE plan descriptions from arbitrary spot specifications produced e.g. by a TPS such as RayStation.

class utils.gate_pbs_plan_file.test_gate_pbs_plan_reading(methodName='runTest')
setUp()

Hook method for setting up the test fixture before exercising it.

tearDown()

Hook method for deconstructing the test fixture after testing it.

Compare two 3D images using the gamma index formalism as introduced by Daniel Low (1998).

class utils.gamma_index.Test_GammaIndex3dIdenticalMesh(methodName='runTest')
class utils.gamma_index.Test_GammaIndex3dUnequalMesh(methodName='runTest')
utils.gamma_index.gamma_index_3d_equal_geometry(imgref, imgtarget, dta=3.0, dd=3.0, ddpercent=True, threshold=0.0, defvalue=- 1.0, verbose=False, threshold_percent=False)

Compare two images with equal geometry, using the gamma index formalism as introduced by Daniel Low (1998). * ddpercent indicates “dose difference” scale as a relative value, in units percent (the dd value is this percentage of the max dose in the reference image) * ddabs indicates “dose difference” scale as an absolute value * dta indicates distance scale (“distance to agreement”) in millimeter (e.g. 3mm) * threshold indicates minimum dose value (exclusive) for calculating gamma values: target voxels with dose<=threshold are skipped and get assigned gamma=defvalue. * threshold_percent is a flag, True means that threshold is given in percent, False (default) means that the threshold is absolute. Returns an image with the same geometry as the target image. For all target voxels that have d>threshold, a gamma index value is given. For all other voxels the “defvalue” is given. If geometries of the input images are not equal, then a ValueError is raised.

utils.gamma_index.gamma_index_3d_unequal_geometry(imgref, imgtarget, dta=3.0, dd=3.0, ddpercent=True, threshold=0.0, defvalue=- 1.0, verbose=False, threshold_percent=False)

Compare 3-dimensional arrays with possibly different spacing and different origin, using the gamma index formalism, popular in medical physics. We assume that the meshes are NOT rotated w.r.t. each other. * dd indicates by default the “dose difference” scale as a relative value, in units percent (the dd value is this percentage of the max dose in the reference image). * If ddpercent is False, then dd is taken as an absolute value. * dta indicates distance scale (“distance to agreement”) in millimeter (e.g. 3mm) * threshold indicates minimum dose value (exclusive) for calculating gamma values * threshold_percent is a flag, True means that threshold is given in percent, False (default) means that the threshold is absolute.

Returns an image with the same geometry as the target image. For all target voxels that are in the overlap region with the refernce image and that have d>threshold, a gamma index value is given. For all other voxels the “defvalue” is given.

utils.gamma_index.get_gamma_index(ref, target, **kwargs)

Compare two 3D images using the gamma index formalism as introduced by Daniel Low (1998). The positional arguments ‘ref’ and ‘target’ should behave like ITK image objects. Possible keyword arguments include: * dd indicates “dose difference” scale as a relative value, in units of percent (the dd value is this percentage of the max dose in the reference image) * ddpercent is a flag, True (default) means that dd is given in percent, False means that dd is absolute. * dta indicates distance scale (“distance to agreement”) in millimeter (e.g. 3mm) * threshold indicates minimum dose value (exclusive) for calculating gamma values * verbose is a flag, True will result in some chatter, False will keep the computation quiet. * threshold_percent is a flag, True means that threshold is given in percent, False (default) means that the threshold is absolute.

Returns an image with the same geometry as the target image. For all target voxels in the overlap between ref and target that have d>dmin, a gamma index value is given. For all other voxels the “defvalue” is given. TODO: allow 2D images, by creating 3D images with a 1-bin Z dimension. Should be very easy. The 3D gamma image computed using these “fake 3D” images can then be collapsed back to a 2D image.

utils.crop.crop_and_pad_image(input_img, from_index, to_index, hu_value_for_padding)

We would like to use itk.RegionOfInterestFilter but that filter does recalculate the origin correctly. So now we do this manually, through numpy.

utils.crop.crop_image(input_img, from_index, to_index)

We would like to use itk.RegionOfInterestImageFilter but that filter does not recalculate the origin correctly. So now we do this manually, through numpy.

class utils.crop.test_crop(methodName='runTest')
setUp()

Hook method for setting up the test fixture before exercising it.

tearDown()

Hook method for deconstructing the test fixture after testing it.

class utils.crop.test_crop_and_pad(methodName='runTest')
setUp()

Hook method for setting up the test fixture before exercising it.

tearDown()

Hook method for deconstructing the test fixture after testing it.

class utils.bounding_box.bounding_box(**kwargs)

Define ranges in which things are in 3D space. Maybe this should be more dimensionally flexible, to also handle 2D or N dimensional bounding boxes.

indices_in_image(img, rtol=0.0, atol=0.001)

Determine the voxel indices of the bounding box corners in a given image.

In case the corners are within rounding margin (given by eps) on a boundary between voxels in the image, then voxels that would only have an infinitesimal intersection with the bounding box are not included. The parameters atol and rtol are used to determine with np.isclose if BB corners are on a voxel boundary. In case the BB corners are outside of the image volume, the indices will be out of the range (negative, of larger than image size).

Returns two numpy arrays of size 3 and dtype int, representing the inclusive/exclusive image indices of the lower/upper corners, respectively.

class utils.bounding_box.test_bounding_box(methodName='runTest')
class utils.beamset_info.beamset_info(rpfp)

This class reads a DICOM ‘RT Ion Plan Storage’ file and collects related information such as TPS dose files. It does NOT (yet) try to read a reffered structure set and/or CT images. This acts as a wrapper (all DICOM access on the plan file happens here). This has a few advantages over direct DICOM access in the other modules: * we can deal with different “DICOM dialects” here; some TPSs may store their plans in different ways. * if ‘private tags’ need to be taken into account then we can also do that here. * We can make a similar class, with the same attributes, for a treatment plan stored in a different format, e.g. for research, commissioning or QA purposes.

Then the rest of the code can work with that in the same way.