In our RV 2019 paper, we introduced shape expressions, a new language for expressing regular expressions over basic geometric shapes in time-series data. In this paper, we give the first algorithm to mine (linear) shape expressions from data. The algorithm uses an interesting combination of an algorithm to automatically partition a time-series datum into piecewise linear chunks, a bucketing strategy over the parameters of these chunks that abstracts the given time-series datum into a discrete trace, and an automata learning-based procedure to learn a regular expression over these shapes.
In our RV 2019 paper, we introduced shape expressions, a new language for expressing regular expressions over basic geometric shapes in time-series data. This is the journal extension with proofs and more case studies.
We propose Stochastic Temporal Logic (StTL) for expressing probabilistic specifications on time-varying behaviors of controlled stochastic dynamical systems; Quantitative semantics for StTL to reason about the robust satisfaction of an StTL by given stochastic system; Propose using the robustness value as the objective function for controller design. Parameter inference for Parameteric-StTL specifications, which allows specifications to be mined from output traces of the underlying system. This paper was accepted for presentation at the EMSOFT 2019 conference.
Proposes general framework convergence classifier functions for differentiating converging behavior from non-converging behavior. CCFs can be based on M-step Lyapunov functions, Signal Temporal Logic or neural networks. Proposes a number of testing techniques to automatically identify non-converging behaviors of a system. Journal extension of EMSOFT '16 paper. Adds proofs and experimental results.
We extend our CAV 2015 paper with a different algorithm for computing Skorokhod distance. The key assumption for this algorithm is that traces are now sampled at a greater frequency, allowing a simpler piecewise-constant interpolation scheme to be used to interpret traces. The paper studies the tradeoff between the sampling frequency and the computational efficiency of the new algorithm. The paper also extends transference results to TPTL that allows freeze quantifiers in the value domain.
We extend our RV 2015 paper for online monitoring of STL. We clarify certain notions: for STL formulas with an unbounded time horizon, we introduce the notion of nominal robustness. The key technical result in this paper is that for arbitrary STL formulas, the nominal quantitative robustness can be monitored using number of variables that is exponential in the length of the formula, but independent of the length of the trace being monitored.
We use a Gaussian Process model as a surrogate model for an arbitrary black-box model of a closed-loop control system and use Bayesian optimization to find possible values for model parameters that lead to the model violating given STL requirements. We also explore a technique to scale to higher dimensions using random embeddings.
Designers of industrial embedded control systems, such as automotive, aerospace, and medical-device control systems, use verification and testing activities to increase their confidence that performance requirements and safety standards are met. Since testing and verification tasks account for a significant portion of the development effort, increasing the efficiency of testing and verification will have a significant impact on the total development cost. Existing and emerging simulation-based approaches offer improved means of testing and, in some cases, verifying the correctness of control system designs.
This is a journal version of our presentation at the Numerical Software Verification Workshop on verifying a powertrain control system using KeyMaera and numerically-obtained safety invariant sets.
This is a journal version of our HSCC '13 paper on mining STL requirements of closed-loop control models. We give some interesting requirement templates in this paper in addition to more experimental results.
Journal version of the ESOP 2010 paper. Derives concurrency control for a sequential library to make it thread-safe, given proof of sequential correctness of the library. Thread safety defined as: a) absence of assertion failures in concurrent context, or b) linearizability of library.
Extended version of the ASE '09 paper. Presents a new static analysis for deadlocks arising from wait-notify based synchronization in programs using nested monitors.
This chapter gives an overview of techniques for qualitative and quantitative monitoring of Cyber-Physical System (CPS) behaviours. We present an overview of some of the important applications and, finally, we describe the tools supporting CPS monitoring and compare their main features.
Modern cyber-physical systems (CPS) are often developed in a model-based development (MBD) paradigm. The MBD paradigm involves the construction of different kinds of models: (1) a plant model that encapsulates the physical components of the system (e.g., mechanical, electrical, chemical components) using representations based on differential and algebraic equations, (2) a controller model that encapsulates the embedded software components of the system, and (3) an environment model that encapsulates physical assumptions on the external environment of the CPS application. In order to reason about the correctness of CPS applications, we typically pose the following question: For all possible environment scenarios, does the closed-loop system consisting of the plant and the controller exhibit the desired behavior? Typically, the desired behavior is expressed in terms of properties that specify unsafe behaviors of the closed-loop system. Often, such behaviors are expressed using variants of real-time temporal logics. In this chapter, we will examine formal methods based on bounded-time reachability analysis, simulation-guided reachability analysis, deductive techniques based on safety invariants, and formal, requirement-driven testing techniques. We will review key results in the literature, and discuss the scalability and applicability of such systems to various academic and industrial contexts. We conclude this chapter by discussing the challenge to formal verification and testing techniques posed by newer CPS applications that use AI-based software components.
We consider the problem of monitoring temporal patterns expressed in Signal Temporal Logic (STL) over time-series data in a clairvoyant fashion. Existing offline or online monitoring algorithms can only compute the satisfaction of a given STL formula on the time-series data that is available. We use off-the-shelf statistical time-series analysis techniques to fit available data to a model and use this model to forecast future signal values. We derive the joint probability distribution of predicted signal values and use this to compute the satisfaction probability of a given signal pattern over the prediction horizon. There are numerous potential applications of such prescient detection of temporal patterns. We demonstrate practicality of our approach on case studies in automated insulin delivery, unmanned aerial vehicles, and household power consumption data.
Given sets of labeled timed traces, finds an STL classifier to separate the traces into distinct classes. Steps: (1) systematically enumerates longer PSTL formulas using smaller PSTL subformulas, (2) for each PSTL formula, successively finds parameter valuations on the validity domain boundary of the formula using multi-dimensional binary search, (3) for each valuation thus found, checks if the STL formula leads to low MCR; if yes, terminates; if no, continues to finding the next point on the validity domain boundary till number of recursive boundary computation calls is reached; after which moves to the next PSTL formula. Uses the idea of signatures (satisfaction values for a random selection of traces and parameter values) to heuristically eliminate possibly equivalent PSTL formulas.
Given a black-box system with inputs and outputs, and a correctness specification for the outputs, mines assumptions on the input (i.e. the environment) for the corresponding outputs to satisfy the given correctness spec. The input assumptions are assumed to be expressible in STL. The mining algorithm enumerates PSTL formulas and uses decision trees on the space of robustness values induced by STL formulas obtained by sampling the parameter space of the PSTL formula. Learned decision tree is then converted into an STL formula that constitutes the environment assumption. Uses falsification to refine the environment assumption (eliminate inputs that do not satisfy the output requirements).
We formulate numerically-robust inductive proof rules for unbounded stability and safety properties of continuous dynamical systems. These induction rules robustify standard notions of Lyapunov functions and barrier certificates so that they can tolerate small numerical errors. In this way, numerically-driven decision procedures can establish a sound and relative-complete proof system for unbounded properties of very general nonlinear systems. We demonstrate the effectiveness of the proposed rules for rigorously verifying unbounded properties of various nonlinear systems, including a challenging powertrain control model.
Convolutional Neural Networks have been proposed for identifying objects in images. In self-driving cars, these CNNs detect objects in frames of a video. While CNNs like SqueezeDet and Yolo have shown a lot of promise at fast one-shot object detection, they can make mistakes because they do not integrate temporal information across frames. We formulate TQTL -- an extension to STL to reason about sanity conditions that an object detector must satisfy while detecting objects across multiple frames (e.g. persistence of labels for the same object, constraints imposed by physical laws of motion, etc.). TQTL shows that both SqueezeDet and Yolo can miss detecting objects, detect phantom objects, and change bounding boxes of the same object across frames in ways that can confuse a downstream decision-making system. A cool feature of using TQTL is that you do not need any ground truth labels! We may not be able to check if a hippopotamus is indeed labeled a hippopotamus, but we will certaintly detect if the CNN changes its label to velociraptor.
This paper explores an SMT-based formulation for runtime enforcement of an STL property, utilizing previous work by Chao Wang's group on synthesizing Boolean shields for LTL properties.
The use of neural networks for control has been on the rise, partly due to the successes of deep reinforcement learning techniques. The question that we broach in this paper is the following: is it possible to train the neural network controller in such a way that the system always remains safe? We answer this question in the affirmative using the formalism of control barrier functions. A control barrier function guarantees that for every state of the given control system, there exists a control input that keeps the system safe. That is, it guarantees that the system trajectories never leave a designated safe set of states. In our setup, we assume that the user provides us a CBF, and the neural network controller is trained to respect the CBF conditions. We check the latter using the nonlinear SMT solver dReal. If dReal gives a counterexample, we incorporate this counterexample in the cost function for training the network, which guarantees that this counterexample (and hopefully a robust region around it) will satisfy the CBF conditions. This yields a verification-in-the-loop reinforcement learning procedure that produces correct-by-construction neural network controllers. The version here fixes an unfortunate typo in Equation 5 of the official version; essentially the sup and inf operators appear swapped in the official version here .
Deep Reinforcement Learning uses custom-designed reward functions to help pick the desired behavior of a closed-loop control system. User-created state-based reward functions are susceptible to reward hacking, where the agent learns an undesirable behavior because the user did not foresee how the agent could maximize expected long term rewards through unexpected policies that can be both undesirable and unsafe. The fix is reward shaping: careful crafting of rewards; of course, this can be error prone as well. Our solution is to express desired high-level behavior using the formalism of Signal Temporal Logic (STL). We also formulate an algorithm to assign state-based rewards by computing the quantitative semantics of the given STL specification over a bounded horizon (similar to performing a rollout). Our algorithm shows that using STL-based rewards potentially allows deep RL to converge faster to a safer policy.
Shape Expressions (SEs) are like regular expressions where atomic symbols specif shapes. While regexps are used for pattern matching in strings, SEs are used for pattern matching in time-series data (i.e. signals). Basic shapes can include lines, polynomial curves, exponential curves, sinusoids, etc. The language of SEs includes unions and Kleene star in addition to basic shapes. While a signal can exactly match a SE, we also allow noisy matches, i.e. where a signal segment may be at a bounded deviation from a constituent basic shape in the SE. We typically use statistical measures such as mean squared error (MSE) for measuring such deviations. Thus, a SE corresponds to both an exact language over signals as well as a noisy language (for a given error threshold). In this paper, we introduce SEs, and give an automata-based algorithm for monitoring an SE.
We use simulation-guided techniques to learn a barrier certificate to prove the safety of a neural network-controlled dynamical system.
This invited paper introduces Stochastic Temporal Logic, and the challenges and solution approaches to some interesting problems in this logic.
This invited paper outlines the challenges in monitoring the security, privacy, information leakage in CPS applications and some tentative solution approaches based on hyper-properties.
This paper proposes using the idea of performing fault localization using statistical techniques. In software testing, tools such as Tarantula and later extensions showed the efficacy of basing fault localization on the correlation between accessing a program location and the program run being faulty. We extend this observation to CPS, where runs can be assessed quantitatively for their degree of satisfaction/violation. We also focus on optimizing/analyzing model parameters. A common paradigm for storing model parameters is in the form of look-up tables. We exploit the special metric structure provided by the look-up table (assuming that the table represents a discretization of some nonlinear function). We have implemented this idea in a Matlab extension called Vajra that builds on top of the Breach toolbox. The tool also provides cool visualizations of likely bug locations.
A key shortcoming of our CAV '17 paper on logical clustering is that the user needs to provide a relative ordering between parameters of a given PSTL formula (that is used as the logical lens for projecting time-series data to a learning-amenable space). Here, we make the observation that for monotone logical properties, we can instead project a time-series datum to the (approximate) validity domain boundary of the given PSTL formula and then perform clustering on this space. This allows us to effectively cluster and learn some interesting new classes of STL-based classifiers.
For reliable situation awareness in autonomous vehicle applications, we need to develop robust and reliable image processing and machine learning algorithms. Currently, there is no general framework for reasoning about the performance of perception systems. This paper introduces Timed Quality Temporal Logic (TQTL) as a formal language for monitoring and testing the performance of object detection and situation awareness algorithms for autonomous vehicle applications. We demonstrate that it is possible to describe interesting properties as TQTL formulas and detect cases where the properties are violated.
In order to effectively analyze and build cyberphysical systems (CPS), designers today have to combat the data deluge problem, i.e., the burden of processing intractably large amounts of data produced by complex models and experiments. In this work, we utilize monotonic parametric signal temporal logic (PSTL) to design features for unsupervised classification of time series data. This enables using off-the-shelf machine learning tools to automatically cluster similar traces with respect to a given PSTL formula. We demonstrate how this technique produces interpretable formulas that are amenable to analysis and understanding using a few representative examples. We illustrate this with case studies related to automotive engine testing, highway traffic analysis, and auto-grading massively open online courses.
We present a technique to investigate abnormal behaviors of signals in both time and frequency domains using an extension of time-frequency logic that uses the continuous wavelet transform. Abnormal signal behaviors such as unexpected oscillations, called hunting behavior, can be challenging to capture in the time domain; however, these behaviors can be naturally captured in the time-frequency domain. We introduce the concept of parametric time-frequency logic and propose a parameter synthesis approach that can be used to classify hunting behavior. We perform a comparative analysis between the proposed algorithm, an approach based on support vector machines using linear classification, and a method that infers a signal temporal logic formula as a data classifier. We present experimental results based on data from a hydrogen fuel cell vehicle application and electrocardiogram data extracted from the MIT-BIH Arrhythmia Database.
Evaluation of industrial embedded control system designs is a time-consuming and imperfect process. While an ideal process would apply a formal verification technique such as model checking or theorem proving, these techniques do not scale to industrial design problems, and it is often difficult to use these techniques to verify performance aspects of control system designs, such as stability or convergence. For industrial designs, engineers rely on testing processes to identify critical or unexpected behaviors. We propose a novel framework called Underminer to improve the testing process; this is an automated technique to identify non-converging behaviors in embedded control system designs. Underminer treats the system as a black box, and lets the designer indicate the model parameters, inputs and outputs that are of interest. It supports a multiplicity of convergence-like notions, such as those based on Lyapunov analysis and those based on temporal logic formulae. Underminer can be applied in the context of testing models created in the controller-design phase, and can also be applied in a scenario such as hardware-in-the-loop testing. We demonstrate the efficacy of Underminer by evaluating its performance on several examples.
We study the problem of falsifying reachability properties of real-time control software acting in a closed-loop with a given model of the plant dynamics. Our approach employs numerical techniques to simulate a plant model, which may be highly nonlinear and hybrid, in combination with symbolic simulation of the controller software. The state-space and input-space of the plant are systematically searched using a plant abstraction that is implicitly defined by ``quantization'' of the plant state, but never explicitly constructed. Simultaneously, the controller behaviors are explored using a symbolic execution of the control software. On-the-fly exploration of the overall closed-loop abstraction results in abstract counterexamples, which are used to refine the plant abstraction iteratively until a concrete violation is found. Empirical evaluation of our approach shows its promise in treating controller software that has precise, formal semantics, using an exact method such as symbolic execution, while using numerical simulations to produce abstractions of the underlying plant model that is often an approximation of the actual plant. We also discuss a preliminary comparison of our approach with techniques that are primarily simulation-based.
Forward invariant cuts are invariant and safe sets that may not be initialized. This paper introduces the forward invariant cut rule in the calculus of the KeYmaera theorem prover. It also suggests how designers could translate localized design information (about a given discrete mode, or a set of modes), into a forward invariant cut, that helps simplify the proof of system safety (by eliminating parts of the hybrid state space).
Signal Temporal Logic allows specifying temporal properties of real-valued signals over dense time-domains. Alongwith the usual Boolean semantics, it is also equipped with quantitative semantics that allow estimating how robustly a given signal satisfies a given property. Offline algorithms to compute the robust satisfaction value require the entire signal over a given finite time horizon to be available before computation can proceed. Online algorithms can estimate upper and lower bounds on the robust satisfaction value (called the robust satisfaction interval RoSI) over prefixes of a complete signal, updating the RoSI as more information about the signal becomes available. In this paper, we propose memory-efficient algorithms to monitor the RoSI in an online fashion. The FMSD paper is a much better and more comprehensive version of this paper. It defines nominal quantitative semantics for STL and also shows that the nominal quantitative semantics for arbitrary STL formulas can be monitored using a finite amount of memory that is independent of the length of the trace.
The Skorokhod distance is a metric on functions (over time) that quantifies mismatches between signal values as well as time jitter. This metric has been heretofore used to define topologies for hybrid systems, but Vinayak and Rupak had a nice paper in HSCC 2015 to compute the Skorokhod distance for polygonal traces in polynomial time. We employ this a metric for testing conformance. First, we explore a restriction of the metric which restricts time jitter to a finite window -- this brings down the complexity to linear in the length of the trace. Next, we explore an optimization-guided approach to find an input to two dynamical systems that maximizes the Skorokhod distance between the corresponding outputs. This corresponds to finding non-conformant behavior. Finally, we show that traces satisfying similar temporal logic properties are classified to be close by the Skorokhod metric; further establishing its usefulness in quantifying system conformance. The FMSD paper is a more comprehensive version of this paper, comparing Skorokhod distance computation between traces constructed using piecewise constant interpolation. Specifically, in the FMSD paper, we study the tradeoff between the runtime complexity of the distance computation (between piecewise linear traces and piecewise constant traces) and the higher sampling rate required for piecewise constant traces.
Automotive embedded control systems are a vital aspect of modern automotive development, but the considerable complexity of these systems has made quality checking a challenging endeavor. Simulation-based checking approaches are attractive, as they often scale well with the complexity of the system design. This paper presents an overview of simulation-guided techniques that can be used to increase the confidence in the quality of an automotive powertrain control system design. We discuss the relationship between simulation-based approaches and the broader areas of verification and powertrain control design. Also, we discuss new software tools that use simulation-guided approaches to address various aspects of automotive powertrain control design verification. We conclude by considering ongoing challenges in developing new simulation-guided tools and applying them in a powertrain control development context.
Falsification for CPS models is about finding input signals to the model that cause the corresponding output signals to violate a specified Signal Temporal Logic (STL) property. In falsification tools like S-TaLiRo and Breach, a strategy to finitize the search space, i.e. the infinite-dimensional input signal space, is to define a finite parameterization of signals using control points. In this paper, we explore a strategy where we impose a discrete grid on the signal space, use Tabu search to explore this grid, and random restarts to mitigate local optima. We also propose a grid refinement strategy to lessen the user-burden of specifying the control points. In this paper, we explore a finite differences based approach to stochastically approximate the descent direction during the local search phase. We also have follow-up unpublished work on simultaneous perturbation based stochastic approximation, which shows better results. Link to appear soon!
We present an approach that uses Rapidly Exploring Random Trees (RRTs) to explore the state space of a dynamical system, with the dual objective of maximimizing the coverage (measured using the star-discrepancy metric), and minimizing the robust satisfaction value of a given Signal Temporal Logic requirement. In other words, a bug-finding technique guided by robustness and coverage.
Contraction metrics help establish incremental stability of dynamical systems. We present a simulation-guided technique to discover contraction metrics for highly nonlinear systems.
We present an approach for finding violations of safety properties of hybrid systems. Existing approaches search for complete system trajectories that begin from an initial state and reach some unsafe state. Our approach searches over sequences of trajectory segments starting from any system states, where adjacent segments may have gaps; our approach then seeks to narrow these gaps iteratively. Interesting segmented trajectories are discovered using a randomized scatter-and-simulate algorithm.
We present a technique that allows us to find Sum-of-Squares Lyapunov functions for hybrid dynamical systems. Given a template form for the Lyapunov function, our technique uses simulation data to infer constraints, the solution of which provides a candidate Lyapunov function. Our technique then uses a global optimizer to search the region of interest for trajectories that violate the Lyapunov conditions for the given function, which are fed back into the procedure as additional constraints. Once the procedure terminates, we use a nonlinear SMT solver to verify the Lyapunov conditions, in order to obtain a proof of soundness of the candidate Lyapunov function (which the global optimizer cannot guarantee). Failures of this check give yet another set of constraints to drive the search procedure.
We present models for an important problem in the automotive domain of powertrain control -- that of air-fuel ratio control. The first and the most complex model consists of a plant model that is a dynamical system capturing the physical behavior of important subsystems in the engine and a multi-mode controller representing different regimes of operation. The second and third model are successive simplifications of the first, and hope to present the research community with manageable benchmark problems for verification and analysis tools. We also present a suite of interesting requirements for these models expressed in a temporal logic over real-valued signals and dense time.
We present a technique that allows us to find Sum-of-Squares Lyapunov functions for nonlinear discrete dynamical systems. Given a template form for the Lyapunov function, our technique uses simulation data to infer constraints, the solution of which provides a candidate Lyapunov function. Our technique then uses a global optimizer to search the region of interest for trajectories that violate the Lyapunov conditions for the given function, which are fed back into the procedure as additional constraints. We derive conditions involving the Lipschitz constants of the vector field and the Lyapunov candidate function, and a sampling density that guarantee that testing the Lyapunov conditions at each point in the random sampling of the region of interest is sufficient to prove the validity of the Lyapunov function over the entire region.
Usual falsification techniques search for simulation traces violating a given safety property by sequential search. We explore the idea of using a global optimizer to minimize the distance between end-points of trajectory segments thereby approaching a real trajectory. Such directionless search has the potential to search the state-space in a less restricted fashion, and thus more comprehensively for systems with large amounts of switching.
Over a chosen metric topology, we define a transducer as robust if bounded distance input strings have bounded distance outputs. Algorithms based on checking emptiness of automata are presented for robustness analysis of transducers over the metric space induced by the Manhattan distance and the one induced by Levenshtein or edit distance.
The theory of regular functions provides a class of analyzable functions from strings to numbers, that can provide a foundation for analyzing quantitative properties of finite-state sysmtes. Cost Register Automata (CRA) are deterministic machines that map strings to costs using a vocabulary of interesting operations such as adding, scaling, min. The notion of regularity provides a yardstick to measure expressiveness, and to study decision problems and theoretical properties of resulting classes of cost functions. Over suitably defined vocabularies, CRA generalize weighted automata and probabilistic automata, and also allow us to express generalized min-cost problems over directed graphs.
Requirements for industrial control systems are often imprecise, non-modular, evolving, or even simply unknown. We propose a framework for automatically mining requirements from Simulink models. The input to our algorithm is a requirement template expressed in Parametric Signal Temporal Logic. Our algorithm is an instance of counterexample-guided inductive synthesis: an intermediate candidate requirement is synthesized from simulation traces of the system, which is refined using counterexamples to the candidate obtained with the help of a falsification tool.
We propose a new design methodology for communication protocols based on the observation that protocols are usually understood in terms of abstract scenarios; each of which consists of a set of transition snippets. A transition snippet describes a single step of the concrete behavior of a protocol, or a single symbolic execution step, or a mixture of both. Given a protocol specified as snippets, we show how intelligent expression and synthesis coupled with an explicit-state model checker can be used in an iterative manner to converge to a complete, fully symbolic protocol description. The advantage of doing things this way is twofold: (1) specifying a protocol as scenarios is more natural, and (2) our synthesis procedure provides a correct-by-construction protocol from such a designer-friendly specification.
We present a notion of robustness for a networked system when the underlying network is prone to errors. Networked system is modeled as Mealy machines communicating over a set of internal channels, and interacting with the outside world through a fixed set of input and output channels. We focus on network errors that arise from channel perturbations, and given a worst-case bound 'i' for the number of errors that can occur in the internal channels, we estimate if the worst-case error in the output channel is less than a given bound 'o'. Such a network is called (i,o)-robust. Error is measured as distance from unperturbed output, and we give procedures polynomial in the size of the network when the distance metric is the edit (Levenshtein) distance or the L1-norm.
NSSTs can realize MSO-definable relations between strings, and are closed under seq. composition. Functional NSSTs are nondeterministic, yet implement functions from strings to strings. Checking functionality of an NSST is in PSPACE, equivalence checking for functional NSSTs is PSPACE-complete, while that for NSSTs is undecidable. Checking containment of an NSST in a finite union of deterministic SSTs is in PSPACE. An NSST could be potentially used as a model for concurrent methods acting on a singly-linked data structure.
Derives concurrency control for a sequential library to make it thread-safe, given proof of sequential correctness of the library. Thread safety defined as: a) absence of assertion failures in concurrent context, or b) linearizability of library.
Identifies syntactic fragment and associated sufficient conditions such that recursive methods therein can be algorithmically verified. Methods and pre/post- conditions encoded as tree automata; correctness reduced to language emptiness.
Derives interface contracts over concurrent invocations of library methods to ensure deadlock-freedom. Symbolcally explores the exponential number of alias patterns across method arguments, using a SMT solver to derive contracts in terms of these patterns.
For Boolean Programs specified by pre/post-condition semantics, uses local annotations generated by pre/post-condition propagation to generate repairs. Improves on existing complexity results by an intelligent repair extraction.
Reduces verification of parameterized data structures to the emptiness of tree automata. Can verify properties like acyclicity for a reasonable set of methods operating on lists, trees, and general graphs.