Conference and Summer School on Declarative Programming
17th to 22st September 2017
Declarative programming is an advanced paradigm for modeling and solving complex problems. This method has attracted increased attention over the last decades, e.g., in the domains of data and knowledge engineering, databases, artificial intelligence, natural language processing, modeling and processing combinatorial problems, and for establishing systems for the web.
The conference Declare 2017 aims to promote the cross-fertilizing exchange of ideas and experiences among researches and students from the different communities interested in the foundations, applications, and combinations of high-level, declarative programming and related areas.
It will be accompanied by a one-week summer school on Advanced Concepts for Databases and Logic Programming for students and PhD students.
The technical program of the event will include invited talks, presentations of refereed papers, and system demonstrations.
Part of Declare'17
University of Würzburg, Germany
Min Fang and Hans Tompits
In recent years, different methods for supporting the development of answer-set programming (ASP) code have been introduced. During such a development process, often it would be desirable to have a natural-language representation of answer sets, e.g., when dealing with domain experts unfamiliar with ASP.
In this paper, we address this point and provide an approach for such a representation, defined in terms of a controlled natural language (CNL), which in turn relies on the annotation language LANA for the specification of meta-information for answer-set programs. Our approach has been implemented as an Eclipse plug-in for SeaLion, a dedicated IDE for ASP.
Lorenz Leutgeb and Antonius Weinzierl
Answer-Set Programming (ASP) is a well-known and expressive logic programming paradigm based on efficient solvers. State-of-the-art ASP solvers require the ASP program to be variable-free, they thus ground the program upfront at the cost of a potential exponential explosion of the space required. Lazy-grounding, where solving and grounding are interleaved, circumvents this grounding bottleneck, but the resulting solvers lack many important search techniques and optimizations. The recently introduced ASP solver Alpha combines lazy-grounding with conflict-driven nogood learning (CDNL), a core technique of efficient ASP solving. This work presents how techniques for efficient propagation can be lifted to the lazy-grounding setting. The Alpha solver and its components are presented and detailed benchmarks comparing Alpha to other ASP solvers demonstrate the feasibility of this approach.
Jan Rasmus Tikovsky
In the last years, concolic testing, a technique combining concrete and symbolic execution for the automated generation of test cases, has gained increasing popularity. Concolic testing tools are initialized with expressions on concrete input data.
But instead of just evaluating them, they additionally collect symbolic information along specific execution paths. This information can be used to systematically compute alternative inputs exploring yet unvisited paths. In this way, test cases can be generated covering all branches of a given program.
The first concolic testing tools have been developed for imperative languages analyzing code at a very low level. Recently, there have been also some approaches investigating the concolic execution of declarative languages. In this work, we discuss the application of concolic testing to the functional logic language Curry. More precisely, we present ccti, a concolic interpreter which is adapted for the automated generation of test cases for both purely functional and non-deterministic programs.
CalcuList (Calculator with List manipulation), is an educational language for teaching functional programming extended with some imperative and side-effect features, which are enabled under explicit request by the programmer. In addition to strings and lists, the language natively supports JSON objects. The language has a Python-like syntax and interactive computation sessions with the user are established through a REPL (Read-Evaluate-Print-Loop) shell. The object code produced by a compilation is a program that will be eventually executed by the CalcuList Virtual Machine (CLVM).
in Z6, Lecture Hall 0.001
Thom Frühwirth (Talk by Daniel Gall)
In previous work we added justifications to Constraint Handling Rules (CHR) to enable logical retraction of constraints for dynamic algorithms. We presented a straightforward source-to-source transformation to implement this conservative extension. In this companion paper, we improve the performance of the transformation.
We discuss its worst-case time complexity in general. Then we perform experiments. We benchmark the dynamic problem of maintaining shortest paths under addition and retraction of paths. The results validate our complexity considerations.
Pedro Roque, Vasco Pedro and Salvador Abreu
Applying parallelism to constraint solving seems a promising approach and it has been done with varying degrees of success. Early attempts to parallelize constraint propagation, which constitutes the core of traditional interleaved propagation and search constraint solving, were hindered by its essentially sequential nature.
Recently, parallelization efforts have focussed mainly on the search part of constraint solving, as well as on local-search based solving. Lately, a particular source of parallelism has become pervasive, in the guise of GPUs, able to run thousands of parallel threads, and they have naturally drawn the attention of researchers in parallel constraint solving.
We address challenges faced when using multiple devices for constraint solving, especially GPUs, such as deciding on the appropriate level of parallelism to employ, load balancing and inter-device communication, and present our current solutions.
Frank Richter, Daniel Gall and Thom Frühwirth
In the abstract operational semantics of Constraint Handling Rules (CHR), propagation rules, i.e. rules that only add information, can be applied again and again. This trivial non-termination is typically avoided by a propagation history. A more declarative approach are persistent constraints. Constraints that are introduced by propagation rules are made persistent and cannot be removed. Now a propagation rule is only applied, if its derived constraints are not already persistent.
The operational semantics with persistent constraints Ω! differs substantially from other operational semantics, hence the standard confluence test cannot be applied. In this paper, a confluence test for Ω! is presented. Since Ω! breaks monotonicity of CHR, a weaker property is established that is shown to suffice for a decidable confluence criterion for terminating Ω! programs. The confluence test is implemented using a source to source transformation.
Falco Nogatz, Jona Kalkus and Dietmar Seipel
XML Schema is a well-established mechanism to define the structure and constraining the content of an XML document. While this approach taken by itself is very declarative, currently available tools for XML validation are not. In this paper we introduce an implementation of an XSD validator in SWI-Prolog, made publicly available as the package library(xsd). Our approach is based on flattening the XSD and XML documents into Prolog facts. The top-down validation makes great use of Prolog's backtracking and unification capabilities. To ensure the compliance to the XSD standard and to support the test-driven development, we have created a test framework based on the Test Anything Protocol and SWI-Prolog’s quasi–quotations.
Philipp Koerner and Sebastian Krings
In general, even though Prolog is a dynamically typed language, predicates may not be called with arbitrarily typed arguments. Assumptions regarding type or mode are often made implicitly, without being directly represented in the source code. This complicates identifying the types or data structures anticipated by predicates. In consequence, Covington et al. proposed that Prolog developers should implement their own runtime type checking system. In this paper, we present a re-usable Prolog library named plspec. It offers a simple and easily extensible DSL used to specify type and structure of input and output arguments. Additionally, an elegant insertion of multiple kinds of runtime checks was made possible by using Prolog language features such as co-routining and term expansion. Furthermore, we will discuss performance impacts and possible future usage of these annotations.
Type systems are a powerful tool in modern programming languages. Static descriptive type inference algorithms for logic programs, which do not rely on a priori type declarations, usually abstract the program success set. This makes types over generous in several cases, due to the unconstrained use of logic variables which may cause the acceptance of more terms than intended in a successful computation. We argue that in a fully typed logic programming language we should have type constraints over all variables. In this paper we propose that all uses of logic variables in a program should be type constrained, where by constraints we mean that either the type of the variable is strictly smaller than the set of all possible terms, or that there is an equality constraint between different types. The types in which all variables are constrained are named closed types. Here we define the notion of closed types and a closure operation which transforms general regular types into closed types.
Emmanuelle-Anna Dietz, Steffen Hölldobler and Richard Mörbitz
It seems widely accepted that human reasoning cannot be modeled by means of classical logic. Psychological experiments have repeatedly shown that participants' answers systematically deviate from the classical logically correct answers. Recently, a new computational logic approach to modeling human syllogistic reasoning has been developed which seems to perform better than other state-of-the-art cognitive theories. We take this approach as starting point, yet instead of trying to model the human reasoner, we aim at identifying clusters of reasoners, which can be characterized by reasoning principles or by heuristic strategies.
Processing programs as data is one of the successes of functional and logic programming. Higher-order functions, as program-processing programs are called in functional programming, and meta-programs, as they are called in logic programming, are widespread declarative programming techniques. In logic programming, there is a gap between the meta-programming practice and its theory: Meta-programming's formalisations do not explicitly address meta-programming's impredicativity and are cumbersome. This article aims at overcoming this unsatisfactory situation by discussing the relevance of impredicativity to meta-programming and by revisiting Ambivalent Logic's syntax and model theory. The impredicative language and model theory proposed in this article are conservative extensions of the language and model theory of first-order logic.
The Push Method for Bottom-Up Evaluation in deductive databases was previously defined as a translation from Datalog to C++. Performance tests on some benchmarks from the OpenRuleBench collection gave very encouraging results. However, most of the systems used for comparison compile the query into code of an abstract machine and then use an emulator for this code. Therefore, runtimes cannot be directly compared. In this paper, we propose an abstract machine for bottom-up evaluation of Datalog based on the Push Method. This also helps to clarify some optimizations we previously expected from the C++ compiler. Since the interpreted code of the abstract machine must do something useful "standalone", we also consider declarative output with templates.
Michael Hanus and Jonas Oberschweiber
We present CPM, a package manager for the declarative multi-paradigm language Curry. Although CPM inherits many ideas from package managers for other programming languages, a distinguishing feature of CPM is its support to check the rules of semantic versioning, a convenient principle to associate meaningful version numbers to different software releases. Although the correct use of version numbers is important in software package systems where packages depend on other packages with specific releases, it is often used as an informal agreement but usually not checked by package managers. CPM is different in this aspect: it provides support for checking the semantic requirements implied by the semantic versioning scheme. Since these semantic requirements are undecidable in general, CPM uses the property-based testing tool CurryCheck to check the semantic equivalence of two different versions of a software package. Thus, CPM provides a good compromise between the use and formal verification of the semantic versioning rules.
Armin Wolf (Talk by Hans-Joachim Goltz)
Motivated by the necessity to model the energy loss of energy storage devices, a Proportional Constraint is introduced in finite integer domain Constraint Programming. Therefore rounding is used within its definition. For practical applications in finite domain Constraint Programming, pruning rules are presented and their correctness is proven. Further, it is shown by examples that the number of iterations necessary to reach a fixed-point while pruning depends on the considered constraint instances. However, fixed-point iteration always results in the strongest notion of bounds consistency. Furthermore, an alternative modeling of the Proportional Constraint is presented. The run-times of the implementations of both alternatives are compared showing that the implementation of the Proportional Constraint on the basis of the presented pruning rules performs always better on sample problem classes.
The programming paradigm of constrained objects is a declarative variant of the object-oriented paradigm wherein objects define the structure of a system and declarative constraints (rather than imperative methods) define its behavior. Constrained objects have many uses in the engineering domain and computation in this paradigm is essentially constraint solving. This paper is concerned with an extension of constrained objects called temporal constrained objects, which are especially appropriate for modeling dynamical systems. The main extensions are series variables and metric temporal operators to declaratively specify time-varying behavior. The language TCOB exemplifies this paradigm and the execution of TCOB programs consists of constraint solving within a time-based simulation framework. One of the challenges in TCOB is identifying errors owing both to the complexity of programs and the underlying constraint solving methods. We address this problem by extracting a run-time trace of the execution of a TCOB program and providing an analysis of the cause of error. The run-time trace also serves as a basis, in many cases, for constructing a finite-state machine which in turn can be used for "model-checking" properties of the system. The paper also presents abstraction techniques for dealing with simulations that result in large state spaces.
in Toskanasaal, Residence Würzburg
in Residence Restaurant
in Z6, Lecture Hall 0.001
Gonçalo Carnaz, Roy Bayot, Vitor Beires Nogueira, Teresa Gonçalves and Paulo Quaresma
The Agatha project aims to develop an intelligent system that resorts to open sources (video, audio and text) of information for surveillance and crime control. Named-entity recognition combined with ontologies is the approach followed for the textual sources. This work describes the theoretical basis together with the system implementations for the text analysis component of the Agatha framework.
Sibylle Schwarz and Mario Wenzel
We present a method to control Lego EV3 robots by Prolog programs. The connection between the robot and SWI-Prolog is established via ev3dev. The sensors and actors of the robot can be operated from Prolog programs using our collection of predefined predicates. We demonstrate our approach by some examples from our introductory robotics courses.
I teach a course Advanced Programming for 4th semester bachelor students of computer science.
In this note, I will explain my reasons for choosing the topics to teach, as well as their order, and presentation.
In particular, I will show how I include automated exercises, using the Leipzig autotool system.
Jan C. Dageförde and Herbert Kuchen
Object-oriented (OO) languages such as Java are the dominating programming languages nowadays, among other reasons due to their ability to encapsulate data and operations working on them, as well as due to their support of inheritance. However, in contrast to constraint-logic languages, they are not particularly suited for solving search problems. During development of enterprise software, which occasionally requires some search, one option is to produce components in different languages and let them communicate. However, this can be clumsy.
As a remedy, we have developed the constraint-logic OO language Muli, which augments Java with logic variables and encapsulated search. Its implementation is based on a symbolic Java virtual machine that supports constraint solving and backtracking. In the present paper, we focus on the non-deterministic operational semantics of an imperative core language.
Ke Liu, Sven Löffler and Petra Hofstedt
Parallel constraint solving is a promising way to enhance the performance of constraint programming. Yet, current solutions for parallel constraint solving ignore the importance of hypergraph decomposition when mapping constraints onto cores. This paper explains why and how the hypergraph decomposition can be employed to relatively evenly distribute workload in parallel constraint solving. We present our dedicated hypergraph decomposition method det-k-CP for parallel constraint solving. The result of det-k-CP, which conforms with four conditions of hypertree decomposition, can be used to allocate constraints of a given constraint network to cores for parallel constraint solving. Our benchmark evaluations have shown that det-k-CP can relatively evenly decompose a hypergraph for specific scale of constraint networks. Besides, we obtained competitive execution time as long as the hypergraphs are sufficiently simple.
To start the registration process, please fill the form on our registration page. You will automatically get a confirmation mail that we received your data. This confirmation mail also includes a copy of your entered data.
We will send you instructions for bank transfer via mail. This may take up to three days. Once we receive the money, your registration is completed. You will get a written receipt at the conference.
Universität Würzburg – Franconia – Bavaria – Germany
+49 931 322770Website
Hotel Zur Stadt Mainz
+49 931 53155Website
Hotel Würzburger Hof
+49 931 53814Website
We have collected the information for guests on a separate page.