Theory days

Estonia-Latvia

Abstracts

Aleksandrs Belovs (University of Latvia). Tight Quantum Lower Bound for k-Distinctness.

We introduce a new quantum query lower bound framework. It is inspired by Zhandry's compressed oracle technique but it also subsumes the polynomial method as a special case. Compared to Zhandry's technique, our approach has two key differences. First, we do not use any oracles (except for the standard input oracle), and define ``knowledge'' directly from the expansion in the Fourier basis. Second, we allow arbitrary probability distributions of inputs. We demonstrate power of this framework by proving first tight quantum query lower bound for the k-Distinctness problem.

Maiara Bollauf (University of Tartu). Coset Shaping for Coded Modulation.

This talk introduces coset shaping, a new geometric shaping technique for coded QAM and PAM signaling. The method applies to both information and parity bits without additional complexity. We show that, as the code length and modulation order increase, the gap to capacity can be made arbitrarily small. In addition, we present bounds and performance comparisons demonstrating the advantages of coset shaping when combined with short- and medium-length codes.

Chris Brzuska (Aalto University). On Bounded Storage Key Agreement and One-Way Functions.

We study key agreement in the bounded-storage model, where the participants and the adversary can use an a priori fixed bounded amount of space, and receive a large stream of data. While key agreement is known to exist unconditionally in this model (Cachin and Maurer, Crypto’97), there are strong lower bounds on the space complexity of the participants, round complexity, and communication complexity that unconditional protocols can achieve.

In this work, we explore how a minimal use of cryptographic assumptions can help circumvent these lower bounds. We obtain several contributions:

Assuming one-way functions, we construct a one-round key agreement in the bounded-storage model, with arbitrary polynomial space gap between the participants and the adversary, and communication slightly larger than the adversarial storage. Additionally, our protocol can achieve everlasting security using a second streaming round.

In the other direction, we show that one-way functions are necessary for key agreement in the bounded-storage model with large space gaps. We further extend our results to the setting of fully-streaming adversaries, and to the setting of key agreement with multiple streaming rounds.

Our results rely on a combination of information-theoretic arguments and technical ingredients such as pseudorandom generators for space-bounded computation, and a tight characterization of the space efficiency of known reductions between standard Minicrypt primitives (from distributional one-way functions to pseudorandom functions), which might be of independent interest.

Matteo Campanelli (University of Tartu, Offchain Labs). Subvector Commitments with Optimal Opening Complexity.

Subvector commitments (SVC) enable committing to a sequence of values and provably revealing any number of values at specific positions at a later time, potentially keeping preprocessed auxiliary information. SVCs have an extremely broad range of applications, from compiling PCPs into cryptographic arguments to proofs of space and much more. We introduce a simple pairing-based SVC where the prover can open a subvector of size k in time linear in k in the standard model (without any random oracle). This improves on the state of the art in the same category (Tomescu et al., SCN 2020) which required O(k log^2(k)) time. Our scheme has several nice structural properties (associative proofs and other aggregation features) and is extremely practical—it concretely obtains orders-of-magnitude speedups relative to prior work.

Silvio Capobianco (Tallinn University of Technology). Two approaches to convergence of trajectories.

In a discrete-time dynamical system, points of a topological space are moved by successive applications of a continuous function. Each point will then follow a trajectory, which is a function that associates to each time the point's position after the corresponding number of iterations of either the update, or its inverse if the latter is reversible. We know what it means for a sequence of points to converge to a point; but then, what does it mean for a sequence of trajectories to converge to a trajectory?

In this talk, we discuss two possible approaches to the problem, proposed by Abram Besicovitch and Hermann Weyl, in the case of a discretized dynamics, where the phase space is partitioned into finitely many blocks and only the current block is recorded instead of the exact position. Trajectories then become infinite (or bi-infinite for a reversible dynamics) words over a finite alphabet. The core idea is that two infinite words are ""close"" if the set of points where they differ is ""proportionally small"". Two realizations of this idea lead to two pseudo-distances, and the metric spaces obtained by identifying sequences at pseudo-distance zero are very different from the more common Cantor space. We then overview an application by Enrico Formenti and others to the study of cellular automata: in particular, the shift map, which is chaotic in the Cantor space, becomes an isometry of the Besicovitch and Weyl spaces. We conclude with more recent work with Pierre Guillon in a more general setting where infinite words are replaced by configurations over generic groups: in particular, we give a necessary and sufficient condition for the more general shifts to be isometries.

Maksims Dimitrijevs (University of Latvia). Lights Out Problem to Benchmark Real Quantum Hardware.

We present Grover's search implementations of the Lights Out problem for graphs with 2x2 grid of lamps and 6 lamps on Möbius ladder to benchmark the performance of real quantum hardware. Additionally, we ran a small Grover SAT baseline problem instances to diagnose device limitations. Problem instances are executed on publicly accessible quantum processors from IBM and IQM. Our results indicate the improvements in publicly available quantum hardware capacity. We also observe that performance may vary among quantum devices of the same manufacturing revision, as well as that calibration quality strongly influences outcomes.

Ulrik Sørgaard Djupvik (Tallinn University of Technology). Introduction to Sweedler Theory.

Sweedler theory studies so-called measurable maps between algebras, and were introduced by M.E. Sweedler in 1969 in the context of Hopf algebras. A particular highlight of the theory, are the so-called Sweedler operations, which describe certain interactions between algebras and coalgebras. In this talk I will describe the original theory, including defining the five Sweedler operations in the classical setting. I will then give a more general framework in which to do Sweedler theory. Namely, I will define measurable maps between monoids (resp. bimodules) in a duoidal category, and explain how this witnesses the category of monoids (resp. bimodules) being graded by the monoidal category of comonoids (resp. cobimodules), in the sense of V-graded categories.

Dirk Draheim (Tallinn University of Technology). Conditional Events, Conditional Random Variables and the Strong Law of Large Numbers.

In (Draheim 2017), we have established a frequentist semantics for partial conditional probabilities. On the basis of this, we have defined a notion of conditional probabilities and have utilized it in a version of the Strong Law of Large Numbers for conditional probabilities. In this talk, we present recent work (with Ahto Buldas), to extend the results to the continuous case, resulting into a notion of conditional random variables. Beyond the technical details (definitions, theorem, proof outline), the talk will come with exhaustive illustrations, motivation, examples etc. (Dirk Draheim (2017), http://dx.doi.org/10.1007/978-3-319-69868-7)

Matthew Earnshaw (University of Tartu). Monoidal categories graded by partial commutative monoids.

Parallel composition of processes is often subject to constraints: programs accessing shared memory may only run in parallel safely if they operate on disjoint memory locations, or processes consuming abounded resource may only run in parallel if their combined usage is within the bound. Effectful categories axiomatize an instance of this phenomenon in programming language semantics: effectful computations may be combined in parallel with central computations, but not with each other. Monoidal categories axiomatize the unrestricted case, in which the globally defined monoidal product allows all processes to be put in parallel. This work introduces monoidal categories graded by partial commutative monoids (PCMs) to axiomatize the structure common to all of these situations.

Christoph Egger (Chalmers University of Technology). Domino: Verifying Cryptographic Protocols.

Domino is an opinionated verification tool for proofs of cryptographic protocols in the computational model -- security is argued by *reduction* to (computational) hardness assumptions.

In contrast to well-established tools like EasyCrypt, Domino specifically targets protocol proofs and trades generality and expressivity whenever this simplification makes proofs of complex cryptographic protocols like TLS easier.

Massimo Equi (Aalto University). Limits of Distributed Quantum Advantage.

In the classical LOCAL model of distributed computing we study how many communication rounds are needed to solve a computational task in a distributed network. The network is modeled by a graph, where the nodes represent computers of unbounded computational power. When looking for distributed quantum advantage, one can compare LOCAL with its quantum generalization quantum-LOCAL, or even stronger models of distributed computing. In our recent works, we found some forms of quantum advantage, but also discovered that there are limits to it. In this talk, I will focus on presenting the lower bounds we found for distributed quantum advantage for specific classes of problems. Namely, distributed linear programs do not permit any quantum advantage, while locally checkable labelling (LCL) problems do, but only up to a $\sqrt{n}$ factor.

Miika Hannula (University of Tartu). Semiring Semantics in Logic and Databases.

In semiring provenance, a topic originating in database theory, database tuples are annotated with values from a specific semiring, and queries propagate these annotations to outputs. Different semirings can encode different information: polynomial semirings represent provenance, tropical semirings model shortest path distances, and natural numbers count the number of derivations for a query output. In recent years this perspective has been extended beyond basic query languages to more expressive logical formalisms that include also negation and fixed-point operators, and connected to classical topics in finite model theory such as expressiveness and game-based techniques. In this talk, we survey these developments, including the presenter’s recent work on semiring-based dependencies.

Wulf Harder (University of Latvia, QuBalt). Shaping Loss Landscapes for SAT: From Hamiltonian Cycle Encodings to Integer Factorisation.

(abstract available upon request to registered participants)

Aleksi Kalsta (Aalto University). Simple Attacks against (Extended) Fiat-Shamir.

The Fiat-Shamir (FS) transform turns interactive public-coin arguments into non-interactive arguments while preserving soundness in the Random Oracle Model (ROM), whereas soundness in the standard model does not hold in general. One of the counterexamples is the work by Bartusek et al. (BBHMR, TCC 2019) which shows that FS fails for a (contrived) version of the Kilian-Micali protocol. Concretely, BBHMR design a collision-resistant hash-function and a sound PCP such that Kilian-Micali is sound, but its FS transform is not, regardless of which hash-function instantiates the random oracle. In a recent breakthrough, Khovratovich, Rothblum, and Soukhanov (KRS, CRYPTO 2025) show that FS-based attacks do not only apply to contrived argument systems, but actually apply to deployed versions of GKR-based succinct arguments

KRS, thus, clarifies that the gap between real-life hash functions and the ROM is of larger practical relevance than previously thought. In this work, we improve our understanding of FS counterexamples by simplifying (and modernizing) the BBHMR attack. Concretely, we determine simple (as well as more complex) equations relating the polynomial commitment scheme (PC) and the hash function whose solutions lead to attacks on soundness for GKR-style protocols, including more complex protocols such as PlonK. Notably, we show that such attacks can even apply to secure PCs.

Finally, we show a counterexample against the extended Fiat-Shamir transform (XFS), which was recently proposed by Arnon and Yogev (AY, CRYPTO 2025) to prevent KRS-style attacks. The existence of contrived counterexamples was conjectured by AY in relation to BBHMR and Barak (FOCS 2001). Our XFS counterexample is indeed contrived and uses universal argument systems similar to Barak. In turn, it is simpler than Barak and does not rely on the involved techniques developed by BBHMR.

Toomas Krips (University of Tartu). Multi-Party Distributed Point Functions with Polylogarithmic Key Size from Invariants of Matrices.

Distributed point functions (DPFs), introduced in 2014, are a widely used primitive in secure computation for a wide variety of applications. However, until recently, constructions for DPFs with polylogarithmic keys were known only for the two-party setting, multi-party schemes have key sizes that are exponential in the number of parties or superlogarithmic in the domain size. We generalize the efficient tree-based two-party DPF approach and obtain a polylogarithmic-size DPF for any number of parties. We use a technique in which off-path leaves satisfy an invariant ensuring that the secret-shared vector is mapped into secret submodules by public matrices. Our scheme is a secure DPF under factoring, Algebraic Group Model, and a new MinRank-type assumption for which we benchmarked various attacks against MinRank. The output of our scheme is in the exponent in some group where Diffie-Hellman type problems are hard, which limits the usability of the scheme. Still, it is the first multi-party DPF generalizing the tree-based two-party DPF approach. Our scheme is the first where the key is polylogarithmic in the domain size and independent of the number of parties, and the key generation and evaluation can be computed efficiently independently of the number of parties.

Erki Külaots (University of Tartu). Special Soundness and Binding Properties: A Framework for Tightly Secure zk-SNARKs.

Interactive arguments often combine polynomial IOPs with polynomial commitment schemes (PCSs). Frequently, the interactive argument is proven to be knowledge sound, but this incurs a high security loss when applying the Fiat-Shamir transformation to obtain a non-interactive argument in the random oracle model (ROM). I will introduce the notion of special soundness for polynomial IOPs, which yields a smaller security loss, and discuss the properties of polynomial commitment schemes that enable the combined interactive argument to be computationally special sound.

Russell W. F. Lai (Aalto University). Hardness of hinted ISIS from space-time hardness of lattice problems.

To assume the hardness of finding short vectors in (Euclidean) lattices is the bread and butter of lattice-based cryptography. These problems are even conjectured (Lombardiexplore and Vaikuntanathan, CRYPTO'20) to be hard for algorithms that run in single-exponential time but polynomial memory. In this work, we explore a perhaps surprising connection between the non-existence of such algorithms with the hardness of solving some hinted lattice problems in polynomial time. Roughly, Hinted Inhomogeneous Short Integer Solutions (H-ISIS) asks, given a random underdetermined system of linear equations (A, b) and as hints many short integral vectors in the kernel of A, to find an integral solution to the linear system only marginally longer than the hints. This suggests that the space-time hardness of lattice problems is a promising foundation for some advanced cryptographic primitives that have recently been proposed but, so far, are based on ad-hoc lattice assumptions with hints.

Based on joint work with Martin R. Albrecht and Eamonn W. Postlethwaite.

Helger Lipmaa (University of Tartu). Plonk is Simulation Extractable in ROM Under Falsifiable Assumptions.

Solving a long-standing open problem, Faonio, Fiore, and Russo proved that the widely used Plonk zk-SNARK is simulation extractable. However, their proof assumes both the random oracle model (ROM) and the algebraic group model. We prove that the same holds in the ROM under falsifiable assumptions. We combine the template of Faust et al., who proved that simulation extractability follows from knowledge soundness, (weak) unique response, and trapdoorless zero-knowledge, with the recent result of Lipmaa, Parisella, and Siim (Crypto 2025), who proved that Plonk has knowledge soundness in the ROM under falsifiable assumptions. For this, we prove that Plonk satisfies new variants of the weak unique response and trapdoorless zero-knowledge properties. We prove that several commonly used gadgets, like the linearization trick, are not trapdoorless zero-knowledge when considered as independent commit-and-prove zk-SNARKs.

Urmas Luhaäär (University of Tartu). Cyclo: Lightweight Lattice-based Folding via Partial Range Checks.

Folding schemes are cryptographic protocols that allow to cheaply reduce proving multiple statements to proving a single one of the same size. They are useful in building IVC, ZKVM and other applications. A popular approach to constructing folding schemes is using lattice-based cryptography. One of the main challenges in lattice-based folding is to manage the norm-growth of the witness vector. State-of-the-art lattice-based folding schemes such as Latticefold [Boneh, Chen 2025], Latticefold+ [Boneh, Chen 2025] and Neo [Nguyen, Setty 2025] address this concern by decomposing the witness and using norm-checks. Cyclo makes folding cheaper by omitting norm-checks and decomposition on the accumulated witness. This introduces a linear norm growth in both soundness and correctness. To get arbitrary folding-depth, one of the known folding schemes can be applied from time-to-time to reset the norm. This results in an amortized norm-refreshing folding scheme. We also provide a nice algebraic description of the encoding from vectors over $\mathbb{F}_q$ to vectors over $\mathcal{R}_q$ found in Neo [Nguyen, Setty 2025].

Mariana Milicich (University of Tartu). A Quantitative Interpretation for Useful Call-by-Value.

Usefulness is an optimisation mechanism and a key ingredient for proving that leftmost-outermost reduction is time-invariant. This context-sensitive mechanism has been extended to other reduction strategies. In particular, useful call-by-value reduction optimises standard call-by-value while preserving its original semantics. However, this preservation result is based on techniques that heavily rely on the operational behaviour of the calculi at play, lacking flexibility to be adapted to alternative variants of such calculi. In this talk, we will go through the first inductive definition of useful call-by-value, UCBV. Introducing this calculus is necessary to define the non-idempotent intersection type system U, the first quantitative interpretation for useful call-by-value. U serves as a model of useful call-by-value as it captures UCBV-termination through typability, abstracting-away from the intricate operational behaviour UCBV has. And it is quantitative because it offers step-count information for UCBV reduction to normal form.

This is joint work with Pablo Barenbaum and Delia Kesner.

Álvaro Montes (Universitat Pompeu Fabra). Improving the Efficiency of Function-Hiding Functional Commitments.

Function-hiding functional commitment schemes allow one party to commit to a private function $f$ and later prove $f (x) = y$ for public $x$ and $y$ without revealing additional information about the function. We construct efficient function-hiding functional commitment schemes for arithmetic circuits of bounded size that achieve proof sizes below 1.6 kB—over an order of magnitude smaller than previous constructions—while simultaneously reducing proving and verification times. We achieve these results by introducing a novel information-theoretic interactive proof system called Polynomial Interactive Oracle Proofs with Randomized Indexer (rPHPs). By compiling rPHPs with commit-and-prove zkSNARKs, we are able to leverage relaxed zero-knowledge notions for our building blocks. This approach eliminates the overhead of strict privacy requirements of prior work, directly translating into improved efficiency in both communication and computation.

Joint work with Vanesa Daza and Carla Ràfols

Liisi Nõojärv (Tallinn University of Technology). Formal Verification of Proof Search and Countermodel Construction for Intuitionistic Propositional Logic.

I will talk about a fully machine-verified framework for proof search and countermodel construction in intuitionistic propositional logic, based on the IG sequent calculus of Corsi and Tassi. This consists of a terminating proof-search algorithm augmented with a mechanism recording derivation attempts that cannot be completed as refutations ("anti-derivations"). From the latter, Kripke countermodels can be extracted. Both the search algorithm and the countermodel extraction from refutations are implemented in the Lean proof assistant where soundness and completeness are established with respect to the Kripke semantics.

Kristiina Oksner (University of Tartu). On the Minimum Length of Functional Batch Codes with Small Recovery Sets.

Batch codes are of potential use for load balancing and private information retrieval in distributed data storage systems. Recently, a special case of batch codes, termed functional batch codes, was proposed in the literature. In functional batch codes, the users can query linear combinations of the information symbols, and not only the information symbols themselves, as it is done in standard batch codes. In this work, we consider functional batch codes with the additional property that every query is reconstructed from a small number of coded symbols. We derive bounds on the minimum length of such codes, and evaluate the results by numerical computations.

Roberto Parisella (University of Tartu). On Knowledge-Soundness of Plonk in ROM from Falsifiable Assumptions.

Lipmaa, Parisella, and Siim [Eurocrypt, 2024] proved the extractability of the KZG polynomial commitment scheme under the falsifiable assumption ARSDH. They also showed that variants of real-world zk-SNARKs like Plonk can be made knowledge-sound in the random oracle model (ROM) under the ARSDH assumption. However, their approach did not consider various batching optimizations, resulting in their variant of Plonk having approximately 3.5 times longer argument. Our contributions are: (1) We prove that several batch-opening protocols for KZG, used in modern zk-SNARKs, have computational special-soundness under the ARSDH assumption. (2) We prove that interactive Plonk has computational special-soundness under the ARSDH assumption and a new falsifiable assumption SplitRSDH. We also prove that two minor modifications of the interactive Plonk have computational special-soundness under only the ARSDH and a simpler variant of SplitRSDH. We define a new type-safe oracle framework of the AGMOS (AGM with oblivious sampling) and prove SplitRSDH is secure in it. The Fiat-Shamir transform can be applied to obtain non-interactive versions, which are secure in the ROM under the same assumptions.

Krišjānis Petručeņa (University of Latvia). Random-Walk Key Relaying for Decentralized QKD Networks.

Quantum key distribution (QKD) networks need relaying beyond direct links, but many approaches rely on centralized control or global topology knowledge. We study a decentralized alternative called random flow, where $M$ key fragments are forwarded via parallel random walks to improve security.

Pavla Procházková (Tallinn University of Technology). Representability theory for multiactions.

Monoidal categories and multicategories, which can be viewed as a strict generalisation of the former, are classical structures used to model compositional systems. Representability theory for multicategories, introduced in 2000 by Hermida, allows one to express the theory of monoidal categories in the language of multicategories. We present work towards analogous results for more complex structures, namely actions of monoidal categories and of multicategories. Although this work was initially motivated by the study of skew monoidal categories and skew multicategories, multiactions turn out to provide an expressive framework that captures various structures beyond the skew ones. Joint work with: Nathanael Arkor

Jevgenijs Protopopovs (University of Tartu). RaceHarness: Environment Modelling for Sound Static Race Detection.

I will present on-going work on using environment models to improve precision of modular static data race detection in C programming language. The presentation will discuss use of model checking to enable more analyzer-friendly encodings of actor-based environment models suitable for use with lock-based data race verifiers. Discussed approach is demonstrated to improve precision of Goblint static analyzer on Linux kernel modules.

Alexander Rivosh (University of Latvia). The Dice are Loaded: Large Language Models and Random Numbers.

Large language models (LLMs) are increasingly deployed to generate synthetic data, which often includes numerical values. This work-in-progress report explores specific biases exhibited by many LLMs when producing numerical outputs. We investigate the statistical properties of integer outputs from several LLMs by repeatedly prompting each model to return a random integer within a fixed range, without auxiliary tools (no RAG, no MCP, etc.). The resulting distributions deviate markedly from those of conventional pseudorandom number generators and instead resemble the biased patterns observed in human-generated sequences. Crucially, the observed biases are model-specific: distinct statistical signatures emerge across architectures, quantisation levels, temperature settings, and other inference parameters. These signatures enable the fingerprinting of LLMs accessed through cloud services, providing a lightweight means of verifying whether a given inference endpoint corresponds to a claimed model version. Moreover, systematic analysis of such numerical outputs offers a pathway towards mechanical interpretability, allowing researchers to infer aspects of a model’s internal processing without direct access to its weights. The effect is observed primarily in smaller and earlier-generation LLMs, and is often most pronounced in models without a “thinking” mode, but is also observed in some of the modern top models. Our work highlights both the limitations of LLMs as standalone random number generators and the utility of their biased output distributions for model identification and behavioural analysis in closed-source, cloud-only environments.

Kübra Seyhan (Ondokuz Mayis University). A Fully Quantum Secure Communication: PQC+QKD Hybridization For Authenticated Key Sharing.

In preparation for the quantum computer era, two paradigms have emerged: Post-Quantum Cryptography (PQC) and quantum cryptography. Establishing a secret key between two parties communicating over a public channel is one of the most fundamental goals in cryptography. While key exchange and key encapsulation mechanisms have been proposed in PQC-based solutions that rely on computationally hard problems, Quantum Key Distribution (QKD) protocols have been proposed in quantum cryptography-based solutions with information-theoretic security. In this talk, PQC+QKD Hybridization, was defined by Loïc Ferreira in 2026*, will be discussed to show how these two paradigms can be used to capture distinct security properties. Additionally, which information security concepts can be captured with this hybridization will be identified by using hash functions, message authentication codes, and authenticated encryption functions. By adapting a strong security model, the security proofs and limitations in achieving features such as soundness, external authentication, key indistinguishability, and identity privacy will be analyzed. Finally, the details of a possible optimized framework for hybridizing PQC and QKD without compromising security objectives will be discussed.

Janno Siim (University of Tartu). Linearization Trick in SNARKs.

Verifying the correctness of computation can be faster than doing the computation itself. SNARK is a cryptographic tool that allows one to do exactly this.

SNARKs that internally use the KZG polynomial commitment scheme often use a so-called "linearization trick", which further reduces communication cost. Justification for the security of this trick has been mostly informal. In this talk (based on my PKC26 paper), I discuss how to formalize this trick and how to extend it to other homomorphic polynomial commitment schemes.

Juris Smotrovs (University of Latvia). Low degree representing polynomials of high sensitivity Boolean functions.

High sensitivity ensures that the Boolean function has high deterministic query complexity. Low degree of the representing polynomial, on the other hand, is necessary for the function to have low quantum exact query complexity which motivates study of such functions. Nisan, Szegedy (1994), Tal (2013), Wellens (2020) established square root of sensitivity with a constant factor as lower bounds on the degree. But how tight are these bounds? Using earlier and new approaches of converting the multivariate representing polynomial into a univariate polynomial and analyzing it, we offer numerical results for degrees up to 32 which improve the abovementioned general bounds for these degrees and allows to offer a finer conjecture on the lower bounds. [Joint work with Jurijs Zaicevs.]

Michel Smykalla (Tallinn University of Technology). Equality of Proofs in Substructural Intuitionistic Modal Logics.

In this talk, we present non-commutative linear intuitionistic versions SIK, SIT, SIK4, SIS4 of modal logics K, T, K4 and S4 with truth, conjunction and box in terms of sequent calculi. Each of these sequent calculi has cut admissible and is sound and complete, in particular in regards to equality of proofs, wrt. the appropriate categorical semantics. The box modality is interpreted as a strong monoidal functor in the case of SIK, and progressively strengthens to a strong monoidal comonad in the case of SIS4. For equational soundness and completeness to hold, the antecedent formulae have to be indexed by natural numbers for box-depths, but for a variant SIS4∗ of SIS4 where box is an idempotent strong monoidal comonad (satisfying more equations than in the basic non-idempotent case), these indices can be restricted to 0, 1. This corresponds to unboxed and boxed compartments of antecedents in sequent calculi for modal logics with exchange.

Nikita Snetkov (Cyberneticas AS). Coppercloud: Blind Server-Supported RSA Signatures.

Privacy-preserving technologies are a well-established area of research for the modern digital systems, driven by the regulatory mandates, such as the GDPR, and growing demand for the users' privacy. One example from the different available technologies are blind signatures: a primitive that supports creation of privacy-sensitive applications, starting from secure e-voting and anonymous digital cash to privacy-preserving credentials and identity management systems. In this talk, we present Coppercloud, a blind server-supported RSA signature scheme designed to enhance privacy in digital identity systems. We discuss the security requirements for blind server-supported signing by defining corresponding ideal functionality, and argue that Coppercloud securely realizes this functionality in the Universal Composability (UC) model.

Igors Stepanovs (Amazon). You Read It Out of Context: Causality in Secure Messaging.

When Alice replies ""yes"" in a group chat, every member should know what she said ""yes"" to. This property is called causality preservation: messages are always seen in their causal context. Most end-to-end encrypted messengers do not guarantee it. A malicious server (or even a benign network fault) can selectively delay or reorder messages, causing different group members to observe inconsistent views of the conversation. This can lead to confusion without violating any cryptographic guarantee.

In this talk, I will introduce the problem of causality in secure messaging. I will explain what causality means in this context and why even two-party chats can fail to preserve it (building on prior work). In the group setting, the problem becomes considerably harder. I will sketch how causality-aware features could work in practice in an app like Signal, and the protocol machinery required to support them. I will discuss trade-offs between the strength of causality guarantees and the communication overhead they require.

This is an ongoing joint work with Akshaya Kumar and Joseph Jaeger.

Tanel Tammet (Tallinn University of Technology). From Natural Language to Explainable Question Answering via a Neurosymbolic First-Order Logic Pipeline.

We present a neurosymbolic pipeline that converts natural language into extended first-order logic (FOL) using large language models. The system processes English statements, rules, and questions through several intermediate representations to produce structured logical forms. These are then evaluated by the commonsense-extended FOL prover GK to perform reasoning and answer queries. The resulting proofs are translated back into human-readable explanations, enabling explainable commonsense question answering. We report on the design, experiments, and lessons learned from developing this approach.

Robert Tarjan (Princeton University). Fast and Simple Sorting Using Partial Information.

We consider the problem of sorting a set of items having an unknown total order by doing binary comparisons of the items, given the outcomes of some pre-existing comparisons. We present a simple algorithm with a running time of O(m + n + log T), where n, m, and T are the number of items, the number of pre-existing comparisons, and the number of total orders consistent with the outcomes of the pre-existing comparisons, respectively. The algorithm does O(log T) comparisons.

Both our time and comparison bounds are best possible up to constant factors, thus resolving a problem which has been open since 1978 (Fredman, Theoretical Computer Science). The best previous algorithm with a bound of O(log T) on the number of comparisons has a non-linear time bound and is much more complicated. Our algorithm combines two classic algorithms: topological sort and heapsort, with the right kind of heap. This is joint work with Bernhard Haeupler, Richard Hladík, John Iacono, Václav Rozhoň, and Jakub Tětek.

Dominique Unruh (RWTH Aachen, University of Tartu). Poor man’s dependent types for Isabelle/HOL.

The type system of Isabelle/HOL does not support dependent types or arbitrary quantification over types. We introduce a system to mimic dependent types and existential quantification over types in limited circumstances at the top level of theorems.

Jevgēnijs Vihrovs (University of Latvia). Quantum Time-Space Tradeoffs for Exponential Dynamic Programming.

We investigate the quantum algorithms for dynamic programming by Ambainis et al. (SODA'19). While giving provable complexity speedups and applicable to a variety of NP-hard problems, these algorithms have a notable drawback: they require a large amount of Quantum Random Access Memory (QRAM), which potentially could be very challenging to implement in a physical quantum computer. In this work, we study how we can improve the space complexity by trading it for time, while still retaining a speedup over the classical algorithms. We show novel quantum time-space tradeoffs, which we obtain by adjusting the parameters of these algorithms and combining them with ""quantized"" classical strategies.

Joint work with Susanna Caroppo, Dārta Zajakina, Aleksejs Zajakins.

Niels Voorneveld (Cybernetica AS). Towards an equational theory of information leakage and corruption.

In cryptographic protocol specification, you may mark certain resources as leaked or corrupted. Can the resource be seen by unauthorized people? Can the resource be influenced by outsiders? Such operations have equational properties; such as the fact that leaking the same resource twice is the same as leaking it once, and leaking a key and an encrypted message could leak the original message. We can think of leakage and corruption as a kind of computational side effect, which enables us to link privacy and resilience properties to the theory of program equivalence. We look at ideas on how this enables us to develop a categorical relational theory to study the severity of leaks and corruption. This is based on ongoing work with Alessandro Di Giorgio and Pawel Sobocinski.

Abuzer Yakaryilmaz (University of Latvia). Teaching Quantum Globally: Experiences and Perspectives.

Between 2021 and 2024, we co-developed and delivered a series of fully online quantum computing courses through a collaboration between QWorld and the University of Latvia. These courses were designed as open and globally accessible learning environments, combining structured lectures with guided self-study and extended learning phases. The aim was to provide accessible yet academically grounded quantum computing education to participants from diverse educational and geographical backgrounds. In this talk, we present these experiences together with statistical and qualitative analyses, and discuss implications for designing scalable and inclusive quantum computing education in global online settings