[dlc-dev] New oracle signing mode optimized for multi-oracle DLCs
Nadav Kohen
nadavk25 at gmail.com
Thu Feb 11 10:31:13 CET 2021
Hi all,
Me from the future here: Wow this email is long and mostly not worth
reading, I have one important question at the very bottom of the email
(where the *bold* text starts) and everything else is just context and
motivation for a purely cryptographic question. Feel free to skip ahead to
that :)
----------------------------
In the current multi-oracle CET model, one computes the single-oracle CET
sets and then for each CET in that set, corresponding to an outcome in some
interval [a, b], one computes a (minimal) set of secondary CETs
corresponding to some interval [a-x, b+y] where both x and y are between
the minSupport and maxError parameters.
---------------- Skip this section if you want to get to the new stuff
----------------
I have recently been trying to see if I can reliably make these secondary
sets have only a single CET, meaning that numeric outcomes with bounded
differences allowed can scale (in CET terms) just as well as enumerated
outcomes.
The original thinking was to split into cases where sometimes we can just
"ignore some digits" to get a single secondary CET that satisfies our
needs, but if the primary CET it is too close to a multiple of maxError
(where close means within minSupport) then a second secondary CET is
required to cover the other side of the nearby multiple of maxError.
The newer and better thinking is to have oracles sign the same event under
multiple bases and then to take intersections of intervals resulting from
prefixes in two (possibly different) bases in order to cover [a-x, b+y].
For many parameters (minSupport, maxError, payoutCurve) this performs
perfectly pretty reliably (meaning a "match" (intersection) is found
satisfying the constraints and resulting in a single secondary CET) but for
certain parameters many cases cannot be covered requiring multiple
secondary CETs (leading to further exponential growth of CETs in the number
of oracles).
Another idea I thought might help this is to have oracles sign "shifted
versions" of outcomes as well (e.g. sign x + 1000 too) but preliminary
experiments don't seem to yield this approach much promise.
---------------- New stuff starts here ----------------
My newest idea is that perhaps oracles could partition their outcome domain
into nearly equal parts (e.g. [0, 9], [10, 19], [20, 29], ..., [90, 99] for
the domain [0, 99]) and then attest to an event begin less than some value
(or greater than but everything is symmetrical so I will focus on less
than).
If oracles were to follow such a protocol for both greater than and less
than, then custom intersections could be taken so that covering CETs can be
generated between any two partition points so that if maxDiff(p_i, p_(i+1))
< maxError - minSupport, then we can guarantee singleton secondary CET sets
meaning any scaling solution (exp growth -> linear) applicable to enum
outcome multi-oracle will be applicable anywhere.
Here is my high-level idea for how to accomplish this. The oracle will
generate a single random private key for this example event, x_0, which
corresponds to "outcome < 10." Then a deterministic one-way function is
used to compute x_1 = next(x_0) which will correspond to "outcome < 20" and
so on with x_i+1 = next(x_i) and where x_i means "outcome < (i+1)*10." Now
say that the outcome is 35, the oracle will release x_3 from which any user
can compute x_i for any i >= 3. Thus they can construct a single CET for
the event "i*10 < outcome < j*10" for any i and j.
The key complication is that we need a one-way point derivation function
P_i = derive_point(x_i) such that there exists a deterministic (doesn't
have to be one-way) function such that next_point(P_i) = P_(i+1).
*----------------- MY QUESTION IS DOWN HERE
-----------------------------------*
If such a triple,
(*next*: Scalar => Scalar, *derive_point*: Scalar => Point, *next_point*:
Point => Point),
of deterministic functions were to exist satisfying the properties:
1. *next* is one-way
2. *derive_point* is one-way
3. *next*(x) = y => *next_point*(*derive_point*(x)) = *derive_point*(y)
Then there is a simple algorithm to convert my single-base, single-oracle
CET compression solution into an algorithm that supports (still
single-base) multiple oracles with bounded differences allowed.
Essentially we need a kind of homomorphic encryption where the encryption
function (*derive_point*) converts applications of *next* on its domain to
applications of *next_point* on its range where we can choose any one-way
function *next* and any function *next_point* to make this work.
I don't (off the top of my head) know if such a suite of functions,
hopefully simple ones, exist and was writing to the mailing list hoping
that someone else might have an idea for how this may be accomplished.
Anyone have any ideas?
Best,
Nadav
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://mailmanlists.org/mailman/private/dlc-dev/attachments/20210211/7482b013/attachment.htm>
More information about the dlc-dev
mailing list