next up previous contents
Next: Complexity and the lack Up: Reasoning about algorithmic knowledge Previous: Logics of algorithmic knowledge   Contents


Knowledge and complexity

We have introduced the concept of algorithmic knowledge in order to represent not only what agents know or can know, but also how long they need to know what they can know. Our analyses up to now can only answer the first question. By means of the systems presented so far one can infer formulae of the form $K_i^{\exists}\alpha$, but no definite statements of the form $K_i^n\alpha$ or $\lnot K_i^n \alpha$. However, to decide if an agent can solve a problem under certain constraints, it is necessary to compute the exact amount of time he needs to solve that problem.

To answer the the ``How long''-question, a complexity analysis is needed. The underlying idea is simple. The complexities of many reasoning problem classes are well-known and can be computed at a low cost. (Complexities are typically, but not necessarily, measured by functions of the input size.) Since the average computation speed of an agent can be assumed to be constant, the amount of time he needs to solve some problem can be computed on the basis of the complexity function for the problem's class5.4.

Suppose that any formula $\alpha$ in a class $\mathcal{C}$ can be computed at the cost $f(\Vert\alpha\Vert)$, where $\Vert\alpha\Vert$ is the length of $\alpha$. Let $c_i$ be a constant number that measures the computation speed of an agent $i$. If $\alpha \in \mathcal{C}$ and if agent $i$ is able to infer $\alpha$, then it can be inferred that $i$ is able to compute $\alpha$ within $c_i * f(\Vert\alpha\Vert)$ time units. That is, $K_i^{c_i * f(\Vert\alpha\Vert)}\alpha$ can be assumed to be true. So, by the aid of complexity theory we can obtain epistemic principles for specific problem classes.

We shall not make any assumption about the nature of the complexity measures and develop our logics independent of the complexity theory in use. We calculate the cost of computing a formula of the language $\mathcal{L}_{N}^{AK}$ by way of cost functions, which are agent-dependent, partial funtions from $\mathcal{L}_{N}^{AK}$ to the set of natural numbers. That is, a cost function $f_i$ is defined for each agent $i$. Intuitively, $f_i(\alpha)$ is the number of time units needed by $i$ to decide $\alpha$ on the basis of all what he knows. Such a cost funtion is defined on the basis of known complexity functions for specific problem classes which can be expressed in the language. Clearly, $f_i$ is defined only for certain sets of formulae, namely for those formulae whose complexities are known. The agents' computation speed and the details how the complexity of a certain formula is measured are encapsulated in the specification of the cost function.

If a cost function $f_i$ is defined for a formula, then certain epistemic statements concerning that formula can be made. If the formula $\alpha$ can be inferred reliably by the agent $i$, then the amount of time needed to infer it is $f_i(\alpha)$, so $K_i^{f_i(\alpha)}\alpha$ can be assumed to be true. Whether or not $\alpha$ can be inferred reliably by $i$, the introspective knowledge of that can be established after $f_i(\alpha)+1$ time units, because his computation of $\alpha$ will return a positive or negative answer after at most $f_i(\alpha)$ time units. Therefore, $ K_i^{\exists}
\alpha \to K_i^{f_i(\alpha)+1}\alpha$ and $\lnot K_i^{\exists}\alpha
\to K_i^{f_i(\alpha)+1} \lnot K_i^{\exists} \alpha$ are valid. Those axioms will be added to the systems of algorithmic knowledge examined earlier to make more powerful logics for resource-bounded reasoning.


\begin{definition}
\par For each $i\in Agent$\ let $f_i: \mathcal{L}_{N}^{AK} \m...
...d{description}\par provided that $f_i(\alpha)$\ is defined.
\par\end{definition}

The complexity analysis makes it possible to prove many unconditional, definite epistemic sentences of the form $K_i^n\alpha$. Let $\alpha$ be a propositional tautology and let $f_i(\alpha)$ be defined. Applying the rule (NEC$^A$) yields the theorem $K_i^{\exists}\alpha$. Hence, $K_i^{f_i(\alpha)}\alpha$ can be derived, by (AC1).

The main task in specifying a system of algorithmic knowledge with complexity analysis is to define the cost functions for the language $\mathcal{L}_{N}^{AK}$. Let us consider how such cost functions can be constructed. We have argued in section 5.2.1 that if $\alpha$ is provable then $K_i^{\exists}\alpha$ can be inferred. We have posed the question if natural number $n$ can be determined such that the stronger sentence $K_i^n\alpha$ can be assumed as a postulate. It is not yet known whether or not the provability problem for K$_N^A$ is decidable, so we restrict our attention to certain subclasses. Let $\alpha$ be an objective formula. Then it can be decided in time $2^{\Vert\alpha\Vert}$, as we know from complexity theory. If an agent is able to recognize objective formulae and to select a special procedure to compute them, he can derive reliably each objective tautology $\alpha$ in a time proportional to $2^{\Vert\alpha\Vert}$.

Interestingly, the previous analysis can be used by an agent within the system in order to reason about himself or about other agents, provided that he has a built-in mechanism to calculate the complexity of reasoning problems. (Such a mechanism can be adopted easily by an agent.) Assume that an agent $k$ tries to find out how long agent $i$ will need to infer a formula $\alpha$ if he chooses to. It is plausible to assume that $k$ can recognize relatively easily that $\alpha$ belongs to the class of objective formulae, so he can reason about agent $i$ exactly like we did before to find out that the time agent $i$ needs is proportional to $2^{\Vert\alpha\Vert}$. Generally, $k$ does not know $i$'s computation speed. However, he may obtain that information from external sources or through his own observations. If $k$ knows the computation speed of $i$, he will be able to compute the amount of time needed by $i$ to infer $\alpha$. In other words, he can calculate $f_i(\alpha)$ for any objective formula $\alpha$. But to estimate the time $i$ would need to derive $\alpha$, agent $k$ does not have to actually derive it. He has only to recognize the class that $\alpha$ belongs to and then to calculate the complexity of $\alpha$, which can be accomplished in linear time. So $K_k^{c_k *
(\Vert\alpha\Vert)}(K_i^{\exists} \alpha \to K_i^{f_i(\alpha)}\alpha)$ is a plausible postulate, where $c_k$ is the computation speed of $k$.

Hence, the definition of the complexity function for $i\in Agent$ may include the following clauses: for all objective formulae $\alpha$

The complexity of the decision procedures for normal modal logic has been investigated extensively ([HM92].) It has been shown (cf. theorem 28) that modal epistemic logic can be embedded faithfully to our systems of algorithmic knowledge, i.e., there is a fragment of the language $\mathcal{L}_{N}^{AK}$ which can be translated one-to-one to modal logic. Consequently, the complexity results obtained for normal modal logic can be applied to determine the cost of solving problems which can be formalized in that fragment. In that way the cost function $f_i$ can be specified for that fragment.


next up previous contents
Next: Complexity and the lack Up: Reasoning about algorithmic knowledge Previous: Logics of algorithmic knowledge   Contents
2001-04-05