Kolmogorov Complexity and Hausdor Dimension Ludwig Staiger
Introduction The concept of Kolmogorov or program size complexity of strings was introduced by R.J. Solomono, A.N. Kolmogorov and G.J. Chaitin in the sixties. For in nite strings (sequences) this concept is strongly related to P. Martin-Lof's de nition of random sequences. Roughly speaking, a sequence is random if almost all of its initial words have a complexity which is close to their length. In other words,random sequences have a relative complexity of 1. Thus one may nd it natural to investigate the degree of randomness of a sequence with the help of its relative complexity. For instance, if x x : : : xi : : : is a random sequence, the sequences x yx y : : : xi y : : : or x x x x : : : xixi : : : may be considered as random of degree 1=2. Having in mind this form of degree of randomness, i.e. measuring randomness as the amount of information which must be provided on the average in order to specify a particular symbol of that sequence, one is naturally led to the question to which extent this idea consistent is with other measure or information theoretic concepts. We consider the following two concepts: The rst one is called the entropy and is also known as upper Minkowski dimension, upper metric dimension, topological entropy or, under a convergence condition, as Shannon's channel capacity, the second one is the Hausdor dimension. Either of these, in some sense, measures the size of sets of in nite strings, and it is to expect that sets of large size do also contain complex sequences, whereas sets of small size contain only sequences of limited complexity. For our proposed size measures, however, every maximally complex string is contained in a set f g of smallest possible size. It is therefore not possible to obtain upper bounds on the complexity of sequences in sets of small size without any computability constraints on these sets. In order to describe computability constraints it is useful to take advantage of the theory of !-languages. This theory investigates 1
1
2
1
1
2
2
2
In: \Fundamentals of Computation Theory" (J. Csirik, J. Demetrovics and F. G ecseg, Eds.), Lecture Notes in Comput. Sci. No. 380, Springer{Verlag, Berlin 1989, pp. 334 { 343.
2
Ludwig Staiger
classes of sets of in nite strings de nable by classes of machines or generated via certain operations from classes of languages (sets of nite strings).
Kolmogorov complexity We start with a brief account on the necessary prerequisites in Kolmogorov or program size complexity, for more detailed information see the book [Sc1] or the survey papers [ZL] and [LV]. The thesis [vL] gives a nice recent survey of the work on random sequences. We conclude this section with a short presentation of our results. Program size complexity de nes the complexity of a nite string to be the length of a shortest program which prints the string. Accordingly, the complexity of an in nite string is a function K ( =) : IN ! IN where K ( =n) is the complexity of the initial part of length n of the string . In this paper we are mainly interested in the rst order approximation (i.e. the linear growth) of K ( =). We consider the functions K ( =n) and ( ) := lim sup K ( =n) ; ( ) := lim (0.1) inf n!1 n n n!1 and we compare the functions (0.2)
(F ) := supf( ) : 2 F g and (F ) := supf( ) : 2 F g
de ned for !-languages to the corresponding entropy HF or Hausdor dimension dim F . Since we are mainly interested in the above mentioned rst order approximations, the established in [KS], [LC] and [Sc2] relations between Kolmogorov, Chaitin and other concepts of program complexity prove that the functions and do not depend on the particular kind of complexity we use. We therefore (also in view of Theorem 2.5 and Proposition 2.10 below) agree on the following concept of conditional complexity. (0.3)
KA (wjn) := inf fjj : A(; n) = w ^ jwj = ng
Here denotes a program for the algorithm A which under the additional input n outputs the string w of length jwj = n. As usual we consider a complexity function K := KU , where U is an optimal algorithm i.e. for every algorithm A there is a constant cA such that (0.4)
K (wjn) KA(wjn) + cA for all w and n :
After some prerequisites on !-languages in the next section (more detailed information can be obtained from Chapter XIV of the book [Ei] and the recent survey
Kolmogorov Complexity and Hausdor Dimension
3
papers [HR], [S4] or [Th]; [S3] contains a comprehensive study of !-languages de nable by Turing machines.) we begin our study with the derivation of properties of the entropy of !-languages and of upper bounds on the set-function via entropy. The above mentioned computability constraints are speci ed in terms of recursive (de nable by Turing machines) !-languages, and it is shown that for !-languages de nable by nite automata (regular, or more general nite-state) the maximum growth of K ( =) can be determined more exactly. Then lower bounds for general !-languages are considered. It turns out that the Hausdor dimension dim F is a general lower bound to (F ), and we exhibit examples that there are complexity gaps between (F ) and (F ) even for simple recursive !-languages. To show that those gaps do not exist for regular !-languages is the aim of the following part. More precisely, it is shown that the maximum growth of K ( =) in a regular !-language behaves strongly like the growth of K ( =) for random sequences including P. Martin-Lof's result on complexity dips. The last part of this paper is devoted to the class of !-power languages, a class not de ned by machines but also exhibiting some regularity in their structure and, therefore, also allowing for a more precise calculation of the functions and .
Preliminaries By IN = f0; 1; 2; : : :g we denote the set of natural numbers. We consider the space X ! of in nite strings (sequences) on a nite alphabet of cardinality #X = r . By X we denote the set of nite strings (words) on X , including the empty word e. For w 2 X and b 2 X [ X ! let w b be their concatenation . This concatenation product extends in an obvious way to subsets W X and B X [ X ! . As usual we denote subsets of X as languagesSand subsets of X ! as !-languages . For a language W let W := feg, W := i2 W i and by W ! we denote the set of in nite sequences formed by concatenating words in W . Furthermore jwj is the length of the word w 2 X and `(W ) := inf fjwj : w 2 W g denotes the minimum length of a word in W . For a word w and a set B X [ X ! we call B=w := fb : w b 2 B g the state of B derived by the word w, and we call a set B nite-state provided fB=w : w 2 X g is a nite set. Moreover, A(B ) := fw : w 2 X ^ B=w 6= ;g is the set of all initial words (pre xes ) of the set B X [ X ! . Finite-state languages are also known as regular languages (cf. [Sa]), whereas regular !-languages are the nite unions of sets of the form W V ! where W and 0
IN
1
1
For the sake of brevity, in what follows we shall write b and A( b ) respectively.
wg B , W
f
f g
f g
w B, W b
and A(b) instead of
4
Ludwig Staiger
V are regular languages. Next we mention the following representation of regular !-languages a proof of which can be found e.g. in Chapter XIV of [Ei]. To this end we say that a language V X is pre x-free provided it does not contain words w, v such that w 6= v and w is a pre x of v, i.e. w 2 A(v). 2
1.1 Theorem An !-language F X ! is regular if and only if there are an n 2 IN and regular languages Wi; Vi X such that the languages Vi are pre x-free and F=
[n W V ! :
i=1
i
i
We consider X ! as a metric space with the metric % de ned by (1.2) %( ; ) := inf fr?jwj : w 2 A( ) \ A( )g : Since X is nite, this space is compact. Its open (and simultaneously closed) balls are the sets of the form w X ! (w 2 X ). Thus the closure C(F ) of a subset F X ! can be described as follows: \ C(F ) = (A(F ) \ X i) X ! = f : A( ) A(F )g (1.3) i2IN
G -sets in (X ! ; %) are characterized as follows (cf. [Ei],[LS]): Let V := f : A( ) \ V is in niteg, then E X ! is a G -set if and only if 9V (V X ^ E = V ).
As usual we denote by i and i the classes of languages in the arithmetical hierarchy.
Upper bounds In this section we derive upper bounds on the complexity of in nite strings in a given !-language F by means of its entropy HF . We start with some connections between the entropy of languages HW and the complexity of words w 2 W . To this end let sW be the structure function of the language W which is de ned as follows (cf. [Ku]) (2.1) sW (n) := #fw : w 2 W ^ jwj = ng : The corresponding structure generating function is (2.2)
sW (t) :=
Xs i2IN
W (i)
ti :
2 In contrast to the case of languages not every nite-state ! -language is also regular (cf. [S4, Section 5]).
Kolmogorov Complexity and Hausdor Dimension
5
The series sW is a positive series and its convergence radius rad W satis es r? rad W . We de ne sW (rad W ) := supfsW ( ) : < rad W g and (2.3) sW () := 1 if > rad W ; and consider sW also as a function mapping [0; 1) to [0; 1) [ f1g. The entropy of a language W is de ned as follows. (2.4) HW := lim sup n? logr sW (n) = ? logr rad W : 1
1
n!1 X!
For an !-language F the notions sF , sF , HF and rad F are de ned as sA F , sA F , HA F and rad A(F ) resp. Therefore, the entropy of an !-language F coincides with the entropy of its closure C(F ) = A(F ). First we derive two results for the complexity of nite strings. (
)
(
)
(
)
2.5 Theorem If W 2 [ then there is a constant c such that for every w 2 W it holds K (w j jwj) logr sW (jwj) + c. Proof. The -part of the theorem is Theorem 1.1.i of [dL]. For the -part de ne A(; n) in the following way: Enumerate X n W up to the point when rn ? rjj elements of length n appeared. Then take from the rest the q()th word of length n. If jj logr sW (n) then the above enumeration process terminates. Hence KA(w j jwj) dlogr sW (jwj)e for every w 2 W . 2 1
1
1
1
3
This result, however, cannot be transferred to the next classes of the arithmetical hierarchy.
2.6 Example ([Sc1],[vL]) There are sequences 2 X ! satisfying A( ) 2 \ and K (=n) n ? o(n). Since sfg 1, our assertion follows. 2 2
2
4
Our Theorem 2.5 leads to the following improvement of Theorem 1 of [S1].
2.7 Proposition If A(F ) 2 [ then (F ) HF . 1
1
2
2.8 Proposition Let V 2 [ . Then for every 2 V there are in nitely many n 2 IN such that K ( =n) logr sV (n) + c for some constant c. 2 1
1
By q() we denote the position of the word in the lexicographical ordering of the set = X . 4 In what follows we shall use the small-o-notation as well as the following abbreviations io to denote that the corresponding inequalities hold almost everywhere or in nitely often. 3
f
:
j j
j
j ^
2
g
6
Ludwig Staiger
In Theorem 2 of [S1] for the special case of nite-state !-languages the following bound stronger than Proposition 2.7 is obtained. Its proof is based on an auxiliary proposition which can be found in [S2]. 2.9 Proposition If F X ! is a nite-state !-language for which a language W X with C(W ! ) = C(F ) exists then there is a constant c such that for all n 2 IN and all w 2 A(F ) the inequality jHF n ? logr sF (n)j c is satis ed. 2 Remark. The condition 9W (W X ^ C(F ) = C(W ! )) is equivalent to the following one employed in Corollary 4 of [S2]: 8w(w 2 A(F ) ! 9v(C(F )=w v F )). 2.10 Proposition ([S1]) Let F X ! be nite-state. Then for every 2 F there is a constant c such that K (=n) ae HF n + c.
Coverings and Hausdor dimension
An of a set F X ! is a family (v X )v2V of balls such that S(v r?Xn-covering ! ) = V X ! F and whose diameters diam v X ! do not exceed r ?n , i.e. `(V ) n. In Section 14 of [Bi] the general de nition of the Hausdor dimension of a subset of a metric space is given. We recall this de nition for our space (X ! ; %). Let X X (3.1) L(F ; V ) := (diam v X ! ) = r?jvj = sV (r?) v2V
v2V
for a covering (v X ! )v2V of F and 0 1. Then L(F ) := lim inffL(F ; V ) : V X ! F ^ `(V ) ng (3.2) n!1
is the -dimensional outer measure of F . Now, consider L(F ) for xed F as a function of . Then there is an 2 [0; 1] such that L(F ) = 1 if < and L(F ) = 0 if > . This \change-over" point is called the Hausdor dimension dim F of the set F , i.e. (3.3) dim F := supf : L(F ) = 1g = inf f : L(F ) = 0g : We observe, that the measures ; ; dim and H share some common properties (cf. [Bi],[S2]). 3.4 Property Let be an arbitrary one of the measures ; ; dim or H. Then for subsets E; F; Fi X ! the following identities hold: (3:4:1) (w F ) = (F ) and (F=w) (F ) for w 2 X ; (3:4:2) S(E [ F ) = maxf(E ); (F )g , and (3:4:3) ( i2 Fi ) = supf(Fi) : i 2 INg , if 6= H : 0
0
IN
0
Kolmogorov Complexity and Hausdor Dimension
7
For nonempty sets of the form W F the last property implies that (W F ) = (F ) for 6= H, whereas HW F = maxfHA W ; HF g in virtue of A(W F ) = A(W ) [ W A(F ) (cf. [LS]). Next we draw a connection between the Hausdor dimension of G -sets and the structure generating function of languages. (
)
3.5 Lemma If sV (r?) < 1 then L(V ) = 0. Proof. De ne V i := fv S : v 2 V ^ # (A(v) \ V ) = i ? 1g. By de nition V i X ! V and V = i2 V i is a disjoint union. Now, the hypothesis ( )
( )
( )
guarantees that L(V ; V i ) = sV i (r?) tends to zero as i approaches in nity. IN
( )
( )
2
If sV (r) < 1 then in particular HV . Consequently dim V HV :
(3.6)
Lemma 3.5 has in a certain sense a converse.
3.7 Lemma If L(F ) = 0 then there is a W X such that F W and sW (r?) < 1. Proof. If = 0 the measure L is the counting measure. Hence F = ; and the assertion is obvious. Let > 0. Choose a family fWi : i 2 INg such that Wi X ! F Sand L(F ; Wi) < r?i. This, in particular, implies `(Wi) ? i. 1
De ne W := i2 PWi . Then sW (r?) i2 L(F ; Wi) < 1. It remains to show that W F . This is obvious, because of Wi X F and 2 `(Wi) ? i every 2 F has arbitrarily long initial words in W . Since sW (r?) < 1 implies HW , this leads to the following consequence of Lemma 3.7 and Eqs. (3.3) and (3.6). IN
IN
1
dim F = inf fHW : W F g
(3.8)
Now, consider for F X ! an in nite subset M IN such that (n? logr sF (n))n2M S tends to lim infn!1 n? logr sF (n) and de ne W := n2M A(F ) \ X n. Then W = C(F ), and Eq. (3.6) proves the following inequality. 1
1
(3.9)
dim C(F ) lim inf n? logr sF (n) HF n!1 1
Our rst lower bound is a general one. It is obtained by combining the above relations between the structure function and the Hausdor dimension with a simple counting argument. We consider the set E (; f ) := f : K (=n) ae n ? f (n)g of (; f )-complex sequences.
8
Ludwig Staiger
Its complement X ! n E (; f ) can be described as V where V := fv : K (v j jvj) < jvj ? f (jvj)g. Counting the number of programs of length < k one obtains # fw : jwj = n ^ K (wjn) < kg < rk =(r ? 1) :
(3.10)
Hence, sV (i) 2ri?f i and, consequently 1. Utilizing Lemma 3.4 we obtain our ( )
Pi2
IN
r?f i < 1 implies sV (r?) < ( )
3.11 P Lemma Let F X ! , and let f : IN ! IN be an arbitrary function such that i2 r?f i < 1. Then L(F ) > 0 implies F \ E (; f ) 6= ;. ( )
IN
Now Lemma 3.11 and Eq. (3.3) yield the following lower bound on (F ).
3.12 Corollary ([R2]) 8F (F X ! ! dim F (F )) : We conclude this section with examples which show on the one hand that there might be as well large gaps between (F ) and (F ) as large gaps between ( ) and ( ) for a single string 2 F even if A(F ) is recursive, thus showing that Hausdor dimension is not an upper bound to , and on the other hand that the bound of Proposition 2.7 is in several cases very imprecise.
3.13 Example Let F := Qi2 X
where S x 2 XQ. Clearly, A(F ) is recursive. When we consider F as with W := i2 ij X j x j Proposition 2.9 proves that (F ) = 0. On the other hand, Daley's diagonalization argument [Da] shows that there is a 2 F satisfying ( ) = 1. 2 IN
i
(2 )!
x
i
(2 +1)!
W
(2 )!
IN
(2 +1)!
=1
3.14 Example ([LS]) Let E := fxg! [ Si2 xi y X i fxg! , where x; y 2 X and x = 6 y. Then one easily veri es that HF = 1 but (F ) = 0. 2 !
IN
Regular !-languages Our Example 3.13 shows that Hausdor dimension is, in general, no suitable upper bound to (F ). In this section we exhibit a class of !-languages which behaves quite regular in the sense that dim F is also an upper bound (and, therefore, in view of Corollary 3.12 a tight one) to (F ). We start with the following property which is proved in Corollary 7 of [S7].
4.1 Property If F is nite-state and closed then dim F = HF .
Kolmogorov Complexity and Hausdor Dimension
9
Next we generalize Theorem 4 of [S6] .
4.2 Theorem If F X ! is regular and = dim F then L(F ) > 0.
Before we proceed to the proof of the theorem we mention the following properties which can be found e.g. in Section 9 of [LS] or Chapter VIII of [Ei]: (4.3) (4.4)
HA V = HV if V is a regular language, and HV < HV if V is a regular and pre x-free language. (
)
Moreover, it holds the following.
4.5 Lemma If V is regular and pre x-free then L(V ! ) = L(C(V ! )). Proof. We have C(V ! ) = V ! [ V E where E := f : A() A(V )g [LS, Section 5]. Hence, by the above Eqs. (3.6), (4.3), and (4.4): dim E HE HA V = HV < HV . Moreover, A(V ! ) [feg = A(V ) implies HV HV ! HC V ! = dim C(V ! ) (
(
)
)
by Property 4.1. Thus for = dim C(V ! ) > dim E we have L (E ) = 0 what in view of the above identity C(V ! ) = V ! [ V E and Eq. (3.4.3) proves our assertion. 2 The following consequence is immediate.
4.6 Corollary If V is regular and pre x-free then dim V ! = dim C(V ! ) = HV ! .
Proof of Theorem 4.2. For closed regular !-languages F the assertion is proved in Theorem 4 of [S6]. According to Theorem 1.1 a regular !-language F can be represented as a nite union of sets of the form W V ! where W and V are regular and V is pre x-free. From Property 3.4 we obtain that there is some w V ! F such that dim w V ! = dim F . Now, for = dim F we have L(w V ! ) = r?jwj L (V ) = r?jwj L (C(V ! )) > 0
by the above Lemma 4.5 and Theorem 4 of [S6] . 2 Moreover, together with Lemma 3.11 we can improve Theorem 6 of [S1] as follows.
4.7 Theorem Let F X ! be regular, and let f : IN ! IN be a function P such that i2 r?f i < 1. Then there is a 2 F such that K (=n) ae dim F n ? f (n). ( )
IN
As a further consequence we obtain an improvement of the upper bound derived in Proposition 2.10 which together with the preceding theorem gives evidence that in a regular !-language F the maximum and minimum complexity of a maximally complex sequence dier only slightly.
4.8 Theorem Let F be a regular !-language. Then for every 2 F there is a constant c such that K ( =n) ae dim F n + c.
10
Ludwig Staiger
Proof. According to Theorem 1.1 we have some w 2 X and a pre x-free regular language V such that 2 w V ! F . Thus, 2 w C(V ! ) = C(w V ! ). Corollary 4.6 implies HV ! = dim V ! dim F , and the assertion follows from Proposition 2.10. 2 In the previous section we gave an example of complexity gaps in (sets of) in nite strings, and in this section we showed that for regular !-languages there are no such gaps. In the late sixties P. Martin-Lof and G.J. Chaitin (cf. [LV]) proved that each sequence in the regular !-language X ! has in nitely many complexityPdips K (=n) n ? f (n) when f : IN ! IN is a recursive function satisfying i2 r?f i = 1. The proof sketched in [ZL] easily extends to regular !-languages of the form V ! where V fw : w 2 X ^ jwj = kg in the sense that K ( =n) io dim V n ? f (n) where f is as above and 2 V ! . This has its reason in the regularity of the branching behaviour of the in nite r-nary tree corresponding to F = V ! , more exactly in the fact that for every v; w 2 A(F ) and for all n 2 IN the numbers sF=w (n) and sF=v (n) do not dier too much from each other. In Proposition 2.9 we have shown that, in particular, for every !language of the form E = w V ! with V X regular there is a constant c such that j logr sE=v (n) ? HE nj < c holds for all n 2 IN and all v 2 A(E ). Those !languages ful ll the above mentioned requirement, and the idea of P. Martin-Lof's proof can be transferred to !-languages of the above mentioned form E = w V ! . This leads to the following result on complexity dips in regular !-languages. ( )
IN
4.9 Theorem Let P F be a regular !-language and let f : IN ! IN be a recursive function satisfying i2 r?f i = 1. Then K (=n) io dim F n ? f (n) holds for all 2 F . ( )
IN
!-power languages We continue our investigations with considering !-languages of the special shape W ! where W is some (not necessarily regular) language. It turns out that one can obtain more precise estimates for (F ) and (F ) in the case of those !-languages. First we are going to prove an assertion claimed by B.Ya. Ryabko [Ry] concerning the Hausdor dimension of !-powers W ! . To this end we quote the theorem of [S5].
5.1 Theorem Let W X be an arbitrary language. Then for every " > 0 2 there is a nite subset V W such that HW ? HV < ". Observe now that for every nite V X ! the !-power V ! is a regular and closed !-language. Consequently, Property 4.1 and Eq. (4.3) prove dim V ! =
Kolmogorov Complexity and Hausdor Dimension
11
HV ! . Thus, in particular, any nite V W satis es HV ! = dim V ! dim W ! . Applying our theorem, yields HW ! dim W ! . Conversely, utilizing Eq. (3.6) and the obvious inclusion W ! (W ) , we obtain dim W ! HW what proves the assertion.
dim W ! = HW
(5.2)
Proposition 2.9, Corollary 3.12 and Eq. (5.2) yield the following estimate of (W ! ).
5.3 Proposition If W 2 [ then (W ! ) = dim W ! . 1
1
2
Remark. The -part of Proposition 5.3 was proved in a dierent manner in [Ry]. 2 1
Next we give an example which simultaneously shows that neither the bound in Corollary 3.12 is tight in general nor Proposition 5.3 can be extended to higher classes of the arithmetical hierarchy.
5.4 Example Consider the same set A( ) as in Example 2.6. From Chapter VIII of [Ei] weQknow that sW (r? ) 1 is a necessary condition for HW = dim W ! = 1. Let = i2 wi where jwij = i!, and de ne W := fwi : i 2 INg. One easily veri es 2 W ! ; W 2 \ and sW (r? ) < 1. Consequently, dim W ! < 1 = 2 ( ) = (W ! ). 1
IN
2
2
1
We continue this section by showing that for (W ! ) in Eq. (0.2) the supremum can be substituted by maximum. To this end we mention the following wellknown relation between the complexity of a product w v and the complexity of its factors w and v (cf. [ZL]).
9c8w8v(K (w v j jw vj) K (v j jvj) ? 2 logr jwj ? c)
(5.5)
Furthermore we need the following theorem which can be proved as Theorem 3 in [S1].
5.6 Theorem Let W X , and let ( i)i2 be a family of elements of C(W ! ) and (fi)i2 be a family of functions f : IN ! IN such that K ( i=n) >io fi(n) for all i 2 IN, and for all k; m 2 IN the inequality fi(n + k) + m ae fj (n) holds IN
IN
whenever i < j . Then there is a 2 W ! such that 8i(i 2 IN ! K ( =n) >io fi(n)).
2
12
Ludwig Staiger
As an immediate consequence of our theorem we get the announced identity.
(W ! ) = maxf( ) : 2 W ! g
(5.7)
Utilizing the same idea as in the proof of Theorem 3 of [S1] we can give a more precise (at least for W 2 ) estimate of the value of (W ! ). 1
5.8 Lemma Let W X . Then (W ! ) HW ! = maxfdim W ! ; HA W g. (
)
2
As an immediate consequence we obtain from Proposition 2.7 the following identity for languages W 2 . 1
(5.9)
(W ! ) = HW ! = maxfdim W ! ; HA W g , if W 2 : (
)
1
Comparing these results with the corresponding ones, (W ! ) dim W ! or (W ! ) = dim W ! , for the function reveals a reason for the possible appearance of complexity gaps in !-power languages.
References [Bi] Billingsley, P., Ergodic Theory and Information. Wiley, New York 1965. [Da] Daley, R.P., The extent and density of sequences within the minimalprogram complexity hierarchies. J. Comput. System Sci. 15 (1974), 151 - 163. [dL] DeLuca, A., On the entropy of a formal language. In: Automata Theory and Formal Languages, Proc. 2nd GI Conference (H. Brakhage, Ed.), Lect. Notes Comput. Sci. 33, Springer-Verlag, Berlin 1975, 103 - 109. [Ei] Eilenberg, S.. Automata, Languages, and Machines. Vol. A, Academic Press, New York 1974. [HR] Hoogeboom, H.J. and Rozenberg, G., In nitary languages: Basic theory and applications to concurrent systems. In: Current Trends in Concurrency - Overviews and Tutorials (J.W. de Bakker, W.-P. de Roever and G. Rozenberg, Eds.), Lect. Notes Comput. Sci. 224, Springer-Verlag, Berlin 1986, 266 - 342. [KS] Katse, H.P. and Sipser, M., Several results in program size complexity. Theoret. Comput. Sci. 15 (1981), 291 - 309. [Ku] Kuich, W., On the entropy of context-free languages. Inform. and Control 16 (1970) 2, 173 - 200.
Kolmogorov Complexity and Hausdor Dimension
13
[LC] Leung-Yan-Cheong, S.K. and Cover, T., Some equivalences between Shannon entropy and Kolmogorov complexity. IEEE Trans. Inform. Theory IT24 (1978), 331 - 338. [LV] Mi, L., and Vitanyi, P.M.B., Two decades of applied Kolmogorov complexity. In: Proc. 3rd IEEE Structure in Complexity Conference, 1988. [LS] Lindner, R. and Staiger, L., Algebraische Codierungstheorie - Theorie der sequentiellen Codierungen. Akademie-Verlag, Berlin 1977. [Ry] Ryabko, B.Ya., Noiseless coding of combinatorial sources, Hausdo dimension and Kolmogorov complexity. Problemy Peredachi Informatsii 22 (1986) 3, 16 - 26. [Russian] [Sa] Salomaa, A., Theory of Automata. Pergamon, Oxford 1969. [Sc1] Schnorr, C.P., Zufalligkeit und Wahrscheinlichkeit. Lect. Notes Math. 218, Springer-Verlag, Berlin 1971. [Sc2] Schnorr, C.P., Process complexity and eective random tests. J. Comput. System Sci. 7 (1973) 4, 376 - 388. [S1] Staiger, L., Complexity and entropy. In: Mathematical Foundations of Computer Science (J. Gruska and M. Chytil, Eds.), Lect. Notes Comput. Sci. 118, Springer-Verlag, Berlin 1981, 508 - 514. [S2] Staiger, L., The entropy of nite-state !-languages. Probl. Control and Inform. Theory 14 (1985) 5, 383 - 392. [S3] Staiger, L., Hierarchies of recursive !-languages. J. Inform. Process. Cybern. EIK 22 (1986) 5/6, 219 - 241. [S4] Staiger, L., Research in the theory of !-languages. J. Inform. Process. Cybern. EIK 23 (1987) 8/9, 415 - 439. [S5] Staiger, L., Ein Satz uber die Entropie von Untermonoiden. Theoret. Comput. Sci. 61 (1988) (2,3),279 - 282. [S6] Staiger, L., Quadtrees and the Hausdor dimension of pictures. In: Proc. GEOBILD '89 (A. Hubler, W. Nagel, B.D. Ripley and G. Werner, Eds), Mathematical Research 51, Akademie-Verlag, Berlin 1989, [S7] Staiger, L., Combinatorial properties of the Hausdor dimension. J. Statist. Plann. Inference 22 (1989), to appear . 5
5
This paper appeared in: J. Statist. Plann. Inference 23 (1989), 95 - 100.
14
Ludwig Staiger
[Th] Thomas, W., Automata on in nite objects. Aachener Informatik-Berichte 88-17. [vL] van Lambalgen, M., Random sequences. Ph.D. Thesis, Univ. of Amsterdam, 1987. [ZL] Zvonkin, A.K. and Levin, L.A., Complexity of nite objects and the development of the concepts of information and randomness by means of the theory of algorithms. Russian Math. Surveys 25 (1970), 83 - 124. 6
6 This paper appeared in: Handbook of Theoretical Computer Science, Vol. B (J. Van Leeuwen, Ed.), Elsevier, Amsterdam 1990, 133 - 191.