Skip to main content
Springer Nature - PMC COVID-19 Collection logoLink to Springer Nature - PMC COVID-19 Collection
. 2020 Jan 7;12038:222–232. doi: 10.1007/978-3-030-40608-0_15

Limited Two-Way Deterministic Finite Automata with Advice

Ahmet Bilal Uçan 5,
Editors: Alberto Leporati8, Carlos Martín-Vide9, Dana Shapira10, Claudio Zandron11
PMCID: PMC7206619

Abstract

External assistance in the form of strings called advice is given to an automaton in order to make it a non-uniform model of computation. Automata with advice are then examined to better understand the limitations imposed by uniformity, which is a typical property shared by all feasible computational models. The main contribution of this paper is to introduce and investigate an extension of the model introduced by Küçük et al. [6]. The model is called circular deterministic finite automaton with advice tape (cdfat). In this model the input head is allowed to pass over input multiple times. The number of allowed passes over the input, which is typically a function of input length, is considered as a resource besides the advice amount. The results proved for the model include a hierarchy for cdfat with real-time heads, simulation of 1w/1w cdfat by 1w/rt cdfat, lower bounds of resources provided to a cdfat in order to make it powerful enough to recognize any language, utilizable advice limit regardless of the allowed pass limit, a relation between utilizable pass limit and advice limit, and some closure properties.

Keywords: Formal languages, Automata theory, Advised computation

Introduction

Advised computation, where external trusted assistance is provided to a machine to help it for computational tasks, was introduced by Karp and Lipton [4] in 1982. Damm and Holzer [1] considered giving advice to restricted versions of Turing machines. Recent work on finite automata with advice include the papers of Yamakami [811], Tadaki et al. [7], Freivalds et al. [3], Küçük et al. [6] and Ďuriš et al. [2]. Today, there are many different models in literature, partly because of the several options available for a machine to access its advice. However, all such models share some common properties. There is an advice function, which maps input lengths to advice strings and not needed to be computable. Advice strings are composed of characters from an advice alphabet. The machine has to use the same advice string when operating on inputs of the same length. We investigate the class of languages recognized by a machine when it consults some advice function having some bounded growing rate. We then play with that upper bound to see what happens to the aforementioned class. An advised automaton takes advantage of an advice string by reading the character under the advice head and choosing appropriate transition from its transition function accordingly. So the same machine may recognize different languages using different advice functions.

We focus on the advice tape model introduced by Küçük et al. in [6]. Since that model becomes extremely powerful (able to recognize all languages) when allowed to use a 2-way input head, and is remarkably limited for the 1-way head case, [2, Theorem 2], [6, Theorem 13], we examine a limited version of two-way input access.

Some common terminology to be used in this paper are as follows: n denotes input length, M denotes an automaton, L denotes a language, h denotes an advice function, w denotes a string, Inline graphic denotes input alphabet, Inline graphic denotes advice alphabet, Inline graphic means any, ALL denotes the set of all languages, and Inline graphic denotes the number of occurrences of character c in string w.

Here are some definitions of concepts that will be used in our discussion,

Definition 1

[6]. Inline graphic.

Definition 2

[2, Definition 5]. Let Inline graphic be a family of relations Inline graphic for some Inline graphic such that Inline graphic, there is a Inline graphic such that Inline graphic and Inline graphic for some Inline graphic. Let Inline graphic be the language Inline graphic. We call Inline graphic a prefix-sensitive language for relation family R.

Definition 3

We call L a prefix-sensitive language iff there exists a relation family R such that Inline graphic is a prefix-sensitive language for relation family R.

Our Model

We defined this model and decided to work on it because the model seems to provide a smooth passage from one-way input head to two-way input head. The name of the new model is circular deterministic finite automaton with advice tape (cdfat) which may have real-time or 1-way input and advice heads (4 possible versions). Circular machines read their input circularly, that is, when the input endmarker has seen and the next transition dictates machine to move its input head to right, the input head immediately returns to the beginning position. Advice head is not allowed to perform such a move.

Note that when restricted to a single pass on input, this model is exactly the same with the standard deterministic finite automaton with advice tapes model (except the two-way input head version) introduced by Küçük et al. [6].

Definition

A circular deterministic finite automaton is a 9-tuple Inline graphic where

  • (i)

    Q is a finite set of internal states,

  • (ii)

    Inline graphic is a finite set of symbols called the input alphabet that does not contain the endmarker symbol, Inline graphic, such that Inline graphic and Inline graphic,

  • (iii)

    Inline graphic is a finite set of symbols called advice alphabet that does not contain the endmarker symbol, Inline graphic, such that Inline graphic and Inline graphic,

  • (iv)

    Inline graphic represents the set of allowed input head movements where S and R means stay-put and right respectively,

  • (v)

    Inline graphic represents the set of allowed advice head movements where S and R means stay-put and right respectively,

  • (vi)

    Inline graphic is the initial state on which the execution begins,

  • (vii)

    Inline graphic is the accept state on which the execution halts and accepts,

  • (viii)

    Inline graphic is the reject state on which the execution halts and rejects,

  • (ix)

    Inline graphic is the transition function such that, Inline graphic implies that when the automaton is in state Inline graphic and it scans Inline graphic on its input tape and Inline graphic on its advice tape, a transition occurs which changes the state of the automaton to Inline graphic, meanwhile moving the input and advice tape heads in the directions specified respectively by Inline graphic and Inline graphic,

A cdfat Inline graphic is said to accept (reject) a string Inline graphic with the help of an advice string Inline graphic if and only if M, when started at its initial state Inline graphic with Inline graphic on the input tape and Inline graphic on the advice tape and while the tape heads scan the first symbols, reaches the accepting (rejecting) state, Inline graphic (Inline graphic), by changing states and moving the input and advice tape heads as specified by its transition function, Inline graphic.

A language L defined on the alphabet Inline graphic, is said to be recognized by such a cdfat M with the help of an advice function Inline graphic if and only if

  • Inline graphic accepts x with the help of Inline graphic, and

  • Inline graphic rejects x with the help of Inline graphic.

A language L is said to be recognized by a cdfat, M, using O(g(n))-length advice if there exists an advice function h with the following properties:

  • Inline graphic, and

  • M recognizes L with the help of h(n).

A language L is said to be recognized by a cdfat, M, using f(n) passes over the input if and only if during the execution of any input of length n, transitions of the form Inline graphic are used at most f(n) times in total.

Note that it is not allowed for a cdfat to have a transition of the form Inline graphic, however, there can be transitions Inline graphic. The endmarker of the input is for informing the machine. It may be a different model if we omit it, for the sake of backward compatibility we continue to use it.

For the notational purposes, Inline graphic denotes the set of languages recognized by cdfat with real-time heads, Inline graphic length advice and f(n) passes. When a head is allowed to stay-put on its tape, we use a different notation. For instance Inline graphic denotes the set of languages recognized by cdfat with 1-way input head and real-time advice head, using g(n) length advice and f(n) passes.

Results

Theorem 1

A language L is prefix-sensitive if and only if for all Inline graphic, there exists Inline graphic such that Inline graphic has Inline graphic equivalence classes.

Proof

Assume that for some language L it holds that for all Inline graphic, there exists Inline graphic such that Inline graphic has Inline graphic equivalence classes. Let f be a function which maps any Inline graphic to an n so that Inline graphic has Inline graphic equivalence classes. Define an infinite family of relations Inline graphic such that Inline graphic and for all Inline graphic and all Inline graphic, Inline graphic. It holds that Inline graphic, there is a Inline graphic such that Inline graphic and Inline graphic for some Inline graphic. Because if there were no such y for some Inline graphic and Inline graphic, then Inline graphic would be true and the number of equivalence classes would not be Inline graphic. According to the Definition 2, we concluded that L is prefix-sensitive.

For the other direction, let L be a prefix-sensitive language. According to the Definition 2, Inline graphic where Inline graphic is a function and Inline graphic is an infinite sequence of relations such that Inline graphic and Inline graphic, there is a Inline graphic such that Inline graphic and Inline graphic for some Inline graphic. It holds that for all Inline graphic, Inline graphic has Inline graphic equivalence classes. Because if the number of equivalence classes of Inline graphic is less than Inline graphic for some k, then there would be two strings Inline graphic and Inline graphic such that Inline graphic and that would imply that there is no y of length f(k) such that Inline graphic and Inline graphic for some Inline graphic.    Inline graphic

Theorem 2

Inline graphic.

Proof

Let Inline graphic where each Inline graphic is a distinct input word of length n and each Inline graphic is either the accept or the reject symbol. Devise a machine M such that, it tries to match the input word and advice character by character in real-time execution. If a mismatch occurs while trying to match the input word, machine M will advance its input head until it is at the beginning position again. Note that the advice head will be at the first character of the next word on advice at the end of this process. Then it tries to match the next word and so on. At some point matching ends with success, that is, machine M will see the endmarker of input while trying to match the characters. At that point it will accept or reject the string depending on which Inline graphic character it is seeing on the advice.   Inline graphic

Theorem 3

For any function Inline graphic, Inline graphic.

Proof

The idea is that for any given machine M, one can devise a new machine Inline graphic such that Inline graphic uses k times less passes than M for all n and for an arbitrary Inline graphic, and still recognizes the same language with the help of some other advice function. Let us group the passes of machine M so that Inline graphic group consists of passes from Inline graphic to ik. With a single pass, machine Inline graphic simulates a group of k passes of M. First pass simulates the first group and second pass simulates the second group and so on. Since Inline graphic does not know which state to begin with a pass without knowing the result of the previous one, it simulates all possibilities and remembers the final state of the previous group of passes using again its states. Therefore the size of the state set of Inline graphic is Inline graphic where s is the number of states of M.

The new advice function Inline graphic is a compressed version of the old one. Let Inline graphic be the new advice alphabet whose symbols represent the k permutations of the symbols of Inline graphic. Inline graphic holds. Let Inline graphic for all Inline graphic. Note that without loss of generality we assume that |h(n)| is an integer multiple of k. We prepare the new advice strings so that Inline graphic represents all strings from h(1) to h(k), Inline graphic represents all strings from Inline graphic to h(2k) and so on.   Inline graphic

Theorem 4

For any function Inline graphic, Inline graphic.

Proof

Let Inline graphic be machines recognizing Inline graphic with the help of advice functions Inline graphic respectively. Let Inline graphic be the machine which is claimed to recognize the concatenation language Inline graphic with the help of advice function Inline graphic. The idea is to predict the words Inline graphic and Inline graphic such that Inline graphic is the input word. Machine Inline graphic doesn’t know from where to divide the input, so it just tries all the possibilities. We replace the advice characters whose locations correspond to the last character of the first portion of the input with their marked versions in order to inform the machine Inline graphic.

In the first pass over the input, machine Inline graphic first simulates Inline graphic on the first portion of the input and stores the last internal state of that execution. Then it simulates Inline graphic on the rest of the input and stores the last state of that execution too. Then it begins the second pass simulating Inline graphic again but this time starting from the last saved state of that thread and when it completes, Inline graphic will update the last state of the thread and so on. Throughout the execution of Inline graphic, two separate threads of execution are simulated at the same time. At the end of at most f(n) passes, if both threads end with accepting their respective sub-inputs, Inline graphic accepts the input. Otherwise, Inline graphic continues the computation with a different division of the input. Note that, given an input word of length n, there are Inline graphic different pairs of words such that their concatenation is the input word. At the end of at most Inline graphic passes, if no division works, Inline graphic rejects the input. According to the Theorem 3, asymptotic rate of the passes is the important part so nf(n) passes can do the same job.

Note that we should double the old advice alphabet size and introduce marked versions of the old symbols in order to mark the position of input separation on the advice Inline graphic. Also note that advice string Inline graphic will be an interleaved version of the Inline graphic and Inline graphic concatenated for all Inline graphic.   Inline graphic

Corollary 1

Inline graphic is closed under concatenation.

Lemma 1

Let Inline graphic. Then for all n and for all k smaller than n, Inline graphic has Inline graphic equivalence classes.

Proof

Let Inline graphic and let Inline graphic be the set of all states except the accept and reject states. Let Inline graphic be a mapping which maps the internal state of machine when input head is for the first time on the first character of w and advice head is at the Inline graphic position to the internal state of machine when input head is for the first time on the first character right after the w. Besides its parameters w and i, this mapping depends on the content of the advice and transition function of the machine. Here we consider a single machine working on inputs of the same length n therefore the mapping depends only on its parameters.

Consider execution of a real-time circular machine on two different inputs of length n, namely Inline graphic and Inline graphic. If we can find two words Inline graphic and Inline graphic such that Inline graphic for all Inline graphic then the two inputs must have the same fate for all z.

Given a single i, there are less than Inline graphic distinct functions Inline graphic. Considering all f(n) functions mentioned above for a word w, there are less than Inline graphic different permutations. Assuming that the number of equivalence classes of relation Inline graphic is greater than Inline graphic for some k and n, there would be two words Inline graphic and Inline graphic such that they are in different equivalence classes and have all the same mappings. This is a contradiction.    Inline graphic

Theorem 5

Let Inline graphic and Inline graphic. Then Inline graphic.

Proof

Consider the language family Inline graphic. Note that Inline graphic is assumed to be a non-decreasing function and the input length Inline graphic. Inputs consist of repetitions of a substring w. Define Inline graphic for all Inline graphic. Depending on the choice of Inline graphic, Inline graphic. We will give three lemmas. Two of them show a hierarchy for the range Inline graphic and the last one is to put the Inline graphic in.

Lemma 2

Inline graphic.

Since given the input length n and the function Inline graphic we can deduct the period of input, we can check a position of the repeating substring w for each pass. Therefore our machine will need Inline graphic many passes.

The advice strings are of the form (parentheses are meta-characters),

graphic file with name M202.gif

Our machine will first search for the first 1 on advice tape and when it has been found, the machine saves the corresponding input character in its states and continue searching for the next 1. When it sees the next 1 it checks the corresponding input character with the one it saved before. If they mismatch input is rejected. The machine then continue searching for 1s and do the same checking till the end of the first pass. It then start with the second pass and do the same procedure again, checking the equality of next character position in substring w. If the endmarker of advice is reached, input is accepted.

Lemma 3

Inline graphic.

Observe that any Inline graphic is prefix-sensitive. Thinking each word as concatenation of first period w and the rest, in other words selecting k to be Inline graphic for all n, Inline graphic has Inline graphic equivalence classes. According to the Lemma 1, Inline graphic.

Lemma 4

No prefix-sensitive language is in Inline graphic.

According to the Lemma 1, for any language Inline graphic, Inline graphic has Inline graphic equivalence classes. Therefore according to Theorem 1, L is not prefix sensitive.   Inline graphic

Theorem 6

Let Inline graphic and Inline graphic. Then Inline graphic.

Proof

Let Inline graphic, Inline graphic be machines recognizing languages Inline graphic, Inline graphic with the help of advice functions Inline graphic and Inline graphic respectively. Devise a new advice function,

graphic file with name M223.gif

for all n where Inline graphic is a brand new advice character that occurs nowhere else. Let Inline graphic be the machine recognizing the union language with the help of Inline graphic. Machine Inline graphic first simulates the Inline graphic and during this simulation it treats the Inline graphic character in advice as an endmarker. When this simulation ends, which may take at most f(n) passes over the input, Inline graphic stores the result in its states and start simulating Inline graphic after adjusting its heads to proper positions, that is input head to the beginning and advice head to the next character after Inline graphic. After at most Inline graphic passes over the input, it completes the execution and store the result in its states. In this way it may end up in 4 different states for 4 possible acceptance status of Inline graphic and Inline graphic. Via combining some of those states into the accept state and the rest into the reject state; union, intersection or difference of Inline graphic and Inline graphic are all recognizable.   Inline graphic

Corollary 2

Inline graphic is closed under union and intersection.

Theorem 7

For any function Inline graphic, Inline graphic.

Proof

The proof is an easy modification of the proof given by Ďuriš et al. for [2, Theorem 3].    Inline graphic

Theorem 8

For any function Inline graphic, Inline graphic.

Proof

Consider the execution of an s-state cdfat with one-way heads. Pausing the advice head, passing on the input more than s times forces the machine to enter an infinite loop. Thus, a machine must advance its advice head before that threshold. Therefore at most sg(n) passes are possible for an execution which eventually halts.   Inline graphic

Theorem 9

For any functions Inline graphic, Inline graphic.

Proof

It is possible to simulate the one-way advice head with real-time advice head using additional advice. The idea is to replicate each advice character Inline graphic times and use separator characters Inline graphic to mark the transition locations. That is, for all n,

graphic file with name M250.gif

where Inline graphic for all Inline graphic and the Inline graphic is a new advice character which is for repeating the endmarker (it is not allowed to have more than one real endmarker character). When the new machine reads Inline graphic on Inline graphic, it behaves exactly like the old machine seeing endmarker on h.

Instead of stay-putting advice head in old machine, let it move right one step in new machine. Instead of moving advice head one step in old machine, enter a subprogram which takes advice head to the next Inline graphic character in new machine.

This trick works because a cdfat with one-way heads must forward its advice head within Inline graphic computational steps. This is because without loss of generality we can assume at least one head is moving in each step and of course input head can move at most Inline graphic times in an execution.   Inline graphic

Corollary 3

For any function Inline graphic, Inline graphic.

It is already known that dfat with 2-way input head is equal in power with the prefix advice model when provided with constant advice [5, Theorem 3.8]. Since our model is sandwiched in between the 2-way input model and advice prefix model when it comes to power, we deduce that Inline graphic for all Inline graphic. Therefore an interesting question to ask is what is the minimum advice for which more passes over input enlarges the class of languages recognized. Küçük and others showed that when provided with polynomial advice, 2-way input head is more powerful than 1-way head [6, Theorem 14]. We proved a stronger result and gave an ultimate answer to the aforementioned question. It turns out that even 2 passes over input is more powerful than a single pass when the machine is provided with an increasing advice.

Theorem 10

Let Inline graphic be any function in Inline graphic. Then Inline graphic.

Proof

Consider the language family Inline graphic. The following two lemmas establish the proof.

Lemma 5

Inline graphic.

Proof

Küçük et al. proved that for any advice length function f, if Inline graphic, then for all n and all Inline graphic, Inline graphic has O(f(n)) equivalence classes, [6, Lemma 6]. It can be shown that for all n, there exists Inline graphic such that Inline graphic has Inline graphic equivalence classes. Since Inline graphic, we conclude that Inline graphic.

Lemma 6

Inline graphic.

Proof

The idea is to devise a machine which in first pass counts the character 1 and in second pass counts the character 2. Let Inline graphic and Inline graphic. Observe that Inline graphic or Inline graphic can easily be recognized by a cdfat with a single pass. In order to recognize Inline graphic for instance, let Inline graphic be the advice function, then consider a machine which stay-puts its advice head when it sees a character other than 1 on its input and advances its advice head when it sees 1 on input. It will accept a string iff both endmarkers are read at the same time. Inline graphic can be recognized similarly. Since Inline graphic, according to Theorem 6, Inline graphic.    Inline graphic

Lemma 7

Let Inline graphic. Then for all n and for all k smaller than n, Inline graphic has Inline graphic equivalence classes.

Proof

Define a configuration Inline graphic of a machine to be the pair of internal state and advice position. Define a non-stopping configuration Inline graphic of a machine to be any configuration where q is a state other than accept and reject states. Let Inline graphic be the set of all non-stopping configurations for a machine and for input length n (Inline graphic). Without loss of generality assume our machines always end their execution when input head is on the endmarker. Let w be a substring of input (not containing the endmarker) and let Inline graphic be a mapping which maps the configuration of machine when the first character of word w is read first time on input tape to the configuration of machine when the character right after the word w is read first time on input tape. Function Inline graphic depends on transition function of the machine, the specific word w being processed and the advice content. We focus on a single machine and inputs of the same length n, therefore in our case Inline graphic depends only on w.

Consider execution of a circular machine on two different inputs of length n, namely Inline graphic and Inline graphic. Both inputs start execution at the initial configuration and after each pass they start with a new configuration. If we can find two words Inline graphic and Inline graphic such that Inline graphic then the two inputs Inline graphic and Inline graphic must have the same fate for all z.

There are less than Inline graphic distinct functions Inline graphic. Assuming that the number of equivalence classes of Inline graphic is greater than Inline graphic for some k and n, there would be two words Inline graphic in two different equivalence classes such that they have the same mapping. This is a contradiction.    Inline graphic

Theorem 11

Let Inline graphic. Then the classes Inline graphic and Inline graphic are incomparable.

Proof

Recall that Inline graphic is nothing but our way of notating the class of languages recognized by the model introduced by Küçük et al. in [6] given access to unlimited advice. According to Theorem 5, a prefix-sensitive language is in Inline graphic no matter how slow f(n) grows. However we know from Ďuriš et al. [2, Theorem 2] that no prefix-sensitive language is in Inline graphic. Therefore Inline graphic.

On the other hand, as stated in proof of Lemma 6, the language Inline graphic can be easily recognized by a machine with one-way heads, given access to Inline graphic length advice. It is easy to see that for all n, there exists k such that Inline graphic has Inline graphic equivalence classes. When the Inline graphic is selected to be linear in n, according to Lemma 1, Inline graphic. Therefore Inline graphic.    Inline graphic

An interesting question to ask is what is the minimum advice or pass needed in order for a model to recognize any language. We can show some lower bounds using Lemmas 1 and 7. PAL is the language of even palindromes.

Corollary 4

Inline graphic.

Corollary 5

Inline graphic.

Conclusions and Open Questions

We showed that cdfat with real-time heads can utilize up to linearly many passes over input. We showed that with exponential pass, the real-time machine can recognize any language. However we do not know if the machine can utilize more than linear passes. There may be a clever algorithm for recognizing any language with linear passes.

We showed that even the most powerful version of the cdfat, that is the one having one-way input and advice heads, cannot recognize some languages when there is not enough advice (a nearly linear bound). However we are not aware of an algorithm for this machine which uses less than exponential resources to recognize any language. It would be nice to know the minimum amount of resources needed to recognize any language.

We compared the class of languages recognized by single pass deterministic finite automaton with one-way heads and unlimited advice with the growing class of languages recognized by a real-time cdfat as we allow more passes over input. Since we know that the former class is bigger than the latter when we allow only constant amount of pass over input and the reverse is true when we allow exponential passes over input, we wonder how that growing takes place and is there any pass limit for which the two classes are equal. It turned out that this is not the case. As long as the allowed pass limit is not constant and sub-logarithmic, two classes are not subsets of each other. However we do not know exactly when the latter class encompasses the former one.

Acknowledgements

Thanks to Prof. Cem Say who have helped editing the paper and correcting my mistakes and to my family for their constant support and love.

Contributor Information

Alberto Leporati, Email: alberto.leporati@unimib.it.

Carlos Martín-Vide, Email: carlos.martin@urv.cat.

Dana Shapira, Email: shapird@g.ariel.ac.il.

Claudio Zandron, Email: zandron@disco.unimib.it.

Ahmet Bilal Uçan, Email: ahmet.ucan@boun.edu.tr.

References

  • 1.Damm C, Holzer M. Automata that take advice. In: Wiedermann J, Hájek P, editors. Mathematical Foundations of Computer Science 1995; Heidelberg: Springer; 1995. pp. 149–158. [Google Scholar]
  • 2.Ďuriš P, Korbaš R, Královič R, Královič R. Determinism and nondeterminism in finite automata with advice. In: Böckenhauer H-J, Komm D, Unger W, editors. Adventures Between Lower Bounds and Higher Altitudes: Essays Dedicated to Juraj Hromkovič on the Occasion of His 60th Birthday. Cham: Springer; 2018. pp. 3–16. [Google Scholar]
  • 3.Freivalds R. Amount of nonconstructivity in deterministic finite automata. Theoret. Comput. Sci. 2010;411(38–39):3436–3443. doi: 10.1016/j.tcs.2010.05.038. [DOI] [Google Scholar]
  • 4.Karp R, Lipton R. Turing machines that take advice. Enseign. Math. 1982;28:191–209. [Google Scholar]
  • 5.Küçük, U.: Finite and small-space automata with advice. Ph.D. thesis, Boğaziçi University (2018)
  • 6.Küçük U, Say ACC, Yakaryılmaz A. Finite automata with advice tapes. In: Béal M-P, Carton O, editors. Developments in Language Theory; Heidelberg: Springer; 2013. pp. 301–312. [Google Scholar]
  • 7.Tadaki K, Yamakami T, Lin JCH. Theory of one-tape linear-time turing machines. Theoret. Comput. Sci. 2010;411(1):22–43. doi: 10.1016/j.tcs.2009.08.031. [DOI] [Google Scholar]
  • 8.Yamakami, T.: Swapping lemmas for regular and context-free languages with advice. CoRR abs/0808.4122 (2008). http://arxiv.org/abs/0808.4122
  • 9.Yamakami T. The roles of advice to one-tape linear-time turing machines and finite automata. Int. J. Found. Comput. Sci. 2010;21(6):941–962. doi: 10.1142/S0129054110007659. [DOI] [Google Scholar]
  • 10.Yamakami T. Immunity and pseudorandomness of context-free languages. Theoret. Comput. Sci. 2011;412(45):6432–6450. doi: 10.1016/j.tcs.2011.07.013. [DOI] [Google Scholar]
  • 11.Yamakami T. One-way reversible and quantum finite automata with advice. In: Dediu A-H, Martín-Vide C, editors. Language and Automata Theory and Applications; Heidelberg: Springer; 2012. pp. 526–537. [Google Scholar]

Articles from Language and Automata Theory and Applications are provided here courtesy of Nature Publishing Group

RESOURCES