AbstractA programming language designed for studies of parallelism and based on Wagner'suniformly reflexive structures is introduced. The measure of depth of computation in the language is studied. The partial recursive functions are shown to be computable in uniformly bounded depth. A comparison of the measure with other proposed measures of computational complexity leads to the suggestion of a list of properties to be checked in classifying such measures
AbstractThe author's forthcoming book proves central results in computability and complexity theory ...
This research is about operational- and complexity-oriented aspects of classical foundations of com-...
We describe three orthogonal complexity measures: parallel time, amount of hardware, and degree of n...
AbstractWe introduce Computational Depth, a measure for the amount of “nonrandom” or “useful” inform...
Torturing an uninformed witness cannot give information about the crime. Leonid Levin [Lev84] Abstra...
Depth of an object concerns a tradeoff between computation time and excess of program length over th...
Depth of an object concerns a tradeoff between computation time and excess of pro-gram length over t...
What can we compute--even with unlimited resources? Is everything within reach? Or are computations ...
AbstractIn the 1980s, Bennett introduced computational depth as a formal measure of the amount of co...
Depth is a complexity measure for natural systems of the kind studied in statistical physics and is ...
Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Mathematics, 1972.Vita.Bibliography...
This paper investigates Bennett\u27s notions of strong and weak computational depth (also called log...
AbstractThis paper reviews and investigates Bennett's notions of strong and weak computational depth...
In this paper, the methods of recursive function theory are used to study the size (or cost or compl...
AbstractThe computation of Boolean functions by parallel computers with shared memory (PRAMs and WRA...
AbstractThe author's forthcoming book proves central results in computability and complexity theory ...
This research is about operational- and complexity-oriented aspects of classical foundations of com-...
We describe three orthogonal complexity measures: parallel time, amount of hardware, and degree of n...
AbstractWe introduce Computational Depth, a measure for the amount of “nonrandom” or “useful” inform...
Torturing an uninformed witness cannot give information about the crime. Leonid Levin [Lev84] Abstra...
Depth of an object concerns a tradeoff between computation time and excess of program length over th...
Depth of an object concerns a tradeoff between computation time and excess of pro-gram length over t...
What can we compute--even with unlimited resources? Is everything within reach? Or are computations ...
AbstractIn the 1980s, Bennett introduced computational depth as a formal measure of the amount of co...
Depth is a complexity measure for natural systems of the kind studied in statistical physics and is ...
Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Mathematics, 1972.Vita.Bibliography...
This paper investigates Bennett\u27s notions of strong and weak computational depth (also called log...
AbstractThis paper reviews and investigates Bennett's notions of strong and weak computational depth...
In this paper, the methods of recursive function theory are used to study the size (or cost or compl...
AbstractThe computation of Boolean functions by parallel computers with shared memory (PRAMs and WRA...
AbstractThe author's forthcoming book proves central results in computability and complexity theory ...
This research is about operational- and complexity-oriented aspects of classical foundations of com-...
We describe three orthogonal complexity measures: parallel time, amount of hardware, and degree of n...