Use LEFT and RIGHT arrow keys to navigate between flashcards;
Use UP and DOWN arrow keys to flip the card;
H to show hint;
A reads text to speech;
135 Cards in this Set
- Front
- Back
What is a computer that is designed to perform just one task?
|
Fixed Program Computer
|
|
What is a computer that is flexible enough to be re-programmed?
|
Von Neumann Architecture
|
|
How was an ENIAC computer programmed?
|
Through toggling switches and cables.
|
|
What are four downsides to assembly language?
|
It is machine dependent; in other words, it is not portable. There is no error checking. It is also hard to read, and tedious to write. There is a huge learning barrier.
|
|
The terms “universal” and “implementable” describe what concept in this class?
|
A Programming Language.
|
|
What does “universal” mean?
|
Any computable function can be defined on this language.
|
|
What does “implementable” mean?
|
It can be implemented on existing hardware platforms.
|
|
The “Do what I mean” vs. “Do what I say” tradeoff scheme applies to what two concepts?
|
Declarative and Imperative languages.
|
|
In terms of expressing algorithm, is declarative or imperative favorable?
|
Declarative. Imperative requires us to provide the base primitives to actually build the algorithm.
|
|
In terms of learning curve, is declarative or imperative favorable?
|
Declarative has a shallow learning curve, as it is problem-domain specific.
|
|
In terms of performance, is declarative or imperative favorable?
|
Imperative, as it is close to the machine and is mostly efficent.
|
|
What are some subsets of declarative languages?
|
Functional, Logic/Constraint Based, and Dataflow.
|
|
What are some subsets of imperative languages?
|
Procedural/Von Neumann, Object-Oriented, Scripting
|
|
What sort of language is Prolog?
|
Logic/Constraint Based Declarative Language.
|
|
What sort of language is Id or Val?
|
Dataflow Declarative Language
|
|
What sort of language is Perl or Python?
|
Scripting Imperative
|
|
What sort of language is C++ or Java?
|
Object-Oriented Imperative
|
|
What sort of language is LISP/Scheme?
|
Functional Declarative
|
|
What sort of language is Pascal?
|
Procedural Imperative
|
|
What sort of language is Fortran or C?
|
Procedural Imperative
|
|
What sort of language is Shell?
|
Scripting Imperative
|
|
What sort of languages are modeled on manipulating data, generally through a sequential computation of statements that directly manipulate data in memory?
|
Procedural Imperative
|
|
What sort of languages are modeled as interactions between objects that collaborate with each other?
|
Object-Oriented
|
|
What sort of languages focus on developer productivity and are often used for integration of components?
|
Scripting Imperative
|
|
What sort of languages are mathematically inspired in that programs are defined in terms of mathematical functions (equivalences)?
|
Functional Declarative
|
|
Is there a concept of memory or state in functional languages?
|
No. Functions simply map values onto values. Control flow is implicit and based on recursion.
|
|
What sort of languages are inspired by propositional logic?
|
Logic and constraint-based Declarative
|
|
If all languages we have talked about are Turing Complete, what are some characteristics of a good language?
|
Readability, Expressivity, Writability, Reliability
|
|
When is translation cost amortized over many runs?
|
This is an advantage of compiling over interpretation.
|
|
When are extensive and slow optimizations possible?
|
This is an advantage of compiling over interpretation.
|
|
When are run-time errors harder to diagnose?
|
This is a disadvantage of compiling over interpretation.
|
|
Are good compilers easy to build or hard to build?
|
They are hard to build.
|
|
What is efficient execution?
|
When there are no transaction costs at run-time; this is an advantage of compiling over interpreting.
|
|
Can compiling produce invalid machine code?
|
No; however the assembler can produce invalid machine code. Basically, there are no/few checks after syntactic correctness.
|
|
Can an assemble type-check?
|
No, however a compiler can.
|
|
A (X) is capable of complex code generation while a (Y) performs no transformations.
|
X = compiler. Y = assembler.
|
|
What is translation during execution called?
|
Interpretation
|
|
On how many inputs does an interpreter work on?
|
Two; the program and the actual input.
|
|
When is there excellent debugging and checking?
|
This is an advantage of interpretation versus compilation. Source code is known when error occurs, and we have two inputs (so easier to type check)
|
|
Can an interpreter generate and evaluate new code at runtime?
|
Yes sir
|
|
When does translation occur many times, thus leading to redundant work?
|
This is a disadvantage of interpretation versus compilation.
|
|
When does translation cost occur at runtime, thus making it inefficient?
|
This is a disadvantage of interpretation versus compilation.
|
|
What are two ways to mix compilation and interpretation?
|
Implicit Compilation, and Explicit Compilation.
|
|
What is implicit compilation?
|
Compilation occurs “behind the scenes.” Tool appears as interpreter for user. Python, etc.
|
|
What is explicit compilation?
|
Separate compilation step; User is aware of byte code (eg. Java). Compile high-level code to low-level byte code. Interpret much simpler byte code.
|
|
When is an interpreter easier to optimize?
|
When it is a low-level interpreter (i.e. you are mixing compilation and interpreting).
|
|
What is an advantage of implicit compilation?
|
There is flexibility like an interpreter.
|
|
What is an advantage of explicit compilation?
|
Source code is not revealed, even if it is easier to decompile.
|
|
What is it called when you modify source code before it is passed to actual compiler or interpreter?
|
Preprocessing.
|
|
Is a pre-processor the same as a source-to-source compiler?
|
No, because a compiler will never return incorrect code, while as a pre might. A preprocessor has no inspection of semantic correctness.
|
|
How do you know if a language is compilable?
|
If most of its checks can be done at compile time. Usually a language is designed with either compilation or interpretation in mind.
|
|
Given a new hardware platform, how do you obtain a compiler?
|
You either bootstapp (build it from scratch), or do cross-compilation if you already have a working compiler.
|
|
So how should one go about cross-compiling to many many machines?
|
Instead of starting off with a any->target source compiler, use one any->virtual machine. (You will have to translate virtual machine hand-by-hand though). Thus you are using virtual machine + bootstrapping.
|
|
What is the optimization phase split into?
|
Machine-independent optimization, and Machine-dependent optimization.
|
|
How does optimization transform a code?
|
It transforms code to improve performance, but doesn't touch semantics.
|
|
What sort of machine-independent optimization reduces subroutine call overhead?
|
Inlining of (short) subroutines
|
|
What sort of machine-independent optimization reduces unnecessary memory access?
|
Store-load pair elimination
|
|
What sort of machine-independent optimization avoids jump-to-jump?
|
Jump coalescing
|
|
How can you avoid doing machine-dependent optimizations?
|
By compiling to a C-code, instead of assembly code
|
|
What sort of machine-dependent optimization moves failure cases out of the hot path?
|
Branch-prediction-friendly code layout
|
|
What sort of machine-dependent optimization avoids spill code(minimizes stores/loads)?
|
Clever register allocation
|
|
What phase discovers the meaning of a program by creating an abstract syntax tree without “extraneous” tokens?
|
Semantic Analysis
|
|
What are some features of semantic analysis enabled by the fact that it uses a symbol table?
|
Enforce identifiers are declared before use, subroutines provide correct number of arguments.
|
|
What is the step after semantic analysis?
|
Immediate code generation.
|
|
What are some characteristics of Ifs?
|
Machine independent, compact, ease of optimization. They often resemble machine code for an imaginary machine.
|
|
Why have a seperate lexical analysis phase?
|
It is easier (and more efficient) to define syntax rules in terms of tokens. In other words it greatly simplifies subsequently performed syntactical analysis.
|
|
What makes a DFA deterministic?
|
For every state and for each input, there exists only one transition.
|
|
How do you know if two DFA states are equivalent?
|
If for each input element, they lead to the partition.
|
|
What is the parse tree sometimes called?
|
Concrete syntax tree
|
|
What are some characteristics of BNF notation?
|
<expr> ::= id | <expr>
|
|
What is EBNF?
|
An ISO standard that extends BNF to simply grammars
Use = instead of ::= Use of “,” for concatenation [A] = 0 or 1 times {A} = 0 or more times |
|
What makes a program syntactically correct?
|
If it can be derived from the start symbol.
|
|
What is a left most derivation?
|
If we always choose the left-most non-terminal symbol
|
|
What is the big O for an arbitrary CFG without any lookahead?
|
O(n^3)
|
|
What are two types of restricted grammar that can be parsed in linear time..i.e. O(n)?
|
LL: “Left to right, left most derivation” or the class of grammars for which a left most derivation always results in a parse tree
LR: “Left to right, right most derivation” |
|
Are LL or LR parsers considered “natural”
|
LR Parsers
|
|
Where do LL parsers create parse-tree from?
|
From beginning at the root, top-down
|
|
Where do LR parsers create parse-tree from?
|
From beginning at leaves, bottom-up
|
|
What are called predictive parsers?
|
LL Parsers
|
|
What does introducing a name do?
|
It creates an abstraction
|
|
What does binding a name to an entity do?
|
It resolves an abstraction
|
|
Do static objects have a fixed address?
|
Yes the address is fixed.
|
|
When are static objects deallocated?
|
Deallocated at program termination
|
|
What are some examples of static objects?
|
Program code, Constants, Global Variables (both initialized and uninitialized).
For uninitialized variables, the value is computed at runtime, then the compiler disallows subsequent updates. |
|
When is there no allocation/deallocation runtime overhead?
|
Using static allocation of global variables. Advantage
|
|
When is allocation size-fixed and possible wasteful?
|
Using static allocation of global varables. Disadvtage.
|
|
When can compiler optimize addresses?
|
Using static allocation of global variables. Advantage
|
|
What objects are allocated Last in First Out?
|
Stack objects
|
|
When can the maximum size of stack be adjusted?
|
At runtime
|
|
What sort of memory structure is essential for subroutine calls?
|
Runtime stack
|
|
How do you allocate memory for subroutine calls on demand?
|
You use a runtime stack;allocate new frame each call; deallocate on return.
|
|
What is the code created that does before call to subroutine?
|
Setup
|
|
What is the code created that does before subroutine body executes?
|
Prologue
|
|
What is the code created that does after subroutine body completes?
|
Epilogue
|
|
What is the code created that does right after subroutine call?
|
Cleanup
|
|
When are offset computations required at runtime (downside)?
|
When using a stack frame
|
|
When are heap objects allocated and deallocated?
|
At arbitrary times.
|
|
How do you allocate onto stack in C++?
|
You use keyword “new”
|
|
How do you deallocate from stack in C++?
|
You use keyword “delete”
|
|
What is internal fragmentation?
|
Occurs when a storage-management algorithm allocates a block that is larger than required to hold a given object; the extra space is then unused.
|
|
What is external fragmentation?
|
Occurs when blocks assigned to active objects are scattered across heap in such a way that remaining unused space is comprised of multiple blocks; there may be quite a lot of free space, but not one piece may be large enough to satisfy a given request.
|
|
What is merging the free space in a heap called?
|
Compacting the heap
|
|
When should one use heap allocation?
|
Use it for large buffers and long-living objects.
|
|
What is a dangling reference?
|
When the binding exists longer than an object.
|
|
What is the dangling reference bug called in C?
|
“Use-after-free” bug.
|
|
What is a memory leak?
|
When objects live forever, even if it is no longer required.
|
|
What is the advantage of garbage collection?
|
It automatically deallocates objects when it is safe to do so.
|
|
What is the set of active bindings at any given moment called?
|
Referencing environment
|
|
How is the referencing environment defined?
|
By scope rules.
|
|
What is the textual region in which a binding is active?
|
The scope of binding
|
|
What is it called when active bindings depend on control flow?
|
Dynamically scoped
|
|
What is it called when bindings are determined at compile time?
|
Statically scoped. Bindings do not depend on call history.
|
|
When dynamically scoped, in what referencing environment is the called subroutine body executed?
|
Executed in the subroutine caller
|
|
When statically scoped, in what referencing environment is the called subroutine body executed?
|
In the subroutine definition
|
|
How can the scope of a binding be extended?
|
Through closures
|
|
When a closure is defined, what does it capture?
|
All bindings
|
|
How do we implement scope?
|
Use a symbol table; one of two central data structures in a compiler – the next is the abstract syntax tree.
Name → (Entity: Address, data type, extra info). Thus use some map-like abstract type, such as a HashMap or TreeMap |
|
How do we keep track of scopes?
|
Stack of environments
|
|
How is the stack of scope environments implemented?
|
With an enclosing scope pointer, aka static chain pointer.
|
|
What is the resulting data structure of scope environments called?
|
Static chain (a list based stack of maps)
|
|
What is the lookup time for the static chain?
|
O(n) where n is the nesting level.
|
|
What are disadvantages of dynamic scoping?
|
It is hard to reason about program, and scope management might have a lot of runtime overhead.
|
|
What is one use of dynamic scoping?
|
As output parameters for methods (write to variables of caller)
|
|
Why is there even such a thing as a need for forward declaration?
|
Well how else can you define a list type (a recursive type) when next pointer is of type being defined. Same for mutually recursive functions.
|
|
What is a forward declaration?
|
It is a promise: “I'll tell you shortly what it means.” It allows for recursive structures.
|
|
Why use namespaces?
|
Enable information hiding. Avoid name clashes with internal helper functions (stop littering global namespace).
|
|
What are the three forms of permeability?
|
Closed, Open, Selectively Open
|
|
What form of permeability is this: anything not explicitly imported is not visible?
|
Closed. Thus names available only via import.
|
|
What form of permeability is this: exported names become automatically visible?
|
Open → Now you can still hide internals, but referencing environment can be large
|
|
What sort of permeability is this:automatically visible with fully qualified name?
|
Selectively Open. Java uses this.
|
|
What is it called when a compiler disallows access to any references to structure internals?
|
Opaque Export
|
|
What is an example of an opaque export in Java?
|
Private class inside class. Interface outside.
|
|
What are the two types of modules?
|
Manager & Type
|
|
What is a manager module?
|
Module exists only once, basically a collection of subroutines and possibly types. Packages in java
|
|
What is a type module?
|
Can be instantiated many times. Each instance has private state. Java:class
|