Use LEFT and RIGHT arrow keys to navigate between flashcards;
Use UP and DOWN arrow keys to flip the card;
H to show hint;
A reads text to speech;
147 Cards in this Set
- Front
- Back
When is it appropriate for a class x to extend a class y? |
-when X is a kind of y - when it makes sense for x objects to have all the operations of y. |
|
In general, should you use inheritance or composition? why? |
-inheritance inherits all the method -composition just some specific methods, need getters and setters. Use when x is not a kind of y. |
|
when methods are private.. |
easier to debug because no other methods are changing it at the same tme |
|
Quicksort's complexity- average case |
-keep partitioning and sort -O(nlog2n)- log2n is depth |
|
Quicksort's complexity- worst case happened when? |
O(n^2) -when the array is already sorted/inversely sorted |
|
Pivot's value for Quicksort |
best: median--guaranteeO(nlogn) |
|
Plausible choice for pivot value for Quicksort |
median of the three- pick three values and find their median |
|
Two ways of creating thread |
1) class Animation extends Thread{...} 2) class Animation implements Runnable{...}. Require public void run() |
|
When start() called, what happened?[extends Thread] |
request a scheduler to run the thread. The thread will be in the ready state. |
|
signature of extends Thread |
class Animation extends Thread{ public void run() {} } Animation anim = new Animation(); anim.start(); |
|
signature of implements Runnable interface |
class Animation implements Runnable { } Runnable requires run method Animation anim = new Animation(); Thread myThread = new Thread(anim); myThread.start(); |
|
What happens when you call a Thread's start() method? |
Thread's state changes to be ready. |
|
What happens when you call a Thread's run() method? |
Thread's state to running. |
|
What is monitor? |
Monitor is a set of multiple routines which support thread synchronization. None of the routines in the monitor can be executed by a thread until that thread acquires the lock |
|
When does a thread die? |
When it finished its run() method. |
|
daemon thread |
thread that will be automatically terminated when all non-daemon threads die. They usually run background supporting tasks for normal threads. |
|
Livelock |
when two or more processes repeatedly change state and at the same time block one another |
|
if an object is mutable,.. |
every access to it must be synchronized |
|
When a Thread enters synchronized code, what does this prevent other Threads from doing? (Be precise.) |
To execute their code blocks |
|
What object is used by a synchronized instance method as a monitor? |
the object's monitor |
|
What object is used by a synchronized static method as a monitor? |
the monitor of the Class object |
|
Give two possible reasons for using a synchronized statement rather than a synchronized method. |
1) want to synchronize only some part of the method 2) when synchronize requires to have different object monitors |
|
Distinguish between parallelism and concurrency. |
Parallellism is when two or more processes run simultaneously, while concurreny is when two or more prcesses run aynchronously. |
|
What is a "check-then-act" error, and why is it always an error? |
It allows other threads to execute during the gap. |
|
What is a data race, and what must you synchronize in order to avoid having one? |
when two threads try to modify same data space at the same time. The end result becomes unpredictable. Synchronize the method that will modify the data |
|
When does a deadlock occur? |
when two or more processes are waiting to get a lock form one another, or waiting for one another to finish |
|
What does it mean to say that an operation is atomic? |
when the operation happens all at once, meaning no other threads can access the same data while the operation is being performed |
|
present the following graph [insert picture] as an adjacency matrix. |
element[i][j] has a mark if i and j are connect. - the matrix is symmetric about the main diagonal |
|
Represent the following graph [insert picture] as an edge set. |
nodeset = {1,2,3} edgeset = {(1,2),(1,3)} |
|
Represent the following graph [insert picture] as an adjacency set. |
nodeset = {1,2,3} 1>{p,q} p = (1,2), q = (1,3) |
|
When searching a graph, what are the two techniques for avoiding getting caught in a cyclic loop? |
1. keep a set of nodes that are already explored 2. mark the nodes itself as having been explored |
|
What are the fields in a Plex? |
-Containers -Contents - Origins -destinations |
|
Mention three ways in which a hypergraph might extend the notion of a graph. |
-Edges may originate from and/or point to graphs or other edges - nodes may be in many graphs at the same time - it is a collection of zero or more graphs |
|
What validity rules apply to the fields of a Plex? |
-If plex x is a container of plex y, then plex y is a content of plex x. -If plex y is an origin of plex x, then plex x is a destimation of plex y. |
|
If a Plex implements a node in a regular (non-hyper) graph, what are the contents of each field of that Plex? |
-content set is empty. -container set contains a single graph that the node is is -origin set contains edges that point to the node -destination set contains edges that come out of the node |
|
If a Plex implements a graph, what are the contents? |
-container, origin and destination are empty. Content set has the graph |
|
If a Plex implements an edge, what are the contents? |
-container set is empty -content set has a single graph which is where the edge is in -origin set contains the node which the edge comes from -destination set contains the node which the edge points to |
|
What is the name of the open-source alternative to Google's MapReduce? |
Hadoop |
|
Briefly define "sequential cutoff". |
when there is no need to split the work into more than one process |
|
what does invoke in fork do? |
create an instance of the class which is a recursive task - should be called from sequential code only |
|
Create a subclass of Thread. What's it in forkjoin? |
create an incrementor |
|
Override run. Forkjoin? |
compute() |
|
Call start. Forkjoin? |
invoke |
|
Put a result in a shared field. Forkjoin? |
join |
|
Call run directly. Forkjoin? |
none |
|
join. Forkjoin? |
join but forkjoin returns the value as well |
|
map in Mapreduce |
map key to a value. Add to a hashmap |
|
reduce in MapReduce |
count # of occurrences in the hashmap then emit back |
|
Forkjoin vs Mapreduce |
- Forkjoin takes advantage of multiple cores and recursively partitions a task into smaller tasks - MapReduce does one big spilt and doesn't communicate till reduce -Forkjoin limites by the size as it runs on a single node -Mapreduce scales well for larger input |
|
RAM =? |
Random Access Machine |
|
RAM model's assumptions |
1) There is one single processing unit 2)There is an arbitrarily large amount of memory 3) Accessing any arbitrarily chosen memory location takes unit time |
|
RAM model breaks down when? |
when an array is too large to fit the memory |
|
Thread-based or shared-memory processing |
Threads share memory and communicate by reading/writing to that memory Java out of the box |
|
Message passing parallel processing |
Threads/ Actors sends messages to one another |
|
Overhead |
costs incurred by parallel algorithm but not sequential algorithm |
|
What are the overhead? |
- 1) synchronization 2) communication 3) contention (for shared memory ~ lock) 4) extra computation |
|
CTA =? |
Candidate Type Architecture |
|
Amdahl's law? |
Some proportion of P can be run in parallel, 1-P run in sequential - will run in 1/(P/N) + (1-P) = the maximum speedup - the more data, the more speedup we will get |
|
what is load imbalance? |
when one process has much less work to do that others, resulting in idle time |
|
Java's concurrency rules: |
1)its the mutable state 2) make fields final unless they need to be mutable 3) Document your synchronization policy |
|
What is an actor? |
lightweighted thread with its own private memory |
|
Model of concurrency's elements: |
1)actor 2) shared memory 2) STM: software transactional memory- if the value change, update. No, otherwise. |
|
Pros for function programming: |
immutable values are automatically thread-safe. |
|
Moore's law: |
Processor is getting faster and faster in foreseeable future, stated by an Intel co founder. |
|
Functional Programming characteristics: |
1) function is treated as a value 2)Functions act like functions in mathematics 3) Immutable values are strongly overemphasized than mutable values. |
|
Scala advantages: |
1) shorter verbosity 2) not designed for concurrent programs 3) Scala is more consistent. All values are objects in Scala, but not in Java(primitives) 4) principle of uniform access. doesn't need to know what what type it is function or value |
|
var vs val |
var is mutable; val is immutable |
|
Unit type's value |
() |
|
null in Java == ..in scala |
Option type |
|
"You can write a Fortran program in any language." |
You can write a crappy program in any language. |
|
Scala's map |
List(1,2,-3,-4) map (_abs) returns List(1,2,3,4) |
|
Scala's filter |
List(1,2,-3,-4) filter(_>=0) returns List(1,2) |
|
Scala's foldl |
list = List(10,2,3) list.foldLeft(0)(_-_) return 0-10-2-3 = -15 |
|
Scala's foldr |
list = List(10,2,3) list.foldRight(0)(_-_) return (10-(2-(3-0))) |
|
Scala's reduceLeft |
list = List(10,2,3) list.reduceLeft(_-_) return 10-2-3 = -15 |
|
minimaxing |
choose the max among those mins |
|
Heuristic function |
computes at a given node what your best move will be |
|
As you minimax on a game tree, the PBV at the node you start from (where it is your move) will never ____________. |
be lower than the PBV of the node's parent--beta cutoff |
|
As you minimax on a game tree, the PBV at a node where it is the opponent's turn will never ____________. |
be higher than the PBV of the node's parent--alpha cutoff |
|
To maximize the amount you can prune a game tree, in what order should you explore nodes? |
top to bottom |
|
Define state space |
consists of: 1) a set of states 2) a set of operators |
|
Define "best first search." |
search for the node that seems to be closest to the goal state |
|
Define "heuristic." |
is a rule of thumb used to decide what's the best choice given the current state, incorporating knowledge of the problem space. |
|
Define "blind search." |
searching algorithm which doesn't utilize any knowledge of the problem |
|
What does it mean to say that a heuristic is "optimistic"? |
never overestimate the distance from the current state to the goal |
|
In the A* algorithm formula f(N) = g(N) + h(N), what are N, g, h, and f? |
N= current node g = distance from start state to the node h = estimated distance from the node to the goal f = our best guess form the start state to the goal, passing though N |
|
When searching a state space, what is the single most important way to minimize the size of the state space? |
choose a good choice of operators |
|
Is the A* algorithm a best-first search technique? If so, how? If not, why not? |
Yes. We are given a node to find the best possible move from there. |
|
Under what circumstances will A* find a best solution? |
if h(n) is optimistic |
|
What are the space requirements for an A* search? |
may require exponential storage |
|
How does IDA* differ from an iterative deepening search? |
instead of using g(N) to cut off searching, use f(N) instead (the estimated total depth). If longer, don't search. |
|
What are the space requirements for an IDA* search? |
linear in length of the path |
|
list match {| case a @ List(b @ _, c @ _*) => s"After $b in $a comes $c" | } List(1, 2, 3, 4, 5) |
After 1 in List(1,2,3,4,5) comes List(2,3,4,5) |
|
Pseudorandom number |
look random but are predictable if you know the algorithm |
|
linear congruential method |
r = (a*r +b)%m; a and b are big prime numbers, m is 2^32 or 2^64 |
|
seed? |
initial value of r (random number) |
|
What is another name for a Gaussian distribution? |
Normal curve with mean = 0.0 and sd = 1.0 |
|
It's usually a good idea to put the word static in front of Random random = new Random(). Why? |
because we want the random to be based on the same random function, as we want an equally likely sampling. So we want the random to be static, not an instance |
|
runtime of stupid sort |
O((n+1)!) |
|
Scala's equivalent of static: |
companion object |
|
sealed classes(Scala) |
all subclasses are declared on the same file |
|
Trait (Scala) = Java.... |
interface |
|
substring(a,b) |
not including position b |
|
singleton object |
there is only one copy of the object |
|
referential transparency |
1) deterministic 2) side-effect free |
|
function literal |
parameter_list => function_body |
|
higher-order function |
ones that take function as a parameter or return function as a result |
|
get method from map |
return Some(value) |
|
getOrElse(4,"Hello") |
will return a string!!! Problem. |
|
Tail recursive |
if the recursive call is the very last thing it does |
|
Factorial's tail recursive version |
factorial(n) { return factorial1(n,1)} factorial1(n, a){if n ==0 return a return factorial1(n-1, n*a) } |
|
What is currying? |
Currying is a way to make partially applied functions |
|
best technique from genetic algorithm: |
sexual reproduction with a small mutation rate |
|
curve fitting's gene: |
a,b,c,d,e,f |
|
curve fitting's chromosome: |
[a,b,c,d,e,f] |
|
Advantages of probabilistic matching |
1) theres more genetic diversity as all has some chance to be a parent 2) best organisms will contribute most to the next generation |
|
Genetic algorithms are |
1) fun 2) mind-bogglingly slow 3)good for a vey few types of problems- use when you see no other way to tackle the problem |
|
Binary search |
-keep changing the range to compare based on the value in the middle |
|
Binary search's complexity |
O(logn) |
|
Bubble sort |
last element is the largest |
|
Bubble sort's complexity |
O(n^2) |
|
Bubble sort's loop invariant |
for all j > outer, if i < j, then a[i] <= a[j] |
|
selection sort |
swap the smallest value with the one at the location |
|
selection sort's complexity |
O(n^2) |
|
Selection sort's loop invariant |
for all i, if outer <= i <= inner, then a[min] <= a[i] |
|
Insertion sort |
insert to the right position starting from the beginning |
|
Heapsort |
- sifting up -remove the right most leave to be the root then sift again -works for only balanced and left-justified binary tree!! |
|
Heapsort's complexity |
O(nlogn) |
|
when's Heapsort better than Quicksort? |
when we need a guaranteed worst time, especially when n is large. |
|
When is Quicksort better than Heapsort? |
The average performance is better. If n is not too large, quicksort is a better choice. |
|
Define heap |
left-justified tree whose children has smaller values than their parents |
|
Define heap property |
children's values are less than or equal to parent's |
|
When representing a binary tree as an array,the right child of the root is at index ________. |
2*i +2, root value is at the beginning |
|
When representing a binary tree as an array,the left child of the root is at index ________. |
2*i +1 |
|
a binary tree is balanced if... |
the tree has two children at depth 0 to n-2 |
|
Which nodes of the binary tree automatically have the heap property? |
leaves |
|
During Heapsort, what is the maximum number of nodes that can fail to have the heap property? |
1 child of each node, and the leafs so for n nodes at depth k( n – 2 ^ k) / 2 |
|
case class in Scala |
-automatically create 1) equals, hashCode, toString, and copy methods |
|
Alpha-beta searching is based on what two assumptions? |
1. The opponent is rational. 2. the opponent algorithm is similar to player's algorithm. |
|
What algorithm type is effectively a depth-first search on a virtual tree? |
Backtracking algorithm |
|
What algorithm type uses at least two recursive calls at each level? |
Divide and Conquer algortihm |
|
What type of algorithm can be described by "do what is best in the short term, and hope the long term works out"? |
Greedy algorithm |
|
Define "heuristic." |
a rule of thumb used to evaluate which option to look at first |
|
define satisficing |
stop as soon as the solution found is good enough |
|
Describe a genetic algorithm for arranging dates. |
1. Randomly create 100 ways of arranging dates 2. Repeat 1000 times { 3. Calculate the sum of the difference between the rating of each date 3. keep 10 ways that have the lowest value 4. for each way, keep the dates that the rating difference is less than 10, and shuffle the rest} |
|
Name or otherwise indicate three different problems for which a greedy algorithm works well. |
Activity-selection problem, Fractional Knapsack, Huffman coding |
|
A dynamic programming algorithm can find an optimal solution to a problem if, for that problem, the "principle of optimality" holds. What is this principle? |
Whatever the initial state is, remaining decisions must be optimal with regard to the state resulting from the first decision. |