Use LEFT and RIGHT arrow keys to navigate between flashcards;
Use UP and DOWN arrow keys to flip the card;
H to show hint;
A reads text to speech;
67 Cards in this Set
- Front
- Back
In an ordered linked list, the search algorithm is somewhat improved because |
the list is sorted |
|
What should be used to traverse a list |
a reference variable not in the list |
|
What is the time complexity to print a linked list |
O(n^2) |
|
The algorithms to implement the operation of find, insert, and remove are |
not identical for sorted and unsorted lists |
|
the data type of each list |
depends on the specific application |
|
What is the time complexity to insert at the end of an array based linked list |
O(1) |
|
When using one reference variable to insert an item into a linked list |
the order of the sequence of events matters |
|
retrieving the information of the first node |
is a basoc operation of the list ADT |
|
Searching through a linked list or inserting an item in a linked list |
requires a traversal of the list |
|
a linked list is |
a collection of nodes |
|
the address of the first node in a linked list is stored in |
a separate location called the list |
|
In general, the list reference variable, which points to the first node |
should be used in the traversal of a linked list |
|
What is the time complexity to print a linked list |
O(n) |
|
Which of the following should be used to traverse a list |
the head of the list |
|
in a linked list |
every node has a successor |
|
a header node |
contains actual list information |
|
In the Dale textbook, when implementing a circular lists |
we can make use of the same nodes class, LLObjectNode, as we did when we implemented our unsorted, sorted, and indexed lists |
|
In a circular linked list |
every node has a successor |
|
the getNext method is more complicated for our circular list |
than it is for its linear counterpart |
|
In the Dale textbook's doubly linked list, the nodes are linked in |
two directions, front and back |
|
A circular node is a placeholder node |
at the end of a list, used to simplify list processing |
|
The header node contains |
information for the first element in the list |
|
In a diagram of a binary tree |
each node in the binary tree is represented as a circle and the circle is labeled by the node |
|
the term for the maximum level of the binary tree |
height |
|
In preorder traversal |
you start at the top of the tree and move down the tree using the values of the nodes to determine the next move |
|
In postorder traversal |
you start at at the lowest node on the left side and work your way out |
|
What is the level of the children of the root node of a binary tree |
1 |
|
In a binary search tree the key in the root node is larger than every key in the left subtree |
is smaller than every key in the right subtree |
|
If a binary tree has two subtrees, left and right, which of the following must be true about left and right |
left and right are binary trees |
|
Each node in a binary tree contains |
data and a link to the left child and a link to the right child |
|
If T is a binary search tree with n nodes, where n>0 |
the average number of nodes visited in a search of T is O(log 2 n) |
|
In a binary search tree, the key in the root node |
is larger than every key in the left subtree and smaller than every key in the right subtree |
|
In a binary search tree, the average number of nodes visited in a search of T |
is approximately O(log 2 n) |
|
For a heap, the order says that the child can be |
larger than both of the parents |
|
In a priority queue, we access the element |
that has the highest priority |
|
There are many ways |
to implement a priority queue |
|
A heap is based on a |
complete binary search tree |
|
A precondition of the reheapDown operation |
is that the root of the tree is empty |
|
A heap must exhibit two proporties, these are |
shape and order |
|
The insertion algorithm sorts the lists by |
moving each element to its proper place |
|
A selection sort can be implemented by selecting the largest element in the (unsorted portion of the) list and |
moving it to the bottom of the list |
|
Selection sort can be applied to |
linked lists and array based lists |
|
Heap sort overcomes the worst case for |
quick sort |
|
Merge sort divides the list into |
2 sublist of nearly equal size |
|
Quick sort does use a |
splitValue |
|
A priority queues may be implemented |
as a heap |
|
Suppose that L is a list of n elements, where n > 0. Let W(n) denote the number of key comparisons in the worst case of merge sort. Which of the following is true? |
W(n) = O(n Log2n) |
|
A selection sort can also be implemented by selecting the |
largest element in the (unsorted position of the) list and moving it to the bottom of the list |
|
Heap sort, for array-based lists |
is of the order O(n log2n) even in the worst case |
|
It can be shown that the average number of key comparisons for insertion sort is |
O(n2) |
|
The selection sort algorithm repeatedly moves the smallest element from the unsorted list |
to the beginning of the unsorted list |
|
If the list above were sorted with selection sort, which two elements would be swapped first? |
5 and 16 |
|
In quick sort, the list is partitioned into two sublists by selecting a pivot, and the two sublists are then sorted and combined into one list in such a way |
so that the combined list is sorted |
|
A quick sort uses this for implementation |
recursion |
|
A heap is an array in which each element contains a key, such that the key in the element at position k in the list is at least as large as the key in the element at position |
2k+1 and 2k+2 |
|
The selection sort algorithm sorts a list by selecting the smallest element in the (unsorted portion of the) list |
and then moving this smallest element to the top of the unsorted list |
|
The quick sort algorithm uses the |
divide-and-conquer technique to sort a list |
|
The lower bound on comparison-based algorithm is |
O(n log2n) |
|
In quick sort, the average case for the number of swaps |
is O(n log2n) |
|
A selection sort can be applied to |
linked list |
|
Merge sort |
does not use a pivot |
|
Suppose that L is a list of n elements, where n > 0. Let A(n) denote the number of key comparisons in the average case of merge sort. Which is the following is true |
A(n) = O(n log2n) |
|
A merge sort algorithm partitions the list into two sublists of roughly equal size, sorts the sublists |
and then combines the sorted sublists into one sorted list |
|
The worst case for the number of comparisons in quick sort is |
O(n2) |
|
The selection sort algorithm sorts a list by selecting the smallest element in the (unsorted position of the) list |
and then moving this smallest element to the top of the unsorted list |
|
In the merge sort algorithm |
most of the sorting is done in the merging and sorting sublists |
|
Priority queues may be implemented as |
a heap |