Use LEFT and RIGHT arrow keys to navigate between flashcards;
Use UP and DOWN arrow keys to flip the card;
H to show hint;
A reads text to speech;
44 Cards in this Set
- Front
- Back
what is a priority queue
|
a data structure that has operations such as insert, and deleteMIN
|
|
What are some uses for a priority queue
|
operating system scheduling,
|
|
how do you implement a priority queue
|
-linked list, performing insertions at the front in O(1) and traversing the list, which requires O(N) time, to delete the min
-or keep the list sorted; which makes insertions expensive (O(N)) and deleteMins cheap (O(1)) -binary search tree. Gives O(log N) average running time for deleteMin and insertions -Binary Tree Heap |
|
what is a heap
|
-a binary tree that is full except for the lowest level which is full left->right a complete binary tree. (2^h + 2^h+1 -1 nodes)
-can be represented as an array (left child = 2 * I; right child 2 * I + 1) |
|
what is a heap-order property
|
-allows operations to be performed quickly
-keeps minimum at the root -to insert begin at next available leaf and if parent is smaller, swap...percolate up -to delete swap the empty spot with the smaller child |
|
Typically what is the Big O of simple sorting algorithms...what about more complex ones
|
N^2
NLogN -generally measured by the number of comparisons and the number of swaps |
|
what is insertion sort good for
|
almost sorted data
|
|
when is quick sort at its worst
|
for reverse sorted data
|
|
what effects efficiency of a sort algorithm
|
the number of inversions
|
|
how do you measure efficiency in a sort algorithm
|
-check the results of the algorithm of various data such as reversed, almost sorted, etc....
-count the number of comparisons (except for radix sort) |
|
how do you perform selection sort.
what is the Big O |
-find smallest then swap.
-i++ -O(N^2) |
|
What is the advantages of a heap sort
|
-n-place and non-reccursive
-doesnt use multiple arrays -best time to use is for extremely large data sets |
|
what are the disadvantages of a heap sort
|
-slower than merge and quick sorts
-memory requirement is doubled when a second array is used to hold the sorted elements |
|
what is the Big O of the heap sort
|
-heap sort is logn
-building a binary heap of n elements takes n time -deleteMin takes logn time -total running time is NLogN |
|
who created radix sort
|
Harold H. Seward in MIT in 1954
|
|
what is the Big O of radix sort
|
O(N)
|
|
what is one of the fastest sorting algorithms when implemented correctly
|
radix sort
|
|
advantages of radix sort
|
-stable
-works in linear time unlike most sorts -can handle large amounts of data without sacrificing time |
|
disadvantages of radix sort
|
-can take up more space than other algorithms bc of additional sublist needed
-takes more time to write bc the code needs to be re-written for each type of data |
|
who developed the quicksort algorithm and why
|
-C.A.R. Hoare in 1962 Soviet Union-sort words to be translated for easier matching between alreaded sorted Russian-to-English dictionary that was sorted on magnetic tape
|
|
advantages of quicksort
|
-best case is nlogn
-one of the fastest sorting algorithms that does not need additional memory -simplicity |
|
who created merge sort
|
-John von Neumann in 1945
-based on divide and conquer - |
|
advantages of merge sort
|
-worst case nlogn
-stable -easily works with linked lists |
|
disadvantages of merge sort
|
-may require an array up to size n
|
|
how do you perform merge sort
|
-combining sublists together
|
|
what is a graph
|
a set of vertices, V and a set of edges, E
|
|
what are directed graphs called
|
digraphs
|
|
vertices w and v are adjacent if
|
(v, w) 0 E
|
|
an acyclic path is
|
a directed graph that has no cycles
|
|
the length of a path is
|
the number of edges on the path
|
|
what are some uses for graphs
|
-utilities planning
-routing -flowcharts -computer networks -electrical circuits |
|
what is the first known graph problem
|
-konigsberg
-inspired by whether it is possible to walk with a route that crosses each bridge exactly once -Euler proved it wasnt true in terms of graph theory. |
|
what does BFS stand for
|
-Breadth First Search
|
|
How do you perform BFS
|
-begin with first node; list all adjacent
-return to first adjacent node; list all adjacent that have not been listed -continue until all nodes are visited |
|
What does DFS stand for
|
-Depth First Search
|
|
how do you perform DFS
|
-Preorder traversal
-visit a node; go to the next adjacent node;continue -if you come to a dead-end, back-track, until there is a path to visit an unvisited node -continue until everything is visited or no paths are available to unvisited nodes |
|
Time for DFS
|
O(e)
|
|
What is SSSP
|
-Single Source Shortest Paths
-Dijkstra |
|
What is MST
|
-Minimum Spanning Tree
-Prim -Kruskal -Both greedy algorithms |
|
What is APSP
|
-All Points Shortest Path
-Floyd |
|
What is Dijkstra and the Big O
|
-shortest path from a vertex to all other vertices in a weighted graph
-O(E + V^2) |
|
What is Prim and Big O
|
-connect to the next shortest edge that doesnt form a cycle
-O(V^2) |
|
Compare Prim and Kruskal
|
-Prim is best when many edges && does not depend on the # of edges
-Kruskal good for sparse graphs with few edges |
|
What is Kruskal and the Big O
|
-sort edges, take the shortest that doesnt create a cycle
-O(ElgE) |