Title: Wires, bits, and the cost of sorting
Speaker: | Samuel Jaques |
Affiliation: | University of Waterloo |
Location: | MC 5501 |
Abstract: How hard is it to sort a list of n integers? A basic course on algorithms says it's O(n log n) time, but what if the list is enormous - so big you would need to cover the surface of the moon just to store it? At that point we need to worry about how we model the physical architecture of the computer. A reasonable assumption is that, up to constants (possibly very large or small), the different components of a computer -- wires, bits of memory, CPUs, etc. -- should be in roughly constant ratios. Switching to this perspective, I will prove a small theorem: that the physical geometry of a computer implies that the time to sort a list is inversely related to the total length of wires in the computer. This will be a brief window into the science fiction of "galactic algorithms", algorithms at incomprehensible scales that cryptographers analyze to decide whether exponentially hard attacks are truly hard enough.