David R Karger | Massachusetts Institute of Technology (MIT) (original) (raw)
Papers by David R Karger
Companion Publication of the 2021 Conference on Computer Supported Cooperative Work and Social Computing
Proceedings of the Ninth Annual Acm Siam Symposium on Discrete Algorithms, 1998
Siam Journal on Computing, Dec 1, 1993
... deterministic problems. A wealth of different models for stochastic combinatorial optimizatio... more ... deterministic problems. A wealth of different models for stochastic combinatorial optimization have appeared in the literature, perhaps most commonly on two-stage and multi-stage stochastic optimization, see survey by Swamy and Shmoys (2006). ...
Proceedings of the 19th ACM Conference on Computer-Supported Cooperative Work & Social Computing - CSCW '16, 2016
Proceedings of the Twentieth Annual ACM-SIAM Symposium on Discrete Algorithms, 2009
Lecture Notes in Computer Science, 2005
Companion to the 20th annual ACM SIGPLAN conference on Object-oriented programming, systems, languages, and applications - OOPSLA '05, 2005
Proceedings of the 8th international conference on Intelligent user interfaces - IUI '03, 2003
Lecture Notes in Computer Science, 2003
Web Semantics: Science, Services and Agents on the World Wide Web, 2005
Journal of the ACM, 1995
We present a randomized linear-time algorithm to find a minimum spanning tree in a connected grap... more We present a randomized linear-time algorithm to find a minimum spanning tree in a connected graph with edge weights. The algorithm uses random sampling in combination with a recently discovered linear-time algorithm for verifying a minimum spanning tree. Our computational model is a unit-cost random-access machine with the restriction that the only operations allowed on edge weights are binary comparisons.
Journal of the ACM, 2000
We significantly improve known time bounds for solving the minimum cut problem on undirected grap... more We significantly improve known time bounds for solving the minimum cut problem on undirected graphs. We use a "semiduality" between minimum cuts and maximum spanning tree packings combined with our previously developed random sampling techniques. We give a randomized (Monte Carlo) algorithm that finds a minimum cut in an m -edge, n -vertex graph with high probability in O (m log 3 n ) time. We also give a simpler randomized algorithm that finds all minimum cuts with high probability in O( m log 3 n ) time. This variant has an optimal RNC parallelization. Both variants improve on the previous best time bound of O ( n 2 log 3 n ). Other applications of the tree-packing approach are new, nearly tight bounds on the number of near-minimum cuts a graph may have and a new data structure for representing them in a space-efficient manner.
IEEE Transactions on Visualization and Computer Graphics, 2000
Information fragmentation is a pervasive problem in personal information management. Even a seemi... more Information fragmentation is a pervasive problem in personal information management. Even a seemingly simple decision, such as whether to say" yes" to a dinner invitation, often depends upon information from several sources---a calendar, a paper flyer, web sites, a previous email conversation, etc. This information is fragmented by the very tools that have been designed to help us manage it. Applications often store their data in their own particular locations and representations, inaccessible to other applications. Consider the ...
A first-year graduate course in algorithms. Emphasizes fundamental algorithms and advanced method... more A first-year graduate course in algorithms. Emphasizes fundamental algorithms and advanced methods of algorithmic design, analysis, and implementation. Data structures. Network flows. Linear programming. Computational geometry. Approximation algorithms. Alternate years.
A significant portion of the web today consists of dynamically generated content. Dynamically gen... more A significant portion of the web today consists of dynamically generated content. Dynamically generated web pages are in essence HTML views of an underlying data source, usually a relational database. We have robust infrastructure for generating pages from data sources but the reverse process of extracting (or mining) the data back from those same pages still remains difficult. A record is a single HTML-embedded data tuple. Sets of records represent semantically distinct classes of items, such as people, courses, or ...
In this position paper we explore current work in AtomsMasher, an end-user reactive programming e... more In this position paper we explore current work in AtomsMasher, an end-user reactive programming environment for the Web, highlight ongoing work in user interface design, privacy, and sharing, and look towards a future of extending end-user programming from simple tasks to complete experiences.
Companion Publication of the 2021 Conference on Computer Supported Cooperative Work and Social Computing
Proceedings of the Ninth Annual Acm Siam Symposium on Discrete Algorithms, 1998
Siam Journal on Computing, Dec 1, 1993
... deterministic problems. A wealth of different models for stochastic combinatorial optimizatio... more ... deterministic problems. A wealth of different models for stochastic combinatorial optimization have appeared in the literature, perhaps most commonly on two-stage and multi-stage stochastic optimization, see survey by Swamy and Shmoys (2006). ...
Proceedings of the 19th ACM Conference on Computer-Supported Cooperative Work & Social Computing - CSCW '16, 2016
Proceedings of the Twentieth Annual ACM-SIAM Symposium on Discrete Algorithms, 2009
Lecture Notes in Computer Science, 2005
Companion to the 20th annual ACM SIGPLAN conference on Object-oriented programming, systems, languages, and applications - OOPSLA '05, 2005
Proceedings of the 8th international conference on Intelligent user interfaces - IUI '03, 2003
Lecture Notes in Computer Science, 2003
Web Semantics: Science, Services and Agents on the World Wide Web, 2005
Journal of the ACM, 1995
We present a randomized linear-time algorithm to find a minimum spanning tree in a connected grap... more We present a randomized linear-time algorithm to find a minimum spanning tree in a connected graph with edge weights. The algorithm uses random sampling in combination with a recently discovered linear-time algorithm for verifying a minimum spanning tree. Our computational model is a unit-cost random-access machine with the restriction that the only operations allowed on edge weights are binary comparisons.
Journal of the ACM, 2000
We significantly improve known time bounds for solving the minimum cut problem on undirected grap... more We significantly improve known time bounds for solving the minimum cut problem on undirected graphs. We use a "semiduality" between minimum cuts and maximum spanning tree packings combined with our previously developed random sampling techniques. We give a randomized (Monte Carlo) algorithm that finds a minimum cut in an m -edge, n -vertex graph with high probability in O (m log 3 n ) time. We also give a simpler randomized algorithm that finds all minimum cuts with high probability in O( m log 3 n ) time. This variant has an optimal RNC parallelization. Both variants improve on the previous best time bound of O ( n 2 log 3 n ). Other applications of the tree-packing approach are new, nearly tight bounds on the number of near-minimum cuts a graph may have and a new data structure for representing them in a space-efficient manner.
IEEE Transactions on Visualization and Computer Graphics, 2000
Information fragmentation is a pervasive problem in personal information management. Even a seemi... more Information fragmentation is a pervasive problem in personal information management. Even a seemingly simple decision, such as whether to say" yes" to a dinner invitation, often depends upon information from several sources---a calendar, a paper flyer, web sites, a previous email conversation, etc. This information is fragmented by the very tools that have been designed to help us manage it. Applications often store their data in their own particular locations and representations, inaccessible to other applications. Consider the ...
A first-year graduate course in algorithms. Emphasizes fundamental algorithms and advanced method... more A first-year graduate course in algorithms. Emphasizes fundamental algorithms and advanced methods of algorithmic design, analysis, and implementation. Data structures. Network flows. Linear programming. Computational geometry. Approximation algorithms. Alternate years.
A significant portion of the web today consists of dynamically generated content. Dynamically gen... more A significant portion of the web today consists of dynamically generated content. Dynamically generated web pages are in essence HTML views of an underlying data source, usually a relational database. We have robust infrastructure for generating pages from data sources but the reverse process of extracting (or mining) the data back from those same pages still remains difficult. A record is a single HTML-embedded data tuple. Sets of records represent semantically distinct classes of items, such as people, courses, or ...
In this position paper we explore current work in AtomsMasher, an end-user reactive programming e... more In this position paper we explore current work in AtomsMasher, an end-user reactive programming environment for the Web, highlight ongoing work in user interface design, privacy, and sharing, and look towards a future of extending end-user programming from simple tasks to complete experiences.