Columbia University Department of Computer Science


The IBM Research | NYU | Columbia THEORY DAY
Friday, November 14, 2003

New York University, Courant Institute 
Room 109 Warren Weaver Hall
251 Mercer Street
New York, NY 10012

The IBM Research | NYU | Columbia Theory Day is a semi-annual conference,
aimed to bring together people in the New York Metropolitan area for one day of
interaction and discussion.  The Theory Day features several (usually 4-5) hour-long
presentations by leading theoretical computer scientists about state-of-the-art advances
in various areas.  Some presentations give a survey of the latest advances in some area,
while others may concentrate on a particular result.  
The meeting is free and open to everyone; in particular, students are encouraged to attend.

Program:

 9:30  - 10:00   Coffee and bagels

10:00 - 10:55 Prof. Oded Goldreich
On the Implementation of Huge Random Objects

10:55 - 11:05 Short break

11:05 - 12:00 Dr. Chandra Chekuri
Maximum Coverage Problem with Group Budge Constraints and Applications

12:00 - 2:00 Lunch break

2:00 - 2:55 Dr. Dana Ron
Testing Bipartiteness: The Dense, The Sparse, and The General

2:55 - 3:15 Coffee break

3:15 - 4:10 Prof. Russell Impagliazzo
Hardness as Randomness: a Survey of Universal Derandomization



The Theory Day will be held at Room 109 Warren Weaver Hall,
Courant Institute of Mathematical Sciences, New York University,
251 Mercer Street, New York, NY 10012
Click here or here for directions.

To subscribe to our mailing list, follow instructions here.

Organizers:

Yevgeniy Dodis (NYU)

Tal Malkin (Columbia)

Tal Rabin (IBM Research)

Baruch Schieber (IBM Research)


Abstracts:


On
On the Implementation of Huge Random Objects

Prof. Oded Goldreich
(Weizmann Institute of Science
and Radcliffe Institute for Advanced Studies
at Harvard University)

We initiate a general study of pseudo-random implementations of huge
random objects, and apply it to a few areas in which random objects
occur naturally. For example, a random object being considered may be
a random connected graph, a random bounded-degree graph, or a random
error-correcting code with good distance. A pseudo-random
implementation of such type T objects must generate objects of type
T that can not be distinguished from random ones, rather than objects
that can not be distinguished from type T objects (although they are
not type T at all).

We will model a type T object as a function, and access objects by
queries into these functions. We investigate supporting both standard
queries that only evaluates the primary function at locations of the
user's choice (e.g., edge queries in a graph), and complex queries that
may ask for the result of a computation on the primary function, where
this computation is infeasible to perform with a polynomial number of
standard queries (e.g., providing the next vertex along a Hamiltonian
path in the graph).

Joint work with Shafi Goldwasser and Asaf Nussboim.





Maximum Coverage Problem
with Group Budget Constraints and Applications

Dr. Chandra Chekuri
(Bell Labs)

We consider the following problem. We are given m sets from a
universe, each of which is colored by one of \ell distinct colors. We
are to pick k (<= \ell) sets of distinct colors so as to maximize the
size of their union. If m=\ell the problem is the same as the maximum
coverage problem and the greedy algorithm is an "optimal"
(1-1/e)-approximation algorithm. For the general problem we show that
a greedy algorithm is a 1/2-approximation algorithm. We then
demonstrate several non-trivial applications of this simple covering
abstraction and discuss some extensions.

Joint work with Amit Kumar.




Testing Bipartiteness:
The Dense, The Sparse, and The General

Dr. Dana Ron
(Radcliffe Institute for Advanced Study
at Harvard
and Tel-Aviv University)


Property Testing is the study of the following family of problems:
given query access to an object (e.g., a graph or a function), a testing
algorithm must decide whether the object has a predetermined property
(e.g., 3-colorability or linearity) or whether it is "epsilon-far"
from having the property. In "\epsilon-far" we mean that an epsilon
fraction of the object should be modified so that it obtains the property.

Property testing algorithms exist for many types of objects and
properties. This talk will focus on testing graph properties, and in
particular on a basic graph property that has received quite a bit of
attention: Bipartiteness. It turns out that for this property, as well
as others, the complexity of testing varies significantly with the model
used for testing, where different models are appropriate for different
types of graphs.

This talk will survey three results (and mention a few other related
results).
(1) The first result is an algorithm for testing bipartiteness of graphs
that are dense. The complexity of this algorithm is independent of the
size of the graph.
(2) The second result is an algorithm of testing bipartiteness of graphs
that have bounded degree. The complexity of this algorithm grows
(roughly) like sqrt{n} where n is the number of graph vertices, and
there is a matching lower bound.
(3) The third, and most recent, result studies a model that is appropriate
for all types of graphs. This work bridges the gap between the first
two results: as long as the average degree of the graph is at most
sqrt{n}, it obtains the same complexity as the second algorithm, and
as the average degree increases, it approaches the complexity of the
first algorithm. Here too the bound is almost tight.

This talk is based on joint works with Oded Goldreich, Shafi Goldwasser,
Tali Kaufman, and Michael Krivelevich.



         Hardness as  Randomness:
a Survey of Universal Derandomization

         Prof. Russell Impagliazzo
               (UCSD and IAS)

We survey recent developments in the study of probabilistic complexity
classes. While the evidence seems to support the conjecture that
probabilism can be deterministically simulated with relatively low
overhead, i.e., that P=BPP, it also indicates that this may be a
difficult question to resolve. In fact, proving that probalistic
algorithms have non-trivial deterministic simulations is basically
equivalent to proving circuit lower bounds, either in the algebraic or
Boolean models.

In this talk, I'll survey some results from the theory of derandomization.
I'll stress connections to other questions, especially circuit complexity,
explicit etractors, hardness amplification, and error-correcting codes.
Much of the talk is based on work by Valentine Kabanets, Avi Wigderson
and myself, but I'll also include results by many other researchers.






"Every day is Theory Day"