Thursday, July 9, 2009

High Throughput Testing and TCS meets EDA

Several items caught my interest this week:

The latest document used by the french minister in charge of Research that is set to develop axes of research of national interest include:

"..Premier axe : santé, bien-être, alimentation et biotechnologies (analyses biologiques à haut débit, nanobiotechnologies, cohortes suivies sur vingt ans, robots d’aide aux personnes dépendantes)..."


The item of high throughput testing seems to get some traction at the policy level. While we are on the subject of high throughput testing, here is a recent arxiv preprint on the subject of compressive sensing and group testing: Boolean Compressed Sensing and Noisy Group Testing by George Atia, Venkatesh Saligrama (I featured it here).

The second item of interest is more general and concerns the view of theoretical people to applied problems. The US NSF recently had a workshop on Design Automation and Theory/Electonic Design Automation that drew theoretical researchers to look into the problems of chip engineering. Both Dick Lipton and Suresh Venkatasubramanian are well known researchers in the area of Theoretical Computer Science (TCS) and blogged about this in their respective blog:

In Suresh's entry, one can read:

A second thought was how the lessons of massive data analysis might be useful in the realm of DA. One speakr described one critical problem as being the degree of complexity associated with current DA tools: there are over 4000 "knobs" to turn in one such tool ! It's believed that these knobs are not independent, and might even be contradictory. If we think of each "run" of the DA tool, outputing some kind of chip layout, as a point in this 4000+ dimensional space, I wonder whether techniques for dimensionality reduction and manifold analysis might be useful to find a set of "core knobs" that control the process.


Let us note the issue of engineers having to deal with "4000 knobs," is an issue eerily similar to what the Experimental Probabilistic Hypersurface (EPH) seems to be solving for thermal hydraulics codes used in nuclear reactor simulations. I note that he has a similar view to mine about performing dimensionality reduction. An issue, that affects also the EPH, is setting up the right metrics between the "knobs".

Table: Parameters of the Cathare code as used in the EPH. From L'Hypersurface Probabiliste Construction explicite à partir du Code Cathare" by Olga Zeydina,

No comments: