2012 Cheriton research symposium

Friday, September 21, 2012 10:00 am - 5:00 pm EDT (GMT -04:00)

The Cheriton School of Computer Science will hold its annual Cheriton Symposium September 21st in the Davis Centre.

This year's symposium will consist of talks by Prof. David R. Cheriton of Stanford University, and Faculty Fellowship recipients, Gladimir Baranoski and Peter Forsyth from 2:00 pm to 5 pm in DC 1302.

Posters by Cheriton ​Graduate Student Scholarship recipients will be on display in the Great Hall, Davis Centre from 10:00 am to 4 pm.

Schedule

Time Description

10:00am - 4:00pm

DC Great Hall - Poster Session

12:30pm

Lunch in DC 1301

2:00-2:10pm

DC 1302 - David Taylor - Welcome and opening remarks

2:10-3:00pm

DC 1302 - David R. Cheriton - Abort Rates and Degree of Concurrency with Transactional Memory

Hardware transactional memory has been proposed as a means to allow the highly concurrent programming that is required with multi-core systems to achieve high throughput, without all the complication and overhead of locking. However, a higher degree of concurrency can lead to a higher transaction abort rate, reducing the throughput and thus the benefit of the concurrency. In this talk, I describe our investigation of the abort rate of two different approaches to hardware transactional memory, namely the conventional in-place update as well as the snapshot isolation model provided by the HICAMP architecture. Our results show that the latter approach together with a merge-update extension to conflict resolution leads to lower or comparable abort rates while allowing higher degrees of concurrency and thus higher throughput, at the cost of the relaxed form of consistency provided by snapshot isolation.

3:05-4:00pm

DC 1302 - Peter Forsyth - Optimal Order Execution: Do You Know What Your Broker is Doing?

Algorithmic trade execution has become a standard technique for institutional market players in recent years, particularly in the equity market where electronic trading is most prevalent. A trade execution algorithm typically seeks to execute a trade decision optimally upon receiving inputs from a human trader. A common form of optimality criterion seeks to strike a balance between minimizing pricing impact and minimizing timing risk. For example, in the case of selling a large number of shares, a fast liquidation will cause the share price to drop, whereas a slow liquidation will expose the seller to timing risk due to the stochastic nature of the share price.

A desirable strategy can be defined in terms of a Pareto optimal solution. We seek to determine the strategy which, for a given expected revenue from selling a block of shares, minimizes the risk (i.e. the variance of the revenue).

We compare optimal liquidation policies in continuous time in the presence of trading impact using numerical solutions of Hamilton Jacobi Bellman (HJB) partial differential equations (PDE). The industry standard approach (the Almgren and Chriss strategy) is based on an approximate solution to the HJB equation. In terms of the mean variance efficient frontier, the original Almgren/Chriss strategy is significantly sub-optimal compared to the solution obtained by solving the HJB equation numerically.

4:05-5:00pm

DC 1302 -Gladimir Baranoski - Benefits and Pitfalls of Interdisciplinary Research on Light and Matter Interactions

Models of light and matter interactions employ computer simulations to describe how different materials absorb and scatter light. These models
are extensively used in a wide range of fields, from computer graphics (e.g., realistic image synthesis) and biomedical optics (e.g., noninvasive diagnosis of medical conditions) to systems biology (e.g., prediction of plant responses to environmental stress) and remote sensing (e.g., early detection of diseases in vegetation). More recently, models of light and matter interactions are also being used to accelerate the hypothesis generation and validation cycles of theoretical research in these fields. The development of simulation frameworks that can be disciplines requires a sound scientific methodology and it is rarely linear. For example, a model should be carefully evaluated by comparing its results with actual measured data, a basic requirement in physical and life sciences. However, often such data either does not exist or it is not readily available. Although significant progress has been achieved on the predictive modelling of light and matter interactions, several technical and political pitfalls severely hinder further advances in this area and limit the applicability of existing models. In this talk, we present a broad discussion of these issues. This discussion is illustrated with examples derived from openly accessible models developed by the Natural Phenomena Simulation Group (NPSG) at the University of Waterloo.

Previous symposiums