Ce blogue n'est disponible qu'en anglais.
On Thursday, March 30, four uOttawa law students (Florence So, J.D. 2018, Marshall Jeske, J.D. 2018, Suzie Dunn, LL.M 2018 and myself) began the eight-hour road trip to New Haven, Connecticut for WeRobot 2017. New Haven is a small coastal city known for Yale University. Yale Law School was the venue of WeRobot, a two-day conference with panel discussions from the who’s who in robot law and policy.
The experts range from lawyers to engineers, who gather to discuss current and emerging legal, social, and ethical issues from robotics and artificial intelligence. Day one consisted of six panels that focused on technical and theoretical issues with robots and AI. The five panel discussions on day two focused on the regulatory challenges, as well as some solutions.
The panelists and papers from the sixth annual WeRobot are very interesting and can all be found online at http://www.werobot2017.com/program.
Bonus: Check out Amanda’s Twitter feed for her compilation of live tweets from the conference! @levendowski
- How Copyright Law Creates Biased Artificial Intelligence
by Amanda Levendowski
Amanda Levendowski argues that bad copyright laws can bias machine learning by restricting the algorithms to a more complete dataset, resulting in more fair results and decisions. She argues for a fair use (fair dealing) right to use copyrighted material for the greater good of the learning algorithm and ultimately the public. This is particularly relevant in the Canadian context since copyright laws are up for review this year.
Highlights by Florence So, JD 2018
Of particular interest was the panel on Human-Automation regulations presenting:
- Framing Human-Automation Regulation: A New Modus Operando from Cognitive Engineering
by Marc Canellas, Rachel Haga, et al, and - Reformulating regulation Around Driveless Cars
by Tracy Pearl.
The panelists talked about the disconnect between the engineers as designers and the pilots as users. For instance, when a certain behaviour is already fundamentally built into the system, it is very difficult to modify it without affecting the overall system architecture. The only way around this deep-rooted but undesirable feature in testing or even production stage would be training the users (pilots) to avoid the “glitch”. This scenario speaks to the all-time design problem where the system brilliantly built by engineers may not be the one that the user wished for. (And we are assuming that the user knew what they wanted!) An obvious solution is to bring user input early in the design stage. The same goes with policy. Engineers approach building things from a technical perspective: what is scientifically possible and is it complying to IEEE? They are not thinking about policy. Meanwhile, policy makers are not aware of what they are building. By the time they are finally talking to each other, it is often too late to roll back the system. Engineers define the problem statement before they build. I think this problem statement should be defined by engineers and policy makers together. Policy by design, so to speak. The rationale is to ensure that engineers aren’t just solving the problem right, but solving the right problem. It sounds simple but difficult to implement in practice.
As a digression, change management is another essential component that is generally lacking in the technology field. Users are used to doing things a certain way. It is quite rude to dumb the new system onto them. Change management can prepare acceptance and avoid “shock”. And if done properly, users can give valuable insight to the engineers.
Overall, this conference was an eye opener for me. More engineers need to attend this type of conference. In my entire career in IT and design, not a single mention about policy popped up. Policy is not in the curriculum of design and engineering. This gap is a big problem and the situation needs to change immediately.
Highlights by Marshall Jeske, JD 2018
- Feminist Perspectives on Drone Regulation
by Kristen Thomasen
This paper by Windsor Law Professor (and CLTS PhD candidate) Kristen Thomasen explores drone regulation through a feminist legal critique. Kristen argues that current drone regulation is focused too narrowly on safety. She wants regulators and drone makers to start considering other implications of the technology including privacy concerns and how these technologies may disproportionately affect certain marginalized groups.
- Sporting Chances: Robot Referees and the Automation of Enforcement
by Karen Levy and Meg Leta Jones
This paper examines automated enforcement in professional sports. Several case studies are explored including automated line calls in tennis, instant replay in football, rule infractions in golf, and the debate around adopting an automated strike zone in baseball. The authors conclude that each sport exists as its own contained legal world, with unique cultural and market incentives that drive the adoption or rejection of automated rules. One of the themes that runs through each sport is the idea of “the sporting chance”: rules in sports are not meant to be 100% absolute and there should be some room for the athletes to influence how the rules are applied.
During the panel, a lot of the discussion focused on what the history of automation in sports can tell us about other forms of automation in society. For example, why do we cringe at the thought of automated speeding tickets, or automated parking enforcement? Part of what emerged in the discussion is that strict rules (don’t park here or you will be fined, or in baseball, anything within this area is a strike) are only one element of the legal/cultural regime at play. There are also principles and standards that co-exist with these hard-line rules that are much harder to automate. In baseball, the best umpires strike zones will expand and contract depending on the game situation. When one team is winning by 20 runs, a good umpire will expand the strike zone in order to get the game over with more quickly.
I would like to thank the University of Ottawa for the generous support to attend WeRobot 2017. The "Dean's Conference Fund" was issued by the Association Étudiante en Common Law Student Society (AÉECLSS).