With plenty of companies already working on prototypes, the age of driverless cars is rapidly approaching. Like any disruptive technology, autonomous vehicles raises a plethora of practical and legal implications across multiple industries that will fast require regulation and standardisation. For example, the question of who is at fault or should be held accountable in the case of a collision involving a driverless car is a grey area that will require some form of regulatory guidance. The most compelling issue raised by autonomous vehicles however is an ethical one – a moral and social dilemma that is stirring a lot of discussion. Say a driverless car is about to hit a pedestrian (or a number of pedestrians), should it swerve and risk killing its occupants? Driverless cars are touted as a safer alternative to human-driven vehicles but despite its capabilities it is not inconceivable such a situation may arise. Given these new vehicles will have to be programmed with a set of rules that will determine their decision in such situation, this difficult question is indeed one that will need an answer, soon.